Behind the scenes5 min read

Why we never train on your uploaded photos (and how to verify it)

Anton Viborniy

Co-founder & CEO of Apiway

AI brands talk a lot about “your data is safe.” Here is what Apiway actually does — and does not do — with your uploads, written for technical readers who want to verify rather than trust.

The policy statement

Apiway does not train any AI model on your uploaded photos (garments, model references, creator photo sets). Your uploads are stored privately to your workspace, used only to generate the outputs you explicitly request, and never included in any training dataset for any model that we operate or that we send images to.

That is the entire policy. There is no exception clause for “research purposes”, no opt-in default that you have to find in settings, no anonymised aggregate carve-out.

How this is actually implemented

Three concrete mechanisms keep uploads out of training.

1. Provider configuration. When Apiway sends an image to an external generation backend (Google Gemini, etc.), the provider request is configured with the no-training flag where the provider supports it. Major providers all expose this setting; we use it.

2. Storage isolation. Uploaded images live in workspace-scoped storage with row-level-security policies that prevent cross-workspace access. No background process iterates across the entire upload corpus.

3. No internal training pipeline. Apiway does not operate a training pipeline that consumes user uploads. We use pretrained models from third parties and custom-trained models built on our own non-user-derived data.

How to verify yourself

For technical readers, three quick verification moves.

  1. Read the privacy policy linked from the footer. The no-training language is explicit and binding, not just marketing copy.
  2. Inspect the network requests when generating an image. The provider call includes the no-training parameter where applicable.
  3. Check the storage configuration. Image storage is workspace-scoped via Supabase row-level security, and we do not run any background analytics on image content.

What is stored, and for how long

Uploads are stored as long as you keep the workspace active. Generated outputs are stored in your gallery until you delete them. Deleting an upload removes it from the active store; backup retention runs for a short window (days, not months) for disaster-recovery purposes.

For creator marketplace photo sets, the set is stored as long as the set is published. Unpublishing removes the set from the public marketplace; the underlying image files stay in your private workspace until you delete them.

What the creator marketplace actually does with photos

Brands buying generations against your set never receive your original photo files. They receive the AI-modified derivative output. Your originals stay in your workspace.

We do not anonymise, repurpose, or aggregate creator photo sets for any purpose other than the marketplace functionality the creator opted into. (Detail: photo set licensing for AI.)

What about the LLM provider?

This is the question most users actually want answered. When Apiway sends your image to Google Gemini for generation, what does Google do with it?

Apiway uses the enterprise/no-training tier of major providers wherever it is available. The provider does not retain images for training under this tier. The image is used to fulfill the generation request and discarded by the provider after a short retention window.

This is true at the provider configuration level, not just promised in marketing copy. If a provider stops offering a no-training tier, we will move generation work to a provider that does.

Why this matters for creators specifically

Creators uploading photo sets are giving up commercial use rights for AI overlay (per the marketplace consent), but they are not giving up training rights. Their likeness is not being used to train any model. The commercial-use license is narrowly scoped to per-generation overlays.

For working models and influencers, this distinction is the entire reason the marketplace works. If listing a set meant feeding it into an AI training pipeline, almost no professional creator would list. The narrow consent flow is what makes the marketplace acceptable to professionals.

Read the privacy policy directly

The privacy policy is linked from the footer of every page. The no-training language is in plain English, not legalese. Open a free account and walk through the upload flow if you want to inspect the actual mechanics.