“Virtual try-on” usually means an AR widget embedded on the storefront — a project that takes weeks to integrate and brings a long tail of bug reports. Most brands do not need that. They need try-on imagery: pre-rendered shots that show a garment on a model for marketing use. Here is how to build that capability without an SDK.
The two completely different products called “virtual try-on”
First, an embedded AR experience: shopper turns on the camera, garment renders on their body in real time. Useful for high-AOV eyewear and beauty, expensive to integrate, brittle on mobile, adds a CRO experiment most small brands cannot afford to run.
Second, AI try-on imagery: a marketing-team workflow that ships on-model PDP shots, ad creatives, and lookbook content from a garment file plus a model image. No widget, no SDK, no on-page impact. This is the one most fashion brands actually want when they say “we should add try-on”.
Step 1: gather the two inputs
- A clean photo of the garment. Flat-lay on a phone is fine.
- A model image. Either an upload of a real model with rights, or a creator photo set from the Apiway marketplace, or a White Studio preset model.
Step 2: pick the Apiway template
For try-on imagery specifically: Virtual try-on. The template accepts the model image and the garment file and renders the garment onto the model. The Hollywood-anchor logic applies here: the model is a real human (real face, real eyes, real environment), the garment is the AI overlay layer.
Output is delivered in your chosen aspect ratio (1:1, 4:5, 9:16, etc.) and on the original environment of the model image — not on a synthetic background. That is what makes the result feel real.
Step 3: batch it for marketing volume
A clothing brand running paid social typically needs 10–30 try-on shots per garment per week (creative refresh, A/B test variants, format ratios). Apiway's pricing makes this affordable: a single try-on shot is a handful of credits, and one credit equals one US cent.
The fastest cadence: pick three creator photo sets per garment, run the garment through each, ship the three results into the ad rotation. Fresh creative every week without a studio booking.
Step 4: where to use the output
- PDPs: additional on-model shots beyond the catalog hero.
- Paid ads: Meta and TikTok creative in 9:16, 4:5, and 1:1 formats.
- Email: hero imagery for newsletter sends.
- Lookbook PDFs: wholesale linesheets and buyer presentations.
- Organic social: Instagram feed and Reels covers.
Step 5: rights and licensing
For Apiway creator photo sets, the marketplace consent flow already covers commercial use of the AI-modified output by the brand. For uploaded model photos, ensure the rights agreement with the model permits AI-modification and downstream commercial use. Apiway does not relicense your uploaded model imagery to anyone else.
When the AR widget is the right move instead
For high-AOV categories where the customer is choosing between sizes or shapes (eyewear, watches with diameter variation, jewelry sizing), AR try-on at the storefront level can lift conversion enough to justify the integration cost. For most clothing brands — where the “will this fit” question is sized rather than shaped — AI try-on imagery is the higher-leverage move.
Generate try-on imagery this week
Pick one current SKU and one creator photo set. Run a single try-on pass. Free accounts ship with 100 one-time credits — enough to test on a real garment without committing to anything.
