How I choose between AI and traditional VFX for every shot
It's not AI versus VFX. It's knowing when to use what. After 20 years of post-production and a year of generative AI, here's how I decide.

The question everyone asks me
"With AI, are traditional VFX still needed?" Directors, producers, agencies all ask me this. The answer is yes — but not for everything. And not like before.
I use generative AI tools (Runway, Veo, Kling) every day, alongside After Effects, Nuke and 3ds Max. I haven't replaced one with the other. I've added tools to the toolbox. The choice depends on the shot, the project, the budget and the result needed.
Here's how I actually decide.
When I choose AI
Concept and pre-visualization. Need to explore ten creative directions for a director? AI generates variants in hours, not weeks. I can show three different looks for a scene before shooting a single frame. For anyone presenting an idea to a client or investor, this changes everything.
Environmental elements and backgrounds. Skies, landscapes, organic textures — AI produces excellent base material that I then refine in compositing. A dramatic AI-generated sky integrated into real footage can be indistinguishable from premium stock.
Social and digital content with tight timelines. When budget is contained and destination is Instagram or a website, AI enables visually rich content at a fraction of traditional cost.
Experimental narrative projects. Short films and concepts entirely AI-generated. New creative territory where traditional film experience is the real differentiator — because AI generates frames, but doesn't tell stories. That's your job.
When I choose traditional VFX
Live footage integration. Need to insert a 3D element into real footage with moving camera? That requires precise camera tracking, lighting match, multilayer compositing. AI doesn't do this — not yet, and when it does, someone will still need to judge if the result is credible.
Pixel-level control. Luxury brand commercials, theatrical films, any project where every frame must be perfect. AI produces output with subtle artifacts that a distracted audience won't notice, but a creative director will. When perfection is required, I serve with my traditional tools.
Coherence across long sequences. AI struggles to maintain visual consistency between consecutive shots — same character, same lighting, same environment. For a narrative sequence of 30 seconds or more, the traditional workflow is still more reliable and often faster.
Rotoscoping and clean-up. Removing a microphone from frame, isolating a subject frame-by-frame, eliminating reflections. Precision work where AI helps (Runway has decent auto-roto) but doesn't replace the eye.
And when I combine them
This is the interesting part — and why the hybrid profile is the strongest in the market.
A recent project: the director wanted an apocalyptic sky for an exterior scene. Traditional solution: matte painting in Photoshop, 3D projection, compositing in Nuke. Estimated time: two days. Hybrid solution: I generated 20 sky variants with AI, selected the best, then refined it in After Effects — color match, integration with footage, artifact correction. Actual time: four hours. Identical result.
AI did the "heavy" work (generating the base), experience did the "smart" work (selecting, refining, integrating). This is the future workflow — and it's already my present.
If you want to learn more about my traditional visual effects or my approach to AI video production, I have dedicated pages for both.
The rule I apply
It's not complicated: if the AI result is at 90% quality in a tenth of the time, I use AI and bring that 90% to 100% with traditional tools. If the AI result is at 60%, I go straight to traditional VFX. The ability to judge where we are on that scale is what makes the difference — and it only comes from experience.
Have a project in mind?
If this article gave you useful ideas and you want to understand how to apply them to your project, tell me what you need.

