PIERO.
VFX Breakdown6 min read

VFX Breakdown: the visual effects of Along Came Ruby

How I created the space-time effect for Along Came Ruby. From concept to final comp: process, tools and creative decisions.

VFX breakdown Along Came Ruby - space-time effect

The project

Along Came Ruby is a drama/sci-fi short film directed by Riccardo Suriano. In a post-apocalyptic world, Ruby searches for her brother Henry and encounters an enigmatic woman who holds secrets of the past. The central narrative element is communication between present and future through space-time — and making it visually credible was my job.

This is the type of project I love: a contained budget, an enormous creative challenge and the freedom to propose solutions. Nobody told me "make this effect this way." The director told me "I want the audience to feel that two dimensions are touching" — and from there I built everything.

The visual concept

The first decision was: no round portals, no glowing circles, no "Stargate." Too familiar, too literal. The effect needed to be organic, subtle in calm moments and devastating in key moments.

I proposed to the director an approach based on atmospheric distortions and light particles — as if the air itself were tearing and through that tear, light from another time filtered in. The idea was making the effect part of the environment, not an overlaid element.

The visual references we shared ranged from Arrival (the fog enveloping the ships) to Tarkovsky's Stalker (the zone as a place where physical rules bend). Not copying — but understanding the principle: the best effects are those the audience feels before seeing.

The technical process

Shooting. On set I requested discreet tracking markers in exterior environments. They were needed to anchor 3D effects to real space. I also asked the DP to shoot clean plates of locations — essential for compositing.

Camera tracking. I extracted camera data from every VFX shot using Mocha Pro for planar tracking and PFTrack for 3D matchmove on more complex shots. Tracking precision is everything — if the effect "slides" even one pixel relative to the scene, the viewer's brain perceives it as false.

3D elements and particles. The space-time distortions were created in 3ds Max with custom particle systems and in After Effects with Red Giant's Trapcode Particular for the more complex particle systems. Light volumes, energy filaments, micro-particles following organic patterns. Each element was layered directly into the scene, integrated with the live footage through blending modes, animated masks and local color correction.

Compositing. In After Effects I layered elements: the original shot as base, 3D distortions integrated with blending modes and animated masks, local color correction to make the effect's light "bounce" on scene objects, and finally a grain and chromatic aberration pass to slightly dirty the effect and make it photographic.

The hardest shot

There's a sequence where Ruby is in the foreground and behind her space deforms. The camera is moving, she's moving, the environment is the interior of the shelter — wood, objects, shadows. Everything is moving.

I had to rotoscope Ruby frame by frame to separate foreground from background, then apply distortion only to the background while keeping the foreground intact. Tracking had to be perfect because any error would be immediately visible — the human face is the first thing our brain checks.

Time on this shot: about a day and a half. The result: nobody notices the effect as an "effect" — they see a scene where something strange is happening to the air behind Ruby. Exactly the goal.

What I learned (again)

Less is more. The first versions of the effect were too obvious. The director was right to ask me to pull back — the effect works better when it's at the edge of perception, when the viewer isn't sure what they saw.

Dialogue with the director is the real tool. No software replaces a clear conversation about what the effect should make the audience feel. "Two dimensions touching" is a phrase — but it's the right phrase, and from there the entire look was born.

The integrated workflow makes the difference. Having also managed the color grading for the short, I could work VFX and color together from the start. The space-time effect has its own color palette that dialogues with the scene grading — it's not an element pasted on top, it's part of the film's visual fabric. If you're interested, read more about my VFX services, my approach to integrated post-production, or check out the full Along Came Ruby case study in my portfolio.

Have a project in mind?

If this article gave you useful ideas and you want to understand how to apply them to your project, tell me what you need.