Build Worlds on Faces: Your First AR Filter in Under 5 Minutes
Create Your Own Instagram Filter - Spark AR Studio beginner tutorial | HUD
You can strap a heads-up display to someone's face using free software and a handful of spinning circles. That's not science fiction. That's Tuesday.
Ben Marriott walks through the entire pipeline in a single tutorial... from Adobe After Effects to Spark AR Studio to published Instagram filter. No fluff. No filler. Just a clean three-stage workflow that turns flat animations into face-tracked augmented reality.
And here's what grabbed me... the constraints.
The Beauty of the Box
Instagram gives you four megabytes. That's it. Four. Your filter... every animation, every texture, every piece of visual magic... has to squeeze into a space smaller than most photos on your phone.
So you design at 600×600 pixels. You animate at 12 frames per second. You strip away everything that doesn't earn its place.
Sound familiar? Constraints don't kill creativity. They sharpen it. Every limitation is just the universe handing you a chisel and saying, "Now carve."
Stage One: Build Your Elements
The HUD graphics start in After Effects. Simple shapes. An ellipse tool. A dash. A slow rotation. Nothing complex... just intentional motion designed to loop.
The critical export step: PNG sequence with RGB + Alpha channels. That Alpha channel preserves transparency. Without it, your beautiful spinning circle sits on an ugly opaque rectangle. The transparency is what lets your creation breathe on top of the real world.
Think of it like this... the Alpha channel is the thing that makes your art a window instead of a wall.
Stage Two: Assemble in Spark AR
Spark AR Studio is free. Windows and Mac. No excuse not to open it.
The workflow inside is elegantly simple:
1. Import your sequences. Create Animation Sequence assets, point them at your exported PNG folders, match the frame rate to 12fps. 2. Add a Face Tracker. This is the anchor. The thing that says "follow this human's face wherever it goes." 3. Parent a Plane to the Face Tracker. This is the core mechanic. A flat surface... a canvas... that now moves with the user's head in real time. Parent-child hierarchy. The Face Tracker leads, the Plane follows. 4. Create Materials, assign Textures. Link your animation sequences to the planes through materials. This is where your spinning circles finally show up on screen, locked to someone's face. 5. Scale to fit. Bump the X and Y to 2.5. The Z doesn't matter... planes have no depth.
Five steps. Your first element is now face-tracked and animating.
Stage Three: Stack for Depth
This is where it gets beautiful.
Duplicate your planes. Assign different HUD elements to each. Then grab the blue Z-axis arrow and pull one plane slightly forward. Pull another even closer.
That's it. That's the whole trick.
When the user moves their head, those layers shift at different rates. Parallax effect. Flat 2D elements suddenly feel like they exist in three-dimensional space. No scripting. No 3D modeling. Just... separation along a single axis creating the illusion of depth.
Layers. Separation. Depth through distance.
I think about that a lot outside of software, too. The things that give our lives richness aren't all on the same plane. They're stacked at different depths... some close, some further out... and it's the movement between them that creates the feeling of something real. ✨
Stage Four: Ship It
File > Export. Follow the prompts to Spark AR Hub. Submit for review.
Fair warning... the review process takes time. Ben mentions at least 10 days. Patience is part of the build. You don't get to skip the waiting just because you finished the making.
Why This Matters for Creators
This tutorial isn't just about Instagram filters. It's a gateway.
Augmented reality is becoming a literacy. The creators who understand how to layer digital elements onto physical reality... who can build responsive, face-tracked, interactive experiences... they're building the visual language of the next decade.
And the barrier to entry? Free software. A webcam. Some spinning circles.
The tools are here. The constraints are generous enough to challenge you and tight enough to focus you. The workflow spans skills most motion designers already have... After Effects compositing, understanding transparency, thinking in layers.
Ben Marriott made this tutorial approachable enough that someone with basic After Effects chops can have a working AR filter by the end of the afternoon. That's not scratching the surface. That's handing someone a key and showing them which door it opens.
Go build something. Put a universe on someone's face. 🚀
Four megabytes. Twelve frames per second. Three software stages. One filter that tracks a human face in real time and wraps it in light. The constraints aren't the enemy... they're the blueprint. Start small. Stack your layers. Ship it before it's perfect. The review period will teach you patience, and the next filter will teach you everything else.
--- Source: https://www.youtube.com/watch?v=xSgRgcNqzfU
From TIG's Notebook
Thoughts that surfaced while watching this.
The mediocre teacher tells; the good teacher explains; the superior teacher demonstrates; the great teacher inspires. — *William Arthur Ward*— TIG's Notebook — On Mentorship & Teaching
Schedule love. Because when someone needs you, it's never convenient.— TIG's Notebook — Core Principles
The two most important days in your life are the day you are born and the day you find out why. — *Mark Twain*— TIG's Notebook — On Purpose & Legacy
Echoes
Wisdom from across the constellation that resonates with this article.
A comedy maker builds three genuinely practical studio projects, revealing principles about finding shared patterns in chaos, tracking time as awareness instead of pressure, and using every available tool to close the gap between imagination and reality.
Practice visible humility by admitting mistakes first and publicly... this creates permission for your team to do the same
ChatGPT 5.4 treats tasks as pipelines to execute, not problems to understand.