Making a Bieber Coachella meme with Claude Code and ffmpeg
I had a dumb idea, zero motion graphics skills, and 15 minutes. Claude Code, ffmpeg, and 1,005 PNG frames later, Justin Bieber is installing Sanity Studio at Coachella.
Justin Bieber headlined Coachella this weekend. He sat on stage with a MacBook for 30 minutes, searching YouTube for old videos of himself and singing along while his screen was projected to 100,000 people. He pulled up "Baby," old paparazzi clips, "Deez Nuts," "Double Rainbow." At one point the WiFi buffered and he just sat there going "come on, man."
Can relate.
It was weird, nostalgic, and extremely meme-able. Someone keyed him out of the stage footage and dropped in a green screen background, and suddenly the internet had a template. People put Minecraft behind him. Spreadsheets. LinkedIn feeds. All the SaaS account with their own version.
of course, I saw it and thought: what if he’s running npm create sanity@latest?
The prompt
I dropped the green screen video link into Claude Code and said, roughly: “I want to replace the green screen with something Sanity related, maybe a CLI command? Can you use ffmpeg for this?”
That was the brief. I’m a lazy prompter.
What Claude Code did
Claude downloaded the video, inspected it with ffprobe, extracted frames and sampled the exact green color (#13ff06). Then it wrote a Python script that generates 1,005 PNG frames of an animated terminal. Not a static image, but a frame-by-frame animation that types out the command, shows the npm prompts, answers the interactive questions (? Project name: bieber-studio), runs spinner animations during dependency install, and lands on the success box.
It compiled those frames into a video, then used ffmpeg’s chromakey filter to composite Bieber on top. The whole pipeline:
- Sample the green →
#13ff06 - Generate 1,005 terminal frames with Python/Pillow
- Compile frames to video
- Chromakey composite Bieber over the terminal
The actual green screen replacement is this:
chromakey=0x13ff06:0.28:0.08Three numbers: the color to key out, how similar nearby colors need to be, and how much to blend the edges. That’s broadcast-quality chromakey in a CLI flag.
Some fun details
The typing animation is almost insultingly simple:
def typing_text(full_text, t, start_t, chars_per_sec=18):
elapsed = t - start_t
if elapsed <= 0:
return ""
return full_text[:int(elapsed * chars_per_sec)]Time in, substring out. No keyframe editor, no timeline, no After Effects. Just math and a for loop drawing text onto 1920x1080 images.
It didn’t manage to make a neat box against the end, but if i had bothered, i’d probably tell it to run boxfix.
So this was kinda cool
I had a creative idea and zero desire to open After Effects for a meme. Claude had the ffmpeg knowledge, wrote the frame generation script, figured out the chromakey parameters, and iterated when things needed tuning. Maybe 15 minutes, start to finish.
Not because the work was trivial. There's about 200 lines of Python and a non-obvious ffmpeg pipeline. But Claude never needed me to explain what chromakey is or how Pillow works. I described what I wanted, it handled execution, we went back and forth on the result. Not that different from pairing with a colleague who happens to know ffmpeg really well (except this colleague works at 3am and doesn't judge you for making meme videos).
The final video is 5.6MB. Bieber sits on stage, types npm create sanity@latest, answers the prompts, watches the dependencies install, and gets the success message. The chromakey edges aren’t perfect if you look closely, but for a meme video, that’s kind of the point.
Sometimes the best content is a dumb idea executed fast.