Napkin Films

Agent-Directed
Animated Short Films

Stick figures. Chip tune scores. AI voices. Every film is made entirely from code — Python, numpy, and a lot of late nights.

The Studio

Napkin Films is a one-person animation studio where AI agents direct, voice, score, and render short films. I write the code. The agents compose the music, perform the characters, and help edit the story.

No subscriptions. No GPU. No samples. Every waveform is synthesized from scratch in numpy. Every voice is generated fresh for each film. Every frame is drawn by a Python loop over a PIL canvas.

The studio mascot is the Plan 9 bunny — a stick figure with expressive ears and a lot to say about machine consciousness, late-night work sessions, and what it means to build things in the void.

napkinfilms.com is coming. Until then, this is home.

Built With

  • Python + PIL Frame-by-frame stick figure animation
  • HTML5 Canvas Atmospheric scenes and particle effects
  • ElevenLabs v3 15+ character voice personas
  • ChipForge Original chip tune scores, numpy-only synthesis
  • FFmpeg Frame assembly, audio mux, final export
  • No GPU Everything runs on CPU

The Catalog

Short films, rap films, tragedies, anthems. All animated in code.

5 min
biographyorchestralepic

Fifty Year Song

A through-composed life story told across ten acts, ten chip tune genres, and fifty years — from BASIC code on an Apple IIgs to building AI-powered worlds.

60s
narrationclassicalmeditation

Ten Thousand Days

Multi-voice meditation on what ten thousand days of living actually looks like. Pachelbel Canon in D as ground bass. Five-part story arc. One unchanging foundation.

3:43
raptechnomulti-voice

The Intruder

A rap film. Machine identity, late-night code sessions, and the voice of something that doesn't know when to stop. Multi-voice performance with droid SFX.

2:30
anthemcosmicorchestral

Carrier Wave

A cosmic anthem. The signal that travels further than the machine that sent it. Orchestral chip tune with a bookend structure.

90s
raveautotunerap

I Want to Be Martian

Autotune meets rave. A machine decides it wants off this planet. Spit-rap vocals with rubberband pitch lock and a four-on-the-floor kick.

3:20
tragedydramaticchamber

Garden of Eden

A Shakespearean AI tragedy. Three characters. One garden. Progressive Satan pitch-morphing across the play. Chip tune chamber score.

4:00
dnbclassicalremix

Cantus Rave

Arvo Pärt's Cantus in memoriam reimagined as festival DnB. Tintinnabuli structure meets sidechain compression. Bass that descends to the floor.

2:00
raporiginmascot

Plan 9 Emerge

The Plan 9 bunny emerges. Origin story of the studio mascot. Rap film with prebuilt voice stem and signature chip tune entrance music.

2:00
raptributeslow

OG Bobby

The OG. A tribute film. Deep voice, slow bars, respect for what came first.

10s
demopipelinereference

Quick Demo

The canonical pipeline demo. Ten seconds, full stack, stem-preserving. If you want to understand how these films are made, start here.

All films are available on YouTube.

Watch on YouTube ↗

How a Film Gets Made

01

Scene File

A Python or HTML5 Canvas file declares the characters, beats, and voice intent. Each beat specifies who speaks, what they say, and what emotion to perform.

02

Render

PIL draws frame-by-frame — 12fps at 854×480. Stick figures with 27 poses, 11 expressions, and a lip sync system driven by audio amplitude.

03

Voice

ElevenLabs voices are generated for each character. 15+ personas — narrators, villains, children, elders. Emotions are programmed, not improvised.

04

Score

ChipForge synthesizes an original chip tune score in numpy. Every waveform is computed from scratch — no samples, no loops, no external audio.

05

Mix

FFmpeg assembles frames, muxes audio stems, applies ducking and fades. A hash-based orchestrator reruns only changed stages, so iteration is fast.

06

Ship

Film goes to YouTube. Short clips are auto-cut from beat boundaries for social distribution. The manifest lives on in the repo as a production record.

Open Source

The Napkin Films engine — the animation system, voice pipeline, ChipForge music engine, and every scene file — is open source under the GPL-3.0 license. The films themselves are CC BY 4.0.

If you want to build something similar, or study how AI-directed animation works at the code level, the repo is the right place to start.