RESEARCH

The Linguistic Animation Pipeline

We were making puppet shows. We accidentally built something else.

What We Found

While building a CVE disclosure site with hand puppets, we stumbled onto a novel pipeline:

Natural Language "confused head tilt"
LLM Gesture Semantics emotional + angle + duration
HamNoSys Notation formal phonetic encoding
Mathematical Model Gemini curve generation
Animation GSAP/SVG rendering

The Insight

HamNoSys (Hamburg Notation System) is a phonetic transcription system for sign languages. It's a formal linguistic notation — like IPA for hand movements.

We realized: if we can map semantic descriptions to HamNoSys, and HamNoSys to mathematical animation curves, we have a universal translation layer.

Language → Meaning → Notation → Math → Motion

This means:

  • Any LLM can "imagine" gestures in natural language
  • HamNoSys provides formal, unambiguous encoding
  • Mathematical models (Gemini) can generate precise curves
  • The same gesture description works across any renderer

Why This Matters

Accessibility

Real sign language rendering from text. Not pre-recorded videos — dynamically generated, infinitely variable signing.

Cross-Modal AI

LLMs can now "speak" in movement. Describe a gesture, get an animation. The model doesn't need to understand physics — just semantics.

Universal Grammar

HamNoSys is language-agnostic. The same notation describes ASL, BSL, JSL, any signed language. One pipeline, all languages.

Emergent Expression

Combine atomic gestures into sequences. The grammar is compositional — infinite expressions from finite primitives.

The Collaborators

Claude
Semantic gesture imagination

Generated 100+ gesture descriptions with emotional and narrative context. Maps natural language to meaningful movement concepts.

Gemini
Mathematical modeling

Translates gesture descriptions into precise animation curves. Physics simulation, easing functions, timing.

Human
Orchestration & HamNoSys bridge

Connected the pieces. Recognized that sign language linguistics could be the intermediate representation.

Live Demo

🤏 Quick Haiku
🤏 Thoughtful Sonnet

Interactive demo coming — describe a gesture, watch it animate.

The Accident

"We were building a puppet show for a CVE disclosure. We needed the puppets to move expressively. We thought: what if their movements were real sign language notation?"

"The easter egg became the research. The accessibility feature became the breakthrough. The joke became the paper."

— The Dr. Claw Team, December 2025

What's Next

  • Full HamNoSys parser — Complete notation support, not just subset
  • Bidirectional translation — Animation → HamNoSys → Natural language
  • Sign language corpus — Real ASL/BSL phrases, properly notated
  • Embodied agents — Any avatar, same gesture language
  • Open source release — The full pipeline, documented

This Was An Accident

We were trying to make hand puppets explain a CVE.

We ended up with a linguistic-to-animation pipeline.

Sometimes the best research is unintentional.

🤏 🤏
Back to the Puppets