is a four-minute AI-driven art piece inspired by the elegance of classical dance and the elemental balance of fire and ice. Set to Vivaldi’s
, the work features ballet and yoga choreography performed by two dancers, symbolizing a dynamic yin-yang relationship.
The San Francisco Ferry Building became the perfect canvas for projection mapping. The visuals created a whirlwind of snow, ice, fire, and light. They merged architecture and art into a breathtaking experience.
We digitally transformed the dancers using Enchanted Solstice AI Projection Mapping techniques.
Use the interactive slider below to compare the original green-screen footage with the AI-enhanced version. See how advanced AI workflows transformed the performers into dynamic embodiments of Fire and Ice, blending art and technology into an immersive visual experience.
The video below features a side-by-side comparison of the original footage and the AI-enhanced version. This demonstrates how we can take any performance footage and transform it into stunning AI-generated visuals, customized for your creative projects and live events.
This project was created in collaboration with
Ryan Uzilevsky of Light Harvest Studios, known for groundbreaking projection mapping and immersive digital art installations. Ryan filmed dancers against a green screen, capturing every graceful movement and emotional gesture.
Through advanced
AI-powered workflows built in
ComfyUI, custom masks were created for each dancer to enable precise isolation and transformation. A tailored
ComfyUI workflow was developed, integrating
Stable Diffusion checkpoints,
ControlNets, and
AnimateDiff. This process transformed the performers into dynamic representations of
fire and ice, turning live-action choreography into AI-generated visual art.
links
AI Tools That Made It Possible
Creating
“Enchanted Solstice” required a fusion of cutting-edge tools:
- Custom ComfyUI Workflow: A custom-built pipeline combining various AI tools was used for precise processing. Custom masks isolated each dancer, enabling frame-by-frame control during AI-driven transformation.
- Stable Diffusion Checkpoints: Applied to generate visually detailed fire-and-ice textures that maintained artistic consistency across all animation frames.
- ControlNets: Used for pose tracking and motion consistency, ensuring the dancers’ original movements were accurately reflected throughout the projection.
- AnimateDiff: Integrated to create fluid motion by generating smooth transitions between animation frames. This process allowed the dancers’ live-action performances to evolve naturally, producing continuous, lifelike sequences as they transformed into fire and ice avatars.
Through this
AI-powered creative workflow, live-action footage was seamlessly integrated with
AI-generated effects, turning classical dance performances into
dynamic digital art installations. The result was a visually stunning projection blending
fire-and-ice-inspired visuals, choreography, and immersive storytelling.
Ready to Create Stunning Visual Experiences?
Looking to bring your creative vision to life through
AI-driven projection mapping,
immersive art installations, or
AI-generated visuals? Let’s collaborate on your next groundbreaking project.