EDGE is a powerful tool for editable dance generation that can create realistic dances based on input music, using a transformer-based diffusion model paired with Jukebox for music feature extraction, and allowing for joint-wise conditioning, motion in-betweening, and dance continuation.
It outperforms recent methods like Bailando and FACT according to human raters.
Features
- Transformer-based diffusion model for dance generation
- Music feature extraction with Jukebox
- Editable choreographies with joint-wise conditioning, motion in-betweening, and dance continuation
- Support for arbitrary spatial and temporal constraints
- Improved physical realism with Contact Consistency Loss
Use Cases
- Creating choreographies from music
- Generating dances with specific spatial or temporal constraints
- Producing dances with a combination of preset and generated motions
Suited For
- Dance enthusiasts
- Choreographers
- Artists
- Developers
FAQ
EDGE is a tool for editable dance generation from music using a transformer-based diffusion model.
EDGE offers joint-wise conditioning, motion in-betweening, dance continuation, and support for arbitrary constraints.
EDGE uses Contact Consistency Loss to improve physical realism and intentional foot sliding in dances.