Exploring Human-Machine Relationships Through Dance
Christopher Frauenberger, Professor of Human-Computer Interaction at IT:U, and Christoph Hubatschke, PostDoc in the Human-Computer Interaction Research Group, are exploring AI research with and through contemporary dance. The research builds on the project DANCR, developed within the transdisciplinary research collective H.A.U.S. (Humanoids in Architecture and Urban Spaces), co-founded by Christoph Hubatschke.
Research at the Intersection of Dance and Technology
DANCR investigates the human-machine interface in contemporary dance through real-time collaboration between dancers and humanoid robots. It was developed by Oliver Schürer (TU Vienna), Eva Maria Kraft (dancer and choreographer), Christoph Hubatschke (IT:U) and Brigitte Krenn (Austrian Institute for Artificial Intelligence). At the heart of the project is the DANCR-tool, an AI-based system designed to explore improvisation and embodiment in dance. Trained using the movement data of individual dancers, the system allows the humanoid robot Pepper to perform alongside humans. In live performances, the robot mirrors and responds to the dancers, creating a dynamic interplay where dancers and machine co-create movement in real time.
Bias by design, relationship by intention
While usually AI is trained with a large and diverse set of data, this project intentionally embraces a bias: the AI is deliberately trained on the highly specific styles of individual artists. This approach nurtures a unique, evolving relationship between human and machine, emphasizing artistic collaboration over automation.

“Usually, AI is trained on large and diverse datasets in the hope to avoid bias, which is of course not possible. In our case, bias isn’t just unavoidable, it’s intentional and essential. Bias is the method. The robot is moved exclusively by the particular movement-set of the dancers involved.”Christoph Hubatschke
The DANCR-tool combines two AI systems:
- improAI, based on a k-nearest neighbors (KNN) algorithm, which generates movement in real time by analyzing past motion patterns of a single dancer.
- transferAI, which applies neural style transfer to translate the movement style of one dancer to another, opening new choreographic possibilities.
Technical Highlights:
- Real-time sensor setup: 4 Kinect sensors, wearable tech, atmospheric sensors, and robot state tracking
- Real-time transformation of motion data for robot compatibility
- Distributed computing network of five computers
AI Improvises Contemporary Dance
Inspired by the training process for the DANCR AI, H.A.U.S. also developed the performance “youAI” at the imagetanz festival at the brut in Vienna 2023. It brings together technology, language, and contemporary dance to reflect on this reciprocal conditioning. With four dancers, one actress, two engineers, and two robots, it becomes a stage for examining how machines are becoming part of our embodied and expressive lives. The audience witnesses the mutual interaction between human dancers and the trained robots.
Rethinking AI and Art
The development of the DANCR-tool was made possible through intensive collaboration between artists, engineers, theorists, and scientists. It redefines AI in dance – not as a choreographic assistant replicating steps, but as a co-creative partner that inspires, adapts, and evolves alongside the dancer.
Projects like DANCR push the boundaries of what performance can be in the age of AI. It shows that AI is not simply a tool but rather a counterpart. And through this evolving relationship, we might rediscover what it means to be human.

“The central question isn’t about results or AI as a potential replacement for humans, but rather the ongoing relationship with AI tools that is essential for humanity and society.”Christopher Frauenberger
In May 2025, Frauenberger and Hubatschke took part in a symposium hosted by Anton Bruckner Privatuniversität, to present DANCR and new research ideas building on this project. It brought together educators, researchers, and artists to explore the question what artificial intelligence means for the arts and those who create it.
Motion Capture at IT:U’s Momo Lab
At IT:U, the Human-Computer Interaction Research Group focuses precisely on these questions of human-machine entanglement. In the Momo Lab, movement is not only studied but also experienced as a living dialogue between body and AI tools.