In-progress interactive dance piece
Two-dancer version of in-progress interactive dance piece
October 13 and 14 2025 NTNU hosted the MiU conference - a conference dedicated to music pedagogical education and research in a broad sense. Although maybe not aligning with the conference theme so much, I still wanted to contribute to a conference being hosted by my own department.
Contrary to what I have usually done at conferences, I decided to make an artistic contribution with KaraOK!, a live electronic performance where I am myself the performer. The performance was built on several instruments that I originally planned using with wheelchair sonification, but I adapted them to a solo live electronic performance where I played with a MIDI mixer and two NGIMU movement sensors.
KaraOK! is an artistic research project that explores the artistic potential of DAMP Intonation, a dataset collected and organised by CCRMA consisting of over 4000 recordings of karaoke performances minus the accompaniment, gathered from all over the world. The dataset is both a snapshot of songs that were popular in the 2000s and a glimpse into the lives of the people who wanted to share their singing, with various personalities, vocal abilities, sincerity, and vulnerability.
In the project, I want to preserve some of the emotional qualities of the performances without being exploitative or ridiculing through creating digital instruments that utilize the performances as sound material, primarily to create sound-based music, where the combination of short sound clips from a large number of performances makes the original songs only occasionally discernible. Several of the instruments are also based on Rave models that I have trained on the dataset together with UiT researcher Shayan Dadman, and these can also be played in real-time, either from MIDI controllers or using motion sensors attached to the body.

You can read more about the Intonation dataset in this article by Sanna Wager and colleagues.