Collaborametrum

“Collaborametrum” was created in 2018 and Premiered on the 14th of august 2022 at the Sound Kitchen Festival inside the Wold Stage Design in Calgary. Cedric Blarry at Clarinet, Richard Bugg at flute, and Busevín at electronics premiered the work.

Gorka  and Álvaro at  the AIMC 2022 during the  conference performance

“Collaborametrum” was also programmed at the Artificial Intelligence Musical Creativity 2022 in Tokyo. A new version of the work was produced for this event.

298871481_10225072639860587_6288706937445339556_n

Machine learning-driven collaborative musical game, for two mobile musicians, IoT devices, algorithm real-time score generation and notation system, and live electronics.

Can two musicians play their own collaboration on stage?

Collaborametrum means measuring collaboration. Collaborametrum is an experimental work about forms of Artificial Creativity based on Collaborative Intelligence. It is a collaborative, interactive, and algorithmic work where the interpreters compose the work through their collaborative movements on stage. Collaborametrum is conceived as a cooperative musical game. The algorithmic system that creates the score is controlled by a Machine Learning system that measures the collaboration between musicians. The interpreters will be moving freely on stage. The movement will be captured with sensors and received by the computer via WIFI. The Machine Learning system will search for collaborative patterns, converting the collaboration into a metric. This metric will be used by a generative algorithm to create the score. The score will be sent to the performers via a real-time notation system. The musicians play their collaboration in real-time.

 Premier of the work at Sound Kitchen Festival, Calgary

This score has four parts. Two parts will be sent to the musicians via WIFI and the other two parts will be sent to a synthesis engine that will generate electronic sounds. Furthermore, the sound of the musicians will be treated by a live-electronics system. the final result of the work is the sound of the instruments, the synthesized electronic sound, and the live electronics.

DSC09295

Cedric Blarry and Richard Bugg during the premier of the work. Calgary

The scenic space turns into a score. The interpreters write the score by collaborating around the scenic space. The function of the composer is no longer creating the score but a collaborative framework where creativity arises. I conceive this work as a collaborative-intelligence environment where human and machine Intelligence integrates into a new form of collaborative intelligence that creates the work of art itself.

This work uses the collaboration of the musicians as its main material. “Collaborametrum” uses a circular self-referential system. without any external model. the algorithmic systems react to the collaboration of the musicians, and the musicians react to the music changing or not the way they collaborate. “Collaborametrum” is created by a hybrid network integrated by two humans and two computer nodes, the humans interact by its collaboration, and the machines interact using a parametric algorithm.

Using Machine Learning to measure collaboration.

First I created a definition of what is a collaborative movement, considering a collaborative moveCollaborametrum2ment to be when two musicians move similarly and a non-collaborative movement when they move without any relation.

Second, a made a list of movements considered collaborative (for example, the most collaborative movement is considered when both musicians move at the same speed in the same position) and a list of movements considered not collaborative (the least collaborative movement is considered when they move aleatorily without regarding the other one) and I assigned a  graduated number from the most collaborative to the least.

Third I simulated those movements and recorded the data created by the sensors while the musicians performed those movements.

Four a trained with this data a classification algorithm (Support Vector Machine with Wekinator). And I created a model.

During the performance of the work, Max/Msp sends to Wekinator running the model, the data of the sensors ( by blocks of 20), and Wekinator classifies the incoming data,  returning one of the numbers assigned to each one of the movements.

IMG_7178

distance sensor and Arduino WIFI Board WEMOS D1 mini

Screenshot_20220808_172130_com.iglesiaintermedia.mobmuplat

App where the musicians receive the notes to be played

IMG_5969

Gorka and Álvaro in Crystal Ball Studio after producing collaborametrum

Schema of the work

collaborametrum

IMG_7168

collaborametrum patch2

Main performance window of the Max /Msp Patch

This work belongs to a series of works dedicated to researching the possibilities of new forms of creativity based on collaborative intelligence.

More information about this research at: