Users collaborate with this system to form an improvised musical ensemble. The group of human users access the software via Apple iPads, where they tap a GUI button to make musical sounds. After a predetermined set of time, the system changes the GUI to a new selection of sounds.
The system ‘triggers an interface update when it calculates that the amount of gestural change in the ensemble has exceeded a predetermined threshold. These moments can correspond to natural “segments” of ensemble improvisations, and so are appropriate times to intervene in the performance… This interaction results in a musical experience similar to “structured improvisation” or “game pieces” except with an ensemble-tracking agent director, rather than a human.’
HB, CA, HFF, CA
Martin, C., Gardner, H., Swift, B., & Martin, M. (2016, May). Intelligent agents and networked buttons improve free-improvised ensemble music-making on touch-screens. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 2295-2306). ACM.