Presented by

  • Charles Martin

    Charles Martin

    Charles Martin is a computer scientist specialising in music technology, musical AI and human-computer interaction at The Australian National University, Canberra. Charles develops intelligent musical apps such as MicroJam, and PhaseRings, researches creative AI, and performs music with Ensemble Metatone and Andromeda is Coming. At the ANU, Charles leads research into intelligent musical instruments. His lab's focus is on developing new intelligent instruments, performing new music with them, and bringing them to a broad audience of musicians and performers.


Charles Martin's music computing lab at the ANU is leading research into creating new kinds of musical instruments that sense and understand music using machine learning. These instruments actively respond during performances to assist musicians. The tools exist to create these instruments today with single-board computers such as the Raspberry Pi and Beaglebone. We envision that musical instruments of the future will do more than react to musicians. They will predict their human player’s intentions and sense the current artistic context. Intelligent instruments will shape their sonic output, seamlessly add expression to sounds, or even generate notes that the performer hasn’t played (yet!). In this talk, Charles will discuss recent progress in creating intelligent musical instruments with machine learning. He’ll talk about what intelligent instruments might mean to musicians, to their music-making process, and what new music these tools can create. In particular, he'll introduce the process of developing intelligent musical instruments using the Raspberry Pi platform.