How an artificial intelligence can become a truly musical companion

Music has long been hailed as one of the last frontiers of human creativity; and music is not just composition, but also, and equally important, expressive performance. It is the performers and the musicians who bring a piece to life, producing an expressive rendition that brings out dramatic, affective, and emotional qualities which may engage and deeply affect listeners; this effect cannot be replicated by a machine – until now. Gerhard Widmer is an award-winning computer scientist, professor and head of the Institute of Computational Perception at Johannes Kepler University Linz. As part of their long-term research on AI & music, his team developed a computer accompanist that permits expressive co-performance between a human and a machine, based on learned computational models of expressivity. It is arguably the first demonstration of a machine playing together with a human in a truly musical way. At Falling Walls, Gerhard will talk about what makes music come alive, what it means for a computer to learn fundamental principles of expressive (piano) performance, and how, by using this learned knowledge, it can contribute to expressive music making – not as a replacement for human pianists, but as a musical companion.

Gerhard Widmer was voted Science Breakthrough of Year in the category Art and Science at the Falling Walls Science Summit. To find out more go to https://www.falling-walls.com

The Falling Walls Science Summit was part of Berlin Science Week 2021 and hosted by the Falling Walls Foundation.