flexible score following: the piano music companion .flexible score following: the piano music...


Post on 20-Aug-2018




0 download

Embed Size (px)


  • Proceedings of the Third Vienna Talk on Music Acoustics, 1619 Sept. 2015, University of Music and Performing Arts Vienna


    Andreas Arzt1,2, Werner Goebl3, Gerhard Widmer1,2

    1Department of Computational Perception, Johannes Kepler University, Linz, Austria2Austrian Research Institute for Artificial Intelligence (OFAI), Vienna, Austria

    3Institute of Music Acoustics, University of Music and Performing Arts, Vienna, Austriaandreas.arzt@jku.at


    In our talk we will present a piano music companion that isable to follow and understand (at least to some extent) a livepiano performance. Within a few seconds the system is able toidentify the piece that is being played, and the position withinthe piece. It then tracks the performance over time via a ro-bust score following algorithm. Furthermore, the system con-tinuously re-evaluates its current position hypotheses within adatabase of scores and is capable of detecting arbitrary jumpsby the performer. The system can be of use in multiple ways,e.g. for piano rehearsal, for live visualisation of music, and forautomatic page turning. At the conference, we will demonstratethis system live on stage. If possible, we would also like to en-courage (hobby-)pianists in the audience to try the companionthemselves. Additionally, we will give an outlook on our effortsto extend this approach to classical music in general, includingheavily polyphonic orchestral music.


    Score following, also known as real-time music tracking, is abig challenge. It involves listening to a live incoming audiostream, extracting features from the audio that capture aspectsof the sound of the current moment, and tracking the mostlikely position in the score regardless of the specific tempochosen by the musicians on stage, of tempo changes due to ex-pressive timing, and robust to varying sound quality and instru-ment sounds.

    Real-time music tracking originated in the 1980s (see [1, 2])and has attracted quite some research in recent years [3, 4, 5,6, 7, 8, 9]. Although there still are many open research ques-tions (such as on-line learning of predictive tempo models dur-ing a performance), score following is already being used inreal-world applications. Examples include Antescofo1, whichis actively used by professional musicians to synchronise a per-formance (mostly solo instruments or small ensembles) withcomputer realised elements, and Tonara2, a music tracking ap-plication focusing on the amateur pianist and running on theiPad.

    In this paper we will summarise some recent developmentsof our music tracking system. First, in Section 2 below, we willdescribe a system that allows flexible music tracking on a bigdatabase of piano music. In Section 3 we will discuss someof the challenges we encountered when we prepared our musictracker to follow orchestral music live at a world-famous con-cert hall. Finally, in Section 4, we will discuss further steps ofhow to combine these two concepts flexible score followingon a database, and tracking orchestral music (or any other kind


    of instrumental classical music) , and possible application en-abled by this technology.


    The piano music companion is a versatile system that can beused by pianists, and more widely by consumers of piano mu-sic, in various scenarios. It is able to identify, follow and un-derstand live performances of classical piano music at least tosome extent. The companion has two important capabilities thatwe believe such a system must possess: (1) automatically iden-tifying the piece it is listening to, and (2) following the progressof the performer(s) within the score over time.

    The companion is provided with a database of sheet mu-sic in symbolic form. Currently the database includes, amongstothers, the complete solo piano works by Chopin and the com-plete Beethoven piano sonatas, and consists of roughly 1,000,000notes in total. When listening to live music, the companion isable to identify the piece that is being played, and the positionwithin the piece. It then tracks the progress of the perform-ers over time, i.e. at any time the current position in the score iscomputed. Furthermore, it continuously re-evaluates its hypoth-esis and tries to match the current input stream to the completedatabase. Thus, it is able to follow any action of the musician,e.g. jumps to a different position or an entirely different piece as long as the piece is part of the database. See Figure 1 for avery brief technical description. More information can be foundin [10] and [11].

    The piano music companion enables a range of applica-tions, for both listeners and performers. Musicians might usethis system as a convenient way to query and show the sheetmusic. They can just sit down at the piano, play a few notes,and the piano music companion will show the respective piece,at the correct position. The companion will take care of turningthe pages at the appropriate times, and will also recognise jumpsto a different positions or pieces and show the sheet music ac-cordingly. For listeners of classical music, the main purpose ofthe piano music companion is to enrich the experience of clas-sical music concerts. This might be achieved e.g. by providinginformation synchronised to the music, in various formats (text,images, videos). We will discuss this in a bit more detail inSection 3 below.


    So far, in our research we were focusing mainly on classical pi-ano music, which resulted in the piano music companion brieflydescribed above. The multi-national European research projectPHENICX3 then provided us with the unique opportunity (and


  • Proceedings of the Third Vienna Talk on Music Acoustics, 1619 Sept. 2015, University of Music and Performing Arts Vienna

    Possible Applications

    Live Performance

    'Any-time' On-line Music TrackerInstant Piece Recognizer

    Note Recognizer(On-line Audio-to-Pitch


    Symbolic Music Matcher


    Multi Agent Music Tracking System

    Output: Score Position

    Musical Score Database

    Figure 1: The piano music companion takes as input a live au-dio stream. It first tries to transcribe this stream into symbolicinformation (Note Recogniser). Then, it matches this informa-tion to a database of sheet music, also represented in symbolicform, via a fingerprinting algorithm (Symbolic Music Matcher).The output of this stage is fed to a Multi Agent Music TrackingSystem which is based on an on-line version of the dynamic timewarping algorithm. This component tries to align the incomingaudio stream to (synthesised versions of) the sheet music at therespective positions provided by the fingerprinter. At each pointin time the best position (regarding alignment costs) is returned.

    challenge) to work with the famous Concertgebouw Orchestra,and to demonstrate our score following technology in the con-text of a big, real-life symphonic concert.

    The event took place on February 7th, 2015, in the Concert-gebouw in Amsterdam. The Royal Concertgebouw Orchestra,conducted by Semyon Bychkov, performed the Alpensinfonie(Alpine Symphony) by Richard Strauss. This concert was partof a series called Essentials, during which technology devel-oped within the project can be tested in a real-life concert en-vironment. For the demonstration a test audience of about 30people was provided with tablet computers and placed in therear part of the concert hall. The music tracker was used tocontrol the transmission and display of additional information,synchronised to the live performance on stage (a similar studywas presented in [12]). The user interface and the visualisationswere provided by our project partner Videodock4.

    Live Performance

    Multi-Agent On-line Music Tracker

    Decision Maker: Computes a combined Hypothesis

    Output: Score Position

    Tracker 1'Score': Jansons/RCO

    Tracker 2'Score': Haitink/LSO

    Tracker N'Score': Previn/PSO


    Figure 2: The Multi-agent Tracker. The live input is fed to N in-dependent instances of the tracking algorithm. Each aligns theinput to its own score representation, based on different perfor-mances of the same piece. Then, the individual hypotheses arecombined and the estimate of the current position in the scoreis returned.

    During the concert the audience was presented with 3 dif-ferent kinds of visualisations. The sheet music was shown syn-chronised to the live performance, with highlighting of the cur-


  • Proceedings of the Third Vienna Talk on Music Acoustics, 1619 Sept. 2015, University of Music and Performing Arts Vienna

    rent bar and automatic turning of the pages. Textual informa-tion, prepared by a musicologist was shown at the appropriatetimes, helping the audience to understand the structure of thepiece. Additionally, artistic videos were shown, which weretelling the story of the Alpensinfonie.

    As this was a live experiment during a real concert, ourmain goal during the preparation was to make sure that the al-gorithm will not get lost at any point in time. In the end, thisled to a method that increased both the robustness and also theaccuracy of the tracking process, with the main idea being touse multiple actual performances (which are preprocessed au-tomatically) as the basis of the tracking process, instead of asingle synthesised version of the score (see Figure 2 for a briefexplanation). For further information about the experiment atthe Concertgebouw we refer the reader to [13], and to [14] fora larger scale evaluation of the multi -gent tracking system.




View more >