andrewp.mcpherson*and theproblemofthesecond …andrewm/mcpherson-cmj2012.pdfperformer:buildinga...

18
The Problem of the Second Performer: Building a Community Around an Augmented Piano Andrew P. McPherson* and Youngmoo E. Kim * School of Electronic Engineering and Computer Science Queen Mary, University of London Mile End Road London E1 4NS, UK Department of Electrical and Computer Engineering Drexel University 3141 Chestnut St. Philadelphia, Pennsylvania 19104, USA [email protected], [email protected] http://andrewmcpherson.org, http://music.ece.drexel.edu Abstract: The design of a digital musical instrument is often informed by the needs of the first performance or composition. Following the initial performances, the designer frequently confronts the question of how to build a larger community of performers and composers around the instrument. Later musicians are likely to approach the instrument on different terms than those involved in the design process, so design decisions that promote a successful first performance will not necessarily translate to broader uptake. This article addresses the process of bringing an existing instrument to a wider musical community, including how musician feedback can be used to refine the instrument’s design without compromising its identity. As a case study, the article presents the magnetic resonator piano, an electronically augmented acoustic grand piano that uses electromagnets to induce vibrations in the strings. After initial compositions and performances using the instrument, feedback from composers and performers guided refinements to the design, laying the groundwork for a collaborative project in which six composers wrote pieces for the instrument. The pieces exhibited a striking diversity of style and technique, including instrumental techniques never considered by the designer. The project, which culminated in two concert performances, demonstrates how a new instrument can acquire a community of musicians beyond those initially involved. Composers often speak of the problem of the second performance: Many ensembles commission and per- form new works, but fewer offer chances for pieces premiered elsewhere to be heard a second time. In the field of digital musical instrument (DMI) design, we instead encounter the problem of the second performer: Once a new instrument has been built and the first performances have been given, how can the designer establish a continuing role for the in- strument in the broader musical community? Very few new instruments have attracted a significant fol- lowing, with the result that an instrument’s designer is often its only performer and composer. Because designing an instrument is time-consuming, and many designers are constantly creating new instru- ments, even the most dedicated of designers will struggle to establish a musical presence for his or her creations. Sergi Jord ` a summarizes: “Many new Computer Music Journal, 36:4, pp. 10–27, Winter 2012 c 2013 Massachusetts Institute of Technology. instruments are being invented. Too little striking music is being made with them” (Jord ` a 2004, p. 326). This article addresses the stage in DMI design following the creation of the first prototype and the presentation of the first performances. Our primary purpose is not to specify how a DMI should be initially designed or evaluated, a topic covered in many excellent articles (Wanderley and Orio 2002; Jord ` a 2004; Cook 2009; Paine 2009; Gurevich and Fyans 2011; O’Modhrain 2011). Rather, we examine the process of bringing an instrument to a larger community of musicians, sometimes changing its design in the process. As a case study, we present our recent work with composers and performers using the magnetic resonator piano, an electronically augmented acoustic grand piano which gives the pianist continuous control over the pitch, dynamics, and timbre of every note. The article concludes with general recommendations for DMI designers seeking to establish a continuing musical life for their instruments. 10 Computer Music Journal

Upload: others

Post on 17-Mar-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

The Problem of the SecondPerformer: Building aCommunity Around anAugmented Piano

Andrew P. McPherson* andYoungmoo E. Kim†

!School of Electronic Engineering andComputer ScienceQueen Mary, University of LondonMile End RoadLondon E1 4NS, UK†Department of Electrical andComputer EngineeringDrexel University3141 Chestnut St.Philadelphia, Pennsylvania 19104, [email protected], [email protected]://andrewmcpherson.org,http://music.ece.drexel.edu

Abstract: The design of a digital musical instrument is often informed by the needs of the first performance orcomposition. Following the initial performances, the designer frequently confronts the question of how to build alarger community of performers and composers around the instrument. Later musicians are likely to approach theinstrument on different terms than those involved in the design process, so design decisions that promote a successfulfirst performance will not necessarily translate to broader uptake. This article addresses the process of bringing anexisting instrument to a wider musical community, including how musician feedback can be used to refine theinstrument’s design without compromising its identity. As a case study, the article presents the magnetic resonatorpiano, an electronically augmented acoustic grand piano that uses electromagnets to induce vibrations in the strings.After initial compositions and performances using the instrument, feedback from composers and performers guidedrefinements to the design, laying the groundwork for a collaborative project in which six composers wrote pieces forthe instrument. The pieces exhibited a striking diversity of style and technique, including instrumental techniquesnever considered by the designer. The project, which culminated in two concert performances, demonstrates how anew instrument can acquire a community of musicians beyond those initially involved.

Composers often speak of the problem of the secondperformance: Many ensembles commission and per-form new works, but fewer offer chances for piecespremiered elsewhere to be heard a second time. Inthe field of digital musical instrument (DMI) design,we instead encounter the problem of the secondperformer: Once a new instrument has been builtand the first performances have been given, how canthe designer establish a continuing role for the in-strument in the broader musical community? Veryfew new instruments have attracted a significant fol-lowing, with the result that an instrument’s designeris often its only performer and composer. Becausedesigning an instrument is time-consuming, andmany designers are constantly creating new instru-ments, even the most dedicated of designers willstruggle to establish a musical presence for his orher creations. Sergi Jorda summarizes: “Many new

Computer Music Journal, 36:4, pp. 10–27, Winter 2012c! 2013 Massachusetts Institute of Technology.

instruments are being invented. Too little strikingmusic is being made with them” (Jorda 2004, p. 326).

This article addresses the stage in DMI designfollowing the creation of the first prototype and thepresentation of the first performances. Our primarypurpose is not to specify how a DMI should beinitially designed or evaluated, a topic covered inmany excellent articles (Wanderley and Orio 2002;Jorda 2004; Cook 2009; Paine 2009; Gurevich andFyans 2011; O’Modhrain 2011). Rather, we examinethe process of bringing an instrument to a largercommunity of musicians, sometimes changing itsdesign in the process. As a case study, we present ourrecent work with composers and performers usingthe magnetic resonator piano, an electronicallyaugmented acoustic grand piano which gives thepianist continuous control over the pitch, dynamics,and timbre of every note. The article concludeswith general recommendations for DMI designersseeking to establish a continuing musical life fortheir instruments.

10 Computer Music Journal

Page 2: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Piece or Instrument?

Perry Cook advises DMI designers starting a projectto “make a piece, not an instrument or controller”(Cook 2009, p. 218). DMI composer-designershave the unique luxury of being able to build theinstrument to suit the needs of the piece, and thisis indeed a common practice (Paine 2009). Thereare very good reasons to take this approach: Simplyput, if an instrument is not useful to its creator, it isunlikely to be of use to anyone else.

Jorda describes the instrument creation processas digital lutherie, a mix of science and art (Jorda2004). When the lutherie process breaks down, itis just as often from lack of artistic planning asfrom technical failure: Most attendees of electronicmusic concerts have at some point experiencedunsatisfying “knob-twiddling” performances stem-ming from inadequate musical planning (Schloss2002). It seems reasonable, then, for DMI design toinitially focus on producing the strongest possiblemusical result rather than the theoretically broadestcapabilities.

Musical communities are more easily formedaround instruments than pieces, however. The DMIcommunity includes many singular experts on theirown creations, which have been carefully tailoredto the creator’s artistic requirements. Though this isa perfectly acceptable arrangement, engaging otherusers involves allowing them to assert their ownindividuality on the instrument. Every commonacoustic instrument affords the performer theability to develop a personal expressive style, and aDMI too closely designed for the needs of a singlepiece may pose unacceptable constraints on othermusicians. In the second stage, Cook rightly argues,DMI designers should consider ways to make theinstrument less piece-specific and more genericallyuseful.

Musician-Focused Design

Consider an alternative approach, taken either fromthe outset of a project or at the revision stage.Instead of designing for a specific piece, or evenfor an individual composer, the designer can take

a user-centered approach, shaping the instrumentto fit the perceived requirements of a community.Literature surveys, musician interviews, and com-munity observation can produce a set of design goalswhich ideally result in a more generically usefulinstrument.

One challenge in user-centered DMI design isthat there are many potential stakeholders; theperformer, the composer, the audience, the designer,and the manufacturer may each have differentand sometimes contradictory perspectives on whatconstitutes a successful instrument (O’Modhrain2011). For example, the easiest way to control amusical process may not be the one that gives theaudience the clearest visual indication of causeand effect, the latter being highly important to aconvincing performance (Schloss 2002). Moreover,a single individual may at different times occupydifferent roles. In particular, potential future DMIperformers may begin as audience members, and ifthe instrument does not engage the audience, theopportunity for future collaboration may be lost.

The Pitfall of Over-Determination

Could user-centered DMI design in fact becomecounterproductive? Johan Redstrom argues that thevery concept of a “user” is artificial, in that a persononly becomes a user when there is a specific objectto use (Redstrom 2006). Attempting to design fora particular set of usage scenarios risks “trappingpeople in a situation where the use of our designs hasbeen over-determined and where there is not enoughspace left to act and improvise”(p. 129). Musiciansfrequently explore and repurpose devices in waystheir designers did not anticipate; Redstrom citesthe example of turntable use by DJs, one of manycases of repurposing throughout music history.

Former Apple CEO Steve Jobs, when asked whatmarket research went into the iPad, replied: “None.It’s not the consumers’ job to know what they want”(New York Times 2011). Similarly, the performer’sprimary job is to express him- or herself using theinstrument at hand. This job does not necessarilyextend to imagining hypothetical future capabilities,though some performers also have a talent for this.

McPherson and Kim 11

Page 3: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Particularly if we view digital lutherie as a mixof artistry and scientific research, the instrumentdesigner can be well served to follow his or herown artistic intuition instead of trying to respondexclusively to the perceived needs of others.

Our view, which we will examine in more depthin the case of the magnetic resonator piano, is thatuser feedback is critical to later stages of redesign andcommunity-building; in the first stage, designing fora community of one is sufficient. (This is not to saypre-design user studies should never be conducted,only that the designer’s intuition has an importantrole to play in DMI creation.) To summarize ourargument: Just build it, give it to musicians, andlearn from what they do. The reason for this partlylies in the difference between how designers andnew users explore the capabilities of an instrument.

Affordances and Constraints

Thor Magnusson examines DMI design from thetwin perspectives of affordances and constraints(Magnusson 2010). An affordance can be defined asa system’s perceived capacity for a particular action.It is a property of the relationship between humanand system: A musical instrument affords certainactions to a human player (dragging a bow acrossa string, pressing a piano key). The constraints ofan instrument are its limitations, which may beobvious or subtle (no pitch bending on the piano,limited polyphony on the violin).

When an instrument is designed around a piece,the designer is likely to consider affordances: Thepiece will set specific artistic goals, and the designerwill find ways for the performer to achieve them.Subsequent performers and composers, however,will encounter a very different situation: The instru-ment will now be a fixed, complete object for whichnew music must be created. These musicians willexplore the instrument’s set of capabilities with thegoal of making it serve their own artistic ends. Eachmusician may arrive at a different, even idiosyn-cratic interpretation of the instrument’s affordances,but all musicians will be bound by essentially thesame set of limitations, leading Magnusson to ar-gue that “learning a digital musical instrument is

therefore more appropriately described as ‘getting afeeling’ for the instrument’s constraints, rather thanengaging with its affordances” (Magnusson 2010,p. 65)

For some DMIs, however (especially augmentedinstruments building on traditional technique),performer skill is also an implicit constraint: Notall the techniques of an expert performer willbe available to a novice. Nicolas Rasamimananamodels the performer–instrument relationship as a“space of possibilities” defined by the intersectionsof instrument acoustics, general bio-mechanicallimitations and the skills of an individual performer(Rasamimanana 2012, p. 218). Particularly promisingfor augmented instrument design is the set ofgestures that are bio-mechanically possible andwithin the performer’s skill but that have notraditional acoustic function: In this case, theconstraints of the traditional instrument can berelaxed to give the performer a wider expressivespace to explore.

Two performer studies on opposite ends of thecomplexity spectrum support Magnusson’s modelof constraint exploration. Gurevich, Stapleton, andMarquez-Borbon (2010) asked musicians to developa performance using an instrument consisting of asingle button that played a fixed-frequency tone. Thenine participants developed a striking diversity oftechniques, and accidental discovery was found to bean important process in developing a personal style.Newton and Marshall (2011) developed a toolkitfor performers to create their own augmentedinstruments. Instead of setting musical goals inadvance, participants took an exploratory approachfocused on limitations and permutations of thetechnology.

We observed a similar process of constraint explo-ration in working with musicians on the magneticresonator piano. Because constraints play such acrucial role in a new performer’s evaluation of aninstrument, we argue that building a communityaround a DMI requires careful attention to con-straints, especially to deciding which constraintscan be relaxed in later design revisions. We canthus refine our argument above to: “Just build it.Musicians will do something unexpected with itanyway, so learn from what they do.”

12 Computer Music Journal

Page 4: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Figure 1. Pianist FeifeiZhang playing themagnetic resonator piano.(Photo: Tony Solitro.)

Access Considerations

The following sections discuss ongoing collabo-rations with composers and performers using themagnetic resonator piano (MRP). Like most aug-mented acoustic instruments, including a similar“electromagnetically prepared piano” (Berdahl,Backer, and Smith 2005), the MRP involves special-ized hardware, so community-building is dependenton providing access to the equipment. Although asignificant musical community can be built around asingle physical instrument, DMIs that have achievedbroad dissemination (e.g., the ReacTable [Jorda et al.2007] and the Silent Drum [Oliver and Jenkins 2008])often do so by replication of the hardware, either bythe designer or by the community through publiclyavailable plans. Hardware replication is also a goalof the MRP project, and the following discussionwill consider both the experience of musicians andthe practical requirements of providing access to agrowing community of collaborators.

The Magnetic Resonator Piano

The magnetic resonator piano (MRP) is an electronicaugmentation of the acoustic grand piano (seeFigure 1). On the traditional piano, when a note isstruck, it cannot be further shaped before its release.The MRP places 88 electromagnetic actuators

Figure 2. The magneticresonator piano uses elec-tromagnetic actuators toinduce vibrations in the pi-ano strings. The amplifiersare seen along the top ofthe photo and the actuatorsare seen above the strings.

Figure 2

Figure 3. An optical sensor(modified Moog Piano Bar)detects the continuousmotion of each key.

Figure 3

inside the piano that induce the strings to vibrateindependently of the percussive hammer mechanism(see Figure 2). By varying the signals to the actuators,it is possible to continuously shape the amplitude,frequency, and timbre of every note. All sound isproduced by the strings and soundboard withoutexternal speakers, maintaining the richness andresonance of the acoustic piano (McPherson 2010).

The MRP is played from the piano keyboard,which is equipped with an optical sensor stripmeasuring the continuous position of each key (seeFigure 3). Traditional MIDI keyboards record onlynote onsets and releases; the MRP, by extractingfeatures of key motion over time, can continuouslycontrol multiple independent parameters of each

McPherson and Kim 13

Page 5: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

note (McPherson 2010). Though a computer is usedto analyze key motion and produce signals for theactuators, the performer rarely if ever interactswith the computer directly. The user experience issimilar to playing a traditional instrument: Physicalgestures on the keyboard create sounds from withinthe piano. Nothing interferes with the hammermechanism, so traditional and magnetic resonatorsounds can be mixed in performance. The pianistcan use a slow or shallow touch to produce resonatorsounds with no hammer attack.

Building on Traditional Performance Technique

Performers spend decades developing and refiningtheir instrumental technique. Most digital instru-ments, regardless of merit, do not receive the sameamount of determined effort. For the MRP, as withmost DMIs based on a traditional design, the needto attract new performers will naturally lead one tothe community of musicians skilled in traditionalpractice. The response of these musicians will besignificantly more positive when their existing skillscan be deployed, repurposed, and extended.

Perry Cook observes that “some players havespare bandwidth, some do not” (Cook 2009). Indesigning (and redesigning) the MRP, the focus wason the word “spare.” To be sure, not every playercan maintain the entirety of their existing techniquewhile also controlling new interface elements (onereason, perhaps, for limited adoption of augmentedinstruments so far). On the other hand, there aremany aspects of pianists’ existing performancegestures that can be fruitfully repurposed. The MRPextends traditional piano technique by examiningcontinuous key motion, adding new means ofcontrol from the keyboard, including gradual keymotion, pressure on depressed keys, taps on the keysurfaces, and vibrato.

Two deliberate limitations maintain the playa-bility of the MRP. First, nothing interferes withtraditional hammer-actuated piano technique, en-suring that the years pianists spend learning theirinstrument can transfer to the MRP. Second, soundis only produced through interaction with the key-board, so there is no risk of inadvertent body or

arm movements triggering musical events. Thisstrategy aligns with Steve Benford’s argument thatcertain ancillary gestures, although important tothe physical expression and theatricality of a perfor-mance, need not directly result in music production(Benford 2010). A related example in augmentedinstrument performance is pianist Sarah Nicolls’suse of accelerometers: The sensors are placed in ahat rather than on the body, separating her deliber-ate engagement with the sensors from the ancillarymotions of piano playing (Nicolls 2010).

Compositional Practice

The role of the composer in establishing a newinstrument is arguably just as important as therole of the performer: A compelling repertoireof pieces will be a strong motivator for futureperformances. With notable exceptions (Fergusonand Wanderley 2010; Nicolls 2010), compositionalpractice is less studied than performance practice,but many of the same principles apply. Concertmusic composers spend many years honing the craftof writing for traditional instruments; writing for acompletely new instrument without guidance froman established repertoire is a challenging task.

One author (McPherson) is a concert musiccomposer. From personal experience, the ability toimagine and understand the sound of an instrumentis a prerequisite to writing for it. The means by whichthe sound is controlled, including the interface andthe gesture-sound mappings, are important in thatthey limit the space of possibilities, but the soundalways comes first. A sound palette that is too wideor amorphous can be a powerful deterrent. Theimportance of constraints in guiding compositionfor DMIs has also been observed by other authors(Stewart 2009; Murray-Browne et al. 2011).

Original Design Goals

Original design and composition for the magneticresonator piano followed Cook’s model: The originalMRP was built in 2009 in parallel with the compo-sition of an extended work, Secrets of Antikythera.

14 Computer Music Journal

Page 6: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Design decisions, especially the performance inter-face and gesture-sound mappings, were informedby the needs of the piece, and the instrument’scapabilities shaped the music. Shortly thereafter,author McPherson wrote another a second piece,d’Amore for viola and MRP. In place of a pianist,the violist’s sound was pitch-tracked and used toselectively activate strings inside the piano, sim-ulating the sympathetic vibrations of the Baroqueviola d’amore. The hardware was fixed by the timeof composition, but new mapping strategies weredeveloped in parallel with the composition.

Affordances were a primary concern in theoriginal MRP design. The instrument was intendedto give the musician a specific set of capabilities:

1. Infinite sustain on a single note2. Crescendos, including crescendos from

silence3. Harmonics on each piano string, in turn

allowing microtonal intervals4. Pitch bends5. Timbre controllable independently of dy-

namic level6. Continuous control of as many parameters

as possible from the keyboard7. Minimal interference with traditional piano

technique

The first five capabilities are particularly relevantto compositional practice: They provide uniquesonic resources while still relating to the traditionalpiano. As we will show in later sections, thiscombination of novel and familiar provided a pointof departure for composers. The last two capabilitiesrelate to performance practice, specifically, the goalof engaging traditional performance technique andmaking effective use of the player’s bandwidth.

Musician Feedback

Secrets of Antikythera and d’Amore were performedin concert and later recorded for CD release. Theperformers were all new to the MRP; Secrets ofAntikythera in particular was performed by threepianists between 2009 and 2011. Working withperformers on these pieces provided a valuable

opportunity for feedback on the MRP’s capabilitiesand usability.

Over the same time period, several interactivedemonstrations of the MRP were given at universi-ties, conservatories, music festivals, and open publicevents. Audiences included composers, performers,musicologists, engineers, and members of the public(children and adults). Demonstrations began with abrief presentation of the instrument’s capabilities,followed by a period for hands-on exploration ofthe instrument. Some events also included a longermusical performance.

Finally, we had the opportunity to solicitindividual informal feedback from many composersand performers. These activities collectivelyconstituted the first step toward building an MRPcommunity. They served to raise awareness of theinstrument, to identify possible collaborators, andto gather suggestions for future design revisions.Notably, the future of the instrument was intendedto be independent of these specific compositions,and collaborating with other composers to writenew MRP music became a priority.

Redesigning Constraints

Demonstrations of the MRP generally highlightedits affordances. Our presentations showed specific,identifiable techniques the performer could useto continuously shape the sound of the piano. Wefound, however, that extended musical interactionsfollowed Magnusson’s model: When a new musiciansat down at the instrument, the process usuallyturned to an exploration of the instrument’sconstraints. Starting from the demonstrated set ofsounds, players often sought to establish a personalsense of what could be achieved and how. Most feed-back focused on how constraints could be relaxed(“It would be great if the instrument could . . . ”).Over the course of two years, several recurring sug-gestions formed the basis for a redesign of the MRP.

Relaxing Constraints: Frequent User Requests

Musician feedback highlighted six areas wheredesign adjustments would make the MRP more

McPherson and Kim 15

Page 7: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

generically useful. Five areas are cases of relaxingmusical constraints, the sixth purely practical. Byaddressing these requests, the new instrument ac-quired a superset of the old instrument’s capabilities.

Dynamic Range

The MRP produces sound by electromagneticallyresonating the piano strings. Though these “res-onator” sounds can be quite intense, the originaldesign could not match the volume of the traditionalpiano being played forte. Musicians frequently re-quested louder resonator sounds to match louderpassages on the piano.

Attack Time

In contrast to the impulse-like actuation of ahammer strike, electromagnetically driven stringsrequire many oscillations to reach their loudestsound. On the original MRP, resonator sounds couldtake anywhere from tens of milliseconds to severalseconds to reach their final amplitude, with thebass register speaking especially slowly (McPherson2010). Musicians found that the slow speaking timelimited the instrument’s utility in faster passages.

The MRP is intended to augment, not replace,the traditional piano. Because the hammers remainusable, we initially felt that slow resonator speakingtime was not an issue. Many musicians wantedthe ability to explore hammerless sounds of theinstrument, however, even while using it in fastpassages.

Timbre

Hammer-actuated piano notes contain dozens ofharmonics, but the original MRP tones featured astrong fundamental with few overtones. As a result,piano and resonator timbres differed significantly.Much of the difference is rooted in the mechanics ofthe instrument, and no electromagnetic system canprecisely replicate the effect of the hammer strikingthe string. Nonetheless, musicians frequentlyrequested resonator sounds closer in timbre to thehammer-actuated piano.

Control and Mapping

Nearly universally, musicians wanted the ability tocontrol as many dimensions of the sound as possi-ble. Polyphonic control of amplitude, frequency, andtimbre (itself multidimensional) presents a signifi-cant challenge, not only in sensor design but also inmapping. Piano performance is already challenging,and requiring the performer to manipulate too manyadditional controls risks making the instrumentunplayable.

Some musicians preferred to see additionalsensors and interface elements beyond the keyboard,including foot pedals, extra knobs, or touch-screeninterfaces. Others wanted as many dimensionsas possible integrated into the piano keyboarditself. Because DMIs decouple user interface andsound production, it is possible to pursue bothstrategies in parallel. The two strategies representfundamentally different visions of the instrument’sidentity, however, and we have relied on our originalartistic goals in deciding which path to pursue. Ourresearch thus far focuses on integrating additionalcontrol into the piano keyboard.

Register

The original design covered four octaves of stringscentered around middle C (C4). Higher pitchescould be played as harmonics on lower strings,but their volume was limited. Musicians requesteda resonator system covering all 88 notes of thepiano.

Setup Time

The MRP installs in any grand piano and removescleanly. Setup in a new instrument typically takestwo hours, most of which came from adjustingthe actuator brackets to match the piano’s stringspacing. The original design could not be scaled upto 88 notes while still allowing installation in thetime available before a concert. Broader adoptionof the instrument thus required a more practicalmechanical setup.

16 Computer Music Journal

Page 8: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Other Suggestions

Some suggestions, though compelling, were im-practical to implement. Some musicians requestedthat the resonators actively damp the strings aswell as excite them. Active string damping is pos-sible (Berdahl, Niemeyer, and Smith 2008), but itsimplementation on the grand piano is extremelychallenging, requiring independent actuation of thethree strings within a single note, individual pickupscollocated with the actuators, and near-zero latencysignal processing.

Other suggestions were not pursued because wefelt they ran counter to the instrument’s character.The primary such case was the creation of asequencer to let the pianist trigger groups of soundsrather than individual notes (form-level rather thannote-level or timbre-level control [Wanderley andOrio 2002]). We did use sequencing techniques to alimited extent within particular pieces, but in thegeneral case we preferred to maintain the MRP asan instrument in the traditional sense.

Hardware Changes

Bringing a DMI to a larger musical communityfrequently entails design changes, especially to themapping strategies between gesture and sound.For the MRP, relaxing constraints identified bymusicians required a fundamental redesign of theelectromagnetic actuation hardware.

Amplifier Design

Several constraints resulted from the amplifiers thatdrive the actuators:

1. Volume was limited by total amplifier power.2. Attack time was limited partly by amplifier

power, and partly by signal pre-processingsteps between computer and amplifiers.

3. Timbre, especially in the case of brighttimbres, was limited by the high-frequencyperformance of the amplifiers and actuators.This in turn was limited by the slew rate(maximum slope) of the actuator current,which was ultimately dependent on thevoltage swing of the amplifier.

Figure 4. New MRP design,showing the rapid-adjustbracket inside the piano.

The original amplifier design can be foundin McPherson (2010). The new design producesenough power that actuator heating, not ampli-fier performance, becomes the limiting factor,and it doubles the output voltage swing. A tech-nical analysis of the new design can be foundin McPherson (2012); this article will insteadmaintain focus on the instrument’s musicalimplications.

Actuator Design

The original MRP actuators were wound by hand,a tedious process. The new MRP uses machine-wound actuators, which provide better perfor-mance over all 88 notes. Because the new designis machine-producible, it allows multiple copies ofthe instrument to be built. As yet, only a singleMRP has been produced on the new design, butreplicability and access are important considera-tions in DMI community-building, especially forcomposers interested in having their pieces playedrepeatedly.

We also developed a new actuator bracket design(see Figure 4) that allows each actuator to be movedin two dimensions and locked with a single wing-nut. The new bracket can be machine-produced andhelps address the setup time constraint.

McPherson and Kim 17

Page 9: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Mapping for Performance

Redesign of the MRP hardware was necessary torelax constraints on volume, attack time, and tim-bre. Addressing musician requests on controllingthe resonator sounds focused on refining the map-ping between keyboard and electromagnet behavior.Albert Einstein is often quoted (apocryphally) assaying “everything should be made as simple as pos-sible, but not simpler.” Similarly, based on Cook’sbandwidth observation, we can state our mappingprinciple as: “Put as much control as possible in thehands of the performer, but no more.” The MRPwill be played primarily by pianists with a highlydeveloped gestural vocabulary. Most potential per-formers felt that extra dimensions of control wouldgive them more expressive freedom, but existingpiano technique leaves relatively few spare motions.Our mappings sought to find means of control fromwithin traditional technique while minimizing thenumber of new interface elements.

Augmented Keyboard Interface

When Secrets of Antikythera was composed in 2009,the MRP was played from two keyboards: the pianokeyboard, using a Moog Piano Bar (Computer MusicJournal 2005), and an auxiliary MIDI keyboardatop the piano. We found that MIDI note onsetsand releases were inadequate for continuouslycontrolling multiple dimensions of a note.

We thus modified the Piano Bar to extractcontinuous key position, measured at 600 samplesper second per key. Continuous key position enableda variety of extended techniques: gradual and partialkey presses, taps and sweeps on the key surfaces,pressure into the key-bed, and vibrato gestures.It also allowed a detailed investigation of piano“touch” that showed that a single press gesture couldbe independently varied in multiple dimensionsbeyond MIDI velocity (McPherson and Kim 2011).

One Mapping or Many?

Multiple continuous features could be extractedfrom key motion, and multiple parameters of each

Figure 5. Two-layermapping from key motionto actuator parameters.The second layer isdynamically adjustable;the standardized mappingis shown in solid blacklines.

electromagnet can be controlled. The challengeis how the two parameter spaces should relate.In working with musicians, it became clear thatthe needs of composers and improvisers could bequite different, and consequently, no one mappingstrategy could serve all cases. For the improviser,as many sounds as possible should be availableat the fingertips; for notated music, the piece candetermine what sounds need to be available at anygiven time. For pieces that need only a few sounds,limiting the dimensions of control reduces thecomplexity for the performer, potentially resultingin a better performance.

Toward a Standardized Mapping

Moving from a composer-specific instrument to amore generically usable one required a mappingsuited to a wide variety of musical situations. Thisstandardized mapping will be what a new performerfirst encounters, and we designed it to be oneimprovising musicians could use. Although it is notpossible to simultaneously, independently controlevery sonic parameter, we aimed to provide as manydimensions as possible in a way that is repeatableand reliable and that encourages exploration. Thestandardized mapping uses a two-level strategy(see Figure 5):

18 Computer Music Journal

Page 10: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

1. Features of key motion are mapped to fourintermediate parameters: intensity, bright-ness, harmonic, and pitch. The parameters,whose meanings are explained in the fol-lowing, are quasi-independent in that notall combinations are possible but none is alinear combination of the others.

2. Intermediate parameters are mapped toactuator behavior, controlling amplitude,frequency (relative to string fundamental),and waveform (combinations of harmonics,affecting amplitudes, phases, and centroid inFigure 5).

Layer 1 is fixed in code; Layer 2 is can bedynamically adjusted based on an XML parameterfile. Mappings in Layer 2 are one-to-many andcan be linear or logarithmic, so the entire systemis a many-to-many mapping between features ofkey motion and actuator parameters. Both layersrelate specifically to actuator behavior; becauseno mechanical alterations are made to the piano,hammer strikes (when present) maintain theirnormal function.

In Layer 1, raw key position from top to bottomcontrols intensity. When the key reaches the key-bed, further pressure controls brightness. (Becausefelt separates the key from the key-bed, pressureproduces a slight deviation in position.) A vibratogesture on a resting key, produced by light taps orgripping key between thumb and forefinger, controlsharmonic, whose value increases linearly with timeat a rate proportional to the vibrato speed. Pitch iscontrolled by a combination of two adjacent keys:When one “center” key is held, partial presses onneighboring white keys bend the pitch of the centerkey up or down. This mapping temporarily overridesthe intensity parameter on the adjacent key.

With the exception of raw key position, none ofthe techniques just mentioned (pressure, vibrato,partial presses) appear in traditional piano perfor-mance. At the same time, the gestures can be easilylearned and executed. The names of the parametersreflect their usual function, but they can be freelymapped to any sonic parameters. In the standardizedmapping, intensity controls (logarithmic) amplitudeand waveform, producing louder, brighter tones

for deeper key presses. Brightness further controlswaveform, pushing the centroid of the spectrumhigher. Pitch maps logarithmically to frequency (asemitone in either direction) as well as to amplitude.Harmonic maps to frequency as an integer multipleof the fundamental and amplitude. Both frequency-related effects increase the amplitude to compensatefor the string’s reduced response away from itsfundamental frequency. An additional parameter,loop gain, is needed to bend pitches away from thestring’s natural frequency (McPherson 2010).

Real-Time Mapping Adjustment

The standardized mapping allows real-time ex-ploration of volume, pitch, and certain aspects oftimbre. It represents a compromise between detailof control and playability, and thus certain soundsare not easily accessible. Limitations include a re-stricted range of timbre, inability to polyphonicallyselect different harmonics on different notes, andrelatively constant volume when keys are heldall the way down (as they are in traditional pianoplaying, regardless of attack velocity).

The XML file defining Layer 2 can be reconfiguredfor every piece. It can also define multiple mappings,selectable with MIDI Program Change or OpenSound Control (OSC) messages. Each mappingprogram can define behavior on a key-by-key basis—for example, allowing only a certain range of pitchesto have the resonator added. Global resonatorvolume can also be externally set by MIDI or OSC.Mapping changes lack the expressive immediacy ofkeyboard control, but they allow access to a widerrange of sounds and are especially useful in notatedcompositions. The next section shows how theywere used in specific pieces.

Composer Collaboration

We undertook a yearlong project in which sixcomposers from Philadelphia, Pennsylvania, andPrinceton, New Jersey, were invited to write newpieces for the MRP. The composers were DavidCarpenter, William Derganc, Daniel Fox, Daniel

McPherson and Kim 19

Page 11: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Shapiro, Jeffrey Snyder, and Tony Solitro. At thetime of composition, the group included one final-year undergraduate, three graduate students, and twopost-PhD professional musicians. Each participanthad previously seen the instrument at a concertperformance or interactive demonstration, butnone had been involved in its development. Onecomposer (Snyder) worked primarily with electronicand novel instruments; the other five had previouslycomposed primarily for acoustic instruments,though two (Solitro and Derganc) had incorporatedelectronics in previous pieces.

At the start of the project in early 2011, we de-cided to present two performances of the new pieces.Particularly with new or unusual instruments, mul-tiple performances better justify the time composersspend creating new works. Four composers (Fox,Shapiro, Snyder, and Solitro) chose to write solopieces for MRP; one composer (Derganc) wrote forviolin and MRP; and one composer (Carpenter)wrote for baritone voice and MRP. Aside from theMRP, no other electronic elements were used in anypiece.

Working Methods

Access to the instrument was a crucial componentof the composition process, and this was madeeasier by the fact that all composers were local toPhiladelphia, where the project was carried out.Most composers’ first direct contact with the in-strument was at a group demonstration session(described earlier). Each composer subsequentlycame to the Music and Entertainment TechnologyLaboratory at Drexel University, where the MRPwas set up, to work individually with the instru-ment. Individual sessions with the instrumentincluded guided explorations (engaging with the in-strument’s intended affordances) and unstructuredimprovisation (exploring constraints).

Composers took varying approaches to the writingprocess. One composed mostly at the MRP; threeothers composed mostly away from the MRP but ata keyboard; two more indicated they did not use akeyboard at all. In a follow-up survey, one composerexplained:

[W]hile the entirety of my work was physicallywritten away from the instrument, the bulkof the compositional process was shaped byseveral meetings well in advance (greaterthan six months) of the work’s completion.I rarely work at the intended instrument(s)for which I compose, as it is usually notpragmatic; however, in this case, I (somewhatsubconsciously) believed it might be a hindranceto my imagination if I were to content myselfwith producing the sounds I could produceat the instrument, thereby letting my owntechnical limitations dictate the scope of thework’s sound-world.

In general, even the composers who workedaway from the instrument indicated that theoriginal meetings shaped their ideas or served as anopportunity to test sounds before including themin the piece. One composer wrote an initial versionof the piece for piano and external electronics(Max/MSP), with the idea of later transferring it toMRP. He felt that this process “freed me to imaginethe sounds I wanted, and then to find an effectiverealization after the fact.”

Sometimes, either as a result of experimentationor as an idea developed during composition, com-posers would request sounds that were not easilyachievable with the standard mapping between key-board and actuators. We encouraged the composersnot to feel limited by the default set of mappings,and in this case we would work to develop cus-tom mappings to allow the intended sound to beperformed. Composer Shapiro, exploring the pitchbends that are played with two adjacent keys in thestandard mapping, was drawn to the sound of theresonator when bent slightly below the natural pitchof the string. He requested a mapping that wouldproduce this out-of-tune sound on every note bydefault. This was achieved by adjusting the relativecenter frequency in the second-stage mapping layerdescribed in the previous section; the resulting map-ping was used in the second movement of Shapiro’spiece (Sound Examples 1 and 2). [Editor’s note:This article’s sound examples are posted online atwww.mitpressjournals.org/toc/comj/36/4 and will

20 Computer Music Journal

Page 12: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

also appear on the 2013 Computer Music JournalSound and Video Anthology DVD.]

Rehearsal and Performance

Once the draft scores were complete, we workedwith the composers and performers to finalize howeach piece would be performed. This step includedimplementing piece-specific mappings, gettingfeedback from performers on the most natural wayto perform certain techniques, and working withthe composer to develop a consistent notation forany novel effects.

Performers for the project included four pianists,a violinist, and a baritone vocalist. Before theproject began, none of the pianists had previouslyplayed the MRP. Two pianists worked with thecomposers during the composition process, withpianist Feifei Zhang taking a role in three piecesover the months leading up to the concert; theother three pianists joined the project during thefinal rehearsals. The rehearsal process served tofamiliarize the performers with the MRP and toprovide feedback to the composers, who sometimesmade changes to the score in response to earlyrehearsals.

Two performances were given in December2011, one at Drexel University and one at TempleUniversity. The resonator system was installed ina Steinway D grand piano in both cases, thougha different instrument at each venue. Because theresponse of the resonator system in the concertpianos differed from the 5-foot-long Knabe babygrand used for early rehearsals, an extended dressrehearsal before the first concert gave composers achance to make last-minute adjustments.

During the performances, author McPherson satin the front row of the auditorium, using an iPodtouch to dynamically adjust mappings and overallvolume via OSC. Separate mappings were loadedfor each piece, and for some pieces, mappings andvolume were adjusted several times within thepiece. Though some of these controls could havebeen given to the pianist, we decided that it wasbetter to minimize the number of new techniquesthe pianist needed to learn, especially for players

who joined the project in the final rehearsals. In allcases, primary expressive control remained with theperformer, as there were few scripted or sequencedactions in any piece.

Results and Observations

All six composers who started the project completedpieces, and all pieces were successfully presentedin concert. In the week following the concert,composers were given a written questionnaire cov-ering their experiences on the project and theirsuggestions for improvement. Based on this ques-tionnaire, the completed scores, and observations ofthe composition process, several interesting trendsemerged.

Notation

Ferguson and Wanderley argue that the ability toconsistently reproduce a performance from notationis a primary metric of success for DMIs (Fergusonand Wanderley 2010). We decided early in theproject not to attempt to prescribe a uniform setof notational conventions. The capabilities of theinstrument had evolved since Secrets of Antikytherawas written in 2009, and it was also expected thateach composer would imagine different sounds andtechniques. With guidance from the authors, eachcomposer was encouraged to develop a personalnotation system, with the idea that similarities inapproaches could guide eventual standardization.

Notation strategies could be classified accordingto whether they noted the sound of a gesture orits means of execution. In the former category,all composers used standard hairpin notation fordynamics: Crescendos were notated as they wouldbe on any other acoustic instrument. Likewise, pitchbends were mostly notated as they sounded ratherthan how they were played (a gesture involving twoadjacent keys).

In the standard MRP mapping, a resonator soundsfor as long as its key is held down. If the key isreleased with the damper pedal depressed, the notewill then follow the natural decay of the piano.By contrast, in traditional piano technique the

McPherson and Kim 21

Page 13: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Figure 6. Excerpt fromSpectra of Morning byTony Solitro. Long notes,as well as notes followedby black horizontal bars,are played with organsustain. Staves below the

piano indicate audioplayed through the pianostrings. (Sound Example 3;excerpt starts at 0:24.)Copyright (C) 2011 byTony Solitro"tonysolitro.com.

pianist can often choose whether to sustain notes byholding the keys or using the pedal without a largeperceptual difference in sound. We adopted the term“organ sustain” to refer to notes sustained with theresonator and “piano sustain” for notes allowed todecay with the pedal down; some pianists foundadjusting to this distinction one of the more difficultaspects of MRP technique.

In Tony Solitro‘s Spectra of Morning (see Figure 6;Sound Example 3), the long notes are held withorgan sustain. The effect is especially significant onthe bass pedal tones, which can be held for multiplebars at a time. Solitro also uses solid black lines toindicate shorter notes within the texture that useresonator to persist beyond their notated length.The remaining notes are played with piano sustain(the damper pedal held but no resonator added).

A related distinction concerned notes played withresonator but without hammer. Most composersused standard notation to indicate notes played with

hammer, using the phrases “no hammer”, “n.h.”or diamond noteheads to indicate resonator-onlypassages. David Carpenter’s Job took the oppositeapproach (see Figure 7; Sound Example 4). By default,notes are played with resonator only and held withorgan sustain. Triangular noteheads indicate pitchesplayed with both hammer and resonator.

Effects based on harmonics or timbre had fewertraditional notation strategies to draw on. In thesecases, notation focused primarily on how theeffects were executed with a secondary emphasison showing the sonic result. Daniel Shapiro’sThe Masons of Heidelberg (see Figure 8; SoundExamples 1 and 2) used harmonic glissandos, pitchbends, and timbre changes within notes. In thestandard mapping, upward harmonic glissandosare produced by gently vibrating the key betweenthumb and forefinger; in consultation with theauthors, Shapiro used a diagonal wavy line (m. 30)which signifies a combination of the vibrato gesture

22 Computer Music Journal

Page 14: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Figure 7. Excerpt from Jobby David Carpenter.Standard noteheads areplayed with resonatoronly, triangular noteheads

(m. 39) with resonator andhammer. (Sound Example4.) Copyright (C) 2011 byDavid Carpenter. Allrights reserved.

Figure 7

Figure 8. Excerpt from TheMasons of Heidelberg(movt. I) by DanielShapiro. Extendedtechniques includeharmonic glissandos(m. 30), aftertouch timbre

modification, and pitchbends (m. 34). SoundExample 1 is from earlierin this movement; SoundExample 2 is frommovement II.

Figure 8

and the resulting sweep up in pitch. In the lastmeasure of Figure 8, two extended techniques areused together. Aftertouch (key pressure) is used tocreate a brighter timbre; Shapiro notates this using

hairpins and the marking “a.t.” Simultaneously, apitch bend pulls the pitch of the E upward. The pitchbend is executed by lightly pressing the neighboringkey (F); Shapiro’s notation focused on the method of

McPherson and Kim 23

Page 15: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

execution by showing the direction of key motionrather than the direction of pitch bend. In all scores,unfamiliar notations are explained in a preface orappendix.

Relationship to the Traditional Piano

Augmented instruments provide a challenge forcomposers: To what extent should the long traditionof the underlying acoustic instrument guide thecomposition? The composers’ approaches can be di-vided into three categories: piano-driven, resonator-driven, and oppositional. The piano-driven approachtreats the MRP as an extension of the acoustic piano,using traditional piano technique as the foundationof the piece. Composer Fox (Sound Example 5)wrote: “I tried to use the resonator sounds as acontinuous outgrowth of the piano sounds. I usedthe resonator for organ-like percussion-less attacks,for sustain, and for new tone qualities.” Derganc(Sound Example 6) wrote: “I used the MRP in a verylimited way. . . . I wanted a very even mixture ofpiano and MRP in my piece.” Both pieces are drivenprimarily by traditional (hammer-actuated) pianotechnique with resonator used to color and extendthe piano sound. For Solitro, too, the resonator actedas an outgrowth of the traditional piano: “For mypiece, the resonator sounds often sustain notes andchords in ways that would otherwise be impossible.[MRP pedal tones] provide a harmonic founda-tion, while the piano plays traditionally soundingornate and fantasy-like passage-work against thepedal.”

Other composers concentrated on resonatorsounds while pushing traditional piano techniqueto the background. Carpenter primarily used soundswith no hammer attack, eliciting an organ-likequality from the MRP. Shapiro also challengesthe primacy of the piano: “I don’t hear much‘traditional’ piano in my work, as I don’t hearthe MRP as an augmented piano. It is simply anew instrument that bears a relationship to thepiano, much the same as the harpsichord bearsa relation to the piano, in terms of mechanismalone.”

Jeffrey Snyder’s approach to the MRP is bestdescribed as oppositional. Snyder writes: “In my

piece, the resonator sounds are generally microtonal. . . by contrast, all the traditional piano writing usesequal-temperament (by necessity) so the combina-tion of resonator and traditional piano representa sort of struggle and interaction between thetuning systems.” Snyder’s Fantasy is played fromtwo keyboards, with the top keyboard controllingresonators alone and the bottom keyboard ham-mers alone, so there is no overlap in the soundsets.

Mapping and Control

Each piece used a different set of mappings, some-times multiple mappings within a single piece.Mapping changes were primarily used to changetimbre and which notes had the resonator added,a role analogous to stops on an organ. Carpenter,Derganc, Fox, and Shapiro used variations on thestandard mapping, which uses continuous keyposition to shape each note. In contrast, Solitroused standard MIDI (onset and release) from thepiano keyboard, which proved more reliable forsustain effects where continuous shaping was notrequired. Snyder used MIDI from an auxiliary key-board, with the lowest octave of keys reserved forchanging mappings. In Figure 9, notes C2–B2 changethe harmonic (1–12, respectively) of the followingnotes. Notes C3 and above play the previouslyselected harmonic on the strings two octaves belownotated; for example, the C# at the beginning ofm. 3 will play the third harmonic (G#4) of thestrings for the note C#3. The sounding pitcheswill thus be markedly different from the notatedpitches.

Tony Solitro’s Spectra of Morning containedseveral short passages in which a sequence of notesor a pre-recorded sound was played through theresonator system. In Figure 6, a recording of avocalist is played into several resonators (G3, G#3,A3, B3, C4, and D4, one octave below each pitchin the excerpt) to produce a ghostly, reverberantshadow of the original sound from within the piano.These effects were triggered externally (by theauthor acting as patch-changer) and represent one ofthe few cases that the pianist did not have note-levelcontrol over the result.

24 Computer Music Journal

Page 16: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

(a)

(b)

Figure 9. Excerpt fromFantasy by Jeffrey Snyder.(a) Notes in lowest octaveselect harmonics forfollowing notes (C2 = 1 toB2 = 12). (b) “pno.” and

“e.k.” indicate pianokeyboard and electronicMIDI keyboard,respectively. (SoundExamples 7 and 8 [startingat 0:14].)

Composer Feedback

Feedback from composers was very positive. Mostcomposers indicated that the results were close totheir expectations. One composer wrote that hispiece “turned out very true to my expectations, andsounded better on many things than I had imagined.Especially upon the switch to the Steinway Dconcert grand for the performance, many of thesubtle details I had written into resonator-intensivesections of the piece came out beautifully.”

We asked each composer to suggest improvementsto the instrument. One requested a GUI to alloweasier exploration of new timbres. Two composersfound the piano dampers limiting: The damper mustbe lifted for a resonator note to sound, requiringeither the key or the pedal to be held. The composerssuggested that a MIDI-enabled piano might allowdampers to be lifted automatically as required.

Several suggestions echoed feedback we had re-ceived about the first revision of the instrument.Two composers requested resonator tones that speakmore quickly, allowing faster passages without ham-mers. One composer requested resonator timbresthat more closely match the traditional piano.These comments suggest that the design revisions

were targeted to relevant areas, but only partlysatisfied the composers’ requests. Finally, severalcomposers wanted greater control from the keyboardover resonator dynamics. Alternative mappings forvolume will be explored that don’t require theperformer to partially press keys to achieve softersounds. Like earlier feedback, composer commentsfocused on ways to relax constraints of the currentinstrument.

Conclusions

Establishing a continuing musical role for a digitalmusical instrument is a challenge. We have pre-sented a case study in musical community-buildingusing the magnetic resonator piano. Initial musicianfeedback guided a complete revision of the hardwaresystem with a focus on relaxing constraints. Acollaborative project with six composers and four pi-anists resulted in the performance of six new piecesfor MRP. We intend to present further performancesof this repertoire and establish more collaborationswith performers and composers. Particular prior-ity will be placed on working with improvising

McPherson and Kim 25

Page 17: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

performers, who may take a different approach tothe instrument than composers of notated music.

Recommendations for DMI Designers

Based on our experience with the MRP, we offerthe following suggestions to instrument designersseeking to establish their instrument in a broadercommunity:

1. Design for the first performance; then iterate.The instrument and the piece may initiallybe linked, but they are not equivalent; it isnot possible to build a community arounda piece. Once the first performances arefinished, the designer should focus on howto let other musicians take personal artisticownership of the sounds and techniques, aprocess which will likely focus on relaxingconstraints. It is not necessary to involve alarge number of musicians in the first design,but feedback is important to the revisionprocess.

2. Demonstrate uniqueness, but connect tofamiliar models. Potential users of theinstrument should have a compelling reasonto use the new design rather than existinginstruments. Every professional musicianhas spent many years acquiring a specificskill set, however, and where the techniquesand sounds of a new instrument can beconnected to existing experience, greateradoption and better performances are likelyto result.

3. Sell to the audience; follow up with theperformer or composer. The first encountera potential collaborator will have is likelyto be as an audience member. A convincingperformance from the audience perspectivewill help convince performers and composersto explore the instrument in more detail.

4. Provide access. Access is important duringthe artistic creation process: Composersand performers will need regular interactionwith the instrument to produce pieces. Evenwhere it is possible to make multiple copies

of an instrument, access to the designeris also important for answering questionsand suggesting areas of exploration. Accessis also important after the collaboration isfinished. Composers want to know theirpieces will have a continuing performancelife, and performers will want to spend theirtime acquiring skills that can be reused.A strong commitment to pursuing theinstrument beyond the duration of a researchproject can itself be a motivator to acquiringand maintaining a broad community ofcollaborators.

Acknowledgments

Many thanks to the composers involved in theproject and to all the performers: Feifei Zhang,Katya Popova, Eric Wubbels, Matthew Bengtson,Lawrence Indik, and Noco Kawamura. Thanksalso to members of the Drexel University MusicEntertainment Technology Lab and the TempleUniversity Music Department for making theconcerts possible. This article is based upon worksupported by the National Science Foundationunder grant no. 0937060 to the Computing ResearchAssociation for the CIFellows Project.

References

Benford, S. 2010. “Performing Musical Interaction:Lessons from the Study of Extended Theatrical Perfor-mances.” Computer Music Journal 34(4):49–61.

Berdahl, E., S. Backer, and J. O. Smith. 2005. “If I Had aHammer: Design and Theory of an Electromagnetically-Prepared Piano.” In Proceedings of the InternationalComputer Music Conference, pp. 81–84.

Berdahl, E., G. Niemeyer, and J. O. Smith. 2008. “FeedbackControl of Acoustic Musical Instruments.” TechnicalReport STAN-M-120. Palo Alto, California: StanfordUniversity.

Computer Music Journal. 2005. “Products of Interest.”Computer Music Journal 29(1):104–113.

Cook, P. 2009. “Re-Designing Principles for ComputerMusic Controllers: A Case Study of SqueezeVoxMaggie.” In Proceedings of the Ninth International

26 Computer Music Journal

Page 18: AndrewP.McPherson*and TheProblemoftheSecond …andrewm/mcpherson-cmj2012.pdfPerformer:Buildinga CommunityAroundan AugmentedPiano ... as digital lutherie, a mix of science and art (Jorda`

Conference on New Interfaces for Musical Expression,pp. 218–221.

Ferguson, S., and M. M. Wanderley. 2010. “The McGillDigital Orchestra: An Interdisciplinary Project on Dig-ital Musical Instruments.” Journal of InterdisciplinaryMusic Studies 4(2):17–35.

Gurevich, M., and A. Fyans. 2011. “Digital MusicalInteractions: Performer-System Relationships andTheir Perception by Spectators.” Organised Sound16(2):166–175.

Gurevich, M., P. Stapleton, and A. Marquez-Borbon.2010. “Style and Constraint in Electronic MusicalInstruments.” In Proceedings of the Tenth InternationalConference on New Interfaces for Musical Expression,pp. 106–111.

Jorda, S. 2004. “Instruments and Players: Some Thoughtson Digital Lutherie.” Journal of New Music Research33(3):321–341.

Jorda, S., et al. 2007. “The reacTable: Exploring the SynergyBetween Live Music Performance and Tabletop TangibleInterfaces.” In Proceedings of the First InternationalConference on Tangible and Embedded Interaction, pp.139–146.

Magnusson, T. 2010. “Designing Constraints: Compos-ing and Performing with Digital Musical Systems.”Computer Music Journal 34(4):62–73.

McPherson, A. 2010. “The Magnetic Resonator Piano:Electronic Augmentation of an Acoustic MusicalInstrument.” Journal of New Music Research 39(3):189–202.

McPherson, A. 2012. “Techniques and Circuits forElectromagnetic Instrument Actuation.” In Proceedingsof the Twelfth International Conference on NewInterfaces for Musical Expression (pages unnumbered).

McPherson, A., and Y. Kim. 2011. “MultidimensionalGesture Sensing at the Piano Keyboard.” In Proceedingsof the 29th ACM Conference on Human Factors inComputing Systems, pp. 2789–2798.

Murray-Browne, T., et al. 2011. “The Medium is theMessage: Composing Instruments and Performing

Mappings.” In Proceedings of the Eleventh Interna-tional Conference on New Interfaces for MusicalExpression, pp. 56–59.

New York Times. 2011. Obituary: Steven P. Jobs.http://www.nytimes.com/2011/10/06/business/steve-jobs-of-apple-dies-at-56.html, 5 October.

Newton, D., and M. T. Marshall. 2011. “Examining HowMusicians Create Augmented Musical Instruments.” InProceedings of the Eleventh International Conferenceon New Interfaces for Musical Expression, pp. 155–160.

Nicolls, S. 2010. “Seeking Out the Spaces Between:Using Improvisation in Collaborative Compositionwith Interactive Technology.” Leonardo Music Journal20:47–55.

Oliver, J., and M. Jenkins. 2008. “The Silent DrumController: A New Percussive Gestural Interface.”In Proceedings of the International Computer MusicConference, pp. 651–654.

O’Modhrain, S. 2011. “A Framework for the Evaluation ofDigital Musical Instruments.” Computer Music Journal35(1):28–42.

Paine, G. 2009. “Towards Unified Design Guidelines forNew Interfaces for Musical Expression.” OrganisedSound 14(2):142–155.

Rasamimanana, N. 2012. “Towards a Conceptual Frame-work for Exploring and Modelling Expressive MusicalGestures.” Journal of New Music Research 41(1):3–12.

Redstrom, J. 2006. “Towards User Design? On the Shiftfrom Object to User as the Subject of Design.” DesignStudies 27(2):123–139.

Schloss, W. A. 2002. “Using Contemporary Technologyin Live Performance: The Dilemma of the Performer.”Journal of New Music Research 32(3):239–242.

Stewart, D. A. 2009. “Digital Musical Instrument Com-position: Limits and Constraints.” In Proceedings ofthe Electroacoustic Music Studies Network Conference(pages unnumbered).

Wanderley, M. M., and N. Orio. 2002. “Evaluation of InputDevices for Musical Expression: Borrowing Tools fromHCI.” Computer Music Journal 26(3):62–76.

McPherson and Kim 27