Skip to main content


Published onMay 04, 2021

Please take a look here to see what events we ran for the network, and for documentation of these events.

FMN ‘What’s next?’ meeting

The funded period of the network officially finishes at the end of May, but of course we hope that activities can carry on.  We would like to organise a Zoom meeting, for anyone who would like to participate in any future activities - research papers, concerts, funding applications, artistic collaborations etc, and also to reflect on the activities over the past 12 months.  This meeting will take place on May 23rd, 1pm - 2/3pm BST.

Here’s an export from the padlet we used in the meeting

FMN Meeting 3:

University of Sussex, UK, May 2022

Public events (in-person and live streamed):


Volks, Brighton, 7-10pm, May 11th (full program) (facebook event link) (live stream)

The Meeting House, Uni Sussex Campus, Falmer, 6-9pm, May 12th (full program) (facebook event link) (live stream)


Cathy van Eck. Silverstone 121, 10am, May 13th (details) (live stream)

Symposium (limited capacity, by invitation)

Full Program

FMN Meeting 2: Athens, March 2022

FMN Meeting 1: Copenhagen, December 2021

Event report:

We will be streaming events from this meeting, event link:

Schedule: (all times UTC+1)

Nov 30th, 7-9pm Performances & 10:30-11:30pm improv session: UKirke (Dannebrogsgade 53, 1660 København) FEEDBACK EXPLORATIONS Concert Program (final):


Dec 1st, 10:30am, Konference Salen (1.001): Keynote from Nicolas Collins, followed by workshop & symposium Dec. 2nd: Aalborg University Copenhagen (A.C. Meyers Vænge 15, 2450 København) livestream:

Links for participants:


Workshop info:

17th Sept 2021, 3-5pm UTC: online network meeting.

Feedback Musician Network Meeting #2

Øyvind Brandtsegg: Finger mounted piezos  - exploration by touch

3:00-3:45 pm UTC (last 5 minutes for discussion)

The talk + performance will present musical instruments where finger mounted piezo pickups are used to explore feedback resonances by touch. The technique makes it easy to change the pickup position with relatively low noise from mechanical handling, and as such allows performative exploration of vibrational modes. The finger acts as a filter in the feedback circuit, and variations in performative gesture (finger pressure, angle of incident, touching with the nail or the flesh) can thus selectively bring out potential resonances of the object.

Some methods of mapping the potential pitch space of the instrument have been investigated. These will be presented and discussed. The attendants are also invited to reflect on possible refinements in mapping strategy and representation, as well as the utility and usability of such maps in general.


Øyvind Brandtsegg is a composer and performer working in the fields of computer improvisation and sound installations. He has a deep interest in developing new instruments and audio processing methods for artistic purposes, and he has contributed novel extensions to both granular synthesis, and live convolution techniques. Brandtsegg has participated on more than 25 music albums in a variety of genres. Since 2010 he is a professor of music technology at NTNU, Trondheim, Norway.

Semiconductor: Ruth Jarman and Joe Gerhardt: HALO: ring of sound and light

3:45 - 4:30 (last 5 minutes for discussion)

Semiconductor will talk about HALO, their large-scale immersive artwork which embodies data collected at the world's largest experiment at CERN, the European Laboratory for Particle Physics. HALO is an instrument that exists as both an installation and stage using 384 piano strings arranged in a circle which are both struck and use a custom feedback system to create an immersive experience that links the quantum world of particles, waves and fields.

HALO by semiconductor


Semiconductor is UK artist duo Ruth Jarman and Joe Gerhardt. Over the past twenty years of collaboration they have become known for a unique and innovative body of visually and intellectually engaging artworks, which explore the material nature of our world and how we experience it through the lenses of science and technology.

They have undertaken fellowship opportunities and residencies at a number of prestigious scientific institutions including CERN, Geneva, Switzerland; Mineral Sciences Lab at The Smithsonian National Museum of Natural History, Washington DC, USA; Gulbenkian Galapagos Artists Residency; The NASA Space Sciences Laboratories UC Berkeley, California.

Semiconductor exhibit and screen their work internationally, selected exhibitions and commissions include; Helicase DeepMind 2020, The Technological Sublime, City Gallery, Wellington, New Zealand, 2019 (solo show); Superposition, 21st Biennale of Sydney, 2018; HALO, The 4th Audemars Piguet Art Commission, Art Basel, 2018 (solo show); Parting the Waves, Axiom Art and Science Gallery, Tokyo, Japan, 2017 (solo show); No Such Thing As Gravity, National Taiwan Museum of Fine Arts, Taiwan, 2017;  The Universe and Art, Mori Art Museum, Tokyo, Japan, 2016; Infosphere, ZKM, Karlsruhe, 2016; Da Vinci: Shaping the Future, ArtScience Museum, Singapore, 2014; Let There Be Light, House of Electronic Arts, Basel 2013 (solo show); Field Conditions, San Francisco Museum of Modern Art, 2012; International Film Festival Rotterdam, 2012; New York Film Festival: Views from the Avant Garde, 2012; Worlds in the Making, FACT, Liverpool 2011 (solo show); Earth; Art of a Changing World, Royal Academy of Arts, London, 2009.

Wrap-up questions & discussion session

4:30-5:00pm UTC

4th June 2021, 3-5pm UTC+0: Launch Event (online).

FMN kickoff event

We will have a performance and Q&A from Adam Pulz-Melbye, along with a quick introduction to the network and group discussion about research in feedback musicianship. We will conclude with a group improv using Gianluca Elia’s networked feedback instrument:

Adam Pulz-Melbye

Performance Notes:

The FAAB (feedback-actuated augmented bass) was developed in collaboration with Halldór Úlfarsson and is the current focus of my research as well as artistic practice. The instrument is a double bass fitted with individual string pickups, a microprocessor, amplifier and built-in speaker. This concert represents the current stage of my research into developing a feedback performance practice, using both composition and improvisation to explore the how traditional musical and technical skillsets become challenged through their encounter with the increased autonomy of a self-actuating, self-resonating instrument. A feature of the FAAB is the development of DSP algorithms that in subtle as well as not-so-subtle ways shape the response of the instrument. These algorithms operate through internal adaptive feedback mechanisms that contribute to increasing systemic complexity, posing a welcome challenge to traditional notions of instrumental mastery.

Adam Pultz Melbye is a Berlin-based double bass player, composer and researcher, currently undertaking PhD-studies at Queen’s University’s Sonic Arts Research Centre, Belfast. His recent practice focuses on joining acoustic and digital systems to expand and explore the outer reaches of double bass resonance and autonomy, often utilising feedback toward this end. He has composed and performed music for sound installations, theatre, film, computer games, dance and sculpture. He appears on around 50 recordings, has performed in Europe, the US, Australia and Japan, while his work has appeared at Wien Modern, Modern Art Museum Albury (Australia) and Kunsthal Nord Aalborg (Denmark). Adam’s work is supported by Senatsverwaltung für Kultur und Europa, Berlin.

Squidback is an adaptive Larsen effect generator running in your web-browser. It uses your device’s microphone and speaker to generate feedback and then filters it through an automatic equalizer which promotes the rising of a variety of tones, while controlling the explosive quality of the most dominant resonating frequencies. It has no user-controllable parameters, but since feedback is affected by the distance between speakers and microphones, by the resonances of everything between and around them, different results can be obtained by moving devices to different parts of a room or putting them inside or near resonating objects or cavities (e.g. pots and pans, musical instruments, hands, mouths..). A special feature of this application is networked performances, where all participants share sounds in real-time, connecting remote rooms and including the Internet among the paths sound can take while traveling back from a speaker to a microphone. It was also conceived as a way to permit collective performances, which would normally happen with participants sharing the same room, under global pandemics conditions. can be used on computers, smartphones and tablets, with built-in or external speakers. Participants can use one or more devices at the same time. It is recommended though to use a modern browser version, preferably Firefox or Google Chrome. On iOS devices (such as iPhones and iPads) the only working option is to use Safari.

For more information, please refer to the SMC2021 paper about Squidback.

Gianluca Elia is a programmer and musician based in Copenhagen, working between open-source software development, improvised electro-acoustic music and performing arts. Interested in creative relationship between humans and machines, his recent work is centered on the observation and sonification of technological phenomena such as audio feedback, traffic on wifi networks and the contents of computers’ memory while running programs. He’s a contributor to the SuperCollider project and he has been teaching this language and platform for computer music at Rytmisk Musikkonservatorium Copenhagen, where he is also technical assistant for Artistic Research. As a performer, he is active in the noise and improvised music scene in Europe, playing solo as “me(at), a plastic gun”, and in more or less stable constellations such as Perfect Volume (DK/PL/IT) and the Great Danes (DK/IT/CO) performance group.

No comments here
Why not start the discussion?