Skip to main content

Playing by ear in the laboratory

lhcsoundsmall

Richard Dobson sat at his computer in England, listening to a New Age cascade of electronic sounds. He had received the files from Argonne physicist Lily Asquith, with whom he was playing a particle-physics inspired version of Name That Tune. Two files were the calculated sound of Higgs particles, two were quark jets, and two were random sounds. Dobson’s job was to tell Asquith which were which.

“It was a tantalizing exercise,” said Dobson, a musician and programmer with the Composer’s Desktop Project. “The patterns were so interesting; they showed that you could actually hear information and make observations.”

Asquith and Dobson are two of the developers behind LHCsound, a collaboration of physicists and musicians who translate data from the Large Hadron Collider into musical notes through a process called sonification. Their hope is that physicists could use sonified data to supplement traditional visual and numerical data from the machine, possibly picking up events with their ears that their eyes would miss.

The music-matching experiments are helping them develop a user-friendly graphical user interface (GUI, pronounced “gooey”) for their sonification process. The program will allow any experimenter, from a high school student to an LHC lead investigator, to input his own data and create a musical masterpiece.

Although LHCsound started in January 2010, the idea of turning data into audible sound is not new. Dobson gives the example of a Geiger counter, which emits a distinctive blip while detecting radiation, as another audible indicator of changing data.

“The human ear is good at detecting subtle changes in sound,” he said. “It’s a survival instinct: a new sound turns up, our attention is drawn to it.”

The GUI is an extension of the original project where sounds are linked to variables in the data such as type of particle, velocity or energy. Users will be able to upload their data as a list of numbers and make music by adjusting parameters such as instrument, tempo and pitch. For instance, a user might choose to map a violin with a particle’s energy, creating an arpeggio as the energy rises and falls throughout the event.

“What physicists are doing is like a treasure hunt or a detective story, and this could help them skim their data,” Dobson said. “The mere idea that you can listen to collisions could be another tool in their armory.”

Any experiment with a column of changing numbers can be sonified, even something as simple as a wooden car’s acceleration as it rolls down a ramp. The LHCsound crew received a grant from the Science and Technology Facilities Council to develop and present a workshop at middle schools in the U.K. so that students can sonify their own physics projects. Asquith said she would love to bring the outreach component to the U.S. as well.

“The teachers we’ve talked to have wonderful ideas for projects in their physics classrooms,” Asquith said. “The main benefit is that the kids are so excited about it.”
To develop sonification as a useable technology, Asquith said she needs more physicists involved in the project. “So far I’m the only one,” she said. “We need to find a proof-of-principle that we could use our ears to distinguish types of events. It’s potentially a large research project.”

In the meantime, the ideas from musicians, artists, educators and others keep coming as fast as particle collisions in the LHC.

“We’re getting more people interested all the time,” Asquith said. “I’ve permanently got a list of people who want me to do some work for them.”

Among the proposed projects: a dance choreographed to the sounds, a composition that could be performed on an actual instrument, and a real-time sonic display for ATLAS that would play the events as they happen in the collider.

“It all has the potential to become very exciting,” Asquith said.