BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Unlocking Big Data: Lessons Learned From The God Particle

Following
This article is more than 9 years old.

It’s a puzzle wrapped in an enigma wrapped in a symphony. It’s the Higgs boson, the so-called God particle, the greatest physics find of the 21st century, turned into music.

Chamber music, to be exact.

Admittedly—and especially for fans of Pythagoras—this conversion is a little mind-blowing. But once you get beyond the cosmic significance, what’s equally interesting is that the resulting symphony—aptly titled “LHC Chamber Music “(with LHC being short for Large Hadron Collider, the particle accelerator that helped us find the Higgs)—gives us a window into the future of data visualization and creative innovation.

But first, the music.

To commemorate the 60th anniversary of CERN—the Swiss institute where the LHC is housed—scientists converted Higgs measurement data into two pieces of music—a piano composition and a full chamber orchestra symphony. The conversion, known as a “sonification,” involves assigning notes to numbers, with the numbers representing “particle collision events per unit of mass.”

In other words, every time the calculations spit out the number 25, that bit of data is converted to a middle C. Then 26 becomes D and 27 F and so on.

This whole process works—meaning it produces something that sounds like music—because “harmonies in natural phenomena,” as the LHC Open Symphony blog recently pointed out, “are related to harmonies in music.”

At a macroscopic level, the purpose of the sonification was to give non-science types an intuitive sense of the vast complexity of the Higgs boson and, as physicist and the music’s composer Domenico Vicianza said: [to] be a metaphor for scientific collaboration; to demonstrate the vast and incredible effort these projects represent—often between hundreds of people across many different continents.”

In other words, the Higgs sonification is also a data visualization technique (in this case, data acoustification), meaning it gives us a different way to interact with huge amounts of information, a different way to try and detect novel patterns.

Why is this a big deal? Big data is the deal. As we all know, the modern world is awash in data. And while we’re starting to get better at utilizing this information, there’s still a very long way to go.

The problem is not pattern recognition. Turns out, we humans are actually great at pattern recognition (which is why, for example, projects like Foldit are so successful). Our trouble starts with holding giant data sets in our heads—which is not an ability we’re all that good at (which is why computers are better at playing chess than humans—better access to giant data sets allows for brute force solutions).

Put differently, right now, the biggest hurdle to big data is that there is no user-friendly interface for big data. No way in for the common person.

Think about the ARPANET, the precursor to the Internet. Made operational in 1975, ARPANET was mostly text-based, complicated to navigate, and used mainly by scientists. All of this changed in 1993, when Marc Andreessen coauthored Mosaic —both the very first web browser and the Internet’s first user-friendly user interface. Mosaic unlocked the Internet. By adding in graphics and replacing Unix with Windows—the operating system that was then running nearly 80 percent of the computers in the world—Andreessen mainstreamed a technology developed for scientists, engineers, and the military. As a result, a worldwide grand total of twenty-six websites in early 1993 mushroomed into more than 10,000 sites by August 1995, then exploded into several million by the end of 1998.

Today, no similar interface exists for big data. Ask a data scientist what the best way to take advantage of the big data revolution is and the most frequent answer is “hire a data scientist.” (this is from personal experience, as I’ve been asking this question for over a year now while researching my next book).

If we all have to become data scientists to take advantage of big data, well, that strikes me as a fairly inefficient way forward.

But sonification is one solution to how to represent big data sets in a way humans can comprehend. It’s a kind of user-friendly interface. As a result, one of the possibilities raised by the release of the Higgs symphony is that some listener might detect a novel pattern in the music, something the physicists involved have not noticed, something in the melody of the music that hints at deeper structure in the universe. Given the strength of the human pattern recognition system, this is not an impossibility.

To come at this from a different angle, I know of a number of different teams working to find novel ways to represent the stock market. One team is trying to find ways to represent the market as natural terrain like snow covered mountains. Why? Instead of turning on the computer to check how your stocks are performing, you could instead don virtual reality goggles and ski the stock market.

The idea being that bringing multiple sensory streams to the process of processing stock market data might a) help us assimilate the data more quickly b) potentially unlock hidden patterns in the data.

And this is nowhere as weird as it sounds. Our subconscious is capable of astounding pattern detection. But the visual perception system is only one of a myriad of possible inputs to an information processing system. Consider that fifty percent of your nerve endings are in your hands, feet and face. Each of those nerve endings represents data processing power. Right now, we’re only using visual information (numbers read off a screen) to analyze the stock market, but engaging more senses means unlocking more processing power means—quite possibly—better analysis.

And better data analysis leads, obviously, to better innovation.