Current trends in music technology (2014-2015)

As we reach the end of 2014, here’s a look at several key and current trends cutting across the technologies we use to create, distribute and consume music. I’ve called them immersive, crowd-everything, health-boosting and nostalgic.

1. Immersive

2014 has seen a continuation of a trend towards using music to create or enhance immersive environments. One beautiful example of immersive musical installations is Living Symphonies, which took place in four forests across England in summer 2014. Artists Daniel Jones and James Bulley painstakingly surveyed each forest site in the months before each installation, and used that information to develop a simulation that modeled the activity of the forest, second by second. The result was a musical piece where the ecosystem itself became the conductor, triggering various musical motifs representing various organisms within the forest site. A network of speakers hidden in the forest canopy and undergrowth played back the resultant music for visitors.

Many of us use music to create personal “bubbles” within which to navigate our daily lives; whilst travelling, to focus on work, to escape from noisy environments into our own selected soundscapes, and so on. The marriage of music and virtual reality (VR) or augmented reality (AR) is bringing about truly immersive experiences of sound and vision. Collider is a VR experience featuring full head-mounted support for the Oculus Rift DK1 and DK2. It combines 3D immersion and raw infrared imagery to create a real-time, interactive environment of light and sound which was inspired by the Large Hadron Collider. In Oculus Rift mode, it uses LeapMotion’s experimental VR tracking, so that you can still see your hands in front of you as you navigate your virtual world. Funktronic Labs founder Eddie Lee, who co-created Collider with prominent Kyoto DJ Baiyon, describes the vision behind the project:

“Having a musical experience that is real-time interactive hasn’t really been explored yet. For example, if an artist wants to drop an album in VR, someone can really immerse themselves in the audio and visuals as well. In a way, that’s what we’re trying to do – it’s like a music video, but you progress yourself through with the music.”

J-pop star Kumi Koda claimed the world’s first virtual reality music video title in 2014 for “Dance in the Rain”, filmed in 360 degrees and viewable in VR. Since you can’t see it on standard YouTube, read Sam Byford’s review on The Verge. Also using LeapMotion technology, Japan’s Aliceffekt created an album entirely with 3D interaction, Telekinetic is the first known album release to use the LeapMotion Controller.

Using gesture to create sound is not new technology; the theremin was patented in 1928 and Jean-Michel Jarre popularised the laser harp back in the 1980s. The trend, however, is becoming more mainstream and the technology ever more sophisticated. Imogen Heap’s “The Gloves” use a range of sensors to allow her not only to control and trigger pieces of music during on-stage performances, but also to free her from the computer screen whilst composing or mixing by mapping particular gestures to software commands. The project, launched in 2012, got its own Kickstarter campaign this year in a bid to make the gestural music creation technology more affordable.

The theremin has also been futuristically re-envisioned by music technologist Milton Villeda, who has just successfully crowd-funded a hybrid theremin, synthesiser and visual controller which allows the user to create music and control video and light via hand gestures.

Photo of the M!lTone open-source synth and MIDI controller

Generative music isn’t limited to deliberate gesture of course. Belfast-based Gawain Morrison and Shane McCourt, who came from an augmented reality background, created a responsive cinema project for SXSW in which the film responded to audience biodata (heart rate, galvanic skin response) by changing elements such as bass levels in the soundtrack and camera angles to amp up fear or otherwise manipulate emotion. (See New Scientist’s coverage of the project.) From this project they developed the Sensum product, which measures emotional response to music and other media via galvanic skin response (minute changes in sweat levels) and other physiological feedback. This data is invaluable to music marketers and advertisers, as well as any media creator interested in measuring emotional impact. (I did some work for Sensum back in 2012 and asked Imogen Heap if we could audience test one her music videos for her, using the Sensum gear; she agreed – read the results here). Results of a Sensum/Goldsmiths University study into the use of music in advertising are soon to be published, and Sensum are currently collaborating on a project audience testing the emotional impact of musicals. Besides that, the technology can be used to generate music and visuals, using biophysical data as controllers and triggers.

Biometric beatboxing is a cool application of the technology: at last year’s futuristic Σsigma event in Tokyo, Ryo Fujimoto attached electrical sensors to his body and used a Leap Motion Controller to track his finger movements to create an interactive, live audio-visual performance.

But why stop at using our physical bodies to create music? Pioneering founder of the human cyborg foundation, Neil Harbisson, believes we should dream beyond the limitations of our current physical realities. Colour-blind from birth, Harbisson can now “hear” colour thanks to his “eyeborg” – a piece of technology combining a sensor which detects colour frequencies in front of him and a receiver chip installed in a bone in the back of his skull. With this technology, Harbisson’s brain has been synaptically remapped to recognise 360 distinct hues by the tones they produce – he claims he now dresses for the chords he wants to hear. “Biohacker” Rich Lee has had magnets embedded in his ears which act as internal headphones; the magnets pick up the signal sent from his music player to a wire coil worn around his neck.

This new trend of biohacking – embedding technology into the body to enhance sensory abilities – has been foreshadowed in science fiction for decades. Although the mainstream is still being slow to catch-up with wearable technology, people have long got used to the idea of implants and embedded technology for medical or cosmetic reasons, so it would seem reasonable that in time the trend will become more commonplace. With any movement that aims to integrate technological developments directly into natural organisms or phenomena there are always ethical questions to be debated and legislation to be hammered out; it will be interesting to see how quickly and widely embedded technology becomes accepted.

Next: Crowd-everything >>

You may also like...

3 Responses

  1. John says:

    Immersion – also Pixeljunk 4AM for PS3. Music was also by Baiyon as well.

  1. January 22, 2015

    […] an update on Sensum’s work, see my post on music tech trends for […]

Leave a Reply

Your email address will not be published. Required fields are marked *