Drowned in Sound: the new (Virtual) Reality of immersive music

As a follow-up to my end-of-year post on music trends, here’s another look at what’s afoot in music technology –  specifically, virtual reality. Hyped as the next big thing a couple of decades ago, VR is in the ascendancy again, but in a very different technological and socially-connected landscape. Is it finally about to come into its own?

Back to the Future: Virtual Reality in the 1990s

In 1992, popular PBS computing show The Computer Chronicles broadcast an episode dedicated to Virtual Reality (VR). The show featured a clip from the world’s first ever virtual reality trade show, hosted by AI Expert in Santa Clara, California, in which Crystal River Engineering founder Scott Foster talked about his company’s focus on audio, at a time when most people in the VR space were focused on visuals:

Since the visual systems that we’re working with today are not very good – it’s very difficult to build a very precise stereo imagery system – but we can build a very precise audio imagery system. And we’re closer to closing the loop all the way to perfection – that is, building systems that are almost mistakable for real life.

Foster founded Crystal River Engineering in 1989, after NASA awarded him a contract for creating the audio component of a VR-based training simulator for astronauts. Foster and his team were pioneers in HRTF-based, real-time binaural sound processing software and hardware, as evidenced by a long list of publications on Foster’s NASA personnel profile.

Later in the show, host Stewart Cheifet asked technology journalist Linda Jacobson about the state of the technology. Her reply was that it was moving “out of the labs and into our laps, almost”. Whilst VR is not yet as mainstream as its evangelists in the 1990s thought it might be by now, interest in the technology is certainly on the rise again, thanks in part to decreasing technology costs, crowd-funding and open source movements and the rise of social networking. Facebook’s 2014 acquisition of Oculus Rift for $2 billion, Google’s 2014 leading of a $542 million investment round into Magic Leap, the Samsung Gear VR (developed in collaboration with Oculus VR), Sony’s upcoming Project Morpheus VR headset and Google Cardboard all point to an upswing in the market for augmented reality (AR) and VR technologies.

Immersive music videos

Whilst current AR and VR innovations in the entertainment sphere have tended to centre around gaming, the technologies have been enthusiastically embraced by some pioneer music artists and big music industry players. J-pop star Kumi Koda claimed the world’s first virtual reality music video title in 2014 for Dance in the Rain, filmed in 360 degrees and viewable in VR.

Writing about the experience in The Verge, Sam Byford described the technology as “the perfect fit for the overwrought futurism of the average J-pop ballad”. By contrast, iconic music innovator Björk’s recent VR music video for her song Stonemilker, made for Oculus Rift and directed by Andrew Thomas Huang, takes the technology in an entirely different direction. Here, it’s used to allow the music to deliver the emotional punch, creating a deceptively simple, deeply intimate experience, blending 360-degree visuals with a 3D binaural soundtrack powered by Edinburgh company Two Big Ears‘ 3DCeption audio technology. Two Big Ears founder Varun Nair described the process to Road to VR:

Bjork and her mixing engineer (Chris Pike, who is senior audio R&D engineer at the BBC) remixed the song specifically for VR and our engine. They could position different instruments at any point in space at a very high resolution and surround the listener with her music and voice. On the mobile device, our engine takes all that information and recreates a binaural mix in real-time based on head tracking data. She’s stripped down the mix to reflect the visuals (it was shot on a beach in Iceland).

Björk herself shared the recording process with fans, on her Facebook page:

i had recorded the strings with a clip on mike on each instrument . we have made a different mix where we have fanned this in an intimate circle around the listener .

so as you watch this in the virtual reality headset it will be as if you are on that beach and with the 30 players sitting in a circle tightly around you [sic]

LA company VRSE.works and VFX specialist Digital Domain created a virtual reality world from Huang’s 360-degree film footage, which was then used by UK-based Third Space Agency (3SA) worked with digital production and VR specialists Rewind to create the final VR presentation. Rewind’s Dave Black reported that people had been moved to tears by the music video.

Speaking about the capabilities of VR in an interview for Fast Company Björk said, “It’s almost more intimate than real life. It also has this crazy panoramic quality. I think it’s really exciting”. The artist relates how she had been discussing a 360-degree camera’s potential for intimacy with director Andrew Huang, when he suggested they take it to the beach where Stonemilker was written. The idea, she says, resonated with her because “that location has a beautiful 360 panoramic view which matches the cyclical fugue like movement in the song”. This idea raises possible implications of VR and AR technologies in terms of embodied cognition, as regards audiences’ interaction with music. Leman suggests that moving to music impacts the listener’s emotional response, increasing positive valence and arousal, by mirroring the affective content of the music through corporeal articulations1. If the impression of 360-degree movement afforded by Stonemilker’s panoramic footage matched cyclical movement in the song structure, then perhaps the viewers were able to experience a sense of embodied empathy even in a passive (i.e. non-moving) viewing state. This might have been responsible in part for the intense emotional response viewers had to the Stonemilker video. If so, VR and AR audiovisual experiences have the potential to create much deeper emotional connections with both music and performer, on the part of the viewer.

Purchasers of Björk’s album Vulnicura at Rough Trade East in London were able to watch the video for Stonemilker using a headset that turned a smartphone into a VR device. The VR video was also viewable at Rough Trade New York until late March, and currently features in a Björk exhibition at MoMa, running until June 7th. Update: it’s now available on Youtube:

The MoMa exhibition includes Björk’s earlier innovation, music app Biophilia, which was the first app in MoMa’s collection. When asked whether she might make a full album using the VR technology, as a follow-up to the app album, Björk replied,

I only did that album because I felt like I had content that made sense, that could relate to the technology. It can’t just be working with the gadget for the sake of the gadget. But also it’s about budgets. You can do apps cheaply. Apps was kind of punk, actually. It was like starting a punk band again. Filming for Oculus Rift is not.

Indeed, AR and VR are less punk than they are “cyberpunk”; a 1980s subgenre of science fiction featuring artificial intelligence, cybernetics, rapid technological change driven, big data and mega-corporations. Oculus chief scientist Michael Abrash credits seminal cyberpunk novel Snow Crash for setting him on a path that led him to Oculus.

In any case, it may be that we are seeing the early days of VR and AR music videos becoming mainstream. Enabling the fan to interact with the artist’s work is a growing trend, enabled in ever-more creative ways by new technologies. New York musician Azealia Banks released an interactive video built for Chrome earlier this year; her ‘mirror’ video for Wallace allows the viewer to control the movements of Banks’ face via his or her webcam, using Google Chrome.

Chris Milk, founder of VRSE.works, who produced Björk and Huang’s Stonemilker app, previously created a Google Chrome project featuring Arcade Fire’s We Used To Wait as the soundtrack. Milk collaborated with Google’s Aaron Koblin to create the award-winning Wilderness Downtown interactive music video. Koblin went on to work with director Vincent Morrisset for Arcade Fire’s 2013 interactive experience Reflektor, which allows the viewer to create various digital effects on the screen with their webcam, mobile phone, via its gyroscope data, or by mouse or tablet. (Watch it at justareflektor.com.) In a Rolling Stone interview, Koblin said of the project’s aim,

With this film, we wanted the user to participate in the experience. And rather than click the mouse or type on the computer, we wanted the interaction to be both unique and human. That’s why we incorporated the phone and webcam as input devices. It lets the viewer participate in the music video, and do so in a novel way using gestures.

The Reflektor video is split into two halves, with viewers encouraged to interact in the first half, and to enjoy the second half – marked by the main character breaking a mirror to leave her world and enter ours – in the traditional fashion. “When she breaks the mirror, there was this idea of getting back to something more real and grounded,” said director Morisset, in an interview for Creative Review. This decision highlights one key issue surrounding interactive music videos, including VR videos, namely the risk of the musical content being overshadowed by what could be seen as gimmickry or glossy marketing.

Next: Beyond the front row >>

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *