The idea behind ‘Expanded Field Recording’ were to investigate how the experience of landscape and time can be intensified and expanded working with VR and 3D technologies. The locations and areas of interest were former industrial, landfill and military sites. Locations that have been repurposed and where the old function and associated memory has been rendered invisible or only indirectly observable. Its traces are barely noticeable without technological extensions. The concept behind Expanded Field Recording aims to make the complex history of such places experienceable.

Firstly, we had to get acquainted with the tools. VR and 3D are driven by specific technologies where all parts play an important role in the chain for it to work properly.

Equipment:

  1. Ambisonic Microphone (Røde NTS1)
  2. 4 Channel Recorder (Sound Devices Mixpre 6)
  3. Insta 360° Pro Camera
  4. Software (Insta 360° Pro Sticher, Waves head-tracker, Røde Soundfield, Harpex, Reaper, GRM-Tools, Waves 360° Surround Tools, Audioease, Abbey Road Studio 3.
  5. VR Headset Oculus rift for Playback of audio and visuals.

We started with the process by getting familiar with the Ambisonics B-format. It is the industry standard for recording, mixing and playing back audio in a full spherical 360-degree soundfield. It is the audio format most widely used in VR/AR and in Facebook and YouTube 360° videos. You can convert stereo and surround to Ambisonics B-format, mix B-format audio, and monitor on your regular stereo headphones. In B Format there are four basic elements: W, X, Y, Z. W is an omni-directional polar pattern, containing all sounds in the sphere, coming from all directions at equal gain and phase. X is a figure-8 bi-directional polar pattern pointing forward. Y is a figure-8 bi-directional polar pattern pointing to the left. Z is a figure-8 bi-directional polar pattern pointing up

During the residency we did a number of fieldtests: in Kunstfort Vijfhuizen (one of forty-two forts in the Defense Line of Amsterdam) The Volgermeerpolder (the largest waste disposal site in the Netherlands, URENCO (Uranium enrichment facilities in Enschede), COVRA (long term storage of radioactive waste) and Borsele (operational nuclear reactor) also in NL. And a fieldtrip to Žirovski Vrh – a former Uranium mine in Slovenia. During these field tests we did a series of practical experiments which helped us to understand better how the equipment behaves and functions – and not – in the field. They also helped exchange knowledge between us, developing new ideas and gathering of information.

Expanding the knowledge.

In line with the concept of Expanded Field Recording we recorded various sources of data simultaneously both inaudible and audible information. (GPS, radioactivity, infrasound, electromagnetic radiation, atmospheric changes, hydrophonic and audio field recordings) using various applicable detectors and microphones. And we recorded various formats of videos and images (multichannel video and 360˚ footage).

I think we reached a starting point what we wanted to explore, to simultaneously record various forms of data. The idea is that one can experience the ‘inaudible’ information such as Radioactivity, EMF, geophonic vibrations together with ‘Audible’ information (the reality as we hear it…) such as Ambisonic field recordings: with wildlife, everyday human activity, weather. This opened up many ideas and possibilities towards an VR ‘sensory devices’ that add an extra spatial and visual layer experience of field recording. An absolute Expanded Field Recording Experience where one can zoom in and out within the different layers of visual and audible information and thus improve the VR audience experience.

One of the research questions were how is it possible to build a (Sensory) device that can manage all these data in real time, we got to the point where we started prototyping and managed to build a software environment in which we could navigate in real time within the 360º footage recorded at the various fieldtrips. Using Max8 it is now possible to playback 4k video material at 60fps, even in stereoscopic view. This allows for a whole new way of navigating and interacting in 360 and VR setups.

Excerpt here (password: EFR): https://vimeo.com/437860544

Additional research we’re made with triggering the data to Modular Synthesizers and Digital Signal Processing applications. By letting the information harvested from our field trips we let the machines lead us into more musical forms, harmonics and radioactive triggered rhythms. Sound material was analysed and edited through applications such as IZotope RX7, GRM Tools, Logic Pro X.

As Covid 19 hit we were forced to look into online platforms, successful experimentations were conducted with online live streaming in multichannel. Through a Safari browser together with applications such as Audiomovers, Twitch and Blackhole opened up the possibility to do multichannel live broadcasting or installations from hi-risk environments, such as radioactive, toxic or pandemic areas into gallery spaces or as an online exhibition.

Overall the experience working and researching 3D and VR has been very positive, although we have only scratched the surface of these new technologies. Off course there are a lot of technical challenges that one has to overcome and find solutions for, as is always the case with new technologies. For example, one has to plan carefully what to do at every location as the equipment does not allow ‘on the fly’ style operation due to set up and calibration of software. Weather is another issue as the cameras are very sensitive for any type of rough weather. These issues will most likely change in the future as the equipment gets faster and more rugged editions appear. And the workflow is challenging as well, where all the parts in the chain (visually and audible) have to connect in order to work, both in hardware and software. There is even some de-learning to do to get away from the conventional way of producing audio-visual artwork.

Through the Expanded Field Recording project, I have investigated how I can expand my current practice of sound art and field recording to the visual domain through experiments with VR and 3D technologies. It allowed me to experiment with a lot of new technologies, both in the lab and through field experiments, and we have developed some new technological applications that I will be incorporating in upcoming projects.

The necessity for Expanded Field Recording stemmed from an interest to investigate how the experience of landscape and time can be intensified and expanded through VR and 3D technologies. Our joint research enhanced our understanding how these technologies could be applied within our practices. Seeking out essential hardware and software. We aimed to personalise and fine-tune all methods to attune to our own concepts of the inaudible and the unseen by using various applicable detectors, microphones and multichannel video and 360° footage. We did a number of practical experiments both in the studio and in the field; in the Netherlands and beyond. We interviewed experts in the field and attended VR workshops and seminars on immersive audio. The knowledge we acquired was shared among ourselves and with a wider community in the form of a presentation, through spinoffs and online channels. All of this prepared fertile ground for future collaborations. Overall, the experience of researching and working with 3D and VR was very positive, although we only managed to scratch on the surface of these new technologies. Of course, there are a lot of technical challenges that one has to overcome and find solutions for, as is always the case with new technologies. For example, one has to plan carefully for each shoot as the equipment does not allow ‘on the fly’ style operation due to complex set up and calibration of software. The weather is another issue to take into account as the cameras used are very sensitive for any type of rough weather. These issues will most likely change in the future as the equipment develops rapidly, and more rugged versions will appear on the market. The workflow is challenging as well, because all the parts in the chain (visually and audible) have to connect in order to function, both in hardware and software. And finally, there is even some de-learning to do be done to get away from the conventional way we are trained to produce audio-visual artwork. This research project produced a lot of new technical knowledge and unearthed artistic themes that we want to explore further. It also fuelled many intellectual discussions on how – especially in the midst of the Covid19 pandemic – we can continue to present new work, how to make it interactive and how to make it possible for the audience to access such work from a distance, for example in an online environment. We have decided that we want to continue working together in the further production of a sensory device that will lead to the presentation of a new work. As a next step, we will therefore transform our plans and ideas into a project plan and start a fundraising mission, to realize an expanded field recording VR installation in the near future, based on the research and experience from our Expanded Field Recording experience.

With the project Expanded Field Recording BJ Nilsen investigates together with SML and Telcosystems the experience of landscape and time can be intensified and expanded by working with VR and 3D technologies. The locations and landscapes of interest are former industrial sites such as landfill sites and military sites, sites that have been repurposed and where the old function and the associated history are rendered invisible or only indirectly observable. Its traces are barely noticeable without technological extensions. The Expanded Field Recording aims to make the complex history of such places experienceable.

SML presentation at De Player Nov 28, 2019 HERE

Supported by Creative Industries Fund NL