Procedural audio & interactive scenes

Workshops & Masterclasses (PM - GROUP #2)

Thursday June 2 - 15:00
Back to 2016 schedule

Procedural audio & interactive scenes

Workshops & Masterclasses (PM - GROUP #2)

The limitless imagination of world builders is now inviting us into living environments within which one can move around and live through strong experiences. Real time world building requires an adjustment not only of the the creation tools but also of the compositional approaches.

The elaboration of credible audio and visual spaces able to adapt instantly to the position, movements and behaviours of visitors will be the object of these workshops in procedural audio and immersive scene construction.

Workshops list :

MDA : OPEN STANDARD FOR IMMERSIVE AUDIO CONTENT
+ SATIE : DENSE REAL-TIME AUDIO SCENE RENDERING ENVIRONMENT

Part 1 : MDA (Multi-Dimensional Audio) is an open and future-proof multi-channel audio content production and interchange format designed for the creation, archiving and distribution of immersive content. It extends legacy multi-channel audio formats to support three-dimensional sound field encoding and the optional addition of audio object waveforms accompanied with positional rendering metadata. This new paradigm allows breaking the constraints that tie the creation format to the playback configuration in traditional workflows, while enabling a natural immersive listening experience in movies, VR and games. In this session, we’ll review the principles and technology components of MDA object-based, configuration agnostic spatial audio production and rendering, and how they enable an “author once, play everywhere” approach to feature film sound track production for the theater, the home and mobile devices.

By Jean-Marc Jot (US)
DTS / MDA

Part 2 : Recent advances in computing offer the possibility to scale real-time 3D virtual audio scenes to include hundreds of simultaneous sound sources, rendered in realtime, for large numbers of audio outputs. Our Spatial Audio Toolkit for Immersive Environments (SATIE), allows us to render these dense audio scenes to large multi-channel (e.g. 32 or more) loudspeaker systems in realtime.

Satie offers highly efficient low level computation and is designed for improved scalability, with minimum dependency between nodes in the DSP graph for parallel audio computation, controlling sound objects by groups and load balancing computation of geometry that allow to reduce the number of messages for simultaneous control of large numbers of sound sources. As an audio rendering process, SATIE uses OSC-based protocols that interface with any software environment to provide real-time audio scene rendering.

We will present the SATIE core, focusing on implementation and performance, followed by a presentation of two examples, one with Unity3D and one with Blender. The presentations will feature use-case scenarios accompanied by demonstrations, including among others, a novel "sonic-depth-of-field" effect we recently observed.

By Nicolas Bouillot, Zack Settel and Michal Seta (CA)
Society for Arts and Technology

CREATIVE CODING WITH VVVV + MODULAR SYSTEM & SPACE / ANALYSIS, SPATIALIZATION & VISUAL SOUND REPRESENTATION

Part 1 : This workshop will revolve around the creation of self-generated, immersive and real-time content creation with vvvv. Discussing as much from a technical, than from a mapping, VR and creative point of view by going over, among other things, the use of shaders and other generative algorithms, the artist will present his step-by-step creative workflow. Interfacing with different audio software and interaction design will also be discussed. vvvv is a hybrid graphical/textual real-time programming environment for large multimedia installations.

By Guillaume Pouchoux - Desaxismundi (FR)
desaxismundi.blogspot.ca

Part 2 : This workshop will present the system on which the sig.int ambisonic A/V performance is based. After describing the modular synthesis system, which is the only sound source in this project, Julien Bayle will present the different strategies that presented themselves to him to spatialize the sound, and why he wanted to include space directly as one of the elements of the synthesis itself. He will then describe how all of the sound sources and the ambisonic spatialization system in Max/MSP are linked with the HOALibrary developed by the CCIM (Centre de recherche informatique et création de Paris).

By Julien Bayle (FR)
julienbayle.net

RECURSIVE AND EMERGENT SYSTEMS AS SPATIAL VIRTUAL INSTRUMENTS FOR PROCEDURAL COMPUTERS USING
GAME ENGINES

This workshop will provide hands-on experience with the use of game-engine technologies for the real-time manipulation and sonification of simulated spatial and kinematic models for procedural audio (non-conventional 3D virtual instruments) in the context of electroacoustic music composition. These models are based on emergent and recursive phenomena that minimise visual information and maximise the exploration of aural space, through gesture and motion. Spatial data mapping strategies in Unity/C# and SuperCollider will be discussed.

We recommend bringing your laptop running a copy of Unity 5 (free personal edition), SuperCollider (with sc3-plugins) and the UnityOSC library (available at github.com/jorgegarcia/UnityOSC).

By Ignacio Pecino (ES)
Recursive Arts

MODERN INTERACTIVE AUDIO CHALLENGES : FROM PROCEDURAL AUDIO TO VIRTUAL REALITY
The field of sound is now drastically changing. From sound creation to restitution, the field is booming and offers new fascinating prospects. Through two examples, we will see how sound creation is more alive than ever: first, with a specific approach to procedural sound, we will see how to generate sounds with realistic behaviours, and then we will examine how to create immersive interactive atmospheres, by referring to our Notes on Blindness project, recent winner of the Storyscapes Award at the Tribeca Film Festival.

By Amaury La Burthe (FR)
Audio Gaming

IN-VR AUTHORING TOOLS AND PROCESSES
This workshop explores the tools and processes of real-time immersive in-VR authoring and visualization. It uses the HTC Vive platform to demonstrate the collaborative capabilities of 1) the Unreal VR Editor, and 2) Google’s Tilt Brush, and the Autodesk authoring workflow from 3DS and ReVit to VR using the Stingray real-time engine. Participants will test the systems and discuss their value for the creation of immersive experiences.

By Jean-Luc Labelle (CA)
MTLVR

DESIGNING AND INTEGRATING PROCEDURAL/INTERACTIVE AUDIO CONTENT IN UNITY WITH PURE DATA/HEAVY
Taking you through the process of creating interactive audio content that can be integrated easily into Unity and manipulated by your game to create engaging experiences, especially for Virtual Reality.

By Joe White (UK)
Enzien Audio


GET INDIVIDUAL WORKSHOP TICKETS
MAX OF 2 CHOICES BETWEEN THESE 6 WORKSHOPS (1 AM WORKSHOP, 1 PM WORKSHOP)
WORKSHOPS ARE PRESENTED TWICE PER DAY : 10AM + 3PM

Each pass holder (5 days or 1 day pass) will choose 2 workshops from the list.

Instructors

Jean-Marc Jot (US)
Nicolas Bouillot (CA)
Zack Settel (CA)
Michal Seta (CA)
Desaxismundi (FR)
Julien Bayle (FR)
Ignacio Pecino (ES)
Amaury La Burthe (FR)
Jean-Luc Labelle (CA)
Joe White (UK)