prt_350x400_1442983054.jpg

Sonna

 

Sonna
Your Seventh Sense

Sonna is an augmented audio reality that enhances environmental perception through data sonification—turning streams of data into sound. Our technology consists of two main components: a network radio that transmits open data streams as sound, through a bone-conducting headset which projects these three dimensional soundscape directly into your head. In essence, Sonna is a tool to experience what is beyond our realm of perception, whether that is something outside our field of vision or signals that exist beyond what we are able to naturally process.

Credits
In collaboration with Allison Rowe & Alexandre Kitching
Completed at RCA & Imperial College 

Press
Featured on Techradar & Interviewed by Sky News
 

 

applications

We envision Sonna's applications across three main categories: enhanced mobility (such as an intuitive navigation aid for the visually impaired), enhanced everyday (for instance an immersive addition to video games) as well as enhanced performance (as an example, providing cyclists with an intuitive sense of their speed and location in relation to their competitors).

Sonna3.jpg
Sonna_Navigation.jpg
AnimatedGif_Sonna.gif
 

testing with Prototype

We built a functioning prototype of the headset, housing an array of bone conducting transducers to project a three dimensional soundscape directly into the user's head. With a few demos for navigation, we created an entirely new and highly intuitive channel of comprehension, while maintaining full capabilities of one's other senses.

IMG_2821.jpg
IMG_2815.jpg

Process

Sonna began with our inquiry into the future of superheroes—exploring what it meant to augment humans by enhancing our senses.

Our frequency response is a thousand times more accurate than our vision, yet our sense of hearing is heavily underutilized. At any given moment, there are countless signals passing us by without notice because our ears can only access a small fraction of those frequencies. We saw potential in expanding our perception beyond our physical reach by creating a new sensory network, where users become the source of input and form a network of "eyes and ears". They can then tap into what they deem important, whether it's listening for wi-fi, location of a friend, or tracking a train's movements.

Our explorations ranged from recording and analyzing the sounds of our internal organs, to experimenting with MaxMSP and Kinect, to streaming data from our phones, to tapping into transit's open API and sonifying streams of data that only exist on screen and in numbers.

Ideation.jpg
Ali_Pulse.jpg
Ali_Pulse2.jpg
Sonna_Process1.jpg
Tools.jpg
ditect8_1250.jpg