Nuno created a soundscape for each of the 19 photographs in the exhibition, and when the photo’s QR code is scanned in the app, the relevant soundscape is played back.
We also went one crucial step further, and reactified each of the soundscapes, adding layers of augmented reality and reactivity into the mix. Some elements of soundscapes could only be heard by being in certain parts of the world. Other soundscapes react to the time of day, meaning that they sound different in the morning to how they do in the evening. All the soundscapes were also treated with a layer of realtime audio from the listeners current environment. Ambient sounds from the listener’s immediate surroundings, heard via the phone’s inbuilt microphone, are actually woven into the music, thus creating a completely unique experience on each listen.
Find more information and test the concept at: www.discloseprojectpaperclip.com
Read some great reviews about Project Paperclip here.