As I held onto her elbow a little too tightly, Linda Chandler of Liquid Real Estate Innovation guided me through a bustling area with a blindfold on as part of an immersive experience for Microsoft’s Soundscape app.
Soundscape is a research project, originally developed in partnership with Guide Dogs UK, that explores how the use of innovative audio-based technology to enable people, specifically those with blindness or low vision, can help build a richer awareness of surroundings in other ways than just visual.
Watch the video below to see Soundscape in action
Liquid REI, which is supporting the growth of the app, is an organisation set up to drive change with a vision to create a real estate sector that better serves the world’s needs, by embracing the possibilities that digital transformation can present.
I was first led without the Soundscape app and a blindfold on, which meant that I could hear an overwhelming amount of noise and I had no idea where I was being led to or what was around me.
However, once I put the headphones on, the app greeted me with 3D audio clues to my surroundings and the binaural audio, a special form of recording, gave me a sense of distance and direction to landmarks and streets.
Binaural recording is a method of recording sound that uses 2 microphones to create a 3D sound which makes the listener feel like they’re in the same room as the object or person that has been recorded. In terms of the app, that meant I knew the Midland Hotel was closer to me than the Costa.
I then proceeded to set a beacon and lead the way back to our original destination, whilst blindfolded. Along the way the app pointed out the ‘Bling Bee’ which was part of the Manchester bee art installation throughout the city.
Features in the app include:
- Calling out landmarks in relation to the user’s position and as they’re walking
- Users can set an audio beacon to an audio destination. Along the way Soundscape will call out roads and intersections
- The ‘around me’ and ‘ahead of me’ buttons. ‘Around me’ will call-out four points of interest 360 degrees from where the user is standing and ‘ahead of me’ calls out five items in front of the user
How it works
Currently, the app uses information from Open Street Map which local residents can update with information that is hyperlocal to their area. The downside however is that the information is not routinely updated which means it can go out of date quickly.
The project is currently crowdfunding for its ‘St Albans Unlocked’ initiative. This is in order to organise a ‘mapathon’ to train community volunteers to curate data in Open Street Map, engage volunteers in user testing the curated information, launch ‘St Albans Unlocked’ to raise awareness and drive adoption, and recruit and train volunteers to maintain and update the local data.
“Most people will judge a ‘smart city’ by whether it makes a difference to them within the few streets where they live and work. St Albans Unlocked has started to give us real insight into how a community can start to impact it’s own digital future, at a hyperlocal level, for the benefit of everyone,” explained Linda.
I’m excited to see the potential impact this technology could have on real estate. It has the opportunity to transform spaces by making them more accessible and to provide great social value. I can already think of potential uses in sectors such as retail involving the use of the app in shopping centres, as well as mapping and applying a digital layer to a variety of buildings to create an understanding of the layout and provide guides.
With sensors becoming more prevalent, perhaps this technology could be enhanced by using them to create beacons – for example if a person with blindness or low vision was travelling independently and needed a way to track their luggage.