Javascript must be enabled to continue!

Augmented Reality Audio Games

30-06-2018 16:19

In this research the Augmented Reality Audio Game (ARAG) “Audio Legends” was designed, implemented and tested to evaluate the utilisation of sonic interaction in an Augmented Reality (AR) environment. 

The Implementation of Audio Legends

Audio Legends is an Augmented Reality Audio Game (ARAG) designed to investigate sonic interaction as a means not only to navigate within the augmented environment, but also to perform gestures required for achieving the game’s objectives. To investigate the implications of such mechanics regarding the usability of the system and the immersion in the game world it was designed to include different long- and closed-range tasks. The scenario to host them was designed by drawing on folktales of Corfu. More specifically, the prototype is based on the local myths regarding the protector of the island, Saint Spyridon, who is believed to have saved the island on four different occasions:

  • in the 16th century he ended famine by guiding Italian ships carrying wheat out of a storm

to the starving Corfiots,

  • twice in the 17th century he fought the plague off the island by chasing and beating with a cross the half-old woman and half-beast disease
  • in the 18th century he defended the island against the naval siege of the Turks by destroying most of their fleet with a storm  

In relation to these achievements three Game Modes (GM) were designed:

  • “the Famine”, in which players have to locate a virtual Italian sailor and then guide him to the finish point,
  • “the Plague”, in which players have to track down and then defeat in combat a virtual monster,
  • “the Siege”, in which players have to fnd the bombardment and shield themselves from incoming virtual canonballs.


Mechanics Design

All GM consist of two Interaction Phases (IP): Exploration and Action. The former is related to locating the virtual sound and navigating to its proximity, whereas the latter is initiated, when players enter a 5 meter radius around the sound, and focuses on gestural behavior in response to virtual sonic stimuli. In both IP, no visual contact whatsoever is required with the game’s interface, which is hidden behind a blank screen.

In Exploration-phase standard ARA techniques regarding amplitude and panning facilitate the localisation of the virtual sound. The field of the game is divided into concentric proximity zones around the emitting source. The position of the player is GPS tracked and assigned to the respective zone. Amplitude is then scaled accordingly: the closer one gets to the source the louder one hears it. Stereo positioning is dynamically modified in relation to the direction the device is held at. The only difference between the three GM is that in Famine and Plague the sound is only heard when the tablet is facing it, whereas in Siege the sound is heard independently of the device’s orientation. Thus, in Famine and Plague players are required to use the device as an instrument to scan their surroundings.

In Action-phase the three GM differentiate themselves considerably from one another. In Famine, once players enter the sound’s closest range, it becomes attached to them so that they can lead it to the required destination. Every 20 to 30 seconds the sound stops following them and moves to the closest one of pre-distributed positions along the possible tracks. Players must then recollect the sound and repeat that process until the completion of their objective. In the Action-phase of the Plague, the sound moves between random positions within a 150o azimuth and 60o elevation conical field in front of the player, who needs to aim the device at the sound’s direction and perform an abrupt front and back gesture like hitting a nail with a hammer. Seven successful hits complete the mode’s objective. Finally, in the Action-phase of the Siege, the sound originating from random positions within the same frontal cone, moves towards the player following a curved path. Players must estimate the sound’s trajectory and hold the device like a shield to block the anticipated impact. If they don’t succeed, the sound will pass right through them. Five successful saves win the GM. 

Technical Implementation

Audio Legends was designed for an area of 120 x 46 meters. Outdoor positioning for the virtual sounds was based on GPS with the minimum attainable accuracy of 10 meters. The game was developed on Xcode 10.2 using Swift 5. A major design principle followed was the compatibility with as many iOS devices as possible. For this reason the bare basics Swift libraries were employed: core location (for GPS location events), MapBox (a library to display position on map), core motion (for device specific motions) and SceneKit (for rendering 3D audio). The end-user equipment consists of a binaural audio recording headset (Sennheiser Ambeo Smart Headset) and an iPad Air (early 2014 model). In Exploration-phase Core Location was combined with MapBox to create a mapping between the actual geographical coordinates of the user and the virtual 3D coordinates of SceneKit. The distance and the orientation in the virtual world (SceneKit) were used to render spatial audio. In Combat-phase the Core Motion library was used to get the orientation off the device and the SceneKit in order to render moving 3D audio targets. Geolocation was disabled during this phase and moving objects were always oriented in front of the player regardless of the orientation, in which they entered the scene.

Sound Design

The sonic content of Audio Legends consists of recorded or electronically created samples, which were processed using an audio editor and sequencer, and various sound design software modules. All samples intended for spatial positioning are monophonic.

 



Back