Recursive Arts Blog Archive


First Residency Days started!

On Augmented Virtuality in Maia

March 9, 2018

On Augmented Virtuality in Maia

March 9, 2018

EASTN-DC Residency

Recursive Arts
Dr Ignacio Pecino, Creative Technologist


MAIA is a mixed reality simulation framework designed to materialise a digital overlay of creative ideas in synchronised trans-real environments. It has been proposed as an extension to the author’s previous research on Locative Audio (SonicMaps).

 For this purpose, a number of hyper-real virtual replicas (1:1 scale) of real world locations are to be built in the form of 3D parallel worlds where user-generated content is accurately localised, persistent, and automatically synchronised between both worlds—the real and the virtual counterparts.

 The focus is on fading the boundaries between the physical and digital worlds, facilitating creative activities and social interactions in selected locations regardless of the ontological approach. We should thus be able to explore, create, and manipulate “in-world” digital content whether we are physically walking around these locations with an AR device (local user), or visiting their virtual replicas from a computer located anywhere in the world (remote user). In both cases, local and remote agents will be allowed to collaborate and interact with each other while experiencing the same existing content (i.e. buildings, trees, data overlays, etc.). This idea somehow resonates with some of the philosophical elaborations by Umberto Eco in his 1975 essay "Travels in Hyperreality", where models and imitations are not just a mere reproduction of reality, but an attempt at improving on it. In this context, VR environments in transreality can serve as accessible simulation tools for the development and production of localised AR experiences, but they can also constitute an end in itself, an equally valid form of reality from a functional, structural, and perceptual point of view. 

 Content synchronisation takes places in the MAIA Cloud, an online software infrastructure for multimodal mixed reality. Any changes in the digital overlay should be immediately perceived by local and remote users sharing the same affected location. For instance, if a remote user navigating the virtual counterpart of a selected location decides to attach a video stream to a building’s wall, any local user pointing at that physical wall with an AR-enabled device will not only watch the newly added video but might be able to comment it with its creator who will also be visible in the scene as an additional AR element.

A number of technical and conceptual challenges need to be overcome in order to meet the project’s goals, namely: 

  • Developing appropriate abstractions (API) for the structuring and manipulation of MR objects in the MAIA server.
  • Modelling precise 3D reproductions of real world locations via 3D scanning, architectural plans, photographs, etc.
  • Finding the best Networking and Multi-Player solutions to operate within the simulation engine.
  • Designing reliable outdoors 3D object-recognition algorithms (computer vision).
  • Providing robust localisation and anchoring of digital assets in world absolute coordinates.
  • Enabling interactions between virtual and physical objects in AR mode.

The outcome of this research project will be made publicly available to content aggregators and general users as multiple separate apps (the MAIA client) for specifics platforms. Currently, the considered options depending on the access and navigation medium are:

Virtual Reality Client (Virtual World Navigation)

  • A Unity WebGL online app to explore virtual spaces using a HTML5 compliant web browser.
  • Fully immersive stereoscopic rendering of virtual environments using VR headsets such as Oculus Rift or HTC Vive.
  • A dedicated iOS app (iPad only).

Augmented Reality Client (Physical World Navigation)

  • A custom AR app for Android and iOS mobile devices (smartphones, tablets, iPads).

Ideally, the Maia VR iPad app will also include an AR mode so one single app will only be required to explore local and remote locations.  Virtual reality visualisations on smartphones have not been considered because of the small viewport—a large screen is recommended for optimal content editing/management in these 3D environments.

All the apps will share similar functionality as they should provide multiplayer access to selected locations and their characteristic objects and urban features (e.g. buildings, traffic signs, benches, statues, etc.). These physical objects—either visualised through a mobile device’s camera or as virtual representations in the parallel virtual world—may be used as anchors, markers or containers for our digital content (i.e. text, images, sounds, static/dynamic 3D models) and we shall be able to set privacy levels to decide who can access it. If not privacy restrictions are imposed, the new content would be added to MAIA’s public layer and it could be accessed by anyone in the MAIA network.

As a curiosity, the name MAIA was chosen considering the etymological explanation given by Indian philosophy where the word maia is interpreted as “illusion”, “magic”, or the mysterious power that turns an idea into physical reality—ideas being “the real thing”, and their physical materialisation “just an illusion” to our human eyes. In this sense, the digital overlay here proposed constitutes a world of existing ideas in the MAIA cloud, materialising in trans-real environments via virtual or augmented reality.




1-9 March 2018


NOVARS Research Centre, The University of Manchester


The main goals for this first visit to the University of Manchester are:

  • To find a suitable location for a MAIA instance in the city of Manchester, collecting all necessary data for its 3D modelling. This prototype will be exhibited in June 2018 during the corresponding EASTN-DC event.
  • To take advantage of NOVARS studios in order to test a fully immersive VR implementation of the current prototype at “Plaza de la Iglesia” in Marbella (Spain), including spatial audio effects and development of an “in-world” UI for content management. 
  •  To communicate and discuss multiple aspects of the project with research fellows, looking for informed feedback and potential collaborators/contributors.


 Ignacio Pecino initially studied Physics at the University of Seville (Spain) but soon focused on sound, music and interactive media, starting a career as a composer, sound engineer and software developer. He completed his BMus (Hons) Degree in Music Composition at the “Conservatorio Superior de Malaga”, where he also worked as a sound engineer. In 2007, he attended master classes with members of the INA-GRM in Paris Daniel Teruggi, Parmegiani Bernard and Francois Bayle, increasing his interest for electroacoustic music and other avant-garde genres.

As a researcher at The University of Manchester (MA, PhD) he investigated technical and fundamental aspects of locative audio and procedural composition using the Unity Game Engine ("Dynamic Audio Composition via Space and Motion in Virtual and Augmented Environments"​). These disciplines were explored in the context of electroacoustic music composition and they are strongly informed by issues of accessibility and perceptual organisation (multimodality), using recursion and emergent phenomena as a means to minimise visual information and reinforce musical gesture and spatialisation.

 As a software developer, in 2012 Ignacio founded the independent studio Recursive Arts (, specialising in Unity game and app development with a focus on GIS/locative audio, procedural art, virtual instruments, and mixed reality (VR/AR).

 As a composer, Ignacio Pecino has premiered acousmatic, audiovisual and interactive works at multiple international festivals including: IX Symposium (Montreal), Primavera en la Habana (Cuba), AudioMostly (Corfu, Greece), MANTIS Festival (Manchester, UK), NIME (Baton Rouge, USA), NYEMF (New York), MAEM (Madrid) and Festival Zeppelin (Barcelona). His research output also includes articles, conference presentations and academic papers in specialised publications such as ICMC, RMA Research Conferences, Filomúsica or SulPonticello.

 Full curriculum available at: