THE AUGMENTED CARNIVAL GEO-LOCATABLE AUGMENTED REALITY, THE BODY AND TIME
NEIL ST JOHN 1
Welcome to the Augmented Carnival, with the QR Code provided please scan the code using the Junaio application available from Apple’s app Store or Android’s Google Play Store, then proceed to take another look at the cover page 2
THE AUGMENTED CARNIVAL
GEO-LOCATABLE AUGMENTED REALITY, THE BODY AND TIME
ABSTRACT This thesis is a technical investigation into the spatial relationships that exist between virtual and actual information - that is to say information derived from the interaction between virtual and actual spaces. Through investigations into human interactions and reactions to spatial scenarios, this piece of work seeks to investigate how Augmented Reality (AR) can become more than a series of holographic projections. The possibilities of feedback loops in geo-locatable augmented reality provide a unique starting point for this investigation. The term ‘geo-locatable’ in itself invites additional factors such as site and narrative. Given that geo-located AR can be influenced by site location, this research shall appropriately support the development of an AR design proposal that celebrates the Caribbean histories and narratives that surround the West India Quay area of Docklands London. A considerable amount of research will be carried out through interviews and tests on augmented Software Development Kit technology (SDK). This development kit will allow for the creation of Augmented Reality applications to run over a number of computing platforms. The SDK will be tested to establish its current range of capabilities as well as its limitations. All of this research will be framed within the context of a design project. The design project has a historical focus on the Caribbean and this narrative will be used to guide the development of the thesis and geo-locate the event which the design project proposes. In conjunction to interviews and tests, research from articles, online sources and books will provide further information to support the projects research agenda. A large majority of these interviews have been carried out with leading professionals as well as hardware and software developers within the augmented reality industry. This allowed for firsthand information to be gathered from the creators of the different software. Additionally, they provided their thoughts, plans and predictions on where Augmented Reality will be in the future. The results gained throughout this thesis will be cross-referenced alongside research within gaming and virtual reality. This is largely because these industries play a significant role in influencing the development of AR technology. In conclusion, the information gathered will help form the basis for speculations on the use of geo-locatable AR technology within the framework of the design project. The results will deal with issues involving siting of the project, exploring the user’s ability for on-site manipulation as well as interaction with the architectural space. The interactive element of the research creates a platform which allows users to rewind, slow down and in essence provide a playback experience of augmented architectural spaces.
2
METHODOLOGY
An Introduction to AR starts with an A.R. Technology Overview which will give an introduction to the different Augmented Reality applications, what they provide and how they differ from each other. This information was acquired by attending a 2 day Augmented Planet Conference that took place in November. Documented information from the conference will help in the comparison of the applications and assisting in highlighting the best software to focus my testing on. Neither of the top rated applications are prefect; each has its own set of limitations that maybe rectified with each update to the program and/or as technology advances over time. The key difference between these applications comes down to the capabilities of each Software Development Kit and developer API. According to results released at The Augmented Planet Conference, the top augmented developer tool at the time, Wikitude ARchitect, didn’t provide 3D capabilities. This option was highly necessary to test the Augmented reality based design project which explores Caribbean history within the Docklands area. These tests hope to prove that a basic level of augmentation can be carried out with the current technology available. Both ‘Junaio/metaio Mobile SDK’ and ‘Layar’ provide limited 3D support when developing Augmented reality based elements. These two applications, within the line of company products, provide “Creator” software to allow nondevelopers to generate augmented content over printed media. Within the Creator, users can create interactive buttons, Images, sound and video overlays, the Metaio Creator takes this one step further by allowing 3D content to be placed on to printed media. This 3D content can be animated as well, allowing some advance capabilities to be explored within an easy to use interface. Software Limitations will start by highlighting the AR Technology I will use for Testing from the Overview of applications listed. This selection will be based on which software has the most advanced features while also providing somewhat free access. I will begin to test this software to see what can be augmented and how rich the experience provided can be. These tests will document the software’s Imaging, Video, 3D and possible Animation capabilities, while highlighting the current limitations to each, if any may arise. The deployment of these augmented options will also be explored through testing the software’s capabilities in that field. Tracking options which use; QR Codes, Picture Marker, Markerless Tracking and Markerless 3D. More important will be the testing of Non-optical Tracking Technology through GPS/Inertial Sensors. These tests will hope to show the Location - based Augmented Reality and its capabilities. The testing will be carried out using my android phone and tablet, as well as a Windows based PC to allow the Software Development Kits to run, test and create applications. Technological Advancements: I will start to gather and summarise Industry developers’ thoughts and predictions about the future of the Technology. This information will come from a number
3
of interviews with Andreas Hauser (Head of Mobile Development at Wikitude), Markus Meixner (CTO at ViewAR), Antoine Brachet (Channel Director at Total Immersion) and David Lock (Director of Operations EMEA Region at Vuzix) carried out at the Augmented Planet Conference. This information will help to explore the various Software and Hardware Developments taking place within the AR community that will enhance the experience and take the Technology beyond the phone, tablet and computer screens. This will be done by providing research done on Google Glasses, translating the experience from using Vuzix glasses and providing information on how their hardware is developing (Interview with David Lock, Director of Operations EMEA Region). The hardware and software discussed within Technological Influences, will be based on technology not currently developed for AR, but becomes very influential to the experience in the future. The research will be carried out mainly through articles and online sources which cover developments in Gaming and Virtual reality. Highlighting what developing hardware and software could play a role in providing a fully immersive A.R. system. Augmented Technology – Possibilities of The Futures and Fourth Dimension – Time Based Spatiality will conclude with my speculations on what’s to come, based on the results from testing, research and interviews carried out. Additionally, forming conclusions of how this system can change our experience of time and the advantages of the information these AR systems will be providing. Design and Architectural integration has been pushed forward recently, with the advancement of the Augmented SDKs and the increase in processor performance. These SDKs, from the testing provided, will prove how they can provide continuous rendering and tracking of digital objects in real space. The research gained from online sources will show how Architects are getting involved in augmented reality; studios include Daniel Libeskind and Acconci Studio. In addition to this research, I will also implement my own ideas currently developing through my design project in conjunction with this thesis and my previous design work which rely heavily on Geo-locatable augmented technology. These speculations will cover how we will experience or re-experience moments of time, as well as sharing moments in time. Finally, it will also consider the advantages of the augmented information provided, expanding on how these can have an effect on Architecture, changing the way we approach design. Will geo-locatable technology allow all Architectural representations develop an augmented permanence, somehow creating another dimension to what an architectural intervention is, and how it will be experienced?
4
THE AUGMENTED CARNIVAL
GEO-LOCATABLE AUGMENTED REALITY, THE BODY AND TIME
CONTENTS
Introduction
An Introduction to AR - A.R. Technology
Technological Advancements
Augmented Technology – Possibilities of The Futures
Fourth Dimension – Time Based Spatiality
Conclusion
Appendix 1
Appendix 2
References
5
INTRODUCTION According to writer Hohl Wolfgang, Augmented reality (AR), is defined as the ‘’real-time overlapping of human sensations with computer-generated models’’1. The definition of AR also includes fields of research and development into technologies that allow for the ‘real time fusion of computer generated digital content with the real world.’2
Reality
M I X E D R E A L I T Y
Augmented Reality
Augmented Virtuality
Virtual Environent
Image: Authors own version of mixed reality continuum as originally defined by Milgram and Kishino 1994
Ronald T. Azuma’s definition of AR highlights three distinct characteristics that need to be met in order to identify AR: • The combination of real and virtual content • The system performs in real-time and is interactive • The virtual content is registered with the real world3 1 H, Wolfgang, ‘Interactive Environments with Open-Source Software’, Springer-Verlag/Wien, Austria, 2009, p. 10 2 M, Haller, M Billinghurst & B Thomas, ‘Emerging Technologies of Augmented Reality: Interfaces and Design’, Idea Group Inc., London, 2007, p. vi 3 R, Azuma, ‘A Survey of Augmented Reality’, In Presence: Teleoperators and Virtual Environments’, vol. 6, no. 4, 1997, p. 356.
6
Regardless of the varying definitions of AR found and given over the years, a fundamental core attribute can be found reoccurring in most of these definitions, namely the need for real-time interaction. The overlap between the ‘real world’ and ‘virtual world’ and the interacting of these two spaces is what, in essence, would be considered augmented reality. This evidently points towards the need for these virtual projections to be ‘placed’ in the real world in order to carry out the function of augmentation. This act of ‘placing’ virtual objects/spaces within the real world allows for issues regarding site and context to come into play. Within the realms of Architecture, Film and Literature, site and location have always played a pivotal role in the development of a design or story. If AR technology includes elements such as these, it would not be farfetched to speculate the influence these inputs might have to an AR projection. This thesis shall take a narrative journey to investigate through technical tests as well as design proposals; cultural spaces and histories. AR technology and its possible uses in architecture shall play a central role in the development of this narrative; with a particular focus being paid to the potential this technology avails for the re-experiencing of time based spatial events. The final supporting design proposal shall propose a reflexive, responsive and contextually aware AR space along the West India Quay area of the Docklands. Overall this paper is a journey to celebrate the growing and ever changing relationships between architecture, movement, culture and history.
7
AN INTRODUCTION TO AR - A.R. TECHNOLOGY
OVERVIEW “A critical mass is forming to support augmented reality products and services as a major tech/media industry.”4- Kipper and Rampolla Over the past few years the development of AR technology has been closely linked to the development of supportive platforms. At present, if one wishes to experience AR they need access to a camera and a screen5, more precisely ‘a smartphone that has a GPS’6 as this is the current minimum requirement for an AR browser. According to writer Tony Mullen, the current popularity of smartphones represents a ‘major milestone on the road to AR ubiquity’7. Their ‘universal’ adoption (that is to say in societies with technologically advanced systems in place) coupled with the smartphones integrated camera displays and constant connectivity makes this mobile technology the ideal propeller for AR technologies. This section aims to give a brief overview of the main AR browsers available, which provide an Augmented experience through its embedded technology.
AUGMENTED REALITY APPLICATIONS
Author’s Image, taken at Augmented Planet
As of October 2012, Wikitude ARchitect, Junaio/Metaio Mobile SDK and Layar are the top three developer tools for supporting AR experiences8 Developed and released on October the 15th 2008, Wikitude was one of first augmented reality browsers. With a global reach of 10 million users over 32 languages,9 Wikitude was the first publicly available AR technology with Geo-Locatable features.10 According to Andreas Hauser, Head Developer at Wikitude, the company achieved this global reach by developing its browser ‘’on all major Operating Systems’’11. These would include Operating Systems such as Android, Apple’s iOS, BlackBerry, Windows Phones and Symbian Devices. Wikitude’s augmented reality 4 G, Kipper & J Rampolla, ‘Augmented Reality: An Emerging Technologies Guide to AR’, Elsevier, Inc. Waltham, MA, 2013, p. xv 5 L, Madden, ‘Professional Augmented Reality Browsers for Smartphones: Programming for Junaio, Layar and Wikitude’, John Wiley & Sons, United Kingdom, 2011, p. 22 6 ibid. 7 Mullen, T, ‘Prototyping Augmented Reality’, John Wiley & Sons, Indiana, Canada, 2011, p. 189 8 L, Madden, ‘Augmented Planet Readers Choice Awards’ , Augmented Planet - The Leading UK Augmented Reality Event, Kensington Town Hall, London, 2012. 9 A, Hauser, ‘Wikitude Architect Presentation’ , Augmented Planet - The Leading UK Augmented Reality Event, Kensington Town Hall, London, 2012. 10 ibid. 11 ibid.
8
SDK, known as SDK ARchitect is based on HTML 5/ CSS/ JavaScript web technologies. The SDK allows for augmented reality content to be created for iOS, Android and BlackBerry 10 through one coding method which can be deployed to each platform. Metaio was established in 2003; two years later in 2005, the company introduced its SDK. Through the advancement of Metaio SDK over the period of 4 years, in 2009 the company launched their Junaio AR browser12 which allows Augmented reality to be displayed through the camera. The browser offers location-based services which provide addition information available in the immediate environment.13 “Unlike Layar and Wikitude, junaio was not released as an AR browser. It was released as the world’s first AR social networking browser. The original version of junaio focused on enabling users to post 3D objects to the virtual world and, in turn, other users could use these 3D scenes to create their own scenes to share with their friends. Since the release of junaio 2.0 in early 2010, junaio has become a fully fledged AR browser with around 150 Channels and another 650 in development.”14 In association with their junaio AR browser, metaio provides the Metaio Creator, which is designed to be used by non-developers and allows a desktop, drag and drop interface,15 allowing anyone to create an augmented reality experience. At this stage, only a few methods of tracking are available within the creator, for the more advanced features; the metaio SDK and junaio provide developer options. The junaio developer Application programming interface (API) allows web coding options like PHP/XML, HTML and Javascript to provide the junaio browser, through your server, with Augmented reality that can be experienced through a range of tracking options. The metaio SDK also provides a range of tracking options while including a 3D rendering engine to allow developers to create Android, iOS and Windows applications through Android SDK, iOS SDK and Microsoft Visual Studio, using the same coding method. According to Founder and Director of Augmented Planet, Lester Madden, “Layar was the first AR browser released, piping Wikitude to the Android platform in 2008.”16 He continues to highlight its success, just after two years of being released; “Layar has produced some amazing stats: more than 1 million user, nearly 2,000 third-party layers created, and a thriving community building open-source tools to make developers’ lives easier.”17 The Layar application is available for iOS and Android, the content created within the program are referred to as layers. These layers are created using the Layar Creator, API and SDK, which provides the option to “embed Layar Vision functionality directly into your very own iPhone or Android app”.18 Layar Vision stands for the multiple methods used to augment objects in the real world. Similair to the Metaio Creator, Layar Creator is a software application which allows for an easy step by step method of augmenting digital content to printed media, there is no developer skills required but it has limited augmented options. Layar API and SDK are development tools which provide advance features for creating augmented content. The API uses MySQL and PHP to build Layer content19 while the SDK allows for the creation of your own application over Android SDK and iOS SDK. 12 metaio, Augmented Reality Factsheet, brought to you by metaio , metaio GmbH, 2013, retrieved 1 April 2013, <http://www.metaio.com/company/>. 13 junaio, Frequently Asked Questions , metaio Inc., 2013, retrieved 1 April 2013, <http://www.junaio.com/faq/>. 14 Lester Madden, ‘Professional Augmented Reality Browsers for Smartphones: Programming for Junaio, Layar and Wikitude’, John Wiley & Sons, United Kingdom, 2011, p. 44 15 M, Silbey, Make your own reality with Metaio’s augmented reality software , smartplanet, 2012, retrieved – 05 October 2012, <http://www.smartplanet.com/blog/thinking-tech/make-your-own-reality-with-metaios-augmentedreality-software/11573>. 16 Madden, op.cit., p. 36 17 ibid. 18 Layar, Layar SDK , Layar, 2013, retrieved 1 April 2013, <http://www.layar.com/sdk/> 19 Madden, op.cit., p. 103
9
AR APPLICATION BREAKDOWN With the best three development tools presented at The Augmented Planet 2012 Conference, based on Readers’ opinions, this paper takes these opinions as a means to narrow down from a wide range of available Augmented platforms. Through the brief introduction of Wikitude, Junaio/ Metaio and Layar, this section shall take a look at each of these toolsets in an effort to provide a platform, albeit in a limited sense, to determine the application and performance of each one. It is hoped that the results will inform on which application would provide the most ideal toolset to support the accompanying design project ‘The Augmented Carnival’. Neither of the top rated applications are prefect; each has its own set of limitations that maybe rectified with each update to the program and/or as technology advances over time. The key difference between these applications comes down to the capabilities of each SDK and developer API. According to results released at The Augmented Planet Conference, the top augmented developer tool at the time, Wikitude ARchitect, didn’t provide 3D capabilities. This option was highly necessary to test the Augmented reality based design project which explores Caribbean history within the Docklands area. These tests hope to prove that a basic level of augmentation can be carried out with the current technology available. Both ‘Junaio/metaio Mobile SDK’ and ‘Layar’ provide limited 3D support when developing Augmented reality based elements. These two applications, within the line of company products, provide “Creator” software to allow nondevelopers to generate augmented content over printed media. Within the Creator, users can create interactive buttons, Images, sound and video overlays, the Metaio Creator takes this one step further by allowing 3D content to be placed on to printed media. This 3D content can be animated as well, allowing some advance capabilities to be explored within an easy to use interface. Developers using the junaio developer API and the Layar API for augmented “channels” and “layers” respectively, are required to set up a web server which is publicly visible to host your content. An FTP application is also required to upload files to the server, as well as a PHP editor to edit PHP files, and an additional MySQL database for Layar API20. A fundamental understanding of these applications is required to set up and use each one successfully. Even though, at this stage, Wikitude didn’t support 3D, its Wikitude ARchitect allows developers, the easiest method of creating augmented worlds. The requirements are “some familiarity with XML”, using “an XML editor, such as Microsoft’s XML Notepad”21 and a web server like Dropbox. Developing within the browsers allow for anyone to access your channel to engage with your augmented experience. These AR browsers are frequently kept up to date to enrich the developer APIs are over the duration of this to allow easier methods to deploy content and/or simpler understanding of the documentation to carry out these tasks, through a step by step tutorial base. Without the knowledge and familiarity of hosting web servers, having limited FTP and PHP experience, another option available to developers, is to use the programs’ SDK to allow one to build a Smartphone application for the platform they desire. This method is by no means easier 20 21
Madden, op.cit., p. 103, 189 Madden, op.cit., p. 69
10
then using the developer APIs, as coding experience is still required through the chosen Operating Systems’ SDK, but what it allows for is your own application, controlled by the user to provide an augmented experience, using only his or her content. This application can be made available for download for the chosen smartphone app store or windows based PC. Software development kits, like the ones for Andoird and iOS, allow augmented SDKs to be used through Integrated development environments (IDE) like, Eclipse and Xcode. Unlike Layar SDK and Metaio SDK, the Wikitude SDK is used in conjunction with their ARchitect API to allow “communication between the ARchitect World and the native application”22. Layar SDK and Metaio SDK provides the tools, built in, to enhance your application with Augmented reality without the need to communicate with eithers’ API. Most features of the SDKs mirror that of its APIs, Layar SDK however does not support one of the key features widely available within its API, geo-location based content. Without this feature, locating the design project within the Docklands wouldn’t be possible, through the software development kit. Metaio SDK appears to provide all of the same features of its API while also including “a powerful 3-D rendering engine in addition to plug-ins for Unity”23 (Unity is a Gaming Engine, used for the development and creation of video games). Using the Metaio SDK, one can ‘Create your own application once and deploy to all major operating systems and devices via AREL, the Augmented Reality Experience Language.’24
THE TESTING DEVICES The devices used to carry out the tests will consist of Android Smartphones and Tablets and a Windows based computer system. The range of devices should allow for maximum testing area, in case one of the devices fail. The Windows based PC is mainly used to build and deploy application code to be read by the Smartphones. The Metaio SDK allows for the development of AR application to be experienced on Windows PCs using its web cam. The Windows PC can also provide emulation support to virtually create a smartphone device to carry out development test. This method is limited to applications which don’t require access to phone’s hardware components for testing. As the digital camera is considered a hardware component, testing Augmented Reality applications must be carried out on either the smartphone or tablet which can be detected and used in develop mode (USB debugging option allows for the device to be detected).
22 Wikitude, Wikitude SDK Documentation for Android , Liferay, 2013, retrieved 10 April 2013, <http:// forum.wikitude.com/documentation?p_p_id=2_WAR_knowledgebaseportlet&p_p_lifecycle=0&p_p_ state=normal&p_p_mode=view&p_p_col_id=column-2&p_p_col_count=1&_2_WAR_knowledgebaseportlet_ mvcPath=%2Fdisplay%2Fview_article.jsp&_2_WAR_knowledgebaseportlet_resourcePrimKey=129086>. 23 metaio, Augmented Reality SDK for App Development , metaio GmbH, 2013, retrieved 1 April 2013, <http:// www.metaio.com/products/sdk/>. 24 ibid.
11
SOFTWARE CAPABILITIES
TESTING The advance features within the Junaio/Metaio line of augmented applications, provides a starting point for testing to begin. Its range of capabilities will help provide an adequate amount of data to reasonably gauge/predict the possible future applications of this augmented technology. The idea of speculating into possible future applications of this technology poses an exciting prospect, particularly when developed in collaboration with and for architects and design professional. The ease of use when creating Augmented elements, using the ‘Metaio Creator’, is highlighted through testing. Although the current methods for augmentation are limited, the control method and familiar icons make it very attractive for any 3D design based user. The testing describes the process of importing an image and 3D model to be augmented on an image marker, which was taken of an object in the real world. Making use of the free metaio platform and publishing method of augmented experiences are welcomed but with some limited capabilities and watermarks, which would be removed with purchase. The free version of the Creator limited the number of items added to each scene and requires a low complexity in its models; this first test reflects this limitation and incorporates very simple geometries within the model displayed.25
26
Image - Test result
A wide range of tracking methods is possible between the use of metaio Creator and the metaio SDK. Although the metaio Creator is limited to 3 of these methods, along with image tracking, it provides the ability to create point cloud data from objects and environment tracking which allows for the creation of an indoor or/and outdoor track points to augment elements into the chosen environment without a marker or images. The Metaio SDK provides the basis of the second set of test, as it explores most, if not all, the tracking types available. This method, even with the base tutorials provided by the program, was no easy task. Leaving the easy drag and drop interface of the “Creator” to approach the Javascript environment of Eclispe, used to develop apps within the Android SDK, is a steep learning curve to say the least. That being said, the junaio developer API wasn’t tested because the prerequisites and experience for development were too high to grasp in the time available. Although working within the Android SDK is similar to the coding environment used within the API, an attempt is 25 26
See Appendix 2 for Full testing carried out by Author, AR Test Recording 1, 20 December 2012 See Appendix 2 for Images from test carried out by Author, AR Test Recording 1, 20 December 2012
12
being made to highlight the current capabilities of this software, and in turn, AR at present Tracking methods within the metaio SDK are highlighted in the second test and show the different possibilities of the SDK. Some of these tracking methods are highlighted below, the test also show animation and video capabilities.27
The regular updates to the SDK, allow for greater improvements to be realised, mainly within camera quality, 3D texturing and model sizes to list a few.
“We have the clear goal to have augmented reality on every smartphone by 2014, and to reach that goal we are collaborating with the hardware makers, to solve a lot of details which can already be addressed now and we are seeing improvements in the possible model size that we can display, the quality of the camera image that is being displayed in the background…”28 - Peter Meier GEO-LOCATION Geo-located based content is provided through the metaio SDK, to provide specific site placement of an object by utilising the GPS information from the device as well as latitude and longitude coordinates of the location where the augmented element is to be placed. These coordinates would be inputted into code written for the built application. Depending on the coding method, the geo-located element may not have specific coordinates but be geo-located based on your position. This method provides the model with an offset from your GPS location as highlighted within the second test.29 The resulting screenshots from the test are displayed below; image 1 shows the offset incorrectly calculated by the application as it built the test program, while image 2 displays the correct calculation of the objects offset from the device’s GPS position.
27 See Appendix 2 for Full testing carried out by Author, AR Test Recording 2, 28 February, 2013 28 Metaio’s Augmented City 2012 [online video] video time code 4:36, metaioAR, 2012, http://www.youtube.com/ watch?v=LGXiRzN2X_g, retrieved 21 March 2013 29 See Appendix 2 for Full testing carried out by Author, AR Test Recording 2, 28 February, 2013
13
CONTEXT The video capabilities to record and playback moments in time, as highlighted in the test carried out, see Appendix 2, present a number of interesting scenarios. Some of which allude to the future development of this technology. Currently Google glasses present a possible fully immersive environment, and by allowing developers to programme this hardware with enhanced video recording and playback capabilities can enrich these possibilities. These programmable elements can be adjusted to provide a temporal based experience which can be used in the context of this design project to display histories in an immersive environment. An immersive re-envisioning of history produces a closer connection to the source presented.
Image: Immersive AR history - Author’s supporting design project The Augmented Carnival
Geo-location has the added layer of context to the AR experience, through my test; augmented reality can move beyond a personal point of view to become a more shared experience. Allowing for multiple interactions with one or more augmented elements, geo-location provides a sense of permanence to the element deployed to that location.
“…AR technology brings with it the potential to turn entire cities into virtual art museums…”30 – Shoshanna Pokras
30 Augmented Reality Emerging Technology - See How works [online video] video time code 6:02, TheTechindia, 2013, http://www.youtube.com/watch?v=RB8QStA8HRI, retrieved 21 March 2013
14
TECHNOLOGICAL ADVANCEMENTS SOFTWARE AND HARDWARE DEVELOPMENTS Hardware development of Augmented reality over the years has been limited to smartphones and web-cam computers. Developing hardware has now reached a stage where devices are being specifically tailored for Augmented reality experiences. This has lead to the development of several types of devices, which will be discussed in this chapter. Two of the leading producers in the visual category are Google and Vuzix glasses.
GOOGLE GLASSES DEVELOPMENT
“Google glasses takes all the functionality of a smartphone and places it into a wearable device that resembles eyeglasses.”31 - David Goldman. Google glasses is a head-mounted display (HMD) which functions as a new form of interface for the performing of common day to day mobile device functions. The Google Glasses HMD allows for a portion of screen to actively display information directly in front of the users viewing range as opposed to viewing on a hand held mobile device screen. It’s display includes the full range of user tasks commonly found on hand held devices such as text messages, maps and turn by turn directions to name a few, all viewed through its new interface.
Image: Google Glasses
CNET has confirmed that, “Glass will be able to connect via Bluetooth to both Android phones and the iPhone. Glass can pull down data from wifi or use the 3G or 4G feed from a connected phone, but it won’t have its own cellular radio.”32 After having tested Google Glass at his company headquarters in New York, technology journalist Joshua Topolsky noted that Google glass provided an experience “nearly identical” to the actual user experience.33 Deriving from this latest 31 D, Goldman, Google unveils ‘Project Glass’ virtual-reality glasses , CNNMoney, 2012, retrieved 28 March 2013, <http://money.cnn.com/2012/04/04/technology/google-project-glass/?source=cnn_bin>. 32 E, Mack, Confirmed: Google Glass arrives in 2013, and under $1,500 , CNET, 2013, retrieved 25 February 2013, <http://news.cnet.com/8301-17938_105-57570779-1/confirmed-google-glass-arrives-in-2013-and-under-$1500/>. 33 ibid.
15
test of the Google Glass form of HMD product, one could speculate the possibilities this form of user interface gives for a field such as architecture. The notion of a fully immersive experience of one’s architectural proposal sited in reality makes room for a great amount of possibility. Although Joshua Topolsky goes on to highlight that ‘Glass isn’t yet perfect and that slow data connections can quickly render the device useless,’34 these are issues that can be addressed and do not impose an overall threat to the idea of experiencing AR within ones field of view. The users experience through a HMD interface allows for a hands-free, natural interaction with augmented information. While going through real space one can record what they see, overlay information where they wish it to be and even share their viewer experience live all within the eyes field of view.
Image: Google glass prototype
When describing how the glasses might be worn and fit in with day to day life, Google notes that “The Glass design is modular, so you will be able to add frames and lenses that match your prescription.”35 However they go on to add that they are “still perfecting the design for prescription frames.”36 The use and application of HMD types of AR interfaces allows for a more natural navigation through a fully immersive AR environment. As highlighted in the image below architectural experimentation, investigations and proposals can all be viewed in this manner.
34 ibid. 35 C, Matyszczyk, Here’s who can’t wear Google Glass: People who wear glasses , CNET, 2013, retrieved 28 March 2013, <http://news.cnet.com/8301-17852_3-57573657-71/heres-who-cant-wear-google-glass-people-who-wearglasses/>. 36 ibid.
16
17
Image on the previous page: Total immersive AR environment taken from viewer’s point of view – Author’s supporting design project The Augmented Carnival
VUZIX GLASSES DEVELOPMENT According to David Lock, Director of Operations EMEA region at Vuzix, the current Vuzix glasses relay on a module with LED displays using diffraction techniques, by looking through the display, the image appears projected onto a surface within your view but actually the image is digitally displayed on the device. He continues to add, that their new head-mounted display has allowed it to look similar to a pair of glasses, by incorporating waveguide and laser technology; it uses a small portion of the glasses to show augmented objects. With this technology, the company provides for any software SDK to interact with its product, this makes it widely acceptable for developers to test.37
Image: Future of augmented reality online video
OTHER DEVELOPING HARDWARE Microsoft has also acquired a patent on its own form of HMD interface AR glasses. Microsoft describes their glasses as having the potential to “enhance sports and other live events with streams of information beamed directly in front of the user – even including action replays and lyrics of songs.”38 Microsoft however is not alone in this development as Technology Company Apple also applied and successfully owns patents to their own form of AR system which works alongside hand held communication devices.39 As awareness continues to spread on the ever increasing potential benefits of AR, technology companies have continued to widen their investment and shareholding in this sector.
37 D. Lock, ‘Vuzix glasses and development’ [Interviewed by Author], 30 October 2012, David Lock, See Appendix 1, no. 1 38 C, Arthur, Microsoft gets patent on augmented reality glasses as ‘AR wars’ start , theguardian, 2012, retrieved 17 February 2013, <http://www.guardian.co.uk/technology/2012/nov/27/microsoft-augmented-reality-glass-googleapple>. 39 ibid.
18
In an effort to increase the interactive nature of its product, international company Microsoft, demonstrated what they titled a ‘live language translation’.40 Where they translated to and from the Chinese language, giving a new meaning to the phrase “live event”41. Stating that the term ‘live event’ may now refer to events ‘much wider than just sports.’42 With the power to provide what would be, according to Microsoft, the ability to have ‘instant replays directly through its AR glasses, as well as annotating what’s happening on the field’.43 The possible ways one might begin to experience real space that is augmented begin to grow. The ability to replay events just moments after they happen could, on a purely experiential level, be compared to the idea of ‘time travel’. That is not to say that one literally travels back through time however the experience of replaying moments that have just passed while still in that same given space might allow one to experience, even if for a moment, the feeling that they may have relived a moment passed. Several companies have invested in developing their own unique Applications and Uses for AR technology in an effort to augment their customers’ experiences of the everyday routines surrounding their products. “Nokia’s Lumia range with Windows Phone 8 includes “City Lens” which shows the distance and approximate direction of places of interest arranged on the screen when you hold it up.”44 Nokia’s City Lens application turns sight into the “next interface for searching the world around you.”45 By providing an augmented reality overlay view of buildings and instantly highlights places of interest, Nokia provides a platform to view information live that can then be actioned on. After type pads, touchscreens and voice recognition, “sight recognition [could] be another standard way to interact with the world around you.”46 Live sight is defined as “a collection of mechanisms” where a “3D sight interface” allows buildings to be detected by “a collection technologies with high accuracy and feeling of depth.”47 Within these so called fields of information, issues such as “Line of sight” come into play, with the “line of sight view” only displaying Points of Interest (POI)’s directly in your line of sight. Further enhancements to the AR experience particularly when recording live events is the “Freeze frame” capability in some devices. According to LiveSight, this allows a user the ability to “save a live view to inspect the city without having to hold the camera pointed at the target”48
“With LiveSight you can orient yourself by simply lifting up your phone and looking through the camera view finder and find your destination whether it is right in front of you or three blocks away”49 - Peter Skillman, head of UX Design”
When considering whether or not AR is to become a dominant form of technology used in the near future, its practicality in terms of use in everyday life plays a large role in influencing its 40 ibid. 41 ibid. 42 ibid. 43 ibid. 44 ibid. 45 P, Bonetti, LiveSight: immersive experiences you can act on , Conversations by Nokia, 2012, retrieved 25 March 2013, <http://conversations.nokia.com/2012/11/13/livesight-immersive-experiences-you-can-act-on/>. 46 ibid. 47 ibid. 48 ibid. 49 ibid.
19
dissemination. Design ergonomics and the ease of use for users play a large role in this, and currently a few applications have started to make strides in this direction. ‘Leap Motion’ is a company which is developing advanced motion sensing technology, focusing on their main product Leap Motion. With the ability detect all 10 fingers, Leap Motion’s hardware and software allows users, behind an interface, to take control over ‘’8 cubic feet of virtual space.’’50 According to Leap Motion the design of the technology enables ‘’infrared cameras in the Leap Motion detector to see a three-dimensional space, down to the millimetre.’’51 With the ability to recognise both expansive motion, such as might be applicable for games consoles, as well as very fine movements, the added value of this form of technology to AR only improves its usability.
Image A: Leap motion in Use
“My take is that Leap Motion won’t replace your mouse and keyboard, but is a powerful tool for augmenting control.”52 - Lance Ulanoff
Image B: Leap Motion in Use
50 L, Ulanoff, Hands on With Leap Motion’s Controller , Mashable, 2013, retrieved 25 March 2013, <http:// mashable.com/2013/03/09/hands-on-with-leap-motion/>. 51 ibid. 52 ibid.
20
TECHNOLOGICAL INFLUENCES VIRTUAL REALITY DEVICES With moves to make interaction between virtual and real space more natural and fluid, ‘Z-Space’’ developed by company InfinateZ is making strides towards delivering software that allows interaction with virtual models in much the same way as real physical ones. One should ideally be able to pick up its pieces and parts and according to InfinateZ “you can move your head around an object to look it from the side or from below, for instance, and the Z Space display will adapt and show you the correct perspective. Z-Space as a system is said to provide ‘’ virtual 3D holographs that are actually interactive [and] with the help of special eyeglasses and a stylus, you as a designer can grab parts of your model, move your head around them in 3D space to view parts from different angles, and you can drop them back into place as you see fit. ‘’53
“Suddenly, the virtual model is becoming more adaptable — as it adjusts itself to the way you work within your design process.”54 – Maria Lehman
Image: Z-Space
‘’Imagine being able to reach out and grab some of the 3-D objects you saw in the movie Avatar with your own hands... this sort of interactive experience— albeit on a smaller scale…could have many possible uses.’’ 55 - Will Knight With eyeglasses equipped with markers that reflect infrared light, Z-Space enables cameras ‘’embedded in the display to track the movement of your head (and thus your eyes) as you change your point of view.’’56Allowing the user to manipulate virtual objects as if they were ‘’really were floating just inches in front of you and It’s easy to see how such technology could be very useful for designers, architects, and animators. ‘’ Infinite Z created a software development kit (SDK) for its ZSpace displays, which let other companies, as well as independent programmers, create software for it. 57
53 M, Lehman, Can Interactive 3D Modeling Change the Way You Design? , Sensing Architecture, 2013, retrieved 17 February 2013, <http://sensingarchitecture.com/9067/can-interactive-3d-modeling-change-the-way-you-design/>. 54 ibid. 55 W, Knight, A Display That Makes Interactive 3-D Seem Mind-Bogglingly Real , MIT Technology Review, 2012, retrieved 24 February 2013, <http://www.technologyreview.com/view/508991/a-display-that-makes-interactive-3-dseem-mind-bogglingly-real/>. 56 ibid. 57 ibid.
21
GAMING HARDWARE ADVANCEMENTS Microsoft’s “IllumiRoom’’ uses a Kinect for Windows camera and a projector to extend on-screen content into and onto the environment we live in allowing the overlay of the virtual over the real physical world. According to Microsoft, the system can ‘’change the appearance of the room, induce apparent motion, extend the field of view, and enable entirely new game experiences.”58
Image: Microsoft’s ‘IllumiRoom’
‘’This is about enhancing your experience – tapping into the background, like surround sound does. In some cases, context can be added – snowflakes sifting downward as you race on a MarioKart winter level. In other cases, useful peripheral elements can show you the broader built environment beyond your main area of focus – skies and streets above, below or off to the site in wireframe format.”59 - Urbanist
58 Urbanist, Surround Screen: ‘Illumiroom’ Immersive Gaming Projection , Web Urbanist, 2013, retrieved 27 March 2013, <http://weburbanist.com/2013/01/10/surround-screen-illumiroom-immersive-gaming-projection/>. 59 ibid.
22
AUGMENTED TECHNOLOGY – POSSIBILITIES OF THE FUTURES
ARCHITECTURAL INTEGRATION HIGHLIGHTING ARCHITECTURAL INTEGRATION WITH THE TECHNOLOGY Architects have already been getting involved in Augmented Reality applications. Architecture OMI presented an exhibition entitled “Augmented Reality: Peeling Layers of Space Out of Thin Air”60 offering “a peek at the future of architecture and representation, but more generally, it marks a further integration of a novel technology to our everyday experience.”61 For this exhibition, 8 architects were commissioned to create augmented sculptures and environments. Using the Layar application, the scanning of QR codes was required for the phone to download the content and reveal the installation. Geo-location was used in the building of the augmented content “to overlay and GPS-pin their three dimensional projects.”62
“Just point your iPhone or Android upwards and experience the fourth dimension!” 63 – Samuel Medina On a 300 acre park, each studio was allotted a plot, which guided user through the experience of the studio’s proposal. One of the architects, SHoP, “sees Augmented Reality as another tool to define and describe space. AR’s geo-location based positioning, along with mobile smartphone devices allow the traditional rendering or on-screen 3D model to leave the office and uniquely bind the virtual to the physical. The experience of being on location and viewing the as-yet built designs adds an entirely new dimension to architectural representation.”64 SHoP continues to note their predictions for AR by envisioning “even greater possibilities for this technology from pre-construction all the way to post occupancy, from visualization of designs to real-time environmental performance metrics and facilities management.”65 The other architects involved in the project are Vito Acconci Studio, Asymptote, John Cleater Studio, KOL/MAC Architects, SITE, Thomas Leeser Architecture and Studio Daniel Libeskind. The curator, John Cleater, was the AR developer for the project.
60 Architecture Omi, Augmented Reality: Peeling Layers of Space Out of Thin Air , Architecture Omi, 2011, retrieved 12 January 2013, <http://www.artomi.org/cms/uploads/augmented-reality-brochure.pdf>. 61 Peter Franck, Peeling Layers of Space Out of Thin Air: Augmented Reality at Architecture OMI , Architecture Omi, 2011, retrieved 11 April 2013, <http://www.artomi.org/program.php?Architecture-OMI-8>. 62 S, Medina, Architects Do Augmented-Reality , Architizer, 2011 , retrieved 18 October 2012, <http://www. architizer.com/en_us/blog/dyn/24498/architects-do-augmented-reality/>. 63 ibid 64 J, Cleater, SHoP, “Dunescape Version 3.0” , Architecture Omi, 2011, retrieved 12 January 2013, <http://www. artomi.org/cms/uploads/SHoP.pdf>. 65 ibid
23
Image: Augmented Reality: Peeling Layers of Space Out of Thin Air
“In the 1990s, we saw the place of public discourse migrate on to the internet. Now with Augmented reality technology, that virtual public sphere can now be overlaid back into the physical environment”66 – John Craig Freeman
66 Augmented Reality Emerging Technology - See How works [online video] video time code 3:23, TheTechindia, 2013, http://www.youtube.com/watch?v=RB8QStA8HRI, retrieved 21 March 2013
24
ARCHITECTURE
“One way to start thinking about these questions is to approach the design of augmented space as an architectural problem. Augmented space provides a challenge and an opportunity for many architects to rethink their practice since architecture will have to take into account the fact that virtual layers of contextual information will overlay the built space.”67 - Lev Manovich Given the technological strides AR has made over the past few years, particularly in its articulation of new forms of interface which give users greater mobility, and allow for hands-free experiences of AR, the application of AR technology as a part of everyday life is becoming ever more practical. With Wi-Fi, 3G and 4G technology becoming a part of daily life, the age of a constant connectivity is here and with it the age of AR. The development of “World’s first Augmented Reality chipset” 68 , Google Glasses, Leap Motion and Z-Space highlight developments towards this new AR world.
Metaio’s Augmented Reality Chipset is the first of its kind in the world69 and its use of combined cutting-edge technology allows it to insert and overlay almost any form of 3-D and virtual content into the real world. According to Metaio, this technology ‘’recognizes images, objects and entire environments in a mobile future that clearly requires smart devices to be ‘always on’ and connected.’’70 With additional Augmented Reality Hardware IP, also known as ‘AREngine’, Metaio has the ability to drastically reduce the power consumption of the devices allowing for all-day AR experiences to become not only possible but also practical.
“The AR Engine will do for augmented reality what the GPU did years ago for the gaming industry. This is a great leap in the AR space, and we strongly believe that the AR Engine working with ST‐Ericsson platforms will help realize the Augmented City—the idea of a completely connected environment powered by augmented reality and made possible with next‐gen, optimized mobile platforms.”71 - Peter Meier, Metaio CTO
In the building proposal highlighted below, a school based in Spitalfields East London, was designed early last year. The core idea of this project was to propose the development of smart architecture, architecture which is designed, specified and tailored to cater to an AR ready world.
67 L, Manovich, ‘The Poetics of Augmented Space’, Visual Communication, vol. 5, no. 2, 2006, pp. 225 68 metaioAR, Introducing the Worlds First Augmented Reality Chipset - Metaio & ST-Ericsson , Youtube, 2013, retrieved 28 February 2013, <http://www.youtube.com/watch?v=6br7NreTwD4>. 69 ibid 70 ibid 71 J, Donovan, Metaio Develops First Chipset To Improve Augmented Reality Performance In Mobiles , Tech Crunch, 2013, retrieved 28 February 2013, <http://techcrunch.com/2013/02/21/metaio-develops-first-chipset-to-improveaugmented-reality-performance-in-mobiles/>.
25
26
Image A: Spitalfields AR School East London - Authors own design work
“…AR can take training and education even further by displaying interactive information in geospatial context – greatly enhancing its usefulness.”72- Kipper and Rampolla Among the proposed supporting systems for the Spitalfields School design project were applications such as ‘LearnAR’. LearnAR is a learning tool that, according to LearnAR, “combines investigative, interactive and independent learning to life using Augmented Reality.”73 Currently on the market this product package provides “a pack of ten curriculum resources for teachers and students to explore by combining the real world with virtual content using a web cam.” According to LearnAR, “the resource pack consists of interactive learning activities across English, maths, science, RE, physical education and languages that bring a wow-factor to the curriculum.”74 Architectural author and researcher Maria Lehman goes on to note that “a new breed of architectural “rooms” will be designed as augmented reality becomes more mainstream. Just think, a new sort of “game room” or an interactive “movie room”.75
“One of the biggest potential uses for AR is also where the Internet shines [It] is in democratising access to information. This goes beyond access to trivia to include serious training and education.”76 - Kipper and Rampolla
Image B: Spitalfields AR School East London - Authors own design work 72 Kipper & Rampolla, loc. cit. 73 J, Alliban, LearnAR – eLearning with Augmented Reality , James Alliban, 2010, retrieved 29 March 2013, <http:// jamesalliban.wordpress.com/category/augmatic/>. 74 ibid. 75 M, Lehman, 5 Reasons Augmented Reality is Good for Architecture , Sensing Architecture, 2009, retrieved 18 October 2012, <http://sensingarchitecture.com/1281/5-reasons-augmented-reality-is-good-for-architecture/>. 76 Kipper & Rampolla, loc. cit.
27
FOURTH DIMENSION – TIME BASED SPATIALITY “Space is no more concrete than time, nor is it easier to represent…the kinds of world we inhabit, and our understanding of our places in these worlds are to some extent an effect of the ways in which we understand space and time’’77 Elizabeth A. Grosz THE MORPHOLOGY OF BODY AND SPACE AR technology gives room to explore architectures, and the movement of bodies through architecture live on site. Furthermore the body’s role in influencing the languages of architectural space syntax has evolved over centuries and continues to do so78. Not only does AR technology make geo-located architectural proposals possible, but it also gives room for the exploration of visionary forms of architectural design. AR spaces, although with their own new forms of constraint, also avail new forms of design freedom for architectural designers. Through tests, speculations can be made that these spaces need not be bound by the usual forces that ground architectural structures in the real world and can therefore become a platform for the real world testing of speculative architecture.79 AR architecture would be site specific as well as site responsive and through the reading of body movements through these spaces, the AR Architectures ‘read’ the body, its movement through the proposed architectural spaces and in essence react and rearrange itself in response. If certain movements continue to reoccur with a high enough frequency to be considered significant, the architecture will respond to this, possibly rearranging in scale, form, shape or colour. This continual cycle between the readings of these two data sets, that is to say the body and the AR Building Information, would begin to establish a type of feedback loop.
“It’s all about making the digital information surrounding us, a much more natural experience…the augmented city for us is a symbol, on what we can achieve with augmented reality as a new user interface paradigm”80 - Peter Meier
Image: AR Carnival - Authors own design work 77 E, Grosz, ‘Space, Time, and Perversión: Essays on the Politics of Bodies’ Routledge, London, 1995, p. 97 78 R, Bhatt, ‘Rethinking Aesthetics: The Role of Body in Design’ Routledge, Oxon UK, 2013, p. 8 79 See Appendix 2 for full test carried out by Author, AR Test Recording 3, 6 April 2013 80 Metaio’s Augmented City 2012 [online video] video time code 0:32 & 0:16, metaioAR, 2012, http://www. youtube.com/watch?v=LGXiRzN2X_g, retrieved 21 March 2013
28
29
Image on previous page: Interacting data sets AR Carnival - Authors own design work
THE FOURTH DIMENSION OF THE AUGMENTED CARNIVAL In the sciences of physics and mathematics, the word ‘dimension’ is often informally defined as the minimum number of coordinates needed to specify any point within a space or object.81 As explained on Wolfram Mathworld, The notion of dimension is important “because it gives a precise parameterization of the conceptual or visual complexity of any geometric object…and the concept can even be applied to abstract objects which cannot be directly visualized.”82 More specifically, when considering the term ‘Fourth Dimension’ scientific reference can be made to Einstein’s theory of special relativity called the fourth dimension time, however Einstein went on to note that time is inseparable from space.83 Within the context of this thesis, The ‘Fourth Dimension’ shall not refer explicitly to space-time as standalone elements but will instead refer to the relationships existing between the bodies movement through these. The Fourth Dimension shall be a reading and recording of the relationships between the body’s movement through a specific space over a certain duration or period of time. These readings would in essence be taken from peoples interactions with an AR design proposal.
“We experience place very personally, we want to see how it has changed, we want to imagine how it’s going to be in the future and AR lets us do this”84 – Deb Boyer The supporting design proposal for this given piece of work is sited in the London Docklands area of West India Quay, the idea to augment elements of the sites’ history and past through the use of AR technology and time based readings will be further explained here. Among the main influences to the naming of the area West India Quay is the large influence people of the Caribbean had on the economic development of that part of London. When looking at certain Caribbean cultures, carnivals have a long standing place in their cultural histories. The modern day Caribbean carnivals tend to comprise of a number of ‘bands’. These bands consist of members of the public who join and proceed along a predetermined carnival route. Bearing this in mind the AR proposal itself shall be in the spirit of an augmented carnival event, proposed paths and routes which involve a certain number of changes can be followed.
“Think of the implications, visiting a site and being able to see past, present and future all at the same time”85 – Shoshanna Pokras
Images below: AR Carnival - Authors own design work
81 E, Weisstein, Dimension , MathWorld--A Wolfram Web Resource, 2013, retrieved 9 April 2013, <http:// mathworld.wolfram.com/Dimension.html>. 82 ibid. 83 M, Edmonds, Can our brains see the fourth dimension? , howstuffworks, 2010, retrieved 9 April 2013, <http:// science.howstuffworks.com/science-vs-myth/everyday-myths/see-the-fourth-dimension.htm>. 84 Augmented Reality Emerging Technology - See How works [online video] video time code 3:05, TheTechindia, 2013, http://www.youtube.com/watch?v=RB8QStA8HRI, retrieved 21 March 2013 85 ibid. video time code 2:10
30
31
32
As the body’s reactions are ever changing along with the ‘carnival’ experience, the user taking part in the event is provided a choice to remain, change or re-experience the journey from different viewpoints. The ability to re-experience, moments in time, while still on site as they continue to occur allows for the ‘time travel experience’, while taking part in the event. The different carnival bands provide different experiences but coming together at the end of the route to become one, allowing users only at that point to take part in each band. The ability to travel back in time may not be possible, however the AR Carnival aims to simulate and stimulate the experience, even if for very brief moments, the feeling of temporal freedom.
“Science fiction aficionados may recognize that union [that of time and space] as space-time, and indeed, the idea of a space-time continuum has been popularized by science fiction writers for centuries. Researchers have used Einstein’s ideas to determine whether we can travel through time. While we can move in any direction in our 3-D world, we can only move forward in time. Thus, traveling to the past has been deemed near-impossible, though some researchers still hold out hope for finding wormholes that connect to different sections of spacetime.”86 - Molly Edmonds Commenting on the experiential capabilities of AR, Maria Lehman notes that “virtual reality technologies are able to bridge the gap by getting rid of distance; building occupants will be able to virtually travel to far-away lands. Just imagine seeing, hearing, smelling (and, yes, tasting and touching) some of the wonders that make a culture what it is. “87 In the particular case of the Augmented Carnival one would experience a more site specific invitation to participate in the past histories and cultures that colour West India Quay. Colour within the Carnival helps in some cases, to distinguish certain Bands from other Bands as well as sections of a Band from other sections. This normally occurs during the beginning when the bands are judged. Each section of a band is grouped by colour combinations which have particular representation to history, community, natural elements or popular interest. Then, as everyone from each section proceeds down the route, each colour mixes to create a kaleidoscope of colours formed through body movement. Maria goes on to highlight, that “technologies found within buildings will be able to transport us back in time to re-creations that help us learn more about history. Architectural environments may not be limited to a real set of geographical coordinates – virtual traveling within buildings will turn them into a sort of augmented reality ‘transportation vessel’.”88
“Digital design tools are omnipresent in design practice and helped architects both past and present to explore the functional and formal solution space for architectural problems. Consequently, these digital aids span from dabbling to construction and are already beyond the constraints of pen and paper or other conventional media. Digital design tools were predominantly developed by fostering the capabilities of conventional tools in such a way that they appear as a logical enhancement from their predecessors.”89 – Wang and Schnabel Image below: AR Carnival - Authors own design work 86 Edmonds, loc. cit. 87 Lehman, loc. cit. 88 ibid. 89 X, Wang & M Schnabel, ‘Mixed Reality in Architecture, Design & Construction’, Springer Science + Business Media B.V, Sydney, 2009, p. 27
33
34
CONCLUSION
“Augmented reality (AR) is a technology whose time has come.”90 - Kipper and Rampolla
Now, more than ever before, AR technology has become the platform to provide viable practical solutions for varying industry problems. Having existed and evolved over the past five decades91, AR’s dissemination over the years has seen it fast grow into possibly one of the most ubiquitous technologies. The current capabilities of augmented technology provide an ideal platform for the development of time based architectural investigations and proposals. The inclusion of time, the ‘fourth dimension’, and the inherent interactivity that comes with AR, allow for projected spaces to be experienced from several different points of view at different points in time. These proposals would have the ability to shift and alter, elements could be switched on and off (in much the same way as is done in computer drafting and modelling), the opportunities are endless and thus the ways of retelling the history of West India Quay is limitless. Time based representation of architectural space in AR form gives room for response. Geolocated AR projections ‘read’ the body’s movement through its holographic location and through interaction can alter and respond. Throughout the carnival procession the body’s movement plays an incredibly vital role in the progression of the event. One person’s action triggers a response from other members of the carnival ‘bands’, this is a continual process of interaction, action-reaction body movements propel the event forward. As with the body’s movement in carnival, so too would the interactions in the proposed augmented reality space trigger an action and reaction cycle, creating a feedback loop within these spaces. The system would continue to loop and grow in an evolving adaptation to the bodies’ continual movement through it. Augmented reality not only provides a level of control of these experiences but also leaves room for the unexpected. Providing not only a combination of ways to view a space, but also the option to re-experience, by rewinding, alternative narratives within the museum’s exhibit. One could choose to follow the single story of an individual in one option while in another option possibly decide to experience the space through reading the collective history of the West Indians Quay. The possibilities are undoubtedly endless. In conclusion, through the use of Geo-located AR technology, a new layer of plasticity over the space and the material world is possible, carrying with it ideas of fixed and unfixed moments in time. This allows for future research and speculations on the nature and future of physical, spatial and virtual relationships. Image below: AR Carnival Chronogram Development - Authors own design work 90 91
Kipper & Rampolla, , loc. cit. ibid.
35
36
APPENDIX 1 INTERVIEWS BY AUTHOR: 1. David Lock (Director of Operations EMEA Region at Vuzix) Question: Can you tell me about your product? Answer: These glasses relay on a module with LED displays using diffraction techniques, by looking through the display, the image appears projected onto a surface within your view but actually the image is digitally displayed on the device. Question: How does this work? Answer: It utilises a small portion of the glasses to display the augmented object, the glasses are using waveguide and laser technology. It’s in early development, there is still a lot of refining to be done, but you can see me and you could be watching something else going on. Question: Are you locked into any Augmented reality software? Answer: No, no, we are agnostic, our products will work with Metaio, Total Immersion…because we use standard web cam, so if you look through the demo, you will see the desktop being replicated. I’m just going to fire up a little application and you will see the object in 3D, when you hold the marker in your hand you can move it around and move around the 3D model. Question: In the future, do you think Augmented reality will replace desktop and laptop systems? Answer: Oh yea, with consumer products 2 years away, you will be talking about no wires or cables and looking through a pair of glasses that look just like my one. 2. Antoine Brachet (Channel Director at Total Immersion) Question: What is the limitation to the type of 3d model you can add into the program? Answer: if the 3d model has too many polygons, it will not load into the program, textures are supported but not to complex and limited particles are supported within the program. Here you have face tracking, marker tracking and image tracking. With Studio (D’fusion studio), you create your scenario in studio using a scripting language to create the augmented reality using one of the tracking methods. Question: Do you think in the future, augmented reality will develop to a point where it replaces the desktop computer? Answer: I think replacing it would be difficult; maybe phone displays would be replaced; you connect your phone to the glasses with an earpiece and that would be the display, having the controls augmented in the glasses.
37
3. Markus Meixner (CTO at ViewAR) Question: How does your software work and where do you think it’s going? Answer: Basically our software is a 3d engine, that is hosting a model database. So we can upload models to a website and then the apps access this database on demand. Basically for the architecture scenario we would produce an app for an architecture office/company and they would have an application branded with their logos and this (ViewAR application) would access their support channel in our system… Question: Ok, I see, so everything would be accessed from your server? Answer: Exactly, and this gives you the ability to keep the app slim and to update models. Question: Is there any limit to the complexity of the model? Answer: I mean our engine is preforming really great, I could show you demos with 400,000 800,000 polygons. Question: Is this application available to try for android? Answer: Currently we didn’t release an android version but we have already ported our systems to android… this is a 25mb model with around about 100,000 polys and you can keep adding models within one scene, up to 1,200,000 polys. Question: It’s good that this can be displayed without markers, do you have an option of where you can fix the model, like a table or can it just be displayed in a space? Answer: You can build markers and so on, but we didn’t focus on showing marker stuff as it’s very simple. 4. Andreas Hauser (Head of Mobile Development at Wikitude) Question: Where do you think augmented reality will be in the future? Answer: In general, it depends on how far in the future you are looking. My personal opinion is not that all the people out there will have the glasses and only talk to each other with the glasses cause I don’t think the interaction will be above a searching level. Cause sometimes every one of us, even the geeks around, you’re at home and you turn off everything, it doesn’t matter what is on TV or what is online, you forget about your smartphone and just be offline, disconnected. But I’m pretty sure there will be specific scenarios especially for tertiary or university research, architecture, even in businesses or setting up your flat, where your architect can tell you, hey you know what! I have a sample room here, just put on those glasses and walk around and it’s prefect for those 15-30mins to get a better feeling of what this will look like but you wouldn’t use AR glasses all the time, well at least not within the next 5 years… and no one knows what will happen in 10 years’ time.
38
Question: There have been a number of online videos and movies which highlight different levels of controlling AR, eye movements, voice control, etc. what do you think about this? Answer: Perfect stuff for film scenarios, Hollywood, also perfect for the gaming industry and the military but it’s perfect for gaming as Sony Playstation portable has these augmented reality games but although these are just a few amount of children that play them all day long, the technology is here. But I think it’s not the technology or the computer systems or whatever, I’m pretty sure within the next 2 years, the technical stuff will be sold, but then the big challenges will be to get it ready for the masses. Because the guys out there and even if they don’t see that you have AR glasses or you may not even have AR, someone could only know my name because of his AR glasses. Private stuff…is it really necessary to have these gadgets on you all day long. Question: It could be worrying, in cases of security, where people can also be recording you using these displays without your knowledge. Answer: Exactly! …I just want to highlight that even though the technology is ready, it may not be accepted. Question: In relation to your SDK, what would be the best way for me to start using it? Answer: Well yes, if you want to start from scratch, then I recommend to start with some vacation days approach so just for example, you would take some samples I have here. You start getting any location based API (Application programming interface) and display it on a map, just decide whatever API you prefer, whatever POIs (Points of Interest) you want and maybe you have a server or an xml file or something to set up some location based tracking, display it on a map. Think about how you want to visualise, from the user, how should this look like, how should content be represented. Show one of your colleagues beforehand because space is very limited, you take time to visualise it, which problems to solve, think before placing. It is really 80% concept, UI, API, foreign API so only 20% is really then AR related. You decide time to make a bubble flow around… no big thing, but the more important thing is to know how you can display, how you can style them and even how many POIs you want to show, which POIs, start on a plain map and as long as it works on a map then check it in AR. At least with our technology you can then reuse a huge amount of your coding, since we are Java script based and HTML, you can copy your code and paste it in AR and replace the example code.
39
APPENDIX 2 FULL TESTING CARRIED OUT BY AUTHOR: AR TEST RECORDING 3 Test 3 – 6th April, 2013 11am - Metaio SDK testing As preperation for the Geo-located test, a set of Points of Interests were collected. The intension was not to use all but to have the POIs data near at hand whenever II choose to start testing. Grab ten locations at various distances and add them to the form below for safe-keeping until you start writing code later in the book). Using Google Maps to harvest latitude and longitude coordinates. From the map Right-click the target and select What’s here? from the menu. The search box has been replaced with the latitude and longitude of the destination. My POIs: 1. 51.448521,-0.013457 2. 51.448514,-0.013461 3. 51.44882,-0.013469 4. 51.446501,-0.010957 5. 51.503518,-0.018973 6. 51.503401,-0.016688 7. 51.504769,-0.018035 8. 51.505313,-0.022944 9. 51.483167,-0.007448
Home – Garden Home – Garden 2 Home – Front Playing Field – Mountsfield Park Jubilee Park Upper Bank St. Canada Square Park Cabot Square Greenwich Library Outside
First step was to create my own application based on the tutorial files provided. To achieve this, a new application folder was created, named and a layout for the cover of the application (app) was provided. The metaioSDK and SDK Example folder remain within the project to provide files required to start programming. The first files copied from the Example folder is the “MetaioSDKViewActivity. java” which provides the link to the metaioSDK then the second file required is the “MainActivity. java”. The “activity_main.xml” remains the cover layout, while a new activity, “ARActivity.java” is created along with the 2 copied over.
40
â&#x20AC;&#x153;ARActivity.javaâ&#x20AC;? allows for the coding of the application. The complexity of the code has proven difficult to familiarise with than expected. Only a few line of code where programmed in before another method had to be explored with the time remaining.
Update 9th April, 2013 4am - As Geo-Location was shown in AR Test Recording 2, a revisiting of the code for that example was carried out to see if it is possible to adapt my model into the scene.
41
Update 9th April, 2013 5pm - This turns out to be possible by locating the assets area in the code and changing the file-name to â&#x20AC;&#x153;yourmodelsnameâ&#x20AC;?, this requires the 3D model to be placed in the assets folder where the previous files are located. The file types for importing 3D models are obj,md2 and fbx format. Not only is changing the assets area necessary but any relation to the previous file-name throughout must be changed as well. The last area to adjust would be the scale of the model. Not highlighted within the code, is where to adjust the rotation, it seems to be based on the modelâ&#x20AC;&#x2122;s translation into the program. The model was tested at home, but the same issues that occurred with the example GPS in Test Recording 2, also happen here. A second outdoor PIO was chosen and tested at that location with the same results. The model below appears with no texture and incorrect rotation to the ground plane.
Update 11th April, 2013 8am - With these poor results as well as no idea how to change the rotation of the object. The decision was made to simplify the model even more and try to reset the coordinates and bake the textures in. An animation of the model was also exported but failed to preview. Another decision was made to edit one of the more advance tutorial to allow interaction with the model based on a GPS offset from the device.
42
Update 12th April, 2013 6am - As with tutorial 5, the process was carried out to replace one of the models with my model in tutorial 7. This took a lot longer than expected as the code is twice as long as tutorial 5 and allows for full interaction with the model.
Update 12th April, 2013 7pm - Updated tutorial 5 programming code as well, to test the basic Geo-location again. Tested over the weekend on site. Tutorial 5 results remained the same as the modelâ&#x20AC;&#x2122;s rotation was off and the model also appeared with an odd texture applied. Maybe due to the low polygon count of the model, the textures appear semi-transparent.
43
Using the edited tutorial 7 GPS method, a range of interactions with the model became possible. The model also remained in a fixed location as the screen is panned from side to side. Highlighting through my model, the ability to fix augmented objects to a set location.
Exploring the model around the Site, the quality through the metaio SDK application is of a low quality. Better quality is available when the application is published.
44
45
Another point of interest was explored on the site with the model and interactions with the model was carried out to play with the scale and rotation.
46
Exploring the model at Site - another point of interest was looked at with the model and interactions with the model was carried out to play with the scale and rotation using hand gestures.
Image below shows the model scaled up on the site, also a video clip of the interaction is overlayed onto this image through the use of Augmented reality.
Video showing the interaction with the model, with the QR Code provided please scan the code using the Junaio application available from Appleâ&#x20AC;&#x2122;s app Store or Androidâ&#x20AC;&#x2122;s Google Play Store, then proceed to look at the Image above. 47
Images of the model being rotated.
Images of the model being scaled.
48
AR TEST RECORDING 2 Software:
Metaio SDK
/Metaio Creator Demo
/
Android Junaio Plug-in
Version History: Start -
Ver. 4.0.2 Ver. 2.0.1 Ver. 4.5
Updates 19/12/2012 04/02/2013
Ver. 4.1.1 Ver. 4.1.2 Ver. 2.5.1
Ver. 4.6
Test 2 – 28th February, 2013 10am - Metaio SDK testing Downloaded and installed Android SDK to Develop Applications Android SDK requires a further application to work, downloaded and installed Java SE Development Kit (JDK)
To start building an android application, a program called Eclipse.exe was installed within the Android SDK. This program allows the building of the application and allows the Metaio SDK to function over this platform. Also included within the Android SDK is an SDK manager which provides the ability to download drivers for the Android platform. If required, this would allow the application built to support a number of different smartphones running older and/or newer versions of Android. Started Eclipse.exe – Explored the program Testing Stopped - Unfamiliar with the controls and not sure how to create and/or run android applications. Update 5th March, 2013 5pm – Metaio SDK testing continue: Emulator issues, Failed when trying to run the tutorials on Windows PC Virtual Android device. Eclipse.exe wouldn’t recognise any android smartphone or tablet plugged in to run the test on.
49
Update 13th March, 2013 5am – Metaio SDK testing continue: Issues understanding how to run Metaio SDK tutorials over Eclipse.exe)
1. Start existing Andriod project
2. Import SDK Andriod example folder
3. After import metaioSDK folder
4. With both imported, plug in an Andriod device
5. Switch phone to USB debugging to be deteched
6. Click on the Example folder then Click ‘Run’
Basic steps to run the ‘Example’ folder once all issues were addressed. Testing on the 14th March, 2013 8pm: Some tracking methods were successful through the tutorial development code. Images below show the application automatically starting on the device.
50
First tracking method in Example application: Photo tracking and/with Animation is shown below
Markless 2D - This tracking method attempts to predict the location to place an object. The image below shows the 3D object placed on the screen of the laptop and on the ceiling.
Markerless 3D is still in beta, and tracking proves to be unpredictable as shown below. This type of tracking first attempts to create a motion tracking map to place the object on.
51
Interaction with the model is also supported. This is programmed into the code to activate through tapping the touch screen device.
Testing on the 15th March, 2013 7pm: First geo-location example was successful through the tutorial development code, but orientation isnâ&#x20AC;&#x2122;t correct. Model appeared far away and horizontal to the ground plane.
52
Testing on the 2nd April, 2013 4pm - Second Geo-location test, outside was more sucessful, The model didnâ&#x20AC;&#x2122;t remained in one position all the time, with the GPS based as an offset from the device. It would shift as the deviceâ&#x20AC;&#x2122;s GPS became unstable.
Video playback capabilities, allow for windowed or full screen playback of recordings. Windowed within a 3D model of a TV is highlighted below.
53
AR TEST RECORDING 1 Software:
Metaio SDK
/Metaio Creator Demo
/
Android Junaio Plug-in
Version History: Start -
Ver. 4.0.2 Ver. 2.0.1 Ver. 4.5
Updates 19/12/2012 04/02/2013
Ver. 4.1.1 Ver. 4.1.2 Ver. 2.5.1
Ver. 4.6
Test 1 – 20th December, 2012 5pm 1. Downloaded and Installed Metaio Creator Demo and Metaio SDK 2. Started Metaio Creator Demo version 2.0.1 Limitations to the Demo version: • you can add only two trackables and two content items at a maximum •
you can preview your magazine, nevertheless “publish” is only available with the full license. Demo version (2012) http://dev.metaio.com/creator/getting-started/demoversion/
3. The program started with an option to open the issue page provided or to create your own issue page. The decision was made to open the provided issue to explore what it could do.
54
4. Numbered steps, provided direction on how to get started. A page was added as directed to begin
5. A dialog box appeared requesting an image to get started. This image will be used to place the AR objects on
55
6. After import, the image is rated according to the quality, the test image used, received two stars out of three. Proceeded on and attempted to add the first object. I decided to add an image to the scene.
7. A range of standard image types are available for import. The decision was made to use a .png image from the thesis design project, which didnâ&#x20AC;&#x2122;t have a background, to see how the image will appear over the base image.
56
8. The image appears to sit flat on the base image. When selected the imported image can be scaled and rotated in the provided 2D space.
9. Attempting to add another object to the scene, a 3D object was chosen.
57
10. Quick observations were made as to the limited number of 3D file types available to use with the software. The range of file types include: .fbx, .dae, .md2, .obj, .mfbx and .zip. An .fbx model was selected of a barrel that has a low polygon count. The selected model type was used because it maintains its materials within standard 3D modelling programs and the test was to see if it would maintain this material mapping.
11. The 3D model imported into scene and the model maintained the texture map. Within the 2D view, it allowed for the same level of control as the image.
58
12. Switching to 3D view, the control system adapts movement, scale and rotation similar to standard 3D modelling programs. This system of control is also provided for the image, allowing the image to be moved to any angle, scale or rotation along x, y, z axis.
13. In an attempt to add another model to the scene, the previous process was repeated. This time an .obj model was selected.
59
14. The attempt failed at this point as the software limits the scene to two objects. This error message stops the process from proceeding.
15. With the maximum objects in the scene, uploading the scene was the next step. Login icon was selected.
60
16. An account was created to connect to metaio AR Cloud
17. A new AR channel was created to host the scene on metaio server
61
18. Upload option was provided, where the project is stored on a cloud server, and after a few minutes, a unique QR code was given to scan using any smart phone with the software.
19. To show the results of this test, an android tablet was used. The Junaio plug-in was installed from Google Play (application hub for Android based phones).
62
20. The Junaio application loads up with a splash screen that introduces the product and highlights that the program is powered by metaio.
21. The first information to appear on the screen shows Places of Interest with their distance away and the ability to select any of them to gain more information.
63
22. Using the scan feature within the Junaio program, the image of the bar-code provided can be scan from any location. This allows Junaio to pick up the unique channel with the elements created.
23. As soon as the information loads from the cloud, the tabletâ&#x20AC;&#x2122;s camera was faced towards the image. The elements instantly are display on top the image, making an augmented reality experience possible.
64
24. The elements appear to work well as you view from multiple angles, but this marker base system does have a limited view distance, depending on how well the image can be seen by the camera and the quality of the image used.
65
REFERENCES LIST Books: Bhatt, R, ‘Rethinking Aesthetics: The Role of Body in Design’ Routledge, Oxon UK, 2013 Fisher, B, K Dawson-Howe & C O’sullivan, ‘Virtual and Augmented Architecture (Vaa’01)’, Springer Verlag London Limited, Great Britain, 2001. Grosz, E, ‘Space, Time, and Perversión: Essays on the Politics of Bodies’ Routledge, London, 1995. Haller, M, M Billinghurst & B Thomas, ‘Emerging Technologies of Augmented Reality: Interfaces and Design’, Idea Group Inc., London, 2007. Kipper, G & J Rampolla, ‘Augmented Reality: An Emerging Technologies Guide to AR’, Elsevier, Inc. Waltham, MA, 2013. Krell, D, ‘Architecture: Ecstasies of Space, Time, and the Human Body,’ State University of New York, New York, 1997. Madden, L, ‘Professional Augmented Reality Browsers for Smartphones: Programming for Junaio, Layar and Wikitude’, John Wiley & Sons, United Kingdom, 2011. Mitchell, W, ‘City of Bits; Space, Place and the Infobahn’, The MIT Press, Cambridge, Massachusetts, 1996. Mullen, T, ‘Prototyping Augmented Reality’, John Wiley & Sons, Indiana, Canada, 2011 Spiller, N, ‘Cyber Reader. Critical Writings for the Digital Era’, Phaidon Press, London, 2002. Wang, X & M Schnabel, ‘Mixed Reality in Architecture, Design & Construction’, Springer Science + Business Media B.V, Sydney, 2009. Wolfgang, H, ‘Interactive Environments with Open-Source Software’, Springer-Verlag/Wien, Austria, 2009.
Theses: Matsuda, K, Augmented (hyper) Reality Domesti/City: The Dislocated Home in Augmented Space , MArch thesis, London, The Bartlett School of Architecture, 2010, retrieved 14 October 2012, <http://keiichimatsuda.com/thesis.php>. Matsuda, K, ‘DOMESTI/CITY The Dislocated Home In Augmented Space’, MArch thesis, London, The Bartlett School of Architecture, 2010. Tran, G, ‘Mediating Mediums: The Digital 3d’, MA thesis, Cambridge, MA, Harvard Graduate School of Design, 2011.
66
Conference Presentations: Madden, L, ‘Augmented Planet Readers Choice Awards’ , Augmented Planet - The Leading UK Augmented Reality Event, Kensington Town Hall, London, 2012. Hauser, A, ‘Wikitude Architect Presentation’ , Augmented Planet - The Leading UK Augmented Reality Event, Kensington Town Hall, London, 2012. Lock, D, ‘The Future of Augmented Reality Eyewear – Vuzix Presentation’ , Augmented Planet - The Leading UK Augmented Reality Event, Kensington Town Hall, London, 2012. Powell, W, ‘DIY Project Glass’ , Augmented Planet - The Leading UK Augmented Reality Event, Kensington Town Hall, London, 2012. Misslinger, S, ‘Junaio’s new API + 3D Tracking’ , Augmented Planet - The Leading UK Augmented Reality Event, Kensington Town Hall, London, 2012.
Journal articles: Azuma, R, ‘A Survey of Augmented Reality’, In Presence: Teleoperators and Virtual Environments’, vol. 6, no. 4, 1997, pp. 355-385. Colquhoun Jr. H & P Milgram, ‘A Taxonomy of Real and Virtual World Display Integration’, University of Toronto, Canada, no. 1, 1999, pp. 1-26. Manovich, L, ‘The Poetics of Augmented Space’, Visual Communication, vol. 5, no. 2, 2006, pp. 219240
Websites: Augmented.org, Making the digital a natural experience! - The insideAR 2011 , augmented.org [web blog], 28 September 2011, retrieved 5 October 2012, <http://www.augmented.org/blog/2011/09/ making-the-digital-a-natural-experience-the-insidear-2011/>. Alliban, J, LearnAR – eLearning with Augmented Reality , James Alliban [web blog], 16 March 2010, retrieved 29 March 2013, <http://jamesalliban.wordpress.com/category/augmatic/>. Architecture Omi, Augmented Reality: Peeling Layers of Space Out of Thin Air , Architecture Omi, 2011, retrieved 12 January 2013, <http://www.artomi.org/cms/uploads/augmented-realitybrochure.pdf>. Arthur, C, Microsoft gets patent on augmented reality glasses as ‘AR wars’ start , theguardian, 2012, retrieved 17 February 2013, <http://www.guardian.co.uk/technology/2012/nov/27/microsoftaugmented-reality-glass-google-apple>. Bonetti, P, LiveSight: immersive experiences you can act on , Conversations by Nokia, 2012, retrieved 25 March 2013, <http://conversations.nokia.com/2012/11/13/livesight-immersive-experiences-
67
you-can-act-on/>. Bonsor, K, How Augmented Reality Works , howstuffworks, 2010, retrieved 2 August 2012, <http:// computer.howstuffworks.com/augmented-reality.htm>. Cleater, J, SHoP, “Dunescape Version 3.0” , Architecture Omi, 2011, retrieved 12 January 2013, <http://www.artomi.org/cms/uploads/SHoP.pdf>. Chima, C, Where augmented reality and architecture collide [Video] , THE NEXT WEB, 2011, retrieved 18 October 2012, <http://thenextweb.com/dd/2011/08/30/where-augmented-reality-andarchitecture-collide-video/>. Donovan, J, Metaio Develops First Chipset To Improve Augmented Reality Performance In Mobiles , Tech Crunch, 2013, retrieved 28 February 2013, <http://techcrunch.com/2013/02/21/metaiodevelops-first-chipset-to-improve-augmented-reality-performance-in-mobiles/>. Dredge, S, Qualcomm: ‘Augmented reality as a technology is starting to mature’ , theguardian, 2013, retrieved 27 March 2013, <http://www.guardian.co.uk/technology/appsblog/2013/feb/26/ qualcomm-vuforia-augmented-reality>. Edmonds, M, Can our brains see the fourth dimension? , howstuffworks, 2010, retrieved 9 April 2013, <http://science.howstuffworks.com/science-vs-myth/everyday-myths/see-the-fourthdimension.htm>. Franck, P, Peeling Layers of Space Out of Thin Air: Augmented Reality , Architecture OMI, 2011, retrieved 11 April 2013, <http://www.artomi.org/program.php?Architecture-OMI-8>. Goldman, D, Google unveils ‘Project Glass’ virtual-reality glasses , CNNMoney, 2012, retrieved 28 March 2013, <http://money.cnn.com/2012/04/04/technology/google-project-glass/?source=cnn_bin>. junaio, Become a junaio developer! , metaio Inc., 2013, retrieved 1 April 2013, <http://www.junaio. com/develop/>. junaio, Frequently Asked Questions , metaio Inc., 2013, retrieved 1 April 2013, <http://www.junaio. com/faq/>. Knight, W, A Display That Makes Interactive 3-D Seem Mind-Bogglingly Real , MIT Technology Review, 2012, retrieved 24 February 2013, <http://www.technologyreview.com/view/508991/a-displaythat-makes-interactive-3-d-seem-mind-bogglingly-real/>. Layar, Layar SDK , Layar, 2013, retrieved 1 April 2013, <http://www.layar.com/sdk/>. Layar, What is Layar? , Layar, 2013, retrieved 1 April 2013, <http://www.layar.com/what-is-layar/>. Lehman, M, 5 Reasons Augmented Reality is Good for Architecture , Sensing Architecture, 2009, retrieved 18 October 2012, <http://sensingarchitecture.com/1281/5-reasons-augmented-realityis-good-for-architecture/>.
68
Lehman, M, Can Interactive 3D Modeling Change the Way You Design? , Sensing Architecture, 2013, retrieved 17 February 2013, <http://sensingarchitecture.com/9067/can-interactive-3d-modelingchange-the-way-you-design/>. Mack, E, Confirmed: Google Glass arrives in 2013, and under $1,500 , CNET, 2013, retrieved 25 February 2013, <http://news.cnet.com/8301-17938_105-57570779-1/confirmed-google-glassarrives-in-2013-and-under-$1500/>. Matsuda, K, Augmented (hyper) Reality: Augmented City , KEIICHI MATSUDA, 2011, retrieved 14 October 2012, <http://keiichimatsuda.com/augmentedcity.php>. Matyszczyk, C, Here’s who can’t wear Google Glass: People who wear glasses , CNET, 2013, retrieved 28 March 2013, <http://news.cnet.com/8301-17852_3-57573657-71/heres-who-cant-weargoogle-glass-people-who-wear-glasses/>. Medina, S, Architects Do Augmented-Reality , Architizer, 2011 , retrieved 18 October 2012, <http:// www.architizer.com/en_us/blog/dyn/24498/architects-do-augmented-reality/>. Medina, S, Real Talk on Augmented Reality , Architizer, 2011, retrieved 18 October 2012, <http:// www.architizer.com/en_us/blog/dyn/23422/real-talk-on-augmented-reality/#.UVQ1QxyeN8H>. metaio, Augmented Reality Factsheet, brought to you by metaio , metaio GmbH, 2013, retrieved 1 April 2013, <http://www.metaio.com/company/>. metaio, Augmented Reality SDK for App Development , metaio GmbH, 2013, retrieved 1 April 2013, <http://www.metaio.com/products/sdk/>. Newman, J, Google’s ‘Project Glass’ Teases Augmented Reality Glasses , PCWorld, 2012, retrieved 28 March 2013, <http://www.pcworld.com/article/253200/googles_project_glass_teases_ augmented_reality_glasses.html> Perrin, W, Hyperlocal websites and augmented reality – hypARlocal , talk about local, 2012, retrieved 14 March 2013, <http://talkaboutlocal.org.uk/hyperlocal-websites-and-augmented-realityhyparlocal/>. Ray, Greg Tran - “Mediating Mediums: The Digital 3D” , CORE77 design magazine & resource, 2011, retrieved 13 November 2012, <http://www.core77.com/blog/architecture/greg_tran_-_ mediating_mediums_the_digital_3d_20442.asp>. Silbey, M, Make your own reality with Metaio’s augmented reality software , smartplanet, 2012, retrieved – 05 October 2012, <http://www.smartplanet.com/blog/thinking-tech/make-yourown-reality-with-metaios-augmented-reality-software/11573>. Trenholm, R, Google Glass spotted at MWC , CNET, 2013, retrieved 28 March 2013, <http://reviews. cnet.com/8301-13970_7-57571749-78/google-glass-spotted-at-mwc/>. Ulanoff, L, Hands on With Leap Motion’s Controller , Mashable, 2013, retrieved 25 March 2013,
69
<http://mashable.com/2013/03/09/hands-on-with-leap-motion/>. Urbanist, Surround Screen: ‘Illumiroom’ Immersive Gaming Projection , Web Urbanist, 2013, retrieved 27 March 2013, <http://weburbanist.com/2013/01/10/surround-screen-illumiroom-immersivegaming-projection/>. Webb, M, Vuzix display Wrap 920AR augmented reality glasses , Gizmag, 2010, retrieved 15 August 2010, <http://www.gizmag.com/vuzix-wrap-920ar-augmented-reality-glasses/13847/>. Weisstein, E, Dimension , MathWorld--A Wolfram Web Resource, 2013, retrieved 9 April 2013, <http://mathworld.wolfram.com/Dimension.html>. Wikitude, Wikitude SDK Documentation for Android , Liferay, 2013, retrieved 27 March 2013, <http://forum.wikitude.com/documentation?p_p_id=2_WAR_knowledgebaseportlet&p_p_ lifecycle=0&p_p_state=normal&p_p_mode=view&p_p_col_id=column-2&p_p_col_ count=1&_2_WAR_knowledgebaseportlet_mvcPath=%2Fdisplay%2Fview_article.jsp&_2_WAR_ knowledgebaseportlet_resourcePrimKey=129086>. Wikitude, Working with the Augmented Reality SDK from Wikitude , Wikitude GmbH, 2013, retrieved 27 March 2013, <http://www.wikitude.com/developer>.
Films and Videos: Augmented Reality Emerging Technology - See How works [online video], TheTechindia, 2013, retrieved 21 March 2013, <http://www.youtube.com/watch?v=RB8QStA8HRI>. Golden Age - Somewhere [online video], Paul Nicholls, 2012, retrieved 5 August 2012, <https:// vimeo.com/25678978>. How It Feels [through Glass] [online video], Google, 2013, retrieved 25 February 2013, <http:// www.youtube.com/watch?v=v1uyQZNg2vE>. How To Create Augmented Reality Apps [online video], gigafide, 2009, retrieved 5 October 2012, <http://www.youtube.com/watch?v=jU6PcBS1pWw>. Introducing the Worlds First Augmented Reality Chipset - Metaio & ST-Ericsson [online video], metaioAR, 2013, retrieved 28 February 2013, <http://www.youtube.com/watch?v=6br7NreTwD4>. Metaio’s Augmented City 2012 [online video], P. Meier, metaioAR, 2012, retrieved 21 March 2013, <http://www.youtube.com/watch?v=LGXiRzN2X_g>. The Future of Augmented Reality [online video], hiddencreative, 2010, retrieved 14 October 2012, <http://www.youtube.com/watch?v=tnRJaHZH9lo&feature=related>. “The World in 2030” by Dr. Michio Kaku [online video] , CUNYQueensborough, 2009, retrieved 29 September 2012, <http://www.youtube.com/watch?v=219YybX66MY&feature=related>.
70
Images: Image: Author’s own version of Mixed reality continuum as originally defined by Milgram and Kishino 1994. Image: Author’s Image, taken at Augmented Planet. Image: Immersive AR history, Author’s supporting design project The Augmented Carnival. Image: Google Glasses [online image], Jailbreak Nation, 2013, retrieved 9 April 2013, <http:// jailbreaknation.net/wp-content/uploads/2013/03/Google-Glasses-2.jpg>. Image: Google glass prototype [online image], CNET, 2013, retrieved 9 April 2013, <http://asset1. cbsistatic.com/cnwk.1d/i/tim/2013/02/22/Google-Glass-photo_610x179.jpg>. Image: The Future of Augmented Reality [online video], hiddencreative, 2010, retrieved 14 October 2012, <http://www.youtube.com/watch?v=tnRJaHZH9lo&feature=related>. Image A: Leap Motion in Use [online image], <http://thetechblock.com/leap-3d-motion-controlsystem-puts-wii-kinect-to-shame-with-its-amazing-accuracy/>. Image B: Leap motion in Use [online image], <http://allthingsd.com/intromessage/>. Image: Z-Space [online image], <http://clustr.com/images/sized/images/news/687/zspace_00600x338.jpg>. Image: Microsoft’s ‘IllumiRoom’ [online image], Web Urbanist, 2013, retrieved 27 March 2013, <http://img.weburbanist.com/wp-content/uploads/2013/01/illumiroom-surround-systemprojection.jpg>. Image: Augmented Reality: Peeling Layers of Space Out of Thin Air [online image], Architecture Omi, 2011, retrieved 12 January 2013, <http://www.artomi.org/cms/uploads/augmented-realitybrochure.pdf>. Image A: Spitalfields AR School East London, Authors own design work Image B: Spitalfields AR School East London, Authors own design work Image: AR Carnival, Authors own design work Image: Interacting data sets AR Carnival, Authors own design work Images: AR Carnival, Authors own design work Image below: AR Carnival, Authors own design work Image below: AR Carnival Chronogram Development, Authors own design work
71
72
73