INTERACTIVE SPATIAL NARRATIVES IN AUGMENTED REALITY, VIRTUAL REALITY

Page 1

A research project presented to The University of Florida Graduate school of Architecture in partial fulfillment of the requirements for the degree of Master of Architecture, followed by University of Florida Spring, 2019.

INTERACTIVE SPATIAL NARRATIVES IN AUGMENTED REALITY / VIRTUAL REALITY

Armando Urenda- Velazquez Trevor Halcomb Faculty Advisor: Lee-Su Huang Stephen Belton Lisa Huang


THIS PAGE INTENTIONALLY LEFT BLANK


INTERACTIVE SPATIAL NARRATIVES IN AUGMENTED REALITY / VIRTUAL REALITY Armando Urenda- Velazquez Trevor Halcomb

Chair: Co-Chair: Co-Chair:

Lee-Su Huang Stephen Belton Lisa Huang

A research project presented to The University of Florida Graduate school of Architecture in partial fulfillment of the requirements for the degree of Master of Architecture, followed by University of Florida Spring, 2019.


Š2019 Armando Urenda-Velazquez / Trevor Halcomb All Rights Reserved



TABLE OF CONTENTS ACKNOWLEDGMENTS......................................................................................................................................................01 ABSTRACT.................................................................................................................................................................................. 03 CHAPTER 1 AR IN ARCHITECTURE........................................................................................................................ 05 History.............................................................................................................................................................................. 07 Current Applications in Architecture................................................................................................................ 15 CHAPTER 2 HARDWARE / SOFTWARE .................................................................................................................. 19 HMD................................................................................................................................................................................. 21 Peripherals...................................................................................................................................................................... 25 Software/ Engine.......................................................................................................................................................... 27 Computer Specs........................................................................................................................................................... 29 CHAPTER 3 PRELIMINARY CONCEPTS................................................................................................................. 31 Initial Scenario / Elements...................................................................................................................................... 33 Obelisk / Gallery.......................................................................................................................................................... 37 Physical Pieces.............................................................................................................................................................. 39 Midterm Scenario....................................................................................................................................................... 47 Conceptual Scenarios................................................................................................................................................ 49 CHAPTER 4 SPATIAL NARRATIVE............................................................................................................................. 63 Obelisk.............................................................................................................................................................................. 65 Scenario #1 - Infinite Plane..................................................................................................................................... 71 Scenario #2 - Room Pivot........................................................................................................................................ 79 Scenario #3 - Lantern Fog....................................................................................................................................... 87 Scenario #4 - Surrealist Paintings........................................................................................................................ 97 CONCLUSION......................................................................................................................................................................... 107


APPENDIX A IMAGE LIST.................................................................................................................................................................. 109 APPENDIX B GLOSSARY..................................................................................................................................................................... 121 APPENDIX C INSTALLATION.......................................................................................................................................................... 123 CODING REFERENCES......................................................................................................................................................................... 124


ACKNOWLEDGMENT By: Armando Urenda

I want to give thanks to all of my professors who supported my crazy ideas since the beginning of my education, especially my chair Lee-Su Huang. His understanding and supportive attitude truly made me grow as a person and as a designer. I would not be the person I am today without his help. I would also like to express my gratitude to my co-chair, Stephen Belton for supporting my endeavors along the way even though this was not his area of expertise. His advice became some of the most helpful and productive along this journey. I could have not made it to this point alone; Therefore, I would like to thank my mom and dad and all my friends who where there for me along this crazy journey, I could not be more thankful for getting to know all of you.

1


ACKNOWLEDGMENT By: Trevor Halcomb

I have never thought of myself as a conventional architect and it has taken me up to this point to realize that this characteristic is a strength. I would like to give my thanks to Professor Lee-Su who has always seen this strength in me and has continued to help and mentor me along this path. To my classmates who I would not have been able to make it this far without. To my mentors, Scott Krehl and John Maze, thank you for supporting me from the very beginning. Brittany for always being there for me. Lastly, my father Jeff who has always believed in me and has pushed me to be the best man that I can. I would like to dedicate this thesis to those who are unsure of the path they are on and to those, I say do not feel discouraged when you do not fit the mold of an architect, it simply means you make your own.

2


Abstract

The development of Virtual Reality /Augmented Reality (VR/AR) is something that has not yet been explored to its fullest potential within architecture. There are three main ways in which we are planning to explore VR /AR. Firstly, we plan to explore the potential capabilities of bringing digital objects into the physical world and how these objects could be used within architecture. Secondly, we would like to explore the potential of spatial manipulation with the use of AR. Lastly, we would like to explore the medium as a design tool in an attempt to push the boundaries of conventional architecture. Exploring the potential of bringing virtual objects into the physical world is by far one of our main focuses. We believe that by integrating hand tracking with virtual reality this would give the user the ability to interact with the virtual world in a way in which has never been done before. The potential capabilities of using this device within architecture can be revolutionary as it would allow the user to manipulate objects directly.

3


In addition to being able to explore the potential capabilities of manipulating virtual elements with your hands in the physical world, we are also planning on exploring the potential of spatial manipulation with the use of AR. This allowed to create a series of surreal experiences for an individual within a given space. Beyond these surreal experiences we are exploring ways in which we could merge the physical and digital elements within a spatial condition. An example of this would be to use physical elements mapped with digital elements allowing the user to change the texture/color of the physical element with the help of the digital element. Finally, through these experiences we are able to demonstrate in an extreme way in which this medium could be used to change the way in which architects are able to perceive spatial elements and shape experiences. These experiences are no longer limited to pen and paper and can be extended into the virtual world through AR allowing architects to not only shape physical space but the future of digital space.

4


5 01


Chapter 1 History

Pushing the boundaries of Virtual Reality (VR) was our main goal in this research project. Understanding the history of VR/AR was a crucial step required to understand and conceptualize the potential limitations and capabilities of this technology.

Figure 1-0. Disney Quest

6


HISTORY OF VR/AR Over the past several years the concepts behind Virtual Reality (VR) have become some of the most talked about subjects in our modern world. The desire of escaping the boundaries of our real world has started to impact most of our industries including the military, the medical field, and architecture. Due to new applications and quality standards, many have expressed and called VR the revolutionary idea of the 21st century. What many do not realize is that this desire of escaping the boundaries and limitations of our reality has been around for at least 541 years. In 1478 Giorgio Martini, an artist and designer, designed a room that would allow a person to escape reality. This room is known as the Studiolo from the ducal palace (Figure 1-1 and Figure 1-2). The success of Giorigio Martini’s idea was written in a letter to a friend by Machiavelli Niccolò. He described the studio as a space that would transport

Figure 1-1. Studiolo from the Ducal palace door view

7

an individual to another reality. Machiavelli said, “ I forget the world, remember no vexations, fear poverty no more, tremble no more at death: I pass into their world”1. This quote by Machiavelli clearly shows that the idea realized into reality for a space that would allow an individual to escape the boundaries of reality have been around for quite some time now. Despite the technological limitations, Giorgio was able to transmit the idea into a physical room. Unfortunately during this time people did not have the technology to develop anything close to a VR headset. It took 360 years for humanity to establish the very first stereoscopic photos that led to the first headset.

Figure 1-2. Studiolo from the Ducal palace door view

1 Art, National Gallery of. “A Room of One’s Own: The Studiolo.” Italian Renaissance Learning Resources. 2019. Accessed May 07, 2019. http://www.italianrenaissanceresources.com/units/unit-4/essays/a-room-of-ones-own-the-studiolo/.


Figure 1-3. Charles Wheatstone

In 1838 an English physicist and inventor, Charles Wheatstone’s (Figure 1-3), was researching if the brain could processes two different images at the same time. He developed the very first device that would allow a human to see two different images at the same time and called this device The Stereoscope (Figure 1-4 and Figure 1-5).

He found out that, “Viewing two images or photos through a stereoscope gave the user a sense of depth and immersion”.2 Unfortunately, the practicality of his first invention was not there. His research gave birth to the idea of Head Mounted Displays (HMD).It was not until a later iteration that he was able to truly create the very first commercial HMD (Figure 1-6) to view images.

Figure 1-4. Wheatstone Stereoscope

Figure 1-6. Commercial Stereoscope Figure 1-5. Reflecting Stereoscope.

2 ”History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html

8


Figure 1-7. Edward Link

Figure 1-8. The Link Trainer

After the development of the very first HMD, the world lost most of its interests in this technology due to it being marketed as a toy. It wasn’t until the end of War World 1 that the United States government expressed a desire to push this technology. The primary motivation was to develop a flight simulator to train airplane pilots for a future war. In 1929 Edward Links (Figure1-7) managed to develop the very first flight simulator called The Link Trainer (Figure 1-8). This simulator tried to recreate as accurately as possible the experience that a pilot might go through while learning to fly a plane. He accomplished this by adding, “...two motors that linked to the rudder and steering column to modify the pitch and roll. A small motor-driven device mimicked turbulence and disturbances…. the realism of this

Figure 1-9. Pygmalion’s Spectacles

9

simulator helped in the training of over 500,000 pilots.”3 The realism of The Link Trainer sparked the interest of science fiction writers. Upon its release to the public Stanley G. Weinbaum wrote Pygmalion’s Spectacles (Figure 1-9) which explained, “...contains the idea of a pair of goggles that let the wearer experience a fictional world through holographic, smell, taste and touch.”4 This was the very first time virtual reality, as we know it, was conceptualized. One year after the publication of Pygmalion’s Spectacles the very first practical glasses/ headset was massed produced and released to the market. This device was known as the View Master (Figure 1-10). The View Master was later used by kids as a toy, for virtual tourism, and in commercial applications.

Figure 1-10. View Master

3 “History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html 4 “History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html


1950 was a fascinating year for the development of virtual reality. Inventor Morton Heilig (Figure1-11) pushed the concepts of virtual reality to its limits. During his research, Heiling was able to develop The Sensorama (Figure 1-11, Figure 1-12). The Sensorama was a simulator that would explore the ideas of 5D by, “...[featuring] stereo speakers, a stereoscopic 3D display, fans, smell generators, and a vibrating chair. The Sensorama was intended to fully immerse the individual in the film.” 5

Ten years after Morton Heilig’s first invention he developed the very first wearable HMD named th Stereoscopic-Television Apparatus for Individual Use (Figure 1-13). According to the patent paper’s drawings this first wearable headset is nearly identical to the ones used in 2012

Figure 1-11. Morton Heilig’s / Sensorama In interview

Figure 1-12. The Sensorama Patent Papers

Figure 1-13. Patent Paper of First HMD

5 “History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html

10


In 1961, one year after the development of the very first HMD, the military funded the development to improve the original design of Morton Heilig’s first HMD. This redesign included a magnetic motion tracking system, which allowed the user to look around the image that they were seeing in the headset providing a 360 degree experience. The redesign allowed military personnel to monitor dangerous situations via a camera.6

Damocles (Figure 1-15). Ivan and Bob Sproull, his student, created, “...the first VR/AR head mounted display that was connected to a computer and not a camera” 8 As you can see from Figure 1-15 this new headset was nothing like the original. Ivan’s design was much bigger and unbearable, mainly because this was his first iteration and he was focused on making his concept work.

The development by the military inspired Ivan Sutherland, a computer scientist and professor at the University of Utah, (Figure 1-14) to develop a headset that would be fully immersive. He stated “The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked.”7 Shortly after Ivan Sutherland made this statement public he began working on a project that would strive to achieve all of his ideas in the statement above. This project later became known as Sword

Figure 1-15. Bob Sproull testing Sword Damocles

Even though Ivan managed to make his concept of a VR headset a reality, he was not able to display anything more than red lines. It took Ivan another year after completion to find some one to develop something useful for his headset. A computer artist known as Myron Kruegere, “...developed a series of experiences which he termed “artificial reality” in which he developed computer-generated environments that responded to the people in it. The projects named GLOWFLOW, METAPLAY, and PSYCHIC SPACE...”9 In other words video games for VR.

Figure 1-14. Ivan Sutherland

11

6”History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html 7”History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html 8 “History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html 9”History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html


1987 was the year in which VR started to focus on the development of portable VR systems. Practically speaking VR sets became smaller and more nimble. Jaron Lanier, the founder of Visual Programming Lab (VPL), (Figure 1-16) put his company in charge of researching and developing products that would breach the gap between the virtual world and reality (Figure 1-17).

Figure 1-16. Jaron Lanier

From 1990 through 2000 there was a loss of interest in the development of VR. This was mainly caused by Nintendo, SONY, and SEGA, some of the most well-known video game companies during this time. Their poorly designed VR headsets managed to give everyone vertigo and headaches. This was especially true of Nintendo’s Virtual Boy (Figure 1-18) which was only capable of displaying the color red as you can see in Figure 1-19. This gave the technology a terrible rating in the market.

Figure 1-18. Virtual Boy

Figure 1-19. Innsmouth No Yakata

The Virtual Reality Society quotes, “Jaron developed a range of virtual reality gear including the Data glove (along with Tom Zimmerman) and the EyePhone head-mounted display. They were the first company to sell Virtual Reality goggles (EyePhone 1 $9400; EyePhone HRX $49,000) and gloves ($9000)”10. The development of these devices were some of the most important achievements in the history of VR as they were the proof of concept people needed to start developing VR headsets for home use.

It wasn’t until 1998 when Disney World launched Disney Quest (Figure1-18) that the interest in VR technology started to regain its popularity. For the very first time a company managed to deliver a delightful experience to millions of users who had never heard or experienced VR before. As you can see from Figure 1-18, Disney Quest was a very popular attraction back in the early 2000s thanks to the full color display and improved graphics.

Figure 1-17. EyePhone and Data Glove

Figure 1-20. Disney Quest

10”History Of Virtual Reality.” Virtual Reality Society. 2017. Accessed May 06, 2019. https://www.vrs.org.uk/virtual-reality/history.html

12


2012 was the year Oculus started a tech company in Silicon Valley that shocked the world by releasing the Oculus DK2(figure 1-19). With the release of this headset Oculus had fixed the issues of vertigo and headaches that many other manufacturers had yet to fix. They truly managed to create an immersive VR experience that was affordable to the public. After the release of the Oculus DK2 the market realized the potential capabilities of this technology sparking a rapid growth in the development of VR headsets. Shortly after, there where many different companies trying to compete with Oculus, such as HTC VIVE, Google and Microsoft. The rapid use of VR technology by consumers encouraged companies

Figure 1-21. Oculus DK2

to start pushing the technology. The Void was one of the first companies to improve in the development of VR by adapting and modifying the technology to create full body VR experience. As you can see in Figure 1-20 a player is able to interact with some digital object while being able to walk around a room without having to be attached to a computer. Soon after the void accomplished wireless movement in VR, Microsoft developed one of the most accurate Augmented Reality(AR) headsets capable of interacting with real world objects known as the Hololens.

Figure 1-22. Microsoft Hololens

13


Figure 1-23. The Void

14


Figure 1-24. Enscape Interface (Video Screenshot)

Current Applications in Architecture As we were taking a closer look into how VR is influencing Architecture, we quickly realized that the industry is using VR as a design tool, in education, and in real-world applications. It is crucial to understand how this technology is being implemented in order to improve and modify our understanding of the potential improvements we could create. Enscape One of the main uses for Enscape (Figure 1-18) in the industry is to breach the gap between standardized programs like BIM and gaming engines such as Unity or Unreal. The creators of Enscape accomplish this by making the program act like a gaming engine and packaging it as a plug-in. This allows the designer not to have to learn a new program or know how to code. Unfortunately there are major trade offs for not using a gaming engine to develop a VR project. One of the main downsides is that the graphics are not as good as they can be. There are also limited by what the programs presets produce rather than having full control of outputs.

15


IRIS VR IRIS VR (Figure 1-18) is a program designed to help architects develop spatial conditions in an immersive manner. Even though this might be seen as a good idea, designing and developing a project in an effective and timely manner is nearly impossible; one of the reasons for this is that the controllers are not designed like a keyboard and they are not 100% accurate. The designer will experience problems when designing high demanding projects due to these limitations. Figure1- 25. IRIS VR

VR One of the most useful applications of VR in architecture is in the use of studying and review projects as they are being designed. This helps the architect to better understand the spatial conditions they are designing. The use of VR in the design process allows architecture firms to present a much more complex project to a client, thereby saving them time and money spent on renders, videos, and other presentations.

Figure 1-26. Revolutionizing School Design with VR

AR Just like VR, AR is being used by the architect to aid in the learning process and in understanding the building; the only difference is that AR focuses on projecting 3-dimensional objects into the real world which means that the user is grounded by reality. This is of great importance if you want to analyze a project and move around a room without bumping into objects.

Figure 1-27. AR

16


Figure 1-28. AR helping Construction Workers Build a Wall (Screenshot)

AR in Building Processes One of the most well-known ways in which AR is being used in a practical manner within architecture is by helping construction workers build complex brick walls without having to place physical guide lines for the construction. The image above (Figure 1-3) gives an accurate representation of how the headset is displaying the digital objects within the job site. It was stated by a worker in University of Tasmania wall project that,“...something as complex as this would have taken us two weeks yet it only took us 6 hours”.11

17

11 Fologram. “Fologram Talks: Holographic Brickwork.” Vimeo. December 12, 2018. Accessed May 05, 2019. https://vimeo.com/305901280.


AR Collision Detection AR is also being used simultaneously on construction sites (Figure 1-28). Workers are able to do live collision detections in the construction site as they are in process of constructing a building. This means that a contractor is able to have much smoother construction process with fewer mistakes.

Figure 1-29. BIM Collision Detection

AR Construction Time Line AR is also being used in the development of construction time-lines. This has made time-lines easier to understand and manage. Contractors are now able to obtain data from worker in the fields and know what products are needed with a accuracy.

Figure 1-30. Construction Time Line (Screenshot)

Site Mapping/ Analysis The concept of drone mapping in first person evolves from the popular sport of drone racing in which they use a VR headset to view the camera of the drone and control it with much higher accuracy. This has allowed workers in the design and construction industry to do building analysis and site mapping in a much more effective way, mainly due to fact that the VR headset makes the drone easier to fly. (See footnote 12 for video footage).

Figure 1- 31. Drone FPV 12 RC, FrĂŠdĂŠric Dauch. YouTube. January 27, 2017. Accessed May 07, 2019. https://www.youtube.com/watch?v=QBoBIjHYBuU.

18


19 01


Chapter 2

Hardware / Software Due to the increasing demand for Virtual Reality (VR) in many different industries, a multitude of companies have started developing their own hardware and software at an alarming rate. Unfortunately not all the products that are being produced are of high standards. The number of products in the market made it quite difficult to pick the perfect VR developers kit; therefore, it was crucial to understand the capabilities and limitations of all of the hardware and software that we could use. If we were to choose poorly, our research would have been critically limited by the hardware and software and not our by our understanding of the subject matter.

Figure 2-0. AMD Ryzen Thread Ripper Die

20


Google Cardboard

Figure 2-1. Google Cardboard

Google cardboard was the first product that came to mind due to its popularity and price. It seemed like a good option to start developing VR scenarios with a phone. Price: $10 Pros: • Price Cons: • There is no spatial tracking in the headset that would allow the user to walk around a room. • Google Cardboard is limited to phones only. • Phones are limited to video only.

Google Daydream VR The second device we looked into was the Google Daydream VR which is the better version of Google Cardboard controller mainly because it has an added controller. Price:$100 Pros: • It has a good starting price. Cons: • The controller can only be used as a TV remote. • The headset is not capable of doing room spatial tracking. Figure 2-2. Google Daydream VR

Gear VR

Figure 2-3. GearVR

21

The Samsung VR is a better version of the Google Daydream VR, it includes a controller which allows the user to experience spatial tracking. Price:$130 Pros: • The headset has a good starting price. Cons: • The Samsung VR display is determine by the Samsung Phone that is available. • Having to purchase an $800 Samsung phone. • The headset is not capable of tracking 360 degrees. • The added controller would not deliver the accuracy desired.


PlayStation VR

Figure 2-4. Play Station VR

The PlayStation VR integrated display and tracking seem like a much better option than phone oriented headsets. Price: $266 Pros: • Good starting price for a developer kit. • The display is integrated in the headset. Cons: •Bad Display resolution only at HD 1920x 1080 and not at 4k, users will see the pixels. •The headset tracking is limited by what the cameras can see, only 180 degrees of tracking. •It is much harder to develop due to SONY. •Not capable of doing AR Oculus Rift

Figure 2-5. Oculus Rift

The Oculus Rift was the very first developers kit that we considered after realizing that the PlayStation VR was much harder to develop for due to SONY and its partial tracking. Price:$399 Pros: • Good price for a full developers kit. • Capable of doing 360 degree tracking. • It is compatible with PC making it easy to design with. Cons: • Display resolution could be better 1080x1200 and not at 4k, users will see the pixels. •Room tracking could be better. • Not capable of doing AR HTC VIVE The HTC VIVE is a much better headset in comparison with the Oculus Rift. Its tracking accuracy is much sharper and reliable, as well as its display.

Figure 2-6. HTC VIVE

Price:$500 Pros: • Good price for a full developers kit. • Capable of doing 360 degree tracking. •Display resolution is much better at 1080x1200 in each eye with a total of 2160x1200. •It is compatible with PC making it easy to design with. Cons: • Not capable of doing AR. 22


Pimax 8k Series

Figure 2-7. Pimax 8k Series

The Pimax 8k series seem to be a potential alternative to the HTC VIVE due to its new and improved ultra-wide display. After reviewing the headset we realized that it induces vertigo. Price:$900 Pros: • The display is super high resolution at 3840x2160 (4K). • Display is ultra-wide for better viewing. • It is compatible with PC making it easy to design with. Cons: • Display has delay problems that can cause vertigo. • It is not capable of doing AR. • Headset does not come with sensors. Lenovo Mirage AR Headset The Lenovo Mirage AR headset was the very first AR headset we looked in to as we realize we need to play with the concepts of AR. Unfortunately this headset is very limiting due to its reliability on a phone. Price:$200 Pros: • The price of getting started with AR is really good. Cons: • The headset can only hold a phone. • Tracking is not very accurate. • Headset does not come with sensors. • It is not of capable of doing VR.

Figure 2-8. Lenovo Mirage AR Headset

Microsoft Hololens

Figure 2-9. Microsoft Hololens

23

The Microsoft Hololens seemed to be a much better fit since this headset has all the hardware integrated. It is also intended to be an AR developer kit. Unfortunately it is very pricey and the graphics are not the best. Price:$3500 Pros: • Capable of doing 360 degree tracking. • There is a computer in the headset so no external computer needed. Cons: • The view that displays AR is very small. • The price is too much for what you get. • Graphics are limited by the computer. • Not capable of doing VR.


Magic Leap One

Figure 2-10. Magic Leap One

The Magic Leap One is another AR headset that is very similar to the Hololens. The difference is that this headset is smaller and it has much better tracking as well as graphics. Price:$2,295 Pros: • Much better 360 degree tracking then Hololens. • There is a much more powerful computer attached to the headset compared to Hololens. Cons: • The view that displays AR is very small. • The price is too much for what you get. • Not capable of doing VR. HTC VIVE PRO (HMD Chosen) The HTC VIVE PRO seems to be a much better option. The headset is capable of delivering the performance we needed. It is capable of performing like a VR headset as well as a AR headset simultaneously.

Figure 2-11. HTC VIVE PRO

Price:$1,098 Pros: • Capable of doing 360 degree tracking up to 20 feet. • Capable of doing VR and AR. • It has a much higher resolution then any other headset at 2880x1600 (4K). • The tracking is much more accurate. • It is compatible with PC making it easy to design with. Cons: • AR cameras are only at 420p which make the screen some what blurry.

24


Figure 2-12. Leap Motion

Peripheral

Leap Motion The Leap Motion is an external device that can be used in conjunction with a VR or AR headset. This device captures the user’s hand movements allowing for the interaction with 3D object.

Price:$100 Pros: • Capable of being used with VR and AR. • The price for hand motion tracking hardware is one of the best. • It gives the user a new understanding about VR/AR.

25

Cons: • The hand input could be better. • In order to use the Leap Motion there is a custom code that must be used. • Doing any updates will crash the program.


ZED MINI

Figure 2-13. ZED MINI

The Zed Mini is a pair of stereo pass through cameras that mimic the eyes; this is intended to be used with any VR headset that is not cable of doing AR and allowing it to have such functionalities. Price:$500 Pros: •The camera quality is at 720p. Cons: • The price is high for what the product is able to do. • You need a modified gaming engine to be able to work with it. • Any update will break the code. • Peripheral vision is limited Plexus Feedback Glove The Plexus Feedback Glove is a very similar to the Leap Motion in the sense that it takes the data from your hands and displays it in the virtual world allowing the user to interact with a digital object. The only difference is that the Plexus also sends feedback to the users hands allowing the user to have haptic feedback from the 3D object.

Figure 2-14. Plexus Haptic Feedback Glove

Price:$250 Pros: • It has a two way feedback allowing a user to have a haptic response from the digital objects. Cons: • There is no temperature feedback. HTC VIVE Tracker The HTC VIVE Tracker allows for the user to add additional tracking points that can be mapped to real-world objects allowing them to become unconventional controllers. Price:$99 Pros: • Being able to track regular objects. Cons: • The tracker battery life is only rated for about 2 hours. • Some what expansive knowing that a regular controller is $130.

Figure 2-15. HTC VIVE Tracker

26


Figure 2-16. UNITY (Screenshot)

Software Understanding the potential capabilities of a gaming engine was a crucial step in the development of our project. We needed to select the best possible software that would grant us the freedom to push our concepts of VR/AR to it limits. Not doing so would have limited us from accomplishing our full potential. Unity Unity is one of the most well-known gaming engines by developers due to its simplicity with coding. The freedom it gives to coding experts is one of the most desirable aspects of Unity. Price: free Pros: • It is free • It is easy to code for. • Its popularity makes it easy to learn as there is a lot of information.

27

Cons: • It is hard to do renderings. • Everything has to be coded. • The program is not finished and there are some plug-ins that you must pay for. • In order to get materials you have to pay for them. • It is not compatible with architecture programs.


Figure 2-17. UNREAL ENGINE 4(Screenshot)

Unreal Engine 4 Unreal Engine 4 is another very well known gaming engine that is very popular with designers due to its implementation of visual scripting instead of coding.

Price:$free Pros: • It is free. •Visual scripting is easer to learn. • Its popularity makes it easy to learn as there is a lot of information. • Compatible with architecture programs. •Very fast video and photo rendering. • Most materials are free. • First gaming engine that supported VR.

Cons: • Coding in C++ is much harder. • When using external plug-ins the engine can crash.

28


Computer Hardware To aid in the development of our research, we had to have a very specific computer capable of running multiple virtual reality projects at once; the list the follows are the minimum recommended specifications for anyone that might be interested in the development of VR.

Figure 2-18. Gaming PC

AMD Ryzen Threadripper 2920X Price: 600 When choosing a CPU similar to Thread Ripper with a minimum of 8 cores and 16 threads running at 3.0-4.0 GHz anything lower than this will start to have problems running all of the programs simultaneously.

Figure 2-19. AMD Ryzen Threadripper 2920X

ROG ZENITH EXTREME Price: 600 The ROG ZENITH EXTREME was our motherboard of choice due to its reliability and compatibility with our CPU. In addition to this we needed a motherboard that was capable of running multiple M.2 hard drives. This is highly recommended due to the very large file sizes that are going to be created when developing VR projects.

Figure 2-20. ROG ZENITH EXTREME

29


DDR4 VENGEANCE RAM Price: $150 In the RAM department, we decided to use DDR4 VENGEANCE RAM due to its reliability when paired with multiple RAM sticks. This is a crucial step since the bare minimum requirements for running a VR projects is 16gb of RAM. We would recommended to install 32bg or more depending on the project type.

Figure 2-21. DDR4 VENGEANCE RAM

M.2 Samsung 970 EVO Price: 150 We decided to use a 500GB M.2 Samsung 970 Evo hard drive due to its performance. This hard-drive is capable of delivering transfer speed of up to 1GB; it’s essential to be using something that is equivalent to or better than this since most of the file sizes that are going to be created are over 60gb each. Using a traditional hard drive will slow down the development of any project quite significantly in terms of hours.

Figure 2-22. M.2 Samsung 970 EVO

NVIDIA GTX 1080 Price: 500-800 The NVIDIA GTX 1080 was our graphics card of choice. Its reliability and performance are some of the best compared to other graphics cards. Due to the resolution qualities of the VR headset and other displays that will be running at the same time it is required to have a similar graphics card or a graphics card that is better.

Figure 2-23. NVIDIA GTX 1080

30


31 01


Chapter 3

Preliminary Concepts The initial concept of our research was to develop scenarios expressing an interaction between the physical and the digital. The first physical construction resulted in a 1:1 scale door mock-up. This physical doorway allowed us to test many digital to physical applications. These applications were then translated to four experiential scenarios that were based around the video game Myst and other digital puzzle games. These scenarios were focused heavily on the interaction between the physical and digital. Numerous physical pieces were constructed to supplement the digital modeling and coding that was being implemented. One of these designs was refined and became the basis for which the Midterm presentation was developed. Following the Midterm review a period of refining and refocusing began. Understanding the experiential goals of each of the spatial narratives ultimately became the design focus while developing conceptual ideas. These ideas came in the form of conceptual charettes that were focused on a single specific architectural or surrealist concept. The most successful of these charterers were then refined and presented as a final presentation.

Figure 3-1. Preliminary Applications

32


Figure 3-2.1 Sand Ruin 1

Figure 3-2.2 Sand Ruin 2

Figure 3-2. Air Scenario

Air Scenario The Air Scenario was conceived based on the air element and consisted, conceptually, as a place in which the wearer would use air to solve the scenarios puzzle. Within the puzzle consisted two main parts: sand and the ruins. Each scenario was broken down into three phases: Phase 0, Phase 1, Phase 2. The Phase 0 consists of the default starting phase where the wearer is initially presented with the scenario. In this phase the user will have to start the scenario which involves interacting with the obelisk. Upon starting the scenario the physical object housing the physical interacting portion of this scenario would be covered with sand and it is up to the wearer to find a way in which to remove the sand. Phase 1 consists of moving the corresponding Obelisk stone into the physical object removing the sand with a gust of air. This would reveal the lower half of the physical object and allow for the continuation onto Phase 2. This phase was yet to be realized and would have involved finally solving the scenario.

33


Figure 3-3.1 Sunken Ship 1

Figure 3-3.2 Sunken Ship 2

Figure 3-3. Water Scenario

Water Scenario The Water Scenario was to be located in between the threshold between the galleries main space and its entry space. This threshold created a natural doorway that we could use digitally and physically. Phase 0 of this scenario involves activating the Obelisk which creates a doorway on the threshold. This doorway would house a portal which depicted a vast ocean on both sides of the surface. Phase 1 involved the wearer interacting with a physical lever which would then flood the gallery with water coming from the portal. This water would then float a digital piece of the puzzle that was tied to a physical piece in the real world. The final phase of the scenario required a piece of the obelisk to be inserted into the digital object that was now floating on the surface of the water. This physical / digital object was to be represented as a broken ship that had crashed ashore.

34


Figure 3-4.1 Broken Rocks

Figure 3-4.2 Floating Rocks

Figure 3-4. Earth Scenario

Earth Scenario The Earth Scenario is the furthest developed scenario and was the scenario that was presented during the Midterm Review. This scenario was chosen because it was deemed to be the most involved and difficult of the scenarios which would allow us to resolve problems that may be present in the remaining three scenarios. After the activation of the scenario in Phase 0, a corner of the galleries two walls would turn to stone. These stone walls would be the basis of this scenario and the physical interaction amongst its pieces. Phase 1 involved the wearer placing a piece of the Obelisk within the first physical structure thus removing some of the stone walls which then formed into floating pillars next to the wearer. Phase 2 further implemented the physical / digital interaction by having digital pieces overlap a physical object allowing the wearer to complete a sequence puzzle on the face of one of the stone pillars. Completing this puzzle sequence finished the scenario and offered a look into the surreal world beyond the stone walls.

35


Figure 3-5.1 Train Boiler

Figure 3-5.2 Foundry

Figure 3-5. Fire Scenario

Fire Scenario The Fire Scenario is least actualized and because of this the majority of the scenario was still in its conceptual stages. The scenario would have revolved around the idea of a fire element and what forms of physical / digital interaction could stem from this concept. Having the wearer interact with fire such as through a trains boiler room or a foundry in which they were creating something were the most agreed upon conceptual scenarios. Perhaps having the wearer move mechanical pieces, gantries, that would focus molten metal in certain directions was the conceptual start.

36


37


Figure 3-6. Scenario Location / Obelisk

38


Figure 3-7. Phase 2 Interaction

39


Main Board The main board of Phase 2 consists of a solder board and an Arduino Micro. These two would be the motherboard for which the entire scenario was programmed onto and would allow for the wiring to be centrally located within the structure.

Figure 3-8. Main Board

Power and Communication One of the most difficult portions of the scenario is the fact that it needs to be portable and this was solved with quick disconnects connecting the main board to the Phase 1 piece and the Obelisk.

Figure 3-9. Power Distribution

Arduino The initial programming was done through the use of the Arduino Mega board. This board was changed later to the Arduino Micro due to the interactions with each board and the Unreal Engine software.

Figure 3-10. Arduino Mega / Mini

40


Figure 3-10.1 Arduino Code

41


Arduino Code The Arduino Code created for the midterm scenario involved associating a button depression from the physical panel to a corresponding button keystroke to the Unreal Software. The code involves binding these two key association so that when the wearer presses a button on the physical panel this sends a message to the Unreal Engine thus progressing the scenario.

42


Figure 3-11. Obelisk and Phase 1 Piece

43


Phase 1 Wiring Phase 1 consists of placing the Obelisk’s relic within the secondary physical structure. A housing for the relic was 3D printed and placed within the structure attached to a physical button. When the relic was placed within the housing this button was triggered sending a signal back to the main board within the Phase 2 structure.

Figure 3-12. Solder Board / Obelisk Piece

Obelisk Time Relic The first relic to be created was the time relic. This relic would deal with any changes in each scenario that needed to use time to solve. This relic depicted an hourglass that was hollow allowing an led to light when the relic was activated.

Figure 3-13. Obelisk Time Relic Front

Time Relic Shape / Structure Each relic would have a specific ring on the backside of the piece that would allow for each scenario to use different relics but varying the location of the ring it would allow us to make sure only certain relics worked in certain relic housing. This created a system for which to limit the wearer to only the correct relic would work to solve he scenarios.

Figure 3-14. Obelisk Time Relic Back

44


Obelisk Phase 0 The Obelisk was the hub for all of the scenarios. It housed all of the Phase 0 buttons which would allow for the start of the all of the scenarios. Each Phase 0 scenario button consisted of a unique button depicting one of the four elements.

Figure 3-15. Obelisk Panel

Figure 3-16. 3D Prints / Tests

45


3D Printed Buttons All of the physical buttons within the scenarios were 3D printed through various tested prints. These buttons were eventually designed to use a prefabricated button that would mesh with the physical face of the 3D printed housing. These would then be wired and attached to LED’S that would signal when the button was correct of not.

Figure 3-17. Physical Buttons

Initial Keyboard In order for the physical button signal to be sent to the Unreal Engine software a keystroke was needed behind each of the buttons. Initially this was done through the use of a modified keyboard. This was later changed as this was much too fragile of a concept. This was later changed to the Arduino Micro which could, through the use of a specific firmware, simulate a keyboard.

Figure 3-18. Test Keyboard

Figure 3-19. Keyboard Layout

46


Figure 3-20. Earth Scenario Concept

Midterm Scenario Earth Scenario The finalized Earth Scenario for the Midterm Review consisted of phases in which the wearer would interact with physical pieces that were overlapped with digital sections. This interaction between the digital and the physical would allow the wearer to complete the scenario which involved using the corner of the gallery to create and manipulate stone structures. These structures would move and adapt to the inputs from both of the Phase 1 and Phase 2 structures.

47


Phase 1 / Phase 2 Puzzle Phase 1 puzzle consisted of a time relic being placed within the first structure. This would crumble the surrounding stone walls and merge the broken stones into the floating pillars. These pillars would overlap with the physical piece in Phase 2. This would create a sequence puzzle that the wearer would need to solve in order to complete the scenario.

Figure 3-21. Physical Interactions

Phase 1 Reveal When the first phase of the puzzle has been completed and the walls begin to crumble we wanted to create depth where the walls once were and created a series of scenes beyond the walls. These scenes would later impact how the final scenes would be created.

Figure 3-22. Phase 1 Reveal

Phase 2 Reveal The final phase would reveal an even further surrealist diorama that depicted a cave scene much larger than the first phase reveal.

Figure 3-23. Phase 2 Reveal

48


Figure 3-24. Infinite Plane Scenario

Conceptual Scenario #1 Infinite Plane Inspired from M.C. Escher’s conceptual drawings regarding infinity and perception, this scenario explores the concept of an infinite plane. The focus of this scenario is to transport the user to a dimension which is physically impossible in the real world and impossible within conventional architecture. This scenario is the first scenario that deals exclusively with VR as opposed to AR due to the technical limitations of the software rendering distance. This exclusive use of VR allowed the scenario to be a total shift from real to virtual.

49


Infinite Plane The conceptual idea of the infinite plane involves the disconnect between previous preconceived notions of space. Space is understood on a scalar level relative to our bodies. On an infinite plane scale is irrelevant as it is read at the same scale regardless of view. Using this concept allows the scenario to remove any semblance of scale for the user resulting in an experience that is vastly different than what our eyes are used to understanding.

Figure 3-25. Infinite Plane Scenario

Scalar Supplements The infinite plane within the rendering software is very successful in removing the users ability to perceive their surroundings scale. By placing objects, such as the mirrored sphere, that reflect the top and bottom planes of the infinite scenario the user is able to better comprehend infinite depth as a concept. These objects further not only the concept of infinity but the truly incomprehensible scale of such a concept.

Figure 3--26. Infinite Plane Scenario

Portal Beyond Perhaps the most difficult portion of this scenario is the change between the physical world and the virtual world. This switch is done through a digital portal. This portal was initially conceived as a vertical portal in which the user would step up into. Upon hitting the digital portal the software would switch from pass-through camera to no camera essentially rendering only the digital scene elements.

Figure 3-27. Infinite Plane Scenario

50


Figure 3-28. Focal Change Scenario

Conceptual Scenario #2 Focal Change The initial concept behind the focal change scenario stems from the idea of obscuring vision or a space based on location. This obfuscation was intended to force the wearer to understand their location based around a central pivot point thus creating a point in knowing not only their surroundings but the spatial characteristics of the room in which they are in.

51


Camera Focus The mechanical portion of the scenario involves the shift between a maximum and a minimum focal length. This focal length is created from a large digital disc that has a central cutout. This disc is then scaled up or down depending on the desired focal distance. This disc is then positioned in front of the camera enabling the effect of reduced focus and peripheral vision.

Figure 3-29. Focal Change Camera

Focal Distance This scenario involves the use of the Vive Tracker. This tracker’s world position within the gallery is then compared to the world position of the HMD and then the difference is calculated. This difference is branched to a set of intervals that scales the focal disc depending on the value. The bigger the difference, further away the HMD is from the Vive Tracker, the smaller the focus distance.

Figure 3-30. Focal Distance

Maximum Focal Range At the maximum interval between the HMD and the Vive Tracker the focus is very small taking up a small percentage of the total viewing range of the HMD. This effect is much like the effect of using a flash light, forcing the wearing to focus on only certain aspects of the room.

Figure 3-31. Max Focal Range

52


Figure 3-32. Room Pivot Scenario

Conceptual Scenario #3 Room Pivot Room Pivot was developed under the concept of vection. Vection is the sensation of movement within a space without the actual movement taking place. This movement is induced by the movement of other objects that forces the perspective of self movement. This idea has been condensed into a scenario that forces the user to experience a movement that is not actually happening.

53


X

Y

Coordinates The basement mechanics of this scenario revolves around the X and Y coordinates of both the HMD and the Vive Tracker. These coordinates are set upon a Cartesian coordinates system and are compared to a set branch of values. These values depend on where the HMD is placed relative to the tracker.

Figure 3-33. X / Y Calculation

Single Movement The initial concept was set out to see if the HMD and Vive Tracker could communicate in a away in which would allow the wearer to feel movement. This single movement step proved successful when objects were placed within the render area.

Figure 3-34. Single Movement

Cartesian Coordinates The addition of both the X and the Y direction allowed for multi-pivot angles thus giving greater depth. This depth furthered the experience by allowing the wearer to relate their actual location in the room to the amount and position of the pivot.

Figure 3-35. Courtesan Movement

54


Figure 3-36. Lighting Change Scenario

Conceptual Scenario #4 Lighting Change This scenario focused on a skills applicable for real world decisions. This scenario involves the manipulation of designed lighting conditions within a future design space. This lighting condition can be controlled in real time and be able to be mapped to geographic locations in order to better understand the effects of lighting on a design. This scenario tries to capture the physical manipulation of lighting conditions.

55


Control Object The sunlight through which this scenario revolves around is attached to a physical object which can be manipulated in order to alter the characteristics of the light. The simple shape of a cone represents the directions and face of the light. As the wearer turns they manipulate the cone and the sun moves in turn. However, the object is not the light source but only the physical controller for the sun overhead.

Figure 3-37. Control Object

Wall Reflection Due to the nature of AR objects within the real world shadow cannot be cast upon the walls in an effective way. This is the limiting factor within this scenario. Figure 3-38 demonstrates the lack of lighting qualities on transparent wall surfaces.

Figure 3-38. Base and Wall Lit

Floor Shadow The most successful way of incorporating the digital and physical is creating a virtual plane on which the shadows can interact with. Although this is not aesthetically effective on the walls, the floor offers a much better platform on which the scenario can interact with.

Figure 3-39. Floor Lit

56


Figure 3-40. Lantern Scenario

Conceptual Scenario #5 Lantern Fog Lantern Fog is a further development from the Focal Change scenario in the direction of a much more experiential scenario. Built on the idea of limiting the wearers eyesight in order to facilitate a better understanding of ones place. This scenario has a black sphere following the lantern. This lantern would be held digitally by the Leap Motion and the wearer would carry around the lantern revealing the gallery walls and floor as they traveled.

57


Lantern Mesh Collision

Lantern Emitter

The basis of the scenario involves the interaction and use of the lantern. This lantern has two properties, one being the mesh collision and the other being the emitter. The lantern mesh collision is attached to a dome that surrounds the lantern. This dome hides the wearers physical environment while holding the dome. The emitter is used to signify the removing of the dome allowing the user a limited field of view.

Figure 3-41. Lantern

Spatial Understanding Much like the Focal Change scenario understanding ones place is the main goal. However, this method of spatial limiting allows a much greater concentration of the immediate space around the wearer.

Figure 3-42. Base and Wall Lit

Fog Dome The difficulty in using this method is the fact that the user is allowed free roam when not interacting with the lantern. This has a negative effect of allowing the wearer to “leave� the scenario. This issue was a driving point in the later development of the final scenarios.

Figure 3-43. Floor Lit

58


Figure 3-44. Moving Spheres Scenario

Conceptual Scenario #6 Moving Spheres Moving Spheres is based on the concepts of spatial manipulation. This idea was divided into three different phases each exploring a different variation of spatial manipulation within such as spatial pulling, spatial shearing and spatial tearing. Each of these phases was intended to explore the potential capabilities of AR while manipulating our reality.

59


Phase 1 Spatial Pulling The idea of phase 1 was to try to manipulate the physical space within the gallery in order to make it appear like it was being pulled by the a 3D object. If the wearer is looking toward the same direction the spheres are moving against each other, the wearer can visualize the idea of spatial pulling as it is shown in Figure 3-45.

Figure 3-45. Spatial Pull

Phase 2 Spatial Shearing Phase 2 is very similar to phase 1; The only difference is that this phase was design around the concept of shearing. The idea was to manipulate the spatial conditions within the gallery in a much more aggressive manner. The wearer can experience this by seeing the multitude of spheres going the opposite direction toward one another as its shown in Figure 3-46.

Figure 3-46. Spatial Shear

Phase 3 Spatial Tearing Phase 3 was designed in order to allow the wearer to experience the concept of spatial tearing. This was accomplished by having spheres spawn in random locations within the room will at the same time adding vertical movement thereby accomplishing spatial tearing.

Figure 3-47. Spatial Tearing

60


Figure 3-48. Fire Scenario

Conceptual Scenario #9 Fire Scenario The fire scenario was inspired by the concept of fire safety. Being able to simulate fire analysis and study the space that is being designed with the use of AR was one of our main priorities. Too accomplish this would allow architects to have a greater understanding of how space would react in a fire situation.

61


Fire The fire particle was designed to mimic real-world physics and simulate a real fire. The spreading of the fire is determined by the materials that where overlaid within the scenario. This is how the fire is able to determine how fast it should grow and spread across the room.

Figure 3-49. Fire

Smoke When the fire starts it is programed to start releasing smoke. This smoke particle is capable of interacting with the room boundaries, allowing for a much more accurate representation of a real world fire.

Figure 3-50. Smoke

Darkness If the wearer stays long enough in this scenario, they will experience the same levels of darkness that a building fire might have. This is due to the design of the smoke particles which increase in volume as time goes by.

Figure 3-51. Darkness

62


63 01


Chapter 4

Spatial Narrative Following the Midterm review a reevaluation and a process of distilling the core concepts of the initial design began. These core concepts took the form of spatial narratives derived from architectural and surrealist origins. By removing the textures and photo-realistic aspects of the initial design a much more focused and surrealist concept emerged that, although seemed less visually impressive, delivered a design whose main focus was experiential rather than visual aesthetics. Following the design charettes a decision was made on which of the previous design were to be chosen as the final design pieces. These choices were made based on uniqueness and success in giving the wearer a purely experiential scenario. The scenarios were chosen to create a wholestic experience without overlap of concepts. Note: This thesis includes a video recording of the following scenarios during the final review.

Figure 4-1. Infinite Plane Scenario

64


Figure 4-2. In-Software Obelisk

Obelisk Anchor Between Digital and Physical Following the selection of the final scenarios a centralized obelisk was chosen in contrast to the midterm and preliminary physical objects. In this design all of the scenarios inhabit a single object, the Obelisk. This object represents a home location in which the user returns to after each of the scenarios. Much like the mock door, this design represents a focus on the threshold between the digital and the physical. It is this connection that is at the core of this thesis and allows the successful juxtaposition of each of those elements.

65


Figure 4-3. Physical Obelisk

Vive Tracker Simplification Perhaps one of the most difficult aspects of not only the midterm but in using augmented reality in general is in the overlay of the digital over the physical. By attaching the Vive Tracker physically to the physical obelisk we are able to digitally design everything around a 0,0,0 focal point. By having an accurate model of the gallery space we are able to place the obelisk within the gallery and be accurate to a few inches given the limitation of the tracking hardware.

Figure 4-3.1 Vive Tracker

66


Figure 4-4. Obelisk Code

67


Unreal Engine 4 Design Code At the design heart of this project is the code in which we talk to the software through. This code is used to do everything within the Unreal Engine. The code has been a learning experience as well as a tool through which our design ideas can be actualized. Perhaps the biggest hurdle in using the code is the translation between the digital and the physical. Designing a scheme for which to write the code in a way in which to translate those ideas was, at times, a struggle. As much of this technology as well as the engine is geared toward virtual and virtual reality applications the transition to augmented reality required innovative steps around these limitations. The following sections including code have been reformatted in a way in which to describe specific sections of the code pertaining to the final scenarios. The code is as much a way in which to visually see our process but see the way in which the code reinforces the wearers experiences.

Obelisk - Unreal Engine Code This section of the code revolves around the interaction between the various portals and their functions within the obelisk. The most difficult portion of this code involves the switching between different scenarios while still being in a scenario. The way of switching between different scenarios was done through a planar object that has hit collision. When the HMD headset passes through this invisible plane this sends a command to load the scenario. This action then hides all of the other scenarios and removes their hit planes which disables their ability to load a different scenario. In order for the wearer to load the next scenario they will have to walk through the portal in which they entered thus resetting and allowing the other portals to reappear.

68


Figure 4-5. Obelisk Code

69


Obelisk - Pass-through Camera The HMD software involves combining the digital objects being rendered within the Unreal Engine and the two cameras providing information regarding the physical world. This information is then combined which gives way to augmented reality. Since some of the final selected scenarios involve switching this pass-through camera ability off to allow for a greater render range this may cause issues if multiple load planes are triggered. By attaching a “Master� hit box within the obelisk the ability to fully reset all scenarios and applications to their starting position was a must. Figure 4-5 depicts a portion of this code that forces the rest of the pass-through camera regardless of what state it may be in at the time.

70


Final Scenario #1 Infinite Plane

71


Figure 4-6. Infinite Plane

72


Figure 4-7. Final Infinite Plane

Final Scenario #1 Infinite Plane Infinite Plane was chosen as the first final scenario due to the inclusion of 100% augmented reality as well as the success in transporting the wearer to an otherworldly plane. The revised scenario begins to solve issues revolving around the removal of the overhead plane portal. The initial concept was to be able to step up into the portal teleporting you to the infinite plane at eye level. The final concept, like all of the final concepts chosen, was to allow the wearer to step through a doorway to a different world.

73


Infinite Plane - Portal The portal threshold for this scenario depicts the meeting of the two infinite planes, the sky and the ground plane. These two create an infinite horizon that symbolizes the main theme of the scenario.

Figure 4-8. Control Object

Gallery Outline The addition of an outline to the gallery walls and floor provides an anchor point in which to juxtapose the infinite plane. This grounds you back with reality and is essential in a scenario that is comprised of fully 100% augmented reality.

Figure 4-9. Base and Wall Lit

Vertical Infinite Plane The most significant change from the initial design is in the addition of a vertical plane. This vertical plane meets the horizontal planes and gives a second dimension to the infinite concept. By positioning the entrance portal relatively close to the meeting edge of both these planes the wearer can experience both planes simultaneously.

Figure 4-10. Floor Lit

74


Figure 4-11. Physical Plane

Physical and Digital Interaction The constant focus of this thesis is the interaction between the digital and physical and how that changes the perspective of the wearer in an experiential way. This scenario represents an entry into that field by allowing the wearer an experience that is unlike the real world. This experience with the infinite is both a surrealist concept and a physical interaction. The feeling of stepping over the edge of the infinite planes becomes a very real emotion and physical experience.

75


Figure 4-12. Digital Plane

76


Figure 4-13. Infinite Plane Code

77


Infinite Plane- Unreal Engine Code Given that this scenario involves switching from partial augmented reality to full augmented reality this scenario is mostly virtual and because of this has the shortest amount of coding. The majority of the code involves the switching of the pass-through camera on and off. All scenarios start hidden and their visibility is toggled only when passing through the correct portal. The blue highlighted portion of the code indicates the wearers physical interaction with this section of the coding.

78


Final Scenario #2 Room Pivot

Figure 4-14. Final Room Pivot

79


80


Figure 4-15. Room Pivot Scenario

Final Scenario #2 Room Pivot The Room Pivot scenario was chosen as the second final scenario due to its success in portraying movement through the HMD. This movement offers an experience that is unlike the other scenarios in that the scenario creates a disconnect between your eyes and what your body is experiencing. From the initial design concept this scenario has incorporated much of the same ideas but has added a secondary movement through the Unreal Engine software.

81


Room Pivot - Portal The Room Pivot scenario portal tries to describe the basic principle of movement. The portal consists of a horizontal plane that has been skewed with the addition of the spheres. These spheres were inspired from the original design concept of additional movement to supplement the feeling of vection.

Figure 4-16. Room Pivot Portal

HMD - Rotation Unlike the original design the revised design incorporates the addition of headset rotation through the Unreal Engine code. This code adjusts the values of the headset rotation according to the same distance calculated between the headset and the Vive Tracker. These two distances are compared and adjusted according to a branch and this set of values rotates the headset along with the rotation of the floor. This combination of rotation of the headset as well as the floor drastically increases the feeling of movement.

Figure 4-17. HMD Rotation

Additional Gallery Geometry Given that the gallery floor moves according to the headset location the wall beneath the horizontal plane is exposed. Since these walls do not exist in the real world digital replacements are applied below the floor of the gallery. These black walls offer a place for the floor to rotate giving the illusion the floor is moving into itself.

Figure 4-18. Basement Geometry

82


Figure 4-19. Real World Room Pivot

Physical and Digital Interaction Perhaps more than any other scenario this scenario consists of a large departure from the actual events in the physical world. These events are directly controlled by the digital scenario thus giving a large physical reaction, as seen in the photo above. Due to the physical experience of feeling movement that is not the users, the user has a profound reaction towards the digitally induced movement. This movement creates a hesitation to advance further than a few feet away from the pivot point which is the obelisk.

83


Figure 4-20. Digital View Room Pivot

84


Figure 4-21. Room Pivot Code

85


Room Pivot- Unreal Engine Code The Room Pivot scenario code deals with the very specific area of locational tracking and information regarding the difference between the HMD and the Vive Tracker. The areas highlighted in blue are where the wearer physically interacts with the digital objects within the scenario. The areas highlighted in green are where the code interacts with the physical location of the HMD and compares that to the 0,0,0 location of the Vive Tracker. This difference between the two is a driver for both the rotation of the HMD and the X and Y rotation of the gallery floor.

86


Final Scenario #3 Lantern Fog

87


Figure 4-22. Final Scenario #3

88


Figure 4-23. Lantern Fog Scenario Physical

Conceptual Scenario #3 Lantern Fog A large departure from the initial design concept the Lantern Fog scenario has moved into a direction that allows for a more controlled and experiential scenario by eliminating the dome. A particle effect has been substituted for the dome which acts as a physical interaction with the lantern. The lantern has now been eliminated and refocused as a singular light orb. This orb is much easier to hold and walk around with. The orb now physically removes the black fog. Many of the concepts from the Focal Change scenario have been incorporated into this scenario.

89


Lantern Fog - Portal The portal for this scenario is simple in design as the main concept of this scenario is finding ones path through the fog. The portal gives a starting idea of the possible interaction with the orb.

Figure 4-24. Lantern Fog Portal

Orb Light The start of the scenario involves the addition of an on and off command for the orb. The orb initially floats on a pedestal a few feet away from the portal entrance. Upon touching the orb, through the use of Leap Motion, the orb is turned on and the scenario begins.

Figure 4-25. Lantern On / Off

Orb Interaction The light orb directly interacts with the fog particle removing in a predefined radius. This effect is temporary and after a few seconds the fog is allowed to return. By allowing the light to remove the fog the wearer must slowly use the orb to explore the gallery space experiencing the space through a much more focused manner.

Figure 4-26. Orb Interaction

90


Figure 4-27. Starting Lantern Fog Scenario

Physical and Digital Interaction This scenario makes great use of the Leap Motion. Through this hardware we are able to physically interact with the lantern orb. The scenario focuses on the physical interaction between the user and the spatial characteristics of the gallery. By limiting the wearers perception and view range it forces the wearer to better understand the wearers location within the gallery. The wearer explores the gallery through the orb and only through the orb can they traverse the gallery.

91


Figure 4-28. Moving the Lantern

92


Lantern Fog- Unreal Engine Code Similar to the code presented in the Room Pivot scenario the code within the Lantern Fog scenario deals greatly with the location of the HMD but also to the location of the orb. An issue within the Leap Motion tracking is the fact that the hand mesh is only present when in front of the tracker. If the physical hands move away from the Leap Motion tracker while the orb is being manipulated it has the tendency to throw the orb or teleport the orb to impossible to get locations. We have provided a redundancy code to reset the orb in this event.

93


Figure 4-29. Lantern Orb Code

94


Figure 4-30. Control Object

95


96


Final Scenario #4 Surrealist Painting

Adapted Design The conceptual idea behind the surrealist paintings was the want to evolve the sphere scenario to a level that was much more interactive and spatial. The volume of the gallery was something that we wanted to explore and created a scenario that allowed the wearer to experience both the horizontal and vertical volumes of the gallery. Two paintings are presented in the scenario each with their own properties consisting of multiple animations. Figure 4-31. Paintings

97


98


Ocean Painting Texture The ocean painting depicts a ship set on an ocean that is in constant movement. This isolation and turbulence was something that influenced this scenario and its ability to fill a room with water. This volume directly relates to the painting depicted.

Figure 4-32. Ocean Painting Texture

Ocean Painting Hitbox The ocean painting has a particular problem of being relatively close to the Lantern Fog scenario trigger box and because of this the trigger box for the ocean painting had to be much smaller and positioned in a way that would not interact with the other scenarios.

Figure 4-33. Ocean Painting Hitbox

Activation Upon triggering the hitbox three separate particles and the ocean animation are activated. The three particle systems consist of two that control the water flowing out of the painting and the third controls the particle of the water hitting the ocean below. The floor animation raises the ocean texture from below the gallery to a predetermined height.

Figure 4-34. Ocean Painting Activated

99


Fire Painting Texture The fire painting was designed second and as a result the texture was designed to go with the theme of ships on the water and possible conceptual designs that could stem from them. A ship on fire was chosen as the resulting smoke could be used as a volumetric object.

Figure 4-35. Fire Painting Texture

Fire Painting Hitbox Unlike the ocean hitbox the fire hitbox was positioned as large and close to the wall as possible to allow the wearer to get as close to the ship before the fire activated.

Figure 4-36. Fire Painting Hitbox

Activation The activation of the fire scenario begins with the fire particle on the painting followed by three strategically placed smoke particles that are angled in such a way to fill the gallery with an even layer of smoke comparatively to the layer of ocean beneath.

Figure 4-37. Fire Painting Activated

100


Figure 4-38. Ocean Painting Inactive

Figure 4-39. Ocean Painting Active

101


Figure 4-40. Fire Painting Inactive

Figure 4-41. Fire Painting Activate

102


103


Figure 4-42. Painting Code

104


Surrealist Paintings- Unreal Engine Code The code in this scenario involves activating and controlling two main animations: fire and the ocean animation. These animations have predefined locations and areas that are triggered from their respective trigger boxes. The blue portions of the code involve where and what the wearers are interacting with the trigger the specific animations.

Figure 4-43. Painting Code

105


106


107 01


Conclusions Reflecting on the MRP as a whole an understanding of ones relationship between the digital and the physical is realized. Throughout the process of conceptualizing the scenarios a deep understanding of the human psyche and how it relates to the two facets is at the core of this project. The surrealist union between the physical and digital and the connection they make while traversing in-between both creates a spatial condition in which conventional architecture is unable to actualize. This MRP has conceptually proven that, through the use of AR/VR, architecture can be created that traverses this unique medium, a surrealist threshold between what is and what is not. This digital architecture, we believe, will be the inevitable future of and through the union of both physical and digital a new era of architectural design and experiential designs will occur.

Figure 4-44. Physical / Digital

108


LIST OF FIGURES Chapter 1 Figure 1-0. Pape, Dave. “DisneyQuest’s ‘Ride the Comix.” Digital image. ROAD TOVR. July 10, 2015. Accessed May 7, 2019. https://www.roadtovr.com/end-of-an-era-disneyquest-first-vr-attraction-set-to-close/. Figure 1-1. Giorgio Martini, Francesco Di. “Studiolo from the Ducal Palace in Gubbio.” Digital image. THE MET. 2019. Accessed May 2, 2019. https://www.metmuseum.org/art/collection/search/198556. Figure 1-2. Giorgio Martini, Francesco Di. “Studiolo from the Ducal Palace in Gubbio.” Digital image. THE MET. 2019. Accessed May 2, 2019. https://www.metmuseum.org/art/collection/search/198556. Figure 1-3. Christopher McFadden. “Charles Wheatstone.” Digital image. INTERESTING ENGINEERING. March 01, 2018. Accessed May 2, 2019. https://interestingengineering.com/sir-charles-wheatstone-father-of-the-wheatstone-bridge-and-british-electric-telegraph. Figure 1-4. Charles Wheatstone. “Wheatstone Stereoscope.” Digital image. Alarmy. July 4, 2017. Accessed May 2, 2019. https://www.alamy.com/stock-image-engraving-depicting-a-wheatstone-stereoscope-sir-charles-wheatstone-165994083.html. Figure 1-5. Charles Wheatstone. “Reflecting Stereoscope.” Digital image. Science & Society. 2019. Accessed May 2, 2019. https://www.ssplprints.com/image/129529/reflecting-stereoscope-originally-used-by-charles-wheatstone-19th-century. Figure 1-6. Magnus Manske. “Charles Wheatstone Stereoscope.” Digital image. INTERESTING ENGINEERING. March 01, 2018. Accessed May 2, 2019. https://interestingengineering.com/sir-charles-wheatstone-father-ofthe-wheatstone-bridge-and-british-electric-telegraph. Figure 1-7. Unknown. “Edward Link.” Digital image. VIRTUAL REALITY SOCIETY. 2019. Accessed May 2, 2019. https://www.vrs.org.uk/virtual-reality/history.html. Figure 1-8. Unknown. “The Link Trainer.” Digital image. VIRTUAL REALITY SOCIETY. 2019. Accessed May 2, 2019. https://www.vrs.org.uk/virtual-reality/history.html. Figure 1-9. Sffaudio.com. “Pygmalion’s Spectacles.” Digital image. VIRTUAL REALITY SOCIETY. 2019. Accessed May 2, 2019. https://www.vrs.org.uk/virtual-reality/history.html.

109


Figure 1-10. Wiliam Gruber. “View Master.” Digital image. VIRTUAL REALITY SOCIETY. 2019. Accessed May 2, 2019. https://www.vrs.org.uk/virtual-reality/history.html. Figure 1-11. ITSUOSAKANE. YouTube. January 05, 2011. Accessed May 02, 2019. https://www.youtube.com/ watch?v=vSINEBZNCks. Figure 1-12. Heilig, Morton. SENSORAMA SIMULATOR. US Patent 81,864, filed January 10, 1961, and issued August 28, 1962. http://www.mortonheilig.com/SensoramaPatent.pdf. Figure 1-13. Heilig, Morton. Stereoscopic-Televison Apparature for Individual Use. US Patent 664325, filed May 24, 1957, and issued October 4, 1960. http://www.mortonheilig.com/Experience_Theater_Patent.pdf Figure 1-14. Unknown “Ivan Sutherland” Digital image. VIRTUAL REALITY SOCIETY. 2019. Accessed May 2, 2019. https://www.vrs.org.uk/virtual-reality/history.html. Figure 1-15. Unknown “Bob Sproull testing Sword Damocles” Digital image. VIRTUAL REALITY SOCIETY. 2019. Accessed May 2, 2019. https://www.vrs.org.uk/virtual-reality/history.html.figure 1-16. Unknown “Jaron Lanier” Digital image. VIRTUAL REALITY SOCIETY. 2019. Accessed May 2, 2019. https://www.vrs.org.uk/virtual-reality/history.html. Figure 1-16. Robertson, Adi. “Why Author and Virtual Reality Pioneer Jaron Lanier Rejected the Modern Web.” The Verge. December 27, 2012. Accessed May 08, 2019. https://www.theverge.com/2012/12/27/3809390/why-youare-not-a-gadget-author-jaron-lanier-rejected-web. Figure 1-17. Unknown “EyePhone and Data Glove ” Digital image. VIRTUAL REALITY SOCIETY. 2019. Accessed May 2, 2019. https://www.vrs.org.uk/virtual-reality/history.html. Figure 1-18. Rutherford, Sam. “Nintendo’s Virtual Boy.” Digital image. Toms Guide. February 2, 2017. Accessed May 7, 2019. https://www.tomsguide.com/us/nintendo-vr-return,news-22199.html. Figure 1-19. Edwards, Benj. “Innsmouth No Yakata.” Digital image. PCMag Digital Group. February 7, 2019. Accessed May 7, 2019. https://www.pcmag.com/feature/366223/7-forgotten-nintendo-virtual-boy-classics/2.

110


Figure 1-20. Pape, Dave. “DisneyQuest’s ‘Ride the Comix.” Digital image. ROAD TOVR. July 10, 2015. Accessed May 7, 2019. https://www.roadtovr.com/end-of-an-era-disneyquest-first-vr-attraction-set-to-close/. Figure 1-21. Lang, Ben. “Oculus DK2.” Digital image. ROAD TOVR. March 19, 2014. Accessed May 7, 2019. https:// www.roadtovr.com/oculus-rift-developer-kit-2-dk2-pre-order-release-date-specs-gdc-2014/. Figure 1-22. Microsoft. “Microsoft HoloLens.” Digital image. Microsoft. 2019. Accessed May 26, 2019. https://www. microsoft.com/en-us/p/microsoft-hololens-development-edition/8xf18pqz17ts?activetab=pivot:overviewtab. Figure 1-23. VOID. “The Void.” Digital image. THE VOID. 2019. Accessed May 2, 2019. https://www.thevoid.com/. Figure 1-24. Borup, Ruben. “Enscape for Sketchup.” YouTube. December 22, 2017. Accessed May 07, 2019. https:// www.youtube.com/watch?v=Bvxb4W6rhPo. Figure 1-25. Valdes, George. “IRIS VR.” Digital image. IrisVR. September 15, 2016. Accessed May 7, 2019. https:// blog.irisvr.com/blog/category/immersive-design-and-immersive-review. Figure 1-26. Unknown. “Revolutionizing School Design with VR.” Digital image. HMC Architects. 2019. Accessed May 7, 2019. https://hmcarchitects.com/news/revolutionizing-school-design-with-virtual-reality/. Figure 1-27. Skansaska USA. “AR WITH HOLOLENS.” Digital image. Next Reality. May 28, 2017. Accessed May 7, 2019. https://next.reality.news/news/studio-216-constructs-architectural-visions-mixed-reality-0177352/. Figure 1-28. Fologram. “Fologram Talks: Holographic Brickwork.” Vimeo. December 12, 2018. Accessed May 05, 2019. https://vimeo.com/305901280 Figure 1-29. Downey, Sarah. “BIM Collision Detection.” Digital image. UPLOAD. November 22, 2016. Accessed May 7, 2019. https://uploadvr.com/vr-and-ar-in-construction/. Figure 1-30. DAQRI. YouTube. November 15, 2016. Accessed May 07, 2019. https://www.youtube.com/watch?time_ continue=91&v=U9t6Osl1Lbc. Figure 1-31. Perlman, Allan. “Drone FPV.” Digital image. AUV COACH. November 4, 2019. Accessed May 7, 2019. https://uavcoach.com/6-tips-first-person-view-fpv-drone-racers/.

111


Chapter 2 Figure 2-0. AMD. “AMD Threadripper.” Digital image. Ars TECHNICA. July 13, 2017. Accessed May 2, 2019. https://arstechnica.com/gadgets/2017/07/amd-threadripper-16-cores-and-32-threads-for-999-arrives-in-august/. Figure 2-1. Google. “Google Cardboard.” Digital image. Cardboard. 2019. Accessed May 26, 2019. https://vr.google. com/cardboard/. Figure 2-2. Google. “Google Daydream VR.” Digital image. Daydream. 2019. Accessed May 26, 2019. https://vr.google.com/daydream/. Figure 2-3. SAMSUNG. “Gear VR.” Digital image. SAMSUNG. 2019. Accessed May 26, 2019. https://www.samsung. com/global/galaxy/gear-vr/. Figure 2-4. SONY. “Play Station VR.” Digital image. Play Station. 2019. Accessed May 26, 2019. https://www.playstation.com/en-us/explore/playstation-vr/. Figure 2-5. Oculus. “Oculus Rift ” Digital image. 2019. Accessed May 26, 2019.https://www.oculus.com/rift/#oui-cslrift-games=robo-recall Figure 2-6. VIVE. “HTC VIVE .” Digital image. VIVE. 2019. Accessed May 26, 2019. https://www.vive.com/us/product/vive-virtual-reality-system/. Figure 2-7. Pimax. “Pimax 8k Series.” Digital image. Pimax. 2018. Accessed May 26, 2019. https://pimaxvr.com/collections/store/products/8k?variant=19912761606203. Figure 2-8. Lenovo. “Lenovo Mirage AR Headset” Digital image. Lenovo. 2019. Accessed May 26, 2019. https://www. lenovo.com/us/en/jedichallenges. Figure 2-9. Microsoft. “Microsoft HoloLens.” Digital image. Microsoft. 2019. Accessed May 26, 2019. https://www. microsoft.com/en-us/p/microsoft-hololens-development-edition/8xf18pqz17ts?activetab=pivot:overviewtab. Figure 2-10. Magic Leap. “Magic Leap One.” Digital image. Magic Leap. 2019. Accessed May 26, 2019. https://www. magicleap.com/magic-leap-one. Figure 2-11. VIVE. “VIVE PRO.” Digital image. VIVE. 2019. Accessed May 1, 2019. https://www.vive.com/us/product/vive-pro/. 112


Figure 2-12. Leap Motion. “Leap Motion.” Digital image. Amazon. 2019. Accessed May 1, 2019. https://www.amazon. com/Leap-Motion-Controller-Packaging-Software/dp/B00HVYBWQO. Figure 2-13. STEREO LABS. “ZED Mini.” Digital image. STEREO LABS. 2019. Accessed May 26, 2019. https://www. stereolabs.com/zed-mini/. Figure 2-14. PLEXUS. “PLEXUS Haptic Glove.” Digital image. PLEXUS. 2018. Accessed May 26, 2019. http://plexus. im/. Figure 2-15. VIVE. “HTC VIVE Tracker.” Digital image. Vive. 2019. Accessed May 26, 2019. https://www.vive.com/us/ accessory/. Figure 2-16. UNITY. “UNITY EDITOR.” Digital image. UNITY DOCUMENTATION. 2019. Accessed May 26, 2019. https://docs.unity3d.com/Manual/LearningtheInterface.html Figure 2-17. EPICgames. “UNREAL EDITOR.” Digital image. UNREAL ENGINE. 2019. Accessed May 26, 2019. https://docs.unrealengine.com/en-us/GettingStarted/FromUnity. Figure 2-18. Image By Author Figure 2-19. Boerme, Finn D. “AMD Ryzen Threadripper 2920X.” Digital image. NoteboookCheck. December 12, 2018. Accessed May 1, 2019. https://www.notebookcheck.net/Review-AMD-Ryzen-Threadripper-2920X-12cores-24-threads.376601.0.html. Figure 2-20. ASUS. “ROG ZENITH EXTREME.” Digital image. ASUS. 2019. Accessed May 1, 2019. https://www.asus. com/us/Motherboards/ROG-ZENITH-EXTREME/gallery/. Figure 2-21. CORSAIR. “DDR4 VENGEANCE RAM.” Digital image. CORSAIR. 2019. Accessed May 26, 2019. https://www.corsair.com/us/en/Categories/Products/Memory/VENGEANCE®-LPX-16GB-(2-x-8GB)-DDR4DRAM-3000MHz-C16-Memory-Kit---Black/p/CMK16GX4M2D3000C16. Figure 2-22. SAMSUNG. “SSD 970 EVO NVMe M.2.” Digital image. SAMSUNG. 2019. Accessed May 26, 2019. https://www.samsung.com/us/computing/memory-storage/solid-state-drives/ssd-970-evo-nvme-m2-500gb-mzv7e500bw/.

113

Figure 2-23. Burnes, Andrew. “GeForce GTX 1080.” Digital image. NVIDIA. 2019. Accessed May 26, 2019. https:// www.geforce.com/whats-new/articles/geforce-gtx-1080-founders-edition.


Chapter 3 Figure 3-1 Image By Author Figure 3-2 Image By Author Figure 3-2.1 Raayaar. “[UE4] Desert Ruins.” Polycount. February 01, 2016. Accessed May 05, 2019. https://polycount. com/discussion/164703/ue4-desert-ruins. Figure 3-2.2 “Desert - Ancient and Modern.” TrekNature. Accessed May 05, 2019. https://www.treknature.com/gallery/Africa/Libya/photo228504.htm. Figure 3-3 Image By Author Figure 3-3.1 Jovicic, Miso. “Sunk by Monroe Snook.” Fine Art America. Accessed May 05, 2019. https://fineartamerica.com/featured/sunk-monroe-snook.html. Figure 3-3.2 Frymann, Abigail. “The Astrid: Is This the End for the 95-year-old Tall Ship That Sank in High Winds off the Irish Coast?” Daily Mail Online. July 27, 2013. Accessed May 05, 2019. https://www.dailymail.co.uk/news/ article-2379972/The-Astrid-Is-end-95-year-old-tall-ship-sank-high-winds-Irish-coast.html. Figure 3-4 Image By Author Figure 3-4.1 “Make Floating Rocks with the Power of Math™! (Part 1).” Broad Strokes. May 08, 2016. Accessed May 05, 2019. https://www.broad-strokes.com/2016-05/floating-rocks-the-power-of-math-part-1/. Figure 3-4.2 “Floating Rocks.” DeviantArt. Accessed May 05, 2019. https://www.deviantart.com/astonvillafc/art/Floating-Rocks-580034205. Figure 3-5 Image By Author Figure 3-5.1 Hebard, Daniel, Rose-marie Karlsen, and Elena Nosyreva. “Steam Locomotive Fire Tube Firebox Art Print by Gary Keesler.” Fine Art America. May 04, 2019. Accessed May 05, 2019. https://fineartamerica.com/featured/steam-locomotive-fire-tube-firebox-gary-keesler.html?product=art-print. Figure 3-5.2 Dace, Marine Picard. “Lava Forge.” ArtStation. Accessed May 5, 2019. https://cdna.artstation.com/p/assets/images/images/006/421/344/large/marine-picard-dace-forge.jpg?1498484709. 114


Figure 3-6 Image By Author Figure 3-7 Image By Author Figure 3-8 Image By Author Figure 3-9 Image By Author Figure 3-10 Image By Author Figure 3-10.1 Learnelectronics. “Arduino Keyboard Emulator.” YouTube. February 13, 2017. Accessed May 08, 2019. https://www.youtube.com/watch?v=SHIcliL4O14&t=62s. Figure 3-11 Image By Author Figure 3-12 Image By Author Figure 3-13 Image By Author Figure 3-14 Image By Author Figure 3-15 Image By Author Figure 3-16 Image By Author Figure 3-17 Image By Author Figure 3-18 Image By Author Figure 3-19 Image By Author Figure 3-20 Image By Author 115


Figure 3-21 Image By Author Figure 3-22 Image By Author Figure 3-23 Image By Author Figure 3-24 Image By Author Figure 3-25 Image By Author Figure 3-26 Image By Author Figure 3-27 Image By Author Figure 3-28 Image By Author Figure 3-29 Image By Author Figure 3-30 Image By Author Figure 3-31 Image By Author Figure 3-32 Image By Author Figure 3-33 Image By Author Figure 3-34 Image By Author Figure 3-35 Image By Author Figure 3-36 Image By Author 116


Figure 3-37 Image By Author Figure 3-38 Image By Author Figure 3-39 Image By Author Figure 3-40 Image By Author Figure 3-41 Image By Author Figure 3-42 Image By Author Figure 3-43 Image By Author Figure 3-44 Image By Author Figure 3-45 Image By Author Figure 3-46 Image By Author Figure 3-47 Image By Author Figure 3-48 Image By Author Figure 3-49 Image By Author Figure 3-50 Image By Author Figure 3-51 Image By Author

117


Chapter 4 Figure 4-1 Image By Author Figure 4-2 Image By Author Figure 4-3 Image By Author Figure 4-3.1 “Buy HTC Vive Tracker 2018.” Microsoft Store. Accessed May 05, 2019. https://www.microsoft.com/enus/p/htc-vive-tracker-2018/93m98dd4bhxx?activetab=pivot:overviewtab. Figure 4-4 Image By Author Figure 4-5 Image By Author Figure 4-6 Image By Author Figure 4-7 Image By Author Figure 4-8 Image By Author Figure 4-9 Image By Author Figure 4-10 Image By Author Figure 4-11 Image By Author Figure 4-12 Image By Author Figure 4-13 Image By Author Figure 4-14 Image By Author Figure 4-15 Image By Author 118


Figure 4-16 Image By Author Figure 4-17 Image By Author Figure 4-18 Image By Author Figure 4-19 Image By Author Figure 4-20 Image By Author Figure 4-21 Image By Author Figure 4-22 Image By Author Figure 4-23 Image By Author Figure 4-24 Image By Author Figure 4-25 Image By Author Figure 4-26 Image By Author Figure 4-27 Image By Author Figure 4-28 Image By Author Figure 4-29 Image By Author Figure 4-30 Image By Author Figure 4-31 Image By Author 119


Figure 4-32 Image By Author Figure 4-33 Image By Author Figure 4-34 Image By Author Figure 4-35 Image By Author Figure 4-36 Image By Author Figure 4-37 Image By Author Figure 4-38 Image By Author Figure 4-39 Image By Author Figure 4-40 Image By Author Figure 4-41 Image By Author Figure 4-42 Image By Author Figure 4-43 Image By Author Ocean Painting “Painting Ship Storm Wallpaper and Background.” Download Awesome Collection of Handpicked Wallpapers and Images. Accessed May 05, 2019. https://www.tokkoro.com/2974335-painting-ship-storm.html. Fire Painting Walser, Robert. “Catastrophe.” The Paris Review. November 05, 2015. Accessed May 05, 2019. https:// www.theparisreview.org/blog/2015/11/05/catastrophe/.

120


APPENDIX B GLOSSARY This appendix contains a glossary of terms commonly used throughout this book. These terms are derived from Wikipedia and edited to better reflect the context of this thesis. 4K: A resolution of a display that is usually referred to as Ultra-HD and a horizontal pixel density of 4000+. ANIMATION: A set of single movement saved in sequence. ARDUINO: A device used to store digital processes and coding. AUGMENTED REALITY (AR): A display type of current headsets that allow for physical passthrough to a digital format. This format combines both the digital (software rendering) and what the cameras are physically tracking. CARTESIAN COORDINATES: A grid based upon the X, Y, and Z planes that are equal distance from a central reference point of 0, 0, 0. CHARETTES: A quick design phase used to produce numerous different design ideas. CPU: Central Processing Unit CODE: A digital set of instructions. EXTERNAL DEVICE: A device that is usually referenced as a supplemental device to the main headset. FOCAL RANGE: A peripheral range that the eyes can see. In this context it is a circular area. GPU: Graphics Processing Unit HAPTIC: Allowing the sensation of touch. HARDWARE: Physical components of a computer or device. HITBOX: An area within the digital world that allows for the interaction of physical objects. HMD: Head-mounted Display LATENCY: The time calculated between two digital events. OBELISK: A central hub where all pieces are interconnected digitally, an anchor point. PARTICLES: A moving object that has predetermined characteristics. PASS-THROUGH: A camera feature of augmented reality that allows the capture of physical objects through to the digital software. 121


PHASE: Used within this thesis as different sections within a scenario. PORTAL: A digital connection between two locations that are not near one another. RAM: Random Access Memory RELIC: A 3D printed object used to interact with the digital world through physical pieces. RESOLUTION: The pixel density of a screen display usually measured in an X and Y, example 1920 x 1080. ROOM TRACKING: A technology that uses external sensors to track the location of the headset within a predetermined space. SCENARIO: As used within this thesis as a digital / physical world in which the wearer interacts with. SOFTWARE: A computer program that serves a specific purpose. SPATIAL TRACKING: A feature of headsets that allow the digital coordinates of the headset to be located within the real world. SURREALIST: A juxtaposition between reality and the unimaginable. TELEPORT: Instant movement between two points. YAW: Yaw is the rotation of the Y axis. The y axis controls the left and right movements. VERTIGO: A sensation that usually associated with sickness and dizziness. VIRTUAL REALITY (VR): A display type of headset that renders the digital world three dimensionally that allows for physical tracking of the headset relative to the digital content.

Term Origin Wikipedia. March 24, 2019. Accessed May 08, 2019. https://www.wikipedia.org/. 122


APPENDIX C INSTALLATION This installation guide will allow you to use AR in the HTC VIVE PRO as well as hand tracking with the Leap Motion. 1. Download and install Unreal Engine 4 (https://www.unrealengine.com/en-US/feed?sessionInvalidated=true). 2.

Download and install Github (https://github.com/).

3. vive)

Download and install Leap Motions SDK (https://developer.leapmotion.com/vr-setup/

4. In order to make the leap motion work with Unreal you must clone the Leap Motion code in the Github program, (https://github.com/leapmotion/LeapUnreal). 5. You must obtain the link that is under (the green clone or download button). YOU MUST USE THE LINK TO CLONE THE CODE. If you download the file the code will not work. 6. Download the SRWorks SDK for the HTC VIVE PRO website (https://developer.vive. com/resources/knowledgebase/intro-vive-srworks-sdk/). 7.

Follow instructions from the website that is given in step 6 in order to install the plug-in.

8. In order to make the Leap Motion and the AR SDK code work in the same project, you are going to have to move install the SRWorks SDK in to the leap motion project that was created with Github.

123

9.

Restart your computer.

10.

Turn on the HTC VIVE PRO before you open UNREAL.

11.

Open STEAM and viewport.

12.

If code crashes a computer restart may be required.


CODE REFERENCES Arduino Code Learnelectronics. “Arduino Keyboard Emulator.” YouTube. February 13, 2017. Accessed May 08, 2019. https://www.youtube.com/watch?v=SHIcliL4O14&t=62s. Fog Particle (Tutorial 1-3) Master, Marvel. “UE4 Tutorial - Local Volumetric Fog 1/3 - Basics - Unreal Engine 4.” YouTube. June 21, 2018. Accessed May 08, 2019. https://www.youtube.com/watch?v=McBiRCSWM9Q&t=309s. Leap Motion. Leap Motion SDK for the Unreal Engine. Computer code. Leap Motion, April 4, 2019. Accessed May 5, 2019. https://github.com/leapmotion/LeapUnreal Meshes / Texture / Animations.1 Marketplace - UE4 Marketplace. Accessed May 08, 2019. https://www.unrealengine.com/ marketplace/en-US/store. Portal Particle. UnrealCG. “Portal Particle System Tutorial - [Unreal Engine 4].” YouTube. June 12, 2017. Accessed May 08, 2019. https://www.youtube.com/watch?v=HobKiwNHNxc. VIVE. VIVE SRWorks SDK. Computer code. VIVE Developers, March 27, 2018. https://developer.vive.com/resources/knowledgebase/intro-vive-srworks-sdk/ Unreal Engine Setup / Starter Guide Developing for SteamVR. Accessed May 08, 2019. https://docs.unrealengine.com/en-us/ Platforms/SteamVR. Unreal Engine 4 Forum2 Unreal Engine Forums. Accessed May 08, 2019. https://forums.unrealengine.com/.

1. Basis for Midterm Scenario and Various Final Effects Used in Final Scenarios 2. Information Used in Various Coding Questions

124


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.