Embodied Interface
Graduate Thesis 2022
Qihang Fan & Yu Cheng Huang
Yu Cheng , Qihang
Contents 04 08 26 32 34
Thesis Statement Foundational Research Design Methodology Thesis Final Thesis Reference
42
Bibliography & Glossary
E-mail: qihang_fan@sciarc.edu yucheng_huang@sciarc.edu
We, Qihang Fan & Yucheng Huang, are aspiring designer across multiple interfaces, specializing in user-experience design, AR & VR interactions, and architecture design. Our practice focuses on connecting ideas with user-centered design through rapid prototyping and immersive interaction, creating uncharted solutions for user experiences.
4
SCI-Arc Yu-Cheng Qi-Hang
Yu Cheng , Qihang
Thesis Statement
E-mail: qihang_fan@sciarc.edu yucheng_huang@sciarc.edu
We, Qihang Fan & Yucheng Huang, are aspiring designer across multiple interfaces, specializing in user-experience design, AR & VR interactions, and architecture design. Our practice focuses on connecting ideas with user-centered design through rapid prototyping and immersive interaction, creating uncharted solutions for user experiences.
6
“Embodied virtuality is the process of drawing computers out of their electronic shells,” said Mark Wieser in his article in 1993. He demonstrated a vision of ubiquitous computing that weaves into the fabric of everyday life until computers are indistinguishable. Predictably, human modality and behavior will adapt correspondingly. In fact, as the books The Measure of Man by Henry Dreyfuss and Architects’ Data by Ernst Neufert suggest, contemporary studies and design approaches to human modality focus on spatial dimensions varying from industrial products to architecture.. However, the prevailing factor of human-computer interaction behavior, which inevitably influences human modality, is lacking in discussions. In our thesis project, we will take on these positions to envision domestic environment adaptation under the context of ubiquitous computing and human-computer interaction design. The extra supplemental layer of mixreality technology catalyzes environmental re-arrangement as a form of embodied virtuality. As a result, we are proposing a vision-driven design exploring “mixed reality ergonomics” in our contemporary everyday life.
SCI-Arc Yu-Cheng Qi-Hang
Our design methodology in this project encourages discussions on human modality, tangible user interface (TUI), and digital twins within domestic environments. Specifically, we will explore the object-oriented relationship of three categories — humans, spatial elements, and discrete objects. Following the concept of digital twins, we will design supplemental embodied virtuality contents superimposing (placement and scale) on physical elements in real-time to form new conceptual models in daily environments. Such conceptual models open up alternative possibilities for humans to perceive the same collections of mundane objects in a different way that gives more agency to modern human modality. Meanwhile, this thesis project challenges conventional ergonomics, whose primary focus is human physical comfort and efficiency. Instead, our thesis proposal brings up a new ergonomics perspective named mix-reality ergonomics, speculating the contemporary domestic space attributes when immersed in a context of pervasive virtuality. As for our thesis project result, it will be presented as a real-time interactive simulation (application) next term.
Thesis Statement
THESIS STATEMENT theory background, design value, methodology, final presentation
8
SCI-Arc Yu-Cheng Qi-Hang
Yu Cheng , Qihang
Foundational Research
E-mail: qihang_fan@sciarc.edu yucheng_huang@sciarc.edu
We, Qihang Fan & Yucheng Huang, are aspiring designer across multiple interfaces, specializing in user-experience design, AR & VR interactions, and architecture design. Our practice focuses on connecting ideas with user-centered design through rapid prototyping and immersive interaction, creating uncharted solutions for user experiences.
10 There is no reason why the objects displayed by a computer have to follow the ordinary rules of physical reality with which we are familiar.
If the task of the display is to serve as a looking-glass into the mathematical wonderland constructed in computer memory, it should serve as many senses as possible. So far as I know, no one seriously proposes computer displays of smell, or taste. Excellent audio displays exist, but unfortunately we have little ability to have the computer produce meaningful sounds. I want to describe for you a kinesthetic display. The force required to move a joystick could be computer controlled, just as the actuation force on the controls of a Link Trainer are changed to give the feel of a real airplane. With such a display, a computer model of particles in an electric field could combine manual control of the position, of a moving charge, replete with the sensation of forces on the charge, with visual presentation of the charge’s position. Quite complicated “joysticks” with force feedback capability exist. For example, the controls on the General Electric “handyman” are nothing but joysticks with nearly as many degrees of freedom as the human arm. By use of such an input/output device, we can add a force display to our sight and sound capability. The computer can easily sense the positions of almost any of our body muscles. So far
SCI-Arc Yu-Cheng Qi-Hang
only the muscles of the hands and arms have been used for computer control. There is no reason why these should be the only ones, although our dexterity with them is so high that they are a natural choice. Our eye dexterity is very high also. Machines to sense and interpret eye motion data can and will be built. It remains to be seen if we can use a language of glances to control a computer. An interesting experiment will be to make the display presentation depend on where we look. There is no reason why the objects displayed by a computer have to follow the ordinary rules of physical reality with which we are familiar. The kinesthetic display might be used to simulate the motions of a negative mass. The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked.
Ubiquitous Computing
SIMULATION HEADSET highlighting the prototype of VR headset & sensatory simulator - Sensorama
12
SCI-Arc Yu-Cheng Qi-Hang
Architects' Data
UNITY SLICE: TABLE show how Passthrough VR can support mixed reality social experiences focused on tangible interfaces.
14 “five fundamental psychological concepts: affordances, signifiers, constraints, mappings, and feedback.”
We live in a world filled with objects, many natural, the rest artificial. Every day we encounter thousands of objects, many of them new to us. Many of the new objects are similar to ones we already know, but many are unique, yet we manage quite well. How do we do this? Why is it that when we encounter many unusual natural objects, we know how to interact with them? Why is this true with many of the artificial, human-made objects we encounter? The answer lies with a few basic principles. Some of the most important of these principles come from a consideration of affordances. The term affordance refers to the relationship between a physical object and a person (or for that matter, any interacting agent, whether animal or human, or even machines and robots). An affordance is a relationship between the properties of an object and the capabilities of the agent that determine just how the object could possibly be used. A chair affords (“is for”) support and, therefore, affords sitting. Most chairs can also be carried by a single person (they afford lifting), but some can only be lifted by a strong person or by a team of people. If young or relatively weak people cannot lift a chair, then for these people, the chair does not have that affordance, it does not afford lifting.
SCI-Arc Yu-Cheng Qi-Hang
The presence of an affordance is jointly determined by the qualities of the object and the abilities of the agent that is interacting. This relational definition of affordance gives considerable difficulty to many people. We are used to thinking that properties are associated with objects. But affordance is not a property. An affordance is a relationship. Whether an affordance exists depends upon the properties of both the object and the agent. Glass affords transparency. At the same time, its physical structure blocks the passage of most physical objects. As a result, glass affords seeing through and support, but not the passage of air or most physical objects (atomic particles can pass through glass). The blockage of passage can be considered an anti-affordance—the prevention of interaction. To be effective, affordances and anti-affordances have to be discoverable—perceivable. This poses a difficulty with glass. The reason we like glass is its relative invisibility, but this aspect, so useful in the normal window, also hides its anti-affordance property of blocking passage. If an affordance or anti-affordance cannot be perceived, some means of signaling its presence is required: I call this property a signifier (discussed in the next section). - The Design of Everyday Things
Human-centered Design
NORMAN TEACUP & NORMAN DOOR unproper design leading to functional confusion
16 Human-computer interaction (HCI) is a multidisciplinary field of study focusing on the design of computer technology and, in particular, the interaction between humans and computers.
HCI surfaced in the 1980s with the advent of personal computing, just as machines such as the Apple Macintosh, IBM PC 5150 and Commodore 64 started turning up in homes and offices in society-changing numbers. For the first time, sophisticated electronic systems were available to general consumers for uses such as word processors, games units and accounting aids. Consequently, as computers were no longer room-sized, expensive tools exclusively built for experts in specialized environments, the need to create human-computer interaction that was also easy and efficient for less experienced users became increasingly vital. From its origins, HCI would expand to incorporate multiple disciplines, such as computer science, cognitive science and human-factors engineering. HCI soon became the subject of intense academic investigation. Those who studied and worked in HCI saw it as a crucial instrument to popularize the idea that the interaction between a computer and the user should resemble a human-to-human, open-ended dialogue. Initially, HCI researchers focused on improving the usability of desktop computers (i.e., practitioners concentrated on how easy computers are to learn and use). However, with the rise of technologies such as the Internet and the smartphone, computer use would
SCI-Arc Yu-Cheng Qi-Hang
increasingly move away from the desktop to embrace the mobile world. Also, HCI has steadily encompassed more fields: “…it no longer makes sense to regard HCI as a specialty of computer science; HCI has grown to be broader, larger and much more diverse than computer science itself. HCI expanded from its initial focus on individual and generic user behavior to include social and organizational computing, accessibility for the elderly, the cognitively and physically impaired, and for all people, and for the widest possible spectrum of human experiences and activities. It expanded from desktop office applications to include games, learning and education, commerce, health and medical applications, emergency planning and response, and systems to support collaboration and community. It expanded from early graphical user interfaces to include myriad interaction techniques and devices, multi-modal interactions, tool support for model-based user interface specification, and a host of emerging ubiquitous, handheld and context-aware interactions.” — John M. Carroll, author and a founder of the field of human-computer interaction.
Human-computer Interaction
HUMAN COMPUTER INTERACTION illustrations
18 Unity Slices: Table explores the future of tangible interfaces, blending realities using the Oculus Passthrough feature to bring people together around humanity’s most enduring social hub, the table.
Inform: Material has to inform users of its transformational capabilities (affordance). In 1977 Gibson proposed that we perceive the objects in our environment through what action objects have to offer, a property he coined as affordance. For example, through evolution we instinctively know that a cup affords storing volumes of liquid. Industrial design in the past decade has stablished a wide set of design principles to inform the user of an object’s affordance—for example, a hammer’s handle tells the user where to grip the tool. In the case of dynamic materials, these affordances change as the interface’s shape alters. In order to interact with the interface, the user has to be continuously informed about the state the interface is in, and thus the function it can perform. An open interaction design question remains: How do we design for dynamic affordances? Unity Slices: Table. This Unity Labs project explores the future of tangible interfaces, blending realities using the Oculus Passthrough feature to bring people together around humanity’s most enduring social hub, the table. We view Unity Slices: Table as a social, mixed reality, proof of concept built to bring people together, whether they share the same physical table, or are an ocean apart.
SCI-Arc Yu-Cheng Qi-Hang
This upcoming proof of concept, vertical slice game made for the Oculus App Lab, brings up to four players together for classic tabletop fun. Over the past year, our team at Unity Labs has been prototyping and iterating on this experience, and we’re excited to give you a behind-the-scenes look at how we made it happen. The first milestone we had to reach while building our prototype was to align the virtual representation of a table to a physical table or desk. Since Oculus Quest 2 does not yet offer a way to accurately detect planes in Unity, we had to adopt a manual approach for using the controllers, so that users could quickly and precisely align the virtual desk to the table. Once the alignment was complete, we needed to find a way to network it. While this might seem simple, there are a number of factors to consider when centering your social experience around a table. Whether you’re in the same space or connected remotely, the tangible table interface is shared and needs to look and feel right for everyone participating.
Mixed Reality
UNITY SLICE: TABLE show how Passthrough VR can support mixed reality social experiences focused on tangible interfaces.
20
SCI-Arc Yu-Cheng Qi-Hang
Mixed Reality
UNITY SLICE: TABLE show how Passthrough VR can support mixed reality social experiences focused on tangible interfaces.
22 Unity Slices: Table explores the future of tangible interfaces, blending realities using the Oculus Passthrough feature to bring people together around humanity’s most enduring social hub, the table.
A digital twin is a virtual representation that serves as the real-time digital counterpart of a physical object or process. Though the concept originated earlier (attributed to Michael Grieves, then of the University of Michigan, in 2002) the first practical definition of digital twin originated from NASA in an attempt to improve physical model simulation of spacecraft in 2010. (Elisa Negri (2017). “A review of the roles of Digital Twin in CPS-based production systems”. Procedia Manufacturing) Digital twins are the result of continual improvement in the creation of product design and engineering activities. Product drawings and engineering specifications progressed from handmade drafting to computer aided drafting/computer aided design to model-based systems engineering. The digital twin of a physical object is dependent on the digital thread—the lowest
SCI-Arc Yu-Cheng Qi-Hang
level design and specification for a digital twin—and the “twin” is dependent on the digital thread to maintain accuracy. Changes to product design are implemented using engineering change orders (ECO). An ECO made to a component item will result in a new version of the item’s digital thread, and correspondingly to the digital twin. The future of Mixed Reality may mean that we will only need a single device to replace our screens. This will bring forth new ways to create content (data), as well as new ways of consuming them. As with other great technological disruptions that have changed our current way of life, Mixed Reality will create more industries and more jobs. MR will unlock a future where our natural world and powerful digital information meets.
Digital Twins
UNITY SLICE: TABLE show how Passthrough VR can support mixed reality social experiences focused on tangible interfaces.
24 “Tabletop TUIs utilize dynamic representations such as video projections, which accompany tangibles in their same physical space to give dynamic expression of underlying digital information and computation.”
SCI-Arc Yu-Cheng Qi-Hang
Urp uses physical scale models of architectural buildings to configure and control an underlying urban simulation of shadow, light reflection, wind flow, and other properties. Urp also provides a variety of interactive tools for querying and controlling parameters of the urban simulation. These include a clock to change the position of the sun, a material wand to change building surfaces between brick and glass (thus reflecting light), a compass to change wind direction, and an anemometer to measure wind speed. Urp’s building models cast digital shadows onto the workbench surface (via video projection). The sun’s position in the sky can be controlled by turning the hands of a clock on the tabletop. The building models can be repositioned and reoriented, with their solar shadows traforming according to their spatial and temporal configuration. Changing the compass direction alters the direction of a computational wind in the urban space. Urban planners can identify potential problems, such as areas of high pressure that may result in challenging walking environments or hard-toopen doors. Placing the anemometer on the tabletop shows the wind speed at that point in the model. Like Illuminating Light (its more primitive predecessor) Urp is built atop the I/O Bulb
infra- structure and employs the glimpser-and-voodoo vision analysis pipeline to identify and locate its component objects. Both applications also demonstrate luminous-tan- gible interaction, a style in which a participant’s relations with the system consist of manipulation of physical objects and the resultant ongoing projection of visual information onto and around these same objects; indeed, Urp extends the variety of such interactions, as we will see later. The paper has two principal parts: in the first, we describe Urp. This entails a brief introduction to the collection of concerns in the urban planning domain that motivate the present work, including a review of some traditional means of addressing these concerns; a recapitulation of basic material introduced elsewhere regarding the I/O Bulb and Luminous Room infrastructures that make the Urp application possible; and finally the implementation issues and a function-by-function description of the Urp system itself. The second part begins with short descriptions of several other projects built with I/O Bulb technology (some of which have not yet been otherwise published or publicly presented) and uses a comparison among these and Urp to suggest two ‘Luminous-Tangible Issues’, early thought- tools for the design and analysis of systems that subscribe to luminous-tangible interaction styles. - Urp: A Luminous-Tangible Workbench for Urban Planning and Design
Tangible User Interface
URP PROTOTYPE a workbench for urban planning and design.
26
SCI-Arc Yu-Cheng Qi-Hang
Yu Cheng , Qihang
Design Methodology
E-mail: qihang_fan@sciarc.edu yucheng_huang@sciarc.edu
We, Qihang Fan & Yucheng Huang, are aspiring designer across multiple interfaces, specializing in user-experience design, AR & VR interactions, and architecture design. Our practice focuses on connecting ideas with user-centered design through rapid prototyping and immersive interaction, creating uncharted solutions for user experiences.
28
SCI-Arc Yu-Cheng Qi-Hang
Interaction System
VIRTUALITY & NATURE by Magdiel Lopez
30
SCI-Arc Yu-Cheng Qi-Hang
Design Factors
VIRTUALITY & NATURE by Magdiel Lopez
32
SCI-Arc Yu-Cheng Qi-Hang
Yu Cheng , Qihang
Thesis Final
E-mail: qihang_fan@sciarc.edu yucheng_huang@sciarc.edu
We, Qihang Fan & Yucheng Huang, are aspiring designer across multiple interfaces, specializing in user-experience design, AR & VR interactions, and architecture design. Our practice focuses on connecting ideas with user-centered design through rapid prototyping and immersive interaction, creating uncharted solutions for user experiences.
34
SCI-Arc Yu-Cheng Qi-Hang
Yu Cheng , Qihang
Thesis Reference
E-mail: qihang_fan@sciarc.edu yucheng_huang@sciarc.edu
We, Qihang Fan & Yucheng Huang, are aspiring designer across multiple interfaces, specializing in user-experience design, AR & VR interactions, and architecture design. Our practice focuses on connecting ideas with user-centered design through rapid prototyping and immersive interaction, creating uncharted solutions for user experiences.
36 “Alessio thesis project intervenes urban space through cutting-edge technology, revisioning giant urban surfaces as a playful and creative space.”
“Games of Deletion” or GoD is an Augmented Reality game based on the use of virtual deletion in the field of design and architecture. The game is inspired by the semi-fictional novel “Games of Deletion” that will be released beside the game on the next 09.09.18 at Southern California Institute of Architecture in Los Angeles. The game is supported by Android Google Pixel 2, iOS I phone X and in the future it could be adapted for some of the main AR headset as Holo Lens or Mira Prism. GoD is a one VS one real scale design visual battle. The player A, located in a certain Location A plays against Player B in a certain location B. The game for being activated needs the Smartphone camera pointing at a Murals that has been previously associated with the game. In this way the game is constantly associated to real existing space. GoD challenges the following themes in AR developments and Architecture. “A Game of Deletion” is a Real Scale Mixed Reality Game or RSMR based on additive and subtractive visual layer of information mapped on a determined urban environment. The first version of the game happen to be
SCI-Arc Yu-Cheng Qi-Hang
developed in the city of Los Angeles because of the large quantity of Murals. Murals in the city of Los Angeles have a different treatment from any other place. Indeed the DCA [ Department of Cultural Affairs reference ] regulates the mural’s artists through the Citywide Mural Program establishing a comprehensive network of mural activity and engagement by muralists, property owners, community stakeholders, educators, technicians, technologists, and preservationists. Murals can be considered powerful Augmented Reality markers. A marker is any object that can be placed in a scene to provide a fixed point of reference of position or scale. In AR, these markers can provide an interface between the physical world and the augmented reality content, such as 3D models or videos. At their core these markers allow the device which is generating the AR content to calculate the position and orientation of its camera. When this calculation is done in real time, this process is known as tracking. Virtual Reality is potentially a very complete tool for designers, but it is still putting us at the center of the platform. I see the future of design somewhere else, where we are cooperating with the tool rather than fully controlling it.
Game of Deletion
CONCEPTUAL DRAWING presenting how game of deletion influence urban typology.
38
SCI-Arc Yu-Cheng Qi-Hang
Game of Deletion
CONCEPTUAL DRAWING presenting how game of deletion influence urban typology.
40 “Ilaria’s thesis project explores how the architect could create different typologies of domestic space not based on current traditional conventions but screen-oriented idea.”
Taking as main reference La Casa Telematica by Ugo La Pietra, an art installation held during the International Trade Fair of Milan in 1982, the thesis aims to emphasize the subversive reality by re-constructing a screen-oriented domestic space in order to explore the interaction between humans, space, and technology. A glimpse is also taken into the bubble architecture by Haus-RuckerCo back in the 60s, which concentrates on psychic experience through building individual utopias. In fact, the project reflects on a paradoxical relation with space and technology in a post-pandemic time, that has increased the myopic behavior of a ‘me-centered digital narcissism. The circular floor plan happened to be an encouragement of non-democratic displacement as it reminisces the ideal prison the Panopticon by theorist Jeremy Betham, where a representation of authority at the center is able to observe all prisoners, without them acknowledging whether they are being watched. A giant 360-degree camera is located in the middle. It captures and produces images for entertainment, family archive, and self-advertisement, turning the house into a live theater, an interface where the actor is also a viewer. Different large screens surround the house replacing conventional windows, they’ve
SCI-Arc Yu-Cheng Qi-Hang
become the new medium as well as obstacles to approaching the outside world. The furniture is embraced with highly responsive design to adapt users’ preferences, and yet they are arranged toward the center. Ironically, although some of the furniture is distributed in a way to encourage collective activity, still the screens appear to be the major force of distraction and distancing. At the dining table, the 3 chairs are displaced in a row, the table is shaped in a way that brings it toward the TV. In the bedroom, the beds are carved out based on past user experience, although the two people are meant to face each other, the beds are separated so they can watch their own TV. The significant amount of digital integration that extends to every daily activity reflects on a conflictual post-pandemic mindset. Insecurity and exclusivity are the two sides of the mirror. The project uses the architectural experience as a representation of mindscape to discover the unconscious research of tranquility, relaxation, and security.
Ilaria's Thesis
CONCEPTUAL RENDERING showing the possibility of future architecture design
42
SCI-Arc Yu-Cheng Qi-Hang
Yu Cheng , Qihang
Bibliography & Glossary
E-mail: qihang_fan@sciarc.edu yucheng_huang@sciarc.edu
We, Qihang Fan & Yucheng Huang, are aspiring designer across multiple interfaces, specializing in user-experience design, AR & VR interactions, and architecture design. Our practice focuses on connecting ideas with user-centered design through rapid prototyping and immersive interaction, creating uncharted solutions for user experiences.