Portfolio - Zhuoneng Wang

Page 1

Zhuoneng Wang - SELECTED WORKS -


User’s Understanding

Designer’s Understanding

Meanings memtonymically invokes

Sense st+n

anticipate

challenges

Disruption anticipate

interface

Artifact

unfold info

Actions at Externalities

Internal dynamics

Remake of designer’s understanding of a user interface. at describes an action taken at time t, st+n describes a follow-up sensation (adapted from Krippendorff, 2005).


TABLE OF CONTENTS 目錄

CHAPTER 1

INTANGIBILITY 無形性 #1 Mixed Reality Hand-Gesture Interaction

06

#2 Mixed Reality in AEC Industry

20

增強現實手勢交互

混合現實在建築領域的應用

CHAPTER 2

TANGIBILITY 有形性 #3 Brainwave Interactive Installation

34

#4 Soft Robotic Arm

44

腦波交互裝置

軟體機械臂

CHAPTER 3

SPATIAL MEDIUM 空間介質 #5 Nomadic Cloud - Mars Habitat Design

54

#6 Vulcan Pavilion

66

遊牧雲 - 火星居住地設計

Vulcan亭

CHAPTER 4

OTHER WORKS 其它作品 #7 SignAR

78

#8 The Interaction Design of VR Kayaking

80

#9 Photography Works

82

手語翻譯AR

VR皮劃艇的人機交互與規則設計

攝影作品


Resource: Zishan Liu, China, 2017


CHAP TE R ONE

INTAN GIBILITY

The chapter develops an understanding of how user interfaces manifest in virtual and augmented reality. With the goal of developing a set of best practices for designing and interacting with theme interfaces. As well as developing tools and workflows to help design in mixed reality. I also wanted to explore the impact that these technologies will have on our society and explore the changing relationship between people and digital content. How we interact with physical objects in the real world is determined by the limitations of the physical world - objects have physical properties and there are physical laws that determine how we interact with the material world. We took metaphors from the physical world and adapted them to help us interact with the digital world.


TANGIBLE_INTANGIBLE

6

INTERACTION


#1 MIXED REALITY HAND-GESTURE INTERACTION

The Exploration of Mixed-reality Interaction Design Individual Research Project

Hand gestures have been largely overlooked for interacting with the digital content in the mixed-reality environment. To fully utilize the ubiquitous nature of hand gestures for this purpose, I worked primarily with Leap Motion hand tracking in MR, documenting how familiarity develops from direct, embodied engagement with novel hyperphysics, through a series of prototypes spanning body-environment fusion (in depiction and interaction), and multisensory integration. This project serves to exhibit a set of concerns and possibilities for spatial interactions, with the hope that such awarenesses might provoke/instill novel frameworks for conceiving of spatial interactions and the role of embodiment in spatial computing.

Workflow & Platform

ZED Camera

Unity

Leap motion

Visual Studio

Rhino 3D


on one end and Virtual environment on the other (illustrated in Figure 2). The area between both ends is name Mixed reality, including Augmented reality and Augmented virtuality, both concepts that mix real and virtual display content. INTRODUCTION | XR TECHNOLOGY Virtual reality herein is defined as consisting of solely synthetic objects, created by a computer to make the user believe THE EVOLUTION OF HCI

the synthetically created world is real and that he/she is present within that world. What is MR?

Real Real Environment

Mixed Reality (MR) Augmented Reality (AR)

Augmented Virtuality (AV)

VR Virtual Environment

TANGIBLE_INTANGIBLE

Figure 2Evolution Virtuality continuum as described by milgram & KishiNo (by author, based on milgram & KishiNo, 1994). of HCI

PLATFORMS EVOLUTION HCI MilgraM’s, as they specify INTERACTION A description of VR by Mazuryk & gervautz (1996) ties OF in with three factors to defiHow we(solely interact with physical objects the real They further differentiate between ne VR: Immersion (presence), Simulation synthetic objects) andinInteraction.

contexts of use for VEs:

world is determined by the limitations of the physical world - objects have physical properties and there are physical laws that determine how we interact with the material world.

14 We took metaphors from the physical world and adapted them to help us interact with the digital world - we still use metaphors like windows and a desktop, as well as nested folders for searching for content in a non-visual way.

8

As technology has advanced, so too have the metaphors that we use to interact with the digital world around us - we have touchscreen phones and new metaphors along with them, like wiping and pinch to zoom.

INTERACTION

Right now we have to use clunky controllers to interact with mixed reality. These interactions feel unnatural and clumsy. These controllers are designer for gaming, and are in no way meant to be used in a productivity context. Some of their designs are moved to inject the user’s hands into the environment anyway instead of being traditional controllers.

VIRTUAL REALITY

MIXED REALITY

Mixed reality is the name given to a spectrum of technologies from augmented reality to virtual reality. In this case, I am using it to describe when digital content is injected into the world around you through the use of a head-mounted display. Such as the Microsoft Hololens. The HMDs used scan the environment so programs/content are treated like they’re a normal part of the environment.


NEW PARADIGM | HAND-GESTURE THE MOST NATURAL WAY TO INTERACT

Benefits of MR

Digital content now has the depth to it, allowing it to inhabit the world as if it were a real object - this opens up a whole new relationship with 3D models and content - we can leverage spatial memory to create new ways of navigating content.

In the real world, you can only interact with things that are within arm’s reach, but taking this limitation into XR would be seriously stunting the potential of the technology - you should have complete agency over the digital environment you’re in.

Because Mixed Reality superimposes digital content into the real world, and it is able to be anchored to the environment, you are able to augment real world objects - even yourself - this changes the ways in which we consume information.

Hand as Input Device

Hyper-Reality Keiichi Matsuda

THE MOST NATURAL WAY TO INTERACT We use our hands to control the world around us - they are our best input device, but the ways we use them are dictated by the limits of our physical world - it makes sense therefore that we would use our hands to control a spatial computing environment.


EQUIPMENT | COMPONENTS ASSEMBLY

CONNECTION AND INFORMATION TRANSMISSION OF DEVICES

TANGIBLE_INTANGIBLE

10

INTERACTION


EQUIPMENT | TESTING

KEY ELEMENTS AND DEVICE COMPARISON

Keys to the Sense of Reality

SLAM

simultaneous localization and mapping

+

Feeling of Depth monocular depth cues

+

Material

rendering quality and object opacity

Monocular Depth Cues

Comparison of devices Rendering Quality

Occulusion

Shadow Casting

Hand Interaction

FOV

Hololens

SLAM

Hololens

Magic Leap

Hololens 2

Leap Motion + ZED Camera

PROBLEMS OF MAINSTREAM HMDs After testing a series of mainstream mixed reality HMDs on the market, including Hololens and magic leap, it is found that the problems are poor rendering, low opacity, and the lack of occlusion, which can’t make the content best integrate with the environment.

Portability


RESEARCH | HAND-GESTURE DIFFERENT MEANINGS AMONG GESTURES

Tracking Points of Hand

Basic Pose Type

TANGIBLE_INTANGIBLE

12

Basic Moving Gesture

INTERACTION

HAND-GESTURE INTERACTION If you look at existing AR inputs for inspiration for how we might interact with virtual interfaces you come across a lot of gestures that are abstract and meaningless-meaning they have to be learned by the user. I wanted to see if it was possible to use this technology to create a natural user interface where you can learn how to use every aspect of the interface without having to be taught it explicitly.


EXECUTION | HAND-GESTURE INTERACTION CHOOSE THE BEST GESTURES BASED ON DEMANDS

Gesture Testing

Model Manipulation

Test Run

HAND-GESTURE INTERACTION Gestures break down to three types-abstract, grounded, and real-I tried to create only interactions based on metaphors we use in the real world to help generate this natural user interface.


RESEARCH | MR INTERFACES THE POSSIBILITIES OF MR INTERFACES

Two Types of Spatial Interfaces Principal

TANGIBLE_INTANGIBLE

Now that digital content has become spatial-the ways in which we are able to interact with it and control it are completely different. I set out a goal to create new paradigms and metaphors to interact with that content. The first step was to understand what the different types of ways content could manifest in order to understand the associated interactions with that content that could be designed

Environment-anchored UI Elements

14

1. SPATIAL UI PANEL

Body-anchored UI Elements INTERACTION DEFINE AREAS OF HAND

2. RESPONSIVE BUTTON


EXECUTION | VIRTUAL WEARABLES EXPANDABLE INTERFACES THAT FOLLOW THE HAND

1. WRIST INTERFACE

An interface on the left wrist displaying basic information. 2. RAY CAST BUTTON

3. OPTIONS SWITCHING

A button on the right wrist that turns on the ray. Objects’ basic information will be displayed when pointed by the ray.

Users can switch design options on the left palm panel. 4. SUNLIGHT ADJUSTMENT

Users can adjust sunlight brightness on the left palm panel. 5. SECTION TOOL

By hitting the button floating above the model, users will see a translucent plane. Similar to the way of translating the model, users make their left hand a fist to move the plane remotely to see the section of the model.

Full Demonstration Video: https://www.youtube.com/watch?v=nHlN_FSRD6o


DESIGN | CONCEPTS

IN ORDER TO MAKE THE INTERACTION SMOOTHER

Pull Content Out of Screen

TANGIBLE_INTANGIBLE Stream the Changes Simultaneously

16

INTERACTION

SYNCHRONIZATION changes made in Rhino, Maya, 3D Max can be streamed to the object in the real environment right away.


EXECUTION | INTEGRATION WITH GESTURES ACHIEVEMENT OF THE DESIGN CONCEPTS

1. OPEN THE PORTAL

Slight touch the screen to open the portal that connects the digital world.

2. PULL THE MODEL OUT

3. Model Manipulation

Make the right hand a fist to pull the model out of the modeling program

Interact with the digital model.

4. Make Changes

Put the model in place.

The changes will be reflected simultaneously on the model users pulled out to the real world.

Full Demonstration Video: https://www.youtube.com/watch?v=3wg6SyOynW4

Make changes in the modeling program.


MR x IoT The Future of Mixed Reality and Internet of Things

Imagine that when your MR headset is connected to all the appliances at home, you can access all their functions and precisely control them through virtual wearables and basic gestures. With the assistance of this technology, the convenience of life will be significantly improved not only for my grandmother, but also for more people with reduced mobility in the world.

TANGIBLE_INTANGIBLE

In the future of interactive technology, I can't help but imagine that the entity of devices will be minified to be invisible, and people will be equipped with the ability to affect reality with near naked-eye VR interaction which acts as the enlargement and extension of physical capabilities. It is like superpower. The new generation of interactive methods can enable this superpower to help make up for or even eliminate people's congenital physical defects.

TV Control on / off, change the channel, adjust the volume through virtual interfaces

98°

18

INTERACTION

S

Switch music, contro adjust th


SPEAKER

ol on / off, he volume

CEILING LAMP Control the on / off, color temperature and brightness of ceiling lamp through virtual interfaces

DIGITAL PHOTO FRAME Switch to any photo or video through virtual interfaces

CHAIR WITH WHEELS move the chair remotely

THERMOSTAT Adjust the temperature in the room


Adapted from The Disciplines of User Experience, Dan Saffer (2008)


#2 MIXED REALITY IN AEC INDUSTRY

iPad based Augmented Reality Applications Individual Work Company: RCH Studios

Mixed reality (MR) combines the virtual and the physical realities into one space and offers an exciting new design paradigm for architects. By projecting a BIM model directly over a physical site in mixed reality, architects can communicate design ideas to the team and clients in an immersive and interactive way. This project demonstrates case studies of mixed reality, using iPad, applied for different phases of architectural projects. I will share my explorations in utilizing this technology in different aspects. Finally, I will examine the potential of mixed reality tools for construction administration and envision on-site mixed reality clash detection using iPad.

Full Demonstration Video: https://youtu.be/f1sRahb9ssA

Workflow & Platform

Unity

Revit

Visual Studio

Rhino 3D


RESEARCH | PAIN POINTS ROOM FOR IMPROVEMENT

Pricing

Contractor

Concept

Owner

Schematic

TANGIBLE_INTANGIBLE

Architect

Engineer LOW-EFFICIENT COMMUNICATION Gestures break down to three types-abstract, grounded, and real-I tried to create only interactions based on metaphors we use in the real world to help generate this natural user interface.

22

INTERACTION

HUGE COST The global construction market will hit $10.3 trillion in 2020, and it’s a field where mistakes are costly: rework typically makes up 12 percent of project costs, reaching tens of millions of dollars on big projects. But augmented reality can change that, which is why Architecture is one of the first industries starting to explore the use of AR in day-to-day work.


RESEARCH | REASON AND METHOD WHY AND HOW TO USE MR TECHNOLOGY

Benefits of MR to AEC Industry

Context Specific The ability to visualize our designs in context is crucial for understanding our impact and communicate our ideas with others.

Interactivity Visualizing our designs in context is time-consuming and often only produce images from single views. Augmented Reality allows anyone to view our designs in context from their own unique perspective.

Rapid Prototyping Rapid prototyping in XR can immediately clarify issues of scale, position, sight lines.

Flow Process Chart

BIM SOFTWARE Revit

BIM DATA

in .xlsx file format

ANALYSE DATA SOFTWARE FOR MASSING STUDY Rhino Sketchup MAYA 3D MAX CAD

in .xlsx file format

MODEL GEOMETRY in .fbx or .obj file format

MR HMD

Hololens/Magic leap

Tablets

iPad/Android devices

Convert .xlsx file format to .csv file format

UNITY 3D

Import model geometry, BIM data, and analyse data to project in Unity 3D game development engine

Visual Studio Build projects and develop interactions using C# in visual studio.


RESEARCH | DES

WHERE CAN MR TECHNOL

COMMUNICATION DESIGN TIME CONSUMPTION AFTER APPLYING XR

TIME CONSUMPTION

ARCHITECT / INTERIOR DESIGNER CLIENT & CONTRACTOR

TANGIBLE_INTANGIBLE

24

project program zoning review as-built drawings interior concept

CONCEPTUAL DESIGN

provide imagery approve program

OUTLINE SPECS

PRELIMINARY SITE PLAN

RESEARCH

contractor recommendation preliminary cost estimate

con cos by

interior mood images program massing model diagramatic floor plans daylight design & verification acoustic verification

inte FF& 3D inte refl

SCHEMATIC DESIGN

program massing approval budget review

DESIGN DEVELOP

architectual & interior design approval cost estimate review

APPROVA OF 3D DESI

INTERACTION SCHEMATIC DESIGN (SD)

DESIGN DEVELOPMENT (DD)

The architect does precedent research and any analysis of the property including zoning and building code issues that may affect the specific development.

The architect revises the initial drawings based on the client’s comments from the SD phase, capturing more specifics and details with these freshly revised sketches

CONSTRUC

By now, client tled on a final paring drawin cal specificati construction, a


SIGN PHASES

LOGY BE APPLIED TO

E

SUPERVISION & GUIDANCE

TECHNICAL DRAWINGS & SPECS

BOOTH SETUP

nsultant’s drawings st estimate revision contractors

consultation

provide additional drawings & design review shop drawings punch list

site application

erior layouts &E selection and design model of exterior erior finish palette flected ceiling plans

detail drawings

site meetings with contractor & client

physical model

PMENT

hardware, lighting & accessory selection cabinets & builts-ins design

CONSTRUCTION DOCUMENTS

finishes, fixture, appliance approval detail dawings approval

AL IGN

CTION DOCUMENTS (CD)

and architect will have setl design and will begin prengs, notes, and most techniions necessary for bidding, and permit application.

CONSTRUCTION ADMINISTRATION

drawings

EXHIBITION

approval

site meetings with contractor & architect

FINAL REVIEW

CONSTRUCTION ADMINISTRATION (CA)

EXHIBITION

On this phase, architects will periodically visit the job site to see progress and ensure the contractor is following the plans per the architectural design intent.

Some projects will be exhibited for the public. Exhibitions have been extremely important in motivating progress and sharing ideas throughout architectural history.


APPLICATION |

INTERACTION

CONSTRUCTION

INTERIOR

TANGIBLE_INTANGIBLE

26

PRESENTATION

DESIGN

HOW CAN MR TECHNO

Full Demonstra https://youtu.be/f


USE CASES

OLOGY BE UTILIZED

ation Video: f1sRahb9ssA


CONSTRUCTION SITE | VR SPHERE INTEGRATE THE MODEL WITH REAL-WORLD SITES

Floor Plan

TANGIBLE_INTANGIBLE

2

3

1

4

28 VR SPHERE

INTERACTION

VR SPHERE AR sphere allows you to place a sphere, which when walked through gives you a completely different experience. It can feel like you’ve teleported to a scene.


CONSTRUCTION SITE | VR SPHERE INTEGRATE THE MODEL WITH REAL-WORLD SITES

1. CAFE

2. BARISTA

3. HALL

4. WORKSTATIONS

After gaining a better understanding of SLAM and the construction schedule of the project, We had come to realize it was better for us to test out the MR tools in a more controlled environment before going on an active project site. The tools that needed to be developed and programmed

were alignment, layering, and annotation. After the steel structure is erected, the MR technology will be a much more effective tool in that it can virtually tie itself to the superstructure and begin to overlay augmented virtual information on the site.


CONSTRUCTION SITE | MODEL OVERLAY INTEGRATE THE MODEL WITH REAL-WORLD SITES

Piping Layout

TANGIBLE_INTANGIBLE

FLOOR 2

FLOOR 1

30 Simultaneous Localization and Mapping (SLAM)

INTERACTION

USING SLAM FOR MODEL ALIGNMENT The model alignment between the virtual and the physical worlds is the first step needed to anchor the model onto the exact location in the real world. In order to align the two spaces, it is necessary to lock three virtual points to three physical points in order to lock the X, Y, Z axes.


CONSTRUCTION SITE | MODEL OVERLAY INTEGRATE THE MODEL WITH REAL-WORLD SITES

MODEL OVERLAY Designers would place the virtual model using the two-point alignment method so the model is at full scale anchored correctly onto the real world. The designer can review the various different models at full scale such as piping layout. Being in full scale, the designers can walk around and experience the space in an immersive fashion.


Resource: Zishan Liu, China, 2017


CHAP TE R TWO

TANGI BILITY

This chapter is based on the formal and kinematic logics. The intention behind this chapter is to make the formal system “intelligent” by providing the ability to respond to their context, through not only a physical transformation that is mostly inherent in natural organisms but also by overlaying a digital “interface” that allows the model to negotiate realities between the digital and the physical worlds.


TANGIBLE_INTANGIBLE

34

INTERACTION Resource: Study of the human fugure, Anterior view, From ‘A comparative anatomical exposition of the structure of the human body with that of a tiger and a common fowl’ Geoge Stubbs


#3 BRAINWAVE INTERACTIVE INSTALLATION

Brainwave-controlled Transformational Cubes Instructor: Guvenc Ozel Collaborator: Wai Ching Cheng The intent of this studio is to explore the logics and formal outputs of organizational behaviors, through researching specific terminologies and finding precedents in the fields of digital and interacti tural prototype that will act as a formal vessel to fulfill a number of idealized conditions related to hierarchy, structure etc... For example in our project is to transform a surface into a volume. The latter half of the studio is to turn our physical models into autonomously reactive systems of mixed realities based on the formal and kinematic logics that we have established earlier. We used EMOTIV Epoc+ headset and Electroencephalography to bridge the realities between the physical and digital world, and to allow our physical model to respond to their context through physical transformation and digital interface “intelligently�.

Video Link https://youtu.be/IQvZyXJZQ6o

Workflow & Platform


FOLDING METHOD | EDGES A 3 DIMENTIONAL FOLDING WAY

The Introduction

TANGIBLE_INTANGIBLE

Core Motion

36

CRITICAL EDGES Critical Edges

INTERACTION Constraint Pattern Location of required edge constraints/ hinges to allow succesful transition from Phase 1 to Phase 3 Connection to other half

Small units connection

Second phase connection


FOLDING METHOD | CUBES

IMPROVEMENT BASED ON THIS BASIC METHOD

Cube Locations

CUBE LOCATIONS

+ Square

+ Big Cut

+ Bottom Cut

+ Upper Cut

Side Cuts

Motion Sequence

MOTION SEQUENCE

MOTION

Expand Upwards

Open both sides

RESULTS

PHASE 1

PHASE 2

VIEWS

Side View

Plan View

Fold back

Descend towards center PHASE 3

Plan View

Side View

CUBE COMBINATION Movement and change of direction of each block when expanding from phase1. Location of required edge constraints/hinges to allow successful transition fron phase1 to phase 3.


THREE PHASES | TRANSFORMATION A 3 DIMENSIONAL FOLDING WAY

PHASE 1 - Division

PHASE 1 - Division

PHASE 1 - Division

THE THREE PHASES

TANGIBLE_INTANGIBLE

Division

PHASE 2 - Volumn

PHASE 2 - Volumn

THE THREE PHASES PHASE 2 - Volumn

38 PHASE 3 - Mass Division

Volumn

PHASE 3 - Mass

INTERACTION

PHASE 3 - Mass

Division

Volumn

Mass

FINAL DESIGN FORM These motors are divided into three sets, with each set controls a motion phase. For instance, two servo motors work together to lift up a number of cubes to create rotational effect.


THREE PHASES | TRANSFORMATION A 3 DIMENSIONAL FOLDING WAY

PHASE 1-2

PHASE 2-3

FOLDING PROCESS These 6 servo must constantly work on track in relationship to each other to create harmonic movements.


INTERACTION DESIGN | CONFIGURATIONS A 3 DIMENSIONAL FOLDING WAY

INPUT INPUT

PROCESS PROCESS

OUTPUT OUTPUT

Thinks about a trained action (e.g. Pushing Motion)

Converts action into a letter (e.g. numbers, alphbats)

Letter will be “pressed” once in grasshopper, triggers True/ False actions

Thinks about a trained action (e.g. Pushing Motion)

Converts action into a letter (e.g. numbers, alphbats)

Letter will be “pressed” once in grasshopper, triggers True/ False actions

TANGIBLE_INTANGIBLE

EEG (Electroencephalography) EMOTIV Epoc+

Visual Programming Firefly

EEG (Electroencephalography) EMOTIV Epoc+

Visual Programming Firefly

Triggers number counter to start counting from 0-90 or 0-180 Triggers number

counter to start counting from 0-90 or 0-180

These numbers control the rotation degree of servro motors These numbers

6 Servo motors are placed inside different cubes to rotate them at different times and degree 6 Servo motors are placed inside different cubes to rotate them at different times and degree

control the rotation degree of servro motors Physical Application Power HD Servo Motors Physical Application Power HD Servo Motors

ANTICIPATED RESULTS ANTICIPATED RESULTS These motors are divided into three sets, with each set controls a motion phase. For instance, two servo motors work together to lift up a number of cubes to create rotational effect. These 6 servo must constantly work on track in relationship to each other to create harmonic movements.

are divided into two servo2 motors together3to lift PHASEThese 1.5motors TO PHASE 2 three sets, with each set controls a motion phase. For instance, PHASE TO work PHASE

up a number of cubes to create rotational effect. These 6 servo must constantly work on track in relationship to each other to create

PHASE 1 TO PHASE 1.5

movements. PHASEharmonic 1.5 TO PHASE 2

PHASE 2 TO PHASE 3

SET 2 +180 +180 SET 2 +180 +180

SET 2 +180 +180

SET 3 Rotation: +90 ° Net Rotation: +90 °

40

SET 3 0 +90

SET 2 0° 0° SET 2 0° 0° SET 1 +90 ° SET 1 +90 ° +90 ° +90 ° SET 3 +90 ° +90 °

SET 1 +90 +180

SET 3 -90 0

SET 3 +90 +180

SET 3 0 +90

SET 1 0 +90

SET 3 0 +90

0 +90

Standard Servo Motor PowerHD LF-20 MG

INTERACTION

Servo Motor Gear

3D Printed 18-tooth plastic gears

Rotation Axis Holder

Attached to the cube to make sure the rotation axis is on the right place

SET 2 NIL +180

SET 3 +90 +180

SET 1 +90 +180

SET 1 0 SET+90 1

MECHANISM DETAILS

SET 2 NIL +180

SET 2 SET 3 NIL -90 +180 0

SET 1 +90 +180

SET 3 0 +90

0 +90 SET 1

SET 2 NIL +180

SET 2 +180 +180

Metal Hinge Increase stablility of cubes

Gear of supporting cube

Attached to the neighbor cubes to provide movement

Motor Holders Attached to the inner wall of the cube to provide motor location accuracy

SET 1 +90 +180


INTERACTION DESIGN | CONFIGURATIONS A 3 DIMENSIONAL FOLDING WAY

ARDUINO BOARD AND SERVO MOTOR Thinks about a trained action(e.g. Pushing Motion). Converts action into a letter (e.g. numbers, alphabets). The letter will be “pressed” once in grasshopper, triggers True/ False actions Triggers number counter to start counting from 0-90 or 0-180. These numbers control the rotation degree of servo motors.


INTERACTION DESIGN | EVER-CHANGING PATTERN A 3 DIMENTIONAL FOLDING WAY

Box Pattern

TANGIBLE_INTANGIBLE Scene from hololens

42

INTERACTION

LENTICULAR PATTERNS Lenticular printing is a technology in which lenticular lenses (a technology that is also used for 3D displays) are used to produce printed images with an illusion of depth, or the ability to change or move as the image is viewed from different angles.


INTERACTION DESIGN | BRAINWAVE A 3 DIMENTIONAL FOLDING WAY

EEG HEADSET CONTROLLING Thinks about a trained action(e.g. Pushing Motion). Converts action into a letter (e.g. numbers, alphabets). The letter will be “pressed” once in grasshopper, triggers True/ False actions Triggers number counter to start counting from 0-90 or 0-180. These numbers control the rotation degree of servo motors.


TANGIBLE_INTANGIBLE

44

INTERACTION Resource: Study of the human fugure, Anterior view, From ‘A comparative anatomical exposition of the structure of the human body with that of a tiger and a common fowl’ Geoge Stubbs


#4

SOFT ROBOTIC ARM

Instructor: Guvenc Ozel Collaborator: Wai Ching Cheng Yifan Chen We designed a robotic couture for the small robots in the IDEAS Lab in the form of a functional prosthesis. The focus is a scenario where soft- robotics are combined with hard or rigid robotics and how they interact with a geometrical object. In this case, the small robotic arms serve as site for our design and the task is to equip or extend them like a tool or gripper. The soft extensions, which we designed, have to incorporate the possible movement of the rigid robots and combine the properties of soft and rigid. The prosthetic extension we design, has to interact with a distinct object with a particular geometry and topology. We focus on how the final design and object of our robotic extensions are articulated due to the interaction with the distinct geometrical object. This interaction can be based on lifting, rotating, wrapping, deforming or gripping. Video Link: https://youtu.be/GbvkOWO1ymg

Workflow & Platform


INTRODUCTION | SOFT-ROBOTICS GETTING IN SOFT TOUCH

The Introduction ROBOTIC COUTURE ABOUT ROBOTIC PROTOCOLS, HERETIC MACHINES AND A PROSTHETIC FOR THE CONTEMPORARY As soft robotics becomes more and more elaborated, its practice appears evident in fields such as medical systems , aerospace, industrial assembly systems, catastrophe intervention, and other industries involving direct contact with human beings. Due to their soft and pneumatic nature, soft robotics have a compelling record within the meaning of interaction, extension and mutualism of the human body.

TANGIBLE_INTANGIBLE

This record gives the possibility of a more compliant, sensitive and corporeal conception of technology in architecture and design. In the scenario of the contemporary, these heretical machines challenge the constraints of permanence and imply the notion of a spatial prosthetic. Just as the introduction of virtual and augmented reality enlarges our capability of perception and creation of space, responsive building parts trigger a new conceptualization of vitalism and dynamism in architecture.

esign intent

Object’s curvature analysis

Object Form Study Designed position of the metaball object will be symmetrical

Symmetrical position of the metaball object

46

Asymmetrical position of the metaball object

INTERACTION

THE SOFT-ROBOTICS We work on the concept of how the extensions can interact, grab and approach the object and the first task will be to develop a small excerpt of the overall concept design which follows one specific movement and function in relation to the object. This can involve gestures like lifting, folding, inflation, deflation, and rotation.


Mold analysis

SOFT EXTENSION | MODELING

THE REATIONSHIP BETWEEN MOLD AND SOFT ROBOT

Molds

Soft robotics arm attachment

Thick silicone surface designed to bend the shape downward

Thin silicone surface designed to give texture to inflated chambers

Mold analysis Soft Robots

Incorporated chambers Air circulation

Filled with silicone to be incorporated as the thick skin Airway providing air to the chambers

DESIGNING THE MOLDS We investigate in CAM (Computer Aided Manufacturing) to obtain our molds and to cast our soft robotics. Therefore we will utilize tools like CNC, Laser cutting and 3D Printing. Within this process, we investigate how the design and topology of molds will influence and guide the final performance of the responsive and transformable object.


PROSTHETIC EXTENSION | GRABBING THE PROCESS OF GRABBING OBJECT

End Effector Design

TANGIBLE_INTANGIBLE

48 Object Form Study

INTERACTION

THE TRANSFORMATION OF TWO SOFT ROBOTS More precisely, we sharpen the concept of prototypes and the overall design and focus on three particular moments of their transformative performance. These moments and gestures can already be linked to the small robots in the robot lab and show specific states of interaction with the object and the prototypes act as an excerpt of the Full-Scale Design.


Design intent

ign intent

Object’s curvature analysis

Object’s curvature analysis

PROSTHETIC EXTENSION | MOUNTING

THE CONNECTION PART FOR MOUNTING THE WHOLE EQUIPMENT

Connection to the soft robotics designed to mimic the curvature of the object

Object Form Study

1

Wire connection 3

1

Air pump tube connection

3

Connection from the robotic arm to the soft robotic

2

Hall sensor placed under the connector

AUTOMATIC OPERATION

3

In this case, the small robotic arms serve as site for our design and the task is to equip or extend them like a tool or gripper. The correlation and performance between the rigid robots and the soft robotics is crucial and we are focusing on how the overall Gestalt and spatial performance of the rigid robots is changed over time. The soft extensions, which we design, have to incorporate the possible movement of the rigid robots and combine the properties of soft and rigid.

2


RIGID ROBOTS

THE THREE DISTINCT STAGES OF

INTERACTION AND

We also focus on the integration and refinement of the concept in terms of techniques in the notion of machine vision. We are looking for s

We integrate external magnetic sensor to control the transformation of the d created a live Interaction pipeline from Machine Vision Devices to de


| INTERACTION

MOVEMENT AND INTERACTION

D MACHINE VISION

interactivity, vitalism and dynamism of the design. This means we will involve solution of object sensoring and the interaction with the context.

design that results from the perception of the external Devices. Therefore we esign via Grassho per (Firefly, gHowl), Processing and Arduino boards.


Resource: Zishan Liu, China, 2017


CHAP TE R ONE

SPATIAL MEDIUM

This chapter lays stress on the spatial form, transcends the purpose of forms to raise rhetorical questions about the essence of space as well as its profound and dense internality. In my view, to create an architectural work is far more than the production of a visible shell. To be specific, an architectural work ought to burst out from the intersection point of different vectors—it is neither an object nor a form; on the contrary, it is a site, a container, and a uniquely confined state.



#5 NOMADIC CLOUD

Mars 3D Printed Habitat Reputations: Second Prize: NASA [Project Mars] international competition | 2018 [Best Depicts Humans in Space] Prize: NASA [CineSpace] competition | Houston | 2019 Honorable Mention: [OUTER SPACE] competition | 2019 Finalist: [Marsception] Mars habitat design competition | 2018 Exhibition: [RUMBLE] UCLA graduation exhibition | Los Angeles | 2017 Screening: Regal LA Live Movie Theater | Los Angeles | 2019; Instructor: Guvenc Ozel Collaborator: Wai Ching Cheng The studio seeks to develop the fundamental designs and technological ideas necessary to manufacture an off-world habitat using mission recycled materials and/or local indigenous materials. The vision is that autonomous habitat manufacturing machines will someday be deployed to the Moon or Mars to construct shelters for human habitation. On Earth these same habitat manufacturing capabilities could be used to produce housing wherever affordable house is needed and access to conventional building materials and skills is limited. Video Link https://youtu.be/xJjEQYohgG4

Workflow & Platform


CONCEPT | PROBLEMS

TWO REASONS OF THE NEED OF KEEPING MOVING

TANGIBLE_INTANGIBLE

56

INTERACTION

THE NEED OF KEEPING MOVING For performing the multiple tasks listed by NASA, the first batch of astronauts need to go to different places. Besides, keeping moving helps them to avoid frequent dust strom.


RESEARCH | MATERIAL

THE LIGHTEST SOLID MATERIAL IN THE WORLD


DESIGN | ROBOTIC FABRICATION THE AUTONOMOUS CONSTRUCTION PROCESS

Construction Process

TANGIBLE_INTANGIBLE

58

3 MODES 3 Phases

INTERACTION

PORTABLE & EXPLORABLE Due to its lightness, the habitat can be moved & lifted for multiple exploration tasks.


DESIGN | TECHNICAL DRAWING THE AUTONOMOUS CONSTRUCTION PROCESS

Auto-folded central core

1

2

3

4

5

6

7

8

Section and Floor Plan

STRUCTURE/ MEMBRANE SECTION

Membrane section

Ag electrode

INNER GRAPHENE MEMBRANE Clear graphene membrane that is part of the chamber

n+ shell

p core

SPHELAR SOLAR CELLS AEROGRAPHITE SUPPLY PIPE

OUTER GRAPHENE MEMBRANE

Pipeline structure that regulates the distribution of hydrogen gas and water

Graphene membrane produced on mars with solar cells embedded within

Spherical solar cells that can absorb solar radiation from different directions to supply energy for the heating of the membrane

ICE / HYDROGEN CHAMBER

Chamber that houses either hydrogen or ice depending on the location of the Habitat; this area will fill with hydrogen when the Habitat needs to float and ice when it is anchored to the ground

AEROGRAPHITE FILM

A thin layer that separates inner living area and the iced structure

AEROGRAPHITE STRUCTURE

3D printed aerographite structure with pores that could freeze or melt H2O to alter the weight and stability of whole structural system

Al electrode


DESIGN | CENTRA

THE DETAILS OF THE FOL

TANGIBLE_INTANGIBLE

60

INTERACTION


AL CORE DETAIL

LDABLE CENTRAL CORE


INTERACTION DESIGN | AR APP AN INTERACTIVE EXPERIENCE BASED ON TABLET

TANGIBLE_INTANGIBLE

62

INTERACTION

APP Demo Video: https://youtu.be/M1YiS1VJo7A


INTERACTION DESIGN | AR APP THE DESIGN OF UX & UI

Principal of Operation

Poster for Scan

FUNCTIONS This APP includes AR experience, cinematic video and information about Mars and the project, enabling people to better understand the project in an interactive way.


MODELS & HOLOLENS | MOCK-UP THE SIMULATION OF REAL FABRICATION PROCESS

TANGIBLE_INTANGIBLE

64

INTERACTION

ROBOTIC FABRICATION FULL SCALE MOCK-UP We try to imitate the robotic fabrication process we designed on Mars for proving the validity of the method. The wires are firstly put on a foam mould and then dipped with resin to be shaped.


INTERACTIVE DESIGN | HOLOLENS AR AN INTERACTIVE EXPERIENCE BASED ON MODEL

HOW THE HOLOLENS WORKS With Hololens, we have the ability to understand and track the model within a 3D environment. It can track that model and superimpose related information on top of it. In this way, you can establish a composite view of an augmented info and the real model.


TANGIBLE_INTANGIBLE

66

INTERACTION Resource: Study of the human fugure, Anterior view, From ‘A comparative anatomical exposition of the structure of the human body with that of a tiger and a common fowl’ Geoge Stubbs

Photos Credit: Z. Wang


#6 VULCAN PAVILION

Reputation: Guinness [The Biggest Architecturally 3D Printed Structure] World Record Location: Parkview Green, Beijing Construction period: 19 days Contributions: Preliminary design; Construction; Photography Collaborator: Archi-SolutionWorkshop VULCAN came from Latin, which in English means ‘volcano’. In Roman mythology, it is originally the name given to the ‘God of Fire. The emergence in the idea of VULCAN symbolises a sense of fear and respect to the unpredictable forces of nature, while suggesting the fragility yet courage of the human civilisation. The process of 3D printing fabrication is the materiality concept of architect’s thought. Additive fabrication can help designer to control the final form directly and connect the gap between design and construction. We design the form of every component in the computer and then we transfer the form into a series of data and code. The data contains spacial information of every point of the component. 3D printer can move along the path defined by the code and extrude material.

Workflow & Platform


RESEARCH | CONCEPT VOCANO & COCOON

TANGIBLE_INTANGIBLE

68

INTERACTION


RESEARCH | FORM VOCANO & COCOON


PHOTO | HUMAN PERSPECTIVE

Photos Credit: Zhuoneng Wang


CONCEPT | RESEARCH FORM AND COCOONS

BREAKTHROUGH It is the world’s biggest 3D printed architectural space to date (a piece of architecture by academic standard, which is to take up nearly 100 sqm, and can be taken apart or assembled at will).


DESIGN | PREPARATION FOR PRINTING

Flattening Hexagon This step is for the convenience of 3D printers print, because the working principle of the 3D printer is to print the plane and then put them on the stack.

TANGIBLE_INTANGIBLE

Through this series of graph can be seen, the original is the space curved surface of the hexagonal after a series of optimization into a plane hexagonal, so as to achieve the construction of arch.

deviation=1.94064

flatten=false

deviation=5.857279

flatten=false

deviation=1.602772 flatten=false

deviation=2.807914 flatten=false

deviation=6.425616

flatten=false

deviation=6.883491

flatten=false

Number Of Flattan Hexagon =0

deviation=0.0

flatten=true

deviation=0.040085 flatten=false

deviation=0.0

72

flatten=true

deviation=0.050502 flatten=false

deviation=0.232645 flatten=false

deviation=0.093978 flatten=false

Number Of Flattan Hexagon =207-438

INTERACTION

deviation=0.0

flatten=true

deviation=0.0

flatten=true

deviation=0.0

flatten=true

deviation=0.0

flatten=true

deviation=0.0

flatten=true

deviation=0.0

flatten=true

Number Of Flattan Hexagon =1023


Flatten Hexagon Block DESIGN | ACCUMULATION OF MUDULES FORM AND COCOONS

Flatten Hexagon Block

ck

Blo on

g

xa

He ten

t

Fla

mb

Nu

tan

lat

fF

O er

=0

on

ag

x He

lse

Nu e fals

=

en

flatt

45

lse

=fa

en

8

97

3 .09

=0

on

fF

rO

e mb

=fa

ag

ex

nH

ta lat

8

43

7-

20

= on

flatt

ue

=tr

en

flatt ti

via

de

0.0

= on

ue

=tr

en

flatt n=

tio

via

de

0.0

ue

=tr

en

flatt 0.0

n=

tio

via

de

.0

0 n=

tio

via

ue

=tr

en

flatt

de

de

e

u =tr

en

flatt .0

=0

on

ti via

xa

He

3

02

=1

16

MODULES

VULCAN is consisted of 1023Nudifferent 3D printing constructive units. This system is a complete breakaway from the traditional method of arch construction,

0.0

n=

tio

via

de

e mb

la

fF

rO

ue

=tr

en

flatt

n tta

n go

16


CONSTRUCTION | FABRICATION COLLABORATION

Individual Panal 3D Print

silk structure

TANGIBLE_INTANGIBLE

connecting structure

74

individual panal structure path of 3D print

3D Print Process

INTERACTION

The process of 3D printing fabrication is the materiality concept of architect’s thought. Additive fabrication can help designer to control the final form directly and connect the gap between design and construction. We design the form of every component in the computer and then we transfer the form into a series of data and code. The data contains spacial information of every point of the component. 3D printer can move along the path defined by the code and extrude material. In our studio, we have more than 20 3D printers. We produced all the bricks in one month and began to construct it in the site.

1. Coding single component

2. Materiality of code

3. 3D printing process


CONSTRUCTION | DETAILS

Individual Panal 3D Print Construction Details

Individual Panal 3D Print

silk structure

silk structure

connecting structure

connecting structure

individual panal structure

3D Print Process

The process of 3D printing fabrication is the materiality concept of architect’s thought. Additive fabrication can form directly and connect the gap between design and construction. We design the form of every componen transfer the form into a series of data and code. The data contains spacial information of every point of the c along the path defined by the code and extrude material. In our studio, we have more than 20 3D printers. W month and began to construct it in the site.

individual panal structure

path of 3D print

Micro structure:

3D Print Process

We use the tenon structure to connect two plates. The operation method is to put the cork into a hole on one side, then use a screwdriver to pry the cork into the other end by a reserved slot.

The process of 3D printing fabrication is the materiality concept of architect’s thought. Additive fabrication can help designer to control the final form directly and connect the gap between design and construction. We design the form of every component in the computer and then we transfer the form into a series of data and code. The data contains spacial information of every point of the component. 3D printer can move along the path defined by the code and extrude material. In our studio, we have more than 20 3D printers. We produced all the bricks in one month and began to construct it in the site. 2. Materiality of code

Flexible temporary joint:

Dealing with hollows in the middle part, in addition to the use of tenon and mortise, we also used the tie to improve the overall strength and stability. 1. Coding single component

Immobilization of bottom:

Dealing with the bottom where the tension is great place, needs to use the glue gun to spray the hot wax to increase the intensity. 2. Materiality of code

3. 3D printing process

17




SignAR

Location: MIT, Boston Role: Developer Responsibility: Coding

AR sign language translation

Time: 2020.01 Website: http://signar.cargo.site/

Process of Gesture Recognition

TANGIBLE_INTANGIBLE

Leap Motion Controller

Preprocessing

Calculation of the Value of Gestures

78

Hidden Markov Models

Computing Feature Set

Hidden Markov Models (HMMS)

Recognized Dynamic Gesture

INTERACTION


Sign Language Translation

We are inspired to augmented reality to create an experience for those who are hearing and speech impaired to be able to communicate via sound and text. Other People’s Voice To︎ Text

Exploring Watson integration will allow machine learnings to power up the vocabulary to scale the potential for meaningful communication around the world. Customize hot keys

The feature of customizing vocabulary and phrases (hot keys) allows non-verbal communicators to communicate with friends, family, coworkers and loved ones more easily.

Full Demonstration Video: https://youtu.be/sbpE6keVX5o


VR KAYAKING

Company: Rios Clementi Hale Studios Location: Austin, Houston Role: Developer Responsibility: Coding Time: 2019.11

MASTER PLAN

Website: https://www.rchstudios.com/news/austin-design-week-rch-studios/

TANGIBLE_INTANGIBLE Flow Chart

Real World 80

Virtual World

User login VR system

INTERACTION

Users put on VR goggle

Personalized choice

System loading scene and environment

Enter the scene

Users hold the oar, con-trollers are positioned

Users paddle to perceive the movement of oars

Track the paddle moti-on through controllers

Users paddle to acce-lerate/decelerate/turn

Users take off VR goggle

User exits the scene

EXPERIENCE THE PROJECT FROM THE RIVER Kayaking and drifting on Colorado river, users will be able to have a different perspective to experience this emerging public space.


KEY CODES

if (paddleLeft.localPosition.y < -0.3f && paddleLeft.localPosition != leftPaddlePos) { leftPaddleDisonX = transform.InverseTransformPoint(paddleLeft.position).x - transform.InverseTransformPoint(leftPaddleUniPos).x; leftPaddleDisonZ = transform.InverseTransformPoint(paddleLeft.position).z - transform.InverseTransformPoint(leftPaddleUniPos).z; if (leftPaddleDisonX < 0) transform.localEulerAngles += new Vector3(0f, -leftPaddleDisonX * 15f, 0f); transform.Translate(Vector3.forward * -leftPaddleDisonZ * 2f, Space.Self); if (Mathf.Abs(leftPaddleDisonZ) > 0.1f) Speed = 10F; if (leftPaddlePos.z > paddleLeft.localPosition.z) forward = true; if (leftPaddlePos.z < paddleLeft.localPosition.z) forward = false; }

Left/Right paddle rowing if (paddleRight.localPosition.y >= -0.3f && paddleLeft.localPosition.y >= -0.3f ) { if (Speed > Deceleration * Time.deltaTime) { Speed = Speed - Deceleration * Time.deltaTime; } else Speed = 0; }

Start decelerating when the paddle is out of water EXHIBITION IN AUSTIN

Full Demonstration Video: https://vimeo.com/378620945


HYPERLOOP STATION

Company: Hyperloop Transportation Technologies Location: Abu Dhabi Role: Designer & VR specialist (contract worker) Responsibility: VR Experience, Design, Rendering Time: 2017.10

FLOOR PLAN

HYPERLOOP STATION HTT’s proposed transit hub is a prototype station for a line that would connect Abu Dhabi and Al Ain in just 12 minutes.


TELEPORT

DIRECTION INDICATOR

AUTO SLIDING DOOR

Users are able to move through teleporting.

An indicator that follows the users leads them to the boarding gate.

The door will slide to open when users approach.

MATERIAL SWITCHING

Users are able to switch the material of the seat.

ENVIRONMENT SWITCHING

Users are able to switch the environment outside the window among lake, Mars, city and the space.

SPATIAL INTERFACE

The interface reminds users of which boarding gate to go.

Full Demonstration Video: https://www.youtube.com/watch?v=VaFmNZJKapU


PHOTOGRAPHY | LANDSCAPE

TANGIBLE_INTANGIBLE

84

INTERACTION


PHOTOGRAPHY | CREATIVE



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.