Viewing to Change

Page 1

VIEWING TO CHANGE

Section title

by Komal Mittal

1


2

Vie wing to Change

3

CSA Research Report

Project Details Project Lead:

Komal Mittal

Design Participants:

Lucy Jones

Title:

Viewing to Change

Location:

Herbert Read Gallery UCA Canterbury Campus

Project Dates:

31 May 2018 - 31 July 2018 Project Design 1 August 2018 - 23 August 2018 Building and testing on site 24 August 2018 - 31 August 2018 Installation open to public

Design Period:

June 2018 - August 2018

Budget:

ÂŁ250

Scale:

3.65 m x 3.65 m

Support:

University for the Creative Arts


4

Vie wing to Change

Section title

5


Vie wing to Change

6

Research Statement

Research Agenda and Process Overview

Significance and Contribution

Climate change has led to areas not previously under flood risk also becoming prone to flooding. Such increased risk has led to a great level of uncertainty and the need for drastic change to provide safety and sustainability. While there are countries which have recognised the immediate need for change by using resilient design practices at micro and macro level ( for example, Netherlands), this awareness is still not widely present. For everyone to work on this problem, creating awareness about its magnitude is the first step.

Hyper-sensory experience of being a spatial witness to a flood.

To witness an environment of destruction spatially can be a way of confronting this awareness of a need to change. This could be created by a participatory experience for a person. A participatory experience will allow the user to witness an environment of a natural disaster and come to terms with its enormity. By creating a virtual experience of a flood, my aim is to make people understand how uncertain and devastating a flood can be. The outcome of this experience will hopefully make people more perceptive towards victims of flooding and also force them to make changes in their life which would make them less vulnerable in a similar situation. An experience like this could be used in any condition and might act as the final force used to make the change in regard to safety against flood risk.

Research Questions 1.

In what ways can a ‘spatial witness’ approach develop uses of virtual reality in design research?

2. How can virtual experiences be used to increase understanding of flood risk? 3. How can a virtual experience go beyond the visual? 4. How real can hyper-sensory experiences be?

Fig.01 (previous page) 3D scanned point cloud of Greyfriars Gardens

The magnitude of the threat of climate change can make its impact hard to conceive. When hearing about natural disasters all over the world, the reaction or sympathy is now short lived. People are no longer moved by news. A new dimension is needed to engage people with climate change. An experience close to reality could awaken the reaction not being provoked by the news alone. Virtual Reality has become a platform for exploring new environments, testing limits to an experience and bringing that experience as close to reality as possible. Consequently, I decided to explore how this technology could be used to design the experience of a disaster and how this experience could be enhanced. By using the 3D scanned data as a point cloud, the recreated virtual space is very close to reality. The enhancement is done by making it an hyper-sensory experience which includes senses of hearing and touch along with the virtual experience making the entire experience more immersive for the user.

Methodologies 1.

Digital design of flood simulations in a virtual environment using the 3D laser scanned point cloud of Greyfriars Gardens.

2. Design an installation in which the senses of hearing and touch enhance the visual experience. 3. To display the image and sound effects of the user’s experience to other people around the installation.

7


Vie wing to Change

8

Section title

Design Proposal View to Change is a virtual reality immersive experience of a flood created by making the user a spatial witness to a flood. This event is created in the Greyfriars Gardens, Canterbury. The entire space is laser scanned in a series of points and this point cloud is bought in a virtual environment. As the point cloud is more of an aesthetic experience rather than an editable object, the interaction possible was limited. The installation is primarily focused on how real a first person flood experience can get and to understand its effect on the user.

The installation includes a 3 m x 3 m surface of grass and pebbles which will help the user connect with the visual unfolding of points in the form of the garden space. The user can also hear the sound of the river flowing along with the rainfall, thunder and wind. This is experienced along with the water level rising in the virtual space. These visuals are projected on a TV screen in the real space for other people to see what the user with the headset is experiencing.

Fig.02 3D laser scan extract Greyfriars Gardens

Key technological outcomes of proposal 1.

Using the 3D scanned point cloud data of Greyfriars Gardens and making it available in a virtual environment using Unity which would be accessible by wearing the virtual reality headset.

Fig.03 3D laser scan extract Greyfriars Gardens

2. Adding water(river) to the point cloud data and controlling its movement. 3. Getting the point cloud to interact with water when the space is flooded. 4. Adding a surface of grass, pebbles and sound effects of rain and flood while a person is in the virtual environment. 5. Projecting the same scene on a TV screen with speakers for other people to see what the user with the headset is experiencing.

Fig.04 3D laser scan extract Greyfriars Gardens

9


10

Vie wing to Change

Proposal & Context

Section title

Design Research Context Field of Work

Work by others

The design research is based on increasing flood risk, 3D scanned point cloud data, recreation of a first person flood experience in a virtual space and its effect on the user.

Hyper-sensory experiences have highlighted senses of sound (For example: In the eyes of the animal, Odyssey) and senses of touch (For example: ScanLAB Projects)

Virtual reality was fundamentally used as a tool for gaming, but is now being applied in various educational and training sectors. This helps the user understand a concept by making it into a reality that a person can explore.

The Fig 02 shows a 3D scanned point cloud of Venice done by ScanLAB Projects to understand the depths of the city’s underground and its historic aspects which have not changed over the period of time. They create visual experiences for users using only 3D scanned data. These unique experiences allow the audience to navigate the same data sets captured on location by the scanning team and used to make the original show graphics.

This recreation of experience can effect people differently and then these effects can be used as a medium for tackling the increased flood risk. Virtual Reality is being proved as a vital tool for educational and technological development which can also create enhanced experiences making them into hyper-sensory experiences. There is a lot of work being done around the world where such immersive hyper sensory experiences are created for the user by including senses other than just vision.

Fig.05 Venice - ScanLAB Projects, 2017

The Fig 03 is a still from a project called ‘IMMERSED’ developed by FEMA as a disaster relief training tool. This project places people in a situation after a flood and gives people the opportunity to make decisions to ensure that all the participants have moved to safety after a flood. This project helped increase better decision making abilities with the people and also on how to work together in such situations.

Fig.06 A still from IMMERSED, 2018

11


Vie wing to Change

12

Design Methodologies The project is a virtual reality spatial experience created using different modes of technology like 3D laser scanning, virtual reality combined with photographs, with the involvement of other senses to enhance the user’s experience.

along with the sound of rainfall, lighting and wind in the virtual space. The next step was adding water into this scene and getting it to interact with the point cloud. As part of testing water to interact with the point cloud, I added different types of water bodies available in Unity allowing a variety of control systems. (Water basic daytime, Water pro advanced). After trying with these methods I found out that the water cannot directly interact with the point cloud and would need a medium.

This experience is created on the 3D laser scan of Greyfriars Gardens, Canterbury. The scan of the site resulted in a point cloud which originally comprised of over a billion points. This point cloud was then processed down to a smaller size of around 15 million points and then bought into a virtual space using Unity. So I started tracing out the point cloud as a 3D model which would Enabling ‘sensory transportation’ be used for interaction but would for the user by adding not be visible to the user. The environmental factors in the water was made a part of a mesh virtual space. This includes the built using different points but creation of rain, fog and clouds was still a 2D surface.

Critical Design Elements 1.

Using the 3D scanned point cloud of the Greyfriars Gardens in a virtual space.

2. Getting objects and landscapes to interact with the point cloud. 3. Creating an interactive environment for a spatial witness. 4. Customizing water by controlling its movement in a virtual environment.

13

Process & Methods

Prototyping and testing The project starting with getting the point cloud into a virtual space, then trying to understand the creation of different environments in a virtual space using Unity. To do that I started building distinct landscapes around the scanned site in an attempt to allow the user to encounter different seasons around the same space. This was also a way to understand and work with the technology. After trying to work my way around the technology, I thought it would be better to experiment only with the point cloud and not the environments around it. As the point cloud is just a large set of points in space produced by 3d scanners, getting the point cloud to interact with other objects in Unity was the biggest challenge. The point cloud is primarily supposed to be an aesthetically pleasing experience for the viewer. Hence, there are not a lot of things that can be done to change or edit it. Also, to keep the experience as real as possible

I tried not to add a lot of things to the point cloud. The idea was to use the scan data and manipulate areas around it. However, since the point cloud forms clusters of points at random, the amount of change was limited. To understand the progress made at every step, the testing was done by repeatedly experiencing the virtual space myself. Every test would uncover issues like changes in ground level, placement of background images, the river flow being too fast or too slow and if the experience felt immersive or not. Different elements of the scene were built separately and then integrated together as part of the final scene. As part of testing and building, it was necessary to understand how others felt about the experience as well. This was tested by receiving feedback for changes needed in details. This made it easier to understand how the whole experience was perceived by the user.


Vie wing to Change

14

Fig.07 3D laser scanning Greyfriars Gardens

Fig.09 Greyfriars Gardens in a virtual space

15

Process & Methods

Fig.08 Greyfriars Gardens in a virtual space

Fig.15 Creation of Clouds

Fig.16 Creation of Rain

Fig.17 Virtual Reality Toolkit Setup 1

Fig.18 Virtual Reality Toolkit Setup 2

Fig.19 Virtual Reality Toolkit Setup 3

Fig.20 Virtual Reality Toolkit Setup 4

Fig.10 Greyfriars Gardens in a virtual space

Fig.11 Building landscape 1

Fig.12 Checking for details in virtual space

Fig.13 Building landscape 2

Fig.14 Building landscape 3


Vie wing to Change

16

Process & Methods

Fig.22 Building Staircase 2

Fig.21 Building Staircase 1

Fig.23 Building Staircase 3

Fig.24 Building Inner Walls 1

Fig.25 Building Inner Walls 2

Fig.26 Building Inner Walls 3

Fig.27 Building Inner Walls 4

Fig.28 Setting Up Navigation 1

Fig.29 Setting Up Navigation 2

17


Vie wing to Change

18

19

Process & Methods

Fabrication Techniques a. Real Space Building a 3 m x 3 m wide surface of grass and pebbles with an antistatic black mat beneath it to avoid any spillage which extends its size to 3.65 m x 3.65 m

This limitation of space was done due to the movement restrictions c. Water surface and movement of the motion sensors, which is The water surface available in 4 m diagonally. unity is a 2D surface. The idea was to add water in the scene The black mat was placed on and then raise its level and create the ground directly after cutting an underwater effect beneath the it into required sizes and joining risen surface as the water in unity the pieces using tape. The is not volumetric. artificial grass was then fixed on these black mats using adhesive To customize the water in the and the pebbles were scattered scene I started with an asset around the surface. available in Unity called Water Basic Daytime. This water b. Virtual Space had limitations relating to customization. The next type of The 3D laser scan provided water was Water Advanced Pro with an accurate replica of which had a variety of options for the Greyfriars Gardens which customization which allowed me was bought in Unity. Around to control the colour, the flow and it I added the background by the reflection of the water. This photographing the same as it water texture was then added to is around the site. In addition to a custom mesh created to fit the that, different environmental shape of the river in the space. factors were added. The river

Fig.30 Setting Up Background with photographs

Fig.31 Photographs of background 1

that flows through the centre of the garden space was added as a 2D surface of water texture. The terrain and different ground levels were added to the point cloud to enable movement in the scene.

Fig.32 Photographs of background 2

Fig.35 Surface detail diagram Fig.33 Photographs of background 3

Fig.34 Photographs of background 4


Vie wing to Change

20

21

Process & Methods

Fig.40 Setting up gallery space 1

Fig.41 Setting up gallery space 2

Fig.42 Setting up gallery space 3

Fig.43 Setting up gallery space 4

Fig.44 Setting up gallery space 5

Fig.45 Set up complete in gallery space

Fig.36 Proposal View for Installation

Fig.37 Surface Detail - Artificial Grass

Fig.38 Surface Detail - Anti Static Mat

Fig.37.2 Surface Detail - Pebbles

Fig.39 Exhibition Space


Vie wing to Change

22

23

Process & Methods

Control Systems Unity and HTC Vive:

Fig.46 Water Test 1

Fig.47 Water Test 2

To create a virtual reality experience, I used the software Unity which is essentially used for game design. The software is combined with a steamVR setup which involves a virtual reality headset along with two hand controllers and two base stations which act as motion sensors. The headset controls the direction of vision, the controllers help in the movement inside the virtual space and the base stations/ motion sensors provide with invisible boundaries in the real space restricting movement for the user with the headset on. Sound System: To add the sense of hearing in the scene, I added a sound system which enabled and the headphones play the audio added in the scene for the user. This sound system was created in

connection with the rainfall and lightning in the scene. This audio system also includes the sound of the river flowing. This sound system is connected to the scene and the user can use headphones while wearing the VR headset to enable the sound while walking in the virtual space. This sound is also set up in the real spaced using speaker systems. Water System: The biggest challenge while working with the point cloud was to get the water to interact with it into the scene. This was one by adding water into the scene and then raising its height which then created an underwater effect below the surface of the risen water. The rise in water level works on a trigger that is activated by the user in the scene which will lead to a flood unfolding in front of their eyes.

Fig.48 Water Test 3

Fig.49 Water Test 4

Fig.50 HTC Vive Control System

Fig.51 Audio System Setup


Vie wing to Change

24

Review of Outcomes

Occupation and Interaction The project is a virtual reality installation set up in the Herbert Read Gallery. It also integrates the real space into the virtual space because the user is provided with a surface for him to walk on while wearing the VR headset. The experience created is a first person flood experience elevated using the senses of hearing and touch thus creating a hyper sensory experience for the user.

can be used to increase understanding of the increasing levels of flood risk. To create awareness on the subject it was essential to make the experience as real as possible and this was done by making it a hypersensory experience and using the data from the 3D laser scan. These technological aspects ensured that the user becomes a spatial witness to a flood after entering the virtual space.

The main aim of this project is to find out how virtual experiences Fig.53 Before the flood 1

Fig.52 Setting up terrain mesh from point cloud

Fig.54 After the flood 1

25


Vie wing to Change

26

Section title

Fig.55 Before the flood 2

Fig.57 Before the flood 3

Fig.56 After the flood 2

Fig.58 After the flood 3

27


28

Vie wing to Change

Review of Outcomes

Dissemination and Future Work This project will be published on my personal website along with the research details and also the process of development: https://komalmittal62.wixsite.com/website. The next step is to understand how much a point cloud can be manipulated and test if there are ways of scanning water into a space. The interaction of point cloud with objects can lead to setting up a system where in real time scanned data can be used for any simulation works to enhance visual quality of the project. Also, to find out how hyper sensory effects can be made more powerful and if there are ways in which the involvement of these senses can be heightened to an extent where they would control the entire project. Fig.59 Before the flood 4

Fig.60 After the flood 4

29


30

Fig.61 After the flood 5

Vie wing to Change

31

Appendix

Appendix Annotated Code Excerpts Object to Terrain: using UnityEngine; using UnityEditor; public class Object2Terrain : EditorWindow { dow () {

[MenuItem(“Terrain/Object to Terrain”, false, 2000)] static void OpenWin-

EditorWindow.GetWindow<Object2Terrain>(true); } Down”};

private int resolution = 512; private Vector3 addTerrain; int bottomTopRadioSelected = 0; static string[] bottomTopRadio = new string[] { “Bottom Up”, “Top private float shiftHeight = 0f;

Water Base: using System; using UnityEngine; namespace UnityStandardAssets.Water { public enum WaterQuality { High = 2, Medium = 1, Low = 0, } [ExecuteInEditMode] public class WaterBase : MonoBehaviour { public Material sharedMaterial; public WaterQuality waterQuality = WaterQuality.High; public bool edgeBlend = true; public void UpdateShader() { if (waterQuality > WaterQuality.Medium) { sharedMaterial.shader.maximumLOD = 501; }


Vie wing to Change

32

Materials and Suppliers List Artificial Grass 3 m x 3 m (from B&M Stores, Canterbury) Pebbles and Stones 1 kg (from Homebase, Canterbury) Anti-Static Black Mat 3.65 m x 3.65 m (from UCA Equipment Store) Timber 250 mm x 1000 mm x 1200 mm (from UCA Wood Workshop) HTC Vive VR Set 1 (from UCA Equipment Store) Speaker Set 1 (from UCA Equipment Store) TV Screen 96 cm x 60 cm (UCA Equipment Store) Other sundries: River Material Asset (from Unity Asset Store) Ambient Sound Pack Asset (from Unity Asset Store) Particle System Effect Asset (from Unity Asset Store)

33

Appendix

Bibliography About | ScanLAB Projects (s.d.) At: https://scanlabprojects.co.uk/ about/ Cazenave, A. and Nerem, R.S. (2004) ‘Present‐day sea level change: Observations and causes’ In: Reviews of Geophysics 42 (3) [online] At: https://agupubs.onlinelibrary.wiley.com/doi/ abs/10.1029/2003RG000139 Farra, S. et al. (2013) ‘Improved Training for Disasters Using 3-D Virtual Reality Simulation’ In: Western Journal of Nursing Research 35 (5) pp.655–671. IMMERSED: A VR Experience About Flood & Resilience | FEMA.gov (s.d.) At: https://www.fema.gov/immersed Klein, N. (2014) This changes everything: capitalism vs. the climate. (s.l.): (s.n.). Watson, D. and Adams, M. (2011) Design for flooding: architecture, landscape, and urban design for resilience to flooding and climate change. Hoboken, N.J: John Wiley & Sons.

USB 2.0 A-Male to A-Female Extension Cable 3 m/9.8 Feet 2 (from Amazon) High-Speed HDMI 2.0 Cable - 3 m/10 Feet 1 (from Amazon)

Image Credits All figures are copyright the author unless noted as follows: Figure 05: Venice – ScanLAB Projects (2017) [Poster] At: https://scanlabprojects.co.uk/work/italys-invisible-cities/ Figure 06: A still from Immersed (2018) [Poster] At: https://www.fema. gov/immersed


Vie wing to Change

34

Credits MA Interior Design Course Leader: Lucy Alice Jones MA Interior Design Tutors: Lucy Alice Jones Owian Caruana Davies Special Thanks: JJ Brophy Anna Holder Kim Trogal Chris Settle

Section title

35


36

Vie wing to Change


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.