Aijin Ying Portfolio and research

Page 1

POR POR F T O T FO L IO L IO

Sample works

Work 1 : Creative coding works Work 2 : Monitoring 1972 Work 3 : Cyber mouth Work 4 : Cinema4D Work Work 5: Mutating Work 6: AR application - Work in progress

Postgraduate (2019-2021) MArch Design for Performance and Interaction Bartlett School of Architecture University College London


Creative coding works Via Processing and Arduino, I programmed a lot of dynamic patterns and made some simple interactive installation. Two processing works have been selected into the Tate Britain Museum annual exhibition, one of which is on the right.


Sample works (Processing) e.g. Particle noise

Run


Sample works (Processing)


Sample works (Arduino)

DQ_L & AJ_Y

Connecting circuit Coding


Sample works (Arduino)


Link of sample works(Processing)

Link of sample works(Arduino)

https://www.bilibili.com/video/BV1aK411A7Ym

https://www.bilibili.com/video/BV1M54y1r7Ea


Monitoring 1972 Machine learning Due to my experiments with DenseCap model. The function of recognise the visual concepts (e.g., objects, object parts, and interactions between them) from images in the image and generate dense language descriptions make me think of the facial recognition in surveillance system.

Facial recognition of surveillance. IMAGE COURTESY BIGSTOCKPHOTO.COM


Machine learning


Machine learning

Video screen shots. London Slums | East street | 1972| a time without recognition technology Video available: https://www.youtube.com/watch?v=_z05IPtQGEc

Generate

Extract the text

DenseCap

AttnGAN

Generated images in DenseCap.

Generated images in AttnGAN.

Project link. https://www.bilibili.com/video/BV1BK411A73P


Cyber mouth TouchDesigner Project link. https://www.bilibili.com/video/BV1pr4y1w7vn

TouchDesigner is a software which can be coded to generate sound-related motion graphic. First of all, I constructed a basic structure to clarify what is input for touchdesigner and what turns out to be the output. The input we select is a scenario in the reality and it will be translated into a scenario in virtual reality. The scenario we choose is people’s ordinary behaviour of interacting with the internet world by tapping the keyboard, uploading their words to online platforms. People's emotional state while typing can somehow be shown from the strength and speed of people’s pressing the keyboard. In this project, I mainly utilise the strength and speed of pressing the keyboard as the input. Since different strength and speed can generate floating sounds with different volumes and different frequencies. Via touchdesigner, such diverse sounds can be translated into visible images. The visual image we apply is various distorted mouths. Mouth is an interesting metaphor. Typing in the virtual world to expressing your words is like speaking with your mouth in real life. So the mouth translated by tounchdesigner is the thing we call cyber mouth.


Concept of translation

Concept Sketch for cyber bullying

Emotional collage in the Internet world.


Sound-generated motion graphic

Generate

Generate

Recording of keyboard sounds.

Translating the mood of the sound to motion graphic.

Sound-generated motion graphic.


Cinema4D Artwork “Bauhaus 100� This work was inspired by Oskar Schlemmer's triadic Ballet (1920).I present Oskar's ideas in a modern way to honor the 100th anniversary of Bauhaus's influence on the world. The work was selected for Bauhaus's 100th anniversary exhibition in Tate Britain.

Triadic Ballet, Oskar Schlemmer(1920)


Cinema4D workplace

Project link. https://www.bilibili.com/video/BV1kz4y1o7XU


Mutating-Work in progress Virtual reality “Mutating” is a speculative design work aiming to address human habits and habitats shifting under COVID-19 circumstances. Look around. People are fully armed to avoid being infected. We are advised to maintain the distance between people and reduce interpersonal communication. We are experiencing an unprecedented situation in this lockdown world, which, from another aspect, drives us to consider the possibilities of human living states in the future. At the moment, we assumed a future life scenario. Through multimedia experience, including sounds synthesis, interactive motion graphics, and VR, “Mutating-work in progress” provides spectators a conversation between human and post-human. http://www.interactivearchitecture.org/mutating-work-in-progress.html


Investigation


Experiments on creating characters

3D modeling according to the collected data.


Virtual experience demo

Demo link. https://www.bilibili.com/video/BV1nT4y1F7vB


Bechat-Work in progress Research Bechat is a social app with AR function. Due to the impact of the epidemic, human social life has changed a lot, shaking hands, hugs, face-to-face chat has become inappropriate. People are unprecedentedly active online. Is the social mode highly dependent on online communication a little too single, and can Bechat bring more possibilities for both online and offline interpersonal interaction?

Part of Bechat prototype


The “Protection” Zeitgeist Personal Protective Equipments is Influencing Human Behaviour and Habits.

Peter Parks, People wearing masks to protect against the SARS virus in Hong Kong’s Mass Transit Railway (March 31, 2003).

Picture showing the situation in the cabin during the pandemic (2020).

The emerging design for Personal Protective Equipments.

WASP, My face mask(2020).

Health code instruction(2020). Some measures to present personal identity during the epidemic.

Aijin Ying drinking at the Heathrow Airport (28/03/2020).

WASP, My space(2020).

Aijin Ying, Infrared thermometer taking visitors temperature, Shanghai Minsheng Art Museum(2020).


Interviews and personas

Observations and Insights on Social Identity Perception


Nonverbal Social Interaction is missing in the environments and occasions Social interaction as defined by Erving Goffman is the process by which we act and react to those around us. Social expectations determine what behaviors we expect to happen during a social interaction. Our shared social expectations and experience guide us through new social interactions and assist in predicting possible outcomes. Humans have learned to adjust their social expectations based on environments and occasions, creating countless social activities and behaviors. References: Calhoun, C. (ed.), 1994. Social Theory and the Politics of Identity. Oxford: Blackwell. Andrea Bucci, 2018. We Are Not Alone: Perception and the Others. ISSN 2035-7109, BrainFactor.

Field observation of social life in London and Shanghai(2020).

Web resource: What’s Missing From Zoom Reminds Us What it Means to be Human!


The transduction of human behaviours

Different ways to wear the 3D print creature.

AR camera scanning the creature and get the behavior information. Human’s behavior forming the movements of the creature.

3D printed creature attached to the garments.

Capture motion--PoseNet

Dynamics of creature.


The new mode of social interaction

The 3D-printed creature could be placed in the environment. The development of the AR App

A puts his/ her 3D-printed creature in the street and set its information.

B scans A’s creature and Check the information it provides.

Behavioral interaction via a chat App


“BeChat� function organization-- Work in progress

Homepage

BeChat UI Prototype: https://modao.cc/app/4a61508fe577a0dc61d410c01f71e6bfb45 60995?simulator_type=device&sticky


POR POR F T O T FO L IO L IO

Field projects

Project 1 : Portrait of Human Project 2 : Processing...

Research

Thesis : When it comes to the materiality in interactive art, digital or physical?

Postgraduate (2018-2019) MA Spatial Performance and Design Architectural Association Interprofessional Studio Architectural Association School of Architecture


14.03.2019 - 15.03.2019 Spatial Experience Event

Portrait of Human Are we human? What is human? When did we become human? Are we still human ? Were we ever human? Are we human yet? -- <Are we Human> Scene 1-TV Human A landscape of the past, a nostalgic version of our im- agination, one where the Gameboy and the dialup sound are at home, One where we seek to typer-connec- tivity as a way forward, we think of technology as the answer to all our problems, as an imagined future, as a dream, as a fiction of sorts. Scene 2-Plastic Human Leaping into a world where we look at a reflection, is this a reflection or a projection? Gilt, greed, embarrassment, shame, come to light through the character of plastic fat. Plastic and human skins blur together to create artificial flesh, the artifical lends us its form until we reach a point of not being able to distinguish between the two.

Scene 3-Breathless Breathless, out of air, extinguished. Material conditions of a feeling that relateds to a sense of precarious time, lonely time, a time of experience from below. A state of emergency. We end to begin again.


Scene1 - TV Human


Scene2 - Plastic Human


Scene3 - Breathless


More details: Link: https://pan.baidu.com/s/1sEk37myb6o_hwywGj5ga9A Password: 82uv Website: https://issuu.com/yingaijin2018/docs/a4_design_documentation_compressed

Project link https://www.bilibili.com/video/BV1na4y1s7DH/


14.06.2019 - 15.06.2019 Spatial Experience Event

Processing... Concept “Processing...� is a multi-disciplinary production that aims to raise a conversation with the audience about the identity of human beings in the contemporary world. With the in- vention of social media, data collection and surveillance, the act of watching becomes highlighted in our everyday life, which transforms us from the subject to the object. We are constantly being observed, as our data becomes collected and forms reports and predic- tions on our behaviours. We voluntarily upload ourselves online to be seen, but we have no idea and control over who is watching. With the use of installations, choreography and projections, we will be examining the different ways of which people access information in space, both through physical perception and representations of the area. As the creator of this piece, we want to highlight the control of data to emphasize the power relations it brings. The piece examines the connection established between the subject and the ob- ject within the act of watching with a humorous twist, transforming the audience into an unpredictable part of the performance.


Space Arrangement Space was generally split into two parts. Half of the area consisted of five balloons hanged from the ceiling. During the first scene, performers choreographed movements, attempting to create interactions among bal- loons. The Inflatable mainly occupies the remaining space. In scene 1, the Inflatable was acted as the auditorium. The audience sat on it and watched towards the main stage.


Scene 1 In the first scene, there will be eye-shaped installations of reflective material. The performers would be interacting with the reflective installations, while both of them have wireless cameras attached to their body. The two will be filming each other while the views from the cameras will be projected on the wall.


Scene 2 In the second scene, the Inflatable will be inflating. Performer A will start approaching the inflatable, sprinkling flour on the ground to shape a line. When the performer A gets into the Inflatable installation, the UV lights installed onto the Inflatable will switch on, highlighting the lines marked on the performers’ bodies. When performer A hides in the centre of the Inflatable and interacts with the surface of the installation, Performer B will start to approach the Inflatable according to the instruction of the line drawn by Performer A.


Scene 3 In the third scene, the singer will gradually move to the main stage, singing. When the song is over, the live projection, which is controlled from the backstage, will go onto space where the audience is, positioning them within the dots. Parts of the dots will be changed to highlight and interact with the audience. The ending is chaos occurring within the digital patterns, which collapses in the end.


More details: Link: https://pan.baidu.com/s/1sEk37myb6o_hwywGj5ga9A Password: 82uv Website: https://issuu.com/yingaijin2018/docs/a4_design_documentation_compressed

Project link https://www.bilibili.com/video/BV1vy4y1B7ZK


20.09.2019 Thesis Research

When it comes to the Materiality in interactive art, digital or physical?


Thesis link:http://book.baige.me/view/Txi


POR POR F T O T FO L IO L IO

Sample projects

Project 1 : Bubble run Project 2 : The obesity experience spaces design Project 3 : The construction of a girls’ domitory Project 4 : Yangchenghu Bamboo Festival

Undergraduate (2014-2018) Environmental Design College of Design and Innovation Tongji University





























Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.