Jia-Rey Chang's portfolio 2021

Page 1





EMAIL: archgary@gmail.com TEL: 302-763-2042 WEB: http://www.archgary.com/ ADDRESS: 1122 Chrisinam Mill Dr., Newark, DE, USA

Jia-Rey(Gary) Chang was born in Taiwan. After completing his M.Arch degree in Architecture and Urban Design Department, UCLA, under the direction of Neil Denari in 2009, he came back to his Alma mater, the Architecture Department in TamKang University, Taiwan, researching on interactive and parametric architecture. In 2010, he established “P&A LAB” (Programming and Architecture LAB: http://pandalabccc.blogspot.com, and lately integrated into archgary.com: http://www.archgary.com to continue) exploring the new possible relationship between the programming and architecture. Meanwhile, he also worked in the Architecture Department of the National Taipei University of Technology as an adjunct lecturer. In 2011, he joined the Hyperbody LAB (http://www.hyperbody.nl/), Department of Architectural Engineering and Technology, TU Delft, for his Ph.D. research on Interactive Architecture. Cooperating with choreographers, visual artists, composers, and programmers, he has been involved in an EU project, MetaBody (http://metabody.eu/), during 2011-2014 to explore the pro-activeness and intra-action relationship between body movement and spatial quality. In early 2018, he finished his Ph.D. research with the dissertation titles as “HyperCell: A Bio-inspired Design Frameworks for Real-time Interactive Architectures(http://www.archgary.com/research/)”, proposing the idea of self-intelligent building components by exploring into the fields of computation, embodiment, and biology in design. Meanwhile, he is also extremely interested in the transdisciplinary topics of interactive architecture, tangible interactive design/art, immersive sensory experience, bio-inspired design, AI (artificial intelligence), creative coding/generative art/visualization, 3D modeling, fashion design, wearable technology, and motion tracking technology, and has conducted numerous related workshops over the years. He is now an assistant professor in the IXD Lab, Department of Art & Design, University of Delaware continuing his research philosophy of 'space as a living being' as an immersive spatial Interaction Design, and exploring his artistic trajectory on creating the experimental immersive sensory (audio/visual/VR/AR) space.



CONTENTS Generative_Art Living Wonderland Fragments Fragments_ColorPalette

Body_Instrument SkyWindow Myth from the Future

VR / AR WonderForest [FishTank] Monitoring Room Goddess

A.I. AI_Jam

Others Others




II Living Wonderland


Generative Art | Creative Coding | Video Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/livingwonderland/

project description

Living Wonderland is a piece trying to see through our inner vision as an abstract expression during the time in quarantine with creative coding. “Living Wonderland” not only metaphorically reveals our lust of craving for freedom but also illustrates the kindness embedded in everyone during this COVID-19 epidemic/quarantine period. No matter it represents the lust or the kindness of every human being, that Wonderland deep in everyone's awareness is just like a "living thing" eager to break through the "frame" of any pre(post)-set constraints, illness, and boundary to look for hope. However, we all know that keeping distance at the time will benefit the entire world. Our inner nature is drastically swinging between the furious thoughts(fears) and the peaceful mind, just like the heartbeat, just like this living wonderland.

A bit of technique

Practically, the entire piece is created by the scripts of code as generative art based on swarm behavior intelligence (creative coding) in Processing with related library, mainly the Camera3D for the stereoscopic effects. Therefore, this short 2-minute film piece can be viewed with a pair of 3D glasses to have the stereoscopic effect, but still enjoy the colorful vision without it. The background music is generated with Cables.gl (web-based visual coding tool) and Ableton Live.







web_link: http://www.archgary.com/works/livingwonderland/ video_link: https://vimeo.com/424888833




II Fragments


Generative Art | Creative Coding | Video Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/fragments/

project description

“Fragments” is a generative art/video piece that illustrates the artist’s expression of the meaning of life. The fragmented color pixels distributed in the container metaphorically represent individuals living in this existing world. As human beings, we might wander throughout our entire life finding our own destined roles, just like the flying color dots that roam around the space and end up in their spots to make a complete painting. No matter if it’s a famous painting or not, the pixel will find its doomed position of the work to actively live out the very best of its life. All single-color-pixel can also be interpreted as a fragment of one’s life whether it is bitter or sweet, it would ultimately mingle together to achieve a beautiful artwork of life. No matter if it’s a famous painting or not, the pixel will find its inner peace resting in the position of the work and shine brightly in its own way. (The reference painting is Vincent van Gogh, “Self-Portrait” 1889, oil on canvas, collection of Mr. and Mrs. John Hay Whitney, which was downloaded through Courtesy National Gallery of Art, Washington’s Open Data under a CCO license. https://www.nga.gov/open-access-images/open-data.html)









web_link: http://www.archgary.com/works/fragments/ videos_link: https://vimeo.com/568312803




II Fragments_ColorPalette


Generative Art | Creative Coding | Video Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/fragments_colorpalette/

project description

“Fragments_ColorPalette” is another version of “Fragment”. The concept remains the same. The fragmented color pixels distributed in the container metaphorically represent individuals living in this existing world. As human beings, we might wander throughout our entire life finding our own destined roles, just like the flying color dots that roam around the space and end up in their spots to make a complete painting. No matter if it’s a famous painting/installation or not, the pixel will find its doomed position of the work to actively live out the very best of its life. All single-color-pixel can also be interpreted as a fragment of one’s life whether it is bitter or sweet, it would ultimately mingle together to achieve a beautiful artwork of life. No matter if it’s a famous painting/installation or not, the pixel will find its inner peace resting in the position of the work and shine brightly in its own way. “Fragments” has coded the way that it can be replaced with different reference images and grow the color floating agents accordingly. It also comes with a VR version of this project.









web_link: http://www.archgary.com/works/fragments_colorpalette/ videos_link: https://vimeo.com/619927027




II SkyWindow


Body-Visual-Audio Immersive Interactive Installation Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/skywindow/ Play= https://editor.p5js.org/archgary/sketches/ulT-lzH5k

project description

During 3 months of being quarantined in our tiny apartment, here comes the project, “SkyWindow”. The concept of the “SkyWindow” is the idea of being a mental escape from reality, especially under this unprecedented time. Being quarantined in an entire enclosure space continuously for numerous hours and days, people are desperately looking for relief in any possible way. Through the artist’s interactive design, looking up to the imaginary sky could be the most enjoyable solution to get immediate comfort without going out. The “SkyWindow” is an immersive and intimate experience with sky-like projections on the ceiling like putting a void hole to it as an interactive installation. A dark environment with the projected sky/universe on the ceiling intriguing the audience to walk closer underneath. Further, the visual graphic will induce the audience to reach out to their hands like touching the sky to trigger the raindrops (meteor shower) and sounds falling from the “SkyWindow”. The "SkyWindow" here metaphorically represents a piece of “hope” people can expect during the pandemic. No matter a planet far away in the dark or sunlight in the bright, it gives you unexpected joy and surprise in the design. Besides exposing under different spatial scenes, through this “SkyWindow”, waving hands in the air will trigger the (meteor) shower falling from the Sky which ironically implies the power of control that people have been losing for a while under such an unpredictable moment. And the (meteor) shower implicitly refers to wash out all the illness and sadness for returning the clean and pure spirits.













web_link: http://www.archgary.com/works/skywindow/ videos_link: https://vimeo.com/467863317




II Myth from the Future


Body-Visual-Audio Immersive Interactive Installation Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/myth-from-the-future/

project description

Myth focuses on the idea of reflecting/reminding how current technology has blended into our daily lives, physical and mental bodies by referring to the ancient Chinese classic literature, “the Classics of Mountains and Seas”. Can you imagine how a human being will evolve into a 3-head-man in the future? No need to wait, because actually, we ARE this kind of 3-head “tech-species” (a species with tech-devices merging) if you consider our smartphone, laptop as heads, and plus our own head. Although many strange living things were no longer existing but written down in “the Classics of Mountains and Seas”, this project attempts to take these creatures in these tales of marvels as metaphors to indicate people who wear/hold hi-technology gadgets and the robotic creatures as “tech-species”. Through this 3-4 minute live experience, one of the audiences will become the “tech-species” by controlling the instrument to influence the immersive audio and visual effect in real-time.

audience experience

It’s a journey of bodily senses experience! The project is an interactive installation lasting around 3-4 minutes as a cycle. One of the audiences will volunteer to wear the easily-take-off interactive instruments as a controller (a gamepad-based garment) attaching to his/her body/limbs. By actively moving his/her body, this active audience wearing the instrument will influence/interact with the visual and sound effects of the projecting background. This particular active audience will "be/come" the “tech-species” in the experience while the other passive audiences may stand freely and do observation within the dynamic immersive visual-audio interactive environment.




Tech-Description

project description The visual effect will be scripted in openFrameworks. And the sound effect will be generated by PureData(along with SimpleSynth and Ableton_Live). The wearable instruments will be designed based on old fashion wired gamepad devices. After sending the signal wirelessly through the gamepad to the computer by USB ports, the internal protocol will be based on OSC communication. The protocol is quite simple by taking PureData as the receiver to get the data from the wearable instrument, and then the data will be passed to the openFrameworks for the rest of visual generations correspondingly.

[ PureData (sound)

Speakers WIRED

SOUND VISUAL

Instrument Projectors

[

OpenFrameworks (Visual)

COMPUTER


Wearable Sensors

USB Wireless

USB

openFrameworks Calculating...

OSC...

Computer

Visual

Sound









web_link: http://www.archgary.com/works/myth-from-the-future/ video_links: https://vimeo.com/374976087 https://vimeo.com/374981214




II WonderForest


VR interactive installation | Generative Art | Creative Coding Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/wonderforest/

project description

“WonderForest” is an immersive VR interactive piece asking the question of “what is real?”. If life is composed of a pile of sensory experiences, should VR be included in this game of life? And can “VR” experiences expand and break the stereotypical idea of “real” and “nature”? This conceptual exploration was inspired by French Philosopher, Gilles Deleuze’s statement that “the virtual is opposed to actual but not real" in his publication “Bergsonism”. Hence, the virtual experience should also deliver real senses and even an extended version of reality. In “WonderForest”, it provides an immersive digitalized visual/audio environment as a new Nature. Just like walking in a forest, the audiences can sense the nature atmosphere but with totally different environmental elements. By creating meshes of waves as landscape, free-floating cubes, and flying dots as living species, noise mixing with birds singing as ambient sounds, the project challenges the general stereotypical notion of Nature. It is also to convince the audiences' that what they've experienced in the VR environment should be considered as “real”. For example, who can be certain about the space multiple lightyears away would be the same physical materialization as we have here and not like in the “WonderForest”? Besides, no matter if it is virtual or real, they will all imprint in our body/mind as real senses and memories once you experienced them, just like seeing movies, playing video games, or dreaming. And that is what “WonderForest” attempts to deliver. Speaking about the actual play in “WonderForest”, the audiences can hit the floating cubes, attract the swarm of dots, produce colorful rain, making flashy lights, which are not possible in the so-called “real” world. And that is what makes the VR experience fun and can be possibly become the extension/escape of reality.









web_link: http://www.archgary.com/works/wonderforest/ videos_link: https://vimeo.com/613781503




II [FishTank]


VR interactive installation | Generative Art | Creative Coding Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/fishtank/

project description

[FishTank] is an immersive VR interactive piece questioning “what is the meaning of life?”. If life is composed of a pile of sensory experiences, what does the role of VR play here? If we are living just in another virtual reality world, do we retain our own free wills? The title, [FishTank], illustrates explicitly the VR environment the audience will experience. In a large cubic space(tank) with a skylight, there is a small floating cube and a school of fish “freely” swimming inside. Not only through observation,bu the audiences can add/reduce the cubes with the original floating cube to create sculpture-like continuous artificial reefs for the fish to navigate. This landscape creation process is just like building an environment in a video game, like Minecraft. It seems like the audiences have taken the lead owning the powers to manipulate the fish species as the God/Creator here. However, from an empathetic perspective by taking the “fish” here as the metaphor of human beings, we (human beings) are all just programmed and living in this water tank (virtual world) created and manipulated by another supreme species. Eventually, what is life? Is it just another programmable virtual reality environment we are living in, just like these fish in the tank? Besides this philosophical concept, the piece is supposed to be fun to play. The artificial cubic reefs/environment can be passed through and created continuously by all audiences experiencing this project as a collective creation. This VR experience is mainly built in Unity while the real-time data get to be sent out to PureData and Ableton-live for sound production simultaneously. Even without interactively adding/reducing cubes, it is enjoyable to immerse oneself in this VR environment to escape from the real world for a bit, just like staring at your fish in the [FishTank].











web_link: http://www.archgary.com/works/fishtank/ videos_link: https://vimeo.com/613882284




II Goddess


AR Filter | Generative Art | Creative Coding Visual \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/goddess/ Filter= https://www.instagram.com/ar/520051082609956/

project description

Goddess expresses the urgent need for hope no matter if it is for a physical or digital world, especially after this tragic COVID-19. Goddess usually metaphors a warm, comfortable, encouraging icon in all cultures that represents the power of a Mother. When it comes to “Faith”, there is no specific deal wherever you physically/digitally locate. The Faith/Hope can be the portal transcending the physical and virtual world and even all religions which is what the Goddess symbolizes here.





web_link: http://www.archgary.com/goddess/ filter_link: https://www.instagram.com/ar/520051082609956/




II AI_Jam


VR interactive installation | Generative Art | Creative Coding Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/monitoring-room/

project description

“Monitoring Room” specifically intends to address the topic of “Monitoring” here. Before stepping into the actual VR experience, a small-scaled Lego room(model) will be placed at the entrance of the exhibition space for the audience to look inside while waiting in line. The audience will see a small chair at the center of the model, a Lego mini-figure with a VR headset, and a functional webcam. There is no clue for the audience to understand what this Lego model is for at this moment. After putting on the VR headset, the audience will suddenly realize that herself/himself becomes that figure she/he just saw in the Lego model and is “monitored” by the audience waiting in line. Through the setting, she/he can see the waiting audience’s faces through the skylight of the virtual room, just like the giant in the “Jack and the Beanstalk” peeping into the room. A single chair with rain falling directly on it in the virtual room enhances the helplessness/loneliness like staying in an interrogation room (a physical chair will also be provided in the middle of the exhibition room for sitting). The audience waiting in line looking inside the Lego model metaphorically are indicated not only Big Brother (the government, the corporate) who owns the privilege to constantly watch/manipulate people’s private data under surveillance but also personal social media power interfering with everyone’s daily lives. In the “Monitoring Room”, playing the “scaling” effect between the virtual and physical spaces is the fun way arousing the attention of the current social “monitoring” issue. Besides, this immersive experience will bring the critical inquiry of "what is reality" matching the main theme with this series of VR experiments, the "Void Rooms". According to the French philosopher, Gilles Deleuze, "virtual" is not opposed to "real" but to "actual" which implies the idea of “virtual is real” in my interpretation. In other words, anything that happens in virtual reality should be considered as a real experience and that is exactly how VR can expand human senses and what the “Void Rooms” series would like to explore.















web_link: http://www.archgary.com/works/monitoring-room/ video_link: https://vimeo.com/495587701




II AI_Jam


AI | Interactive Instrument | Generative Art | Creative Coding Visual \ Sound \ Code designed by Jia-Rey Chang

Website= http://www.archgary.com/works/ai_jam/ Play= https://editor.p5js.org/archgary/sketches/PL27zPToi

project description

AI_Jam is an audio-visual interactive web-based installation which people can use it as an instrument to compose sound together with AI simultaneously. Seeking a stronger and closer bonding and finding the harmony between humans and AI to create a direct interaction for making music is the concept here. The background drumbeat is generated by AI (Google Magenta) and the beat pattern can be switched by the audience’s preferences. While the beat is playing in the background, the audience can freely drag the mouse to different positions on the screen to generate sound (or use *Wii-nstrument simultaneously). By turning on the “color-tracking mode”, instead of mouse-dragging , the audience can move a certain assigned color object (yellow in this code/case) detected by the webcam(tracking.js) in order to generate sound (Tone.js). Imagine, the audience is just like a lead guitar player by dragging the mouse or moving the color object to generate the melody, while Artificial Intelligence as a drummer takes parts in the rhythm. It attempts to lower the difficulty of playing the instruments and realizing the dream of jamming with AI without any sophisticated skillset. The whole piece is done with P5.JS (https://p5js.org/). *Wii-nstrument is an instrument set composed with 4 Wii-nunchuk mainly designed for real-time/live audio-visual performance as well as interactive installations.













web_link: http://www.archgary.com/works/ai_jam/ video_link: https://vimeo.com/434878576




II Others






Facebook: https://www.facebook.com/archgary/ Instagram: https://www.instagram.com/jiareychang/





Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.