Preface
This portfolio contains a collection of my explorations in computation, digital fabrication, and analysis over the course of the last year. This work was conducted independently from my courses, these projects have been fueled by my curiosity, questions I have asked myself, and ideas that I have had. Enjoy!
Scan or click to check out more of my projects!
VRxCNC
Looking to combine both VR and CNC milling technology, I assigned myself a project to design a sun shading module that follows a set of instructions which behave like Grasshopper script. I started by importing seven hexagons into the VR modeling software Oculus Medium, then following my scripts I perforated the hexagons.
Example Script:
1. Define a set of stripes on the surface of a hexagon
2. Using a paraboloid shape place 4 perforations through the hexagon
3. With the same paraboloid, carve out craters which are tangential to at least two other craters and centered on the stripes, adjust the radius of the crater by changing the depth of the crater
4. Only the four original craters should perforate the hexagon, adjust placement and depth to avoid excessive perforations
5. Repeat steps 3 & 4 until the surface of the hexagon is covered with craters
Starting Geometry
VRxCNC
When milling the hexagons, I experimented with settings to get different finish textures. I started by trying a finishing tool path with a large step over percentage, this resulted if scalloped grooves cut into the surface in ratered and offset patterns. I then realized that I could combine multiple finishing tool paths to get new textures. By using two rastered tool paths which were perpendicular to one another, the scalloped groves would created diamond patterns. Combing and rastered tool path with an offset one resulted in another diamond pattern, but this time it gets distorted and skewed by the offset pattern.
12 PLASTER PENTAGONS
During a workshop I learned of a technique to use a single mould to make a complex shape over the course of multiple pours of concrete. Looking to try something similar, I 3D printed a pentagonal mould with side angles that would let me form a dodecahedron. At first I wanted to use concrete and have each consecutive pour bond to the already hardened pieces of concrete, but this did not work out as well as I had hoped. The concrete did not bond together well and was wearing down my mould. I switched from concrete to plaster, hoping that it would bond together better. I had some success with the plaster bonding together, but was too fragile to survive the production process. I adapted my plan to make different modules of a dodecahedron from the same mould. I printed some inserts which filled parts of the mould and allowed me to make individual segments of a pentagon, which could be glued together to form a dodecahedron. To assemble the dodecahedron I needed, 1 full pentagon, 5 pentagons with three sides, and 5 pentagons with only two sides.
Mould set up for Full Pentagon
Mould set up for 3 Sides of Pentagon
Mould set up for 2 Sides of Pentagon
12 PLASTER PENTAGONS
There were many challenges which ran into during this project. I was not expecting the concrete to wear down the 3D printed mould pieces and much as it did, so before switching to plaster I had to make some pieces. I also made the mould too stiff, there were difficulties in removing the plaster from the mould resulting in the plaster cracking a couple of times. I learned how to get lots of things during this process which I was not expecting going into it. I learned how to properly mix plaster and always get a good water to plaster ratio. I learned that a good mould should have some flexibility if it is too be reused and if the pieces it makes are fragile.
DRONE ANALYSIS
During my Integrated Design Studio, I took a trip to La Antigua, Guatemala to document the ruins of monasteries and convents which collapsed during an earthquake in 1773. Before the trip, the School of Architecture purchased a drone so that I could create a photogrammetry scan of the ruins. When I returned from Guatemala, I had the images the drone took processed which resulted in a detailed digital mesh model of the ruins with texture files made from the drone’s images and a site orthomosaic image. Throughout the rest of the semester I was able to use this mesh for many tasks.
The first thing I did with the mesh was import it into Oculus Medium, a VR modeling software, to clean up the mesh and repair sections that were distorted. After this work in VR I was left with a model of the ruins which could be 3D printed. I also used Oculus Medium to create massing models in and amongst the ruins. I developed a fluid work flow of designing a form in VR, exporting an .OBJ file, scaling it in Rhino, slicing it in Ultimaker Cura, then sending the file to print all withing an hour or two.
DRONE ANALYSIS
The model was also essential for creating architectural drawings of the ruins. Prior to the trip to Guatemala, my class had some inaccurate plans, sections, and elevations. With the intention of providing the municipality of La Antigua with more accurate drawings my class worked on making a better set of drawings of the ruins. We used the orthomosaic of the site to make corrections to the plans and add details that were missing. Sections of the ruins would have been a daunting task if it were not for the digital model of the ruins the drone afforded us. I set up viewports and clipping planes in Rhino to provide me with cut line through the ruins, then I traced details which were only visible because of the mesh textures. This process produced sections that looked like realistic aged ruins and provided details that brought the sections to life even further.
DRONE ANALYSIS
The textured mesh of the ruins was used again to help present my intervention on the site through renderings and experiencing it in VR. Prior to my trip, I did not comprehend the true size of the ruins, I found that the best way to convey this understanding of the ruins’ scale was best accomplished through VR. When I was explaining the ruins to colleagues and other professors I would have them put on the headset, at which point they understood the size of the ruins through moving their head and body to look around them.
When VR was not an option, I had renderings which placed my intervention within the ruins and the context beyond the edge of the model. In Rhino I used the “perspective match to image” command to set up a view for me to make my renderings. I started with photos I took on the site and placed it as a wallpaper in Rhino, next I would move the viewport to get in the approximate area where the photo was taken. I then placed a set of points on distinct features in the photo and on the corresponding features on the mesh, such as the top right corner of a wall which was visible in the photo and on the mesh. After matching up the photo and mesh points, Rhino would calculate the exact placement of the viewport camera and the lens focal length. Now with the camera placed, I would set the positioning of the sun based on the date and time when the wallpaper image was taken, this created shadows and lighting that matched the photo. I then rendered the scene and composited it together in Photoshop.