Volume IX Issue I Fall 2023

Page 1

Attractiveness AI: Can Artificial Intelligence Really Rate Your Looks?

Re-Capturing the Road to Net-Zero Emissions

BREAKING DOWN CARBON CAPTURE AND STORAGE

Plants, Cells, and Bugs: The Good, the Bad, and the Ugly of Making Meat Sustainable

Fur-Free Future: Revolutionizing Research with 3D Tissue Culture

ISSUE I FALL 2023 VOLUME IX
PAGE
12
PAGE 15
PAGE 21 JOURNAL OF UNDERGRADUATE SCIENCE AND TECHNOLOGY

Letter from the Editor

Dear Reader,

It is with immense pride and anticipation that we unveil Volume IX, Issue I of the Journal of Undergraduate Science and Technology (JUST). Our journal was originally founded with the goal of creating a platform where like-minded undergraduates could showcase cutting-edge research and scientific enhancements on both local and global scales; since then, it has transformed into something far greater than its original form. From our bi-yearly publications to photography, design, and our recently launched short-form newsletter, our journal has evolved into a community of undergraduates aiming towards a common mission: making science accessible to broader audiences. At UW-Madison, we have the unique ability to provide undergraduate students the ability to publish their work in a peer-reviewed journal and give students a glimpse into the publication process of academic journals. We believe that such experiences are important in setting up students for success within academic settings as they continue on their future paths. We are grateful to play a part in the larger effort of making scientific writing and research available to those beyond academia.

Additionally, it is with immense gratitude that I take some time to acknowledge the extraordinary team behind this publication. While I may be writing this letter, the truth is that this journal is a vibrant symphony performed by a multitude of talented undergraduates. Our staff writers, editors, designers, marketing, and web design teams

worked diligently over the past semester to transform knowledge onto paper and do so while also balancing their lives as students and individuals. To our advisors Dr. Todd Newman and Dr. Joan Jorgenson, I want to send a special thanks for all the time and invaluable expertise that you were able to share with us this semester. Lastly, I would like to take a second to recognize the Wisconsin Institute for Discovery, the Associated Students of Madison, and the College of Agriculture and Life Sciences for their unwavering support of undergraduate science journalism.

As you delve into the pages that follow, remember that science isn’t confined to labs and (boring) textbooks; it’s woven into the very fabric of life, the air we breathe, the food we eat, and the technology we hold close. May the following editorials, photography, and manuscripts not only enlighten you but also empower you to see the world through a more scientific lens. Even if you haven’t always felt drawn to STEM, I hope this journal ignites your inner explorer and shows you the wonder hidden in everyday things. In this issue, you will find a smorgasbord of diverse pieces – from computers that solve mathematical puzzles to the capture of carbon in the environment and even the startling power of worms! Have fun on this adventure!

Sincerely,

Pictured above: 2023-24 Executive Board: Top row (from left to right): N. Martinson, A. Shaikh, Q. Ruzicka, S. Dekate. Bottom row (from left to right): D. Hamdan, A. Li, M. Simhan, C. Hansen. Not pictured: T. Pham and D. Chatterjee

who we are

The Journal of Undergraduate Science and Technology (JUST) is an interdisciplinary journal for the publication and dissemination of undergraduate research conducted at the University of Wisconsin-Madison. Encompassing all areas of research in science and technology, JUST aims to provide an openaccess platform for undergraduates to share their research with the university and the Madison community at large.

Submit your research below to be featured in an upcoming issue!

Our sponsors

EDITOR-IN-CHIEF

Manasi Simhan

MANAGING EDITORs

Adina Shaikh

Dima Hamdan

Trang Pham

HEAD WRITERs

Amy Li

Natalie Martinson

MARKETING Director

Chloe Hansen

Marketing Assistant

Mackenzie Twomey

DESIGN Director

Quinn Ruzicka

design assistant

Olive Haller

Our team

Head Web Developer

Shreyanshu Dekate

Web assistant

Sophie Liu

Finance and outreach Chair

Dev Chatterjee

EDITORS OF CONTENT

Analise Coon

Carmela De Leon

Ivy Raya

LeAnn Aschkar

Mihir Manna

Walid Lakdari

STAFF WRITERS

Adrian Sieck

Alex Wong

Daniel Molina

Kaashvi Agarwwal

Manal Aditi

Vedaa Vandavas

Zane Brinnington

Short-Form Editors

Ayah Amer

Dannah Altiti

Mackenzie Twomey

short-Form Writers

Daniel Del Carpio

Khadijah Dhoondia

Mahathi Karthikeyan

Sean Hugelmeyer

Contributing Writer

Jackson Kunde

Contributing Researcher

Dara Safe

We would like to sincerely thank the College of Agriculture and Life Sciences [CALS], the Wisconsin Institute for Discovery, the Associated Students of Madison (ASM), and the Wisconsin Alumni Research Foundation for financially supporting the production of JUST’s Fall 2023 issue.

Thank you!

Editorials

Re-Capturing

Computers are learning to do math, but can they learn to think along the way: A Look at

Jackson

Manal

Attractiveness

Rate Your Looks?

Alexandra

Plants,

contents Fall 2023
Man’s Trash is Another Worm’s Treasure: Benefits of Vermicompost
Sieck
One
Adrian
the Road to Net-Zero Emissions
Molina
Daniel
Computer-Assisted
Proofs
Kunde
AI: Can Artificial Intelligence Really
Wong
From Thought to Text
Aditi
Cells, and Bugs: The Good, the Bad, and the Ugly of Making Meat Sustainable Zane Brinnington 15 12 09
04 02 Nanoscopic Warriors: Revolutionizing Cancer Treatment with Nanoparticles Kaashvi Agarwwal 18 Fur-Free Future: Revolutionizing Research with 3D Tissue Culture Vedaa Vandavas 21
06
contents manuscript
the Feasibility of Hydrogen as an Alternative Fuel for Industrial Steel Reheat Furnaces
Safe 30 15 02 06
photography
Akindele and Karsey Renfert
Exploring
Dara
pixels
Andrew
26

One Man’s Trash is Another Worm’s Treasure: Benefits of Vermicompost

When it comes to food waste solutions, most people have heard of composting as a way to turn leftovers into something useful. Composting is great, but if you want an upgrade, worms are a better substitute. Vermicompost (from the Latin word vermis, meaning ‘worm’) is a type of compost that uses worms to break down food waste. According to an article titled, “Vermicomposting for Beginners” by Rick Carr, vermicompost is described as “the product of earthworm digestion and aerobic decomposition using the activities of micro and macroorganisms at room temperature” (2019). This means that vermicompost comprises worm-digested food waste and organic material decomposed by microorganisms (microscopically small organisms, like bacteria and fungi). ‘Aerobic decomposition’ refers to the decomposition that happens when oxygen is present---decomposition will be done by microorganisms that need oxygen.

However, the use of vermicompost has many benefits while also being an eco-friendly practice. It has been shown to improve nutrient availability, soil structure, and moisture, encourage microbe activity, encourage plant growth, and even suppress pests.

Vermicompost Bins

Vermicompost is primarily made in bins, as they are effective and easy to find. This helps to make vermicomposting fairly beginner-friendly, and bins can be a variety of sizes depending on how much waste you have and how much room you have. Based on the guide, bins will need small holes in the sides and stakes in the bottom, to allow airflow throughout. The bin should be half full of soil and a layer of moistened shredded newspaper. The best worms to use are red wigglers, like Eisenia fetida and Eisenia andrei, due to their living and eating habits. The bin should be at room temperature with adequate moisture. For more information, read “Vermicomposting for Beginners” by Rick Carr.

Nutrients and Hormones

Decomposers play a huge role in the ecosystem as they convert dead material into usable material for other organisms. They make it possible for all the energy and nutrients from plants to return to the plants after death. Earthworms are a great example, as they eat organic material in soil and on forest floors. What they leave behind after feeding is full of nutrients and bacteria, which is why vermicompost is so effective.

| 2 biology

Vermicompost contains very high levels of essential plant nutrients, more than regular soil. Not only does it contain the most common essential nutrients (nitrogen, potassium, carbon), but it also contains less common nutrients like zinc, magnesium, and iron in forms that plants can absorb (Ur Redman et al., 2023). Plants can’t just take up any chemicals and pick out their desired nutrients. They need these nutrients around them in specific forms, which vermicompost, as well as any good soil has available. As a cherry on top, vermicompost does not only contain nutrients. It also has microorganisms that can continue to make more nutrients available over time, like a grocery store that keeps restocking.

Applying vermicompost to soil will increase plant growth and improve yield. Knowing that this is due to the nutrient availability in vermicompost, another factor is the plant hormones that vermicompost contains. Microbes and earthworms will secrete hormones like auxins, gibberellins, and cytokinins (Ur Redman et al., 2023). These hormones have a variety of functions, but most notably, they stimulate root growth and cell division, germination, and stem elongation. Faster growth and germination from hormones in vermicompost help farmers and gardeners achieve better yields.

Soil Structure and Drought

Earthworms can improve the structure of soil by breaking down plant material and minerals, as well as mixing around organic matter to form soil aggregates (Ur Redman et al., 2023). Soil aggregates are small clumps of sand, silt, and clay bound together. Aggregates are essential because they form the soil structure, while the space between the aggregate clumps leaves room for air and water to pass through. Stable soil aggregates that hold their shape prove to be crucial for soil ecosystem health, as their stable structure consistently supplies oxygen to soil organisms and absorbs water well for plants.

Surprisingly, drought stress can be reduced by vermicompost. The higher the water holding capacity and porosity, the less destructive effects appeared. During drought, plants will respond in several ways to keep water in. One method is to keep the stomata, openings in the leaves, closed. Stomata allow gas inside, like carbon dioxide, to be used for itself. Carbon dioxide is a vital part of photosynthesis, so plants need stomata to allow it in. However, in addition to letting things into the plant, stomata run the risk of letting things out of the plant- namely water. In drought conditions, plants will keep their stomata closed in order to prevent water from escaping, but the lack of carbon dioxide stresses them. Because of this, stomatal closure is a crucial indicator of drought, with high stomatal closure indicating water stress and lower stomatal closure indicating more available water. Vermicompost reduced the stomatal closure in plants in drought conditions (Ur Redman et al, 2023). This indicates that vermicompost helps to prevent water stress.

Disease Freeing Advantage

Vermicompost can be very influential, with the ability to affect microbial activity, soil temperature, porosity, infiltration, nutrient content, and even crop yield. Its positive effect on microbes increases the number of beneficial microbes, boosting plant growth (Ur Redman et al., 2023). Beneficial

Vermicompost can be very influential, with the ability to affect microbial activity, soil temperature, porosity, infiltration, nutrient content, and even crop yield.

microbes are microorganisms that can perform a variety of roles: nutrient processing, hormone production, and pathogen control. These beneficial microbes make plants happy, well-fed, quickly growing, and disease-free.

Indeed, vermicompost can prevent diseases. The reason for this is mainly because earthworms excrete a fluid called coelomic fluid. Coelomic fluid serves multiple purposes in worms, unrelated to microorganisms- mainly, to move and store things around the body. But when combined with other compounds released by earthworms, coelomic fluid has insecticidal and antifungal properties which help suppress insect pests and other diseases (Ur Redman et al., 2023). Additionally, due to the boost to beneficial microbes, vermicompost helps beneficial microbes out-compete less beneficial ones.

Conclusion

Overall, vermicompost provides a whole host of benefits. Whether plants experience biotic or abiotic stress, there’s a chance that vermicompost can fix it. With the provided evidence, it shows that vermicompost can reduce water stress, salinity (salt) stress, pest and disease stress, and lack of nutrients. Earthworms fulfill an important role in decomposition, nutrient cycling, soil health and structure, and plant success–and vermicompost harnesses this power to benefit plants and plant-eaters alike. Due to these benefits, vermicompost should be considered a capable and worthwhile method to fertilize plants and improve soil health.

Works Cited

Carr, R. (2019, November 1). Vermicomposting for Beginners. Rodale Institute.

https://rodaleinstitute.org/science/articles/vermicomposting-for-beginners/

Vermicompost: Enhancing Plant Growth and Combating Abiotic and Biotic Stress, Ur Redman, S., De Castro, F., Aprile, A., Benedetti, M., & Fanizzi, F. P. (2023). Vermicompost: Enhancing Plant Growth and Combating Abiotic and Biotic Stress. Agronomy, 13(4), 1134. https://doi.org/10.3390/agronomy13041134

Image: Kyle Spradley | © 2014 - Curators of the University of Missouri

Fall 2023

Re-Capturing the Road to Net-Zero Emissions

One person can save roughly one tonne of carbon dioxide emissions by converting to vegetarianism for one year. A round trip from Chicago to London emits about the same amount. From reusable straws to shower times to the three R’s (Renew, Reuse, Recycle), it’s hard to remember a time when sustainability was not at the forefront of young people’s mind; but this individualistic view of climate change and pollution portrays just a small part of a picture. Most greenhouse gasses are being emitted from industrial processes like the burning of fossil fuels – not from individual Wisconsinites.

While combating climate change is still the responsibility of every person, being informed about the bigger picture is a part of that responsibility, and understanding how achieving net zero emissions is possible is necessary for efficient activism. An incredible tool for accomplishing such a goal is through a technology called carbon capture and storage (CCS). Carbon capture allows for the carbon dioxide emitted through combustion to be reabsorbed back into the earth and stored in a secure way instead of going into the atmosphere.

Carbon capture, more often than not, acts as an add-on to existing industrial plants that serve to absorb CO2 as it is being emitted. The international think tank, Global CCS Institute, was formed to investigate and promote the use of carbon capture, which they define as “technologies that capture the greenhouse gas carbon dioxide (CO2) and store it safely underground, so that it does not contribute to climate change... CCS includes both capturing CO2 from large emission sources (referred to as point-source capture) and also directly from the atmosphere,” (Global CCS Institute, 2022). Although encouraging, carbon capture technologies are extremely energy intensive, regardless of how they are implemented. For this reason, carbon capture infrastructure projects must have appropriate renewable energy sources, otherwise, they are just perpetuating the unsustainable behavior they were implemented to curb.

The process can take many forms, and the science is extremely complex, varying from case to case. There are generally three methods of carbon capture: post-combustion, pre- combustion, and ‘oxyfuel combustion’, (Omodolor et al, 2020). Post-combustion is the most common and involves the use of a filter that scrubs through the fumes produced by fossil fuel combustion, ‘filtering’ out the carbon dioxide using special chemical solvents designed to bind to CO2. This method is the most popular, as it can be retrofitted to most existing power plants. Pre-combustion involves the

use of a gasifier, a tank that uses high pressure and temperature, which converts fuel into a gas, separating carbon dioxide from hydrogen gas. The carbon dioxide is stored, and the hydrogen gas is used as a cleaner fuel to be burned for electricity. Lastly, oxyfuel combustion is a system where fuels are burned in pure oxygen instead of air which contains a smaller percentage of the gas. What this highly energy intensive and dangerous process ensures is that the resulting byproducts are largely made up of carbon dioxide and water vapor. Using basic condensation, the water vapor is separated, and thecarbon dioxide is isolated at higher rates than the previous two methods. Carbon capture technology is crucial to undercutting emissions and easing a transition towards renewable energy, and the field is constantly changing and growing, these are just the methods already being used.

“ Climate change is never black and white, it requires understanding the nuances of technology and accepting that there is no perfect solution.

Carbon dioxide storage is another question entirely. After being captured, the carbon dioxide is compressed into a special type of liquid, allowing it to be stored more effectively. The beauty and genius of this storage system lies in the answer to the following question: where can we store all this carbon dioxide in liquid form? Where is there enough space to securely store the carbon dioxide that has just been collected? Why not in the very same place where it had originally been mined? Depleted oil reservoirs and other deep lying geological formations are perfect locations, they are already capable of absorbing large amounts of stored carbon dioxide.

| 4
Environment

A point to keep in mind is that this technology is not a solution, it is merely a method of abating damages that would have otherwise run rampant. Carbon capture technology should by no means be an excuse to continue investing in non-renewable resources but should instead be viewed as a cushion to allow for a smoother transition to greener and more sustainable forms of energy production. Wisconsin is home to 2 of the top 100 most polluting power plants in the country: Elm Road Generating Station, and Columbia Energy Center. Additionally, the top ten most productive power plants, in terms of electricity, produce nearly 90% of emissions and only just 63.5% of electricity (Wisconsin Environment Research and Policy Center, 2022). This is not enough. Power plants must be either more efficient with the emissions they are producing or find ways to curtail the pollution they create. Carbon capture technologies provide an avenue through which to accomplish this.

Nevertheless, it is the responsibility of everyone to do their part for the environment through recycling when possible, limiting waste and energy usage, but also to advocate for what they want for the planet. Climate change is never black and white, it requires understanding the nuances of technology and accepting that there is no perfect solution. Carbon capture has been doted by many as an anti-solution; a way

to validate the burning of excess fossil fuels justified through this new technology. People want a miracle solution. Carbon capture is a band aid that gives us the wiggle room to pursue alternative solutions like investing in green energy, but eventually, that time will run out without proper decision-making.

Works Cited

CCS explained: Capture. Global CCS Institute. (2022, July 12). https://www.globalccsinstitute.com/ccs-explained-capture/ Omodolor, I. S., Otor, H. O., Andonegui, J. A., Allen, B. J., & Alba-Rubio, A. C. (2020). Dual- function materials for CO2 capture and conversion: A Review. Industrial & Engineering Chemistry Research, 59(40), 17612–17631. https:// doi.org/10.1021/acs.iecr.0c02218

Wisconsin’s dirtiest power plants. Wisconsin Environment Research & Policy Center. (2022, June 22). https://environmentamerica.org/wisconsin/center/resources/wisconsinsdirtiest-power-plants/

Image: Source: VectorMine/Shutterstock.com

Fall 2023

Computers are learning to do math, but can they learn to think along the way: A Look at Computer-Assisted Proofs

Since its nascent days in the minds of 19th-century mathematicians and its earliest implementations breaking codes during World War II, computer science has been intertwined with the study of mathematics. Mathematical logic formed the basis of computing, while computers allowed for fast arithmetic calculations. In short, mathematics advanced computing, and computing advanced mathematics. While these sciences are inextricably linked, it wasn’t until the 1970s that computer science found applications in the abstract branches of mathematics known as “pure” math. Now, artificial intelligence is developing new algorithms and assisting some of the most brilliant pure mathematicians of our lifetime, like fields medalist Terrance Tao (Tao, 2023).

The First Computer-assisted Proof

The first theorem that relied heavily on computer assistance is the four-color theorem. The problem is as follows: take any map, for example, the map of the United States and color each state such that no two states with the same colors are touching. For example, we may color California and Wisconsin the same color, but not California and Oregon as they border one another. This idea can be described as a problem in graph theory where each state is a vertex with an edge between each border state. Mathematicians posed the question: Can we color the map using only four colors?

The problem originated in the 1850s when South African mathematician Francis Guthrie noticed that the counties of England could be colored with only four colors. He speculated that any map could be colored using only four colors (Richeson, 2023). For over a hundred years, no one was able to produce sufficient mathematical proof of Gunthrie’s conjecture. When mathematicians Kenneth Appel and Wolfgang Haken at the University of Illinois took on the problem in

1976, they had a new trick up their sleeve: computers. They devised a proof that would give computers the ability to test many different cases to find a solution to the theorem. Over six months and thousands of hours of computing time, Appel and Haken were able to check thousands of configurations exhaustively (Richeson, 2023). The reward for their work: a long sought-after proof of the four-color problem. With this breakthrough, the relationship between pure mathematics and computing was established.

Computers Start Learning

That proof was established nearly 50 years ago. Since then, computer development has exploded —the IBM computer that Illinois mathematicians relied on is now less powerful than the modern cell phone (Love, 2014). With this improved hardware, scientists have developed methods to teach computers how to learn new tasks, such as games and algorithms. One of Google’s artificial intelligence labs, DeepMind, is a pioneer behind an approach that teaches computers known as “reinforcement learning.” Training a computer using a reinforcement learning algorithm is very similar to training your dog to do tricks. Unfortunately, we cannot simply explain to dogs how to roll over; instead, we give dogs treats if they complete a trick correctly or get close to completing the trick. This is how researchers can teach computers complicated tasks; rather than trying to explain and encode a good approach to a game, which we often cannot even verbalize or define, we can give the computer a list of possible “moves” that it could choose from and reward it if it performs well in the game.

Through this approach, DeepMind built artificial intelligence which achieved superhuman performance in a variety of different games. In the Chinese strategy game Go, they developed artificial intelligence that Lee Sedol, one of the best Go players of all time (DeepMind). In Chess, their artificial intelligence, AlphaZero, learned to play chess better than many grandmasters (Silver et. al, 2017). Furthermore, this approach is not only limited to existing games, DeepMind discovered that they could teach AI new tasks by formulating them as games like Chess and Go. It was this approach that allowed DeepMind to teach computers to develop new

| 6
Figure 1: The four-color theorem exemplified in a map and a graph (Source: Quanta Magazine)
mathematics

mathematical algorithms.

DeepMind taught artificial intelligence to generate new, faster algorithms to do matrix multiplication. Matrix multiplication is a mathematical operation that underlies many of the calculations that computers perform regularly. It is used in computer graphics, physics simulations, and even in the algorithms used to train artificial intelligence. Due to matrix multiplication’s ubiquity, developing efficient algorithms for it could improve computing speeds across numerous computer programs.

To accomplish this, DeepMind created a single-player game where each move that the AI could take would correspond to a step in the matrix multiplication algorithm. The reward for the player would be finding fewer steps to successfully multiply the matrix.

The “game” of matrix multiplication is extremely difficult — the number of possible algorithms is far greater than the number of possible games of Chess or Go. In spite of the difficulty, though, the computer successfully discovered a variety of new algorithms to multiply matrices without any prior knowledge of previous algorithms. It rediscovered current state-of-the-art algorithms and developed new ones. If simple algorithms would take 100 steps and state-of-the-art approaches could solve the problems in 80 steps, a computer could learn to do it in just 76 (Fawzi et. al, 2022). This research demonstrated that computers could advance modern mathematics through learning.

These findings, although ground-breaking, have various drawbacks. To continue to make strides on any problem in mathematics we will have to formulate a new game with all the necessary rewards and possible moves similar to what researchers at DeepMind did for matrix multiplication. This will mean that each problem would come with its own set of “learned rules” that would have to be input into the computer each time – a process that is difficult, time-consuming, and simply not possible for every single problem out there. Furthermore, the new approaches that AI models might produce are not guaranteed to be understandable to humans. Though we could make use of the findings provided by artificial intelligence, we will not necessarily understand why they work. This is a problem. Mathematical proofs are formed from logical steps and arguments that explicate the validity of a new theorem. Without that support, it would be difficult for future mathematicians to build from the work of AI.

An AI That Can Explain Why

In recent years, a new type of artificial intelligence known as large language models (LLMs) has skyrocketed in popularity. These models have tantalized researchers with the idea that a computer system could not only develop new ideas but also explain the underlying logic that led them to their ideas. These models learn from vast amounts of textual data, encompassing sources such as books, articles, websites, and

a substantial portion of the internet. OpenAI’s GPT4, one of the most well-known LLMs, learned from approximately thirteen trillion words (OpenAI, 2023). During training, the model is presented with input sentences and must predict the next word. To put that enormous amount of data into perspective, if someone were to read 300 words per minute, which is around average, it would take them about 82,000 years to read all the words that GPT-4 was trained on.

One of the compelling features of this approach is how much simpler this training process is than the last approach. Instead of creating an entirely new game with actions and rewards, these models can use the same information, textbooks, articles, and blog posts that humans use to learn about different subjects. If the last approach was like training a dog, LLMs are more like training humans.

This approach has yielded some very impressive results, OpenAI’s GPT-4 scored a 5 on the AP Biology exam, scored over 1500 on the SAT, and passed the bar exam (OpenAI, 2023). Beyond this AI’s competitive college application and its status as a licensed attorney, scientists across domains are interested in how intelligent these systems can become. While some scientists are hopeful that these models can be the first artificial general intelligence and surpass human intelligence, many say that LLMs are no more than “stochastic parrots” that simply predict the next word (Bubeck et al, 2023). The ability to do mathematics, a purely logical, yet exceptionally creative science, may be a proxy for the model’s intelligence.

The ability to do mathematics, a purely logical, yet exceptionally creative science, may be a proxy for the model’s intelligence.

Unlike the standardized tests mentioned earlier, mathematics has proven very difficult for these AI models. Early versions released in 2022 would often fail at middle and highschool-level mathematics (Frieder et al, 2023). More recent models, like GPT-4, have improved, and early tests have shown that GPT-4 is proficient at basic math while also being able to do some university-level mathematics. However, while GPT-4 can do upper-level problems, this math is far from research-level mathematics, which not only requires significant domain knowledge but also requires clever, original approaches. When GPT-4 has been asked to complete math problems that require a clever solution, such as questions from the International Mathematical Olympiad, it “fails

Fall 2023
Figure 2: A visual of how GPT predicts the next word from a sequence

spectacularly” (Frieder et al, 2023). Similarly, GPT-4 fails when asked to answer questions that require advanced approaches and knowledge such as graduate-level mathematics (Frieder et al, 2023).

One method that has improved LLMs’ mathematical reasoning is “process supervision.” Essentially, if the LLM is the student, process supervision is the teacher asking the student to “show their work.” Researchers train another AI model to evaluate LLM solutions based on the mathematical steps that it took to arrive at their final answers. This process has been shown to lead to more correct answers from models and could have the added benefit of forcing LLMs to explain the rationale behind their solutions, rather than just outputting an answer (Lightman et al, 2023).

Conclusion

In conclusion, as we imagine the development of artificial general intelligence, the symbiotic relationship between mathematics and computer science continues. A computer system that can understand mathematics and meaningfully contribute would not only revolutionize the field but would prove there is a model that can follow logical steps and be creative — in other words, a computer that can think.

Works Cited

Tao, T. (2023, October 24). Embracing change and resetting expectations. Microsoft Unlocked. https://unlocked.microsoft. com/ai-anthology/terence-tao/ OpenAI. (2023, March 27). GPT-4 technical report. arXiv.org. https://arxiv.org/abs/2303.08774

Bubeck, S., Chandrasekaran, V., Eldan, R., Gehrke, J., Horvitz, E., Kamar, E., Lee, P., Lee, Y. T., Li, Y., Lundberg, S., Nori, H., Palangi, H., Ribeiro, M. T., & Zhang, Y. (2023, April 13). Sparks of artificial general intelligence: Early experiments with GPT-4. arXiv.org. https://arxiv.org/abs/2303.12712

Richeson, D. S., & Quanta Magazine moderates comments to facilitate an informed, substantive. (2023, August 8). Only computers can solve this map-coloring problem from the 1800s. Quanta Magazine.

https://www.quantamagazine.org/only-computers-can-solvethis-map-coloring-problem-from-the -1800s-20230329/

Toal, S. J. (2017, October 18). Celebrating the four color theorem. College of Liberal Arts & Sciences at Illinois. https://las.illinois. edu/news/2017-10-18/celebrating-four-color-theorem Overbye, D. (2013, April 28). Kenneth I. Appel, mathematician who harnessed computer power, dies at 80. The New York Times. Frieder, S.; Pinchetti, L.; Griffiths, R.-R.; Salvatori, T.; Lukasiewicz, T.; Petersen, P.C.; Chevalier, A.; Berner, J. Mathematical capabilities of chatgpt. (2023). Cornell University

Love, D. (2014, May 19). The specs on this 1970 IBM mainframe will remind you just how far technology has come. Business Insider.

https://www.businessinsider.com/ibm-1970-mainframe-specsare-ridiculous-today-2014-5 DeepMind. (n.d.). AlphaGo. Google DeepMind. https://deepmind.google/technologies/alphago/ Silver, D., Hubert, T., Schrittwieser, J., Antonoglou, I., Lai, M., Guez, A., Lanctot, M., Sifre, L., Kumaran, D., Graepel, T., Lillicrap, T., Simonyan, K., & Hassabis, D. (2017, December 5).

Image: DeepMind

| 8

From Thought to Text

From pop culture to science fiction, the art of reading minds has been a tantalizing idea that comes with great power and consequence. In the 1950s, science fiction encountered a boom in extrasensory perception (ESP), mind-reading, and psychic abilities, with media such as Star Trek and The Twilight Zone introducing advanced life that uses mind-reading to their advantage (Harris, 2015). The unprecedented ability to capture and interpret an individual’s thoughts has long been foreseen as holding unparalleled power, and just this past May, artificial intelligence (AI) achieved a milestone, bringing us closer than ever to attaining telepathy.

Breakthroughs in Translated Brain Activity

At the University of Texas at Austin, a study led by computer scientists Alex Huth and Jerry Tang was conducted to translate brain activity into a direct flow of text. It is a non-invasive procedure that utilizes functional magnetic resonance imaging (fMRI) and large language models (LLMs). Invasive language decoding systems involve neurotechnologies from the brain to interpret and extract language-related information. These systems typically require the implantation of electrodes or neural recording devices into brain regions specializing in speech and language processing. By observing neural activity and decoding patterns of brain signals, the systems strive to decipher spoken or intended language. This would allow individuals unable to speak a chance to communicate in the future. They combine the brain’s language-related neural activity and the generation of intelligible text or speech, offering a promising avenue for individuals with communication impairments and potentially developing more efficient and natural means of expression.

A separate study at UC San Francisco and UC Berkeley, headed by Edward Chang, illustrated the improvement of invasive language decoding systems, displaying their potential to decode thoughts from neural signals (Garner, 2022). The three participants in the study listened to 16 hours of podcasts, from The Moth to Modern Love, while an fMRI scanner measured their brain activity (Osborne, 2023). Notably,

the individuals had to consent to the decoding by allowing, opening their minds, and thinking solely about each stimulus in order for the machine to read their thoughts and create coordinating text while the participants listened.

Comparison of Invasive and Noninvasive Language Decoding Systems

While recording the brain activity, the model produced a general gist of the thoughts or speech of the participants. This was a breakthrough regarding non-invasive methods since only words and phrases were decipherable by similar techniques previously. Over the last ten years, decoders have permitted unconscious individuals to answer “yes or no” questions and have singled out a single idea that an individual hears from a list. The accomplishment conquers a significant impediment of fMRI: while the method can plan mind action to a particular area with a staggeringly high goal, there is an innate delay, which makes it impossible to track thoughts in real-time. This exists since fMRI measures the bloodstream’s reaction to a thought, which circles and returns to a pattern in over ten seconds, a challenge that even the most remarkable scanner cannot improve (Parshall, 2023). This inherent limitation has impeded the capacity to decipher brain activity in reaction to natural speech, yielding a scattered “mishmash of information” stretched across a few seconds. However, the emergence of enormous language models– simulated intelligence such as OpenAI’s ChatGPT–allowed for an alternative path (Airhart, 2023). These models can numerically address the semantic importance of discourse, permitting the researchers to see examples of neuronal action compared to a series of words with specific significance as opposed to endeavoring to peruse out movement word-by-word.

Dr. Huth has characterized this achievement as a significant improvement, noting that they have successfully facilitated the model to comprehend and interpret complex ideas in continuous language over extended durations (Zeisloft, 2023). The decoder has an accuracy rate of 72 to 82 percent,

Fall 2023
Figure 1: Comparison of the stimulus given with decoded LLP stimulus (Source: The University of Texas at Austin)
Written by Manal Aditi Edited by Adina Shaikh and Trang Pham science

far higher than previous attempts to translate thoughts into ideas (Parshall, 2023). Although this has been a great victory regarding semantic decoders, much still needs to be recovered from the process, such as grammatical nuances like pronouns. Proper nouns are often left to the wayside, as places and names are misheard if not entirely incorrectly reproduced. Despite this triumph in semantic decoding, challenges remain in addressing grammatical nuances, indicating that further improvements are necessary to advance this groundbreaking technology.

Ethical Considerations and Consequences

The persisting concerns about AI technology echo historical apprehensions about the potential of mind-reading. Critics express unease that this technological advancement could function similarly to a polygraph exam, introducing the risk of criminalizing individuals based on the contents of their mental process. In neuroethics, a division exists concerning evaluating recent advancements and their impact on cognitive privacy. Notably, Gabriel Lázaro-Muñoz, a prominent bioethicist affiliated with Harvard Medical School in Boston, underscores the importance of heightened vigilance, emphasizing that the progression of intricate, non-invasive technologies, exemplified by this innovation, appears to be advancing more rapidly than previously anticipated (Malleck, 2023). This development serves as a powerful call for policymakers and the broader public, asserting the need for careful consideration of the ethical and societal implications.

“ The persisting conerns about AI technology echo historical apprehenions about the potentional of mind-reading.

Nevertheless, some eminent experts posit that the technological complexity and high margin of error of this approach render it an unlikely tool for malevolent purposes. They point to the immobility of fMRI scanners, which makes it challenging to access an individual’s thoughts without their willing cooperation. Furthermore, they question the viability of investing considerable time and resources in developing decoders for purposes beyond the restoration of communication abilities (Reardon, 2023). Experts state that, at the current moment, the technology does not warrant excessive apprehension, as it does not presently comprise enough power to analyze thoughts and brain processes. Nevertheless, the anticipated evolution of this technology will necessitate policy formulation and the implementation of safeguards to ensure its responsible and equitable use.

Future Applications

Proactive measures were integrated during this study to predict damaging consequences. Tang emphasized the team at the University of Texas at Austin’s commitment to addressing concerns on potential misuse, expressing that they take these concerns seriously. Their primary purpose is to ensure the responsible and intended use of these technologies while promoting their advantageous applications. Notably, when tested on different individuals, the decoder underwent customizations, resulting in disjointed and incoherent output. Moreover, participants retained the ability to influence the machine’s output by contemplating alternative ideas, such as visualizing animals or engaging in imaginative diversions. The measures above underscore the researchers’ emphasis on the responsible use and ethical limitations regarding the new technology.

Tang emphasized that the current capabilities do not facilitate malicious purposes, but they remain dedicated to establishing robust policies to prevent potential misuse before it becomes a significant concern (Airhart, 2023). Although substantial ramifications are possible, the triumphs may revolutionize communication. A significant triumph from the outcome of these studies would potentially improve communication for those who are mentally sound but unable to speak. In the case of physical disabilities potentially caused by strokes or other ailments, patients may be able to facilitate communication using technology developed from these models (Airhart, 2023). This could revolutionize communication for the disabled and allow patients under medical conditions such as a stroke to seek the necessary contact. The key will be balancing scientific innovation with proper safeguards to ensure a more accessible future.

| 10
Figure 2: Speech synthesis from neurally decoded spoke sentences (Source: Chartier, 2018) (Source: The University of Texas at Austin)

Works Cited

Airhart, M. (2023). Brain Activity Decoder Can Reveal Stories in People’s Minds.” College of Natural Sciences.

Cottier, C. (2023). Can Ai Read Your Mind? In Discover Magazine, Discover Magazine.

Chartier, J., Anumanchipalli, G. K., Johnson, K., & Chang, E. F. (2018). Encoding of articulatory kinematic trajectories in human speech sensorimotor cortex. Neuron, 98(5), 1042-1054.e4. doi:10.1016/j.neuron.2018.04.031

Devlin, H. (2023). Ai Makes Non-Invasive Mind-Reading Possible by Turning Thoughts into Text. In The Guardian, Guardian News and Media.

Garner, I. (2022). No Longer at a Loss for Words.

Hamilton, J. (2023). A Decoder That Uses Brain Scans to Know What You Mean - Mostly.

Harris, J. (2015). Has Telepathy Become an Extinct Idea in Science Fiction?” Auxiliary Memory.

Malleck, J. (2023). Mind-Reading Tech at UT Austin Raises Ethical Questions. GovTech, GovTech.

Metzger, S. L., Liu, J. R., Moses, D. A., Dougherty, M. E., Seaton, M. P., Littlejohn, K. T., … Chang, E. F. (2022). Generalizable spelling using a speech neuroprosthesis in an individual with severe limb and vocal paralysis. Nature Communications, 13(1). doi:10.1038/s41467-022-33611-3

Parshall, A. (2023). A Brain Scanner Combined with an AI Language Model Can Provide a Glimpse into Your Thoughts. Scientific American, Scientific American.

Reardon, S. (2023). Mind-reading machines are here: Is it time to worry? Nature, 617(7960), 236. doi:10.1038/d41586-02301486-z

Samuel, S. (2023). Mind-Reading Technology Has Arrived. Vox. ZeisloftScientists Unveil AI That Can Turn Thoughts Into Written Text. (2023, May 1). The Daily Wire. https://www.dailywire.com/ news/scientists-unveil-ai-that-can-turn-thoughts-into-writtentext

Tang, J. (2023). Semantic Reconstruction of Continuous Language from Non-Invasive Brain Recordings.” Nature News. Nature Publishing Group.

Whang, O. a. I. (2023). Is Getting Better at Mind-Reading.” The New York Times, The New York Times.

Zeisloft, B. (2023). Scientists Unveil AI That Can Turn Thoughts into Written Text.” The Daily Wire, The Daily Wire.

Fall 2023
Figure 3: Jerry Tang helping a patient during an MRI (Source: University of Texas, Austin)

Attractiveness AI: Can Artificial Intelligence Really Rate Your Looks?

The phrase, “beauty is in the eye of the beholder” has often been used to express the differing opinions individuals have when it comes to perceived physical attractiveness as well as the difficulty we have in pinpointing which features make someone beautiful. Thus, researchers sought to answer whether AI can provide more clarity, by using quantifiable metrics to create a modern formula for beauty.

Although AI platforms aimed towards judging the attractiveness of everyday individuals are already on the market, the true potential for influence lies with platforms such as Instagram and TikTok. TikTok, which has already been accused of manually suppressing “uploads by unattractive, poor, or otherwise undesirable users” to boost retention, would have the potential to automate current human moderation through AI (Biddle et al., 2020).

When it comes to selecting which metrics machine predictive models should focus on, research conducted on evolutionary biological preference for fitter features reveals certain facial characteristics that are indicative of good genes and reproductive health (Little et al., 2011). Several indica-

tors include averageness, facial symmetry of features, and sexually dimorphic traits.

Averageness

Researchers used a series of individual facial photos and superimposed them to create an “average” face from the dataset. Afterward, subjects were asked to compare and rank the attractiveness of the original and compiled faces. The study found that 4 faces out of 96 original faces were rated as significantly more attractive than the averaged faces, and 75 out of the 96 original faces were rated significantly less attractive (Langlois et al., 1989). Simply put, humans seem to prefer features that lie in the average for the population.

Facial Symmetry

Though differingresearch has been published on the importance of facial symmetry in determining beauty, a study of Japanese subjects rating photos of Japanese men and women using a range of blended mirrored faces found that manipulated photos with perfect symmetry had a higher mean attractiveness score than those with normal symmetry (Rhodes et al., 2000). On average, the perfectly symmetric

| 12
computer science

faces had a score of 4.2/10 for attractiveness, whereas the normal faces scored a 4/10. This study supports the theory that higher facial symmetry is associated with beauty.

Sexually Dimorphic Traits

The biological preference for sexually dimorphic features may have provided evolutionary advantages as indicators of health during reproduction (Kleisner et al., 2021). Sexual dimorphism, or the development of physically different masculine and feminine traits, is closely tied to levels of testosterone and estrogen. Testosterone is found at higher levels in men, whereas estrogen is found more in women. Both hormones play important roles in roles in sexual development and the maintenance of overall health. Consistently across various cultures, feminine features in women are rated more highly in terms of attractiveness. In contrast, for men, while traditionally masculine traits such as a strong jaw and brow ridge are generally well-regarded, conflicting research exists regarding the attractiveness of feminine features in men. Thus, a stronger link seems to exist between feminine characteristics in women and attractiveness when compared to exclusively masculine characteristics in men and attractiveness. It may be worth noting the influence of current beauty trends and their role in affecting our tastes when it comes to judging attractiveness.

The Role of AI in Making Sense of Biological Preferences

Using datasets of ranked faces, computer vision artificial intelligence, and evolutionary biological preferences, researchers have been able to train models which predict the attractiveness rankings of individuals.

In a broad definition of the term, artificial intelligence focuses on matching or exceeding human capabilities. Rather than being coded to a direct result, AI is “trained” on example data to form predictions. Machine learning, a sub-field of AI conducts either supervised or unsupervised learning. Supervised learning machine models rely on categorized and labeled data, such as identifying specific facial measurements for different features on a face, whereas unsupervised learning identifies patterns within unlabeled data to develop groupings. In mimicking the structure of learning in human brains, deep learning neural networks are used in supervised learning to layer data.

Published in 2021, a machine learning development model created to research the accuracy of Attractiveness AI used artificial intelligence deep learning by creating neural networks to compare datasets of images against one another (Iyer et al, 2021). Researchers focused on extracting specific facial features from the data set by using Facial Landmark Localization, a computer vision tool that detects and measures the relative positions of features in comparison to the position of the eyes. Collected ratios were then separated into descriptive categories and used alongside human ratings

in helping to predict attractiveness. Other aspects researchers considered were facial colors, shapes, and textures. By using GLCM (Grey Level Spatial Matrix), comparisons could be drawn between one color pixel and its neighbors, forming a facial map that determined a value for the texture of the skin. An additional study was also conducted to find the facial tones that were rated most highly when associated with different pinpointed facial landmarks.

Four regression models, which are statistical operations used to predict the results of new data were used, with their predicted ratings being compared against the manual human ratings to determine the model’s accuracy. K-Nearest Neighbors (KNN) modeling showed the highest correlation with human raters. K-Nearest Neighbors assumes that points located near each other are similar, making predictions of the graphed distance between the testing data and training data. Despite not showing the highest correlation in this study, another technique frequently used is the Artificial Neural Network. Artificial Neural Network (ANN) has been frequently cited in research on beauty-related AI, with its ability to mimic human learning patterns through connected “neural” layers.

Bias in Data Collection and Attractiveness AI Output

Though research has shown that to some extent attractiveness can be analyzed through facial “rules”, cultural beauty standards, racial bias, and limitations of current technology raise crucial considerations for the future developments of attractiveness AI.

When rating “beauty”, the biases of human subjects are transferred into the training data of AI systems, potentially resulting in a degree of human-like bias in AI results. The bias human testers and researchers may have as a result of racial biases and facial features they are exposed to may be further compounded by the lack of representation in certain studies. The limited racial diversity in facial data may be partially attributed to limited data sources researchers can pull from. Despite Yahoo and ImageNet storing 100 and 14 million images each, the majority of data is rendered unusable due compositional quality of the photo. Due to limited data

...cultural beauty standards, racial bias, and limitations of current technology raise crucial considertaions for the future developments of attractiveness AI.
Fall 2023
Figure 1. Faces edited to be more masculine (a) and more feminine (b) (Source: Little et al, 2011)

accessible to the researcher, trained models may not be fed enough diverse faces to accurately match human ratings between different races. Another issue in data collection is the limited preset range of shades most cameras can capture, resulting in darker shades being incorrectly represented in the data computer vision models are trained on.

The Future of Attractiveness AI

Beauty AI ventures have already sprung up, with platforms such as Face ++, which is valued at $4 billion for its facial detection and beauty scoring services, and Quoves, a facial analysis report service that charges a fee in the $150-$300 price range. Although practical commercial applications of Beauty AI are still in their beginning stages, further research utilizing different machine learning and visualization tools will surely redefine and expand upon what we currently know in the field.

Moreover, visual data is expected to play a larger role in the future of AI, with the steady increase of photos taken worldwide likely to address some of the current diversity and quality shortcomings in facial data. Along with developments in data collection, the IBM Global AI Adoption Index found in 2022 that 35% of companies were already using some form of AI in their businesses, with an additional 42% exploring the use of AI (IBM Global AI Adoption Index, 2022). Thus, we can expect the accuracy and complexity of facial assessment tools to respond to the growing market in the coming years.

Works Cited

Biddle, Sam, et al. “Invisible Censorship.” The Intercept, 16 Mar. 2020, theintercept.com/2020/03/16/tiktok-app-moderators-users-discrimination/.

Binar, Kurina. “Trend and Visualization of Artificial Intelligence Research in the Last 10 Years.” Research Gate, May 2023, www.researchgate.net/figure/Scheme-of-AI-DL-MLand-ANN_fig1_371205057.

Buolamwini, Joy. “Artificial Intelligence Has a Racial and Gender Bias Problem.” Time, Time, 7 Feb. 2019, time. com/5520558/artificial-intelligence-racial-gender-bias/. File:Average of two faces 2.jpg. Wikipedia, https://en.m.wikipedia.org/wiki/File:Average_of_two_faces_2.jpg.

IBM Global AI Adoption Index 2022, May 2022, www.ibm. com/downloads/cas/GVAGA3JP.

Iyer, Tharun J., et al. Machine Learning-Based Facial Beauty Prediction and Analysis of Frontal Facial Images Using Facial Landmarks and Traditional Image Descriptors, National Library of Medicine, 25 Aug. 2021, www.ncbi.nlm.nih.gov/ pmc/articles/PMC8413070/.

Kagian, Amit, et al. A Machine Learning Predictor of Facial Attractiveness Revealing Human-like Psychophysical Biases, 1 Feb. 2008, www.sciencedirect.com/science/article/pii/ S0042698907005032.

Kleisner, Karel, et al. “How and Why Patterns of Sexual Dimorphism in Human Faces Vary across the World.” Scientific Reports, U.S. National Library of Medicine, 16 Mar. 2021, www.ncbi.nlm.nih.gov/pmc/articles/PMC7966798/.

Little, Anthony C, et al. “Facial Attractiveness: Evolutionary Based Research.” Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, U.S. National Library of Medicine, 12 June 2011, www.ncbi.nlm.nih. gov/pmc/articles/PMC3130383/#RSTB20100404C2. Moridani, Mohammad Karimi, et al. Human-like Evaluation by Facial Attractiveness Intelligent Machine, June 2023, www. sciencedirect.com/science/article/pii/S2666307423000165. Rhodes, Gillian, et al. “Attractiveness of Facial Averageness and Symmetry Innon-Western Cultures: In Search of Biologically Basedstandards of Beauty.” Sage Journals, 15 Aug. 2000, journals.sagepub.com/doi/ epdf/10.1068/p3123.

Ryan-Mosley, Tate. “I Asked an AI to Tell Me How Beautiful I Am.” MIT Technology Review, MIT Technology Review, 5 May 2021, www.technologyreview.com/2021/03/05/1020133/ ai-algorithm-rate-beauty-score-attractive-face/ Image: MIT Technological Review Commons

| 14

Plants, Cells, and Bugs: The Good, the Bad, and the Ugly of Making Meat Sustainable

If you’re a food consumer in America, there’s little doubt that you’re aware of the numerous alternatives to meat that have emerged in the recent years. Whether it be The Burger King Impossible Whopper, Beyond KFC, Pizza Hut beyond, or the Impossible and Black Bean Burgers sold in the University of Wisconsin’s student Unions, nearly every big restaurant has boasted some sort of other-worldly sounding alterna-

tive meat. However, there’s little incentive for one who isn’t dietary restricted or vegetarian to select any of these over tried-and-true real meat. At least, there hasn’t been until now.

The ethics of consuming other living beings is no longer the only moral quandary carnivores have to come to terms with; as the years go on, the meat industry as it exists today is growing increasingly less sustainable for the planet. Livestock farming is responsible for about 12-18% of total greenhouse emissions, over half as much as the entire transportation industry, mostly due to the natural methane production from cattle (GFI, 2023). One third of habitable land is used to farm animals, mostly as grazing land for cows, contributing to growing deforestation rates and carbon emissions (Direct, 2012).

With the world population expected to increase by roughly 1.5 billion in the next 30 years (United Nations, 2019), the meat industry surely would have to grow to match, meaning that the risk to the Earth would grow as well. Recognition of this has led to the proposition and development of several alternatives that aim to replicate the gustatory, nutritional, and cultural void that the loss of meat would leave in the general population. The question then is “which alternatives currently lead the front, and how well do they balance sustainability and substitution?”

Plant-Based Meats (Impossible and Beyond)

Probably the least unnerving alternative to meat are the traditional “veggie burgers” that mimic animal flesh using vegetables and legumes already well-established in the Ameri-

Carbon Emissions (KgCO2e) and land usage (m2) to produce
of product (Source: Moonloft, 2020) Fall 2023
a kg
sustainability

can diet. The two titans of the mass-produced plant-based meats of today are Impossible and Beyond; both companies that aim to recreate ground beef with plants, though by notably different means.

Beyond meat combines peas, mung/fava beans, and rice with cocoa butter and coconut oil to mimic beef’s essential structural components (protein and fat), along with potato starch and methylcellulose for texture and beet juice to replicate the red color of raw beef. In addition, minerals like calcium, iron, salt and potassium chloride are added as well, both to replicate beef’s taste and nutritional value (Beyond, 2023). Impossible meat, on the other hand, is primarily constructed from soy and potato proteins, with fats and textures the product of coconut oil and sunflower oil injections. Their approach to replicating meat flavor is vastly different from Beyond, as it’s based not on combining the flavors of multiple different vegetables, but instead on the mere presence of a single molecule; heme. Heme is ubiquitous in plants and animals because it is essential for oxygen transport (Gruss et. Al, 2012). It also undergoes a multitude of chemical reactions when cooked to produce a slew of flavors, which Impossible Meats allege are the ones that consumers of meat know and love.

While heme naturally exists in high enough concentrations in beef to produce these flavors, the heme found in impossible meat is derived from the DNA of soy roots. It is inserted into genetically engineered yeast and fermented before being extracted and added into the raw Impossible product (Impossible, 2022).

But how do these compare to beef, and what are the environmental effects of their production? It’s difficult to objectively compare the way the two products taste and how well they functionally resemble meat. It is easier, however, to compare the nutritional properties of each with real meat– the result being that they are remarkably similar. The plant burgers are only 30-40 fewer calories per patty than one of normal beef, and all three contain between 19-20 grams of protein, 6-8g grams of saturated fat, and 370-390 mg of sodium. The biggest difference is the fat and carbohydrates content, which range from 14g to 23g and 0g to 9g, in plant burgers and real burgers (Nutritionix, 2019).

Considering the environmental impacts, there is zero doubt that the plant-based meats are more sustainable. While the precise numbers vary between Impossible, Beyond, and the numerous additional brands of plant-based meats, in general, the production of the same mass of plant-based beef as normal beef requires 96% less land, produces 89% fewer greenhouse gas (kg-CO2eq/kg), reduces water use from 87-99%, and even reduces the risk of aquatic pollution by 51-91% (GFI, 2023). While some sacrifices in taste are sure to be expected with plant-based burgers, their reduction of the environmental consequences of beef farming are remarkable.

Lab-Grown

When it comes to replicating the taste, texture, and mouthfeel of meat, cultured meat is easily the best alternative, as it is biologically identical to real animal flesh. The only difference is where it comes from – not from an animal, vegetable, or legume, but a laboratory.

Technically, the process does begin with an animal. Whether it be from a cow, pig, or chicken, muscle tissue is extracted and is used as the source for stem cell derivation (Post, 2012). Stem cells are cells in their early stages, before they have differentiated into specific cells (liver cells, skin cells, brain cells, etc.), a process triggered by their environment.

These cells are placed into a bioreactor, which contains media designed to mimic the conditions of the animal body, complete with the oxygen, amino acids, vitamins, sugars, and other nutrients essential for their growth (GFI, 2023). This enables the stem cells to proliferate, or replicate, until their concentration has increased by thousands. From there, the conditions are altered so that the stem cells are able to differentiate into one of three essential cells of meat: muscle, fat, and connective tissue.

These cells may be combined together with little order to replicate the ground meat that comprises hamburger, or can be more intricately structured with 3D-printer-like machinery that portions out the components and organizes them into more recognizable shapes, like steak or bacon (Post, 2012), or meat from a different animal entirely.

Converting stem cells into steak (Source: Amsbio, 2021) | 16

Since these cultivated meats are grown from the exact same cells as traditional meat and are chemically identical, they share the same nutritional aspects as the meat they attempt to replicate. Any potential nutritional side effects that come from the process remain to be noted.

The same can be said for the environmental impacts, of which there is currently heavy dispute. Early studies estimated that when compared to real meats, cultured meat production results in 78–96% lower greenhouse gas emissions, 99% less land use, and 82–96% lower water use (Tuomisto & Mattos, 2011). While this sentiment is echoed in current studies, a new revelation suggests that if the current procedure for making cultivated meat was scaled up to the extent necessary to replace real beef production, the energy usage required to do so would render the global warming potential anywhere from 4 to 25 times as much as that of real beef (Risner, 2023). The fact of the matter is that there is a lot of conflicting data regarding the environmental impact, and further research needs to be made before a more solid conclusion is reached. It’s critical to remember that this technology is still in its infancy, and it is reasonable to expect that any potential risks to the environment could decrease as further developments are made.

Insects

The third potential solution requires nothing artificial; no labs, no bioreactors, no heme to mimic the taste of meat–but it does require a lot of people to get out of their comfort zone. Entomophagy, or the practice of eating insects, has existed in humanity for thousands of years and remains in several cultures to this day. Then, what exactly would large-scale insect consumption look like if it were implemented in the United-States?

Producing enough insect meat to substitute the meat in our diet today would require insect farming, a process not dissimilar to the way in which animals are farmed. They are born in a farm either outside or inside, given feed to grow, and then killed and used as fertilizer, food, or whatever they might be needed for. Trillions of insects are already produced this way yearly, the number increasing day by day as countries like Ghana adopt Weevil farms to combat the sustainability crisis (Kennedy, 2023). In fact, this practice has been normalized in America for hundreds of years; beekeepers oversee the replication and upbringing of thousands of bees a year, harvesting their honey and facilitating pollination.

Preparing insects for consumption isn’t dissimilar to how we prepare meat every day; cook them, season them, combine them with other ingredients to create unique dishes. Thousands of recipes involving insects already exist worldwide. Mexico, for example, is the home of a dish called chapuline, which consists of toasted grasshoppers seasoned with garlic, lime, salt and chiles. Cockshafer soup was a delicacy in France and Germany until the 1950s and is a crab soup-like dish made from delimbed beetles.

While integrating them into our cuisine would take some getting used to, how well do insects replicate the nutritional aspects of meat? Despite varying wildly in their nutritional

profiles, insects contain roughly 9.96 g and 35.2 g of protein per 100 g versus the 16.8–20.6 g in meat (Payne, 2015), making them a more than suitable replacement when it comes to a protein source. Fat content, however, varies greatly depending on bug source. Meats generally have a saturated fat content of 1–3.8 g per 100 g, whereas for bugs the value is 2.28 to 9.84 g per 100 g. This could be of concern due to the link between saturated fat and heart disease though it is of note that certain bugs, namely crickets and honeybees contain saturated fat contents of 2.28 and 2.25g per respectively, much closer to meat. Furthermore, bugs tend to be higher in both unsaturated fats, which combat heart disease, and iron (Payne, 2015).

While insects provide roughly the same nutritional factors as animal meat, they are also significantly more efficient at converting the food they consume into the energy that is responsible for those nutrition factors. In other words, less land, water, and energy is required to produce the same mass of insect as a cow, pig or chicken. Furthermore, while only about 40% of a cow is edible, 80% of an insect is, meaning that less of the energy that goes into producing them goes to waste. (Sjogren, 2017)). It should be unsurprising then that insect farms have the potential to reduce carbon emissions by 70%.

“ Seeing as lab-grown meat is far from perfected and that it’s incredibly unlikely the general population will unanimously add insects to their diet in under three decades...

Comparing the Three: Sustainability vs. Substitution

Understanding all of these potential meat substitutes primes for a better answer to the question of which would be the best replacement for meat. As stated previously, the two most important factors to consider are sustainability and substitution.

Plant-based meat, though they have had somewhat of a renaissance as of late, still haven’t entirely shaken their reputation of unpleasant tastes and odd textures. Nonetheless, they are currently the most widely accepted and available substitute for meat in the US and have proven to be significantly better for the environment for real meat.

Fall 2023

Lab grown meats, biologically identical to the real thing, are undoubtedly the best substitution for the taste, texture, and nutritive properties of real meat. Nevertheless, the lack of understanding of its environmental effects combined with the current small-scale execution makes it somewhat unsustainable for the near future.

Insects are quite sustainable and, given that farming them is practiced daily in countries all over the world, insect farming is quite feasible to implement. They are a lackluster substitute, however, completely different in taste, texture, and the recipes by which they would need to be prepared.

The fact of the matter is that none of these are perfect replacements for meat in either category, and that finding an alternative for one of the most essential and beloved parts of the American diet is an almost impossible task. Hard as it may be, the meat industry’s lack of sustainability combined with the fact that that population will increase by billions by 2050 make it an essential one. Seeing as lab-grown meat is far from perfected and that it’s incredibly unlikely the general population will unanimously add insects to their diet in under three decades, consider ordering a beyond burger next time you’re craving meat. It might be time to start getting used to it.

Works Cited:

Beyond Meat. “Ingredients | What Is Plant Based Meat? | beyond Meat.” Www.beyondmeat.com, 2023, www.beyondmeat.com/ en-US/about/our-ingredients/.

Cohen, Amir. “3D-Printed Steak, Anyone? I Taste Test This “Gamechanging” Meat Mimic | Zoe Williams.” Reuters, 16 Nov. 2021, www.theguardian.com/food/2021/nov/16/3d-printed-steak-tastetest-meat-mimic.

“Cultured Meat Products | AMSBIO.” Amsbio, 2021, www.amsbio. com/cultured-meat/.

Djekic, Ilija. “Environmental Impact of Meat Industry – Current Status and Future Perspectives.” Procedia Food Science, vol. 5, no. 5, 2015, pp. 61–64, https://doi.org/10.1016/j.profoo.2015.09.025.

Driver, Alice. “Opinion: Lab-Grown Meat Is an Expensive Distraction from Reality.” CNN, 5 July 2023, www.cnn.com/2023/07/05/ opinions/lab-grown-meat-expensive-distraction-driver/index. html.

Eufic. “Lab Grown Meat: How It Is Made and What Are the Pros and Cons.” Www.eufic.org, 17 Mar. 2023, www.eufic.org/en/food-production/article/lab-grown-meat-how-it-is-made-and-what-arethe-pros-and-cons.

GFI. “Environmental Benefits of Plant-Based Meat Products | GFI.” Gfi.org, gfi.org/resource/environmental-impact-of-meat-vs-plantbased-meat/

Global Orphan Foundation. “Palm Weevil Larvae: The Other Other White Meat.” Global Orphan Foundation, 2 Oct. 2023, www.globalorphanfoundation.org/our-stories/palm-weevil-larvae. Accessed 18 Nov. 2023.

Moonloft,. “Climate Impact of Meat, Vegetarian and Vegan Diets.” Ethical Consumer, 14 Feb. 2020, www.ethicalconsumer.org/fooddrink/climate-impact-meat-vegetarian-vegan-diets.

Gruss, Alexandra, et al. “Chapter Three - Environmental Heme Utilization by Heme-Auxotrophic Bacteria.” ScienceDirect, Academic Press, 1 Jan. 2012, www.sciencedirect.com/science/article/abs/pii/ B9780123944238000032. Accessed 9 Dec. 2023.

“Heme + the Science behind ImpossibleTM.” Impossiblefoods.

com, 2022, impossiblefoods.com/heme.

Kennedy, Adrienne Katz. “11 Bug and Insect-Eating Practices across the Globe.” Tasting Table, 12 Jan. 2023, www.tastingtable. com/1165294/bug-and-insect-eating-practices-across-the-globe/. Merck. “IR Spectrum Table & Chart.” Merck, vol. 1, no. 1, 2021, www.sigmaaldrich.com/MX/en/technical-documents/technical-article/genomics/cloning-and-expression/blue-white-screening.

Nutritionix. “Nutritional Properties of Various Meats and Meat Alternatives.” The Current, 10 Sept. 2019, nsucurrent.nova. edu/2019/09/10/whats-the-real-beef-with-fake-meat/

Oaxaca. “Chapulines Sazonados from Oaxaca 30 Grms - Seasoned Grasshoppers.” Insect Gourmet - Your Guide to Edible Insects, www.insectgourmet.com/product/chapulines-sazonados-from-oaxaca-30-grms-seasoned-grasshoppers/. Accessed 18 Nov. 2023.

Payne, C L R, et al. “Are Edible Insects More or Less “Healthy” than Commonly Consumed Meats? A Comparison Using Two Nutrient Profiling Models Developed to Combat Over- and Undernutrition.” European Journal of Clinical Nutrition, vol. 70, no. 3, 16 Sept. 2015, pp. 285–291, www.ncbi.nlm.nih.gov/pmc/articles/PMC4781901/, https://doi.org/10.1038/ejcn.2015.149.

Post, Mark J. “Cultured Meat from Stem Cells: Challenges and Prospects.” Meat Science, vol. 92, no. 3, Nov. 2012, pp. 297–301, https://doi.org/10.1016/j.meatsci.2012.04.008. Accessed 4 Mar. 2019.

Risner, Derrick, et al. Environmental Impacts of Cultured Meat: A Cradle-To-Gate Life Cycle Assessment. 21 Apr. 2023, https:// doi.org/10.1101/2023.04.21.537778. Accessed 9 Dec. 2023.

Sjøgren, Kristian. “How Much More Environmentally Friendly Is It to Eat Insects?” Www.sciencenordic.com, 17 May 2017, www.sciencenordic.com/agriculture--fisheries-climate-climate-solutions/how-much-more-environmentally-friendlyis-it-to-eat-insects/1445691.

Sundry Photography. “Impossible vs beyond Meat.” Tasting Table, 15 Oct. 2022, www.tastingtable.com/1047017/whatsthe-difference-between-an-impossible-burger-and-a-beyond-burger/. Accessed 18 Nov. 2023.

United Nations. “Population.” United Nations, 2019, www. un.org/en/global-issues/population#:~:text=The%20world.

| 18

Nanoscopic Warriors: Revolutionizing Cancer Treatment with Nanoparticles

Since the birth of modern medicine in 19th century, medicine has advanced rapidly over the years with astounding breakthroughs and innovations that have transformed the medical landscape. From Edward Jenner’s smallpox vaccine to the discovery of the DNA double helix strand by James Watson and Francis Crick in 1953, scientists have come a long way fighting through most complex diseases.

Cancer, also called the “the emperor of maladies” is highly multifaceted, challenging our understanding of the biological rules that govern life. This enigmatic disease has left scientists grappling with the relentless quest for effective prevention, early detection, and curative therapies. In this exploration, the essay delves into the labyrinth of cancer, unravelling the multifaceted nature of this disease and the ongoing efforts to conquer its complexity through the application of nanotechnology. Nanotechnology is manipulation of matter on a near-atomic scale to produce new structures and systems. Even though the concept of nanoparticles and its properties were understood in ancient times, it gained

significant attention in the recent years because of its potential for substantial advancement in cellular and atomic level which can benefit cancer treatment vastly. It is believed that nanotechnology can help with an early cancer detection, decreases radiation dosage, and improves therapeutic specificity which could eliminate the systemic toxicities associated with conventional methods, therefore, overall improving the prognosis and patient quality of life.

Background Information About Nanotechnology

Nanotechnology involves structures and properties at the nanometer (one billionth of a meter level) and has the potential to revolutionize industries and impact various aspects of our lives by harnessing the unique properties that emerge at a nanoscale. The nanometric size molecules are used by the human body’s biological mechanism to cross natural barriers to access new delivery sites and interact with DNA (Mundekkad & Cho, 2022). Nanoparticles are advantageous in the field of medicine and technology due to their small surface area to volume which specializes them to absorb vast number of medications and move quickly throughout the bloodstream. Furthermore, they have possible applications in novel diagnostic instruments, targeted medicinal and pharmaceutical products and tissue engineering such as reconstruction of cartilages. Nanoparticles can be produced by a critical mix of manganese and citrate using nanotechnological techniques, allowing its use in tailoring mechanisms for medication administration, new diagnostic methods, and nanoscale medical devices (Haleem et al., 2023). Therefore, the potential use of nanoparticles such as nanotubes, nanodiamonds and fullerenes can be used in therapy.

Types of Nanoparticles

Fullerenes are a unique subset of nanoparticle which are made up of 60 carbon atoms (C60) in a spherical structure which was discovered in 1985 (David R.M. Walton and Harold W. Kroto, 2023). They explored the potential of fullerenes in drug delivery by controlling the release of therapeutic agents in the body as it enhances drug solubility and stability, minimizing the damage to healthy tissues and reducing the side effects. Additionally, Fullerene can be used in photodynamic therapy (in presence of oxygen) which can be employed to destroy undesired biological tissue. Fullerene has potent anticancer activities, however the potential toxicity to nor-

Written by Kaashvi Agarwwal Edited by Analise Coon Fall 2023

mal tissues limits its further use. Therefore, nanocrystals , of C60 (Nano-C60) with negligible toxicity to normal cells have been developed as a radiosensitizer, a substance that enhances the sensitivity of cells or tissues (Gong et al., 2021). Functionalized fullerenes help in early detection, monitoring tumor progression, and understanding the cancer cell’s environment using biosensors. Additionally, carbon nano tubes (CNTs) had been suggested as in vivo applications, due to their strong optical absorption in the specific wavelength of nanoparticles, and as an active tool for bioimaging and drug delivery applications. Recently, fullerene and nanodiamond, a different type of nanoparticle had been investigated and received much attention to use as a drug delivery carrier (H. Al-Tamimi & B.H. Farid, 2021).

There are two types of nanoparticles: natural and designed. Natural nanoparticles are formed in nature due to the erosion of geological materials, as well as decomposition of biological materials, mainly plant residues, or combustion of fuel products. Designed nanoparticles are manufactured by the nanotechnology industry (Lisik & Krokosz, 2021). Another type of nanoparticles, Magnetic nanoparticles are gaining popularity due to their unique superparamagnetic nature, controllable size, excessive chemical stability with more desirable surface, and biocompatibility (Malehmir et al., 2023).

Applications of Nanoparticle in Oncology

As of January 2022, it is estimated that there are 18.1 million cancer survivors in the United States (Kemp & Kwon, 2021). The increase in the estimate of cancer survivors is due to advancement in cancer treatment such as usage of nanotechnology. The reason being is its ability to deliver the normal insoluble drugs to local and distant tumor in a better way due to its chemical properties and size. Thus, reducing the systematic side effects associated with conventional drug therapies, as indicated by the Figure 3 below. A majority of cancer diagnoses are identified in a later stage, causing difficulty in the detection and survival of the patients. However, nanotechnology provides enhanced imaging, faster diagnosis and increased therapeutic efficiency, as displayed by Figure 3 below.

These nanodrugs are invariably biocompatible, non-immunogenic, non-toxic, and biodegradable, which in turn reduces the risk of unpredicted loss of function or adverse effects encountered in the traditional therapy (Wang et al., 2021). The flexibility of nanoparticles in terms of size, shape, selective binding capacity, high permeability and retention effect, surface modification placed them in a good position in cancer therapy, especially in ovarian, breast, and non-small cell lung cancers (Mundekkad & Cho, 2022).Further, it was also

found that the nanometric size also includes many of the human body’s biological mechanisms that allow crossing natural barriers to access new delivery sites and interact with DNA or small proteins at different levels, in blood or within the body tissue cells. (Mundekkad & Cho, 2022). Nanomaterials can function as radiosensitizers, creating highly specific and uniform radiation dosing to tumors while sparing healthy tissue (Xie et al., 2018).

...researchers are exploring nanotechnology-based delivery of nucleic acids as effective treatment strategies for a variety of cancers.

Nanoparticle-based therapeutics can induce tumor cell death and in turn increase neo-antigen release from this tumor which can be utilized to improve antigen presentation and T-cells activation. They can deliver pro-inflammatory agents to tumors causing its microenvironments to enhance the cancer immunotherapy response. The specialized function of the nanoparticle in use in drug development is of delivering immunostimulatory or immunomodulatory molecules in combination with chemo or radiotherapy, or as adjuvants to other immunotherapies. In addition to this process, Nanoparticles are utilized to capture antigens shed from tumors following the radiotherapy therefore working as powerful agents in drug therapies.

Furthermore, nanoparticle-based vaccines are being designed for raising T-cell responses through antigen-adjuvant co-delivery, multi-antigen activation of dendritic cells, and continuous antigen release, as exemplified by the figure 4 above. Other applications of nanotechnology here include in situ vaccination with artificial antigen presenting cells and immune depots placed near tumors. These strategies will advance and be refined as our understanding of cancer immunotherapy deepens (“Nanotechnology cancer therapy and treatment,” 2023). Moreover, researchers are exploring nanotechnology-based delivery of nucleic acids as effective treatment strategies for a variety of cancers.

Conclusion

In conclusion, the application of nanotechnology in oncology treatment is a paradigm shift in the way we approach the diagnosis, treatment, and management of cancer. The journey through this topic review has revealed the remarkable potential of nanoparticles, such as fullerenes, nanotubes,

| 20
Figure 1: Difference categorization of nanoparticles: fullerene, carbon nanotubes, graphene, carbon dots and nano diamonds, used in nanotechnology (Source: Liu & Liang, 2012)

and nanodiamonds, in revolutionizing the field of oncology. Their versatility as drug carriers, imaging agents, and therapeutic tools has opened doors to unprecedented precision and efficacy in cancer treatment. These nanoparticles hold the promise of tailored medication delivery, limiting collateral damage to healthy tissues while boosting impact on cancer cells. They offer more precise diagnostic imaging, allowing for earlier cancer identification and monitoring. Furthermore, they can be used in novel therapeutic techniques such as photodynamic therapy and radiation sensitization. While the potential benefits are compelling, challenges, such as safety, scalability, and regulatory considerations, must be addressed to bring these nanotechnology-based treatments to patients worldwide and with the collaborative efforts between researchers, clinicians, and regulatory bodies it can be possible.

The challenges persist and patients hope for better outcomes, therefore, nanotechnology stands as a beacon of progress. It holds the potential to not only improve the efficacy of existing treatments but also inspire entirely new therapeutic strategies. The quest for innovative and personalized cancer care continues, and the integration of nanotechnology promises to be a powerful catalyst in the battle against cancer, offering renewed hope to patients and their families.

In the future, research should be done on the necessity of green synthesis methods to produce nanoparticles in an eco-friendly way, which would make nanotechnology more favorable and acceptable by the global public. Finding out ways where less hazardous chemicals can be used is another complex task, as ethical considerations and regulatory compliance are essential to prioritizing patient’s safety, environmental responsibility, and sustainability. By embracing these approaches, we can shape a more responsible, ethical, and sustainable future for healthcare and technology. As discussed above, the application of nanotechnology in oncology has been advantageous in the field of medicine therefore there is a promising area of research on the application of nanoparticles in organ transplant especially to reduce the risk of graft failure and especially an agent for immunomodulation which in the case would mean to modulate ethe immune response during and after transplantation. This approach would align with a broader trend of responsible and ethical innovation, which is essential for the long-term acceptance and success of emerging technologies.

Works Cited

David R.M. Walton and Harold W. Kroto. (2023, September 13).

Fullerene | Definition, properties, uses, & facts. Encyclopedia Britannica. https://www.britannica.com/science/fullerene

Gong, L., Zhang, Y., Liu, C., Zhang, M., & Han, S. (2021). Application of Radiosensitizers in cancer radiotherapy. International Journal of Nanomedicine, 16, 1083-1102. https://doi.org/10.2147/ijn. s290438

H. Al-Tamimi, B., & B.H. Farid, S. (2021). Fullerenes and Nanodiamonds for medical drug delivery. Nanocrystals [Working Title]. https://doi.org/10.5772/intechopen.97867

Haleem, A., Javaid, M., Singh, R. P., Rab, S., & Suman, R. (2023). Applications of nanotechnology in medical field: A brief review. Global Health Journal, 7(2), 70-77. https://doi.org/10.1016/j. glohj.2023.02.008

Haleem, A., Javaid, M., Singh, R. P., Rab, S., & Suman, R. (2023). Applications of nanotechnology in medical field: A brief review. Global Health Journal, 7(2), 70-77. https://doi.org/10.1016/j. glohj.2023.02.008

Kemp, J. A., & Kwon, Y. J. (2021). Cancer nanotechnology: Current status and perspectives. Nano Convergence, 8(1). https://doi. org/10.1186/s40580-021-00282-7

Learn about types of nanoparticles. (n.d.). Nano Products Online Store | Nanoproducts, Nanoparticles, Nanopowders, Dispersions, Nanotubes, Additives & Nano Lubes Supplier : MKNano.com. https://www.mknano.com/info-guide/types-of-nanoparticles.aspx Lisik, K., & Krokosz, A. (2021). Application of carbon nanoparticles in oncology and regenerative medicine. International Journal of Molecular Sciences, 22(15), 8341. https://doi.org/10.3390/ ijms22158341

Liu, Z., & Liang, X. (2012). Nano-carbons as Theranostics. Theranostics, 2(3), 235-237. https://doi.org/10.7150/thno.4156

Malehmir, S., Esmaili, M. A., Khaksary Mahabady, M., SobhaniNasab, A., Atapour, A., Ganjali, M. R., Ghasemi, A., & Moradi Hasan-Abad, A. (2023). A review: Hemocompatibility of magnetic nanoparticles and their regenerative medicine, cancer therapy, drug delivery, and bioimaging applications. Frontiers in Chemistry, 11. https://doi.org/10.3389/fchem.2023.1249134

Mundekkad, D., & Cho, W. C. (2022). Nanoparticles in clinical translation for cancer therapy. International Journal of Molecular Sciences, 23(3), 1685. https://doi.org/10.3390/ijms23031685

Nanotechnology cancer therapy and treatment. (2023, September 30). National Cancer Institute. https://www.cancer.gov/nano/cancer-nanotechnology/treatment#:~:text=Nanoparticle%20delivery%20vehicles%20can%20play,presentation%20and%20T%20 cells%20activation

Saborni Chattopadhyay1, 2, Jui-Yi Chen1, Hui-Wen Chen3, 4, CheMing Jack Hu1. (2017). Nanoparticle vaccines adopting virus-like features for enhanced immune potentiation. Nanotheranostics. https://www.ntno.org/v01p0244.htm

Truini, A., Alama, A., Dal Bello, M. G., Coco, S., Vanni, I., Rijavec, E., Genova, C., Barletta, G., Biello, F., & Grossi, F. (2014). Clinical applications of circulating tumor cells in lung cancer patients by CellSearch system. Frontiers in Oncology, 4. https://doi.org/10.3389/ fonc.2014.00242

Wang, J., Li, Y., & Nie, G. (2021). Multifunctional biomolecule nanostructures for cancer therapy. Nature Reviews Materials, 6(9), 766783. https://doi.org/10.1038/s41578-021-00315-x

Xie, J., Gong, L., Zhu, S., Yong, Y., Gu, Z., & Zhao, Y. (2018). Emerging strategies of nanomaterial-mediated tumor Radiosensitization. Advanced Materials, 31(3). https://doi.org/10.1002/ adma.201802244-

Fall 2023
Figure 2: Diagrammatic explanation of the role of synthetic nanoparticles in vaccine design and development. (Source: Chattopadhyay1, 2017)

Fur-Free Future: Revolutionizing Research with 3D Tissue Culture

From research to ethics, science bridges the gap between discovery and morality

Introduction and Overview

In Massachusetts, scientists at biotechnology company Emulate Inc., have been developing human cell-based technology that recreates organ-level functions. This state-of-the-art system mimics the structure and function of human body systems and is used to model organs in healthy and diseased states. Emulate creates “organs-on-chips” to enhance biological research and development by overcoming the limitations associated with animal models and even replacing animal testing (Ingber, D.E., 2022). Emulate is not the only company which is working on animal model alternatives. In fact, the previous decade has seen an exponential growth in labs and companies in this field. For instance, the EpiDerm “tissue model” by MatTek Life Sciences represents a three-dimensional model derived from human cells. It is specifically designed to assess the corrosive or irritating properties of chemicals on the skin. This innovative model serves as a humane alternative to conventional experiments that involve rabbits.

With these recent advancements in technology and the development of in-vitro and computational models, ethical concerns about animal testing have led to discussions about the necessity of using animals when alternative, less harmful methods are available. While ethical regulations have led to some greater scrutiny of animal research, public opinion, activism, and pressure from animal rights organizations also shape these research practices. All of these factors highlight the ethical progress of man as we stray away from animal abuse via chemical and drug testing.

Historical Perspective on Animal Model Testing

From the early Greek physicians who vivisected live animals to understand animal anatomy, to the 12th century surgeons who used animals to study their surgical procedures, humans have been using animals for exploratory and practical research. This practice was backed by the uniform view of human superiority over animals and has been persistent throughout the centuries. Philosophers like Descartes referred to animals as ‘automa’ or ‘machines’ and publicly vivisected them to demonstrate their inability to experience pain while alive (Rachel Hajar, 2011).

It was around the 18th century that people began to question the ethics behind animal testing for scientific discoveries. By the beginning of the 19th century, the topic of discussion was not if animals could feel or not and to what extent, but rather whether vivisection was justifiable by human activity. In 1874, Queen Victoria expressed her own concern over the treatment of animals, which coincided with wide-scale English public opposition in the 1870s. Much later, North America had an animal rights movement in 1980. Books such as Richard Ryder’s Victims of Science and Bernard Rollin’s Animal Rights and Human Morality were crucial publications in the resurgence of popular interest in animal welfare. With the realities of how animals were being treated becoming somewhat public knowledge, scientists could no longer defend all their experiments as some of their predecessors had done. History in this area has marked encouraging progress, but there is a great deal of advancement yet to be made.

Controversies and Limitations of Animal Model Testing

Animal testing, also known as in vivo testing, is a highly controversial topic in today’s society. There has always been much debate as to whether one must allow laboratory animals to suffer for the sake of science. While many scientists are well measured in their research and attempt to minimize the pain suffered by laboratory animals, some remain oblivious to the fact that animals are indeed living beings and treat them like machines used to produce varieties of data.

Russell and Burch introduced the “3Rs” principle in their 1959 book titled “The Principles of Humane Experimental Technique” which are the guiding principles for more ethical use of animals in product testing and scientific research (T. Arora et. al, 2011). The first “R”, Replacement, emphasizes the need to replace animal testing with alternative methods whenever possible. This involves finding non-animal alternatives such as in vitro (test tube) experiments or computer modeling to reduce the use of animals in research. The

| 22
Model of an Organs-on Chip
biology

second “R”, Reduction focuses on reducing the number of animals used when they are necessary for experimentation. This can be achieved through more efficient experimental design, using statistical techniques like data-pooling, modeling and randomization to involve less animals, and sharing data to minimize duplication. The third “R”, Refinement, calls for the refinement of experimental procedures to minimize the pain, suffering, and distress experienced by animals involved in research. The 3Rs principle has been widely adopted and promoted as a framework to ensure more ethical and humane treatment of animals in scientific research while also encouraging the development and use of alternative methods that do not involve animals.

Furthermore, these ethical issues border another set of limitations experienced with animal testing which is that they do not always accurately represent human responses. Biological variability among animals, variations in experimental conditions, and even subtle differences in animal strains can impact the reproducibility of results. This makes animal testing an unreliable method of testing having limited predictive value. For instance, in the late 1950s, a German pharmaceutical company introduced thalidomide, a drug widely used to alleviate morning sickness in pregnant women. However, this drug had devastating consequences, resulting in the death of approximately 2,000 unborn babies and causing malformations in over 10,000 children. In response to this tragedy, a new testing method known as embryotoxicity testing was introduced, involving the use of 3,200 rats and 2,100 rabbits per drug to assess the potential toxicity of drugs to embryos (T. Hartung, 2022).

“ Biological variability among animals, variations in experimental conditions, and even subtle differences in animal strains can impact the reproducibility of results.

In these cases, the only way to check if newly created drugs were safe was by using animals to see how they affected them, mimicking what would happen in humans. Cases like this one illustrate the historical reliance on animal testing to ensure the safety of drugs and chemicals for human use. However, using animals as models for human responses is far from perfect.

Rise of In-Vitro Cell Models

In response to these various concerns about animal models, there has been an amplifying effort to develop and adopt in-vitro methods as a more humane and scientific alternative to traditional animal testing.

In vitro, which means “in glass” in Latin, refers to experiments conducted in a controlled laboratory environment using cells, tissues, or organs rather than live animals. Recent advances in science have enabled us to develop multiple such methods like testing cells and tissues in cell cultures or test tubes, 3D Tissue culture, biological devices like organs-on-a chip, bioreactors, in silico models and in vitro skin models. In vitro methods offer several advantages, such as increased precision, reduced costs, and the ability to simulate human biological responses more accurately. These methods have gained prominence in recent years, driven by advancements in cell culture technologies, microfluidics, and high-throughput screening.

3D Tissue Culture: Journey of Organ-on-Chip

Organ on chip, also known as OoC, is an intriguing technological development in 3D tissue culture, where biology is coupled with microtechnology to create a miniaturized platform mimicking human organs. The chip is composed of a clear flexible polymer about the size of a USB memory stick that contains hollow microfluidic channels (fluids passing through micro-channels) lined with living human organ cells and human blood vessel cells. These living, three-dimensional cross-sections of human organs provide a window into their inner workings and the effects that drugs can have on them, without involving humans or animals. Working on the microscale lends a unique opportunity to attain a higher level of control over the microenvironment that ensures tissue life support, as well as a means to directly observe cell and tissue behavior.

Traditional two-dimensional (2D) cell culture methods, where cells grow on flat surfaces, have provided a lot of insight into the workings of human physiology, culture stem cells, study diseases, cell-cell interaction, tissue imaging, drug discovery, toxicology, and drug metabolism since the 1900s. However, they have been ineffective in scientific investigations because they fail to mimic the complex and dynamic microenvironments of living organisms and provide an accurate micro-environment for drug discovery.

Fall 2023
Figure 1. The Inner structure of an Organ on Chip (Source: hDMT Public Comons)

In contrast, 3D tissue culture models provide a more physiologically relevant environment for studying cell behavior, tissue development, disease mechanisms, and drug responses (Deepanmol Singh et al., 2022).

The shift from 2D to 3D cell culture has opened new possibilities for researchers seeking to bridge the gap between in vitro studies and in vivo realities. Various studies in drug discovery and cancer conducted on 3D cell culture achieve better accuracy. Gene expression studies correlate better with 3D models of cells. 3D cell culture has also helped subvert research on animal models and thereby made research more humane. The utilization of organ-on-a-chip models offers several benefits, including the ability to downsize experimental setups, seamlessly integrate various components, minimize resource consumption, and maintain precise control over critical parameters such as concentration gradients, fluid shear stress, as well as the interaction between different organs and tissues.

These miniaturized systems find applications in diverse areas of the drug development process-- serving as valuable tools for tasks ranging from refining initial drug hits to lead optimization, conducting essential toxicological assessments, performing physiological studies, investigating pharmacokinetics, and conducting screenings for phenotypic variations.

Manufacturing and Components:

Organ-on-a-chip devices are typically made using microfabrication techniques. The specific steps involved in creating an organ-on-chip include:

1. Design and Conceptualization: The process begins with the conceptualization of the organ-on-a-chip model. Researchers define the specific organ or tissue they aim to mimic and outline the intended purpose and experimental objectives.

2. Microfabrication: Microfabrication techniques, typically used in the semiconductor industry, are applied to create the microfluidic chip. This chip serves as the platform for the organ-on-a-chip model. Substrates such as glass or polymers (e.g., polydimethylsiloxane) are used to create the chip’s structure.

3. Cell Culture and Seeding: Appropriate human cells, derived from the organ or tissue of interest, are cultured in a laboratory setting. These cells are carefully selected and prepared to retain their biological properties . The cultured cells are then seeded onto the microfluidic chip, where they attach and grow.

4. Microchannel Network: The microfluidic chip contains a network of microchannels that serve as a circulatory system for the culture. These microchannels are designed to mimic the vasculature and fluid flow within the targeted organ. Fluids, such as cell culture media, can be introduced into these microchannels to simulate blood or interstitial fluid.

5. Perfusion System: To replicate the dynamic environment within the organ, a perfusion system is integrated. This system provides a continuous flow of culture media through the microchannels. The perfusion system maintains essential

parameters, such as shear stress, oxygen levels, and nutrient supply, which are crucial for the cells’ health and behavior.

6. Physiological Conditions: The chip is designed to recreate the physiological conditions experienced by the specific organ. For instance, a lung-on-a-chip might simulate breathing motions through mechanical actuation, while a liver-on-achip could incorporate metabolic functions. By introducing mechanical and electrical cues, the organ-on-a-chip model can closely mimic the tissue’s native microenvironment.

In recent years, researchers have developed different models for various organs on a chip--for example, kidney on a chip, lung on a chip, heart on a chip, skin on a chip, pancreas on a chip, brain on a chip and blood brain barrier on a chip (Donald E. Ingber, 2022).

Organ on Chip system: Regulatory Landscape and Future

While organ-on-a-chip technology has made remarkable strides, there are still several hurdles to overcome. Challenges related to issues like surface adsorption effects and inefficient fluid mixing in microfluidic devices persist. Additionally, various modifications are needed to enhance the technology, as noted in organ-on-a-chip reviews (M. Mastrangeli et al, 2019). Currently, most organ-on-a-chip models focus on individual organs or combinations of a few of them.

| 24

However, to create a holistic “body on a chip,” it is crucial to interconnect these models into one unified system that mimics multiple major organs. Notably, there are numerous organs, such as adipose tissue, the retina, and the placenta, that have seen limited exploration using this technology.

Market research indicates that North America is likely to dominate the organ-on-a-chip technology market, with a 49% share, as it embraces the shift from 2D and 3D cell cultures. Key adopters of this technology include pharmaceutical companies, biotechnology firms, academic, and research institutions. Furthermore, increased research funding in the pharmaceutical and biotechnology sectors and a growing number of clinical trials involving cell-based therapies are expected to boost the organ-on-a-chip market.

Organ-on-a-chip technology represents a paradigm shift in the world of scientific research and discovery, offering a wealth of opportunities while aligning with the principles of bioethics--particularly its instrumental role in reducing the number of animals used in scientific experiments. The ability to replicate human responses in vitro significantly decreases the reliance on animal models, thereby promoting a more humane and ethical approach to science. One of the primary challenges of animal testing has been its limited ability to accurately predict human responses. Organ-on-a-chip mod-

els, on the other hand, offer greater relevance to human biology, making research outcomes more applicable to human health and reducing potential risks associated with inadequate translation from animal models to human patients. Ultimately, organ-on-a-chip technology is a beacon of hope for scientific research that not only offers tremendous opportunities for advancement but also champions the principles of bioethics. By reducing reliance on animal testing and providing more precise, humane, and predictive models, it empowers the scientific community to make discoveries and develop solutions that resonate with the values of ethical and responsible research. This transformative technology paves the way for a more compassionate and effective era in scientific exploration.

Works Cited:

Arora T, Mehta AK, Joshi V, Mehta KD, Rathor N, Mediratta PK, Sharma KK. Substitute of Animals in Drug Research: An Approach Towards Fulfillment of 4R’s. Indian J Pharm Sci. 2011 Jan;73(1):1-6. doi: 10.4103/0250-474X.89750. PMID: 22131615; PMCID: PMC3224398.

Hartung T (2022) Replacing Animal Testing: How and When?. Front. Young Minds. 10:959496. doi: 10.3389/ frym.2022.959496

Leung, C.M., de Haan, P., Ronaldson-Bouchard, K. et al. A guide to the organ-on-a-chip. Nat Rev Methods Primers 2, 33 (2022). https://doi.org/10.1038/s43586022-00118-6

Mastrangeli, M.; Millet, S.; partners, T.O.; van den Eijnden-van Raaij, J. Organ-onChip In Development:Towards a roadmap for Organs-on-Chip. Preprints 2019, 2019030031. https://doi.org/10.20944/ preprints201903.0031.v1

Singh D, Mathur A, Arora S, Roy S, Mahindroo N. Journey of organ on a chip technology and its role in future healthcare scenario. Applied Surface Science Advances. 2022 Jun;9:100246. doi: 10.1016/j.apsadv.2022.100246. Epub 2022 Apr 11. PMCID: PMC9000345. Image: The Gaurdian

Fall 2023

pix els

ON THe COVER: Andrew Akindele

Fall 2023
Andrew Akindele Karsey Renfert Andrew Akindele Karsey Renfert

manuscript

30

Exploring the Feasibility of Hydrogen as an Alternative Fuel for Industrial Steel Reheat Furnaces

Dara Safe

Department of Mechanical Engineering, University of Wisconsin-Madison

Abstract: Amidst the global push for sustainable energy, this study explores hydrogen as a viable fuel source in industrial settings, specifically in reheat furnaces for steel production. Utilizing a comprehensive technoeconomic analysis, various hydrogen production pathways were assessed for their costs, challenges, and benefits. Methods involved a comparative analysis of production costs, efficiency, and environmental impact. Results indicate that while hydrogen presents higher upfront costs, its long-term benefits in efficiency and sustainability are noteworthy. The study concludes that hydrogen, despite its challenges, offers a promising alternative for sustainable industrial applications. Limitations include the narrow focus on steel reheat furnaces and the evolving nature of hydrogen production technologies.

Fall 2023

manuscript

Diving into the manuscript

As the world scrambles for cleaner energy, scientists are looking at hydrogen as a potential power source for factories. This study focused on using hydrogen to heat furnaces that melt steel, a big energy guzzler. It compares different ways to make hydrogen as well as the cost and efficiency of each method. The study suggests that while the use of hydrogen would be expensive, it was more efficient and less polluting in the long run. The study is a promising step towards a cleaner future for industry.

Read on to find out more!

Exploring the Feasibility of Hydrogen as an Alternative Fuel for Industrial Steel Reheat Furnaces

Abstract

Amidst the global push for sustainable energy, this study explores hydrogen as a viable fuel source in industrial settings, specifically in reheat furnaces for steel production. Utilizing a comprehensive technoeconomic analysis, various hydrogen production pathways were assessed for their costs, challenges, and benefits. Methods involved a comparative analysis of production costs, efficiency, and environmental impact. Results indicate that while hydrogen presents higher upfront costs, its long-term benefits in efficiency and sustainability are noteworthy. The study concludes that hydrogen, despite its challenges, offers a promising alternative for sustainable industrial applications. Limitations include the narrow focus on steel reheat furnaces and the evolving nature of hydrogen production technologies.

Introduction

Rising greenhouse gas emissions from increased human activities involving fossil fuels are a global concern. The global steel sector is the second-largest industrial contributor to greenhouse gas emissions, with roughly eight percent of total carbon dioxide emissions from fossil fuels (Hasan Muslemani, 2021). One of the most energy-intensive processes within steelmaking is the steel reheating process, requiring 2.3 gigajoules per ton of steel, roughly 20% of the industry’s total energy consumption (NREL, 1998). The UNFCCC’s Paris Agreement declared to combat climate change globally in 2015 and has incentivized the steel industry to become carbon neutral by 2050 (Schmitz et al., 2021). This ambitious effort requires alternative fuels in industrial steel reheating to replace fossil fuels as the principal energy source. Hydrogen is considered one of the most promising solutions to achieve net zero carbon emissions partly due to the wide range of sourcing and high heating value. Shown in Figure 1, the reheating furnace is an integral part of the rolling mill, as the steel must be heated up to the recrystallization temperature for plastic deformation to occur and the end-use steel product is made.

Methods

The study reviewed existing literature to understand hydrogen’s use in industrial heating, looking at how well it burns and its effects on the environment and costs. It considered hydrogen’s energy content and how furnaces architecture may change to handle its low density. It compared the carbon

emissions from making hydrogen using renewable energy and fossil fuels with carbon capture. The cost analysis weighed these methods against current and predicted hydrogen prices. The paper explained how hydrogen is made through steam methane reforming and three common water electrolysis methods.

Results

Table 1 lists hydrogen’s critical combustion property values compared to commonly used compounds in reheat furnace burners. The fuel properties dictate the burner operating parameters, such as fuel volumetric flow and air supply.

Table 1: Comparison of various fuel properties used in industrial reheat furnaces adapted from (ToolBox, 2018; von Scheele, 2021)
| 30
Figure 1: Overview of the steel plant rolling mill

Based on a case study by Mukherjee et al., increasing the hydrogen content in a hydrogen-methane fuel mixture from 10% to 90% results in a 2.2% increase in fuel efficiency when equating heat input (Mukherjee & Singh, 2021). However, from a volumetric basis, the flow rate of the 90% hydrogen mixture must be roughly two and a half times that of the 10% hydrogen mixture flow rate to achieve equal heat inputs due to the considerably low density of hydrogen compared to methane (Mukherjee & Singh, 2021). Consequently, the gas piping system must be designed to accommodate the additional volumetric flow.

A notable difference in the fuel properties is that the flame propagation speed of methane is about a sixth of the speed of hydrogen due to the low density, which makes the flame very hot and quick to spread, leading to safety concerns for flashback. Burners must be able to accommodate flashbacks to avoid the risk of damaging the burners and potentially causing an explosion if the flame reaches the fuel supply (Vance et al., 2022). Furthermore, the flue gas in hydrogen is water vapor and NOx, unlike carbon dioxide in commonly used carbon-based fuels. Observations indicated an impact on the water concentration within the reheat furnace, with potential implications for scale formation, decarburization, and temperature uniformity.

Steam Methane Reforming and Carbon Capture

Natural gas reforming is the most dominant process for hydrogen production globally. Nearly half of the hydrogen produced in the world and 95% in the United States are made using this process (Liu et al., 2021; Office, 2022). The process converts methane, typically from natural gas, into hydrogen and carbon monoxide via a reaction with steam over a catalyst (Ji & Wang, 2021). The production process follows four steps shown in

Figure 2. The natural gas is first purified to remove impurities and leave behind a pure form of methane. Next, the methane feedstock reacts with steam at about 800℃ to produce syngas, a mixture of carbon monoxide and hydrogen gas.

The following reaction is a water gas shift where the carbon monoxide reacts with steam over a metal-based catalyst to convert into hydrogen and carbon dioxide. Finally, the steam methane reforming gas mixture is purified using adsorption with carbon dioxide removal devices and undergoes methanation to remove leftover carbon oxides (NYSERDA, 2019).

According to Ji & Wang, the final composition of the gas is 70-75% hydrogen, 7-10% carbon monoxide, 6-14% carbon dioxide, and 2-6% methane (Ji & Wang, 2021). More advanced steam methane reforming plants utilize a two-stage pressure and vacuum swing adsorption (PSA/VSA) instead to purify the product to attain nearly pure hydrogen from the steam methane reforming gas and pure carbon dioxide from the tail gas (Shi et al., 2018). Soltani et al. point out that the typical steam methane reforming plant utilizing the processes described above emits 7 kilograms of carbon dioxide per kilogram of hydrogen, accounting for 3% of the global industrial sector emissions (Soltani et al., 2014). Despite steam methane reforming being a carbon-intensive process, carbon capture and sequestration (CCS) can mitigate most carbon dioxide emissions into the atmosphere (Soltani et al., 2014). According to Padro & Putsche, the two most common forms of sequestering are ocean disposal and underground injection (Padro & Putsche, 2010).

Water Electrolysis

Water electrolysis is an electrochemical redox reaction involving an electric current to decompose water into water and oxygen. The technology is now used on a small scale and makes up 4% of global hydrogen demand. Electrolysis is classified into three methods, shown in Figure 3: alkaline electrolysis (AEL), proton exchange membrane (PEM), and solid oxide electrolysis (SOEC). The following briefly overviews the three proven technologies for large-scale hydrogen production.

AEL and PEM are the most developed water electrolysis technologies for commercial applications. AEL typically uses an electrolyte of liquid potassium hydroxide or sodium hydroxide. The hydroxide ions within the electrolyte are the charge carriers that transfer from the cathode to the anode to produce hydrogen gas. Rego de Vasconcelos & Lavoie explains AEL can also operate at high pressures up to 690 bar and are favorable for reducing the costs associated with the high-pressure requirements for transportation (Rego de Vasconcelos & Lavoie, 2019). Nevertheless, pressurized electrolysis has higher specific energy consumption at 56 to 60 kWh/kg H_2 compared to large-scale ambient pressure electrolysis energy consumption of 48 to 60 kWh/kg H_2 (Rego de Vasconcelos & Lavoie, 2019).

PEM pumps water to the anode, where the water molecule is separated into oxygen, hydrogen, and electrons. The hydrogen ions then migrate across a proton conducting membrane, to the cathode via direct current, where hydrogen gas is produced.

SOEC is essentially the reverse reaction of PEM where steam reacts with electrons at the cathode to produce hydrogen gas and oxygen ions (Rego de Vasconcelos & Lavoie, 2019). The oxygen ions then travel through the gas-tight membrane to the anode, forming the ions into gas and liberated electrons (Rego de Vasconcelos & Lavoie, 2019). The process is unique since it uses heat and electricity to drive the breakdown of water molecules. Therefore, the heat provides part of the supplied energy, which lowers electricity demand and, consequently, the overall specific energy consumption to 28 to 39 kWh/kg H_2 (Rego de Vasconcelos

Figure 2:Steam methane reforming process reproduced from (NYSERDA, 2019)
Fall 2023

& Lavoie, 2019). Unlike the other technologies mentioned, SOEC is still in the prototyping stages, and further validation is required on cyclic stability, lifetime, and operation pressure before commercial use (Rego de Vasconcelos & Lavoie, 2019). Table 4 summarizes the operating conditions for each technology with corresponding measurements of specific energy consumption, as discussed in this section.

Discussion

A comparative assessment of the considered hydrogen production methods is summarized in Table 3 and Table 4. Due to current trends to avoid fossil fuels, electrolysis technologies are assumed to be powered by renewable electricity, thus emitting roughly zero carbon emissions (Bartels et al., 2010). On the other hand, SMR alone involves a significant number of emissions. However, with the use of CCS the global warming potential reduces the global warming potential by 71.7% to 81.7%, depending on the efficiency of carbon dioxide sequestration, as measured by the amount of carbon dioxide successfully captured and stored per unit of energy used (Ji & Wang, 2021). With CCS, SMR is considered quasi-clean shown in Table 3.

The efficiency of the process is critical in ensuring the overall viability of hydrogen as a fuel source for commercial applications. The developing electrolysis technology, SOEC, is known to have high energy efficiency due to the high process temperatures, which reduce the required voltage, hence lowering energy consumption (Chi & Yu, 2018). PEM is the next best in average efficiency and has the advantage of high-power density and pressure within the fuel cell. Lastly, the most mature technology AEL is comparable to the efficiency of SMR, according to Table 3.

In electrolysis, the capital cost is typically related to the electrolyzer power capacity, operating strategy, and technology efficiency (Buttler & Spliethoff, 2018). Within elec-

trolysis, AEL has favorable capital costs compared to PEM and SOEC. Notably, SMR alone is currently the cheapest in production cost and capital expenditure according to Table 5 and 6. Nevertheless, capital costs can nearly double when implementing CCS (Nikolaidis & Poullikkas, 2017).

Table 3: Cleanliness, efficiency, and capital expenditure for each hydrogen production technology adapted from (Dawood et al., 2020; IEA, 2022a; Ji & Wang, 2021)

Natural gas costs vary from $0.5 to $1.7 per kilogram of hydrogen, depending on regional gas prices (IEA, 2021). Using CCS technology to reduce emissions increases the cost to $1 to $2 per kilogram of hydrogen (IEA, 2021). El-Emam and Özcan point out that the electrolytic hydrogen cost is strongly dependent on the mix of renewable energy being utilized to generate renewable electricity and the hydrogen generation plant capital costs. This explains the wide range of current hydrogen cost via renewable electricity (El-Emam & Özcan, 2019). Estimations from the IEA of hydrogen cost for the emerging hydrogen production routes are included in Table 4 (IEA, 2022b).

Conclusion

Natural gas is one of the most viable pathways for introducing hydrogen as an energy carrier in reheat furnaces because it is commercially proven and is the least expensive feedstock for producing hydrogen in this analysis. However, CCS is required to address the high greenhouse gas emissions associated with using natural gas. According to the U.S. Department of Energy (DOE), the goal is to reduce the cost of distributed hydrogen production from renewable electricity to $2 per kilogram of hydrogen by 2025 and $1 per kilogram of hydrogen in the next decade (EERE, 2022). Based on the comparison section, the significant capital costs and lack of validation at a large scale make hydrogen via water electrolysis less feasible as an immediate solution for high carbon emissions in the steel industry. Significant research and development for electrolyzers and renewable electricity production efficiency are required to bring the cost of hydrogen using renewable sources to DOE targets. In terms of hydrogen application in reheat furnaces, the unique combustion characteristics of hydrogen, such as high flame speed and adiabatic flame temperature, pose safety and operational challenges that must be considered before wide-scale integration (Baukal et al., 2021).

Figure 3: Methods for water electrolysis with respective anode and cathode reactions reproduced from (Rego de Vasconcelos & Lavoie, 2019) Table 2: Operating conditions and development status of water electrolysis technologies adapted from (Dincer & Zamfirescu, 2016)
| 32
Table 4: Cost of hydrogen for each energy source adapted from (IEA, 2021, 2022b)

It should be noted this paper neglects critical analysis of alternative hydrogen production, additional methods of reducing emissions in reheat furnaces, and others cost with hydrogen implementation. Emerging routes of hydrogen production using other forms of feedstock should be evaluated: biomass, coke oven gas, solar, oil, etc.(Hanley et al., 2018). Additionally, the use of thermal instead of electrical energy to power electrolysis seems promising since the cost of electricity is significantly higher than thermal energy, which gives incentive for further development of SOEC (ElEmam & Özcan, 2019). Lastly, a thorough life cycle impact assessment is needed to evaluate missing information in this paper, such as production capacity, plant construction requirements, operational expenses, and fuel transportation costs (Cetinkaya et al., 2012).

Acknowledgments

The author extends gratitude to Professor David Rothamer for his support as the sponsor of this independent study.

References

Bartels, J. R., Pate, M. B., & Olson, N. K. (2010). An economic survey of hydrogen production from conventional and alternative energy sources. International Journal of Hydrogen Energy, 35(16), 8371-8384. https://doi.org/https://doi.org/10.1016/j. ijhydene.2010.04.035

Baukal, C., Johnson, B., Haag, M., Theis, G., Henneke, M., Varner, V., & Wendel, K. (2021). High Hydrogen Fuels in Fired Heaters.

Buttler, A., & Spliethoff, H. (2018). Current status of water electrolysis for energy storage, grid balancing and sector coupling via power-to-gas and power-to-liquids: A review. Renewable and Sustainable Energy Reviews, 82, 2440-2454. https://doi.org/ https://doi.org/10.1016/j.rser.2017.09.003

Cetinkaya, E., Dincer, I., & Naterer, G. F. (2012). Life cycle assessment of various hydrogen production methods. International Journal of Hydrogen Energy, 37(3), 2071-2080. https://doi.org/ https://doi.org/10.1016/j.ijhydene.2011.10.064

Chi, J., & Yu, H. (2018). Water electrolysis based on renewable energy for hydrogen production. Chinese Journal of Catalysis, 39(3), 390-394. https://doi.org/https://doi.org/10.1016/S18722067(17)62949-8

Dawood, F., Anda, M., & Shafiullah, G. M. (2020). Hydrogen production for energy: An overview. International Journal of Hydrogen Energy, 45(7), 3847-3869. https://doi.org/https://doi. org/10.1016/j.ijhydene.2019.12.059

Dincer, I., & Zamfirescu, C. (2016). Chapter 3 - Hydrogen Production by Electrical Energy. In Sustainable Hydrogen Production (pp. 99-161). Elsevier. https://doi.org/https://doi.org/10.1016/ B978-0-12-801563-6.00003-0

EERE. (2022). Hydrogen Production. Hydrogen and Fuel Cell Technologies Office. https://www.energy.gov/eere/fuelcells/ hydrogen-production

El-Emam, R. S., & Özcan, H. (2019). Comprehensive review on the techno-economics of sustainable large-scale clean hydrogen production. Journal of Cleaner Production, 220, 593-609. https://doi.org/https://doi.org/10.1016/j.jclepro.2019.01.309

Hanley, E. S., Deane, J. P., & Gallachóir, B. P. Ó. (2018). The role of hydrogen in low carbon energy futures–A review of existing perspectives. Renewable and Sustainable Energy Reviews, 82, 3027-3045. https://doi.org/https://doi.org/10.1016/j. rser.2017.10.034

Hasan Muslemani, A. H. M. A. X. L. A. K. K. A. F. A. A. J. W. (2021).

Opportunities and challenges for decarbonizing steel production by creating markets for ‘green steel’ products. Journal of cleaner production, v. 315, pp. 128127--122021 v.128315. https:// doi.org/10.1016/j.jclepro.2021.128127

IEA. (2021). Global Hydrogen Review 2021. https://www.iea.org/ reports/global-hydrogen-review-2021

IEA. (2022a). Electrolysers. https://www.iea.org/reports/electrolysers

IEA. (2022b). Global average levelised cost of hydrogen production by energy source and technology, 2019 and 2050. https:// www.iea.org/data-and-statistics/charts/global-average-levelised-cost-of-hydrogen-production-by-energy-source-and-technology-2019-and-2050

Ji, M., & Wang, J. (2021). Review and comparison of various hydrogen production methods based on costs and life cycle impact assessment indicators. International Journal of Hydrogen Energy, 46(78), 38612-38635. https://doi.org/https://doi. org/10.1016/j.ijhydene.2021.09.142

Liu, W., Zuo, H., Wang, J., Xue, Q., Ren, B., & Yang, F. (2021). The production and application of hydrogen in steel industry. International Journal of Hydrogen Energy, 46(17), 10548-10569. https://doi.org/https://doi.org/10.1016/j.ijhydene.2020.12.123 Mukherjee, R., & Singh, S. (2021). Evaluating hydrogen rich fuel gas firing. In. DigitalRefining: Engineers India Limited (EIL). Nikolaidis, P., & Poullikkas, A. (2017). A comparative overview of hydrogen production processes. Renewable and Sustainable Energy Reviews, 67, 597-611. https://doi.org/https://doi. org/10.1016/j.rser.2016.09.044

NREL. (1998). Steel Reheating for Further Processing.

NYSERDA. (2019). Hydrogen Production - Steam Methane Reforming (SMR). N. Y. S. E. R. a. D. Authority.

Office, U. S. H. a. F. C. T. (2022). Hydrogen Production: Natural Gas Reforming. U.S. Department of Energy Office of Fossil Energy and Carbon Management Retrieved from https://www. energy.gov/eere/fuelcells/hydrogen-production-natural-gas-reforming

Rego de Vasconcelos, B., & Lavoie, J. M. (2019). Recent Advances in Power-to-X Technology for the Production of Fuels and Chemicals. (2296-2646 (Print)).

Schmitz, N., Sankowski, L., Kaiser, F., Schwotzer, C., Echterhof, T., & Pfeifer, H. (2021). Towards CO2-neutral process heat generation for continuous reheating furnaces in steel hot rolling mills – A case study. Energy, 224, 120155. https://doi.org/https:// doi.org/10.1016/j.energy.2021.120155

Shi, W., Yang, H., Shen, Y., Fu, Q., Zhang, D., & Fu, B. (2018). Two-stage PSA/VSA to produce H2 with CO2 capture via steam methane reforming (SMR). International Journal of Hydrogen Energy, 43(41), 19057-19074. https://doi.org/https://doi. org/10.1016/j.ijhydene.2018.08.077

Soltani, R., Rosen, M. A., & Dincer, I. (2014). Assessment of CO2 capture options from various points in steam methane reforming for hydrogen production. International Journal of Hydrogen Energy, 39(35), 20266-20275. https://doi.org/https://doi. org/10.1016/j.ijhydene.2014.09.161

ToolBox, E. (2018). Air - Diffusion Coefficients of Gases in Excess of Air. https://www.engineeringtoolbox.com/air-diffusion-coefficient-gas-mixture-temperature-d_2010.html

Vance, F. H., Goey, P., & van Oijen, J. (2022). Development of a flashback correlation for burner-stabilized hydrogen-air premixed flames. Combustion and Flame, 243, 112045. https://doi. org/10.1016/j.combustflame.2022.112045

von Scheele, J. (2021). Decarbonisation and Use of Hydrogen in Reheat Furnaces.

Fall 2023

Journal

Science and

The Journal of Undergraduate Science and Technology (JUST) is an interdisciplinary journal for the publication and dissemination of undergraduate research conducted at the University of Wisconsin-Madison. Encompassing all areas of research in science and technology, JUST aims to provide an open-access platform for undergraduates to share their research with the university and the Madison community at large.

Submit your research below to be featured in an upcoming issue!

of Undergraduate
https://justjournal.club/ Technology
@wisc.just

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.