PENNSCIENCE PennScience is a peer-reviewed journal of undergraduate research and related content published by the science and technology wing at the University of Pennsylvania and advised by a board of faculty members. PennScience presents relevant science features, interviews, and research articles from many disciplines, including the biological sciences, chemistry, physics, mathematics, geological science, and computer science.
VISIT PENNSCIENCE.ORG FOR PREVIOUS ISSUES, UPCOMING EVENTS, AND MORE!
Interested in joining the team? Email us at pennscience@gmail.com!
Team at PENNSCIENCE EDITORS IN CHIEF Magnolia Wang Brian Song
Faculty Advisors Dr. M. Krimo Bokreta Dr. Jorge Santiago-Aviles Editing Managers
Writing Managers
Design Managers
Business Managers
Anushka Dasgupta Mehek Dedhia
DaHyeon choi Sagar Gupta
Amarachukwu Okafor Bianca Vama
Glen Kahan Sneha Sebastian
Editing
Writing
Design
Business
Anya Jayanthi Sukhmani Kaur Zoe Lu Jessica Lvov Danish Mahmood Esha Mishra Avinash Singh Konstantinos Tsingas
Benjamin Beyer Guyin (CiCe) Chen Isabel Engel Kevin Guo Beatrice Han Brian Lee Rebecca Nadler Emily Ng Michelle Paolicelli Sarah Pham Junle (Richard) Chen Cynthia Schneider
Jessica Hao Phuong Ngo Caitlyn Prabowo Avinash Singh Saraswati Sridhar
Elena Cruz-Adames Eashwar Kantemneni Neha Shetty
Technology Manager GRACE LEE
fall 2021 | PENNSCIENCE JOURNAL 3
TABLE
OF CONTENTS Future Technologies and Concerns with Self-Driving Cars
07 10 14 17 20 22 25
28 31 33 36 38 40 43
Jonathan Tran.....................................................................................
Kevin Guo.............................................................................................
Defense Against Airborne Pathogens: 3 Research Based Practices
Environmental Cleanup
Guyin (Cice) Chen................................................................................
Benjamin Beyer .................................................................................
Defense Against Food Insecurity: Novel Policies, Technologies, and Scientific Research
The 3-Parent Baby: New Methods for Mitochondrial Replacement Therapy
Junle (richard) Chen........................................................................
Emily Ng................................................................................................
Mobilizing the Immune System to Defend Against Cancer
Protecting Who You Are: Protecting Biomedical Data in the 21st Century
Rebecca Nadler.................................................................................
Brian Lee ............................................................................................
Technology to keep women safe on campus
Synthetic Microbes: A Future for
Isabel Engel ...................................................................................... The Effects of Sleep Circuitry on the Management of Neurological Disorders
Sarah Pham........................................................................................ Drones, Cameras, and AI: Firefighters of the Future
Michelle Paolicelli ........................................................................ 4
Drug Development, the FDA Approval Process and its Major Concerns
PENNSCIENCE JOURNAL | fall 2021
Cancer Drug Delivery
BEatrice Han..................................................................................... Monkeys & More: Lab Animal Safety at Penn
Cynthia Schneider.......................................................................... FEATURED RESEARCH: Crosstalk between Macrophage and Adipocyte
MAGNOLIA WANG.................................................................................
To our Readers,
Features
On behalf of the entire team at PennScience, we are excited to present the Fall 2021 and twenty-first issue of the PennScience Journal of Undergraduate Research. For many years, PennScience has explored a variety of topics across the fields of science and technology. In consideration of the global COVID-19 epidemic and its repercussions, we focused on the theme of Safety and Defense for our publication this semester. We highly commend our writers, who have developed this idea in many nuanced directions. Jonathan Tran examines the safety and challenges presented by autonomous vehicles, while Guyin Chen discusses the timely topic of defense against airborne pathogens. Junle Chen highlights the current research on carbon dioxide fixated starch as an innovation towards a more sustainable solution, and Rebecca Nadler delves deeper into the mechanisms of the immune system as a therapy to fight cancer. Isabel Engel brings the theme onto campus by featuring technologies that can help keep women safe at school, while Sarah Pham brings the topic of sleep health to the spotlight, addressing its importance as a safety concern. Michelle Paolicelli explains the defense strategies in place to combat recurring forest fires, and Kevin Guo discusses drug safety in light of current trends in global drug abuse. Benjamin Beyer explores the environmental and ecological cleanups following chemical spills and other man-caused chemical disasters, and Emily Ng conducts a thorough analysis of the three-parent baby method that uses mitochondrial DNA to treat infertility. Brian Lee focuses on genomic and biometric data in the context of biomedical data security, and Beatrice Han investigates how genetically modified bacteria can serve as drug delivery systems. Finally, Cynthia Schneider explains the importance of animal safety in science laboratories in an interview with Dr. Yale Cohen, Professor of Otorhinolaryngology and Director of Penn Animal Welfare. PennScience is dedicated to showcasing the breadth and depth of scientific research in the Penn community. In line with our theme of Defense, we are pleased to feature an examination of macrophage differentiation and immune regulation of metabolic activity in adipose tissue, a thesis paper written by our very own Magnolia Wang under the guidance of Dr. Patrick Seale at the Perelman School of Medicine. We are beyond grateful for our Writing, Editing, Design, Business, and Technology Committees at PennScience for putting forth their great dedication and passion toward the Journal. We would also like to thank the Science and Technology Wing of the King’s Court College House, as well as the Student Activities Council for their funding that made our publication possible. Finally, we extend our gratitude towards faculty mentor Dr. Krimo Boketra for his unending support, guidance, and zeal for PennScience. Last but not least, we would thank all of our readers! We highly value your support towards PennScience, and we hope that you enjoy reading our issue. Sincerely, Magnolia Wang (C’23) and Brian Song (C’22) Co-Editors-in-Chief
fall 2021 | PENNSCIENCE JOURNAL 5
Looking for a chance to
publish your research? PennScience is accepting submissions for our Spring 2022 issue! Submit your independent study projects, senior design products, reviews, and other original research projects to share with fellow undergraduates at Penn and beyond. Email submissions and any questions to pennscience@gmail.com Research in any scientific field will be considered, including but not limited to: Biochemistry Biological Sciences Biotechnology Chemistry Computer Science Engineering Geology Mathematics Physics Physiology
6
PENNSCIENCE JOURNAL | fall 2021
Features
Written By Jonathan Tram Designed By Caitlyn Prabawo
fall 2021 | PENNSCIENCE JOURNAL 7
In
2019, the National Highway Traffic Safety Administration (NHTSA) reported that 36,096 people were killed in motor vehicle crashes across the United States.1 Furthermore, NHTSA also found that “94% of serious crashes crashes across the United States.1 Furthermore, NHTSA also found that “94% of serious crashes are due to human error.” Faced with these grim statistics, companies such as GM Motors and Tesla are offering a solution in the form of autonomous vehicles. From rearview cameras to lane change warnings, assistive driving technologies are already commonplace in modern vehicles. Automotive companies claim that autonomous vehicles such as self-driving cars have the potential to revolutionize travel and safety. However, there is still much skepticism around the idea. From ethical and legal complications to hacking and data security, there is much to consider. Breaking down the potential benefits and drawbacks of autonomous vehicles provides a clearer picture of how this technology will influence the future of transportation. The Society of Automotive Engineers International developed a standard measure of the degree of autonomy that a vehicle possesses that ranges on a scale from zero to five.2 The degree of automation increases with each level. Level zero represents a car which is fully controlled by the driver. Vehicles currently on the market are level one or two. At level three, the driver can conditionally cede control, but must be ready to retake command when alerted, and at level five, a human driver is no longer necessary. While level five autonomy may not be possible in the near future, companies like Tesla are rapidly working towards making high level autonomous vehicles available for public use. Autonomous vehicles rely on several pieces of hardware to capture their surroundings. Their hardware primarily consists of cameras, radar, LiDAR (light detection and ranging), a computational power source, and a variety of other sensors.3 Many autonomous vehicles rely on cameras, which are responsible for collecting visual data around the vehicle. This process is essential in helping vehicles safely navigate their surroundings. In fact, in 2021, Tesla announced that their autopilot system, Tesla Vision, will rely solely on cameras.4 Other models for autonomous vehicles tend to supplement the visual data provided by cameras with radar and LiDAR (light detection and ranging). Radar uses radio waves to provide low resolution images of the surrounding environment. On the other 8
PENNSCIENCE JOURNAL | fall 2021
hand, LiDAR is light-based.5 It provides higher resolution images at the cost of not functioning as well in low-visibility conditions. Therefore, radar is a somewhat redundant system in normal weather, but often essential to passenger safety in less optimal conditions when LiDAR is ineffective. Along with these primary components, autonomous vehicles designs also include global positioning systems (GPS), inertial navigation systems, and ultrasonic sensors. The data inputs gathered by these hardware pieces are then processed by algorithms in the software. Autonomous vehicles are trained on numerous data sets from real-world scenarios. As the algorithms are provided with more data, the software becomes increasingly adept at making decisions based on its surroundings. The machine learning algorithms in self-driving cars can roughly be broken down into four categories: regression algorithms, pattern recognition, cluster algorithms, and decision matrix algorithms.6 These codes work together to provide the framework with which these vehicles are able to make real time decisions. For example, pattern recognition algorithms are responsible for recognizing objects and filtering out irrelevant environmental data. These algorithms use line segments and circular arcs to model its surroundings. Using this information, decision matrix algorithms are able to make predictions about the outcomes of various decisions, such as whether to proceed through an intersection or not. These algorithms consist of several independently trained decision models, which are integrated to minimize risk in decision making.6 A new promising development in autonomous vehicle software is happening at Caltech. A research group led by Soon-Jo Chung, a Caltech professor and research scientist at NASA’s Jet Pro-
FEATURES sion Laboratory, is revolutionizing current visual terrain-relative navigation systems (VTRN).7 A VTRN system operates by comparing images of local terrain to high-resolution satellite images in order to figure out its current position. When this system is integrated into autonomous vehicle software, the technology would allow vehicles to pinpoint their location. For decades, scientists have struggled to harness the potential of VTRN systems because obstructions like leaves and snow from seasonal changes cause the system to be unable to determine its location. Dr. Chung’s research group is turning toward deep learning and AI to solve this long-standing problem. These deep learning algorithms are able to teach themselves about the surroundings by picking out fine details that humans would likely miss. They are able to account for seasonal changes to create a stable and invariant domain, which current registration software in the vehicle can readily process.8 While previous algorithms were “no better than a coin flip, with 50 percent of attempts resulting in navigation failures,” Dr. Chung’s work has demonstrated an astonishing 92 percent success rate.7 This software development has the potential to drastically improveautonomous vehicle safety. With this technology, vehicles are able to locate their surroundings using images that the VTRN systems have already been trained on. This is especially important when GPS is not available as it provides autonomous vehicles with essential information to safely navigate their surroundings. Improvements like these are promising for the future of autonomous vehicles. Many companies are successfully capitalizing on such innovations. In 2019, Tesla’s autopilot system logged over two billion miles of use.3 With enormous economic incentives on the line, these companies are encouraging politicians to pass legislation that would allow for more freedom in autonomous vehicle testing and production. However, some scientists, engineers, and policy makers contend that instead of pushing ahead, it is necessary to pause and consider current safety issues. In a 2016 Senate hearing, Dr. Mary L. Cummings, director of Humans and Autonomy Laboratory and Duke Robotics at
Duke University, argues that the federal government should further regulate the production and testing of autonomous vehicles.9 Dr. Cummings highlights how “corner cases,” such as extreme weather conditions, pose a safety threat if not properly tested for. Another major concern with autonomous vehicles is the potential for hacking. In 2015, security researchers discovered that a Jeep Cherokee could be hacked from more than 10 miles away. The researchers were able to control the AC settings, radio, windshield wipers, and even the engine.10 Even though the Cherokee was not an autonomous vehicle, the threat of hacking is even greater for autonomous vehicles as the driver has far less control of the vehicle in comparison. Unfortunately, hacking isn’t the only way to sabotage self-driving cars. In her testimony, Dr. Cummings points out how widely available laser devices are able to trick self-driving cars into processing objects that aren’t present in their surroundings.9 Proponents of increased regulation of autonomous vehicles argue that this technology, in fact, increases the risk of accidents until such problems can be resolved. As scientists like Dr. Chung attempt to grapple with such issues, one can only imagine the possibilities for autonomous vehicles to reshape the future of safety and travel. The applications for this technology are endless. From using VTRN in space to account for planetary seasonal changes to autonomous food delivery systems, the developments in self-driving technology have the potential to make profound impacts on our daily lives.
References
fall 2021 | PENNSCIENCE JOURNAL 9
DEFENSE AGAINST AIRBORNE PATHOGENS: 3 RESEARCH BASED PRACTICES With the appearance of SARS-CoV-2, airborne transmission has become a major research focus. There are three primary modes of airborne transmission. First, inhalation of respiratory droplets and aerosol particles. Second, deposition of respiratory droplets on mucous membranes in the mouth, nose, or eyes. Lastly, contact with mucous membranes that have been soiled with respiratory.1
WRITTEN BY CICE (GUYEN) CHEN DESIGNED BY AMARA OKAFOR
INTRODUCTION
Features
With
the appearance of SARS-CoV-2, airborne transmission has become a major research focus. There are three primary modes of airborne transmission. First, inhalation of respiratory droplets and aerosol particles. Second, deposition of respiratory droplets on mucous membranes in the mouth, nose, or eyes. Lastly, contact with mucous membranes that have been soiled with respiratory.1 Common airborne diseases include influenza, chickenpox, mumps, measles, and tuberculosis. The usual symptoms include inflammation, coughing, congestion, runny nose, sore throat, and swollen glands.2 Airborne transmission has not been an area of intense study because it is hard to trace and determine its exact route due to its prevalence and lack of containability.3 It was previously thought that airborne infection was caused by droplets emitted by infected individuals via coughing, sneezing, and transference to the mucous membranes. However, due to their large size and short traveling distance (which are usually larger than 100 μm and can only travel up to a distance of 1 to 2 meters) droplets are actually not the main mode of transmission.4 Airborne diseases are usually transmitted through the inhalation of virus-laden aerosols, which, unlike the droplets, are smaller than 5 μm in size and travel more than 1 to 2 meters away. Aerosols are released during all types of vocalization activities: breathing, talking, shouting, sneezing, and coughing. Interestingly, the most virus-laden aerosols are released through breathing, speaking, and continuous vocalization in comparison to less frequent coughing.4 There are several factors that affect aerosol transmission. For instance, temperature impacts the stability of proteins, lipids, and genetic material, changing the stability of aerosols. UV radiation inactivates viruses by damaging their genetic material. Aerodynamic factors like airflow, ventilation, and filtration influence the virus-load: low ventilation rate increases the risk of virus-laden aerosols indoors, with physical plexiglass barriers impeding airflow, trapping aerosols in a confined area, and increasing the transmission efficacy of virus.4 Here, we will mainly focus on three important prevention mechanisms: masking, computational fluid dynamics (CFD) simulations, and building design.
MASKING MECHANISM The mask mechanism is the most well-known prevention method. Masks serve as the source control that reduces both the emission of aerosols and droplets and the inhalation of viruses. However, surgical masks are only effective in environments with low virus abundance since the masking efficacy heavily depends on virus abundance in the environment. While N95 respirators have a 5% particle penetration rate, surgical masks have a 30% to 70% particle penetration rate. Despite the high transmission rate, masking still proved to be effective in daily cases. Proper wearing can achieve a large reduction in infection when maskless infection probability is low. In 1 mm droplets, there are approximately 50,000 viruses in a hundred million milliliter of the fluid released during respiration, with simple masks removing all of the large droplets.5 Different viruses have different abundances. Past scholars had shown that in thirty minutes, coronavirus emits approximately 53 particles, influenza emits 38 particles, while rhinoviruses eits 94 particles. Thus, despite that respiration alone emits approximately 3 million particles during a 30 minute time period for a normal person, only a small amount of it is actually virus-loaded and masks could still work effectively in these cases. However, advancd masks should be used in virus-ridden indoor environments, especially in medical centers and hospitals.
CFD SIMULATIONS Another prevention method is to use accurate computational fluid dynamics (CFD) simulations to estimate aerosol concentration and its associated risk, both of which are crucial factors that can optimize strategies aiming to mitigate exposure. CFD is a branch of fluid mechanics that analyzes and solves fluid flowing problems via data analysis. It is widely used to simulate the flow
fall 2021 | PENNSCIENCE JOURNAL 11
Figure 1: The mask mechanism is the most well-known prevention method. Masks serve as the source control that reduces both the emission of aerosols and droplets and the inhalation of viruses.
HIGH RISK MEDIUM
of fluids and their interactions. The technique is applied to various fields, including aerodynamics, weather simulations, and engine analysis.6 In the case of viral transmission prevention, CFD can effectively characterize the transport of virally loaded aerosols and model the spreading of disease in crowded environments like in cars, buses, and aircrafts. Collecting data that qualifies the size and concentration of aerosols, it can also analyze the interaction between aerosol particles and facemasks, including the airflow between the wearer’s face and the facemask.7 There are various real world implementations that have proven the efficiency of CFD. For instance, by simulating the airflow pattern in Abravanel Hall and Capitol Theater, two concert venues in Utah, CFD reveals there is a strong circulation at the center of the stage where emission easily builds up. This discovery then led to many ventilation optimization strategies. To increase airflow and circulation, players were rearranged on stage to alter airflow patterns. Instruments that don’t have air going through them were placed in the middle, while those with high emission possibilities were placed near the doors and vents.8
ARCHITECTURAL DESIGN
SAFE SAFEST 6 FEET
12 PENNSCIENCE JOURNAL | fall 2021
Architectural designs could effectively increase indoor air quality and inhibit infectious diseases. One of the earliest and most prominent designs is the Pavilion model designed by Florence Nightingale, who is known for modernizing hospitals. In her hospital ward model, she emphasized the importance of natural daylight and cross ventilation, all of which can minimize the infection of airborne diseases.9 Social distancing is one of the most effective ways to reduce the spread since the traveling distance of aerosols is short. Thus, providing sufficient spacing in hallways, lobbies, and waiting areas in buildings can reduce contact transmission. Moreover, research has shown that adequate ventilation, especially natural ventilation, in public spaces is better at preventing infections, as it can decrease the droplets’ nuclei concentration drastically. In another study, tuberculosis transmission risk in a consulting room was reduced by 72% after proper ventilation was implemented. Thus, design methods including
open-end corridors and integration of courtyard into ventilation should be implemented to reduce the transmission.10
CONCLUSION With the extensive research on airborne transmission diseases, we are gradually gaining more knowledge on its mechanisms and prevention methods. Aerosols,
the small particles carrying the virus, are the main method of transmission. Many preventive measures implemented now, including social distancing and masking, should be followed strictly, with other means, such as CFD and building designs, to identify the virus-loading places and mitigate the situation. It is also crucial for WHO and CDC to recognize the importance of airborne transmission routes and implement policies with a scientific basis. Through integrating science and policy, the world will be better prepared for the current situation and potential pandemics in the future.12
REFERENCES
Another crucial factor that should be considered for architectural design is natural lighting, which contains solar radiation (UV) that can kill a variety of bacteria, including anthrax and tuberculosis. Based on a recent study, sunlight from the north window over two layers of glass could effectively kill hemolytic streptococci, the bacteria that causes strep throats and skin rash and can survive for up to 195 days in the dark at room temperature.11 Accordingly, buildings should include more installation of openable airways to allow sunlight penetration, which could effectively inhibit the survival of airborne diseases.
Features
Figure 2: Air quality can be improved by ameloriating indoor ventillation. Buidlings can have good air quality (bottom) by incorporating air conditioning, fans, and ventillation as opposed to bad quality (top) where there are no open windows or air flow
fall 2021 | PENNSCIENCE JOURNAL 13
Defense Against Food Insecurity: Insecurity: Novel Policies, Technologies, and Scientific Research Written By Richard Chen Designed By Bianca Vama
One in ten peo-
ple in the world is malnourished. One in four is considered medically obese. Over one third of the entire global population cannot afford a healthy diet. The number of people experiencing hunger increased by 15% in 2020 due to wars and the COVID-19 pandemic. Emitting about 30% of total greenhouse gases, the food sector depletes essential agricultural resources through poor
farming practices and amplifies aquatic pollution, soil health degradation, and biodiversity cascade.1 In turn, these factors lead to climate change problems like heat waves, floods, droughts, all of which play significant roles in disrupting the current food system.2 With great uncertainty in acquiring stable nutriment sources and low availability in healthy and safe food dominating a significant portion of the world population, the world’s food system is in great disarray.3 As the imminence of these problems grow, it is more essential to revamp the global food system through policy and institutional changes along with technological advancements. Past and Now: Policy and Technologies Policy, along with related programs, is one lens for tackling institution-induced hunger by improving diets, protecting rights, and ensuring that changes are implemented effectively. Past studies show that food insecurity is not only caused by but also compounds with health, monetary, and emotional challenges faced by disadvantaged households.4,5 With younger generations impacted by food insecurity to an unprecedented extent in the contemporary era, many policies strive to help disadvantaged families via nutrition programs.6 For instance, by providing families in the US with health referrals, nutritional education, and nutritionally balanced food packages, federal programs like Supplemental Nutrition Assistance Program (SNAP), Special Supplemental Nutrition Program for Women, Infants, and Children (WIC), and Child and Adult Care Food Program (CACFP) significantly alleviate food insecurity and hunger issues by decreasing obesity rate and improving health status among children.7 Similarly, resource distribution policies that advocate the rights of marginalized groups can also protect resources and alleviate inequalities associated with age, gender, and ethnicity, all of which are crucial factors associated with poverty and food insecurity. For instance, research has shown theoretically that if policy distributes similar resources among farmers of both sexes in Ethiopia, the productivity of corn would almost double.8 Similarly, by protecting the rights of indigenous people and smallholders, policies like the UN Voluntary Guidelines on Tenure can promote investment and land acquisitions in lower income nations and reduce
14 PENNSCIENCE JOURNAL | fall 2021
Features economical stress on food insecurity issues by promoting equitable tenure rights.9 Often working alongside policies are novel environmental science and agricultural technologies that amend past mistakes and monitor environmental health. Many of these technologies focus on improving soil health. The Land Life Company promotes and monitors reforestation and soil regeneration by applying algorithms that generate reforestation models to physical products that provide water and shelter to plants’ deep root structures.10 Aiming to achieve a similar goal, Trace Genomics analyzes microbiomes and microbes’ respective DNA sequences in the soil and designs plans that farmers can use to improve practices and yield rates.11 Many other technologies focus on tackling environmental problems and alleviating climate change’s stress on food systems. For instance, the incorporation of nanotechnology in food packaging can result in new forms of packaging that coat nanoparticles on recycled materials to maintain sustainability and food freshness.12 On a macroscopic level, soundscape technologies developed by Hitachi effectively use acoustic-monitoring stations to not only algorithmically monitor the biodiversity in rainforests but also alert intrusions of illegal loggers.13 Although seemingly disconnected from food insecurity, these technologies effectively tackle long lasting environmental problems, which in turn alleviate climate change issues and reduce the respective stress on agricultural components of food security. Through rigorous theorizing and testing, extant policies and technologies are certainly effective at providing positive structural changes to food systems and tackling problems resulting from food insecurity. However, the sole existence of policy and technology will not solve food insecurity, as these two components only focus on amending the past and reacting passively to problems. Consistent global effort is needed to implement new methods of not only counteracting but directly addressing food insecurity in the future. Future: Science and Research Complementarily, science and research can tackle food insecurity problems from their root and epitomize positive impacts along with policy and technologies. One field of research that can greatly alleviate
world hunger is extracellular chemoenzymatic starch synthesis, a project developed by researchers at Tianjin Institute of Industrial Biotechnology (TIB) of the Chinese Academy of Sciences, where starch is chemically generated from carbon dioxide; this is a process that was thought only possible during photosynthesis in plant cells. Starch, a form of carbohydrates, is a main source of human food calories. It is majorly produced by crops through carbon dioxide fixation via natural photosynthesis, where plants trap light energy through their leaves and use that energy to convert carbon dioxide and water into glucose. Using the produced glucose as an energy source, plants are able to generate substances such as cellulose, the main component of plant cell walls, and starch. However, starch synthesis and storage are inefficient and complicated, as they involve around 60 complex metabolic reactions and physiological regulations but only have about 2 percent energy conversion efficiency.14 Building upon past research that has only synthetically generated starch through cellulose or enzymes, the TIB group produced starch via carbon dioxide by designing a new synthetic route that uses 11 steps of core biochemical reaction to complete the conversion from carbon dioxide to starch. Starting from theoretical calculations, researchers used formic acid as the starting reactant and drafted a possible reaction route through computer analysis. To optimize the reaction path, researchers split the entire starch conversion process into four modules and optimized them individually. Then, after utilizing various engineered enzymes to increase the reaction’s conversion rate, researchers combined carbon dioxide and hydrogen into methanol and concretized the entire process of converting carbon dioxide to starch. Comparing the final version of the reaction pathway to the prototype, the former yields a hundred times higher than that of the latter and synthesizes around 8.5 times more starch than that of maize. This indicates that theoretically, a one cubic meter biofall 2021 | PENNSCIENCE JOURNAL 15
reactor can produce starch equivalent to the annual output of five acres of maize. In addition, by using solar energy generated electricity to produce hydrogen that is essential to the synthesis reaction and reduce the carbon dioxide concentration in the atmosphere, TIB makes the project more environmentally friendly and reduces potential climate change related stress on food insecurity.15 The high yield rate of starch and environmental benefits from the project provide a direct solution to food insecurity and hunger. The incorporation of science research similar to TIB’s starch synthesis project with policy and technological advancements can tackle food insecurity problems through fundamental, institutional, and societal perspectives. By applying similar research into future technologies, it is possible to not only transform and recycle carbon dioxide in the atmosphere into oxygen, but also produce sufficient food sources that human beings can rely on.
CO 2
16 PENNSCIENCE JOURNAL | fall 2021
REFERENCES
MOBILIZING THE IMMUNE SYSTEM TO DEFEND AGAINST CANCER
Features
WRITTEN BY REBECCA NADLER DESIGNED BY AMARA OKAFOR Beginning
in the first century with ancient Roman physicians such as Celsus, surgery has been used to physically excise tumors from patients. In more recent years, options for cancer therapeutics have improved exponentially.1 Surgical intervention to remove a tumor often results in cancer remission because leaving even one cancer cell provides a platform for exponential proliferation into tumor mass. The advent of radiation therapy and chemotherapeutic agents in the 1900s, significantly increased overall survival rates of cancer. However, such therapies pose noteworthy risks of toxicity, as they are nonspecific and have the potential to damage healthy tissue in the body. Immunotherapy is the most recent paradigm in cancer treatment, promising to limit adverse toxicity by supercharging the immune system’s own natural defenses against cancer. Immunotherapy consists of cancer therapeutics that help the immune system fight cancer. At large, the immune system recognizes and defends against “nonself,” pathogenic proteins. The immune cells constantly survey the bloodstream and become activated upon binding to an unknown protein. Nonspecific, fast-acting immune responses form the innate immune system, while the adaptive immune system is responsible for slower immune responses governed by immunological memory.2 These two major branches of the immune system work in successive steps to mount an immune response against a pathogen as the innate cells present antigens to activate the adaptive immune cells: initial cytokines from the recognition of a pathogen activate macrophages which present antigens to T-cells that subsequently activate antibody production from B-cells to eliminate the pathogen.
adaptive immune cell, are capable of recognizing and eliminating nascently transformed cells prior to clinical indications of malignancy. Within the last thirty years, rodent models for the study of human cancers have been crucial in confirming this hypothesis. For example, using a tumor model in mice with functioning immune systems, the cytokine interferon γ (IFN-γ) was shown to promote the detection and elimination of tumor cells. However, studies have also demonstrated that IFN-γ shapes the magnitude of the immune response to a tumor and promotes the outgrowth of tumor cells with immune-evasive properties, revealing the dual role of the cytokine.4 The current cancer immunoediting hypothesis explains the role of the immune system in preventing cancer proliferation while taking into account the survival of tumor cells in immunocompetent patients.3 Via uncontrolled mutations and epigenetic changes that are acquired throughout the course of cancer development, a subset of surviving cancerous cells capable of escaping immunologic detection expands, ultimately leading to clinically observable malignancy.
The intricacies of the immune system’s role in cancer progression make it an interesting target for cancer therapeutics. Given the potential of precision immunity, immunotherapies can specifically target cancer cells while leaving healthy cells unharmed. Research in the field of immuno-oncology is bolstered by the potential for longer-lasting effects than current treatment options, as immunotherapy can harness the capacity for immunological memory by the adaptive immune response to detect cancer relapses. Immunotherapy may serve as a universal therapeutic for cancer treatment due to the incredible specificity of the immune system in conjunction with its ubiquitous The cancer immunosurveillance hypothesis nature among patients. proposed by Burnet and Thomas in 1957 stated Several types of immunotherapies are FDAthat the immune system has a protective role in neoplastic disease, preventing tumor formation by approved and integrated into standard oncology targeting distinctive features of tumor cells.3 At the treatment guidelines. For instance, immune time, it was speculated that lymphocytes, a type of checkpoint therapy is designed to inhibit the natural fall 2021 | PENNSCIENCE JOURNAL 17
Mechanisms of T cell activation and regulation.
Naive T-Cell
Treg-Cell
Before Activation
CD28 TCR Antigen
CTLA4
B7-1/B7-2
MHC
APC
Lymphoid Tissue Activated T-Cell
Early After Activation
Peripheral Tissue
conferred a 3.6-month short-term survival benefit in melanoma patients, and 22 percent of advanced melanoma patients benefited from an additional three or more years of life with ipilimumab.6 Rather than regulating the endogenous immune response, adoptive T cell (ATC) therapy involves the infusion of self or donor-derived T cells into cancer patients. In one variation of ATC therapy, immune cells that have invaded a tumor are derived from cancer biopsies and expanded for intravenous infusion into the same patient. Successful studies have been presented in many solid tumor types with varying rates of tumor reactivity, including metastatic melanoma, cervical cancer, renal cell cancer, breast cancer, and nonsmall cell lung cancer.7 In CAR T cell therapy, T cells are engineered to express chimeric antigen receptors (CARs) that are specific for a cancer antigen, allowing for better extracellular targeting of tumor cells. Most research utilizing CAR T cell therapy has been performed in liquid cancers that begin in blood-forming tissue due to the limitations of localizing the T cells to solid tumors.
PD1 PD1 mRNA
PDL1/2
Late After Activation
immune checkpoints that hamper the strength of the immune response, allowing T-cell activation and proliferation to occur. CTLA4 is one such immune checkpoint, as it blocks signals from T-cells receptors to prevent further immune activation. The administration of anti-CTLA4 antibodies, targeting the negative regulator CTLA4, were shown to enhance antitumoural immunity and induce longlasting immunological memory in multiple murine models.5 This theory is supported by clinical data, as Ipilimumab, a human anti-CTLA4 antibody, was FDA approved in 2011 following evidence that it 18 PENNSCIENCE JOURNAL | fall 2021
Therapeutic cancer vaccines are also emerging in cancer immunotherapy. The identification of tumor-associated antigens, which are highly expressed on tumor cells and expressed to a lesser extent in normal tissues, present as a targetable molecule for T cells, although neoantigens that are unique to a malignancy are better suited for vaccine targeting.6 Castle et. al. demonstrated a novel next generation sequencing technique with bioinformatics to identify immunogenic somatic mutations that serve as potential candidates for cancer vaccines.8 Immunotherapy has greatly changed the practice of medicine in the context of cancer treatment. Progress in the field is noted by improvements in pancreatic cancer survival, as it is traditionally a fatal disease with high mortality and poor prognosis. In a meta-analysis of clinical trials employing specific immunotherapy against pancreatic cancer, the three-year overall survival rate was significantly improved in patients who received specific immunotherapy compared with patients who did not undergo treatment. On a molecular level, the metaanalysis also revealed that the antibody-response in patients receiving specific immunotherapy was also improved, and a greater number of T-cells were activated, suggesting a potent immune response
and enhanced cellular immune function and systemic antitumor activity.9 In considering the future of immunotherapy, it is imperative to note the targeting of neoantigens in cancer vaccine development. Autoimmunity, in which the immune system attacks healthy cells, can arise with checkpoint inhibitors as well as the targeting of tumor-associated antigens because the treatments are not exclusively specific for tumor cells.10 The greater mutational burden of cancer cells can be harnessed in order to target neoantigens that only result from the expression of mutated tumoral cell DNA. This also may allow for more universal cancer therapeutics, as there are common neoantigens that arise in many tumor types. Furthermore, there is an expected rise in the study of immunotherapies in non-cancer settings, such as in immune-mediated inflammatory disease and autoimmune disease.10 Research in immuno-oncology and beyond is primed to enhance cancer treatment options for patients in the coming years.
REFERENCES
Features
REFERENCES
REFERENCES
fall 2021 | PENNSCIENCE JOURNAL 19
Technology to keep women safe on campus Written By Isabel Engel Designed By Bianca Vama Amidst the context of
the “Me Too” movement, an investigation of the safety of women has taken the national stage. As institutions have investigated rates of sexual harassment, the demographic of young adult women has emerged as a key focus. According to Rape, Abuse, & Incest National Network (RAINN), women between the ages of 18 and 24 are at the highest risk for sexual violence. On college campuses, where the majority of students fall into this age bracket, such violence has become prevalent. A survey conducted by the Association of American Universities concludes that at 33 of the nation’s major universities, around one in four undergraduate students reported having experienced sexual misconduct or assault. Additionally, 41.8 percent of all students surveyed reported experiencing at least one instance of ‘sexually harassing behavior’ since college enrollment. Furthermore, 18.9 percent of surveyed students reported that such behavior impacted academic experience and participation or led to a hostile college environment. With such rates of sexual harassment on college campuses and in the general population of young women, many companies are turning to new and innovative technology to keep women safe. News stories of women’s assault sparked the idea of wearable preventative technology for women. Herman Veenstra, founder of the Safelet Bluetooth bracelet, was among the first to transition wearable Bluetooth technology from the field of fitness to safety. Since his creation of Safelet in 2014, many companies have adopted similar ideas in attempting to employ Bluetooth technology as a means of keeping women safe. Two main fields of protective technologies have recently emerged: wearable technologies and mobile applications. These technologies are reshaping the concept of personal protection for women across the country.
20 PENNSCIENCE JOURNAL | fall 2021
Wearable Technologies A primary focus of the growing preventative technology industry, wearable technologies hold much potential in promoting women’s safety. From bracelets to personal alarms, a wide array of wearable technologies are innovating the way women go about self-defense and personal protection. The Safelet SOS tracker bracelet was among the first products promoted as a wearable safety device. Safelet works in conjunction with a Bluetooth application to activate alarms, track users’ locations, and send audio messages to emergency contacts. Safelet allows for users to create a network of emergency ‘Guardians’ to be alerted in case of an emergency. Its success as a safety bracelet has inspired a wave of other fashion-based safety designs. InvisaWear, another smart jewelry company, sells a patented protective technology that comes in the form of keychains, necklaces, bracelets, and scrunchies (among other products). The Invisawear products pair Bluetooth and American District Telegraph (ADT) security technologies in creating protective equipment disguised as fashionable accessories. The accessories are specifically tailored to the young adult demographic, as they provide a sleek and fashionable
Features
way to carry an emergency alert device. The InvisaWear products allow users to subtly stay connected to emergency contacts and have readily accessible safety services. Other companies, like Birdie, have created personal and portable safety alarms for users to carry as they navigate busy and potentially unsafe areas. The Birdie is embedded with a strong 130 db alarm and a flashing strobe light. When users activate (pull) the alarm, the sound and light are released to ward off perpetrators of harassment, violence, assault, or any other unsafe behavior. Sophomore Victoria Conroy carries a Birdie alarm in her backpack. “While I, fortunately, have never had to use my Birdie in an emergency situation, I feel safer having it on me,” Victoria explained. “It is definitely a great resource to have, especially because we go to school in a busy city.” Encompassing technologies like Safelet, Invisawear, and Birdie (among a wide range of others), the field of wearable technologies have taken strides in protecting women in urban areas and on college campuses.
Mobile Applications Many technology companies are also turning their attention to mobile applications that employ Bluetooth technology as a means of protecting consumers. Bluetooth operates using a low power radio that streams data to several channels. While the technology has been adopted by a wide range of fields, it has been specifically applied to the safety industry as a means of tracking consumers’ locations and connecting users with emergency points of contact. Several applications, like the Noonlight application, are making notable strides in using Bluetooth technology to keep women safe. The Noonlight app employs simultaneous Bluetooth and GPS technologies to connect users with local police. By holding down a button on the application’s home page, users can connect with police in real time. The application also allows for a safety network, allowing emergency contacts to trace a user’s location in case of an emergency. Another application widely used by college students, especially students at Penn, is the Guardian app. The Guardian app is “a custom-branded personal safety app that helps higher education institutions, businesses and healthcare organizations connect and engage with their communities wherever they are.” Using
GPS technology and a direct student - to - police connection, the Guardian app allows Penn students to access immediate safety resources while on campus. The app offers a wide variety of features, from a HELP line to a walking escort, all geared towards student safety in a busy, urban area. Freshman Maya Hardy feels an increased sense of safety through the Guardian app. “The app’s built-in resources allow me to feel more secure in Philadelphia, as I know I can always contact Penn emergency services in case of an emergency,” she said. Other applications, like the BSafe App, are working to combine an array of safety technologies into one mobile platform. The application is designed to keep users safe wherever they are. With features like an SOS button, voice activation technology, live streaming, recording, GPS tracking, fake calling, emergency contact networks, and timer alarms, the safety app strives to ensure users’ protection and confidence when navigating potentially unsafe areas. Noonlight, Guardian, and BSafe represent just a few of the applications in the growing safety application field. These apps, among many others, work to ensure users’ safety in a growing digital world.
Conclusions and Applications to Penn Students As students of a busy, urban area, safety is often at the forefront of Penn students’ minds. By using the ample technological resources available in this growing online world, Penn community members can take steps towards ensuring personal protection. The wide array of emerging technologies addressing women’s safety (particularly the safety of young females at higher risk for violence) holds promise in keeping Penn students safe, on campus and beyond. As the fields of wearable and mobile safety technologies continue to grow, so too will the resources available to Penn students to stay safe in Philadelphia.
REFERENCES fall 2021 | PENNSCIENCE JOURNAL 21
nicate using the neurotransmitter acetylcholine) in the suprachiasmatic nucleus of the hypothalamus regulate the sleep-wake cycle by receiving information about light exposure and projecting to the pineal gland. In doing so, they release the sleep-promoting neurohormone known as melatonin.1 Written By Sarah Pham Designed By Avi Singh
On
average, a person will spend one-third of their lifetime sleeping. Not only does sleep serve as a vital component of healthy brain function, it also provides defense against possible alterations within the neural circuits constituting neurological disorders. In the absence of sleep, major proteins, pathways, and processes within our brain undergo deformations which can accumulate if left for a prolonged period of time and progress into becoming causative factors of neurological and psychiatric disorders. In order to understand the link between sleep and neural disorders, scientists aim to elucidate the role of underlying neural sleep circuits in the regulation of health and disease. Brain Regions Involved in Sleep Regulation
Within specific regions of the brain there are cell bodies of neurons that produce a regulated flow of neurotransmitters, allowing us to maintain a regulated sleep-wake circadian rhythm. Circadian rhythms are 24-hour cycles serving as the body’s internal clock. Directly influenced by environmental cues, circadian rhythms are tied to the cycles of day and night. They help to promote our brain to switch between certain cycles of sleep and wakefulness.1 In the hypothalamus, groups of neurons serve as control centers affecting sleep and arousal. Specifically, cholinergic neurons (that commu22 PENNSCIENCE JOURNAL | fall 2021
Meanwhile, the neural and homeostatic regulation of rapid-eye movement (REM) sleep is localized in the network of sleep-promoting neurons in the brain stem (midbrain, pons, and medulla oblongata) and hypothalamus.2 The amygdala, a brain region known to control the processing of emotions, plays a substantial role in sleep studies due to its high activation during REM sleep.2 The linkage between amygdala activity and REM sleep helps to explain the co-occurence of mood disorders with abnormal sleeping patterns. The thalamus also plays a role in the sleep-wake cycle by acting as a control center for sensory information relay to the cerebral cortex.2 During most stages of sleep, the thalamus decreases in activity, allowing us to fall into a state of relaxation. However, during REM sleep, the thalamus becomes increasingly active and sends the cortex sensory information to essentially make us “experience” dreams. Regulation of Wakefulness, NREM/REM Studying the wavelengths of brain activity during sleep allows scientists to determine periods of wakefulness, non-REM, and REM sleep. When we are not in a state of sleeping, we are productive in a state of wakefulness. In switching to states of sleep, we cycle through two stages called REM sleep and non-REM sleep several times during the night. With each cycle, we fall into longer REM peri-
Features Awake (beta waves)
REM
12.5min 12.5min
Stage 2 nREM (repeat) (sleep spindles+K-complexes)
Stage 1 nREM (theta waves) 2.5min
12.5min
2.5min
12.5min
Stage 3 nREM (repeat)
12.5min 12.5min
Stage 4 nREM (delta waves)
ods. Non-REM occurs in 3 stages.3 Stage 1 is the “changeover from wakefulness to sleep” consisting of several minutes of light sleep. At this point, our brain waves begin to slow along with our muscles, heartbeat, eye movements, and breathing. Stage 2 is where we spend most of our repeated sleep cycles. Our brain wave activity slows and bursts of neural oscillatory activity in the form of sleep spindles are most prevalent during this stage. Sleep spindles serve as a temporally precise event signaling the coordinated activity of the thalamus and neocortex in the processing of information during sleep.4 Previous works have supplemented the importance of sleep spindles through correlations of NREM brain activity and sleep-related advantages in memory.4 Stage 3 occurs in longer periods during the first half of the night and refers to the period of deep sleep that is required for us to truly “recharge.” Brain waves become even slower and our body is less reactive to external waking cues. REM sleep occurs in the first 90 minutes after falling asleep. In this stage, our eyes move rapidly from side to side and our brain waves resemble that of wakefulness.5 Our breathing speeds up and becomes irregular while blood pressure and heart rate increase to levels close to the threshold needed for wakefulness. REM sleep is most important to our sleep cycle because it stimulates areas in the brain essential for memory consolidation and strengthening of synaptic connections.3
Stage 2 nREM (sleep spindles+K-complexes)
Stage 3 nREM
Effects of Sleep Deprivation on Some Neurological Disorders Sleep deprivation (SD) refers to a reduction in time spent sleeping below an individual’s baseline requirement while sleep restriction (SR) refers to a partial loss of sleep.2 SD and SR reduce the effectiveness of the immune system and decrease cognitive function, memory consolidation, learning, and emotional wellbeing. In prolonged periods without good sleep, our brain can produce causative factors of neurological and psychiatric disorders by the accumulation of lipids and proteins.2 Most notably, Alzheimer’s disease is characterized by surplus deposits of extracellular amyloid β plaques, intracellular tangles, and neuronal loss. Generally, the toxic metabolites of amyloid β (Aβ) are cleared by astroglial cells during REM sleep. Studies show a bidirectional relationship between sleep and Alzheimer’s disease.2 During sleep, accumulations of Aβ produced in the brain are cleared at an increased rate of about 25-30% during REM sleep compared to wake state. However, during SD, aggregates of Aß showed an increase in the hippocampus, preceneusm thalamus, and cortex – inducing a structural and functional change characteristic of Alzheimer’s disease development.2 As a result, neurogenesis (the growth of new neurons) is inhibited and cognitive dysfunction ensues. Sleep deprivation also affects cholinergic neurons fall 2021 | PENNSCIENCE JOURNAL 23
(neurons mainly utilizing the neurotransmitter acetylcholine), which are essential in regulating sleepwake states, memory, and learning. Specifically, SD disrupts the switch in cycles between sleep and wakefulness by increasing adenosine levels in the basal forebrain (major cholinergic output of the central nervous system) thereby inhibiting the cholinergic system.2 Adenosine acts as an inhibitory neurotransmitter that inhibits many processes associated with wakefulness. During wakefulness, adenosine levels gradually increase throughout the day to form an accumulation of “sleepiness.” When the threshold of adenosine levels are reached, we inevitably fall asleep and only then do adenosine levels begin to decrease. However, SD can alter the various pathways promoting sleep/wake cycles leading to the formation of neurodegenerative disease pathology. Similarly, Parkinson’s disease (PD), a distinctive motor neurodegenerative disease causing rigidity, bradykinesia, and tremors, is caused by an accumulation of lipids and proteins such as α-synuclein, Aβ, TDP-43, and tau. However, clinical studies demonstrate improvement in the cognitive and mental ability of PD patients with optimal sleeping patterns.2 In the case of Huntington’s disease (HD), a neurodegenerative disease causing motor impairment,
Stunted Learning and Cognition
cognitive deterioration and behavioral problems, diagnosed patients suffer due to a faulty gene resulting in parts of the brain becoming damaged over time. The induced characteristics of disturbed sleep patterns in HD patients are linked to neuronal dysfunction, immune dysregulation, and shrinkage in brain volume.2 In studying the sleep circuit networks involved in sleep, new tools and techniques involving the simultaneous imaging of various neuron activity in multiple brain regions and incorporation of computational analysis have been especially relevant in current neuroscience research. By understanding the implications of sleep loss and revealing the intricacies of the complex neural sleep circuitry, research can open avenues to treatments for various neurological disorders. Specific neural circuit mechanisms must continue to be explored in order to elucidate the relationship between sleep and its effects on Alzheimer’s, Parkinson’s, Huntington’s, autism spectrum disorder, multiple sclerosis, and more. Sleep deprivation plays an apparent role in the alteration of cellular signaling cascades, evidently demonstrating the increasing importance of identifying sleep networks as a foundational focus in developing treatments.
Neuropathic Pain Parkinson’s Disease
Multiple Sclerosis
Sleep Deprivation
Autism Spectrum Disorders
Epilepsy
Alzheimer’s Disease
Huntington’s Disease
Stroke 24 PENNSCIENCE JOURNAL | fall 2021
Glioma
DRONES, CAMERAS, AND AI: Firefighters of the Future
Features
Written By Michelle Paolicelli Designed By Jessica Hao
FOREST FIRES, GOOD AND BAD
Huge
wildfires across the country and the world have been headlining the news frequently in recent years. But as often as we see and hear about these blazes in the media, their long lasting effects are much larger. In 2021 alone, 5.7 million acres of forest in the United States have been burned, including nearly 1 million acres lost in The Dixie Fire in California.1 The skies have turned to an ominous orange haze, streets have been muted grey with ash, and homes and lives – human, animal, and plant – have been lost. Our capacity to contain these fires is shrinking as the ferocity and frequency of wildfires grows. There is a clear need to shift towards preventative measures so the damage caused by forest fires can be mitigated despite increased drought, temperatures, and other favorable conditions for ignition.
“In 2021 alone, 5.7 million acres of forest in the United States have been burned.” “Good fires” refer to a technique used to systematically clear out overgrown sections of forest proactively before they can contribute to the uncontrollable spread of wildfires.2 Historically, these intentional and controlled fires have been successful in mitigating the damage caused by wildfires, but recently, they have become a point of contention. Due to the increased number of active wildfires at any given time, there are not enough resources to simultaneously continue these preventative measures and combat the forest fires spreading destruction. This leads to a vicious cycle of new fires igniting and then spreading uncontrollably through overgrown forests. The goal of “good fires” is to create
fire break that lessens the likelihood of uncontrollable spread. This practice has been around for centuries, but it is clear that a more technologically advanced solution is needed to combat the volume and ferocity of present day wildfires.3 Most people are familiar with the image of planes flying over actively burning forests and dumping large clouds of extinguishing foam down glowing trees. Different extinguishing foams are used but one of the most common types is Class A Foam Concentrate. The name for this foam comes from the fact that it is able to penetrate through class A fuels – also known as ordinary combustibles like wood – and gives water a foaming ability.4 These Class A foams gradually release the moisture contained within them once they are deposited and absorb heat more efficiently than plain water.5 Although the material properties of Class A foams are favorable for fire suppression, the release of this foam is untargeted and often not overwhelming enough to fully extinguish the flames. Additionally, foam is often used when fires are of high intensity and already widespread.6 Furthermore, Class A foam has potentially harmful environmental effects making it less than ideal for a long term sustainable solution to forest fires.
fall 2021 | PENNSCIENCE JOURNAL 25
ADVANCES IN FIREFIGHTING TECHNOLOGIES Some alternatives to the traditional foam fire suppression technique to prevent, detect, and extinguish wildfires include extensive camera networks, drones, algorithmic smoke detection and artificial intelligence.7-11 Rain Industries, a startup based on the West Coast, uses a combination of machine learning and drone technology to detect and extinguish fires in their infancy faster than humans can physically reach the often remote locations where fires ignite.7 This technology employs extensive use of computer vision to detect an active fire without human intervention. Once a fire has been detected, a drone stocked with compressed foam autonomously takes off, flies to the coordinates of blaze, and releases the bombs that detonate to release extinguishing material. Rain Industries aims to eliminate the need for fly over bulk release of Class A foam by manned planes by stopping fires before they span hundreds of acres.
information informs the actions of responders and allows for continuous monitoring of fires throughout evacuation, rescue, and extinguishing efforts. ALERTWildfire has been actively used during wildfires in California and Nevada in the past, but the network and reach of their work continues to expand. A primary goal of the project is to install 1000 cameras in California alone by 2022, making a true statewide network. Another indicator of fire monitored by Rain Industries are ITRON smart meters. These electric meters serve as a monitoring device for when power lines go down.12 Since the path of power lines is pre-determined and easily mapped, the immediate knowledge of lines going down can help direct resources to the front of a fire as well as its potential path.
Rain Industries’ primary focus is the development of drone technology and computer vision that is needed to target a fire once the drone is in the air. In order to become aware of active forest fires, the company needs to tap into a network of cameras and sensors already established. The result of a consortium between the University of Nevada Reno, University of California San Diego, and the University of Oregon, ALERTWildfire provides a network of Pan-Tilt-Zoom (PTZ) cameras to Rain Industries as well as firefighters and first responders on the ground.8 For both drones and humans, these cameras enhance situational awareness by providing information on the location and ferocity of fires. This
26 PENNSCIENCE JOURNAL | fall 2021
B
Features COLLABORATION IMPROVES EFFECTIVENESS The use of ALERTWildfire’s extensive network of cameras by companies like Rain Industries and first responders would not be possible without integrated artificial intelligence (AI). Machine learning and AI provided by the Korean based company Alchera makes it possible for the PTZ cameras to detect smoke and fire in real-time, minimizing the delay between ignition and response.10 While ALERTWildfire provides the physical network of cameras that makes it possible to see deep into hundreds of acres of forests, Wsince 2019, Alchera has made these cameras “smart” by processing images at a rate not possible by humans. The final piece, fire suppression, is taken on traditionally by firefighters on the ground. More recently, startups like Rain Industries have been challenging the norm to improve success rates. ALERTWildfire and Alchera have been working in tandem for some time now while Rain Industries is new to the scene. This new opportunity for collaboration offers a comprehensive solution to fighting forest fires from detection to suppression.
REFERENCES
Technologies and strategies currently pursued by Rain Industries, ALERTWildfire, Alchera, ITRON and other groups indicate a promising future for the combattance of wildfires. Work still needs to be done to make these mostly early stage products and services widely available. Additionally, the coalescence of these technologies into a unified front against forest fires has the potential to save homes, native flora and fauna, and human lives. Likewise, serious efforts need to be made to stop and reduce climate change so that the conditions ideal for fires such as high temperatures and extensive droughts are minimized. The future of tools to fight wildfires is bright, but a future without the need to fight these fires at all is even more so.
A
fall 2021 | PENNSCIENCE JOURNAL 27
FDA
FDA
FDA
FDA
A FDA D F A FDA D F A FDA D F A D F F A FDA DA D F A D F FDA FDA FDA FDA FDA FDA FDA
FDA and Drug Safety Written Writtenby: ByKevin KevinGuo Guo Designed by: Avi Singh Designed By Avi Singh
In
1957, the family-owned German pharmaceutical company Chemie Grunenthal discovered a sedative analogue known as thalidomide. The company realized its effectiveness in curbing morning sickness symptoms like vomiting, nausea, and headaches. Immediately the company mass-produced and marketed the compound as a “wonder drug” that could cure many ailments under the brand name Contergan in Europe. However, people eventually realized that women taking thalidomide often gave birth to children with birth defects. More than 10,000 infants were affected with physical defects such as polydactyly, with 2,000 succumbing to their physiological disorders.1
Oldham Kelsey, ran independent tests for drug safety and deemed thalidomide unsafe for pregnant women. She subsequently prevented the suffering and deaths of thousands of infants in the U.S. and was awarded the President’s Award for Distinguished Federal Civilian Service. In 1961, not long after Kelsey’s life-saving decision, the drug was recalled off the market due to mounting public pressure regarding its effects.2, 3
In order to prevent any future accidents such as the thalidomide incident, the FDA must maintain its long history of ensuring drug and public health safety in the U.S. Many regulations set by the FDA have risen as a result of public health crises When Chemie Grunenthal sought involving unsafe drugs that had not approval from the FDA, the official in been tested or traced for toxicity in the charge of FDA drug review, Frances general public. Today, the FDA follows 28 PENNSCIENCE JOURNAL | fall 2021
Features
a strict drug development process with many checkpoints to ensure a drug is safe for public use.4 After a new drug or compound is discovered in the lab, it joins the pool of thousands of other drug candidates that annually apply to reach the next step in the discovery process. Approximately 32 percent of the candidates are then evaluated and compared on several criteria such as dosage, mechanisms of action, and effectiveness.4 Preclinical research is then performed to determine toxicity in humans using in vitro (mechanism at the cellular level) and in vivo (how the compound affects a multicellular organism) experiments. If deemed suitable for testing in humans, the drug candidate is then used in carefully designed clinical studies to investigate its impact on the general population.4 Clinical trials occur in three phases, with each requiring more time and participants. The first phase, in which 70% of drugs pass, tests safety and dosage in healthy participants. After proof of safety, the drug candidates are evaluated in efficacy and side effects with 33% moving to the third phase. In the third phase, where 25-30% of drugs are approved, the FDA and drug developers monitor any adverse reactions to the drug. A fourth phase may be A PENNSCIENCE JOURNAL | Fall 2021
run to test safety and efficacy in a larger participant population compared to 4 previous phases. If a clinical trial is successful and the drug has sufficient evidence for its safety and efficacy, the drug developer can submit a New Drug Application (NDA) to market the drug. The NDA details labeling, safety updates, potential side effects (e.g., drug addiction), directions for use, and patient information. If the FDA considers the NDA as complete and there is no evidence for fabrication or manipulation of data, the drug can be approved for commercial use. However, the FDA must still work closely with the drug developer to specify the labeling (e.g., contraindications, potential side effects) and previous issues with existing data.4 The general public and the scientific community often debate the decision-making capacity of FDA approval for drug safety and efficacy. For example, shortcuts in the development process such as Emergency Use Authorizations (EUAs) allow certain drugs and vaccines to accelerate approval.4 While some EUAs such as those for the Pfizer-BioNTech COVID-19 vaccine are successful in preventing hospitalizations and deaths among the public, others such as Remdesivir have had little benefit in public health and safety.5 Furthermore, those informed on the policy decision-making of the FDA’s approval process take issue with its assumption that surrogate end points inform clinical outcomes, which is that correlation is causation. For example, the FDA has recently been embroiled in controversy for approving Aducanumab (a drug supposed to treat Alzheimer’s disease) based on the reduction of β-amyloid levels in patients. The FDA had never explicitly confirmed fall 2021 | PENNSCIENCE JOURNAL 29
that levels could serve directly as a clinical outcome, rather, used β-amyloid levels as a surrogate endpoint. Furthermore, according to FDA statistical review, β-amyloid changes correlated little cognitive improvements, suggesting that β-amyloid levels have not changed patient recovery significantly. The FDA’s reliance on surrogate end points to inform clinical decisions despite an inconsistent relationship with clinical outcome is a major cause of concern for the public and scientists alike.2,6
A non-medical, yet major nonetheless, issue that the public is often concerned with is the FDA’s potential political influence by third parties, or regulatory capture. Regulatory capture is defined as a shift in priorities of regulatory agencies from protecting public-health interests towards industrial commercial interests.9 This is especially pertinent given an FDA officer’s duty in negotiating with scientists, physicians, legal experts, and other people involved with marketing and selling the drug. Without governmental oversight or stringent guidelines in Even in less contemporary history, the FDA has preventing regulatory capture, the FDA risks being been criticized for delaying the time it takes to approve controlled by pharmaceutical companies instead of certain drugs. After the thalidomide crisis, the 1962 the need for improving medicine and public health. Kefauver Harris Amendment was passed to require proof of efficacy of the drug in addition to safety, Regardless of the current state of the drug approval which was much more difficult and time-consuming process, it is essential that the FDA continues to than only requiring safety. This significantly increased promote public health and guide drug developers in drug approval time from seven months to seven years making more effective and safe drugs that will help between 1962 and 1998. This led to the delay in the those afflicted by debilitating diseases such as cancer approval of the first life-saving HIV drug AZT in the and Alzheimer’s disease. To avoid future mistakes in 1980s, as well as the frustration and even protests of drug safety and effectiveness, the FDA must constantly many HIV/AIDS activist groups. The FDA had to adapt to new technologies that revolutionize medicine immediately respond by expediting the approval for and improve how drugs are being approved. the drug so that it was approved in only two years.7,8
REFERENCES
30 PENNSCIENCE JOURNAL | fall 2021
Environmental Cleanup
Features
Written By Benjamin Beyer Designed By Phuong Ngo
Throughout
the past few decades, instances of environmental disasters and pollution have dramatically increased. Even in recent weeks, an oil leakage occurred off the coast of southern California. According to the New York Times, 126,000 gallons of oil leaked out of a pipe in the Pacific Ocean.1 This environmental disaster poses a vast danger to pacific flora and fauna, and it prevents people from using southern Californian beaches for recreational use and for work. On October 11, 2021, the Associated Press reported that some beaches were reopening, but still remain closed for fishermen and other marine-based trades.2 While the California oil spill halted beach life for a few weeks, caused ecological damage, and incurred economic losses to oil companies and coastal communities alike, this spill was not to the same magnitude as others that have occurred in recent years. The Deepwater Horizon oil spill was one of the most catastrophic ecological disasters of the twenty-first century due to the volume of oil that was spilled off the Gulf Coast, causing many of the same issues as the California oil spill to a much greater extent. In addition to these instances of marine pollution, recent years have also shown a drastic increase in smog and air pollution rates. Air pollution and smog have been linked with heart disease, cancers, and many respiratory diseases along with being detrimental to the environment. Smog often can cause irritation to the lungs, nose, and eyes, and can also exacerbate respiratory and circulatory issues, or
can cause these issues in perfectly healthy individuals. In Philadelphia, the American Lung Association found that ozone smog has continually worsened.3 As society continues to innovate in driving and other technologies that contribute to these kinds of pollution, we will continue to experience drastic increases in pollution rates worldwide. Although pollution continues to increase alarmingly, there have been innovations that can help us clean up these disasters. Crude oil is made up of hydrocarbons with many impurities of nitrogen, sulfur, and oxygen, and its density is often less than that of water. Thus, crude oil often floats on top of water when spilled.4 When this oil is spilled, there are a few ways by which we can clean it up. One such way that is commonly employed is chemical dispersants. In this method, planes and boats distribute dispersants over a large area range. Among the many different types of available dispersants, Corexit is one of the more common brands.5 Corexit is made of 1,2-propanediol and 2-butoxyethanol along with other acidic salts.6 These chemicals work to disperse the oil into the ocean as opposed to having oil washing onto beaches. Similar chemicals can also be unleashed underwater. Critics, however, assert that these chemicals can cause more pollution and can harm marine life. Both of these types of pollution control were copiously used during 2010’s Deepwater Horizon incident. Other types of non-chemical cleanup have fall 2021 | PENNSCIENCE JOURNAL 31
also been proposed, like using skimmers—boats that skim oil from the water’s surface—to collect the concentrated oil on the water’s surface. While this is a good method to help cleanup oil without introducing other chemicals into the system, the scope of skimmers is very limited by types of debris and the weather conditions.7 Another commonly employed method is that of sorbent materials, where the oil is absorbed into a material which can then be removed from the water. This method is also difficult to apply on a large scale and is adversely affected by weather.8 While many of these technologies were invented in recent years, more recent oil spills have put new and emergent technologies to the test. While many solutions to oil spills are responsive, many methods used to combat smog and air pollution are preventative. Some are legislative; for example, in the United States, the government has passed the Clean Air Act, which aims to prevent pollution through measures like emissions reduction and further industrial regulation.9 Air purifiers are one commonly employed way to filter pollutants out of the air. While common air purifiers are often made to the scale of small homes and rooms, large air purifiers have been made on the scale of cities and towns. For example; in the Chinese city of Xi’an, a 200foot tower was constructed to reduce air pollution and smog in the city.10 This plant recycles energy from solar heat and then uses this energy to purify the air entering the tower and has yielded promising results.11 While many methods have been created to clean up oil spills, governments and organizations have often taken a much more preventative approach to cleaning air pollution with legislation as opposed to proactive cleanup methods. As rates of air pollution and instances of oil spills have increased in the past decades, many cleanup measures have made progress in environmental cleanup, yet pollution continues to be a pervasive issue throughout the world. Many new and emerging technologies pertaining to pollution cleanup pertain to improving and bolstering current technologies.12 As our societies continue to grow, these issues of pollution will only grow in their effect on human communities. Thus as we continue to battle these issues of pollution, we must continue to adapt our technologies to attempt to clean our planet for years to come.
32 PENNSCIENCE JOURNAL | fall 2021
REFERENCES
Features
The 3 Parent Baby: New Methods for Mitochondrial Replacement Therapy Written By Emily Ng Designed By Saraswati Sridhar
Besides
being the powerhouse of the cell, recent studies about the mitochondria’s extracellular genome point to the possibility of revolutionizing gene therapies. Mitochondrial replacement therapy (MRT) is a newly experimented form of reproductive in vitro fertilization (IVF) involving the replacement of mutated mitochondrial DNA with that of a healthy donor.4 New processes in enacting this concept may be capable of preventing the transfer of mitochondrial diseases from mother to offspring.
THE IMPORTANCE OF MITOCHONDRIA Given its primary role in facilitating aerobic respiration, the mitochondrion is an organelle that is essential to the production of energy in cells. What distinguishes mitochondria from all organelles besides the nucleus, however, is the presence of its own DNA called mitochondrial DNA (mtDNA). Mitochondria are also found abundantly in oocytes, or the immature ova of the mother, and support oocyte development.6 As such, mitochondrial DNA is only inherited from the mother.2 This implies that mitochondrial replacement therapy is specifically targeted to biological mothers whose
mitochondrial DNA might be at risk of transferring debilitating diseases. Such information also supports requiring a female donor’s mitochondrial DNA for mitochondrial replacement therapy.
MITOCHONDRIAL DISEASES Many inherited metabolic diseases are due to mutations in the mitochondrial DNA. Mitochondrial diseases are responsible for deterioration in function among major organs, such as the brain, heart, and lungs.4 The severity of each disease depends on the particular mutation and its heteroplasmy level, which refers to the ratio of healthy to mutated molecules in each affected cell.5 Since the 1980s, over 250 large deletions or point mutations in mtDNA have been recognized as causative of disease.2 To worsen matters, all mitochondrial diseases are incurable.3 Only a handful even respond to currently known treatments.4 Given that mitochondrial DNA is highly unstable, it has proven to be difficult to find the numerous mutations that cause mitochondrial diseases even with various techniques, further halting the process of discovering potential cures.4
fall 2021 | PENNSCIENCE JOURNAL 33
MATERNAL SPINDLE TRANSFER VS. PRONUCLEAR TRANSFER Recent methods of mitochondrial replacement therapy, however, seek to prevent the development of mitochondrial diseases prior to conception by replacing the mitochondrial DNA of the mother. Currently, there are two popular methods of mitochondrial replacement therapy – maternal spindle transfer and pronuclear transfer. Both involve the transfer of the biological mother’s nuclear DNA to the oocyte of a female donor. Maternal spindle transfer (MST) involves the transfer of a biological mother’s nucleus from her mature oocyte to a healthy female donor’s enucleated oocyte. An enucleated oocyte refers to an immature ovum whose nucleus is removed.2 This approach occurs during the mature metaphase II in which nuclear DNA forms a meiotic spindle by assembling metaphase chromosomes.5 The spindle is separated Mother’s egg with faulty mtDNA is fertilized with Father’s sperm in the lab
foremost of techniques regarding MRT. Metaphase II is an important stage during MST; the chromosomes are lined up along the metaphase plate with guidance from spindle fibers as they proceed to be separated in the next stage. Additionally, the complex lacks mitochondria, which minimizes the risk of carrying mutated mtDNA during MST. Issues inhibiting the success of MST have been resolved in recent years as well. These important milestones include the creation of a technique for transferring the spindles with the use of a laser objective.5 The alternative technique is pronuclear transfer (PNT), whose process is quite similar to that of MST. PNT occurs during the embryo or zygote stage in which both the biological mother’s and female donor’s oocytes have been fertilized by the biological father’s sperm.2 A pronucleus, which, in this case, is the haploid nucleus of the egg, forms in each egg cell after fertilization is complete. The pronucleus from the donor’s egg cell is replaced with the pronucleus The Parent’s genes are transported into donor egg with healthy mtDNA
Parents’ Genes
Father’s Sperm
Mother’s Egg
Faulty Mitochondrial DNA
The reconstruccted embryo is implanted back into mother resulting in 3 genetic parents
Healthy Mitochondrial DNA
Donor’s Genes Removed
from the biological mother’s egg cell.4 from the mature oocyte and transferred to the cytoplasm of the donor’s enucleated and unfertilized Both methods result in healthy mitochondrial oocyte. This is followed by fertilization with the DNA for the child, eliminating major opportunities biological father’s sperm in the reconstructed oocyte.2 for mitochondrial diseases to develop. Given how recent these findings are, however, there is no In 2009, MST was initially studied in primates, guarantee that such techniques are effective, thus but recent examinations have primarily focused on leading to speculation about ethical concerns in humans bearing pathogenic mutations. Such extensive regards to wasting zygotes in the process and the act studies have allowed MST to be considered as the of manipulating germ cells as a whole. For instance, 34 PENNSCIENCE JOURNAL | fall 2021
ETHICAL CONCERNS SURROUNDING MRT Several concerns about the regulation of mitochondrial replacement therapy have emerged. A primary matter is the need to use human embryos in experimentation. This was first introduced after the birth of the first baby to be conceived through IVF in the UK. The Warnock Committee began to consider reevaluating embryology and fertilization, and such debate included whether embryo research should be allowed.1 The UK passed regulation in 2015 (The Human Fertilization and Embryology), becoming the first country to regulate mitochondrial transfer.3 To this day, many arguments continue to concentrate on the embryo’s moral statuses.1 There also lies discourse about the infant’s resultative identity from mitochondrial replacement therapy. A few legal issues have highlighted how children who are born after these therapies would technically have genetic connections with three parents—mother, father, and donor. Opponents have argued that sequence variation between mitochondrial haplotypes consists of only a few amino acid substitutions. Utilizing donors that have the same mitochondrial haplotype can significantly reduce mtDNA differences between the donor and the mother.2
Features
Mitochondrial replacement therapy, though expanded through the advancements in its various methods, still remains far from its potential scope in eliminating applicable diseases. More and more methods beyond MST and PNT are currently undergoing extensive research, and ethical issues continue to hamper progress. Nonetheless, such a field of study is an exciting venture that scientists are continuing to expand, explore, and experiment with.
REFERENCES
MST is frequently cited as being more ethically acceptable compared to PNT. PNT requires the destruction of fertilized zygotes, not gametes. Thus, this field of study has experienced, and will continue to experience, intense discourse about the ethics behind the contested processes within mitochondrial replacement therapy.5
For this reason, it has been suggested for mitochondrial replacement therapy to be only applied on the mtDNA of male embryos. Given that mtDNA inheritance is only maternal, female embryos are highly susceptible to passing down mitochondrial diseases down future generations. Such an idea was rejected in the UK but accepted in the US. However, numerous questions about discarding female embryos and requiring parents to be involved in sex selection are still debated today.5 fall 2021 | PENNSCIENCE JOURNAL 35
0101101010100101010101010101010100101010101001010100010101010100100 01011010101001010101010101010101001010101010010101000101010101001 01011010101001010101010101010101001010101010010101000101010101001 01011010101001010101010101010101001010101010010101000101010101001 010110101010010101010101010101010010101010100101010001010101010010 01011010101001010101010101010101001010101010010101000101010101001 010110101010010101010101010101010010101010100101010001010101010010 01011010101001010101010101010101001010101010010101000101010101001 010110101010010101010101010101010010101010100101010001010101010010 01011010101001010101010101010101001010101010010101000101010101001 010110101010010101010101010101010010101010100101010001010101010010 Written By Brian Lee Designed By Bianca Vama 01011010101001010101010101010101001010101010010101000101010101001 010110101010010101010101010101010010101010100101010001010101010010 WhoWho are you?are you? 01011010101001010101010101010101001010101010010101000101010101001 0 01011010101001010101010101010101001010101010010101000101010101001 0 C ountless philosophers, anthropologists, historians, and anthropologists alike have all posited upon what 01011010101001010101010101010101001010101010010101000101010101001 0 constitutes a person’s identity: what makes someone ‘b’) to transform a readable message – such as battle who they are and what makes them different from plans – into a series of characters unintelligible to 01011010101001010101010101010101001010101010010101000101010101001 0 someone else. It may come as a surprise, then, to an adversary. Having such a scheme would allow see how machines, tech companies, and hospital parties to transmit the message in public without 01011010101001010101010101010101001010101010010101000101010101001 0 systems can use a handful of algorithms and logic worrying too much about prying eyes due to the rules to reduce your experiences and biology down hidden secret. Although modern parties no longer 01011010101001010101010101010101001010101010010101000101010101001 0 to a core set of numbers, signals, and files. Such use the Caesar cipher or its relatives due to their a reduction of someone’s physical existence and simplicity, the purpose of encryption has remained 01011010101001010101010101010101001010101010010101000101010101001 0 health can be extremely powerful and insightful – the same. Today, modern algorithms for encrypting and thus, dangerous in the wrong hands. Biomedical data, including biomedical data used in research 0101101010100101010101010101010100101010101001010100010101010100100 and biological data – ranging from a copy of your or for identification, are extremely complex: blood test results to a genetic profile to a scan of your computationally-expensive processes involving 01011010101001010101010101010101001010101010010101000101010101001 0 fingerprint – can reveal a surprising amount about multiplication between large primes, establishing your habits, fears, and even your future. Therefore, secure protocols by which information is transferred 01011010101001010101010101010101001010101010010101000101010101001 0 it is important to understand how our data is kept (like HTTPS), and ensuring a sufficient amount of safe, how our identities can be dissociated with entropy is involved in calculating encryption keys. 01011010101001010101010101010101001010101010010101000101010101001 0 biomedical research data, and how the legal system reacts to a breach in case current safeguards fail. In addition to employing multiple encryption 01011010101001010101010101010101001010101010010101000101010101001 schemes, there are also strict technological safeguards that guarantee biomedical and biometric data is 01011010101001010101010101010101001010101010010101000101010101001 0 Encryption and only shared if absolutely necessary. On an individual Limiting Access 01011010101001010101010101010101001010101010010101000101010101001 0 consumer level, data such as your fingerprint or the FaceID scan that unlocks your iPhone is often 01011010101001010101010101010101001010101010010101000101010101001 0 First and foremost, we need to understand how protected and designed to be private by design; your the singular bytes that make up our electronic health encrypted data is stored on a separate disk, making 01011010101001010101010101010101001010101010010101000101010101001 0 records, track variations in our genome, and encode it significantly more difficult for an adversary to gain our eye color are kept secure. an image of your face or fingerprint without your 01011010101001010101010101010101001010101010010101000101010101001 0 consent. Research institutions may have a similar Encryption is fundamentally a very simple infrastructure. For instance, biomedical research 01011010101001010101010101010101001010101010010101000101010101001 0 process: the ancient Romans have been attributed data is only stored on approved computer clusters with creating some of the first widely-known and devices to ensure proper use. 01011010101001010101010101010101001010101010010101000101010101001 0 and understood encryption schemes, which are colloquially referred to as Caesar ciphers. These technical safeguards aren’t the only 01011010101001010101010101010101001010101010010101000101010101001 0 Essentially, one of Julius Caesar’s officers or trusted way institutions and companies keep data safe. advisors would use a shared, secret, and reversible In the research world, there are also extensive 01011010101001010101010101010101001010101010010101000101010101001 0 process (such as shifting all letter in a message by processes involving review by scientists and ethicists a predefined amount, such as changing an ‘a’ to a to ensure the use of data is appropriate, necessary, 01011010101001010101010101010101001010101010010101000101010101001 0 36 PENNSCIENCE JOURNAL | fall 2021 0101101010100101010101010101010100101010101001010100010101010100100 01011010101001010101010101010101001010101010010101000101010101001 1
2
3
0101010110101010010101010101010101010010101010100101010001010101010 0101010110101010010101010101010101010010101010100101010001010101010 Features and minimizes harm. Almost all research projects or investigations involving human health data must 0101010110101010010101010101010101010010101010100101010001010101010 be approved by an Institutional Review Board (IRB) 0101010110101010010101010101010101010010101010100101010001010101010 consisting of a diverse range of individuals, including scientists, clinicians, and members of the public. The 0101010110101010010101010101010101010010101010100101010001010101010 IRB plays a significant role in encouraging scientists to elaborate on the impact of their research on the 0101010110101010010101010101010101010010101010100101010001010101010 patients, ensure their use of data is justified, and minimize risks to patients by limiting who has access 0101010110101010010101010101010101010010101010100101010001010101010 to the data. In having a large number of conversations and outlining the rules of accessing and utilizing data, 0101010110101010010101010101010101010010101010100101010001010101010 an IRB can serve as an additional layer of security to ensure that private, sensitive medical data is not 0101010110101010010101010101010101010010101010100101010001010101010 mishandled or shared publicly. images, even when all other personal information has been removed. To prevent this, researchers may 0101010110101010010101010101010101010010101010100101010001010101010 use one of several software tools to remove facial features before data are shared or openly released.9 Anonymity 0101010110101010010101010101010101010010101010100101010001010101010 However, similar to genetic data, although many of these techniques are excellent today, it is likely that 0101010110101010010101010101010101010010101010100101010001010101010 In addition to the safeguards used to ensure further research and technological advancements only the right people see your biomedical data, it is may make today’s efforts obsolete. Therefore, 0101010110101010010101010101010101010010101010100101010001010101010 important (particularly in genetic studies) that released research institutions continue to keep imaging data research data from your patient file cannot be traced under lock and key and employ a limited-access 0101010110101010010101010101010101010010101010100101010001010101010 back to you. Some believe that everyone has a right model that ensures researchers only see and analyze to privacy and many people may not want to publicly data if absolutely necessary. 00101010110101010010101010101010101010010101010100101010001010101010 share certain diagnoses or test results with the world. Therefore, companies and the research community Through complex measures such as 0101010110101010010101010101010101010010101010100101010001010101010 at-large perform several steps to dissociate pieces of complex encryption schemes, secure hardware medical data from a living human being. 0101010110101010010101010101010101010010101010100101010001010101010 disks, institutional review limiting access to data, and imaging anonymization protocols, scientists and Genetic data has traditionally taken a forefront engineers engage in a number of methods to keep 0101010110101010010101010101010101010010101010100101010001010101010 in this challenge given its potential ability to predict personally-identifying biomedical data safe. Although multiple aspects of a person’s biology, ranging from much work has been done so far, there is no single 0101010110101010010101010101010101010010101010100101010001010101010 their eye color to their lifespan. Currently, efforts to measure that can guarantee full safety or anonymity. anonymize data have included the “stripping” of Therefore, researchers continue to take advantage 0101010110101010010101010101010101010010101010100101010001010101010 demographic data including a patient’s age, gender, of the newest advances and findings in computer and race. However, with increasingly powerful science, biology, mathematics, and other related 0101010110101010010101010101010101010010101010100101010001010101010 computers and increasingly efficient algorithms, fields to continuously keep our data safe, allowing removing aspects of demographic data is not always us to continue our daily lives and for researchers to 0101010110101010010101010101010101010010101010100101010001010101010 enough as previous studies have unfortunately shown continue examining data to improve healthcare. that publicly available information can be leveraged 0101010110101010010101010101010101010010101010100101010001010101010 to identify individuals who have taken part in larger, public genetic surveys. As a result, many large 0101010110101010010101010101010101010010101010100101010001010101010 organizations sharing genetic data (such as the NIH) have shifted to a limited-access model necessitating 0101010110101010010101010101010101010010101010100101010001010101010 the manual evaluation of each researcher requesting data access. 0101010110101010010101010101010101010010101010100101010001010101010 0101010110101010010101010101010101010010101010100101010001010101010 Anonymizing imaging data has also been of significant interest in recent years. With improvements 0101010110101010010101010101010101010010101010100101010001010101010 in scan quality and facial recognition software, participants can be identified by a 3D render of their 0101010110101010010101010101010101010010101010100101010001010101010 fall 2021 | PENNSCIENCE JOURNAL 37 0101010110101010010101010101010101010010101010100101010001010101010 4
5,6
7
REFERENCES
8
Synthetic Microbes: A Future for Cancer Drug Delivery Written By Beatrice Han Designed By Phuong Ngo Many synthetic biology lab
have dedicated themselves to pioneering cancer treatments. Although discovering new therapeutic compounds is exciting and expensive, the most pressing issues lay elsewhere. There is an ample supply of potential new drugs, but options to deliver those drugs to their intended targets remain scarce. This shortage has severe consequences, since optimizing drug delivery is equally crucial as the development of the compound itself; safe and effective drugs must localize at tumors while interacting minimally with healthy tissue. To tackle these key challenges, pharmaceutical companies are developing drug delivery systems – which ideally shield drugs during transport through the body before releasing them at the desired destination - in parallel with pharmacological compounds.1
William B. Coley went as far as to infect his patients with Streptococcus bacteria, theorizing that the pathogens would activate the immune system against cancer.3 In doing so, Coley pioneered the earliest version of immunotherapy. His patients saw regression of their tumors, which would have been otherwise inoperable.7
However, injecting cancer patients with bacteria did not become a standard method of treatment. Instead, advances in genetic engineering turned bacteria into factories for producing therapeutic proteins to treat diseases, including cancer.8 Grown in large bioreactors (with trillions of cells per milliliter), bacteria can be programmed to make large amounts of recombinant therapeutic proteins by editing their genetic code. As researchers looked for alternatives to the harsh synthetic drugs formerly prescribed to treat disease, these new bacteria-produced proteins which could only be derived from engineered living cells were revoluStandard chemotherapies, the most ubiquitous cancer tionary. Named “biotherapeutics,” they became targets of treatments, bombard the body with small cytotoxic – or mass-production by pharmaceutical companies.9 cell-killing – molecules to destroy tumor cells. But because For instance, the earliest recombinant protein to be comthese therapies are administered systemically (through the bloodstream) with minimal targeting mechanisms, they mercialized was bacteria-produced insulin, first synthesized in the 1980s.10 Today, recombinant proteins can be found in trigger side-effects throughout the body.3,4 hormone therapies, blood clotting factors, and anti-inflamSo why not simply direct drugs to cancer cells? Target- matory molecules.11 As the most rapidly growing field for ing tumours is difficult: tumor microenvironments lack a treating complex diseases like cancer, biotherapeutics are sufficient supply of blood vessels, which renders them hy- often preferred because their basis in natural proteins suits poxic, or oxygen poor, allowing them to escape elimination them to functioning within a host body. 400 recombinant by the immune system. Furthermore, the high interstitial protein-based therapies are currently on the market and pressure between tumor cells hinders movement of ther- 1300 other protein candidates are under development.12 As apeutic compounds into affected tissues.3,4 Complications many as 30% of the host cells associated with commercially arise even if therapeutic molecules successfully reach their available biotherapeutics are bacterial.13 target system. A the majority of the drug diffuses all over But what about the possibility of putting bacterial therthe body, creating massive off-target effects.5,6 Although newer treatments – including monoclonal antibodies – apeutic factories inside patients? Synthetic microbes condisplay advancements in targeting ability, they struggle to tain a complete cellular machinery and can thus respond accumulate tumor cells and still create “on-target off-tu- to extracellular signals while self-replicating, presenting an mor toxicity”: toxicity that emerges in the correct area but avenue to control their localization and therapeutic protargets healthy cells in place of cancerous ones.2 Thus, sci- duction.2,14 Although eukaryotic therapeutic vehicles like entists are investigating new options for cancer treatments. human macrophages or T-cells could be created from the One of the more radical treatment options: synthetic mi- same engineering principles, prokaryotic bacteria have unique benefits as vehicles for drug delivery. They are betcrobes. ter at adapting to changing conditions in the human body, While it seems counter-intuitive to use pathogens to and they are more cost-effective to produce.2 combat disease, the concept of exploiting the microbiome There are multiple steps involved in creating synthetic for therapeutic purposes is not novel. In 1891, surgeon 38 PENNSCIENCE JOURNAL | fall 2021
microbes. Like biotherapeutics, synthetic microbes begin with a gene engineered to encode a therapeutic protein. After the gene is fused with regulatory sequences to control its expression level, it is placed in a vector – usually a bacterial plasmid – and inserted into a bacterial cell.13 Although bacterial host systems are generally the best choice for recombinant proteins because their genetic mechanisms are better elucidated than eukaryotic cells, bacteria cannot express some human proteins that need extensive folding or chemical modification to function. Although simpler and easier to control, bacteria sometimes lack the machinery to produce complicated proteins.15,17 In addition to a modified genome, synthetic microbes must be equipped with a system to allow deposition of therapeutic payloads at the correct locations.18 In terms of tumor cells, many strains of anaerobic bacteria, bacteria that cannot survive in the presence of oxygen, preferentially accumulate within the oxygen-poor tumor microenvironment. However, because unmodified bacteria inconsistently colonize tumor cells in clinical trials, further optimization is necessary to create synthetic microbes that reliably localize to tumors.19 To enhance the targeting abilities of synthetic microbes, researchers have engineered bacteria to express receptor proteins on their outer membranes that target antigens specifically found on cancer cells. While this is a common route for most cancer drug delivery, more success has been found in expressing proteins that recognize and bind to general healthy cells in the body but are only expressed in hypoxic tumor environments where cancerous tissue is present. So although the receptors can bind to any tissue, they are only activated upon reaching a tumor site.20 This model has been validated in in vivo animal trials. After mice were injected with a strain of anaerobic bacteria engineered to produce a receptor-binding surface protein only under hypoxic conditions, a reduction in tumor size was observed.21 Finally, synthetic microbes require some mechanism to release their therapeutic compounds in a controlled manner. To achieve this control, scientists have taken advantage of bacterial quorum sensing systems, which link gene expression to bacterial population density.22 In one animal model, researchers both engineered bacteria to express a antitumor molecule and programmed a “kill switch” that caused synchronized cell lysis – bursting their cell membrane and killing the bacteria – once a certain threshold population density was reached. This kept the bacterial population in check minimized the immune response to the pathogens.23
Features
lished clinical trials involving this technology. Those that exist show a wide discrepancy from the success of preclinical models; the first engineered bacterial strain for cancer therapeutics, tested in patients with metastatic melanoma and metastatic renal carcinoma, colonized tumor cells but failed to reduce tumor size.24 Though a more recent clinical trial involving Clostridium bacteria successfully induced the regression of an advanced bone tumor, that study analyzed data from only a single patient.25 Despite these tepid results, researchers have continued to launch studies; there are over 40 clinical trials underway investigating synthetic microbes.14 One of the critical barriers to clinical application of synthetic microbes is the immune response that the microbes, as foreign pathogens, might induce. Researchers have attempted to address this issue by choosing microbial hosts are already prevalent in the body (such as E. coli strains commonly found in the human gut) or induce little hostile response from the host – such as the lactobacillus lactis bacteria which cannot invade mammalian tissues or cause infection.16,26 Furthermore, the immunogenicity of engineered microbes can be attenuated by deleting gene clusters in the bacteria responsible for making toxic molecules that might provoke an immune response.3 Yet, problems still remain, as any component of a synthetic microbe might induce an immune response.3 Thus, scientists must develop mechanisms that not only eliminate the microbes but also destroy the internal material released from the dead cell when it is no longer producing the therapeutic.27 The development of such mechanisms remains challenging. Synthetic microbes represent a promising new technology that may address many of the issues currently plaguing drug delivery. Yet, the failure to translate the results of preclinical models into human patients combined with the difficulty of neutralizing microbe-host interactions has hampered the approval of microbial therapeutics. Further investigation is needed before synthetic microbes can be considered a viable clinical option. But if synthetic microbes prove successful in clinical trials, they could represent the future of care-not just for victims of cancer, but for all patients who depend on toxic and unpredictable drugs.
REFERENCES
Despite the extent of preclinical work involving synthetic microbes, there remain relatively few pub-
fall 2021 | PENNSCIENCE JOURNAL 39
MONKEYS & MORE: LAB ANIMAL SAFETY AT PENN INTERVIEW BY Cynthia Schneider DESIGNED BY AMARA OKAFOR Animals
have been used and sometimes mistreated in research for decades. One example is Ivan Pavlov and his studies on dogs in relation to the concept of classical conditioning, a psychological process of learning through association of a natural stimulus (dogs naturally liking food) with a neutral one (a bell). Another case being a rhesus monkey named Albert who was sent to space in 1949 by the US, but died of suffocation. Subsequently, 5 other monkeys (Albert II, III, etc.) were sent to space and also died due to other complications all to understand the effects of space travel on the body.1 In 1965, A Sports Illustrated article was written about a farmer’s dalmation, Pepper, being kidnapped and then sold for research/experimentation purposes. The researchers attempted to implant a pacemaker into Pepper, who later died. This article sparked a movement to implement animal cruelty laws at a nationwide level, rather than just statewide. These nationwide regulations on animal welfare now play a major role in how animal research can be conducted in laboratories across the country, including ones in the Penn community. National regulations on animal welfare were enacted in 1966 when the Animal Welfare Act was first passed, bringing clear guidelines on how animals in research
40 PENNSCIENCE JOURNAL | fall 2021
should be properly treated.2 It has ensured that there is a standard for handling vertebrate animals in research. Now, national and institutional entities are vital in overseeing animal welfare in research facilities. One national organization involved in animal safety is the Public Health Service (PHS) which includes the National Institutes of Health (NIH) and other sub organizations. More specifically, the Office of Lab Animal Welfare (OLAW) is under the NIH. These organizations in tandem with the USDA (United States Department of Agriculture) have passed further laws and guidelines to give these lab animals greater care. It must be noted that there isn’t any one federal body that is responsible for these duties, but that they all have overlapping and independent oversight responsibilities.
With the Health Research Extension Act of 1985, Features PHS established the Institutional Animal Care and Use Committee (IACUC) which must be instituted in every university or company that conducts animal research. Further guidelines on the care of animals in biomedical or behavioral research at the institutional level were also included in the Health Research His own lab works with monkeys or nonhuman Extension Act of 1985. primates. The main focus is the neural basis of perception and cognition in humans. Therefore, OLAW (under the NIH) has its own guidelines monkeys are the closest models to humans because on the care and use of animals in research. This is of their brains and ability to complete complex best seen in the NIH’s vetting process for grants that behavioral tasks. As mentioned earlier, cats may determines whether or not they will fund someone’s be used because of their excellent vision, but a research based on many parameters and justification cat’s visual system is different from humans’. This is of not only why the research is conducted but why because even if the behavior is shown in one way a certain animal is the appropriate model. OLAW with these models they are not as comparable with works closely with PHS and also monitors research humans as monkeys. sites across the country.3 Dr. Cohen uses the rhesus monkey, or rhesus Any university, institution, or company that does macaque, which is a species of non-human primate that animal research needs an IACUC. IACUC serves is commonly used in research because it shares about to establish general animal care protocol, determine 93 percent of its genes with humans.6 They also can what research is done based on specific justification perform certain high-order processes or mechanisms of the research and animal model used, monitors labs such as auditory decision making, perceptual to enhance their protocols, and works with full time grouping, or categorization and discrimination of professional staff as well as veterinarians on campus to vocalizations.7 This is his own justification for the use take care of the physiological needs of these animals. of the rhesus monkey as his animal model for his lab The animals on campus are seen everyday whether connecting to the first R (Replacement) that there is that be in the context of research or their general care, no other animal that could ‘replace’ the monkey that which includes their time spent with veterinary staff. could yield the same intended results. It is an ethical responsibility to make sure that these animals are safe and well-taken care of. Even during the height of the pandemic, the animals were seen daily because OLAW already required IACUCs to have a disaster plan. This plan includes maintenance of animal facilities, proper supplies (food, water, power, etc.), and a focus on animal care rather than research for situations like COVID-19.4 IACUC follows the principle of the 3Rs: Replacement, Reduction, and Refinement.5 This means that they question why this specific animal is used in research and not another animal (Replacement), questioning why a certain number of animals are used (Reduction), and refining the techniques/technology used to minimize pain and provide the best welfare for these animals (Refinement). The 3Rs are instrumental to how an IACUC committee evaluates research in a lab. I was able to interview Dr. Yale Cohen, the Otorhinolaryngology Chair and IACUC Chair at Pictured Above is the Interviewee Dr. Yale Cohen, Penn, to better understand IACUC functions and to who researchers how the brain combines sengain more context on animal safety research at Penn.
sory, motor, and cognitive cues to form internal models of the external world.
fall 2021 | PENNSCIENCE JOURNAL 41
Among the 100,000 animals in research labs at Penn, there is a wide variety of animal life that is the backbone of research on campus. Dr. Cohen mentioned something known as the “Krows Principle” when discussing how researchers choose an animal model: researchers use an animal model that is the most appropriate for their topic of interest. Some examples he gave include sound localization studies using barn owls because of their heightened hearing abilities, vision studies using cats because of their wider field of vision, and other studies using transgenic animals, usually mice, which will have a gene that was deliberately inserted into them to mimic diseases like diabetes or Parkinson’s disease.
than needed for their purpose. Dr. Cohen expressed the idea of the third R, refinement of animal care practices as technology develops: animal anesthesia for mitigating pain, scientific justification for chronic pain (no needless suffering), and proper care for sick animals. Above all else, he stressed that “animal welfare is never compromised for research.”
Lab animals are the backbone of much of the research conducted at Penn and around the world. The use of animal models has been in practice for many years, but through legislation and proper maintenance of their welfare, animals have been given much more respect for their contribution to research. Everytime that a new drug is being studied Dr. Cohen needed to address the first R, and advertised on the news or a research journal, the replacement, to justify why he needed to use rhesus animals used in these cases are safe and appreciated monkeys in his lab in adherence with IACUC protocol. for their contribution to scientific discovery. This is known as a “scientific justification”, which is the justification for why one’s research should be conducted and therefore receive grants, funding, and support. Through the lens of the second R, reduction, labs like Dr. Cohen’s do not use any more animals
REFERENCES
42 PENNSCIENCE JOURNAL | fall 2021
Characterization of Macrophage Differentiation and Regulation of Metabolic Activity in Adipose Tissue: Crosstalk between Macrophage and Adipocyte
Research
Magnolia Wang
Patrick Seale Lab Institute for Diabetes, Obesity and Metabolism Perelman School of Medicine
ABSTRACT The chronic state of obesity has been widely shown to trigger an immune response that involves the systemic increase in circulating pro-inflammatory cytokines, the recruitment of monocytesand lymphocytes to inflamed tissues, and generation of tissue repair. (Reilly, S., & Saltiel, A., 2017) This infiltration and activation of immune cells, in particular macrophages in adipose tissue (AT), contribute towards immunological dysregulation and the pro-inflammatory paradigm. (Ellulu, M. S. et al., 2017) A thorough examination of the interactions at play between both adipocytes and adipose tissue macrophages (ATM) could help to elucidate the bidirectional dynamics between two cell populations. The ultimate goal of the research project is to characterize the interactions of ATM with brown adipose tissue (BAT) in co-culture, and to examine the impact of AT on ATM differentiation, phenotype, and activity. The experimental approaches in this research project involve the use of flow cytometry and Luminex analyses to characterize ATM phenotypes and secreted cytokine profiles in co-culture with BAT, respectively, as compared to white adipose tissue (WAT).Characterization of ATM phenotypes is critical to understand their role in the metabolic function of AT and in the inflammatory pathways responsible for metabolic diseases. Initial efforts were allocated towards setting up an experimental system by means of establishing stress conditions through a panel of cytokines; in particular, IL-6. The cytokines in the panelwere evaluated on their ability to induce 3T3-L1 cells to secrete various cytokines including IL-6, IL-7, IL-4, IL-10, IL-33, and IFN-g; these stress conditions serve to replicate the impact of obesity on adipocytes. Through qPCR and RTPCR analyses, and based on overall production of cytokine profile, it was found that treatment of 3T3-L1 cells with IL-6 and CCL2 had induced cellular stress to the greatest extent, and were utilized in the Transwell co-culture system to stress-stimulate the 3T3-L1 cells and mimic conditions of obesity. Establishing a co-culture system under stress conditions paves the way for further analysis of the interactions between ATM and BAT under certain metabolic disorders such as in obesity or high-fat diet (HFD). INTRODUCTION Obesity induces a chronic and low-grade inflammatory state that contributes to metabolic dysfunction, and inflammatory pathways are induced in visceral AT due to quantitative and phenotypic changes in immune cells, with ATMs being the predominant population. (Monteiro, R., & Azevedo, I., 2010) In the lean state, BAT promotes a healthy metabolic phenotype by expending excess nutritional energy and breaking down toxic metabolites, while resident macrophages are mostly of M2 phenotype. In the obese state, however, ATMs undergo a “phenotypic switch” from an anti-inflammatory M2 subtype to a pro-inflammatory M1 subtype with
excess WAT, secreting pro-inflammatory cytokines and chemokines including TNF-α, IL- 6, and MCP1. (Lumeng C.N. et al., 2008). How BAT and WAT influence the M1/M2 balance and its subsequent effect on AT thermogenesis remains to be elucidated. Thus, the previous work gives rise to the research objective of understanding the interactions between AT and innate immunity, which could enable the harnessing of metabolic inflammation to improve tissue remodeling and insulin sensitivity and ultimately enhance a healthy life span. It has been established that the expansion of adipose tissue (AT) in obesity is accompanied by the accumulation of adipose tissue macrophages (ATMs) that contribute to a state of low-grade, fall 2021 | PENNSCIENCE JOURNAL 43
chronic inflammation and dysregulated metabolism. Furthermore, in BAT, thermogenic adipocytes that express greater levels of uncoupling protein 1 (UCP1) that in turn dissipate energy as heat. (von Loeffelholz C., & Birkenfeld A., 2018) However, the inactivity of WAT gives rise to obesity-associated inflammation, in turn leading to insulin resistance and greater metabolic dysfunction. (Vieira-Potter VJ., 2014) The research question that arises from these findings seeks to elucidate the communication pathways between AT and ATMs, and how AT and ATMs may exert a reciprocal impact in metabolic regulation of the AT and differentiation of ATMs. Furthermore, the fact that the processes of recruitment and differentiation in macrophage subpopulations are contingent upon the local AT led to the hypothesis that ATM regulation can be controlled by the distinct thermogenic capacity of BAT and WAT. Taken together, the objectives of this independent study are (1) to examine the impact of brown and white adipocytes on the differentiation, phenotype, and activity of macrophages, (2) to characterize the impact of ATM phenotypes on the metabolic function of WAT and BAT. Ultimately, understanding the bidirectional interaction between ATM and AT will serve as a springboard to further examine the regulation of the innate immunity pathways and the contribution to metabolic and stress conditions such as obesity or diabetes. METHODS Cell Culture Adipocyte cell lines transformed from 3T3-L1 cells were utilized and stress-stimulated with cytokines HIF1a, CCL2, IL-6, and IL-33. RAW 264.7 macrophages from murine model at unpolarized M0 with differentiation potential to either M1 or M2 were activated by LPS. 3T3-L1 pre-committed pre-adipocytes: 3T3-L1 is a pre-committed pre-adipocyte cell line derived from murine 3T3-L1 cells. The cells have a fibroblast-like morphology and under certain circumstances, may differentiate to adopt an adipocyte-like phenotype which can increase the synthesis and accumulation of triglycerides. 3T3-L1 CL-173 cells were obtained from the Institute for Diabetes, Obesity, and Metabolism of the Perelman School of Medicine and were differentiated into mature adipocytes. 44 PENNSCIENCE JOURNAL | fall 2021
RAW 264.7 macrophages: RAW 264.7 cells are macrophage-like cells and originate from the Abelson leukemia virus transformed cell line that is derived from BALB/c mice. Upon stimulation with LPS, RAW cells increase nitric oxide (NO) production and enhance phagocytosis. Additionally, RAW cells have the ability to kill target cells through antibody dependent cytotoxicity. RAW 264.7 cells attach to tissue culture-grade plastic via cation–dependent integrin receptors as well as other cation-independent receptors. Given that they are extremely sensitive to lipopolysaccharide (LPS) endotoxin from gram-negative bacteria, only sterile disposable tissue culture ware and solutions, buffers, and media with endotoxin tested distilled deionized water were used during handling. RAW 264.7 cells utilized in this project were obtained from the Institute for Diabetes, Obesity, and Metabolism of the Perelman School of Medicine. LPS Preparation for Activation of RAW 264.7: The RAW 264.7 macrophages were activated by subjecting samples to lipopolysaccharide (LPS) stress. KDO lipid A from Avanti was resuspended in DPBS to a 1 μg/μL concentration. This was sonicated for 1 minute in an ultrasonic water bath. The RAW 264.7 cells were cultured in this medium and stimulated for 24 hours at a final concentration of 100 ng/mL. Preparation of MDI induction medium: Stock solutions of 50 mM IBMX and 1 mM dexamethasone were prepared in DMSO. IBMX was added to DMEM to a final concentration of 0.5 mM to generate 1 mL of IBMX stock solution per 100 mL of medium. Next, dexamethasone was added to a final concentration of 1 µM to generate 100 µL dexamethasone stock solution per 100 mL medium. Afterwards, insulin was added to a final concentration of 10 µg/mL. Preparation of insulin medium: Insulin was added to DMEM to a final concentration of 10 µg/mL. Co-culture of Adipocytes and Macrophages: Macrophages and adipocytes were co-cultured to allow for direct contact between the cell populations and for the continuous, stable secretion of different mediators. The co-culture system was utilized to simulate the cell-to-cell interactions within in-vivo HFD murine models, in which the ATM/AT ratio can then be determined.
The co-culture system was established by utilizing Transwell Permeable Support plates. The Transwell individual inserts were placed into each well of the 6-well reaction plate during the cell seeding procedure; the wells of the reaction plate had already contained a population of adipocytes. The macrophages were added on top of the Transwell permeable membrane, which facilitate direct contact between the two cell populations in order to crucially examine the bidirectional relationship and metabolic activities between macrophages and adipocytes in the in vitro system.
Research
Medium every 2-3 days until ready for assay; cells were suitable for assay 7-14 days post-differentiation. The differentiated adipocytes were treated with cytokines HIF1a, CCL2, IL-6 and IL-33, as inducers of cellular stress.
Reverse Transcription Polymerase Chain Reaction (RT-PCR) RNA extraction of 3T3-L1 cells PureLink RNA Mini Kit was utilized for RNA Quantitative RT-PCR was performed using extraction of 3T3-L1 cells. Proper aseptic RNA fluorogenic SYBR Green Master Mix kit (Thermo handling techniques were followed to prevent RNase Fisher) and a Mx3005P QPCR System (Agilent). Spectrophotometric analysis was performed to contamination of reagents and RNA samples. determine the concentrations of purified total RNA Fibroblast differentiation of 3T3-L1 preadipocytes samples before reverse transcription. The reaction mix for all mRNA samples is outlined in Table 1. into adipocytes Following the passage of 3T3-L1 cells, the procedure for 3T3-L1 differentiation into mature adipocytes was Thermal Cycling: performed. Cells were fed alternate days using 3T3- Reactions were loaded into a thermal cycler. The L1 Preadipocyte Medium and incubated another 48 thermal cycler was programmed with the following hours before the differentiation process was initiated. conditions shown in Table 2. After 48 hours, medium was replaced with a sufficient volume of 3T3-L1 Differentiation Medium. The Quantitative Polymerase Chain Reaction (qPCR) 3T3-L1 cells were further incubated for 2-3 days. The qPCR was employed and samples were run under cells were fed using 3T3-L1 Adipocyte Maintenance standard cycling parameters, outlined in Table 3.
fall 2021 | PENNSCIENCE JOURNAL 45
Statistics and Data Analysis RT-PCR and qPCR were employed to monitor the production of cytokines in response to cellular stress, characterized by marked stress in the endoplasmic reticulum. Data was analyzed using the comparative Ct method. Data in Figures A through D in the results section are expressed as mean ± SEM. Statistics were performed with GraphPad software (Prism). DISCUSSION Obesity triggers an immune response that involves a systemic increase in circulating inflammatory cytokines, recruitment of immunocytes to inflamed tissues, activation of leukocytes, and stimulation of repair tissue responses. (Tanaka, T. et al., 2014). The results of this study highlight the complex role that pro-inflammatory IL-6 and CCL2 plays in inflammation and metabolism. In particular, the experimental findings demonstrate that BAT-derived IL-6 and CCL2 had stressed 3T3-L1 cells to the greatest extent, compared to treatment with other cytokines HIF1a, CCL2, IL-6 and IL-33. This finding can be attributed to stress-induced IL-6 and CCL2 dependent metabolic programming, wherein stress in the endoplasmic reticulum induces gluconeogenesis in both net positive and negative energy balance states. (Amen, O.M., 2019) Based on the overall production of cytokine profile, it is clear that both IL-6 and CCL2, 46 PENNSCIENCE JOURNAL | fall 2021
especially IL-6, had induced cellular stress to the greatest extent and thus simulate conditions of obesity in 3T3-L1 cells. These two pro-inflammatory factors will be considered as stimuli for future experiments. An apparent trend indicates an increased expression of IL-6 mRNA levels resulting from treatment with HIF1a, CCL2, IL-6, and IL-33. Relative mRNA expression levels were normalized to the control group. In HIF1a treatment (Figure A), the relative mRNA level was at a value of 17.5, much greater compared to that of the control group or other experimental groups, which all had values at 5 or below. In CCL2 treatment (Figure B), both IL-6 and IFN-g were expressed at higher levels, with IL-6 at a relative value of 19 and INF-g at a relative value of 21. Compared to these two experimental groups, the mRNA expression levels for other cytokines and control groups were much lower, at values of 5 or below. In IL-6 treatment (Figure C), IL-6 was expressed at a high relative value of 33. This value had greatly exceeded the mRNA expression levels of the control group and other experimental cytokines, which had relative values of 7 or lower. In IL-33 treatment, (Figure D), IL-6 mRNA expression was also shown to be at a much greater level at a relative value of 42.5, a much higher expression level than that of the control group as well as that of other experimental cytokines, which all had relative values at 10 or below.
Figures A-D. Results of qPCR Reaction and Relative mRNA Levels in 3T3-L1 cells following stimulation with cytokines HIF1a, CCL2, IL-6, and IL-33, normalized to control
fall 2021 | PENNSCIENCE JOURNAL 47
From the experimental findings, it can be reasoned that IL-6 and CCL2 are potent stimuli that drive gluconeogenesis in net energy positive states. Therefore, IL-6 and CCL2 will be considered for use in the Transwell co-culture procedure since they demonstrated a marked increase in the induction of stress in 3T3-L1 cells compared to treatment with others, mimicking the conditions and inflammatory state of chronic obesity. This experiment had utilized 3T3-L1 cell lines transformed from murine white adipocytes in an invitro system. The absence of the primary macrophage and interactions with other cell types may limit the translation of the conclusion drawn from this project into the physiological or pathological setting. In the future, ex vivo systems involving primary macrophage and fresh adipose tissue isolated from human subjects could be established. Obese mice could also be utilized to closely mimic metabolic disease conditions in which the bi-directional immunological response between adipocytes and macrophages can be investigated in vivo.
medium matrices, making it an ideal system for obesity screening in clinical research. This multiplex bead suspension array system is based upon the sandwich immunoassay technique, and utilizes 100 distinct polystyrene beads containing fluorescent dyes at varying ratios. Each set of beads can be conjugated with a different capture molecule, which is prepared and incubated with the sample in a microplate to react with specific analytes. A fluorescently-labeled reporter molecule functions to specifically bind the analyte and is added to detect and quantitate each captured analyte. In the last step, the contents within each microplate well are drawn into the array reader, and precision fluidics align the beads in a single file, passing them individually through lasers. The Luminex assay ultimately provides the high-caliber sensitivity needed to precisely detect the presence of cytokines with a limit of less than 1 pg/mL, allowing for analyses into the types of cytokines, adipokines, and other inflammatory mediators secreted by macrophages and adipocytes in the co-culture environment. (Houser, B., 2012) Moreover, the phenotypes and activities of ATMs in cell populations of each sample will be analyzed in comparing experimental BAT vs. control WAT groups. The degree of M1 and M2 differentiation will be examined in order to determine if BAT contributes to the perpetuation or protection against inflammation associated with metabolic disease.
Following the incubation of macrophages and adipocytes in the Transwell co-culture system, Flow cytometry and Luminex multiplex analyses will be employed to characterize ATM phenotypes as well as cytokine and adipokine profiles secreted by macrophages and adipocytes, respectively, in the culture medium. In addition, primary tissues will also ACKNOWLEDGEMENTS be utilized to replicate a more natural model. I would like to thank Dr. Patrick Seale for his continued support and mentorship during my research Flow cytometry: Flow cytometry will be employed to identify cell experience this semester. I am particularly thankful surface markers for further analysis of different for his insightful advice that helped to further develop cell types including M0, M1, M2, BAT, and WA; my research hypothesis and experimental aims. specifically, CD11C for M1, CD206 for M2, and AUTHOR CONTRIBUTIONS F480 for both. Luminex Assay: A host of cytokine biomarkers including TNF-a, IL-6, INF-g, and IL-1B have been shown to play a crucial role in driving the pathogenesis of obesity. (Viera-Potter, V.J., 2014) Considering that much of the relationship between these biomarkers and the chronic obese state remains to be uncovered, a more sensitive and accurate immunological profiling technique shall be harnessed in order to accurately gauge the dynamic response of cytokines.
MW designed and performed all research, and PS provided input into the research question, objectives, and design of the project. REFERENCES
Amen, O. M., Sarker, S. D., Ghildyal, R., & Arya, A. (2019). Endoplasmic Reticulum Stress Activates Unfolded Protein Response Signaling and Mediates Inflammation, Obesity, and Cardiac Dysfunction: Therapeutic and Molecular Approach. Frontiers in The Luminex assay is known for its sensitivity, pharmacology, 10, 977. https://doi.org/10.3389/ precision, and linearity of dilutions in sample fphar.2019.00977 48 PENNSCIENCE JOURNAL | fall 2021
Ellulu, M. S., Patimah, I., Khaza’ai, H., Rahmat, A., & Abed, Y. (2017). Obesity and inflammation: the linking mechanism and the complications. Archives of medical science: AMS, 13(4), 851–863. https:// doi.org/10.5114/aoms.2016.58928
RESEARCH
Houser B. (2012). Bio-Rad’s Bio-Plex® suspension array system, xMAP technology overview. Archives of physiology and biochemistry, 118(4), 192–196. https://doi.org/10.3109/13813455.2012.705301 Lumeng C.N., DelProposto J.B., Westcott D.J., Saltiel, A.R. Phenotypic switching of adipose tissue macrophages with obesity is generated by spatiotemporal differences in macrophage subtypes. Diabetes. 2008 Dec;57(12):3239-46. doi: 10.2337/ db08-0872. Epub 2008 Oct 1. PMID: 18829989; PMCID: PMC2584129. Reilly, S., Saltiel, A. Adapting to obesity with adipose tissue inflammation. Nat Rev Endocrinol 13, 633–643 (2017). https://doi.org/10.1038/nrendo.2017.90 Rosário Monteiro, Isabel Azevedo, “Chronic Inflammation in Obesity and the Metabolic Syndrome”, Mediators of Inflammation, vol. 2010, Article ID 289645, 10 pages, 2010. https://doi. org/10.1155/2010/289645 Tanaka, T., Narazaki, M., & Kishimoto, T. (2014). IL-6 in inflammation, immunity, and disease. Cold Spring Harbor perspectives in biology, 6(10), a016295. https://doi.org/10.1101/cshperspect.a016295 Vieira-Potter VJ. Inflammation and macrophage modulation in adipose tissues. Cell Microbiol.2014 Oct;16(10):1484-92. doi: 10.1111/cmi.12336. Epub 2014 Aug 30. PMID:25073615. Von Loeffelholz C, Birkenfeld A. The Role of Nonexercise Activity Thermogenesis in Human Obesity. [Updated 2018 Apr 9]. In: Feingold KR, Anawalt B, Boyce A, et al., editors.Endotext [Internet]. South Dartmouth (MA): MDText.com, Inc.; 2000-.https:// www.ncbi.nlm.nih.gov/books/NBK279077/
fall 2021 | PENNSCIENCE JOURNAL 49
Interested in joining PennScience? Contact www.pennscience.org Penn Science is sponsored by the Science and Technology Wing at the University of Pennsylvania