Science in Society Review - Spring 2018

Page 1

Spring 2018 | University of Chicago A Production of The Triple Helix

The Science in Society Review

ISSN 2164-4316

The International Journal of Science, Society, and Law

ASU - Berkeley - Brown - Cambridge - CMU - Cornell - Georgia Tech - Georgetown - GWU - Harker- Harvard Harker - Harvard- -JHU JHU- -NUS NUS- -OSU OSU- -UC UCDavis Davis -- UCSD UCSD -- UChicago UChicago -- Melbourne Melbourne - Yale


THE TRIPLE HELIX A global forum for science in society

The Triple Helix, Inc. is the world’s largest completely student-run organization dedicated to taking an interdisciplinary approach toward evaluating the true impact of historical and modern advances in science. Work with tomorrow’s leaders Our international operations unite talented undergraduates with a drive for excellence at over 25 top universities around the world. Imagine your readership Bring fresh perspectives and your own analysis to our academic journal, The Science in Society Review, which publishes International Features across all of our chapters. Reach our global audience The E-publishing division showcases the latest in scientific breakthroughs and policy developments through editorials and multimedia presentations. Catalyze change and shape the future Our new Science Policy Division will engage students, academic institutions, public leaders, and the community in discussion and debate about the most pressing and complex issues that face our world today. All of the students involved in The Triple Helix understand that the fast pace of scientific innovation only further underscores the importance of examining the ethical, economic, social, and legal implications of new ideas and technologies — only then can we completely understand how they will change our everyday lives, and perhaps even the norms of our society. Come join us!

TRIPLE HELIX CHAPTERS North America Chapters Arizona State University Brown University Carnegie Mellon University Cornell University Georgia Institute of Technology George Washington University Georgetown University The Harker School Harvard University Johns Hopkins University The Ohio State University University of California, Berkeley University of California, Davis University of California, San Diego University of Chicago Yale University Europe Chapters Cambridge University Aristotle University Asia Chapter National University of Singapore Australia Chapter University of Melbourne


TABLE OF CONTENTS Quantum Cryptography: Uncertainty in a Secure World

5

Innovation and Medical Ethics

9

Praveen Balaji...................................................................................................

Benjamin Hsu...................................................................................................

The Impacts of Checkpoint Blockade on the Future of Cancer Research

10

Maya Pimentel.............................................................................................

Fighting the Flu: How Epidemic is Avoided Each Year

13

Annagh Devitt..............................................................................................

The Occasional Glass of Red: Ending the Debate on the Risk and Rewards of Moderate Alcohol Consumption

15

Vanessa Ma...................................................................................................

Moral Implications of the Atomic Bomb

20

Priya Lingutla..............................................................................................


STAFF AT UCHICAGO President Salman Arif Vice President Nila Ray Editor in Chief, SISR Aya Nimer Rachel Gleyzer Production, SISR Ariel Goldszmidt Ariel Pan Production, Scientia Elle Rathbun Bonnie Hu Events Director Peter Ryffel Events Coordinators Franklin Rodriguez William Rosenthal E-Publishing Directors Isabella Pan Editors in Chief, Scientia Jeremy Chang Clara Sava-Segal

Message from Chapter Leadership Dear Reader, It is with great excitement that we bring to you the 2018 Spring Issue of The Science in Society Review. A new year has introduced new directions to consider in some of the most pressing issues of science in society and at The Triple Helix we understand the need to investigate these questions in an interdisciplinary manner. In this vein, our writers, aided by a strong support system of undergraduate editors and faculty mentors, strive to incorporate the perspectives of multiple fields in their articles. For this reason and others, we at The Triple Helix pride ourselves on the fact that we bring our writers together with eminent University professors and field professionals for one-onone collaboration. We are proud to encourage our future leaders in their rigorous exploration for the key issues in society today. It is our hope that the articles presented herein will stimulate and challenge you to join our dialogue. Salman Arif President, The Triple Helix UChicago uchicago.president@thetriplehelix.org

4

THE TRIPLE HELIX Spring 2018

Š 2018, The Triple Helix, Inc. All rights reserved.


Quantum Cryptography: Uncertainty in a Secure World

C

Praveen Balaji

ryptology, the mathematical science

of secret communication, has always had a special role in human history. Whether it be the meager monoalphabetic substitution cipher appearing in Sir Arthur Conan Doyle’s 1903 story Dancing Man or the infamous Enigma cipher machine utilized in WWII with more than 100 quintillion possible combinations, people around the globe have invested much effort to avoid eavesdroppers.6 It is within this world of secret messages that we find a constant battle between those who wish to keep their privacy and those who wish to sabotage it. Cryptology has two branches, both of which are relevant to private communication: cryptography and cryptanalysis. The former is devoted to developing techniques to better encrypt, or conceal, information passed between intended parties. The latter is, as one would expect, focused on creating algorithms and methods to decrypt, or break, such ciphers in order to expose weaknesses. While both fields are equally important to the field of cryptology, cryptography is what safeguards the very basis of our communication.4 The safety of today’s Internet traffic relies partly on a popular form of cryptography called “public-key cryptography.”4 In this form of cryptography, the parties involved have “keys” that allow them to perform cryptographic operations such as encryption and decryption. Intuitively, if a party sends an encrypted message, only a party with a compatible key can decrypt the message. This basic idea is utilized in many interactions. For example, when a user visits a website that starts with the famous “https,” the computer is actually connecting securely through a web server by combining a series of cryptographic operations to ensure confidentiality, integrity of the information being sent, and authenticity in the source. When the server originally encrypts any message through an algorithm, only a computer with a matching decryption technique and key can obtain the message. An authentication tag that goes along with the sent information verifies that it is the proper party and not an imposter. The strength of these measures depend on the algorithm employed.1 These algorithms are generally devoted to finding problems that are easy to solve one way and hard to solve the other way. The most widely used encryption technique to date is RSA encryption, which is based on the multiplication of two large prime numbers. While computing the product of two numbers is very easy for classical computers, finding those two numbers given only the product is a rather difficult problem. Given this sort of problem, we could then send an encrypted message. An individual, say George, can pick two large prime numbers, which constitute his secret key to the message. He could then make their product public along with an “encryption exponent,” the public key, that would allow anyone to create their own encrypted message. Since only George knows the two prime numbers, he would be the only one able to decipher the message. Given current technology, it would © 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

5


not matter if eavesdroppers were tapping into the communication, for they would not be able to decrypt the message in a reasonable time.4 In the given example with a computer and a web server, the computer downloads a public-key with which it can send encrypted messages to the server. Only the server has the necessary resources (private asymmetric key) to break the encryption and communicate with the computer. It would seem that with all the protocols stated, along with the additional safety of a “message-authentication code” associated with every key distribution, communication could not be more secure.1 However, this is where quantum computers introduce a level of uncertainty in our everyday protection. Quantum computers differ from their classical counterparts through their utilization of quantum mechanical characteristics. The two major phenomena that constitute quantum computation are superposition and entanglement. The first allows for a particle to exist in two or more states, “such as spinning opposite directions” at once.3 As a result, the “computer’s memory can be in many classical states at the same time,” meaning that a quantum bit (qubit) can hold the “combination” of 0 and 1 such that it holds two times more data than a regular bit.2 To provide an example, 5 bits can only represent one digit between 0 and 32 (25) while 5 qubits can potentially represent all 32 digits at once. Generally, this exponential increase in information per a string of bits allows for greater power overall, although contrary to popular belief, it does not “lead to exponentially faster computation across the board.”4 However, this additional computer power is still powerful enough to make a significant dent in conventional cryptosystems. For example, as far back as 1994, mathematician Peter Shor devel- While quantum computers oped an algorithm that a quantum will not exceed conventional computer could use to “quickly foil” RSA encryption, “one of the computers for at least another major safeguards used today.”2 decade, their capabilities are Furthermore, another mathematician, Lov Grover, was able to on the rise. construct a quantum algorithm for finding a unique input in a database of N values, lowering the number of queries needed from “N to √N queries”––thereby cutting the time needed to break an encryption by an exponential factor related to N. While quantum computers will not exceed conventional computers for at least another decade, their capabilities are on the rise. Small devices from tech giants like Google and Intel are set to hit the market within the next five years and other companies, such as IBM, have already issued cloud quantum computing systems for developing algorithms.7 Such progress has become a beacon of hope for hackers across the globe. In the Dutch General Intelligence and Security Service’s urgent declaration for “quantum-safe encryption” in early 2015, the agency introduced the concept of “intercept now, decrypt later,” where a hacker may stash encrypted keys in hopes of decrypting them once a quantum computer becomes available.2 Other intelligence agencies have realized the potency such technology could have on classical cryptosystems as well. The NSA “revealed its intention to transition to quantum-resistant protocols” in a statement in 2015 and PQCRYPTO, a European consortium of 6

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


Fig. 1 Security levels of encryption techniques based how many operations (displayed as b where 2b is the approximate number of operations) necessary for classical and quantum computers. Although quantum computation slightly decrease the number of operations necessary for symmetric cryptography, where both parties have the same key, public-key encryption is entirely broken through Shor’s algorithm.1

quantum cryptography researchers, issued a report recommending “techniques that are resistant to quantum computers.”2 These measures, which may or may not be effective, vary widely in their approach to the problem. Classical-Based Protocols As seen in the figure provided, symmetric cryptography––where both the sender and the recipient have the same keys––is less affected by quantum attacks than public-key cryptography is. Given this information, the PQCRYPTO has advised encryption utilizing symmetric keys to move to ciphers with 256-bit keys so that the time needed to break an encryption is still significant.1 However, cryptographers have also stated that existing public-key cryptosystems which rely on the hardness of factoring can be replaced with more difficult mathematical problems. A top contender is McEliece encryption, which relies on transforming a simple, solvable linear algebra problem into a rather difficult one and vice versa. As expected, only those with the respective keys can simplify the problem and deduce the solution. Beyond computational problems, another popular approach being suggested is lattice-based cryptography, where keys are equations for coordinates in a “grid-like collection of points” akin to a 2D or 3D shape. A message could be located at some point in the mathematical construct such that the recipient must find the distance from a certain lattice point to that message. Such a problem is difficult for both classical and quantum computers due to the sheer number of permutations involved, making it a good replacement in protecting current security systems.2 Quantum-Based Protocols Given the rise of the quantum revolution, it would seem appropriate that while quantum cryptanalysis disarms today’s classical algorithms, quantum cryptography utilizes its mechanics to create unbreakable systems. Such research has been done on the basis of entanglement, a phenomenon which allows particles to affect each other’s states despite being apart spatially. From that, quantum computers far away © 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

7


from each other can be correlated in a way not possible classically.4 Overall, these same characteristics which threaten computer security have led to some interesting new protocols. One of the most practical quantum cryptosystems that has recently come up is Quantum Key Distribution (QKD), where keys for encrypting and decrypting messages are encoded “on a individual photon basis.” Since quantum theory suggests that photons change their correlation, or entanglement, if measured, any eavesdropper attempting to intercept the photon to decode its message will be detected by the parties involved [8]. This is the basic mechanism behind the most famous QKD protocol, BB84, where a bit––either 0 or 1––is represented by “polarizing a photon in either one of two bases.”5 The resultant string of bits acts as the key for encoding and decoding. Due to the unbreakable laws of physics being the foundation of its security, QKD protocols have gained a tremendous amount of support. Companies like ID Quantique and MagiQ Technologies have started offering commercial solutions. Even the US Defense Advanced Research Projects Agency has been developing quantum networks for large-scale secure communication.8 Hybrid Cryptography Another measure that has come up in many implementations is a combination of various classical and quantum techniques. For example, one could combine keys distributed using the quantum protocol mentioned above and a public-key algorithm for additional safety. In such a situation, keys could be obtained using non-quantum-safe tools and modified using the entanglement property. With that, the length of time for which the information is secure could be increased with a simple function that updates the keys.4 Overall, despite proper quantum computers being far from the present, its potential has cultivated an urgency in the world of private technology. The protocols mentioned above each have their own benefits and drawbacks. While quantum approaches to protection are unwieldy due to the difficulty of perfectly harnessing their effects, classical methods, while reliable, are a potential dead-end in the unforeseeable future. In truth, what would be best for now and possibly forever would be a hybrid form. Maximizing the amount of protection quantum effects can give us while minimizing the effort in sustaining the behaviors would be optimal in combination with a sure-fire classical key-distribution method. While the field of research into these protocols is still new, I have no doubt that before long, more protocols will come to face the looming threat of quantum cryptography. References

8

1

Bernstein, Daniel J., et al. (2017). “Post-quantum Cryptography.” Nature 549: 188-194.

2

Cesare, Chris. (2015). “Encryption faces quantum foe: researchers urge readiness against attacks from future-generation computers.” Nature 525(7568): 167+.

3

Choi, Charles Q. “Quantum Leaps in Quantum Computing.” Scientific American. Accessed March 30th 2018.

4

de Wolf, Ronald. (2017). “The Potential Impact of Quantum Computers on Society.” Ethics and Information Technology, 19(4): 271-276.

5

Haitjema, Mart. (2007). “A Survey of the Prominent Quantum Key Distribution Protocols.” Washington University in St. Louis.

6

Hughes, Richard et al. (2006). “Quantum Cryptography.” Contemporary Physics 36(3): 149-163.

7

Masoud, Mohseni, et al. (2017). “Commercialize Quantum Technologies in Five Years.” Nature 543(7644): 171-174.

8

Rothke, Ben. “An Overview of Quantum Cryptography.” Information Security Management Handbook, 6th edition, CRC Press, p. 1045-1057 (2007).

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


Innovation and Medical Ethics

T

Benjamin Hsu

can challenge and change our understanding of what is right or wrong. We can look at a couple of inventions as examples for how a single invention can usher in change for the entire field of medical ethics. The start of modern medical ethics began with English physician Thomas Percival, who coined the very term while drafting a code of conduct for physicians in 1794.1 The code of conduct addressed the professional obligations of health practitioners, and was deeply rooted in the the interpretation of the Hippocratic Oath––what should a doctor do and what does it mean to do no harm? Yet this changed and evolved when in 1928, the mechanical artificial ventilator underwent its first human clinical trial.2 The advent of this technology blurred the lines between life and death. When a patient’s life is sustained by artificial means, when is it, if ever, justified to deprive them of their life support? Is it murder to do so? Then in 1950, this line was further blurred with the first successful vital organ transplant (kidney).3 The main challenge associated with organ transplantation is that it is critical for the organ to be fresh; the success rate of the transplantation is drastically lowered even after a few hours post mortem. In light of this, what should be the criteria for one to be a donor? The conventional definition of brain death was hence invented, which some would argue is only a convenient way to sidestep the ethical question at heart: is it ever right to kill someone (by taking their organs) in order to save another? Finally, recent technological developments suggest that medical ethics may soon move beyond the individual and onto societal concerns. In 2013, the CRISPR-Cas9 genome editing technology was invented.4 Scientists believe that this technology might pave the way for genome editing in humans, which raises a plethora of ethical questions. Specifically, one interesting problem is the issue of distributive justice. If one day genetically altered humans become commonplace, how do we ensure that everyone has equal access to these genetic enhancements? In other words, how do we prevent the wealthy from becoming an exclusive class of superhuman? Additionally, would guaranteeing someone the genetic treatment for an illness (such as cancer) be the same as a genetic enhancement (such as making someone smarter)? Where can we draw the line? With these questions underlying many decisions made in the medical world, it is imperative to understand the ethical issues that technology and medicine may bring. echnological developments

References 1

Waddington, Ivan. (1975). “The development of medical ethics-a sociological analysis.” Medical History 19(1): 36.

2

Gorham, J. (1979). “A medical triumph: the iron lung.” Respiratory therapy 9(1): 71-73.

3

Matevossian, Edouard, Hans Kern, Norbert Hüser, Dietrich Doll, Yurii Snopok, Jörg Nährig, Jennifer Altomonte, Inga Sinicina, Helmut Friess, and Stefan Thorban. (2009). “Surgeon Yurii Voronoy (1895––1961)––a pioneer in the history of clinical transplantation: in memoriam at the 75th anniversary of the first human kidney transplantation.” Transplant International 22(12): 1132-1139.

4

Cong, Le, F. Ann Ran, David Cox, Shuailiang Lin, Robert Barretto, Naomi Habib, Patrick D. Hsu et al. (2013). “Multiplex genome engineering using CRISPR/Cas systems.” Science: 1231143.

© 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

9


The Impacts of Checkpoint Blockade on the Future of Cancer Research

A

Maya Pimentel

American Cancer Society, there are 234,030 new cases of lung cancer and 154,050 deaths from lung cancer each year in just the United States alone.1 The most common current techniques for cancer therapy include Paclitaxel and chemotherapy, which prevent further tumorigenesis through inhibition of tumor cell growth. However, these treatments are flawed and have limited success in larger, faster growing tumors. Paclitaxel inhibits cell division by blocking microtubule polymer cleavage. It is administered to minimize tumor growth, and can reverse the detrimental apoptosis inhibition developed through carcinogenic mutation.2 Despite being an effective form of cancer treatment, chemotherapies such as paclitaxel requires continuous administration and are not successful for all strains of tumor growth. For example, Paclitaxel is an unsuccessful form of anti-tumor therapy in triple negative breast cancer.3 Thus, a stronger, more advanced from of cancer treatment, may be required. ccording to the

More recently, a new form of treatment, known as “checkpoint blockade,� has been developed. Cancer cells form through mutations which induce several key characteristics that enable high proliferation, increased prevalence of mutations, and immune response inhibition. The last development occurs specifically and most commonly through a mutation which causes the cancer cell to develop a ligand that kills T lymphocytes upon its detection of the tumor mass. T cells are a type of immune cell that respond to peptides presented on molecules by both antigen presenting (APC) cells, and normal somatic cells. These peptides must be presented by molecules known as the Major Histocompatibility Complex (MHC), specifically, MHC class I for non-APC cells and MHC class II for APC cells (excluding cross presentation). All cells are constantly presenting peptides, but usually these are self-peptides that are from normal functioning cells. T cells have receptors, called TCRs, that develop random specificity to different possible peptide sequences, along with select affinity to the MHC complex itself. When a T cell bumps into a presenting cell and its receptor matches the peptide as well as the MHC complex attached, the T cell will try to activate a response. However, this will not precipitate solely by this binding of TCR-MHC connection. The T cell will only activate when it has been damaged or infected and a costimulatory signal, or ligand, is displayed as well, informing the T cell of the damage within. It then induces T cell-specific proliferation and defense measures against all cells presenting such specific peptides. In addition to costimulatory peptides, some cells also have inhibitory ligands that activate one of several pathways in T cells which induce T cell death. One of these pathways is PD-1, which leads to apoptosis and is activated by the presence of and binding to the ligand PD-L1. This is a natural process whose purpose is preventing autoimmunity through accidental activation of T cells against self-peptides. However, 10

THE TRIPLE HELIX Spring 2018

Š 2018, The Triple Helix, Inc. All rights reserved.


tumor cells are known to mutate and develop their own PD-L1 ligand to protect the cancerous cells by destroying any T cells specific for tumor cell peptides. This allows cells to proliferate in hiding, by blocking immune system detection. However, if this development is inhibited, the T cells will not be activated by the PD-L1 ligand, and therefore will be able to activate and induce a strong immune response, hopefully strongly reducing the tumor and eliminating further development. Thus, many researchers thought to develop a way to inhibit this ligand-receptor connection in order to save these T cells and allow them to attack the cancer cells with greater vigor and success. The latest development is the creation of an antibody known as anti-PD-1, which is an antibody that has a variable region specific to the PD-1 receptor. Thus, when introduced into an organism the antibody binds to the PD-1 receptor of T cells, preventing the possible binding of PD-L1 from cancer cells to this receptor, saving the T cells from apoptosis and allowing them to activate and proliferate. This is the “checkpoint blockade.” Pembrolizumab is a prominent checkpoint blockade therapy which includes the administration of anti-PD-1 antibodies to patients suffering from cancer. The Phase I trials for this immunotherapy included a range of doses. The progression-free survival rate (PFS) of the patients at 12 months was 35%, meaning that within one year 35% of patients had minimal cancer growth from this checkpoint blockade immunotherapy. However, the PFS does decrease over course time, and the PFS for a 36-month treatment is 21%. The overall survival rate after 36 months was 21%. This study suggests the possibility of physical and translational application as a novel form of cancer immunotherapy. Such immunotherapy is promising not only because of the successful survival rates and inhibition of tumor progression, but also because of its long term effects. Chemotherapy and other cytotoxic treatments require continuous application to yield strong benefits. However, immunotherapy may induce a prolonged tumor-fighting response, since the tumor cells are now free to be attacked by the immune T cells.4

Thus, Anti-PD-1 is more effective than the alternative anti-CTLA4 immunotherapy, as seen by the increased survival and and progression free success rates. Anti-PD-1 also appears to be the strongest form of immune checkpoint cancer therapy developed, according to KEYNOTE 006. Another “immune checkpoint blockade” treatment includes the use of anti-CTLA4 antibodies. Following a similar mechanism to cancer PD-L1, CTLA-4 is commonly produced by tumor cells to yield T cell death through cell cycle arrest. A therapeutic form of these anti-CTLA4 antibodies is known as Ipilimumab. In this randomized and controlled Phase III study, Pembrolizumab was compared to Ipilimumab in terms of treatment success in patients with advanced melanoma. The PFS for pembrolizumab after 6 months was 46.4%, with doses given every 3 weeks, and 26.5% for ipilimumab, with different dosing procedures but the same 3-week time interval. The survival rates were 68.4% for pembrolizumab and 58.2% for ipilimumab. Thus, Anti-PD-1 is more effective than the alternative anti-CTLA4 immunotherapy, as seen by the increased survival and and progression free success rates.4, 5 © 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

11


Despite seeming to be the best approach to cancer treatment, there are a few extreme disadvantages to such a form of immunotherapy. One of these is the possible threat of autoimmunity. The development of PD-L1 ligands is commonly associated with tumor cells, but it is also common in somatic cells in order to prevent autoimmunity, in case if a T cell has developed a receptor targeting non-antigenic proteins. The anti-PD-1 antibodies administered are non-specific and would bind to all cells presenting the PD-L1 ligand, including both cancer and non-cancer alike. Therefore, immune cells are now able to mount an immune response against non-tumor somatic cells as well, which may lead to the development of an autoimmune disease. Lupus is a type of systemic autoimmune disease known to lead to kidney failure. In one study, looking at PD-1 knockout mice, two out of four mice showed signs of lupus-like glomerulonephritis, suggesting a strong correlation between PD-1/ PD-L1 co-inhibition and prevention of autoimmunity.6 Overall, the use of immune checkpoint blockade as a form of cancer treatment shows strong clinical success. It has also been suggested to possibly yield prolonged tumor suppression, as compared to other forms of defense, like chemotherapy. In particular, anti-PD-1 treatment appears to be have yielded the most promising results among various forms of immune checkpoint therapy, as seen from the clear increase in survival rate and progression free success. With checkpoint blockade therapy, there is a potential risk of autoimmunity that may prove such immunotherapies as being too dangerous and risky for patient treatment. Nonetheless, anti-PD-1 treatment is still a powerful defense against cancer. As well, the development of other, possibly better forms of immunotherapy, such as the CAR T-cell therapy, may prove to be even more successful against tumorigenesis. Within the past several years, interest has shifted from cytotoxic therapies, such as Paclitaxel to more immunological therapies. It is clear now that the possibility of curing cancer may lie specifically within the field of immunology.

References

12

1

Cancer Facts & Figures 2018. American Cancer Society, 2018.

2

“Paclitaxel.” National Center for Biotechnology Information. PubChem Compound Database, 2018. Accessed May 5, 2018.

3

Oudin, Madeleine J. et al. (2007). “MENA Confers Resistance to Paclitaxel in Triple-Negative Breast Cancer.” Molecular Cancer Therapeutics, 16(1).

4

Linck, Rudinei Diogo Marques, Costa, Rômulo Leopoldo de Paula, & Garicochea, Bernardo. (2017). “Cancer immunology and melanoma immunotherapy.” Anais Brasileiros de Dermatologia, 92(6), 830-835.

5

Robert, Caroline et al. (2015). “Pembrolizumab versus Ipilimumab in Advanced Melanoma.” New England Journal of Medicine, 327.

6

Nishimura, Hiroyuki et al. (1999) “Development of Lupus-like Autoimmune Diseases by Disruption of the PD-1 Gene Encoding an ITIM Motif-Carrying Immunoreceptor.” Immunity, 11, 141-151.

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


Fighting the Flu: How Epidemic is Avoided Each Year

E

Annagh Devitt

United States faces an epidemic: the flu. The 2017-2018 flu was particularly bad, with doctor and emergency room visit levels rivaling those of the 2009 “swine-flu” pandemic. It is also the first ever flu season to have widespread activity levels in all 50 states and Puerto Rico. Yet, with more people than ever getting flu shots the question arises as to why this season is particularly bad. The answer is a number of factors ranging from the effects of the privatization of vaccine production to this year’s dominant flu strain. While the flu season starts in October for the Northern Hemisphere, preparations begin as early as February. Every year the World Health Organization (WHO) has to predict which strain will predominate the upcoming season. However, predicting for the exact strain 6 months in advance is nothing more than a guessing game. Although the Southern Hemisphere is a good indicator of what is to come (as they experience their flu season months before) the numerous variations and even the possibility of a completely new strain cannot be accounted for. very year the

After WHO has determined 3 to 4 strains they believe will dominate the season, the Center for Disease Control (CDC) or an affiliated lab grows the specified strains to make Candidate Vaccine Viruses (CVVs). The CVVs are then distributed to private companies for production. So called “flu-factories” inject the CVVs into egg embryos, allowing the companies to multiply the virus. This growth process only takes about a week and is relatively cost free, requiring only eggs and a controlled temperature environment. After the virus has been fully formed, it is removed from the embryo and killed (to be used for an inactivated vaccine). After purification, the federal government in the form of the Food and Drug Administration has to approve the resulting vaccine. If the vaccine passes, it gets put into dispensing mechanism, such as syringes, and then is sold to recognizable names such as CVS. In all, the production constitutes a $4 billion industry which saw $1.61 billion in revenue last year. The vaccine industry is a complex mix of private and public interest. Government funded facilities provide the needed vaccine yet private companies produce it and make it profitable. In some instances this had lead to advancements in the field, such as the quadrivalent vaccine, a vaccine that holds antibodies for 4 different strains rather than the typical 3. However, privatization also means companies will take the most cost-effective route, which means the effectiveness of the vaccine could be compromised. Recent research has suggested that growing vaccines © 2018, The Triple Helix, Inc. All rights reserved.

However, privatization also means companies will take the most costeffective route, which means the effectiveness of the vaccine could be compromised. THE TRIPLE HELIX Spring 2018

13


in egg embryos, as all major flu factories do, could lead to mutations in the vaccine that decrease its effectiveness. However, since eggs are so cost effective it is unlikely that the industry will transition to more effective measures such as cell-based or recombinant vaccinations. Egg based vaccines, due to the makeup of egg embryo, often cause mutations in the specific proteins of viruses. Therefore the virus that is initially given out by the CDC is not the same viruses that is produced by the private companies, hence the FDA approval. Specific strains such as H3N2, the dominant 2017-2018 strain, are particularly susceptible to mutation. That is why, in part, H3N2 has an effectiveness rate of approximately 30% compared to a rate of 50-60% for other strains like H1N1. Mutations occur because the virus adapts to the conditions inside the egg embryo which means the virus, and therefore the vaccine, is less suitable to humans. The alternative––growing the virus in mammalian cells––avoids this issue. Cell-based vaccines are also better for global epidemics because cells can be frozen and stored longer than egg-embryo vaccines. However, the cell technology needed to produce these vaccines is relatively new and deviates from an almost 70-year long practice of using egg embryos. In addition, Madin-Darby Canine Kidney cells (the cells used to produce these viruses) are much more expensive then simple eggs. Changes that occurs in the vaccine industry are not likely to happen anytime soon. The private industry does not have a sizeable incentive to switch their ways. Egg-embryos are cost effective and can still produce effective vaccines. To invest in more technologically advanced methods would just mean a higher price per dose without necessarily increasing the effectiveness of the vaccine. Likewise, the public side of the industry cannot do more than try and guess the right strain and urge the public to get flu shots. The one easy way out is the so called “universal vaccine,” a potential cure-all for flu season. Scientists are currently trying to synthesize a vaccine that attacks the head of the virus, the unchanging aspect of flu, rather than the tail, which changes every year. By creating such a vaccine the issue of the flu season could be solved permanently. Yet, this too is years in the making. References 1

“Influenza (Flu).” Centers for Disease Control and Prevention. Accessed May 5, 2018.

2

World Health Organization.

3

Moyer, Wenner Melinda. “Flu Vaccine ‘Factories’ Create Errors That Reduce Production.” Scientific American, November 6, 2017.

4

Tree, Julia A. et al. (2001). “Comparison of large-scale mammalian cell culture systems with egg culture for the production of influenza virus A vaccine strains.” Vaccine, 19(25-26), 3444-50.

5

Zost, Seth J. et al. (2017). “Contemporary H3N2 influenza viruses have a glycosylation site that alters binding of antibodies elicited by egg-adapted vaccine strains.” Proceedings of the National Academy of Sciences.

6

Robertson, James S. et al. (1995). “Replicative advantage in tissue culture of egg-adapted influenza virus over tissue-culture derived virus: implications for vaccine manufacture.” Vaccine, 13(16), 1583-88.

7

Alymova, I. V. et al. (1998). “Immunogenicity and Protective Efficacy in Mice of Influenza B Virus Vaccines Grown in Mammalian Cells or Embryonated Chicken Eggs.” Journal of Virology, 72(5), 4472-4477.

8

Matthews, James T. “Egg-Based Production of Influenza Vaccine: 30 Years of Commercial Experience.” The Bridge, Fall 2006, 19-24.

9

Branswell, H. “‘The Problem Child of Seasonal Flu’: Beware This Winter’s Virus.” Scientific American, January 9, 2018. ”Universal Influenze Vaccine Research.” National Institute of Allergy and Infectious Diseases. Accessed May 5, 2018.

10 11

Beil, L. “A Universal Flu Shot May Be Nearing Reality.” Scientific News, October 17, 2017. Waldie, Paul and Robertson, Grant. “How Vaccines Became Big Business.” The Globe and Mail, April 30, 2018.

12

Sagonowsky, E. “Industry’s Favored Flu Shot Production Process Hurt Efficacy Last Year: Study.” FiercePharma, November 7, 2017.

13

Bernstein, A. and Hoffman, Steven. “Fighting the Flu: We Need a New Kind of Intelligence.” The Globe and Mail, January 22, 2018.

14

Fine, P. et al. (2011). “‘Herd Immunity: A Rough Guide.” Clinical Infectious Diseases, 52(7), 911-6.

15

14

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


The Occasional Glass of Red: Ending the Debate on the Risk and Rewards of Moderate Alcohol Consumption

O

Vanessa Ma

f all documented recreational drugs,

alcohol is by far the most prevalently used: over 80% of citizens in the United States have used alcohol at some point in their lifetime, outpacing cigarettes, which comes in second, at around 50%.1 Many organizations have published different recommended “safe” dosage levels, claiming different degrees of association and causation between alcohol and assorted afflictions, such as heart disease, liver failure, and cancer. In everyday conversation and journalism however, facts and claims contradict each other, and the precise extent of alcohol’s effects is a subject of hot debate. The current alcohol research landscape is comprised of two main methodologies: biological, wet lab research that examines the response of cellular cultures and human subjects to ethanol, or statistical examination of large datasets to draw correlations between alcohol consumption and health risks. This article hopes to separate causation from correlation through outlining the known biological mechanisms and correlative impacts of alcohol consumption. Central Nervous System Alcohol induces drowsiness and a reduction in pain response, manifesting in lowered inhibitions. The biological explanation for this lies in alcohol-induced neurotransmission alterations, which lower biological response rate and magnitude. These modulations are mainly observed on two types of membrane-bound proteins: ligand-gated ion channels (LGICs) and voltage-dependent calcium channels.2 Both short term and long-term changes are observed as a result of alcohol consumption. In the short term and at lower doses, alcohol has been observed to act as a depressant in cell culture by altering the activity of LGICs in a net inhibitory manner.3 This is best understood in the activity of GABAA receptors. Short for gamma-aminobutyric acid type A, these receptors are mainly found in presynaptic regions, and function through lowering postsynaptic membrane potential via an influx of chloride ions. Normally, nerves transmit signals to the brain through synaptic firings, which are triggered by changes in membrane potential above a certain threshold value. There are simultaneous biological mechanisms that act to favor membrane potential increase and those that counter it – the GABA ligand binding mechanism is one of the latter. The presence of ethanol causes concentration spikes in GABA ligands and receptors, which in turn further inhibits membrane potential. Thus, the synapse reaches threshold potential at lower rates, decreasing synaptic firings. In short, the body is less responsive to external stimuli, giving rise to the common phenomenon of “lowered inhibitions” accompanying alcohol use. Predictably, after long term use, alcohol increases the baseline membrane potential a synapse needs to reach in order to fire. What is surprising, however, is that the © 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

15


body compensates for this regular ethanol dosage by reducing the sensitivity of GABAA receptors to positive allosteric modulators and increasing affinity for their negative counterparts. This is done through transcription of subunits in mRNA production, which changes the conformation of the GABAA receptors, giving rise to the “tolerance increase” that is commonly experienced after prolonged consumption of alcohol. However, there is another side to the coin – experiencing a decrease in the effects of alcohol increases the probability of alcoholism through a combined mechanism of addiction and a lack of self-awareness and control. Acute, large doses of alcohol can directly block voltage-gated calcium channels.4 These channels are master regulators for neurotransmission as synaptic firings are triggered by LGICs and then sustained by calcium ion potentials. Ethanol specifically inhibits L-type calcium channels, which are responsible for long-lasting synaptic responses, such as sustained adrenaline production in response to a threat. With chronic use, even in small does, the concentration of L-type channels increases counter ethanol’s inhibitory effects. Since voltage-dependent calcium channels (VDCC) are triggered after the membrane is depolarized through LGIC activity, ethanol’s inhibitory effects on VDCC serve to compound biological unresponsiveness and its associated conditions. To summarize, in the short term, the effect of alcohol on the neurotransmission reduces the body’s response rate, reasonably leading to decreased alertness manifesting as lethargy, confusion, amnesia, and loss of sensation, among other conditions. In the long term, major changes to synaptic systems take place to adapt to alcohol’s biologically disruptive effects, which reduce the visible effects of alcohol consumption. However, this increases the chances of addiction, and with it, an array of socio-economic and serious accidents as a result of inhibited self-control.

In the long term, major changes to synaptic systems take place to adapt to alcohol’s biologically disruptive effects, which reduce the visible effects of alcohol consumption. Brain Function, Cognition and Learning Learning and cognition arises first and foremost from a functioning memory. Restricted by limited previous biological research on memory development and storage, most research on alcohol’s effects in this area has been grounded upon observation of participants’ phenotypical responses to carefully designed experiments. Alcohol negatively impacts memory through the alteration of sleep states. The frequency and density of Rapid Eye Movement (REM) sleep is reduced as a result of alcohol consumption in the evening, even in small amounts.5 REM sleep is essential for the consolidation of memory, which is the transfer of information from short term memory to long term memory. Fewer and shorter REM sleep cycles are linked with neurodegenerative diseases such as Alzheimer’s and Parkinson’s. It is important to note that alcohol’s effects on sleep were not observed if a small amount of alcohol was ingested during the day.

16

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


Organ Function Alcohol is chemically broken down in the liver and pancreas, which informs the conventional belief that chronic consumption of alcohol increases the drinker’s risk of liver failure and pancreatic diseases. This claim is biologically and statistically backed by extensive research, and thus will not be the focus of this section.6 The elephant in the room is the validity of the World Health Organization’s (WHO) classification of alcohol as a carcinogen. Research has shown statistically significant associative relationships between alcohol and various cancers, though the biological mechanism through which this occurs has yet to be identified. The greatest associative carcinogenic relationships with alcohol consumption are, predictably, those of the upper aerodigestive system, since these tissues come into direct contact with alcohol during ingestion – esophageal, oral cavity, pharynx cancers showed dose-proportionate risk hikes, alongside breast cancer. These risk hikes also apply to light drinkers.7 Increased risk of liver and colorectal cancers were only found for moderate to heavy drinkers. On a broader note, alcohol negatively impacts biological development and maintenance by interfering with growth hormone secretion, at a rate proportional to dosage (i.e. effects are noticed even for low alcohol consumption.8,9 This is accomplished through the reduction of mRNA transcripts of growth hormone receptors as well as growth hormone concentrations in the blood.10 These effects did not diminish after cessation of alcohol consumption, implicating long term systemic alterations – whether these changes are reversible at all is unknown. Also known as somatotropin, growth hormone breaks down fat into energy for essential cell reproduction and protein synthesis processes, such as tissue regeneration.11 Logically, this is extremely detrimental for persons undergoing puberty or at younger, developmental ages. Taking it one step further, these effects actually pertain to all stages of life, since cellular renewal and maintenance are just as, if not more important in old age than youth as organ systems start to fail. Alcohol’s effects on cardiovascular function are particularly pronounced due to the myriad of possible mechanisms in which its consumption can affect heart function, some of which have already been discussed above. As with the other aforementioned afflictions, it is hard to deduce a direct biological relationship between alcohol and cardiovascular disorders such as hypertension, atrial fibrillation and hemorrhagic stroke, and research to date has remained in the realm of statistical analysis for correlative significance. Atrial structural interferences can be observed almost immediately after alcohol consumption of any level. Atrial Fibrillation (AF), a condition also known as “heart flutter” or irregular heart rate, exhibits a dose-proportional correlative risk hike with alcohol.12 Moderate drinkers can experience AF within 36 hours of drinking and immediately elevated cardiovascular risk that is usually attenuated after a day. On a more relatable level, the common “hangover” phenomenon is suspected to simply be an effect of imbalances in bodily systems that have not yet reverted, for example, ionic imbalances caused by potassium loss in the vomit. Though not yet proven, research suggests a few potential biological causes: acute inflammation in heart muscle tissues, which is prevalent even in cardiac resonance imaging conducted on healthy binge drinkers, diuretic effects leading to displacement of electrolyte levels, which could provoke arrhythmic manifestations, and lowered calcium sensitivity and synaptic responsiveness, which leads to irregular low voltage regions in membranes © 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

17


and filaments. Atrial structural changes are causally linked to hypertension (high blood pressure) and atrial fibrillation, cardiac arrest, hemorrhagic stroke, and, in severe cases, death. Immune System Alcohol’s effects on the immune system can be broadly classified as increased susceptibility to communicable and non-communicable diseases. It is especially important to note that little biological research has been done on alcohol’s impact on the immune system, even on proxies (e.g. cell cultures, rats, rabbits), and most of the information to follow stems from social statistical analyses. This by no means reduces the credibility of the claims, and in fact calls for greater awareness and alertness, since a lack of understanding of the causal biological mechanism limits cure development and treatment options. In terms of communicable diseases, the link between alcohol consumption and sexually transmitted diseases, including human immunodeficiency virus (HIV) parthenogenesis, has been hotly debated. This could be due to social causes, since alcohol consumption also exhibits a correlative relationship with instances of unsafe sex.13 More controversial is the claim that alcohol consumption reduces effectiveness of anti-retroviral treatment (ART) for HIV-positive patients. Socially, a multitude of research efforts point towards a link between any alcohol consumption and non-adherence to treatment, with explanations for missed doses being lower self-care abilities and the conscious decision to stop doses for social drinking activities.14 Biologically, there is statistical evidence to show that alcohol consumption at all levels reduces clinical outcomes of ART.15 A plausible biological explanation postulates that alcohol molecules compete with viral molecules for binding sites at anti-retroviral complexes, or even change the conformation of anti-retrovirals, thus lowering the effectiveness of the treatment – there has yet to be conclusive research to this end. In terms of non-communicable diseases, alcohol consumption has been linked to increased risk of tuberculosis and pneumonia. Statistical analysis of global datasets reveals a dose-proportional relationship between alcohol consumption and risk of tuberculosis. Better biological understanding exists for the case of alcohol-related risk of pneumonia, which is caused by the commensal bacterium Streptococcus pneumoniae. When the host’s immune system is weakened, the bacteria becomes pathogenic. As previously discussed, alcohol reduces the body’s responses to external stimuli, which also translates to a suppression of defenses against pathogens. As the bacteria enters the respiratory tract, impaired cough responses and mucus generation of an alcohol user thus allows near uninhibited entry of the pathogen. Similar patterns can be observed in other areas of the immune system, thus leading to greater probabilities of pneumonia among alcohol users in a dose-related fashion.16 To conclude, statistical correlation of alcohol with various degenerative afflictions abounds, but the biological causal mechanisms for many of these claims have not yet been thoroughly researched. This particularly applies to the substance’s effects on neurotransmission and immune system impairment. Granted, many of these gaps will be filled as biological research and understanding of the human body in general progresses, but it is imperative that readers in all fields – policy, research, and academia – remain vigilant against the potential harms of alcohol use, even in small doses.

18

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


References 1

National Survey on Drug Use and Abuse.

2

Davies, M. (2003). “The role of GABAA receptors in mediating the effects of alcohol in the central nervous system." Journal of Psychiatric Neuroscience, 28(4), 263-274.

3

Valenzuela, C. F. (1997). “Alcohol and neurotransmitter interactions.” Alcohol Health and Research World, 21(2), 144-8.

4

Walter, H. J. et al. (2000). “Ethanol Regulates Calcium Channel Subunits by Protein Kinase C δ-dependent and -independent Mechanisms.” Journal of Biological Chemistry, 275, 25717-22.

5

Smith C. and Smith D. (2003). “Ingestion of ethanol just prior to sleep onset impairs memory for procedural but not declarative tasks.” Sleep, 15;26(2), 185-91.

6

Pace, A. et al. (2009). “Pancreas and Liver Injury Are Associated in Individuals With Increased Alcohol Consumption.” Clinical Gastroenterology and Hepatology, 7, 1241-6.

7

LoConte, N. K. et al. (2018). “Alcohol and Cancer: A Statement of the American Society of Clinical Oncology.” Journal of Clinical Oncology, 36(1), 83-93.

8

Roehrs, T. and Roth, T. “Sleep, Sleepiness, and Alcohol Use.” Alcohol Research & Health, 25(2), 101-9.

9

Ekman, A. C. et al. (2001). “Ethanol decreases nocturnal plasma levels of thyrotropin and growth hormone but not those of thyroid hormones or prolactin in man.” Journal of Clinical Endocrinology and Metabolism, (81), 2627-32. Lang, C. H. et al. (2000). “Acute Effects of Growth Hormone in Alcohol-Fed Rats.” Alcohol and Alcoholism, 35(2), 148-58.

10 11

Utiger, R. D. “Growth hormone.” Encyclopedia Brittanica. Voskoboinik, A. et al. (2016). “Alcohol and Atrial Fibrillation.” Journal of the American College of Cardiology, 68(23).

12

Rehm, J. et al. (2012). “Alcohol consumption and the intention to engage in unprotected sex: systematic review and meta-analysis of experimental studies.” Addiction, 107(1), 51-9.

13

Watz, V. (2017). “The detrimental effect of alcohol on HIV treatment adherence.”

14

da Frota Santos, V. et al. (2007). “Alcohol effect on HIV-positive individuals: treatment and quality of life.” Acta Paulista de Enfermagem, 30(1).

15

Bhatty, M. et al. (2011). “Alcohol abuse and Streptococcus pneumoniae infections: Consideration of Virulence Factors and Impaired Immune Responses.” Alcohol, 45(6), 523-39.

16

© 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

19


Moral Implications of the Atomic Bomb

T

Priya Lingutla

he discovery of the neutron in 1932 initiated the race to unlocking and harnessing

the power of the atom; the neutron instantiated nuclear fission as a possibility, and the atom bomb ceased to be a dream. On December 2nd, 1942, the Chicago-Pile 1 achieved criticality; the first self-sustaining nuclear chain reaction was born. For everyone standing in the room, CP-1 was both the answer to their hopes and hard work, yet, there was utter silence as everyone’s thoughts drifted immediately ahead to the bomb.2 Ironically, the scientists involved in the building of the bomb itself had minimal jurisdiction over it. In 1941, it was the United States government that decided to go all out on atomic bomb investigation: President Roosevelt supplied millions of dollars to the Uranium Committe.4 Jurisdiction over the bomb was bought through political funding, raising the question of who had a say in the actual dropping of the bomb. The moral framework of creating the atom bomb itself surrounded the need to win the race to the bomb, a race that would define the winner as the most powerful nation in the world. However, the actual use of the bomb in a civilian-dense area was a whole new question altogether. While whether the bomb should have been dropped or not is a question for the ages, in post-war consideration, the actions taken during the last few months of the war seem particularly morally questionable. These include the refusal of a bomb demonstration, which could’ve potentially recalled the opposition into surrender and the dropping of the second bomb amongst other things. This then begs the question of what the prevailing thoughts of the scientists and politicians/war-time officials at the time were; given that there were alternatives of exhibiting the bomb, what could the justification for causing unprecedented devastation to entire Japanese cities be, and was the mass killing of thousands of civilians a matter that could be decided by the US government? Never have we been so close to something so detrimental to the world as introducing a weapon even deadlier and more inhumane than poison gas to the world’s stage, and analyzing the perpetual thoughts of those involved in the process of building and dropping the bomb, specifically the scientists involved in its creation and the political/war-time officials who had jurisdiction over it, proves vital in understanding the timeline of events as well as why they transpired the way they did. Robert Oppenheimer, war-time head at Los Alamos, once expressed that for scientists, knowledge is a good in itself and that they must constantly learn about the world, how to control it, and how to give back to it.1 However, for the scientists at Los Alamos and the Metallurgical Laboratory, the weight of assessing moral implications of their work was exceedingly overwhelming. The creation of nuclear weapons during the war dawned upon scientists a new intensity of moral responsi-

Never have we been so close to something so detrimental to the world. 20

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


bilities; the struggle between maintaining autonomy as a scientist versus the moral responsibilities that accompanied the product of their research plagued all physicists. Nevertheless, there are only two opinions to be had about the bomb: opposing it or not. Ironically, the physicists with the greatest ties to the political and militaristic realm i.e. the physicists on the board of the Scientific Advisory Panel were the ones who recommended the dropping of the bomb, while the other prominent physicists involved in the research, such as Leo Szilard and James Franck, heavily opposed the dropping of the bomb in civilian-dense areas. They even crafted reports that drew out why the bomb should not be deployed and alternatives that would maintain US nuclear authority and security. While the Scientific Advisory Panel was established partly for a sincere desire of scientific input, it also served as the ‘token’ that stifled discontent that nuclear policy decisions were being made without consultation from the men making the bomb.1 The fact that there was a wartime partnership with the federal government about maintaining handsome funding to physics after US victory coupled with the fact that these scientists were being kept in the dark about war-time plans signifies a false sense of scientific representation in the political and militaristic realm. The Manhattan project began as a desperate political race to develop the atom bomb before Germany, but what it paved way to was the power to deploy an instrument of mass destruction; on June 16th, 1945, a statement from the Scientific Advisory Council recognized that the weapon should be dropped to “help save American lives,” (Oppenheimer) and thus the political doctrine of terror bombing was met with scientific reinforcement. What’s ironic is that in the political realm, the question was never whether to drop the bomb, but how. While it is obvious to state, the fact that the government invested an enormous amount of time, energy, and funding should point to the fact that there had to be a public demonstration, and having rejected the idea of a non-combat demonstration of the atom bomb, deployment similar to that of strategic bombing was imminent.6 One could even go as far as to attribute the views of the physicists on the Scientific Advisory Panel as developed through the misinformation, secrecy, and lack of communication between the army and the scientists. Interestingly, there’s a consistent theme of future moral re-consideration among the scientists who supported the deployment of the bomb. Moral responsibility outweighed their decision after the devastation that the bomb caused in Japan was assessed; Truman once unsympathetically recalled that Oppenheimer met with him and spent most of his time wringing his hands, telling him that they had blood on them because of the discovery of atomic energy.8 Oppenheimer realized that the hopes for international control of atomic energy proved implausible under the US government; subsequently, he even opposed the H-Bomb on moral grounds. While there existed a clear lack of disconnect between the Scientific Advisory Panel and the other prominent physicists in 1945, it is clear that even the scientists who were affiliated with the political realm were typically kept in the dark regarding war-time plans. While the special responsibilities and burdens that overwhelm the A-Bomb physicists are not mutually exclusive of whether they support the dropping of the bomb or not, some physicists, namely James Franck and Leo Szilard, went the extra mile to publish the Franck report, providing clear reasons as to why the dropping of the bomb would be harmful. The report touches on the point that the nuclear arms that would ensue, and even provides potential and more morally-sound alternatives. Essentially, the Franck Report draws out the reason the atom bomb isn’t simply © 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

21


an ordinary weapon: it encompasses all the staggering possibilities as a means of political pressure in peace and sudden destruction in war.5 The report extensively elaborates on the long-term consequences of dropping the bomb, especially the instantiating of a nuclear arms race. The report significantly mirrors the viewpoints of scientists who advocated for a demonstration of the bomb in an uninhabited area to establish superiority and give Japan the option to surrender, effectually saving thousands of civilian lives and decades of radiation poisoning. Clearly addressing the special responsibility that plagues scientists in the creation of nuclear weaponry, Franck and his co-signers were essentially arguing against the use of the weapon they spent half a decade of research building. While some scientists continued research in hopes of its enormity leading to world peace through peaceful application, technological imperative motivated others.4 The Franck Report, however, was ineffectual in persuading war-time officials. While the report was written with the long-term goal of international security from arm races and prevention of cataclysmic nuclear destruction, the army was simply looking for ways to inch themselves closer to victory in war. The distressed fatalism that underlies the writing of the Franck Report signifies the scientists’ desperation to have their revisions acknowledged, but it proved unsuccessful in swaying the US government. After US victory, there seemed to be a consensus that the bomb was dropped in order to win the war; however, in the political realm, the bomb was thought of as just another weapon to add to the US arsenal, not as a weapon that would single-handedly win the war. It does not seem as though they deployed it to end the war, but rather exploited the power of the bomb as a stronger tactic under strategic bombing. When Truman’s decision to drop the bomb was communicated to General Carl Spaatz of the army air forces, the second point of the order was about delivering additional bombs to certain targets as soon as they were made ready.6 The fact that the military leaders did not know how many bombs would be sufficient leads one to believe that they both disregarded the social considerations of the bomb and failed to comprehend its intensity and its implications. This is even reflected in the censoring of the Franck Report by Manhattan Project authorities when the power of the bomb was being described. Within the report, the bomb was described as something a million times more powerful than the rocket bomb; Within the report, the the million was crossed out and changed bomb was described to a thousand. Every week, 20,000 tons of explosives were being dropped on Japan in as something a million the course of normal bombing;6 the Little Boy times more powerful bomb exploded with the energy equivalent to 15,000 tons of TNT while the Fat Man than the rocket bomb; exploded with the equivalent of 20,000 tons the million was crossed of TNT. The atom bomb could cause the same destruction as a week of aerial bombing, in out and changed to a a matter of seconds. While in the political and militaristic realm, the bomb was added thousand. to their arsenal to up their game in regards to strategic bombing, they clearly didn’t understand the intensity of the devastation until after it had been deployed. This is ironic because after the US victory, the victory justified the bomb usage; assigning ‘justification’ to the action after the result is not the same as justifying the action itself. Given that the full effect of the bomb, 22

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


specifically radiation poisoning, was not known for decades after its deployment, it is clear that the US government took the matters of assessing whose lives were worth more into their own hands. This must be kept in mind when considering who is responsible for the deaths of thousand Japanese civilians. War-time officials even exploited the death of the Japanese to inspire post-war security and the peaceful harnessing of the atom’s power, which is particularly questionable given that Eisenhower publicly announced that the bomb could “blackmail mankind into peace.” Whether the building up of nuclear arsenal was an egotistical step or a calculated move for fortifying nuclear-based security, it didn’t prevent other nations from doing the same. The US government undoubtedly set the tide for the development of international nuclear arsenals, and coaxed the media through Eisenhower’s ‘Atoms for Peace’ speech into believing that the US desired nuclear transparency, while actually, the speech served as a cover for the US weapons buildup in the backdrop of the Cold War. The moral implications of dropping the bomb extend not only to thousands of civilian deaths, but also to several enacted policies that have an extremely hazy moral framework. While the burden of moral responsibility weighs heavily on US scientists and should for war-time officials, Japanese civilians paid the ultimate price. Hiroshima actualizes the deaths of thousands of people from mere gruesome statistics to a reality; the real accounts serve to justify the unreal human struggle of those who met Little Boy face-to-face. This unprecedented devastation followed the survivors for decades to come, whether psychologically or physically. Even in Robert Oppenheimer’s later moralizations, he appraised that they, they being the scientists and war-time officials involved in the research and deployment of the bomb, had the pride of thinking they knew what was good for man. Combining the shocking reality of what the US bombing did to the civilians of Japan with the emotional accounts of the survivors of the dropping of the first atom bomb brings about the messy and unbelievably complex moral question of whether the use of the atomic bomb was inevitable: an unfathomable question in all its complexity.

References 1

Berstein, B. J. (1988). “Four Physicists and the Bomb: The Early Years 1945-1950.” Historical Studies in the Physical and Biological Sciences, 18(2), 231-63.

2

CP-1 Video – “The Day Tomorrow Began.”

3

Eisenhower, D. D. “Atoms for Peace.” Speech.

4

Farrell, J. J. (1995). “Making (Common) Sense of the Bomb in the First Nuclear War.” American Studies, 36(2), 5-41.

5

Franck, J., D. J. Hughes, J. J. Nickson, E. Rabinowitch, G. T. Seaborg, J. C. Stearns, and L. Szilard. “The Franck Report.”

6

Gordin, M. D. Five Days in August: How World War II Became a Nuclear War. Princeton, NJ: Princeton UP, 2007. Print.

7

“Recommendations on the immediate use of nuclear weapons,” 16 June 1945, HB, RG 100.

8

Truman to Acheson, 7 May 1946, PSA.

© 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018

23


24

THE TRIPLE HELIX Spring 2018

© 2018, The Triple Helix, Inc. All rights reserved.


The Triple Helix International Leadership

The Triple Helix, Inc. is an undergraduate, student-run organization dedicated to the promotion of interdisciplinary discussion. We encourage critical analysis of legally and socially important issues in science and promote the exchange of ideas. Our flagship publication, the Science in Society Review, and our online blog, The Triple Helix Online, provide research-based perspectives on pertinent scientific issues facing society today. Our students at twenty chapters at some of the most renowned universities in the world form a diverse, intellectual, and global society. We aim to inspire scientific curiosity and discovery, encouraging undergraduates to explore interdisciplinary careers that push traditional professional boundaries. In doing so, we hope to create global citizen scientists. www.thetriplehelix.uchicago.edu


© 2018, The Triple Helix, Inc. All rights reserved.

THE TRIPLE HELIX Spring 2018


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.