Science in Society Review - Winter 2013

Page 1

Winter 2013 | University of Chicago

ISSN 2164-4314

Nuclear Sovereignty Radioactive Waste Dumping and Native Lands

LIGHTEN UP! Optogenetics as a medical treatment— and a barrier to metaphysics The Mysterious Function of the Female Orgasm On the Origin of Knowledge: an Evolutionary Enquiry into the Structure of Perception ASU Berkeley Brown Harker Harvard JHU UCHICAGO.indd 1

Cambridge CMU NUS OSU UC

Cornell Davis

Georgia Tech UCSD UChicago

Georgetown Melbourne

GWU Yale 1/17/2013 10:40:54 PM


EXECUTIVE MANAGEMENT TEAM Chief Executive Officer Mridula Nadamuni Chief Operating Officer, Asia Worapol Ratanapan Chief Operating Officer, Australia Kristijan Jovanoski Chief Operating Officer, North America Benjamin Dauber Chief Marketing Officer Megana Roopreddy Chief Production Officer Cassie Yeh Chief Technology Officer Lauren Beck Executive Editor-In-Chief, Print Publication Dhruba Banerjee Executive Director, E-Publishing Edgar Pal Executive Director, High School Outreach Kathryn Scheckel Executive Director, Internal Affairs Brittany Hsu Executive Director, Science Policy Yucheng Pan INTERNATIONAL STAFF Senior Literary Editors Harrison Specht Mary Fei Michael Graw Pallavi Basu Titas Banerjee Victoria Phan Senior Production Editors Felice Chan, Cornell Judy Chan, Cornell Andrew Kam, UChicago Emmy Tsang, Cambridge

Senior E-Publishing Editors Venkat Boddapati Fili Bogdanic Irene Ching Jae Kwan Jang Evan Jin Arthur Jurao Prathima Radhakrishnan BOARD OF DIRECTORS Chairman Erwin Wang Vice Chairman Kalil Abdullah Board Members Manisha Bhattacharya Jennifer Ong Zain Pasha Julia Piper James Shepherd Jennifer Yang TRIPLE HELIX CHAPTERS North America Chapters Arizona State University Brown University Cornell University Carnegie Mellon University Georgia Institute of Technology George Washington University Georgetown University The Harker School Harvard University Johns Hopkins University The Ohio State University University of California, Berkeley University of California, Davis University of California, San Diego University of Chicago Yale University Europe Chapter Cambridge University Asia Chapter National University of Singapore Australia Chapter University of Melbourne

THE TRIPLE HELIX A global forum for science in society

The Triple Helix, Inc. is the world’s largest completely student-run organization dedicated to taking an interdisciplinary approach toward evaluating the true impact of historical and modern advances in science. Work with tomorrow’s leaders Our international operations unite talented undergraduates with a drive for excellence at over 25 top universities around the world. Imagine your readership Bring fresh perspectives and your own analysis to our academic journal, The Science in Society Review, which publishes International Features across all of our chapters. Reach our global audience The E-publishing division showcases the latest in scientific breakthroughs and policy developments through editorials and multimedia presentations. Catalyze change and shape the future Our new Science Policy Division will engage students, academic institutions, public leaders, and the community in discussion and debate about the most pressing and complex issues that face our world today.

All of the students involved in The Triple Helix understand that the fast pace of scientific innovation only further underscores the importance of examining the ethical, economic, social, and legal implications of new ideas and technologies — only then can we completely understand how they will change our everyday lives, and perhaps even the norms of our society. Come join us!

The cover for this year’s Science in Society Review is designed by Katrina Machado of Brown University. It illustrates a choice, from Navajo folklore, between sustenance and destruction. As the story goes, their people decided between two yellow powders: corn or an underworld substance leading to snakes destroying the world. The tale is an allegory of the dilemma between public health and economic health that Native Americans in radioactive waste-contaminated lands face today.

UCHICAGO.indd 2

1/17/2013 10:41:24 PM


TABLE OF CONTENTS Optogenetics

9 11

Using light for healing could be a new treatment for depression

15

The Female Orgasm

Perception

22

Understanding female sexuality to better manage sexual health

How evolution and brain structure shape human perception

6

UCHICAGO UC CHI

Cover Article Alysse Austin

Nuclear Sovereignty: Radioactive Waste Dumping and Native Lands

Local Articles

Aleksandra Augustynowicz

9

LIGHTEN UP! Optogenetics as a medical treatment—and a barrier to metaphysics

12

Personal Genomics: A Double-edged Sword

15

The Mysterious Function of the Female Orgasm

18

Molecular Gastronomy: Where Food and Science Collide

22

On the Origin of Knowledge: an Evolutionary Enquiry into the Structure of Perception

Taylor Coplen

25

How Synthetic Biology Creates Scientific Knowledge

Daniel Benner

28

Toward a Behavioral Model of International Trade

Aleks Penev

Claire Wilson Leigh Alon

Daichi Ueda

Digital Highlights International Features 32

American Discussion of Climate Change: Is It Possible for Science to Inform Policy?

38

The Potential of Viruses in Medical Treatments

Elizabeth Richardson, Cambridge

40

Reading the Labels: Off-label Drug Prescription

Hillary Yu, Cornell

Cover design courtesy of Katrina Machado, Brown University

UCHICAGO.indd 3

Taylor A. Murray, ASU

1/17/2013 10:41:27 PM


INSIDE TTH STAFF AT UCHICAGO President Tae Yeon Kim

Message from the Chapter Leadership

President Emeritus Benjamin Dauber

Dear Reader,

Co-Vice Presidents Jawad Arshad Missy Cheng

It is our great pleasure to present you with the Fall 2012 issue of The Science in Society Review. We strive to publish articles that uphold the mission of The Triple Helix, Inc. – demonstrating rigorous investigations of particular scientific fields and situating the study within greater societal implications. This issue incorporates a diverse group of perspectives, featuring undergraduate writers who have worked closely with the editorial staff and faculty reviewers to craft pieces that carefully examine the wide array of pressing issues in today’s society. It is our hope that these varied viewpoints will contribute to the ongoing discussion within our campus and foster further inquiry beyond our community. We invite you to join the conversation.

Director of Marketing Luciana Steinert Associate Directors of Marketing Daniel Cheng Valeria Contreras Print Editor-in-Chief Lily Gabaree Print Managing Editors Jack Bliamptis Melissa Cheng Tamsin Parzen Luciana Steinert

Lily Gabaree Editor-in-Chief of Print

Tae Yeon Kim President

Benjamin Dauber President Emeritus

Managing Editor, Scientia Patrick Delaney Director of Production Andrew Kam Associate Director of Production Christina Chan Production Editors Shelby Winans Charles Pena Kat Cheng Emma Cervantes Writers Leigh Alon Aleksandra Augustynowicz Daniel Benner Taylor Coplen Aleks Penev Daichi Ueda Claire Wilson Associate Editors Doni Bloomfield Mehnaaz Chowdhury Amanda Hartman Da Hei Ku Tyler Lutz Sylwia Nowak Frank Qian Austen Smith Varun Suri Thomas Wagner Eric Zhang Faculty Review Board Eric A. Schwatz, MD Michael Rust, PhD Samuel Kortum, PhD David A. Weitz, PhD Professor Dario Maestripieri, PhD Marion Verp, MD Darrel Waggoner, MD Brandon Fogel, PhD Stefano Allesina, PhD Events Coordinators Catherine Castro Evan Jin Deniz Cem Özensoy Cecilia Jiang E-Publishing Editor-in-Chief Gregor Siegmund

Local News Dear Reader, This school year, our chapter has seen a number of exciting developments. Together with the more than 300 active members of our Chicago TTH chapter, we accepted the University of Chicago William J. Michel Award for Best Registered Student Organization. We continue to work closely with an everincreasing number of faculty members, and have notably acquired the support of the founding Pritzker Director of the brand-new University of Chicago Institute for Molecular Engineering, Matthew Tirrell, and his department. We have expanded our local organization so that now, we can confidently say that there is a place here for each and every one of our fellow college students. We have consciously and dramatically increased the size of our production, marketing and events teams, and have watched our group of talented writers and editors grow to unprecedented levels. In fact, we have further expanded the intellectual diversity of our chapter, with members having declared for more than 30 of the University’s different majors and minors. The second issue of our journal of original UChicago undergraduate research, Scientia, is currently in production. Finally, we have celebrated the graduation of Chicago members Edgar Pal and Benjamin Dauber to international positions - as Executive Director of E-Publishing and Chief Operating Officer, North America, respectively. All in all, we are proud to be able to say that TTH at the University of Chicago is the place to be now, and in the years to come!

Luciana Steinert Director of Marketing

E-Publishing Managing Editors Andrew Kam Deborah Olaleye Vicki Yang

4 THE TRIPLE HELIX Winter 2013

UCHICAGO.indd 4

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:28 PM


INSIDE TTH

Message from the CEO On the face of it, The Triple Helix International is an undergraduate science-policy journalism organization. Dig a little deeper, and you will find a talented group of students interested in exploring the implications of scientific advancement on society. We share our passions through scholarly reflections printed in our biannual journal, provide timely, in-depth analyses on our e-publishing blog site, host lecture series and invest in high school outreach. Providing all of these services is no small feat. A team of tireless content editors, business managers, production editors, event coordinators, and more are working at every level of the organization to facilitate success. To us, the organization is not just an extracurricular; it is a family. Alumni, your Triple Helix journey need not end with graduation. Keep in touch and let us celebrate your successes with you. Our past successes have set the stage for greater heights. TTH Online continues to grow, and the next step is going mobile, reaching wider audiences with our thought-provoking content. Ultimately, we aim to create an open forum for learning, and we can only do it with your help. Join us on Facebook, Twitter and LinkedIn. Let us know what you think. Voice your opinions, make yourself heard, and make time to enjoy the fruits of your labors. I, for one, am sincerely looking forward to another great year for TTH.

Mridula Nadamuni Chief Executive Officer

Message from the CPO and EEiC Welcome to The Science in Society Review, the print publication of The Triple Helix. In the following pages you will find engaging discussions written by talented undergraduate writers, assisted by dedicated editors, and laid out by production editors. Each article will be unique in its subject matter, but will weave a common theme of how science, society, and law interact in the world. Our cover article, “Nuclear Sovereignty: Radioactive Waste Dumping and Native Lands” by Alysse Austin of Brown University, for instance, asks us to consider how our need to dispose nuclear waste has trumped respect for disenfranchised groups of people. The practice of dumping nuclear wastes on tribal lands to help alleviate dire economic conditions is only one of many examples of how socioeconomic health clashes—ironically—with public health in our flawed socio-eco-political system. We at The Triple Helix encourage you to consider, as you read, the words of Chief Seattle: “All things are connected like the blood that unites us all. Man did not weave the web of life, he is merely a strand in it. Whatever he does to the web, he does to himself.” To the next generation of world problem-solvers, remember that complex issues require interdisciplinary solutions. Our name “The Triple Helix” represents the “web,” or intertwined disciplines, of science, society and law, and represents one way to approach these problems. Consider this, and feel free to join our global forum for science in society at triplehelixblog. com. It is a small, but first, step toward seeking solutions “The Triple Helix” way. On behalf of the entire International Editorial Board, welcome to our journal. We hope you will find some fascinating reads here, and will return for our next issue in Spring 2013!

Dhruba Banerjee and Cassie Yeh Executive Editor-in-Chief and Chief Production Officer

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 5

THE TRIPLE HELIX Winter 2013 5

1/17/2013 10:41:29 PM


BROWN

Nuclear Sovereignty: Radioactive Waste Dumping and Native Lands Alysse Austin

T

he warning sign that often signals the boundary of a nuclear waste repository is, appropriately, a foreboding one. The fan-like, black-and-yellow trefoil symbol has become a universal archetype for mysterious operations, for eerie desolation, for sinister danger. Many have only rarely, if ever, encountered these signs as thankfully most people are never situated near a long-term nuclear waste site. “Not in my backyard,” American citizens adamantly declare. The reaction on the part of these NIMBY campaigners with respect to nuclear repositories is understandable; nobody wants a radioactive waste site situated on the land in which they build their homes and lives. But such waste has to go somewhere, so where do these signs exist? The answer is a dark reflection of one aspect of America’s nuclear energy policy. One of the most surreptitious and tragically common methods of disposing of nuclear materials in the United States is to “site,” or place, repositories on sovereign or semi-sovereign Native American tribal lands– a practice which bears the gilding of legality but completely defies respect for our fellow citizens and for environmental justice. The practice is rarely questioned but hopefully will be more closely scrutinized as the possibility of a future more reliant on nuclear energy increases. The entanglement of Native American tribes and hazardous radioactive waste goes back several decades, but the treaties binding the U.S. government to civil treatment of indigenous land can be traced back to well over a century ago. In June of 1855, in what is today Washington State, the Yakama Tribe signed an agreement with the federal government, ceding a plot of desert land along the westward Cascade Mountains to the United States [1]. The plot of land encompassed a piece of the Columbia River, and the treaty enumerated the rights of the Yakama to use and continue to subsist on the ceded land and water resources. Over a century and a half later, the Yakama nation still possesses the rights to fish and use other resources in the area—at their own risk. 6 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 6

Today, the danger of gleaning resources from that particular segment of the Columbia lies in the fact that this land is now home to the most contaminated toxic waste site in North America [2]. It’s called the Hanford Site, and it was a fully-operating nuclear production facility from 1944 until 1983 [3]. It housed the plutonium reactors used to manufacture the infamous Fat Man, dropped on Nagasaki in the last days of World War II. The cores of its nine nuclear reactors were cooled by the waters of the Columbia River on which the Yakama people depend, which at the time of the facility’s operation was purportedly the most radioactive stream in the world [4]. Declared a Superfund site by the Environmental Protection Agency a year after ceasing operation, today the Hanford Site is still undergoing the cleanup process, with the Department of Energy spending an additional $2 billion a year to restore it to its natural state [5]. The Yakama, along with the nearby Umatilla and Nez Perce tribes, have been active in holding the federal government accountable for the site. In addition to lawsuits, such as the 2003 case urging the federal government to Reproduced from [20] adequately restore the fish runs in this territory (as only the land, not the waters, were deemed eligible for cleanup by the EPA), the Tribal Council has submitted comments on the restoration process and approached officials to remind them of the land treaty to which the United States is still bound [6]. They received less than hopeful responses. In a 2007 speech to Heart of America Northwest, an organization dedicated to the Hanford site cleanup, head of the Yakama tribe’s Environmental Restoration and Waste Management Program Russell Jim said, “As a Tribal Councilman in the mid-1970’s, I made government-to-government requests to Hanford officials, to inform them of their trust responsibility and of our Treaty rights...I was often met with a response along the lines of, ‘What Treaty–what does that have to do with Hanford?’” [7]. © 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:30 PM


ACKNOWLEDGMENTS BROWN The ongoing account of the Hanford Site is one of various indigenous sustainability group Honor the Earth, are referscenarios throughout the United States in which a native tribe ring to the siting process as “radioactive colonialism” [13]. has faced or is facing the threat of nuclear waste (imposed The settlements springing up in the midst of tribal lands in by both governmental organizations and the private sector) today’s nuclear age are not colonized by human beings, but within its community. Other examples include that of the by cancerous materials-- painful demonstrations of the lack of Goshute tribe and the White Mesa Ute tribe in Utah, the Gila respect for age-old treaties and the disregard for human welfare River Indian community in Arion the part of the United States. zona, the Fort Mojave, ChemeThe Navajo tribe’s struggle with huevi, Quechan, Cocopah, and cancer ever since Cold War-era It is time that we develop an Colorado River Indian Tribes in uranium mining on their lands ethical nuclear waste disposal Southern California, the Navajo reminds us of the reality of the Tribe throughout the southwest, risks. The Navajo had previously program that does not prey and countless others [8]. The difexperienced relatively low cancer upon economic disadvantage ference between these tribes’ rates than the general populastruggles and the case of the tion, but after years of inhaling Hanford Site is that the latter uranium dust, drinking water demonstrates indigenous tribes’ shared desire to mitigate which had dwelt among uranium pit mines, and eating foods harm to their communities; their investment in their own which, while in their living state, had absorbed the radioacenvironmental safety is unhindered by any conflict of interest. tive substances, the cancer rate in their community doubled But as in the case of the other aforementioned tribes, there is between the 1970s and 1990s (while the nation as a whole an insidious facet to the narrative linking Native American experienced a decline in cancer) [14]. lands with nuclear byproducts-- a caveat which pits public This issue is vital to track as the United States explores health against economic health and which further contorts alternative energy programs and attempts to lessen its depenthe state of environmental justice for indigenous groups in dence on fossil fuels over time, as nuclear energy is a viable the United States. It is the fact that there can be substantial future option. Nuclear energy already currently supplies twenty economic incentive attached to storing nuclear waste on tribal percent of American power, and the history of nuclear waste lands or to selling lands for the same purpose, which makes storage in the United States reveals a disproportionate amount the prospect attractive to tribal leaders [11]. of off-site waste residing on Native American reservations [11, While siting is a voluntary process on the part of the 12]. Nuclear waste storage is heavily regulated in the United site’s owners, some activists, like Winona LaDuke, head of States, but as reservations are sovereign or semi-sovereign ter-

Reproduced from [21]

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 7

THE TRIPLE HELIX Winter 2013 7

1/17/2013 10:41:30 PM


BROWN CAMBRIDGE ritories, they are not necessarily bound by the same regulatory the consideration of Yucca Mountain as a waste repository laws. This puts their citizens in a vulnerable place, as twice as suggests that indigenous interests would have little to do with many Native American families are below the poverty line as swaying the decision to build or not to build. “It is doubtful,” the rest of the population and tribes are therefore less likely to the report explains, “given the mood of the federal courts on reject a source of income, even if it comes at a non-monetary views of the sacred by Native American[s], that they alone cost: the risk of radioactive exposure and associated health will impede the project” [15]. It is time that the United States develops an ethical nuclear effects [11]. Such is the basis for the conflicting dichotomy of interests at stake, and such is the truly sinister aspect of waste disposal program that does not prey upon economic disadvantage. A piece of the Navajo hazardous nuclear waste siting creation story describes the Naon indigenous lands. vajo people being given a choice As early as 1986, the De“I made government-tobetween two yellow powders partment of Energy was looking for sustenance: one is yellow into Yucca Mountain, Nevada, government requests to Hanford corn powder, and the other is at this time the property of the officials, to inform them of their a substance from the ground Western Shoshone Tribe, as a po[19]. They called this substance tential nuclear waste repository trust responsibility and of our leetso in ancient times, but it is [15]. The 2004 Western Shoshone Treaty rights...I was often met now recognized as radioactive Distribution Claims Act allowed uranium ore. In the story they the U.S. government to buy 24 with a response along the lines choose the yellow corn, which million acres of land--including of, ‘What Treaty?’” represents life and vitality, and Yucca Mountain, sacred to the decide that leetso is from the Western Shoshone people–from underworld, a substance meant the sovereign tribe for $186 million dollars, distributed by the Bureau of Indian Affairs [16]. to stay in the earth undisturbed lest it transform into a snake Approved in 2002 by Congress, the Yucca Mountain nuclear above the earth’s surface and wreak havoc. This ancient story waste plan would effectively wedge the Western Shoshone is an eerie augury of the problem Native American communipeople between a high-level radioactive waste repository and ties now must grapple with; it is the push and pull between the already-existing Nevada Test Site, a nuclear weapons testing that which will sustain them and that which will cause them range. The proposal was met with widespread disapproval to languish. In recent decades, sadly, the separation between from environmentalists, Nevada citizens, and Native groups the two has been muddled. alike and was eventually scrapped in 2010. A victory for the Western Shoshone, perhaps, but an indirect one; the Nuclear Alysse Austin is an Environmental Studies major at Brown University Regulatory Commission cited “budgetary concerns” over all and grew up in Northern Nevada. else as the reason the plan was nixed [17,18]. A 1991 report on References 1. Stevens, Isaac L. Treaty With the Yakama, 1855. Territory of Washington; June 1855. 2. Economic risks to the region - Hanford, the Columbia River and the economy [Internet] 2009 [cited 2012 Mar 12]. Available from: http://www.ecy.wa.gov/features/ hanford/hanfordecon.html. 3. Hanford Overview [Internet]. 2012 Jan 8 [cited 2012 Mar 12]; Available from: http:// www.hanford.gov/page.cfm/HanfordOverview. 4. Alvarez, Robert. Poisoning the Yakama. Counterpunch [Internet]. 2010 Dec 17 [cited 2012 Mar 12]. Available from: http://www.counterpunch.org/2010/12/17/ poisoning-the-yakama/. 5. Frank, Joshua. Hanford’s Toxic Avenger’s. Seattle Weekly [Internet]. 2012 Feb 22 [cited 2012 Mar 12]. Available from: http://www.seattleweekly.com/2012-02-22/news/ hanford-s-toxic-avengers/. 6. Yakama Nation Seeks to Expand Hanford Lawsuit. U.S. Water News Online [Internet]. October 2003 [cited 2012 Mar 12]. Available from: http://www. uswaternews.com/archives/arcrights/3yaknat10.html. 7. Jim, Russell. The Yakama Nation & Restoration of the Hanford Site. Heart of America Northwest [Internet]. 2007 June 4 [cited 2012 Mar 12]. 8. Environmental Health, Justice, and Sovereignty. Greenaction [Internet]. Available from: http://www.greenaction.org/indigenouslands/index.shtml. 9. Native Americans: Uranium Mining/Nuclear Testing/Nuclear Dumping. Friends of the Earth [Internet]. Available from: http://www.motherearth.org/h-rights/america. php. 10. Understanding Radiation: Health Effects. USEPA [Internet]. 2011 July 8 [cited 2012 April 25]. Available from: http://www.epa.gov/radiation/understand/health_ effects.html#q1. 11. Reservations about Toxic Waste: Native American Tribes Encouraged to Turn Down Lucrative Hazardous Disposal Deals. Scientific American [Internet]. 2010 Mar 31 [cited 2012 Mar]. Available from: http://www.scientificamerican.com/article. cfm?id=earth-talk-reservations-about-toxic-waste. 12. United States Environmental Protection Agency. Nuclear Energy [Internet]. 2012

8 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 8

Feb 29 [cited 2012 Mar 14]. Available from: http://www.epa.gov/cleanenergy/energyand-you/affect/nuclear.html. 13. Gowda M.V.R., Easterling D., Nuclear Waste and Native America. Risk, Health, Safety and Environment 1998; 9(3); 248. 14. Fowler, Catherine S. Native Americans and Yucca Mountain: A Revised and Updated Summary Report on Research Undertaken Between 1987 and 1991. State of Nevada Agency for Nuclear Projects. 1991 October [cited 2012 Mar 12]. Available from: http://www.state.nv.us/nucwaste/library/se-039-91v1.pdf. 15. Pasternak, Judy. A Peril that Dwelt Among the Navajos. Los Angeles Times (Los Angeles, California) [Internet]. 2006 Nov 19 [cited 2012 May 21]. Available from: http://www.latimes.com/news/la-na-navajo19nov19,0,89720,full.story. 16. Harding, Adella. BIA updates Western Shoshone on Money. Elko Daily Free Press (Elko, Nevada) [Internet]. 2010 June 21 [cited 2012 Mar 12]. Available from: http:// elkodaily.com/news/local/article_0a5a56c8-7d0a-11df-aaca-001cc4c002e0.html. 17. Environmental Justice Case Study: The Yucca Mountain High-Level Nuclear Waste Repository and the Western Shoshone. University of Michigan School of Natural Resources and Environment [Internet]. 2004 June 18 [cited 2012 Mar 12]. 18. Winter, Michael. NRC Clears Way for Scrapping Yucca Mt. Nuke Dump. USA Today [Internet]. 2011 Sept. 11 [cited 2012 Mar 12]. Available from: http://content. usatoday.com/communities/ondeadline/post/2011/09/nrc-clears-way-for-scrappingyucca-mt-nuke-dump-/1#.T2YUHWJWpJ8. 19. Environmental Justice for the Navajo: Uranium Mining in the Southwest. University of Michigan School of Natural Resources and Environment [Internet]. 2004 June 18 [cited 2012 Mar 12]. Available from: http://www.umich.edu/~snre492/ sdancy.html. 20. Hanford Site. Wikipedia [Internet]. 2005 January [cited 2012 August 11]. Available from: http://en.wikipedia.org/wiki/File:Hanford_Site_sign.jpg 21. Yucca Mountain Nuclear Waste Repository. Nevada Division of Environmental Protection [Internet]. [cited 2012 Aug 11]. Available from: http://ndep.nv.gov/boff/ yucca03.jpg.

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:31 PM


UCHICAGO BROWN

LIGHTEN UP! Optogenetics as a medical treatment—and a barrier to metaphysics Aleksandra Augustynowicz

I

magine a healing pinpoint of light, fixed in the center of your forehead, radiating beams of energy throughout your body—a snippet from your local meditation class, or a clairvoyant glimpse of the future. About 121 million people worldwide suffer from depression, and in 2000, the illness was 4th in the global burden of disease [1]. It is expected to rise to second place in 2020, affecting both sexes of all ages [1]. Seven out of every thousand adults worldwide have schizophrenia [1]. Millions are struggling with anxiety, while others are battling neurodegenerative diseases. Current treatments are not always effective due to the brain’s many still-unresolved mysteries. “If the human brain were so simple that we could understand it, we would be so simple that we couldn’t,” quipped Ian Stewart, science fiction writer and professor of mathematics

at University of Warwick. This statement does not take into account the new field of optogenetics, which can now tease apart the workings of the nervous system. Optogenetics depends on transfecting target cells with viral vectors to express protein ion channels that respond to different wavelengths of light. Blue light stimulates proteins taken from freshwater algae Chlamydomans Reinhardtii, green light affects those from its relative Volvox Carteri, and yellow works on those of the archaebacteria Natronomona Pharaonis. Once the proteins are expressed on the target cell membrane, a wire ending in a small light is guided to the desired area of the brain. Turning on the light stimulates the proteins. When activated, they regulate the electrical activity of cells. Pulses of blue, green, or yellow wavelengths can be used to select

Reproduced from [11]

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 9

THE TRIPLE HELIX Winter 2013 9

1/17/2013 10:41:31 PM


UCHICAGO the microcircuit disrupts the effects of drug abuse [4]. The scientific progress is not just a news story: it affects people’s personal lives. For example, the same reward pathway might be turned off in a mother with a cocaine addiction, and allow her to focus on those she loves, making optogenetics a potent tool. This tool was used to probe the mechanism behind depression with a study of the prefrontal cortex neurons in mice; humans have corresponding neurons that are thought to contribute to

Reproduced from [11]

and excite the desired cells without affecting the functions of their neighboring cells [2]. In this way, optogenetics gives a way to modulate specific neuronal pathways as appropriate stimulations evoke responses like movement, altered behavior, or even memory retrieval. Studying these selective pathways and observing their effects brings us ever closer to understanding brain function. The field links behaviors and moods to the electrical activity of identified cells. Though optogenetics could prove to be a powerful tool in medicine, this new dawn for mental health studies and the unveiled brain could bring on blindingly bright discoveries about ourselves—it could upset the way we think of emotions and interpersonal relations. It debunks ontological questions of identity and existence. Five years ago Jay Leno joked about an optogenetically controlled fly pestering George W. Bush. However, rather than a novel mode of political activism, optogenetics could be the new treatment therapy for depression or addiction. It provides insight into pleasure-system pathologies and mechanisms of substance abuse by facilitating the study of groups of dopaminergic and cholinergic neurons. Dopamine plays an important role in the control of movement, emotion, and

Imagine doing surgery blindfolded, jabbing tissue without knowing where your scalpel lands pleasure or pain sensations [3], while cholinergic receptor activation modulates numerous complex and often opposing biological processes [4]. Studying these pathways therefore requires a method that is selective and temporally precise. Ilana Witten and colleagues from Princeton University have identified patterns in reward pathways by pulsing light onto chosen receptors in transfected mice. The neurons selected were sensitive to cocaine, and the study suggests that optogenetics can block cocaine conditioning in mammals. The control of 10 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 10

Though optogenetics could prove to be a powerful tool in medicine, this new dawn for mental health studies and the unveiled brain could bring on blindingly bright discoveries about ourselves—it could upset the way we think of emotions and interpersonal relations the disease. The mice were subjected to chronic social defeat stress—a model of bullying—by being exposed to an aggressive mouse. After repeated contact, the stressed mice showed social avoidance—a sign of depression. Neurons in the mouse brain showed gene expression that mimicked tissues from the prefrontal cortex of clinically depressed postmortem human patients. Once the model was generated, a fiber was inserted to light up and stimulate the pre-frontal cortex. Upon treatment, the mice fully recovered from depression symptoms—they began to interact with other mice. The optogenetic manipulation of cortical neuron firing highlighted the key activity of the prefrontal cortex in depression-like behavior and showed the possibility of counteracting depression symptoms. Evidence suggested that electrical stimulation can relieve depression that is resistant to other treatments [5]. Anxiety, the most common among psychiatric disorders, has been linked to the amygdala—a region in the brain associated with emotional processing [6]. The neural mechanisms behind the condition, however, were unknown until precise optogenetic stimulation of mouse basolateral amygdala terminals showed a reversal of anxiety symptoms. Conversely, when the cells in this region were optogenetically inhibited through glutamate receptor antagonism, anxious behavior increased [6]. Could the calming effect of waving lighters during a Pink Floyd concert be optogenetics’ new color in the band’s iconic prism?

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:31 PM


UCHICAGO Because of the brain’s complexity, chemical treatments drugs. It provides insight into how neural circuits coordinate are limited for psychiatric disorders that are due to chemical reward-driven learning and decision making. The mechanisms imbalances in the brain. The networks of synapses stretch far optogenetics illuminate within “Cogito” lead us ever closer beyond local chemical composition. Medications acting to to understanding “ergo sum.” restore the chemical balances affect neuron firing in the whole The promising knowledge we gain through these studcircuitry of the brain rather than solely targeting the defective ies has a price: our current understanding of ourselves. If a link. Treatment effectiveness varies. Imagine doing surgery mother’s addiction could be scientifically probed, studied, blindfolded, jabbing tissue without knowing where your scalpel and manipulated at the neural level, so could her maternal lands. The same problem of generality applies to methods of instincts. We could end up classifying love as just a series of study such as electric shock or viral strategies. Optogenetic equations and cellular events—a pleasure-reward pathway. Our targeting, on the other hand, can yield 98% specific light sensi- perception of the feeling would become a collection of electrical tive protein expression in desired dopamine neurons of trans- signals in determined cells. The knowledge may cause disillugenic rats, which then allows for sion: what would happen to the selective control with exceptional magical mystery that moved PeWe’ve come far in our Odyssey temporal precision—the stimulus trarch to compose? What makes does not affect nearby cells that a human special apart from his across the nervous system, are not meant to be excited [7]. physical being is his love, his dehacking away, shocking, “We know the human brain is sire to philosophize, his ability a device to keep the ears from to dream and imagine. Reducdousing, and stimulating grating on one another,” author ing the metaphysical qualities Peter De Vries once wrote. Thanks to electrical calculations might to optogenetics, we can start to hone in on the mechanisms upset the classic idea of “man.” An optogenetic dissection of within the mysterious brain. Whereas an electrical shock might the amygdala, hypothalamus and orbifrontal cortex to discover simultaneously stimulate various antagonistic activating and treatment for mental illnesses could also mean a dehumanization deactivating neurons in the medial prefrontal cortex, muddling of emotions: chemical and physical processes. Psychopatholoor cancelling out their effects, the light stimulation allows for gies arise with a lack of empathy [9], but their study would quick selection between them. expose biological motivations behind the feeling. We have Optognetics’ potential to treat neurological diseases was to understand that as the ultimate Homo sapiens, literally shown in mice with autism and schizophrenia, which correspond the ‘knowing man’—having insight into the very organ that to defects in neural interactions. An elevated ratio of cortical allows us to see within ourselves—we gain knowledge that cellular excitation to inhibition (E/I balance) in cells contrib- comes with the price of the magic of individualism. utes to disease symptoms. After activation of their prefrontal Looking into the future of neuroscience, Francis Crick cortical excitatory neurons with light, autistic mice showed predicted, “When we finally understand scientifically our percepan increased preference for social interaction [8]. This study tions, our thoughts, our emotions and our actions—hopefully gave insight to cell and circuit level changes that would not sometime in the 21st century—it is more than likely that our have been possible with just pharmacological manipulation. It view of ourselves, and our place in the universe will be totally leads us closer to understanding the pathophysiology behind transformed” [10]. Our possibilities will have transformed social and information-processing dysfunctions. along with our views. Despite our immense progress we still We’ve come far in our Odyssey across the nervous system, have a long way to go. As of right now, if Francis Crick had hacking away, shocking, dousing, and stimulating. We know read enough Sir Arthur Conan Doyle, perhaps he might have the general architecture of the brain and the responsibilities quoted Sherlock Holmes: “I am a brain Watson. The rest of of each part. With time, our quest of the overall “How?” me is a mere appendix.” reveals more and more intricate details. Instead of just knowing the area of the brain responsible for a reaction, we can Aleksandra Augustynowicz ’14 is majoring in Biological Sciences and now target the specific subgroups of neurons. These can be Spanish Language and Literature at the University of Chicago. She studied with temporal specificity and spatial selection. The is currently working in Elizabeth McNally’s molecular cardiology new method provides precision not possible with electrodes or lab and is aiming at a career in medicine, possibly psychiatry. References 1. Mental Health: Depression. World Health Organization. [Online]. Available: http:// www.who.int/mental_health/management/depression/definition/en/. [2012, July 09]. 2. Deisseroth, K. “Controlling the Brain with Light.” Scientific American. November 2010. 49-51. 3. Best, J.A., Nijhout, F.H., Reed, M.C. Homeostatic mechanisms in dopamine synthesis and release: a mathematical model. Theoretical Biology and Medical Modeling. September 2009; 6:21. 4. Witten, IB. et al Cholinergic interneurons control local circuit activity and cocaine conditioning. Science. December 17 2010; 330 (6011):1677-1681. 5. Covington, H et al. Antidepressant effect of optogenetic stimulation of the medial prefrontal cortex. The Journal of Neuroscience, December 1 2010; 30(48):16082–16090 6. Tae KM et al. Amygdala circuitry mediating reversible and bidirectional control of

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 11

anxiety. Nature Mar 17 2011; 471: 358-362. 7. Witten, IB et al. Recombinase-driver rat lines: tools, techniques, and optogenetic application to dopamine mediated reinforcement. Neuron, December 8 2011; 72: 721-733. 8. Yizhar O, et al. Neocortical Excitation/Inhibition balance in information processing and social dysfunction. Nature September 08 2011; 477: 171-178. 9. Decety, J. Dissecting the neural mechanisms mediating empathy. Emotion Review 2011; 3:92 10. Crick F. The impact of molecular biology on neuroscience. The Royal Society 1999; 354: 021-025. 11. http://forumblog.org/wp-content/uploads/slide-002-728-1.jpg 12. http://www.nih.gov/researchmatters/march2011/images/anxiety_l.jpg

THE TRIPLE HELIX Winter 2013 11

1/17/2013 10:41:32 PM


UCHICAGO

Personal Genomics: A Double-edged Sword Aleks Penev

I

n 2003, the completion of the Human Genome Project was technique is easy to setup and interpret, without the need for heralded as a new dawn in the fields of human genetics advanced technology or computers, so even small labs can do and disease. The consensus sequence for the majority of a limited amount of sequencing. The drawback, however, is the human genome was now available, annotated, and ready that one can only read 50-100 nucleotides at a time, depending to support countless explorations into human genetic condi- on how long the gel is made. For even a small genome - the tions. With this accomplishment, scientists set their sights on average bacterial genome has one million base-pairs – this an even loftier goal: making cost-efficient genome sequencing process would take an excruciatingly long amount of time, available for any species. This would grant genetic studies since the PCR reaction and gel preparation takes the better using any species as a model organism, from different strains part of 8 hours [1]. While this method is clever in design and of mice to obscure invertebrates, unprecedented flexibility in simple to implement, the length of time necessary to sequence designing experiments to understand the multitude of molecular the entire human genome (approximately 3.2 billion base pairs) interactions that underlie biological life. However, the most is inordinately long, especially considering that one needs potent use for fast and cost-effective whole-genome sequencing to sequence overlapping fragments to “stitch” the pieces towould be in the clinical sphere. The potential for preemptively gether. Fortunately, newer methods that allowed for faster recognizing disease susceptibility and prescribing preven- sequencing came to the forefront around the turn of the century. tive measures before the pathological signs of disease are These methods use machines that can do real-time analysis apparent is staggering: not only would the quality of life for of nucleotide addition onto a solid substrate by using lasers patients with chronic disease imand fluorescently labeled nucleoprove drastically, but the overall tides, and greatly improved the load on the healthcare system, speed and mechanization of the and the associated costs, would sequencing process [1]. The most Despite its effectiveness dramatically decrease as well. well-known version of this form in the laboratory, the This past January, Life Technoloof sequencing is called Illumina gies announced their system for sequencing, which has been and true power of this new generating a complete genomic still is the industry standard for sequencing capability lies sequence for $1,000, putting the sequencing genomes because it possibilities of clinically relevant is a much faster and cheaper in the field of preventive personal genomics well within but still accurate - alternative to medicine reach. However, this advancethe Sanger-style sequencing. The ment, like most revolutionary Human Genome Project cost aptools, comes with its own danproximately $3 billion, whereas gers; the possibility of genetic discrimination and potential for the sequencing of a similarly sized genome using Illumina over-reliance on the as-of-yet simplistic method of sequence cost a fraction of that (approximately $100,000) in the years interpretation are now becoming real concerns. Therefore, this following the completion of the Project [1]. powerful new tool in preventive medicine should be welcomed However, this was still not the quick and affordable not only with enthusiasm, but also pragmatism concerning sequencing that scientists were after. In 2006, the Archon X its consequences. Prize was announced, promising a prize of $10 million to Decades before the Human Genome Project, the first any company that could develop a method to sequence 100 genome, belonging to a bacterium, was sequenced in the mid- human genomes with sufficient accuracy in the span of 10 70s; once the polymerase chain reaction (PCR) was developed days for less than $10,000 per genome. This past January, Life in the 1980’s, other organisms’ genomes could be amplified Technologies announced their Benchtop Ion Proton Sequencer enough to sequence [1]. Though many sequencing methods system, which completely surpasses the goals of the originally existed, the most popular was the Sanger method, which uses stated project. This system is reportedly capable of sequenca clever experimental design, involving the nature of the four ing an entire human genome in as little as one day, for the primary nucleotides that make up the primary sequence of DNA astounding cost of $1,000 [2]. The previous industry leader and the highly specific shapes and chemistry of the chemical did the same in the timespan of several weeks to months, for bonds that hold them together. The resulting PCR product is approximately 5-10 times the cost. While the technical details a mixture of fragments that terminate at modified nucleotide of how this method works are still under wraps, it is known insertion sites which can be run on a gel side-by-side, and based that it implements semi-conductor chip-material technology on where the fragment stops the experimenter can “read” the that allows for incredibly rapid sequencing in a bench-top genetic sequence from one end of the gel to the other [1]. This machine slightly larger than the average table-top centrifuge

12 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 12

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:32 PM


UCHICAGO (imagine a large toaster oven) [2]. This technology is much more affordable, which greatly broadens the availability of partial and full sequencing capabilities to relatively small labs. The Ion Proton system has incredibly broad applicability. For example, the technology can be applied to the 1000 Genomes Project, a large, multi-national effort to refine the human consensus sequence. This project aims to compile a collection of human genomes from all the major ethnic populations to achieve a better idea of not only the sequence similarity between these different subgroups, but also the important differences between them that influence their particular susceptibilities to certain genetic and chronic diseases. The application of the Ion Proton system would allow the project to gain a wider expanse of the genome – potentially as close to the full sequence as possible – with a similar level of accuracy and repeatability to the original project. This is an important step forward, as a significant portion of human disease susceptibility is not a product of just the protein-coding exome, but also dependent on the intervening sequences, which accounts for approximately 98% of the human genome. This new technology will help advance our understanding of these intervening sequences and their influence on disease much more effectively than previously possible. Despite its effectiveness in the laboratory, the true power of this new sequencing capability lies in the field of preventive medicine. Numerous known mutations and polymorphisms can predispose a patient to a number of chronic diseases, such as Alzheimer’s, diabetes, and hypertension. Sequencing can easily be worked into the standard neonatal battery of tests, replacing the current tests that examine key genetic markers. Once this is done, it would be possible to identify any potential trouble areas that the child’s family should be aware of to avoid the serious consequences of a more developed chronic disease. Currently, the primary indicator for the potential development of diabetes in patients is family history of the disease. This is hardly concrete evidence and often insufficient to encourage significant caution in the patient throughout life. Direct

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 13

sequence evidence can convince patients of the need to adhere to a prophylactic intervention. Once the pathological signs of diabetes are evident enough to diagnose, they are irreversible and often require daily management (through multiple blood sugar tests daily and even periodic insulin injection) [4]. This necessity for perpetual care of chronic diseases Reproduced from [7] not only reduces the overall quality of life for the patient, but also puts a burden on the current healthcare system. Much of the disconcertingly high cost of the current healthcare system is a consequence of treatment for chronic diseases like type-II diabetes that could have been prevented. Diabetes alone accounted for $178 billion in the US in 2007 through both direct medical care costs and indirect costs from disability and work loss [4]. With complete sequence information and further research regarding the genetic predisposition of patients to all sorts of chronic diseases, a better preventive initiative could take hold and reduce the overall cost and load on the healthcare system, not to mention prevent the distress of countless patients who might otherwise develop pathological symptoms. With all its promise, however, the notion of complete sequence information also comes with its own ethical perils. The danger of “genetic discrimination” becomes a very real threat in a world where sequence information is available. Despite efforts to keep patients’ sequence information private, the risk of unauthorized access will always exist: imagine a hypothetical pair of identical twins, who share the same genomic sequence – what if one consents to sequencing but the other does not? Special cases like this one exemplify the nuances to any regulation of clinical sequencing that need to be considered carefully. Once the sequence is obtained and the critical annotations are incorporated into a patient’s medical records, the danger of the information being used against them increases significantly. For example, numerous groups have claimed to have isolated genetic polymorphisms that predispose the owners to taking more risks, or potentially addictive behavior such as gambling or alcohol addiction. It is not hard to imagine that if one’s employer (prospective or

THE TRIPLE HELIX Winter 2013 13

1/17/2013 10:41:32 PM


UCHICAGO otherwise) somehow obtains his or her genomic annotations, this can jeopardize any career prospects, whether or not the individual actually does exhibit the supposed phenotype of riskier or addictive behavior. There are already some regulations in place that prevent the discrimination of potential employees based on medical reasons: the Americans with Disabilities Act (ADA) protects workers as long as they can perform their assigned tasks and the Family and Medical Leave Act which protects jobs for those who need extended medical leave [5]. However, cases of medically related discrimination still occur through loopholes in the existing laws – for example, the ADA does not list disabilities specifically, so this is typically a subjective assessment. These laws are also not strictly applicable to genetic predispositions - for the most part, they protect those with clinically diagnosed or family history of disease. This is an important ethical question that needs to be dealt with carefully as we enter this world of enhanced genomic information – not only should people be protected from conclusions drawn from their genomic sequence, but the markers for genetic predisposition need to be carefully examined before becoming part of the medical canon. Finally, testing every individual at birth can have its own unintended consequences. Genetic tests on fetuses, both early and later in development, come with the inherent risk of damaging the fetus. However, there is also the risk of “false positives” if benign (or silent) mutations that occur in the genome on a regular basis with little or no functional effect are flagged as potentially deleterious. This could raise concerns for a patient where there is no cause for any, inciting excess caution and even extra costs. An analogous situation is described by the Hygiene Hypothesis, a theory behind the rising rates of allergies and asthma in developed countries, while less developed nations don’t seem to have the same trend, despite being exposed to less sanitary conditions and similar consumer products [6]. The Hygiene Hypothesis posits that excessively clean environments during the first years of life actually predispose the infants to asthma and allergies later in life by preventing their immune system from interacting with sufficient amounts of foreign matter and consequently hyper-sensitizing the immune system later in life [6]. A similar circumstance could be imagined in the case of genomics – one could live one’s life so carefully coddling a particular genetic feature that it might cause some other malady as an unexpected consequence – it might even be completely unrelated to the illness one was trying to avoid in the first place! It is clear that the dawn of Life Technologies’ new system for genomic sequencing will have a tremendous impact on the

Reproduced from [8]

scientific and medical fields. It will allow for more in depth studies into the workings of genetics and the predispositions to human disease, while also allowing physicians to quickly and effectively diagnose existing and potential illnesses, as well as advise proper preventive care that can potentially reduce the load on the American healthcare system significantly. However, these fantastic new prospects come with lurking dangers of their own, which need to be dealt with specifically in the legal and ethical codes as we enter this new era of genomic information. With the proper care and precautions in place, this new technology can easily lead us to new heights of knowledge and treatment. Aleks Penev is a fourth-year undergraduate biology major at the University of Chicago. His current research interests involve induced pluripotent stem cells and their applicability to regenerative medicine and chronic disease treatment.

References 1. Krebs J, Goldstein E, Kilpatrick S. Lewin’s Genes. 10th Edition. Burlington, MA: Jones & Bartlett Publishers; 2009. 2. Life Technologies: Ion Proton Sequencer [Internet]. New York: Life Technologies; 2012 [cited 2012 May]. Available from: http://www.invitrogen.com/site/us/en/home/ Products-and-Services/Applications/Sequencing/Semiconductor-Sequencing/proton. html?CID=fl-proton. 3. Murray P. Singularity Hub: Group Set To Sequence 1000 Genomes By The End Of The Year [Internet]. Columbia, MI: Singularity Hub; 2012 [cited 2012 May]. Available from: http://singularityhub.com/2012/04/04/group-set-to-sequence-1000-genomes-bythe-end-of-the-year/

14 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 14

4. The Economic Impact of Diabetes [Internet]. Brussels: International Diabetes Federation; 2009 [cited 2012 May]. Available from: http://www.idf.org/diabetesatlas/economicimpacts-diabetes. 5. Disability Discrimination [Internet]. USA: U.S. Equal Employment Opportunity Commission; 2010 [cited 2012 May]. Available from: http://www.eeoc.gov/laws/types/ disability.cfm. 6. Bukowski JA, Lewis RJ. Is the Hygiene Hypothesis an Example of Hormesis? Nonlinearity Biol Toxicol Med. 2003 April: 1(2): 155–166. 7. http://commons.wikimedia.org/wiki/File:Karyotype.png 8. http://commons.wikimedia.org/wiki/File:DNA-structure-and-bases.png

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:32 PM


UCHICAGO

The Mysterious Function of the Female Orgasm Claire Wilson

M

any contemporary sexologists claim that deeming the female orgasm “mysterious” is the result of cultural reluctance to learn and communicate about it. Indeed, it remains uncertain whether the estimated 5-10% of women who report never experiencing orgasm are truly incapable or simply lack sufficient information and comfort with their sexuality to achieve one [1]. Nevertheless, it is clear that the female orgasm is a more complex process than its male counterpart’s “push-button” response. Physical and nonphysical factors combine to create a more nuanced experience in women—one that, unlike the male orgasm, may be so subjective that it cannot be empirically measured. Such complexity leads many to wonder why the female orgasm even exists, as its evolutionary significance is unclear compared with the male orgasm’s explicit connection to reproduction. Theories entitled everything from “byproduct” to “cryptic choice” to “sperm upsuck” have been proposed, but only by achieving a thorough understanding of how female sexuality functions may we better comprehend why it has become what it is today—and use this insight to better manage women’s sexual health.

Physical and Psychological Mechanisms In the extensive 2004 review Women’s orgasm, five researchers compiled a comprehensive overview of the physical processes at work during arousal, orgasm, and the post-orgasmic “refractory” period. As with the male erection, female sexual arousal is marked by increased blood flow to the genitalia. While the vagina at rest resembles a “collapsed tube,” several factors cause it to lengthen and expand during arousal. The outer third of the vagina, designated the “orgasmic platform,” rises in anticipation of penetration. In a process referred to as “vaginal tenting,” the cervix and uterus draw up and away from the vagina’s rear wall. This creates a sunken, balloonlike distension somewhat isolated from the cervix in order to delay deposited sperm from passing directly through the uterus and into the fallopian tubes [2]. During orgasm, powerful striated muscles that surround the vagina begin producing rhythmic contractions, each about 0.8 seconds apart. Early researchers of orgasmic function in women believed that “the stronger the orgasm, the greater the number of contractions and, thus, indirectly the longer the duration of orgasm”; subsequent comparisons of physiological recordings and subjective reports, however, have not

Reproduced from [9]

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 15

THE TRIPLE HELIX Winter 2013 15

1/17/2013 10:41:32 PM


UCHICAGO conclusively found a relation between these contractions and individuals’ judgments of intensity. In addition to contractions of the “orgasmic platform,” uterine contractions may cause the cervix to lower into the seminal pool, resolving the obstacles against sperm transport posed by vaginal tenting. However, contractions may fall short as an objective indication that an orgasm has been experienced, given some women’s reports of orgasm when no contractions were observed [2]. Unlike males, in whom plasma concentration of oxytocin rises steadily throughout sexual activity, females experience a dramatic release of this hormone into the bloodstream during orgasm. Oxytocin plays a physiological role in reproductive functions (childbirth, for one) and may be involved in such behaviors as empathetic responses and pair-bonding. Activation of the cingulate cortex and medial amygdala in the brain has also been observed, perhaps indicating a role for dopamine, a neurotransmitter involved in reward systems like those activated by addictive drugs [1]. Orgasmic releases of the hormone prolactin—associated with lactation and feelings of sexual gratification in general—also double in concentration for women and remain so for up to an hour after climax. Blood flow to the genitals changes dramatically throughout arousal

Non-genital stimulation, dreams, hypnosis, and even mental concentration have all been shown to produce orgasm in certain women, highlighting the critical role of the brain and psychology in female sexual response and reaches its highest rates for about 10-30 seconds directly after orgasm. In combination with late uterine contractions, post-orgasmic releases of prolactin are thus hypothesized to serve as a “terminal event” for female sexual response, much like ejaculation for males [2]. Beyond its physical mechanisms, female orgasm has been shown to be quite variably determined. Non-genital stimulation, dreams, hypnosis, and even mental concentration have all been shown to produce orgasm in certain women, highlighting the critical role of the brain and psychology in female sexual response. In a similar vein, a recent study that collected data on exercise-induced sexual pleasure found that women who responded to the online survey reported orgasms “in absence of sexual fantasy and without clothing-related clitoral pressure or friction” [3]. Coupled with other reports of women who valued their subjective identification of orgasm over any objective observations, it appears that the female orgasm is far more individual and variably experienced than males’ more straightforward version [2]. 16 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 16

Theories of Evolution A controversial 2005 review, The case of the female orgasm: Bias in the science of evolution, revived the once popular conception of the female orgasm as an evolutionary byproduct of its male counterpart, rather than the result of adaptive necessity. That

...the female orgasm is far more individual and variably experienced than males’ more straightforward version men can achieve orgasm so much more reliably than women may mean that female orgasms are just the product of both sexes developing from the same embryological structure, much like how males develop nipples without any gender-specific need for them [4]. Despite its theoretical logic, analyzable evidence for the byproduct theory remains scarce. For example, a 2008 paper examined variability in clitoral length versus that of the penis. The authors explained their rationale that “relatively high variability in traits was evidence of a lack of selection for functionality” [4]. The study did in fact find clitoral length to be much more variable than penile length, seemingly implicating that the clitoris has not adapted to evolutionary pressures like the penis; but this conclusion quickly provoked two retaliations in the same journal. The first primarily refuted the interpretation that the erratic reliability of female orgasm is a sign of nonadaptation, pointing out that ease and frequency of orgasms may not reveal anything about their adaptive value [5]. The second attacked the measure of genital size as an indicator of orgasmic reliability altogether. Its authors even replicated the study measuring volume instead of length, yielding no significant difference between the sexes and revealing just how unreliable reporting variability in size can be [6]. Others have also expressed doubt over the byproduct theory. For example, one researcher criticized the putative link between variability and selectivity, pointing out that “nonadaptive male nipples are less variable in size than their adaptive female counterparts” [7]. Another attacked its conception of evolutionary pressures, as “The current adaptiveness of an adaptation is a separate concern from its evolved purpose...” [8]. But these debates over what exactly constitutes an evolutionary byproduct in the first place are better handled with evidence, like that found in a handful of twin studies conducted during the past decade. The authors explained that if female orgasm is a byproduct of both sexes’ shared developmental origins, genetic contributions to orgasmic function should be correlated for biologically related males and females; and according to survey data from a staggering 10,000 subjects, no such significant correlation was found [7]. Alternative theories as to the evolutionary origin of the female orgasm have thus been reasserted. One hypothesis states that females’ greater difficulty in achieving orgasm provided © 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:33 PM


UCHICAGO incentive for taking multiple mates among pre-human ancestors, promoting confusion over their offspring’s biological sires and consequently entrusted their care to the whole of the society [8]. Others argue that unreliable orgasms may bond females to those males capable of eliciting them, especially given the potential pair-bonding effects of orgasmic releases of oxytocin [7]. Many “cryptic choice” theorists furthermore believe that the inconspicuous nature of the female orgasm may aid in selecting which partner’s sperm make it to the egg. For instance, one study found that body symmetry in males—a trait indicative of stable genes—predicted frequency of orgasm in their female partners [8]. The humorously titled “upsuck” theory, moreover, posits that female orgasm facilitates the transport of sperm through to conception [2] or serves to retain deposited sperm after eliciting orgasm longer than ejaculates that did not elicit orgasm [1]. All taken together, byproduct theory must contend with evidence that the female orgasm’s infrequency might reflect many less straightforward functions than those of male orgasm: covertly selecting, possibly on the basis of males’ genetic caliber, which partners will get called back, which partner’s sperm will fertilize an egg, or which partner’s sperm will be kept long enough to have a chance. Implications and Conclusion Ultimately, any insight into the development and function of the female orgasm holds potential benefits for the management of modern women’s sexual health. Increased understanding of the physical and nonphysical processes behind the production of orgasm provides hope for the amelioration of psychological pressures placed on females who have difficulty achieving orgasm, from unfounded expectations to false inferences about the failure to meet them. With respect to its evolutionary significance, the suggestion that the female orgasm evolved as a mechanism for subtle discrimination carries a salient implication for maternal contributions to the human species: in effect, though the male sex drive may have played the major role in ensuring that future generations exist, female psychology may have had a major role in deciding what they are like. On the other hand, the validation of byproduct theory would alleviate much stress surrounding female sexuality, redefining failure to achieve orgasm as an entirely natural baseline from which each woman may proceed as she sees fit. Regardless, whether it facilitates pair-bonding with a single partner, incentivizes sex with many, or serves no specific purpose at all, the female orgasm’s existence at least reveals one basic truth about human behavior: we have evolved more complex social relationships than the basic male + female = References 1. Puts DA, Dawood K. The evolution of female orgasm: adaptation or byproduct? Twin Research and Human Genetics, 2006 June;9(3):467–472. 2. Meston CM, Levin RJ, Sipski ML, Hull EM, Heiman JR (2004). Women’s orgasm. Annual Review of Sex Research, 2004;15:173-257. 3. Herbenick D, Fortenberry JD. Exercise-induced orgasm and pleasure among women. Sexual and Relationship Therapy, 2011 Nov;26(4): 373-388. 4. Wallen K, Lloyd, EA. Clitoral variability compared with penile variability supports nonadaptation of female orgasm. Evolution & Development, 2008;10(1):1-2. 5. Hosken DJ. Clitoral variation says nothing about female orgasm. Evolution & Development, 2008;10(4):393-395.

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 17

Reproduced from [10]

offspring model of species perpetuation. Research into female sexual function continues to show how diverse and nuanced it is, illuminating the necessity for more open communication and individualized care for women’s sexual health. Increased awareness of the ways in which sexuality operates for males and females is thus essential to understanding the differences between us and deriving more enjoyment from our intimate interactions. Claire Wilson ‘14 is an English and psychology double major at the University of Chicago. She is an avid follower of research into sexual health and behavior and hopes to pursue a career in sexual psychology.

6. Lynch VJ. Clitoral and penile size variability are not significantly different: lack of evidence for the byproduct theory of the female orgasm. Evolution & Development, 2008;10(4):396-397. 7. Zietsch BP, Santtila P. Genetic analysis of orgasmic function in twins and siblings does not support the by-product theory of female orgasm. Animal Behaviour, 2011;82:1097-1101. 8. Thornhill R, Gangestad SW. Human female copulatory orgasm: a human adaptation or phylogenetic holdover. Animal Behaviour, 1996;52(4):853–855. 9. http://www.flickr.com/photos/penumbrapics/2947689682/in/photostream/ 10. http://www.flickr.com/photos/lezarderose/1227708362/sizes/l/in/photostream/ 11. http://www.flickr.com/photos/nalbertini/7036568793/in/photostream/

THE TRIPLE HELIX Winter 2013 17

1/17/2013 10:41:33 PM


UCHICAGO

Molecular Gastronomy: Where Food and Science Collide Leigh Alon

W

hen it was conceived, the term “molecular gastronomy” was not widely used or known, except by the numbered few interested in what was then a subfield of chemistry. The birth of the term is easily traced to 1989, when physicist Nicholas Kurti and physical chemist Herve This created a new discipline which they originally named “molecular and physical gastronomy.” The pair had set out to understand the chemistry behind why certain cooking methods were so prevalent, and the truth (or lack thereof) behind old wives’ tales, such as the notion that women could not prepare mayonnaise while menstruating [1]. They tested such theories in a space that was part kitchen, part lab, and organized the first International Workshop on Molecular and Physical Gastronomy in 1992 [2].

However, molecular gastronomy has recently become an oft-used term by the popular press to describe the trend in restaurants of using newfound techniques to present and prepare food in exciting and unexpected ways. For example, at The Fat Duck in the United Kingdom, one of the tasting menu items called “Mad Hatter’s Tea Party” consists of mock turtle soup, a pocket watch, and toast sandwich. Many involved in the intersection of food and science, however, have taken issue with the term “molecular gastronomy” and what it encompasses, with differing opinions from chefs, scientists, and critics. John Lanchester of the New Yorker writes, “Colonel Sander’s cooking is just as molecular as Ferran Adria’s,” who is considered to be the father of the practice of implementing molecular gastronomy in a restaurant setting. His restaurant,

Reproduced from [11]

18 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 18

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:34 PM


UCHICAGO El Bulli, was ranked number one in the world a record four times by Restaurant Magazine [4]. Indeed, This concedes that he and Kurti were not the first and only ones to practice molecular gastronomy. This writes that the first instance of molecular gastronomy may be found in antiquity, when in the second century BC an experiment was conducted to determine whether fermented meat was lighter than fresh meat [1]. He also mentions that the 18th century chemist Antoine-Laurent de Lavoisier studied stock preparation and used density as a measure of quality [1]. Therefore, This does believe molecular gastronomy applies to experimentation involving food; however, since cooking in general is not about acquiring knowledge, This doesn’t consider chefs molecular gastronomists. He ultimately outlines the field of molecular gastronomy by stating the five goals it seeks to achieve: to collect and investigate old wives’ tales about cooking, to model and scrutinize existing recipes, to introduce new tools, products, and methods to cooking, to invent new dishes using knowledge from the previous three aims, and to use the appeal of food to promote science [1]. According to This, the key to distinguishing molecular gastronomy from other forms of food preparation is its aim to understand the principles behind the physical processes the food undergoes and to use that knowledge to deliberately and methodically improve the way cooking is done [1]. This also adds that chefs should not be considered molecular gastronomists because “chefs create food, not knowledge” [1]. Yet he writes that it is imperative a certain degree of “art” and “love” must be incorporated in the field because “the main aim in cooking is to produce good food, which is art and not technique.” It is this delicate balance between scientific facts and artistry that is essential, according to This, for a culinary endeavor to be considered molecular gastronomy. Most chefs, however, express much more disdain over their classification as molecular gastronomists than This does. In a 2009 panel discussion, some of the top chefs dubbed as molecular gastronomists, Ferran Adria, Heston Blumenthal, and Andoni Luiz Aduriz, discussed why they reviled the term used so often by the media to describe what they do. Many great cooks resist being labeled molecular gastronomists because they fear it is off-putting for many people who are © 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 19

wary of consuming food supposedly prepared by scientists [5]. However, these chefs do not believe they should be excluded from the field, but rather that the term can be more broadly applied. “Today, you’ve got bakers working to find the best flour, the best yeast, the best oven. This is science, pure and simple. But people still say they’re shocked if science participates in cooking,” Adria said [6]. Some of the first research projects in the field of molecular gastronomy began with This. He created the “complex disperse system,” a formalism he developed to describe the various types of colloidal systems possible in food preparation, some containing all three phases of matter [1]. He used this system to classify and study the development of French sauces over time. In addition, he introduced a system to describe the consistency of food by its firmness, where a gas is rated zero; a liquid, one; an emulsion, two; a jellified emulsion, three; and so forth up to infinity, which he writes may describe chewing gum. He writes “if we are able to understand why a certain food is tasty and pleasurable, we can describe its preparation scientifically so even inexperienced cooks are able to make a good dinner without having to rely on years Reproduced from [12] of experience or old wives’ tales” [1]. Despite the disdain over the label “molecular gastronomy,” more and more chefs have begun incorporating science in their kitchens. Recognizing that science has always played a role in cooking, many chefs have begun to explore the principles behind the preparation of their food and exploit them to create food which is either prepared more efficiently, more accurately, or simply in an unexpected way. At El Bulli, Ferran Adria used to keep the restaurant closed for half the year, during which he and his team of about 30 food enthusiasts scoured the world for ingredients and experimented with various techniques in their kitchen lab. They painstakingly documented every ingredient, and every way in which it was prepared until the right combinations of different preparations were finally determined and compiled together to make up the menu for the upcoming year. An example of a creation that resulted from such determination was a pina colada that consisted of coconut foam, dehydrated pineapple, and a rum gel [6]. At Moto in Chicago, for a patron’s birthday, chef Homaru Cantu will serve an edible paper menu that tastes like birthday cake. THE TRIPLE HELIX Winter 2013 19

1/17/2013 10:41:36 PM


UCHICAGO

Reproduced from [13]

In fact, he is in the process of patenting a machine which can make edible paper flavored by whichever ingredient is inserted into it, a technology which may have implications for what astronauts eat on space missions, or ways to feed the third world [7]. Many of the world’s top restaurants have been labeled as outposts of molecular gastronomy, such as Grant Achatz’s Alinea in Chicago, Heston Blumenthal’s The Fat Duck in the UK, Wylie Dufresne’s Wd~50 in New York, and David Chang’s Momofuku restaurants. Although the creations made possible by molecular gastronomy are virtually endless, a few techniques and gadgets are utilized in their preparation. One popular technique is cooking meat sous vide, or under vacuum. Water is poured into a pan and heated to a low temperature, which never exceeds 100 degrees Celsius. Then meat and seasonings are placed into a plastic bag and placed in the hot water bath, cooking the meat slowly so that it retains its moisture. This method also allows for meat to be cooked at very exact temperatures because no oxidation reactions can occur due to the sealing of the meat, allowing for a greater degree of accuracy as for how well the meat is done [2]. In addition, by cooking the meat at low temperatures, water does not evaporate as much,

20 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 20

leaving the meat more moist [8]. Yet another key technique, invented by Ferran Adria, is “spherification,” which creates liquid filled beads that explode upon being bitten. It involves a reaction between calcium chloride

At Moto in Chicago, for a patron’s birthday, chef Homaru Cantu will serve an edible paper menu that tastes like birthday cake

and alginate, a substance extracted from brown seaweed. The desired ingredient is first blended in liquid form with calcium chloride, and then carefully dropped into a mixture of alginate and water that has sat overnight. The calcium chloride ions

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:37 PM


UCHICAGO cause the alginate polymers to become cross-linked, forming a gel. As the calcium chloride mixture is dropped carefully, this gel forms a bead. Spherification allows for the creation of jelly-like spheres of almost any flavor [2]. Flash freezing is also a very widespread technique in molecular gastronomy. When food is exposed to extremely low temperatures it will be frozen on the surface but liquid in the center [2]. Liquid nitrogen is often used in this process due to its low temperatures. The liquefied gas reacts with food to form solid crystals that don’t affect the taste of the food [2]. When liquid nitrogen comes into contact with the warmer food it boils instantaneously, and encompasses the food in nitrogen gas. The liquid in the food crystallizes, thereby solidifying the food [2]. The crystals formed with liquid nitrogen are smaller than the crystals formed when freezing with water due to the rapid freezing process, so they do not affect the taste of

At Morimoto in New York, flash frozen pomelo is mixed with mango puree to produce a unique texture which tastes like ice cream overall the food when it is solidified [2]. At Moto, for example, egg yolks are flash frozen so that they retain their flavor without freely mixing with the rest of the ingredients. At Morimoto in New York, flash frozen pomelo is mixed with mango puree to produce a unique texture which tastes like ice cream overall. Foaming creates a unique texture, using a mechanical force to trap gas bubbles in a solid or liquid and stabilizing the two states together. This is accomplished by creating new additional surface area through mixing [9]. First, a stabilizer which functions as a surfactant is added [9]. This substance coats the molecules, allowing them to adhere to each other more readily. It does so because it is an ampiphillic molecule that surrounds drops of the liquid suspended in solution. As their hydrophobic ends face towards the dispersed liquid and hydrophilic ends face towards the water the liquid is dispersed References 1. This, Herve. “Food for tomorrow? .” Nature Publishing Group : science journals, jobs, and information. N.p., n.d. Web. 8 May 2012. <http://www.nature.com/embor/ journal 2. Harris, William. “How Molecular Gastronomy Works.” HowStuffWorks . N.p., n.d. Web. 8 May 2012. <http://science.howstuffworks.com/innovation/edibleinnovations/molecular-gastronomy4.htm>. 3. Lanchester, John. “The Mad Genius of “Modernist Cuisine” : The New Yorker.” The New Yorker. N.p., n.d. Web. 8 May 2012. <http://www.newyorker.com/arts/ critics/atlarge/2011/03/21/110321crat_atlarge_lanchester?currentPage=all>. 4. “ The World’s 50 Best Restaurants.” Restaurant Magazine. N.p., n.d. Web. 8 May 2012. <http://www.theworlds50best.com/>. 5. Abend, Lisa. “Debating the Merits of Molecular Gastronomy – TIME.” Time . N.p., n.d. Web. 8 May 2012. <http://www.time.com/time/arts/article/0,8599,1873579,00. html>. 6. ”If the world’s greatest chef cooked for a living, he’d starve | Compare and buy |

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 21

in, the hydrophobic liquid substance drops may mix, forming an emulsion [8]. Without the surfactant, the liquid in the walls of the bubbles moves down and the air trapped inside moves up, causing the bubbles to pop [9]. Lecithin, agar, and gelatin all may serve as stabilizers [9]. Foams can be either sweet or savory, as well as either cold or hot, and allow for the easy delivery of a particular flavor to a dish without drastically changing the physical makeup, by simply applying them to the top of dish [9]. The airy appearance of foams also serves to create very attractive dish presentations, as well as a unique texture diners may not have experienced previously [9]. A dish served at Moto notable for its compelling presentation, incorporating bright pink and black foams, is cuttle fish topped with beet juice and squid ink foam. Molecular gastronomists have also made use of unorthodox ingredients. Methylcellulose congeals in hot water and becomes liquid again when it cools, making it an ideal gelling agent. Soy lecithin is used as an emulsifier that maintains a uniform dispersion of one liquid in another by acting as a surfactant. Transglutanimase causes protein to stick together by forming amide bonds, making it useful in removing fat from meat [2]. In addition, transglutanimase may act as a “meat glue,” in which two cuts of meat may be stuck together using the substance, creating a new cut of meat which may be cooked better or more efficiently [10] 1-octen-2-ol or benzyl trans-2-methylbutenoate may be added to give a mushroom taste to dishes if mushrooms are not available [9]. Agar is used to thicken liquids, and tapioca maltodextrin is used to create powders out of fatty substances like bacon fat [9]. Molecular gastronomy is a new and burgeoning field. While the popular media, the original founders, and chefs are all still in disagreement over what constitutes molecular gastronomy, more and more cooking professionals are blending science and art to push the limits of what food can do. Although diners may be wary of the new creations, chefs and food scientists remain undeterred in their quest to reinvent techniques which have not been questioned or improved for hundreds of years. Their creations will have great implications for the future, whether it be in a high end dining room or the inside of a space shuttle. Leigh Alon is a Biology Major at the University of Chicago. She has a blog about food and studies biology, and therefore finds the bridge between science and cooking fascinating. The Observer.” The Observer. N.p., n.d. Web. 8 May 2012. <http://observer.guardian. co.uk/foodmonthly/f 7. Reingold, Jennifer. “Homaro Cantu’s Weird Science | Page 5 | Fast Company.” FastCompany. N.p., n.d. Web. 8 May 2012. <http://www.fastcompany.com/ magazine/105/open_food-cantu.html?page=0%2C4>. 8. Roca, Juan. “Precision Cooking: Enabling New Textures and Flavors.” Harvard University, Cambridge. 12 September 2011. 9. “Molecular Gastronomy Tips and Tricks for Home Cooks.” Modern Cooking Made Easy. N.p., n.d. Web. 22 June 2012. <http://www.modernistcookingmadeeasy. com/> 10. Dufresne, Wylie. “Proteins and Enzymes: Transglutaminase.” Harvard University, Cambridge. 24 October 2011. 11. http://www.flickr.com/photos/foam/1220523407/ 12. http://www.flickr.com/photos/danieljean/3622833553/ 13. http://www.flickr.com/photos/ponderer/402965052/

THE TRIPLE HELIX Winter 2013 21

1/17/2013 10:41:40 PM


UCHICAGO

On the Origin of Knowledge: an Evolutionary Enquiry into the Structure of Perception Taylor Coplen

H

umans are terrible observers. If a laboratory instrument introduced as much bias as we do, it would be tossed out immediately. We fabricate significant patterns from meaningless data. We see faces everywhere, from pieces of toast to Mars [1]. And every classic rock song, when played backwards, seems to reveal some hidden, often satanic, message. There is little doubt that the human mind plays an active role in structuring perception, but it is unclear how fundamentally active this role is. The way we receive information from the external world has traditionally been a topic of purely philosophical concern. However, a consideration of the evolutionary process that shaped the human brain offers an explanation for the specific structure of our perception. Empiricists assert that the mind is originally lacking in ideas and accrues them through experience. In this theory of knowledge, the mind is likened to a tabula rasa (blank tablet), or in the words of John Locke, “white paper, void of all characters, without any ideas” [2]. Other philosophers hold that the mind has innate ideas, which precede the individual’s experience. Immanuel Kant famously championed this position. Kant argued that concepts like space, time, and even causality are pure intuitions of the human mind, which we impose upon our experience of the sensible world [3]. According to Kant, these conceptions form a cognitive framework, which makes experience possible by structuring

Our specific neurological structure, and thus the structure of our perception, is the product of many evolutionary trials and errors sensory information. As such, this knowledge must predate the individual’s experience. However, the origin of this a priori framework is obscure, and Kant says very little about how we come to possess such concepts. If we consider all mental activity as the product of physical cognitive mechanisms, then it seems that this inquiry into the origin of our cognitive framework falls into the realm of evolutionary biology. An epistemological inquiry, centered upon an evolutionary understanding of the human being, reveals two insights: 1) that we are born with these innate concepts that structure our experience, and 2)

22 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 22

that these concepts are themselves the product of experience in a general sense—the experience of our biological ancestors. By the latter half of the 18th century, the epistemologies of European philosophers could be categorized as either empiricist or rationalist. The rationalists, residing primarily

Reproduced from [15]

on the European continent, regarded reason as the primary source of knowledge [4]. Alternatively, the empiricists, most of whom were British, argued that experience was the principle source of knowledge [5]. Most epistemologists of the time fit neatly into one of these two categories, until the writings of the Scottish empiricist, David Hume, spurred Immanuel Kant to produce his magnum opus, The Critique of Pure Reason, in which he rejected both of the competing schools. Kant suggested reason as a solution to the skeptical arguments proposed by Hume’s empiricism. The resulting theory was referred to as “transcendental idealism.” Often likened to the Copernican Revolution in philosophy, transcendental idealism argues that causality, space, and time are a priori intuitions, which the human mind imposes upon our perception of the sensible universe [6]. Thus, drastically more significance is placed on the role of the observer. According to transcendental idealism, concepts like space and time can never be learned through experience because the individual must possess these concepts in order for experience to be possible. In order to understand why these concepts must be innate to allow for experience to take place, consider the alternative: a mind devoid of innate ideas, the aforementioned tabula rasa. If a subject were born without an innate idea of space,

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:40 PM


UCHICAGO one may assume that he or she could derive this idea by identifying objects and considering their physical relation to each other. However, the idea of space must be presupposed in order for the notion of one object’s relation to another to even be conceivable. The concepts of time and causality can be considered in a similar hypothetical situation and the conclusion is always the same: the way that we experience the external world is made possible only if these concepts are presupposed. These innate concepts can be thought of as way for the mind to structure incoming sensory information. Without this structuring framework, the individual would simply receive incoherent stimuli rather than useful, organized information about the external world. Yet despite the obvious necessity of this cognitive framework, its origin remains obscure. One major obstacle that has hindered this investigation is the mystery surrounding the mind. Many philosophers in the past have struggled with the perplexing relationship between the mind and body. Today, however, modern neuroscience has provided considerable evidence that indicates that all mental phenomena are the result of a physical structure [7]. Though the claim that the mind is the product of physical matter is still disputed, there is enough supporting evidence to accept this position as a premise for the following argument. In humans, all sensation, thought, emotion, and desires are the direct product of neurological activity [8]. This discovery is essential to forming a complete understanding of our innate cognitive framework. In order to understand the structure of our perception, we need only understand the origin of the structure of our central nervous system, specifically the brain. The brain, like all the physical mechanisms that comprise the human body, developed gradually over time through the process of natural selection [9]. Thus, as we trace the evolutionary development of the brain we will be simultaneously witnessing the development of the framework that structures our perception. Let us first consider the notion of experience—so intimately related to any theory of knowledge. For Hume and other empiricists, experience is the aggregate of cognitions acquired through the individual’s perception of the external world. While experience clearly requires a subject, there is no reason that it should be considered as restricted to an individual. For if I could somehow transmit my observation of an object directly to another being by minutely altering the structure of his or her brain to precisely reconstruct my mental image, there would be no way to identify the “true” observer in any epistemological sense. We would both have acquired the exact same knowledge. Though this example seems like science fiction, the point is that experience can be transferred from one individual to another by replicating physical conditions of the brain.

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 23

Reproduced from [10].

Reproduced from [16]

Experience, in a more general sense, can be transferred from one individual to another, encoded in complex patterns of deoxyribonucleic acid, DNA. When a mutation alters a physical characteristic of an organism, the alteration can either positively or negatively contribute to its fitness. The structure of the organism’s neurological system determines the way it perceives. Physical manifestations of the organism’s genetic structure will either be passed on to the next generation or perish with the organism; the outcome is determined by the organism’s interaction with its environment [10]. If, for instance, an organism is biologically structured in such a way that it is unable to perceive a potential predator, it will most likely not survive to reproduce and thus this perceptual structure will die out. Our specific neurological structure, and thus the structure of our perception, is the product of many evolutionary trials and errors. As information is useful for the survival of an organism,

In the same way that our respiratory system structures the way that we receive oxygen from the environment, our complex neurological system dictates the way that we receive information it is a commodity shaped by the process of natural selection. In the same way that our respiratory system structures the way that we receive oxygen from the environment, our complex

THE TRIPLE HELIX Winter 2013 23

1/17/2013 10:41:48 PM


UCHICAGO neurological system dictates the way that we receive information. To gain knowledge of the external world, an organism requires some biological structure that can receive stimuli and trigger a reaction in the organism. For instance, an eyespot apparatus is an organelle found in many unicellular organisms [11]. Simple photosensitive proteins react to incoming photons and produce chemical messengers, which allow the organism to determine the direction of the light and move accordingly [12]. Though this is one of the most primitive forms of perception, the organism can now distinguish between two states of the external world: light and dark. This rudimentary process of

Reproduced from [17]

perception involves chemical messengers that cause an almost instantaneous response in the organism. Further evolutionary modification in more complex multicellular organisms causes these chemical messengers to make permanent alterations to the organism so that the information can be accessed and utilized at a later time [13]. By making physical alterations in response References 1. NASA “Unmasking Mars” http://science.nasa.gov/science-news/science-atnasa/2001/ast24may_1/ 2. Locke; An Essay Concerning Human Understanding, New York: Prometheus, 1995 2:1:2. 3. Kant, Immanuel. Critique of Pure Reason. Cambridge: Cambridge University Press, 1998. 4. Cottingham, John, Rationalism, London: Paladin Books, 1984. 5. Priest, Stephen. The British Empiricists. 2nd. New York: Routledge , 1990. 6. Kant, Immanuel. Critique of Pure Reason. Cambridge: Cambridge University Press, 1998. 7. John C. Eccels “Evolution of Consciousness” 1992. 8. Armstrong, D.M. A Materialist Theory of the Mind, London: Routledge; 1993. 9. Striedter, Georg. Principals of Brain Evolution. Sunderland: Sinauer, 2005.

24 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 24

When organisms become capable of storing multiple pieces of information, memories are structured in sequence so that when the organism recalls each memory there is a sense that one memory took place before the other to stimuli, organisms become capable of constructing primitive memories [14]. Organisms with the ability to sense change in the external world and record these changes have the rudiments necessary to produce the groundwork for our conception of time. When organisms become capable of storing multiple pieces of information, memories are structured in sequence so that when the organism recalls each memory there is a sense that one memory took place before the other. Thus humans, with extremely complex neurological systems capable of storing countless memories, have an innate temporal intuition, which allows us to structure our experience in chronological order. While this example is particular to the structure of temporal perception, the structure of spatial or causal perception can be considered in a similar way: as a gradual succession of increasingly complex perceptual structures. The process of evolution by natural selection can sufficiently account for the specific structural framework of our perception. Once we accept that our thoughts, sensations, and perceptions are the product of the physical structure of our brain, which is in turn the product of natural selection, this conclusion is inevitable. In a certain sense, the experience of our biological ancestors, i.e. their interaction with the environment, is the determining force that shaped the outcome of our cognitive structure. The bias that our particular type of perception introduces is the vestigial baggage of evolution, which was at one point in our evolutionary history conducive to survival. Taylor Coplen ’15 is a Philosophy & History, Philosophy and Social Studies of Science and Medicine (HIPS) double major at the University of Chicago. 10. Williams, George. Adaptation and Natural Selection. Princeton : Princeton University Press, 1966. 11. Kreimer, G. “The green algal eyespot apparatus: a primordial visual system and more?” 2009. 12. Hegemann P “Vision in microalgae”, 1997. 13. Stanley B. Klein, Leda Cosmides, John Tooby, and Sarah Chance Decisions and the Evolution of Memory: Multiple Systems, Multiple Functions, 2005. 14. Chrisantha T Fernando, Anthony M. Liekens, Lewis E. Bingle, Christian Beck, Thorsten Lenser, Dov J Stekel, and Jonathan E. Rowe Molecular Circuits for Associative Learning in Single-Celled Organisms, 2008. 15. http://www.flickr.com/photos/cblue98/7254347346/ 16. http://www.flickr.com/photos/question_everything/2921759515/ 17. http://www.flickr.com/photos/truthout/4521676743/

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:50 PM


UCHICAGO

How Synthetic Biology Creates Scientific Knowledge Daniel Benner

W

hen taught in school, the scientific method is generally presented as a well-structured process of analysis and hypothesis testing that scientists use to gather knowledge about the natural world. However, a strict application of this definition leaves synthetic biology in a difficult place. Rather than conduct analysis or test hypotheses, most projects attempt to engineer a biochemical system that does something interesting or useful. The status of synthetic biology as a science raises an interesting question: as science becomes more technical and specialized,

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 25

what distinguishes scientists, who create knowledge, from engineers, who create a technology? The term “synthetic biology” was coined in 1974 by Waclaw Szybalski, one of the founders of molecular biology, referring to advances in knowledge and biotechnology that would allow scientists to construct living systems with purposefully arranged genetic material. Since then the field has made massive contributions to biotechnology and medicine. For example, much of the food we eat has been genetically modified. An artificially expanded genetic alphabet is used in FDA approved diagnostic kits for HIV and other diseases [5]. Genetically engineered E. coli bacteria are used to synthesize thousands of useful proteins, such as insulin for the treatment of diabetics. Additionally, synthetic biology has since tried to create entirely artificial living systems. In 2010, a research group led by Hamilton Smith, who won a Nobel Prize for his contributions to genetic engineering technologies, created a cell with an entirely synthetic genome. The cutting edge of research in synthetic biology is represented by the iGEM competition, where teams from universities around the world attempt to apply engineering principles to living systems. In 2011, 165 teams participated, and results included a cell designed to metabolize cellulose into bio-diesel fuel, a method of protein synthesis that uses DNA as a scaffold to join amino acids in a precise order, and a game of microscopic “Rock, Paper, Scissors,” where rock, paper, and scissors are represented by different enzymes, and the winner is recorded by an engineered E. coli bacterium [2]. Synthetic biologists are certainly developing interesting toys and technologies, but how does any of this help us understand the natural world? This tension between the pursuit of knowledge and the pursuit of technology has a long history in the sciences. Within the context of Aristotelian natural philosophy, which dominated Western academic culture until the scientific revolution, practical knowledge was the concern of tradesmen and artisans, certainly not of those interested Reproduced from [6] in understanding the world. Not until the THE TRIPLE HELIX Winter 2013 25

1/17/2013 10:41:51 PM


UCHICAGO seventeenth century did writers like Francis Bacon begin to argue that the rigorous study of nature should deliver practical benefits. Today, science and technology are inexorably linked in the public mind. In many disciplines, scientists are pressured to justify their use of coveted funding and facilities with the promise that their research will also generate technological advances. Synthetic biology, in this picture, could easily be viewed as the logical conclusion of this trend: a discipline where the pursuit of knowledge has been entirely subsumed by the desire to engineer. However, many synthetic biologists are adamant that their research does create new knowledge [1]. To understand how this knowledge is created, we will look into the philosophical principle of scientific knowledge and how synthetic biologists apply these principles in their research. Modern thought in the philosophy of science is highly skeptical of knowledge claims, occasionally going so far as to doubt the existence of “true knowledge.” Logical positivism—the view that scientific empiricism can logically verify propositions about the natural world—has suffered substantial criticism. The most common view is that scientific knowledge (if, indeed, one wants to call it “knowledge”) is generated through a process of falsification. A theory can never be verified, but good theories can survive rigorous testing without being falsified [3]. However, in common use, we speak of scientific knowledge all the time. Most people would be comfortable claiming they “know” the earth orbits the sun, or they “know” the chemical composition of water. Why are we so comfortable claiming positive knowledge when philosophy says science cannot give us any? The answer is that, although we can never logically verify scientific knowledge, in practice, we trust it because it is empowering. It increases our ability to predict and control the natural world. In other words, it does what we would expect positive knowledge to do [1]. The ideas of falsification and empowerment make synthetic biology interesting. If we understand physical mechanics, we should be able to send a satellite into orbit. If we understand carbon chemistry, we should be able to manufacture diamonds. And if we understand molecular biology, we should be able to synthesize living chemical systems. If our understanding is good, the synthesis will succeed. If not, failure will force the scientist to critically review the theory his procedure was based on. For example, each strand of a DNA double helix has a backbone that is linked together by phosphate groups, each carrying a negative charge. Scientists hypothesized that the repeating charges on the two backbone strands destabilized the double helix. Therefore, synthetic biologists set out to create an altered form of DNA that did not have these

26 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 26

negative charges. Initially, synthesis went very well. Short sequences of the altered DNA molecule were able to form double helices. However, attempts to synthesize longer strands of the altered DNA failed. The backbone was not stable, and the chains folded in on themselves [1]. Trying to synthesize altered DNA revealed something that, in hindsight, now seems obvious: that the row of repelling charges in the backbone of the DNA molecule prevented the molecule from folding. Conceptually, however, something deeper happened. The unviability of the altered DNA molecule forced scientists to confront the errors in their theory. A part of the DNA molecule that had previously been considered inconsequential actually played a crucial role in making DNA a viable carrier of genetic information. This insight allowed researchers to make a more general claim about the necessary properties of a genetic molecule: in order to support Darwinian evolution, the molecular structure encoding hereditary information must be able to accommodate mutations without its chemical properties being altered. The repeating backbone charges dominate the properties of DNA. Consequentially, even if a DNA nucleotide sequence is altered extensively, the solubility of the polymer, its ability to form double helices, and other properties central to its biological function are preserved. This makes DNA an excellent molecular system to support change, which evolution requires [1]. The important thing to note is that even though this project failed to create a functioning genetic molecule, it provided a new working theory for genetic molecules that can now be applied to other projects. For example, in the search for extra-terrestrial life in our solar system, probes could use the characteristic repeating charge to identify genetic molecules in life forms that are not based on DNA [1]. This is a case where an engineering project created scientific knowledge. In light of this study (and others in the field), it is worth asking what the difference is between a scientist and an engineer, since it clearly is not what they do, at least in synthetic biology.

Reproduced from [7]

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:52 PM


UCHICAGO What characteristic allows an engineering project to become a science project? The key difference between scientists and engineers seems to be their attitudes towards their failures and successes. As engineers, synthetic biologists want their projects to succeed. When a system behaves differently in practice than it should in theory, an engineer reports a failure, whereas a scientist looks for an opportunity to incorporate this finding into current theory; for scientists, success is uninteresting. Success means that the current theory was adequate to the task. Of course, this not a bad thing. When developing medicines or technologies, we want our theories to be adequate to the task. But when scientists try to push the frontier of human understanding, success means the project was not ambitious enough. Failures are invaluable teaching tools because they reveal the deficiencies in our current understanding, and so

But when scientists try to push the frontier of human understanding, success means the project was not ambitious enough. Failures are invaluable teaching tools because they reveal the deficiencies in our current understanding, and so make way for new theoretical advances make way for new theoretical advances. “To be consequential in driving discovery and paradigm change, the challenge should be deepened until the theory fails,” says Steven Benner, who designed an expanded genetic alphabet [1]. Science lives at the cutting edge, where we have the most to learn, and success, as such, is only as great as the failures that were overcome to achieve it. Even in the synthesis of the first wholly synthetic bacterial genome, which was accomplished in 2010, failure was encountered at first [4]. Currently the holy grail of synthetic biology is the creation of interchangeable parts. In order to make biology better suited for engineering, researchers have been trying to create a bank of standard “biobricks,” fragReferences 1. Benner S, Yang Z, Chen F. Synthetic Biology, tinker biology, and artificial biology. What are we learning? C.R. Chimie 2011; 14, 372-387. 2. Previous iGEM Competitions [Internet] iGEM [cited 2012 May 5]. Available from http://igem.org/Previous_iGEM_Competitions 3. Popper, K. “The Logic of Scientific Discovery”. Routlage Classics; 2010 4. Venter, C. Frist Self-replicating Synthetic Cell [Internet]. J. Craig Venter Institute. [Published May, 20. 2010, Cited 2012 July 19]. Available from: http://www.jcvi.org/ cms/press/press-releases/full-text/article/first-self-replicating-synthetic-bacterial-cell-

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 27

Reproduced from [8]

ments of genetic code that have a well understood function, which would ultimately let us engineer cells. The problem is that genetics does not lend itself easily to deconstruction. The function of most genes is determined by a complex web of interdependencies and triggers that are still only barely understood. The first synthetic cell was a small, but valuable step towards the goal of understanding how livings systems work. What is essential to the growth of knowledge is that the failures of the prevailing theory be analyzed critically and undone creatively. This sort of critical analysis allows the process of synthesis to serve as a proxy as hypothesis testing. By exploring challenges and pushing the edge of our practical theories, synthetic biology and related fields have the potential to create new knowledge and deliver practical, revolutionary discoveries. Daniel Benner is studying Psychology and the Philosophy of Science at the University of Chicago. constructed-by-j-craig-venter-institute-researcher/ 5. Z. Yang, Chen, F. Chamberlin, S. Benner, S. Expanded Genetic Alphabets in the Polymerase Chain Reaction Angewandte Chemie International Edition. 2010 Jun, Issue 49(1):177–180, 6. http://www.flickr.com/photos/berkeleylab/3523749314/ 7. http://www.flickr.com/photos/bookhling/5116747360/ 8. http://upload.wikimedia.org/wikipedia/commons/b/b5/ BayStLouisPesticideCabinetsEPA1974.jpg

THE TRIPLE HELIX Winter 2013 27

1/17/2013 10:41:54 PM


UCHICAGO

Toward a Behavioral Model of International Trade Daichi Ueda

I

n a field as controversial as economics, disagreements are the norm when it comes to policy. Nevertheless, trade has been the one area of economics where experts have reached a rare consensus [1]. According to Milton and Rose Friedman, “ever since Adam Smith there has been virtual unanimity among economists, whatever their ideological position on other issues, that international free trade is in the best interests of trading countries and of the world.” “Yet,” they continue, “tariffs have been the rule” [2]. Indeed, tariffs and other trade barriers are remarkably prevalent [3]. There is a stark, baffling divergence between what economists unanimously prescribe and what governments actually do. Whence does this conflict between theory and reality arise? The neoclassical answer is that protectionism is almost entirely an outcome of political dysfunction. Drawing on the fields of economics, psychology, and political science, however, elucidates an alternative explanation – that nations derive utility not only from their absolute wealth, but also from relative wealth. While it is largely undisputed that tariffs are an example of institutional failure, one can gain further insight into the phenomenon by considering how nations incur different economic costs of protectionism. It is worth emphasizing here that the point is not to defend tariffs, but to provide a new explanation for their existence. According to standard economic theory, protectionism is “a form of stealing” that transfers wealth from the general population to a small group of producers [4]. This minority, whose interests are very strongly tied with trade policy – for instance, farmers heavily protected by tariffs and import quotas – are able to organize themselves effectively and impose the cost of their welfare (that is, the higher domestic prices of their goods) on a larger but less motivated majority, the consumers. Governments have consistently failed to enact socially efficient policies. The implicit assumption of this “institutional failure” theory is that the total social welfare of a nation depends solely on its absolute level of consumption. Nevertheless, some recent economists have instead chosen to model agents with preferences that depend not only on their own absolute levels of consumption, but also on the consumption levels of relevant reference groups. For instance, Clark et al. proposed an extension of the traditional model of utility based on the “concept of income comparisons – both to

others in the relevant reference group… and to oneself in the past” [5]. To account for the role that status (social comparison) and habituation (temporal comparison) play in satisfaction, they introduced a comparison variable in the utility function, in addition to the usual parameters representing consumption and leisure. Their simple model adequately explains a phenomenon that has long puzzled economists, the Easterlin paradox: real income has grown sharply in most Western nations without an accompanying increase in happiness [6, 7]. Economic growth in the West has not made people significantly happier, both because the higher overall wealth level has raised material norms and because people have become habituated to their wealth. At least as applied to individuals, this notion of referencedependent preferences is hardly revolutionary. More than a century ago, Thorstein Veblen realized that individuals derive utility from being able to display wealth and social status and thus engage in what he calls “conspicuous consumption” – consumption for the sake of display [8]. Veblen goods, such as diamonds and Rolls-Royces, are not valuable in themselves but in their ability to convey status. James Duesenberry later formalized similar ideas in his relative-income hypothesis, arguing that a person determines his or her consumption behavior based on relative rather than absolute consumption level [9]. In other words, consumers do not only seek to attain a certain fixed standard of living, but instead seek to keep their living standards at least as high as those of people around them. Poverty in America and poverty in Sudan signify vastly different standards of living, but people find both undesirable. Both Dusenberry and Veblen recognized something that all social scientists except economists seem to take for granted: human behavior makes little sense when removed from its social setting. However, it is more recently, with the economic study of happiness flourishing, that this nonstandard theory of utility has garnered strong empirical support. Today, there is an abundance of econometric and statistical studies corroborating the relativity of income utility. For example, Erzo Luttmer at the National Bureau of Economic Research found a negative correlation between average neighborhood income and selfreported happiness comparing datasets in the National Survey of Families and Households. Luttmer showed not only that

Human beings feel pleasure in being better, and states gain security in being stronger. Trade theories must account for this fact

28 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 28

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:41:55 PM


UCHICAGO Americans exhibit “spiteful egalitarianism” in which they feel worse when others around them are richer, but also that this effect is economically significant: “an increase in neighbors’ earnings and a similarly sized decrease in own income each have roughly about the same negative effect on well-being” [10]. Likewise, psychologists Chris Boyce and Simon Moore examined a data set of 12,000 adults in Britain and concluded that income rank within a relevant social group (e.g. neighborhoods, age groups) predicts general life satisfaction better than absolute income [11]. As a result, they argue, a rise in income for one person will ceteris paribus reduce the utility of others who lose rank. Even more, this reference dependence of utility is not a phenomenon that occurs only in developed nations. Economists John Knight and Lisa Song observed similar results in rural China. They showed that people in rural China, despite their poverty, are often happy because the utility of current income is low compared to the impact of social and temporal comparisons. Over 60% of rural Chinese people are happy, because they do not feel poor relative to their village peers and past selves [12]. Research consistently finds that relative wealth is a major factor of happiness in individuals. Of course, the preceding considerations concerning individual consumers do not directly lend support to the “behavioral” model of trade proposed earlier (nor undermine the neoclassical explanation), because one cannot immediately draw analogous conclusions about states and nations. Insofar as individuals constitute the decision-making organ of any political organization, the aggregate of individual decisions made according to the behavioral model should at least approximate actual government actions. Nonetheless, political science provides even stronger arguments in favor of the behavioral framework. The reality is that while individuals

gain a psychological benefit from being relatively better off than other individuals, nations gain far more substantial advantages from being relatively stronger than other states [13]. The dynamics of international politics and diplomacy can be seen to be a function of relative levels of power in the same way that human happiness is a function of relative wealth. Thus, one has good reasons to argue that America, as a nation state, gains utility from being the world power rather than a regional power [14]. In fact, the very notion that great powers seek to maximize or at least maintain their relative power for the security of their states is the fundamental conclusion that characterizes the realist tradition of international relations [15]. From the perspective of realists, the reason that international politics necessarily occurs in a competitive environment is because the persistent structure in the international system is anarchy – there is no world government that supersedes every national government [16]. Thus, maintaining relative power is the only way for a state to ensure its continuity. According to John Mearsheimer, under some reasonable assumptions such as that states are rational but skeptical and that survival is the most basic goal of states, “the international system creates powerful incentives for states to look for opportunities to gain power at the expense of rivals” [17]. Ultimately, a great power seeks to become a hegemon, the only great power in a system, and dominate other states because hegemony is the condition that provides the highest probability for survival. In light of the realist theory, which emphasizes the paramount importance of relative power, it is clear that states too have a non-classical set of preferences over trade: since economic advantages translate easily to military advantages, states do undoubtedly have an interest in the economic out-

Reproduced Reproduced from from [22] [22]

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 29

THE TRIPLE HELIX Winter 2013 29

1/17/2013 10:41:56 PM


UCHICAGO comes of other states. World free trade is uniformly desirable according to the neoclassical theory, because it ignores the zero-sum aspect of international politics. Nevertheless, within the behavioral framework of the world, a government can have rational reasons to prefer the protectionist equilibrium over free trade because the benefits of trade are asymmetrical [18]. In reality, some states do gain relatively less through trade, so states that weigh relative wealth more heavily than absolute wealth can rationally choose to adopt protectionism. Asymmetry of free trade benefits arises along two important axes: the degree of specialization and the level of institutional infrastructure. That the more specialized economies (such as Luxembourg) gain more from trade than the more selfsufficient economies (such as the United States) is intuitive: a hypothetical nation that only produces computers but does so extremely efficiently can thrive under free trade but will starve in absence of trade. Sam Kortum, a trade theorist at the University of Chicago, calculates that gains from trade range from 85.4% in Estonia to 2.8% in Japan [19]. It is clear from a purely economic standpoint that different economies have different implied cost of facing tariffs overseas. The political reason that free trade benefits nations unequally is somewhat more subtle. For instance, the ability to access the benefits of frictionless trade depends on modern infrastructure (such as transportation and telecommunications) and good governance, which developing nations often cannot afford. Thus, free trade “delivers benefits that flow mainly to the well-endowed countries, those with wealth and a reliable nexus of political, social, and economic institutions” [20]. Moreover, powerful countries are able to design global trade rules and influence their implementation to their advantage. According to one calculation, about half of anti-dumping actions are against producers in developing countries, which account for only 8% of all exports [21]. In his 2011 State of the Union Address, President Obama captured something fundamental about the American psyche and international politics when he preached the need to “out-innovate, out-educate, and out-build” America’s global competition. Indeed, this competitive outlook of the global market seems to be one of the common grounds between

the Democrats and the Republicans – like Obama, Romney argued, “We must be ready to compete... If America fails to act, we will be eclipsed,” during his campaign in the 2008 presidential primary. Whether one studies America as a state or Americans as individuals, one would have an incomplete understanding without looking at how America does relative to other nations. Human beings feel pleasure in being better,

Economic growth in the West has not made people significantly happier, both because the higher overall wealth level has raised material norms and because people have become habituated to their wealth and states gain security in being stronger. Trade theories must account for this fact. To be clear, these considerations imply neither that neoclassical economics is false and useless nor that free trade is unrealistic and undesirable. Instead, they are meant to illustrate an aspect of the mechanism that perpetuates protectionism that one cannot understand through a purely neoclassical lens. The simplifying assumptions of economics have proved extremely useful in making complexities of real-life situations tractable for analyses. Nevertheless, the ambitious scope of the field necessitates that economists seek help from other social scientists in order to attain the richest possible view. Daichi Ueda is an undergraduate at the University of Chicago, studying math and economics. He believes that we can understand most phenomena better by taking an integrative approach that does not confine itself to a single discipline.

References 1. Whaples R. Do Economists Agree on Anything? Yes!. The Economists’ Voice, 2006. 2. Friedman M, Friedman R. The Case for Free Trade. Hoover Digest. Hoover Press; 1997. 3. There is no major economy in the world without tariffs, according to the World Tariff database. 4. Griswold D. Seven Moral Arguments for Free Trade. The Cato Institute. 2002 5. Clark A et al. Relative Income, Happiness, and Utility: An Explanation for the Easterlin Paradox and Other Puzzles. Journal of Economic Literature, 2008. 6. Easterlin R. Will Raising the Incomes of All Increase the Happiness of All?. Journal of Economic Behavior and Organization, June 1995. 7. To be fair, Betsey Stevenson and Justin Wolfers argue that Easterlin drew flawed conclusions because he used insufficient data. While their work contradicts the most extreme claims of Easterlin – that absolute income plays no role in happiness – it certainly does not deny the importance of relative income. 8. Veblen T. The Theory of the Leisure Class. 1899. 9. Dusenberry J. Income-Consumption Relations and their Implications. 1948. 10. Luttmer, EFP. Neighbors As Negatives: Relative Earnings And Well-Being. Quarterly Journal of Economics 2005 11. Boyce C et al. Money and Happiness: Rank of Income, not Income, Affects Life Satisfaction. Psychological Science. April 2010. 12. Knight J et al. Subjective Well-Being and Its Determinants in Rural China. China Economic Review, 2009.

30 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 30

13. One way to formalize this “substantial advantage” is to think of international (economic) rank as a public good shared by all the members of a particular nation. Individuals have their own psychological reason to desire higher rank of their nations, but it is at the aggregate, social level that we should analyze the benefit of public goods. 14. America here refers to the collection of its citizens. I envision the government as an agent that seeks to maximize the total social welfare of its citizens. 15. Donnelly J. Realism and International Relations. Cambridge: Cambridge University Press, 2000. 16. Waltz K. Theory of International Politics. Reading, MA: Addison-Wesley, 1979. 17. Mearsheimer J. The Tragedy of Great Power Politics. New York: W.W. Norton & Company, 2001. 18. Basic game theory predicts that a bilateral trade situation in which one party is free trade and the other party is protectionist is unlikely to persist. 19. Eaton J, Kortum S. Putting Ricardo to Work. Journal of Economic Perspectives, Spring 2012. 20. Yotopoulos P. Asymmetric Globalization: Impact on the Third World. The Asymmetries of Globalization. London: Routledge; 2006. 21. Birdsall N. Asymmetric Globalization: Global Markets Require Good Global Politics. Brookings Review, Spring 2003. 22. http://commons.wikimedia.org/wiki/File:Diamonds.jpg

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:42:00 PM


DIGITAL HIGHLIGHTS

Highlights from the Blog Articles featured here are selected from The Triple Helix Online. To read the full aricle, browse more subjects, and join the conversation on the web, visit triplehelixblog.com.

Criteria for Habitable Planets and Implications for Earth

Biological and Psychological Effects of Human Space Flight

by Sunny Parmar, Georgetown University

by Isabelle Boni, University of Chicago

“The search for extraterrestrial life has always been one of our greatest fascinations with the universe. . . How successful has this search been in actually finding an Earth-like planet? And what can we do when we do find such a planet?” The article goes on to describe criteria for habitability and consider these questions.

While important innovations have come out of space exploration, technology is also being developed to minimize the various strains associated with space travel. The conclusion affirms that space travel is worth its costs: “With all these possible outcomes in mind, is it still worth it to send astronauts into space? Given the redeeming scientific boons of space exploration and progress made in solving problems related to it, the answer is, increasingly, yes.”

Leaf it to Me: Biomimicry and the Ar ficial Leaf by Prathima Radhakrishnan, University of Chicago

Using a Virus to Treat Cancer by Ayush Midha, The Harker School Although the potential of using viruses against cancer was first discovered two centuries ago, recent research has revived interest in the potential of this particular treatment method: “With more research and testing, the oncolytic virus has the chance to revolutionize medicine by not only being the most effective treatment of cancer, but also the most improbable treatment.”

A fresh, optimistic look at environmentalism centers on the development of the artificial leaf as an example of the potential of technology to pave the way to a brighter future: “By mimicking nature’s design principles, engineers and scientists are creating products that are perfectly adapted to solve problems we face today. . .we can save nature by paying her the greatest complement: imitation.”

Paramedics: A Danger to Patients? Nuclear Energy: The Future of Bioremedia on

by Michael West, George Washington University

Some question the effectiveness of “the endotracheal intubation (ETI), a technique in which the paramedic inserts a tube down by Monica Kumaran, The Harker School a patient’s throat in order to use it as an airway adjunct,” while Considering the difficulties and assets of nuclear energy, others vouch for its effectiveness amongst paramedics. More the conclusion states that “If the world is to switch more of training and better instruction may be necessary to ensure its energy to nuclear sources, governments and businesses the risks of ETI is far outweighed by its effectiveness. involved in the industry must increase transparency and accountability. Until then the future use of bioremediation will have to become a way for society to clean up nuclear Scan the code with a camera mistakes of the past, not further nuclear energy in the future.” phone to visit the blog now!

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 31

THE TRIPLE HELIX Winter 2013 31

1/17/2013 10:42:00 PM


ASU

The Current State of the American Discussion of Climate Change: Is It Possible for Science to Inform Policy? Taylor A. Murray

T

he anthropogenic warming of the atmosphere is now unequivocal. In its latest report, the United Nations Intergovernmental Panel on Climate Change (IPCC), an elite body of thousands of the world’s leading scientists, noted: ‘‘there is at least a 90 percent chance that human activities, mainly burning fossil fuels, are to blame for most of the warming in the last 50 years.’’ In its previous report, in 2001, the IPCC noted that the link was at least 66 percent certain. The growing scientific consensus on human-caused climate change represents the issue’s currency within the scientific community and presumably establishes the bedrock for public policy makers to begin acting. Yet, there still remains a vacuum of effective responses to address climate change, or the “world’s most pressing problem,” as United Nations Secretary General Ban Ki-Moon once described the issue. While science gains certainty in the causes of climatic change, the American public has grown less certain. The American public has been shown to lack an understanding of the fundamental elements of the scientific discipline – the public wants proof that climate change is real and that we are the cause [1]. Although there are existing methods that can be used to increase the certainty of science, most philosophers of science agree that there is no ironclad means to prove scientific theory [2]. Misunderstanding between

32 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 32

science and the public, and science and policy makers, is causing the capacity for science to function in a “normal” way to grow increasingly dubious - old dichotomies of facts and values, and of knowledge and ignorance are being transcended [3]. “Normal” science, termed by Kuhn [4], based on skepticism

While science gains certainty in the causes of climatic change, the American public has grown less certain and disinterestedness, is giving way to an era termed “postnormal” science, which is characterized by the breakdown of scientific communication, an anomalous role for scientists in the political sphere, and the increased role of uncertainty in the formation of scientific consensus [3]. Given the breakdown in communication and understanding between science and the public, other sources of information have become increasingly significant insofar as their ability to influence perception. Where science struggles to communicate with society, the general public is often left to rely upon mass media to gain knowledge of scientific findings. Studies have shown that the public assimilates much of its knowledge about climate change science from the media [5,6,7]. This essay examines the role of American media on discussions of climate change and concludes that it is problematic for three main reasons: 1. the media lacks the toolkit necessary to accurately represent multi-faceted scientific issues to the public and to encourage informed debate; 2. the media remains open to influence, which may promote an inaccurate, or at times, biased account of the issue, and 3. the perception of same on the part of the public. All of these factors play a noteworthy role in the disconnect between the American public and climate science, which Reproduced from [44] hampers policy.

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:42:02 PM


ASU Journalism Generalized: The End of Didactic Debate As mass media gains traction as a public source of climate change knowledge, the characteristic norms of media become increasingly important. Accordingly, the relationship between the public and what the media conveys is “defined, in large part, by the many journalistic norms and values that both affect what is deemed news and influence how that news is framed” [8]. Three principal norms shape how the media disseminates information: political norms –media provides the proper information to citizens that can be used to hold politicians accountable for actions; economic norms – journalists operate in a hostile environment which is mediated by monetary and time-oriented constraints; and journalistic norms – media adheres to principles such as objectivity, fairness and balance. The media generally complies with journalistic norms by providing equal amounts of time to both contrarian and mainstream science in discussions of climate change, even when contrarian science represents a diminutive portion of the scientific community. In this way, balance becomes bias. It is clear that editors and reporters must navigate through many pressures when reporting news. Often, acting in accordance with journalistic norms proves challenging and can undermine meaningful and accurate coverage of complex issues like climate change. For instance, economic norms may limit the time reporters and editors can spend on a particular story and thus engender pressures from journalistic norms at the expense of accuracy. Creating the impression of “balance” – by giving contrarian and mainstream science equal time in discussions of the issue - can serve as a crutch for reporters when they lack the requisite scientific background or knowledge or are facing time-related constraints [9]. Representing the issue in an equitable fashion requires research and resources that the media may not always have. By attempting to act in accordance with journalistic norms of balance, the media can misrepresent the state of climate science, as there is no real “debate” between contrarian and mainstream science in the scientific community. Giving contrarian and mainstream scientists equal time in discussions of climate change (often called the “dueling scientist technique”) creates the impression that climate science is not settled, when, for all intents and purposes, it is. Boykoff & Boykoff state: “These opposing scientists, who receive ‘roughly equal attention,’ create the appearance of a hot scientific debate between the upper echelons of the science community, which elides the fact that on one ‘side’ there are thousands of the world’s most reputable climate-change scientists who vigorously engage the process of peer review, while on the other side there are only a few dozen naysayers who generally have not had their skeptical assertions published in peer-reviewed publications” [9].

equal attention” to theories depicting anthropogenic and natural causes of climate change. By adding a skeptic to the mainstream scientific message, audiences become less likely to believe that scientists agree on climate change, less certain that the earth is warming, less likely to believe climate change is a very serious issue, and less likely to support government action to deal with the issue [10]. Giving equal time to both skeptic and consensus views effectively belies the state of consensus within the scientific community, while propagating uncertainty that undermines the basis for public concern on the issue. A dichotomized debate also exploits and reinforces the public’s misconceptions of science as a discipline that can guarantee certainty. When contrarians use the inherent uncertainty of science as an argument against climate change, it causes the public to believe science has the ability to be certain. This view is counterproductive to the establishment of a pedagogical debate about what to do about climate change, as it misrepresents the capabilities of science. Political norms within journalism also affect how the mass media conveys the problem of climate change. Because media is diverse, various media outlets will report stories related to climate change differently. Confirmation bias - a tendency of people to favor information that confirms their beliefs – can lead to bad decision-making and attitude polarization. Sociological studies carried out by the Woods Institute for the Environment at Stanford University and the Center for International and Security Studies have shown that American viewers of certain partisan television news networks are less likely to agree with the anthropogenic causes of climate change and the scientific consensus [11,12]. Such findings are consistent with recent research by the Yale Project on Climate Change and the George Mason University Center for Climate Change Communication that suggests that those holding more extreme views of climate change - on both ends of the spectrum - are most likely to gauge themselves as “very well informed” on the issue [13]. A Week After Tomorrow Political and journalistic norms can also merge to undermine the state of climate science. Mass media coverage of climate change since the early 1990s has disproportionately focused upon the uncertainty of climate science and scientific controversy [14]. Gavin Schmidt, a climate modeler at NASA’s Reproduced from [45]

Further, Boykoff & Boykoff investigated the balance norm applied to anthropogenic climate change in four major United States newspapers from 1988 to 2002 [8]. The study found that the majority of articles featured balanced accounts of the problem, and gave “roughly © 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 33

THE TRIPLE HELIX Winter 2013 33

1/17/2013 10:42:04 PM


ASU Goddard Institute for Space Studies in New York, classified the current environment in which climate scientists work as “insane” [15]. The contentious atmosphere has materialized in scandals involving climate scientists and their science. The most prominent of which was the massive release of hacked emails from the Climate Research Unit (CRU) at the University of East Anglia in the United Kingdom, also known as “climategate.” In one email, a scientist mentioned that he had just completed a “trick” involving the use of tree ring proxy data and direct temperature measurements. This trick used direct temperature measurements to fill in temperatures for years in which the tree ring data was least reliable [16]. Unfortunately, the CRU scientists did not label which temperatures were proxies and which were direct measurements. This was the only criticism of the scientists in an independent review of the scandal which found “no evidence” that the data presented in the IPCC report was misleading [16]. The validity of the science, and the conclusions related to it, were not in question. The scandal did, however, provide fuel to the media narrative of a fiery scientific debate about climate change. The example of climategate highlights how media falls victim to dramatization of fact and the public’s desire to be

Media dramatization and preoccupation with audience stimulation has resulted in the tendency for public perceptions of climate change to become increasingly focused on lowprobability risks that receive high media attention entertained. Novel images of bodies of conspiring scientists are diverting, and can change public perception. Such examples are also evident in Hollywood thrillers depicting exaggerated effects of climate change such as “The Day After Tomorrow.” Motion pictures offering an escape from reality succeed in further compounding American misperceptions of the very real problem of climate change. Anchorman Dan Rather once described the film as an exemplar of the “never-ending debate over global warming...” and indicated that “a sci-fi flick is a catalyst for a fight over science facts” [9,17]. Media dramatization and preoccupation with audience stimulation has resulted in the tendency for public perceptions of climate change to become increasingly focused on low-probability risks that receive high media attention [5]. This misperception hampers informed public policy and the societal consensus consistent with its formation. Made in America Examining how the American media operates shows its capacity

34 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 34

Reproduced from [46]

to misinform the public. In contrast with international media, and in contrast with scientists’ views, the media in the United States tends to distort the scientific traits of climate change [18]. Facts and analysis are more “freely intermixed” in U.S. news reporting than in other countries [18]. In a multi-national study conducted by Thomas Patterson and Wolfgang Donsbach on the understanding of objectivity, American journalists were shown to disagree over the basic meaning of objectivity in journalism, as they often expressed contrasting definitions of the concept [19]. Even agreeable accounts of objectivism are not always safe from the everyday threats American journalists face. Often market competition in the media plays into the hands of those who provide the media with information. For the media, remuneration for prized intelligence can necessitate pleasing a vested interest and compromising journalistic standards. Howard Kutz, reporter for the Washington Post, explains how “a publicist hired by United Airlines and US Airways offered three major newspapers a deal that none of them could refuse. The pitch: We’ll give you the exclusive details of a $5 billon merger if you promise not to call any outsiders for comments.” It was later documented that all three papers, the Washington Post, the New York Times, and the Wall Street Journal complied [18, 20]. Kutz’s example highlights how practices that would uphold the journalistic principle of objectivism, such as fact checking and impartiality, can be compromised by contemporary journalistic norms. The corporate nature of mass media may represent the most formidable challenge to the public’s fair understanding of climate change. The commercial media system is dominated by a small number of powerful, U.S. –based media corporations. This system was largely developed in the 1980’s from pressure by the International Monetary Fund, the World Bank, and the U.S. government, to deregulate and to privatize mass media [18]. These actors control access to information and have the power to marginalize dissent and to discredit political beliefs that pose a threat to their interests. As climate change can represent a fundamental challenge not only to industry but also to the notion of progress and modernity, control of the mass media becomes vital in maintaining power structures. It is evident that proponents of the status quo have used media to maintain power. It follows that the majority of “news” articles published by some of the most popular sources of media in America, namely the New York Times, the San Francisco Chronicle and the Washington Post, have suffered from pro-corporate bias when reporting on climate change in comparison with mainstream scientific literature [21]. Media

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:42:04 PM


ASU has also been shown to repeatedly cite a small minority of scientists whose views happen to coincide with those of oil, coal, and petrochemical industries [21]. Consistent with these findings are allegations of fraud, bribery and corruption amongst the world’s leading media conglomerates, which have spawned FBI and independent inquiries on behalf of the American and British governments [22,23]. By becoming largely dependent on advertising for profits, the media can easily become indebted to the corporations that keep them in business. This dependence can undermine objectivity in news. Newspapers now obtain seventy five percent of their revenue from advertising, whereas broadcast media obtains almost one hundred percent of their revenue from advertising [24]. According to a June 2012 report released by the Interactive Advertising Bureau (a group of 500 Internet and technology firms responsible for selling 86 percent of online advertising in the U.S.), Internet advertising revenues in the first quarter of 2012 totaled a new industry record of $8.4 billion [25]. Companies paying media to advertise their products have the power to dictate the types of stories the media can and cannot produce. For instance, Proctor and Gamble, a manufacturing company, once instructed a media business partner: “There will be no material on any of our programs which could in any way further the concept of business as cold, ruthless, and lacking all sentiment or spiritual motivation. If a businessman is cast in the role of villain, it must be made clear that he is not typical but is as much despised by his fellow businessmen as he is by other members of society. Special attention shall be given to any mention, however innocuous, of the grocery and drug business as well as any other group of customers of the company. This includes industrial users of the company’s products, such as bakeries, restaurants, and laundries” [24]. Failure by Design Public opinion polls in the United States indicate that the media has a major influence on how Americans view climate change. While climate science has grown more confident in the anthropogenic causes of the issue, the American public has shown a decrease in confidence. A 2010 study conducted by the polling organization Gallup indicated that nearly 40% of Americans believe that scientists are unsure about global warming, while 10% of those polled said that most scientists believe global warming is not even occurring. The same study indicated that the percentage of Americans in 2010 who thought most scientists believed global warming was occurring dropped to the lowest level ever recorded by the organization since 1997 [26]. A June 2012 poll conducted by Stanford University and the Washington Post found that only 18 percent of Americans polled indicated climate change as the biggest environmental problem in the world [27]. Newport [26] also indicated that 48 percent of Americans believe that the seriousness of global warming is exaggerated by the media. The media has diminished Americans’ trust of climate science. The media courier cannot deliver science’s messages. The American opinion of climate change also differs greatly

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 35

Reproduced from [47]

from international opinion. A major factor in this variation is the prevalence of a “debate” in American media on whether climate change is caused by human or natural sources; this debate is largely exclusive to America [28]. According to a 2011 study conducted by Gallup, America has the highest percentage of population of any country (47%) that attributes the effects of global warming to natural causes [29]. It has also been shown that, when compared to the 15 other surveyed populations, Americans worry about global warming the least, with almost 25% of the population saying it is not worried about the problem at all [30]. Lastly, the proportion of Americans who think climate change is a threat to them or their families fell from 2007 to 2010, at a rate faster than equivalent population proportions of any other of the top four greenhouse gas emitters, China, Russia, Japan and India [31]. Policy Implications – Management By Objectives The media plays a strong role in influencing public opinion, which in turn influences political actors [32]. While media is not the only factor influencing public opinion, investigation of the other factors lies outside the scope of this essay. Regardless, Carvalho reported that research has shown the media to be the main source of information for the public and the main factor which influences public views on climate change [6]. Plotting the translation of public opinion into policy is inherently difficult. Nevertheless, correlations between the public opinion of climate change and policy formation have been shown to exist. Tjernstrom & Tietenberg showed that individuals’ attitudes toward climate change play a significant role in the establishment of national policies to reduce greenhouse gas emissions; the study also showed that “democratic institutions and structural conditions have important functions in translating attitudes into policy” [33]. When the media repeatedly cites a minority of scientists whose views happened to coincide with those of an industry largely opposed to regulation, it undermines the basis for informed action. Jenkins explains how the media visibility of environmental organizations plays a role in policy formation [34]. From the early 1980s to the mid-1990s, global warming advocates enjoyed more public visibility in the media than contrarians. But in 1997, when the U.S. Senate, opposing international agreements to climate change, passed the Byrd-Hagel Resolution, opponents to climate THE TRIPLE HELIX Winter 2013 35

1/17/2013 10:42:04 PM


ASU change legislation garnered more public visibility from the media than advocates. Also during this period of “advocacy imbalance” the Bush Administration withdrew from the Kyoto Protocol negotiations and the Senate failed to pass the McCain-Lieberman Act, which would have created a federal cap-and-trade system to regulate greenhouse gas emissions for the first time in the U.S. [34]. Public opinion polls throughout the late-1980s showed that the public preferred immediate action on climate change, and subsequent polls in the 1990s showed that this preference for policy “sharply declined” [35]. Importantly, McCright and Dunlap highlight the ongoing successful movement by conservative think tanks and fossil fuel corporations, in cooperation with climate change skeptics and first facilitated by the 1994 Republican takeover of the U.S. Congress , to use media to characterize climate change as a “non-problem” [36]. Some of the details of the overarching movement reveal the media as a key player. In the spring of 1998, industry opponents of climate policy assembled at the American Petroleum Institute offices in Washington, D.C. to put together a $600,000 budget that would recruit and train dissident scientists in public relations with the goal of convincing journalists, politicians, and the public that the climate policy was not warranted. The success of the program was to be measured “by tallying the number of news articles that raise questions about mainstream climate science and the number of radio talk show appearances by scientists questioning the prevailing scientific view” [37]. Conclusion When science and the public experience a normal communicative relationship, it allows the public to be an advocate for environmental concerns. The environmental movement in the United States in the 1960s and 1970s was one of the most successful popular movements in U.S. history. Public engagement in the formation of scientifically informed environmental policy during this period helped enact a mélange of historic environmental laws, or “epic victories” [38]. Beginning from the mid-1970s—as the anti-environmental corporate interests began to fully engage campaigns against environmental concerns like climate change – to 2010, every political group except those classifying themselves as liberals has lost trust in science [39]. The scientific message is undoubtedly being altered as it is filtered through the airwaves and typewriters of the American media. The relationship between science and the public has become exceptionally convoluted. The post-normal state of science blurs the lines between what is indeed science and what is an artifice of different media narratives. Science in the post-normal age is uncharacteristically malleable. A recent study published in the journal Nature Climate Change found that as members of the American public display higher science comprehension, they become increasingly skilled at conforming scientific evidence to their own cultural and social values; the same study indicated that scientific literacy among individuals does not predict perceptions of climate change [40]. Such findings complicate the overfamiliar theory that attributes low concern with climate change to limits of the ability of the public to

36 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 36

engage in scientific reasoning. Evidence suggests, rather, that the issue with low public salience about climate change in America is not due to lack of scientific reasoning, but that such an entrenched scientific “debate” has fundamentally changed how science is understood and applied. While social and cultural beliefs will seemingly continue to change the meaning of science for individuals, the scientific community can take steps to guard objectivism. This can best be achieved by strengthening the scientific method. By sharing data more freely, climate scientists can bolster the external review and experiment replication processes, which will help eliminate the subjectivity and social constructivism that threatens to undermine science in the post-normal era. Philosopher and medical doctor Ludwik Fleck was one of the first who proposed this as an additional requirement of the scientific method, as he saw firsthand the value of the development of scientific truth in his work as a physician [41]. Scientists could form organizations devoted to enhancing the scientific method, which would address the difficulties in rendering “fact” independent of “point of view,” as Fleck envisioned. Such organizations could perhaps increase the margins between the robust machinery of science and the media’s misshaped notion of the contours of the debate. The American media cannot be a reliable primary source of knowledge on climate change. It has exacerbated the decline in the quality of the “debate” and worse, capitalized on that process as well. Although it is unclear what role scientists will play in communicating with the general public in a post-normal age, media based on profit, entertainment, and factional bias has not been a satisfactory alternative. A didactic platform for media to represent climate change is crowded out by journalistic norms and corporate affinities. An examination of the effect that media has on the climate discourse makes clear that this is only one symptom of larger systematic problems related to how the media portrays science. The stakes are heightened when policy implications are in play. There remains a need for the public to gain the basic scientific knowledge necessary to frame the issues in a manner which reflect the accurate consensus within the global scientific community. Without this knowledge, the public will remain vassals of the interests behind the scenes. Similarly, media must base itself on objectivity and equity, not the competing interests to which it has been proven vulnerable. The operations and characteristics of American media contrast with other global media sources, which causes American media to represent the problem of climate change in a distinct fashion. Polls showing American opinion as a continuing outlier in comparison to the scientific consensus and to the international public are emblematic of this dissimilarity. The increasing confidence of climate science alone cannot stimulate public confidence in science. It should be noted that the United States has been regarded as a powerful obstacle in facilitating international climate action, becoming the only developed nation not to ratify the Kyoto Protocol in 2004 although it was the world’s largest greenhouse gas emitter at the time. A lack of policy is a by-product of the general apathy towards the issue. Irrespective of an economic recession in November

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:42:04 PM


ASU of 2011, the U.S. Department of Energy reported the most recent record of carbon dioxide emissions that “jumped by the biggest amount on record” to a level higher than the worstcase scenario modeled by the IPCC [42]. To be sure, not all media is biased or dissonant to the scientific consensus. Former U.S. Vice President Al Gore’s efforts to educate the public on climate change were highlighted in the documentary film “An Inconvenient Truth,” which was released in 2006. Such a production represents the successful facilitation of popular media on the topic of climate change that is largely consistent with mainstream science. According to a 2007 report released by the Environmental Change Institute at the University of Oxford, almost 90 percent of North Americans surveyed said they had become more aware of the problem after seeing the documentary, while 75 percent of North Americans said they were changing some habits after seeing the film [43].

The negative influence of mass media on American discussions of climate change offers no easy solutions. Public broadcasting sources and not-for-profit media outlets may help address some of the issues discussed. It should also be encouraging that individual attitudes have been shown to correlate with the establishment of climate change policies [33]. It is clear that at some point the hard facts of climate change will deliver the message with respect to which other voices have demurred. Whether the American public must await such a moment may be anticipated by gauging the signs that, perhaps against the odds, society can assimilate the best of its science.

References 1. Miller J. The Conceptualization and Measurement of Civic Scientific Literacy for the Twenty-First Century. In J Meinwald editors. Science and the Educated American: A Core Component of Liberal Education. : American Academy of Arts and Sciences; 2010. p.241-255. 2. Oreskes, N. The Scientific Consensus on Climate Change: How Do We Know We’re Not Wrong? In: Climate Change. Joseph F. DiMento, Pamela Doughman, editors. MIT Press. 2007. p. 66-98. 3. Funtowicz, S., Ravetz, J. Science for the post-normal age. Futures. 1993. 25(7): p. 739-755. 4. Kuhn, T. The structure of scientific revolutions. Chicago. University Of Chicago Press; 1962. 5. Boykoff, M. T. Carbonundrums: The Role of the Media. In S. Schneider, Author. Climate Change Science and Policy. Washington, DC: Island Press. 2010: p. 397-404. 6. Carvalho, A. Media(ted) discourses and climate change: A focus on political subjectivity and (dis)engagement. Wiley Interdisciplinary Reviews: Climate Change. 2010: p. 172-179. 7. Wilson, K. Mass media as a source of global warming knowledge. Mass Communication Review, 22(1&2). 1995: p. 75-89. 8. Boykoff, J., Boykoff, M. Balance as bias: Global warming and the US prestige press. Global Environmental Change (14). 2004: p. 125-136. 9. Boykoff, M., Boykoff, J. Climate change and journalistic norms: A case-study of US mass-media coverage. Geoforum, (38). 2007: p. 1190-1204. 10. Malka, A., Krosnick, J. A., Debell, M., Pasek, J., & Schneider, D. Featuring skeptics in news media stories about global warming reduces public beliefs in the seriousness of global warming (Report). 2009. Woods Institute for the Environment Stanford University. 11. Krosnic, J. A., MacInnis, B. Frequent viewers of Fox News are less likely to accept scientists’ views of global warming. Report. 2010. Woods Institute for the Environment Stanford University. 12. Ramsay, C., Kull, S., Lewis, E., Subias, S. Misinformation and the 2010 election: A study of the US electorate. Report. 2010. WorldPublicOpinion.org, The Program on International Policy Attitude and The Center for International and Security Studies at Maryland. 13. Maibach, E., Roser-Renouf, C., Leiserowitz, A. Global Warming’s Six Americas 2009: An Audience Segmentation Analysis. Report. 2010. Yale Project on Climate Change and the George Mason University Center for Climate Change Communication. 14. McCright, A. M., Shwom, R. L. Newspaper and Television Coverage. In S. H. Schneider, Author. Climate Change Science and Policy. Washington, DC: Island Press. 2010: p. 405-413. 15. Schiermeier, Q. The real holes in climate science. Nature (463). 21 Jan., 2010: p. 284-287. 16. Merali, Z. UK climate data were not tampered with. 2010. Nature News: http:// www.nature.com.ezproxy1.lib.asu.edu/news/2010/100707/full/news.2010.335.html 17. Bowen, J. Global Warming: Movie Debate. 20, May, 2004. CBS Evening News. 18. Dispensa, J., Brulle, R. Media’s social construction of environmental issues: Focus on global warming - a comparative study. 2010. International Journal of Sociology and Social Policy, 23(10). 19. Donsbach, W., Patterson, T. E. Political news journalists: Partisanship, professionalism, and political roles in five countries. Comparing political communication theories, cases, and challenges. 2010. Cambridge University Press. 20. Sexton, J. 08, October 1999. Corporate profiles. FAIR. 21. Nissani, M. Media coverage of the greenhouse effect. 1999. Population and Environment, 21(1).

22. Pilkington, E. FBI to investigate news corporation over 9/11 hacking allegations. 14, July 2011. The Guardian. 23. Stringer, D. UK minister: Press must face tougher penalties for breaching standards after hacking scandal. 12 February 2012. Washington Post. 24. Bagdikian, B. H. The media monopoly (5), 1997. Beacon Press. 25. Interactive Advertising Bureau [Online]. 2012 June 12 [cited 2012 June 25]; Available from: URL:http://www.iab.net/about_the_iab/recent_press_releases/press_release_archive/press_release/pr-061112 26. Newport, F. Americans’ global warming concerns continue to drop. Report. 2010. Washington, DC: Gallup. 27. Stanford University, Washington Post [Online]. 2012 July 2 [cited 2012 July 5]; Available from: URL:http://www.washingtonpost.com/wp-srv/nation/documents/ global-warming-poll.pdf 28. Dorsey, M. K. Climate justice on the road to Durban and beyond: movements below, paralysis above? Lecture presented at The Sustainability Series in Wrigley Hall Room 481, Tempe, Arizona. 23, March 2011. 29. Pugliese, A., Ray, J. Worldwide, Blame for Climate Change Falls on Humans. Report: 2011. Washington, DC: Gallup. 30. Pew Global Attitudes Project. No global warming alarm in the U.S., China. 2006. Washington, D.C.: The Pew Research Center for the People & the Press. 31. Rettig, J. Fewer Americans see climate change a threat, caused by humans. 26, August 2011. US News. 32. Speck, D. L. A hot topic? Climate change mitigation policies, politics, and the media in Australia. Human Ecology Review, 17(2). 2010. 33. Tjernstrom, E., Tietenberg, T. Do differences in attitudes explain differences in national climate change policies? Ecological Economics, 65: p. 315-324. 2008. 34. Jenkins, C. Democratic politics and the long march on global warming: Comments on McCright and Dunlap. The Sociological Quarterly, 52: p. 211-219. 2010. 35. Nisbet, M. C., Myers, T. The polls-trends: Twenty years of public opinion on global warming. 2007. Public Opinion Quarterly 71(3): 2010 36. McCright, A. M., Dunlap, R. E. Defeating Kyoto: The conservative movement’s impact on U.S. climate change policy. Social Problems, (50). 2010: p. 348–373. 37. Cushman, J. H. Industrial group plans to battle climate treaty. 26, April 1998. New York Times. 38. Shellenberger, M., Nordhaus, T. (2004). The Death of Environmentalism. Report, 2004. 39. Gauchat, G. Politicization of Science in the Public Sphere: A Study of Public Trust in the United States, 1974 to 2010. American Sociological Review, 77 (2): 2012, p.167187. 40. D. Kahan, E., Peters, M. Wittlin, P Slovic, L. Ouellette, D. Braman, G. Mandel. The polarizing impact of science literacy and numeracy on perceived climate change risks. 27, May 2012. Nature Climate Change. 41. L. Fleck. The Genesis and Development of a Scientific Fact. 1979. Chicago: University of Chicago Press. 42. Chomsky, N. ‘Losing’ the world: American decline in perspective, part 1. 14, February 2012. The Guardian. 43. Climate Change and Influential Spokespeople. Report. 2007. Nielsen Company & the Environmental Change Institute. 44. Original image by NASA available from: http://www.nasa.gov/images/ content/187737main_AndrewSequence_hg.jpg 45. Original image by USGS available from: http://mrib.usgs.gov/cch/ 46. Original image by NASA available from: http://climate.nasa.gov/news/?FuseActio n=ShowNews&NewsID=745 47. Original image by EPA available from: http://www.epa.gov/climatechange/images/basics/factorysmoke.jpg

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 37

Taylor A. Murray ‘12 graduated from Arizona State University with degrees in Global Studies and Sustainability. He is an MSC candidate at University of Oxford who is particularly interested in climate change and the philosophy of science.

THE TRIPLE HELIX Winter 2013 37

1/17/2013 10:42:04 PM


CAMBRIDGE

The Potential of Viruses in Medical Treatments Elizabeth Richardson

V

iruses present persistent dangers to human health and they have caused many of the most terrifying and lethal diseases throughout history, from smallpox to influenza and HIV. They are the smallest pathogens found in nature and one of the most mysterious; although several scientists in the nineteenth century speculated on the idea of a pathogen smaller than a bacterium, viruses could not be observed until the invention of electron microscopy in 1931 by Ruska and Knoll. Despite massive advances in our understanding of viruses, they continue to cause some of the worst epidemics known. There is no antibiotic equivalent for the virus: the only weapons we have against the spread of viral disease are vaccinations and drugs that work by slowing, but not preventing, their replication. However, with advances in genetics and development of sophisticated techniques that allow material on the scale of viruses to be manipulated and modified, viruses are increasingly appearing in a different light. It is becoming probable that one of the world’s deadliest killers may become the unlikely saviour of twenty-first century medicine. Viruses consist of only a few elements. The genetic material at their core can be either double or single stranded RNA or DNA. The genome is usually extremely small, which enables rapid replication; influenza has 9 genes, whereas smallpox has about 150 genes. Protecting this genetic material is a protein coat, which can allow the virus to survive for several hours outside the host cell. Once a virus has infected a cell, it will

This tiny genome is usually less than 50 kilobases long and can be quickly replicated in a host cell hijack the replication machinery of the host and use it to produce more copies of its own genome. Lacking a metabolism of their own, they are entirely dependent on the host cell to replicate in the purest form of parasitism. Viruses are best known as destructive and persistent killers. But what is it about a virus that makes them such a difficult pathogen to destroy? Viruses have evolved over time to do two things very well: infect cells and replicate their own genetic material inside them. Once a virus has infected a cell, it is almost impossible to kill the virus without also destroying the cell and viruses replicate extremely quickly–H1N1, the influenza viral strand responsible for pandemics from Spanish flu in 1918 to swine flu in 2009, can prove to be lethal to a newly infected mouse in just 4-6 days [1].The 1918-20 pandemic resulted in the deaths of 50-100 million people. Fast replication and efficient cell infection make viruses deadly pathogens in every organism, from bacteria to plants and animals. However, the exact qualities that have made viruses so deadly are those that we can exploit for our own benefit. Since the first advances were made into the study of genetics, viruses have become a common vector for introducing recombinant DNA into a target cell. As viruses require very few genes themselves to survive and replicate – relying mostly on the replication machinery of the host cell – the viral genome can be replaced with another gene sequence that can then be carried into a cell. This has enabled significant advances in the techniques used to study genetics and today many common lab techniques require

An adipocyte (3T3-L1 cell differentiated in vitro) infected with a virus that encodes the green fluorescence protein. The dark circles are lipid vesicles. Reproduced from [7]

38 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 38

© 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:42:04 PM


CAMBRIDGE the use of viral vectors. Though the use of viruses in studying genetics is extensive and the origin of many major breakthroughs in our understanding of the genome, use of viral vectors in genetic engineering has potential applications which extend far beyond academic study. In medicine, viruses that infect bacteria, called bacteriophage, are being considered as a potential antibiotic treatment. Bacteriophage infection of a bacterial cell will usually result in lysis: once the phage’s replication cycle is complete, the bacterium will rupture and die, releasing many more phage to infect further bacteria. Combined with the high specificity of phage - most species will only infect one bacterial strain this property could result in an efficient antibiotic treatment with few side effects on our ‘good’ bacteria. There are still some issues with the therapeutic use of phage: occasionally the cell will mutate to be resistant to the bacteriophage. Nonetheless, this is an extremely important area of study, particularly considering the increase in bacterial infections which are resistant to conventional antibiotics [2]. Another use for which viruses are particularly well-adapted is in the treatment of cancer, an area of medicine with potential for massive growth. There are striking similarities between the effects of viruses and carcinogenic mutations on cells, and this means than some viruses will preferentially infect cancerous cells over healthy ones [3]. These viruses, known as oncolytic viruses, can be engineered to further increase the specificity

There are striking similarities between the effects of viruses and carcinogenic mutations on cells, and this means than some viruses will preferentially infect cancerous cells over healthy ones for tumour cells; for example, by placing a gene necessary for viral replication under the control of a promoter active only in tumour cells [4]. The virus may still infect healthy cells but is unable to replicate without the transcription factors associated with expression of a tumour-specific gene. Though the focus is often on finding viruses which can destroy the target cells, this is not necessarily the only use of viruses in treating cancer. It is possible to engineer a virus to express green fluorescent protein (GFP) in cancer cells, by making replication of the virus and expression of GFP dependent on References 1. Tumpey T.M., Characterization of the Reconstructed 1918 Spanish Influenza Pandemic Virus, Science , 2005:310 (5745), . 2. Harper, D. R., Kutter, E., 2008, Bacteriophage: Therapeutic Uses, eLS. 3. Kirn, D, Martuza, R.L., & Zwiebel, J, 2001, Replication-selective virotherapy for cancer: Biological principles, risk management and future directions Nature Medicine 7, 781 - 787 4. Kishimoto H, Zhao M, Hayashi K et al, 2009, In vivo internal tumour illumination

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 39

the expression of telomerase–an enzyme associated with out of control cell replication—in the host cell [4]. GFP causes the infected cells to glow bright green under UV light–locating and removing all the cancer cells in a patient then becomes a relatively simple task. A clinical trial by Breitbach et al, published in Nature, shows one of the first successful applications of this technique using a modified pox, JX-594 [5]. This virus, a relative of smallpox, had been engineered to be oncolytic and was intravenously given to patients with solid tumours. The virus was able to reduce the size of these tumours by lysing the cancerous cells, and showed remarkably little expression in adjacent tissues. This showed that the virus was able to selectively target the cancerous cells and that it was safe to inject a geneticallyengineered virus into patients, a vital criteria in the selection of viruses for treatments. Another striking example of the use of viral vectors in the treatment of cancer occurred in 2011, in a patient suffering from leukaemia. A research group at the University of Pennsylvania modified some of the patient’s own T cells – cells in the immune system usually responsible for destroying damaged cells within the body – to kill the cancer cells, causing the patient to go into remission. Ten months later, the patient is still in remission. This is a dramatic improvement on current treatments for leukaemia, which can include a full bone marrow transplant, a dangerous procedure which carries a 20% mortality risk and only a 50% chance of a cure [6]. In this trial, some of the patient’s T cells were removed and modified to express a new antibody-like receptor on their cell surface called chimeric antigen receptor (CAR). This protein reprogrammed the T cells to attack all cells expressing another protein, CD19. This category of cells includes normal B cells but also, importantly, all of the tumour cells, and affects fewer off-target cells than conventional treatments [6]. The T cells were then reintroduced to the patient with the capacity for self-replication and the ability to target and destroy all the cancer cells, causing the leukemia to go into remission. These modified T cells were infected with HIV–the virus having been engineered to contain the CAR gene, and to be harmless to the patient [6]. Though the leukaemia trial was small, involving only three patients, the results now being observed from virotherapy are extremely promising for further development of the treatments. Ideally these results will be only the first, as more viral-based treatments are developed. The potential for medical benefit is incredible, now that we can engineer viruses from pathogens into unusual allies in the fight against disease. Elizabeth Richardson is a second year student studying Biological Natural Sciences at Murray Edwards College. by telomerase-dependent adenoviral GFP for precise surgical navigation, PNAS, 106, 14514-7 5. Breitbach, C, et al, 2011, Intravenous delivery of a multi-mechanistic cancertargeted oncolytic poxvirus in humans 6. Porter, D, Levine, B et al, 2011, Chimeric Antigen Receptor–Modified T Cells in Chronic Lymphoid Leukemia, N Engl J Med 2011;365:725-33 7. Photograph by Pazik Polak. Available from: http://www.flickr.com/photos/ pazit/2263174384/ under CC-BY-2.0 license.

THE TRIPLE HELIX Winter 2013 39

1/17/2013 10:42:04 PM


CORNELL

Reading the Labels: Off-label Drug Prescription Hillary Yu

D

espite evidence that the drug produces little change in quality or duration of sleep, Seroquel® has long been used to treat soldiers with insomnia. The inconsistency in effectiveness is less surprising given that Seroquel’s intended use, as labeled and approved by the federal Food and Drug Administration (FDA), is as an antipsychotic drug for the treatment of various mental illnesses. Seroquel® does possess slight sedative properties, but its overall inability to produce significant improvements in a patient’s sleep and, instead, ability to cause a harmful and potentially fatal irregular heartbeat, must be taken into serious account [1]. Although off-label prescribing is fairly common – approximately one in every five drug prescriptions in the United States is issued for off-label use – knowledge of the process behind the practice is not [2]. The term off-label use refers to any instance in which a drug is taken for some purpose other than its FDA-approved use. Classes of drugs commonly prescribed for off-label use include anti-seizure drugs for depression or nerve pain, and antidepressants for chronic pain, ADHD, or bipolar disorder [3]. Although the practice of prescribing drugs for use in ways not approved by the FDA may seem illicit, it is entirely legal. The FDA stops untested and unsafe products from entering the prescription drug market, but has very little jurisdiction over distribution once those drugs are approved, have entered the market, and are available for sale. The only form of regulation, in a sense, for off-label prescribing is contained in the condition that drug companies are restricted from promoting the off-label uses of any of their drugs, to either doctors or consumers. There is an intricate balance in off-label prescription and use. On one hand, the ability to prescribe drugs off-label

40 THE TRIPLE HELIX Winter 2013 UCHICAGO.indd 40

...when a patient “has not had success with the tested medication,” offlabel drug prescribing allows a clinician to try a similar medication with “a different side-effect profile,” increasing the chances of success affords physicians flexibility in tailoring prescriptions to match patient needs. As Dr. Steve Hughes, Associate Director for Medical Services at Cornell University’s Gannett Health Center, explains, when a patient “has not had success with the tested medication,” off-label drug prescribing allows a clinician to try a similar medication with “a different side-effect profile,” increasing the chances of success [4]. The overall rationale for the legality of off-label prescribing lies in the idea that, as those who interact regularly with their patients, doctors should retain the ability to exercise their judgment in determining what drugs should be prescribed to whom. In many cases the relative freedom given to doctors in this area has been beneficial – for example, as scientific studies emerge with results on new drug benefits, physicians can proactively adopt new practices for their patients without having to wait for the FDA’s approval process. The concerning news is that ambiguity and misunderstanding often obscures off-label use. A 2006 study in the United States showed that alarmingly, 73% of off-label drug uses were found to have little to no scientific support [5]. For many drugs that are commonly prescribed off-label, physicians also often mistakenly believe that the alleged off-label benefits of the drugs are actually approved by the FDA. A 2009 mail survey by Chen et al. revealed that 41% of surveyed physicians erroneously believed at least one drug paired with a common off-label treatment was FDAReproduced from [3] approved. An example of one such © 2013, The Triple Helix, Inc. All rights reserved.

1/17/2013 10:42:04 PM


CORNELL drug-indication pair is, again, Seroquel®. Not only is Seroquel® prescribed off-label for insomnia, but it is also used off-label to treat dementia with agitation, even though the supporting evidence for its medical effectiveness is inconclusive [6]. In situations where little scientific support for success exists, prescribing drugs off-label is costly, both in finance and health. When drugs are used off-label without any real evidence for their efficacy in treating a medical issue, the placebo effect can mask the ineffectiveness of the medicine, resulting in a substantial financial cost to the patient who pays for the drug without the benefit of an actual improvement in health. If a drug is acting as a placebo, a patient may also incur unnecessary health costs through any harmful side-effects. A study conducted by Dr. Veronica Yank of Stanford University on the recombinant factor VIIa (rFVIIa or more commonly, NovoSeven) reveals these costs of off-label prescribing. NovoSeven, a recombinant procoagulant, was

first approved in 1999 by the FDA to treat hemophilia-afflicted patients. By 2008, however, only 3% of in-hospital use of the drug was for hemophilia; the other 97% of in-hospital use consisted of off-label usage in cases involving cardiovascular surgery and trauma [7]. Published in spring 2011, the findings of Dr. Yank and her colleagues on the widespread off-label use of NovoSeven indicate there is no evidence the drug actually improves the chances of saving lives when used in an off-label context. Instead, NovoSeven increases the risk of blood clots and strokes [8]. When it is taken into consideration that investigators estimate the cost of the drug to be $10,000 per dose [7], it is clear that financial, as well as health, costs can be high when the off-label uses of drugs fail to undergo References 1. Levin A. Concern raised over antipsychotic’s use for sleep problems. Psychiatric News 2011 Sep;46,17: 21. 2. Stafford RS. Regulating off-label drug use - Rethinking the role of the FDA. N Engl J Med 2008; 358:1427-1429. 3. Shopper’s guide to prescription drugs – number 6: “Off-label” drug use. Consumer Reports [Internet]. 2007 [cited 2012 April 27]; Available from: http://www. consumerreports.org/health/resources/pdf/best-buy-drugs/money-saving-guides/ english/Off-Label-FINAL.pdf 4. Stone, H [citing Hughes, S]. 05 Jul 2012. Re: Question for The Triple Helix.

© 2013, The Triple Helix, Inc. All rights reserved.

UCHICAGO.indd 41

stringent scientific study. Although off-label prescribing is not always detrimental, such studies as the one co-authored by Dr. Yank indicate that for the practice to be truly safe – and patient safety is or should always be a primary concern for doctors and physicians – extensive research on drugs commonly used off-label must occur on a larger scale. It is also the responsibility of physicians – and even patients themselves (to the best of their extent) – to find out as much as they can from peer-reviewed scientific papers or the website of the FDA itself, before agreeing to prescribe or take an off-label drug. Furthermore, specifically regarding the ethical Reproduced from [10]. implications of off-label prescription, physicians who continue to prescribe ineffectual drugs in the face of emerging studies should be held accountable. Even now, resources exist that allow physicians to check the current, available drug information; the Healthcare business department of Thomson Reuters offers a resource called DRUGDEX, which not only contains information on over 2,300 drugs, but also allows physicians to compare offlabel indications (in medicine, an indication is a legal reason allowing a certain test, drug, surgery, or procedure to be used). As science advances, the medical field will only be more reliant on the use of Reproduced from [3] medicine in treating humanity’s myriad health issues. The technology exists for appropriate studies to be carried out to test both the effects of new drugs and the efficacy of using current drugs for novel purposes; although testing is often expensive, there is a strong ethical reason for us to discover as much as possible about the costs and benefits of different drug uses. Given the right evidentiary support, off-label drug use will not only be legal, as it technically already is, but also ethical and beneficial. Hillary Yu is a sophomore pursing a double major in government and biological sciences with a concentration in ecology and evolutionary biology.

5. Radley DC, Finkelstein SN, Stafford RS. Off-label prescribing among office-based physicians. Arch Intern Med 2006;166:1021-1026. 6. Chen DT, Wynia MK, Moloney RM, Alexander GC. U.S. physician knowledge of the FDA-approved indications and evidence base for commonly prescribed drugs; results of a national survey. Pharm Drug Study 2009; 18:1094-1100. 7. Avorn J, Kesselheim A. Editorial: A hemorrhage of off-label use. Ann Intern Med 2011;154:566-567. 8. Yank V, Tuohy CV, Logan AC, Bravata DM, Standenmayer K, Eisenhut R, et al. Systematic review: benefits and harms of in-hospital use of recombinant factor VIIa for off-label indications. Ann Intern Med 2011;154:529-40.

THE TRIPLE HELIX Winter 2013 41

1/17/2013 10:42:05 PM


ACKNOWLEDGEMENTS The Triple Helix at the University of Chicago would like to sincerely thank the following groups and individuals for their generous and continued support:

Dr. Matthew Tirrell, founding Pritzker Director of the Institute for Molecular Engineering The Institute for Molecular Engineering University of Chicago Annual Allocations Student Government Finance Committee (SGFC) Bill Michel, Assistant Vice President for Student Life and Associate Dean of the College Arthur Lundberg, Assistant Director for the Student Activities Center Ravi Randhava, Student Activities Resource Advisor Brandon Kurzweg, Student Activities Advisor Dr. Jose Quintans, William Rainey Harper Professor and Master of the Biological Sciences Collegiate Divison, Advisor to The Triple Helix All our amazing Faculty Review Board Members Our Faculty Speakers: Dr. Harald Uhlig, Professor in Economics and the College (Event: The Eurozone Crisis) Dr. Lainie Ross, Professor in Pediatrics, Medicine, Surgery, and the College; Associate Director, MacLean Center for Clinical Medical Ethics (Event: Ethical and Policy Issues in Organ Transplantation) Dr. Matthew Tirrell; Founding Pritzker Director, Institute for Molecular Engineering (Event: First Look: Dr. Matthew Tirrell) Dr. Michael David, Assistant Professor of Medicine (Event: The Screening of Contagion)

If you are interested in contributing your support to The Triple Helix’s mission, whether financially or otherwise, please visit our website at thetriplehelix.org. © 2013 The Triple Helix, Inc. All rights reserved. The Triple Helix at the University of Chicago is an independent chapter of The Triple Helix, Inc., an educational 501(c)3 non-profit corporation. The Science in Society Review at the University of Chicago is published bianually and is available free of charge. Its sponsors, advisors, and the University of Chicago are not responsible for its contents. The views expressed in this journal are solely those of the respective authors.

UCHICAGO.indd 42

1/17/2013 10:42:06 PM


Literary Divisions

Students who are new to this division often start out as Associate Editors or Writers – sometimes they do both!

The Print Division, led by its Editor in Chief, publishes the Science in Society Review (SiSR) and Scientia, our journal of original student research. Print Associate Editors work one-on-one with a writer, under the supervision of a Managing Editor. AE’s maintain the quality of their writer’s article throughout the literary cycle. SiSR writers compose 2500-word exposés on the impact of science on society. Scientia writers publish findings from their original research projects in a professional format similar to that of respected academic journals. To have your questions about the Print Division answered, please email us at: uchicago.print@thetriplehelix.org

The E-Publishing Division, led by its own Editor in Chief, produces the TTH Online blog, at http://triplehelixblog.com/. E-Publishing AE’s work with one or two writers under the supervision of a Manager Editor, helping to shape content, style and presentation. E-Pub Writers compose blog posts of diverse formats that are no longer than 1500 words. Writers explore current events in the world of science from a range of academic perspectives. To have your questions about E-Publishing answered, please email us at: uchicago.epub@thetriplehelix.org

To make sure you’re always in the loop, add yourself to our listhost at lists.uchicago.edu/. Search for “TheTripleHelix” – no spaces. Subscribe to thetriplehelix@lists.uchicago.edu/.

Marketing

Events

Production

Contact: uchicago@thetriplehelix.org

Contact: uchicago.events@thetriplehelix.org

Contact: uchicago@thetriplehelix.org

TTH-newcomers can apply to be an Associate Director of Marketing. The ADoM’s, as we call them, are trained in all areas involved in being the Director of Marketing: recruiting, networking with an eye towards fundraising efforts, and advertising. Each of the ADoM’s spearhead a project to expand the chapter, under the Director’s supervision.

TTH-newcomers can apply to be an Events Coordinator. An Events Coordinator (EC) presents his or her event idea and then works with the rest of the division to plan and host the event. Last year, the division held a film screening, introduced the Director of the new Institute for Molecular Engineering to the undergraduate population, and hosted two lectures: a presentation by the Associate Director of the MacLean Center for Clinical Medical Ethics on the ethics and policy issues involved in organ translation and a talk about the Eurozone Crisis with the Economics Department.

TTH-newcomers can apply to be a Production Editor. Together with their division leaders, Production Editors produce the layout and design of both of our Chapter’s printed journals. Though some design experience is preferred, all editors are trained to use the appropriate programs. The Production Division designs all of the chapter’s marketing materials, notably posters for TTH events, flyers, and letterheads.

To make sure you’re always in the loop, add yourself to our listhost at lists.uchicago.edu/. Search for “TheTripleHelix” – no spaces. Subscribe to thetriplehelix@lists.uchicago.edu/.

UCHICAGO.indd 43

1/17/2013 10:42:06 PM


Business and Marketing Interface with corporate and academic sponsors, negotiate advertising and cross-promotion deals, and help The Triple Helix expand its horizons across the world!

Leadership: Organize, motivate, and work with staff on four continents to build interchapter alliances and hold international conferences, events, and symposia. More than a club.

Innovation: Have a great idea? Something different and groundbreaking? Tell us. With a talented team and a work ethic that values corporate efficiency, anyone can bring big ideas to life within the TTH meritocracy.

Literary and Production Lend your voice and offer your analysis of today’s most pressing issues in science, society, and law. Work with an international community of writers and be published internationally.

A Global Network: Interact with high-achieving students at top universities across the world— engage diverse points of view in intellectual discussion and debate.

Science Policy Bring your creativity to our newest division, reaching out to students and the community at large with events, workshops, and initiatives that confront today’s hardest and most fascinating questions about science in society.

For more information and to apply, visit www.thetriplehelix.org. Come join us.

UCHICAGO.indd 44

1/17/2013 10:42:06 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.