princeton journal of science and technology Spring 2013
Inside:
sequester
Featuring:
Snapshots of Science
University President Shirley Tilghman
Worms & Mind Control Organs in a Petri Dish $250mil
$250 mil Stopper Sequester
$200 $150
No. 2013
$100
THE SEQUESTER What does it mean for Princeton Science?
engineering physics + math health biology
STAFF Editor-in-Chief
What is Innovation?
Lead Designer
Innovation Journal is a student run publication that highlights Princeton’s science news.
Stephen Cognetta
Eugene Lee
Business Manager Christine Chien
Webmaster Lucas Ho
Editors Matthew Blackburn Abrar Choudhury James Evans Nassim Fedel Kristen Houston
Samuel Kim Shreya Nathan Tayyab Shah Helen Yao Eddie Zhou
Senior Writers Stacey Huang Julia Metzger
Kiran Vodrahalli Michael Zhang
Writers Samuel Chang Cissy Chen Eli Chertkov Evan Chow Swetha Doppalapudi Mizzi Gomes Sarthak Gupta David Harris Sahana Jayaraman Alexandra Junn Sydney Kersten Gwen Lee Rachel Leizman Mattie Lloyd
Bennett McIntosh Neil Mehta Chelsea Parker Fred Shaykis Greta Shum Gina Sun Eugene Tang Meredith Wright Jenny Wu Ed Xiao Elizabeth Yang Kevin Zhang Jen Zhao Karena Cai [business]
Designers Jessie Liu [senior] Angela Zhou [web] Rory Fitzpatrick 2
Neeta Patel Erica Tsai Jessica Vo
Free copies can be found at at Frist Campus Center, the E-Quad, and various other locations on campus. Innovation is published once a semester. Contact innov@princeton.edu for interest in making donations or joining our staff.
Featured Interviewees Christopher Tully (PHY) Gaspar Bakos (AST) Bernard Chazelle (COS) President Shirley Tilghman Mark Rose (MOL) Sam Wang (MOL) Celeste Nelson (CBE) DataMi Research Team Arvind Narayanan (COS) Thomas Gregor (PHY) Zemer Gitai (MOL) Andrew Leifer (Lewis-Sigler)
Visit us online at http://www.innovationmag.org Read our tweets at http://twitter.com/InnovJournal Like our Facebook page at http://facebook.com/InnovJournal
TABLE OF CONTENTS 4
6
innovation
HEALTH
Snapshots of Science
Molding Organs
Featuring Guest Photographers
8
10 ENGINEERING
Internet Privacy
Time Dependent Pricing
TUBE
12
Digital Identities
14
16 BIOLOGY
FEATURE
Optogenetics
Symmetry
Princeton Science and the Sequester
in Drosophila Melanogaster
18
Studying brain behavior in C. elegans
20
22 PHYSICS + MATH
Ptolemy The search for the oldest relic of the universe
Natural Algorithms
HATNet extrasolar planets
SNAPSHOTS OF SCIENCE layout by Rory Fitzpatrick selections by Nassim Fedel and Kiran Vodrahalli
Germany’s city of Duisburg is the largest steel-producing city. This is a photo of Duisburg’s €2 million “Tiger and Turtle” walking roller coaster sculpture. It’s a roller coaster that humans “ride” at their own, walking pace. At night, it is lit up for tourists and city-goers. The wind blowing at my face from the height this photo was taken, in combination with the way it looks like a “stairway to heaven,” was breathtaking.
by Kyle Douglas ‘15
1 1
The demonstrator in Freshman Seminar “Chemistry of Magic” who appears to be breathing fire is actually holding a straw filled with cornstarch. When he blows the corn starch out over the Bunsen burner flame, the particles catch on fire and create a giant fire ball. by Maylin Meisenheimer ‘16
2
A Princeton Communiversity visitor is watching a physics demonstration. Students explained the transference of potential energy to kinetic energy, and visitors were able to practice the lesson they learned by increasing the electric charge within a closed box. by Irene Burke ‘16
3
Students’ movements on the dance floor are captured using extended shutter release photography and exclusion blending of multiple photographic layers at the MIMA: Substance event in the Architecture building. by Luke Cheng ‘14
4
2
4 3
A gear from a magnetic motion machine model demonstrating the unique orientation of magnets within a rotor. built and photographed by Samuel Chang ‘16
Two Questions in Reverse Engineering Organ Development
Question 1: What cellular environment triggers organ growth? CELL
Extracellular Fluid
Question 2: What shape do organs grow in?
begin with differently shaped hydrogels
Organic Materials
Extracellular fluid surrounds cells, and is important for the transportation of organic materials necessary to maintain cellular structure and function.
Ingredients: Enzymes, salts, sugars, proteins, Stem cells
place cells into symmetrical and Y-shaped hydrogel
Different concentrations of ingredients change the chemical properties of the cellular environment. Certain properties signal cells to replicate and organize into a certain shape.
It was found that cells grow better in Y-shaped (asymmetrical) hydrogels
However, the specific environment around cells necessary to trigger the formation of tissues is still unknown
health
Molding Organs
The First Steps of Development by Sahana Jayaraman, interviewing Professor Celeste Nelson (CBE) designed by Eugene Lee and Jessica Vo Could organs be developed outside the body? Professor Nelson, in the Chemical and Biological Engineering department, is looking into this very question. Starting out as a few progenitor stem cells, organs develop into complex yet organized structures. But what continues to fascinate and baffle scientists is how organs are able to develop – what are the processes that occur to turn these few cells into a complex structure that carries out vital chemical and biological processes? In the human body, cells are surrounded by extracellular fluid. This fluid is mostly water but contains many ions, enzymes, sugars, salts, and other proteins. All of the chemicals in this fluid are important for the transportation of organic materials necessary to maintain cellular structure and function – whether it is to repair the cell, remove waste from the cell, or communicate between cells. When tissues start as a few cells, the environment around these few cells, such as the extracellular fluid, contains specific chemical properties in order to signal that the cells need to replicate and become organized into a certain shape to form a tissue, and eventually an organ. However, the specific environment around cells necessary to trigger the formation of tissues is still unknown. In order to study possible environments necessary for the formation of tissues, Professor Nelson used structures known as hydrogels. Hydrogels are hydrophilic (water-stabilized) polymers and therefore very water absorbent. The hydrogels can be molded into specific shapes and can have specific chemical, mechanical, and physical properties that cells need to replicate and develop. Hydrogels can also be placed in water with certain chemicals to mimic the extracellular environment of cells starting to form tissues. Therefore Professor Nelson is changing the shape and properties of hydrogels and the environment the hydrogel is in, to determine the signals that trigger the development of specific organs. She
used polymers to create molds of different shapes that became the shapes of the hydrogels, and then placed some cells in the hydrogels. The shape of the hydrogel that allowed for the most cell growth suggested that it was a similar shape to how organs develop in organisms. Professor Nelson started with the shapes that are similar to those found in the early development of living organisms. These were all molds that had symmetry, such as spheres, squares, an upside-down Y, and small tree like shapes. What Professor Nelson found was that the molds that had less symmetry, such as an upside-down Y and small tree like shapes, were molds that caused the most cell growth. In fact, the molds of an upside-down Y, or a similar tree-like mold is the shape that every organ originates from. All the molds that Professor Nelson tried did cause cell growth, however the cell growth in the more symmetric molds, such as spheres, was not like in vivo growth. As Professor Nelson continues the research, she is looking to use slightly more complex structures for her hydrogel model and mimic the in vitro conditions to what takes place in chicken and mice embryos, which are common organism models for this field of research. While the research is still in its early stages, and is far from being ready to have clinical applications, which would be at least 20 or 30 years down the line, the information that Professor Nelson is gathering
about the conditions necessary for the start of organ development is promising. She says, “If you really understand what is going on [with organ development] you should be able to recapitulate it ex vivo (outside the body).” Her research could eventually lead to growing an organ outside of a human body which would help organ transplants, and could be used as systems to test out therapeutic treatments for congenital diseases. Professor Nelson says, “[For] congenital diseases that have no treatments right now, we know the causes, but there’s a hole as to why the cells in those diseases develop the way they do. If we can gain any bit of insight as to how these diseases develop, it’s helpful to find therapeutic targets,” for those diseases. If organs can be created outside the body, they can also help avoid ethical concerns. Currently the main way to study living organs would be to do testing on humans and fetuses, however, if living organs can be developed outside the body, studies can be done using those organs instead. While Professor Nelson became interested in this research because of its potential applications, she is also interested in this research because “Organs are beautiful, but what we can build outside of the body right now looks like a tumor, which it should not be.” By studying how organs develop in their earliest stages, Professor Nelson’s research could help scientists with the groundbreaking tool and resource of molding tissue into functional organs. 7
engineering
TUBE: Time-dependent Pricing Treating the sickness of network congestion through smart pricing for mobile data By Stacey Huang interviewing the DataMi research team Designed by Jessie Liu data usage
$
minimum price found →
Most of us have been a victim of congested networks at one point or another, often far more often than we’d like. From that cat video that just doesn’t load on YouTube to the intransigent SCORE website during course registration, slow networks can be a never-ending source of frustration. Yet, away from the boons of student life and free internet, congested networks are more than an annoyance: they also mean a higher bill, and the prices are only getting higher. As consumer data consumption doubles every year and bandwidth-hungry devices proliferate in the market, it has become increasingly difficult for Internet Service Providers (ISPs) such as AT&T and Verizon to provide data at a reasonable price while satisfying the high demands of users. Current technology simply cannot keep up with the level of user demand. Since 2009, a Princeton research team has been working to combat this problem. Working in EDGE, Professor Mung Chiang’s own Princeton networking lab, the team, which is led by Associate Research Scholar Sangtae Ha, takes an innovative approach that involves not only improving existing technology but also working to maximize economic factors. “The solution requires mechanisms beyond technology, such as economics, as ways to address the problem
2009 8
The Battle with Network Congestion The system of dynamic prices is nothing new; data pricing structures were incorporated in voice call pricing far before the era of internet. The research team has been working to implement a more nuanced version of this idea for mobile data, which unsurprisingly is the sector which has seen the greatest rise in demand within the past few years. The key with time-dependent pricing is adapting prices for low-peak periods. But
the question is how much? Calculating prices must be a delicate balance to create an incentive to switch to low-traffic periods, but not so much that the low-traffic periods become high-traffic periods. Consumer usage of data turns out to be perfect for this system, however, which is why TUBE is effective. “If you are lost on the street, you would be using your GPS no matter how expensive it is, whereas if you were planning to watch a movie, instead of watching it at 8 PM, which might be a peak time, you might want to watch it from 9 PM if there is a discount,” explained Dr. Sen, indicating that while there will always be data that people need in real-time—stock prices and the weather, for example—there are also activities that require no user interaction such as downloading software and cloud data synchronization that could be rescheduled for the middle of the night. TUBE essentially functions as a middleman between the service providers and the consumers, working in a continuous cycle of measuring aggregate user demand and calculating optimized prices based on that demand. Usage data provided by the ISPs is used to develop a model of the general willingness to shift around data usage from time to time. The model is used to predict usage for the next day and calculate prices
March 2011: TUBE finishes as a finalist in Vodafone’s Wireless Innovation Project Competition
Late 2009: The idea of research time-dependent pricing is first considered by EDGE Lab
2010
of growing congestion. It is a natural way to address this issue to obtain a win-win across consumers and internet service providers and benefits everyone,” described Dr. Soumya Sen, a postdoc research associate involved in the project. The elegant solution? “TUBE,” or “Time-dependent Usage-based Broadband price Engineering,” which allows users to pay different prices depending not only on how much data they transfer across the Internet but also on when they access it. Simply put, users receive discounts for deflecting usage, such as downloading and streaming, to when the network is less busy. A test trial of 50 people at Princeton yielded positive results, and the team is now moving to implement the technology on a larger scale.
February 2010: Research on time-dependent pricing begins
March 2011: TUBE is presented at Princeton’s Keller Center Innovation Forum
2011
April 2011 - February 2012: TUBE trial run is conducte students, and their family members
engineering
A look inside the DataWiz app: For the future? If ISPs were to implement the price incentive system:
App predicts future data usage
Autopilot mode Users enter how much they are willing to pay for data Users rate how much they are willing to delay the use of each application Autopilot mode budgets accordingly, scheduling downloads for when there is less traffic.
• Set a monthly cap and DataWiz tracks data usage, alerting users when they reach the cap.
• • Location-based tracking: records gps/wifi information
ahead of time; historic data is included from previous weeks to account for changes from day to day, such as the weekends. All the data is fed back into a control loop in a continuous process. TUBE in your Pocket: DataWiz On the consumer end, TUBE’s technology is now available as an app for Android and iPhone—DataWiz. The app features a friendly interface through which users can track their own wi-fi and cellular data usage to manage their expenses. After a week of usage, the app is able to predict daily caps for users to allow them to optimize data usage. Monthly caps are set either manually or automatically—a limit from the app’s recommendations based on a specific person’s usage—and the app gives notifications when the monthly cap has almost been reached. In addition to these basic functionalities, the app features a location-based tracker that lets users know how much data they use in each location and the free Voice Over Internet Protocol (VOIP) system, which essentially allows phone calls over the internet so that users won’t exceed their data plans. Despite the wealth of information it provides, the app also surprisingly nonintrusive. Every user’s individual data usage is
stored locally on their devices, so the app collects less data about users’ data usage than ISPs themselves, which have to know how much data each user consumes in order to know how much to charge them. “We actually developed the model so as to preserve as much privacy as possible. We don’t look at which applications you’re using—for the algorithm, we don’t even need to know each individual’s usage,” explained Carlee Joe-Wong, a graduate student in the team. “We’ve tried to make it as minimally invasive as possible.” Future Headings: Taking Technology to the Real World The project is particularly unique in that it focuses not only on the theory but also on the steps to implementation and commercialization. As such, in addition to the research aspect of the TUBE project, the team has also been working to commercialize their technology. To that end, they have founded DataMi, a company that grew out of dynamic data pricing research. Commercializing the technology, however, is admittedly one of the more difficult parts of the project. Interacting with the users and the companies and making sure the system works seamlessly, as well as coordinating the project across all departments—from
2012: Trial period begins with a small ISP in Alaska named Matanuska Telephone Association (MTA), and the second-largest ISP in India, Reliance Communications; talks with AT&T in the U.S. begin for forthcoming trial
ed with Princeton faculty, staff,
2012
+
May of 2012: DataMi co-founded
July 30-31 2012: Presented during the first Smart Data Pricing Forum (SDP Forum) in Friend Center, Princeton, with positive feedback
sales, engineering, and management—can sometimes be more challenging than research itself, but integrating these components into the project is also exactly what makes the work done at DataMi very relevant to everyday life. The team presented their research at a myriad of academic conferences from here in Princeton all the way to Italy; venture capitalists and academics alike were thrilled with the idea. The technology is flexible and can be tailored to the needs of each provider and their users; pricing could be time dependent, or could also be applied to other types of data such as text messages and even to systems such as electrical usage. “There has been a growing momentum that this is the right way to start thinking about the future of broadband sustainability and economic viability of the whole internet ecosystem,” said Dr. Sen. “And much of it started from Princeton.” The research team consists of about 12 people, from undergraduate, graduate and postdocs, development and research staff, and even professional designers. They welcome precocious new undergraduates to take part in the research project, from any number of fields including applied math, computer science, electrical engineering, economics, and design.
August 2012: DataWiz app released
$
$$
$$
2013
2013: Dynamic pricing models continue to be commercialized
9
engineering
since because
Privacy, Please: Expanding Identities in the Digital World By Eugene Tang: Interviewing Professor Arvind Narayanan iPhones, tablets, Facebook. Whether we like it or not, technology has rapidly been occupying larger and larger roles in our lives. From online shopping to instant communication to GPS systems, technology has greatly simplified day-to-day life, and as a result, we have become increasingly reliant on it. One consequence of such technological improvements, however, often lurks in the background: how much do technology companies know about us, and what they do with that knowledge?
Just how anonymous is the information collected from you over the Internet? Dr. Arvind Narayanan’s research on deanonymization strives to answer these questions. By using their websites (Facebook, Amazon, Netflix… etc.), we allow tech companies to obtain large amounts of information about us – what we buy, what movies we watch, where we are. Naturally, these websites use this information to their advantage, sending their data anonymously to public services, advertising companies, and recommendation services to create more accurate recommendations or present better targeted advertisements. However, these questions then inevitably arise: just how anonymous is this “anonymous” data?; Is it possible that someone could discover your identity solely by analyzing the movies you’ve watched or the items you’ve pur10
chased? Dr. Arvind Narayanan, Professor of Computer Science at Princeton University, strives to answer these questions in his research on deanonymization. At its core, anonymous data is data that, even if publicly available, cannot be traced back to the user. Of course, certain pieces of information are “sensitive,” or key to identifying a person. For example, in 1997 Latanya Sweeny showed that knowing just the gender, zip code, and birth date of a random individual was enough to precisely pinpoint his or her identity. As a result, it was commonly held that as long as such “personally identifiable” information was kept hidden, the data released would be “anonymous.” In his research, however, Professor Narayanan challenges such a notion. Working with scholars at Stanford University and UC Berkeley, Narayanan conducted research on identifying people purely based on their writing styles. You might ask, why study this? Imagine this scenario. You are a political blogger, voicing your discontent at the government’s current policies. However, not wanting to be identified and personally victimized for your beliefs, you use a pseudonym to obscure your identity. Narayanan, however, has shown that unfortunately, you are not as safe as you think. Everybody writes in a slightly different style. For example, given two interchangeable words like “since” and “because,” one person may prefer to use “since” while an-
Designed by Angela Zhou
other may prefer to use “because.” Though just looking at a person’s usage of “since” and “because” is not enough, the combination of thousands of such markers put together enables a computer to compile
Prof. Narayanan was able to develop an algorithm to identify the author of a blog post just by analyzing and comparing the writing style. a thorough database on different people’s styles of writing. In his research, Narayanan was able to develop an algorithm to identify an author of a blog post solely through analyzing the author’s writing style. By feeding a computer 100,000 different blogs, Narayanan then picked a blog written by an author who also had another blog included in the 100,000 blogs. The program was still able to find this second blog among the 100,000 and match it to the correct author, solely based on the author’s writing style. Such research, as Narayanan describes, could have huge implications. If you were that “anonymous” political blogger, someone could potentially scan all of the blogs online and determine who you are purely based on your writing style. Because of the potential gaps in anonymization, Narayanan is now focusing on developing a new way to create websites – privacy-conscious system design. While it is possible for companies to build websites in a way to inherently protect pri-
engineering vacy, Narayanan also recognizes the many . challenges to be faced in such a process. Economically, companies currently do not have much of an incentive to build more privacy-conscious websites – if consumers complain about privacy, companies can compensate by simply adding a few extra privacy features. Additionally, it is very difficult to define what does or does not invade privacy.
track information on mobile and desktop devices, and what information the websites obtained about their users from the cookies (some cookies actually stored the user’s browsing history). For Narayanan, it is only through such an understanding that we can develop a better means for protecting privacy online. With the growing role of technology in society – from iPhones to tablets to Face-
book – new issues arise, especially that of privacy. It is not the act of companies collecting data about us that is worrying; in many ways, it actually improves our lives. Rather, it is what they do with the data that can be a huge concern. There is yet a long road ahead to understanding the relation between data and privacy, but Narayanan is taking great strides forward to achieve such an understanding.
It is only through an understanding of how companies use consumer data that we can develop a better means for protecting privacy. To better understand privacy, Narayanan proposes that it is necessary to reverse-engineer websites and quantifiably measure how they use their consumers’ information. Reverse-engineering websites could take on many forms, but at its core, it is basically a way to discover how exactly a website interacts with its users. For example, in another piece of research performed with fellow scholars at Princeton University, Narayanan studied how certain websites stored cookies, small bits of code used to
Reverse-engineering a website to discover how it interacts with users in order to arrive at a more privacy-conscious systems design. Left: Typical systems architecture, with unsecure interactions and added privacy features. Right: an application secure from the very beginning.
11
-5% $
$6.3b CUT
$2.4b
Budget TheCuts Sequester $1.6B
What Does It Mean for Princeton Science?
Interviewing University President Shirley Tilghman By Meredith Wright
Edited by Abrar Choudhury, Nassim Fedel, Samuel Kim, and Kiran Vodrahalli
$970M
FDA
CDC
loss of innovation approx. $250M
$206M NSF
NASA
Global Health Programs
$433M $388M $323M
NIH
Department of Energy
Millitary Research
Also featuring Prof Mark Rose, Prof Sam Wang, Dr. Ethan Perlstein
It is no secret that much of the research conducted at Princeton is funded by federal agencies. According to President Shirley Tilghman, the University receives approximately $250 million annually. These government grants and contracts are a major source of funding for Princeton’s cutting-edge research in engineering, applied sciences, natural sciences, mathematics, social sciences, and the humanities. They come from organizations such as the Department of Energy (which is the sole funding source for the Princeton Plasma Physics Laboratory), the National Institute of Health, the National Science Foundation, and the Department of Defense. Unfortunately, these sources of funding are now in jeopardy because of the sequester. Sequestration refers to the creation of budget caps, which, if exceeded, result in automatic general cuts. These budget caps apply to discretionary domestic spending, which includes federally-funded research. Congress implemented sequestration when it passed the Budget Control Act of 2011, its temporary solution to the 2011 debt-ceiling crisis. The act mandated general cuts to discretionary domestic spending if a Congressional Joint Selection Committee on Deficit Reduction did not meet its November 2012 deadline of cutting $1.5 trillion from the federal budget over the course of 12
Amount cut from federal agencies that fund Princeton research
Amount that federal agencies currently give annually to fund Princeton research
10 years. Unfortunately, the committee did not meet this deadline. After a few months of political debates concerning the so-called fiscal cliff, the first round of cuts were enacted on March 1, 2013, with approximately $85 billion being cut from the discretionary parts of the federal budget for fiscal year 2013 (a 5% reduction). As President Tilghman explains, “Five percent doesn’t sound like a large amount, but it’s actually a very significant amount.” But 5% isn’t even the end of it, as additional cuts are set to go into effect each fiscal year barring any new legislation, with the next round of cuts set to occur in October. “What we’re all terrified about is what happens October 1st. There’s always the worry at times when funds are really tight, that it starts to influence the kind of work that goes on,” said Tilghman. “One of the things I’m sure you’ve already figured out in science is that low-risk science tends to be boring science and not impactful science.” Tilghman fears that even the perception of decreasing levels of funding will result in less effective “low-risk” research. The impending issues of October aside, it is clear that the current budget cuts will trickle down from the federal agencies to Princeton University. For example, training grants for graduate students, undergrad-
uate summer programs, and lab material costs could be affected. Tilghman felt that senior thesis funding would not be in jeopardy due to their relatively low cost. However, she conceded that the lab materials and equipment used by seniors are all ultimately paid for with federal funds. “Our investigators who’ve made commitments to experiments, to people, would suddenly have less money than they thought they were going to have,” said Tilghman. Tilghman explained how particular agencies might impact departments at Princeton differently. NIH funding will be critical for the Molecular Biology department, while Department of Energy funding will shape the future of the PPPL. The severity of the cuts will depend on how these organizations decide to make their 5% cuts. The NIH, for example, could decide to issue no new grants for the next year, reduce all grants by 5%, or to carefully analyze which programs can be cut more or less in order to reach the overall goal of a 5% cut. On the other hand, the PPPL’s future is already uncertain, as they never received a fiscal year budget for 2013. The lab has been working under fiscal year 2012 guidelines, knowing that cutbacks are sure to come. They have yet to receive instructions from the Department of Energy on how the sequester’s impact on the DOE will trickle down to the Designed by Eugene Lee
CUT funding
Research is funded by priority, which favors “low risk” research. This results in a shift towards safer, but typically less innovative research.
What Else could be cut?
? Lab Materials & Equipment
Lab Training
Undergrad Summer Programs
PPPL. Finally, the Andlinger Center for Energy and the Environment will also be impacted, due to the large grants it receives from the Department of Energy. Princeton’s investigators are keenly aware of the murky future ahead for their research. Many have reached out and written to their representatives on the issue. Others have already started scaling back expenses. “Although the scientific community has long been subject to insufficient funding for basic research, the sequester will make the problem much more severe,” said Professor Mark Rose, Director of Undergraduate Studies for Molecular Biology. “I don’t think that people realize that the mere threat of the sequester, together with the delay in passing a federal budget for 2013, has already had a significant negative impact on research.” Rose explained that proposals that would have been funded last year may have to wait months to receive funding decisions from the NIH this year. “Even if grants are eventually funded,
it may be too late to retain the researchers who were paid on that grant. The result is not good for science and not good for the country.” According to Professor Sam Wang, Associate Professor of Molecular Biology, a partial remedy for scientists could be found in various alternative funding sources, such as foundations, corporations, and individual donors. However, it is unclear if they will be enough for the basic science research that occurs on our campus. This is especially true for basic research without clear medical applications or the potential for commercialization. With federal funding unquestionably an issue and traditional alternatives unlikely to provide real results, researchers will be forced to become more creative in where they seek funding. Crowdfunding, where researchers advertise a project idea online and raise donations from the general public, could be a temporary fix for the lack of funds. Ethan Perlstein, a former Lewis-
Sigler Fellow, suggests a three-pillar approach for fellow scientists battling budget cuts. These pillars involve researchers seeking online donations from small donors through crowdfunding, larger grants from foundations and advocacy groups, and gifts from individual philanthropic ‘angels.’ Like many of Princeton’s scientists, Perlstein emphasized that the lesson of sequester is that researchers can no longer be dependent on one source of funding in order to continue doing exciting work. Right now, Princeton’s scientists are in limbo waiting for the federal agencies to make decisions on budgets and grants, but the outlook undoubtedly looks grim. Hopefully, the agencies will make cuts that do not hold back Princeton research. Otherwise, researchers may find that in order to get funding, they will need to pursue less ambitious projects or become creative in their endeavors.
The Three Pillar ApProach to Funding
Funding
$
$
There’s still hope for researchers who lose out from the sequester.
Crowdfunding
Grants
Angel Investors
13
biology
No two snowflakes
Are Identical by Julia Metzger interviewing Professor Thomas Gregor
Take a moment and look at your hand. Your right hand. Watch the tendons flex as you stretch out your fingers, examine the blue and purple blood vessels peeking through your skin. Look at the conformation of wrinkles around your knuckles, and the lines snaking across your palm. Now look at your left hand as well. Hold them both out, together. Isn’t the symmetry remarkable? The corresponding components are nearly identical Professor Thomas Gregor and his biophysics Laboratory for the Physics of Life have investigated this very symmetry in the wings of a particular species of fly, Drosophila melanogaster, in specific relation to reproducibility: the level of similarity of a biological component across different members of the same population. However, while symmetry within an individual is certainly remarkable, it is also ubiquitous in the natural world. We see bilateral symmetry in mammals, radial symmetry in starfish, spherical symmetry in some protozoa, and even approximately infinite fractal self-similarity in numerous plants. Reproducibility, on the other hand, is quite rare.
14
biology
designed by Stephen Cognetta
The saying goes, as we all know, that no two snowflakes are ever the same. Accordingly, understanding the biological mechanisms by which organisms generate macroscopically symmetric body parts, as well as macroscopically reproducible individuals within a species, is an area of intense research. The relationship between the two is also an area of interest; it is one thing to have our own hands nearly identical, but imagine holding up one’s hand and comparing it to the hand of a friend – logically, we would conclude that the level of symmetry within an individual would exceed reproducibility across a population. So, too, in Professor Gregor’s studies of the symmetry and reproducibility of wings in Drosophila melanogaster, wings of a certain individual are bound to be more similar to one another than to other members of the population. Or so one would think. Professor Gregor’s findings have indicated that symmetry and reproducibility for wing shape in the species of Drosophila melanogaster are, in fact, essentially equal. In inbred fly lines raised at a constant temperature, the fluctuations in shape and variation of bilateral symmetry correspond to the same level of disparity in reproducibility in a population. This amazing relationship between symmetry and reproducibility resulted from Gregor’s investigation of underlying developmental mechanisms using the wing as an ideal starting point. According to
Gregor, “the wing is easy to measure: you can rip it off, plop it on a microscope, and measure it in two dimensions.” Extracting meaningful and statistically significant results requires a vast amount of collected data, and the structure of the wing lends itself to this type of analysis. Gregor and his lab compiled morphometric measurements of landmark points on the wing (see figure to the left) across 500 adults of different strains, and from these measurements they were able to glean powerful insight into the specific developmental mechanisms that generate this remarkable symmetry and reproducibility in the fly, as well as the fascinating relationship that they are approximately equal. Gregor began with two possible schemes for how biological organisms obtain symmetry and reproducibility through the complex network of Drosophila melanogaster larval development – where 15 genes generate 6,000 cells in just 3 hours. In the first scenario, a vast number of complicated mechanisms are required to strictly control and regulate shape and size throughout this entire developmental process in order to result in symmetric macroscopic body parts. The second is far simpler, entailing the construction of initially symmetric conditions that remain ultimately symmetric through exact replication on each side, but it, too, is subject to a limiting factor: the success of this method is likely dependent on fluctuations
in development, environment, and genetics. Gregor was able to relate his measurements of symmetry and reproducibility in the macroscopic wing to microscopic magnitudes during development – and both are approximately the size of a single, individual cell. With this metric, he linked the macroscopic readout of bilateral symmetry and population reproducibility with the underlying molecular details of embryonic development. Thus the resulting power of Professor Gregor’s results is that the second scenario is the governing process of generating wings in the development of Drosophila melanogaster, and it is a “beautifully simple and elegant mechanism: as long as seeds on the left and right are similar and the steps are conducted reproducibly, we obtain the same result in the end.” In and of itself this research has powerful implications for the developmental processes of drosophila, but even further frontiers exist as well. Gregor hopes to perhaps investigate the interplay between symmetry and reproducibility in other organs of the fly, as well as investigate specific changes in inbreeding and temperature. Gregor has certainly challenged our notions of symmetry and reproducibility in the natural world, and so we wait eagerly to see what other conclusions his research will draw. 15
biology
Shedding Light on the Mysteries of the Brain and Behavior By Neil Mehta interviewing Dr. Andrew Leifer Designed by Jessie Liu and Erica Tsai
Ever wonder how our brains are able to grasp complex activities, such as playing an instrument or driving a car? How about riding a bike or turning a doorknob? Such everyday activities, called “motor-sequence behaviors,” are a fascination to neuroscientists around the world because they require a precise coordination between many different groups of brain cells, or neurons, and muscles. Here at Princeton, Dr. Andrew Leifer of the Lewis-Sigler Institute for Integrative Genomics is studying this very phenomenon. His work attempts to understand how different connections between neurons are responsible for such coordinated behaviors and habits in organisms. “Currently, we know how individual neurons function, and how large areas in the brain are related to behavior,” explains Dr. Leifer. “But what we don’t know is how small neural circuits can play a role in behavior. How are small, individual neural circuits relevant to behavior? How can they affect stuff like motor memory and addiction, etc?” Dr. Leifer’s goal is to understand how different combinations of neurons can result in this coordination of complex movements so fluidly in everyday life. He approaches this research by using what biologists call “model organisms,” specifically the worm C. elegans. These worms are useful because scientists have fully mapped out the genome of the worm, and know how every single cell in this worm is 16
created and dies, and how all 302 neurons of this worm are connected, and therefore there is a large knowledge basis that can be used for researchers to take advantage. Dr. Leifer activates or inactivates individual neurons in the worms to see how that affects motor sequence behaviors, such as turning around when the worm bumps into an object. But how is it possible to activate or deactivate one or two specific neurons in an organism? How is it possible to understand how all the neurons in C. elegans interact with its approximately 100 muscle cells, both directly, and indirectly, to coordinate various motor-sequence behaviors? Dr. Leifer and other neuroscientists have ingeniously implemented a novel technique in neuroscience, called optogenetics, to aid in their research. In optogenetics, scientists force neurons to express special proteins. These proteins, depending on which ones the researchers use, can either activate or inactivate the neuron or group of neurons, when hit with a specific color of light. For example, the protein channelrhodopsin-2 activates neurons that express it when those neurons are hit with blue light, and the protein halorhodopsin inactivates neurons that express it when those neurons are hit with yellow light. In this way, neuroscientists can specifically ask how changing one neuron can affect the behavior of an entire organism! For example, in the worm C. elegans, optogenetic
activation of the worms’ sensory neurons causes the worm to back up, turn around, and continue to move in a random direction different than before. This reaction is the exact same as the worm’s reaction to bumping into an obstacle. Expressing halorhodopsin in the worms’ sensory neurons and shining yellow light on the worms, however, prevents this behavioral response when the worms swim into an obstacle! In
Researchers can use complex genetic engineering to get these proteins expressed in any neuron of their choosing. some cases, the results are very surprising, and scientists have concluded that some specific neurons are essential for some very important behaviors of model organisms. Furthermore, researchers can use complex genetic engineering to get these proteins expressed in any neuron of their choosing, hence the name optogenetics (opto → light, genetics → controlling protein expression). Optogenetics is widely used in model organisms, such as rats and mice, to control neural circuits, and in turn study the function these neurons have in behavior in these organisms. Because the anatomy and genetics of the worm C. elegans are so well known, and because these worms are optically trans-
biology
Lenses and mirrors are required to shine laser light onto the digital micromirror device (on the right) before heading into the microscope (not pictured on left). Photo by Elizabeth Kane.
parent (visible light can shine right through them), these worms make an ideal model organism for Dr. Leifer’s experimentation. Using the concept of optogenetics, Dr. Leifer has created a novel experimental design that allows him to track the movement of his worms over time and therefore see changes in worm behavior over time when optogenetically stimulating the worms! A microscope camera that can follow the worm movement visualizes the worms. Sensors associated with the camera allow
lasers constantly point at a single neuron or a group of neurons, allowing Dr. Leifer to continuously track changes in worm behavior. Dr. Leifer hopes that this experimental technique will help him and his lab shed light on motor-sequence behaviors using the worm C. elegans as a model organism. Currently, neuroscience research is at the brink of understanding how seemingly simple behaviors arise from extremely intricate and complex neuronal interactions, and the implementation of optogenetics plays a key
role in understanding how these complex interactions form and work. Dr. Leifer’s fascinating new methodology of using optogenetics in real-time with worms will hopefully shed new light on how the brain can coordinate seemingly complex behaviors so flawlessly. Dr. Leifer strongly believes that understanding how the brain of model organisms function will eventually help shed light on how the human brain is responsible for such unimaginably amazing feats.
Laser-Induced Behavior in C. elegans Before
Laser
Unregulated C. elegans worm movement
After Induced omega-turn in C. elegans worm movement
A laser beam is shined on neurons being studied, activating them and inducing specific behaviors in C. elegans 17
physics + math
ν
The Search for the Oldest Relic of the Universe: PTOLEMY at Princeton by Cissy Chen interviewing Professor Christopher Tully (PHY)
RELIC NEUTRINOS: “[N]eutral, nearly massless particles... at temperatures colder than deep space...”
As archaeologists dig for the oldest signs of civilization on Earth, physicists here at Princeton have embarked on a different pursuit: the search for the oldest relics from the early universe that could bring with them a wealth of information about the history of the universe. These ancient relics of the universe are neutral, nearly massless and nearly frozen particles that are called relic neutrinos. They were created just a second after the universe is predicted to have begun. Professor Christopher Tully of the Princeton Physics Department developed an experiment called PTOLEMY (Princeton Tritium Observatory for Light, Early-Universe, Massive-Neutrino Yield) to search for these neutrinos. The motivation behind this experiment, according to Professor Tully, is simply that “modern science is so convinced that these Big Bang neutrinos have to be there that anything other than the right answer would completely change our belief about how the universe began.” Theorists have predicted that there are about 330 neutrinos per cubic centimeter everywhere in space – that means that
1
there are probably millions moving around you right now. Though these predictions are supported by the current body of scientific knowledge, these neutrinos have never been detected. And as Prof. Tully says, “it’s something so fundamental because most of space is permeated with it, but for physicists, it’s just hard to go on without looking.” But looking is one thing; finding is another. These neutrinos, as pervasive as they are predicted to be, are extremely difficult to detect because they move around with such low energy that we need very high precision tools to make these measurements. Much of the work in the PTOLEMY experiment involves building energy filters to make these measurements. The key concept is to use an unstable material whose natural decay process is sped up by a neutrino interaction. Tritium, the third isotope of hydrogen, is ideal for this. Thanks to the Princeton Plasma Physics Laboratory, PTOLEMY has access to substantial amounts of tritium that can be spread very thinly (to a couple of atomic layers) over a large area (the size of a football field). The tritium sheet acts like a thin
sail for neutrino “wind.” When tritium decays naturally, electrons are produced with a wide range of energies – but when neutrinos interact, these electrons will have a unique and slightly higher energy. Therefore, PTOLEMY must use tools that measure this increase in energy very precisely. The idea of the method is simple: PTOLEMY aims to let an electron run through multiple energy “filters” to precisely measure the energy of a single electron from this decay. In the first filter, all of the electrons are pulled to a magnet and forced to climb a very precisely carved energy “hill,” like a roller coaster on an upward ascent. Only the most energetic ones can get over this energy “hill,” so the low energy electrons are filtered out. Some of these higher energy electrons may have come from neutrino interactions, but not all. So then they are pushed through another more selective filter by accelerating them through a magnetic field again. These electrons move in an orbit and emit radio waves at a certain frequency. With a very precise antenna, we can measure these frequencies. When a single electron passes the frequency
Electrons climb an energy hill as they are pushed toward a magnet. Only electrons with high energy make it through the first filter.
2 18
designed by Rory Fitzpatrick
Electrons accelerate through a second magnetic field. As they orbit, this more selective filter determines which high energy electrons will pass through.
3
When an electron leaves the second filter, it hits a detector that measures the energy added by the neutrino interaction.
physics + math
CONSEQUENCES OF RESEARCH
? RETHINK UNIVERSE
threshold for an electron that was emitted due to a neutrino interaction, it continues moving and hits an extremely sensitive device. It is balanced precariously between two different states, as though on the tip of a pin. When a little bit of energy is added by the electron, there is such a sharp phase transition of this device (i.e. the pin tips over to one side), that it is possible to measure exactly how much energy was added, and thus find the energy of the single electron. By examining this energy, physicists will be able to count the number of neutrinos that interact with the tritium. A fully functional prototype of this experimental model has been constructed, but the PTOLEMY team is still working on improvements to the tools. Once constructed, this experiment could result in a number of drastically different observations about the universe. First, detecting too many neutrinos could possibly contradict the Big Bang theory, the long accepted theory of the history of the universe. In fact, this observation would suggest that there wasn’t just a single Big Bang: perhaps these neutrinos came from multiple other Big Bangs. Or, if
330 neutrinos and antineutrinos per cubic centimeter
✓
? ? ? ν? ? ?
THEORIES VERIFIED
RETHINK NEUTRINO
“[Neutrinos are] a tiny ingredient of the universe that makes it all work.” — Professor Tully no neutrinos are found, we may simply need to reconsider the lifetime of the neutrino – perhaps most of these relic neutrinos have decayed in the past 13 billion years. Furthermore, PTOLEMY has the power to capture a special type of predicted neutrino that cosmologists think may explain the mysterious substance that we cannot see or detect, dark matter, which is thought to make up about 85% of the total mass of the universe (the other 15% comes from normal matter, like the atoms that make up our visible universe). Although still in its beginning stages, this experiment will create a great new observational machine, which, over time, will be used to quantify more and more observations about the universe. Currently, the biggest experimental challenge is keeping the experiment, especially the device that is balanced between two states, at a very low temperature (ten times colder than deep space) to achieve the desired precision. This has never been done before with
such a large experimental setup, but physicists at PTOLEMY are working on achieving these extremely low temperatures. The techniques and tools from PTOLEMY will contribute to many areas of science and imaging processes that require high precision energy measurements. Detecting these elusive relic neutrinos could finally experimentally verify (or contradict) a prediction that has been accepted for quite some time on theoretical grounds. Understanding these relics from just a second or two after the predicted Big Bang would allow us to experimentally probe the very beginning of matter formation in the universe, preempting a crucial time in which the most fundamental ingredients of our universe were formed. And although neutrinos seem so insignificant, as if the universe would just behave quite the same way it does without them, it turns out that they are, in the words of Professor Tully, “a tiny ingredient of the universe that makes it all work.”
PTOLEMY 19
Natural Algorithms A New Way to Study the Natural World By Elizabeth Yang With Professor Chazelle
Professor Bernard Chazelle is a computer science professor who specializes in theoretical computer science, namely algorithms. His research explores the new field of natural algorithms, which provides us with an innovative way to think about science.
“
“
Biology is equal to physics plus history.
Changes in how organisms behave have accumulated over time as complexity has grown. 20
A single bird is nothing out of the ordinary. But thousands of them, swiftly swooping and swerving in unison, without colliding into each other, and creating beautiful, fluid forms is a very impressive marvel of nature. Likewise, while a single ant is a typical six-legged insect, a giant swarm of ants is a whole other beast in itself. Termites, blind creatures, manage to create massive, functional mounds. They manage to collectively facilitate this process without any direct communication between them. These phenomena motivate researchers like Professor Bernard Chazelle to study natural algorithms. According to Professor Chazelle, algorithms, which are step-by-step procedures used predominantly in computer science to reach an end goal, represent the future of scientific research because of their ability to model complex, dynamic systems. Computer scientists design algorithms that are tailored to accomplish a specialized task or goal. Algorithms, as we know them conventionally, are engineered; they are products of human creation. As a result, applying an algorithm to nature may be counterintuitive—how can a manmade construct be found in the world around us? The answer lies in the fact that most biological systems are far too complex to be explained by differential equations, which were the primary tools used to study science in the 20th century. Thus far, differential equations have embodied some of the simplicity and elegance of science. Physical phenomena, like heat transfer or particle motion, can often be expressed quite succinctly, with a few operators and symbols. “A differential equation,
which you can write in one line, pretty much gives the entire solution to the problem,” says Professor Chazelle. Although the study of physics can become quite complicated the forces driving it can be simplified and abstracted into a few lines of equations which can make predictions about physical behavior. Differential equations, however, are not enough for the natural world. “In the living world, everything is more complicated than traditional physics,” remarks Professor Chazelle. A better way to think about natural systems, according to Professor Chazelle, is that “biology is equal to physics plus history.” The millions of years of life that preceded us have played pivotal roles in shaping the wonders we observe today. Professor Chazelle further explains that what we see in biology stems from the occurrences of all eras past: “From natural selection and evolution, and so on, all these changes have accumulated, and complexity has grown, but that’s not true in physics. When you drop a pebble, you could have dropped that pebble a million years ago and exactly the same law would have applied.” Because of this complexity we inherit from evolutionary history, only algorithms are able to help us understand how these complex events—birds flocking, ants swarming and mound building—can be coordinated. Due to all this complexity, the field of natural algorithms lends itself to a multidisciplinary approach. Consequently, Professor Chazelle works with a variety of other experts in fields ranging from biology to physics in order to
Each individual musician understands how to synchronize with the group.
People change their minds in a predictable fashion.
study these natural algorithms. He is currently working on projects involving synchronization, a central theme that is at the core of many biological and behavioral processes. For instance, an orchestra could very well play without a conductor because each individual understands how to engage in synchronization with the rest of the group. The tens of thousands of electric cells in the heart also manage to completely synchronize to generate a heartbeat. Less helpfully, the neurons of epileptic patients all synchronize when they are not meant to, leading to seizures. Clearly, studying synchronization in an algorithmic way would give us a greater understanding of very relevant biological processes. Professor Chazelle is also applying the idea of synchronization to a topic in sociology—namely, generating a consensus. While it is nearly impossible to have a group of people completely agree on an issue, there is usually a moment when a group hits a consensus, an equilibrium point of sorts at which opinions are no longer changing. The process of reaching this consensus, a state in which people’s views stabilize, is called convergence. Professor Chazelle investigates key questions relating to convergence, such as, under what circumstances do we converge? And, under what circumstances do opinions change over time? During his investigations, he hit upon the interesting result that people never change their minds at random. In fact, when people do change their minds, they do so in a very predictable fashion, which we can, to some extent, foresee through the lens of natural algorithms. As we expand our explorations into science and technology, algorithms are going to be a new tool that will become increasingly important to scientists and researchers. “Algorithms give a narrative, and tell a dynamic story”. Just as differential equations once led to breakthroughs in science, algorithms can now revolutionize our understanding of the living world and science in general. Algorithms not only account for the complexity generated by years of history, but they are able to show us the actual processes by which nature creates these fathomlessly complex systems. Natural algorithms are absolutely revolutionary, and the development of this significant concept lives right in Princeton’s own computer science department.
Neurons synchronize during epileptic seizures.
slower
faster
An example of natural algorithms at work: colonies of fireflies synchronize their glowing on a macro-scale by the individual’s microscale response to the frequency of neighboring fireflies’ blinking. 21
HATnet: The Search for Extrasolar Planets By Michael Zhang interviewing Professor Gaspar Bakos (AST) Designed by Erica Tsai
Our galaxy alone contains 300 billion stars, most of which, we now know, have planets around them. Before 1995, nobody had discovered a single planet outside our Solar System. By 2013, over 860 planets have been positively identified, with tens of thousands of candidates awaiting confirmation. With this flood of discoveries, many fundamental questions are beginning to be answered. How many planets are there? How did they form? What are they made of, where do they orbit, and how many are suitable for life? Every night, telescopes scan the sky, monitoring the stars to discover even more exoplanets. Two of the most successful exoplanet search projects have been HATnet and HATsouth, both conceived by Assistant Professor Gaspar Bakos and now managed by Harvard’s Center for Astrophysics. Both endeavors consist of completely automated telescopes inside of protective domes. HATnet, established in 2003, has 6 small 11 cm telescopes at two sites, Arizona and Hawaii. HATsouth, installed in 2009, has six 18 cm telescopes: 2 in Chile, 2 in South Africa, and 2 in Australia. HAT, like most exoplanet searches, uses the transit method. After the international HAT team selects a target region, the telescopes automatically image the region night after night. Astronomers use these photos to monitor the brightness of every
= location of HATnet telescopes = location of HATsouth telescopes
star in the region. If a planet orbits its star at the right angle, it periodically passes in front of the star, causing an apparent dip in the star’s brightness—the bigger the planet, the larger the dip. When more than one transit is observed, astronomers can calculate the planet’s orbital period, which leads directly to its distance from its star—the shorter the distance, the shorter its orbital period. The detected signal is now a planetary candidate. In order to verify that it actually is a planet, follow-up observations usually attempt to detect it by measuring the
Doppler shift of its star’s light, caused by the planet’s gravitational force as it orbits the star. If this is successful, astronomers would know the planet’s mass, radius, and orbital distance. Every night, HAT’s mini-observatories use their sensors to measure wind speed, humidity, cloud cover, precipitation, and nearby lightning strikes. If the weather is just right, the dome opens and the telescope uses its CCD camera to take pictures of a single area of the sky, 8x8 degrees in area, from dusk to dawn. This area is roughly the
physics + math
size of a fist held at arm’s length—extremely large for an astronomical telescope. When the weather deteriorates or sunrise approaches, the domes close, the telescopes stop tracking their targets, and researchers can access their data through the Internet. The system is entirely automated and built using open-source software. HATnet now monitors more than 100,000 stars every year, for a lifetime total of 700,000; HATsouth monitors 400,000 and is outpacing its predecessor in its rate of discoveries. These automated observatories were first envisioned by Bakos in 1998. At the time, he was a year undergrad in Hungary, and planned to use them not for planet hunting, but for astronomers to point telescopes at gamma-ray bursts as soon as they are discovered. “It seemed to me like a suboptimal procedure that you get a phone call from a guy in Europe, who was woken up by his cellphone, then he calls me up, then I answer the phone, then I put down the phone, then I stop my observation…just very suboptimal in the age of Ethernet and robots. The telescope should just respond to a trigger and point to the right position,” said Bakos. His focus quickly shifted from gamma rays bursts to monitoring the sky for variable stars, and by 2000, Bakos had built a working prototype. Meanwhile, he moved to Harvard’s Center for Astrophysics at a time when astrono-
mers there were extremely enthusiastic about looking for planets, convincing him to dedicate his invention to planet hunting. HAT can reliably measure 1% dips in a star’s brightness, and has discovered 50 planets to date. Most of these planets are hot Jupiters—enormous planets much heavier than Jupiter, and orbiting so close to their stars that they complete one orbit in days. As with any transit survey, large planets with short orbital periods are much easier to detect than Earth-like planets that orbit once every 365 days. In reality, smaller planets like the Earth, which orbit their stars at large distances, are overwhelmingly more common, a trend that holds to the limits of current human technology. HAT has the ability to detect super-Earths, planets a few times Earth’s mass, or 100 times less massive than Jupiter. HAT can even detect a super-Earth in the Habitable Zone of a small star. “We made simulations and calculations and yes, there is a possibility, but it would be extremely surprising,” Bakos claimed. Since its inception, HAT has detected a wide variety of planetary systems. HAT-P2b, the first supermassive Jupiter for which scientists measured an accurate radius, is a gas giant the size of Jupiter but 9 times its mass. Its orbit is so eccentric that it comes as close as 7.4 million and as far as 25 million km from its star. HAT-P-32b
is 8 times Jupiter’s volume but around the same as Jupiter mass. Its density is so low that scientists are still puzzled as to how it could have formed. HAT-P-11b, only 8% the mass and 42% the radius of Jupiter, was the first transiting “hot Neptune” discovered from the ground. Finally, the star HAT-P-13 has 2 planets: an inner planet with a 2.9day orbit and 0.8 Jupiter masses, as well as an enormous outer planet with 16 Jupiter masses and a 446-day orbit. This was the first system where a transiting exoplanet was confirmed to be accompanied by a second planet. HATnet and HATsouth are just two of many ongoing exoplanet search projects. Super-WASP employs a similar methodology, and has discovered 80 planets. The Kepler spacecraft has discovered 2700 planet candidates since its launch in 2009. It has allowed scientists to estimate that our galaxy contains 2 billion habitable Earth-like planets around Sun-like stars, of which Kepler is expected to discover at least one. Recently, spectroscopy of large extrasolar planets has become possible, allowing astronomers to determine the chemical composition of their atmospheres. In several years, the next generation of telescopes will be capable of analyzing the atmospheres of Earthlike planets to search for signs of alien life.
***HAT-P-2b shown in comparison to Jupiter
***HAT-P-11b shown in comparison to Neptune
Radius: approximately equal to Jupiter Mass: 9x that of Jupiter Characterized by an eccentric orbit
Radius: 42% that of Jupiter Mass: 8% that of Jupiter Characterized by a highly inclined orbit Known as “Hot Neptune“ (an extrasolar planet in an orbit close to its sun, with a mass similar to that of Uranus or Neptune) 23
engineering physics + math health biology
ACKNOWLEDGEMENTS Without our contributors, Innovation would not be possible. A special thanks to the following groups, departments, schools, people, and programs:
MIRTHE Adlinger Center for the Humanities PRISM Princeton Science and Tech Council Chemical and Biological Engineering Princeton Writing Program Keller Center Professor Saini Professor Stock Psychology Professor Gmachl If interested in being a sponsor, email innov@princeton.edu for more information.
24