ISSUE 2 2016
18
Black Hole Collision BY JOHN TOON
32
DATA SCIENCE
Cyber Materials BY RICK ROBINSON
Turning Numbers into Knowledge Powerful computers and sophisticated algorithms change the way research is done Page 24
42
K-12 STEM Success BY BEN BRUMFIELD
50
Bog Monster BY BEN BRUMFIELD
The InVenture Prize is one of Georgia Tech’s most successful programs for fostering innovation and entrepreneurship among undergraduates. The annual contest rewards students with cash prizes for inventions that aim to solve society’s biggest problems. This year’s three winning teams created devices that have the potential to protect firefighters, keep athletes safe, and improve water quality.
TRUEPANI developed a water disinfection system for Indian households that won the contest’s $5,000 People’s Choice Award. The system includes a cup with a thin antimicrobial coating that disinfects water by releasing copper ions. The ions disrupt the microbes’ cellular functions and kill them. The cup’s design mimics a shape typically found in rural Indian households. Each cup comes with a similarly coated metal lotus flower, which symbolizes purity. This lotus flower is attached to a chain so it can be placed in the water storage containers popular in homes across India. The main inventors are recent graduates — S amantha Becker, civil engineering, and Shannon Evanchec, environmental engineering — who are participating in Georgia Tech’s Startup Summer program for young entrepreneurs. Photo by Rob Felt B
EXHIBITA
R E S E A R C H H O R I ZO N S 1
EXHIBITA
2
WOBBLE, an automated balance test to assess athletes following concussions, won $10,000 for finishing in second place. This portable device monitors a player’s balance. When someone steps on top of Wobble’s metal platform, the device moves in random directions, keeping the athlete’s brain guessing. Meanwhile, four sensors in the platform are reading pressure on its surface. As the athlete shifts his balance, Wobble analyzes how well he is reacting to the movement. If his balance differs from that of the baseline test taken before the concussion, he is likely not ready to return to play. The team is preparing for a pilot study this fall with several high schools. The inventors are: Hailey Brown, mechanical engineering; Matthew Devlin, biomedical engineering; Ana Gomez del Campo, biomedical engineering; and Garrett Wallace, biomedical engineering. Photo by Rob Felt R E S E A R C H H O R I ZO N S 3
FIREHUD, a device that helps firefighters track their vital signs while fighting fires, won first place and $20,000. A heads-up display attaches to a firefighter’s mask and measures heart and respiratory rates, blood oxygen level, body temperature, external temperature, and other vital signs. Firefighters view this information through the helmet display so they will know whether they are overexerting themselves, which puts them at risk for cardiac arrest. The device, which is about the size of a cell phone, also transmits data about a firefighter’s health to the incident commander, who can view it on a computer through an app. Inventors Zack Braun, a computer engineering major, and Tyler Sisk, an electrical engineering major, are working on the invention through Startup Summer, a 12-week internship for Georgia Tech students and recent graduates who want to launch startups. They are meeting with firefighters interested in the device. Photo by Rob Felt 4
EXHIBITA
R E S E A R C H H O R I ZO N S 5
1
Exhibit A The InVenture Prize fosters innovation and entrepreneurship.
7
Cross Talk Data engineering and science change how research is done.
56 Glossary Explanations for terminology used in this issue.
FRONT OFFICE
24 Data Driven 32 Cyber Forged 42 Next-Generation Genius 50 Shaking a Sleeping Bog Monster
Research finds knowledge within massive data sets to provide new insights. Data engineering and science accelerate the development of new materials. Georgia Tech’s K-12 outreach programs encourage a new generation of scientists and engineers. Research in a Minnesota peat bog could predict how climate change will affect this critical ecosystem.
11 Profile Dionne Nickerson studies the promotion of positive social change.
BBH_POSTER_2.PSD | 504.8 MB
FRONT OFFICE | 401.9 MB N17C10200-TEXT.INDD | 49.1 MB
DATA SCIENCE | 1.04 GB
ADDRESS CORRECTIONS Please send address corrections to John Toon (jtoon@ gatech.edu) or 404-894-6986. POSTMASTER Please send address changes to: Research News & Publications Georgia Institute of Technology 177 North Avenue NW Atlanta, Georgia 30332-0181 USA
K-12 | 1.06 GB
12 File Georgia Tech contributed to Atlanta’s bid for the 1996 Summer Olympics.
COVER Dana Randall and Srinivas Aluru, co-executive directors of Georgia Tech’s new Institute for Data Engineering and Science. Photo by Rob Felt. Back cover: A Minnesota peat bog, home to a climate change study. Photo by Ben Brumfield.
RESEARCH HORIZONS ISSUE 2 2016
REPRINTS Articles from this magazine may be reprinted with credit to Georgia Tech Research Horizons. Research Horizons magazine is published to communicate the results of research conducted at the Georgia Institute of Technology.
COVER | 2.8 GB
Research Horizons is a member of the University Research Magazine Association (URMA).
BOGS | 1.15 GB
Web www.rh.gatech.edu Twitter @gtresearchnews 17C10200-P11-003.PSD | 304.8 MB
15 Expertise A new process changes the way complex parts are made. 18 Visualization Helping a general audience envision a black-hole collision.
6
NCI-OPEN_2012-05-01.SDF | 1.2 GB
The folder structure of this issue of Research Horizons magazine. The gray bands are folders, and the width of the yellow arc represents the size of each individual file.
Copyright 2016 by the Georgia Tech Research Corporation. All rights reserved. ISSN #1060-669
F I L E : R E S E A R C H H O R I ZO N S A R C H I V E S ; E X P E R T I S E : R O B F E LT; V I S U A L I Z AT I O N C O D E : G E N E R AT I V E - G E S TA LT U N G . D E
D E PA R T M E N T S
N17C10200
ISSUE 2 2016
CONTENTS
STAFF Editor John Toon Art Director Erica Endicott Writers T.J. Becker, Josh Brown, Ben Brumfield, Laura Diamond, Jason Maderer, Rick Robinson, John Toon Photographer Rob Felt Copy Editor Margaret Tate
CROSS TALK
The Coda building will be a 21-story, 750,000-square-foot mixed-use facility that will house Georgia Tech’s data science and engineering program. It will be located in Technology Square.
SEEING THE DATA BIG PICTURE DATA SCIENCE AND ENGINEERING CHANGE THE WAY RESEARCH IS DONE
Steve Cross is Georgia Tech’s executive vice president for research.
Georgia Tech is using data science and engineering to help understand and improve the world around us. The ability to extract critical information from vast data sets is helping create new research directions in areas as diverse as drug discovery and application, design and development of new materials, prognostics for complex equipment, and analysis of genetic information. To bring together the many disciplines that are contributing to this field, we’ve launched a new interdisciplinary research institute — the Institute for Data Engineering and Science (IDEAS) — and are working with an Atlanta developer to construct a landmark 21-story building to house both Georgia Tech researchers and companies working to take advantage of new approaches in data science and engineering. By putting these diverse groups together in the same space — centered on a massive new data center — we’ll create synergies that will make Georgia Tech, Atlanta, and the state of Georgia an international hub for this fast-growing field. Taking advantage of new knowledge from data science and engineering to advance our economy will require a workforce with skills in science, technology, engineering, and mathematics. Through Georgia Tech’s many STEM-related K-12 efforts, concentrated in the Center for Education Integrating Science, Mathematics, and Computing (CEISMC) and including many other units, we’re helping schools, teachers, and students focus on preparing for next-generation career opportunities.
CODA: COURTESY OF JOHN PORTMAN & ASSOCIATES
Finally, this issue of Research Horizons describes Georgia Tech’s contributions to a research project known as Spruce and Peatland Responses Under Climatic and Environmental Change (SPRUCE). Nearly a third of the world’s carbon is tied up in peat bogs located in cool northern climates. Organized by the Department of Energy and involving the U.S. Forest Service and 19 universities, the project is examining what may happen to that carbon as temperatures and carbon dioxide levels rise with changing climate. Georgia Tech powers an impressive innovation ecosystem that facilitates transformative opportunities, strengthens collaborative partnerships, and maximizes the economic and societal impact of the Institute’s research. Our goal is to conduct leading-edge research and then transition the results of that research into use. As you read this issue of Research Horizons, you’ll see how we’re leveraging these collaborative partnerships to create game-changing solutions to society’s most challenging problems. We truly are creating the next generation of data science, materials design, technology education, and environmental study. As always, I welcome your feedback. Enjoy the magazine! Steve Cross Executive Vice President for Research August 2016 R E S E A R C H H O R I ZO N S 7
FUSE software simplifies the collection and integration of Internet of Things information
Reconfigurable Radar Unlike traditional mechanical radars, modern phased-array antennas are solid-state devices that aim, transmit, and receive radar beams electronically. But they have a drawback: Each phased array must be built for a single type of application. Researchers at the Georgia Tech Research Institute (GTRI) are developing an innovative phased-array design in which each element contains more than 100 interconnected radio-frequency switches. This allows each radiating element of an entire array to be reconfigured via software to perform a variety of jobs. GTRI’s reconfigurable electromagnetic interface is part of Arrays at Commercial Timescales, a Defense Advanced Research Projects Agency (DARPA) program aimed at speeding development times for phased-array systems. 8
The GTRI design packs many tiny and digitally addressable switches onto half-inch-square tiles, which in turn can be used to build out large-scale antenna arrays. Backed by innovative gallium arsenide radio-frequency switches from corporate partner BAE Systems, each reconfigurable array element can independently perform essential antenna-related functions including beam steering, frequency tuning, and polarization. — Rick Robinson
I L LU S T R AT I O N : I S TO C K P H OTO ; R A DA R : R O B F E LT
DATA STREAMLINED
The Internet of Things (IoT) includes millions of sensing devices in buildings, vehicles, and elsewhere that produce reams of data. Yet it involves so many different kinds of data, sources, and communication modes that its myriad information streams can be onerous to acquire and process. Researchers at the Georgia Tech Research Institute (GTRI) have developed a flexible, generic data fusion software that simplifies interaction with sensor networks. Known as FUSE, it provides a framework to standardize the diverse IoT world. Its application programming interface (API) lets users capture, store, annotate, and transform any data coming from internet-connected sources. “The Internet of Things has always been something of a Tower of Babel, because it gathers data from everywhere — from the latest smart building microcontrollers and driver-assist vehicles to legacy sensors installed for years,” said Heyward Adams, the GTRI research scientist who is leading the FUSE project. “Traditionally, people wanting to utilize IoT information have had to examine the attributes of each individual sensor and then write custom software on an ad-hoc basis to handle it.” Before FUSE, Adams said, a typical IoT task could require several manual steps. For example, users would acquire data from the internet by manually finding and setting up the proper communication protocols. Then each data value would have to be assigned to a supporting database. Finally, the user would need to process the data, via approaches such as arithmetic manipulation or statistical evaluation, before it could be fed into a decision algorithm. FUSE uses a generic RESTful communications platform that readily integrates data streams from real-world sensors into cohesive, human-readable information. Using a graphical environment, the FUSE framework facilitates real-time data acquisition by letting users subscribe readily to webpages, APIs, and other streaming-data sources that employ a multitude of protocols and modalities. The unified data stream is then processed, integrated, and formatted according to user specifications. “FUSE lets us take a task that used to involve a week or two and complete it in 10 or 15 minutes,” Adams said. “It provides a standard way of communicating in the unstandardized world of IoT.” — Rick Robinson
Helping Cool Cities
Georgia Tech graduate student Paul Rose and Assistant Professor Anna Erickson are shown with Cherenkov quartz detectors that would be used to find shielded radioactive materials inside cargo containers.
Factoid
R O B F E LT
More than 12 million maritime containers enter the U.S. annually from more than 800 global ports, according to Logistics Management magazine. Inspecting them for possible nuclear materials is a huge challenge for homeland security authorities.
DETECTING NUCLEAR STOWAWAYS
Researchers have demonstrated proof-of-concept for a novel monochromatic particle imaging technique based on low-energy nuclear reactions designed to detect the presence of weapons-grade uranium and plutonium in cargo containers arriving at U.S. ports. The method relies on a combination of neutrons and high-energy photons to detect shielded radioactive materials inside the containers. The technique can simultaneously measure the suspected material’s density and atomic number using mono-energetic gamma ray imaging, while confirming the presence of special nuclear materials by observing their unique delayed neutron emission signature. The mono-energetic nature of the novel radiation source could result in a lower radiation dose compared to conventionally employed methods. As a result, the technique could increase detection performance while avoiding harm to electronics and other cargo that may be sensitive to radiation. If the technique can be scaled up and proven under real inspection conditions, it could significantly improve the ability to detect — and prevent — the smuggling of dangerous nuclear materials. Supported by the National Science Foundation and the U.S. Department of Homeland Security, the research was reported in the Nature journal Scientific Reports. Scientists from Georgia Tech, the University of Michigan, and the Pennsylvania State University conducted this research, which is believed to be the first successful effort to identify and image uranium using this approach. “Once heavy shielding is placed around weapons-grade uranium or plutonium, detecting them passively using radiation detectors surrounding a 40-foot cargo container is very difficult,” said Anna Erickson, an assistant professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering. “One way to deal with this challenge is to induce the emission of an intense, penetrating radiation signal in the material, which requires an external source of radiation.” — John Toon
Louisville, Kentucky, could significantly reduce the number of people who die from heat-related causes under a series of recommendations that could also help other cities around the world respond to the growing hazards of extreme heat. The plan calls for planting additional trees and vegetation, cutting energy consumption by cars and buildings, decreasing impervious surface areas such as parking lots, and increasing the reflectivity of roads and rooftops. The recommendations are part of a new study that is the first in the nation to measure the benefits of heat management strategies for reducing urban temperatures and reducing the numbers of individuals dying from heat-related causes. The strategies were developed by a research team led by Brian Stone, director of Georgia Tech’s Urban Climate Lab and a professor in the School of City and Regional Planning in the College of Design. If the strategies are implemented, Louisville would be Brian Stone is the first city in the a professor in Georgia Tech’s world to develop an School of City urban heat-adapta- and Regional tion plan, Stone said. Planning. The city could then show how changes to a city’s physical surface can alter the impact of the urban heat island effect, which turns cities into cauldrons because of the combined impact of climate change and rising temperatures driven by a predominance of concrete and a shortage of vegetation. “Cities need to think about aggressive action if they want to measurably slow the rate at which they’re warming,” Stone said. “Louisville and this study can point the way for other cities to follow.” Heat is the deadliest natural disaster facing the United States — killing more people than hurricanes, tornadoes, and earthquakes combined. About 650 people die every year because of exposure to excessive heat, according to the U.S. Centers for Disease Control and Prevention. — Laura Diamond
R E S E A R C H H O R I ZO N S 9
FRONTOFFICE
One Atom Makes a Big Difference Combining experimental investigations and theoretical simulations, researchers have explained why platinum nanoclusters within a specific size range facilitate the hydrogenation reaction used to produce the chemical ethane from ethylene. The research offers new insights into the role of cluster shapes in catalyzing reactions at the nanoscale and could help materials scientists optimize nanocatalysts for a broad class of other reactions. At the macroscale, the conversion of ethylene has long been considered among the reactions insensitive to the structure of the catalyst used. By examining reactions catalyzed by platinum clus-
10
Simulation shows a 10-atom platinum nanocatalyst cluster supported on a magnesia surface. The “bulge” caused by the 10th atom gives the cluster improved catalytic properties.
When the U.S. Army updates defensive and offensive software on its UH60M Black Hawk and AH64D Apache Longbow helicopters, the improved systems must be fully tested to make sure they’re working properly. That includes evaluating how information is represented on the multifunction display (MFD) and multipurpose display (MPD), which use symbology to display threats. Until recently, that testing required the use of a real helicopter or costly display components that had to be configured to operate in a laboratory environment. Thanks to an MFD/MPD emulator developed by the Georgia Tech Research Institute (GTRI) in collaboration with the Army Reprogramming Analysis Team (ARAT), the testing can now be done on ordinary laboratory computers anytime it’s needed. The new emulator can help get software updates to deployed Army Aviation forces faster. “This is an exact replica of what’s on the helicopter, so when they’re testing the software upgrades in the laboratory, they see exactly what the pilot is going to see in the helicopter cockpit,” said William Miller, a GTRI principal research scientist who helped lead the project. “When the final software for the electronic warfare system is deployed to the field, it is already tested with the display. That saves money and time.” The project began with observing the operation of a multifunction display in operational helicopters. Next, a development team led by GTRI Research Scientist Heyward Adams developed the emulator in a standard military Windows-based computer, using cards to simulate the sensors that would normally be providing data to the MFD. The emulator is already in use by Army mission software developers in the ARAT laboratories. — John Toon
Heyward Adams is a research scientist in the Georgia Tech Research Institute.
POSTDOCS FOR CAREER PREP For doctoral students studying science, technology, engineering, and mathematics, landing a job as a research-oriented university faculty member typically requires having spent time as a postdoctoral researcher. But a new study shows that the research faculty path isn’t the only reason students pursue a postdoc. In a survey of nearly 6,000 doctoral students, more than a third of the students with plans to pursue postdocs said they had more interest in careers outside of academic research. The finding is surprising because it challenges the notion that postdoctoral research is a stepping stone primarily for research faculty positions, said Henry Sauermann, associate professor in Georgia Tech’s Scheller College of Business. “There’s this common belief that Ph.D. students pursue a postdoc because they want to have a faculty career,” Sauermann said. “The answer is much more complex.” The results of the research were published in the journal Science. The work was sponsored by the National Science Foundation and the Ewing Marion Kauffman Foundation. — Josh Brown
CLUSTER: UZI LANDMAN; HELICOPTER: SOUTH CAROLINA NATIONAL GUARD
ters containing between 9 and 15 atoms, however, researchers in Germany and the United States found that, at the nanoscale, this belief no longer holds true. The shape of nanoscale clusters, they found, can dramatically affect reaction efficiency. While the study investigated only platinum nanoclusters and the ethylene reaction, the fundamental principles could apply to other catalysts and reactions, demonstrating how materials at small sizes can provide different properties than the same material in bulk. Supported by the Air Force Office of Scientific Research and the Department of Energy, the research was reported in the journal Nature Communications. “The knowledge gained from our re-examination of the validity of a fundamental concept in catalysis, and the emergent paradigm shift that we uncovered concerning structure sensitivity of reactions catalyzed by nanosize catalysts, may open new vistas for the design and control of catalytic activity in the nanoscale,” said Uzi Landman, a Regents Professor and F.E. Callaway Chair in the Georgia Tech School of Physics. — John Toon
MISSION SOFTWARE ACCOMPLISHED
PROFILE
THE BUSINESS OF GREEN Dionne Nickerson, a Ph.D. student in marketing in Georgia Tech’s Scheller College of Business, researches the way companies and consumers can promote positive social change. Her current work examines corporate sustainability efforts and how potential customers respond to these programs.
Factoid The Scheller College of Business is internationally recognized as a leader in business education that is grounded in a deep understanding of how advances in technology affect the way business is conducted. The college focuses on equipping students with analytical skills to assess opportunities and apply appropriate technologies for a competitive advantage.
WHERE ARE YOU FROM? I grew up on the South Side of Chicago and graduated from Brown University in 2000 with a B.A. in engineering. After graduation I joined AmeriCorps and worked as an elementary school reading tutor for a couple of years. Then I moved to France and taught English for a couple of years while taking some business and engineering classes at Universite de Technologie de Troyes. When I moved back I started working for a small technology consulting firm in Providence. That’s where I got exposed to tech transfer and the business side of engineering and science. I really enjoyed it so I went back to school and earned my M.B.A. in 2013 from Providence College. YOU HAVE A DIVERSE SKILL SET. WHAT BROUGHT YOU TO GEORGIA TECH? When I was in my M.B.A. program I did my thesis on the adoption of mobile technology and mobile payment systems in Kenya and Tanzania. I looked at how it helped small entrepreneurs and interviewed microentrepreneurs in various industries in Nairobi and Dar es Salaam. It was amazing. I loved it. My advisors suggested I think about getting a Ph.D. I applied to a ton of schools but Georgia Tech was the only place that brought me out to meet everyone in person. Also, at the time, my research interests were in mobile technology so I thought Georgia Tech would be a really good fit. WHAT ARE YOU RESEARCHING NOW? It is very important to me that if I’m going to work on research it must be something that betters our lives. When I got to Georgia
R O B F E LT
Tech I wanted to explore something totally different. I thought about how my husband and I recycle and how we interact with the environment but wondered why we don’t buy more things that support sustainability. That got me thinking about how a company can make a business case for sustainability. How does a company make a business case for people to do the right thing and buy the products that will help all of us and the planet we live on? HOW DO YOU DEFINE SUSTAINABILITY? It relates to business practices that protect our planet for generations to come. This is more than just an environmental perspective. This also relates to workers’ rights, women’s empowerment, and investing in communities in ways that will benefit all of society. My research will help companies understand what types of sustainability efforts and claims are a good match with their products and targeted customers. YOU’LL COMPLETE YOUR PH.D. IN 2019. WHERE DO YOU SEE THIS RESEARCH GOING BETWEEN NOW AND THEN? I’m working with Omar Rodriguez-Vila, assistant professor of marketing, to look at food products, specifically beverage companies because they publish annual sustainability reports. Some early research shows people will look at certain products more favorably after learning about sustainability efforts, such as supporting youth employment or clean drinking water. Another study will look at whether consumers’ willingness to purchase sustainable products varies depending on if they are in a public or private setting. The next step is to look at luxury products. While a lot of sustainability research has focused on low-cost consumer goods, there has been some interest in the public that luxury brands engage in sustainability initiatives. — Laura Diamond
R E S E A R C H H O R I ZO N S 1 1
FILE
A three-screen presentation system provided a preview of the venues that were being planned for the 1996 Olympic games.
GO FOR THE WHITE AND GOLD When organizers of Atlanta’s bid for the 1996 Summer Olympics met with then-Georgia Tech President John Patrick Crecine, they were planning to ask for help in creating a 3-D architectural model of the city’s vision for its Olympic venues. Other cities bidding for the games were expected to use such models in what was then the standard way to visualize planned construction. Instead, Crecine offered to help produce something that had never been part of an Olympic bid proposal before: an interactive 3-D simulation that would allow members of the selection committee to “fly through” the proposed venues. In the late 1980s, that kind of graphic presentation was truly cutting edge. According to the Fall 1990 issue of Research Horizons, the program presented a wide-screen view of the proposed Olympic Village using three videodisc players, three computers, computer-composed music, digitized narration, and a unique interaction system that included a computer-animated, touch-sensitive, 3-D model of the Olympic Village on the Georgia Tech campus. A Commodore Amiga computer controlled the presentation. The Georgia Tech research group, which included assistance 12
from Georgia State University and several local companies, saw the presentation as a way to showcase Atlanta’s aspirations — and provide a testbed for developing innovative presentation techniques. “We were going to catch the weary International Olympic Committee (IOC) in the last week of the bid competition, so we wanted to make this more entertaining than the other big presentations they would see,” said development team leader Michael J. Sinclair, quoted in the magazine. “We wanted to tell the audience about transportation, medical facilities, entertainment, training facilities, housing, and dining.” The team relied heavily on computer-generated renderings of buildings proposed for the games. For instance, an architectural firm provided a database of information about the Olympic Dormitory, which now houses Georgia Tech students. Other facilities included the aquatic venue, which is now Tech’s Campus Recreation Center, and the Olympic Stadium, now home to Turner Field. The program had its intended effect, and in the summer of 1996, Georgia Tech’s campus became the Olympic Village, housing some 15,000 athletes from around the world. — John Toon RESEARCH HORIZONS ARCHIVES
FRONTOFFICE
Computing Inside Cells Using strands of nucleic acid, scientists have demonstrated basic computing operations inside a living mammalian cell. The research could lead to an artificial sensing system that could control a cell’s behavior in response to such stimuli as the presence of toxins or the development of cancer. The research uses DNA strand displacement, a technology that has been widely used outside of cells for the
Eryn Bernardy, a doctoral candidate in Georgia Tech’s School of Biological Sciences, holds an agar plate on which cholera colonies (yellow) are growing.
CHOLERA: JOHN TOON; COMPUTING: CHIARA ZURLA
A GERM-EAT-GERM WORLD In humans, cholera is among the world’s most deadly diseases. But in aquatic environments far from people, the same bacterium attacks neighboring microbes with a toxic spear — and often steals DNA from other microorganisms to expand its own capabilities. A study of more than 50 samples of Vibrio cholerae isolated from patients and the environment demonstrates the diversity and resourcefulness of the organism. In the environment, the cholera bacterium is commonly found attached to chitin, a material used by aquatic creatures such as crabs to form protective shells. In the wild, most strains of cholera can degrade the shells for food, and the new study showed how the presence of chitin can signal the bacteria — which have receptors for the compound — to produce behaviors very different from those seen in human disease. Among the cholera strains studied, less than a quarter were able to take up DNA from other sources. Almost all of the
samples taken from the environment were able to kill other bacteria — a phenomenon called “bacterial dueling” — but just 14 percent of the bacterial strains isolated from humans could do so. “It’s a dog-eat-dog world out there even for bacteria,” said Brian Hammer, an associate professor in the Georgia Tech School of Biological Sciences. “Bacteria such as Vibrio cholerae sense and respond to their surroundings, and they use that information to turn on and off the genes that benefit them in specific environments.” The research, supported by the National Science Foundation and the Gordon and Betty Moore Foundation, provides information that could lead to development of better therapeutic agents against the disease. The research was conducted with assistance from the Centers for Disease Control and Prevention and was reported in the journal Applied and Environmental Microbiology. — John Toon
Brian Hammer is an associate professor in Georgia Tech’s School of Biological Sciences.
PV CELLS GET SPACE TESTING A novel three-dimensional solar cell design developed at the Georgia Tech Research Institute (GTRI) will soon receive its first space testing aboard the International Space Station (ISS). An experimental module containing 18 test cells was launched to the ISS in July and installed on the exterior of the station. In addition to testing the three-dimensional format, the module will also study a low-cost copper-zinc-tin-sulfide (CZTS) solar cell formulation. Built by coating miniature carbon nanotube “towers” with a photo-absorber that captures sunlight from all angles, the 3-D cells could boost the amount of power obtained from the small surface areas many spacecraft have. “We want to see both the light-trapping performance of our 3-D solar cells and how they are going to respond to the harshness of space,” said Jud Ready, a GTRI principal research engineer and an adjunct professor in Georgia Tech’s School of Materials Science and Engineering. — Rick Robinson
design of molecular circuits, motors, and sensors. Researchers modified the process to provide both “AND” and “OR” logic gates able to operate inside the living cells and interact with native messenger RNA. The tools they developed could provide a foundation for bio-computers that are able to sense, analyze, and modulate molecular information at the cellular level. Supported by the Defense Advanced Research Projects Agency and the National Science Foundation, the research was reported in the journal Nature Nanotechnology. “The whole idea is to be able to take the logic that is used in computers and port that logic into cells themselves,” said Philip Santangelo, an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. “These devices could sense an aberrant RNA, for instance, and then shut down cellular translation or induce cell death.” Strand displacement reactions are the biological equivalent of the switches or gates that form the foundation for silicon-based computing. They can be programmed to turn on or off in response to an external stimuli such as a molecule. — John Toon
R E S E A R C H H O R I ZO N S 1 3
FRONTOFFICE
Suman Das with the LAMP System CPT6060, which can be used to build highly complex and demanding ceramic cores and molds. 14
EXPERTISE
1 The LAMP machine dispenses a ceramic resin onto a platform. Mechanical arms spread the resin in a thin layer over the platform.
4 Each layer is broken up into many small, seamless images and projected in a scrolling, serpentine pattern across the layer.
2
3
Using a digital CAD model, high resolution bitmaps are created that represent each layer of the model.
5
These bitmaps are then projected with UV light onto the layer of resin, one layer at a time.
6
The areas of resin that receive the UV light become hardened and cure to form a solid layer. The process continues layer by layer until all of the layers are completed.
Once the resin is removed from the ceramic molds, they are ready to be fired in a furnace and then directly used in casting.
L E F T, I M A G E 6 : R O B F E LT; I M A G E S 1 - 5 : C O U R T E SY D D M SY S T E M S
THE SHAPE OF THINGS TO COME A THOUSANDS-OF-YEARS-OLD PROCESS GETS A DIGITAL REBOOT The stainless steel and glass encasement stands 8 feet tall and at least as wide. Inside, an optical projection system moves in a serpentine pattern across a platform, shooting rays of ultraviolet light into a slurry of photosensitive binder resin and ceramic particles. Dozens of pieces begin to take shape as the projection system makes thousands of passes over the platform. The machine, which owes its origins to prototypes built by a team of researchers led by Suman Das, a professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering, is different from 3-D printers that have come before. This one can be used to build highly complex and demanding ceramic cores and molds. It’s called the LAMP™ System CPT6060, produced and marketed by Atlanta-based DDM Systems under an exclusive license from Georgia Tech. The cores and molds are used for investment casting, also known as “lost wax” casting. Traditionally, making the cores and molds has required a time-consuming process that involves injection molding a ceramic core, creating a wax model around
the core, using a ceramic slurry to slowly build a shell around the wax, and coating the shell with stucco to thicken it. After melting out the wax and firing the shell, molten metal is poured into the ceramic mold to form a part. The LAMP system circumvents that process by fabricating the mold and core in one step from a digital design. In addition to the faster turnaround time, the new technique can reduce the cost of manufacture by as much as 65 percent for new component designs. The LAMP system works by converting a digital design into thousands of high-resolution images that the machine uses to build parts a single 100-micron layer at a time using a slurry mixture. The 3-D printer is large enough to build numerous parts simultaneously during one session. Once the resin is removed from the molds, they are ready to be fired in a furnace and then directly used in casting. While the first industrial application of the system is for printing ceramic cores and molds for casting aircraft turbine engine components, the LAMP system is also in great demand for making precision cast components that are used in a diverse group of industries, including defense, energy, biomedical, and automotive. Aside from printing ceramic cores and molds for investment casting, the LAMP system is also capable of printing a variety of other complex pieces from the ceramic slurry. — Josh Brown R E S E A R C H H O R I ZO N S 1 5
FRONTOFFICE
Scientists study flow of glacial meltwater into oceans
Scientists have observed a significant increase in the melting of glacial land ice on Greenland, spurring concerns about global sea level rise. A research team led by the University of Georgia (UGA) and including a Georgia Tech scientist has discovered the fate of much of the fresh water that pours into the surrounding oceans as the Greenland ice sheet melts every summer. The findings were published in the journal Nature Geoscience. “Understanding the fate of meltwater is important, because research has shown that it can carry a variety of nutrients, which 16
A research team measures meltwater runoff from the ice sheet margin in Greenland during summer 2013.
Annalisa Bracco is a professor in Georgia Tech’s School of Earth and Atmospheric Sciences.
THOMAS MOTE/UGA
CURRENT EVENTS
may impact biological production in the ocean,” said Renato Castelao, co-author of the study and associate professor of marine sciences at UGA. The researchers created a simulation that tracks meltwater runoff under a variety of atmospheric conditions, and they were surprised to discover that most of the meltwater found off the west coast of Greenland actually originated from ice on the east coast. The meltwater is largely deposited into the Labrador Sea, an arm of the Atlantic between Canada’s Labrador Peninsula and the east coast of Greenland. Annalisa Bracco, a professor in Georgia Tech’s School of Earth and Atmospheric Sciences, says the Labrador Sea is a basin of key climate relevance because it’s one of the few places in the world where “deep water” is formed in the ocean through convection. “In winter, water can mix to depths of 2,000 meters (6,000 ft.) and feed deep ocean currents,” she said. “If the surface stratification of the ocean changes — because so much melted water reaches the central Labrador Sea — convection will be halted, which creates dangerous consequences for the global climate.” — James Hataway, University of Georgia
FRONTOFFICE
research horizons
ACID IN THE AIR
Factoid The pH of PM 2.5 particles can’t be directly measured, so scientists must infer their acidity by studying the distribution of atmospheric species that can be measured and are highly sensitive to the value of the particle pH.
F O O D : I S TO C K P H OTO. C O M ; R O O F TO P: R O B F E LT
PICTURE OF HEALTH Pork, mayonnaise, and cookies versus bagels, kale, and hummus. That’s the glaring difference in food choices between two groups of people in the Northeastern United States. The foods on the first list are more exclusive to social media feeds of people living in Northeastern food deserts, a term used by the United States Department of Agriculture (USDA) to describe communities with limited access to grocery stores with fresh food. The second list is more exclusive to non-food deserts. A new study has identified the food choices and nutritional profiles of people living in both types of communities throughout the United States. It included three million geotagged posts on the social media platform where food is king: Instagram. The researchers found that food posted by people in food deserts is 5 to 17 percent higher in fat, cholesterol, and sugars compared to food postings shared by people in areas with more grocery stores. “Instagram literally gives us a picture of what people are actually eating in these communities, allowing us to study them in a new way,” said Munmun De Choudhury, an assistant professor in Georgia Tech’s School of Interactive Computing. The research was presented at the ACM Conference on Computer-Supported Cooperative Work and Social Computing. The breakdown of foods in other regions includes:
When acidic materials are spilled, clean-up involves adding a base chemical to neutralize the acid. Up to a point, the more base added, the more neutral and less toxic the spill becomes. Something similar happens in the atmosphere. Acidic sulfur emissions from power plants have been rapidly declining over the past decade, and the neutralizing base — ammonia — emitted from a different source, hasn’t declined. This has led many atmospheric scientists to assume that the ambient sulfate particles we all breathe are becoming less acidic and therefore less toxic. But a new study shows this intuitive expectation hasn’t happened, at least not in the Southeastern United States, where the remaining sulfate particles appear to be as acidic as ever. Beyond human health, the research has broader implications for atmospheric pollution and global climate change modeling. Sponsored by the National Science Foundation and the U.S. Environmental Protection Agency, the research was reported in the journal Nature Geoscience. The conclusions are based on observed gas and aerosol composition as well as humidity and temperature data collected at a site in rural Alabama as part of the Southern Oxidant and Aerosol Study. “Sulfates are the major source of acidity in the atmosphere, and gas-phase ammonia — mostly from agriculture — had been expected to react with the remaining particles to reduce their acidity,” explained Rodney Weber, a professor in Georgia Tech’s School of Earth and Atmospheric Sciences. “But what we found is that the system that forms the sulfate particles isn’t very sensitive to the amounts of ammonia neutralizer. This has implications because the acidity of these particles affects other important atmospheric reactions.” — John Toon
Southeast: bacon, potatoes, and grits vs. collard greens, oranges, and peaches. Midwest: hamburgers, hot dogs, and brisket vs. beans, spinach, and kale. West: pie, beef, and sausage vs. quinoa, apples, and crab. Southwest: barbeque, pork, and burritos vs. tomatoes, asparagus, and bananas. — Jason Maderer
Researchers have found that the pH of atmospheric sulfate particles is lower than expected. Shown are Georgia Tech graduate student Hongyu Guo, Professor Rodney Weber from the School of Earth and Atmospheric Sciences, and Professor Armistead Russell from the School of Civil and Environmental Engineering.
R E S E A R C H H O R I ZO N S 1 7
VISUALIZATION
GRAPHIC IMPACT When astrophysicists saw the squiggly lines recorded by LIGO detectors in Washington and Louisiana in September 2015, they immediately suspected that the gravitational waves they were seeing — the first ever observed — signaled the collision of two massive black holes. For ordinary folks, however, the squiggles — which confirmed another prediction of Albert Einstein’s 1915 general theory of relativity — just didn’t say very much. But for Karan Jani, a member of the LIGO team and a doctoral candidate in Georgia Tech’s School of Physics, and Matt Kinsey, another doctoral candidate, the waves offered an opportunity to apply simulation and visualization technology to help a broader audience share in the excitement the scientists were experiencing. Using Einstein’s equations, Jani created a simulation of the collision of black holes recorded by the Laser Interferometer Gravitational-wave Observatory (LIGO) detectors. The event itself lasted just a half second, but simulating it required an estimated 50,000 hours of high-performance computer time — using a Georgia Tech computer cluster and a supercomputer at the Texas Advanced Computing Center.
INSPIRAL Creating the visualization started with 50,000 hours of computer time to solve Einstein’s field equations, then processing the results with graphics software. The colors emanating from the black holes represent the energy lost by the pair: gravitational waves.
18
Kinsey then used a program known as VisIt to convert the simulations to a 3-D animation; frames from the animation were used to create a poster that was shown by media around the world. “These graphics are really the only way that people in general could have grasped what the discovery is all about,” Jani said. “To take Einstein’s equations and solve them, you need a very sophisticated numerical infrastructure, and we were fortunate to have had access to this.” When the black holes collided 1.3 billion years ago, the gravitational waves radiated away from them in a three-dimensional pattern. But since the waves were detected at only two locations on Earth, the full 3-D nature of the event could not have been appreciated without the simulation and visualization. The two black holes were each roughly 30 times the mass of our sun, but because black holes are the densest objects in the universe, the resulting body would have had an area perhaps no larger than Montana. The amount of energy released by the collision was 50 times the total energy emitted by the universe at that instant. “It is truly remarkable that we were here at the right time and with the technology to detect this,” Jani said. — John Toon
MERGER During the final fraction of a second, the two black holes collide into each other at nearly one-half the speed of light and form a single more massive black hole, converting a portion of the combined black holes’ mass to energy, according to Einstein’s formula E=mc2. This energy is emitted as a final strong burst of gravitational waves. This section of the waveform is the moment of greatest energy, during the merger, shown by the size of the gravitational waves detected by LIGO.
RINGDOWN At the end of the merger, the black hole is not a perfect sphere but continues to reverberate, like a bell ringing. The gravitational waves are traveling farther from their source — by the time they reach Earth, they will have traveled 1.3 billion light-years.
MERGER: M. KINSEY, M. CLARK, K. JANI
FRONTOFFICE
ROBOTS LEARN HUMAN ETHICS BY READING STORIES The rapid advance of artificial intelligence (AI) is raising concerns about robots acting unethically or choosing to harm humans. But how can robots learn ethical behavior if there is no “user manual” for being human? Researchers Mark Riedl and Brent Harrison from Georgia Tech’s School of Interactive Computing believe the answer lies in “Quixote,” a technique that teaches “value alignment” to robots by training them to read stories, learn acceptable sequences of events, and understand successful ways to behave in human societies. “The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels, and other literature,” said Riedl, associate professor and director of the Entertainment Intelligence Lab. “We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose.” Quixote is a technique for aligning an AI’s goals with human values by placing rewards on socially appropriate behavior. It builds upon Riedl’s prior research — the Scheherazade system — which demonstrated how artificial intelligence can gather a correct sequence of actions by crowd-sourcing story plots from the Internet. Scheherazade learns what is a normal or “correct” plot graph. It then passes that data structure along to Quixote, which converts it into a “reward signal” that reinforces certain behaviors and punishes other behaviors during trial-and-error learning. In essence, Quixote learns that it will be rewarded whenever it acts like the protagonist in a story instead of behaving randomly or like the antagonist. Presented at the AAAI-16 Conference, the research was supported by the Defense Advanced Research Projects Agency (DARPA) and the Office of Naval Research (ONR). — Tara La Bouff
ROBOT: ISTOCKPHOTO.COM; BACTERIA: STACEY PFALLER/EPA
BUGS IN THE PIPES The human microbiome, a collection of microorganisms living inside us and on our skin, has been attracting considerable attention. Now, researchers are discovering that the built environment also has a microbiome, which includes a community of potentially pathogenic bacteria living inside water supply pipes. A paper published in the journal Applied and Environmental Microbiology described microbial communities found in shower hoses at a major U.S. hospital. The study documented bacteria — and related genes — using cutting-edge meta-genomic techniques that allow the characterization of organisms that cannot be detected using traditional culture-based microbiology assays. Researchers from the U.S. Environmental Protection Agency and Georgia Tech collaborated to study these biofilm communities but can’t say yet if these bacteria pose a threat to patients. Because some of the genes could indicate pathogenic characteristics — such as resistance to antibiotics — more study is needed.
“We can say confidently that if pathogens are in there, they are not there in very high abundance,” said Kostas Konstantinidis, an associate professor in Georgia Tech’s School of Civil and Environmental Engineering. “But the organisms we detected in these biofilms appear to have characteristics that could be of interest because they are related to bacteria that are opportunistic pathogens that could pose a threat.” Researchers began by culturing bacteria from 40 shower hoses removed from individual hospital rooms. Nucleic acid was extracted from five of the shower hoses and processed using next-generation sequencing technology. The sequencing data was sent to Georgia Tech, where doctoral student Maria Juliana Soto-Girón matched the sequences against known bacteria — and genes that have known effects such as virulence and antibiotic resistance. “If they have a core of genes, they may be receptive to acquiring other genes that will render these microorganisms more
problematic,” said Jorge Santo Domingo, a microbial ecologist with the EPA’s Office of Research and Development in Cincinnati. “These organisms are very good at living in difficult environmental conditions with limited carbon sources, so fighting them could become a challenging proposition. We don’t know if they constitute a problem, but we certainly want to find out.” — John Toon
Image shows the growth of Mycobacterium isolated on a plate of culture medium.
R E S E A R C H H O R I ZO N S 1 9
‘Bursting’ for Attention
Researchers have for the first time precisely manipulated bursting activity of cells in the thalamus, tying it to the sense of touch. Shown are Georgia Tech graduate student Clarissa Whitmire and Professor Garrett Stanley.
20
Researchers have fabricated model blood vessel systems with diameters as narrow as the smallest capillaries in the body. The systems were used to study the activity of white blood cells as they were affected by drugs that tend to make them softer, which facilitates their entry into blood circulation.
LET’S GET PHYSICAL Wilbur Lam is an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University.
Simple physics may play a larger role than previously thought in helping control key bodily processes, including how the body fights infection. Using a model blood vessel system built on a polymer microchip, researchers have shown that the relative softness of white blood cells determines whether they remain in a dormant state along vessel walls or enter blood circulation to fight infection. Changes in these cell mechanical properties — from stiff to soft — can be triggered as a side effect of drugs commonly used to fight inflammation or boost blood pressure. Better understanding the role of physics in fine-tuning such biological processes could give researchers new approaches for both diagnosing and treating disease. The work, believed the first to show how biophysical effects can control where white blood cells are located within the blood circulation, was reported in the journal Proceedings of the National Academy of Sciences. The research was supported by the National Heart, Lung, and Blood Institute of the National Institutes of Health, the National Science Foundation, and the American Heart Association. “We are showing that white blood cells, also known as leukocytes, respond physically to these drugs and that there is a biological consequence to that response,” said Wilbur Lam, an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. “This may suggest new ways to treat disease, and new places to look for diagnostic information. There may be physics-based disease biomarkers that we can use in addition to the common biological and biochemical markers we have been using.” — John Toon
R O B F E LT
As you start across the street, out of the corner of your eye, you spot something moving toward you. Instantly, your brain shifts its focus to assess the potential threat, which you’re relieved to determine is a slow-moving bicycle. The brain’s ability to quickly focus on life-or-death decisions, then immediately shift to detailed analytical processing, is believed to be the work of the thalamus, a small section of the midbrain through which most sensory inputs from the body flow. When cells in the thalamus detect something that requires urgent attention from the rest of the brain, they begin “bursting” — many cells firing off simultaneous signals to get the attention of the cortex. Once the threat passes, the cells quickly switch back to quieter activity. Using optogenetics and other technology, researchers have for the first time precisely manipulated this bursting activity of the thalamus, tying it to the sense of touch. The work, done in animal models, was reported in the journal Cell Reports and sponsored by the National Institutes of Health’s National Institute of Neurological Disorders and Stroke. “If you clap your hands once, that’s loud,” explained Garrett Stanley, a professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. “But if you clap your hands several times in a row, that’s louder. And if you and your friends all clap together and at the same time, that’s even stronger. That is what these cells do, and the idea is that this mechanism produces bursts synchronized across many cells to send out a very strong signal about a stimulus in the outside world.” Neuroscientists have long believed that such coordinated spikes of activity serve to focus the brain’s attention on issues requiring immediate attention. Stanley and graduate student Clarissa Whitmire — working with researchers Cornelius Schwarz and Christian Waiblinger from the University of Tübingen in Germany — used optogenetics techniques to study bursting activity in the thalamus regions of rats. — John Toon
WATER WINGS
SEA BUTTERFLIES: DAVID MURPHY; FISH, VOLCANO: ISTOCKPHOTO.COM
The sea butterfly (Limacina helicina), a zooplankton snail that lives in cold oceans, truly lives up to its name. Georgia Tech researchers went to the Pacific Ocean to scoop up hundreds of the 3-millimeter marine mollusks and then used highspeed cameras to watch how they move. They found that sea butterflies don’t paddle like most small water animals. Instead, they’re like flying insects, flapping their wings to propel themselves through the water. “Snails evolutionarily diverged from flying insects 550 million years ago,” said Donald Webster, a professor in Georgia Tech’s School of Civil and Environmental Engineering. “Hence, it is amazing that marine snails are using the same figure-eight wing pattern that is typical of their very distant airborne relatives.” Another amazing similarity between these pteropods and insects is the use of a clap-and-fling wing motion. Each species claps its wings together, then rapidly flings them apart to generate enhanced lift. The team did find one major difference between sea butterflies and flying insects. Nearly two-thirds of the plankton’s body is its shell. When it’s not moving forward, it sinks to the ocean floor. To avoid sinking, the pteropod rotates its body up to 60 degrees with each stroke. The rotation puts its wings in the proper position to flap downward during every half-stroke (about 10 times per second), thereby enabling it to move in an upward, zig-zag path through the water. The research was reported in The Journal of Experimental Biology. — Jason Maderer
The sea butterfly, or Limacina helicina, is a zooplankton snail that lives in cold oceans.
POLLUTED DUST HURTS OCEAN LIFE Researchers have found yet another worrisome trend impacting the health of the Pacific Ocean. Modeling shows that for decades, air pollution drifting from East Asia out over the world’s largest ocean has kicked off a chain reaction that contributed to oxygen levels falling in tropical waters thousands of miles away. “There’s a growing awareness that oxygen levels in the ocean may be changing over time,” said Taka Ito, an associate professor in Georgia Tech’s School of Earth and Atmospheric Sciences (EAS). “One reason for that is the warming environment — warm water holds less gas. But in the tropical Pacific, the oxygen level has been falling at a much faster rate than the temperature change can explain.” In the study, the researchers describe how air pollution from industrial activities has raised levels of iron and nitrogen — key nutrients for marine life — in the ocean off the coast of East Asia. Ocean currents then carry the nutrients to tropical regions, where they promote biological activity, eventually reducing subsurface oxygen. That process plays out all across the Pacific, but the effects are most pronounced in tropical areas, where dissolved oxygen is already low. Athanasios Nenes, a professor in EAS and the School of Chemical and Biomolecular Engineering, said the research shows how far-reaching the impact of human industrial activity can be. “The scientific community always thought that the impact of air pollution is felt in the vicinity of where it deposits,” Nenes said. “This study shows that the iron can circulate across the ocean and affect ecosystems thousands of kilometers away.” The study, which was published in the journal Nature Geoscience, was sponsored by the National Science Foundation. — Josh Brown
Bubble Trouble Volcanic eruptions spew fine ash, sulfur, and crystal-poor magma into the atmosphere. New research suggests how light vapor bubbles migrating and accumulating in parts of shallow volcanic chambers contribute to the effects. Volcanic chambers are a maze of crystal-rich and crystal-poor regions, especially where magma stalls and builds before eruption. The researchers used lab experiments and computer models to focus on how bubbles move to and through these shallow reservoirs, which are three to five miles below the surface. “We know that bubbles control the style and power of eruptions, but we don’t fully understand how they
behave,” said Christian Huber, an assistant professor in Georgia Tech’s School of Earth and Atmospheric Sciences. “It’s probably like opening a soda and watching the bubbles race to the top of the bottle.” Huber and colleagues from Eidgenössische Technische Hochschule Zurich (ETH) believe these bubbles maneuver their way through crystal-filled magma until they settle in these open-spaced reservoirs — areas without many crystals — and build up the necessary energy for an impending eruption. The team’s experiments indicate that bubbles squeeze through the narrow openings to create finger-like paths.
These long paths allow the bubbles to merge and form connected pathways that transport low-density vapor efficiently through the crystal-rich parts of magma chambers. “Once they reach the end of this crystal-rich area and get more space, the water vapor fingers transform back into their usual, spherical bubble shape,” said Andrea Parmigiani, who led the study during his postdoctoral work at Georgia Tech and ETH. “Once vapor forms these bubbles, the ascent of the light vapor bubbles is slow and bubbles accumulate.” The findings were reported in the journal Nature. — Jason Maderer R E S E A R C H H O R I ZO N S 2 1
PROTECTING THE GRID BY LISTENING Device fingerprinting can reveal attacks
said Raheem Beyah, the Motorola Foundation Professor in Georgia Tech’s School of Electrical and Computer Engineering. “This is the first technique that can passively fingerprint different devices that are part of critical infrastructure networks. We believe it can be used to significantly improve the security of the grid and other networks.” The networked systems controlling the U.S. electrical grid and other industrial systems often lack the ability to run modern encryption and authentication programs, and the legacy equipment connected to them were never designed for networked security. The grid systems are also difficult to update using the “patching” techniques common in computer networks. Device fingerprinting takes advantage of the unique physical properties of the grid and the consistent types of operations that take place there. For instance, security devices listening to signals traversing the grid’s control systems can differentiate between these legitimate devices and signals produced by equipment that’s not part of the system. — John Toon
Georgia Tech researchers are “fingerprinting” devices on the electric grid to improve security. Shown with grid devices and a schematic are graduate student David Formby, Associate Professor Raheem Beyah, and Assistant Professor Jonathan Rogers.
22
Factoid In February, CNN reported that U.S. investigators had found evidence of the first-of-itskind cyberattack on a power grid that caused a 2015 blackout for hundreds of thousands of people in Ukraine.
R O B F E LT
Human voices are individually recognizable because they’re generated by the unique components of each person’s voice box, pharynx, and other physical structures. Researchers are using the same principle to identify devices on electrical grid control networks, using their unique electronic “voices” — fingerprints produced by the devices’ individual physical characteristics — to determine which signals are legitimate and which might be from attackers. A similar approach could also be used to protect networked industrial control systems in oil and gas refineries, manufacturing facilities, wastewater treatment plants, and other critical systems. The research, reported at the Network and Distributed System Security Symposium, was supported in part by the National Science Foundation. “We have developed fingerprinting techniques that work together to protect various operations of the power grid to prevent or minimize spoofing of packets that could be injected to produce false data or false control commands into the system,”
RESISTANCE IS FUTILE
STORM: ISTOCKPHOTO; ZHANG: JOHN TOON
Small Failures, Big Impact When Super Storm Sandy struck New York in October 2012, the damage to the state’s electric utility infrastructure was devastating, overwhelming repair and restoration efforts by distribution system operators. A new study shows the extent of the challenge faced by the upstate New York distribution grid and suggests what might be done to make the system more resilient against future storms. The study, which required more than three years to complete, examined power failures that affected more than 650,000 customers from four major service regions. The study showed that failures affecting small numbers of customers accounted for more than half the outage impact, challenging a traditional recovery strategy that prioritizes repairs to substations and other major facilities. The research, published in the journal Nature Energy, is believed to be the largest detailed study of failure reports for distribution grids. “System failures can affect large numbers of customers even if they occur at the distribution level of the grid and do not cascade,” said Chuanyi Ji, an associate professor in Georgia Tech’s School of Electrical and Computer Engineering. “Together, these local failures can have a big non-local impact on customers. These findings are drawn from largescale data analytics — a relevant area for energy infrastructure.” The top 20 percent of distribution grid failures accounted for more than 80 percent of the customers affected. But even failures that each affected relatively small numbers of customers added up. A large portion — 89 percent — of small failures, represented by the bottom 34 percent of customers and commonplace devices, resulted in 56 percent of the total cost of the storm’s 28 million customer-interruption hours. — John Toon
By increasing the level of a specific microRNA (miRNA) molecule, researchers have restored chemotherapy sensitivity in vitro to a line of human pancreatic cancer cells that had developed resistance to a common treatment drug. If the miRNA molecules can be delivered to cells in the human body, the technique might one day be used to battle the chemotherapy resistance that often develops during cancer treatment. A Georgia Tech research team identified the miRNA used in the research with a computer algorithm that compared the ability of different miRNAs to control the more than 500 genes that were up-regulated in drug-resistant cancer cells. Georgia Tech Graduate Research Assistant Mengnan Zhang moves samples of pancreatic cancer cells into a flask for study. The research The study was reported May 27 in examines the role of microRNA molecules in controlling resistance to the Nature Publishing Group journal chemotherapy drugs. Cancer Gene Therapy. “We were specifically interested in what role miRNAs might play in developing drug resistance in these cancer cells,” said John McDonald, a professor in Georgia Tech’s School of Biological Sciences and director of the Integrated Cancer Research Center. “By increasing the levels of the miRNA governing the suite of genes we identified, we increased the cells’ drug sensitivity back to what the baseline had been, essentially undoing the resistance. This would suggest that for patients developing chemotherapy resistance, we might one day be able to use miRNAs to restore the sensitivity of the cancer cells to the drugs.” MicroRNAs are small non-coding molecules that function in RNA silencing and post-transcriptional regulation of gene expression. The miRNAs operate via base-pairing with complementary sequences within messenger RNA (mRNA) molecules, silencing the mRNA molecules that control the expression of certain proteins. — John Toon
MORE THAN PEER PRESSURE When an assembly of microgel particles includes one particle that’s significantly larger than the rest, that oversized particle spontaneously shrinks to match the size of its smaller neighbors. This self-healing nature of the system allows the microparticles to form defect-free colloidal crystals, an unusual property not seen in systems made up of incompressible particles. Using small-angle X-ray and neutron-scattering techniques, researchers carefully studied the structures formed by dense concentrations of the microparticles. They also used tiny piezoelectric pressure transducers to measure osmotic pressure changes in the system. They found that in dense assemblies of microparticles, counter ions bound to the microgels by electrostatic attraction come to be shared
by multiple particles, increasing the osmotic pressure which then works to shrink the oversized particle. “When the particles are close enough together, there is a point at which the cloud of ions can no longer be associated with individual particles because they overlap other particles,” said Alberto Fernandez-Nieves, an associate professor in Georgia Tech’s School of Physics. “The ions create an imbalance between osmotic pressure inside and outside the larger particles, pushing them to de-swell — expel solvent to change size — to match the pressure of the system given by these delocalized ions. This is only possible because the microgel particles are compressible.” The research was reported in the journal Proceedings of the National Academy of Sciences. — John Toon
Alberto Fernandez Nieves is an associate professor in Georgia Tech’s School of Physics.
R E S E A R C H H O R I ZO N S 2 3
0 10 10 0 10 0 110 0 10 1 0 1110 0 11 0 110 0 10 1 0 110 0 0 0 1 0 111
0 0 10 0 110 0 0 1
How traditional research is being rebooted
A textual analysis of this story orders the letters and then graphs their relationships
1111 0 1110 0 10 0 110 10 0 1 0 11110 10 0 110 1111 0 110 1110 0 1110 0 11 0 0 10 0 0 0 0 0 10 0 110 1 0 110 0 0 0 1 0 110 0 111 0 110 0 0 0 1 0 11110 10
0 110 10 0 1 0 110 1110 0 110 0
1 0 110 10 0 0 0 0 10 0 0 0 0 0 10 0 10 0 0 0 110
24
0 1110 10 0 0 110 1111 0 0 10 0 0 0 0 0 110 10 0 0 0 110 0 10 1 0 110 110 0 0 110 110 0 0 0 10 0 0 0 0 0 1110 111 0 110 10 0 1 0 1110 10 0 0 110 10 0 0 0 0 10 0 0 0 0 0 110 0 111 0 110 0 10 1 0 110 1111 0 111
The folder structure of this issue of Research Horizons magazine. The gray bands are folders, and the width of the yellow arc represents the size of each individual file.
W
hen it comes to scientific circles, data science may be a new kid on the block, but it’s rapidly become everyone’s best friend. A highly interdisciplinary field that blends statistics, computing, algorithms, applied mathematics, and visualization, data science uses automated methods to gather and extract knowledge from very large or complex sets of data. “Data science is difficult to explain because any way you >>>>>
0 10 1 0 0 10 0 0 0 0 0 10 0 10 0 1 0 1110 0 11 0 1110 0 11 0 1110 10 1 0 110 0 10 1 0 0 10 0 0 0 0 0 0 110 0 10 0 0 10 0 0 0 0 0 0 110 0 10 0 0 110 0 0 0 0 0 110 0 0 1 0 0 110 110
10 0 10 0 110 0 111 0 110 10 0 1 0 110 0 0 0 1
R E S E A R C H H O R I ZO N S 2 5
Professors Dana Randall and Srinivas Aluru are co-executive directors of Georgia Tech’s new Institute for Data Engineering and Science.
define it, you’re usually excluding something that is critically important,” said Dana Randall, a professor in Georgia Tech’s School of Computer Science. Granted, people have been collecting and crunching numbers for a long time, but over the past decade several things have changed, noted Charles Isbell, senior associate dean and a professor in Georgia Tech’s College of Computing. “A lot of data became ubiquitous, algorithmic sophistication has increased dramatically, we can construct complicated models to predict things — and we have the machinery to make it happen. Put all this together, and data science suddenly matters.” Indeed, instead of being relegated to some niche fields, “data science is becoming pervasive,” agreed Steve McLaughlin, chair of Georgia Tech’s School of Electrical and Computer Engineering (ECE). “There are very few fields in sciences, engineering, humanities, or business that aren’t being drastically impacted by data.”
<drug repurposing>
Take Jeffrey Skolnick, who is leveraging big data and high-performance computing to advance drug development. “Today I’m not only working differently than 20 years ago, I’m working differently than five years ago,” said Skolnick, a professor in Georgia Tech’s School of Biological Sciences and director of the Center for the Study of Systems Biology. “The widespread access to extremely large data sets to learn, train, and test on is a sea change. We’ve figured out ways of using predictive structures rather than experimental ones, which can save time and money.” Pharma companies typically spend more than $1 billion and take 10 to 15 years to develop a new drug. Yet only one in 5,000 compounds actually makes it from the lab to the medicine chest. On the brighter side, the U.S. Food and Drug Administration has approved some 1,500 drugs for consumer use, so finding new 26
uses for existing drugs could dramatically accelerate translational medicine — which is Skolnick’s bailiwick. His research group has built a unique knowledge base by developing algorithms that predict possible structures for 86 percent of human proteins from DNA sequencing. Known as Dr. PRODIS (DRugome, PROteome and DISeasome), this knowledge base can suggest alternative uses for FDA-approved drugs for each protein associated with a disease. “It’s not perfect yet, but the database is very useful for giving you a short list of things to try if existing treatments aren’t working,” Skolnick said. One of their success stories has been to suggest a drug, originally developed to combat nausea, to treat a child suffering from a rare form of chronic fatigue. Drug repurposing is possible because of the fundamental design properties of proteins and the “promiscuity” of drugs, Skolnick said. His researchers have shown there are a limited number (less than 500) of ligand-binding pockets, where a drug molecule can form a bond with a human protein. “So even if you design a drug to target a ligand-binding pocket in one protein, unintended interactions with similar pockets in other proteins can occur because the number of pocket choices is small,” Skolnick explained. In addition to drug repurposing, Dr. PRODIS can identify human protein targets for new chemical entities along with possible side effects. “We can help pharma companies by suggesting early in the game if their drug has off-target interactions that could cause it to be withdrawn or fail clinical trials,” Skolnick said, noting that Dr. PRODIS’ success rate is about 44 percent. “Again, this isn’t perfect, but it gives you a good tool.” What’s more, it’s a fast tool. If a pharma company gives Skolnick a drug molecule and wants to know its side effects, he can produce results within an hour — something that five years ago wasn’t possible in any time frame. Skolnick chalks up this network analysis to the treasure-trove of data now at his disposal. “When you have very large datasets that are diverse, the statistical likelihood of generalizing the data and getting meaningful results is far higher,” Skolnick said. “Then, high performance computing enables you to manipulate the data and learn from it. The goal is to build an algorithm that behaves the same way in the real world as in a controlled environment.”
<drinking from the fire hose>
With unprecedented amounts of data suddenly on tap, the challenge many researchers face is how to consume it. For example, inexpensive sensor technology has made it easy for power companies to collect data on critical high-value assets such as generators and turbines. Yet analytical technology has lagged behind, inhibiting their ability to make sense out of it, said Nagi Gebraeel, associate professor in Georgia Tech’s School of Industrial and Systems Engineering (ISYE) and associate director of the Strategic Energy Institute. In response, Gebraeel’s research group is developing a new computational platform to provide detection and predictive analytics for the energy industry. This platform remotely assesses the
Professor Jeffrey Skolnick is leveraging big data and high-performance computing to find new uses for existing drug compounds. He is shown in the pharmacy of Georgia Techâ&#x20AC;&#x2122;s Stamps Health Services. R E S E A R C H H O R I ZO N S 2 7
health and performance of equipment in real time and monitors trends to determine such things as: The best time to perform maintenance. When to order new parts so they don’t linger in inventory, costing money and possibly becoming obsolete. How shutting down one piece of equipment will affect the entire network.
Quantum computing, which uses properties of quantum mechanics to solve problems in optimization and whole number theory. Neuromorphic computing, which mimics how the human brain operates to update architectures. Approximate and stochastic computing, two complementary approaches based on the observation that computers often calculate results that are more accurate than necessary. Power could be saved by computing only as much as needed to get something that’s acceptable. Novel approaches to microarchitecture (while maintaining software compatibility to the existing software base), which might include cryogenic supercomputing, and adiabatic and reversible computing. Creating new transistor technology, which would be less disruptive to the computing stack but also very difficult. Two possibilities are tunneling field-effect transistors (FETs) and carbon-nanotube FETs. Conte’s research team has been modeling some of these new approaches. “Georgia Tech has been a national leader in defining what happens in post-exascale, and now we’re executing on it,” Conte said. “Going beyond exascale isn’t an incremental thing, it’s a fundamental shift.”
<breaking the bottleneck>
Similar analytical challenges exist in life sciences. Over the past decade, the throughput of sequencing DNA (rate at which DNA can be sampled) has increased by a factor of more than 1 million while costs have decreased by a factor of 1 million. “The raw data contains valuable things, yet we don’t know what they are until we analyze it,” said Srinivas Aluru, a professor in Georgia Tech’s School of Computational Science and Engineering (CSE). “The data comes really fast, so you need the ability to analyze it quickly — otherwise it just sits in storage. Analysis is the bottleneck.” With funding from the National Science Foundation (NSF), Aluru’s research group is developing techniques that can leverage high-performance computing to analyze data as rapidly as it is generated. For example, Patrick Flick, one of Aluru’s graduate students, created parallel algorithms for distributed-memory construction that can index the entire human genome in less than 10 seconds, winning Flick a prestigious “Best Student Paper” award at the 2015 Supercomputing Conference. Another milestone: The researchers have created a method to predict networks at the entire genome scale — a feat not done before — enabling them to explain how different genes work together in a biological process. This can be used for a wide variety of life-science applications, from determining causes of cancer to advances in plant biology, Aluru said. “For example, we’re working on a biological pathway responsible for nutritional content in plants to see if we can manipulate and improve it.” Emerging data science tools and techniques are dramatically changing the scale of problems that researchers can tackle, Aluru observed. “Instead of just looking at a single pathway or a few genes, we can go after whole genome scale.”
<big chemistry>
“There’s no way to pursue any challenging applications of quantum 28
COLLEGE OF COMPUTING
Currently the world’s most powerful supercomputers work at the peta scale (millions of billions of floating-point operations per second) and computer engineers expect to achieve exascale (about 1,000 times faster) within the next few years. Yet further increases in performance require radical changes, said Tom Conte, a professor in Georgia Tech’s College of Computing. For years, the computer industry has been fulfilling Moore’s Law, with computers doubling in speed every 18 months, but that ended in 2005, when power densities of CMOS-based microprocessors made continued scaling on single-thread performance uneconomical, Conte said. “Parallelism and multicore began, but they are limited solutions. We’re at an inflection point where we need a fundamental change in how we build computers.” Earlier this year, Conte launched Georgia Tech’s Center for Research into Novel Computing Hierarchies (CRNCH). “It’s about hierarchies because we think the solution will cut across all levels, from algorithms to programming languages, all the way down to how we build devices,” he said. Among possible solutions are:
“The latter is especially important because any slack caused by shutting down one generator has to be picked up by the rest of the generators,” Gebraeel said. “Now their lifetime has to be re-evaluated because they are working in overload. That’s where optimization and analytics intersect.” By integrating detection, prediction, and optimization capabilities, the new platform could help power companies achieve significant savings. Indeed, a preliminary study shows a 40 to 45 percent reduction in maintenance costs alone. In the past, there’s been a lot of unnecessary preventative maintenance, Gebraeel pointed out. “Companies do it because of safety, which is rational, but they are being too conservative because they don’t have enough visibility into their assets.” Key to creating the computational platform is re-engineering older statistical algorithms that were developed in the context of limited data, Gebraeel said. Today’s algorithms must be executed on processing platforms that can handle terabytes and petabytes of data, deployed across a large number of computer nodes.
Nagi Gebraeel is analyzing large volumes of sensor data from electric power generation equipment to find information that could improve reliability and reduce maintenance costs. He is an associate professor in Georgia Tech’s School of Industrial and Systems Engineering.
chemistry unless you have access to high-performance computers,” said David Sherrill, a professor in Georgia Tech’s School of Chemistry and Biochemistry who focuses on intermolecular interactions. He recalls his days as a grad student, when quantum chemistry calculations were extremely hard to do and papers were based on a handful of calculations and data points. “Today it’s a different story,” Sherrill said. “With more sophisticated algorithms, better hardware, and larger clusters of computers, a typical paper is based on hundreds or thousands of quantum calculations, which enables our multiscale models to be more accurate and appropriate.” Not just users of high-performance computing, Sherrill’s
research team is also designing the next-generation of quantum chemistry software. “In the last five years, we’ve seen a lot of innovation on the hardware side, such as graphics processing units,” he explained. “That’s forcing us to be smarter about writing software so it easily adapts to different kinds of hardware — something we didn’t worry about 10 years ago.” In light of this change, Sherrill is starting to send his postdocs to computing conferences in addition to traditional chemistry convocations. Sherrill’s researchers are part of a multiuniversity team designing Psi4, an open-source suite of quantum chemistry software for high-accuracy simulations of molecular properties. The software has a wide range of applications, from understanding R E S E A R C H H O R I ZO N S 2 9
how drugs bind to proteins to how crystals pack into a solid. In addition, Sherrill is one of six principal investigators developing new paradigms for software interoperability, a $3.6 million project funded by NSF. The goal is to create “reusable” software libraries where new features can be used by many different computational chemistry programs. “In the past, different codes have competed with each other, so if one person added a new feature, then everyone had to,” Sherrill explained. “Yet it’s too hard to operate this way. By creating an interoperable library, you’d have much more impact and avoid reinventing the wheel. One small team could add a feature that quickly gets into a variety of different programs.”
<materials ecosystem>
Data science is also accelerating the development — and deployment — of new materials, which is key to solving challenges in everything from energy and climate change to health care and security. “Almost every technology is dependent on new materials,” pointed out Dave M cDowell, a Regents Professor who holds joint appointments in Georgia Tech’s School of Mechanical Engineering and School of Materials Science and Engineering (MSE). Due to an emphasis on empirical methods, it has historically taken an average of 15 to 20 years after discovering a material with interesting properties to commercialize it. “Yet thanks to accelerated modeling and simulation protocols, we’re able to assess candidate materials more rapidly for applications,” McDowell said. He points to fatigue, a major problem in metallic aerospace and automotive structures. New computational techniques can now take a material down to the micron scale, represent its structure digitally, and reproduce the kinds of scatter, variability, and properties seen in laboratory experiments. Even better, this can be done in a couple of weeks versus years in a lab. Georgia Tech’s materials engineers are unique in their focus on hierarchical materials informatics — a special branch of data science that extracts and communicates knowledge at multiple scales as opposed to only considering a material’s chemical composition. “This is important because you can have the same chemical composition, but completely different properties at different scales due to how atoms are arranged,” explained Surya Kalidindi, a professor with joint appointments in CSE and MSE, who has written a new textbook on the subject. “Respecting the details at different scales 30
David Bader’s research group is pioneering massive-scale graph analytics — technology that can be employed to help prevent disease in human populations, thwart cyberattacks, and bolster the electric power grid. Bader is chair of Georgia Tech’s School of Computational Science and Engineering.
Running analytics fast and ingesting a streaming firehose of edges simultaneously is like having new engines installed on your plane — while you’re flying.
allows you to understand what arrangements are causing the material to respond in a particular way. By changing the arrangement, you can alter the material, making it stronger or weaker or tweaking electronic, magnetic, and thermal properties.” Beyond solving fundamental problems, Georgia Tech is also helping manufacturers make better decisions. Its Institute for Materials (IMat), launched in 2013, is creating a collaborative materials innovation ecosystem among researchers, industry, national labs, and other universities to link basic research with product development, manufacturing scale-up, and process selection. Typically, companies haven’t recorded information related to their material recipes or processing, explained McDowell, IMat’s founding and executive director. “Instead, the information resided with ‘seasoned experts,’ causing it to get lost or reinvented. By helping companies digitally track workflows and incorporate
modern data science tools, we can enable them to determine, for example, if replacing a material in their production line has enough value to offset economic loss due to downtime.”
<expanding data footprint>
In the past decade, Georgia Tech has been rapidly establishing itself as a leader in data science on a number of fronts. In 2005 it established the School of Computational Science and Engineering (CSE) to educate students in advanced computing and data analysis combined with other disciplines. Since 2012 Georgia Tech and its collaborators have won more than $15 million in federal awards from the Obama administration’s National Big Data Research and Development Initiative. And last November, Georgia Tech was named one of four NSF Big Data Regional Innovation Hubs in partnership with the University of North Carolina. Led by Aluru at Georgia Tech, the South Big Data Hub will build public-private partnerships across 16 states and the District of Columbia. “The goal is to leverage data science and foster community efforts to tackle regional, national, and societal challenges,” Aluru said. “We’ll begin by focusing on five areas: health care, coastal hazards, industrial big data, materials and manufacturing, and habitat planning.” “The Hub firmly places Georgia Tech in the national spotlight for big data analysis,” said David Bader, chair of CSE. “We have become the go-to place for data science … a place where problems are solved in much broader context than traditional top-tier research universities.” Case in point: Bader’s research group has been pioneering massive-scale graph analytics — technology that can be employed to help prevent disease in human populations, thwart cyberattacks, and bolster the electric power grid, to name a few applications. Graph analytics uncover relationships and extract insights from huge volumes of data, and the CSE researchers have designed parallel algorithms that run extremely fast (while keeping up with edge-arrival rates of 3 million per second), even when graphs have billions and trillions of vertices. With these cutting-edge algorithms, the researchers have developed a collection of open-source software, known as STINGER (Spatio-Temporal Interaction Networks and Graphs, Extensible Representation), which can capture analytics on streaming graphs. “In the past, analysts needed to know the size and range of entities before creating a graph,” Bader explained. “Yet STINGER can track a dynamic graph even when future relationships are not known. Running analytics fast and ingesting a streaming firehose of edges simultaneously is like having new engines installed on your plane — while you’re flying.” Increasing its investment in data science and interdisciplinary research, Georgia Tech will be the anchor tenant in a new 750,000-square-foot, mixed-use property in Midtown Atlanta. Developed by Portman Holdings, the project has been christened “Coda” and will include a 21-story building with 620,000 square feet of office space and 40,000 square feet for retail and restaurants. In addition, an 80,000-square-foot data center will provide advanced cyber infrastructure and national data repositories. Georgia Tech will occupy about half of the office space, bringing faculty from the data sciences together with a cross-section of basic and applied researchers. The other half of the building will be devoted to industry. “Data science brings together multiple areas of expertise to solve big, crucial problems — and the building is meant to reflect that,” said Isbell, explaining that Coda will be organized around areas of interest rather than departments or specific disciplines.
Indeed, CSE will be the only academic department to be entirely relocated to Coda. Many faculty members across campus will relocate to Coda permanently; others will reside there temporarily, depending on the length of projects, and then return to their home unit. “The building will be a living laboratory and provide the largest gathering of data science experts in one place of any university in the country,” said McLaughlin, who served on a committee with Randall and Isbell to determine faculty needs and maximize benefits of the new building. “Midtown is going to be transformed by this building,” Randall said. “Coda will be an outward-looking face for declaring ourselves a mecca for data science.” Skolnick looks forward to the unique collaborations Coda will make possible. “Serendipity is very important in science, and random interactions are the most exciting ones,” he observed. “Most of the important science and engineering discoveries are done at the interface of disciplines. Having people with different abilities and expertise in one place will accelerate that process.”
<new institute>
The new building will also be home for Georgia Tech’s Institute for Data Engineering and Science (IDEAS), a new interdisciplinary research institute led by Randall and Aluru. IDEAS has a two-pronged mission, Aluru explained: improving the foundations of data science, and advancing different fields that use data-science tools, such as health care, energy, materials development, finance, and business analytics. The institute will: Enable one-stop shopping for industry. IDEAS will make it easier to connect companies with students and faculty to support short- or long-term collaborative projects. Generate excitement among students. Data science will ultimately impact every discipline at Georgia Tech, so even students who don’t specialize in it will need to be educated about it. Increase communication and collaboration among faculty. Building an academic community around data science will help with everything from winning funding to sharing equipment and expertise. Breaking down silos is critical, Randall said. “Take algorithms. Often scientists come to mathematicians with very difficult, specific questions. We might look at a question and say it can’t be solved efficiently — end of story. However, if we know where the question is coming from, the bigger context, we might realize the researcher doesn’t need an exact solution. An approximation would be just as good for their purposes, and could be done efficiently. Yet if you just hand off these encapsulated questions, you miss the crux of where the magic can happen.” For example, breakthroughs in theoretical computer science have been made by looking at problems as a physicist would, Randall added. “And along the way, we’ve introduced techniques from computing perspectives that have solved long-standing physics problems. This is happening more and more. Academics can no longer operate the way they used to, being very domain specific. You need the instincts, intuitions, and techniques that come from multiple fields to really push the boundaries.”
Enjoying this article? For more Georgia Tech research news, subscribe to our monthly e-newsletter: www. rh.gatech.edu/ subscribe and follow us on Twitter @gtresearchnews.
T.J. Becker is a freelance writer based in Michigan. She writes about business and technology issues. R E S E A R C H H O R I ZO N S 3 1
Advanced computer technologies speed development of real-world materials S T O R Y BY R I C K R O B I N S O N
32
I L LU S T R AT I O N B Y J U S T I N M ET Z
P H O T O S B Y R O B F E LT
A specialist in quantum chemistry, Professor David Sherrill is streamlining the materials analysis process by developing improved methods for studying atomic-scale chemistry. He is shown in the School of Chemistry and Biochemistryâ&#x20AC;&#x2122;s computer cluster. R E S E A R C H H O R I ZO N S 3 3
compete in the digital age, where design of new products occurs within months or a few years,” said Dave McDowell, executive director of Georgia Tech’s Institute for Materials (IMat). “Our goal is to dramatically accelerate that process.” Multiple teams of Georgia Tech researchers are utilizing cyber techniques to support accelerated materials design. Here are a few of the innovative efforts underway by research teams that include engineers, chemists, physicists, computer scientists, and others.
Materials — natural substances altered by humans to meet specific needs — are critical to technology. Today’s advanced materials make possible rocket engines, smartphones, medical machines, anti-pollution devices, and much more. Traditionally, materials have been developed slowly, by trial and error. Today, 21st century computational techniques, in tandem with cutting-edge experimentation capabilities, allow materials scientists and engineers to work at the atomic scale to design novel materials with increasing speed and effectiveness. The result is that today’s cyber-enabled materials — so-called because of the computer’s pivotal role in their creation — are more likely to move from laboratory to industry in a few years rather than a decade or two. This increased efficiency is helping fulfill the goal of the Materials Genome Initiative, a 2013 White House program aimed at bolstering the economy by shortening development cycles. “Historically, it has taken 15 to 20 years to implement new materials into high-value products, which is simply far too long for industries to
34
ADVANCING MOLECULAR MODELING A major goal for materials science and engineering involves more accurate understanding of material structures and properties and how they influence one another. Such knowledge makes it easier to predict which real-world properties a theoretical material would possess when realized. Currently, researchers use computers to delve into materials structures using two approaches. The first relies on experimental data, derived from examining actual materials using microscopy, spectroscopy, X-rays, and other techniques. This data is plugged into computer models to gain insight on materials behavior. The second approach involves models based on “first principles” methods. Such models are developed by pure computation, utilizing established scientific theory without reference to experimental data. Such “ab initio” or “physical” models are widely regarded as useful, but not necessarily fully accurate, when approximations are made to speed up the computation process. Materials researchers strive to balance and integrate the experiment-based methodology with the theoretical approach. Investigators continually compare one type of result to the other in the drive to obtain accurate insights into materials structures. Professor David Sherrill is a specialist in quantum chemistry in Georgia Tech’s School of Chemistry and Biochemistry. He is working to streamline the materials analysis process by developing improved methods for studying atomic-scale chemistry. “The dream is that if you had truly accurate and predictive models, you would need much less on the experimental side,” Sherrill said. “That would save both time and money.” Sherrill and his research team have made progress toward more definitive physical models. They’ve demonstrated that cutting-edge computing techniques can produce highly accurate physical models of the interior forces at work in a molecule. With funding from the National Science Foundation, Sherrill’s team studied crystals of benzene, a fundamental organic molecule. In a proof-of-concept effort, they developed advanced analytic software that supports parallel processing on supercomputers — making possible the high level of computational power required for modeling molecules. The team’s efforts culminated in a benzene model that was deemed singularly accurate. That success showed it was indeed possible to compute nearly exact crystal energies — also called lattice energies — for organic molecules.
Professor Richard Neu is developing new materials that can withstand the extreme temperatures in jet engines and gas turbines. Key to his work is understanding how atoms diffuse through materials under varying conditions. He is shown in the intake of a jet engine at the Delta Flight Museum in Atlanta. R E S E A R C H H O R I ZO N S 3 5
MODELING SUPERALLOY PERFORMANCE Richard W. Neu, a professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering, develops materials that can withstand the extreme temperatures in jet aircraft engines and energy-generating gas turbines. With funding from the U.S. Department of Energy and multinational corporations, he investigates fatigue and fracture in metal alloy engine parts that are constantly exposed to heat soaring past 1,400 degrees Celsius, as well as to continuous cycles of heating and cooling. “We want engine parts to withstand ever-higher temperatures, so we must either improve existing materials or devise new ones,” Neu said. “The most effective way to do that is to understand the complex interactions of materials microstructures at the grain level, so we can vary chemical composition and get improved performance.” Each grain is a set of atoms arranged in a crystal structure with the same orientation. The ways in which different grains fit together plays a major role in determining an alloy’s properties, including strength and ductility. Key to Neu’s materials development work is understanding how atoms diffuse through materials under various conditions — a 36
“We want engine parts to withstand ever-higher temperatures, so we must either improve existing materials or devise new ones.” daunting assignment when more than 10 different elements are mixed together in a single alloy. To perform such studies, he turns to advanced cyber techniques that help him understand materials processes at atomic dimensions. Vast computer databases, developed by materials scientists to describe the thermodynamics and mobility of the atoms in simple binary alloys, now offer critical insights into more complex alloy structures, Neu said. These data collections provide information about what’s taking place at both the microscale and mesoscale during long-term, high-temperature exposures. Such databases consist of information developed either experimentally or through calculations based on first principles — basic physical theory. Using models built with this data, Neu and his team can understand how the strengthening phases within these grains change with exposure and can predict their impact on material properties. Neu is using such modeling capabilities to improve nickel-based superalloys — high-performance metals used in the hottest parts of turbine engines. Even small increases in the temperature tolerance of these alloys can result in important performance gains. He’s also investigating more novel materials including refractory metals such as molybdenum. Although molybdenum alone breaks down at high temperatures through oxidation, when combined with silicon and boron it can produce an alloy that may offer heat-tolerance increases of 100 degrees Celsius or more. “Images of the microstructure and other information derived from our models are extremely important in this work,” Neu said. “This data lets us see clearly the link between a microstructure and the areas within it where fractures could occur. We couldn’t do this before we had effective databases and modeling tools.” continued on page 39
ISTOCKPHOTO.COM
“The work demonstrates that first-principles methods can be made accurate enough that you can rely on the energies calculated,” Sherrill said. “Based on that data, you can then go on to derive accurate material geometries and properties, which is what we really need to know.” The Sherrill team is part of a multi-institution group that is developing a program called Psi4, an open-source suite of quantum chemistry programs that exploits first-principles methods to simulate molecular properties. Sherrill’s team used a version of Psi4 in its analysis of benzene. Sherrill, a member of Georgia Tech’s Center for Organic Photonics and Electronics (COPE), believes the modeling capabilities demonstrated in the benzene project will lead to better predictive techniques for other organic molecules. “I’m very excited about this advance in quantum chemistry,” he said. “I believe in a few years we’ll be doing highly accurate physics-based calculations of molecular energetics routinely.”
Accelerating Materials Development
Surya Kalidindi is a professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering.
Historically, it has taken 15 to 20 years to implement new materials, which is simply too long in the digital age, where new product design often occurs within a few years — or sometimes months. The U.S. Materials Genome Initiative (MGI) emphasizes the need to accelerate the discovery and development of materials to maintain industry competitiveness. The MGI aims to more closely couple materials development with advanced manufacturing processes to facilitate next-generation consumer products such as lightweight, fuel-efficient vehicles and energy-dense batteries with enhanced performance at lower cost. Along with the MGI, the industry-led Integrated Computational Materials Engineering (ICME) initiative represents the shared vision of universities and government to introduce new and improved materials into the marketplace by leveraging advances in computational materials science and data science and analytics. The Institute for Materials (IMat) at Georgia Tech, led by Executive Director Dave McDowell, is helping to define the elements necessary to advance the MGI and ICME. In our “innovation ecosystem,” emphasis is placed on connecting computation, experiments, and data science via high-throughput linkages to rapidly identify material solutions that can be incorporated into products. Universities have a core responsibility to prepare the future workforce to operate effectively within this ecosystem. The confluence of high-performance computing and modern data science with historical methods and tools of materials R&D necessitates an integrated systems approach rather than reliance on isolated, individual experts, as well as cross-disciplinary curricula and degree programs. IMat’s activities to support emerging concepts and methods in materials data science and informatics have focused on: Tools for digital representation of material structures. Data analytics to explore correlations of material structure with process parameters and/or the properties controlling material performance characteristics. E-collaboration protocols to track workflows and communications in the process of developing materials.
Dave McDowell is a Regents Professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering and executive director of the Institute for Materials.
In complex materials R&D, various activities are conducted by different experts, often in different places. Georgia Tech has emerged as a leader in the field of materials data science and informatics by serving as the “glue” that connects all elements of the ecosystem. Since IMat’s founding in 2012, several novel and strategic building blocks have been set in place. First is Georgia Tech’s FLAMEL (From Learning, Analytics and Materials to Entrepreneurship and Leadership), a joint initiative between IMat and the School of Computational Science and Engineering (CSE). Funded through an NSF-IGERT, FLAMEL addresses some key educational gaps related to materials data science and informatics. Teams of students from computer science and either engineering or science are paired together, taking courses that cover fundamentals of materials science, engineering, manufacturing, and computer science, plus two courses aimed at synthesizing and integrating these different disciplines. IMat has also supported the creation and launch of an e-collaboration platform called MATIN. Initially designed to promote peer-learning among students — with focus on designing and
deploying features that allow tracking and curation of data, codes, and discussions — MATIN is being further developed to host open source codes for digital representation of materials; spatial statistics and structure-property correlations; inverse modeling strategies; and other modeling and simulation toolkits developed by Georgia Tech faculty in sponsored research projects. Starting in fall 2014, a new graduate course, Introduction to Materials Informatics, developed by Professor Surya Kalidindi and offered jointly between CSE and ME, has allowed students to employ data science approaches to cross-disciplinary research
In our “innovation ecosystem,” emphasis is placed on connecting computation, experiments, and data science via high-throughput linkages to rapidly identify material solutions that can be incorporated into products. problems ranging from packed soils and polymer structures to fiber-reinforced composites to tungsten nanowires and more. This has demonstrated the feasibility of designing and deploying automated and accelerated data science protocols on vastly diverse material datasets to extract new insights. Another component of IMat’s strategy is to spur the incorporation of data science tools in core research areas through seed funding. Examples from the past year include Polymer Composites with Engineered Tensile Properties and Accelerating the Discovery and Development of Nanoporous 2D Materials (N2DMs) and Membranes for Advanced Separations. IMat will continue to fund targeted collaborations through this ongoing initiative. IMat is also preparing the workforce for the coming age of digital materials data by helping to develop two massive open online courses (MOOCs). Starting this year, High Throughput Computation and Experiments and Materials Data Science and Informatics will introduce a global audience to Georgia Tech’s innovation ecosystem for accelerating new materials. Georgia Tech’s support for MGI objectives in the larger academic community includes co-founding the Materials Accelerator Network in 2014 with the University of Wisconsin-Madison and the University of Michigan, and the NSF-sponsored South Big Data Hub, awarded to a team led by Georgia Tech and the University of North Carolina at Chapel Hill. Finally, a new data science institute at Georgia Tech called IDEAS for Materials Design, Development and Deployment (IDEAS:MD3), supported with internal funding and set to launch in August 2016, will provide a collaborative, pre-competitive materials data science consortia, embedding member-industry R&D personnel with campus researchers, as well as addressing members’ specific materials needs at higher levels of contract engagement. — Dave McDowell and Surya Kalidindi R E S E A R C H H O R I ZO N S 3 7
Professors Martha Grover and Elsa Reichmanis are using sequential experimental design techniques to develop new ways of printing organic electronics. Both are professors in the School of Chemical & Biomolecular Engineering.
38
continued from page 36
NILS PERSSON
UTILIZING INVERSE DESIGN This image shows the alignment of polyWhen a material is manufactured, the mer nanofibers that necessary processing can change it at the enable the fabrication molecular level. So an alloy or other mateof high-performance flexible electronic rial that once appeared well suited to a devices. In this progiven use can be internally flawed by the cessed microscopy time it’s a finished product. image, fibers’ orientations are color coded Professor Hamid Garmestani of Georgia to help researchers Tech’s School of Materials Science and analyze their structure Engineering is investigating this endemic and alignment. The research team problem. Working with Professor Steven includes Professors Liang of the School of Mechanical EngiMartha Grover and neering and researchers from Novelis Inc., Elsa Reichmanis, and graduate research Garmestani is using an approach called assistants Michael inverse process design to pick out better McBride and Nils candidate materials. The effort is funded Persson, all from the by the Boeing Company. School of Chemical and Biomolecular Garmestani’s methodology starts Engineering. by examining a material’s actual microstructure at the end of the manufacturing cycle. It then employs a reverse engineering approach to see what changes would enable the material to better withstand manufacturing stresses. The Garmestani team analyzes a finished part from the standpoint of its properties. If its post-processing microstructure no longer has the right stuff to perform required tasks, the researchers think in terms of a better starting material. “Our approach is the inverse of what’s done conventionally, where you look for a material with the desired properties and work forward from there,” Garmestani said. “We start from the final microstructure and work back to the initial microstructure. If we know what an optimal final microstructure would look like, then we can figure out what the initial microstructure would have to be in order to get there.” To achieve this, Garmestani develops microscale computer representations of the final material microstructure and the initial microstructure. He also considers the process parameters that a material undergoes during manufacturing, plugging in data on forging, machining, coating, and other processes that can affect a material internally. By representing materials at the level of grains — interlocking clumps of molecules —Garmestani can compute the effect of specific processing steps on the microstructure. The distribution of grains — and their interrelationship with myriad tiny defects and other features — is key to determining a material’s properties. Utilizing a mathematical framework, the researchers digitize a material at the micron scale and even at the nanoscale to trace the minute effects of each manufacturing phase. That data helps track the role of every step in altering the distribution of internal features. “As the microstructure changes, we can predict what the properties would be depending on how the constituents and the statistics of the distribution changes,” Garmestani said. “In this way we can optimize the process or the microstructure or both, and find that unique initial microstructure which is best suited to the needs of the designer or the manufacturer.” DEVISING EXPERIMENTATION STRATEGIES Martha Grover, a professor in Georgia Tech’s School of Chemical and Biomolecular Engineering (ChBE), is studying how to structure real-world experiments in ways that best support materials
Get all the Georgia Tech research news as it happens. Visit www.rh.gatech. edu/subscribe and sign up for emails, tweets, and more.
development. Grover and her team are developing specific experimentation strategies that optimize the relationship between the experimentalists who gather data and the theoretical modelers who process it computationally. “What’s needed is a holistic approach that uses all the information available to help materials development proceed smoothly,” Grover said. “Considering the objectives of both experimentalists and modelers helps everybody.” Grover and her group are using an approach known as sequential experimental design to tackle this issue. Working with Professor Jye-Chyi Lu of Georgia Tech’s H. Milton Stewart School of Industrial and Systems Engineering (ISyE), Grover has developed customized statistical methods that allow collaborating teams to judge how well a given model fits the available experimental data and at the same time create models that can help design subsequent experiments. For instance, in one project, Grover worked with graduate student Paul Wissmann to optimize the surface roughness of yttrium oxide thin films. The work required costly experiments aimed at learning the effects of various process temperatures and material flow rates. After gathering initial experimental data, the researchers built a series of models using an eclectic approach. On the one hand, they created empirical models of the thin-film deposition process using data from the experiments. But they also developed another set of models using algorithms based solely on first principles — science-based physical theory requiring no real-world data. Then, using a computational statistics method, they combined the two modeling approaches. The result was a hybrid model that offered new insights and also let the researchers limit the number of additional experiments needed. “Tailored statistical methods provide us with a systematic type of decision-making,” Grover said. “We can use statistics to tell us, given the data that we have, which model is most likely to be true.” R E S E A R C H H O R I ZO N S 3 9
Le Song, an assistant professor in Georgia Tech’s School of Computational Science and Engineering, is using machine-learning techniques to investigate organic materials that could replace inorganic materials in solar cell designs. He is shown with a photovoltaic array at Georgia Tech’s Carbon Neutral Energy Solutions Laboratory.
Grover is currently working with ChBE Professor Elsa Reichmanis, an experimentalist studying organic polymers. Their current project involves finding ways for printing organic electronics for potential use in roll-to-roll manufacturing. This technique could provide large numbers of inexpensive flexible polymer devices for applications from food safety and health sensors to sheets of solar cells. The collaborators are using sequential experimental design approaches as they investigate organic polymer fiber structures at both nanoscales and microscales using atomic force microscopy. “Tight coupling of modeling and experimentation is helping us to develop optimal fiber size and arrangement, and to examine the hurdles involved in scaling up production processes,” Grover said.
PINPOINTING MATERIALS CANDIDATES Finding an optimal material from thousands of candidates is a challenging job. It requires a blend of human expertise and computational power to make it happen. David Sholl, who holds the Michael E. Tennenbaum Family Chair in the School of Chemical and Biomolecular Engineering (ChBE), works two sides of the materials selection challenge. He and his team use predictive computer models to find candidate materials for specific applications, and at the same time they focus on continually improving the computational techniques they’re using. Each project requires the team to painstakingly develop a large database of possible materials for a target application, drawing on existing materials information. Then the researchers use computer models to validate this collected data against both experimental findings and first-principles physical analysis. The result is numerical calculations that provide information on materials structures down to the molecular level. 40
“It isn’t a case of giving the computer a list of materials and having it do everything,” said Sholl, who is also a Georgia Research Alliance Eminent Scholar. “We have to put together a list of thousands of potential materials, and then we examine and verify the existing data on those materials. Only then can we do a staged series of calculations to look for materials with the key properties needed for a particular application.” Sholl describes this calculation process as a “nested approach.” His team screens large numbers of materials using a simplified set of approximations that are amenable to high-throughput processing on supercomputers. What’s left are candidates of particular interest, which are then evaluated more closely. He and his team are currently working on several materials selection projects sponsored by the Department of Energy. These include finding the best materials to eliminate contaminants from natural gas; developing materials for capturing carbon dioxide from the atmosphere and industrial smokestacks; and making high-performance membranes to separate chemicals. Sholl believes that in the next few years emerging data science methodologies will help his team streamline what is in many ways a big data challenge. More efficient methods could mean less time spent preparing and validating materials data, and more time spent evaluating the likely candidates. “Here at Georgia Tech I’m closely involved with people developing various kinds of materials, who are keenly focused on scaling them and integrating them into actual technologies,” he said. “That relationship is always pushing me to think about how to use my team’s calculations to help develop real-world applications, rather than just producing lots of information.”
SPEEDING BIG DATA ANALYSIS The sheer volume of available data can make it challenging to
find key information. Even supercomputers can take months to mine massive datasets for useful answers. Le Song, an assistant professor in Georgia Tech’s School of Computational Science and Engineering, is tackling big data challenges related to materials development, with support from the National Science Foundation. Advanced processing techniques, he explained, can speed up the screening of thousands of theoretical materials created by computer simulation, decreasing the time between discovery and real-world use. As elsewhere in materials science, Song deals with the complex interplay of computer simulations based on first-principles physics theory versus the computational analysis of data from laboratory experiments. Part of his work involves using machine-learning approaches that delve deep into first-principles models to identify materials candidates. Machine learning, sometimes known as data analytics or predictive analytics, uses advanced algorithms to automate the evaluation of patterns in data. For example, in one project, Song is investigating organic materials that could replace inorganic materials in solar cell designs. He rejected creating complex original models of each hypothetical material using quantum mechanical scientific theories of organic molecules, a lengthy and expensive computational undertaking. Instead, he’s using molecular datasets that are already available, along with custom machine-learning techniques, to create predictive models in just hours of computer time. Despite this simplified approach, these models can accurately link a candidate material’s structure to its potential properties. Song makes use of the parallel computing capabilities of graphics processing units to help reduce the time needed for computation. These relatively inexpensive high-speed devices are suited to demanding computational tasks in fields that include neural networks, modeling, and database operations. In other research, Song avoids first-principles approaches and utilizes hard data from experiments. Working with images of metals for aircraft applications, he’s examining alloys at the grain level — clumps of molecules — to develop information on how a material’s microstructure is linked to specific desirable properties. To support their work, Song and his team use a range of deep learning techniques such as convolutional neural networks. This advanced deep learning architecture emulates the neuron organization found in visual cortexes to help analyze images, video, and other data. The technology can automate the extraction of important features that help guide the materials-analysis process, lessening the need for human involvement. “Deep learning helps to automatically derive material features information from available data such as images and molecular structures,” he said. “That reduces the time needed to find the links between a material’s structure and its properties.”
SIMPLIFYING COMPUTATIONAL MATERIALS DESIGN Andrew Medford is focused on developing materials informatics techniques to find novel materials more quickly. He is working with machine learning, probability theory, and other methods that can locate key information in large datasets and point the way to promising new formulations. Highly accurate materials analysis is now available, thanks to density functional theory, quantum chemistry, and other advanced techniques, he explained. These approaches use first-principles physical methods to calculate the properties of complex systems at the atomic scale.
Deep learning helps to automatically derive material features information from available data such as images and molecular structures. That reduces the time needed to find the links between a material’s structure and its properties. But these approaches have two drawbacks, said Medford, a postdoctoral researcher in the School of Mechanical Engineering (ME) who works with ME Professor Surya Kalidindi and will join the School of Chemical and Biomolecular Engineering faculty in January. First, they require large amounts of computational time; second, they produce huge datasets where critical information can be hard to find. “So the next step is learning how to fully exploit the data we generate,” Medford said. “Hundreds of thousands of different potential compounds might work for a specific application, and performing complete calculations on more than a select few isn’t possible.” The key, he said, is finding ways to use existing materials-related data, along with novel informatics approaches, to more effectively search “high dimensional” problems — datasets that contain many different materials attributes. Developing the right data science techniques is critical to this effort, including better ways to generate, store, and analyze datasets with effective big data techniques, and better ways to organize collaborating communities of researchers to help build materials databases. One major issue involves integrating the sheer variety of available technologies, including various types of datasets, computational methods, and data storage systems. Also needed are more effective ways to predict the accuracy of the data being generated; for example, improving data processing techniques, such as uncertainty quantification, used to gauge information dependability. One typical challenge, Medford said, involves a highly important class of materials: catalysts used to process synthetic or bio-derived fuels. Investigators must screen fuel compounds and biomass precursors so complex that calculating properties for even a single potential reaction pathway is computationally overwhelming. The good news is that the fundamental chemistry of these systems involves only a few basic organic elements — carbon, hydrogen, and oxygen — which bond in a limited number of ways. This insight could simplify the high dimensional informatics challenge involved in finding candidate materials. “Novel data-driven approaches can reduce the complexity of these systems into a few key descriptors,” Medford said. “That can provide a route to rapid computational screening of potential materials for synthetic fuel catalysts and help bring more effective processing methods to industry much faster.” One thing is certain: The Material Genome Initiative is important to U.S. economic development, and cyber-enabled materials have a key role to play in that effort. Georgia Tech research teams will continue to research and develop ways to reduce the time and cost involved in moving advanced materials from the supercomputer and the laboratory to real-world applications. Rick Robinson is a science and technology writer in Georgia Tech’s Institute Communications. He has been writing about defense, electronics, and other technology for more than 20 years. R E S E A R C H H O R I ZO N S 4 1
NEXT
GENERATION
GENIUS Georgia Tech cultivates science and technology education for K-12 students and teachers Story by Ben Brumfield
A countertop catapult flicks a scratchy Velcro ball onto a fuzzy mat stretching down a 10-foot table. It plunks snugly into place, instantly motionless, and children standing on the sidelines measure how far it flew. ¶ “Write down the distance,” their teacher, Antoinette Richter, reminds them. She teaches engineering at Carver Road Middle School in Griffin, Georgia, using materials provided by the Georgia Institute of Technology. ¶ Sixth-grader Chyna grabs a bicycle pump attached to the catapult, which is
42
ATLANTA SCIENCE FESTIVAL
At a GoSTEM Latino science day, a student participates in a science demonstration in Spanish. R E S E A R C H H O R I ZO N S 4 3
made of erector set parts, and puts her might and weight into the plunger. “Always pump the air up to the same pressure every time,” Richter tells her. The compressed-air science gadget needs a consistent amount of force behind each launch.
Passion and test scores
Chyna is one of thousands of students benefiting every year from a palette of Georgia Tech K-12 outreaches so numerous they are hard to keep track of. Some researchers dazzle young eyes at weekend sci-tech fests with laser experiments or underwater rescue robots. Others give schoolrooms vistas on nebulas thousands of light years away. They stir wonder for science and awe for technology and push kids to reach for them. But the main focus is a bit less glamorous and a lot more committed to guiding classes through years of learning to raise grades and standardized test scores. “We want long-term partnerships with schools so we make sure our efforts will actually facilitate change in the classrooms,” said Lizanne DeStefano, who runs a core Georgia Tech K-12 education and outreach unit called CEISMC. “That takes prolonged engagement over time.”
Earth-shaking STEM
CEISMC, pronounced seismic, like in an earthquake, stands for Center for Education Integrating Science, Mathematics, and Computing. Its mission is to raise exposure to STEM education — another acronym — which stands for science, technology, engineering, and math. With the heft of 50 employees, including Ph.D. scientists, designers, engineers, and teachers, and with $9.4 million in annual 44
funding, CEISMC supports several Georgia school districts, the Boy Scouts, the Girl Scouts, and much more. Its major funders are the National Science Foundation, the Goizueta Foundation, the Blank Foundation, and the Georgia Department of Education. Its broader purpose is to take cutting-edge Georgia Tech research to the people. “We’re a knowledge transfer bridge,” DeStefano said. “We help the public to better understand the importance of science and technology in daily life.” But children are the focal point. CEISMC alone helps educate 11,000 children per year. “Our staff don’t sit in their offices much,” DeStefano said. “They’re out in the community.”
On the road again
CEISMC’s Will Jimerson has driven 50 miles south from Atlanta to help out at Carver Road. He’s instructing some of Richter’s students using a catapult at the table next to hers. “You want to have a list of distances when you’re done, so you can average them later,” he tells them. At Richter’s table, Chyna groans. “This is so hard!” She means the physical strain of the pump, not the mental strain. She’s a STEM success story, engaging with and responding to CEISMC’s teaching methods as hoped. “I want to be a pediatrician,” she said. “They do scientific things, and I like science, actually a lot.”
Breaking the fall
Chyna and her engineering classmates exemplify the mission of most of Georgia Tech’s K-12 outreach, which casts a particular eye on underserved students likely to lose interest in STEM or fall behind, then drop out of it.
FITRAH HAMID
Children learn how to make a simple electric motor in CEISMC’s Horizons program at Drew Charter School Elementary Academy.
The main outreach targets are public schools in areas where parent incomes are especially tight, and children often don’t have opportunities to learn like students elsewhere. These schools also might not be able to afford some nicer equipment and instructional aids on their own. Jimerson gestures to a device in the classroom corner. “That’s a 3-D printer. Our grant funded 3-D printers for all middle schools and high schools in the county school system.” CEISMC also created and donated the teaching texts, which are all over Carver Road’s science classrooms. In fact, Richter has only ever taught engineering from CEISMC books.
Most likely (not) to succeed
A few halls away, in a seventh-grade science lab, students thumb through a CEISMC workbook on oil spills while they form teams for an experiment using tap water and cooking oil. While she works over an aluminum tray with the oil-water mixture, Tiffany says she already knows she wants to be a scientist. “I got interested in sciences in the fifth grade.” Though many of the students in the science lab are white and male, it also has a good number of minority students and females, reflecting Carver Road’s overall student body makeup. To CEISMC, that’s progress. One of its aims is to keep minority students and girls going in science, as both are very likely to turn away. “We still see fewer girls interested in science than boys and far fewer African-American and Latino students in science careers,” DeStefano said. Chyna happens to be both female and African-American. She’s also in her middle school years, a phase notorious for shedding math and science students.
The middle school wall
“Middle school is when we lose them,” said Leigh McCook, who coordinates STEM outreach for the Georgia Tech Research Institute (GTRI). GTRI is Georgia Tech’s applied research organization, and it has access to lots of technology that makes kids say “cool!” — like lasers, underwater robots, and nanotechnology. One GTRI program, called Direct-to-Discovery, uses a high-bandwidth teleconferencing system to connect students to megatelescopes halfway around the globe as well as cutting-edge Georgia Tech labs in their own state. Like CEISMC, GTRI meets kids at science festivals and takes GTRI Road Kits to their schools to teach them about math, physics, and engineering. Its K-12 outreach goal matches CEISMC’s: Get children into STEM and point them toward college and a science or technology career. But GTRI also integrates business partners interested in helping with education. GTRI has a dedicated year-long internship program called Project ENGAGES in four Atlanta public high schools, including one predominantly African-American boys school and one predominantly African-American girls school. It brings underserved students into Georgia Tech labs to conduct research throughout the year and apply what they’ve been learning in science and engineering.
Puberty peer pressure
When kids hit middle school, science classes become more challenging, and many students hit a wall and turn away, said Mindy DiSalvo, a former principal who is an educational guide for GTRI. It’s more like three or four walls, for girls in particular. As
“I want to be a pediatrician. They do scientific things, and I like science, actually a lot.” a principal, DiSalvo watched them turn away from sciences in droves. “First of all, they’re just middle school kids, and they’re more interested in social things. There’s peer pressure.” At one STEM event, only boys turned up, she recalled. “They told me that the girls were not there because they all went to cheerleading practice.” Also, middle schoolers aren’t in one classroom all day with the same teacher, who knows their weaknesses along with their strengths to try to balance them out. Instead, pupils move from subject class to subject class, and the teachers don’t get a full picture of what’s going on with them. “If we can hold onto them through middle school and engage with the teacher, we can see more of them sticking with sciences,” McCook said. That often means teaching the teacher. Richter, for example, teaches engineering but holds a degree in business management. Georgia Tech has helped her develop her subject-matter skills. “They not only did a great job of explaining the goals of the curriculum, but they gave me the tools I need to teach my students,” Richter said. “Things like 3-D modeling software and how to use a 3-D printer.”
Meet Superteacher
In the past five years, CEISMC has trained around 2,000 schoolteachers. Many have become classroom heroes, but high school science teacher Casey Bethel could rightly wear a Superman cape. He was selected Georgia’s Teacher of the Year for 2017. He’s also a bona fide biochemistry lab researcher at Georgia Tech thanks to a CEISMC program called GIFT, short for Georgia Intern-Fellowship for Teachers. He’s now an expert on 3-D protein crystallography and has co-authored a research paper submitted to the prestigious research journal Nature. That astounds him. R E S E A R C H H O R I ZO N S 4 5
High school science teacher Casey Bethel is a bona fide Georgia Tech researcher in Professor Raquel Lieberman’s lab. He’s also Georgia’s Teacher of the Year for 2017. 46
“Who would have thought this high school teacher might be published in Nature?” Bethel said. It makes him dream about going for his Ph.D. and researching full time, but for now he’s dedicated to his students. About half the children at New Manchester High School in the Atlanta suburb of Douglasville come from low-income families, he said. “It’s not a Title 1 school, but it’s also not far from it.” About three quarters of the students are African-American.
Science sidekicks
People just like me
As another component of its STEM education outreach, Georgia Tech brings children from historically underrepresented minorities and ethnicities onto campus to get them accustomed to the idea that a university is a place for them. These visits get Bethel’s students out of their typical surroundings, he said. “They come out of that and see that scientists are people just like them.” With Hispanic students, the language barrier with parents can play a role, so CEISMC offers a Latino STEM day all in Spanish. “What was really powerful was the parents,” DeStefano said. “The parents could easily participate, and the kids didn’t have to translate for them. The parents were so engaged. They asked questions like crazy.” That’s rare. Usually, they are quiet because of language. “Now, the children and their parents have experienced campus as a place that they belong,” DeStefano said. These outreach programs are not just about recruiting future students for Georgia Tech. “We take a bus of students from Gwinnett County around to colleges in Georgia and outside of Georgia,” DeStefano said.
At GoSTEM, parents hear in Spanish about the possibilities a science education can offer their children.
B E T H E L : R O B F E LT; G O S T E M : AT L A N TA S C I E N C E F E S T I VA L
As with the students they serve, many Georgia Tech programs target educators at underserved schools, and when they come into labs for a summer to work, work they do. “Teachers are paid a living wage. It’s not charity,” DeStefano said. When the school year starts back, Bethel will stride into class a real-life scientist. “The first few years, I had no idea what I was doing as a science teacher. It takes a lot of honesty to say that,” he said. How things have changed for him. At Georgia Tech he co- authored a paper on engaging students in science that was published in the Journal of Chemical Education. Georgia School Superintendent Richard Woods walked into Bethel’s classroom unannounced in May to declare him teacher of the year. Bethel nearly hit the floor, but his students went through the roof. “They were jumping up and down clapping and whooping,” he said.
In the fall, when Bethel returns to his classroom, he’ll have new STEM sidekicks. “I get to bring some students each summer to the labs for five weeks,” he said. “When they get back to school, they become advocates for science careers.”
R E S E A R C H H O R I ZO N S 4 7
When things go right
There is little doubt about Nick going to college, maybe only whether it will be Georgia Tech, MIT, or an Ivy League school. He’s visiting Georgia Tech to boost his already stellar robotics skills at one of the many outreach opportunities open to all students. Many who attend such publicly available seminars are high achievers. Nick is captain of his school’s robotics team, and with a competition coming up in three weeks, they’re at Georgia Tech’s Institute for Robotics and Intelligent Machines to sharpen their competitive edge. “In the competition, there’s an autonomous vehicle and also a driver challenge,” Nick said. The winner gets a cash prize. Nick’s classmate Colette already has her takeaway from Georgia Tech’s programmers. “They give their robots tiny little commands, and then the robots decided how to use them,” she said. “That’s what we’re kind of trying to do with ours.”
Hot car alarm
FITRAH HAMID
CEISMC’s Sirocus Barnes gives instruction in an extracurricular science and technology class at Drew Charter School.
Georgia Tech also gives budding engineering geniuses a chance to show off their inventions in the K-12 InVenture Challenge. Some innovations are what one would expect from the research and development wing of a major corporation, like the car seat invented by a high school student who saw news reports about children dying in cars parked in the sun. Her seat sets off an alarm and dials 911 as the temperature in the car rises.
“You can see the next generation of scientists and engineers. You can see young students putting their creativity to amazing use.”
48
“You can see the next generation of scientists and engineers,” DeStefano said. “You can see young students putting their creativity to amazing use and getting excited about their ability to create things and solve problems.” In another CEISMC public outreach, the Kids’ Club, elementary and middle school students are learning about energy-producing technologies in a Saturday on-campus seminar. Most every question the teachers ask is met with a lightning-fast answer. “The challenge is that they know more than you’re expecting,” said one of the teachers. “So, you try to get this line of inquiry going. But they already know everything.” These students are benefiting from a great education, and it shows. They’re clearing the middle school wall like it’s a runner’s hurdle.
The early birds
The bricks to that wall are laid in elementary school, DiSalvo said. “A generation of elementary school teachers say, ‘I don’t do science and math. I really don’t do that,’” DiSalvo said. “Teachers will say, ‘I have never been a biologist; I only teach a little biology.’” Many of their students are then ill prepared for middle school science. At two public elementary schools in Atlanta, CEISMC is planting the STEM seed early with dedicated programs called Horizons. One of the schools is Drew Charter School Elementary Academy. Lea is not quite 3 feet tall and looks about 6 years old, but at Drew, she’s trying hard to be the boss. She heaves herself into the teacher’s chair and pretends to commandeer her schoolmates as they file into a classroom for some extracurricular afterschool STEM. Her squeaky voice is no match for the whooping of two dozen kids fueled by the knowledge that school will let out for summer in just a few days.
‘Clap three times’
A man bellows warmly, “If you hear my voice, clap once; if you hear my voice, clap twice.” The noise dies down, and after “clap three times,” the room is silent. Little faces gaze up at CEISMC’s Sirocus Barnes as he readies them for this week’s lesson. Three Georgia Tech undergraduate students have come with him to help the children learn about electromagnetism by building a simple electric motor. They bend wires into heart shapes and spirals and balance them atop AA batteries perched on magnets. Then Barnes asks the class, “What’s going to happen with the wire when I let it go?” “It’s going to heat up,” a boy answered. True, but that’s not what Barnes is looking for. Most of the little faces are stumped. Then eyes widen when Barnes lets go of the wire to show them how it rotates around the battery. “What’s it doing?” he asks. No answer, at first. “I’m so confused right now!” moans Gania, one of the smallest girls in the room. She puts on a frolicsome grin, then muses, “I get confused a lot.” Horizons is working to change that for her and two-thirds of the students in the classroom, who make below-average grades. The extra instruction is designed to boost the performance of the bulk of the students. Georgia Tech is committed to staying with them from first grade to early ninth grade, and plans to extend the program through high school are in the works. During summer, the students come to Georgia Tech for booster courses. They also work in labs and learn how to swim. The Horizons program is working. “Their achievement gaps
are reduced,” DeStefano said. The kids are doing better in school and scoring higher on state-wide standardized tests.
Ignoring pizza
In the classroom at Drew, a few hands shoot up. “The forces are moving the copper coil around and around,” a girl answers. “The electromagnet has forces that combine together to make the wire spin around, and the energy that flows through it is making it spin,” a boy calls out. Now, the kids are getting it, but brows are still furrowed. The new challenge lies not in the dexterity of mind but of hand. The wires are a bit thick for them to bend. But they’re so determined to finish making the motors that they ignore the aroma of pizza that has flooded the classroom for the past 10 minutes. The teachers end the lesson and serve up dinner. Their parents will pick them up soon. On the ride home, the children can tell them all about electric motors. Parents who learned and remember the right-hand rule of electromagnetism in grade school might be able to follow along. If they can’t, they can take satisfaction in seeing their child get a better shot than they may have had at doing well in school — and in life.
A student gets hands-on experience in robotics with the help of a Georgia Tech engineer.
Ben Brumfield is a senior science writer with Georgia Tech’s Institute Communications. He is a former CNN.com editor. R E S E A R C H H O R I ZO N S 4 9
50
The SPRUCE climate change experiment is warming a bog to study a potential greenhouse gas catastrophe
Story and photos by Ben Brumfield
In a northern Minnesota peat bog, Georgia Tech researchers are studying how microbes metabolize organic carbon. The work is part of a major research project with the goal of understanding how the ecosystem may respond to climate change. Shown are Joel Kostka, a professor in Georgia Tech’s School of Biological Sciences and School of Earth and Atmospheric Sciences, and Max Kolton, a Georgia Tech research scientist.
52
ith a bread knife, Joel Kostka stabs into the watery floor of a peat bog dedicated to scientific study. The serrated blade sounds like it’s sawing through a soaked loofa laced with toothpicks. Kostka is one of dozens of researchers who have come to northern Minnesota’s Marcell Experimental Forest to get a good look at a behemoth made of carbon. It’s buried under the fluffy moss, but in a globally warmed future, scientists postulate, it will transform into greenhouse gases and accelerate climate change. The U.S. Department of Energy is testing the hypothesis in a wide-ranging experiment called SPRUCE, which stands for Spruce and Peatland Responses Under Climatic and Environmental Change. SPRUCE is artificially warming the whole bog ecosystem in sections from below the plant roots up to the treetops. “There’s no other large experiment like this of its kind,” said Kostka, a microbiologist and professor in Georgia Tech’s School of Biological Sciences and School of Earth and Atmospheric Sciences. “The idea is to warm the soils from the bottom up.” The research should mimic global warming like nothing before it.
Here, have a slice
Scientists are monitoring the altered ecosystem with particular attention to deep, carbon-rich peat — most of it is millennia old, brown, and muddy. It goes down some 15 feet. Kostka picks up a sopping, moss-covered chunk from the surface and holds it out in his hand. A small, pale, bulbous bug
falls out of it as he points out the layers. “You can see the green material and then the beginnings of the decomposing Sphagnum in the peat.” Sphagnum is the scientific name for the peat moss plant. “The leaves of a peat plant are just one cell thick,” Kostka marvels. Inches under the green sprigs are yellowed ones, then the deep brown goo starts. “Peat moss turns out to be a hugely important plant globally,” Kostka said. “It arguably stores more carbon than any other plant on Earth.”
Legendary bog monster
Folklore has embellished bogs with gooey monsters interred beneath the squishy moss, waiting centuries to resurrect so they may rise up to wreak havoc. Horror movie hooey — but it’s a great metaphor for the worstcase scenario of what lies deep inside cold, or boreal, peat bogs in parts of northernmost North America and spanning northern Europe and Siberia. For some 10,000 years, Sphagnum has been sucking carbon gases out of the atmosphere. Then, as it died and sank, it took megatons of carbon down with it, packing the carbon compounds virtually airtight and refrigerating them at temperatures of around freezing. “We call it a carbon bank, because a lot of the carbon that’s present in terrestrial soils is present in these peats,” Kostka said.
Exploding oil refineries
Though boreal bogs cover just 3 percent of Earth’s land surface, they have banked more than 30 percent of the world’s soil carbon. Rising global temperatures, it is feared, could pry open these refrigerated vaults.
“There’s no other large experiment like this of its kind. The idea is to warm the soils from the bottom up.”
Enclosures built in a Minnesota peat bog allow the SPRUCE experiment to mimic conditions that might arise as the climate changes. Researchers are measuring the resulting impact on the emission of gases such as methane and carbon dioxide.
R E S E A R C H H O R I ZO N S 5 3
Joel Kostka and Max Kolton filter cold water from the bog to net microbial DNA and RNA fragments for study.
With warming, oxygen and microbes could make their way into the carbon bank, where they would rot the well-conserved biomass like compost, converting some of it into carbon dioxide and a lot of it into methane, a greenhouse gas 30 to 50 times more potent than CO2. Analogous to the bog monster’s resurrection, the gases would rise into the atmosphere with possibly disastrous consequences. Research specialists from 19 universities are at Marcell studying multiple facets of this possible chain of events. Kostka is examining the final stages of the process. “We’re understanding the below-ground carbon cycle and the microorganisms that are active in degrading the organic material,” Kostka said. Heightened activity would be a bad sign. “If we found that a lot of that carbon at higher temperatures rose to the atmosphere as carbon dioxide and especially as methane — which traps more heat than carbon dioxide — then that would be a worst-case scenario,” he added. In the worst case, imagining climate change as a wildfire, boreal peat bogs would be like gas stations or oil refineries in its path. Their particularly rich fuel stores would blast up global warming as the heat tapped into them.
Space station charm
The SPRUCE experiment, coordinated by the Department of Energy’s Oak Ridge National Laboratory (ORNL) and the Department of Agriculture’s Forest Service, is trying to see what it would take to set the deep carbon off. If it doesn’t exactly erupt into methane, what will happen? In the best case, will the bog just continue to suck up carbon gases from the atmosphere as it always has? With such questions in mind, ORNL designed and built the warming laboratory that sprawls into the bog like a space station — and shares the same sterile charm. Grated boardwalks of gray plastic supported by silvery metal posts branch out at right angles to 17 plots, where they encircle small orchestras of scientific instruments. Football-sized cylinders hang from poles and girders, and bottomless pots and tubs are embedded in the Sphagnum. Most of these devices measure some variation of carbon in gas emissions or in the water. 54
“This big cylinder is for measuring the greenhouse gases,” Kostka said. “Right now you can see there’s a cover on top of it. And what the cover does is trap the gas inside the cylinder.” But nature is the passenger on this journey, and the researchers take meticulous measures to ensure it is undisturbed, aside from the warming, so it can be studied in an authentic state. Amongst the metal and plastic, rhododendrons dance diagonally over the shaggy moss, and cotton grass shoots straight up a foot or two, bursting into cotton-ball blossoms. Above them, tamarack and spruce tower alongside a weather scaffold.
Make mine hot
Most of the plots at SPRUCE have underground heating elements, and most are also enclosed in semi-transparent greenhouses as tall as the treetops inside them. Above the door to one enclosure hangs a sign reading, “Welcome to a warmer future. +6.75° C, elevated CO2.” (that +6.75° Celsius would be +12.15° in Fahrenheit). Another reads +2.25° C and no elevated CO2; yet another reads +4.50° C, elevated CO2. The enclosures range from no additional heat or carbon dioxide all the way up to +9° C and added CO2 — a really extreme scenario that probably wouldn’t play out on Earth for at least 100 years. Vents blast warm air into the enclosures, none of which have roofs, which allows the heat to waft out, on purpose. “The open top is really so the gases and the precipitation can exchange within the chamber. We didn’t want a closed top chamber like an aquarium,” said Randy Kolka, a scientist with the Forest Service. The idea is not to isolate the plots, but to have them participate in the natural weather of their surroundings, while riding a few degrees above it. “There comes more rain,” Kostka says, as the precipitation dampens everyone in the enclosure. The natural exchange goes for the water table, too. The enclosure allows natural ebbs and flows, and monitors them for traces of old carbon. Right off the bog are three semi-truck-sized tanks; one holds propane for heat, the others hold CO2.
Watching grass grow
John Latimer has been watching grass grow for more than 33 years. And the leaves change color in the fall. And the flowers bloom in spring. All along, he has kept records on the changing seasons. And all along, he has shared his observations with area residents on Tuesday mornings on radio station KAXE, 91.7 FM, Grand Rapids (Minnesota). He’s a local phenologist — someone who studies flora’s seasonal states. He also works for SPRUCE. In the warmest chamber, at +9° Celsius, Latimer pokes a small camera on a long pole down a hole, then pulls it out slowly, snapping pictures of root systems to observe what warming might do to them. “Once a week I come in and put the camera in,” he said. “I get the same pictures each week.” Over time, researchers will compare the shots to assess root development. A hypothesis holds that greater warmth and carbon gases will make some plants grow faster and larger, altering the ecosystem and the water table. That could expose the carbon-rich peat deposits to more oxygen and promote their breakdown. “At the end of the day — one day — I will have taken about 2,250 pictures,” Latimer said, grinning through a neatly groomed white beard. “Roots. Yep.” He points to a plant that looks like a short, sparse fern with long, broad leaves. “Three-leaved false Solomon’s seal is the common name,” he said. It’s no giant, but it puts down the deepest roots in the plot.
Though boreal bogs cover just 3 percent of the Earth’s land surface, they have banked more than 30 percent of the world’s soil carbon.
Warming whiplash
The underground heating has been going on for a year, the above ground heating for a few months. But already Latimer has seen a whiplash in the local weather severely damage the tamarack trees in the +9° C chamber. In early March, outdoor temperatures shot up to over 80° F. This heat, plus the additional 9° C (equivalent to 16° F), bumped the heat inside the chamber to nearly 100° F. The plants, including the tamaracks, popped out new buds and sprigs as though spring had arrived. Latimer thought he was seeing his second earliest spring on record, the earliest being 2012. “I had Aspens flowering. I had silver maple flowering. They were early in terms of all the records I have,” he said. Then in mid-March, winter roared back, and buds dropped to the ground. The tamaracks turned brown, and the seasoned phenologist did a double-take. “Everything stopped moving,” Latimer said. Of all the years he has notes on, he feels recent ones have been more extreme. Previously, the earliest and latest springs he had recorded were nine years apart, but that has changed. “My bookends, my range, went from 1987 to 1996, and now it’s 2012 to 2013, earliest and latest ever. It’s astonishing. How do you get two years like that back-to-back that are absolutely the extremes?”
About that monster
Reports of record-breaking heat have become common with average global temperatures on the rise. February through April of this year in the bogs were the warmest such period on record.
But the big question for SPRUCE remains: Do scientists know yet if the methane catastrophe really will happen? “Maybe, maybe not,” said Paul Hanson from ORNL, which is overseeing the experiment. He is its project coordinator. “The warming effects, depending on how big they are, have the potential to do good things for vegetative systems or to push things over the edge.” It’s too early to tell. Scientists are giving SPRUCE 10 more years to come up with a meaningful glimpse at a warmed future.
Bad, good, bad news
But with the experiment barely fully cranked, there has already been some good news and some bad news. First, the good news. So far, the gases and water coming out of the bog are not infused with carbon from those deep stores. That old carbon can be recognized by its percentage of radioactive carbon 14, which is different from current atmospheric carbon. But the bad news is, Kolka from the Forest Service thinks that will change. “I do think you’re going to see the signal of that old carbon pretty soon,” he said. Furthermore, warmed bog plots are already emitting more methane than usual even without tapping that deep, old peat, and the implications of that go beyond boreal bogs into the wider world. “Just that release of methane supports the hypothesis that in a warming world, microbes will become more active,” Georgia Tech’s Kostka said. “And especially microbes that produce methane will become more active and release more methane.” The microbes are called methanogens — a potential carbon gas monster, with or without any bog.
A huge pulse
There’s more. At Kostka’s lab in Atlanta, when he and his postdoctoral assistant, Max Kolton, warmed up moss samples, methane came streaming out. Kostka thinks he’ll see the same at SPRUCE with the addition of air heating. “We expect this summer we’re going to see a huge pulse,” he said. About 500 feet from SPRUCE, Kostka and Kolton slosh into open peatland to take core samples of deep peat. Their rubber boots mush a few inches into the moss carpet, as water rises to their insteps. But that’s as far as they sink. The bogs offer as much support for walking as a foam mattress. The water is about 39° F; the air temperature, in mid-May, is about 45° F, but mosquitoes are undeterred by the chill and buzz by the dozens around the researchers’ mosquito-net head coverings. Kostka talks about a counterpart to methanogens in the Sphagnum — methanotrophs. Methanotrophs eat methane and could possibly soften a methane boost. Over the millennia they have helped the moss build the carbon stores. Kolton grunts and pulls out a corer full of deep peat. Kostka points to a layer. “You can see some of the ancient peat. It’s at least about 5,000 years old, maybe 8,000 years.” It’s from either the time the city of Troy was founded or the beginning of the Copper Age. The deepest peat formed during the New Stone Age, when the Earth’s human population was about five million. Ten years from now, SPRUCE scientists hope to know more about what the modern age’s place will be in the next layer of peat, and in all those past. Ben Brumfield is a senior science writer with Georgia Tech’s Institute Communications. He is a former CNN.com editor. R E S E A R C H H O R I ZO N S 5 5
GLOSSARY
In this thermo mechanical fatigue experiment, the temperature of a nickel-base superalloy specimen was cycled between thermal extremes.
Data Engineering and Science PAGE 24, DATA DRIVEN
Data engineering and science is a highly interdisciplinary field that blends statistics, computing, algorithms, applied mathematics, and visualization. It uses automated methods to gather and extract knowledge from very large or complex data sets. At Georgia Tech, researchers in data engineering and science are pursuing initiatives in machine learning; high-performance computing; algorithms and optimization; health and life sciences; materials and manufacturing; and energy infrastructure.
Next-Gen Hardware PAGE 28, DATA DRIVEN
Superalloy PAG E 3 6 , C Y B E R F O R G E D
(soo-pər-ˇa-loi) A superalloy, also known as a high-performance alloy, is a material that exhibits a favorable combination of properties such as maintaining excellent mechanical strength at extreme temperatures, resistance to thermal creep deformation, excellent fracture toughness, good surface stability, and resistance to corrosion or oxidation. Georgia Tech researchers are developing nickel-base superalloys for use in the hottest parts of jet engines and gas turbines. 56
The world’s most powerful computers work at the petascale — millions of billions of floating-point operations per second. Computer engineers expect to achieve exascale speeds — about a thousand times faster — within the next few years. But further performance increases will require adoption of such radical technologies as quantum computing, neuromorphic computing, approximate and stochastic computing, new microarchitectures, and new transistor technologies.
Peat Bog
PAGE 50, SHAKING A SLEEPING BOG MONSTER
A peat bog is a wetland that accumulates a deposit of dead plant material — often mosses, and in most cases, Sphagnum moss. Peat bogs in the northern latitudes are storing as much as 30 percent of the world’s carbon. Georgia Tech researchers are part of a major research initiative, known as SPRUCE, to determine how a warming climate might affect the carbon stored in the bogs, potentially liberating large quantities of methane and carbon dioxide, which could accelerate climate change. R O B F E LT
CONNECT WITH GEORGIA TECH RESEARCH From discovery to development to deployment, Georgia Tech is leading the way and “creating the next” in scientific and technological progress. Let us introduce you to our internationally recognized experts and help you make meaningful connections with our talented students. The people listed on this page are responsive, well-connected, and ready to help you tap into everything Georgia Tech has to offer. Find out what’s new, and what’s next. Contact us today.
ONLINE RESEARCH CONTACT FORM www.research.gatech.edu/contact GENERAL INQUIRIES AND OFFICE OF THE EXECUTIVE VICE PRESIDENT FOR RESEARCH
Get all the Georgia Tech research news as it happens. Visit www.rh.gatech. edu/subscribe and sign up for emails, tweets, and more.
GAIL SPATT Program Manager Office of the Executive VP for Research 404-385-8334 spatt@gatech.edu
INDUSTRY COLLABORATION AND RESEARCH OPPORTUNITIES
CORPORATE RELATIONS AND CAMPUS ENGAGEMENT
DON MCCONNELL Vice President, Industry Collaboration 404-407-6199 donald.mcconnell@gtri.gatech.edu
CAROLINE G. WOOD Senior Director, Corporate Relations 404-894-0762 caroline.wood@dev.gatech.edu
CORE RESEARCH AREA CONTACTS BIOENGINEERING AND BIOSCIENCE
ENERGY AND SUSTAINABLE INFRASTRUCTURE
CYNTHIA L. SUNDELL Director, Life Science Industry Collaborations Parker H. Petit Institute for Bioengineering and Bioscience 770-576-0704 cynthia.sundell@ibb.gatech.edu
SUZY BRIGGS Director of Business Development Strategic Energy Institute 404-894-5210 suzy.briggs@sustain.gatech.edu
DATA ENGINEERING AND SCIENCE SRINIVAS ALURU Co-Director Data Engineering and Science Initiative 404-385-1486 aluru@cc.gatech.edu TRINA BRENNAN Senior Research Associate Institute for Information Security and Privacy 404-407-8873 trina.brennan@gtri.gatech.edu ELECTRONICS AND NANOTECHNOLOGY DEAN SUTTER Associate Director Institute for Electronics and Nanotechnology 404-894-3847 dean.sutter@ien.gatech.edu
MICHAEL CHANG Deputy Director Brook Byers Institute for Sustainable Systems 404-385-0573 chang@gatech.edu
RENEWABLE BIOPRODUCTS NORMAN MARSOLAN Director Renewable Bioproducts Institute 404-894-2082 norman.marsolan@ipst.gatech.edu PEOPLE AND TECHNOLOGY RENATA LEDANTEC Assistant Director Institute for People and Technology 404-894-4728 renata@ipat.gatech.edu
MANUFACTURING, TRADE, AND LOGISTICS
PUBLIC SERVICE, LEADERSHIP, AND POLICY
TINA GULDBERG Director, Strategic Partnerships Georgia Tech Manufacturing Institute 404-385-4950 tina.guldberg@gatech.edu
KAYE G. HUSBANDS FEALING Chair and Professor School of Public Policy 404-894-6822 khf@pubpolicy.gatech.edu
MATERIALS
ROBOTICS
JUD READY Lead Liaison, Innovation Initiatives Institute for Materials 404-407-6036 jud.ready@gatech.edu
GARY MCMURRAY Associate Director of Industry Institute for Robotics and Intelligent Machines 404-407-8844 gary.mcmurray@gtri.gatech.edu
NATIONAL SECURITY MARTY BROADWELL Director, Business Strategy Georgia Tech Research Institute 404-407-6698 marty.broadwell@gtri.gatech.edu
Research News & Publications Office Georgia Institute of Technology 177 North Avenue Atlanta, Georgia 30332-0181 USA
CHANGE SERVICE REQUESTED
www.rh.gatech.edu Volume 33, Number 2 Issue 2, 2016
Nonprofit Organization U.S. Postage PAID Atlanta, GA 30332 Permit No. 587