Convergence Issue 22

Page 1

FALL 2018

The magazine of engineering and the sciences at UC Santa Barbara

FOCUS ON:

SUSTAINABILITY sediments in space COMBATTING GPS SPOOFS ON THE GRID SECURING SOFTWARE JULIA SPEAKS:

A NEW COMPUTER LANGUAGE

College of Enginee


A message from the Deans

L

ast October the United Nations Intergovernmental Panel on Climate Change (IPCC) released a report detailing the en­vironmental impacts likely to occur if Earth’s average temperature increase by more than 1.5 degrees Celsius over pre-industrial levels.

ROD ALFERNESS Dean and Richard A. Auhll Professor, College of Engineering

PIERRE WILTZIUS Susan & Bruce Worster Dean of Science, College of Letters & Science

The report also indicated that the window for action is closing, with perhaps only a decade remaining to slow carbon emissions enough to prevent the worst impacts of climate change. That news comes amid rapidly increasing global demand for energy — most of which currently is derived from burning fossil fuels, which contribute to warming — making action to support sustainability more vital than ever before.

Governments around the world are aware that new forms of energy are needed as we move to a post-oil economy. But while they are being developed and scaled up, the most important way to slow emissions is to reduce energy demand by increasing energy efficiency — in buildings, electronic devices, cars, ships, planes, batteries, communications systems, agriculture, water-supply systems, and other key infrastructure elements. Here at UCSB, researchers in engineering and the sciences are hard at work to increase energy efficiency, reduce pollution, and develop systems and products that support sustainability by ensuring prosperity both for humans and for Earth’s living systems. A special section in this issue, titled “FOCUS ON: Sustainability” (see page 13), lays out some of the impressive research being pursued at UCSB to address several grand challenges of sustainability. Elsewhere, we highlight research on computer security from the software designer’s and hacker’s perspectives, share a novel approach to preventing GPS hacking of an autonomous electrical grid, chat with UCSB alumna and donor Pamela Lopker about her leadership vision at the business-software company QAD, note major awards for two young faculty members and the leader of the Technology Management Program, describe an experiment that got a lift from a SpaceX rocket, and recall one of UCSB’s most prominent powerhouses of discovery, Professor Emeritus Jacob Israelachvili. We wish you enjoyable reading, a fulfilling holiday season, and a promising and peaceful new year.

1 Spring 2018


9

25

11

13

27

31 COVER Image

CONTENTS 3 News Briefs Awards for scholars, remembering a legend, STEM diversity works, restoring toxic streams, and frogs facing off with the chytid fungus. 7 Alumna & Donor Pamela Lopker The practical, service-centered leader of QAD. 9 Sediments in Space In June, a SpaceX rocket carried a UCSB experiment to the International Space Station.

25

Toward a Seure Electric Grid Professor JoĂŁo Hespanha suggest a new way to avoid crippling GPS spoofing attacks.

27

Security Agents Tevfik Bultan and Giovanni Vigna take complementary approaches to data defense.

31

Julia Speaks CoE alumni release a powerful new computer language.

11 An Unfolding Mission Understanding why proteins lose function when bonded to artificial surfaces. 13 FOCUS ON: Sustainability A special section highlighting how College of Engineering researchers are tackling some grand challenges. 15

Major Grants for Grand Challenges

16

Simulating Liquid-Membrane Interaction

19

Mellichamp Cluster on Sustainable Products and Manufacturing

21

Algorithms to Smarten Buildings, Track Oil Spills

23

Centers for Sustainability

Artist’s representation of technology's role in advancing sustainability. Illustration by Brian Long

The Magazine of Engineering and the Sciences at UC Santa Barbara Issue 22, Fall 2018 Editor in Chief: James Badham Director of Marketing: Andrew Masuda Artwork/Design: Brian Long UCSB Public Affairs Contributors: Julie Cohen, Sonia Fernandez Photography: Matt Perko, Tony Mastres

convergence.ucsb.edu Web Design: Robert LeBlanc Web Graphics: Brian Long

College of Engineering CONVERGENCE 2


NEWS BRIEFS YOUNG AND impressive

A

ssistant Professor William Wang (computer science), and Associate Professor Jonathan Klamkin (electrical and computer engineering) have received the prestigious Young Faculty Award from the U.S. Defense Advanced Research Projects Agency (DARPA). They join 34 others who received 2018 awards, which include substantial funding for recipients’ research. “We’re extremely proud of William and Jonathan for receiving their DARPA Young Faculty Awards,” said CoE dean, Rod Alferness. “They continue a strong tradition of junior faculty being recognized for outstanding work, ensuring that UCSB engineering is in good hands for the future.” Klamkin’s area of focus is photonic integrated circuits (PICs), which transmit information on light waves rather than electromagnetic waves. They are this century’s counterpart to the electronic integrated circuit, which revolutionized computing and electronics in the 20th century. Klamkin, who leads the Integrated Photonics Laboratory at UCSB, is among those working to bring about the photonics revolution by addressing energy-consumption issues associated with lasers, the source of light in PICs. It is a key challenge to overcome if PICs are to become reliable and reach their potential for largescale integration. Substantially reducing the power consumption of laser diodes in PICS, said Klamkin, 3 Spring 2018

such questions as how misinformation goes viral and how false information bypasses our usual

William Wang

Jonathan Klamkin

“would make photonic circuits ubiquitous as a mainstay in computers, smartphones, medical instruments, sensors, and automobilies. “DARPA is widely recognized for its commitment to cuttingedge research,” he added. “It is an honor to receive this award, which represents a unique opportunity for students and me to investigate truly foundational technology.” Klamkin, who earned his PhD at UCSB and joined the faculty in 2015, has previously received a NASA Early Career Faculty award. Wang’s specialty lies in the relatively new and very hot field of natural language processing (NLP), a branch of artificial intelligence concerned with how humans and computers communicate. (Think Siri and Alexa, speech recognition, and machine translation.) In his project titled “Dynamo: Dynamic Multichannel Modeling of Misinformation,” he and his team are purusing answers to

filters to spread rapidly. Wang's goal is to develop a machinelearning system that can automatically learn explainable patterns of how misinformation spreads. “I’m very honored,” he said. “This is only the second time in the history of the program that the DARPA Young Faculty Award has been awarded to an NLP researcher.” In his two years at UCSB, Wang has helped to grow the campus’s NLP research lab into one of the world’s most proliic, contributing to UCSB’s 2018 top-three ranking for NLP from CSRankings.org.

TMP director kyle lewis receives Highest Honor for System Dynamics Research

K

yle Lewis, chair of the Technology Management Program (TMP) at UC Santa Barbara, has received the highest honor for research involving system dynamics. The System Dynamics Society presented the Jay Wright Forrester Award to Lewis and her co-author, Edward G. Anderson, a professor at the University of Texas, Austin, for their paper “A Dynamic Model of Individual and Collective Learning Amid Disruption,” which was published in Organizational Science in 2014. The award dates to 1983 but has been given only five times in the past decade.

“It is an honor to be recognized for using system dynamics to make an impact in organizational research,” Lewis said. “The award is a testament to the high-quality, highly innovative research being conducted within TMP and the College of Engineering.” “System dynamics” refers to a computer-aided approach to modeling phenomena that occur over time, with feedback. Applied to problems that arise in complex social, managerial, economic, or ecological systems, system dynamics incorporates mathematical modeling and emphasizes a “continuous view” in order to identify under-


Kyle Lewis

lying patterns of policy and behavior. Lewis and Anderson modeled the effects of disruptive events on learning and productivity in organizations at both individual and collective levels. They simulated the impact of events on productivity and performance, such as employee turnover, technological innovation, reorganization, and extreme events, such as natural disasters. “By using mathematical reasoning to simulate logic, we were able to identify patterns, supported by data, that would be difficult to observe directly,” said Lewis. She and Anderson found that while collective and individual learning initially complement each other with respect to collective performance, over time, they interact in complex ways. The computational model described how collective knowledge affects individuals’ ability to learn, and how individuals’ knowledge affects the propensity for collective

knowledge to develop. One implication of the their findings is that organizations which depend on expertise, knowledge, and know-how as factors of production must consider how and when knowledge becomes embedded in collective structures and not simply focus on developing individual employee learning and knowledge. “The results show further that dismantling organizational structures, such as teams or units, or substantially reorganizing a firm, can dramatically and negatively affect overall performance,” Lewis said, adding, “Once collective structures are impacted, collective learning must begin again before performance can increase.” Lewis and Anderson continue to use learning curves to model organizational phenomena and answer questions about who in a hierarchy must possess more generalized, versus specialized, knowledge and under what conditions.

Legacy of Discovery: Jacob Israelachvili, 1944-2018

U

C Santa Barbara has lost a dear friend. Professor Emeritus Jacob Israelachvili, described by the American Institute of Chemical Engineers as one of the most influential chemical engineers in the past one hundred years, died of cancer on September 20. He was 74. On October 26 The Scholarship Foundation of Santa Barbara announced that it is accepting donations for a new scholarship fund, the Jacob N. Israelachvili Honors Science Scholarship Fund, to support graduate and undergraduate students in chemical engineering, materials, physics, and biochemistry. In more than thirty years at UCSB, with joint appointments in the Chemical Engineering and Materials Departments, Israelachvili led the Interfacial Sciences Lab, reflecting the focus of his groundbreaking research on interactions between molecules and surfaces, principally in liquid environments. “Jacob was a legend in his field, a great citizen on campus, and a strong advocate for UCSB and his students,” said Rod Alferness, dean of the College of Engineering. “A countless number of his students, his colleagues, and scientists around the world are where they are today because of him.” Israelachvili received his PhD in Physics from the Surface Physics Department of the Cavendish Laboratory at the University of Cambridge in England in 1972. After a twoyear research fellowship at the University of Stockholm, he moved to Australia, where from 1974 to 1986 he led a laboratory devoted to

Jacob Israelachvili

measuring the forces between surfaces. During that period, he made his first significant impact on the scientific community when he designed a new surface-force apparatus (SFA) to measure force interactions in liquid. In 1985 he published Intermolecular and Surface Forces, still regarded as the definitive textbook in the field. Israelchavili joined the CoE faculty in 1986 and helped elevate the Departments of Chemical Engineering and Materials, and the graduate program in Biomolecular Sciences and Engineering to be among the most respected programs of their kind in the nation. “Jacob was one of the senior faculty most

responsible for the department’s meteoric rise in reputation and prestige in the late 1980s,” said chemical engineering professor Brad Chmelka. “He helped the department establish a foundation built firmly on deep scientific understanding and great technological impact.” Israelachvili’s UCSB research team received acclaim in 2011 for developing a breakthrough equation that predicted molecular forces in hydrophobic interactions. The equation helps explain why oil and water do not mix, how proteins are structured, and what holds biological membranes together. Matthew Tirrell, former CoE dean and current dean of the Institute for Molecular Engineering at the University of Chicago, who spent a decade working closely with Israelachvili, described him as being in a class by himself. “He was sui generis in the world of interfacial science,” Tirrell said. “A builder of uniquely powerful tools, the author of broadly influential articles and books, and a mentor to hundreds of younger scientists, Jacob gave the world its first direct look into many aspects of interfacial structures and forces. He had a profound ability and patience to focus on his interests, whether it was the new instrument he was designing or the person he was engaging in discussion.” Israelachvili embraced the collaborative atmosphere at UCSB and broadened the scope of his research to span multiple fields of study. His group helped to solve several mysteries, including how geckos climb vertical surfaces, the recovery of oil from the ground, the friction CONVERGENCE 4


NEWS BRIEFS and lubrication of mammalian joints, and the physical mechanisms behind autoimmune diseases such as multiple sclerosis. The author or co-author of nearly five hundred published journal articles, Israelachvili was a Fellow of the Royal Society of London, whose members include Charles Darwin, Isaac Newton, Albert Einstein, Stephen Hawking, and fellow UCSB faculty member Craig Hawker. He was also a member of the U.S. National Academy of Engineering and the U.S. National Academy

of Sciences, and was a Fellow of the American Physical Society. “A very small number of scientists, through their brilliance and imagination, create new bodies of science apparently out of whole cloth,” said Martin Moskovits, a distinguished professor of chemistry at UCSB. “They are the Columbuses and Gallileos, who are driven by a unique conviction that there lies a new continent or a new ocean beyond what is visible on the horizon. Jacob was one of these. And he

was more. He was unusually kind and generous, inviting those who wanted to join him in exploring these new scientific territories do so. This generosity of spirit also made him an incomparably successful teacher and mentor, as the hundreds of students he mentored can attest to. He was admired and celebrated literally around the globe.” Jacob Israelachvili is survived by his wife, four children, two grandchildren, and a sister.

UCSB Works for Diversity students in chemical engineering also spent time studying photovoltaics at UTEP. “They have a great program in photovoltaics there, and faculty are really used to working with undergraduates, so they’ve been super mentors for our exchange students,” said MRSEC education director, Dotti Pak. “At UCSB,” said Professor Ram Seshadri, who is also From left: Dotti Pak, Javier Read de Alaniz, and Ram Seshadri. director of the MRSEC, “we provide training for undergradCoE Supports Materials Science at uate students, who get to work with graduate UTEP, Jackson State students and can access instrumentation to run experiments related to their research at new round of NSF funding is making Jackson State or UTEP.” it possible for UC Santa Barbara to UCSB grad students also talk to visiting continue for six more years a pair of prospective graduate students about why they partnerships supporting diversity and visions chose UCSB, which of graduate school among materials-science inevitably leads to students at two Minority Serving Institutions discussions about — the University of Texas, El Paso (UTEP), and the possibilities of Jackson State University (JSU) in Mississippi. collaborative work This is the third round of funding for Jackson across disciplines. State and the second for UTEP. Seshadri, the PI The grants are part of the NSF’s Partneron both collaboraships for Research and Education in Materials tions, is joined by (PREM) program, aimed at fostering next-genother UCSB faculty: eration materials research by having a team together, the UTEP of faculty work with engineering and science and JSU groups students from diverse backgrounds who will, Jackson State student includes 13 faculty hopefully, go on to earn advanced degrees. Natalie White in the MRL. — from Chemical PREM collaborations leverage existing NSF-funded centers at supporting institutions, Engineering, Materials, Chemistry & Biochemistry, Mechanical Enineering, Physics, and in this case the Materials Research Science and Engineering Center (MRSEC) at UCSB, aka Biomolecular Science and Engineering. Among them is chemistry professor Javier the Materials Research Laboratory. Student Read de Alaniz, diversity coordinator of the exchanges provide opportunities for underUCSB MRSEC, who said, “This partnership has graduate students at UTEP and JSU to spend played a critical role in our sustained effort to the summer doing research with students and engineering and science faculty at the MRSEC. increase diversity in STEM at UCSB, as well as at the national level.” Last summer, several UCSB undergraduate

A

5 Spring 2018

A Move to Diversify STEM Faculty In a new program, UC Santa Barbara is collaborating with UC Merced, CSU Fresno, and CSU Channel Islands to create and test a model for training and mentoring underrepresented minority doctoral students in science, technology, engineering, and mathematics (STEM) fields, thus leading to increased diversity among STEM faculty. All four universities are recognized Hispanic Serving Institutions. The National Science Foundation, through its Alliances for Graduate Education and the Professoriate, awarded $2.1 million to fund the project and a research component, in the hope that a model will be developed that can be replicated at other universities nationwide. “Working with other institutions expands our views of higher education in California and beyond, and gives us a renewed sense of our broader societal mission,” said Carol Genetti, dean of the graduate division at UCSB. Graduate Division leaders working with co-PIs David Sherman (psychological and brain sciences) and Anne Charity Hudley (linguistics) wrote the project proposal after realizing that, in order to be as prepared as possible, doctoral students who are interested in teaching at colleges and universities need experience and training beyond what they get at their home campuses. Last summer, doctoral students in science, technology, engineering, math, social sciences, linguistics, and psychology met at UCSB for a workshop on how to teach effectively. In fall, the students were paired with CSU faculty members in their disciplines to co-teach classes on those campuses. The UC schools will study the psychology behind what drives young doctoral candidates to make their career decisions, such as attending teaching-intensive schools compared to research-heavy institutions.


Undoing Superfund Toxicity

A Bit of Good News for Frogs

F Contaminated ponds at the Leviathan Mine Superfund site.

A

fter more than twenty years, a once heavily polluted system of streams in California’s Sierra Nevada mountains is again supporting diverse life. Leviathan, Aspen, and Bryant Creeks were decimated by iron and other acidic heavy-metal pollutants released from the Leviathan mine, a 250-acre open-pit sulfur-extraction site located near Lake Tahoe. The streams are recovering thanks to a collaboration led by a team of UC Santa Barbara scientists. Their compiled research on the Leviathan project was published this fall in the Journal of Environmental Toxicology and Chemistry. “This is an encouraging story that shows the success of lasting efforts to capture and treat the mine effluent and improve stream water quality and ecological health,” says David Herbst, a freshwater aquatic scientist based at UCSB’s Sierra Nevada Aquatic Research Laboratory in Mammoth Lakes. He oversaw the group that performed the recovery work. The Leviathan mine opened in 1951, and the sulfur extracted from it was trucked to another mining facility near Nevada, where it was used to dissolve copper ore. Leviathan closed in 1962 and eventually became a

U.S. EPA Superfund site. The first step in recovery was containing polluted snowmelt and rainfall, followed by using chemical treatments to remove the toxins from the water in two streams. In the third, the process that formed the acidic soup of metals was chemically reversed, turning them back into minerals that settle in a microbial wetland from which the cleansed water flows back into the stream. Herbst, who has dedicated much of his career to studying freshwater insect life, monitored the return of many insect species in downstream areas of the creek system and discovered that they began to flourish, moving upstream as they recovered. As the food web becomes reestablished, fish and other aquatic and riparian life can also recover. “The numbers and variety of stream bugs tells us how well the treatments worked to alleviate the toxicity,” said Herbst. “Clearly, this is a case of the ‘canary in the coal mine’ and I’m glad to report that our canaries are alive and doing well.” While noting that recovery is not complete, he says, “This case demonstrates the importance of using science to inform how we go about doing environmental cleanups and large-scale restoration projects.”

ollowing years of mass mortality as a result of infection with the deadly Batrachochytrium dendrobatridis fungus, or chytrid, some frog populations in El Copé, Panama, now seem to be co-existing with the pathogen and stabilizing their populations. “Our findings lead us to conclude that the El Copé frog community is stabilizing and is not drifting to extinction,” said Graziella DiRenzo, a postdoctoral researcher in ecology, evolution and marine biology (EEMB) at UC Santa Barbara. “Extinction is a big concern with chytrid worldwide. Before this study, we didn’t know a lot about the communities that remain after an outbreak.” DiRenzo, a scientist in the lab of EEMB professor Cherie Briggs, outlined her findings in the paper “Eco-Evolutionary Rescue Promotes Host-pathogen Coexistence,” which appeared in the journal Ecological Applications. She conducted her research while at Michigan State University and the University of Maryland. The mass die-offs were connected to the chytrid fungus, also known as Bd, at around the turn of the 21st century. In El Copé, Bd took out almost half of the amphibian population within a few years. From 2010 to 2014, DiRenzo and colleagues revisited the El Copé site to sample the frog population for Bd infection. They assumed the population was heading toward extinction but found, upon examination, that it seemed to be relatively stable. Repeated data collection and statistical modeling led to the finding that while some frog species succumb to the disease, others persist. The researchers attribute this host-pathogen coexistence to “eco-evolutionary

An El Copé frog, coping.

rescue,” which occurs when either ecological and/or evolutionary mechanisms allow the host and the pathogen to persist. Evolutionary mechanisms include amphibians’ immune adaption, and the reduced density and richness of frog species in El Copé, where fewer frogs equates to fewer opportunities for infection. Of the 74 amphibian species noted in El Copé prior to Bd’s arrival, approximately 32 remain. It’s still too early to tell whether the El Copé frog population will continue to stabilize in the long term, DiRenzo noted, because climate change, habitat loss, pollution, invasive species, and other stressors present additional challenges to amphibian communities. Approximately one-third of the world’s seven thousand amphibian species are at risk of extinction,” said DiRenzo, who is now studying how Bd interacts with other pathogens that affect amphibians. “Our results provide a glimmer of hope for amphibian populations persisting in the face of major threats, but it does not mean that amphibian populations will rebound to their abundance prior to the threat, or that the ecosystem structure has not been compromised.”

CONVERGENCE 6


Alumna and donor:

pamela lopker

The practical, service-centered leader of QAD As the co-founder and president of QAD, Inc., Pamela Lopker has proven herself a brilliant programmer and a formidable entrepreneur. She is also highly practical. As a math major at UCSB, she took computer science courses (before there was a computer science department) with professional life in mind. After graduating, she spent two years at Comtek Research writing software for Raytheon before teaming up with her then-boyfriend and eventual husband, the late Karl Lopker, who had founded Decker’s Outdoor — to start QAD, which provides sector-leading integrated business-manufacturing software to companies worldwide. In 2005, the Lopkers were named Distinguished Alumni of UCSB in recognition of their generous giving through the Lopker Family Foundation. Together, they supported an endowed faculty chair and multiple student fellowships in the College of Engineering, and offered programmatic and scholarship support for the Art Department. Convergence visited with Pamela Lopker.

7 Spring 2018


C: What led you to computer software and coding? PL: In my junior year at UCSB, I went to the placement center, where I took some aptitude tests and was told I could be an actuary. And I was thinking, Do I really want to be an actuary? So I started looking around. I had never taken a computer science class but thought I might be good at it. And I could see that it was a growing market, particularly on the West Coast, so it seemed a good fit for my capabilities. I was trying to get a career that would allow me to support myself and have a nice lifestyle.

C: You gave up defense work in the seventies after not being permitted aboard a Navy ship because you were a woman, a fact that, you realized, would diminish your role and impact in your field. What are your thoughts as a female entrepreneur and business leader in the male-dominated tech sector? PL: I love to see women in the new-hire classes in technical positions. In our last round, I think we had something like ten new hires, and four were women, who typically make up only about twenty percent of the work force in computer science. It is still considered a male career, and I don’t really understand that, because it’s a great profession for women. Software developers make a good living. You can often work as an independent contractor and work from home. You don’t have to work full-time if you want to have a family. It’s a very flexible career, which is why I encourage young women to go into the tech industry.

C: You’ve mentioned being asked by students what they should study to make the most money in the least time. Do you see alternatives to that perspective? PL: It seems that the current generation is maybe not as materialistic as our generation was. I see both of my kids wanting to do something to change the world. They really do want to have a small footprint. They don’t want to drive cars. They don’t want to leave a lot of trash in their wake. And they’re very, very conscious about that. I believe that there is new thinking about wanting to tread lightly on the Earth and to improve things and give back. But there are so many different personalities. You can say that about one group of people, and then there’s another group that has a whole different idea.

C: You have said that to be a successful entrepreneur, your passion for something has to be internalized? Can you explain? PL: Maybe it works for certain people, but in my mind, you can’t just be sitting there thinking, How can I do the least and make the most money? You really need to want to do good for your customer. You have to want to create a product that your customer will be really happy with. And you have to personally want to do that, because if you don’t, you might make a sale, but you’re not going to have a customer. You want a customer for the long term, and you want that customer to believe that you will always have their interest in mind and that you actually want to and do deliver superior products. And if you’re not completely involved in and passionate about that, I just don’t think it’s going to work.

C: You are a strong supporter of the UCBS College of Engineering. Why do you contribute? PL: We give back because that’s what we came from; it’s where we’ve succeeded. And the college offers tremendous access for the community. If you look at all the companies in Santa Barbara and Goleta that have been started by UCSB graduates, it’s huge. Universities like Stanford and UC Berkeley are surrounded by companies started by their alumni. I think there is the same entrepreneurial environment around UCSB, and those companies add a lot to the economy.

C: How does the QAD landscape look going forward? What’s next? PL: Everything now is cloud computing. You no longer sell software for somebody to install on the premises. And then the next big challenge is incorporating artificial intelligence into the software and doing it well. If we missed the AI movement, we could have problems. Fortunately, we already have many “techy” people with that capability, but it’s certainly something we look at as we’re hiring new people. The other big area is agility, so that the software we create can change when a customer needs it to, and the customer can make the change. It’s about building tools that allow them the agility to mold and change their businesses and have the software support without always having to come back to us because they need a new field on their screen or they need to calculate their pricing a little differently. CONVERGENCE 8


A SpaceX rocket carrying a UCSB experiment to the International Space Station launches from Cape Canaveral. Photo courtesy of NASA.

9 Spring 2018


SEDIMENTS in space

In June, a SpaceX rocket carried a UCSB experiment to the International Space Station

W

hen the SpaceX CRS-15 mission launched from Cape Canaveral on June 29, it carried some 250 experiments bound for the International Space Station. Among them was one designed by UC Santa Barbara mechanical engineers Paolo Luzzatto-Fegiz and co-PI Eckart Meiburg. Recently the two collaborators, both of whom specialize in fluid mechanics, have partnered to advance understanding of sediment, including its compositional characteristics and dynamics. In particular, said Meiburg, “We’re trying to understand some fundamental processes that govern the transport of sediment.” Sediments can carry pollutants into water bodies, redistribute nutrients in watersheds, and affect land use. When an ocean earthquake occurred near Newfoundland in 1929, enough accumulated sediment moved to cause a tsunami that killed 28 people and devastated transatlantic telecommunications cables. The phenomena of special interest to Luzzatto­ Fegiz are electrostatic forces that cause tiny solid particles to come together in fluid. They are difficult to study on Earth, because the forces driving them are small and largely obscured by gravity. “For example,” Luzzatto-Fegiz says, “rivers send sediment, which is made up of particles of clay, sand, and other matter, to the ocean. As the particles sink, they very slowly come together due to some weak but important electrostatic forces that play a role in nutrient cycling and mineral accumulation in the ocean.” It is, however, very hard to measure those forces in the lab. Luzzatto-Fegiz explains: “Ideally, we would like to shake a little vial full of disaggregated mud and then observe the clumping process. But because the mud is more than twice the density of water, it sinks very fast, too fast for us to observe the process accurately enough to understand the dynamics involved.” Further, as the clumps sink, friction with the water tends to cause pieces to be torn away from the clump. “If you were to put the clay into the ocean where it is three hundred feet deep, there would be plenty of time to observe the process before the clump reached the seabed, but it’s not practical to do that,” he adds. And the shearing force would still affect the clumping process. To get around that, Luzzatto-Fegiz, working with

Meiburg, postdoctoral researcher Bernard Vowinckel, and Nick Rommelfanger, an undergraduate physics student in UCSB’s College of Creative Studies, developed what may be the first series of sediment experiments to be performed in the Space Station’s micro-gravity environment. Electrostatic forces that cause particles to cluster exist whether or not gravity is present. The experiment allowed the team to observe that process over a large time scale — one month — in the absence of gravitational settling. Space-station astronaut Dr. Alexander Gerst, who is also a sediment expert, performed the experiments the week of July 23 and provided results as downloadable images. The team hoped to see multiple important results, including the size of the largest aggregates in space. On Earth they are about one millimeter in diameter — perhaps, Luzzatto-Fegiz suggests, because friction causes the clumps, or “flocks,” to break apart as they fall through the fluid. Clays that have strong surface charge may not aggregate at all in Earth’s oceans. “The main thing we don’t’ know how to model are these cohesive forces,” he explains, “so we’re going to test it to see if, by removing the gravity element, we get larger and larger clusters. If so, we can then use that to build a model of the cohesion and then combine that with the model of the fluid friction, which is very well established.” Because many experiments on engineered colloids (liquids with particles in them) have been conducted previously on the space station, the equipment necessary for the work was already in place, so Luzzatto-Fegiz had to send only samples. At press time, the team was still awaiting some results, but Luzzatto-Fegiz noted at least one exiting anomaly: “We can already observe that many clay mixtures that do not aggregate on Earth will readily form clusters in the microgravity of space. This gives us crucial information that we can use to develop our models.” The researchers hope that the experiment conducted aboard the space station will provide them with a better understanding of the cohesive forces in sediment. The work has several important Earthbound applications, including ecosystem modeling, deep-water hydrocarbon exploration, carbon sequestration, and mobilization of contaminants in water bodies around the world.

CONVERGENCE 10


AN UNFOLDING

MISSION Understanding why proteins lose function when bonded to artificial surfaces

11 Spring 2018


I

t seems simple enough: a person takes a But using proteins presents a problem of medicine at a given dose to treat a specific its own. Because the gold-plated sensors pick condition. But dosing is not an exact science. up and provide an electrochemical readout, Different people, even people who weigh their surface has to be a charged conductor. about the same, metabolize medicines at dif“Under the conditions we employ, the ferent rates, and dosing based on body mass surface of our sensor has a negative charge,” does not necessarily capture those metabolic Plaxco says. And, like all DNA molecules, differences between patients. Differences aptamers also have a negative charge. That between the biology of men and women can prevents the aptamers from adhering inapprofurther complicate drug dosing. priately to the sensor surface, i.e. in a way that For some time now, chemistry professor causes the DNA to unfold, losing its structure and director of the UCSB Center for Bioengiand function. neering, Kevin Plaxco, has been leading the But when a protein — which, unlike a development of electrochemical sensors to DNA molecule, contains both negative and conduct real-time monitoring of drug levels in positive charges — is bonded to an artificial the body. Coupled to feedback control, that surface, such as the sensor, it stops functioncould make it possible to monitor the effecing. “Attaching DNA to a surface does not tive drug concentration in a patient’s blood moment to moment and continuously fine-tune the amount of the drug that is released. To date, Plaxco’s process has been built around a tiny gold-wire sensor that can fit through a very narrow, 22-gauge, hypodermic needle. The wire is coated with a type of DNA molecule, called an “aptamer,” to which the specific drug of interest can stick, or “bind.” In current experiments, the DNA-coated sensor is then inserted into the body of a rat, where binding of a specific molecule to the aptamer — so far, a half-dozen different drugs, including several “last-line-of-defense antibiotics,” have been administered — generates an Martin Kurnik (left) and Kevin Plaxco at the bench in the Plaxco lab. easily detectable electronic signal that scales with the concentration of the target molecule. So far, change its folding stability very much,” Plaxthe Plaxco group has adapted multiple sensors co says. “But when we tried proteins, they to such in-vivo deployment and used them to just unfolded, stuck on the surface, and lost measure concentrations of multiple drugs in their function. That's what got us thinking the living body in real time. about this problem.” A limitation to their approach, however, Plaxco notes that several theories explain results from using DNA as the recognition elethe mechanisms by which the protein’s atment, because it can recognize only a limited tachment interaction with the surface would number of molecules. In further experiments, change the protein’s stability, but it seems to Plaxco and Martin Kurnik, a postdoctoral be closely tied to charge. He believes that researcher working in his lab, have sought when the protein attaches to the sensor, its to adapt protein molecules to the platform, positive charges are drawn to the negative because their much-greater chemical complex- charge of the gold surface, causing the protein ity allows them to bind a far greater range of to unfold and cease functioning. target molecules. That’s the theory, “But until now,” Plaxco

says, “no one has been able to measure that experimentally.” A paper published in August in the Proceedings of the National Academy of Sciences, co-authored by Kurnik and Plaxco, explains a new procedure for understanding the chemistry and physics of the unfolding process. Specifically, the method lets researchers measure by how much the relative amounts of folded and unfolded protein molecules (i.e., the protein’s “stability”) differ when a protein is confined to a surface compared to when it is free in solution, and to systematically explore the physical origins of that effect. “This allows us to start to dissect what drives proteins to unfold on the surface,” Plaxco says. “Our technique gives us great control over the chemistry and the charge on the surface. We can tune those variables and see how changes in the parameters change a protein’s stability.” “The novelty in this is that we're performing a quantitative measurement of how the folding physics of proteins are affected by their being confined to a surface,” says Kurnik. “As far as we are aware, there are no prior examples of this. "What I hope is that, first, this will allow us to make protein-based biosensors. That could make the unique sensor platform that Kevin has developed applicable to a much wider range of molecules, allowing it to be used as a diagnostic tool for a broader variety of medical conditions.” The research may also prove valuable in understanding some fundamentals of cell biology. Kurnik explains: “Because our method makes it easier than ever for researchers to perform high-precision measurements of protein-surface interactions, I hope that it will also be used to provide new insights into the physics of how proteins interact with various surfaces, including ones that mimic the properties of biological membranes. "Our current understanding of these effects is limited, and we cannot claim to comprehend the molecular biology of the cell without quantitative descriptions of the interplay between a cell’s soluble proteins and its multitude of membrane surfaces.” CONVERGENCE 12


13 Spring 2018


THE SCIENCE AND ENGINEERING OF

SUSTAINABILITY T

he current geologic period is often referred to as the Anthropocene, indicating that, for the first time in history, human activities are the largest influence on climate and the environment. A world population of some 7.5 billion is putting tremendous pressure on the vast web of natural systems that sustain diverse life on Earth. We derive most of our energy from fossil fuels, with negative environmental consequences. In light of such trends, it is not surprising that “sustainability” has become a watchword of our time. And while its meaning can vary, we think of it as referring, generally, to whether human activities can be maintained at current levels without negative long-term impacts that affect life and prosperity. We have a ways to go. Many researchers in the sciences at UC Santa Barbara approach sustainability from an ecological perspective, seeking to reduce pollutants, restore fisheries, and mitigate deforestation, habitat loss, and climate change. At the College of Engineering, and in the Departments of Chemistry and Chemical Engineering, energy is a main focus of sustainability research: how to make more of it for less cost and with a smaller environmental footprint, how to use and store it more efficiently, how to conserve it in manufacturing processes, and how to develop alternatives to fossil fuels. In this “FOCUS ON: Sustainability” section, we share some of the important sustainability-related work being pursued by researchers in engineering and the sciences at UCSB.

CONVERGENCE 14


MAJOR GRANTS FOR GRAND CHALLENGES UCSB FACULTY WORK TOWARD CLEAN WATER, BETTER BATTERIES, QUANTUM COHERENCE, AND HUMAN-LIKE COMPUTING

15 Spring 2018

T

he 21st century presents the world with multiple sustainability challenges, especially on the energy front. Energy demand is growing rapidly, but fossil-fuel combustion drives climate change, creating a need to transform how we generate, supply, transmit, store, and use energy. Water is generally required to produce energy, and purifying water requires energy, increasing the urgency of finding energy-efficient ways to purify water and Ram Seshadri water-efficient ways to produce energy. While incremental advances to existing Since beginning the ambitious program, technologies are helpful in these areas, such the DOE has established 36 integrated, marginal improvements will not be enough to multi-investigator EFRCs around the nation, meet the future demands. Radical new techinvolving partnerships among universities, nologies are needed, and, recognizing that national laboratories, nonprofit organizaadvances at the frontiers of science will be tions, and for-profit firms. Each is conducting required to produce them, in 2009 the U.S. Defundamental research focused on one or more partment of Energy (DOE) established the En“grand challenges” related to the shared ergy Frontier Research Center (EFRC) program. energy future of the nation. Its goal: to establish the scientific foundation for Now, scientists affiliated with UC Santa Bara fundamentally new U.S. energy economy, one bara’s California NanoSystems Institute (CNSI) that will decisively enhance U.S. energy security have joined that effort on four newly approved, and protect the global environment. renewable four-year EFRC projects.

The UCSB team is particularly happy to combine expertise in soft (e.g. polymer) and hard (e.g. oxide) materials to improve the performance of lithium ion batteries.

ON THE WATER FRONT The $10.75 million Center for Materials for Water and Energy Systems (M-WET) is a joint effort between UCSB ($3.9 million) and the project home, the University of Texas, Austin. Rachel Segalman, chair of the Department of Chemical Engineering, is the project associate director and PI on the UCSB team, which includes Mahdi Abu-Omar (chemistry), Christopher Bates (materials), Michael Doherty (chemical engineering), Glenn Fredrickson (materials, chemical engineering), Songi Han (chemistry, chemical engineering), Craig Hawker (materials, chemistry), Baron Peters (chemical engineering, chemistry), M. Scott Shell (chemical engineering), and


AIMING SIMULATIONS TOWARD HIGH-PERFORMANCE FILTERS

T

Rachel Segalman

Todd Squires (chemical engineering). The M-WET team will take a materials approach, working at the atomic level to “design and perfect” revolutionary new forms that can be used as membranes for filtering chemically contaminated water for re-use. The researchers will seek to “master energy and information on the nanoscale to create new technologies having capabilities rivaling those of living things.“ Synthetic polymer membranes are widely used to purify water, mainly because they are more energy efficient than competing technologies. But current membranes were not designed for, and are not suitable for, applications involving water that is heavily contaminated with organic and inorganic components resulting from, for instance, oil and gas production. In addition, water’s interactions with, and its structure near, complex heterogeneous surfaces (filtering membranes; see sidebar at right) remain poorly understood, and those gaps in knowledge limit the ability of scientists and engineers to develop materials for energy and water applications. M-WET is intended to fill those gaps, and the UCSB team will bring deep and broad expertise in polymer research to the project. POWERING UP: DELIMITING LITHIUM ION BATTERIES The Synthetic Control Across Length-scales for Advancing Rechargeables (SCALAR) is a collaboration with UCLA. The group, which includes UCSB PI Ram Seshadri (materials, chemistry) plus Brad Chmelka (chemical engineering), Anton Van Der Ven (materials), and Segalman, will take a materials approach to discovering the science that will enable more-powerful and energy-efficient batteries, which last longer and charge

he dynamics of water near solid surfaces play a critical role in numerous technologies, including water filtration and purification, chromatography, and catalysis. One well-known way to influence those dynamics, which in turn, affects how water “wets” a surface, is to modify the surface hydrophobicity, or the extent to which the surface appears “oily” and repels water. Such modifications can be achieved by altering the average coverage, or surface density, of hydrophobic chemical groups on the interface. Now, in a paper published in the Proceedings of the National Academy of Sciences, lead author Jacob Monroe, a fifth-year PhD student in the lab of UCSB chemical engineering professor M. Scott Shell, describes a new perspective on the factors that control water dynamics at interfaces. The findings could have important ramifications for membranes, especially those used in water filtration. “What we’re seeing is that just changing the patterning alone — the distribution of those hydrophobic and hydrophilic groups, without changing the average surface densities — produces fairly large effects at an interface,” Monroe said. “That’s valuable to know if I want water to flow through a membrane optimally.” Monroe and Shell found that if they arranged all the hydrophobic groups together and made the surface very patchy, the water moved faster, but if they spread them all apart, the water slowed down. “If the membrane were for water filtration, you might want the water to move quickly across it,” Monroe notes, “but you might also want to sit at the surface to repel par-​ ticles that stick to it and foul the membrane.” Monroe's finding about patterning holds immediate relevance for interpreting experiments, because it means that assessing the surface density of hydrophobic groups alone is not enough to characterize the material. Monroe and Shell discovered the distribution effect by combining simulations of molecular dynamics with a genetic algorithm optimization, which is simply an algorithm that emulates natural evolution — here used to identify surface patterns that either increase or decrease surface-water mobility. “It’s kind of like a breeding program,” Monroe explains. “If you had a pool of dogs and wanted a certain kind of dog, say one that’s bigger or has a shorter tail,

Jacob Monroe (left) and M. Scott Shell

you would breed the dogs that have those characteristics. We do the same thing on a computer, but our goal is to design a surface having specific characteristics that allow it to perform how we want it to. You need the fitness metric, and then you can tune the genetic algorithm to optimize specific performance characteristics, for instance, to have water move quickly across a membrane or to adsorb on a surface. “We run molecular dynamics simulations to assess those properties,” he adds. “We assign a level of fitness to each individual, and then hybridize the most fit individuals spatially and drive the systems toward the desired properties. “This work is exciting because it shows for the first time that nanoscale patterning on surfaces is an effective means of engineering materials that give rise to unique water dynamics,” Shell says. “It has long been thought that biological molecules, like proteins, use surface chemical patterning to influence water dynamics toward functional ends, such as accelerating binding events that underlie many biomolecular processes. We have now used a computational optimization algorithm to 'learn' what these patterns should look like in synthetic materials having target performance characteristics. The results suggest a new way to design surfaces to precisely control water dynamics near them, which becomes widely important to chemical separations and catalysis tasks.” The research will also be useful in a new Energy Frontier Research Center (EFRC) project (see article at left). In that effort, the researchers will be taking a materials approach to “design and perfect” revolutionary new materials that can be used as membranes for filtering chemically contaminated water for re-use. CONVERGENCE 16


faster. Total funding for the project is $9.75 million, with UCSB researchers receiving just under $2 million for their part. For more than two decades, lithium-ion batteries (LIBs) have had a profound impact on the development of a vast range of electronics that are important in all aspects of daily life. LIB market share is already enormous and will only grow with the electrification of transportation. In fact, electrified vehicles alone are expected to generate a nearly twenty-fold increase in LIB demand by 2025, and the integration of grid-level energy storage for renewables will further extend market dominance. The materials used in LIBs work surprisingly well, but new materials are needed for better performance that will enable technological advances, whether it’s making smaller, longer-lasting devices or electrifying larger modes of transportation, such as airplanes. Moreover, the ability to move to truly renewable energy sources — e.g. wind, solar, and hydro — requires improved battery storage capacity to handle the time-dependent availability of the energy generated by those sources. Recognizing the limitations of LIB technology and the challenges they present, the SCALAR team has received funding for an in-

Ania Bleszynski Jayich

You need to do those things faster than the quantum bit loses its 'quantumness.'

tegrated four-year program to create the enabling science and technology for next-generation electrochemical energy storage. “Energy storage in batteries has become an increasing concern that can enable numerous future technologies, including those associated with renewable and clean-energy technologies,” says Seshadri, who is also director of the NSF-funded Materials Research Science and Engineering Center (aka, the Materials Research Laboratory) at UCSB. “UCSB has great depth of expertise in this area, and the EFRC is an opportunity to bring together a number of experts to address these important challenges. The UCSB team is particularly happy to combine expertise in soft (e.g. polymer) and hard (e.g. oxide) materials to improve the performance of lithium ion batteries.” TOWARD CONTROLLING QUANTUM COHERENCE Physicist Ania Bleszynski Jayich is the sole UCSB researcher on another EFRC project, titled Novel Pathways to Quantum Coherence in Materials and based at the Lawrence Berkeley National Laboratory. She explains that a key goal of her part in the $11.75 million project, $540,000 of which is directed to-

ward her research, is to “dramatically expand our understanding of quantum coherence in solids by building on fundamental materials discoveries.” Quantum systems, such as atoms, electrons, and photons, are unique in that they can exist in what is called a superposition state. That is, they can be in two places at once. Quantum coherence refers to this fleeting phenomenon, which has no limit to how briefly it may last before dissipating into a classical state in a process called quantum decoherence. Quantum computing, quantum communication, and other quantum applications all take advantage of, and are made possible by, the superposition state. In recent years, considerable progress has been made in creating systems that exhibit quantum coherence for measurable periods of time, on the order of microseconds and milliseconds. Jayich explains: “That has led to an explosion of research in the field of quantum technologies, where researchers utilize these quirky, non-intuitive phenomena of quantum mechanics to realize functionalities that are not enabled by classical physics.” For quantum technologies to be functional, quantum coherence has to exist for at least a few nanoseconds, long enough to perform a logic operation or store information in

Chandra Krintz and Rich Wolski are using big data and machine learning to help ecologists and other scientists gain better access to, and make better use of, millions of images of wildlife photographed on UCSB’s Sedgwick Ranch Reserve. Krintz and Wolski are also working on well-pump monitoring solutions to increase water conservation in agricultural fields. 17 Spring 2018


GREAT MINDS: THE NEUROMORPHIC COMPUTER Back in the 1970s, “Moore’s Law” posited a doubling of the number of transistors on an integrated circuit every two years, and a corresponding increase in processing speed. And while it has held true for several decades, experts agree that in the next 15 to 25 years, that cycle will end. With more data available all the time, a new computing paradigm is needed. Associate Professor Jon Schuller is the one UCSB researcher on an EFRC project — Quantum Materials for Energy-Efficient Neuromorphic Computing (Q-MEEN-C) — directed at identifying, developing, and characterizing quantum materials to enable energy-efficient “neuromorphic” computing. A neuromorphic

computing architecture would make it possible to solve problems in a way that replicates how the human brain functions. Total funding for the four-year project is $9.75 million with $200,000 directed at Schuller’s work. Based at UC San Diego, the Q-MEEN-C project is meant to establish fundamental research to drive the next information revolution. The grand challenge is to develop a highly energy-efficient and fault-tolerant computational machine “that works like the brain.” Conventional materials, devices, and architecture are not up to that task, because they are comparatively energy-inefficient (the brain uses very little energy), are strongly affected by defects (the brain is very good at accommodating and sorting through defects), and have reached fundamental limits in terms of size and processing speed (the end of Moore’s Law). Q-MEEN-C researchers will explore and seek to understand and control the novel physical mechanisms of quantum materials that can enable functionalities necessary to develop a neuromorphic machine. “By mimicking aspects of the brain and using them in your computer architecture, you can unlock the ability to process energy efficiently, and new materials are the key to that,” Schuller says. As an expert in conducting optical spectroscopy of quantum materials and demonstrating their use in reconfigurable devices, Schuller will be a valuable contributor in terms of identifying and characterizing potentially useful materials. He collaborated previously on an internal UC multi-campus grant, working with four UC San Diego colleagues who are also part of Q-MEEN-C.

Jon Schuller

By mimicking aspects of the brain and using them in your computer architecture, you can unlock the ability to process energy efficiently, and new materials are the key to that.

memory. “You need to do those things faster than the quantum bit loses its ‘quantumness,’” Jayich notes. “For instance, if I need to send you some encrypted information, then in the time frame lasting from the moment I send the information to you until you receive it, everything has to be quantum coherent. That's critical to enabling these technologies.” The “novel pathways” part of the project title refers to developing new methods for maintaining quantum coherence, addressing the fact that coherence is highly susceptible to environmental factors. “If you do it just right, you can understand what the decohering interactions with the environment are and fine-tune the system to stabilize coherence,” she says. “The materials approach means looking for materials that preserve coherence, looking at the defects in them and trying to improve them, kind of cleaning up the materials and characterizing them."

Chemical engineer Michelle O’Malley and her team have discovered new enzymes in large-bovine digestive tracks that allow the animals to extract the energy in plant cellulose. Understanding and cultivating the digestive microbes could enable production of clean fuel and chemicals from agricultural waste. CONVERGENCE 18


MELLICHAMP CHAIRS: CLUSTERING FOR SUSTAINABILITY “O

ur society is built around stuff,” says chemical-engineering professor Susannah Scott, the first endowed chair in a group intended to total four in the Mellichamp Sustainable Materials and Product Design cluster. “The stuff comes from somewhere in the form of resources, which we transform into products that we use and eventually throw away. That whole process has an impact on the environment.” The sustainability group was established in 2014 as the third Mellichamp research cluster. The program began in 2001, when longtime member of the UCSB and UC Academic Senates Professor Emeritus Duncan Mellichamp and his wife, Suzanne, funded a single chair in process control. Two full clusters followed, in systems biology (2003) and globalization (2008), codifying the format of each cluster having four chairs, its own area of emphasis, a life span of fifteen years, and a mission of solving problems while developing important new areas of expertise at UCSB. The Sustainable Materials and Product Design cluster was established with the goal of developing materials and processes that could reduce the environmental impact of manufacturing. A broader, affiliated goal was to provide more people with technologies that enable a high standard of living without exceeding Earth’s capacity to supply the necessary raw materials and absorb the inevitable byproducts of such activities. Joining Scott as chairs in the cluster are chemistry professor Mahdi Abu-Omar, who came to UCSB in 2016 with expertise in

“green” chemistry, and chemical engineer Phillip Christopher, who arrived at UCSB in January 2018. Both are also experts in catalysis. A search is under way for the final chair, whose appointment will be at the Bren School of Environmental Science & Management and who will focus on resource systems analysis, which incorporates modeling to identify ways to reduce energy and resource consumption for industry. Scott’s goal within the cluster is to improve the efficiency and reduce the environmental impacts of chemically intensive catalytic processes, which are involved in about ninety

searchers is the fact that, as Scott says, “We’ve built a society that has been basically dependent on carbon as the principal source of fuel and the vast majority of chemical components used to make fibers, polymers, paints, adhesives, and much more. Getting and using oil is a highly optimized process that has a hundred-year head start on bio-derived chemical fuels and products. It works very nicely.” But, she adds, because of the negative environmental impacts of extracting and using fossil fuels, “In the future, we will have to let go of that idea.” Abu-Omar focuses his research on turning waste products into energy. Like Scott, he works on lignin but comes at it from a different perspective. “Lignin can be used to produce interesting materials like those we make from petroleum today,” he explains. “In my group, we use chemistry to take the lignin polymer apart and form structures that we can then put back together in a way that gives them unique new properties like those of structural plastics.” Abu-Omar’s group has made strides in enhancing the selectivity of that process, so that it yields only the products, or molecules, they want. “We’ve figured out how to get two or three products from lignin instead of a dozen products,” he says. He is also taking a visionary look at reusing plastics beyond recycling them, such as, for instance, transforming a used plastic bottle into something other than another bottle. "We want to think about it in a non-conventional way, so that the used bottle is now a

One of the valuable aspects about these Mellichamp chairs is that, to make an impact, you really do need to bring in people from different areas.

percent of manufacturing and, according to Christopher, are responsible for wasting a few percent of all the energy the world uses. Scott also focuses on minimizing the formation of undesired products in catalysis, reducing demand for rare metals by creating catalysts based on materials that are abundant in the earth, and scaling up environmentally friendly methods of catalysis for depolymerizing the lignans in fibrous plants so that the attached sugars, in the form of cellulose, can be used as biofuel. A key challenge facing the cluster re-

Some 140 million chemicals are registered to the Chemical Abstract Service, and another 25,000 to 30,000 are added every day, too many to test for potential toxicity to humans and ecosystems. The Chemical Life Cycle Collaborative (CLiCC) at UCSB, funded by the U.S. EPA, is a new open-access online service that uses models to enable rapid analysis of chemical life-cycle impacts. CLiCC is directed by Bren professor Sangwon Suh and includes chemical engineering professors Michael Doherty and Susannah Scott. 19 Spring 2018


Clean-chemistry team: Mellichamp sustainability chairs (from left) Susannah Scott, Phillip Christopher, and Mahdi Abu-Omar work to "green" chemical production and product manufacturing.

feedstock,” he explains. “It is a human-made waste product that contains value in its energy and its content. Can we now use the concepts of green chemistry creatively to make from it starting molecules that can then be used to make other economically and environmentally viable materials? “Working with lignin,” he adds, “We learned a lot about how to break carbon-oxy­gen bonds that are inherent in those materials. If we’re going to take oil-based polymers apart in a meaningful way, we have to learn to manipulate carbon-carbon and carbon-hydrogen bonds. Right now, we’re at that initial stage of designing chemistry to be able to take the long carbon chains and make smaller carbon chains from them with some selectivity.” Providing a glimpse into how the cluster functions, he says, “Once we understand the chemistry, we might say, ‘How can we make this environmentally sustainable?' So we might then go to Susannah, who can help us come up with catalysts that can perform the process faster and more efficiently. And someone like

Phil can help us to think about what would be missing to make this lab work scale up. What are the challenges and how should we design the reaction? Maybe I was doing the chemistry under certain conditions that would cost too much to scale as to make it impractical. And going beyond the cluster, someone from political science might say that we should be thinking about how the molecules we’re making might be perceived and accepted — how political forces might affect the technology.” Working in that collaborative way, he says, “really helps you to get outside your comfort zone and understand how other scholars view the problem and think about it. I might have a way of approaching the chemistry, and then someone shows me that, down the road, we might have to think about policy or water usage. And all of a sudden you say, ‘Oh wait, can I reformulate my thinking to address that?’ It’s very enriching and leads to better science.” Phillip Christopher works in two main areas of catalysis. One is adapting to the shift from

oil to natural gas as the primary resource for manufacturing commodities. Natural gas is much cleaner than oil in terms of sulfur and heavy-metal content, and it has a lower greenhouse-gas footprint, because the hydrogen content relative to carbon is higher than for oil. Still, Christopher says, “It presents some challenges. Some products — for example, ethylene, propylene, and butene, the major reactants used to produce plastics — used to come from oil, but now we have to make them from natural gas. The difference is that the oil you pull from the ground has big long-chain hydrocarbons, and we can exploit well-understood, long-used processes to break them down into smaller molecules, which are then used to make things. But the natural gas you pull from the ground is made up of the tiniest hydrocarbons, so different chemical conversion processes are required to form critical reactants.” Like Scott, Christopher is also working to develop catalytic processes that require fewer precious metals, especially platinum-group metals, and particularly the amount of such metals used in automobile catalytic converters. “We need catalytic converters to clean automobile exhaust more efficiently and in a way that requires fewer precious metals,” he says. A recent joint grant from the National Science Foundation and the Ford Motor Company further supports his research in that area. Scott adds, “One of the valuable aspects about these Mellichamp chairs is that, to make an impact, you really do need to bring in people from different areas. I could also imagine a sociologist coming in and saying, ‘How do we get people to think more critically? How do we get people to stop thinking that because they’re only one person, their actions don’t matter?’ In that context, trying to bring together people with really different perspectives makes sense.” “Sustainability is something that the whole UCSB campus has a big footprint in,” Scott notes. “People feel strongly about it, and we all have different perspectives. The College of Engineering perspective is that you need solutions that work on a large scale. Our part in that is chemical manufacturing.”

Paolo Luzzatto-Fegiz, a mechanical engineer who specializes in fluid dynamics (see "Sediments in Space" on page 9), is studying ways to enhance the efficiency of wind turbines on wind farms, where the wind “shadow” of nearby turbines can reduce efficiency. CONVERGENCE 20


ALGORITHMS FOR SUSTAINABILITY IGOR MEZIĆ’S ROLE IN MAKING BUILDINGS SMART AND TRACKING OIL SPILLS

M

echanical engineering professor Igor Mezić comes to sustainability by way of algorithms, particularly as they relate to "smart" buildings and, peripherally, tracking oil spills in open ocean. In an era when energy efficiency is a major topic of concern, buildings account for forty percent of the energy and seventy percent of the electricity used in the United States. According to Mezić, roughly twenty to thirty percent of that energy is wasted, because buildings aren’t smart. Mezić is a world expert on smart buildings — the systems they need to be energy-efficient, and the systems needed to monitor those systems and the building generally and receive data that can inform corrective action. He directs UCSB’s Center for Energy-Efficient Design and leads the Buildings and Design Solutions Group within UCSB’s Institute for Energy Efficiency (IEE). Most importantly, he develops algorithms that are embedded in software that drives a building’s capacity to be smart and informs human managers what is needed for it to run at optimal efficiency. “It’s a problem that has global significance, and it’s not easy,” Mezić says. “Buildings are very dynamic.” A building changes constantly over a 24-hour cycle, depending on the level of occupancy, the temperature and humidity, and

the number of lighting, air-conditioning, heating, and other systems that are running, and are often left running even though they are not in use While many large, modern buildings are loaded with sensors that deliver data related to every element of a building’s moment-to-moment energy profile, historically, the vast majority of that data has not been monitored or used until a system malfunctions, largely because humans had to Mechanical engineering professor Igor Mezić correctly predicted sift through the data to that oil from the Refugio spill (above) would reach Los Angeles. recognize useful patterns, and there simply weren't enough people for the job. However, in the past He and his lab researchers added many new twenty years, Mezić says, “We’ve replaced sensors to the building, hand-positioning them the level of analysis humans once did with an in places where measurements did not exist to automated algorithmic layer that didn’t used avoid fusing the new data with existing meato exist at a very profound level.” surements. They targeted hundreds of data Buildings have wised up. points, and the data from them was captured Mezić’s first effort in this area came at the every five to ten minutes. LEED Gold–certified UCSB Student Services Then, Mezić recounts, “We passed those Building, just to see what he could discover. massive sets of data through our algorithms to

Approximately 30 percent of food grown is wasted, often because of spoilage. Apeel, a startup founded by CoE Alumnus James Rogers (PhD ’12), markets an all-natural plant-based coating that dramatically extends the shelf life of produce. 21 Spring 2018


get a global picture of how energy was being used, what was working and what wasn’t, and where you could get savings. We tried to squeeze as much efficiency as we could from the systems that we already had.” The result was impressive: energy efficiency was increased by twenty percent in a building that was already fifty percent better than average. From there, Mezić and his colleagues continued to develop the mathematics. Some of their resulting algorithms have received patent protection over the past ten years, and in 2011, the UCSB Office of Technology

to be the impossibility of predicting where it would go made him think, With so much knowledge available about currents in the ocean, we should be able to say something about this. “I thought my community needed to get involved just from the purely scientific standpoint,” he recalls. Mezić creates algorithms that apply not to a single challenge but, with some adjustments, to multiple challenges, and so, with “a few theories on the shelf,” he ran some calculations and shared them with an attorney friend in Los Angeles, who passed them on

I thought my community needed to get involved just from the purely scientific standpoint.

Transfer licensed some of them to a company called Ecorithm, which incorporated them into a web-based application that offers real-time indicators of a building’s performance. While many early adapters of the system are companies that do business in a technology-related space, Mezić expects that to change. “The whole world is moving toward achieving greater energy efficiency and using less energy, and the energy savings pay for the cost of the software quickly, in some cases within a year,” he says. “Adaptation will happen over time.” Sustainability concerns related to energy can easily include purely environmental matters, such as those that arose from the Deepwater Horizon disaster that occurred in April 2010, when the platform exploded, caught fire, and sank, releasing an estimated 5.9 million barrels of crude oil into the Gulf of Mexico. In June, Mezić was watching news about the spill and the spread of the oil. Hearing people talking about the oil and what seemed

to some government officials. Shortly thereafter, Mezić received a call requesting that he travel to the gulf and, about a week later, found himself boarding a flight that was carrying oil-cleanup business executives to the site of the spill. Once in Louisiana, he started testing algorithmic simulations of oil movement against what people were measuring on the water and shared the results with the Coast Guard and anyone else who was interested. Mezić's simulations turned out to be highly accurate, and his work caught the attention of Jane Lubchenko, then administrator of the National Oceanic and Atmospheric Administration (NOAA), who invited him to write a paper on the topic, which was published in Science. After the spill had been capped, he joined UCSB colleague David Valentine, professor at the UCSB Marine Science Institute and an expert on the microbial ecology of hydrocarbons, to co-author a paper published in the Proceedings of the National Academy of

Professor Igor Mezić's algorithms have wideranging application, from energy efficiency to environmental protection.

Sciences. The paper resolved a number of scientific questions related to the spill, especially how water movement in the Gulf of Mexico caused the oil to spread, both on and below the surface, and how the spill had created a bloom of oil-eating bacteria that helped in some small measure to mitigate the disaster. Three years later, when the Refugio oil spill occurred, releasing nearly 143,000 gallons of oil along the Gaviota Coast, Mezić used his existing algorithms again to predict, correctly, that the oil would reach Manhattan Beach in Los Angeles. By accurately simulating reality, Mezić’s model can enable those responding to actual spills in the future to better understand where the oil will go and, thus, direct their cleaning efforts most effectively.

Professor Emeritus Larry Coldren and Associate Professor Jonathan Klamkin received NASA funding to work on shrinking, by several orders of magnitude, the Size, Weight, and Power consumption (SWaP) of integrated microphotonic circuits for satellite-based Lidar applications. With their reduced SWaP, the circuits can be deployed on smaller spacecraft for less cost, making it possible to generate more scientific measurements of greenhouse gases, such as carbon dioxide and methane. CONVERGENCE 22


CENTERS AND INSTITUTES FOR

SUSTAINABILITY

An architectural rendering of Henley Hall. The new home of the Institute for Energy Efficiency is expected to open in fall 2020.

LOOKING SOUTHWEST FROM MESA ROAD

UNIVERSITY OF CALIFORNIA, SANTA BARBARA | INSTITUTE FOR ENERGY EFFICIENCY – HENLEY HALL

S

everal centers and institutes at UCSB serve as complementary, often-overlapping points of convergence for faculty, graduate students, and professional and postdoctoral researchers who share an interest in discovering faster, more-efficient, and cheaper ways of creating, moving, and using energy, and in identifying new ways to conserve it. Among them are the Institute for Energy Efficiency (IEE), AIM Photonics (within IEE); the joint UCSB-UCLA California NanoSystems Institute (CNSI), and the Solid State Lighting and Energy Electronics Center (SSLEEC). Through those centers and institutes, researchers — including many world leaders in their fields — collaborate at the level of materials, devices, and systems to uncover the science and develop the solutions to produce low-cost, energy-efficient electronics.

2018 | © KIERANTIMBERLAKE

IEE, AIM Photonics, CNSI, and SSLEEC also have strong records of moving research findings from the lab into the marketplace, where they have the greatest impact. “The environment at UCSB is terrific for someone from industry who wants to come in and get a lot of work done really fast with top-caliber people,” said Ray Beausoleil, who leads the Large-Scale Integrated Photonic research group at Hewlett Packard Labs, which has partnered with UCSB researchers. CNSI is directed by professor of chemical engineering, world-renowned polymer expert, and member of the United Kingdom’s Royal Society Craig Hawker and includes nearly forty affiliated faculty from UCSB engineering and the sciences. CNSI collaborators work in multiple areas of nanoscience, including energy efficiency. The Materials Department

in the College of Engineering is particularly well represented in this area, where nanoscale work is aimed at developing new materials — and devices that incorporate them — to boost energy production, efficiency, and storage. At the Bren School of Environmental Science & Management, researchers at the UC Center for Environmental Impacts of Nanotechnology have done groundbreaking work in determining the fate and transport of engineered nanomaterials on land and in freshwater and saltwater environments, in high-throughput testing of new nanomaterials based on experimental analysis of the effects of specific classes of ENPs on plant and animal life, and in leveraging nanoparticles’ unique characteristics in mitigating pollution. The Institute for Energy Efficiency, numbers nearly eighty UCSB faculty among its

The copper connections used on cloud server stacks create heat and slow the speed at which data is transmitted compared to how it moves on fiber-optic cable. UCSB researchers Clint Schow, Adel Saleh, Jim Buckwalter, Jonathan Klamkin and Larry Coldren integrated optical switches onto the switch chip, and Facebook is now testing the technology, which is faster and can save abundant energy. 23 Spring 2018


affiliates, including Chancellor Henry T. Yang and Director John Bowers (electrical and computer engineering). The mission of the IEE, which is expected to be housed in its new building, Henley Hall, by fall 2020, is to realize major gains in energy efficiency and production in the following six areas, while advancing worldwide standards of living. • Lighting: Long-term research is done to increase LED efficiency, an area of considerable overlap with the SSLEEC, where a team of faculty, including co-director Shuji Nakamura, who won a Nobel Prize for inventing the blue LED, work on a variety of projects based on the semiconductor gallium nitride. • Energy Production & Storage: Research is pursued on leading-edge technologies include Professor Emeritus Alan Heeger’s work on flexible, cheap organic photovoltaic cells, themselves an extension of the conducting polymers he discovered and for which he won a Nobel Prize. Other faculty members, such as Gui Bazan, materials and chemistry professor and director of the Center for Polymers and Organic Solids at UCSB, continue to refine the technology. Additional topics include a new generation of lithium ion batteries, and batteries inspired by biology. • Sustainability: The focus here is on integrating new technologies into the marketplace, while also understanding the social, environmental, and resource impact of those technologies. Life-cycle assessment tools are used to understand supply loops and to quantify the full range of impacts of products and services.

Electronics & Photonics: The goal is to develop new wireless and optical communications technologies for interconnects that are 100 times more efficient. Replacing the electronic components that connect devices with photonic components could cut energy use by 20 to 75 percent. Computing: Data centers currenly consume some 1.5 percent of global energy, and the volume of data is increasing exponentially. Meanwhile, gains in computing-efficiency have slowed to a crawl. IEE researchers, including computer science professor Rich Wolski, inaugural holder of the Duval Family Presidential Chair in Energy Efficiency, are working to address the looming efficiency gap by reducing the energy use associated with cooling loads, inefficient server use, and wasteful computer processes. Buildings & Design: Buildings consume 72 percent of electricity in the United States and 40 percent of all energy produced in the U.S. IEE researchers like Igor Mezić are developing smart-building energy-management systems (see article on page 21) to accelerate the transition to zero-net-energy buildings.

Within IEE is AIM Photonics, the West Coast hub for President Barack Obama’s American Institute for Manufacturing Integrated Photonics (AIM Photonics). Directed by professor of electrical and computer engineering and AIM Photonics deputy chief executive officer, John Bowers, the center was created to develop an end-to-end integrated photonics manufacturing system in the U.S. “We are working to develop science

An architectural rendering depicts the first-floor interior of Henley Hall. technologies that can save more energy than alternative energy sources can deliver,” says Bowers, a world leader in the field who in 2017 received the Photonics Award, the highest honor given by the Institute of Electrical and Electronics Engineers (IEEE). Demand for high-performance silica photonics wafers is doubling every year. The wafers integrate multiple functions onto a single wafer, making it possible to overcome a fundamental incompatibility between the photons that carry data on fiber-optic networks and the electrons that carry data in computers. Large tech entities like Google want millions of units per year, but supply is currently insufficient to meet such demand. AIM Photonics is meant to transform what Bowers described in 2017 as a “cottage industry” of small, independent manufacturers that produce individual pieces on ceramics into a nationwide industry that can produce integrated photonics wafers at enormous scale while driving the technology forward to give American manufacturing a key advantage in this important emerging area of technology.

UC Santa Barbara has been named a top performer in three categories of the Association for the Advancement of Sustainability in Higher Education (AASHE) 2018 Sustainable Campus Index: The university was cited for its impressive strides in green building, waste reduction and sustainable investing.

CONVERGENCE 24


Toward a Secure Electrical Grid Professor João Hespanha suggests a way to protect autonomous grids from potentially crippling GPS spoofing attacks.

N

ot long ago, getting a virus was about the worst thing computer users could expect in terms of system vulnerability. But in our current age of hyperconnectedness and the emerging Internet of Things, when data is collected and transmitted at a rate and in quantities inconceivable a few decades ago, that’s no longer the case. Connectivity has given rise to a new law of universal concern to those who work in the area of systems control, like João Hespanha, professor in the departments of Electrical and Computer Engineering, and Mechanical Engineering at UC Santa Barbara. That law says, essentially, that the more complex and connected a system is, the more susceptible it is to disruptive cyber-attacks. “It is about something much different than your regular computer virus,” Hespanha says. “It is more about cyber physical systems — systems in which computers are connected to physical elements. That could be robots, drones, smart appliances, or infrastructure systems such as those used to distribute energy and water.” Much of Hespanha’s research is focused on developing algorithms to make sensor-rich autonomous systems viable. In one main area, he focuses on developing the algorithms to control autonomous drones. Such vehicles, which would not have a human controlling them moment to moment, have application in a variety of areas, such as traffic monitoring, precision agriculture, and monitoring oil pipelines for leaks or corrosion. Because such sensor-rich systems are both complex and connected — and, thus, vulnerable to attack — Hespanha also invests a great deal of time on security issues. That means "figuring out what happens if the sensors you’re using have been compromised," he notes. "You have to work simultaneously on the two problems: designing a safe system, but also figuring out how to break the system by identifying its vulnerabilities and how somebody would attack it,” he adds. The same holds true for the above-mentioned infra­ structure elements, and in a paper titled “Distributed Esti­mation of Power System Oscillation Modes under Attacks on GPS Clocks,” published in the journal IEEE Transactions on Instrumentation and Measurement in July, Hespanha and co-author Yongqiang Wang, a former UCSB postdoctoral researcher who is now a faculty member at Clemson University, suggest a new method for protecting the increasingly complex and connected power grid from attack. The question that arises in any system that incorporates

25 Spring 2018

many sensors for monitoring is, what if someone intercepts the communication between two sensors that are trying to assess the health of the system? How does the system know not to believe — and act on — the false information? Hespanha explains: “In the power grid, you have to be able to identify what the voltage and the current are at specific, highly precise points in time” for multiple points along the grid. Knowing the speed at which electricity moves, the distance between sensors, and the time it takes an oscillation to move between sensors, one can determine whether the oscillation is real. Measuring voltage and current precisely has long been possible, but energy travels very fast, and to know exactly when a measurement is made at a sensor requires time resolution of just a few milliseconds. That used to be impossible to achieve, but something called a "phase measurement unit" (PMU) changed that. PMUs, which are aligned with the atomic clocks used in GPS, can measure with extreme time resolution the voltage and current anywhere on the grid. With that vast interconnected system becoming increasingly distributed, and with photovoltaic cells, electric-vehicle charging, wind turbines, and other elements causing more perturbations along the grid, power providers now have to monitor the system more, and PMUs are among the most important devices for doing so. While PMUs could be used to inform autonomous control systems, so far, they have seen limited use, mostly for monitoring, for one simple reason: they are vulnerable to GPS spoofing attacks. “There is the possibility,” Hespanha says, “that someone will hack the system and cause a catastrophic failure.” The attack could be as simple as someone taking a readi­ly available GPS jammer to a remote power-distribution station in the middle of the desert and tricking the system into providing false measurements. The net effect could be that the system, believing, for example, that there is a shortage of energy flowing into it because of the false readings, could overload power lines by dispatching energy production when and where it is not needed. That could lead to a cascade effect as the false readings ripple through the system and incorrect actions are taken. Since it is virtually impossible to prevent someone from getting close enough to a remote substation to jam its GPS, Hespanha says, “What you need is a control system that can process the information it receives to make good decisions. The system has to continuously hypothesize that what it is reading is not real.”


“The power-supply system is a distributed system, so measurements are being made in many places,” Hespanha explains. “If one of them starts to give erratic or unexpected measurements indicating a sudden current surge or a voltage drop, you should be able to determine whether those measurements make sense.” In the case of an actual fluctuation, such as when many people in a large city like Los Angeles are using their air conditioning on a hot summer day, the result may be a slight drop in the alternating-­current frequency in the city. That drop creates a disturbance, which propagates along the entire Western Interconnect, the power grid stretching from western Canada south to Baja California in Mexico and reaching eastward over the Rocky Mountans to the Great Plains. As the disturbance travels through the grid, the power stations that feed electricity to it try to counteract the

disturbance by generating extra power if the frequency is too low or decreasing production if the frequency becomes too high. “You’re going to start by seeing oscillation on the grid,” Hespanha explains. “That’s exactly what the PMUs are looking for. You then compare the precise time you saw the disturbance in Los Angeles to the time you saw it in Bakersfield and then at other sensors as it continues north. And if those readings don’t reflect the physics of how electricity moves, that’s an indication something’s wrong. The PMUs are there to see oscillations and to help dampen them to prevent them from developing.” But if someone fooled an automated system, instead of damping the oscillations, the PMUs could create them instead. So how would such an attack be recognized and stopped? To illustrate, Hespanha goes to a whiteboard in his office and draws a line representing the electrical line running between Los Angeles and Seattle, with many smaller, ancillary

lines running off to the sides. “If power is going in a certain direction, you should also be able to see any oscillation in the side lines in that direction," he explains. "And you know the physical model of what things should do, so an attacker who changed the measurement on the main line would also have to mess up a lot of other measurements on the side lines along the way. And that would be very difficult to do.” Testing conducted so far suggests that Hespanha’s system would be resistant to attack and remain effective even if one-third of the sensor nodes were compromised. “That would allow for a much more autonomous system; that’s the next big step,” he says. “This is an enabling technology that will be needed to make a lot of this control come online. And it will be needed soon, because the system gets more complex all the time and is therefore more susceptible to attack.”

How it Can Work

You have to work simultaneously on the two problems, designing a safe system, but also figuring out how to break the system... — João Hespanha

CONVERGENCE 26


27 Spring 2018


SECURITY AGENTS Tevfik Bultan and Giovanni Vigna take complementary approaches to data defense

S

ecurity breaches have become a daily fact of life in the Digital Age, and largescale attacks can compromise millions of users’ personal information, as has happened at Equifax (143 million accounts), eBay (145 million), Facebook (up to 50 million in September), and Yahoo, where 3 billion accounts were put at risk. The important role of the Internet in national infrastructure, including the power grid, air-traffic-control systems, and financial networks, has made it the target of sophisticated attacks as well. Digital security break-ins cost approximately $109 billion in the United States in 2016, and are estimated to cost between $375 billion and $575 billion per year worldwide. Small wonder that governments and industry are committing enormous resources to gain the upper hand in the data-theft

wars. Often, such entities work with university researchers like Tevfik Bultan and Giovanni Vigna. Part of a formidable group of professors in the Computer Science Department at UC Santa Barbara’s College of Engineering, they take complementary approaches to security work. Bultan’s approach reflects what he describes on his Verification Laboratory (VLab) website as “an ongoing shift in focus from performance to dependability,” reflecting the fact that “the size and complexity of the software systems nowadays inevitably lead to errors during both design and implementation phases” — errors that can cause vulnerabilities that compromise data. He works primarily on two main security fronts: automating the process of finding vulnerabilities in software and trying to develop CONVERGENCE 28


hackproof software by proving that it has no vulnerabilities.Vigna comes at security from the opposite perspective, seeking to find and exploit vulnerabilities in applications before they are deployed. Both share the goal of enabling programmers to make their products more secure. “I think Tevfik works more on making sure that a program cannot misbehave, while we look for ways to make a program misbehave,” Vigna says. “He wants to be able to prove that bad things can’t happen. It’s like going back to the origins of software engineering, where you have formal verification and you say, ‘I’m going to prove that when the elevator door is open, there is no way that an elevator car will not be in front of the door.’ We’re more on the hacker side, saying, ‘How can we screw up this elevator?’

Aligned against attack (from left): Tevfik Bultan and Giovanni Vigna

29 Spring 2018

But sometimes Tevfik does what we do, and sometimes we do what he does. The two approaches are closely related.” Much of the work Bultan refers to as “finding exploits” is done manually, but he prefers to automate the time-consuming process of identifying vulnerabilities by taking what is called a formal-methods approach. “We extract a math-based formallogic representation of the software and then build logic solvers — software that can analyze the logic formulas,” he explains. In application, the importance of his formal-methods work is reflected in a pair of grants he received recently — one from Amazon and another from the National Science Foundation — to address security issues with cloud computing services such as Amazon Web Services, Inc.

One problem is that when companies put their data onto Amazon’s cloud servers, they have to set the access rights appropriately so that other cloud users can’t access their company data. But the rights aren’t always set up correctly, and data gets exposed. To add security, Amazon developed its own language so that companies have to write their access specifications in that language, but people still make errors that can give rise to vulnerabilities. Amazon then used an approach called differential analysis, which Bultan had described in a journal article a decade earlier, to build an automated checker, called Zelkova, to determine whether access rights were set up correctly. Zelkova is good at reasoning about numbers but not as good at reasoning about text, called strings in the computing


field, which are necessary in data-rights policies that, for instance, include people’s names. String analysis is a fledgling area of research, and Bultan has co-authored the first book on the subject, String Analysis for Software Verification and Security (Springer, 2017). The Amazon and NSF research grants he received will enable him to extend his work on string analysis so that tools like Zelkova can become more effective in analyzing access policies. “This is a great example of formal-methods techniques being applied in practice from the security perspective,” Bultan says. “It is nice to see industry build a formal-methods tool that is having a practical impact and was significantly influenced by the research we do here at VLab.” Vigna’s work, too, finds abundant application in the world beyond academia, often for the Defense Advanced Research Projects Agency (DARPA), the Office of Naval Research, or the Army Research Office. In a recent $11.7 million DARPA grant, the lead PI, Fish Wang, an alumnus of the UCSB SecLab (run by Vigna and fellow Computer Science professors Christopher Kruegel and Dick Kemmer), and the five other co-PIs at the collaborating universities — four of whom also earned their PhDs in Vigna’s lab and one who worked as a postdoctoral researcher there — are pursuing what Vigna describes as “a new frontier in security and vulnerability analysis.” He says that the basic idea of the project, called Computers and Humans Exploring Software Security (CHESS), is “to create a cyber system that can reason about a program. But it should also recognize when it encounters an obstacle that requires human intervention, and should be able to collect, in an intelligent way, those cognitive tasks that humans do better than machines do. It’s a changing perspective with respect to what can the automated and what can be orchestrated.” According to the CHESS website, “The process requires hundreds, if not thousands, of hours of manual effort per discovered vulnerability.” Further, “Automated program analysis capabilities…cannot address the majority of vulnerabilities. CHESS aims to develop capabilities to discover and address vulnerabilities in a scalable, timely, and consistent manner.”

To summarize, Vigna says, “The programs are not yet smart enough.” The paradigm shift, until that changes, is to bring in humans on an ad hoc basis to bypass the problem and then give control back to the AI. It’s called artificial artificial intelligence — AI with human helpers. Amazon has a simple version of it, called Mechanical Turk. The webbased service allows anyone to post a task that a machine cannot do, and then anybody in the world can choose that task and get paid a small amount for doing it. But while Mechanical Turk simply provides a mechanism for humans to ask other humans to carry out a simple task, says Vigna, “Our system is able to carry out a security analysis, identify a situation in which the analysis is stalling, and ask for targeted help from a human to overcome the obstacle.” It’s a machine asking for human assistance. Vigna, who directs the Center for CyberSecurity at UCSB (the website for which describes the “abysmal” state of security on the Internet), has also worked extensivly on smartphone security. “People don’t realize how much trust they put in their phones,” he says. “For instance, we trust the market and assume that when we download something from Google, we won’t be infected, but that’s not always the case. And there are applications that you trust will not grab something from the Internet and put it on your phone. But it gets done all the time. So we studied that, trying to look at the whole smartphone ecosystem to understand what the problems are.” Vigna’s lab has also examined the many aspects of what he calls the “underground economy” of the Internet — people who try to sell users fake antivirus programs that actually infect computers, extortion schemes, laundering money via shipping mules, and the like. Bultan, Vigna, and other computer-security experts know that, while cyberspace can seem like the Wild West, there is one simple step that all of us can take to protect ourselves, and that is to use two-factor authentication on all our devices. That single action, Vigna says, will eliminate more than ninety percent of account compromises — and avoid the need for an expert to restore order in your digital world.

CONVERGENCE 30


ALUMNI RELEASE A POWERFUL NEW COMPUTER LANGUAGE 31 Spring 2018


from biology and chemistry to ecology, astronomy, and quantum phsyics. In February 2019, the three eligible co-creators of the Julia language — Jeff Bezanson, Karpinski, and Shah — will receive the prestigious, quadrennially awarded James H. Wilkinson Prize for Numerical Software from the Society for Industrial Applied Mathematics (SIAM). Computer science faculty member Linda Petzold was the first UCSB recipient of the Wikinson Prize, in 1991.

THE COMPANY To begin, only Shah (now CEO of Julia Computing) and Karpinski (CTO) knew each other, but Shah’s PhD advisor, UCSB professor John Gilbert (who was also on Karpinski's dissertation committee), had a

Stephan Karpinski computing that I know today,” sees in Julia’s success clear echoes of the kind of collaborative work that he has always valued as a hallmark of UCSB. “The Julia language grew out of an interdisciplinary mix of computational science, performance computing, and numerical linear algebra, together with twenty-first-century advances in programming languages and compilers,” says Gilbert. “Because of how it combines ease of use with high performance, Julia has taken an important place in data science and machine learning. It's great to see my students' success; we can all be proud of UCSB's unique ethos and history of nurturing this sort of cross-disciplinary research.” Shah says that starting with four different disciplines helped the team avoid tunnel vision: “Alan is a mathematician at MIT, Stefan is a data scientist, Jeff is a programming languages and computerscience researcher, and I am a computational scientist. All of us believe that computing can solve the world’s toughest problems, and that creating a high-performance programming language that builds bridges between diverse academic communities was the gateway to solving grand challenges. As a result, there was a natural mixing of diverse ideas and viewpoints, even as we all agreed on the big picture.”

I

n 2009, Viral Shah (PhD Computer Science, 2007) and Stefan Karpinski, a UC Santa Barbara computer-science graduate student, were playing catch after an Ultimate Frisbee intramural match. They may have talked some about strategy, but a more important result of their meeting was the discovery that they shared a common frustration with computer languages, what they would come to call “the two-language problem.” “When I was at UCSB,” Shah recalls, “you typically programmed using two kinds of languages. You had languages like R, MATLAB, and Python, which are easy to use to write very high-level programs. Engineers and scientists like writing programs that are at the level of mathematical abstraction. However, those are not high-performance languages, because they cannot scale to handle big data or do large-scale simulations or big science.” For that to be possible, a computer scientist or a programmer has to take the program written in Python or R and rewrite it in a different language, typically C, C++, or Java. The program can then run at maximum speed on a computer. “To be productive and fast, you had to write the same program twice,” Shah says. “You had this mishmash of two systems. We thought it would be ideal to have a single system that could be used to write programs at a high level of abstraction — the level at which you think about the science — and have that same program run really fast on the best hardware you can lay your hands on. That is what led to the creation of Julia.” Last August, six and a half years after Julia was first released to the scientific computing community, and with contributions from more than eight hundred programmers, the open-source language Julia 1.0 was released. The eight hundred contributed packages, which are easy for programmers to write in Julia, dramatically expand its effectiveness by providing algorithms and methods for specific disciplines ranging

It's great to see my students' success; we can all be proud of UCSB's unique ethos and history of nurturing this sort of cross-disciplinary research. research collaboration with Alan Edelman, a professor of mathematics at MIT and now Chief Scientist at Julia Computing. Bezanson (CTO) had just started working as a researcher in Edelman’s lab (now the Julia lab). Eventually the four developers met each other, and in 2009, they started collaborating on the new language. “We thought we’d give it a couple of months, but when that time was up, we were having too much fun to stop,” Shah recalls. “It was a very quick and fluid interaction with like-minded people. We had a lot of ideas that look ridiculous now, and no one got upset if you broke something.” Gilbert, whom Shah credits for “investing the time and effort in me so that I could learn everything about scientific

CONVERGENCE 32


The team leveraged multiple opensource projects that predated Julia, resurotalong ago,ofgetting a virus wasto the rected number ideas that dated about the worst thing computer 1960s but had fallen out of favor during users could expect in terms the PC revolution, and used muchof of that system vulnerability. But in our current age material in new ways while combining it ofwith hyper-connectedness and the emerging their own novel ideas. Internet ofcore Things, when data collected The principles wereisestablished and a rate and in quantities bytransmitted 2012, when at Karpinski wrote a blog post inconceivable a few“Ifdecades that’sdeno that read in part, you are ago, a greedy, longer the case. With connectivity, a new manding, unreasonable programmer, we law has you emerged, universal concern invite to giveone it a of try.” to those who time, work they in theexpected area of systems At that to release versionlike João 1.0 in a few months,professor but that in control, Hespanha, turned into six years. “That'sand because the departments of Electrical Com-in an open-source project can't just Engiintroputer Engineering, andyou Mechanical duce aatnew some years the neering UCversion Santa Barbara. Thatdown law says, line and slap new version number,” essentially, thaton thea more complex and Shah explains. “Weis,started this susceptito solve connected a system the more the real and wanted to be sure ble it is to problem disruptive cyber-attacks.

known as Dynamic Stochastic General Equilibrium. Invenia two problems, designing a Technical Computing employs safe system, but also figuring the latest research in machine out how tocomplex break the system learning, systems, by identifying its vulnerabilities risk analysis, and energy sysand how somebody tems to optimize thewould electrical attack it,” he adds. grid across North America. Its current codebase is written The same truePython, for the mostly in holds MATLAB, above-mentioned infrastrucand C. Now Invenia is looking ture elements, in a paper to scale up its and operations, and titled “Distributed Estimation their language of choice for of Power SystemisOscillation the experiment Julia. Modes under Attacks on GPS Julia is used in energy Clocks,” published in the jouranalytics and optimization, nal IEEE Transactions on Instrumedical modeling of cancer mentation Measurement evolution, and mapping genetin July, Hespanha and co-auic diversity, optimizing milk thor Yongqiang Wang (a using former output, risk assessment UCSB postdoctoral research large-scale Monte Carlo simand now arobotics, faculty member at ulations, space-misWe are confident that Julia will Clemson University) suggest sion planning, schoolbus route aoptimization, new methodand for protecting help advance the frontier of much more. the increasingly complex and scientific discovery for many Because computers connected power grid faster from are not getting much years to come. attack. today, large computing tasks may be handled by harnessThe arises inof ing question the powerthat of millions that we had a solid foundation for every“Itone is about something much different than any system that incorporates to go out and start using. That's why it “commodity” computers — your regular computer virus,” Hespanha many sensors laptops for monitoring i.e. standard — in a is, took six years.” says. “It is more about cyber physical what if someone intercepts the The proof of the team’s success lies in “distributed system” the total systems — systems in which computers communication between two power of which is similar to that the number of high-profile, high-impact are connected in towhich physical elements. that are trying to assess the health of a supercomputer. applications Julia is beingThat used. sensors could The be robots, drones, smart appliancof the system? How does the system In one such Julia project, called know Federal Aviation Administration es,isor infrastructure systems such as those not to believe — and act on — Celeste, Julia was used to writethe thefalse using Julia to develop the next generused to distribute energy and water.” information? program to catalog images of some 188 ation of its Airborne Collision Avoidance Hespanha explains, “In the power grid, million astronomical objects that have System. When two aircraft are within a Much of Hespanha’s research is focused on you have to be able to identify what the been captured since 1998 by telescopes minute of a midair collision, the system developing algorithms to make sensor-rich, voltage and the current are at specific, at Arizona’s Apache Point Observatory, as gives the pilots of both aircraft warnings autonomous systems Andaction because highly points in Survey. time” for multiple part ofprecise the Digital Sky The images and tells them whatviable. corrective those systems are both complex and conpoints along the grid. Knowing the speed contained 60 terabytes of data. The data to take to avoid colliding. The current nected — and, thus, vulnerable to attack at which electricity moves, the distance system is quite old, and a team at Lincoln was loaded onto the supercomputer —Labs he also investsHopkins a great University deal of time on between sensors, and Berkeley the time it takes an Cori at the Lawrence National at Johns is using security issues, or figuring out, as he says, oscillation to move between sensors, one Laboratory, then the world's fifth-largest Julia to design its new replacement. The “what happens if the sensors you’re using can determine whether the oscillation is computer, consisting of 650,000 compu­ project requires computation of an have been compromised.” real. ter cores. exhaustive search comprising 650 billion “You have to work Julia simultaneously ontime the The data analysis program, written decision points. reduced the in Julia, which makes distributed comrequired to conduct those computations puting easy, was able to catalogue the by several years. 188 million light sources in less than 15 In 2015 economists at the Federal minutes,” Viral says. “On a single laptop, Reserve Bank of New York used Julia to it would have taken I don't know how publish the bank’s most comprehensive, many years.” most complex macroeconomic models,

N

33 Spring 2018

Viral Shah “When the Juliaand project gotprecisely started Measuring voltage current in 2009, project’s goalbut of energy unifyingtravhas longthe been possible, high performance and highexactly productivity els very fast, and to know when a seemed like a far-off dream,” says Keno measurement is made at a sensor requires Fischer, a chief technology at Julia time resolution of just a fewofficer milliseconds. Computing, Julia website. “In but That used toon bethe impossible to achieve, 2017, projects like Celeste show that this something called a phase measurement dream has become reality. unit (PMU) changedathat. "Scientists can now take the prototypes they have developed PMUs, which are aligned with on thetheir atomic laptops and run themcan onmeasure the biggest clocks used in GPS, with exsupercomputers without to and switch treme time resolution thehaving voltage curlanguages or completely their rent anywhere in the grid.rewrite With the energy code. We are confident that distributed, Julia will grid becoming increasingly help advance the frontier of electric-vehicle scientific and with photovoltaic cells, discovery for many years to come.” Shah concludes, “When we started we were just doing something that the four of us wanted for our own purposes. We were fed up with the way things were being done. We never imagined it would come this far.”


GET THE BEST, KEEP THE BEST The Dean’s Fund enables the College of Engineering to recruit top graduate students and to hire and retain the best faculty in a highly competitive environment.

PLEASE JOIN US IN A COLLABORATION THAT MAKES THE WORLD A BETTER PLACE.

giving.ucsb.edu 805-893-GIVE

CONVERGENCE 34


College of Engineering University of California Santa Barbara Santa Barbara, CA 93106-5130

Nonprofit Organization U.S. Postage PAID Santa Barbara, CA

The University of California, in accordance with applicable Federal and State law and University policy, does not discrimate on the basis of race, color, nationa origin, religion, sex, gender, expression, gender identity, pregnancy, physical or mental disability, medical condition (cancer related or genetic characteristics), ancestry, marital status, age, sexual orientation, citizenship, or service in the uniformed service. The University probibits sexual harassment. This nondiscrimination policy covers admission, access, and treatment in University programs and activities. Inquiries regarding the University's student-related nondiscrimiation policies may be directed to the Office of Equal Opportunity & Sexual Harrassment/ Title IX Compliance, Telephone: (805) 893-2701.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.