MELLON ENGINEERING
WE’RE SURROUNDED BY AMAZING PEOPLE WHO DON’T THINK ALIKE
WE’RE SURROUNDED BY AMAZING PEOPLE WHO DON’T THINK ALIKE
Engineers are problem solvers, and if there has ever been a time when society needs engineers, it is now. Each unsettling climate report or calamitous weather pattern underscores the point that climate change is happening now. If we are to avoid the worst effects of climate change, we must keep the Earth’s temperature from rising 1.5ºC. Engineers have an important role to play in mitigating climate change and helping people adapt to it.
In this magazine, we take a broad look at decarbonization and some of the technical, economic, and policy issues surrounding climate change. We highlight research on emerging technologies that reduce CO2 emissions from the power sector and other industries. An important theme in decarbonization discussions is equity. While decarbonization will benefit people in general, there are no guarantees that these benefits will be evenly distributed. Research is underway in the College to identify potential societal disparities before we build new energy systems. In the College of Engineering, we have many faculty working across the departments, the Scott Institute, and the university in areas related to energy innovation and the environment.
Another focus area that spans our academic departments is artificial intelligence. We are conducting groundbreaking research on design for AI. Often AI is viewed as a tool to add onto an existing system, but we are exploring how to build better systems by integrating AI into the engineering design process in the very beginning. The findings of this work have tremendous implications for industry. Further, we are teaching students how to blend AI and engineering together in our AI Engineering master’s degrees, which will make our graduates attractive to companies that want to use AI to move their businesses forward.
Throughout this magazine, you will read about faculty and students who are fulfilling their roles as engineers by creating the best outcomes for humanity’s future. I am proud of their work, and I hope you are, too.
Sincerely,
William H. Sanders Dr. William D. and Nancy W. Strecker Dean, College of EngineeringClimate change–it’s personal now
A framework to advance green-hydrogen production
Equality and air quality: the decarbonization balancing act
Decarbonizing the grid with flexible buildings
Decarbonizing industry with 3D-printed stacked columns
EDITOR:
Sherry Stokes (DC’07)
DESIGNER:
Tim Kelly (A’05, HNZ’14)
COVER ART:
Karl Huber
CONTRIBUTORS:
Krista Burns
Daniel Carroll
David Cochran (photography)
Monica Cooney
Hannah Diorio-Toth (DC ’17)
Susan Endres
Emily Forney (DC ’12)
Lisa Kulick
Kaitlyn Landram
Ryan Noone
Hope Reveche
Lynn Shea
Lauren Smith
Holly Stokes
Sara Vaccar
Giordana Verrengia
Nicholas Zurawsky
New leader in mechanical engineering Faculty receive NSF CAREER Awards
Portugal update: Partnerships drive success
Legendary lessons from Silicon Valley
Bill Sanders installed as Dr. William D. and Nancy W. Strecker Dean
Better than a hole in the head
Playing nice: how self-driving and human-driven cars will mingle
How Uber and Lyft are transforming mobility in US cities
Big steps for mini robots
New hope for people living with paralysis after stroke
Mother’s milk may hold key for infant disease therapy
Drones for efficient last-mile deliveries
Consumer-friendly broadband nutrition-style labels
New modeling approach helps advance cryopreservation
Two out of three glaciers could be lost by 2100
How a sandwich design is transforming electronics
All eyes on “forever chemicals”
AI Engineering will change the world around us
DfAI: The missing piece of artificial intelligence engineering
Bringing AI to hard disk drives
U.S. Chamber AI Commission releases new report
Flexible, energy-harvesting robotics
Undergraduate research: A path worth taking
Liz Barre wins pentathlon at NCAA Track and Field Championships
Dowd Fellowship supports “high-risk, high-reward” research
ChemE Cube competition winners
A global approach to tackling grand challenges
Students build elaborate mazes for IDeATe course
ALUMNI
Thank you for your outstanding service
Alumna “pays it forward” with scholarship for mechanical engineers Innovation in the trading card industry
The evidence is clear: To avoid the worst effects of climate change, we must drastically restrict releasing greenhouse gases and remove carbon dioxide (CO2) from the air. Increasingly common “oncein-a-century” storms and unyielding droughts underscore the consequences of human-driven greenhouse-gas emissions, of which CO2 is the main culprit.
The Intergovernmental Panel on Climate Change (IPCC) says carbon capture deployment is lagging if we want to meet global mitigation targets. With the stakes so high, why is this the case?
Edward Rubin, a Nobel laureate on the topic, sums it up simply: “Why would anyone spend good money to prevent CO2 from going into the air if there is no incentive or requirement to do so?”
And here begins a lesson on the economics that shape our motivations for generating and reducing greenhouse-gas emissions. Rubin was a coordinating lead author of the special report on the capture and storage of carbon dioxide by the IPCC, which, jointly with Al Gore, received the Nobel Peace Prize in 2007. Today he is a professor emeritus in the Departments of Engineering and Public Policy and Mechanical Engineering, and the Alumni Chair Professor of Environmental Science and Technology Emeritus at Carnegie Mellon University.
“Would you be willing to spend thousands of dollars more for an electric car to keep CO2 out of the atmosphere?” asks Rubin.
Some people with financial means may, but others and big industry not so much.
“If you are a factory and CO2 is emitted as a byproduct of making whatever it is you make, say cement, why would you spend sixty percent more to capture the carbon when the other cement guys aren’t doing anything? What would it take for you to control your air pollutant emissions? In most cases, the answer tends to be a stick in the form of government regulations,” explains Rubin.
CONTINUES ON NEXT PAGE
Traditionally, government intervention is how we solve environmental problems in the United States. The role of the government is to enact measures that help solve problems. Typically, this happens with a mix of carrots, or incentives, and sticks, or requirements. The biggest improvements in environmental quality have all required sticks.
“We require coal-burning powerplants to install equipment to control major air pollutants; we don’t simply ask them to consider it,” says Rubin.
What makes the carbon dioxide problem different from past environmental challenges is that throughout our economy carbon dioxide is more pervasive in terms of sources and activities that give rise to atmospheric emissions, and its adverse impacts are less obvious. In contrast, we successfully solved water and other air pollution problems that were harmful to human health, especially when the sources of those problems were relatively constrained.
For example, three decades ago in the United States, we targeted sulfur dioxide from coal-burning powerplants to curtail acid rain. That required emission reductions from just a few hundred facilities in one sector of the economy. However, with greenhouse gases and CO2 in particular, “there is not just one place to go to solve the problem,” says Rubin.
Though dealing with the many sources of greenhouse-gas emissions is challenging, Rubin believes there is another factor that explains the slow pace of actions to control these emissions. “There is nothing people care about more than their health,” states Rubin. And this point ties to the initial question as to why we haven’t done more to curtail greenhouse gas emissions.
“While there are important scientific differences between traditional air pollutants and greenhouse gases, in my mind, what distinguishes the global climate problem from other environmental problems is that it hasn’t yet been personal,” says Rubin.
“Politically and socially, there haven’t been the same urgency and personal stakes as there have been for other environmental issues like polluted water and dirty air, where there were direct and visible impacts on health and well-being. But now it is getting a little more personal, and that’s the big change I’ve seen. Climate change impacts are getting more attention because they are unavoidable, and they are happening more rapidly than science anticipated a couple of decades ago.”
Glancing backwards, Rubin explains that a great deal of public understanding (and misunderstanding) about climate change was influenced by media activities supported by the
oil, gas, and utility industries. As a result, there has not been an overwhelming sense of necessity to deal with climate change politically. But times are changing. Now we are experiencing some of the adverse effects of climate change that were predicted by scientists long ago. “Because climate change happens more quickly in some parts of the globe, a lot of those impacts have become visible sooner than people may have anticipated,” says Rubin.
“We’re now seeing the linkages between climate change and other phenomena like forest fires, droughts, heat waves, and floods. If your house burns down in a forest fire, that gets your attention. If your house floods because of a hurricane, that gets your attention. When science links those tragedies to climate change induced by human activities, then the realization kicks in, ‘Hey this isn’t something far in the future that doesn’t affect me or my family; it’s getting personal now,” says Rubin.
“I think the realization that climate change is real and can affect us individually, and that it is here now, is helping to catalyze more political action than it has in the past.”
The Inflation Reduction Act, signed last year, pledges about $369 billion in climate investments, including $270 billion in tax credits to encourage carbon reductions. Rubin, like many members of the scientific community, sees the measure as a good first step but much more is needed.
The package of incentives, “is a very large set of carrots, which hopefully will speed up investments that people and businesses make in their own economic interests. The basic expectation is that these incentives will drive down the cost of low-carbon technologies along the so-called learning curve. As technologies become cheaper over time, solutions will be more economical,” says Rubin.
The legislation provides incentives for renewable energy sources like wind and solar, as well as measures to boost carbon capture, utilization, and storage. The International Energy Agency and the IPCC concur that carbon capture and storage is imperative if we want to limit average global temperature rise to no more than 1.5°C (2.7°F) to avoid the most dangerous impacts of climate change.
“What it comes down to is: what will it take to avoid more than 1.5°C of warming? The answer is to achieve net-zero emissions globally by 2050. But today the planet is already 1.1°C above pre-industrial levels. So, if we continue on the current trajectory, or reduce emissions by only a little, there is no way we’re going to limit the global average to 1.5°C unless we also figure out a way to take substantial amounts of carbon out of the air,” says Rubin.
A recent literature review of the environmental, economic and social aspects of carbon capture and storage found that Edward Rubin is the most cited and most published person worldwide on the topic.
“When we talk about carbon capture today it can mean different things to different people,” says Rubin.
Up until recently, carbon capture and sequestration (CCS) has headlined discussions on removing carbon dioxide from coal or gas-burning power plants, or from large industrial facilities like cement, steel, and petrochemical plants that also release large amounts of CO2 during production. “Technologies for capturing CO2 in industrial settings have been used for many decades,” Rubin notes. “What’s new is to sequester it deep underground to prevent any release to the atmosphere.”
Yet there are other ways to capture or remove carbon from our atmosphere such as by planting more trees or using direct air capture systems, which is technology that “is like a big vacuum cleaner for the air.” Air capture technologies are new and not yet used commercially, but they could be essential to mitigating climate change if other measures prove inadequate.
Expanding incentives for carbon capture methods and technologies should speed their development and implementation. With the proper incentives, CCS can make sense for some industries both financially and politically, whereas direct air capture and sequestration is technically possible, but still extremely expensive and not possible on a large scale soon.
However, looking broadly across the economic landscape, other factors are coming into play: In particular, the cost of renewables has come down dramatically. “The full cost of a reliable energy system is still much greater than some of the numbers you see bandied around for renewables, but there is a lot of investment now in battery and other energy storage technologies that are needed. I’m fairly confident it will pay off, and costs will come down. The same is true for other technologies. If there is a market for carbon capture systems, their cost will also come down,” says Rubin.
“The industrial sector is where carbon capture and storage is most likely to take hold since there are few other options for deep carbon reductions. But again, what is needed is a strong policy driver, because at the end of the day, CCS is an addon technology, like scrubbers or water treatment plants, that increases the cost of the product. So, the environmental benefit of carbon capture only makes economic sense if there are broad restrictions on carbon emissions, or a market-based policy such as an emissions tax or a cap-and-trade program,” he says. “What we have today is a basket of carrots and few if any sticks.”
“I’ve increasingly underestimated the political power of vested interests. There is strong resistance to change when it affects their pocketbooks and livelihood. But the world is starting to come around now as climate impacts are getting more personal. It is taking a lot more time than I would like, so we are also going to be doing a lot of adaptation to climate change before we see substantial progress to 1.5°C. But history shows that once we get serious we eventually achieve our environmental goals, often at a much lower cost than anticipated.”
In the early 2000s, Uruguay experienced a crippling energy crisis. Lacking oil or natural gas reserves, the Uruguayan government realized that if the country were to become energy independent, it had to move forward with renewables. Today, more than 95% of Uruguay’s electricity is generated from renewable sources, chiefly wind and hydropower.
However, renewables are intermittent, so ensuring a reliable power supply at peak demand times requires an expensive over-installation of capacity. This in turn, results in excess energy at times when the demand is not at its peak. This excess of energy can be sold to neighboring countries/states, but a problem remains: If neighboring countries/states don’t need the energy, then what do you do with it?
This question challenged Ana Torres, an assistant professor in Carnegie Mellon’s Department of Chemical Engineering, who began her academic career in Uruguay.
She published several papers on storing the excess of renewable energy in batteries in Uruguay, but then “the country saw the opportunity to use the surplus of renewable energy to produce green hydrogen.”
With the Uruguayan electricity sector decarbonized, Torres, who focuses on process systems engineering for clean and sustainable energy, saw opportunities for using green hydrogen to decarbonize transportation, along with other aspects of the Uruguayan economy, including the chemical industry.
According to Torres, the International Renewable Energy Agency posits that “the hydrogen economy is expected to account for 12% of final energy use by 2050 in a below 1.5C degree global
warming scenario.” But generating large amounts of hydrogen from renewables is not cheap or easy. Green hydrogen results when water goes through a process called electrolysis that is powered by renewable sources. During this process, an electrolyzer splits the water into hydrogen and oxygen. There are different ways to make green hydrogen, but prior to Torres’ research, there wasn’t a systematic way to determine the best methods for commercial production.
Using Uruguay as a test study, Torres spent two years creating an allencompassing superstructure-based optimization framework that evaluates green-hydrogen production at different scales by the selection of power sources, different types and sizes of electrolyzers, and storage devices.
“This superstructure is a process of all processes,” says Torres. “We have this superstructure, as we call it, that is a master flow sheet that accounts for all the possibilities for producing green hydrogen from different sources of renewable energy. We can write a model from that, and we can solve that model using optimization techniques.”
The framework’s flexibility allows for technologies to be evaluated in a way that takes advantage of their different efficiencies and costs. To test the framework, she conducted four test cases that featured small-sized hydrogen production plants and largescale hydrogen production. Two of the studies evaluated the technologies to find optimal combinations and run what-if scenarios to identify synergies. While the other two studies aimed to satisfy an application by only considering technology devices already
on the market.
Through simulations, Torres determined that solid oxide electrolyzers hold future promise, and if we are to choose currently available market alternatives, then alkaline electrolysis is preferred over proton exchange membrane electrolysis. Electrolyzers are key to green-hydrogen production, and the insight from Torres’ research contributes much needed information on the performance and operating costs of electrolyzer systems.
“In this model, we used data from Uruguay, but the idea now is to use data from the United States to see if the same processes that I observed in Uruguay will also be optimal in the United States,” says Torres. “My guess is that they won’t because here in the United States, it’s a much larger country with more climate options. For example, you have a desert where there is not much wind but more sun. That would change the economics of the process for sure, but it might also change which type of technology is more relevant for that region.”
“From a process design perspective, green-hydrogen production is a much more difficult problem than the problems we have been taught to solve. It’s very difficult because of the intermittences. We cannot design a process for a steady state. We must design processes that work at less than 90% or 100% capacity, and we have to be able to shut the systems on and off as required.”
Ana Torres’ paper, “Coupling time varying power sources to production of green-hydrogen: A superstructure based approach for technology selection and optimal design,” was published in Chemical Engineering Research and Design.
Sustainability is the driving factor behind decarbonization, as it’s widely understood that an energy system rooted in fossil fuels will have disastrous outcome for human health and our planet. However, when modeling decarbonization, sustainability is far from the only consideration policymakers must make. Potential pathways to decarbonization are often evaluated on a least cost basis, as many national investments are.
Though decarbonization will provide a net benefit for every American, that doesn’t guarantee that these benefits will be evenly distributed across society. To investigate this potential for continued inequality, Destenie Nock and Teagan Goforth have created a new framework for social impact assessment to identify who may be burdened nationally with higher levels of air pollution during the energy transition.
While the entire country will enjoy the benefits of a fully carbon neutral energy grid, on our current course, the environmental benefits of getting there will likely be unevenly distributed. The new framework, published in Nature Communications, will help policymakers achieve equal distribution in air quality benefits as our energy system decarbonizes.
“I hope that this will help move the decarbonization discussions towards considering the environmental and social lens,” says Nock, an assistant professor of civil and environmental engineering and engineering and public policy. “This forward-looking equality analysis can help identify potential disparities before we build a new energy system.”
Based on national strategies for decarbonization, Nock and Goforth, a graduate student in engineering and public policy, modeled the regional impacts on energy generation and air quality across 134 regions of the United States. They matched this with corresponding census data, overlaying demographic information with the annual distribution of airborne pollutants from energy generation predicted in a given region, based on the decarbonization strategy in question.
Across the eight decarbonization scenarios evaluated with-
in their framework, they found that none achieved equality in air quality before their mandated year for decarbonization, whether it was 2035 or 2050. Every scenario involving at least some future decarbonization efforts would reduce air pollutants for the lowest income group by at least 20 percent; however, inequalities persisted in every scenario. This held true across every demographic factor they evaluated, including income group, poverty level, and race/ethnic background.
Nock and Goforth concluded from their analysis that any decarbonization scenario based solely on the least-cost paradigm is at risk of allowing local air pollution inequalities to persist. The framework they’ve created to analyze equality in the distribution of air pollution can help policymakers as they work through a difficult balancing act with numerous economic, environmental, technical, and social factors. The authors contend that to prioritize equal distribution in air quality, our pathway to decarbonization must be guided by strict mandates or be consistently driven by a focus on equality throughout the energy transition.
“We have shown that a single objective of minimizing cost doesn’t ensure equitable distribution of benefits in terms of local air pollutants. Air pollution disparities need to be considered throughout the transition,” says Nock. “When crafting public policy for energy transitions, decision-makers can use this work as a source for indicating the need for holistic multiple objective approaches to energy system planning and to ensure an equitable and sustainable future.”
Noting several opportunities for further investigation, Nock and Goforth question how outcomes might differ from an assessment based on optimizing equality, rather than minimizing cost. There’s also room for expanding their analysis to consider potential trade-offs between equal air pollution distribution and other equality, equity, environmental, and cost objectives. A multidisciplinary perspective will be crucial to managing this complicated array of trade-offs.
Elvin Vindel, a Ph.D. student in civil and environmental engineering, led the creation of a model that can cut emissions from buildings and improve the overall efficiency of the grid. The paper was co-authored by his advisors Mario Bergés and Burcu Akinci, both professors of civil and environmental engineering.
In the U.S., buildings represent more than 70 percent of electricity consumption and a large portion of the carbon emitted by the energy sector. The sustainable integration of renewable energy in the electrical grid requires additional sources of operational flexibility. Renewable energy sources like wind and solar provide much cleaner energy than fossil fuels. However, the variability introduced by relying on natural processes for power generation will require a grid more adaptable to changes in the quantity of energy available. The Department of Energy (DOE) has
recognized the potential in reimagining buildings as flexible assets through the implementation of technologies and innovations that increases efficiency and reduces carbon emissions.
The team’s model for demand flexibility in heating, ventilation, and air conditioning (HVAC) systems offers win-wins in energy demand management for building managers and grid operators alike. Demand flexibility is the capacity of a building to actively change its energy consumption—an important property that can help balance energy demands across the power grid.
Existing methods for estimating the flexibility of commercial buildings have focused on thermal flexibility while simplifying the response of the HVAC system. This approach often leads to low accuracy in flexibility predictions, which has so far limited the prospect of using buildings for grid services that improve reliability. Additionally, a fundamental
challenge in developing more accurate models is to encompass the heterogeneity in the building population and in individual comfort preference.
The new model defines this flexibility as a property of the installed mechanical system by leveraging the data streams generated and stored in modern building automation systems. The proliferation of smart meters and sensors in over 60 percent of commercial buildings is providing these researchers more real-time data from HVAC systems than ever before. Their proposed model provides a more accurate demand flexibility prediction for drops in demand compared to existing approaches while supporting a scalable model acquisition process for widespread use in commercial buildings.
Better models for estimating demand flexibility suggest new ways that building managers and utility operators can coordinate to achieve mutually beneficial reductions in energy demand. If a building can provide the grid with a more accurate and timely picture of its demand flexibility, then grid managers can better balance and actuate energy demands while providing incentives to building managers who are able to lower their demand when needed. The team has already tested the model on simulations of three buildings under varied climates across the US. They’re planning further testing on real-world HVAC systems starting this summer to validate the findings of this latest research.
Carbon dioxide emissions from the combustion of fossil fuel accounted for 73% of total U.S. greenhouse gas emissions in 2020. Researchers at Carnegie Mellon University are exploring ways to reduce emissions from power plants using post-combustion carbon capture via absorption through additively manufactured stacked columns.
Post-combustion capture uses predominantly chemical solvents to separate carbon dioxide out of the exhaust gas from fossil fuel combustion. Traditional packed columns—structures filled with a packing material that allows fluid to move from one end to another and enable gas to react with a higher proportion of liquid solvent—are being explored for industrial application. To deploy packed columns, their hydrodynamic factors need to be carefully evaluated. This is where Grigorios Panagakos, an assistant research professor of chemical engineering, comes in.
Panagakos’ research group worked through the National Energy Technology Laboratory (NETL) in partnership with the Pacific Northwest National Laboratory (PNNL) and the Lawrence Livermore National Laboratory (LLNL) to design and test the hydrodynamics of an additively manufactured column called the Schwarz-D. Under Panagakos’ lead, the PNNL team conducted 3D multiphase flow simulations to understand the hydrodynamic characteristics of the stacked column and LLNL 3D printed packings of the Schwarz-D geometry and conducted experimental studies on those structures.
The Schwarz-D is a triply periodic minimal surface (TPMS) structure that can be used simultaneously for heat and mass transfer. TPMS structures consist of two interpenetrating fluid volume domains separated by a thin wall. TPMS structures exist throughout nature (e.g. biological membranes and rock crystals) and exhibit superior structural strength using a minimum amount of material. TPMS structures also exhibit improved heat transfer and mass transfer properties due to higher specific area and excellent permeability, making them a good choice for heat and mass transfer systems.
“The major driver for any technology is the cost,” said Panagakos. “If we can optimize the column to be smaller and more efficient, we make the technology more viable for industrial use, but in order to do that we have to take into account all of the physics behind it.”
To enable the maximum amount of CO2 to transition from gas to liquid, the group needed to monitor the interfacial area between the two phases. They extensively explored the impacts of the liquid and gas loads, solvent properties, and contact angle—a quantitative measure of wetting of a solid by a liquid—on the interfacial area and pressure drop in the Schwarz-D packed column. The simulations were also validated against experimental data gathered by the group. Furthermore, the performance of a traditional Monoethanolamine (MEA) solvent was compared to that of a state-of-the-art, water-lean solvent called EEMPA N-(2-ethoxyethyl)-3-morpholino propan-1-amine.
“We are working with our collaborators on how to take the very detailed information that we gain through our simulations and connect it on a multi-scale level modeling approach with higher level systems.”
Post combustion technology is relevant to all point sources of CO2 emission and can therefore be an essential tool in decarbonizing not only the energy sector but industries like cement, petrochemical, and steel as well. This research was the first time simulations were used to understand a twophase system using TPMS geometries. The outcomes advance the understanding of the hydrodynamics of additively manufactured packed columns and provide insight into potential opportunities in chemical processes’ intensification in general.
The work was sponsored under the Carbon Capture Simulation for Industry Impact (CCSI2) family of projects, a flagship project on carbon management from the U.S. Department of Energy.
Just as blood pressure informs heart health, intracranial pressure (ICP) helps indicate brain health. ICP sensing is the burgeoning focus of Jana Kainerstorfer’s biomedical optics lab at Carnegie Mellon University. Her team is working to modernize ICP sensing approaches, which historically have been invasive and risky. Their noninvasive alternatives will ease risk of infection, pain, and medical expenses, as well as present new monitoring capabilities for patients with an array of brain injuries and conditions, from stroke to hydrocephalus.
Investigating pressure levels in the brain is a laborious task for health professionals and hasn’t progressed much since the 1960s. Current practice involves drilling a hole into a patient’s skull and placing a probe inside for continuous monitoring of ICP levels. It comes with the risk of infection and damaging the brain itself, and while valuable data to have, ICP measurement is reserved only for the most critical of situations.
“At the core of it, what we’ve done is build a sensor alternative that doesn’t require drilling a hole into the patient’s head,” said Kainerstorfer, an associate professor of biomedical engineering at Carnegie Mellon University. “We published two papers that explore the use of optical sensors on the forehead for noninvasive ICP monitoring, using near-infrared spectroscopy and diffuse correlation spectroscopy. Both approaches represent huge strides in improving the patient experience and providing better tools to monitor pressure levels in the brain, which can be a key variable in both diagnosis and treatment decisions.”
The first body of work, in collaboration with Matthew Smith, an associate professor of biomedical engineering at CMU, was published in Neurophotonics, and demonstrates that noninvasive near-infrared spectroscopy can be used for ICP monitoring. The process involves small, customized sensors that are placed on the head—think a Fitbit for the brain—to measure light that has interacted with the brain. Every time the heart beats, it produces a pulse in the brain, and the sensors measure hemodynamic blood volume/blood flow changes. Then, a machine learning algorithm is applied to determine ICP.
“Our aim is to create a portable and user-friendly tool to measure pressure changes in the brain,” explained Kainerstorfer. “We don’t believe that fancy hardware is needed to measure blood flow or hemoglobin concentration, and right now, we don’t have a good handle on how much brain pressure even fluctuates. There are other non-invasive methods that are out there, but what sets us apart is that we’re using a sensor that can be made wearable and offers continuous monitoring.”
Kainerstorfer’s group also partnered with UPMC Children’s Hospital of Pittsburgh, in collaboration with Dr. Michael McDowell, to conduct a clinical study utilizing more specialized sensors and hardware to measure ICP, via noninvasive diffuse correlation spectroscopy. 15 pediatric patients from UPMC’s pediatric intensive care unit participated, and their results were published in the Journal of Neurosurgery.
JANA KAINERSTORFER’S BIOMEDICAL OPTICS LAB IS WORKING TO MODERNIZE INTRACRANIAL PRESSURE SENSING, WHICH HISTORICALLY HAS BEEN INVASIVE AND RISKY.
As part of the study, a probe with optic fibers was placed on the forehead of the pediatric patients who already had invasive ICP monitors to measure blood flow changes and quantify ICP and to compare the non-invasively acquired data to the invasively acquired data. The results of the noninvasive model were compared with those of invasive monitoring and found to be very similar.
“In this study, the patients had an invasive sensor placed in their brain, and it served as our ground truth,” elaborated Kainerstorfer. “We worked alongside outstanding collaborators and were encouraged by the results we saw, which indicated that a noninvasive approach could yield accurate ICP measurement. It still requires very specialized hardware and expertise to interpret the data, but it’s much better than drilling a hole in the head.”
Noninvasive intracranial pressure sensing stands to benefit a wide range of people, from critical care and emergency medicine physicians to parents of children with conditions like hydrocephalus. Hydrocephalus is a condition caused by too much brain fluid in the head, which causes pressure on the brain. To solve for this, surgeons implant a shunt to drain the fluid; however, a reliable tool to non-invasively measure shunt function does not exist.
“Imagine a six-year-old who was born with hydrocephalus travels to Walt Disney World Resort with her family,” related Kainerstorfer. “If during their trip, she starts to exhibit symptoms of shunt failure, like headaches or drowsiness, the family currently has no way to quickly measure ICP, to see if it has increased. The only solution is to run to the ER and get CT/MRI testing and incur expensive medical bills. Many families live in a constant state of fear regarding shunt failure and being too far from a hospital capable of fixing the problem. Further, even if such a hospital is available, it often does not have access to the past medical records necessary to compare to, making diagnosis of shunt failure more difficult. Our work suggests there is a better way—a user-friendly portable sensor that can accurately measure ICP.”
Next steps with the work include plans for a larger, multi-center clinical study of adult and pediatric patients.
“ICP sensing was a side project for our lab initially,” said Kainerstorfer. “I come from an understanding of what influences blood flow, and how those changes influence neurons. My interest in this topic was piqued from an engineering challenge perspective, but also a personal empathy for patients who live with this uncertainty and don’t have any noninvasive way of evaluating brain pressure changes. Trying to make someone’s life better by an engineering approach is why I’m in this field.”
policies using reinforcement learning, a machine learning training method.
“Under some conditions, you really need reinforcement learning intelligence, but in other conditions, that reinforcement learning is just telling you to do what you probably would have done anyways,” say Joe-Wong
The team suggests that fleet operators train models to manage AV fleets locally. If new traffic patterns occur, then the models are updated, especially to direct people away from incidents. However, if traffic flows unabated, then fewer updates are needed, which reduces the communications between AVs on the road and AVs reporting back to a centralized server.
Akin to when Model Ts traveled alongside horses and buggies, autonomous vehicles (AVs) and human-driven vehicles (HVs) will someday share the road. How to best manage the rise of AVs is the topic of a Carnegie Mellon policy brief, Mixed-Autonomy Era of Transportation: Resilience & Autonomous Fleet Management.
Carlee Joe-Wong, one of the brief’s authors, says, “Once AVs begin to deploy, there’s probably not going to be any going back. So, there is need to start talking about policies now, to study them thoroughly and get them right by the time AVs arrive.”
Joe-Wong, an associate professor of electrical and computer engineering, and her fellow researchers asked, “What is different when you have AVs in traffic compared to just having HVs? We realized that one of the main differences is that AVs are altruistic and HVs are selfish.”
AVs can anticipate what is going to happen and reroute themselves, such as in the event of road construction or
an accident. Programmed to operate safely and follow rules, AVs can take altruistic actions that benefit other vehicles and not just themselves. Humans in a hurry, may not be so generous with their time.
The price of selfish driving becomes evident when examining traffic flow. As selfish-behaving cars move in and out of a traffic system, the system eventually reaches equilibrium, a balanced state, but traffic may not be flowing efficiently. For example, equilibrium can be reached when traffic snarls along bumper-tobumper. “Sometimes equilibrium is far from optimum,” says Joe-Wong.
Altruism could improve traffic flow by avoiding suboptimal equilibria, and not everybody has to be nice. In simulations, altruistic states occur when AVs make up 20% to 50% of the vehicles on the road.
All these AVs have the capacity to work in sync, yet centrally controlling thousands of AVs will lead to computation and communication concerns. The researchers want to strike a balance between centralized and decentralized
Operating at optimal equilibrium, applying reinforcement learning, and having a higher proportion of collaborative AVs will reduce congestion. However, to address cascading failures that could cripple a transportation network, the researchers factored in other modes of urban transportation. They added bus, subway, railway, and bike-sharing systems to their models and demonstrated that if passengers are adjusted between different modes of transportation, this would maximize the use of the entire network and prevent it from overloading and failing.
Based on these findings, the team recommends that when planning agencies create traffic flow redistribution policies for AVs that they consider incorporating multiple interdependent transportation systems to keep people moving.
In the era of mixed autonomy, altruistic AVs could act as coordinators that keep traffic flowing by eliciting positive actions from HVs. Although it will take time before AVs outnumber human-driven vehicles, all drivers will notice improved traffic flows with just a partial adaptation of AVs.
This brief received support from the National University Transportation Center for Improving the Mobility of People and Goods (Mobility21) at CMU.
Over the last decade, the meteoric rise of ridesourcing services like Uber and Lyft have transformed the urban landscape, affecting travel patterns, car ownership, and congestion, and more broadly, the economy, the environment, and equity.
The ways in which Uber and Lyft are redefining mobility is the focus of a policy brief series, “Uber and Lyft in U.S. Cities: Findings and Recommendations from Carnegie Mellon University Research on Transportation Network Companies (TNCs).”
The brief series, a compilation of studies conducted by Jeremy Michalek, the lead author, and other Carnegie Mellon Engineering researchers, delves into the implications and opportunities that TNCs present.
“It is important for us to understand whether TNCs are providing social benefits and whether cities should have a friendly or more skeptical posture toward them,” says Michalek, professor of engineering and public policy and mechanical engineering
On the plus side, the researchers found that TNCs have increased economic growth, employment, and wages for intermittent jobs in U.S. cities.
“However, Uber and Lyft affect different kinds of cities differently, and that is important to understanding their impact,” explains Michalek.
For example, TNCs are not a reliable way to reduce car ownership. When TNCs entered U.S. cities, car ownership increased in car-dependent and slow-growth cities, and TNCs displaced transit ridership most in cities with high income and fewer children.
The costs that Uber and Lyft impose on cities are not clear-cut, either. The research reveals that Uber and Lyft can clean the air but clog the streets. Taking an Uber instead of a personal vehicle can reduce air pollution costs by 9 to 13 cents, but the extra driving to and from passengers increases costs from congestion, crash risk, and climate change by about 45 cents. “You create lower external costs to society when you drive your personal vehicle, on average,” says Michalek.
Responding to environmental concerns and public interest, Uber and Lyft have committed to electrify more vehicles in the future, but policy interventions to encourage electrification of TNC fleets may still be warranted. The researchers conducted simulations and found that when TNCs are faced with the air emissions costs that their fleets impose on society, they electrify more of their fleets. This was shown to reduce air emission costs by 10% in New York City and up to 22% in Los Angeles.
The researchers note that the best fleet is typically a mix of electric and gas-powered vehicles.
Finally, the brief considers the capacity for Uber and Lyft to reveal and potentially address societal inequities. In one study, the researchers examined TNC ridership during heatwaves in New York City in 2019 and found that ridership increased more in high-income neighborhoods than in lowincome neighborhoods, suggesting that low-income riders are subject to endure more extreme heat and humidity.
“We’ve been doing a lot of work on Uber and Lyft over the past six years, and this brief series provides a compact summary of what we have learned and what we recommend–both for cities and for travelers wishing to reduce negative effects of their travel choices on society,” says Michalek.
This brief received support from the National University Transportation Center for Improving the Mobility of People and Goods (Mobility21) at Carnegie Mellon University.
Sometimes walking isn’t as simple as one foot in front of the other, especially when building robots that walk on two legs. Aaron Johnson’s Robomechanics Lab, which aims to create robots that are suitable for field work, recently experimented with several foot designs to give a 15-centimeter biped the steady gait it would need to perform complex tasks. Their robot combines the mobility advantages of passive dynamics and the minimal hardware of actuators to form an easily constructed device that, despite its miniature build, could make a big difference in the future of areas like natural disaster recovery.
With legs that are 15 centimeters long, this biped is on the smaller side for a walking robot. However, by understanding the passive dynamics and using simple designs we can actually make something this small that walks. The simplicity of these designs—in this case, one torso, two upper legs with an actuated joint in each, and a passive hip joint—makes quasi-passive walkers an ideal candidate for conducting field work.
“The free walking is fully power autonomous, unsupported by structure or offboard electronics,” says Justin Yim, a postdoctoral researcher in the Robomechanics Lab who worked on the project with Johnson, an associate professor of mechanical engineering. “This robot carries its own battery and processor board and doesn’t need to be attached to anything else to walk. It just needs flat ground.”
One of the most challenging parts of the design is the feet. Johnson and his team tested multiple spherical designs to determine which one would let the robot maintain its balance and stability. The team’s first choice would have been concentric feet with a gap, which looks like a sphere with a wedge cut out of the middle. The biped landed on the edge of the foot
instead of the surface as it walked, and its gait was unstable. A second option, concentric feet without a gap, comprised almost a full sphere. The team observed the biped roll from one foot to another while rocking from side to side; however, instability was still an issue. Non-concentric feet wound up being the most practical fit. In this design, the feet have a wider stance with a gap between them, and they land on the face instead of the edge as they walk.
Yim compares this experiment to trying on shoes for a robot. Because the team stuck to spherical designs, it’s less of a debate over heels and flats, and more like a decision between different pairs of tennis shoes to see which ones offer enough support in the right places. A perfect fit for bipeds allows them to teeter as they walk, without losing their balance.
Choosing non-concentric gap feet as the best design marks a baby step towards understanding how smaller bipeds can be applied to more difficult tasks. Logical possibilities for miniature, legged robots include environmental monitoring or natural disaster recovery, particularly for investigating hard-to-reach crannies and crevices. The idea behind monitoring how bipeds walk with different types of feet is to establish what movements are easy and difficult to make. Understanding the physics of walking can shed light on how bipeds would perform in a variety of situations, and how feasibly their designs could be modified to accommodate payloads like sensors or cameras.
“This robot isn’t the most extreme in anything,” says Yim. “But I think it’s in a unique space. There have been other walkers that use fewer motors, but they weren’t freely walking, they were attached to a boom. Ours isn’t the smallest walker ever, but it does use these passive dynamic-inspired walking techniques that are not as common in the smallest scales.”
Globally, every fourth adult over the age of 25 will suffer a stroke in their lifetime, and 75% of those people will have lasting deficits in fine motor control. Until now, no treatments have been effective for treating paralysis in the so-called chronic stage, which begins six months after the stroke.
Technology developed by Douglas Weber in collaboration with the University of Pittsburgh is offering new hope for people living with impairments that would otherwise be considered permanent.
The team discovered that muscles respond directly to electrical stimulation of specific spinal cord regions, enabling patients to regain mobility of their arm and hand.
Spinal cord stimulation technology uses a set of electrodes placed on the surface of the spinal cord to deliver pulses of electricity that activate nerve cells inside the spinal cord. Research groups around the world have shown that this stimulation can be used to restore movement to the legs, but the complexity of the neural signals controlling the unique dexterity of the human hand and arm adds a significantly higher set of challenges.
By engaging intact neural circuits located below the lesion, a pair of thin metal electrodes resembling strands of spaghetti implanted along the neck allow stroke patients to fully open and close their fist, lift their arm above their head or use a fork and a knife to cut a piece of steak for the first time in years.
“The sensory nerves from the arm and hand send signals to motor neurons in the spinal cord that control the muscles of the limb,” says Weber, a professor of mechanical engineering “By stimulating these sensory nerves, we can amplify the activity of muscles that have been weakened by stroke. Importantly, the effect only strengthens muscle activation when the patients are trying to move, providing assistance only as needed so the patient retains full control of their movement.”
Nikhil Verma, a Ph.D. student in Weber’s lab, has been monitoring muscle activity to have a direct measure of the effects of stimulation on the spinal cord.
“To make this an effective therapy, we need to develop a robust method for controlling the pattern and timing of electrical stimulation applied to the spinal cord,” said Verma.
“By measuring activation levels of muscles, we can detect when the patient is trying to move and then deliver targeted stimulation to assist the movement.”
Verma's research used electromyography (EMG) sensors to measure the effects of stimulation on muscle recruitment and to detect movement intentions, generating a control signal for patterning spinal stimulation to facilitate the intended movement.
Early clinical trial results hint at a promising future of spinal cord stimulation as a neurorehabilitation strategy for a sizeable proportion of stroke survivors. Unexpectedly, the effects of stimulation seem to be longer-lasting than scientists originally thought and persist even after the device was removed, suggesting it could be used both as an assistive and a restorative method for upper limb recovery.
Moving forward, researchers continue to enroll additional trial participants to understand how many stroke patients can benefit from this therapy and how to optimize stimulation protocols for different severity levels.
“Thanks to years of preclinical research building up to this point, we have developed a practical, easy-to-use stimulation protocol that could be easily translated to the hospital, especially since we can use and adapt existing FDA-approved clinical technologies,” said Marco Capogrosso, an assistant professor of neurological surgery at the University of Pittsburgh.
Breastmilk has long been considered an ideal way to feed a growing baby. Chock-full of nutrients that are easily digested, it assists in brain growth, improved eyesight, and the development of the infantile gastrointestinal, nervous, and immune systems.
Now, researchers at Carnegie Mellon University are exploring ways to utilize the unique properties of breast milk to develop a novel approach to infant disease therapy.
After giving birth to her daughter in 2016, Kathryn Whitehead grew curious about the properties of breast milk. As she began researching the topic, she learned of the millions of living cells present in every milliliter.
“When a child consumes breastmilk, they are also consuming their mother’s cells,” said Whitehead, a professor of chemical engineering.
Although human cells are one of the least studied components of breast milk, previous work has demonstrated their remarkable properties. These properties include traveling out of the gastrointestinal tract and integrating into an infant’s tissue, where the cells can proliferate and remain into adulthood.
“This is an amazing phenomenon,” says Whitehead. “As a drug delivery research group, we spend a lot of time thinking about how to transport proteins and other therapeutic molecules out of the gastrointestinal tract. These cells may inspire us as to how to do that.”
Whitehead believes that if her lab can figure out how these cells achieve transport through the body, they could potentially be used as drug delivery vehicles for therapeutic purposes, something that has never been done before.
Her group’s first paper on the topic, now published in the journal Science Advances, takes the first important steps toward this goal. Here, Whitehead’s group characterizes the cells in mature stage breastmilk at the protein, gene, and transcriptome levels, providing a genetic fingerprint through which to track maternal cells in the infant’s body.
These results will pave the way for Whitehead’s long-term vision of genetically engineering the cells to orally treat infants with medicines and vaccines or to tolerize them to allergens.
“Our approach will be non-invasive both for the mother and the baby,” said Whitehead.
Now that Whitehead’s lab has identified the maternal cellular components of milk, they will focus
on tracking the different types of maternal cells as they move through the infant. It is possible that only certain cell types (e.g., lactocytes or immune cells) will have the transport and medicine production abilities to achieve Whitehead’s long-term vision. Once these optimal maternal cells are identified, the team will then engineer those cells for therapeutic purposes.
Whitehead has drawn plenty of support for the project. In 2018 she received the prestigious National Institute of Health (NIH) Director’s New Innovator Award, given to exceptionally creative scientists proposing high-risk, high-impact research. She also received funding for preliminary research from CMU Chemical Engineering alumnus Jon Saxe and his wife, Myrna Marshall.
Consumers are used to quick deliveries. Order a product one day and it is delivered the next. But the logistics behind this massive movement of goods and the environmental impact mean that better solutions are needed to balance consumer demand and the energy consumption of “last-mile” deliveries.
To address this, Costa Samaras and Thiago Rodrigues researched “an increase in the demand for last-mile delivery while trying to reduce the environmental impacts of the transportation sector.” Their work was published in the August issue of Patterns.
Rodrigues, a Ph.D. candidate in civil and environmental engineering, explains that companies are evaluating autonomous vehicles for last-mile delivery. “We focused on understanding the impacts on the energy consumption and greenhouse gas (GHG) emissions of this transformation in how we deliver packages,” he says. The research was funded by the Department of Energy and includes authors from three departments within Carnegie Mellon University.
The experiments defined the principles that influenced an aircraft’s overall energy consumption. “Most of the literature focuses on helicopters,” Rodrigues says. “So, we adapted the physical model and created an experimental protocol to replicate the flight conditions expected during lastmile delivery.”
The team learned that payload
A comparison of last-mile delivery modes by energy consumption and CO2 emissions showed that very small quadcopter drones and e-cargo bikes are most efficient.
mass and total flight duration were the main contributors to the drone’s overall energy consumption. “For small packages with high value, such as medical supplies and electronics, the quadcopter drones showed a considerably low energy consumption per mile traveled compared to other transportation modes.”
Surprisingly, drone speed and wind speeds had little impact on the drone’s overall energy consumption. Testing took place with the drone flying at speeds between four and 12 meters per second. During that time, the wind conditions varied from two to 16 knots.
For last-mile deliveries, adopting quadcopter drones for small package
deliveries could result in substantial energy savings while lessening GHG emissions. “Drones can have up to 94% lower energy consumption per package than other vehicles.” The overall amount of emissions reduction depends on the intensity of the electricity grid in an area. “Regions with cleaner electricity would benefit more from adopting drones to transport small packages,” he says.
While the researchers determined multiple benefits in utilizing small drones, Rodrigues admits that there are operational and regulatory challenges that need to be addressed. “However, a few drone delivery operations are already being implemented, with medical supplies and even groceries being safely delivered by drones. These operations are leading the way in expanding the use of drones in all last-mile delivery sectors.”
There’s also the issue of larger packages, which cannot be delivered by small drones. Rodrigues suggests that electric cargo bicycles and ground autonomous delivery robots could be an energy-efficient delivery option.
This study is an output of previous research led by Samaras and Sean Qian, both professors of civil and environmental engineering. Their work analyzed how drones, autonomous vehicles, robots, and intelligently managed infrastructure may improve cost and energy efficiency of the firstand last-mile of goods transportation.
For nearly 20 years, researchers at Carnegie Mellon CyLab Security and Privacy Institute have been advocating for consumers, leading the charge by gathering insights and developing tools to better inform users about the technology they utilize and bring into their homes. From finding innovative ways to convey privacy information to users visiting websites or using mobile apps, to creating security disclosures for IoT devices, CyLab has answered the call.
So, it’s no surprise that Lorrie Cranor, CyLab Director and professor in CMU’s Software and Societal
Systems (S3D) and Engineering and Public Policy (EPP) departments, was invited to speak at an April 2022 Federal Communications Commission virtual public hearing on broadband consumer labels. The FCC issued a Notice of Proposed Rulemaking (NPRM) in January 2022 to move forward with a proposal that would require broadband internet providers to display easy-to-understand labels, enabling consumers to comparison shop for broadband services.
“We have a long history of doing privacy research that involves consumers,” said Cranor. “So, when the FCC staff
asked me to comment on what would make broadband internet labels useful, I told them that it was important to survey consumers to find out.”
The FCC has been tinkering with the idea of broadband labels since 2009. In 2016, it created templates for broadband providers to use. The industry, however, was hesitant to adopt the new labels, and without legislation, the initiative fell by the wayside. However, the Infrastructure Investment and Jobs Act now requires the FCC to mandate that providers create consumer-friendly labels with information about their broadband services.
In response to the FCC’s request for comment, CyLab conducted a large-scale user study of more than 2,500 participants, uncovering the information most important to consumers shopping for broadband internet service and determining what terminology and presentation formats make this information most understandable and useful.
In the first phase of the study, experts Cranor, Jon Peha, a professor in EPP who served as the FCC’s Chief Technologist back when broadband labels were first proposed, and Christopher Choy, a master’s student in S3D, led a team that examined the FCC’s proposed 2016 broadband consumer labels to determine what information users found helpful and what was confusing.
Findings revealed participants strongly supported the idea of broadband labels and showed consumers were most concerned with cost, speed, and reliability when making purchasing decisions. Participants were also interested in measures of performance other than just downstream speed. Moreover, they cared about broadband performance during periods when performance is normal and much worse than normal, rather than under the best possible conditions as one typically sees in today’s advertisements. Consumers said they struggled to compute total service cost over a several-year span using the FCC’s version of the labels and expressed a lack of knowledge of many technical terms used.
With these insights, CyLab’s researchers went to work, developing and proposing their own set of broadband labels. Students Ellie Young and Megan Li iterated on label design, compiling the information consumers were most interested in, clearly and concisely, in a tabular format.
Next, the team surveyed participants to compare the effectiveness of their new label design with the
FCC’s 2016 labels.
Overall, CyLab’s proposed labels performed better than the FCC’s. Consumers appreciated having access to more detailed information and found the new labels easier to use and comprehend.
During the comparison, participants further expressed their desire to know the total cost of their internet plan without any ambiguity and were happy with the way CyLab’s labels incorporate taxes, fees, and optional monthly charges and discounts. Results also revealed that consumers are interested in knowing how much the listed performance metrics could drop during peak times, along with explanations for any technical terms.
The CyLab researchers made further improvements to their label designs based on their survey results, introducing a layered design with a summary and full version to help address the needs of a wider range of consumers.
“Broadband labels should balance the needs of consumers who value simplicity and conciseness with those who value detailed information,” explained Choy.
CyLab’s researchers also believe the FCC should require broadband service providers to deposit detailed plan information in a standardized computer-readable form. They say this information should be housed in a publicly accessible database, enabling third parties to generate customized labels for consumers, and offer comparison shopping tools, quality of experience or suitability ratings, and other value-added services.
“We would love to see a website where consumers enter their zip code, some information about the number of people in their household, and the types of internet services they use, and in turn get a plan comparison tailored to their needs,” said Cranor. “A database of broadband label information would help make that happen.”
The researchers hope their proposed label designs and research report will help the FCC staff better account for consumer needs and preferences as they draft broadband label requirements.
heart, liver, and pancreas can survive for a few hours outside the body. Cryopreservation has the potential to extend this time limit to several days or even longer. “Ideally, we would like to preserve the organs for an indefinite period of time so that we have an organ bank from which we can solve a number of problems, including having a constant availability of organs and tissues on demand for transplantation,” says Solanki, a postdoctoral researcher in mechanical engineering.
Prem Solanki and Yoed Rabin of Carnegie Mellon University’s Biothermal Technology Laboratory have developed a new way of modeling the solidification of fluids in confined volumes, with applications to organ and tissue preservation.
Cryogenics is the study, production, and use of cold temperatures. Cryopreservation is the study and practice of storing biomaterials at low temperatures. The field is almost 70 years old. Practical applications of cryopreservation include the preservation of stem cells, sperm cells, embryos, corneas, pancreatic islets, and other biomaterials. Cryopreservation is also used for biobanking plant seeds and tissues.
The potential of cryopreservation is enormous. Today, research on cryopreservation focuses on preserving whole organs for long-term storage and viability, the success of which can be a major medical advancement. Currently, organs like the
The field of cryopreservation moves at a glacial pace as the process comes with numerous challenges. When organic tissue freezes in uncontrolled conditions, it tends to die. The frozen region of the tissue stops receiving oxygen-rich blood, which quickly deteriorates tissue due to ischemic injury, an injury caused by lack of blood flow. At the microscopic level, ice crystal formation can burst apart cells, and at the macroscopic level, thermal stress can break blood vessels, both of which make the tissue unviable when thawed. To counter this, scientists replace the water in the tissues with a solution similar to antifreeze, known as “cryoprotective agents” (CPAs), allowing the tissue to be cooled below freezing without ice crystals forming. However, the ingredients in CPAs are often toxic to the organic tissue, requiring very particular concentrations of these ingredients in the solutions.
An alternative to the problem of ice formation is isochoric cryopreservation, freezing the tissue while maintaining a constant volume inside the chamber. “The idea of isochoric preservation is that you allow some ice formation such that the tissue specimen is preserved in the liquid part of the solution,” says Solanki. Ice formation occurs near the walls of the container, with the delicate tissue safely stored in the center. The solution expands in volume, but the rigid walls of the preservation chamber stop it. The only other option for the solution is to increase pressure, which reduces the temperature at which the solution freezes. This allows the tissue to be stored at below-freezing temperatures while still suspended in a liquid solution.
Isochoric preservation is what Solanki and Rabin, a professor of mechanical engineering, are studying. While isochoric preservation looks promising for the future of cryopreservation, it is currently extremely difficult to model mathematically. Ice behaves very differently from liquid water, and it is difficult to determine where ice will form in a system and how those “freezing fronts” interact with liquid water. The team’s latest research proposes a new method to model the isochoric cooling process that treats the frozen and non-frozen parts as one continuous pseudo-viscoelastic fluid.
This work focuses entirely on calculating these fluid behaviors in a system of pure water. This is very useful as a proof-of-concept. However, cryopreservation requires CPAs that behave radically different from water, especially when cooled below freezing temperatures. Some of these CPAs are unique to the field of cryopreservation. Because of this, their material properties have yet to be experimentally measured for use in computer models.
“We are in the process of getting a lot of data from our collaborators,” explains Solanki. “We are one of the very few engineering labs working in this domain. So, one of the big tasks we have in our lab is finding the material properties of these chemicals.” The team has already measured some of these material properties and published a paper supplying a database for the temperature-dependent densities and thermal expansion coefficients of some of these CPAs.
For now, cryopreservation has only been successful on very small tissue, such as embryos and stem cells. “When it comes to bigger specimens, there are still some challenges to face,” says Solanki. The largest pieces of tissue we have successfully preserved have been human blood vessels and there is research being conducted on preserving entire human ovaries. Solanki notes that we still need extensive research before we are able to preserve entire limbs and organs, but he is optimistic about the future of the field.
Density of water within the range of pressures and temperatures that the researchers investigated.
Source: Biothermal Technology Laboratory, Carnegie Mellon University
David Rounce has led an international effort to produce new projections of glacier mass loss through the century under different emissions scenarios. The projections were aggregated into global temperature change scenarios to support adaptation and mitigation discussions, such as those at the United Nations Conference of Parties (COP 27). His work showed that the world could lose as much as 41 percent of its total glacier mass this century—or as little as 26 percent—depending on today’s climate change mitigation efforts.
The most recent IPCC report for policymakers brought together thousands of internationally recognized climate experts in an urgent plea to citizens and their governments to fight for drastic and immediate reductions to greenhouse gas emissions. The report warned that policymakers have less than three years to act to avert catastrophic and irreversible changes to our climate. The shared socioeconomic pathways, or SSPs, they used to model future scenarios for climate change are based on factors like population, economic growth, education, urbanization, and innovation. These new pathways illustrate a more complete picture of socioeconomic trends that could impact future greenhouse gas emissions.
Only recently have researchers been able to produce global predictions for total glacial mass change using the new SSPs. Rounce’s work aggregates these future climate scenarios based on their increase in global mean temperature to evaluate the corresponding impacts associated with temperature change scenarios ranging from +1.5° C to +4° C. His model is also calibrated with an unprecedented amount of data, including individual mass change observations for every glacier, and uses state-of-the-art calibration methods that require the use of supercomputers.
Rounce, an assistant professor of civil and environmental engineering, and his team found that in the SSP with continued investment in fossil fuels, more than 40 percent of the glacial mass will be gone within the century, and more than 80 percent of glaciers by number could well disappear. Even in a best-case, low-emissions
scenario, where the increase in global mean temperature is limited to +1.5° C relative to pre-industrial levels, more than 25 percent of glacial mass will be gone and nearly 50 percent of glaciers by number are projected to disappear. A majority of these lost glaciers are small (less than one km2) by glacial standards, but their loss can negatively affect local hydrology, tourism, glacier hazards, and cultural values.
Many processes govern how glaciers lose mass, and Rounce is working to advance how models account for different types of glaciers, including tidewater and debriscovered glaciers. Tidewater glaciers refer to glaciers that terminate in the ocean, which causes them to lose a lot of mass at this interface. Debris-covered glaciers refer to glaciers that are covered by sand, rocks, and boulders. Prior work by Rounce has shown that the thickness and distribution of debris cover can have a positive or negative effect on glacial melt rates across an entire region, depending on the debris thickness. In this newest work, he found that accounting for these processes had relatively little impact on the global glacier projections, but substantial differences in mass loss were found when analyzing individual glaciers.
His work provides better context for regional glacier modeling, and he hopes it will spur climate policy makers to lower temperature change goals beyond the 2.7° C mark that pledges from COP-26 are projected to hit. Smaller glacial regions like Central Europe, low latitudes like the Andes, and the upper areas of North America will be disproportionately affected by temperatures rising more than 2° C. At a 3° C rise these glacial regions almost disappear completely.
Rounce noted that the way in which glaciers respond to changes in climate takes a long time. He describes the glaciers as extremely slow-moving rivers. Cutting emissions today will not remove previously emitted greenhouse gasses, nor can it instantly halt the inertia they contribute to climate change, meaning even a complete halt to emissions would still take between 30 and 100 years to be reflected in glacier mass loss rates.
As devices get smaller and more powerful, the risk of them overheating and burning out increases substantially. Despite advancements in cooling solutions, the interface between an electronic chip and its cooling system has remained a barrier for thermal transport due to the materials’ intrinsic roughness.
Sheng Shen, a professor of mechanical engineering, has fabricated a flexible, powerful, and highly reliable material to efficiently fill the gap.
“At first glance, our solution looks like any ordinary copper film, but under a microscope the novelty of our material becomes clear,” explained Lin Jing, Ph.D. student.
The material, composed of two thin copper films with a graphene-coated copper nanowire array sandwiched between them, is extremely userfriendly. “Other nanowires need to be in-situ grown where the heat is designed to be dissipated, so that their application threshold and cost is high,” said Rui Cheng, postdoctoral researcher in Shen’s lab. “Our film isn’t dependent
on any substrate; it is a free-standing film that can be cut to any size or shape to fill the gap between various electrical components.”
The “sandwich” builds out of Shen’s “supersolder,” a thermal interface material (TIM) that can be used similarly as conventional solders, but with twice the thermal conductance of current state-of-the-art TIMs.
By coating the “supersolder” in graphene, Shen’s team enhanced its thermal transport capabilities and prevented the risk of oxidation, ensuring a longer service life. The “sandwich,” compared to the thermal pastes/adhesives currently on the market, can reduce thermal resistance by more than 90% when considering the same thickness.
Thanks to its ultra-high mechanical flexibility, the “sandwich” can enable a wide range of applications in flexible electronics and microelectronics, including flexible LEDs and lasers for lighting and display, wearable sensors for communication, implantable electronics for monitoring health and imaging, and soft robotics.
Moving forward, Shen’s team will explore ways to scale the material at an industrial level and lower its cost, while continuing to seek ways to improve it.
“We are very excited about this material’s potential,” Shen shared. “We believe that a wide variety of electronic systems can benefit from it by allowing them to operate at a lower temperature with higher performance.”
This research was first published in ACS Nano and is a collaborative effort with Tzahi Cohen-Karni, a professor of biomedical engineering and materials science and engineering, and Xu Zhang, a professor of electrical and computer engineering, both of Carnegie Mellon University.
From regulators to researchers and most industries in between, all eyes are on PFAS. PFAS, per-and polyfluoroalkyl substances, are a class of highly fluorinated human-made compounds that have been used for decades in everything from nonstick cookware and personal care products to fire-fighting foams and school uniforms. Their commonality and extreme resistance to environmental degradation has made them ubiquitous in ground water, soil, and, worst of all, humans. Linked to a slew of health risks including liver toxicity, bladder cancer, and decreased immune response to vaccinations, exposure to PFAS is concerning. So, how can we eliminate these “forever chemicals?”
Historically, PFAS substances have only been characterized in water and soil, but the emission of these compounds during chemical manufacturing, use, and disposal results in their emission into the air. Ryan Sullivan, a professor of mechanical engineering and chemistry at Carnegie Mellon University, has been developing new methods to measure PFAS in both the atmosphere and in aerosol particles to answer outstanding questions regarding PFAS atmospheric components that lead to human exposure. His group is also developing new approaches to destroy forever molecules that are not removed by conventional water treatment plants.
“In remediation, our end goal is what we call full mineralization, where all fluorine is removed from the molecule. Historically, researchers have had some success with mineralization, but there has always been a percentage of the fluorine unaccounted for,” said Sullivan. “Sometimes researchers get stuck at partially fluorinated products that are still PFAS. Our work uses a non-targeted approach so that we can
better quantify these missing molecules and figure out just how close we are to full PFAS remediation.”
One promising technology for PFAS remediation is reduction via hydrated electrons. The process involves shining a UV light on salt water to break electrons away from the sulfite salt. Fluorine molecules are highly electron hungry so when the dissolved electron joins its bond, the very stable carbon–fluorine can finally be broken, releasing a harmless fluoride ion.
By using a non-targeted analysis during this process, Sullivan’s group identified novel PFAS molecules that had not previously been found during UV/sulfite reduction treatment.
“By having a complete understanding of this complex chemistry, we can optimize the engineered treatment conditions, fill holes in the known chemical mechanisms, and push closer
to achieving full mineralization.”
In order to better understand these contaminants, Sullivan has also developed a way to measure PFAS directly in the gas phase, and in aerosol particles. This avoids the existing need to collect large samples of air first before extracting PFAS prior to analysis.
“If the government were to issue regulations around PFAS emissions from manufacturing plants it will be crucial to measure their concentration going into the atmosphere to determine if they are compliant with the allowed emissions,” he explained. Moving forward, Sullivan’s team will explore the chemistry and transport of PFAS in aerosols suspended in the atmosphere.
This research was published in two papers in Environmental Science: Processes & Impacts.
Many of the most important real-world impacts will come from figuring out how to employ artificial intelligence (AI) algorithms into physical systems. This integration challenge is immense, and the innovative systems we design must prove to be resilient, trustworthy and secure.
Carnegie Mellon University’s College of Engineering is equipping next generation engineers with the ability to integrate AI into the constraints of the engineering problem and view the challenge from a new perspective by way of seven new master’s degrees in AI Engineering and one additional degree at CMU-Africa
“CMU is well positioned to offer these programs because not only are we a worldclass engineering college, but we have a vast wealth of expertise amongst faculty using AI and computing within their engineering research,” explained Shelley Anna, associate dean for faculty and graduate affairs and strategic initiatives. “This enables students to work with award-winning faculty who understand how to blend AI and engineering together. We encourage our students to take advantage of those collaborative approaches that are taking place across campus.”
Experts from our biomedical, chemical, civil and environmental, electrical and computer, materials science, and mechanical engineering departments have come together with CMU-Africa and the Information Networking Institute to create the new master’s degree programs. The inaugural class enrolled in fall 2022.
Minkwon Choi decided to pursue a Master of Science in Artificial Intelligence Engineering - Electrical and Computer Engineering degree
because he recognized that AI is shaping tomorrow. “AI is all around us. I want to bring my engineering knowledge into AI so that I can do something amazing with it.”
Students in the Master of Science in Artificial Intelligence Engineering programs are immersed in courses that blend engineering domain knowledge and the fundamentals of AI and machine learning.
The programs have four core courses across disciplines: Systems & Tool Chains for AI Engineers, Introduction to Machine Learning for Engineers, Introduction to Deep Learning for Engineers, and Trustworthy AI Engineering to give students a foundational understanding of AI Engineering.
Each discipline also has a unique integration approach whether that be a required research project or a project-oriented course that integrates the two approaches. The combination of domain knowledge and AI fundamentals help them to discover enhanced and breakthrough solutions to the entire engineering process.
“You can’t open the newspaper without seeing something about how AI is embedding itself within the design of all sorts of physical systems. Even the phone in your pocket is running on AI,” Anna shared.
“Our students will end up working for companies that want to use AI to move their business forward. Whether that be developing autonomous systems for self-driving cars or enabling a chemical plant to make decisions to optimize operations, our students will be prepared with a deep expertise in both AI and engineering. We are creating an opportunity for our students to define a unique career.”
Considering how quickly engineering design and manufacturing have advanced alongside computational developments, it may surprise you that very few engineers are trained in both engineering system design and artificial intelligence. There are countless opportunities for breakthrough improvements in how we develop new technology using AI in engineering design, but to succeed in these challenging areas, engineers must understand a new specialty: design for artificial intelligence.
Chris McComb and Glen Williams have developed a Design for Artificial Intelligence (DfAI) framework in collaboration with researchers at Penn State University to educate and encourage the academic and industrial engineering community to adopt AI engineering design.
“Most of the time, we view AI as a tool to add onto an existing system, but to develop better systems, we need to integrate AI into the engineering design process from the very beginning,” explained McComb, an associate professor of mechanical engineering.
A core challenge is motivating institutions to make investments in the long-term potential of AI technologies. Since engineering is product-driven and the incentives in design and manufacturing
prioritizes short-term excellence, budgeting resources for long-term research and development is challenging but worthwhile.
Williams, previously McComb’s student and now principal scientist at Re:Build Manufacturing, illustrated the importance by describing two hypothetical companies mass-manufacturing electric aircrafts. For up-front development, Company A chooses a manual manufacturing path to quickly hit the market and reach profitability. On the other hand, Company B, builds a data-rich process that captures intelligence throughout the lifecycle of the design. Within the next 10 years, Company B is able to drastically reduce their operating cost by utilizing data-driven design that can both optimize the production of their aircrafts and create better products. Company A can no longer keep up.
Because design and manufacturing don’t happen in silos, DfAI applies to the broader aspects of the engineering design process. Williams suggests that foundationally, advancing DfAI can be addressed through 1) raising AI literacy in industry, 2) redesigning engineering systems to better integrate with AI, and 3) enhancing the engineering AI development process.
“Engineering data is complex and not always relatable to the wider community,” McComb explains as one reason other fields may be innovating AI technology quicker. “The number of experts able to interpret this data is small, so DfAI will require individuals to have specific expertise. Academia and industry need to work together to support long-term innovation in this area.”
The team outlines three personas as necessities for DfAI: engineering designers, design repository curators, and AI developers. An engineering designer may be a person or team responsible for developing the specifications of a new project. They are the problem solvers who can understand the engineering constraints, as well as the AI algorithms. Design repository curators must take the role of a database maintainer one step further by having engineering design and manufacturing knowledge to deliver design engineers the data management tools to meet workflow demands and be extensible to future demands. Finally, the AI developers must be able to ideate, develop, market, and continuously improve AI software products to help the design engineers.
“We can’t think of AI development as an afterthought in our core operations,” McComb summarizes. “Unless we augment design engineers with next-generation design and manufacturing software based on deeply-integrated AI, our ability to design novel and useful technology will fall short of the potential of these new manufacturing techniques.”
Per Williams, several industries may have an easier time adopting DfAI principles. Computer science applications, such as products manufactured
by digitally driven techniques like additive manufacturing, naturally have a complex cyber-physical pathway and yield personnel that would be well suited in adopting and driving DfAI principles. Regulated industries, such as aerospace and medical devices, are accustomed to following rigorous procedures and reliable data storage techniques in the context of very complex engineered systems, so they likely have the resources to begin the DfAI adoption process. We may also see the internet of things (IoT) and smart device product designers adopting DfAI principles sooner rather than later. These designers stand to benefit not only from data during the design process, but also from vast amounts of data gathered from their devices in testing or in practice. Harnessing this valuable product data from the field could result in tremendous benefits to AI tools that help enhance the quality, performance, sustainability, and profitability of future products.
“Since there is so much variety between engineering applications, industries, technologies, and scales of operation, establishing general frameworks, common terminology, and written principles is vital to growing an interconnected community of AI Engineers who can collaborate together,” Williams elaborated. “Our DfAI framework provides the high-level starting point for these critical discussions.”
This research was published in the ASME Journal of Computing and Information Science in Engineering Personas in the DfAI process
Disk drives are the main workhorse in today’s technology. Everything from facial recognition to the Alexa in your house rely on information stored in the cloud, which is supported by disk drives. To keep up with the exponential rate of data created every day, disk drives must evolve by reading and recovering data more efficiently.
Jimmy Zhu, ABB Professor of Electrical and Computer Engineering, and his students have developed the first machine learning-based data detection channel chip for hard disk drives. The team presented their work at the VLSI Symposium 2022 last June in Hawaii.
“We are integrating machine learning into hard disk drives to make them faster and more efficient,” says Zhu. “This is the first chip to use machine learning for data detection in disk drives. The performance of the machine learning-based chips surpasses the regular chips manufactured today.”
In present hard disk drives, data detection from data bits recorded in the magnetic layer on a disk surface uses conventional signal processing technology. By using machine learning, the process can be better optimized with significant performance improvement which will translate to an increased capacity for storing more data.
The Department of Electrical and Computer Engineering at Carnegie Mellon University is in a unique position to develop such an innovative chip. By offering courses like Advanced Digital Integrated Circuit Design, students can design computer chips and have them fabricated at the end of the semester.
“My students take this class to render the neural network chip,” says Zhu. “We design the chip throughout the year and can test a physical prototype at the end of the semester. This is a unique experience offered at Carnegie Mellon.”
Industry partners agree. Zhu states that disk drive companies are surprised that a university can handle this level of development and fabrication.
“We have taken the lead in machine learning technology,” says Zhu. “This research will impact any real time data detection which will change hard disk drives and the future of technology as we know it.”
This pioneering research work has been funded by the Data Storage Systems Center and its industrial sponsors. It is also part of the collaboration between Apple Computer the Department of Electrical and Computer Engineering at Carnegie Mellon University.
The U.S. Chamber of Commerce’s Commission on Artificial Intelligence Competition, Inclusion, and Innovation released a new report aimed at providing insight for policymakers managing the impact of AI on our economy and society. Conrad Tucker, professor of mechanical engineering at Carnegie Mellon University, served as a commissioner.
“The report is timely, given the profound impact that AI is having across multiple facets of society,” explained Tucker. “The U.S. is uniquely positioned to lead in the AI space and serve as a guide to how AI can positively impact people and society.”
The report outlines four major findings:
The development of AI and introduction of AI-based systems are growing exponentially. Over the next 10-20 years, virtually every business and government agency will be using AI. This will have a profound impact upon society, the economy, and national security.
Future advances in customer services and productivity gains—as well as the emergence of new security threats—will be through AI, and, therefore, failure to smartly regulate AI will harm the economy, potentially diminish individual rights, and constrain the development and introduction of beneficial technologies.
The U.S., through its technological advantages, well-developed system of individual rights, advanced legal system, and interlocking alliances with democracies worldwide, is uniquely situated to lead this effort. The U.S. must act to ensure future economic growth, provide for a competitive workforce, maintain a competitive position in a global economy and address future national security needs.
Policies and initiatives to both enforce existing and craft new laws and rules for the development of responsible AI and its ethical deployment must be a top priority for this and future administrations and congresses.
“The AI report is a culmination of more than a year’s worth of hard work engaging with multiple stakeholders to ensure that diverse perspectives were taken into consideration. I am grateful to have served with such esteemed colleagues and thought leaders,” Tucker said.
Beginning in January 2022, the commission, co-chaired by former Reps. John Delaney (D-MD) and Mike Ferguson (R-NJ), focused on researching and recommending AI policies related to regulation, international research and development competitiveness, and future jobs. To reach these recommendations, the commission received input from more than 87 expert witnesses and stakeholders in both the United States and abroad.
TOMORROW HUMAN TOUCH WILL EXPAND BEYOND OUR OWN SKIN.
The sensitivity of human touch is remarkable and central to our experiences. In the future, our acute sense of touch can be augmented through finely tuned systems to permit tactile discrimination through pressure, vibration and temperature.
In Softbotics research we are enhancing and expanding the body’s haptic sensing function. We do this with soft and ultrathin transducers that sense, transmit signals, and induce sensations of touch and feel based in microsystems and novel materials.
In the future, we will enhance interactive experiences in virtual environments, medical care, surgical operations, rehabilitation, long-distance communications and workplaces.
Learn more at engineering.cmu.edu/softbotics
The shape memory polymers known as liquid crystal elastomers (LCEs) are increasingly popular for uses in softbotics, haptics, and wearable computing. Functioning as actuators, they can allow materials to contract, expand, change shape, and perform like biological muscles do.
Because the action is controlled through heating and passive cooling, efforts to speed up these processes and increase energy efficiency are critical to advancing the work.
A multidisciplinary team of researchers from Carnegie Mellon University’s Department of Mechanical Engineering, Human-Computer Interaction Institute, and Robotics Institute sought to tackle this challenge by combining LCEs with a thermoelectric device (TED).
The collaborators developed a soft, flexible mechanism capable of electrically controlled actuation, active cooling, and thermal-to-electrical energy conversion. The findings were published in the journal Advanced Materials
The LCE-TED mechanism functions as a transducer—an electrical device that converts one form of energy into another. A thin layer of TED contains semiconductors embedded within a 3D printed elastomer matrix. It is wired with eutectic gallium-indium (EGaIn) liquid metal connections.
TEDs operate with dual functionality as both heaters and coolers in one mode and energy harvesters in the other. Because the layer is covered with LCEs on both sides, the TED can alter-
nately heat and cool the LCE layers. In addition, it can harvest energy from the changes in temperature.
“The ability to recover energy from residual heat and thermal gradients could contribute to improved energy efficiency and longevity of the host electronic device or robotic system,” said Carmel Majidi, a professor of mechanical engineering who directs the Soft Machines Lab.
Softbotic demonstrations revealed advantages of the LCE-TED transducers for use in practical applications: fast and accurate tracking due to active cooling; autonomous, “smart” maneuvering of a two-limbed walker toward a
heat source; and regenerative energy harvesting.
“This demonstrates the potential for creating softbotic systems that can harvest some of the electrical power they need from energy in the environment around them,” said Mason Zadan, a Ph.D. student and the lead author of the study.
Additional research will seek to integrate the transducers into the limbs of soft robots to more fully realize LCE-TED’s potential. Another aspect of the work will aim to further develop the energy harvesting and controls capabilities using untethered soft robotic platforms.
Jon Cagan, the George Tallman and Florence Barrett Ladd Professor in Engineering, was appointed the head of the Department of Mechanical Engineering (MechE) at Carnegie Mellon University, effective November 1, 2022. After a rigorous and extensive international search, Cagan succeeds Allen Robinson, director of CMU-Africa and the Raymond J. Lane Distinguished University Professor.
Since joining the faculty in 1990, Cagan has made a tremendous impact in research and education with expertise in engineering design automation and methods. This includes merging AI, machine learning, and optimization methods with cognitive science problem solving. He has also successfully transferred his research to industrial
practice. Cagan embodies advanced collaboration, having worked with engineers, psychologists, neuroscientists, marketers, designers, and architects in his work.
In the classroom, Cagan invites his students to think outside of the box and push traditional boundaries with new courses like Integrated Product Development, and product innovation courses, looking at grand challenges and future emerging (Fourth Industrial Revolution) technologies. He aims to prepare the next generations of mechanical engineers to solve problems through innovative teamwork and multidisciplinary approaches. The university has recognized his contributions by honoring him with the Robert E. Doherty Award for Sustained Contributions to Excellence in Education.
Cagan has demonstrated strong leadership skills by serving the College of Engineering in many roles: as its interim dean, the associate dean for graduate and faculty affairs, and the associate dean for strategic initiatives where he focused in part on enhancing Carnegie Mellon’s Silicon Valley campus. In 2013, he co-led the College’s strategic planning process to set the stage for new initiatives. He also co-founded the Integrated Innovation Institute.
Among other efforts, Cagan co-led the first strategic plan for Equity, Diversity, and Inclusion for the college and its implementation, including the formation of the new dual Ph.D. partnership with Howard University. As a mentor and advocate, he has played an integral role in the revision of core faculty and student policies to better serve and support our community. He also aided the
university in academic planning through the COVID pandemic.
A Fellow in the American Society of Mechanical Engineers, Cagan has earned the organization’s Design Theory and Methodology, Design Automation, and Ruth and Joel Spira Outstanding Design Educator awards. He has authored several books, more than 300 publications, and is an inventor on multiple patents.
He earned his bachelor’s and master’s degrees from the University of Rochester and received his Ph.D. from the University of California, Berkeley.
“This is an exciting time to guide and support the next generation of innovative education and research in Mechanical Engineering,” Cagan said. “I’m honored to accept this role and work with an incredible community of faculty, staff, and students.”
College of Engineering’s Eni Halilaj and Xi Ren have received National Science Foundation (NSF) CAREER Awards for their research with Carnegie Mellon Engineering’s mechanical engineering and biomedical engineering programs. The NSF Faculty Early Career Development Program awards grants to “early-career faculty who have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization.”
Eni Halilaj, an assistant professor of mechanical engineering and director of the Musculoskeletal Biomechanics Lab, has been awarded a five-year grant to study musculoskeletal modeling with wearable sensors and smartphone cameras. Halilaj’s research with the NSF grant will seek to make gait analysis more accessible for rehabilitation research and therapy monitoring. Her work will merge the complementary strengths of artificial intelligence and physics-based modeling into new motion-tracking systems that are lowcost, dynamically robust, and equitably
accurate across human demographics and abilities. The project also plans on employing youth outreach programs to educate and inspire the next generation of biomechanical researchers and engineers.
Charlie (Xi) Ren, an assistant professor of biomedical engineering, has been awarded a five-year grant for research into the role of the extracellular environment in lung tissue maturation during embryonic development and lung tissue regeneration after surgical procedures. The goal of this research is to aid in the engineering of functional lung cells from human-induced pluripotent stem cells (hiPSCs) or human made stem cells used for applications in tissue regeneration and repair or when combating diseases and injuries. Ren seeks to employ a new method of tracking the production of proteins in the extracellular matrix that are key to the regeneration of lung tissue. This new method will provide crucial data in the field of lung regenerative medicine and will address a current bottleneck in the progress of this research.
The Carnegie Mellon Portugal Program (CMU Portugal) was launched in 2006 as a platform for uniting Carnegie Mellon researchers and students with counterparts from Portuguese universities and industry for the purpose of steering Portugal to the forefront of information and communication technologies (ICT) development.
The tight coupling of cuttingedge research, world-class graduate education, and highly innovative companies has proven successful. Roughly 42 Portuguese institutions and 132 industry partners are associated with the program. Five of Portugal’s seven unicorn companies (startup companies valued over $1 billion) partner with CMU Portugal, and 12 startups have been created. These startups, which secured venture capital from the United States, Portugal, and other financial markets, have created over 1000 highly skilled jobs.
Some companies associated with the program are longstanding partners because of the program’s perceived value: Partners have access to a global network of industrial and academic
contacts that they can collaborate with to advance innovation.
CMU Portugal industry partners help support education and research programs, and this pays off in various ways. Companies can increase their competitiveness by investing in R&D and by supporting advanced training. This supplies companies with the skilled workers that they need.
In 2020, the program announced an ambitious call for research projects, and 12 Large Scale Collaborative Research Projects were selected. Portuguese ICT companies are leading the projects, and they are collaborating with other industry partners, Portuguese universities and research centers along with eight CMU departments.
The awarded projects cover the application of data science and engineering, artificial intelligence and machine learning, and design and engineering to address social problems in the health sector, forest fire prevention, mobility, energy, and language technologies. Projects were selected for a variety of reasons, including their capacity
to drive innovation and international competitiveness, while providing research opportunities for Ph.D. students.
Through these projects, CMU Portugal was able to strengthen research connections and commitments with companies within its innovation ecosystem and establish new partnerships with companies that have a strong presence in new lines of business, including online privacy and the health, energy, and environment sectors.
An example is the iFetch project that is led by FarFetch, one of Portugal’s unicorns. Multimodal chatbots is a growing area of research. The iFetch project proposes a new generation of conversational agents that interact with users using verbal and visual information. This technology requires very sophisticated AI, machine learning, and computer vision and image processing techniques.
The Large Scale Collaborative Research Projects represent an investment in talent. More than 443 researchers are involved in this work. To date, the projects have delivered at least 191 peer-review publications, five patents, and 100 master’s and 54 Ph.D. theses are ongoing or concluded.
To expand student research opportunities, CMU Portugal launched in 2021 the Affiliated Ph.D. Programs Initiative. In this program, Portuguese universities host students who are required to conduct research at CMU for up to one year as a visiting scholar. Upon completion of their Ph.D., candidates are awarded a degree by the Portuguese host university.
Many doctoral graduates now populate high tech companies in Portugal, which is a noticeable change from 10 years ago. Foreign and multinational companies are establishing research and development centers in Portugal because now they can find the talent they need.
In the early 1970s, Alvy Ray Smith was working at the Xerox Palo Alto Research Center. His coworker, Carnegie Mellon alumnus Richard Shoup, knew his friend Alvy was a painter, so he invited him to try out the new project he was working on for Xerox called SuperPaint.
Smith was so captivated by the early graphics program and framebuffer computer system that 16 hours would go by before he could tear himself away from it.
“Here it was! The two things I loved— computers and art in one place together,” exclaimed Smith.
In November, the author, artist, and computer graphics pioneer was at Carnegie Mellon Silicon Valley to talk about his new book, A Biography of the Pixel. But the hundreds of students who attended in person and online were just as eager to hear a biography of the author who famously co-founded Pixar Animation Studios, the producer of Toy Story—the
first full-length computer animated film. Sheryl Root brought Smith to the Silicon Valley campus. Root is an Associate Professor of the Practice at the Integrated Innovation Institute, which offers master’s degrees that combine studies in design, business, and engineering disciplines. The former head of strategy for Hewlett Packard, who began teaching at Carnegie Mellon in Silicon Valley in 2006, says that she, Smith, and her students all share a passion for innovation and technology.
“This environment just makes you want to do something,” she said.
SuperPaint sparked Smith’s dream to make the first computer generated movie. What he learned in the two decades it took for computing power to accommodate that ambition hold valuable lessons for ambitious young innovators at Carnegie Mellon who are striving to create new products, services, or companies.
AUTHOR, ARTIST, AND PIXAR COFOUNDER ALVY RAY SMITH INSPIRES CARNEGIE MELLON STUDENTS AND ASPIRING ENTREPRENEURS.
Smith who believes Moore’s Law, which states that “Anything good about computers gets better by an order of magnitude every five years,” says “If you don’t have the computer power you need today, keep at it because in five years, you’ll have 10 times more.”
In 1975, Xerox opted to forgo color in their graphics, prompting Smith to find another computer powerful enough to handle the new technology. He landed at the New York Institute of Technology where a wealthy Long Island businessman, Alexander Schure, had a team of 100 cell animators and a belief that computer animation could save time and money.
Schure also had one of the few 8-bit framebuffers Smith wanted. And while it still was not as much power as Smith ultimately needed, he and the team that would eventually form Pixar wisely used the time to not only improve their graphics skills, but to also learn how the old-style animators worked.
“At Pixar we did not allow that,” declared Smith who explained that they always valued both artistic and technical creativity.
In 1979, Smith and the computer graphics team left New York to work for Lucas Films, where they assumed they would work on the follow up to George Lucas’ wildly successful Star Wars movie. When they didn’t get to create graphics for The Empire Strikes Back, Smith grumbled, “George thinks we’re a bunch of technoids who don’t understand what artists do.”
But it wasn’t long before Paramount Pictures hired Industrial Light and Magic, a division of Lucas Film, to create computer graphics for Star Trek II - The Wrath of Khan giving Smith and the Graphics Group the opportunity to create the movie’s one-minute genesis sequence.
Smith was determined to make it in a way that not only made perfect narrative sense but would, more importantly, show George Lucas what they could do artistically. It worked, and the team proceeded to create computer graphics for Return of the Jedi and many other movies.
“I like being around that entrepreneurial energy. It’s scary, but it pulls out the best in you,” said Smith who says he had to learn the job, how to deal with people, handle finances, and more in order to eventually succeed.
Smith and the Graphics Group left to form a hardware company with the prototype they had built at Lucas Film, but they needed to sustain their 40-member team until there was enough computing power to make the full-length film they always envisioned. After 45 unsuccessful pitches, they decided to ask Steve Jobs to fund the hardware venture.
In 1986, Jobs, who had recently parted ways with Apple, funded Pixar. Jobs had to refinance the operation so many times that he eventually acquired full equity in the company. But Smith admits he never really cared about the money. He just wanted to “make that movie.”
And then in 1991, Disney, who knew the Pixar team because they had helped
digitize their cell animation process, called to say, “Let’s make that movie you always wanted to make—we’ll pay for it.”
Toy Story, Pixar’s first full-length feature, was released in 1995, at a time when many students in the recent Silicon Valley audience were Disney’s target audience for animated movies.
Sarvesh Karkhanis, a Silicon Valley student who attended Smith’s presentation, gushed at how inspiring Smith’s story was and how exciting it was to meet a pioneer of the Silicon Valley ecosystem.
“It was amazing—because he is the founder of Pixar, and he knew from the start that he had something that was the future,” said Karkhanis, who is working to establish a medical device company.
Aisya Aziz, another Silicon Valley student, was equally impressed. She read Smith’s book and was especially interested in his surprising explanation of a pixel, which he adamantly claims is not a little square.
“It’s actually something different than what I thought it was, and his detailed description of a pixel really spoke to my nerdy side,” said Aziz.
Both Aziz and Karkhanis were recently named James R. Swartz Entrepreneurial Fellows. The highly selective program develops entrepreneurial potential and leadership skills through hands-on experiences, networking, mentoring, and courses in entrepreneurship that can fast-track the careers of CMU first-year graduate students who are passionate about entrepreneurship in the technology arena.
Root is proud, but not surprised, that two of the 12 recent Swartz fellows are her students because she knows they excel at applying technology to solving real world problems. For example, they are finding new ways to help premature babies, improve K-12 education, and reduce waste in the environment.
“Like Alvy their creativity, passion, and determination are a thrill to be around,” said Root who added that, “Alvy’s open and honest talks excite and inspire these students who also have the true entrepreneurial spirit.”
A long-awaited ceremony to formally install William H. Sanders as the 15th dean of the College of Engineering and celebrate the extraordinary gift to establish the Dr. William D. and Nancy W. Strecker Dean’s Chair that Sanders now holds was held on November 3, 2022. Sanders joined Carnegie Mellon in January 2020 only months after the dean’s chair was created and just weeks before the campus transitioned to remote instruction status in response to the COVID-19 outbreak, causing the university to postpone the ceremony. At the event Nancy Strecker, a Harvard graduate, who along with her husband Bill made the extraordinary $15 million
gift to endow the dean’s chair, professes herself to have the zeal of the converted.
“This is a truly great engineering school where you practice what you preach. Collaboration is the heart and soul of Carnegie Mellon, and no one is more collaborative than Bill Sanders,” said Strecker.
Bill Strecker is a triple alumnus of the Carnegie Mellon’s electrical engineering program, earning bachelor’s, master’s, and doctoral degrees in 1966, 1967, and 1971. For nearly three decades, he worked at Digital Equipment Corporation, where he served as consulting engineer, senior vice president of engineering, chief technology officer
and senior vice president of corporate strategy. He earned 16 patents for his breakthrough engineering designs and led the development of the highly successful VAX computer architecture.
Nancy Strecker also had an extensive career with Digital Equipment Corporation, where she and Bill met. She built and led an award-winning worldwide corporate sales team and DEC’s global pharmaceutical industry business unit before retiring as vice president of global customer programs.
Sanders, who thanked the Streckers, emphasized that his accomplishments were only possible because of many talented people who contributed to his
successes. “We do work that matters to shape the world for real and enduring good,” said Sanders.
Sanders previously served as a tenured professor and held the Herman M. Dieckamp Endowed Chair in Engineering at the University of Illinois at Urbana-Champaign. His research interests included secure and dependable computing and security, as well as resiliency metrics and evaluation, with a focus on critical infrastructures. He has published more than 300 papers and has also directed work at the forefront of national efforts to make the United States’ power grid smart and resilient.
He was the founding director of the University of Illinois’ Information Trust Institute, served as director of the Coordinated Science Laboratory, and was head of the university’s Department
of Electrical and Computer Engineering from 2014-2018. He co-founded the Advanced Digital Sciences Center in Singapore in 2009, which was Illinois’ first international research facility.
As the inaugural recipient of the Strecker Dean’s Chair, Sanders will have critical funding that will empower faculty and students and enable the College of Engineering to focus on transformative results that drive the intellectual and economic vitality of our community, nation, and world.
In his opening remarks, Provost James Garrett said, “Under Dean Sanders’ leadership, the College of Engineering is a place where excellence and innovation thrive. It is a place that molds creative and technically strong engineers who pioneer solutions to global challenges. And this goes beyond the programs here in Pittsburgh. Dean Sanders has been instrumental in growing engineering programming at CMU-Africa that is flourishing.”
72% OF ENGINEERING STUDENTS CLAIM TO BE INTROVERTS,
BUT
85% ARE ACTIVE WITH A STUDENT ORGANIZATION.
88% USE SOCIAL MEDIA EVERY DAY … BUT ONLY 27% SAY THEY’RE ON TIKTOK DAILY.
40% MAKE THEIR BED EVERY DAY.
254 belong to the Society of Women Engineers
60 joined Carnegie Mellon Racing/ Society of Automotive Engineers
60 are members of National Society of Black Engineers
12% ARE VEGAN OR VEGETARIAN. HAVE A TATTOO.
73% EXERCISE EVERY WEEK
+
66% SPEAK MORE THAN ONE LANGUAGE.
45% SAY THEY’RE SPORTS FANS.
13%
62% PLAY A MUSICAL INSTRUMENT.
Thanks to the 579 of our 1,823 engineering undergrads who took our oneminute online survey!
Lauren Janicke’s work studying and analyzing inefficiencies in electrical transmission and distribution networks empowered her success.
Not long after Lauren Janicke attended one of Destenie Nock’s guest lectures, she reached out to ask if she could work on the civil and environmental engineering and engineering and public policy professor’s research team.
Luckily, the answer was yes. Nock’s broad research interests using mathematical modeling tools to address societal problems related to sustainability and energy were a good match for Janicke’s interests and inspired her decisions to major in civil and environmental engineering and minor in environmental and sustainability studies. It also contributed to her decision to pursue a second major in statistics and data science.
The pairing has led to some remarkable opportunities for Janicke, including having led a study that was recently published in Energy, an international, multi-disciplinary journal in energy engineering and research. Janicke and the research team estimated the air pollution generated through inefficiencies in transmission and distribution (T&D) networks, examined opportunities for reducing emissions through regulation at the multinational and sub-national scales, and compared the cost of these potential emissions reductions to the cost of investment in renewable energy. The study found that such investment could significantly reduce air pollution.
Janicke says that it was not a problem to
make the time for the research project, which ultimately identified that investment in electricity T&D systems to be a significant opportunity for reducing air pollution.
“It’s been easy because I like the research work even more than the class work,” admits Janicke.
Her initial assignment to create a scenariobased model to quantify the amount of carbon dioxide, atmospheric pollutants, and particulate matter emissions associated with the T&D of electricity in the continental United States was later expanded to a global analysis for 146 countries. The work gave her the opportunity to learn ArcGIS software that can analyze and transform data into interactive web maps.
She later worked on a project where she applied her budding statistical skills to create visualizations in an analysis of how snowfall effects Uber and Lyft usage in Chicago. She spent
Lauren Janicke’s undergraduate research stint led to a major scientific journal publication, an internship with a federal agency, a prestigious scholarship, and in April 2023, she received the Judith Resnik Award, which recognizes an exceptional senior woman who plans to pursue graduate or professional training in a technical field.
the summer after her sophomore year building her data science skills working on research at the National Renewable Energy Laboratory (NREL) in Colorado. And during the summer of 2022, she had a 10-week, full-time paid internship with the National Oceanic and Atmospheric Administration through her Ernest F. Hollings scholarship, which provides up to $9,500 for each of two years of study.
Janicke says that Nock was a great mentor and the opportunities she had to conduct research helped her determine the type of work she wants to pursue. After she graduates in the spring, she plans to pursue master’s and Ph.D. degrees.
Kelby Kramer said that interacting with members of Gerald Wang’s research team was one of the best aspects of his undergraduate research experience.
Kelby Kramer, who graduated in December, has always known that his ultimate career ambition is to work as a coastal engineer. Kramer, who has worked as a lifeguard and is an avid surfer, wants to work on policies related to sea level rise and help ensure there is a protective infrastructure for coastlines.
As a student he was involved in three research projects. He constructed
a wave tank to study coastal erosion protection methods. He worked on a project to validate different satellite remote sensing techniques for glacier mapping. And he spent two years working with Gerald Wang on a pandemic related project.
Wang, an assistant professor in civil and environmental engineering, applied his background in small scale particle interaction to a pedestrian study on social distancing. Using a program that was developed to study particle interaction such as how fluids move through nanotubes in a desalination plan, Kramer ran simulations on how desire for social distancing influenced pedestrian movement.
“It was a cool idea to use molecule modeling software for pedestrian dynamics," said Kramer.
Although it was less related to his career goals, Kramer says he gained a lot from being involved in the project. He was introduced to computing that he wouldn’t have been exposed to otherwise. He says the parallel computing methods he used are helping him in the work he is doing now with the Army Corps of Engineers. He says he is also far more aware of the power of computers to solve problems. In coastal engineering, for example, satellites are used for observation, but it’s computers that must be used to analyze all of that data.
Kramer, who intends to pursue a master’s and possibly a doctorate degree, says that having worked on the project gave him the opportunity to be a part of a project throughout the entire research process from idea to a final published paper.
But he says what he liked the best was being a part of a team.
“I really enjoyed the community aspect,” said Kramer, who stays in touch with some of the research team members and follows the work they’re doing now.
Malaika Alphons always believed that studying chemical and biomedical engineering would lead to a job as an engineer working in industry, but her experience last summer working in Robert Tilton’s thermodynamics labs has added a research and development job to her list of options.
She reached out to Tilton, a chemical engineering professor, after having been a student in his thermodynamics class. She was brought onto the team and received tuition-free elective credit for the work through the Summer Undergraduate Research Apprenticeship (SURA) program.
"It was a really good experience. It was different from working in classroom labs and there was a lot of critical thinking and problem solving required,” said Alphons.
Her work in the lab was to experimentally determine the kinetics of particle adhesion to a surface in varied polymer and surfactant concentrations, which are commonly used in consumer goods like shaving cream and styling mouse.
She would run several mixtures through microfluidic channels, examine them under the microscope, and take several pictures of each experiment over a timescale of a few hours. She got to also analyze the images and develop methods to measure adhesion in particles per unit area. Her findings showed what amount of polymer added to a solution with surfactant caused the particle adhesion to be inhibited, and that polymers without surfactant did not cause adhesion.
She said that both Tilton and her mentor, doctoral student Angela Yang, encouraged her to take ownership of which strategies to use in the analysis and documentation.
“They provided guidance but also let me try to figure it out on my own, which helped me realize what I was capable of,” said Alphons.
She plans to secure an internship this coming summer working in industry to help further determine what route to take when she graduates in 2024.
Liz Barre, a senior in mechanical engineering, won the pentathlon at the 2023 NCAA Division III Indoor Track and Field Championships on March 10, 2023. That win made her Carnegie Mellon's first NCAA national indoor track and field champion and set a school record of 3,873 points. The next day, Barre placed fourth in the high jump and earned her second All-American honor. She is the first Tartan to earn two All-American honors at the indoor championships in the same year.
Each year, thanks to a generous gift from alumnus Philip Dowd (B.S. MSE ’63) and his wife Marsha, the Dowd Engineering Seed Fund for Graduate Student Fellowships supports a year of doctoral expenses for multiple Ph.D. students.
The fellowship was created in 2001 to fund cutting-edge doctoral research in engineering at Carnegie Mellon. The projects it supports are so new that insufficient intellectual property exists for them to be funded by government agencies and foundations. Philip and Marsha Dowd established this fellowship to incubate “high-risk, highreward” projects.
“HIGH-RISK, HIGH-REWARD”(Left to right) Spencer Matonis, Weitao Wang, Sandhya Ramachandran, Abhishek Anand
Development and application of low-cost techniques to determine speciated PM2.5 concentrations in Sub-Saharan Africa
Exposure to air pollution poses a severe risk to global public health. Among air pollutants, PM 2.5 (particulate matter with an aerodynamic diameter of 2.5 μm or smaller) alone is responsible for nearly 3.3 million premature deaths annually. While air quality is regulated in the United States by the Environmental Protection Agency, there is a gap in the availability of air quality information across the globe, particularly in Sub-Saharan Africa (SSA) and Latin America.
Abhishek Anand, a Ph.D. student in mechanical engineering, is working with Associate Research Professor Albert Presto to develop low-cost methods to improve air quality monitoring. Using image processing, Anand can quantify atmospheric black carbon concentrations from cell phone images of filter tapes in existing PM 2.5 monitors established by U.S. Embassies around the world. Black carbon is a component of PM 2.5 and it has higher health impacts than PM 2.5 alone. Anand has successfully tested the method by measuring black carbon from monitors operating in Pittsburgh, PA and he plans to further apply this method for black carbon monitoring in SSA.
Investigation of Neuronal Correlation at Somatosensory Cortex
Induced by Transcranial Focused Ultrasound Neuromodulation
Brain disorders and diseases cost the U.S. more than $1.5 trillion dollars annually. Existing pharmacological treatment yield side effects, such as drug addiction, inefficiency, or overdose. Noninvasive neuromodulation has been identified as an alternative with limited side effects, but it is only effective if precise stimulation can be accomplished.
Sandhya Ramachandran, a biomedical engineering Ph.D. student in Bin He’s research group is working to understand how one neuromodulation technique, low-intensity transcranial focused ultrasound or tFUS, can modulate how neurons interact with each other. tFUS uses a low-intensity ultrasound wave to stimulate the brain with high precision. It has a high spatial resolution and the ability to noninvasively target deep brain regions.
The researchers aim to demonstrate that tFUS can be used noninvasively to modulate neural connectivity in a long lasting and parameter dependent manner. Long-term, this research may lead to therapies that ease the burden of neurological disorders.
Bioresorbable drug delivery patch for localized treatment of Crohn's disease strictures
Crohn's Disease, an incurable autoimmune disease, is characterized by blockages in the gut made up of inflamed and fibrotic tissue, known as strictures. These strictures can lead to a multitude of problems, including intestinal blockage, inability to digest food, and heightened risk of bowel perforation. Current treatment includes injections, infusions and oral drugs, however, these approaches are expensive, often ineffective, and can lead to unwanted side effects.
Spencer Matonis, a materials science and engineering
Ph.D. student in Chris Bettinger’s research group is working to improve treatment for this issue through a tube-shaped structure that will be embedded at the site of the narrowing where it will eventually degrade. This treatment will allow for the controlled release of antiinflammatory drugs directly to the symptomatic tissue. The project is advancing to proof-of-concept testing. Matonis and his collaborators hope to eventually perform testing in human subjects through clinical trials under the supervision of the FDA.
A DNA nanoshell for encapsulation and protection of endothelial cells
According to the Health Resources & Services Administration, every day 17 people die waiting for an organ transplant in the U.S. With biomedical advancements, 3D printed organs are a reality but keeping the organs’ cells safe during the engineering process poses challenges.
Weitao Wang, a Ph.D. student in mechanical engineering, has developed a modular synthetic cell armor to protect cells during therapeutic and bioengineering applications. Working with Rebecca Taylor, associate professor of mechanical engineering, and Charlie Ren, assistant professor of biomedical engineering, Wang has encapsulated cells with a DNA origami-based nanoshell that excels in its programmability, stability, biocompatibility, and biodegradability. His finding opens new opportunities to finetune the biophysical properties of the cell membrane by exploring changes in cell behavior and function.
Carnegie Mellon chemical engineering students took home first place in the 2022 RAPID Manufacturing Institute’s ChemE Cube competition for their invention, the “Water-Mellon.”
The competition was held during the 2022 American Institute of Chemical Engineers (AIChE) annual meeting. Teams were tasked with creating a “modular on-demand water purification” system in a one cubic-foot plant.
Designed to affordably and sustainably remove all contaminants from water, the “Water-Mellon” uses an activated carbon filter sourced from coconuts to attract organic and other non-polar compounds, as its large surface area provides a high absorption capacity. The filter can be reactivated, allowing it to be recycled, ultimately reducing waste.
Once the water has made its way through the filter, it’s treated with sodium hypochlorite to destroy bacteria and viruses through oxidative protein unfolding that denatures microorganisms’ proteins.
A second activated carbon unit then removes the excess sodium hypochlorite and any remaining contaminants while also reducing odor.
Additionally, the “Water-Mellon” has the potential for parallel purification. Developers say that by connecting multiple units to one inlet and outlet stream, it can effectively meet the water demands of developing nations.
Members of Carnegie Mellon’s 2022 ChemE Cube team include:
• Seniors: Yerim Lee, Lance Miller, Shivank Joshi, Alayna Mikush
• Juniors: Gabe Mendez-Sanders, Davina Jain, Claire Su, Justin Croyle
Aspiring engineers come to Carnegie Mellon University to learn in a way that goes well beyond principles in a textbook. Practical, hands-on work gives students real-world experience to help them excel in their careers. In CMU-Africa Adjunct Professor Ganesh Mani’s course, Grand Challenges in AI: Past, Present, and Future, he tackles this philosophy through the project-based curriculum offered to students across three continents. Graduate students at CMU-Africa, and in the Heinz College of Information Systems and Public Policy, and undergraduate students at CMUQatar come together in a hybrid course that focuses on critically evaluating the progress of artificial intelligence (AI).
“What I tell my students is that this course represents an important aspect of their future careers: the ability to work globally. On day one of a job, you’re very likely to be asked to work with a global team. You’re going to have to coordinate talking to different people across time zones as well as have your perspectives and work
interpreted by colleagues of different cultures,” says Mani.
The course was inspired by Mani’s reflection on a 1988 talk and article from his colleague and mentor, Raj Reddy, the Moza Bint Nasser University Professor of Computer Science and Robotics. Reddy had laid out the grand challenges that he felt the field of artificial intelligence would see over the next three decades. When Mani decided to write a follow-up article in AI Magazine in 2021, he turned the topic into a course that would allow students to get involved in this critical conversation.
The course includes guest lectures from thought leaders such as University of Maryland’s Catherine Nakalembe, who talks to students about remote sensing for agriculture in Africa and Bill and Melinda Gates Foundation’s Kanwaljit Singh, who discusses the role of digital public goods in the field of AI.
Students are asked to work together in small groups to choose a topic that could serve as a building block or inform key issues around a grand challenge in AI. Each group is made up of students from all three CMU locations. Mani explains that projects can be surveys, critical evaluations of existing systems, scaled-down pilot implementations, or bold well-thought-out suggestions.
“One of the interesting projects was titled ‘Understanding Gender-based Violence in Africa,’” says Akintayo Jabar (’21), who served as a teaching assistant for the course after obtaining his master’s in information technology degree from CMU-Africa. “This was particularly interesting because
the subject is a sensitive one. The information obtained from the project could be instrumental in policies aimed at reducing and ultimately preventing gender-based violence cases.” Other project topics have ranged from crop sensing to chat bots in African banking.
The majority of the course is taught in a hybrid fashion from CMUPittsburgh; however, Mani spends a week teaching in-person at both CMU-Africa and CMU-Qatar in order to meet his students. During the semester, students are encouraged to share their perspectives on the course topics, which often differ because of
the diverse cultural experiences of the class participants.
“Multi-location courses usually foster a diversity of experiences, and this wasn't any different. The students across all locations contributed the peculiarities of their experiences during discussions, and how AI has impacted, is impacting, and could still impact them,” says Jabar.
The course will be offered again in the fall of 2023, with some modifications and the new title of “AI and Emerging Economies.” In the future, Mani hopes to expand this class to include students from other disciplines within Carnegie Mellon.
STUDENTS USE SKILLS IN ART, TECHNOLOGY, PROBLEM SOLVING, PROJECT MANAGEMENT, USER INTERFACE, AND COLLABORATION TO CREATE PHYSICAL AND VIRTUAL MAZES.
When Nestor Gomez was looking for a topic that combined art and technology for a new course in Carnegie Mellon’s Integrative Design, Arts and Technology Network (IDeATe) program, he found his way to a surprising idea—mazes.
Mazes and labyrinths have a rich culture of art and history that date back to the ancient Greek myth of Theseus slaying the minotaur who had been imprisoned in a labyrinth of complex passages. Ancient Romans later incorporated playful labyrinth designs into their tile floors and the Celts used them in daring games and athletic contests. Today there are elaborate life-size corn mazes in rural fields across America.
But constructing a maze in the physical world requires engineering know-how and designing them in the virtual world requires computer programming skills.
Engineering, computer science, architecture, and design students have enrolled in the Special Topics: Mazes course in nearly equal numbers. Their skillsets vary, but they all had to apply creativity and problem solving to building mazes for the class projects.
Belle Blanchard, a mechanical engineering student, says she has always liked mazes and was interested in the IDeATe courses because they incorporate the artistic with the technical. After having attended a webinar that Gomez, an assistant teaching professor of mechanical engineering, hosted last year, she also knew him to be an especially fun and enthusiastic instructor.
Working with the Arduinos—the microcontroller kits that many of the engineering students used to add sensors, gates, and lights to their mazes—was a good challenge for her.
“I didn’t have a lot of programming experience, so it was really fun to try it out in such a low stakes environment, and it was inspiring to see the cool ways the other students programmed the electronic functionality into their designs,” said Blanchard.
Gomez strives to inspire creativity with lessons that show possibilities and assignments that are made with minimal constraints. Early on he gave the students a simple piece of wire to create a unicursal labyrinth, to demonstrate that, unlike mazes, which have multiple paths that branch off and do not necessarily lead to a center end point, labyrinths have only a single continuous path to the center.
He constructed a 3 x 3-foot wooden tilt table with magnetic walls that the students used to design mazes that a golf ball rolled through. And Gomez taught the students how to generate virtual mazes in Microsoft Excel.
But for their final projects, they were given the freedom to build either physical or virtual mazes, and Gomez says he was blown away by their ingenuity.
“The mazes that the students designed and built were mind-boggling to me,” said Gomez.
Tanner Aikens, a mechanical engineering major, was inspired by the modular design of the tilt table to design a maze with electronic sensors that opened magnetic gates. The maze could only be completed once all four gates were opened.
“Designing the first one was challenging, but each of the three following mazes were easier to design,” said Aikens who added, “I got to exercise skills I don’t usually use, and, in the end, it was cool that I was able to construct such reliable systems.”
Another group incorporated sensors that played notes to “Twinkle Twinkle Little Star” if the rolling ball moved through the maze properly. A tri-level maze, inspired by Dante’s Inferno, took users through artistic representations of heaven, hell, and the forests of Earth. Another group made a mini-golf maze that could be rearranged to create
different courses for playing the game.
Beyond creativity, students learned some important project management lessons. A group who wanted to build a very small-scale three-dimensional maze found that they weren’t able to achieve the precision they needed when laser cutting its parts. Another group spent too much time trying to incorporate a cable car into their design and the rest of the maze suffered as a result.
The students also had opportunities to collaborate and observe how users interfaced with their designs. Gomez gave each student a small book of mazes that showed the difference between navigating a maze that you could see in its entirety versus those that didn’t reveal what was next. The students were tasked with figuring out the mazes themselves and then using the book to observe others working through the mazes. Gomez said it was a good exercise for engineers, computer programmers, and artists to observe how others use and adapt to what they create.
Aikens and Blanchard formed an unexpected collaboration for their final project. Aikens and his partner wanted to further explore the idea of not being able to see the entire maze and were designing a maze that the users would walk through. Meanwhile Blanchard and her partner were interested in incorporating a tactile experience into their maze. The two pairs teamed up to create a 6 x 6-foot maze that users walked through blindfolded, using only the various surfaces as clues to guide them through the maze.
Aikens was pleased that he accurately predicted that all of the users believed they were working through a much larger space. And Blanchard said that even though she is not typically a fan of group work, this collaboration, like all of the work for this course, was a really good experience.
Gomez has taught the class during the last two fall semesters and looks forward to teaching it again in the fall of 2023.
Carnegie Mellon University awarded the Founders Medal for Outstanding Service and Exceptional Achievement to Claire and John Bertucci on October 28, 2022, during its 72nd annual CMU Alumni Awards ceremony as part of the university’s Homecoming celebrations.
The distinguished alumni couple have displayed outstanding dedication and service to the university and extraordinary accomplishments in their lives.
“Claire and John Bertucci’s lifelong partnership with CMU is an inspiration to all of us,” said Farnam Jahanian, president of Carnegie Mellon. “Claire and John have demonstrated a deep commitment to access and affordability through their support for fellowships and scholarships, and their philanthropy has also helped to create a state-of-the-art environment where students, faculty and researchers can advance cutting-edge technological breakthroughs. Our entire community is grateful for their enduring commitment to furthering Carnegie Mellon’s mission to transform society through research and education.”
Claire and John Bertucci have a passion for Carnegie Mellon University and for making the world a better place.
They met as CMU students in the 1960s. John completed a bachelor’s degree in metallurgical engineering and a master’s degree in industrial administration while Claire graduated with a bachelor's degree in business. After graduation and career experience in semiconductor manufacturing and management consulting, John joined MKS Instruments, a small process control instrumentation company in 1970. He purchased MKS in 1974 and went on to guide its growth and international expansion as CEO and chairman. MKS became a public company in 1999 (MKSI). John retired as chairman in 2020 and now serves as chairman emeritus.
That success fueled the couple’s passion for giving back. At CMU, they funded the Claire and John Bertucci Nanotechnology Laboratory, which performs more than $10 million a year in cutting-edge research. In the College of Engineering, they established the Claire and John Bertucci Fellowship in Engineering, which has supported 184 fellows since 2008, and the John and Claire Bertucci Distinguished Professorship in Engineering. They also started the Claire Ruge Bertucci and John R. Bertucci Endowed Presidential Scholarship, a four-year scholarship received by three students since 2017.
Gianluca Piazza, director of the Claire and John Bertucci Nanotechnology Laboratory, said the couple’s impact is amplified by the hundreds of people—from academia, national labs, and industry—who use the lab to innovate every year.
John is an emeritus trustee on the CMU Board of Trustees and a long-time member of the College of Engineering’s Dean’s Advocacy Council. Outside of Carnegie Mellon, Claire and John support many organizations in their Massachusetts community.
“Our education here really taught us how to think and how to continue to learn on our own, and for that we’re truly thankful,” John said. “We’ve been pleased to be able to help CMU continue to be a leading academic and research university.”
Fellow College of Engineering alumnus and emeritus trustee Philip Dowd (’63) said the Bertuccis helped Carnegie Mellon become the “great university that it is today.”
“There are people who make things happen, there are people who watch things happen,” Dowd said, “and I think that John and Claire are clearly in the former group.”
Alumna Lauren (Milisits) Gonzalez (BS MechE ’13, BS CS ’14) still remembers getting the letter in the mail that meant she could attend Carnegie Mellon. It wasn’t her acceptance letter; that had already arrived.
It was a scholarship award notice. With it, combined with federal grants, she could afford to accept her CMU admission.
“I remember telling my mom once, ‘I hope one day I’m able to give back as much as I got in scholarships so that somebody else could do the same,’ sort of as a pay-it-forward gesture,” Gonzalez said.
She recently set up an endowed scholarship for the Department of Mechanical Engineering to do just that with the help of her employer’s match program. By endowing the scholarship, Gonzalez ensured it will partially support at least one engineering student each year in perpetuity.
“If I’m able to make what I do go further, then I might as well,” she said, noting that many companies match their employees’ charitable donations— doubling their impact—like Shell does for her.
Gonzalez hopes the scholarship will encourage undergraduate students to pursue engineering, particularly women, given their persistent underrepresentation in science, technology, engineering, and math (STEM) fields.
Growing up in Pittsburgh’s Squirrel Hill with parents who encouraged her interest in STEM, Gonzalez said she didn’t realize there was a gender
imbalance in STEM until college. Over the years, her parents had enrolled her in as many university outreach programs as they could.
“It was through all of those free programs that I kept my interest in science and engineering. And I never once doubted that I could do it,” she said. “So then, when I actually was going to school, that’s where I started to hear the stereotypes of, ‘Oh, you’re a woman. Are you sure you’re going into engineering?’ or ‘Oh, you’re in CS? You don’t look like a CS major.’”
Those responses irritated her, driving her to help more women get into STEM.
After her undergraduate years, Gonzalez moved to Texas where she worked as a test automation engineer
at a start-up and enrolled at the University of Texas at Austin to earn her master’s in mechanical engineering. She briefly interned for Shell while still a graduate student. The company offered her a full-time position, leading to an opportunity to work offshore on one of the company’s largest floating platforms, Appomattox, in the Gulf of Mexico. She continued to climb, becoming a project manager and now, a business advisor to the vice president of information and digital engineering in Houston.
COVID-19 added challenges, but Gonzalez said she’s proud of her and her team’s resilience throughout the pandemic. They delivered a multimillion-dollar project on time, on
budget, and on target “from our pajamas at home,” she joked. And she delivered a baby in the middle of it.
Her experience in the Department of Mechanical Engineering helped equip her for such challenges by exposing her to different people, projects, and work ethics, she said. She found that she likes “big problems that challenge the world, that have major impact.”
“We worked our butts off in school,” Gonzalez said. “And knowing how difficult that can be, it helps build the mindset that when something’s not easy, you don’t just quit right away and give up, because we wouldn’t be tackling big problems, and we wouldn’t be changing things if as soon as you hit a roadblock, you quit.”
The College of Engineering also provided some of her favorite memories while at CMU—trips to Peru and China with educational and industry components. Opportunities she never would have had if not for scholarships.
Education changed her life, she said, and motivates her philanthropy. Gonzalez encourages other alumni to consider how they can change someone’s life, even with small donations.
“Because I’m telling you, every thousand dollars, every hundred dollars I got towards the scholarship completely changed my path,” she said.
The trading card industry is thriving, and cards are selling for record-setting prices. In August 2022, a 1952 Mickey Mantle card sold for a whopping $12.6 million, while in 2021, a Pikachu Pokémon card reaped $5.275 million. Along with high sales prices for individual cards, as of 2019, the trading card industry was valued at $4.7 billion, and it’s projected to reach a value of $62 billion by 2027. Despite the soaring projections, the technology serving the industry hasn’t kept pace.
Trading cards are historically sold and traded at local, independently owned stores. With small staffs and millions
of cards to sort and catalogue, these businesses struggle to run efficiently and profitably. With ecommerce creating an increasingly competitive business environment, business owners are faced with a dilemma of how they can stay afloat with their limited capabilities.
Carnegie Mellon University alums, Kevin Lipkin (Mech E’08, Tepper ’09) and Cornell Wright (ECE ’07, CS ‘08), believe they have the answer with the Roca Sorter–a trading card sorting and cataloguing robot.
Serial entrepreneurs, Lipkin and Wright spent 15 years founding and growing companies within the healthcare, wastewater, education, and agricultural industries. While they had little background knowledge, they began their foray into the trading card industry in late 2019 after meeting with some early customers of the Roca Sorter.
“We could just hear and see what a game changer the product was for them, helping these small businesses grow, be more efficient, and really do things that they otherwise wouldn’t be able to,” said Lipkin.
They recognized an opportunity with the product and moved forward with acquiring the Roca Sorter product line to further commercialize the technology.
The Roca Sorter works in a two-fold process by first
identifying a card and then sorting it. The input tray of the machine is loaded with 1,000 cards at a time. Then, a camera takes an image of a card which is analyzed via an imagehashing algorithm and compared to the cards of a known “universe,” i.e., “Magic the Gathering” or “Pokémon.” Once the card is identified and mapped with textual information, it’s sorted using a mechanical process. The top card in the input tray is picked up with a vacuum pickup cylinder which moves along an X/Y gantry to lay the card in one of forty-five bins in the machine. Then, the second card in the input tray is compared to the first card that was analyzed and placed in a bin. This process continues until the machine has placed all 1,000 cards into a bin. The machine then recompiles the cards and places them back into the input tray in the order that the customer wants. Typically, customers ask for the cards to be sorted alphabetically or by market price. The customer is also sent a CSV file of all the cards that were sorted by the machine, thus allowing the customer to build a “library” of their inventory. This CSV file can easily be loaded into an ecommerce inventory account. Customers, who may sell thousands of cards per day, can list their cards for sale online without having to individually go through their bulk collections. Lipkin explains, “By having the cards cataloged and
alphabetized, it’s like the Dewey Decimal System. It becomes very easy and much faster to pull the exact cards that got ordered that day and get them shipped out.”
Once Lipkin and Wright acquired the Roca Sorter, they had to support existing customers and set up new warehousing and manufacturing facilities for the robots. “The initial focus was ‘How do we keep selling these and have a product that makes our customers happy and be able to support them?’” said Wright. Driven by this ideology, Wright and Lipkin worked on the Roca Sorter and its software to improve reliability and manufacturability. They’ve also implemented a support plan for their customers, which provides replacement parts and software updates.
Since 2019, Lipkin and Wright have sold upwards of 200 machines and grew to a nine-person team. They were acquired by TCGplayer in November 2021, which allowed them to continue making improvements on their product and grow. In 2022, their robots sorted over 100,000,000 trading cards across their customers, many of whom run their machines 24/7. Late in 2022, eBay acquired TCGplayer, making Roca Sorter part of an 11,000-person company. Lipkin said of the acquisition, “I think this is a pretty unique opportunity to take everything we’ve done to the next level.”
Forbes recently named two Carnegie Mellon Engineering alumni to its 30 Under 30 in Energy list for their innovative developments. Shashank Sripad and Olivia Dippo both saw needs in their respective fields and sought to fill these gaps. From this, Sripad’s “stealth-mode” electric aircraft company and Dippo’s Limelight Steel were born.
Shashank Sripad attended CMU for both his master’s in energy science (’16) and his Ph.D. in mechanical engineering (’22). His career in the energy sector began in the lab of Venkat Viswanathan, an associate professor of mechanical engineering, where he started with research in building electric semi-trucks. With companies such as Tesla and Rivian releasing electric cars, Viswanathan and Sripad thought to take this one step further and focus on vehicles that emit even more pollution. Their work gained attention after one of their publications correctly predicted the production timeline of these trucks.
In 2018, Sripad began switching his focus over to electric aircraft manufacturing after being approached by the companies Airbus and Zunum Aero. Using the same principles and ideas that are implemented with building electric car batteries, they began to apply this to aviation. Their main challenge is figuring out how to optimize the energy stored while keeping the battery light enough to fly.
The decarbonization of aviation is the main goal of Sripad’s company. He calculated that the emissions saved by about 400,000 electric vehicles would equate to only about 10 to 15 electric aircrafts. “The impact you can have on decarbonization is really significant. And that is the core motivating factor,” he states.
Looking to the future, one of the biggest challenges Sripad believes he and his colleagues will face is planning on such a large scale and remaining patient throughout the battery development process. Working in aviation requires certifications and strict safety measures, which can lead to these advancements taking five to 10 years to complete. Still, he says, “there is no room for error in terms of safety.”
Olivia Dippo (BME/MSE ’15) loved how her undergraduate Introduction to Materials Science class at CMU combined “hands-on lab experience with the theory to understand how atomic-scale phenomena affected humanscale materials behavior.” After gaining an initial interest in steel and metallurgy from her professors and Pittsburgh’s rich history, she joined the lab of Anthony Rollett, a professor of materials science and engineering and co-director of the Next Manufacturing Center. She credits him with being an encouraging mentor who inspired her to follow in his footsteps and secure a job at Los Alamos National Laboratory.
Dippo and her co-founder and friend, Andy Zhao, discussed their initial ideas for Limelight Steel at weekly meetups over coffee. Together, their expertise in their respective fields allowed them to realize they had a solution for lowering CO2 emissions in the iron and steel industry by targeting the blast furnace that is currently used for steelmaking. The current method is responsible for 8-10% of global CO2 emissions, a number Dippo hopes to reduce to zero in the future.
“Steel demand continues to grow as the world continues to industrialize, so it’s imperative that we decouple the steel that’s used for infrastructure, automobiles, and so many more applications in our daily lives, from CO2 emissions,” Dippo says. Limelight Steel’s laser furnace technology uses approximately 40% less energy than current methods and does not use fossil fuels, creating almost no emissions at all. Dippo believes that through her company’s efforts, they will contribute to keeping the planet a clean and habitable place.
Sullivan discusses danger of air fresheners
The Washington Post Ryan Sullivan explains how air fresheners can actually have serious adverse effects on consumers. He recommends naturally sourced essential oils as a way to combat household odors.
Bergbreiter
Popular Science
Sarah Bergbreiter discusses how jumping insects inspired the mechanics of her team’s design for a bouncing robot that controls energy transfers between surface and device using a launch mechanism.
Morgan discusses power grid security concerns
CBS 60 Minutes
Granger Morgan discusses the challenges of securing power facilities as physical attacks by domestic extremists increase.
Presto talks about the chemical effects of the East Palestine train derailment Fortune, NBC News, and CNN
Albert Presto talked about the environmental effects of the East Palestine train derailment and the chemicals’ effects on the town’s residents.
CMU Engineering joined forces with the Pittsburgh Penguins, Covestro, and Bauer Hockey to make ice hockey safer. The students developed shot blocker equipment to protect players’ lower legs and ankles.