7 minute read

A NEW METHOD FOR FIGHTING CANCER

Next Article
“Cover” Story

“Cover” Story

Innovative approach in quest to develop cancer vaccines nets Emory chemist Rong Ma a Michelson Prize.

The current study was based on laboratory experiments using an upper-limit dietary exposure of streptomycin to bumblebees. It is not known whether wild bumblebees are affected by agricultural spraying of streptomycin, or whether they are exposed to the tested concentration in the field.

Funded by a US Department of Agricultural grant, the researchers will now conduct field studies where streptomycin is sprayed on fruit orchards. If a detrimental impact is found on bumblebees, the researchers hope to provide evidence to support recommendations for methods and policies that may better serve farmers.

Based on established evidence, the researchers hypothesize that the negative impact of streptomycin on bumblebees seen in the lab experiments may be due to the disruption of the insects’ microbiome.

“We know that antibiotics can deplete beneficial microbes, along with pathogens,” Avila says. “That’s true whether the consumers of the antibiotics are people, other animals, or insects.”—Carol Clark

Emory chemist Rong Ma 21G received a $150,000 Michelson Prize for her proposal to harness the mechanical processes of cells as a new approach in the long-running quest to develop cancer vaccines. Ma, who received a PhD from Emory in 2021, is a postdoctoral fellow in the lab of Khalid Salaita, Emory professor of chemistry.

The Michelson Prizes: Next Generation Grants are annual awards to support young investigators who are “using disruptive concepts and inventive processes to significantly advance human immunology and vaccine and immunotherapy discovery research for major global diseases,” according to the Michelson Medical Research Foundation and the Human Vaccine Project, the organizations administering the awards.

Ma was one of three scientists selected through a rigorous global competition to receive a 2021 Michelson Prize for immunotherapy research.

“We need disruptive thinkers and doers who dare to change the trajectory of the world for the better,” says Gary Michelson, founder and co-chair of the Michelson Medical Research Foundation. “Yet promising young researchers too often lack the opportunities, resources, and freedom to explore their bold ideas. The pandemic has created additional roadblocks for many of them. With the Michelson Prizes, we aim to provide early-career investigators a vital boost for their forward-thinking approaches.”

“Rong Ma is a spectacular, highly motivated scientist,” Salaita says. “Sometimes I will tell her that a goal she sets is too lofty or difficult to pull off, but she will look back at me and say, ‘I want to do really big, difficult things.’ ”

“To find specific antigens on cancer cells for cancer vaccine development is extremely challenging, partly because of the ambiguity in predicting what antigens the body’s immune cells can recognize,” Ma says. “Many researchers are focused on using genetic sequencing techniques to find genetic mutations and predict tumor-specific antigens to achieve this goal.”

Ma’s proposal, however, is to use the mechanical forces transmitted by immune cells to antigens as a marker to identify and evaluate whether an antigen can trigger a potent immune response. If the method works in a mouse-model system, Ma explains, the long-range vision would be to isolate the immune cells that are mechanically active when recognizing cancer-specific antigens. The identified antigens and isolated immune cells could then be used to train the body to defend against cancer cells. Carol Clark

A Trio Of Goldwater Scholars

Three juniors in Emory College of Arts and Sciences have been named Goldwater Scholars for 2022, the fourth consecutive year that multiple students have won the nation’s top scholarship for undergraduates studying math, natural sciences, and engineering.

Anish “Max” Bagga 23C (mathematics and computer science), Noah Okada 23C (computer science and neurobiology), and Yena Woo 23C (chemistry) are among the 417 recipients chosen from more than 1,240 nominees from universities across the country. Emory has produced forty-five Goldwater Scholars since Congress established the program in 1986 to honor the work of the late Sen. Barry Goldwater. Each Goldwater Scholar will receive up to $7,500 per year for their studies, until they earn their undergraduate degrees.

MAKING MORE ‘GOOD TROUBLE’

Professor Darren Lenard Hutchinson was selected to lead the School of Law’s new Center for Civil Rights and Social Justice. The center will enhance the law school’s already rich focus on issues of civil rights, human rights, and social justice and will serve as a hub for interdisciplinary scholarship, research, teaching, evidence-based policy reform, and community outreach. The center was established in September, thanks to a transformative gift of $7 million from the Southern Company Foundation. Hutchinson is the law school’s inaugural John Lewis Chair for Civil Rights and Social Justice, which serves as a lasting tribute to the legacy of “good trouble” advocated by the late congressman.

Seeking Truths For Racial Crimes

Professor Hank Klibanoff and Gabrielle

Hospitals Reach New Heights

Four Emory Healthcare hospitals have been named top Georgia and US hospitals, and one has been named a top global hospital, in Newsweek’s lists of the World’s Best Hospitals 2022. Emory hospitals took the top four spots in Georgia. Emory University Hospital was listed as the No. 1 hospital in Georgia, and it was the only Georgia hospital named in the top 250 global list, coming in at number 135 in the world. Emory Saint Joseph’s Hospital was listed as the No. 2 hospital in the state. Emory Johns Creek Hospital took third place, while Emory University Hospital Midtown ranked fourth. All four hospitals placed among the top 300 hospitals in the country.

Dudley, instruction archivist in Emory’s Stuart A. Rose Manuscript, Archives, and Rare Book Library, were confirmed by the United States Senate to serve on the Civil Rights Cold Case Records Review Board. Dudley and Klibanoff were nominated by President Joseph R. Biden in June 2021. The review board will examine records of unpunished, racially motivated murders of Black Americans from 1940 to 1980. Dudley is a founding member of the Atlanta Black Archives Alliance and has been working with civil rights collections for more than a decade. Klibanoff is the director of the Georgia Civil Rights Cold Cases Project at Emory and the creator and host of the Buried Truths podcast, which delves into the stories of unpunished racially motivated killings.

BY TONY REHAGEN

ILLUSTRATION BY CHARLES CHAISSON

PHOTOGRAPHY BY STEPHEN NOWLAND

Our fear of artificial intelligence long predates AI’s actual existence. People have a natural apprehension in the face of any technology designed to replace us in some capacity. And as soon as we created the computer—a box of chips and circuits that almost seemed to think on its own (hello, HAL-9000)—the collective countdown to the robot apocalypse has been steadily ticking away.

But while we’ve been bracing to resist our automaton overlords in some winner-take-all technological sci-fi battle, a funny thing happened: The smart machines quietly took over our lives without our really noticing. The invasion didn’t come from the labs of Terminators from Skynet or Agents from The Matrix; it took place in our pockets, in the grocery checkout line, on our roads, in our hospitals, and at the bank.

“We are in the middle of an AI revolution,” says Ravi V. Bellamkonda, Emory University’s provost and executive vice president for academic affairs. “We have a sense of it. But we’re not yet fully comprehending what it’s doing to us.”

Bellamkonda and his colleagues at Emory are among the first in higher education to dedicate themselves, across disciplines, to figuring out precisely the impact the rapid spread of AI is having on us—and how we can better harness its power.

For the most part, this technology comes in peace. It exists to help us and make our lives easier, whether it’s ensuring more precise diagnoses of diseases, driving us safely from place to place, monitoring the weather, entertaining us, or connecting us with each other. In fact, the real problem with AI isn’t the technology itself—it’s the human element. Because while true, autonomous artificial intelligence hasn’t been achieved (yet), the models of machine learning that have snuck into every facet of our lives are essentially algorithms created by humans, trained on datasets compiled and curated by humans, employed at the whims of humans, that produce results interpreted by humans. That means the use of AI is rife with human bias, greed, expectation, negligence, and opaqueness, and its output is subject to our reaction.

In fact, the emergence of AI presents an unprecedented test of our ethics and principles as a society. “Ethics is intrinsic to AI,” says Paul Root Wolpe, bioethicist and director of Emory’s Center for Ethics.

“If you think about the ethics of most things, the ethics are in how you use that thing. For instance, the ethics of organ transplantation is in asking ‘Should we perform the procedure?’ or ‘How should we go about it?’ Those are questions for the doctor—the person who develops the technology of organ transplantation may never have to ask that question,” Wolpe says.

“But AI makes decisions and because decisions have ethical implications, you can’t build algorithms without thinking about ethical outcomes.”

Of course, just because the scientists and engineers realize the implications of their creations doesn’t mean they are equipped to make those momentous decisions on their own, especially when some of their models will literally have life or death implications. These algorithms will do things like decide whether a spot on an Xray is a benign growth or a potentially life-threatening tumor, use facial recognition to identify potential suspects in a crime, or use machine learning to determine who should qualify for a mortgage. Is it really better for society to have engineers working for private companies deciding what datasets most accurately represent the population? Is it even fair to place that burden on them? What is the alternative?

The answer might be the very thing we’ve already identified as AI’s key flaw—humanity. If the big-data technology is going to continue to take on more and more responsibility for making decisions in our lives—if AI is truly the cold, calculating brain of the future— then it’s up to us to provide the heart. And Bellamkonda and Wolpe are among the forward-thinking leaders who believe we can do that by incorporating the humanities at every step of the process. One way to accomplish that is with existing ethics infrastructure. Ethics has long been a concern in medical science, for example, and there are many existing bioethics centers that are already handling AI-related questions in medicine.

At Emory, the Center for Ethics boasts a world-class bioethics program, but also includes ethicists with decades of experience tackling issues that extend far beyond medicine alone, such as business, law, and social justice—all realms that are being impacted by the emergence of machine learning.

“I’m a proponent of prophylactic ethics,” says Wolpe. “We need to

Continued on Page 19

This article is from: