Mudd Center of Ethics: Annual Report 2020 - The Ethics of Technology

Page 1

Mudd Center for Ethics

Annual Report 2020

The Ethics of Technology


Welcome O

n behalf of the Roger Mudd Center for Ethics at Washington and Lee University, I am pleased to provide this report, summarizing highlights of the center’s recent programming.

It has been a busy and exciting time. Guiding all of the Mudd Center’s work has been its original goal of “fostering serious inquiry into, and thoughtful conversation about, important ethical issues in public and professional life.” In 2010, founding director Angela Smith envisioned a center that would encourage a “multidisciplinary perspective on ethics informed by both theory and practice.” That vision holds true today, as I hope the following pages attest. Thanks to Roger Mudd’s generous gift to Washington and Lee, the center is now celebrating a decade of supporting in-depth analysis of contemporary ethical issues. In keeping with its model of selecting an overarching theme for each academic year, the center chose “the ethics of technology” for its speaker series in 2019-20. As this report shows, our speakers explored a range of topics under that theme. Bioethicist Josephine Johnston kicked off the series in September with a compelling lecture on gene editing (i.e., CRISPR) technologies. Johnston imagined parents of the future grappling with whether to use such technology to advantage their own children. Later fall guests included Jeff Smith, a professor of cyberspace studies at Virginia Military Institute, who took a broad view of technology’s effects on human history; Ron Arkin, a Georgia Tech engineer who described developments in robotics and robo-ethics; Helen Nissenbaum, the country’s leading theorist of privacy; and Virginia Eubanks, who presented social justice issues plaguing automated government services. In the midst of these talks, the Mudd Center’s postdoctoral fellow, Jeremy Weissman, taught undergraduate seminars on the same topic and drew upon the center’s lineup of guests. The center’s faculty fellows also gathered in advance of each speaker’s visit to discuss the speaker’s work and its broader implications. The Winter Term schedule proved no less fascinating. Franklin Foer, correspondent for The Atlantic and author of “World Without Mind,” lectured on the rise and domination of the big tech firms. What impact have they had on culture? Should they be regulated, and if so, how? Next to speak were Washington and Lee’s own Karla Murdock and Wythe Whiting, past and current chairs of the university’s Department of Behavioral and Cognitive Science. To a standing-room-only audience, the two scholars examined the psychological impacts of cellphone use in the lives of young people. Although several other presentations were canceled due to the Covid-19 pandemic, we were grateful for the year’s eight speakers and for their memorably rich reflections on today’s world and its shaping, for better or worse, by technology. In the following pages, discover more about these talks and about other programming, past and future, of the Mudd Center.

Brian Murchison, Director


Technological change is everywhere, sparking basic questions: Where are these changes taking us? What values should inform society’s choices? In 2019-2020, the Mudd Center engaged a host of questions about “the ethics of technology.” Those included: How should we think about developments in CRISPR technologies for editing human DNA, what ethical guidelines should apply, and what vision of humanity should the guidelines reflect? In the realm of robotics and artificial intelligence, what ethical considerations should inform the practices of designers, builders and users? As governments and businesses increasingly use digital decision-making systems, what duties arise to prevent unfair treatment of citizens? As surveillance technologies become ever more pervasive and effective, how should privacy be understood, and what rules should protect it from invasion by the state or private entities? Should the practices of big tech companies — including unprecedented gathering and selling of information about individuals — be regulated, and if so, how? What benefits and risks are associated with Big Data algorithms, and how can the technological revolution be harnessed to contribute to the public good? The aim of the series was to explore those and related questions.

The Mudd Center was endowed by Roger Mudd, a 1950 graduate of W&L, awardwinning journalist, and member of the advisory committee for W&L’s department of journalism and mass communications. For over thirty years, he served as a Washington correspondent for CBS News, NBC News, and the “MacNeil/Lehrer Newshour” on PBS. His honors include the George Foster Peabody award, the Joan S. Barone Award for Distinguished Washington Reporting, and five Emmys. In 2010, he said, “Given the state of ethics in our current culture, this seems a fitting time to endow a center for the study of ethics, and my university is its fitting home.”

MUDD CENTER ANNUAL REPORT 2020 | 1



The Good Parent in an Age of Gene Editing: How Novel Genetic Technologies Challenge Parental Responsibility JOSPEHINE JOHNSTON | SEPTEMBER 26, 2019 Josephine Johnston, a bioethicist and lawyer at the Hastings Center, an independent bioethics research institute in Garrison, New York, came to Washington and Lee University on Sept. 26 and delivered a talk entitled “The Good Parent in an Age of Gene Editing: How Novel Genetic Technologies Challenge Parental Responsibility.” Stackhouse Theater was standing-room-only for the event. Johnston’s talk focused on “clustered regularly interspaced short palindrome repeats,” or CRISPR-Cas9. This technology, she explained, can alter any DNA, including DNA in embryos, and changes can be passed on to offspring. With CRISPR, it may be possible to eradicate certain diseases and “fix” certain behavioral and educational disabilities. “We can now engineer the human race,” Johnston said. Johnston explored some of the ramifications of creating children with the aid of CRISPR. She particularly asked what it will mean to be a “responsible parent” when the technology becomes more widely available. She posited that whole-genome sequencing will become part of the standard newborn screening process in the coming years. Johnston predicts that many parents will choose to make maximal use of genome technology for the perceived benefit of their children, but she worries that the opportunity to use the technology will morph into an obligation if seen as a responsibility of parenting. Johnston’s ethical concerns relate to CRISPR’s potential burdens on users. An immediate concern is financial: The cost of having a child will rise drastically if parents feel obligated to have a child through in vitro fertilization. The problem of costs raises a host of equality

considerations. Another burden relates to the complexity of deciding what to edit and what not to edit. Further concerns relate to identity and values: Is gene editing an extreme form of parental control over the child? Cultural pressure on parents to use genome technology could cause parents to act in ways that deeply distress them by interfering with their own values. Such issues likely will have disproportionate impact on women, the poor, the disabled or those with particular religious, moral or political convictions.

“WE CAN NOW ENGINEER THE HUMAN RACE.” She closed with a concern that society tends to prioritize the best interests of the child and ignore the flourishing of parents. In her view, parents’ own interests belong in the calculus of ethical parenting. At the end of the talk, an audience member asked, “If, a hundred years from now, we as a society support CRISPR technology, if we are sure that an editing procedure will not have negative collateral effects, and if we are confident that everyone can afford the technology regardless of socioeconomic status, would a parent have an ethical obligation to prevent something like autism in a child?” Johnston answered that if concerns about collateral effects of editing and socioeconomic influence were alleviated, then there might be an obligation to edit disabilities and diseases. However, she cautioned that our culture has a regrettable tendency to think negatively of disabilities, and that some conditions are not so readily categorized as “disabilities” in the first place. She concluded, “We should be encouraged to think hard about what are the ‘disabilities’ we should edit out.”

Josephine Johnston is a bioethicist and lawyer at the Hastings Center, an independent bioethics research institute in Garrison, New York. Her expertise involves the ethical, legal and policy implications of biomedical technologies. Publications include (with Erik Parens) “Human Flourishing in an Age of Gene Editing” (Oxford: 2019). She is a member of Columbia University Medical Center’s Center for Excellence in Ethical, Legal and Social Implications.

MUDD CENTER ANNUAL REPORT 2020 | 3



An Ethical Framework for a God-like Intellect JEFFREY SMITH | OCTOBER 3, 2019 Jeffrey Smith, brigadier general (retired) and professor of cyberspace studies at Virginia Military Institute, delivered the second talk in the series. An audience composed of Washington and Lee undergraduates and law students, faculty, staff, VMI Cadets and Lexington community members assembled in Hillel House to hear Smith discuss “An Ethical Framework for a God-Like Intellect.” Human ambition, according to Smith, is “soaring, global, universal, God-like.” Taking the audience back to the origins of human history, Smith noted that nature’s only criterion for achievement was that humans survive to child-bearing age. Humans surpassed that end, creating what Smith termed mankind’s greatest technological achievement — culture. Human culture is characterized by language and the ability to examine human origins through myth. Today, we have advanced to the point where we have such things as artificial intelligence and gene modification. What, he asked, is an ethical framework that is flexible enough for our culture with its advanced technology? For Smith, the mission is to build a society with an ethical framework that resolves in our favor the problem sets associated with technology, such as a modified genetics, artificial intelligence and human vulnerability to threats like nuclear war. Smith characterized society as “going full speed ahead,” making technological advances that cannot be turned around. Ethics, an inherent part of every culture, serves as a kind of societal

pacemaker. The trick will be to “go full speed ahead, but do it slowly.”

“WE ARE THE SCRIBES AND WE’RE INTERESTED IN TELLING THE STORY OF NATURE—SCIENTISTS DO IT, ETHICISTS DO IT, LITERARY WRITERS DO IT.” After pointing out that ethics and morality in Western culture are linked historically to the Ten Commandments, Smith noted that contemporary thinker Yuval Noah Harari has examined a number of belief systems but discarded them all as insufficiently flexible for today’s society. So where are we to look for the flexible ethical framework that can address seemingly intractable quandaries posed by today’s technology? Perhaps humans can look to their vast technology and use it to model ethical solutions. Smith believes in the hope that he says is inherent in humanity: “We [humans] are the scribes and we’re interested in telling the story of nature—scientists do it, ethicists do it, literary writers do it.” One audience member asked, “Since our ethical frameworks are inherently selfish, how do we model them to be less selfish?” Smith answered, “In a world where technology is a partner in our crimes, we haven’t created unselfish ethical frameworks very well, so how can our technology offer something different? If our world has enough space to account for sufferings, our technology can learn from that. But if we can’t account for suffering, our tech won’t.”

Brigadier General (ret.) Jeffrey G. Smith Jr. had a 33-year military career that included multiple staff and command roles in Europe, Bosnia/ Herzegovina, Iraq and Afghanistan. For four years, he was dean of the faculty and deputy superintendent for academics at Virginia Military Institute. Now teaching cyberspace studies, he offers a course, “History of Information Technology (Past, Present, and Future),” examining the origins, evolution and ethical implications of today’s technological environment.

MUDD CENTER ANNUAL REPORT 2020 | 5


Robots That Need to Mislead: Biologically-Inspired Machine Deception RONALD ARKIN | OCTOBER 21, 2019 Professor Arkin’s lecture, entitled “Robots That Need to Mislead: Biologically-Inspired Machine Deception,” was part of his larger project on human-robot interaction. Relationships between people and machines can exist in diverse settings such as warfare, childcare and eldercare. Arkin and his team at Georgia Tech have spent years working on robot decision-making about when and how to trust and deceive. Their work necessarily includes a variety of ethical questions, and, as Arkin states, “everyone is a stakeholder in this discussion.” Arkin opened his talk by asking: “Why would we want to teach robots to deceive?” Part of the answer resides in the fact that scientists have observed deception in use throughout nature and the human world — by primates, in medical theory, in the military, even in sporting events such as when a point guard in basketball “fakes” to get a defender out of the way for a clean shot at the basket. People use deception all the time: It is a part of human culture. Arkin gave an example from pop culture. In the film “Interstellar,” scientists had to dial back the truth parameters of a robot to 90% because absolute honesty is not always the safest or most diplomatic form of communication among emotional human beings. Arkin’s work on deception was biologically inspired in the sense of relying on studies of deceit practiced by animals. Studies of squirrel hoarding showed squirrels misleading others about the location of stored acorns. Studies of bird mobbing showed Arabian babblers faking strength to ward off predators. Arkin sought to use such behavioral patterns in developing algorithms for small mobile robots. His working definition of deception was “a false communication that tends to benefit the communicator.”

deceptive behavior and to act accordingly by using deception signals. While Time magazine hailed Arkin’s work on deceptive robots as one of the top 50 inventions of 2010, New York magazine called it “a stunning display of hubris.”

“IN GENERAL, DECEPTION IS ACCEPTABLE IN AN AFFECTIVE AGENT WHEN IT IS USED FOR THE BENEFIT OF THE PERSON BEING DECEIVED, NOT FOR THE AGENT ITSELF. FOR EXAMPLE, DECEPTION MIGHT BE NECESSARY IN SEARCH AND RESCUE OPERATIONS, ELDER – OR CHILD-CARE.” After explaining his research and its reception, Arkin considered robo-ethics. He asked whether deception is ever acceptable in humans and whether we err in introducing that capability in machines. He cited two possible approaches: first, the Kantian position that deceptive behavior and lies are morally unacceptable, and second, the utilitarian theory that if deception increases happiness and total benefit, it is acceptable. He noted that “in calming down a panicking individual in a search and rescue operation or in the management of patients with dementia, with the goal of enhancing that individual’s survival,” deception can have societal value. “In this case, even from a rights-based approach, the intention is good, let alone from a utilitarian or consequentialist formulation,” so that arguably deception is warranted. At the same time, the same technology conceivably could be used for nefarious purposes. The true ethical question then becomes: How do we ensure robot deception is only used in appropriate contexts?

“I’m grateful for the road I’m on and where I’m heading.”

Arkin’s experimentation involved enabling robots to model, generate and cope with misdirection in various situations. He ultimately could teach robots to identify the kind of situation warranting

STEPHEN SHARP

6 | MUDD CENTER ANNUAL REPORT 2020


The Institute of Electrical and Electronic Engineers (the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity) has guidelines in place. One recommendation states: “In general, deception is acceptable in an affective agent when it is used for the benefit of the person being deceived, not for the agent itself. For example, deception might be necessary in search and rescue operations, elder – or child-care.” A second one states: “For deception to be used under any circumstance, a logical and reasonable justification must be provided by the designer, and this rationale must be approved by an external authority.” However, the IEEE guidelines are currently just that: guidelines.

Legislation does not exist, and little professional conversation addresses when it is acceptable to use this technology. Following the talk, an audience member asked, “How do we deal with or control companies and foreign or domestic actors using these deceptive mechanisms?” The answer, responded Arkin, resides in laws, regulations, whistleblowers and IEEE guidelines that provide a basis for thinking about ways to proceed. IEEE has lobbying capabilities in Washington as well as clout within companies; its mission is to inculcate ethical behavior in individuals. “But there will always be rogues,” said Arkin. “Ultimately, you have to try to find ways to discover and penalize unethical behavior.”

Ronald Arkin is a roboticist at Georgia Tech and founder of the Mobile Robot Laboratory. His work has taken him around the world: to the Royal Institute of Technology in Stockholm, the Sony Intelligence Dynamics Lab in Tokyo, and the School of Electrical and Computer Science in Queensland. He is the author of over 170 publications and is known for his book “Behaviorbased Robotics.”


Privacy as Contextual Integrity: Confronting the Great Regulatory Dodge HELEN NISSENBAUM | OCTOBER 28, 2019 Professor Helen Nissenbaum gave a talk on “the great regulatory dodge,” the problem of online providers of health, educational or other services escaping careful oversight. She opened with an explanation of privacy regulation as it began and changed over time. She then sketched her theory of privacy as “contextual integrity.” The story begins almost 50 years ago with a 1973 report, entitled “Records, Computers, and the Rights of Citizens,” written by an advisory committee on automated personal data systems and submitted to the Department of Health, Education, and Welfare. This report’s backdrop was a growing concern about individuals’ loss of privacy at the hands of large government agencies and private-sector entities with the power to gather, maintain and use individuals’ information. The report’s recommendations sought to balance the benefits of computerized record-keeping systems with the rights of individual data subjects. The report produced a Code of Fair Information Practices containing five basic principles: Secret databases should be prohibited; reuse of data for purposes other than those stated at the time of collection should be prohibited; data security must be adequate; data subjects must be allowed to inspect their records; and data subjects must be allowed to correct their records. Nissenbaum said that the report’s concept of privacy was the individual’s right to control his or her information. Protecting privacy would be a function of procedural rules that would level the playing field between individuals and large information collectors. Today, the approach is known as “notice and choice” ‒ the user’s right to notice that data has been collected and stored, and the right to choose whether the data should be collected in the first place. Typically,

8 | MUDD CENTER ANNUAL REPORT 2020

an online provider recognizes “choice” by allowing users to opt in or opt out of the collection. If a user does not affirm a choice, the default rule is that a user has opted in. As for “notice,” Nissenbaum asked the audience to picture the long digital privacy notices provided by Google or Facebook. She pointed out that these notices are largely unreadable. They subject users to asymmetric knowledge and power, along with take-itor-leave-it conditions. Under this system, large data companies can avoid honoring their own policies.

“WHAT HAPPENS WHEN EXPECTATIONS OF PRIVACY CHANGE? MILLENNIALS, FOR EXAMPLE, HAVE NO EXPECTATIONS OF PRIVACY.” The “great regulatory dodge” involves this standard practice of “notice and choice” for regulating privacy for online apps, devices and platforms. In the U.S., we have a patchwork of privacy regulations such as HIPPA for the health sector and FERPA for the educational sector, but online platforms that provide comparable services to their real-world counterparts and collect similarly sensitive personal information claim to be exempt from these existing regulations and instead governed by the arguably corrupt “notice and choice” apparatus. For example, while an institution like Washington and Lee University must protect students’ educational information under FERPA, an online educational service such as Coursera, which collects intensive data on student educational activities, is regulated only by “notice and choice.” The service thereby “dodges” relevant existing privacy regulations and is highly unregulated in its collection and use of student information.


Nissenbaum’s answer to this state of affairs is her own framework: “contextual integrity.” This approach makes four claims. First, “privacy” is not a function of secrecy, containment or minimization of information; rather, privacy amounts to appropriate flows of information. Second, “appropriate” information flows are those that conform to contextual informational norms, based on the idea that social life contains different contexts and different applicable norms. Third, contextual informational norms stem from a number of independent parameters (data subject, sender, recipient, information type and transmission principle). Fourth, conceptions of privacy derive from ethical concerns that evolve over time. Nissenbaum argued that we can “stymie the dodge” by applying her framework to online apps, devices and platforms. For example, by using “notice and choice” as their privacy framework, online educational services address users as students but treat them as if they are consumers. However, the norms

of appropriate flows of information are notably different and value-laden depending on how we view the ends of education. If we view education as merely concerned with job training, for example, then we may be more comfortable with simply shipping off student information to third parties as records of their various achievements. But if we instead view education as a haven for learning, then such data-sharing practices undermine educational integrity and must be properly regulated. One audience member asked, “What happens when expectations of privacy change? Millennials, for example, have no expectations of privacy.” Nissenbaum answered that she does not agree that millennials lack privacy expectations; rather, they have different expectations. Her recommendation from contextual integrity is to study millennial privacy expectations in earnest and regulate with a deeper understanding of consequences.

Helen Nissenbaum is professor of information science at Cornell Tech and author or co-author of eight books. She previously directed the Information Law Institute at New York University. Earlier, she served as associate director of the Center for Human Values at Princeton. In 2019, she was named a distinguished fellow of the Stanford Institute for Human-Centered Artificial Intelligence. Her research takes an ethical perspective on policy, law, science and engineering, and she is known for the concept of privacy as “contextual integrity.”

MUDD CENTER ANNUAL REPORT 2020 | 9



The Shakedown State: Digital Debt, Economic Inequality and Automation in Public Services VIRGINIA EUBANKS | NOVEMBER 14, 2019 How has algorithmically based technology affected social services in the United States? Professor Virginia Eubanks of Albany University, author of a highly praised new book on that topic, gave a public lecture in November 2019. She forcefully described what happens to the poor when bureaucracies adopt impenetrable electronic systems that do little more than track and stigmatize those they are supposed to be helping. Eubanks recalled for the audience the 19th century establishment of “poorhouses,” institutions set up by state or local authorities to separate poor people from the rest of society. These prison-like houses were often overcrowded, unsanitary and unaccountable. Some had mortality rates as high as 30%. Ostensibly built for the benefit of “paupers,” they constituted “the nation’s primary method of regulating poverty,” allowing society to isolate the disadvantaged and avoid providing real financial assistance. These places “inspired terror among poor and working class people” across the country. The thesis of Eubanks’ talk and her book, “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor,” was that the poorhouse is not obsolete but alive and well. Technology is having the effect of sentencing the poor to a “digital poorhouse” that might be harder to escape than the work farm of earlier times. Automated decision-making has created a new regime of surveillance and control over those in need. Eubanks walked the audience through a trio of chilling case studies involving people stymied by technologies in their efforts to cope with poverty. One study looked at Indiana’s reliance on fallible automation to apply and enforce strict

welfare eligibility rules. A second study looked at homelessness in Los Angeles, particularly the city’s use of automation to match homeless people with available housing by gathering extensive data about the homeless. Unfortunately, the matching system ignored a whole segment of the population and allowed police to have access to the data. A third study looked at Pennsylvania’s computerized approach to child protection services: The state used algorithms to predict which kids were at risk for neglect or abuse. However, since defining “neglect” can be highly subjective, the predictions were often wrong, with devastating impact on parents. Eubanks asserted that these technology systems, billed as removing bias and error from social services administration, actually had disastrous results: They removed humanity, empathy and human accountability from those systems and allowed real need to go unaddressed. The solution? In Eubanks’s view, society’s entire understanding of poverty, its causes and solutions needs to change, as does the inclination of politicians and voters to outsource moral responsibility for the poor to algorithms and machines. Eubanks advocates political action and greater citizen involvement in political movements and campaigns. Eubanks also proposes a “Hippocratic oath” of sorts for data scientists, systems engineers and others creating or wielding technology affecting the poor. The oath amounts to an ethical pledge to “integrate systems for the needs of people, not data,” “to use my skills and resources to create bridges for human potential, not barriers,” and to refrain from designing any “data-based system that overturns an established legal right of the poor.”

Virginia Eubanks is associate professor of political science at the University at Albany, SUNY. She obtained her Ph.D. in science and technology studies at Rensselaer Polytechnic Institute. Eubanks came to her research on technology, poverty and women’s citizenship through a history of activism in community media and technology center movements. A founding member of the Our Data Bodies Project, she was a 2016-2017 fellow at New America.

MUDD CENTER ANNUAL REPORT 2020 | 11


World Without Mind FRANKLIN FOER | JANUARY 30, 2020 The faculty fellows of the Mudd Center have a custom of preparing for a speaker’s visit by gathering in advance and discussing the speaker’s work. As journalist Franklin Foer’s visit approached, the fellows read and talked about “World Without Mind,” Foer’s stirring critique of the “big tech” companies. The book’s subtitle captures its theme: “The Existential Threat of Big Tech.” Foer wants Americans to wake up to “the inseparable perils” posed by corporate knowledge merchants: monopoly and conformism. Foer defines monopoly as “the danger that a powerful firm will use its dominance to squash the diversity of competition,” and conformism as “the danger that one of these monopolistic firms, intentionally or inadvertently, will use its dominance to squash diversity of opinion and taste.” In Foer’s view, America has abandoned the perspective of the humanities (“the human path”) and is allowing itself to be dazzled and controlled in diverse ways by what Europeans call GAFA: Google, Apple, Facebook and Amazon. Engaging informally in Mattingly House with law students and undergraduates a few hours before his public lecture, Foer called himself a “skeptic of technology” who nevertheless appreciates its benefits. In that setting as well as in his lecture, Foer explained his position that big tech companies have become the “most powerful gatekeepers the world has ever known,” sorting the news we read, creating hierarchies of information and using market power to influence our judgments. Reform is long overdue, he argued, including antitrust regulation. Foer told his Stackhouse Theater audience that one impediment to reform is the fact that modern America has yet to grasp the

12 | MUDD CENTER ANNUAL REPORT 2020

tech industry and its threats – that the sheer breadth of what these companies do and sell makes it difficult to comprehend the scope of their influence. He reiterated his belief that consumers are beginning to “merge” with their machines, be it a smart watch constantly attached to the wrist or an Alexa-enabled smart home that knows when users make their first cup of coffee in the morning and when they pull a final snack out of the fridge at night. “We’ve all become a bit cyborg,” according to Foer. In his view, “we’re not just merging with machines, but with the companies that run the machines.” Questions from the audience pursued the need for solutions. The faculty fellows had discussed Senator Mark Warner’s 2019 white paper, “Potential Policy Proposals for Regulation of Social Media and Technology Firms.” They had also discussed proposals from academia, such as Professor Jack Balkin’s call in 2016 for ethical and legal treatment of big tech firms as “information fiduciaries.” This idea would apply to “many online service providers and cloud companies who collect, analyze, use, sell and distribute personal information.” As fiduciaries, these entities would have “special duties to act in ways that do not harm the interests of the people” whose information they use. Foer himself called for creation of a data protection agency that would enable citizens to “own their own data,” and would have authority to impose “constraints” on tech companies about “what can be collected and what can be exploited.” At a dinner hosted by President Dudley in Lee House following the lecture, Foer and faculty members discussed a number of related topics, including the impact of technology on the current political scene. Foer’s themes and proposals reverberated after his visit, with students and faculty considering in greater depth the ethics of technology.


Franklin Foer is national correspondent for The Atlantic. A 1996 graduate of Columbia University, he began his journalism career at Slate, then owned by Microsoft. He was editor of The New Republic magazine from 2006 to 2010. He returned to that position from 2012 to 2014. His popular 2010 book, “How Soccer Explains the World,” has made its way onto syllabi in philosophy courses. In a 2017 interview, Foer said, “Questions about technology are fundamentally spiritual questions.”


The New Appendage: Cellphones in Cognitive and Behavioral Context KARLA MURDOCK & WYTHE WHITING | FEBRUARY 6, 2020 Arriving at Stackhouse Theater on Feb. 5 for the next Roger Mudd Center event on “the ethics of technology,” attendees encountered a curious sign at the door: “Welcome! We would like to offer you the opportunity to stow your phone during today’s lecture. If you are interested in this, we will show you where to leave your phone and give you a number so that it can be easily located after the lecture.” Almost all attendees were willing to part with their phones for an hour, although more than a few expressed discomfort (whether real or feigned) in parting with that all-purpose computer for even a few minutes. The presenters were two of Washington and Lee’s best-known faculty members, Professor Karla Murdock and Professor Wythe Whiting. Murdock is a former chair of the Department of Cognitive and Behavioral Science (formerly the Department of Psychology), and Whiting is the current chair. Their topic had immediate appeal to the campus community: the psychological effects of cellphone use and its ethical implications. Or, as Murdock and Whiting named their lecture, “The New Appendage: Cellphones in Cognitive and Behavioral Context.” Murdock opened the talk for the pair with a quote from the U.S. Supreme Court in Riley v California: “Modern cell phones… are now such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy.” Murdock introduced a number of facts on rates of cell phone usage by emerging adults and their parents from PEW Research, culminating in the statistic that increased screen usage by children and teens increases parental worry by 65% and, according to Murdock, sends parents into an “anxiety vortex.” Her talk focused on the population known as the “internet generation,” or “IGen,” defined as those

14 | MUDD CENTER ANNUAL REPORT 2020

born between 1995 and 2012. U.S. cell phone saturation reached 50% in 2012, accompanied by dramatic increases in psychological distress for the IGen. It appears that cell phones affect psychological functioning through three mechanisms: compromised sleep, displacement of alternative activities and social comparison. Murdock underscored the third: teens’ cellphone use as a means of comparing their lives to whatever their peers are doing.

“MODERN CELL PHONES… ARE NOW SUCH A PERVASIVE AND INSISTENT PART OF DAILY LIFE THAT THE PROVERBIAL VISITOR FROM MARS MIGHT CONCLUDE THEY WERE AN IMPORTANT FEATURE OF HUMAN ANATOMY.” Whiting referenced his work on cellphone distraction in the contexts of driving and classroom learning. The costs are heaviest when the mind has to reload thoughts onto the task at hand. This is particularly dangerous when driving, where driver reaction time is 146% slower after checking a text. One audience member asked, “You have already mentioned the possible roles of parents and policymakers, but what about the ethical duties of Instagram, Facebook or iPhone?” Both speakers answered that those corporations have important responsibilities. YouTube has considered removing “visible likes” from Instagram so that users can see their own likes but not the likes of other users. The big tech firms should be more involved in acknowledging and addressing these issues. As the audience filed out, no phones were left behind.


Karla Klein Murdock is professor of cognitive and behavioral science at W&L. She earned a Ph.D in clinical psychology from the University of Georgia and completed a clinical internship at Western Psychiatric Institute and Clinic (University of Pittsburgh Medical Center). Her research is guided by a developmental psychopathology theoretical framework emphasizing interactions between risk and protective factors in the emergence of psychological symptoms and strengths. Her recent studies have investigated links between cellphone use and indicators of health, well-being and cognitive performance.

Wythe Whiting is a professor of cognitive and behavioral science at W&L. He obtained his Ph.D. in cognitive psychology at the Georgia Institute of Technology before completing a postdoctoral internship at Duke University in cognitive neuroscience. He joined the faculty at W&L in 2003. His work has focused on the application of cognitive science to a variety of areas in the field of psychology and behavior. His recent work has focused on the physiological and behavioral consequences of a digital environment on our ability to sustain attention.


2019-2020 Events Course on Ethics of Technology Aligning with the Mudd Center’s annual theme on the ethics of technology, Mudd Postdoctoral Fellow Jeremy Weissman taught a seminar entitled Ethics and Emerging Technologies during the fall, winter and spring terms. In this seminar, students took a critical look at a number of cutting-edge technologies that are still largely on the horizon. They also attempted to decipher the ethical issues they present and how such problems might be mitigated. Some emerging technologies they critically analyzed included artificial intelligence, human enhancement technologies, brain computer interfaces, augmented reality, virtual reality, facial recognition and other surveillance technologies, synthetic biology, nanotechnology, self-driving cars and military robots. In many cases, Weissman was able to directly line up course units to coincide with visiting speakers in the Mudd Center’s lecture series, including Josephine Johnston, Ronald Arkin, Helen Nissenbaum and Virginia Eubanks. This provided an invaluable opportunity for students to deepen their understanding of the course material and, in a couple of cases, to meet directly in a small-group setting with the very authors that were read in class. This allowed for a truly enhanced learning opportunity for these students.

The Ethics ofTechnology 2019–2020

A Year-long Series on Multiple Issues Posed by Technological Change

Mudd Center for Ethics

The Good Parent in an Age of Gene Editing: How Novel Genetic Technologies Challenge Parental Responsibility

Josephine Johnston Director of Research and Bioethicist at The Hastings Center; Scholar of Ethical, Legal, and Policy Implications of Biomedical Technologies; and author of “Human Flourishing in an Age of Gene Editing.”

Thursday, Sept. 26, 2019 5–6:15 p.m. Book signing following lecture Stackhouse Theater, Elrod Commons Open to the Public

~ Jeremy Weissman

The Ethics ofTechnology 2019–2020

A Year-long Series on Multiple Issues Posed by Technological Change

The Ethics ofTechnology 2019–2020

A Year-long Series on Multiple Issues Posed by Technological Change

Mudd Center for Ethics

Mudd Center for Ethics

An Ethical Framework for a God-like Intellect

Robots that Need to Mislead: Biologicallyinspired Machine Deception

Jeffrey G. Smith, Jr Professor of Cyberspace Studies, Virginia Military Institute; Retired Brigadier General; Deputy Commanding General of The Army’s Cyber Command; Scholar of Origins, Evolution and Ethical Implications of Today’s Technological Environment

Thursday, Oct. 3, 2019 5–6:15 p.m. Hillel House Multipurpose Room Open to the Public

WL

WL

MUDD CENTER for ETHICS

MUDD CENTER for ETHICS

Ronald Arkin

Regents Professor, Georgia Tech; Founder, Mobile Robot Laboratory; Path-breaking scholar of robot navigation, artificial intelligence, robo-ethics; author of “Behavior-based Robotics”

Monday, October 21, 2019 5 p.m. Stackhouse Theater Open to the Public WL MUDD CENTER for ETHICS


The Ethics ofTechnology

The Ethics ofTechnology

A Year-long Series on Multiple Issues Posed by Technological Change

2019–2020

A Year-long Series on Multiple Issues Posed by Technological Change

2019–2020

Mudd Center for Ethics

Privacy as Contextual Integrity: Thwarting the Great Regulatory Dodge

Mudd Center for Ethics

World Without Mind

Monday, October 28, 2019 5–6:15 p.m. Stackhouse Theater Elrod Commons

Thursday, January 30, 2020 5 p.m. Stackhouse Theater Elrod Commons

Open to the Public

Open to the Public

WL

WL

MUDD CENTER for ETHICS

MUDD CENTER for ETHICS

The Ethics ofTechnology

A Year-long Series on Multiple Issues Posed by Technological Change

2019–2020

Mudd Center for Ethics

The New Appendage: Cellphones in Cognitive and Behavioral Context Karla Murdock

Professor of Cognitive and Behavioral Science

Wythe Whiting

Professor of Cognitive and Behavioral Science, Department Head

A Year-long Series on Multiple Issues Posed by Technological Change

The Shakedown State: Digital Debt, Economic Inequality, and Automation in Public Services

National correspondent for The Atlantic; acclaimed critic of big-tech impact on privacy, intellectual property, and contemporary values; author of “World Without Mind,” named one of the year’s best by N.Y. Times, L.A. Times, and NPR

Acclaimed Theorist of Privacy; Prize-winning Author; Professor of Information Science, Cornell Tech

2019–2020

Mudd Center for Ethics

Franklin Foer

Helen Nissenbaum

The Ethics ofTechnology

Virginia Eubanks Associate Professor of Political Science & Women’s Studies, University of Albany, SUNY; Co-founder of grass-roots antipoverty and welfare rights organization; Scholar of Technology and Social Justice; Author of “Automating Inequality: How High-Tech Tools Profile, Police & Punish the Poor”

Thursday, November 14, 2019 5 p.m. Stackhouse Theater Elrod Commons Open to the Public WL MUDD CENTER for ETHICS

The Ethics ofTechnology 2019–2020

WL MUDD CENTER for ETHICS

Mudd Center Movie Night

Steven Spielberg’s “A.I.” Stackhouse Theater Wed., January 15, 2020 7 p.m.

Discussion to Follow Open to the Public

Thursday, February 6, 2019 • 5 p.m. Stackhouse Theater Elrod Commons Open to the Public WL MUDD CENTER for ETHICS

D ue to Co events vid-19, some we a confe re cancelled: r en c securit e on cybery and rights; Anne W civil ashb playwr ight; Tr urn, a K . Sm ith, U. S cy .P Laurea te 2017 oet -19.


2020 Undergraduate Ethics Conference About This Conference: The Conference of The Mudd Journal of Ethics is a national undergraduate research conference devoted solely to the study of ethics. Sponsored by both the Roger Mudd Center for Ethics and the Phi Sigma Tau Honor Society
in Philosophy, this conference was created with the goal of recognizing, studying and discussing exceptional academic work in ethics produced at the undergraduate level. This fifth annual conference features students from three different colleges and universities located in two different states across the country. Each paper presented will be published in the fifth edition of The Mudd Journal of Ethics this spring.

DAY 1: SATURDAY, MARCH 7 • 2-6 PM • HILLEL MULTIPURPOSE ROOM

2-2:40 p.m. “From GHG to GCG: The Individual Moral Obligation to Act on Climate Change,” by Ilana Cohen, Harvard University 2:40-2:50 p.m. Break 2:50-3:30 p.m.

“Meritocracy and its Effect on Distributive Justice,” by Charlotte Radcliffe, Washington and Lee University

3:30-3:40 p.m. Break 3:40-4:20 p.m.

“Shall We Not Revenge?,” by Jake Beardsley, College of William and Mary

4:20-4:30 p.m. Break 4:30-6:00 p.m. Keynote Address: “Adopt a Theory, Obey a Code, Follow your Gut, or Ask the Right Questions? A ‘Decision Science’ Approach to Ethical Decision Making,” by Dr. Bill Hawk, Professor of Philosophy at James Madison University DAY 2: SUNDAY, MARCH 8 • 9 –10:30 AM • HILLEL MULTIPURPOSE ROOM

9-9:40 a.m.

“A Duty to Live: Kant on Suicide,” by Woojin Lim, Harvard University

9:40-9:50 a.m. Break 9:50-10:30 a.m. “Stoic Eudaimonia: Can Mental Health Rightfully be Considered an ‘Indifferent’?,” by Brooklyne Oliveira, Washington and Lee University 18 | MUDD CENTER ANNUAL REPORT 2020


Keynote Speaker Bill Hawk joined JMU as head of the Department of Philosophy and Religion in 2001. He also served as the General Education program coordinator of Cluster Two: Arts and Humanities shortly before being named chair of the Madison Collaborative in 2013. Bill continues to enjoy teaching and was one of the first faculty to integrate the “Eight Key Questions” into his courses. Before coming to JMU, Hawk served as a vice president and academic dean for three years at Eastern Mennonite University, where he was also a professor of philosophy. As chair, Bill works directly with faculty, staff, students and administrators to build the conceptual and practical framework for ethical reasoning at JMU and beyond.

The Mudd Journal of Ethics Editors: Parker Robertson ’20 (Editor-in-Chief) Clare Perry ’21 (Assistant Editor) Stanton Geyer ’20 (Editor) Sierra Terrana ’20 (Editor) Chad Thomas ’21 (Editor) Kushali Kumar ’22 (Editor) Max Gebauer ’22 (Editor) Tyler Bernard ’23 (Editor) Anna Hurst ’23 (Editor) Ben Hess ’23 (Editor)

MUDD CENTER ANNUAL REPORT 2020 | 19


A Year-Long Series on Multiple Understandings of Self and Society

KWAME ANTHONY APPIAH, Professor of Philosophy and Law at NYU, Keynote Speaker “THE ETHICS OF IDENTITY: THE INJURIES OF CLASS” | SEPT. 27, 2018

ANITA FOEMAN, Professor of Communication Studies at West Chester University of Pennsylvania “DNA AND IDENTITY: CHANGING THE CONVERSATION ABOUT WHO WE ARE” | OCT. 18, 2018

PAULA VOGEL, Acclaimed Playwright “THE ART OF TOLERANCE” | OCT. 30, 2018

REBECCA JORDAN-YOUNG, Sociomedical Scientist and Chair of the Department of Women’s, Gender and Sexuality Studies, Barnard College “THINKING BIOCULTURALLY ABOUT IDENTITY AND ETHICS” | NOV. 29, 2018

JOY HARJO, Award-Winning Poet, Writer, Musician and Member of the Mvskoke/Creek Nation “EXILE IN MEMORY” | FEB. 11, 2019

DAVID LUBAN, Professor of Law and Philosophy, Georgetown University Law Center “THE ETHICS OF PROFESSIONAL IDENTITIES IN LAW AND WAR” | FEB. 28, 2019

JONATHAN LEAR, John U. Nef Distinguished Service Professor at the Committee on Social Thought and in the Department of Philosophy at the University of Chicago “WHAT WOULD IT BE TO MOURN GETTYSBURG?” | MAR. 14, 2019

DR. RALPH CALDRONEY, MD, local retired physician, Hospice “PROFESSIONAL IDENTITY IN HEALTHCARE” | APR. 26, 2019

THE HONORABLE MARY GRACE O’BRIEN, Judge, Court of Appeals, Virginia “JUDICIAL IDENTITY” | SEPT. 12, 2019

KWAME ANTHONY APPIAH

DAVID LUBAN 20 | MUDD CENTER ANNUAL REPORT 2020

ANITA FOEMAN

PAULA VOGEL

JONATHAN LEAR

REBECCA JORDAN-YOUNG

DR. RALPH CALDRONEY

JOY HARJO

THE HONORABLE MARY GRACE O’BBRIEN


List of Speakers for 2020-2021: *Keynote, Sept. 24, 2020: The Hon. Reuben E. Brigety, President of Sewanee University, former Dean of the Elliott School of International Affairs, George Washington University, and former U.S. Ambassador to the African Union. Talk Title: “Black Lives Matter - An International Moment.” *Oct. 15, 2020: Dr. Anne-Marie Slaughter, CEO, New America. Director, International Legal Studies Program, Harvard University (1994-2002). Author, The Chessboard and the Web: Strategies of Connection in a Networked World (2017). Talk Title: “Renewing the Promise of America: Looking Back to Move Forward.” *Oct. 29, 2020: Elizabeth Kolbert, Staff Writer, The New Yorker. Author, The Sixth Extinction: An Unnatural History (2014) (winner of the Pulitzer Prize). Talk Title: “Climate Change and Its Impact on the World Order” (moderated conversation). Jan. 28, 2021: Dr. Jonathan Wortham, Centers for Disease Control, Atlanta GA. Dr. Wortham is an alumnus. Talk Title: “Ethical Problems in Public Health Practice.” Feb. 2, 2021: Professor Erin Taylor, Department of Philosophy, Washington and Lee, and Dr. Ralph Caldroney, Physician. Talk Title: “Ethical Issues in the Context of Covid-19.” Feb. 15, 2021: Professor Valerie Hudson, University Distinguished Professor and George H.W. Bush Chair & Professor of International Affairs, Texas A&M. Author, The First Political Order (Columbia 2020). Talk Title: “The First Political Order: How Sex Shapes Governance and National Security Worldwide.” Mar. 15, 2021: Professor Felix Kwame Yeboah, Michigan State University. Principal author, Youth for Growth: Transforming Economies Through Agriculture, report sponsored by Chicago Council for Global Affairs (2018). John M. Gunn Exchange Scholar, W&L. Talk Title: “Africa’s Youth and Agrifood System: Pathways for Job Creation and Economic Transformation.” Mar. 25, 2021: Professor Katrina Forrester, Dept. of Government and Social Studies, Harvard University. Author, In the Shadow of Justice: Postwar Liberalism and the Remaking of Political Philosophy (Princeton 2019). Talk Title: “Feminist Internationalism Revisited.” (Pending) Apr. 5, 2021: Lauren Yee, playwright. Author, The Great Leap; The Cambodian Rock Band; The Song of Summer. Talk Title: TBA. (Pending) May 4-5, 2021: Professor Jason DeLeon, UCLA. Director, Undocumented Migration Project, organizer of Hostile Terrain 94 (global participatory exhibition). Author, The Land of Open Graves: Living and Dying on the Sonoran Migrant Trail (Cal. 2015). Talk Title: TBA. *These speaking engagements will be virtual. MUDD CENTER ANNUAL REPORT 2020 | 21


Mudd Center for Ethics


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.