7 minute read

Filmmaker alleges sexual assault by former rowing coach, criticizes his legacy at Penn

Fox calls on Penn to remove Nash’s name from the Ted A. Nash Land Rowing Center

HALEY SON Staff Reporter

A filmmaker who says she was sexually assaulted by a Penn rowing coach is calling on the University to remove his name from a center dedicated in his honor.

The filmmaker, Jennifer Fox, alleges that she was sexually abused by Ted Nash — who died in 2021— when she was 13 years old. Fox gave Nash a pseudonym in her Emmy-nominated film detailing the incident, "The Tale," but publicly named Nash as her abuser for the first time on March 20 in a New

York Times article.

A legend in the rowing community, Nash coached both women’s and men’s rowing at Penn from 1965 to 1983 after winning two Olympic medals. In 2014, Penn honored him with the dedication of its indoor rowing center, the Coach Ted A. Nash Land Rowing Center. Fox, along with Penn community members, told The Daily Pennsylvanian that she wants Penn to change the name of the center to address his legacy.

“My goal is to have Nash’s name taken off of everything, not just at Penn, but everywhere,” Fox said.

Fox added that she hopes Penn community members stand up against having Nash’s name continue to be memorialized on campus, adding that Penn's reaction could signify a larger message.

“I think hurting his legacy is a big blow to Ted Nash and all the abusers out there,” she said.

See ROWING, page 6

Former Wharton vice dean resigns as Temple University president

Former Vice Dean of Executive Education at the Wharton School and 2000 GSE Ph.D. graduate Jason Wingard resigned as Temple University president on March 28.

Mitchell Morgan, chair of the board of trustees, announced the resignation to the Temple community, stating that Wingard’s resignation will become effective on March 31.

Wingard’s resignation comes after a series of recent events sparked outrage from members of the Temple community, including a Temple University Graduate Students’ Association strike that lasted over a month and the shooting death of an on-duty police sergeant in February.

“Given the urgent matters now facing the University, particularly campus safety, the Board and the administration will ensure the highest level of focus on these serious issues,” the statement said.

“We understand that a concerted and sustained effort must be undertaken as we attempt to solve these problems.”

On March 21, the faculty union planned to hold a vote of no confidence on Wingard, as well as Morgan and Provost Gregory Mandel.

The resignation follows the formation of a Special Committee of the Board of Trustees on March

23, created “to apply more rigorous attention to the urgent matters facing the University,” according to a message sent to Temple community members.

Wingard served as the senior director at the Wharton School from 1999 to 2004 before serving as the vice dean of executive education at Wharton from 2010 to 2013.

Prior to his role at Temple, he served as dean emeritus and professor of human capital management at Columbia University's School of Professional Studies. Wingard as the 12th president of the university in 2021, making him Temple’s first Black president. The announcement writes that Temple's board will appoint a small group of senior administrators to temporarily lead the university.

“This group will have many years of experience at Temple and devotion to its mission,” the statement said. “Each will have discrete responsibilities for the university’s essential functions and provide a stable foundation for us as we look toward the search for our next president.”

Mollick is not alone in his approach. Design professor Sebastien Derenoncourt also requires his students to use ChatGPT. Derenoncourt said that he assigned his students to “specifically” use AI tools in their midterm work, and a text generator to help them with their paper.

“[M]y emphasis has been that ChatGPT and similar tools are there to help them expand and outline their abilities,” Derenoncourt said.

Pettigrew, Osborn, Mollick, and Derenoncourt’s approaches exemplify what Bruce Lenthall, the executive director of the Center for Teaching and Learning, said is a wide variety of policies regarding ChatGPT across the University. However, he said that student ChatGPT usage is limited by flaws in its outputs, as users have to know the subject matter “pretty well” to identify mistakes.

“[T]here’s a real risk associated with it — even if you’re not going to be caught — if you say things that are nonsense,” Lenthall said. “If you’re a student, you should absolutely know that you are using it at your own risk.”

In the absence of a University-wide policy for ChatGPT, Penn professors are creating a patchwork of approaches regarding the use of artificial intelligence in their classes.

The viral chatbot, which OpenAI launched in November, can generate human-sounding responses to a multitude of prompts and is pro ficient at writing, synthesizing text, and coding. Its abilities rival those of Penn students: A Wharton professor recently found that ChatGPT would pass a Wharton MBA exam, and the newly-released GPT-4 scored in the 93rd percentile on a simulated SAT exam.

Unlike some of its peer institutions, Penn has not published a dedicated policy governing the use of artificial intelligence by students. Without such a policy, six Penn professors and administrators spoke with The Daily Pennsylvanian about how they are tackling the use of ChatGPT, from banning to mandating its use.

Community Standards and Accountability

Associate Director Danielle Crowl wrote that she believes that students who use ChatGPT without explicit permission from professors are considered to have violated University policies from Penn’s Center for Community Standards and Accountability.

Crowl wrote that the use of ChatGPT depends “on the facts of the allegation” and cited sections A and B of the Code of Academic Integrity: Cheating and Plagiarism.

“Fortunately, ChatGPT as it stands now is not good at citation and therefore the CSA has been able to detect plagiarism violations when students are using this AI,” Crowl wrote.

Still, some professors have taken additional steps to prevent students from excessively using ChatGPT to write code or assignments. Stephen Pettigrew, the director of data science at Penn’s Program on Opinion Research and Election Studies, wrote in his PSCI 3800: “Applied Data Science” syllabus that, while using online resources can be beneficial to solve issues, taking code directly from online is unacceptable.

“As somebody who has taught this stuff for a while, I get a good sense of what are the common mistakes that students make,” Pettigrew said.

“And if a student were to turn in ChatGPT-written code, and there’s weird mistakes in it, it’s probably going to set off alarm bells in my head.”

Critical Writing Program Director Matthew Osborn said that the program was allowing students to “explore and experiment” with ChatGPT, with a caveat — “as long as they’re aware of and understand what appear to be some substantial limitations and some inaccuracies in the content that they generate.”

“You should note that foundation models have a tendency to hallucinate, make up incorrect facts and fake citations, and produce inaccurate outputs,” a draft policy for the Critical Writing Program reads. “You will be responsible for any inaccurate, biased, offensive, or otherwise unethical content you submit regardless of whether it originally comes from you or from one of these foundation models.”

The policy also emphasizes the need for students to appropriately cite “foundation models” like ChatGPT.

Wharton professor Ethan Mollick is taking a more emboldened approach. He supports the usage of ChatGPT in his classes, mandating that students use it for several assignments. Mollick said that ChatGPT and similar AI-based systems are useful tools that can be used for the betterment of education.

“We have to recognize and learn how to use these tools and not fight against them,” Mollick said. “I also think that we can accomplish things educationally that we could not accomplish before by using tools in this way.”

Mollick said that his embrace of ChatGPT cannot be copied by professors in all other subjects. In English composition courses, he said, professors will want to assign blue book tests or have students use computers that are disconnected from the internet.

Beyond ChatGPT’s current abilities, professors raised questions about the chatbot’s future in education, ranging from its increasing abilities to concerns about equity. Pettigrew said that he will need to change his assignments to prevent students from easily cheating on them using AI.

“Part of the thing that ChatGPT is going to force, at least for me, is being more diligent about keeping my materials fresh and new, so that a chatbot cannot regurgitate an answer from online back to me,” Pettigrew said.

Osborn said that he is confident the Critical Writing Program’s current curriculum would continue to be as helpful for students in the future, albeit with minor changes. By contrast, Lenthall said that priorities in education may need to be reevaluated as AI improves and exceeds human capabilities.

“If ChatGPT gets to the point where it can write a better paper than you can, what’s the point of you learning that skill?” Lenthall asked. “[I]f we don’t have an answer for that, then the fact that students can cheat is irrelevant, as getting the degree won’t be worth anything. Because if ChatGPT is better at it than a college student, then we don’t need that college student.”

Pettigrew said that the onus is on students to “take their own learning into their own hands” as AI potentially makes it easier to cut corners on assignments.

“It’s important that human beings know how to do certain things, because ultimately, ChatGPT is never going to cure cancer, and never come up with that creative solution to a problem that human beings have worked on for a long time,” Pettigrew said.

Lenthall said that instructors need to implement barriers to cheating to prevent students from feeling pressure to do so from other students, because students are most likely to cheat when they see everyone else doing so. He also said that ChatGPT will likely be monetized in the future, potentially creating an “in-class divide” between those who can afford it and those who cannot.

As AI advances at a faster pace, Derenoncourt said that ChatGPT’s global proliferation will change his baseline assumptions about his students. He said that AI could increase equity, citing how most of his students are nonnative English speakers and benefit from a tool that lets them express themselves in grammatically correct ways.

“I’m going to have higher expectations,” Derenoncourt said. “I’m going to be more willing to tell students that they can do better than this, because they have the tools to be able to do this work, [and] I’ll also be able to without

This article is from: