6 minute read

Depictions of drug addiction ignore systemic causes

Shows like “euphoria” misrepresent addiction as a personal flaw.

Julie Ha Opinions Columnist

Advertisement

One of my responsibilities as an older sister is the drug talk. While I’ve taken it upon myself to pass down a voice of reason, there is a noticeable discrepancy in my experience with drug education and my personal reality — both from being a part of Gen Z and having gone to an elitist high school abundant with academic pressure — that hinders my ability to articulate the reality of drug use and addiction.

Growing up, a walk across 125th St. in East Harlem, an area historically saturated with opioids, and sights of locals halfasleep on the sidewalks would be accompanied with harmful rhetoric, such as, “This is what happens when you do drugs,” or “This is why you shouldn’t do drugs.” Not to mention, candid and unwarranted photos depicting Lindsay Lohan’s “rise and fall” were consistently plastered on the front pages of magazines I skimmed through in supermarkets. The truth is, though, I knew more drug addicts in high school than I could physically count, as all sorts of substances served as a means to grapple with academic pressure. Although the problem at hand was beyond the scope of study-drug abuse, my peers’ seemingly successful social and academic life and most importantly, appearances hardly measured up to the warnings issued by both my parents and the tabloids.

In light of this discrepancy, “Euphoria,” a show illustrating the inner lives of several teens, has not been devoid of criticism pertaining to the unrealistic portrayal and romanticization of drug use. Specifically, critics fear that the show’s depictions of drug use would promote unhealthy coping mechanisms among an impressionable audience. However, while critics simultaneously claim that clips from “Euphoria” falsely diagnose addiction as glamorous and that their intention is to end addiction, the “harsh reality” — images of half-asleep addicts and celebrities after benders — is neither realistic nor productive to their goal either.

For one, images of the “harsh reality” frame addiction as a personal flaw — a deficit in the moral character and willpower of users — when empirically that is not the case. The prescription opioid epidemic is a prominent example. It stemmed from a mid-1980s study that shifted cultural attitudes about opioids to consider them a low-risk, non-addictive cure for pain and about pain itself as a condition that necessitated treatment.

Marketing for prescription opioids as a “miracle drug” would continue well into the 21st century as doctors in West Virginia and Kentucky, specifically, became the target of sales pitches for newer and less addictive opioid products. As of now, one in four participants in long-term prescription opioid therapy have become addicted.

If the epidemic has taught us one thing, it’s that despite the claims of for-profit studies, opioids are indeed addictive. Addiction often operates outside the scope of personal willpower, taking root in patients who most likely had no intention of participating in the epidemic. Now, doctors ask patients about their family and personal history with addiction, sexual abuse and psychological issues when prescribing opioids, all of which new research proves can exacerbate one’s susceptibility to addiction.

Here’s where “Euphoria” gets addiction right — the show supplements addiction with narratives surrounding the mental health crisis. In other words, personal issues are only at fault in so much as they underscore broader systemic issues that have also empirically fueled drug addiction in certain communities. The crack epidemic, for example, that resulted from the war on drugs in the 1980s and 1990s and primarily influenced predominantly Black communities in urban areas, was cyclically fueled by systemic issues, such as racism and mass incarceration, which failed to get crack off the streets, and, consequently, a lack of holistic treatments, such as mental health and addiction counseling.

The intention of amplifying media representations of addiction with “harsher” depictions and dubbing them as reality to ultimately end addiction may be noble, but these images ultimately diverge from the evidence and history of drug addiction as systematic by placing the burden of sheer willpower onto those who are struggling. While the “harsh reality” may ease anxiety among “Euphoria” critics, for example, doing so is only counterproductive to both the efforts of humanizing the stigma and, as the war on drugs has shown, centering treatment as the locus of addiction discourse.

When it comes to “Euphoria’s” casual images of drug use and the tabloids’ exploitative photographs of Lindsay Lohan, simply put, these images are really two sides of the same coin, and we cannot replace Hollywood misrepresentation with another distorted and subliminal form of it, as representation and education are inextricably linked within the addiction discourse. While the media must do its part in neither sensationalizing nor dehumanizing addiction, we must also do ours by recognizing our own prejudices within drug education.

Julie Ha is a sophomore double-majoring in English and comparative literature.

with ChatGPT

Teachers can use ChatGPT to teach about technology and develop curriculum.

Samantha Rigante Opinions Columnist

The development of ChatGPT, an artificial intelligence (AI) chat system created by OpenAI researchers, has been one of the most widely discussed topics in recent news and media. Released in November of 2022, ChatGPT is able to mimic many different types of writing, which is, in many cases, nearly impossible to distinguish from writing completed by students or other humans. Its purported use, according to OpenAI, is to interact with users in a “dialogue format,” which “makes it possible for ChatGPT to answer follow-up questions, admit its mistakes, challenge incorrect premises and reject inappropriate requests.”

The problem is that, while ChatGPT is improving in scholarly activities and writing (passing law exams at the University of Minnesota and business exams at the

University of Pennsylvania’s Wharton School of Business, it poses a significant risk to the legitimacy and originality of assignments that students may complete. Students’ ability to use ChatGPT in order to cheat on both multiple choice and written exams has driven concern among many professors and teachers throughout all levels of the educational system, forcing educators to rethink and decide whether to begin working with ChatGPT or attempt to get around it.

So, what solutions are possible to the distinctively 21st century issue of AI within the education system? Despite the draconian responses that some have proposed, such as New York City’s banning of ChatGPT on school computers, the answer may lie somewhere else. Professors should be rethinking their assessments and perhaps attempting to work in tandem with new developments in AI, as their availability will only become more prominent. For example, a high school English teacher in Oregon, Cherie Shields, allowed students to use ChatGPT to help them produce an essay outline of different texts. Afterward, she had them manually write their essays by hand. In order to make sure that students are not plagiarizing, teachers can utilize a system known as GPTZero, which identifies writing completed by AI programs, including ChatGPT.

By using classic education techniques, such as requiring handwritten essays, and applying available modern technology, educators may be able to approach this issue. This method teaches students effective ways to use the technology available to them in a supervised setting while also allowing them the opportunity to garner inspiration and creatively write their own essays afterward. As Ms. Shields says, “The process … had not only deepened students’ understanding of the stories. It had also taught them about interacting with A.I. models, and how to coax a helpful response out of one.”

Teachers can also use ChatGPT to their advantage. Since COVID-19, there has been a widely reported shortage of professionals within the academic field, and many are tired and overworked. Teachers and professors can treat ChatGPT as a subject of study in classrooms and use it to create creative lesson plans. Simply inputting “Create an eighth grade lesson plan about the Civil War” into ChatGPT creates an interesting, detailed lesson plan highlighting the most important aspects and morals of the Civil War. ChatGPT could also be beneficial in creating personalized learning tools for teachers to give to students, such as study guides or quizzes, offering individualized learning while helping teachers with assignments and activities.

Other ways in which ChatGPT could be effectively utilized include having students assess its responses and recognize their flaws. Educating students on how to use systems like ChatGPT and their potential dangers is an effective way to utilize such systems and teach students that using them is not always helpful. Especially since many professions may begin to implement different sorts of AI programs, educating students on both their harm and usefulness may be valuable to them once they enter the workforce. Jobs in many different fields — including data analytics, software engineering and even sales — are beginning to utilize AI programs as a way to maximize profits.

The initial reaction by some may be to outright ban software like ChatGPT in academic settings or severely limit students’ ability to access it. However, decisions such as these appear rash and overall unhelpful. Banning the software within schools would be ineffective as it’s possible students could still use it outside of school or university premises. Additionally, intensively watching students — for example, requiring them to only write papers in class or having them explain every edit they make, as some professors are doing — to ensure they are not cheating is an overstep of instructors’ authority in academic settings and an inefficient use of time that could be spent learning.

Learning how to engage with and operate new technology is never an easy feat, and the simplest solution to the overwhelming issue of AI within education may simply be to rewrite academic honesty codes and ban the usage of any such systems. However, AI and programs such as ChatGPT are here to stay, and it would be to professors’, students’ and teachers’ best advantage to learn how to work with it instead of against.

Samantha Rigante is a freshman majoring in philosophy, politics and law.

This article is from: