PRESENTOR Artificial Intelligence and Therapy Francis Gregg, Texas A&M University
I. Introduction Suicide death has risen “25 percent since 1999 across most ethnic and age groups” (Routledge 2018) which points to an obvious crisis in our modern society. According to Dr. Clay Routledge, a professor of psychology at North Dakota State, it may in part be from “a desperate search, common to all lost souls, to find meaning” (Routledge 2018). One attempt to decrease suicide rates is the use of AI chatbots as replacements to human therapists. Chatbots, by design, are woefully unable to provide treatment to people who are suffering from existential depression which is the result of humans yearning to find meaning in their lives. In fact, a person who suffers from this form of depression and decides to use chatbots as a therapeutic medium may actually be pushed deeper into despair. One framework to better explain the cause and remedy to this issue is the theory and practice of Logotherapy which must be administered by a human mental health professional. My critique of chatbots as an instrument to help humans overcome lack of meaning will shine light on some of the strengths and pitfalls of AI, the novelty of humanity’s yearning for meaning, and how the two can symbiotically coexist. II. Strengths and Weaknesses of AI AI was defined by the computer scientist John McCarty as “machines that are capable of performing tasks that we define as requiring intelligence, such as reasoning, learning, planning, and problem-solving” (Luxton 2015, 2). The specific type of AI that is used in advanced chatbots is referred to as weak AI, which essentially means the program has specific tasks it is attempting to accomplish within specific parameters. Advanced chatbots use machine learning processes through the utilization of artificial neural networks. In layman’s terms, the chatbot has the ability to not only recognize patterns, but also respond to those patterns differently depending upon how often connections are made. Neural networks were designed in a way to mimic the “biological neurons” where “connections are made and strengthened with repeated use” (Luxton 2015, 3). The system receives data from the user’s questions and answers, along with guidance from psychologists who
24