4 minute read
The Future is Here
by TEAM
Deval (Reshma) PaRanjPe, mD, mBa, FaCs
Unfortunately, this may become a Hobson’s choice sooner than later. Artificial intelligence has limits and is, at least not yet sentient, according to Silicon Valley insiders. However it is about to change the way we live, much as the introduction of the fax machine, cellular telephones, and computers did in decades past. AI is like Eli Whitney’s new cotton gin, and it will power the new industrial revolution.
Advertisement
A recent report by Europol (the EU version of Interpol) warns that up to 90 percent of internet content could be generated or edited by AI by 2026. Think of what that means. Content websites will depend on an admittedly imperfect AI system to generate nearly everything you read on the internet. The tone, accuracy, and motive behind the content will be moderated by humans— who have not been doing a great job of late, as anyone who has witnessed the devolution of editorial standards on television, radio, web, and print news can attest. Editorializing, previously the ultimate faux pas, is now seemingly de rigeur in broadcast journalism.
More troubling still, AI may become the sole or prevalent editor of AIgenerated content. We have seen lately that AI can be problematic in that it has no EQ and no inherent moral compass. The New York Times story of the writer whose Microsoft Chatbot made a pass at him and asked him to leave his wife is but one example. Aggressively racist and sexist behavior from other chatbots has also been well-documented. “It’s all in how you train the algorithm,” people may say, along the lines of “guns don’t kill people; people kill people.” Perhaps, but both guns and AI make it a lot easier to achieve nefarious ends on a grand scale.
AI is sweeping the television industry as well. An Indian news channel has employed an AI news presenter named Lisa, who can speak multiple languages flawlessly and still maintain the studied cadence of a BBC reporter. She is attractive, well-rated by the audience, and could be programmed to tell you the weather report or that nuclear destruction is imminent and mass suicide is a reasonable option. If you watch her, she does not have six fingers or any of the classic AI “tells”. She looks like an attractive, well-spoken young woman in her twenties. The article interviewed viewers, most of whom did not mind an AI presenter and enthusiastically found her pretty; however one cynical taxi driver was rightly concerned that he could now REALLY not trust anything he was told on the news.
From Page 5
AI is already in the hands of cybercriminals who are running a scam in which they grab a snippet of speech from a real person and extrapolate that using AI into a scam phone call. Typically, grandparents or parents are called by an AI chatbot purporting to be the grandchild, who is in jail after a car accident or other mishap and needs bail money. Think this is farfetched? This happened to the parents of a good friend in Pittsburgh last year; they nearly fell for it because the voice sounded exactly like their grandchild and responded to them conversationally.
Actors are currently on strike, and one of their grievances is their lack of rights and protection against AI hijacking of their image, voice, and work. But the privacy and ownership issues extend beyond actors and celebrities to the average citizen. If you cannot protect and maintain ownership of the rights to your own image and voice, and if anyone can make video or audio of you that can’t be distinguished from the real thing, and this video or audio has you doing or saying terrible things---what and who can you really trust anymore? And how can you clear your name from allegations that you did or said something wrong? Look for the legal system to change as well.
Think of the implications for politics—if you thought you couldn’t trust politicians before, now they can claim AI as an excuse for misbehavior.
Good people could be falsely accused of heinous things, and bad people could be falsely exonerated. Whoever controls the media will have an incredibly powerful tool---if you thought there was “fake news” before, sit back, because the real “fake news” will now be possible. Imagine the chaos and discord that foreign governments and enemy agents will be able to sow using AI generated fake video and audio using social media. The American people (indeed any people—it’s human nature) have already proved in the last few years to be emotionally labile, intellectually lazy, and liable to quickly turn on one another along manufactured divisions under stress.
AI will cause upheaval in the labor market and force job retraining. If retraining is not possible, this, in itself, will cause a fundamental change in the economy. Many white-collar jobs will be eliminated—writers, editors, lower, and middle management, and even some upper management positions will no longer be necessary. Businesses will become more efficient, but large swaths of the populace will become unemployed and discontent leading to upheaval.
Skilled trades may not be so replaceable until AI-powered robots advance further and are more accessible.
AI will affect the practice of medicine profoundly. Health insurance companies and hospital systems will undergo the same slash-and-burn of lower and middle management seen in other industries, no doubt to the delight of upper management but also clinicians. AI will reach into diagnostic, therapeutic, and genetic realms at the expense of patient privacy, particularly if the information is integrated with health insurance companies. AI will also change clinical medicine for the better by outsourcing scutwork, streamlining care, improving efficiency, helping patients with preventive medicine, and aiding procedures with AI-powered robots. Can AI replace physicians? Not yet. But our job descriptions may change profoundly.
Our job as humans and physicians is to ensure that the AI set in place to help us operates under an ethical moral compass and has EQ and heart. Otherwise, it is a fine line between heaven and hell.
Buckle up; the future is here.