2 minute read

IInherentOppressionandBiasin InherentOppressionandBiasin nherentOppressionandBiasin ttheTiktokAlgorithm theTiktokAlgorithm heTiktokAlgorithm

With over one billion people using the app every month (a), Tiktok has become one of the world’smostpopularsocialmediaandvideosharingapps.Whiletherearemanyreasonsfor the app ’ s popularity, the most known reason is the app ’ s astoundingly efficient and controversial algorithm for user video curation Each user has their own “For You Page”, on which the algorithm chooses videos for you based on your interaction with previous videos. Each user can also allow their data and activity to be tracked across other apps they use, much like Facebook does. TikTok has its own Community Guidelines and that all user videos must follow, as well as “technology that identifies and flags potential policy violations, such as adult nudity and violent and graphic content” (b) to keep these rules in check However, withthistechnologycomesalotofbiasesthatusershavenoticed.

Thephenomenonofthe“echochamber”isonethatT users are familiar with. It describes the concept tha algorithm gives each user a video that perpetuate typeofcontenttheuserhasalreadyexpressedintere Ifauserrepeatedlylikes,comments,and/orsharesvi about knitting, they will get even more knitting video their For You Page This is why Tiktok is so popul rewards higher levels of activity with more conten userwillenjoy,thuscreatinganechochamber.

Advertisement

While this is great for users who enjoy knitting, there is a dark side to this echo chamber one that can perpetuate hateful ideas and videos One would think that the same technology that flags adult nudity and other visual content violations would also be able to flag these hateful ideologies as dangerous, but there is no such proof of any technology Of course, Tiktok has human reviewers who go through each flag, but there are still many cases of hateful ideologies not being flagged or punished in any way One user reported that his profile bio was flagged for using the word “black” in any capacity, whether it was “I am a black man ” to “Black Lives Matter” (c) However, the phrases “white supremacy ” and “I am a neo-Nazi" did not prompt the same flags for violations. The same algorithm that supposedly brings “ new content” into the user ’ s feed is the same one that narrows down the scope of what a user is actually consuming. In a country as politically, economically, and socially polarized as the United States, the creation of barriers between users with different viewpoints creates tension between a user and anyone who disagrees with them This brings up an important question: are these biases just simple errors in the algorithm, or was this intentional?

ThisechochamberideologyisperpetuatedindifferentalgorithmsonTiktok specifically, thealgorithmthatsuggestsnewaccountstofollow Inanonformalexperimentperformed byTwitteruserMarcFaddoul,anewTiktokaccountwascreatedforasinglepurpose:to follow random accounts and see what accounts would pop up on the suggested people bar Faddoul found that the physical features of the original account that he followed woulddeterminethesuggestionsforotheraccountstofollow IfFaddoulfollowedPerson A, who was a white woman with blonde hair, then he would get Person B and Person C, two other white women with blonde hair, recommended for him to follow This algorithm transcendsjustraceandgender Faddoulfoundthathairdye,bodytype,age,andvisible disabilityofuserswerealsofactors Heconcludedthatthisalgorithmassumesthatifone follows, say, an Asian teen with dyed hair, it is because the user is an Asian teen with dyed hair, and not for their humor, videos, or other aspects of their content Faddoul remarked that while it is not unusual for social media algorithms to create “bubbles” for things like political opinions, Tiktok seems to be the “first major platform to create such clearphysiognomicbubbles” (d)

Theseclearlybiasedalgorithmshaveanevenwiderrangeofimpact.Whiletheydoaffectvideo and profile curation for users, they also affect content creators in perhaps an even more devastating way When videos and profiles are so easily flagged for content violations and reliantonthe“physiognomicbubbles”togainmorefollowersandalargeraudience,creatorsin minoritygroupsareautomaticallydisadvantaged Notonlyaretheylesslikelytogainfame,but theyalsofallpreytoplagiarismandoutrightrobberyoftheircontent TheNewYorkTimeswrote anarticlein2020tellingthestoryoftheoriginalcreatorofthepopular“Renegade”danceon Tiktok.Thiscreator,JalaiahHarmon(picturedleft),sawherdancebeingreplicatedbysomeof the most popular and white creators on Tiktok, as her videos gained no more popularity and shereceivednocreditforherdance (e)

This story is one that BIPOC creators on Tiktok know well as they continue to experience algorithmic biases in this way. These same algorithms meant to be fair and unbiased have been the cause of the segregationandsilencingofBIPOCcreatorsandusersonTiktok Whileitwouldbeeasytojustblame theseproblemson“glitchesinthecode”,itisimportanttorecognizethatbiasedalgorithmsarecreated bybiasedhumans Thisiswhyitisimportanttoconductmoreformalresearchonracialbiaseswithinthe Tiktokalgorithminordertounderstandwhythisishappening.Furthermore,Tiktokisaperfectexample of why it is important that coding becomes more diverse A diverse array of people creating the algorithms would lead to less chance of biases like the ones on Tiktok. A more inclusive and safer Internet for everyone – one where diverse perspectives are cherished and users are encouraged to questionthingsandgrow-startswithdiversityatthesoftwarelevel.

Sources: https://newsroomtiktokcom/en-us/1-billion-people-on-tiktok https://wwwtiktokcom/community-guidelines?lang=en#38 https://www.nbcnews.com/news/us-news/tiktok-algorithm-prevents-user-declaring-support-black-lives-matter-n1273413 https://twittercom/MarcFaddoul/status/1232014908536938498? ref src=twsrc%5Etfw%7Ctwcamp%5Etweetembed&ref url=https%3A%2F%2Fwwwbuzzfeednewscom%2Farticle%2Flaurenst rapagiel%2Ftiktok-algorithim-racial-bias

This article is from: