10 minute read

Conflict in the Digital Age: How Can We Preserve Human Rights?

Conflict in the Digital Age: How can we preserve human rights? By: Alonna Williams

Abstract

Advertisement

Technology moves at an exponential pace. As we move forward, algorithms will become increasingly complex, and their use will become increasingly widespread. In the case of social media algorithms, the extremes of what is happening online is escalating already fragile conditions in some countries. In others it is destabilizing democracies. In warfare, the digital sphere has become a new front and the rules of warfare are changing. The protection of Human rights could be at stake if not considered. It is essential that a respected international body, like the United Nations (UN) take charge. As a multinational organization with vast reach and resources, the UN is an organization poised to help tackle this global challenge by promoting accountability via impact assessments for algorithms, deploying country-level risk mitigation strategies and updating international law to include rules on how to enact algorithmic warfare.

Background

Social Media Algorithms

Stop the Steal. The Arab Spring. Ethnic Conflict in Myanmar. Civil War in Ethiopia. There is not any part of the world that has been spared from the negative effects of social media algorithms. Left unchecked, these algorithms have wreaked havoc on us offline.

Social media sites are akin to a battleground for some conflicts and have played a role in conflict escalation for others. For the first, the conflict in Myanmar comes to mind. Myanmar is a country with an ethnically and religiously diverse population. In 2011, the country began its political transition into a democracy after 50 years of a repressive military dictatorship. The transition was hardly democratic as the military regime had a continued presence in authoritative positions across all levels of government (Rio 2020). The military regime stratified Myanmar’s society into eight ethnic races and 135 “ national races” . However, Muslim minority groups, such as the Rohingya were excluded from the list of races. The Rohingya have constantly struggled to obtain citizenship rights within Myanmar, and as the “democratic transition” was ushered in, this issue became highlighted again. By 2017 tensions between the Rohingya and ethnic Rakhine came to a head. In August of that year, the Arakan Rohingya Salvation Army (ARSA), a Rohingya militant group, launched a terrorist attack on various police and military psosts throughout Rhakine State. The aftermath of these attacks led to a humanitarian crisis where over 1 million Rohingya fled to neighboring Bangladesh.

The conflict in Rakhine fed into the already present tensions between the Buddhist and Muslim communities. Anti-Muslim sentiments led to the creation of a Buddhist National Movement, the Ma Ba Tha, and the escalation of violence against the Muslim minority within Myanmar. According to the BBC, one of the Ma Ba Tha’s leaders has been linked to violence against

Muslims (British Broadcasting Corporation 2015). As the political power of the Ma Ba Tha rose, access to mobile data became cheaper, which meant significantly more people were able to get online. At this point, Facebook became hugely popular, almost ubiquitous to daily life in Myanmar. Leaders in the Buddhist National Movement used Facebook to spread their message and the military used it to promote and monitor content while also “trolling critics” (Rio 2020).

Facebook’s algorithm made it easy to spread propaganda, hate, and disinformation quickly. For example, after the attacks by ARSA, digital researcher Raymond Serrato found a 200% increase in activity in an anti-Rohingya Facebook Group of 55,000 people (Safi and Hogan 2018). The Ma Ba Tha used virality to engage their audience through tactics like clickbait and tagging many accounts on a post supportive to their cause. Similar to the military, the Ma Ba Tha also used Facebook ads to target their audiences. Facebook’s ad features allowed for a more nuanced audience targeting through the use of demographic factors like age, gender, and interests. The social media site provided a platform for the government and political leaders to amplify their hate against the minority group, which played a significant role in escalating the intensity of hate towards Muslims, and a rise in violence towards the group. According to the Council on Foreign Relations, 1.3 million Rohingya have been displaced and the conflict is continuing to worsen.

Moreover, social media algorithms do not only play a role in conflict escalation in developing countries but in developed ones also. Facebook’s algorithm was a focal point in the United States’ 2020 presidential election. While Facebook was not considered another warfront like in Myanmar, the social media site did play a role in escalating existing tensions within American society. Facebook’s Groups feature has been accused of being a breeding ground for violence. In 2017, Mark Zuckerberg made Facebook Groups a strategic priority for the company in an effort to promote more “ meaningful connections” (Paul 2020). The promotion of Groups is not an issue in itself, in a politically tense environment,the algorithm’s ability to connect like-minded people with violent tendencies can lead to violent conflicts offline.

Civil rights groups like the Center for American Progress and the Southern Poverty Law Center warned Facebook that its Groups feature had become a breeding ground for hate, political extremism, and misinformation. Unfortunately, the company did not act until after people had been radicalized and the threat of violence was already present. Facebook uses artificial intelligence (AI) to monitor violent language in Groups, but the AI has fallen short, illustrated by the link between individuals who connected via Facebook’s Groups and violent acts committed offline. For example, in October 2020, six men used Facebook to plan a kidnapping of the Governor of Michigan. They were found to be a part of the “boogaloo” movement, which is a far-right, anti-government extremist movement in the United States whose members have been found to organize through Facebook Groups. An advocacy group called the Tech Transparency Project, found that there are at least 125 active Boogaloo groups on Facebook (Boigon 2020). Recently, Facebook’s parent company Meta has received a wrongful death lawsuit accusing the company of promoting groups that advocate violence. Facebook’s Groups algorithm is front and center in this lawsuit as the suit alleges that Facebook gave rise to the far-right extremist boogaloo movement, which allowed the killers to connect. The lawsuit claims that the social

media site’s algorithms “ are weighted to favor untrue, inflammatory, and divisive content that will grab and keep users’ attention. ”(Kurup 2022)

The Stop the Steal movement is another example where Facebook’s Groups algorithm played a role in connecting individuals to incite violence offline. In this case, these connections resulted in an unprecedented violent attack on the U.S. Capitol. Meta representatives have had to testify in Congress regarding the social media site’s role in spreading misinformation about the U.S. election and the role it played in connecting thousands of people to incite an insurrection. From wrongful death lawsuits to the kidnapping of a government official to a violent attempt to attack Congress, Facebook’s AI has played a role in connecting people to violent groups while also failing to properly monitor violence to prevent it.

A New Type of Warfare

Not only has artificial intelligence played a role in conflict escalation, but it has enabled a new type of warfare called “ algorithmic warfare” . Algorithmic warfare involves using artificial intelligence to increase military power. The nations that best collect and make sense of data through AI will increase their military power and enjoy long-term competitive advantages (Jensen, Whyte, and Cuomo 2019). As Vladimir Putin stated “ whoever becomes the leader in the AI sphere, will become the ruler of the world”(The Associated Press 2017). This is a chilling statement considering the ongoing war in Ukraine started by Putin.

Militaries will be able to use AI for surveillance and to analyze behavioral trends by identifying people’s attitudes, behaviors and intentions, and locations. In the wrong hands, this data can be used by governments and militaries to target subgroups and track dissenters. AI can also be used to manipulate human behavior through misinformation campaigns, like when Russia interfered with the U.S. 2020 election using disinformation campaigns on Facebook and Twitter.

The concept and use of algorithmic warfare has many concerns. One major one is that the use of artificial intelligence in warfare increases the speed of human decision-making. While this can be considered a good thing in high stakes contexts, the use of a system that has a biased code can potentially be destabilizing. This issue then leads into the ethical concerns of this type of warfare. What would the new rules of war look like? How do you protect civilians’ human rights when using these new military technologies? These are questions that need to be addressed. The digital sphere has become a new front and the rules of warfare are changing. Human rights could be at stake if not considered. It is essential that a respected international body, like the United Nations (UN) take charge. As an multinational organization with vast reach and resources, the UN is an organization poised to help tackle this global challenge.

Recommendations

1. Promote accountability through algorithmic impact assessments (Ivanhoe, 2021)

A supranational organization like the UN could create a framework to assess the impact of social media algorithms in conflict settings. The assessments would ensure the perspectives of local communities are taken into account and ensure human rights are not being encroached upon.

2. Deploy country level risk mitigation strategies (Rio 2020)

One of the biggest factors that allowed the conflict in Myanmar to fester is the lack of content moderators for the local languages. Staff in conflict-fragile areas needs to be scaled and the UN can help. It would be beneficial if the UN created a special digital office meant to address the lack of moderators in conflict-fragile places. Local individuals who speak the targeted languages can be recruited to help protect groups being attacked online.

3. Create a human rights framework for the use of algorithms in warfare

In non-digital warfare, human rights are protected through international law. The Geneva Convention includes protocols that have established rules for protecting civilians, prisoners of war and the wounded and sick (International Committee of the Red Cross 2010). The protocols have been updated throughout history and have established rules on how to conduct war on land and what types of weapons are acceptable to use. It is time for an update to the Geneva Convention to include algorithmic warfare and new military technologies to protect human rights.

References

British Broadcasting Corporation. 2015.

“The Political Sway of Myanmar’s Monks. ” BBC News. https://www.bbc.com/news/av/world-asia-34472023. Council on Foreign Relations. 2022. “Rohingya Crisis in Myanmar. ” Global Conflict Tracker. March 7, 2022.

https://www.cfr.org/global-conflict-tracker/conflict/rohingya-crisis-myanmar. Edroos, Faisal. 2017. “ARSA: Who Are the Arakan Rohingya Salvation Army?” www.aljazeera.com. September 13,

2017. https://www.aljazeera.com/news/2017/9/13/arsa-who-are-the-arakan-rohingya-salvatio n-army. Getty, LOGAN CYRUS/AFP via. 2020. “Facebook Failing to Contain Content from Far-Right ‘Boogaloo’ Movement, Experts Say. ” The Forward. June 25, 2020. https://forward.com/news/449531/facebook-failing-to-contain-content-from-far-right-bo ogaloo-movement/. Ivanhoe, Hana. 2021. “Peacebuilding, Extremism and Social Media, Part 3: Algorithms –JustPeace Labs. ” JUSTPEACE LABS. July 22, 2021.

https://justpeacelabs.org/peacebuilding-extremism-and-social-media-part-3-algorithms/. Jensen, Benjamin M, Christopher Whyte, and Scott Cuomo. 2019. “Algorithms at War: The Promise, Peril, and Limits of Artificial Intelligence. ” International Studies Review, June. https://doi.org/10.1093/isr/viz025. Kurup, Rohini. 2022. “Facebook Sued over Killing Tied to Boogaloo Movement. ” Lawfare. January 11, 2022. https://www.lawfareblog.com/facebook-sued-over-killing-tied-boogaloo-movement. Paul, Katie. 2020. “Thousands of Facebook Groups Buzzed with Calls for Violence ahead of U.S. Election. ” Reuters, November 6, 2020, sec. Media Industry. https://www.reuters.com/article/us-usa-election-facebook-focus-idUSKBN27M2UN. Pauwels, Eleonore. 2020. “Artificial Intelligence and Data Capture Technologies in Violence and Conflict Prevention. ” Global Center on Cooperative Security. https://www.globalcenter.org/wp-content/uploads/2020/10/GCCS_ AIData_ PB _ H.pdf. Riley, Tonya. 2020. “Analysis | the Technology 202: Facebook Struggles to Keep Violent ‘Boogaloo’ Content off Its Platform. ” Washington Post, June 24, 2020. https://www.washingtonpost.com/news/powerpost/paloma/the-technology-202/2020/0 6/24/the-technology-202-facebook-struggles-to-keep-violent-boogalo-content-off-its-plat form/5ef23ddb88e0fa32f82410c4/. Rio, Victoire. 2020. “The Role of Social Media in Fomenting Violence: Myanmar. ” Toda Peace Institute. https://toda.org/assets/files/resources/policy-briefs/78-rio-myanmar-v2.pdf. Safi, Michael, and Libby Hogan. 2018. “Revealed: Facebook Hate Speech Exploded in Myanmar during Rohingya Crisis. ” The Guardian, April 3, 2018, sec. World news. https://www.theguardian.com/world/2018/apr/03/revealed-facebook-hate-speech-explo ded-in-myanmar-during-rohingya-crisis.

The Associated Press. 2017. “Putin: Leader in Artificial Intelligence Will Rule World. ” CNBC.

September 4, 2017. https://www.cnbc.com/2017/09/04/putin-leader-in-artificial-intelligence-will-rule-world.h tml. Tobin, Ariana. 2019. “Civil Rights Groups Have Been Warning Facebook about Hate Speech in Secret Groups for Years. ” ProPublica. July 2, 2019. https://www.propublica.org/article/civil-rights-groups-have-been-warning-facebook-abou t-hate-speech-in-secret-groups-for-years.

This article is from: