4 minute read

The Current Strategy Is Not Working

Firstly, a lot of effort has been focused on U.S. Congressional action. However, Congress is not well positioned to address granular digital technology problems–at least not with the approach we have now.

Between 2019 and 2023, around 60 unique bills have been put on the floor in Congress related to digital platform governance, specifically focused on declining information integrity, extremism, incitement to violence online, and diminishing press freedoms.2 None of the bills passed. In fact, only about six percent made it out of committee.3

When considering the processes of Congress, the complexity of the issues at hand, and the hyper-political framing of social media governance, the lack of movement on this legislation is unsurprising. Congress is simply not well positioned to directly address the intricate problems of digital platforms. It has hundreds of competing priority areas and serves as a generalist body rather than a group of subject matter experts. It is not crafted to move fast or pass one-off legislation to directly address the full myriad of risks created by social media products, nor does it have the measurements and information needed to make informed policy choices.

Ultimately, three important factors play into the lack of movement: money, politics, and firm secrecy. Large technology companies spent a record-breaking $69 million on lobbying the federal government in 2022—higher than both the defense and pharmaceutical industries.4 Compounding that, technology policy has become entangled with partisan politics. As a result, it is hard to see how a split 118th Congress could build the consensus needed to pass the numerous pieces of legislation required to address broader societal concerns, such as the mental health crisis, radicalization, and consumer privacy. Finally, platform companies have virtually no industry-specific disclosure practices. This makes it hard for external actors, including the government, to measure and assess the actions that industry has previously taken in an attempt to protect consumer and communal welfare.

Despite these challenges, we still look to Congress to act–and rightly so–particularly as consumer safety concerns continue to mount. For Congress to do this effectively, it is important to recognize that it has the tools and position to create macro-level change. Its primary responsibilities and strengths lie in funding government functions and programs, holding informative hearings to shape the legislative process, and conducting oversight of the executive branch. Therefore, we must consider how we can best leverage these strengths to achieve our goals in platform governance.

Secondly, private sector solutions have been fragmented, ad hoc, and platform-specific, which ignore the fundamental interrelation between digital spaces and how harms can propagate through online experiences.

Nearly all social media companies have tested solutions to address the harms of social media on users. Of the 73 proposals identified in DIGI’s DGDP Index, 65 have been tested by the industry. Even President Donald Trump’s Truth Social has content moderation policies embedded in their Terms of Service that appear to limit harmful content across the ideological spectrum.5

While almost 90 percent of proposed solutions have been tested across different platforms, only 7 percent have been fully implemented industry wide.6 And although platform-specific responses are necessary, there are spillover effects from one platform to another: The typical social media user interacts with 6.6 social media accounts on average.7 It is easy for activity on Instagram to influence activity on Reddit, Twitter, Facebook, and so on. Additionally, even if Reddit has safe practices, for example, any single consumer is subject to risks on other seemingly comparable platforms. Without best practices in place across platforms, there is no guarantee to consumers that their online experience has uniform safety considerations.

Moreover, it is difficult for external stakeholders to know how effectives consumer safety features are (of the ones that have been tested) because, again, companies have little to no disclosure practices. Some platform companies recently have participated in third party audits to validate their products8 but even these are difficult to validate because auditors are beholden to the data the platform companies provide. We see this playing out in the algorithmic auditing space, for example, where companies participating in “collaborative audits” end up compromising the integrity of independent review.9

Finally, “responsible” innovation teams at platform companies that are meant to champion societally-concious product and policy choices are often underfunded and, in some cases, completely deprioritized. This makes it nearly impossible for the groups tasked with speaking on behalf of the consumer to penetrate the full operations of a company—and that is if the teams exist at all. Towards the end of 2022, Meta chose to dissolve its Responsible Innovation Team.10 Just in March 2023, Microsoft announced it was shutting down its Ethics & Society team.11

Lastly, legislation and private sector governance has primarily focused on the harms caused by the technology of today, without much consideration of where digital services are headed.

We are only now grappling with consumer harms caused by the platforms founded nearly two decades ago. But the digital space is constantly evolving, it is the harm landscape. The Metaverse, web3, consumer-facing applications of large language models are already hitting the market–whether you believe in their viability or not–and yet we have not even scratched the surface within the policy world to mitigate forecasted risks.

Good consumer welfare practices require processes and systems that allow governance to keep up with the pace of technology. Right now, we do not have those processes in place. Experts and pundits have called for global bodies of governance for a range of digital services from social media to general purpose AI. Yet, these recommendations lack the specificity, precedent, and incentives to catalyze real change. So we continue to rely on ad hoc and reactionary proposals from governments, and underfunded, cagey, and piecemeal solutions from industry.

We do not need to reinvent the wheel with new conceptions of global governance bodies though; consumer and societal risks emerge with any new technology or innovation and, historically, we somewhat methodologically created standards systems at both a domestic and international level. They are industry-led and therefore more equipped to move at a speed close to that of innovation itself.

This article is from: