By Barak Richman and Francis Fukuyama 

Twitter's decision to kick Donald Trump off its platform was welcomed by many liberals as a necessary response to a president who had incited the January 6 storming of Congress. But many of those not particularly friendly to Mr. Trump were nonetheless troubled that Twitter had grown so powerful that it could silence a U.S. president. The Russian anticorruption activist Alexei Navalny, in a series of tweets a few days before his recent arrest, expressed similar worries about Twitter using its power against champions of democracy.

The prevailing approach to protecting political speech on the major digital platforms has been to pressure the companies to self-regulate by creating, for instance, oversight boards of outside experts. But leaving these momentous decisions in the hands of private companies is not a long-term solution; they have neither the legitimacy nor the capacity to make such decisions in the public interest.

The core issue is the oversize power of Twitter, Facebook and Google in controlling political discourse. One measure with some congressional support is to repeal or change Section 230 of the Communications Decency Act, which limits the liability of platforms for the content they carry. But reform could hurt small companies more than the established giants, who can more easily develop algorithms and hire personnel to filter out problematic content.

The question is how to reduce the power of today's digital monopolists without losing the substantial social value they offer. We believe that a structural solution is possible: requiring the dominant platforms to allow users to select their own output from the algorithms. This would depend on the creation of a new kind of "middleware." The term normally refers to software that connects operating systems with applications, but we envision software that puts a filtering layer between big platforms and their users. It would give consumers a simple, powerful tool to manage the content that reaches them.

The problem with today's digital monopolists is not that they carry fake news or conspiracy theories; this content is unfortunately widespread, and the First Amendment protects Americans' right to express what they want. The real harm comes from those few companies' ability to amplify certain voices while excluding others. This power is a product of scale economies, in which one large platform generates more attention, participation and value than multiple smaller ones, and a business model that relies on capturing user attention for advertisers. These platforms squeeze out alternative platforms while fueling virality, thereby advancing the loudest, most provocative voices.

Their ability to amplify or silence speech has reached a scale large enough to potentially sway elections and shape policy outcomes. It constitutes a form of concentrated political power similar to leaving a loaded gun on the table. We may believe that the person sitting opposite us today will not pick up the gun and shoot us with it, but trusting the good intentions of individual actors is not a sustainable approach for any democracy.

The power of the digital platforms extends far beyond their public decisions to suspend certain accounts. The voices people hear on these platforms are the product of hidden algorithms that the platforms can tweak at any time without public notice. Twitter algorithms determine which tweets users see, Facebook algorithms disseminate certain news feeds, and Google algorithms sequence news, advertisements and other search results. These algorithms are utterly nontransparent; they invisibly take away from users much of their ability to control what appears in their feeds.

Some have argued that banning Mr. Trump from Twitter violates the First Amendment and amounts to censorship. This is wrong in a technical sense, since the First Amendment constrains only government action; it actually protects the right of the platforms to carry what they want. Nonetheless, there is some truth to the censorship charge. These platforms are so central to political speech that they have become, in effect, the new American public square, comparable to the three dominant television networks in the 1950s and '60s.

Others have suggested using the antitrust laws to curtail the power of the major platforms. But these laws were designed to correct economic harms and cure monopolistic abuses. The major platforms may be guilty of such abuses, which is why Facebook and Google are being taken to court by the Federal Trade Commission, Justice Department and many state attorneys general, to say nothing of their legal problems in the European Union. (Twitter has not faced the same scrutiny.)

But none of the remedies currently being sought would address the potential political harms of the platforms' control over political speech, except perhaps a full AT&T-style breakup. That outcome is not only politically and legally unlikely, but it would take years and likely prove ineffective because other dominant digital companies would rise to take the place of today's giants.

The great advantage of the "middleware" solution that we propose is that it does not rely on an unlikely revolution in today's digital landscape and could be done relatively quickly. How would it work? A spate of third-party companies would create and operate software to curate and order the content that users see on their digital platforms, according to the users' preferences. Users could insert their preferred middleware as plug-ins to the platforms and thus choose their own trusted intermediary to sort their news, rank their searches and order their feed of tweets.

Were Donald Trump allowed back onto Twitter, middleware would allow users to choose how much to hear from him -- potentially dampening his virality through their choices and, most significantly, reducing the impact of any decision made by Twitter executives. Surveys suggest that many users would prefer a calmer, less divisive internet; this plan would put that to the test.

Middleware would outsource content curation from the major platforms to a new, competitive layer of smaller companies. Once platforms open themselves to middleware, little technical expertise is required to craft a new algorithm, and even nonprofit entities or local civic organizations could sponsor their own. Middleware need not alter the platforms' general appearance or workings. Instead, by filtering the platforms' algorithms according to consumers' choices, it would give users greater control over the information they see. The internet would return in some respects to being the decentralized place it was designed to be, rather than a sphere dominated by a few large companies.

This solution would require the cooperation of the major platforms and new powers for federal regulators, but it is less intrusive than many of the other remedies currently receiving bipartisan support. In an earnings call this week, Twitter CEO Jack Dorsey endorsed "giving more people choice around what relevance algorithms they're using. You can imagine a more market-driven and marketplace approach to algorithms." Regulators would have to ensure that the platforms allow middleware to function smoothly and may need to establish revenue-sharing models to make the middle-layer firms viable. But such a shift would allow the platforms to maintain their scope and preserve their business models. Moreover, middleware would allow Twitter, Facebook, and Google to continue delivering their core services to users while shedding the duty of policing political speech.

Could middleware further fragment the internet and reinforce the filter bubbles that have fed the polarization of American politics? Perhaps, but the aim of public policy should not be to stamp out political extremism and conspiracy theories. Rather, it should reduce the outsize power of the large platforms and return to users the power to determine what they see and hear on them.

Middleware offers a structural fix and avoids collateral damage. It will not solve the problems of polarization and disinformation, but it will remove an essential megaphone that has fueled their spread. And it will enhance individual autonomy, leveling the playing field in what should be a democratic marketplace of ideas.

--Dr. Richman is a professor of law and business at Duke University. Dr. Fukuyama is a senior fellow at Stanford University's Freeman Spogli Institute.

 

(END) Dow Jones Newswires

February 12, 2021 17:24 ET (22:24 GMT)

Copyright (c) 2021 Dow Jones & Company, Inc.
Twitter (NYSE:TWTR)
Graphique Historique de l'Action
De Juin 2024 à Juil 2024 Plus de graphiques de la Bourse Twitter
Twitter (NYSE:TWTR)
Graphique Historique de l'Action
De Juil 2023 à Juil 2024 Plus de graphiques de la Bourse Twitter