Regulation, Moderation, and Social Media Decentralization

“Do you remember the internet in ’96?” a silent television display asks in Facebook’s quintessential Klavika font during an ad break. The sound of a dial-tone connection shrieks out of the television and captures the attention of the casual viewers who have turned elsewhere or to social media during the intermission.

At a blistering 2021 speed, the screen shifts from archaic interfaces to modern emojis without giving as much as a second to focus before clearing the display for text reading “It’s been 25 years since comprehensive internet regulations were passed. It’s time for an update.” It is part of Facebook’s pro-regulatory advertisement push entitled, “Born in ’96” part of the larger “It’s Time” campaign.

As advertisements go, this one is remarkably effective, if a little overbearing. Would you expect anything less from the king of every corner in the advertising market?

The ecosystem created for the internet in ‘96 by the Communications Decency Act (CDA) has clearly not impeded Facebook’s success. After being created in 2004, Facebook was established in a post-CDA world and has played a leading role in establishing that world’s bounds. Nevertheless, the company’s anxiety over ambiguities in the ancient digital legislation is understandable.

Rather than letting Facebook’s executives design the social media market of the future, what if there were free competition? Not the kind of competition that Twitter and even Parler provide Facebook, rather, a type of decentralized competition that challenges the structure that the Silicon Valley giants are built on. In other words, how about a polycentric organization of competition that makes the CDA obsolete and breaks up vertical monopolies on user-generated content and use data? Thanks to an unexpected source, that competition may not be far away.

To Moderate or Not to Moderate is NOT the Question

Chances are, if you are made uneasy by a social media giant lobbying to change the rules that govern it and its competitors, that dubious feeling may come from a general distrust of Facebook itself. Facebook may or may not have lost your trust after Russia used the platform to target Americans with divisive advertisements during the 2016 election, or after CEO Mark Zuckerberg was summoned to Congress to testify in 2018 about the site’s alleged internal content moderation bias against conservatives. Even without negative associations, however, new regulations on established markets create barriers to entry and disincentivize competition. In this case, new regulations would mandate that social media companies practice internal content moderation—otherwise simply known as moderation—something that has strained Facebook’s abilities up until recently.

At the root of Facebook’s legal issues is the CDA. Although originally intended to determine what content was suitable for television, the CDA became one of the most foundational regulations for the burgeoning internet. Insofar as the internet is concerned, the CDA mandates that a site may not publish certain indecent, and often independently criminal, content. It also delegates the enforcement of these rules to a regulatory agency, the Federal Communications Commission (FCC), instead of leaving the justice system to sort out victims and perpetrators. Notably, the CDA also creates a distinction between “publishers,” standard websites that curate or create content, and “platforms” like social media sites that allow anyone to post and merely aggregate and serve content to consumers.

Distorting the justice system by inserting executive agencies between victim and perpetrator creates a topsy-turvy system. As it stands, proving the facilitatory guilt on the part of a social media company is far easier than proving that any crime outside of the scope of the CDA had been committed in the first place.

Under the CDA, there are two systems of online content production. Publishers are obligated to internally moderate content such that it remains within the bounds of what the law considers acceptable speech. On the other hand, platforms are not held to this standard and are, by the nature of the distinction, barred from behaving as publishers. Although this clause, better known simply as “section 230,” has been touted by many as the saving grace of the CDA from the perspective of free speech, it is also the wedge that causes Facebook to take flack from both the left and right.

Facebook has been scrutinized for not moderating content strictly enough in 2016 and for being too politically restrictive ever since. New regulations would certainly clear up Facebook’s role, especially if Facebook’s on-staff legal team would have a say in the verbiage of any proposed bill, which it likely would. Either way, moderation of user-generated media, and therefore free speech online, will either be centralized under a federal agency or only distributed between a few massive companies which themselves have nothing to do with content production.

The free market provided several alternatives to Facebook and Twitter but few gained traction in the face of such established competitors and steep regulatory obligations. One of these start-ups, Parler, managed to gain a healthy following when President Donald Trump was controversially removed from almost every other online platform following the storming of the Capitol on Jan 6. After gaining millions of users overnight, Parler’s web-hosting service, a subsidiary of Amazon, decided to sever ties with the company over its moderation policy. This effectively moderated the entire website off of the internet by refusing to do business with them.

Many were attracted to Parler’s moderation policies, or lack thereof, and, had it not been shut down, the site would have posed a competitive threat for a portion of Facebook’s disaffected user base. Although providing a place for truly unfettered conversation, Parler segmented conversation and would never be a comfortable place for the majority of social media users who prefer some community standards beyond the legal bare minimum to be enforced.

Besides, having a second, slightly edgier public square just outside the first is not a substitution for effective public discourse. The French third estate’s self-separation from the estates general did not, after all, create a more healthy political dialogue for the French people during the beginning of the French Revolution.

Parler was shuttered for two months while the site’s founders procured alternative web hosting services. Although currently functional, Parler’s existence is not a long-term competitive solution to the problem of legally obligatory moderation because it frames moderation, in and of itself, as a bad thing. The same could be said for President Donald Trump’s new media outlet if it is ever opened up to public contribution.

Moderation, when done offline, is a daily practice for most. Whether by choosing the members of your inner circle or choosing to only have two slices of pizza, people self-moderate their lives all the time. Centrally planned moderation, however, is called prohibition and often causes more harm than good.

Enter the Decentralized Social Media (DSM) model, a polycentric model of online interaction recently proposed on Medium by Ross Ulbricht, the currently imprisoned founder of Silk Road, an infamous illicit online marketplace that jump-started the popularity of Bitcoin in the early 2000s.

The Innovation of Decentralized Social Media

Moderation of something as big as social media is incredibly difficult and would take a massive amount of manpower if done entirely manually. Some of Facebook’s most closely guarded secrets determine the algorithms the company has developed for use in ad targeting and to facilitate moderation. The CDA both disallows this moderation and requires it, depending on which side of Section 230 a site falls on.

To oversimplify, Ulbricht’s DSM model would remove those automatic and manual moderation tools from under the hood of a social media’s servers and place those same processes in the device of the social media user under the control of separate companies that stand to profit from providing moderation and aggregation services at the discretion of the device owner. Users could access any or all of the web’s available social media content feeds at once and only be fed content within their own acceptable parameters while retaining ownership of their user data. All of this would be done through the operant function of Bitcoin, the encrypted blockchain.

In practice, the seemingly small distinction between where these algorithms are processed and who owns those functions resolves several of the questions raised and created by the CDA without having to create or pass any new laws.

Where the CDA consolidates and centralizes the responsibility for moderation under the content aggregator’s purview, start-up companies following Ulbricht’s model would compete to moderate and aggregate both user content and advertisements from social media platforms, thereby creating a market where there previously was only mandate.

Users would simply open an app wherein social media content is centralized. Users could pick and choose moderation and aggregation providers to reward with a portion of the advertisement revenue that their engagement latently generates and freely switch between providers.

Rather than having aggregation and moderation be centralized by legal obligation to a few social media companies, levels of moderation, advertisement service, and content prioritization would all be separate overlapping markets which independently compete to provide superior service and control for negative externalities.

Algorithmic moderation is a powerful tool that, along with manual moderation, can create comfortable digital environments. If moderation and aggregation were divorced from the social media platform and made a competitive marketplace, users would be free to use whatever network or combination of networks that they preferred and have the content they are served moderated however they see fit. If users could be in control of the moderation they facilitate and are subject to, the arrangement would be considerably more consensual. Social media companies under a decentralized model would not be held responsible for users misusing their digital infrastructure as content regulation would be the responsibility of the client-side moderation algorithm and the companies that compete to provide those services most effectively. If a user was ever dissatisfied with the moderation they were provided those would have market remedies.

Should someone use social media as a means to harass or threaten another, there would be no intermediate party at fault, freeing the judicial system to bring justice to guilty and affected parties alike.

Side-effects may include

Besides sidestepping the CDA’s ineffectual regulations, a DSM would protect the privacy of users by encrypting the user generated data used by moderation algorithms and keeping a function of that unique data as the user’s encryption key or proof of identity.

The value of the advertising market and the size of Facebook’s share of it are both due to the incredible amount of data that Google, Facebook, and other companies collect on every person who uses their services. This information is the company’s to sell, use to target ads, or train algorithms with. Under a DSM model, that information would be yours to sell and distribute among service providers.

The value of this information is worth much more than the emotional value of privacy. Companies like Facebook make much of their money in one way or another from the accuracy and scope of their user data collection. If that information were to become yours by using a DSM, so too would the money it generates.

As it stands, Facebook and Google data-mine users in exchange for a service. Were they to have to adapt to a DSM model, companies like this would need to shift to more traditional models where payment is offered directly for a service rendered. Ulbricht’s model would allow businesses to accommodate liquid payment between service providers like web-hosts or advertisers and the users so that the app constantly allows users to be in control of how much of their data they would like to share and how much usability they want to pay for by receiving ads.

Innovation always trumps regulation

Rather than offering prescription for what ails the social media marketplace, Ulbricht’s paper is a prediction from a prison cell. The unstoppable march of innovation is sure to further segment the digital marketplace for social programs into intricately specialized niches. The distributed social media model is merely a description of how those businesses and technologies would need to operate.

Because Ulbricht was not granted the clemency from the Trump administration that he so hoped for, the infamous programmer will not be the one to found the moderation or content aggregation start-ups that he describes. Public figures such as Jordan PetersonDave Rubin, and Tim Pool have all claimed to be creating platforms that in some way aggregate social media, beginning the process of decentralizing—or polycentrizing—the social media market. It remains to be seen if these or any start-ups will truly realize Ulbricht’s ideas, but if the CDA is not soon updated, it will likely be circumvented.

Just as 3D printers have shown several gun laws to be archaic, if not entirely obsolete, the best way to counter a bad set of laws or regulations is to create a technology or idea that renders it pointless. Ulbricht may not be the one to lead the charge, but his simple Medium post certainly opened a door.

Gavin Hanson (born in ’96) is the Editor-in-Chief of Catalyst

Published with permission from Catalyst. Read the original article here.

Related posts

1 comment

ข่าวกีฬา April 17, 2024 at 8:12 pm

… [Trackback]

[…] Find More on that Topic: thelibertarianrepublic.com/regulation-moderation-and-social-media-decentralization/ […]

Leave a Comment