And the results so far? Well, let’s just say Facebook has lost a significant amount of money thanks to its legal troubles. Billions of dollars are owed as part of an FTC settlement, for example, but it’s important to remember that these numbers are just a drop in the bucket compared to the company’s actual worth. Tap or click here to see how much Facebook owes. After so much trouble in court, Facebook is looking for any way it can to cover itself against further repercussions. And now, it’s issuing a bizarre warning to every user about how the content they post could be subject to removal at Facebook’s discretion. Is Facebook going to start restricting free speech? As it turns out, the truth is far more complicated.
Facebook pushes an eerie warning about content being posted
If you’ve been on Facebook at all during the last several days, you may have gotten a strange alert informing you that content you post could be deleted to “mitigate adverse legal or regulatory impacts.” This word salad of legalese may seem spooky, but what it actually signals is a much broader change in the way Facebook moderates content. Until recently, Facebook (and other social networks) have been protected by a piece of legislation known as Section 230. This set of rules states that Facebook isn’t liable for what its users post and that the company has the freedom to moderate and remove content as it sees fit. But all that has changed with Facebook thanks to an executive order signed by President Trump on May 28. Now, per the order, Facebook can lose its Section 230 protections if it is found to discriminate or show bias against users and the material they post. And Facebook’s response to this order? It’s changing its own rules to protect itself. While it’s not 100% confirmed that Facebook is responding to the President’s executive order, it’s a reasonable assumption due to the timing of this announcement. Facebook, most likely, doesn’t want to get into any more legal hot water if it can help it. And now, Facebook is arguing that it can fully delete or restrict access to posts providing that the content can lead to negative legal repercussions for the company. In other words, anything that threatens Facebook’s business is no longer allowed. So what kind of content can’t be posted? Weirdly enough, Facebook isn’t saying how these decisions will be made — only that the company plans to initiate these rules for good as of October 1. On one hand, this may be good news for people who have asked for Facebook to take a more aggressive stance against sexually inappropriate and violent content making its way to the site (all things which can get the company in trouble legally). Others, however, fear a chilling effect on free speech, and a chance for Facebook to discriminate against users even further. Either way, Facebook is going out of its way to cover its behind — particularly as the presidential election draws ever closer.
Putting up even more guardrails
These aren’t the only moderation changes taking place at Facebook, either. Just this week, Facebook announced that it would be making a change to help fight the spread of viral fake news and other kinds of misinformation in Facebook Messenger. Per these new rules, users will only be able to forward messages to a maximum of five people or groups at a time. If you attempt to go beyond this limit, you’ll get an alert saying “Forwarding limit reached.” This change is being unrolled to each market individually and should be fully global by September 24. In addition, Facebook has also announced that it would no longer be allowing new political ads to be posted on the platform in the final week leading up to the election. It also plans to remove any warnings or ads that claim citizens can potentially acquire COVID-19 at the voting booth. The changes above, when put into practice, will actually cause Facebook to lose more money than if it had done nothing at all. In this case, why even make the changes? While we don’t know for sure, it’s fairly safe to assume that Facebook doesn’t want a repeat of Cambridge Analytica and the accusations from congress that the event led to. Tap or click here to see if your data was used during the scandal. By trying to become a more neutral arbiter, is Facebook actually making its platform less open and free? As bad as this might sound, it’s actually more of a return to form than anything else. In the olden days of the web, most discussion forums were privately moderated, and the people in charge could delete or edit posts according to their whim. Could this old-fashioned way of moderating content be the future of social media? Whatever happens from here on, there’s no doubt that online speech is changing quickly. In all likelihood, the web of tomorrow will look nothing like it does today. Curious about a Facebook alternative? Tap or click here to see if Parler, the “free speech” social network, is worth joining.