Responsibility: User vs. Platform Part III

droplet
3 min readJul 1, 2021

--

Image by Prettysleepy from Pixabay

In our third and final installment of our responsibility series, we’ll explore one of the most controversial and subjective conversations around social media — should content be moderated, and to what degree? We’ll start off by saying that we do not claim to have the final answer to this question, and even go a step further and say that there isn’t ever going to be one right answer. So how do we address a question that can’t be answered?

On one side of the argument is the case for free speech, on the other side a more complex combination of things, many of which ultimately come down to safety. Advocates of stricter platform moderation site concerns such as bullying, the growth of hate groups and terrorist groups, and misinformation that can sometimes lead to real-world harm, including acts of violence. While free speech advocates will often acquiesce that these are the dangers of unchecked free speech, they are quick to point out the danger of powerful authority figures (particularly un-elected ones) being able to control what others can and cannot say.

Politicians from both ends of the spectrum have become increasingly vested in cracking down on social media platforms, but it’s still unclear what solution would appease all interested parties. Social media companies bearing more responsibility for what’s posted on their platforms would result in stricter content moderation, while more legal responsibility for restricting users’ speech through content moderation would actually penalize stricter content moderation, making it seemingly impossible for politicians across the aisle to come up with a workable solution.

It is likely that, at least in the United States, platforms will continue to create their own personal policies concerning speech for a while. There are obvious pitfalls to this, as the concerns of both sides of the argument continue to go unchecked, but there are also benefits to having a variety of spaces, each providing a unique level of freedom or safety to accommodate a diverse public. So rather than count on a universally agreed upon set of guidelines which will likely never come, social media platforms need to create clearer policies of their own.

Facebook seems woefully out of touch with their own policies, with their oversight board overturning four out of five cases of moderation in its first rulings, and calling the indefinite suspension of former President Trump “standardless” and not in accordance with their policies. Twitter’s CEO, Jack Dorsey, publicly acknowledged the platform’s shortcomings in the area of content moderation and policies when he tweeted after Trump’s ban that it was “a failure of ours ultimately to promote healthy conversation.” And as you may remember from a previous post, YouTube pulled our video from its platform earlier this year, our own personal experience of their flawed algorithms continually undermining whatever policies they may be striving for.

So while the question of how much content moderation is appropriate or necessary will continue, one thing is clear already — platforms need to take responsibility for whatever policies they choose to enact. Such clarity would likely help the platforms weather criticisms with more dignity and a clearer conscience, as well as helping users to better find the platforms where they wish to spend their time.

Enjoying learning something new? Sign up for our e-mail newsletter and get a bite-sized piece of inspiration and information every week, as well as a free Bill of Rights download as a welcome gift!

Click here to sign up

You can also support us by sharing this article, following us, and applauding our posts.

--

--