Categories
Case Studies Speech

Speech, Consequences and The MAP

It’s not easy to limit speech wisely–you need a LAAP that fits your venue and goals.  But it’s even harder to impose consequences for “crebit” (speech that violates the LAAP), while still doing so in a moral way (as laid out in The MAP), because while we limit speech, we punish people.  Today’s example in the news:

Facebook and Trump are at a turning point in their long, tortured relationship

On Jan. 6, as an angry mob stormed the U.S. Capitol, President Donald Trump posted on Facebook that his supporters should “remember this day forever.”

“These are the things and events that happen when a sacred landslide election victory is so unceremoniously & viciously stripped away from great patriots who have been badly & unfairly treated for so long,” he said in a post.

In response, Facebook did something it had resisted for years: banned Trump’s account indefinitely for inciting violence. Twitter, YouTube and others followed suit.The ban is that culmination of a long-running and tortured relationship between the politician and the social media company, one that will hit a new inflection point on Wednesday. That’s when a Facebook-funded panel of experts will announce whether Facebook must reinstate Trump’s account. The impending decision by the Oversight Board, a less than one-year-old body that describes itself as an “experiment” in the regulation of online speech, could be the most consequential decision ever regarding free speech on social media, according to experts. It could also alter the way that social media companies treat public figures going forward.

I’m not going to say anything about Facebook’s LAAP–its policies regarding what can be said.  Instead, I’ll talk only about the consequences of speech.

This particular case involves one of the most extreme consequences available to the host of an online venue: the banhammer. But it extends easily to more traditional venues.  When is it appropriate to ban a colleague from future business meetings?  To ban a student from a classroom?  To ban someone from holding their own meeting in your venue (like calling a meeting or giving a public talk)?

Let’s walk through a few principles from the MAP.  The banhammer doesn’t seem that effective in online settings–online platforms ban speakers all the time, and new ones pop up.  It’s far more effective in traditional settings, because not everyone is invited to speak in the first place.  And the consequences are far more severe.  Banning a colleague from meetings might mean they can’t do their job, so you’re basically firing them.  Banning a student from a classroom might mean they can’t pass their classes, so you’re basically expelling them.  They must have violated some very important obligations for such severe consequences to be proportional.

Consequences for speech also risk running afoul of the Entity Principle, which requires holding entities accountable for all they do as a steward or governor, but nothing else.  But banning a speaker may impose consequences on that person’s listeners, who have done nothing wrong.

I don’t have any easy answers to the question of what makes banning the right approach.  I’ll leave that for a team to discuss in class, or for a student to address in a Canvas discussion forum. Instead, I’ll close with this advice:  head off LAAP-violating crebitry before it occurs, rather than waiting to hold people accountable after the fact.

UPDATE:  The Board has rendered its decision. Their concerns seems similar to mine.  Here’s the summary:

The Board has upheld Facebook’s decision on January 7, 2021, to restrict then-President Donald Trump’s access to posting content on his Facebook page and Instagram account.

However, it was not appropriate for Facebook to impose the indeterminate and standardless penalty of indefinite suspension. Facebook’s normal penalties include removing the violating content, imposing a time-bound period of suspension, or permanently disabling the page and account.

The Board insists that Facebook review this matter to determine and justify a proportionate response that is consistent with the rules that are applied to other users of its platform. Facebook must complete its review of this matter within six months of the date of this decision. The Board also made policy recommendations for Facebook to implement in developing clear, necessary, and proportionate policies that promote public safety and respect freedom of expression.

They also note the impact of banning one person on the welfare of others:

Facebook’s decision to suspend Mr. Trump’s Facebook page and Instagram account has freedom of expression implications not only for Mr. Trump but also for the rights of people to hear from political leaders, whether they support them or not. Although political figures do not have a greater right to freedom of expression than other people, restricting their speech can harm the rights of other people to be informed and participate in political affairs. However, international human rights standards expect state actors to condemn violence (Rabat Plan of Action), and to provide accurate information to the public on matters of public interest, while also correcting misinformation (2020 Joint Statement of international freedom of expression monitors on COVID-19).

I found it particularly interesting that the Oversight Board relied heavily on international standards.  Here’s their summary:

International law allows for expression to be limited when certain conditions are met. Any restrictions must meet three requirements – rules must be clear and accessible, they must be designed for a legitimate aim, and they must be necessary and proportionate to the risk of harm. The Board uses this three-part test to analyze Facebook’s actions when it restricts content or accounts. First Amendment principles under U.S. law also insist that restrictions on freedom of speech imposed through state action may not be vague, must be for important governmental reasons and must be narrowly tailored to the risk of harm.

Clear and accessible rules would go a long way toward fulfilling my advice of heading off inappropriate speech before it occurs.  If people don’t know the limits of the LAAP, how can they stick to them?

The “legitimate aim” requirement ties closely into a key assumption of any LAAP–that there is a purpose to the local conversation.  That purpose guides the agenda, which in turn limits speech by casting some topics as irrelevant.

Necessity is the most interesting one to me.  The Board says:

The requirement of necessity and proportionality means that any restriction on expression must, among other things, be the least intrusive way to achieve a legitimate aim (General Comment No. 34, para. 34).

I find this interesting because lately I’ve been wondering if the MAP is incomplete–it does not include any specific requirement about governing in the least intrusive way.  I’ve toyed with a principle like “Governance should impose the least possible cost on the governed and on society.”  This would fit well with a longstanding principle of “Efficiency”, which auditors commonly pair with Effectiveness:  You want to do a good job of auditing financial statements, but you don’t want to impose too much cost on your client.

I’ll close by noting that Facebook’s operations are complex enough that they have lots of governance options short of banning.  While the following paragraph doesn’t say so explicitly, I read it as an indirect way of saying “hey guys, you could have just tweaked your algorithms and other tools to limit the reach and impact of his posts.”

Facebook stated to the Board that it considered Mr. Trump’s “repeated use of Facebook and other platforms to undermine confidence in the integrity of the election (necessitating repeated application by Facebook of authoritative labels correcting the misinformation) represented an extraordinary abuse of the platform.” The Board sought clarification from Facebook about the extent to which the platform’s design decisions, including algorithms, policies, procedures and technical features, amplified Mr. Trump’s posts after the election and whether Facebook had conducted any internal analysis of whether such design decisions may have contributed to the events of January 6. Facebook declined to answer these questions. This makes it difficult for the Board to assess whether less severe measures, taken earlier, may have been sufficient to protect the rights of others.

 

Leave a Reply

Your email address will not be published. Required fields are marked *