Why Facebook Doesn’t Ban Drug Cartels From Using Their Site
Back in January, a Facebook investigator learned of a Mexican drug cartel that was using Facebook to recruit, train, and pay hitmen. Now the subsequent recourse would seem obvious to some, but much more ambiguous and complex of a situation when business is on the line; Facebook favored the latter. Even though the shocking behavior was in direct violation of Facebook’s rules, Facebook didn’t stop the cartel from posting on their site. Why is that?
Similar scenarios have surfaced all throughout the popularity of Facebook, flagging Middle Eastern human sex traffickers looking to lure women into abusive situations, armed militia groups in Ethiopia distributing hateful material against ethnic minorities, illegal soliciting of organs, pornography, sex work, and even government propaganda to stifle political dissent. Facebook employees knew of these violations far before any action if at all, was taken against the posters. Facebook blamed issues of language barriers for dealing with content from foreign countries, and a need to placate authoritarian governments to continue operating within their borders, but investigators also concluded that Facebook put a large emphasis on placating business interests at large, but what really were Facebook’s options and what could have happened alternatively?
Applying the knowledge of game theory we’ve learned thus far in the course, it’s possible to break down Facebook’s decision-making process and perhaps shed some light on their actions, or in this case, inaction. Facebook, as with all companies like it, is a business, and as such, will make informed business decisions that have the potential to affect the status of the company. In oversimplified terms akin to those brought up in course material, Facebook can either choose to remove all posts that go against their community guidelines, or they can choose to wholly ignore them. In these mock payoff matrixes, Facebook will be Player A, Player B will change depending on the region in question. For U.S. users, banning content that was largely already unpopular in the US poses no threat to Facebook relations with US-based corporations and business interests; As U.S. Facebook user activity continues to dwindle with the rise of other preferenced forms of social media, Facebook focuses its attention towards the bulk of their user base: Asia and non-U.S./European countries. In another mock payoff matrix, player B is now one of the largest controlling Asian powers, China. Because Asia accounts for up to almost 43% of Facebook’s monthly user count, Facebook has a lot more on the line in removing the content of, for example, the Chinese government engaging in anti-Hong Kong suppression propaganda. In the third matrix with Player B now as an arbitrary small country where we assume human trafficking might be more prevalent, in this scenario Facebook has no business ties to the country and would have no reason to not police accounts, however, flagging violations to Facebook’s rules also takes considerable time and resources. Further, since this country has less relative business importance and thus less possibility of media backlash, Facebook has much less of an incentive to enforce its rules. Instead, Facebook then chooses to put their policing resources elsewhere such as Europe and the U.S. where they’d face more scrutiny, leaving the Drug Cartels of small arbitrary country B to slip away.