Artificial Intelligence and Socially Optimal Allocation
With news of the future advent of artificial intelligence (AI) looming, many have been faced with coming to terms with the reality of possibly losing their jobs. Well-known scientists like Stephen Hawking even say “AI could spell the end of the human race”(1) while entrepreneurs like Mark Zuckerberg remain quite optimistic. Feelings are indeed mixed on the topic.
As with many things in life, an objective view to this issue can provide some much-needed clarity. The idea of computing socially optimizing allocation has the potential of helping to quantify the “cost” or impacts of AI on the existence of humanity by assessing what humanity stands to gain in the absence of AI and substracting this result from what mankind stands to gain from the presence of AI. In an attempt to quantify these results, the following table has been organized with AI and humans as advertisers. Repetitive tasks such as transportation, paralegal work, house cleaning represent one slot while “big thinking” or tasks that require the human touch such as managing a corporation, mentoring students, or being a president represent another slot. Though hard to effectively quantify all of these values, the idea is to create a matching market based on the averaging the estimates provided by pwcartificialintelligence.com
Advertiser | Slots | Repetitive Tasks | Big thinking |
AI | Repetitive tasks | 80 | 55 |
Humans | Big thinking | 75 | 80 |
The socially optimal allocation is represented by bold text. That is, AI should take repetitive tasks, while humans should take tasks that involve big thinking.
These values, though rough estimates, provide a good vantage point from which to analyze the impact of AI. Boon or bane, given more accurate statistics, humanity can objectivity assess to a large extent the consequences of AI before things get out of hand thanks to techniques in socially optimal allocation and VCG.
Works Cited
- Rawlinson, Kevin. “Microsoft’s Bill Gates Insists AI Is a Threat.” BBC News, BBC, 29 Jan. 2015, www.bbc.com/news/31047780.