Aliens and Bayes’ Rules
http://www.usatoday.com/story/opinion/2016/11/17/aliens-space-fermi-paradox-intelligent-life-glenn-reynolds-column/93976586/
Humans have always been fascinated with the idea of aliens. For centuries, we have asked and worked towards the question, “are we alone in the universe?” Looking at this question through a statistical lens leads us to believe that it is highly unlikely for humans to be the only life in the universe. Improving technologies are revealing more and more Earth-like planets every year. Most theories to explain our isolated presence seem like a stretch – are we the first intelligent species? Or are we not intelligent enough to detect other highly advanced species?
This article references a paper by James Miller and D. Felton titled The Fermi Paradox, Bayes’ Rule, and Existential Risk Management, which offers a new way of thinking about this question. Bayes’ Rule (or Bayes’ Theorem) is a statistics idea which describes the probability of an event, based on prior knowledge of conditions that might be related to the event. This paper presents the theory of the Great Filter. The Great Filter is a theory that there is a point after which a species becomes too advanced and destroys itself. This self-destruction can be possible through advanced weaponry, powerful artificial intelligence, failed experiments, or a number of other possibilities. The paper argues that instead of looking for other thriving species, we should instead apply Bayes’ rule and be looking for the aftermath of a destroyed planet/civilization in order to study it and try to prevent ourselves from ever reaching this filter. Whether or not there is a “Great Filter,” human technology is advancing rapidly, and soon we will become too anxious on Earth and venture out into other parts of the galaxy. Only then will we start finding real answers.