I was saying to D last night that 1 thing I never tire of is reading descriptions of just how big the universe is. I had just read another good explanation before I said that.
The Fermi Paradox fails to comprehend just how large the universe is. Trillions of galaxies. The concept isn’t possible for us to understand.
The Dark Forest lays out a fantastic reason as to why any civilization that wants to persist would be well served to STFU and lay low. That’s another pretty practical contradiction of TFP.
As soon as I saw this thread, I thought of that theory. I love that despite it being directly from a sci-fi novel, it's actually gained a lot of traction among the scientific community. And that makes sense, when you combine its two basic tenets (every civilization seeks to survive and resources are finite.) It doesn't get much simpler than that - under those constraints, there could very reasonably be no possible scenario under which peaceful interactions between civilizations occur. So it's a pretty compelling argument to, indeed, STFU.
Overall, there's a lot of big ideas behind any discussion on this kind of topic. You could look at it from a more existential POV - we live in a more or less cozy, enclosed world where we consider other humans from different outlooks and perspectives as the greatest threat to our way of life or what we believe, but that's really just the result of bias along the lines of what Nassim Taleb's Black Swan theory posits, isn't it? You could argue that just a single encounter, however rare, with any other lifeform from someplace beyond - be it intelligent or otherwise - is a far greater existential threat than the day to day threat economic and "super power" interactions pose. Just one such encounter could wipe us all out in the time it takes to understand the term "exponential growth." Like what happens if we happen to encounter some single cell organism we've never encountered before - be it from journeys we take into space or something that reaches our planet on a chunk of rock that crashes through our atmosphere? That could lead to a scenario very similar to when Europeans, having cohabitated with animals for many years already, encountered isolated villages of natives in the new world - wholesale elimination of civilizations by diseases that were novel and new to those civilizations but not even something the "invaders" were aware of at the time. And that's just one example of something we are already aware of - what if the "life" we encounter doesn't fit any of our definitions? I think the best line from that 60 Minutes story was the one where the one guy said that we have to be prepared to confront the fact that our understanding of the very term "life" is probably pretty limited. Our "rules" don't mean anything once you get beyond our own biases.
If you're a data scientist, you can play out the question from a game theory perspective and design your own fairly straightforward simulations (like, say,
this example - BTW, for anyone who is a data scientist or even just interested in the world of so-called "big data" analytics, towarddatascience is a great site. Lots of coding tips, methodology discussions, and tons of examples.)
All in all, I think the more relevant thing for people to think about than merely the question of whether or not other intelligent life exists in the universe is the fact that humanity is really kind of like a bunch of five year olds trying to walk a tightrope 50 feet above the ground here - we only recently learned to walk and we're confident in our ability to do so, and perhaps we're just intelligent enough to start asking questions about how to walk that rope, but we're in no way prepared for the consequences of what happens when we get a few feet out on that rope and realize the rules of simply walking aren't sufficient to keep us from falling. Basically, our ability to ask the question in no way means we're ready to try to answer it. Even if the Dark Forest Theory is totally wrong, which is certainly a possibility, you could argue that a program like SETI was really, really stupid because having the ability to send a message and then doing so without having a clue what kind of response it might elicit is objectively stupid.