“The human race’s prospects of survival were considerably better when we were defenseless against tigers than they are today when we have become defenseless against ourselves” – Arnold Joseph Toynbee
Everybody loves the Drake Equation, what with its fast and loose prediction that there should be somewhere between 1000 and 100,000,000 civilizations in the Milky Way galaxy (N) given 1 star formed per year, on the average over the life of the galaxy, one fifth to one half of all stars having planets, stars with planets will have between 1 and 5 planets capable of developing life, 100% of those planets eventually developing intelligent life, 10–20% of which will be able to communicate, and the most pernicious of all the variables “L” (the length of time for which such civilizations release detectable signals into space), which was generously estimated somewhere between 1000 to 100,000,000 years.
At the first scientific meeting regarding the search for extraterrestrial intelligence (SETI), astrophysicist Frank Drake posed the equation as a probabilistic thought piece, and the conclusion of the meeting attendees was the rather ambiguous statement that N was roughly equal to L. This of course leads to the Fermi Paradox, that is, the offhand comment of Enrico Fermi to Edward Teller, Herbert York and Emil Konopinski on their way to lunch, pointing out that given the high probability of intelligent life out there in the universe, why haven’t we heard from anybody? By our estimates the universe is about 13.8 billion years old, thus scale and probability seem to favor intelligent life being common in the universe, and yet we have a total lack of evidence of it. What gives?
Such speculation have given way to a lot of attempts to assuage our cosmic loneliness with hypotheses that account for our apparent universal solitude. Maybe life is actually rare and Earth is an aberration. Perhaps life isn’t rare, but intelligent life is or doesn’t advance very far technologically as a rule. Extinction-level events may be more common than we think. Then of course there are a whole range of sociological and conspiracy theories that end up with everybody isolating themselves or lurking about trying not to be noticed. Sadly, after a recent re-read of British historian Arnold Joseph Toynbee (1889-1975), I’m starting to think it’s that pesky “L” variable. Given we only have the one sample of our own species, but an analysis of the rise and fall of civilizations seems to indicate that intelligent life sucks at “L”.
Toynbee wrote a twelve volume treatise on the study of history where he examined twenty civilizations, most of them related as parent or offspring to one or more of the others: namely the Western, the Orthodox, the Iranic, the Arabic, the Hindu, the Far Eastern, the Hellenic, the Syriac, the Indic, the Sinic, the Minoan, the Indus Culture, the Sumeric, the Hittite, the Babylonic, the Egyptiac, the Andean, the Mexic, the Yucatec and the Mayan, and commented, “If we take the antiquity of Man to be something like 300,000 years, then the antiquity of civilizations, so far from being coeval with human history, will be found to cover less than 2 percent of its present span: less than 6,000 years out of 300,000. On this time-scale , the lives of our twenty-one civilizations-distributed over not more than three generations of societies and concentrated within less than one-fiftieth part of the lifetime of Mankind – must be regarded, on a philosophic view, as contemporary with one another.”
In short, Toynbee took the long view and was telling us to get over ourselves. You may have an iPhone, but you’re not that far removed from digging canals between the Tigris and Euphrates in the grand scheme of history. Now we have a lot of apocalyptic scenarios for the end of civilization, and these days we tend to focus on nuclear destruction, incurable plague, runaway climate change, and invasion from the unwashed, yet unspecified barbarian hordes (well, often specified, but the identification varies based on who’s doing the specifying). Toynbee, after decades of studying the breakdown of civilizations came to the conclusion that we can screw up the environment, we can screw up the social order, and we can face barbarian invasions, but that isn’t what ultimately destroys civilizations. He argued that the real cause of the fall of civilization was that societies that develop great expertise in problem solving become incapable of solving new problems by overdeveloping their structures for solving old ones.
We’ve come a long, evolutionary way with our science and technology in a relatively short period of time, and have a perhaps misplaced confidence that we can “science” our way out of our ultimate doom, but the utter silence of the universe seems to argue against it. What we call civilization tends to be a flash in the pan. The trend of civilization seems to be a rapid rise, fall, and disintegration into a dark age. We landed on the moon. We send out an occasional space probe, or Martian lander. We beam signals out into the void and organize marches on Area 51, confident that one day little green men will send us a love note. And then we go about the more mundane business of killing each other, wrecking the planet, and applying the same old solutions to increasingly complex problems. Intelligent life may very well be abundant out there in the Milky Way, but civilization as we understand it, like life, is more likely to be, as Thomas Hobbes said, “nasty, brutish, and short”.
One might argue that civilization has a self-destructive impulse, an expression of the Freudian “death instinct”, in that as individual organisms we fight for our own survival, but turn our inclination towards destruction towards the external world, or as aptly phrased by Jean-Paul Sartre, “One could only damage oneself through the harm one did to others. One could never get directly at oneself”. We know that thus far the universe is silent. We know that the lifespan of our own civilizations is ludicrously short when viewed against cosmic timescales, but our ego tells us we must be the pinnacle of something, but as Werner Herzog observed, “Civilization is like a thin layer of ice upon a deep ocean of chaos and darkness”. But at least it’s quiet.
Assuming the universe started with one event, the “Big Bang,” maybe every sentient being is progressing at the same rate. I guess my idea depends on whether Earthlings really have been visited by space vehicles which would indicate that some civilizations started before ours, or they’re smarter.
Esoterx, your post is a bit melancholy. Toynbee can have that effect. Although his massive comparative history fell out of favor and his theory about the rise and fall of civilizations was found wanting by historians, I think there is merit in Toynbee’s appraisal of the limits of technology to solve human problems.
The “industrial system” (as Toynbee liked to call applied science/technology) lends itself to treating people (all life for that matter) as things. This epistemological error facilitates a moral error–which is fine if a species or culture places no great value on survival (“live fast, die young, leave a good-looking corpse”).
It may be that Toynbee would have endorsed the great filter answer to the cosmic where-is-everybody question, extrapolating from what he knew about our civilizations and our species. But what rationale justifies this extrapolation? I’m willing to bet that some extraterrestrial civilizations have successfully made it past the great filter. I assume they have good reasons for being aloof.
The times in which we live make it hard to be optimistic about our chances of getting past the great filter. On the other hand, I think that sociocultural systems can and do evolve, and that (following Gregory Bateson) there is a natural selection of ideas by means of which epistemological errors can be corrected. I know there are wise people in the world. Unfortunately, too few of them seem to have any real power.
But even if our current civilization collapses, our species may yet be granted another shot at getting past the great filter, assuming that among the surviving pockets of human communities, one or more of them find a way to tame greed, hate, and fear, and discover a way of making knowledge subservient to wisdom.