Pessimism is permeating our national atmosphere. Newsweek may as well publish an article titled “We Are All Declinists Now,” for if there is one thing the left and the right, from Occupy Wall Street to the Tea Party, seem to agree on, it is that things are not going well in this country and appear to be getting worse.
This crisis of confidence is not unique to America, of course — across what was once quaintly referred to as “Western Civilization,” economies of entire nations are either suffering in stagnation or teetering near total collapse. Falling or stagnating living standards and birthrates portend a considerably older and poorer West in the not-so-distant future.
The reasons for this unfortunate state of affairs are many and varied, and not yet fully understood. But let us put aside the question of causation for a moment and ask instead: Can we in the West — and in the United States in particular — turn this boat around? Must we accept decline as inevitable?