A New Wrinkle on Public Choice Theory


In my business ethics classes, I typically discuss public choice theory (PCT) right after I survey ethical egoism. I explain that by taking egoism seriously, economists have been able to understand many issues more profoundly than philosophers, who currently tend to dismiss egoism from the realm of serious ethical theory. (Other important phenomena that are generally beyond the appreciation of academic philosophers, uninterested in the profound role of self-interest in real life, include moral hazard, the principle-agent problem, regulatory capture, and especially “rent-seeking.”) In short, many philosophers buy into the Hegelian notion that the state (which they equate with the government) is the realm of disinterested charity — unlike business, which they take to be the realm of pure self-interest. Most economists are not Hegelians, which needless to say is part of their charm.

PCT asserts three propositions. First, everyone in the political process — voters, politicians, government bureaucrats, special interest groups, and the lobbyists who represent them — are motivated primarily (if not entirely) by self-interest. That is, egoism governs political reality. Second, there is an asymmetry in what participants in the process stand to gain, with special interests often standing to gain a lot while the average taxpayer only a little. And there is a concomitant asymmetry of knowledge. Third, since politicians are not spending their own money, but are using other peoples’ money (OPM), they have no incentive to use resources for the general good.

Most economists are not Hegelians, which needless to say is part of their charm.

The classic use of PCT is to explain why pork-barrel spending is so prevalent among politicians of every party (including those accurately characterized as libertarians, such as Ron Paul), and so hard to control. Suppose I am Congressman Jason (a jarring thought, I grant you), who represents a district dominated by a university. Suppose further that I am approached by a group of people who want me to build a “senior center” in my district, which, being a university-dominated area, has a lower concentration of old people than many other districts. They approach me, pleading their case, and reminding me that they made a large donation to my campaign in the last election. This group will typically include the folks who have the most to gain, such as the old people who will benefit from the project without having to spend a nickel more than taxpayers who won’t be able to use it, and the construction firm that will pocket millions of bucks from it. But the group will give itself a virtuous-sounding name, such as “Citizens for the Elderly” or “Seniors in Solidarity.”

PCT predicts that I, the politician, will reason as follows: “If I put this project in some grand bill, say, a defense spending bill, and it passes, I will get tens of thousands of campaign dollars for my next election. And, hey, money is the mother’s milk of politics. Moreover, voters in my district — especially elderly ones — will see my name on this new center and give me credit for it, even though it was OPM that financed it. Of course, the populace as a whole would be better off if this senior center were built in a district with a higher concentration of old people, but it isn’t my money, so I don’t care.”

PCT also predicts that since some of the old people in my district stand to gain a lot, not to mention the construction company, they will follow the progress of the legislation very closely, write letters on its behalf to my colleagues, call other congresspersons, and so forth. On the other hand, since the average voter only stands to lose perhaps a dollar on this boondoggle — and has other pressing matters to worry about — that voter will have no incentive to follow the legislation. He will be “rationally ignorant,” in the snappy patter of the economists.

So, when we think of politicians acting for their self-interest, as predicted under PCT, it is self-interest at arms length, so to say. We think of pols who decide to push suboptimal taxpayer-funded projects to directly help favored supporters or their constituents as a whole, so they can indirectly benefit by harvesting more votes. But a recent article in the Washington Post suggests that when they put through pork-barrel projects, many pols receive a much more directly personal payoff.

The newspaper compared public records about the property owned by all 535 members of Congress, and correlated the information with the earmarks pushed by these people over the past four years. It turns out that 33 of these solons pushed projects (costing taxpayers over $300 million) that were within two miles of their own properties. Moreover, 16 of them pushed subsidies for companies or programs in which their immediate family members (children, parents, or spouses) were employed.

The report notes that under the rules of Congress, this is all perfectly legal, and the members so benefiting were under no obligation to disclose it.

When confronted, the legislators naturally explained away their conflicts of interest by remarking on how necessary the projects were for their local economies, and claiming that the personal benefits they received were unimportant or merely coincidental.

Here are some of the juicier examples. Notice that members of both major parties are well represented.

  • Rep. Bennie Thompson (D-MS) got $900,000 in funding to resurface some roads back home. One stretch of resurfaced road just happened to be the one on which he and his daughter own two homes.
  • Rep. Roscoe Bartlett (R-MD) arranged for $4.5 million in taxpayer cash to improve a freeway interchange at a junction near his 104-acre farm and a bunch of his rental properties.
  • Rep. Ruben Hinojosa (D-TX) got an earmark to widen a major road in his district that just happened to be 600 feet away from a property that was owned and being developed by his family.
  • Rep. Jack Kingston (R-GA) used a $6.3 million earmark to restore the beaches on little Tybee Island. By sheer coincidence, he owns a home on the island, about 900 feet from the beach.
  • Rep. John Olver (D-MA) got $5.1 million to realign part of a highway. The project starts at a part of the road that is only about 200 feet from his 15-acre home, as well as some adjoining properties owned by his family.

The capper is Rep. Doc Hastings (R-WA). He was serving on the House Ethics Committee when it defined a congressperson’s financial interest as one having “a direct and foreseeable effect” on his or her assets. But the committee added that, as the Post put it, “’remote, inconsequential or speculative interests’ do not count.” Two years after writing this, however, Hastings himself got a $750,000 earmark for a new overpass — on a site only three blocks from a business he formerly owned and ran that is now operated by his brother, on land he still owns.

So when politicians spend OPM, they not only use it to buy votes, they often use it more crudely, to feather their own nests. This hardly supports the Hegelian concept of the state as the realm of disinterested charity.

Sixteen members of Congress pushed subsidies for companies or programs in which their immediate family members were employed.

Why do politicians think they can get away with such obvious use of taxpayer money for their own direct benefit? Well, to begin with, in every case they think they can successfully rationalize away their behavior to the voters. So, for example, Rep. Kingston, when questioned by the reporters, said brazenly, ”It’s absurd to suggest that this benefits me. The beach doesn’t improve the real estate of a house, unless it’s on the beach. . . . The only thing that changes in value is the beachfront property. It does have an economic impact on the beach and the community.” One has to suppose that Kingston thinks the average voter is a fool — which may or may not be a plausible view, depending on the depth of your cynicism.

But one should also remember that politicians are rarely caught. Few reporters ever do the sort of research needed to discover cases of such directly beneficial pork projects. Indeed, the research that the Post reporters did seems to be the first of its kind.

Perhaps the Post will now do an expose showing that Santa Claus doesn’t really exist. Its management, which is highly favorable to expensive government, may be similarly surprised.

Share This

Words of Auld Lang Syne


I don’t enjoy the start of a new year. With the exception of one occasion, when I was 12 years old and discovered that if I borrowed my brother’s shortwave set I could listen to January 1 arrive at one place after another around the globe, until it got all the way to Michigan, I have greeted the great event with surly cheer. The appearance of a new year simply makes me aware of all the things that went wrong during the last year, and that still aren’t going right.

This is particularly evident in the case of words. Every year is pregnant with a host of locutions that no intelligent person could ever have engendered, unless disgustingly drunk. But the ugly brats are born, and many of them grow up into big, ugly, popular clichés, monsters that continue stalking the landscape even as the next twelvemonth begins.

One way of hastening their end is to adopt the tactic of the aboriginal tribesman, who recites the names of his gods in order to get rid of them. Another tactic, similar to the first, is that of the modern corporation, which celebrates someone as Manager of the Year in hopes that he will retire.

Inspired by such examples, I now present my list of the Ten Most Gruesome Expressions of 2011, the ten phrases that have most clearly outlived their usefulness, if any. All these terms have lately displayed their full nastiness, though none of them actually originated in 2011 — a year oddly barren of brand-new tripe. Several of them, indeed, are already well stricken with dementia. But let’s not be clinical. Let’s just try to imagine what the world would be like without them, and pray God that they will soon be taken from us.

I’ve ranked our gruesome friends from 1 to 10, according to the danger I think they pose to the republic’s mental health — in other words, according to their tendency to make me sick. To preserve suspense, I’ll save the most sickening expression for last. Don’t peek. The worst is coming.

So here goes, starting with Number . . .

10. “Sweet” — as in the following conversation.

“Hello, Mrs. Smith. This is Dr. Jones. Your tests are back, and they show that your cancer may not be terminal.”


Preposterous. But hardly impossible. The Saccharine Salute now appears in conversations everywhere. It started with 16-year-old thrashers and druggies, but it has spread inexorably to older, more sensible types. Remember that I said “er” and “more,” and that we’re dealing with baby boomers here. As you know, we boomers were never as bright as Newsweek thought we were, and our mental age has not advanced as rapidly as our physical age.

9. “Epic.” Another thrasher term, as in “Dude! That is a seriously, seriously epic board,” as in “skateboard.” Since few publicly educated people know what an epic is, the word has easily passed from boarders to radio hosts to TV hosts to half the other people in the known universe. What next? Will “sonnet” become the universal contrastive term? “Dude! I got this gross little sonnet thing stuck on my sneaks, dude!” Ask yourself, what would Milton say?

8. “Due diligence.” This is a legalism, with an actual meaning. Please look it up, the next time you’re tempted to tell your son that you hope he’ll do his due diligence in school today. Until recently, the phrase was confined to legal circles. Then it got into politics, as Republicans and Democrats tried to blame each other for the depression (sorry! I mean the “downturn”) of 2008 and following. The other party had caused the mess by its failure to exercise due diligence. Well, to use the Valley Girl lingo of 30 years ago, “Duh! Yeah! Maybe some people, like all of you, might’ve screwed up. Yuh think?” Notwithstanding this obvious reflection, “due diligence” proved useful for scoring points in the great game of “which political party is better at running the country” (another nasty expression, toward which I will exercise due diligence in another Word Watch). The ultimate winner of this game is the person who can show that nobody in his party ever smokes weed, watches porn, or texts during office hours. “Due diligence” is an intensely conservative phrase, but its conservatism isn’t a philosophy, or anything that makes sense; it’s just a high-church way of covering your ass.

7. “Got your back.” I don’t know where this started, or who kicked it into popularity. It means, of course, that while you run out and try to shoot the enemy, I’ll stay here and discourage people from shooting you in the ass. I wonder: Which of us has the tougher job? The real purpose of “got your back” is to glorify the speaker, not to improve life for the listener. It’s a militant upgrade of the useless “I’m here for you.” I suppose the next upgrade will be “If you go down, believe me, buddy, I’ll give you the coup de grace.”

6. “Icon.” OK, I’ll admit it. In 2009 I published a book, The Big House: Image and Reality of the American Prison, which is part of a line of books offered by Yale University Press (and available for sale on amazon.com), and this line of books is called the Icons of America Series. So now I’ve advertised my book, and also prevented you from using the “icon” thing against me, since I already brought it up. But what “icon” means, in the context of that series of books, is “something that everyone can picture, and everyone thinks he understands, except that he doesn’t.” That’s a useful concept. There’s another meaning, which is even more useful: “a literal or literary picture that represents concepts of fundamental importance to the people who make and view it.” Thus, the lilies that adorn a picture of Mary and the Christ child illustrate her purity; the baby’s trusting look reveals his innocence; the cruciform gesture with which he stretches forth his arms foretells his redemptive death. There’s a scene in Homer’s Odyssey in which Odysseus arrives at Ithaca and is greeted by his patroness Athena. They sit under the sacred olive tree and plot the ruin of the suitors. This scene is also an icon. It presents a vision of the ideal god and hero — similar in character, equal in virtue, and equally disposed to plotting and enjoying their plots. But notice: none of this adds up to “Kate Voted 2011’s Top Beauty Icon,” “Patti Smith at 65: From Rebel to Icon,” or “Hotel an Icon in Red Hook for 164 Years.” If “icon” means “celebrity,” call Kate a celebrity. If to be “an icon” means to be famous, say that Red Hook has a famous hotel. I don’t know what you do with the Patti Smith headline. Find some other meaningless word, I guess.

5. “Double down.” This phrase first became popular in an innocent way. It conveyed the stubborn fecklessness of President Obama, a bad gambler who somehow considers himself a good one. Then it became a synonym for “continuing one’s course” or simply “being consistent.” And that is wrong, very wrong. Obama is not doubling down every time he repeats the same campaign speech he’s been using for the past three years. He’s not a risky, heroic figure. He’s not Bret Maverick. Let’s ban this particular chip from the casino.

4. “Dead on arrival.” Here is the Democrats’ new favorite, and they would die without it. Of course, they still have trademark rights to “our children,” “the folks on disability,” “American workers that are out of jobs,” “people that are most in need,” and the all-purpose suffix “in this country” (as in “we need to do better for our children, the folks on disability, workers that are out of jobs, and people that are most in need, in this country”). All these terms have been useful in maintaining the Democratic base in its chronic condition of insanity. But what the Democratic politicians needed was a phrase that would gratify the base while menacing the opposition. Ideally it would be a phrase that expressed both their habitual arrogance and their frustrated spite about their massive losses in November 2010. So they picked up “dead on arrival.” Harry Reid is its biggest fan. When he finds the Republicans in their usual state of legislative dithering, he taunts them by asking where is their bill? When he finds that they may actually have a bill, he announces that the bill will be “dead on arrival.” It doesn’t occur to him that a man who looks like an undertaker shouldn’t be pushing images of dead bodies. It doesn’t occur to the mainstream media either. That’s why this repulsive expression is now appearing everywhere there.

3. “Kitchen table.” Here’s a homey phrase that is useful whenever a “news correspondent” accidentally asks a politician to comment on an important issue. Thus: “Do you think it’s a problem that in a time when other people have less and less money, the salaries and benefits of government employees keep going up?” That’s a real question, for a change. The real answer is simple: “Yes.” The phony answer takes more work. “Well, Marcie, I just think that when the American people sit down at the kitchen table to work out their family budgets, I just don’t think when they’re sitting there at the table, they’re really wondering what other people take home in their paychecks, or what benefits their public servants may have earned. I think what the American people are thinking about when they sit down there at the kitchen table to really think things out, they’re thinking about the really important issues. Will we have economic justice in this country? Will our public workers be getting a living wage? Will we take care of our seniors on Social Security and our young people in our public schools? Is there life on other planets?” “Kitchen table” is this year’s substitute for the first half of the favorite cliché of 2008, “Main Street versus Wall Street.” It’s a slimy attempt to convince you that Pennsylvania Avenue is not the problem. It’s an attempt to fool you into thinking that when you sit there at the kitchen table and stack up your pathetic statements of profit and loss (mostly loss) and try to figure out how you’re going to pay your ridiculous federal income tax, you are feeling exactly what some politician feels when he reclines in his limousine and tries to figure out how to make you pay still more. I’m surprised that I ranked this one as only No. 3.

2. “Up for grabs,” as in “the Iowa caucus is now up for grabs.” Nothing unusual about this one — just the awful certainty that for the next 11 months we’ll be told that “South Carolina is up for grabs,” “Florida is really up for grabs right now,” “there are over 400 House seats, and they’re all up for grabs,” and yes, “the White House itself is up for grabs.” I suggest that this metaphor be replaced by something similar but more explicit. Let’s try “the Senate is up for sale,” “the House is up for sale,” and “the White House is up for sale.” Those expressions would acknowledge the fact that if you tell the voters you are not going to pay them off, you are not going to increase Social Security benefits, increase veterans’ benefits, increase students’ benefits, increase almost everyone’s benefits, while decreasing almost everyone’s taxes, you will not be elected. Or so we are told.

Now bring the drums and trumpets! The end of the procession is in sight.

As Pogo said, we have met the enemy, and he is us. We are what our president calls the

1. “Folks.” All right, this is just another Obama-ism. But does that make it innocent? Certainly not. Yet its origins are sad. The f-word first gained control of Obama’s mind when the polls showed conclusively that he had lost the “folks.” So he obsessively created stories about various kinds of “folks” — “folks sittin’ around the kitchen table” (see no. 3, above), “folks that are just tryin’ to balance their checkbooks,” “folks that are hurtin’,” “folks that we’re helpin’” — an enormous crowd of folks to surround and comfort him. I reckon I’m one a them folks, cause I’m really hurtin’ when I hear crap like this. It worked for Huey Long, but, sorry, it doesn’t work for a guy who ran for president on his credentials as a Harvard grad. It there are folks in this world, President Obama is a non-folk. Like most partisan words, however, “folks” has wanderlust. It doesn’t care which side of the aisle it’s on. And why shouldn’t the Republicans have their crack at it, too? I’m sorry, very sorry to say this, but 2012 is likely to be the Year of the Folks. That’s what makes No. 1 so dangerous.

Now, that’s sort of a downer, isn’t it? You see what I meant about New Year’s. We’ve come to the end. The awards have been given. Nobody’s happy. It’s time to leave the auditorium.

I’m sure you’ve noted, however, that most of phrases on this year’s roll of shame are political. I put it at six out of ten. This is not an accident or product of my own whim. There is a law at work here, a law of linguistic devolution: the larger the government, the more it talks, and the more influence it has on everyone else’s discourse. That can’t be good.

But just remember: Liberty’s got your back.

Share This

You're No Fun Anymore


Share This

Face Time


I was not an early adopter of Facebook. And I joined for commercial reasons. For a short time a few years ago, all the smart people in book publishing were saying that social media was the future of book promotion. Of course, at that point, the smart people in every industry were saying that social media was the future of promoting any product or service. Some of those smart people may have been in the employ of Zuckerberg & Co.

That conventional wisdom, like most such, turned out to be an exaggeration of a minor observation. My firm’s efforts at promotion through Facebook have yielded modest results. (The well-worn triad of direct mail, author spots on local talk radio, and carefully-chosen display ads remains the most effective way to promote books.)

Despite this, I still use Facebook. And may use it more than ever. It’s a pleasant diversion, a low-maintenance way to stay in touch with family, friends and a group of “Facebook friends” — acquaintances from high school, college and other points in my life. It offers the interactivity of a chat room with the promise of enough vetting to keep out the most egregious cretins and child-molesters.

It’s also an interesting laboratory for measuring people’s attitudes about sports, politics, pop culture and the news.

One thing that I’ve learned is how presumptuous — and erroneously presumptuous — people are about the means and motives of online entertainment. Many of my acquaintances presume that there’s some system of consumer-protection law that applies to their dealings on Facebook. This applies especially to matters of “privacy.”

Facebook is, like Google, an advertising company at heart. The business model is to create an online space that people will visit regularly — and then to sell access to those people. Many of the activities on Facebook are designed to capture information about users likes and dislikes, so that Facebook can create detailed consumer profiles and sell precisely-calibrated access to advertisers.Yet multitudes of Facebook users rage childishly when this or that detail comes to light about how the site collects information.

Another lesson (and the real reason for this Reflection): the politics and beliefs of most Americans are so ill-formed and erratic that it’s difficult to engage them in a meaningful way.

Recently, several of my Facebook friends posted approving comments about Warren Buffett’s “integrity” and “bravery” in calling for higher taxes on the wealthy. I pointed out — as I have in this space — that there’s no integrity or bravery in Buffett. At least on this issue. He’s acting in self-interest, and being cagey about it. His company’s holdings include several life insurance companies that sell annuities and other tax-avoidance mechanisms. The higher the federal tax rates, the more his products sell. He’s like an arsonist who owns the fire-extinguisher shop across the street from a theater that he sets afire during a sold-out performance of La Boheme.

Despite the ugly truth, some of my Facebook friends insisted that Buffett looks out for the working man. So, I pointed out that he is also a large shareholder in the Washington Post Company — whose highly-profitable Kaplan Education unit destroys the lives of working-class idiots by selling them worthless degrees financed by costly student loans that aren’t dischargable in bankruptcy.

At this point, a friend of one of my Facebook friends — who could read the comment thread through his connection to my friend (such is the nature of a social network) — commented that my use of the term “working-class idiots” was offensive. And that he knew better than I how predatory Kaplan Education is because he had borrowed tens of thousands of dollars to get a useless certificate in 3D animation from that very company. And that, several years later, he remains unemployed. But he wasn’t as angry at Kaplan or Buffett as he was at me for describing his ilk unkindly.

The What’s the Matter with Kansas wing of the American Left argues that presumedly right-leaning corporate interests brainwash the middle class into voting against its own interests. But that brainwashing isn’t a Right/Left phenomenon. The same argument could be made of the presumedly left-leaning Warren Buffett and the unemployed friend of my Facebook friend.

We who value liberty have a long way to go in explaining our case to the American masses. We have to assume our fellow citizens know nothing. Or, worse, we have to assume that most of what they know is affirmatively false. And we have to do it nicely.

I use Facebook as a tool to sharpen my skills in this effort.

Share This

The Passing Paradigm


The latest much-ado-about-nothing crisis passed, with a result that should seem familiar. In 2008, Americans were told that if the TARP bill (a $787 billion taxpayer-funded welfare handout to large banking institutions) wasn’t passed, the stock market would crash and massive unemployment would follow. After an unsuccessful first attempt to pass the bill amid angry opposition from constituents, the bill passed on a second vote. Subsequently, there was a stock market crash followed by massive unemployment.

This time, our political-media cabal told us that if Congress was unable to pass a bill to raise the debt ceiling, the government would not be able to meet its short term obligations, including rolling over short term bonds with new debt. US debt would be downgraded from its AAA status, and a default would be imminent. After the melodrama, Congress passed the bill raising the debt ceiling. Standard and Poor’s subsequently downgraded US Treasury debt anyway, and deep down everyone knows that a default is coming as well, one way or another.

We are seeing the end of a paradigm. Thomas Kuhn argued in The Structure of Scientific Revolutions (1962) that anomalies eventually lead to revolutions in scientific paradigms. His argument holds equally true for political paradigms.

A paradigm is a framework within which a society bases its beliefs. For example, people at one time believed that the forces of nature were the work of a pantheon of gods. Sunlight came from one god, rain from another. The earth was a god, as was the moon. With nothing to disprove the premises of the paradigm, it persisted. People went on believing that sunlight and rain were the work of sunlight and rain gods because there was no compelling reason for them to believe otherwise.

However, within any paradigm there are anomalies. Anomalies are contradictions — phenomena that cannot be explained within the framework of the paradigm. People have a startling capacity to ignore or rationalize away these anomalies. While it may defy logic to continue to believe that rain comes from a rain god even after evaporation and condensation has been discovered and proven, people would rather ignore the anomalies and cling to the paradigm than face the fact that the paradigm is false.

There is at least one thing that will be quite obvious: centralized government is insane.

But once there are too many anomalies, the paradigm fails, and a new one must take its place. This new paradigm renders the old one absurd, even crazy. At some point in the future, people will look back on the political paradigm of the 20th and early 21st centuries. There is at least one thing that will be quite obvious to them: centralized government is insane.

Consider the premises upon which this present paradigm relies: all facets of society must be planned and managed by experts. The judgment of the experts trumps the rights or choices of any individual. The choices made by the experts will result in a more orderly society and greater happiness for the individuals who compose it. There will be better results from one small group of experts controlling everyone than multiple groups of experts controlling smaller subgroups of society.

Of course, libertarians reject every one of these assumptions on its face. A free society does not tolerate “planning” or “management” by anyone. All choices are left to the individual, as any attempt to plan or manage his affairs amounts to either violation of his liberty, looting of his property, or both. However, let’s assume that the first three assumptions of the present paradigm are valid and merely examine the last. Even that does not hold up to scrutiny.

Suppose an entrepreneur starts a business. At first, his market is local. He opens retail outlets that are overseen by store managers. The entrepreneur is the CEO of the company and manages the store managers. Even at this point, the CEO must trust day-to-day decisions to his managers. He has no time to make everyday decisions as he tries to expand his business. The managers do this for him and he concentrates on strategic goals.

His business is successful and soon he begins opening outlets outside of the original market. He now has a need for regional managers to manage the store managers. He manages the regional managers and leaves the details of how they operate within their regions to them.

The business continues to expand. With retail outlets in every state, there are now too many regions for the CEO to manage directly. The CEO appoints executive directors to manage larger regions, each composed of several smaller ones. There is an executive director for the West Coast, another for the Midwest, and another for the East Coast. Of course, the CEO has the assistance of his corporate vice presidents who manage sales, operations, human resources, and other company-wide functions from the corporate office.

Now, suppose that one day the CEO decides to fire the executive directors, the regional managers, and the store managers. He will now have the salespeople, stock clerks, and cashiers for thousands of retail outlets report directly to him and his corporate vice presidents. Would anyone view this decision as anything but insane?

As silly as this proposition sounds, this is a perfect analogy for how we have chosen to organize society for the past century. The paradigm rests on the assumption that every social problem can be better solved if the CEO and his corporate staff manage the cashiers and the salespeople directly. As in all failed paradigms, anomalies are piling up that refute its basic assumptions.

This paradigm assumes that centralized government can provide a comfortable retirement with medical benefits for average Americans, yet Social Security and Medicare are bankrupt. It assumes that a central bank can ensure full employment and a stable currency, yet the value of the dollar is plummeting and unemployment approaches record highs (especially when the same measuring stick is used as when the old records were set). It assumes that the national government’s military establishment can police the world, yet the most powerful military in history cannot even defeat guerrilla fighters in third-world nations. It assumes that the central government can win a war on drugs, yet drug use is higher than at any time in history. It assumes that experts in Washington can regulate commerce, medicine, and industry, yet we get Bernie Madoff, drug recalls, and massive oil spills.

Hundreds of years ago, the prevailing medical science paradigm assumed that illnesses were caused by “bad humors” in the blood. Operating with that assumption, doctors practiced the now-discredited procedure known as “bleeding.” They would cut open a patient’s vein in an attempt to bleed out the bad humors. As we now know, this treatment often killed the patient. Most rational people today view the practice of bleeding as nothing short of lunacy.

Ironically, this is a perfect analogy for the paradigm of centralized government. The very act of a small group of experts attempting to manage all of society drains its lifeblood. It is the uncoerced decisions of millions of individuals that create all the blessings of civilized society. It is the attempt by a small group of people to override those decisions that is killing society before our very eyes. Someday, people will look back on our foolishness and laugh as we do now at the misguided physicians who bled their patients to death. The present paradigm is dying. The revolution has begun.

Share This

The End Is Nigh


In an article in the December 2010 issue of Liberty, I alerted readers to the fact that a leading network of Christian radio stations was predicting that the end of the world was absolutely, positively going to happen in 2011. According to Family Radio, which broadcasts in many countries, and which probably has a station near you, Judgment Day will begin on May 21 with the Rapture of the true believers and will conclude on October 21 with the total destruction of the physical universe. In the process, almost all the inhabitants of the earth will perish.

This is the message of Family Radio’s “Bible teacher,” a retired businessman named Harold Camping. His interpretations of Scripture are explained — as well as I, or probably anyone else, could explain them — in the article just mentioned, “An Experiment in Apocalypse.” You can download it here. For a less critical perspective, see Family Radio’s own website, which offers a list of stations where you can hear the apocalyptic message for yourself.

For students of human nature, especially American human nature, this particular religious prediction is a matter of great importance, interest, and (let’s face it) fun. It is, for them, what a transit of Venus is to astronomers, what the sighting of an ivory-billed woodpecker is to ornithologists, what an eruption on the scale of Mt. Saint Helens is to volcanologists. It’s the kind of thing that happens much less than once in a generation.

Of course, experts, religious and secular, make predictions all the time, and other people believe them. Generals predict that if they are given appropriate resources, they will be able to accomplish their mission. Scientists predict that if their policy advice goes unheeded, the environment will be subject to further degradation. Politicians predict that if you vote for them, they will initiate a new age of prosperity for the American people, and if you don’t, you will be visited by spiraling unemployment and a continuing decline in the American standard of living. Preachers say they are confident that the signs of the times point to an early return of our Lord Jesus Christ. Economists assure us that if trends continue, we can expect to see whatever they themselves have been trained to expect.

According to Family Radio, Judgment Day will begin on May 21 with the Rapture of the true believers and will conclude on October 21 with the total destruction of the physical universe.

All these modes of prophecy are potent. They have effects. They get people’s attention. They lead some people to do things that otherwise they would not do — vote, go to war, buy a house, pledge more money to the church. But they are all escapable and forgettable. They never include a date on which something definitewill happen. What you see, when you look at the words and figure out what they really mean, is just a suggestion that something that is capable of almost any definition (“prosperity,” “job creation,” “depression,” “desertification,” “a world in which our children will have more (or less) opportunity than we do,” “the fulfillment of God’s plan”) will manifest itself at some time that is really not a time: “very soon,” “in our generation,” “by the end of the century,” “earlier than predicted,” “much earlier than anyone would have thought,” “with a speed that is startling even the experts,” “at the end of the day,” “eventually”).

Of course, the less definite a prediction is, the less meaning it has; but the more definite it is, the less likely it is to come true. Real economists and real theologians can tell you why. A real economist will show you that human events result from individual human choices, their motives unresolvable into quantifiable data, their possible sequences multiplying from the original motives in fantastic variety. Real theologians will tell you, in the words of the old hymn, that the Deity is not a writer of op-ed forecasts: “God moves in a mysterious way, his wonders to perform; / He plants his footsteps in the sea, and rides upon the storm.”

Nevertheless, it is impossible that someone’s announcement that “California is more vulnerable than ever to a catastrophic earthquake,” or that “this administration will meet the problem of the deficit, and solve it” could ever be completely disconfirmed. If the big one doesn’t destroy San Francisco during your lifetime, as you thought had been predicted, don’t use your dying breath to complain. You’ll just be told that California is even more vulnerable “today” than it was “before,” because more years have passed since the last prediction. If the politician you helped elect disappoints you by not having “solved the problem,” whatever the problem is, you’ll be told that “our plan for the new America ensures that appropriate solutions will be implemented, as soon as Congress enacts them into law.”

How can you disconfirm the ineffably smarmy language of the exhibits in the California Academy of Sciences? Once an intellectually respectable museum, it now adorns its walls with oracles like this: “If we don’t change our actions, we could condemn half of all species of life on earth to extinction in a hundred years. That adds up to almost a million types of plants and animals that could disappear.” Should you decide to argue, you can’t say much more than, “If you don’t stop purveying nonsense like that, your museum has seen the last of me, and my $29.95 admission fees, too.”

Thirty years ago there was a considerable emphasis among mainstream evangelical Christians on the prospect of Christ’s imminent return. There was a popular religious song, urging people to keep their “eyes upon the eastern skies.” Less mainstream religionists said that all indications point to the probability that the present system of things will end in 1975. Meanwhile, a very popular book, Famine 1975! America’s Decision: Who Will Survive? (1967), predicted that the world would soon run out of food; and scientists worried the world with predictions that “global cooling” would soon be upon us.

For students of human nature, especially American human nature, this particular religious prediction is a matter of great importance, interest, and (let’s face it) fun.

Among these predictions were a few that, wonder of wonders, actually could be disconfirmed, and were. Though no one said that the Egyptians (often thought to be especially “vulnerable”) would begin starving to death precisely on May 21, 1975, some people came close enough to saying that; and events showed they were wrong. Yet there are escape routes available for all predictors, even those proven to be wrong. Two escape routes, really: memory and interest.

Jesus knew about this. He told his followers that no man knows the day or the hour of his Return, but that people would always be running around predicting it (Mark 13:32, Luke 21:8–9). The failed predictions, which seemed so snazzy before they failed, wouldn’t really be perceived as failures, because the failures wouldn’t be remembered, or considered interesting enough to be remembered. Watch out for predictions, he said.

There are two things going on here. One is that people die, and their enthusiasms die with them — often to be revived by the next generation, before being forgotten again. Quack medical treatments, as someone has pointed out, have a generational life, and so do quack economic and religious treatments. The Bates Eye Method, a way of using exercise to improve your eyesight, doesn’t work, and when people find that it doesn’t, they abandon it. They eventually die, and another group of people “discovers” the great idea, wants to believe it, and makes a big deal out of it, temporarily. The phony (and I mean “phony” not in the sense of “mistaken” but in the sense of “created to make money and impress people”) religious ideas of the I Am Movement have had a similar life cycle among New Age types. And when it comes to economics, where would we be without such recurrent notions as the idea that unions are responsible for “the prosperity of the middle class,” the idea that the minimum wage lifts people out of poverty, and the idea that general wellbeing can be created by forcing the price of commodities up by means of tariff regulations?

The gullible people who endorse such ideas often die with their convictions intact, although they may not succeed in passing them along to others, at least right away. In April we witnessed the death of the oldest man in the world, a gentleman named Walter Breuning. Before his death at the age of 114, Mr. Breuning gave the nation the benefit of his more than a century of experience and learning — his belief that America’s greatest achievement was . . . Social Security! Yes, if you retire at 66, as Mr. Breuning did, and collect benefits for the next 48 years, I suppose you might say that. But it’s an idea that’s likely to be ignored by people who are 30 years old and actually understand what Social Security is.

And that’s the additional factor: lack of interest. Failed ideas, and failed predictions, aren’t always forgotten — many of them have a second, third, or fourth advent. But they may be ignored. They may not be interesting, even as objects of ridicule. I suspect that most young people would say that Social Security is “good,” but it’s not as important to them as it was to Mr. Breuning. The same can be said of mainstream Christians, who agree that Christ will return, but pay little or no attention to any current predictions.

Right or wrong, as soon as an idea reveals even a vulnerability to disconfirmation, it often starts to dwindle in the public’s mind. Global cooling is a perfect example. Once, cooling was fairly big in the nation’s consciousness; then it didn’t seem to be happening, right now anyway; then it began to seem unimportant; then it disappeared, without anyone really noticing its absence.

This is what tends to happen with political and economic predictions. The smart people, and the political fanatics (such as me), go back to what Roosevelt said or Kennedy said or Obama said, and notice how wildly their promises and predictions varied from the accomplished facts; but the people in the street go their way, unshocked and unaffected. They may not have expected specific accuracy from their leaders’ forecasts, but if they did, they forgot about it. Initially, they were foolish enough to be inspired, or frightened, but they were canny enough to realize that other forecasts — equally inspiring, frightening, and vulnerable to failure — would succeed the present ones.

The subject changes; the language does not. It’s always apocalypse on the installment plan.

It’s like Jurassic Park, where the dinosaurs seem certain to devour the heroes, and almost manage to do so — about 1100 times. After the first few hundred near-death experiences, you realize that the only logical reason this won’t go on forever is that the theater has to be cleared for another showing of Jurassic Park. Expectation diminishes, long before the show is over — although you may be willing to see it again, in a few years, once the specific memory wears off. That’s the way the language of prediction often works, or fails to work.

As long as the idea of socialism has existed, its priests have predicted the downfall of the capitalist system. When each seemingly fatal contingency proved not to be fatal, another contingency was identified; when that failed to produce the climax, a third came into view . . . and so on. Some people were willing to return for Downfall of Capitalism 2, 3, and4; but others sensed that the plot had no logical ending, after all, and sought something different.

So new performances began in the Theater of Prognostication. Followers of Malthus demonstrated that civilization would perish through “over-breeding.” Eugenicists showed that it would end by the over-breeding of the “unfit.” For many generations, journalists computed the size of “the world’s proven fuel resources” and demonstrated that unless alternative sources of energy were found, the engines of the world would stop. In 1918, the world was assured that it was about to be made safe for “democracy.” Then it was assured that it was on the brink of unimaginable prosperity, to be produced by “technocracy.” After that, it learned it was about to be completely destroyed by a new world war. When the war came, but did not succeed in destroying the world, optimists prophesied an imminent “era of the common man,” while pessimists prophesied annihilation by the atom bomb. For generations, the “doomsday clock” of The Bulletin of the Atomic Scientists stood at a few minutes till midnight. It still does — because now it registers not only the purported danger of atomic war, but also the purported likelihood of destruction by “climate change.” In other words, another movie has hit the theater.

The subject changes; the language does not. It’s always apocalypse on the installment plan. You buy it in little doses. First, “evidence seems to show”; then, “all the evidence shows”; after that, “experts are all agreed.” The only thing lacking is clear language about exactly how and exactly when the event will happen.

In the early 20th century, many millions of words were spilled about (A) the world’s inevitable political and social progress; (B) the world’s inevitable destruction in a great war. But when the first great war began, there was nothing of inevitability about it. If Russia, France, Germany, and Austria-Hungary had decided, as they might easily have decided, not to bluff one another, or to call one another’s bluffs, about the insignificant matter of Serbian nationalism, there would have been no World War I. In the 1930s, world war was regarded as inevitable by people terrorized by new types of weapons and by the traditional bogeys of “monopoly capitalism” and “western imperialism.” When war came, it wasn’t ignited by any of those things, but by the least predictable of world historical factors: the paranoid nationalism of the Shinto empire, and the weird appeal of Nazism, embodied in the unlikely figure of Adolf Hitler.

If you refuse to be gulled by apocalyptic lingo, what will happen to you? Here’s what. You’ll be told that you are “in denial.”

There’s nothing much to the predictive power of human intelligence. But if you refuse to be gulled by apocalyptic lingo, what will happen to you? Here’s what. You’ll be told that you are “in denial.” Even more repulsively, you will be told that “denial is not just a river in Egypt.” (Isn’t that clever?) You will be accused of not believing in Science, not respecting the Environment, not caring about History, and so forth. You will be accused of all the characteristics exemplified by the prophets of doom themselves: ignorance, arrogance, and not-listening-to-others.

But let’s see how Harold Camping, Family Radio’s leader and prophet, compares with the other leaders and prophets we’ve considered. In one way he is exactly similar — his use of what Freud called “projection.” In every broadcast Camping warns his audience against arrogance, unloving attitudes toward other people, impulsive and subjective interpretations of the Bible, and submission to the mere authority of so-called Bible teachers. And in every broadcast he denounces members of ordinary churches for failing to heed his prophecies; rejoices in the pain they will suffer on May 21, when they realize that he was right and they were wrong; and suggests that anyone who disagrees with him is denying the authority of the Bible itself. On April 28, his radio Bible talk concerned the dear people in the churches, whom he reluctantly compared with the 450 priests of Baal who were slaughtered by Elijah because they trusted in their own ideas and wouldn’t listen to the true prophet — a prophet who, like Camping, was favored by God because he was absolutely certain that what he said was true.

But — and this is the most important thing — Camping has a dignity and intellectual integrity denied to most other predictors, including those most esteemed in our society. He doesn’t speak in generalities. He predicts that Judgment Day will come, without any doubt or question or problem of definition, on a particular day: May 21, 2011. In all likelihood it will begin with a great earthquake, which will devastate New Zealand and the Fiji Islands, at sundown, local time. After that, the wave of destruction will circle the globe, with the setting sun. By May 22, the Rapture will have been concluded; “it will all be over!”, and everyone will know that it is; the whole thing is “completely locked in.” In making his prophecies, Camping is actually risking something. He is actually saying something, not just uttering fortune cookie oracles.

I use that phrase advisedly. The other night, Mehmet Karayel and I dined at the Mandarin, and as always we were interested in seeing what our fortune cookies had to say. Mehmet’s fortune was a typically New Age, ostensibly precise manipulation of words. It said, “You may attend a party where strange customs prevail.” Yeah, right. Or he may not attend such a party, or the customs may be “strange” only to ignorant people, or the customs may be present but not prevail, etc.

Mehmet is a real intellectual, not a person who plays one on TV, so he was not taken in by this exciting forecast. Then came the unveiling of my fortune. It was, believe it or not, “Your future looks bright.” Can you imagine a feebler thing to bake into a cookie? But Mehmet is aware of Mr. Camping’s prophecies, so he knew how to strengthen the message: “It should say, ‘YourMay 22 looks bright.’”

So Mehmet and Harold Camping, though disagreeing firmly on specifics, stand together in demanding that they be produced. Mehmet is certain that my May 22 can be bright; Camping is certain that it can’t. But they both know that only one of them can be right about a proposition like this, and that we’ll soon find out which one it is. How refreshing.

I must admit that not everybody in Camping’s outfit is up to his high intellectual standard. On April 26 I received a plea for contributions to Family Radio (which, by the way, has a good deal of wealth and doesn’t really need many contributions — but why not ask?). The plea came in two parts. One was a brief letter from Camping, requesting my “continued involvement” in FR’s ministry, because “Time is running out! The end is so very near, with May 21, 2011, rapidly approaching.” The second was a pledge card, where I could check the amount of money I planned to give “each month.” As I said in my December article, there is evidence that some people at FR are biding their time, trying to keep the organization together so it can continue — under their leadership — after the failure of May 21. My speculation is that the pledge card is their product, and they don’t mind contradicting Camping’s message in the letter. They may even be alerting “supporters” like me to keep the faith: my May 22 does indeed look bright.

But that’s a digression. Harold Camping is not a politician or a professor of environmentalism, whose prophecies can never be proven wrong because they’re ridiculously non-specific. No, he has said exactly what he means by the end of the world, and he has said exactly when the end of the world will happen. You can check it. I hope you do. Go to Family Radio’s website, find out where its nearest radio station is, and tune in during the evening of May 20 (US time), when, Camping believes, Judgment Day will begin in the Fiji Islands. Then listen through the next few days, as Family Radio responds to the disconfirmation of its prophecies. Or does not respond — until it figures out how to do so (and that should be interesting also).

As I’ve said before, this is a once in a lifetime opportunity. It will be much more interesting than listening to the constant din of the secular prophets — politicians, historians, economists, and environmentalists — whose intellectual stature, compared to that of Harold Camping, addlepated prophet of the End Time, is as nothing, a drop in a bucket, and the small dust that is wiped from the balance.


Share This

Count Down


Share This

Where Do We Stand Today?


Major events concentrate the mind on major issues.

At this moment, we are all trying to analyze the results of the great American election of 2010. We are also celebrating the beginning of Liberty magazine’s online edition — proof of the continuity of libertarian ideas across all movements and events of history.

Much has happened in American politics since Liberty first went to press, back in the summer of 1987. This is a good time to ask how well liberty itself has fared during the past quarter century.

It's sad to realize that the history of these years can most readily be divided into periods, not by great new inventions or movements, but by presidential personalities — the age of Reagan, the age of Bush the First, the age of Clinton, and so on. Let's start by looking at the major features of the world in which Liberty was born, the age of Reagan (second administration).

The most prominent political feature of that world has passed away — the threat of nuclear annihilation of the West by the empire of communism. That threat had overshadowed a generation of Americans, sometimes manifesting itself as a chronic anxiety, sometimes rising to a pitch of hysteria, but always costing mightily in emotion and in wealth. For anyone who came to conscious life after, say, 1991, the effects of a threat like this are probably impossible to understand. I hope they remain that way. Nevertheless, the danger went away. The grand threat of communism was replaced by the nasty threats of Muslim fanaticism and creeping nuclear proliferation; but while these are worth worrying about, they are not quite comparable.

Perhaps the most powerful cause of the collapse of communism was the burden of its own inefficiency, a flaw that libertarian thinkers had never ceased to emphasize.

Why did the communist empire fall? An event of this kind has many causes, and you are free to emphasize one or another of them, depending on your politics. One was probably President Reagan's determination to out-spend and out-invent the communist military machine. Another was Reagan's use of essentially libertarian arguments about the benefits of individual freedom to inspire the West with a new determination to resist the propaganda of defeatism. (A well-advertised determination to resist is itself a powerful counter-threat, and in this case it seems to have had a major effect on collectivist morale, everywhere.) Perhaps the most powerful cause of the collapse of communism was the burden of its own inefficiency, a flaw that libertarian thinkers had never ceased to emphasize, even as their arguments were laughed to scorn by "progressive" thinkers in the West, and even as conservative American leaders worried that the communists were about to "catch up" with us. They didn't; they fell on the track, victims of the astonishing skill and inventiveness of individual enterprise.

Those of us who were politically conscious (or in my case, semi-conscious) in 1989–1991 recognized the communist collapse as a tremendous victory for libertarian ideas. If this be “triumphalism,” make the most of it; the echoes of our triumph are still heard, most recently in America’s general revulsion against the idea of a “single-payer” (that is, communized) national insurance scheme. Contrast the favorable reception of a single-payer retirement system — Social Security — two generations before.

Though not all libertarians would agree, real progress was also made by Reagan’s forthright defense of what is now called American “exceptionalism.” There is indeed something exceptional about America, and Reagan didn’t say that the exceptional thing was religion, ethnic diversity, immigration, community spirit, or anything else that is considered politically correct on either the Left hand or the Right. He said it was freedom, free enterprise — and he was correct.

Yet by summer 1987 it had become obvious that Reagan’s own legacy was more conservative than libertarian. He simplified the tax brackets, which had been designed to extract the maximum possible out of every nickel added to your income, and in so doing he reduced the tax rates; but this, as anticipated, actually raised total government income. Then his administration proceeded to spend much more than its income. That was not a libertarian thing to do.

One thing that limits state power in America is the individual states, which in the federal system are supposed to check and balance the overweening might of Washington. Reagan believed in federalism — but only when it fitted his own purposes. He was responsible for nationalizing the drinking age at an absurd 21, by using federal highway funds as a bludgeon against states that, very rationally, didn’t want to go along. And while he was a mighty foe of regulation, he was also a friend of the ridiculous war on unregulated recreational drugs. You can say the same thing about every other president, except the current one; but we might expect more from a conservative president who once told Reason magazine that “the very heart and soul of conservatism” was libertarianism.

In the case of both Presidents Bush, “conservative” should be placed in quotes. Ideological labels don’t stick very well to sheer incompetence.

Reagan was also to blame for some serious sins of omission. He intended to abolish the Department of Education, but he paid little attention to the person he appointed as secretary of that department; and when the appointee turned out to be a public foe of abolition, Reagan let the project drop. The result: three decades of enormous and destruction educational spending and meddling by the national government. In addition, Reagan, a man of deep personal loyalty (a good thing), permitted his vice president, George Bush, to be anointed as his successor (a very bad thing).

I don’t have to connect all the dots that outline the political profile of George H.W. Bush, although each of them contributed to the success of Liberty in its attempt to distinguish grassroots libertarians from conservatives in power. But probably, in the case of both Presidents Bush, “conservative” should be placed in quotes. Ideological labels don’t stick very well to sheer incompetence.

The first President Bush nominated Clarence Thomas, a firm and deep libertarian, to the Supreme Court, and stuck by him when he refused to yield to the most violent opposition that any Court nominee has ever endured. That was a good thing — indeed, a great thing — but it didn’t respond to any interest in judicial philosophy on the part of the “conservative” president. It responded, again, to a sense of personal loyalty, which is not to be blamed but can hardly be depended upon as a means of advancing liberty. Bush’s other appointee was David Souter, who was one of the most anti-libertarian, and certainly one of the stupidest and least qualified, people ever to roost on the Supreme Court bench. A political crony vouched for Souter, so Bush nominated him, as Eisenhower had nominated the outrageous judicial activist William Brennan.

Someone, someday, will write a book called “The Mystery of George H.W. Bush.” It will attempt to answer the question, “How can a hero of World War II, and subsequently an observer of every seamy transaction in the wars of American politics, emerge as such a sap?” Bush won office by promising that he would veto any attempted tax increase: “Read my lips: no new taxes!” He then agreed to raise taxes, on the promise of his political opponents to lower government expenditures, something that they had no intention of doing. It’s hard to think of any other president who would have been foolish enough to make such an agreement, and it very appropriately cost Bush his presidency.

Bush showed great ability at persuading foreigners to unite with the United States in asserting Kuwait’s independence after the oil-rich kingdom had been forcibly annexed by Iraq. He also showed great fortitude. I well recall watching news coverage of the buildup to the first Gulf War. I was in the company of other libertarians, all very bright people. Their reaction, as they saw the troops walking onto the ships: “Poor kids! They’ll never come back alive.” And that was a possibility. Yet Bush took the risk and fought a successful war in the Gulf, a war that actually came to a conclusion.

He also fought a successful, though much less honorable, war against Manuel Noriega, dictator of Panama — allegedly for his involvement with drug trafficking, actually for his general antagonism to the United States. Bush had Noriega snatched from Panama and tried in the United States, where he was convicted of crimes against the laws of a country that was not his own. You don’t have to be sympathetic to Noriega to sense that President Bush wasn’t a deep thinker about international law. Nor do you need extraordinary intelligence to perceive, in the wars of the first Bush administration, the seeds of wars in the second.

If ideas count — and they do — a modern liberal Democrat president had admitted that libertarians, the foes of every idea he endorsed, had won the argument.

It is hard to find a libertarian feature of the first Bush regime, and harder still to find one in the early regime of Bill Clinton. The only good thing about it was Hillary Clinton’s elephantine attempt to socialize the nation’s healthcare system, and the failure of her attempt. It failed, not principally because of the Republican Party’s opposition, but because of the people’s response to a well-calculated television ad campaign, supported by some intelligent and interested people on the Right. To the political pros, the campaign seemed to have “come out of nowhere,” yet it re-energized the forces of limited government. A year later, these forces united behind new ones within the Republican Party and ousted the president’s party from its leadership of the House of Representatives for the first time in 40 years. Clinton’s response was to proclaim that “the era of big government is over.”

Of course, that was a lie. As several people commented at the time, all he meant was that the era of lots and lots of little government was continuing. Nevertheless, if ideas count — and they do — a modern liberal Democrat president had admitted that libertarians, the foes of every idea he endorsed, had won the argument.

Unfortunately, his remarks were a warning, for those who would listen, that future government aggressions would be finessed, not announced. In future, modern liberals like Al Gore would pretend that they had a “lock box” in which to put Social Security taxes, and that the box would never be raided for the use of any other government project — a mythological concession to the people’s desire to limit the state’s depredations. And in future, modern socialists like Barack Obama would claim that even their most Ozymandian schemes, such as Obama’s healthcare “reform,” would “pay for themselves” or even “reduce government expenditures.”

La Rochefoucauld said that hypocrisy is the tribute that vice pays to virtue. If so, we can do without any more tributes. The truth of libertarian ideas is admitted, in principle. Still, it’s the false ideas that get put into practice.

Clinton commissioned countless military adventures abroad, in Eastern Europe and in the Near East; they seldom amounted to much, although they asserted the kind of bellicosity that his own party now wants to run away from. But Clinton did two good things for the cause of limited government: he made an alliance with the Republicans for a sweeping, and successful, reform of welfare; and by his scandals he so diminished the prestige of the presidency as to make people significantly less likely to believe and trust elected officials. Bad news for government is usually good news for liberty.

What shall we say about the second President Bush? Unlike Bill Clinton, he wouldn't be a bad person to have as a neighbor — unless, of course, he decided that you might be secreting a weapon of mass destruction. R.W. Bradford, the founder of Liberty, once published an article in which he tried to account for Bush's invasion of Iraq. After a close review of the evidence, he concluded that Bush really believed his own account of the dangers that Saddam Hussein posed to the world. I found Bradford's reasoning persuasive. Bush was not an evil man; he was a gullible man, and he was usually gullible in the ways in which modern liberals are gullible. Until recently, they too believed in solving problems, real or perceived, by projecting military power abroad; indeed, more leading Democrats initially supported the second Gulf War than had supported the first one.

If Bush had happened to be a member of the Democratic Party (which, except for his family identification, he could easily have been, given his general political ideas), no one on that side would have quibbled about his vast government expenditures and vast government indebtedness, or his blithe disregard for any limitations on the power of the federal government. It was Bush who engineered one of the greatest federal takeovers in history, Washington’s massive intervention in local education, under the title of "No Child Left Behind."

In 2008, George Bush the modern liberal was succeeded by Barack Obama, another modern liberal — but a much more self-conscious and socialistic one. People in the eighteenth century used to analyze people by reference to their “ruling passion,” to whatever it is about them that they are willing to sacrifice everything else to. President Obama’s ruling passion is intellectual arrogance, the kind of arrogance that finds its equal, among American presidents, only in the disastrous mentality of President Wilson. Wilson never understood why he was deserted by the people over the issue of the League of Nations; after all, his ideas were correct. For Obama, as for Wilson, “correct” means “progressive,” and “progressive” means “maintaining unquestioned faith in the uninformed notions of the leader.”

No hypocrisy here: Obama believes sincerely in the ideas he enunciates. He believes implicitly in Keynesianism, minus Keynes’s qualifications of his theories; in the most naïve form of dirigisme, minus the glitter of Louis Napoleon and Baron Haussmann; in the managerial economics of Henry Wallace, minus Wallace’s wonderful goofiness (though Obama followed the Reverend Mr. Wright as Wallace followed his “guru”). In short, Obama is not an intellectual, no, not by a mile. He simply makes the mistake of believing that he somehow arrived at his naïve ideas through a long process of thought and experience, and that his inspiring “philosophy” is his ticket to success.

For Obama, as for Woodrow Wilson, “correct” means “progressive,” and “progressive” means “maintaining unquestioned faith in the uninformed notions of the leader.”

Clearly, it’s not. If “success” means “being elected,” right now he couldn’t be nominated as an alderman in Chicago. If it means “moral superiority,” why are you laughing right now? Obama’s administration demonstrates the truth of an important libertarian idea, developed by Friedrich Hayek in the chapter of “The Road to Serfdom” that he called “Why the Worst Get on Top.”

I’ll summarize the argument in this way: socialism attracts people for many reasons. One is a desire for unearned material rewards. Another is a lust for the power that socialized economies automatically convey to an elite. But yet another is the humanitarian idealism that is felt by some of the world’s morally “best” people. They enlist in the socialist cause because they think it will make a better world. These are the “hopey-changey” people. But when a socialist regime comes into power, it inevitably demonstrates, as Obama’s regime has demonstrated, that its promises cannot be fulfilled, especially in the terms originally proposed. At this point, the best of the hopey-changey people hop off the train; the worst stay on, making their way toward the front by means of lies and intimidation.

Consider the leading personnel of the Obama regime — the Nancy Pelosis and Harry Reids, the David Axelrods and Rahm Emanuels — and you will see the principle in action. Because our tradition of limited government has been preserved in many important respects, the “worst” in America are not allowed to be as bad as the “worst” could get in, say, Stalinist Bulgaria; but they are as bad as bad can be, in American terms. They are living demonstrations of the intellectual and moral vacancy of socialism, American style, and so is their boss, Obama.

So where do we stand today?

We stand at the end of a quarter century of confirmations of libertarian ideas. We stand in the midst of an enormous popular rebellion against the state, a rebellion conducted almost entirely in libertarian terms. The Tea Party movement is not the only example. In every state, in virtually every county, ordinary intelligent Americans are calling for retrenchments of government. Sometimes their protests are united with demands that run contrary to libertarian ideas, demands motivated by conservative religious dogmas or opposition to international trade or the “outsourcing” of industry. But these notions are not the rudder on the ship. In most cases, they are scarcely heard.

If nothing else, the elections of 2010 showed that the American people are tremendously dissatisfied with the performance of Obama and his party, and on thoroughly libertarian grounds. The results of the election indicate a massive revolt against both the arrogance and the enabling ideas of the modern state.

The vehicle of this revolt has not been the Libertarian Party, which is no longer the most conspicuous political manifestation of the freedom movement. The main vehicle is now the venerable Republican Party. Despite the absence of a self-described libertarian president, despite the presence of time-serving apparatchiks as leaders of the congressional Republicans, the G.O.P. is as much infused with libertarian ideas as the Democratic Party is infused with socialist ones — and that’s saying something.

This remarkable development was made possible by three other developments, two of them quite unexpected.

One was the internet revolution, a supreme technological application of the principle of spontaneous order that libertarians have always advocated. The internet’s creation of a new kind of spontaneous order ended the hegemony of the government-authorized radio and television networks, which in 1987 allowed barely a hint of libertarianism to surface in the national discourse. Their dominance has been utterly destroyed. Now, anyone who has a good idea, an idea that works — and libertarian ideas do work — can reach out to other people and give the idea a potent political expression. Add to this the growth of cable TV, hungry for real ideas that will interest real people.

Another unexpected development was the growth in influence of libertarian journals, think tanks, and other voluntary organizations, making their way in the new channels of the internet and cable TV. The Cato Institute, the Mises Institute, Reason magazine, Liberty magazine, FreedomFest . . . these are only a few purveyors of libertarian ideas reaching out to a broad audience of Americans and giving them intellectual ammunition to continue the war against the coercive state. Gone are the days when the New York Times quoted someone from the Ford Foundation, and CBS quoted the New York Times, and “public opinion” resulted. Now libertarian ideas and libertarian research and the evidence of a successful libertarian society, as embodied in the internet itself, are as close as anyone’s keyboard, where they compete very successfully, thank you, with the ideas of the closed society.

What’s the third development? It’s simply the persistence and continual confirmation of essential libertarian ideas. The arguments of Locke and Madison, Friedman and Mises, Paterson and Hayek, haven’t changed during the past 23 years; but they have been ratified by more, and more conclusive, events and understood by more, and more informed, people. This is not the moment for regret or despair; this is the moment for confidence in the future, in our country, and in ourselves.

ldquo;reduce government expenditures.rdquo; of industry. But these notions are not the rudder on the ship. In most cases, they are scarcely heard.

ldquo;reduce government expenditures.

Share This
Syndicate content

© Copyright 2018 Liberty Foundation. All rights reserved.

Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.