How Much Ruin, Exactly?


“There is a great deal of ruin in a nation.” It’s a remark Adam Smith made to a young colleague, John Sinclair, who worried that the cost of quelling the American rebels might lead to the downfall of Great Britain. It’s also a remark Liberty’s founder, Bill Bradford, quoted back to me on several occasions, whenever I was doom-and-glooming about how some country or another was on the road to ruin.

It’s a remark that comes to mind often still, whenever I’m agitated about governmental stupidity or malfeasance. Foolish wars in the Middle East have not ruined the US, and neither have decades of profligacy, proliferations of acronymic agencies, or a succession of villains in our highest offices. Communism did not ruin Russia, and has not ruined China; even Nazism could not permanently ruin Germany, though it did succeed in splitting it for a while.

It’s a familiar feedback loop: the more that’s seized, the worse the economy gets; the worse the economy gets, the more can be seized.

Nonetheless on occasion I read of some insane diktat in one or another corner of the globe and wonder just how far that corner’s leaders are prepared to stretch the maxim. Sovereign debt will likely not ruin Spain, or Portugal, or even Greece, though the EU seems intent on testing that out a while longer yet. Debt (again) and a shrinking population will probably not ruin Japan, but its prime minister Shinzo Abe, with his “Abenomics”—a reheated and desperate Keynesianism—is trying his hardest to make things worse. Unemployment and labor unrest will certainly not ruin France, but French president François Hollande, meanwhile, has yet to pass up a chance to kill off jobs and push companies abroad.

And then there’s Venezuela.

Venezuela, of course, was one of the great experiments: Hugo Chávez’s “Bolivarian Revolution” was supposed to prove the superiority of socialism (economic, of course; its moral superiority was assumed long ago), provided only that said socialism is backed up by seemingly inexhaustible national resources. Chávez, wasting no time after his election in 1998, set about “redistributing income” through land grabs and price fixes, threatening hesitant businesses with expropriation and then often following through on that threat. Under “Chavismo,” Venezuela assumed ownership of much of the nation’s construction, telecommunications, utilities, and food production industries, insisting at each step of the way that the takeovers were necessary to combat the predation of profiteering capitalists.

This sets up a feedback loop familiar to anyone who’s given even the slightest attention to modern government, where every gain (however temporary) is attributed to the extraordinary wisdom and foresight of the government agents, while every loss (all too often permanent) is attributed to the greed of speculators and other enemies of the people. Naturally, the more that’s seized, the worse the economy gets, but on the other hand, the worse the economy gets, the more can be seized. It’s brilliant, really—at least until the shortages of basic goods become too great for anyone but an ideologue to ignore.

Say this in Chávez’s favor: his policies—and those of his successor, Nicolás Maduro—have encouraged innovation in the Venezuelan people; for instance, consider the smartphone app created to help them find toilet paper, in perpetually short supply thanks to price controls. But, as with the more traditional example of broken windows, this innovation isn’t going toward the sorts of things that would convince anyone of Chavismo’s superiority. And as other nations, especially the United States, Canada, and Brazil, have become more energy-independent, Venezuela and President Maduro are finding fewer buyers for their one undoubted asset, while the state-owned oil industry has become ever more wasteful and unprofitable.

With revenues plummeting and prices held artificially low, inflation has, inevitably, kicked in. And here’s where the “ruin” starts coming in: Maduro’s response (other than continuing to threaten or outright seize businesses) was to devalue the currency, and impose controls on currency exchange. As account holders desperately tried to get their money out of the country ahead of impending hyperinflation, Maduro doubled down by devaluing further, attempting to cut off foreign travel. Finally, he enacted a “Law on Fair Prices,” prohibiting profit margins of over 30%—which is to say, no profit, for anyone running an import business—while at the same time enacting long jail terms to punish “hoarders,” or, less insanely, anyone refusing to sell at a loss.

The socialist policies have certainly encouraged innovation in the Venezuelan people—take for instance the smartphone app that helps them find toilet paper.

Now, I’m no expert on Venezuela. I’ve never been there, I don’t know anyone from there, and I can’t get more than the barest sense of any articles written in Latin American Spanish. But I can’t imagine any experience of the place that would convince me that those Venezuelans who protest Chavismo are just, in the words of professional useful idiot Oliver Stone, “sore losers”—though they certainly aren’t winners, either, not while they’re getting gunned down for demonstrating against the ongoing depredation and repression. And so long as the government is willing to arrest the opposition leader, or expel consular officials for so much as meeting with protesting students, things don’t seem likely to improve.

It’s impossible to know where it will all end, or whether it could be enough to ruin Venezuela. I suspect not: prior to Chávez, Venezuela was no more or less stable than any other Latin American nation since the time of Bolívar himself. Oddly enough, in this era of globalization, the idea of a nation may be more susceptible to ruin than individual nations themselves. Those that are nearest ruin are those that were highly unstable and unwanted to begin with: Somalia, Iraq, Yugoslavia—lines drawn on a map as a convenience to colonial invaders or international do-gooders (if you can tell those apart). Yet even those fictions hold up longer than one might expect—just look at Zimbabwe.

So yes, there is likely a great deal of ruin still in Venezuela. But it is a shame, and likely will be a tragedy, to see the depths its rulers are willing to plumb before they hit bottom.

Share This

The High and the Mighty


Two thousand thirteen was a hard year for this column. As soon as things seemed to be settling down, another threat or evil tendency always intruded itself. You know what happens when you finally get the floor washed and waxed: along comes your neighbor, or the guy who’s replacing the sink, or your friend who just happened to be driving past, and suddenly the place is filthy again. Word Watch is still trying to clean up the mess of 2013, and now 2014 is making its own kinds of mess.

The verbal polluters hail from the strangest places. In December, Word Watch was informed, through the majesty of CNN, that someone absurdly styling himself William, Duke of Cambridge, Earl of Strathearn, Baron Carrickfergus, Royal Knight Companion of the Most Noble Order of the Garter, Knight of the Most Ancient and Most Noble Order of the Thistle, and heir to the throne of the United Kingdom of Great Britain and Northern Ireland, not to mention the British Dominions beyond the Seas, had made himself the centerpiece of a video documentary — a work so repulsive that it has to be noticed, and warned against, despite the strangeness of its apparition.

The thing is called “Prince William’s Passion,” but don’t get the wrong idea. What he’s passionate about is animal conservation in Zimbabwe. Oh fine. But the offspring of his passion — a documentary of unbearable length, whatever time it literally consumes — can be viewed for only a few minutes before one’s sanity is endangered. I stopped watching as soon as I suspected I was going crazy, so I didn’t get to see it all. It seems improbable, however, that the epos includes any reference to the fact that Zimbabwe harbors not only animals but one of the most rapacious tyrannies on earth. Prince William’s passion is conservation of wild life, not of human life.

Well, we all have our passions. Job candidates are tortured to reveal what they are passionate about. The people one meets at parties disconcertingly confess their lifelong passion for rubber baby buggy bumpers. Dead people are praised for having consistently succumbed to — I mean followed — theirpassions (note to file: check the Hitler obits). Angry people call radio advice shows to complain that their spouses are frustrating their passions — and again, sex is not the problem. The passion always turns out to be something like writing children’s books or running a home for ferrets.

Zimbabwe harbors not only animals but one of the most rapacious tyrannies on earth. Prince William’s passion is conservation of wild life, not of human life.

So the prince has passions; so what? What most alarmed this viewer was the documentary’s sad evidence of the deficient education that royal persons now receive. As the grand summary of his weltanschauung — or is it only his gestalt? — His Highness uttered these memorable words:

Conservation is so key.

For years we have observed the ugly progress of key from a normal, though uninteresting, modifier (“That was a key consideration”) to an ungainly predicate adjective (“That consideration is key”). So what’s wrong with that? Two things.

1. Key naturally evokes images of a physical object, an object that exists not for itself but as a means of opening or entering something else. The original setting of key was in sentences such as, “That consideration is the key to our success.”Thrusting key onto the stage alone is contrary to established and useful idiom and associations; it needs, at least, a noun immediately following it (“key consideration”).

2. Used in the new, naked way, key usurps the place of more useful and exact modifiers. There’s a big difference between an important consideration and a crucial consideration,a helpful consideration, and so on. Key obliterates the alternatives and ends the possibility of precision.

Maybe that’s why it has become a cliché — that is, an easy substitute for thought. Much worse, however, is the elevation of key to the status of a metaphysical quality that cannot be qualified but can only be intensified. How key is conservation, Your Highness? Is it really key? Sort of key? Very key? He can’t say. All he can say — with passion — is that it is so key. Dude.

Another dude is Christopher James (“Chris”) Christie, governor of New Jersey. Unlike Prince William, Christie is a dude but not a twit. He earned a lot of praise when, on January 9, he held a long, colloquial, and seemingly frank press conference to deny that he had anything to do with the artificial bottleneck that his aides created at the entrance to the George Washington Bridge, in order to wreak vengeance on political foes. What has been forgotten was Christie’s first response to the bridge news (Jan. 8). Here is the entirety of his statement:

What I've seen today for the first time is unacceptable. I am outraged and deeply saddened to learn that not only was I misled by a member of my staff, but this completely inappropriate and unsanctioned conduct was made without my knowledge. One thing is clear: this type of behavior is unacceptable and I will not tolerate it because the people of New Jersey deserve better. This behavior is not representative of me or my Administration in any way, and people will be held responsible for their actions.

Notice that second sentence. It declares that the governor is outraged that wrong conduct happened without his knowledge — literally meaning that he wouldn’t have been outraged, had he known about it. He came close to a similar blunder in the first sentence, which damns whatever it was that he saw “today for the first time” but leaves open the possibility that he wouldn’t regret any bad behavior he’d seen for the second or third time.

Almost everything about the statement is odd. When have you ever heard of conduct being made? And when, in normal life, have you heard an apology that says nothing whatever about what is being apologized for? Unfortunately, however, such abnormalities have become normal in our political life. Politicians and their highly specialized, highly paid, highly communicative aides are constantly losing control of basic English, and apologies are constantly being wrenched into things like pretzels — all twist and no center.

Obama’s popularity is now at an all-time low, and according to all available polling, a major cause is people’s growing conviction that he is a phony, pure and simple.

Following the practice of his friend, President Obama, Christie originally reacted to criticism by sneering at it. He spent a long time denying that the bridge episode had happened. He ridiculed the very idea. He found nothing exceptional or exceptionable in the long, gross imposition of force that someone perpetrated on the public by restricting rush-hour access to a bridge in order to conduct a “traffic study.” If Christie had any interest in what words mean, he would have said, “What the hell kind of study!” and brought the matter — whether it was a traffic study or an intentional persecution of innocent drivers — to an immediate end. Of course he didn’t. Then, like Obama on the IRS scandal, he became outraged. Aren’t you tired of that word? Aren’t you saddened by it?

One thing that everyone continues to be tired of and saddened by is the president’s folksy fakery. There are millions of examples, but here’s one from an interview he gave on Nov. 14:

I’m just gonna keep workin’ as hard as I can around [he emphasized that word] the priorities the American people have set for me.

If you want proof of how out of touch Obama is, try that remark. Nobody thinksthat by dropping his g’s he’s bein’ anythin’ other than phony, yet he jus’ keeps on doin’ it. As for workin’, it didn’t take very long for people to find out that Obama doesn’t work very hard at anything but golf. After seein’ his popularity fall from very high to very low during the first few months of his presidency, he started playin’ these verbal tricks. Result? Nothin’. His popularity is now at an all-time low, and according to all available polling, a major cause is people’s growing conviction that he is a phony, pure and simple. But he jus’ keeps on pretendin’ that he’s nothin’ but a workin’ man, sweatin’ away on the job site jus’ like ever’body else.

Another thing that hasn’t changed is the president’s curious conviction that he lives in the 1970s. I doubt that there’s a political program he’s offered that wasn’t one of the American people’s priorities, as identified by Jimmy Carter — with the sole exception of amnesty for illegal aliens, whom ’70s Democrats generally perceived as inimical to the cause of high union wages. (They still are inimical to union wages, but the unions of today are down so low that their only hope is to assemble enough naïve voters to help them retain their political power and subsidies.)

The tipoff is that word around. Nobody but leftists, embedded in the ideas of the ’70s like rocks in a glacier, uses around in that (frankly) idiotic way. Asked what they’re doing with their lives, kids who have been coopted into what are now old-leftist pressure groups can be depended upon to say, “I’m working around issues of income inequality”; “We’re working around questions of peace”; “I’m interested in working around issues like, uh, climate change.” In the same way, Obama keeps working around priorities.

Just try to picture this working around. Imagine an issue, or a question, or, for God’s sake, a priority. Never mind whether you think that the American people set thatpriority. Just try to picture the thing itself. Now picture somebody working around it. What’s he doing? Is he fencing it off? Laying tile to keep the ground water out? Or is he evading it, as people do when they try to get around an obstacle?

However you picture it, around in ’70s speak has the same rhetorical function that it has in such sentences as, “I think there are around a hundred fallacies in the president’s argument.” Its only use is to impart vagueness. Yet in the stale old “radical” tradition from which Obama has not escaped, people assume that around imparts some kind of solemnity. It doesn’t, and the fact that they think it does is sad evidence of their inability to reflect on what they’re saying.

It wasn’t the grace of God that kept Barack Obama from poverty. It was a banker grandmother, elite private schools, an indulgent Harvard Law School, adoption by a political machine, and fat contributions from wealthy people.

The president’s addiction to folksiness is closely linked to his passion for clichés. Almost anything he says is a cliché, but I was especially impressed by the phoniness of the double cliché he emitted when speaking on January 7 of people’s supposed entitlement to be paid despite the fact that they don’t have a job. He was speaking in favor of the dozenth extension of unemployment benefits since mid-2008. How could anyone be in favor of that? Because it helps people survive until they get back to work? But economic studies, with which Obama is presumably familiar, indicate that people tend strongly to get back to work when their unemployment payments are about to cease.

Obama stated his reasons, and they had more to do with metaphysics than they did with economics: “We’re all in this together,” he opined. “There but for the grace of God go I.”

May I suggest that it wasn’t the grace of God that kept Barack Obama from poverty? It was a banker grandmother, elite private schools, an indulgent Harvard Law School, adoption by a political machine, and fat contributions from wealthy people. In return for these favors, he now spends his days ladling out clichés like we’re all in this together. And he talks of God.

And speaking of God: the deity’s friends and purported friends — holy men and hirelings, true shepherds and false — have performed more service for the English language than anyone but that skeptic, Shakespeare. Consider the King James Version of the Bible. Consider the Book of Common Prayer. Consider the Anglican manner of the KJV and BCP, as echoed by Jefferson and Lincoln — or, if you want libertarians, Paterson and Rand. But that was then; this is now. A news item of January 4 reports that “the Archbishop of Canterbury Justin Welby, the leader of the world's 80 million Anglicans,” is supporting yet another revision of the Book of Common Prayer, which his cohorts have already revised within an inch of its life. This revision eliminates all mention of sin from the baptismal service, thereby eliminating a good deal of its seriousness and almost all of its purpose. If you’re not a sinner, why do you need to be baptized? Why do you need a church, and a baptismal rite to let you into it?

Well, not to panic. Welby is far from the real leader of the world’s fourth-largest group of Christians. Outside of Britain, which is the only place affected by this latest theological-linguistic plague, he is generally regarded as a fifth wheel. And the anti-intellectual, or at least the anti-theological, tendency of the current mania for revision is sharply opposed by other religious potentates. According to AFP and the Mail:

Former Bishop of Rochester Michael Nazir-Ali said the move, which is being trialed until Easter in around 1,000 parishes, was part of a "constant dumbing down of Christian teaching".

"Instead of explaining what baptism means and what the various parts of the service signify, its solution is to do away with key elements of the service altogether."

Amen. But look at what Britons call this process of dumbing down. The “move,” they say, “is being trialed.” Lord save us — this locution may invade America. Beware the first symptoms:

“Are Jim and Susan living together?” “Yes, they’re trialing their relationship.”

“The administration continues trialing its newest version of what happened at Benghazi.”

“I was once a conservative, but I was only trialing.”

“I trialed writing English, but it was just too tough for me.”

Share This

Obama Reveals Sudden Emergence of Racism


To someone from the New Yorker, President Obama has now repeated what his allies have said many times before: his popularity suffers because of his race: “There’s no doubt that there’s some folks who just really dislike me because they don’t like the idea of a black president.”

The president’s sentiment is even more pathetic than his grammar and diction (“there’s folks”), and it reflects as poorly as anything could reflect on his analytical power and knowledge of history — even, in this case, his own political history.

According to the Rasmussen poll (to cite just one of many concurring polls), on inauguration day, 2009, 67% of Americans approved of the president whom they had recently elected, and 32% disapproved. Only 16% “strongly” disapproved. According to the same outfit, five years later, on Jan. 20, 2014, 49% approved and 50% disapproved, four-fifths of them heartily disapproving.

At what point did the president’s race change?

Share This

Pigs R Us


Responding to President Obama’s January 17 speech about intelligence gathering (i.e., spying on people), some anti-NSA activists opined: "Rather than dismantling the NSA's unconstitutional mass surveillance programs, or even substantially restraining them, President Obama today has issued his endorsement of them. . . . The speech today was 'historic' in the worst sense. It represents a historic failure by a president to rein in mass government illegality and violations of fundamental rights." The madcap Julian Assange commented: "I think it's embarrassing for a head of state like that to go on for almost 45 minutes and say almost nothing.”

For once I agree with the supposed progressives (although Assange could have made the same remark about any of Obama’s speeches). The president has no interest in restraining any aspect of government. In this he resembles his immediate predecessor, and the resemblance is becoming uncanny. From government stimulus of “the economy” (i.e., state employees, welfare recipients, and phony capitalists) to government interference with education to government intervention in foreign wars, Obama has been enthusiastically devoted to Bush’s causes and Bush’s ways of working. The difference is that he has been less “transparent” about how he carries on his work.

While listening to Obama’s monotonous, empty speeches, one often feels one’s mind wandering, just as one felt one’s mind wandering while one tried to listen to Bush. You find yourself doing things you seldom do. You dust that odd place behind the DVDs. You inspect the carpet to see if the edges need repair. You see if you’ve got enough cards to send next Christmas. Sometimes you lapse into fantasy. In recent weeks, I’ve been picturing myself on the last page of Animal Farm, where Clover wonders why everything seems “to be melting and changing.” How is it that when you look at the purported animals and the purported men, it’s impossible to say which is which?

Share This

An Unforeseen Development?


On NPR this morning, I heard that 525,000 people had left the American labor force in December. I couldn’t find the number on the NPR website, so I looked on the Labor Department’s. My “find” function came up empty there as well. It’s probably there, but I think you have to add and subtract a little from the relevant columns of figures to come up with it. Having wasted precious minutes, I grew impatient. I baited my Google hook with the raw number (525k) and cast it into the data sea. The number was reported on many suspect blogs, tagged with red doughnuts warning me away. Then: Voilà. An article from Economics Analytics Research, Unemployment Rate Plunges to 6.7% in Dec. As Labor Force Shrinks; Payrolls Up Disappointing 74K”:

The drop in the unemployment rate came as a result not of new jobs, but a sharp increase in the number of persons not in the labor force — 525,000 — to 91,808,000, an increase of 2,969,000 in the last year. In 2012, the number of persons not in the labor force increased 2,199,000.

Why are people dropping out of the labor force? Some retire. Some grow weary of a fruitless job search and move in with their parents. Others migrate to the underground economy. But why the “sharp” increase at the end of 2013?

Let’s face it, there are people who will choose to glide into Social Security and Medicare on the wings of Obamacare.

At least part of the reason may be this: before January 1, 2014, when you left the labor force early, not only did you lose any possibility of unemployment benefits but you were also probably tossed into the healthcare jungle of uninsurable pre-existing conditions, crowded emergency rooms, and lousy medical treatment.

Let us say that you are a 60ish empty nester who has been downsized. You have been looking for work for a year. Your unemployment benefits have run out and all your job leads have led nowhere. While you have a modest nest egg, Social Security won’t kick in for a few years and Medicare a few years after that. Your company-sponsored health insurance has run out and you are on the verge of applying for jobs for which you are ridiculously overqualified just to get the insurance.

But not so fast. Beginning on January 1, 2014, if you don’t have a job or more than a modest income, you are eligible for Medicaid — healthcare provided at no cost to you as a result of the Affordable Care Act. Please note: non-income assets don’t count against eligibility, and, under the new law, the allowable income ceiling has been raised (eligibility requirements have been relaxed) to allow millions more to enjoy this benefit, including the boomer described above.

Let’s face it, there are people who will choose to glide into Social Security and Medicare on the wings of Obamacare. They will choose not to take a big step down the career ladder in order to secure a benefit that is available for the asking. There is a facet of human nature that shrugs, “Why not?”

It has to be asked: was this incentive to hang it up early an intended part of the new law, or was this “sharp” shrinking of the labor force an unforeseen development?

In either case: heck of a job, guys.

Share This

Why I Worry about Global Warming


When I was in college, Margaret Mead came by and told me I wasn’t getting enough sex. Not that I needed an important scientist to point out anything so obvious, but it was nice to have official validation. And in the how-much-sex-I-should-be-having department, nobody could validate like Margaret Mead.

Margaret Mead had been in Samoa, watching from behind bushes as the improving hands of unfettered sex turned would-be hoodlums into loving, productive members of society. In Samoa, there was almost no interpersonal violence, very little crime, and no juvenile delinquency. The only reason juvenile delinquency happened in America was because juvenile Americans weren’t getting enough sex. Who could argue with Margaret Mead about something like that?

Mead had credentials. She was curator of Ethnology at the American Museum of Natural History, chair of the Division of Social Sciences at Fordham, fellow of the American Academy of Arts and Sciences, chair and also president of the executive committee of the board of directors of the American Anthropological Association.

Chuck’s jailbirds didn’t sound like the peaceful, sexually contented bonobos Ms. Mead had made them out to be.

With that one speech, Margaret Mead transmogrified a whole auditorium-load of us randy college guys into future productive members of society, every one of us on the prowl to spread peace and love all over whichever girl we ran into next. And when we ran into girls who clung to patriarchal values linking sex to marriage or, for that matter, to guys who turned them on, we had Margaret Mead and those fine-sounding credentials to corral her into the sack with.

The first glimmer that there might be more to the laid-back life in Samoa than Margaret Mead had led us to believe came years later when I occupied the office next to Chuck Habbernigg’s. Chuck had been attorney general for American Samoa, which meant he was on a first-name basis with just about everybody in the Pago Pago prison. And the people he was on a first-name basis with . . . well, not to put too fine a point on it, but Chuck’s jailbirds didn’t sound like the peaceful, sexually contented bonobos Ms. Mead had made them out to be.

The second inkling that something might be wrong came when a New Zealand anthropologist named Derek Freeman did what none of my classmates had ever done, or anybody else, apparently. He went to Samoa, checked out la Mead’s research, and discovered that she hadn’t been as rigorous as she let on. Hard as it was to imagine how such a thing could even be possible, it turned out that young Samoans got even less sex than young Americans, because Samoan parents made a bigger deal out of virginity than our parents had. And as for things like crime and social discontent . . . murder, juvenile delinquency, sexual violence, and suicide were higher over there than here. In the case of murder, much higher. The rate in Samoa was twice that of some of our inner cities.

For decades people had swallowed what Margaret Mead ladled out because nobody had the chops to call bullshit. It would have been worth the career of any anthropologist to claim that somebody as powerful as Margaret Mead, with all her chairs and important committees, was spectacularly, laughably wrong, especially an anthropologist who hadn’t gone to Samoa and done the fieldwork himself. And who’d want to do that? She had already done that fieldwork. If you wanted to go study a tribe, you’d go somewhere that hadn’t already been studied. So Freeman did the obvious thing, he waited until Mead had shuffled off to that great steering-committee in the sky, before he published.

Mead wasn’t the only famous scientist to hitch herself to a cartload of half-baked science, sink her teeth into the bit, and take off running. And to get millions of otherwise sensible people galloping along behind. The year after I graduated from college Paul Ehrlich came out with a book called The Population Bomb. It was a scary book that explained in a scary, scientific way how there were so many people in the world that entire societies were on the brink of being torn apart by food riots, hundreds of millions of us were going to die, and it was too late to do anything about it.

For decades people had swallowed what Margaret Mead ladled out because nobody had the chops to call bullshit.

“The battle to feed all of humanity is over,” announced Mr. Ehrlich in his most scientific way. “In the 1970’s hundreds of millions of people will starve to death in spite of any crash program embarked upon now. At this late date, nothing can prevent a substantial increase in the world death rate . . .”

Ehrlich wrote this in 1968, and his credentials were positively Meadian, they are so impressive. During his long, destructive career, he’s been president of the Center for Conservation Biology at Stanford and fellow of the American Association for the Advancement of Science, the United States Naval Academy of Sciences, the American Academy of Arts and Sciences, and the American Philosophical Society. Credentialwise, there’s no doing better than Paul Ehrlich.

By way of illustrating how serious the population thing had become, he included a hockey-stick graph proving just how far down the broad highway to destruction we already were. Hockey-stick graphs have become de rigueur lately with the scare-you community, and they’re pretty much all the same: a horizontal line running from the Pleistocene to the Industrial Revolution indicating not much going on until, along about your great-grandparents’ day, the line shoots upward and, voilà, the planet is pucked, Armageddon is upon us, we’re all going to die and it’s your fault.

If what you actually remember from the ’70s has less to do with food riots in the Imperial Valley and more to do with the Green Revolution and hundreds of millions of Chinese and Indians and Africans lifted out of starvation, bear in mind that the Green Revolution wasn’t something that got talked about a lot at the time. At the time, socially aware people who considered themselves scientifically literate . . . along with 58 academies of science that considered themselves socially-aware . . . became so alarmed over the fact that the rest of us weren’t willing to strangle our own children in order to save the planet that they began to think it was their duty to do something about us. Paul Ehrlich said so himself:

We must have population control at home, hopefully through a system of incentives and penalties, but by compulsion if voluntary methods fail. We must use our political power to push other countries . . .

I don’t know whether Deng Xiaoping read The Population Bomb, but the Paramount Leader wasn’t some wimpy university professor who could only rant about saving people from themselves. Deng Xiaoping was Paul Ehrlich with an army, and he had the power to see that pretty much anything he came up with happened. What he came up with was China’s one-child policy . . . and all the forced abortions, sorrow, and murder of girl babies that haunt the Chinese to this day.

In the ’30s, the issue du jour wasn’t that we had too many people in the world. In the ’30s, the issue was that we had too many of the wrong sort of people. Eugenics is the scientific name for doing something about too many of the wrong sort of people; and millions of the right sort, millions of concerned, socially-aware people, people with only the purest of motives, people who considered themselves scientifically-literate, jumped on the eugenics bandwagon. In our country, this led to anti-miscegenation laws and forced castration. In more socially-committed places, politicians used their political power in ways that sound positively Ehrlichian . . . and ensured healthy genes with gas chambers and murder squads.

All of those people who kept telling us Something Has To Be Done thought of themselves as scientifically literate, but none of them were.

In the ’70s, scientifically literate people discovered that if the rest of us — meaning me and you — didn’t clean up our industrial ways, and soon, glaciers were going to come down and scrape Manhattan off the map. Before we even had the chance to decide whether this was something we might want, famous scientist Carl Sagan — who’d spent part of his career on television and part of his career figuring out the way things are on other planets — jumped in on the side of the glaciers. Sagan was a lot smarter than you and me put together, and the debate about the glaciers was over: they were on the march, and the time had come to head down to the community college and sign up for adult-education classes in blubber chewing and igloo making.

All of those people who kept telling us Something Has To Be Done thought of themselves as scientifically literate, but none of them were. Not even Carl Sagan. Sagan was scientifically literate about television shows and atmospheric chemistry and dust storms on Mars, and the physics of particles bumping together in the rings of Saturn, but he didn’t know squat about glaciers. There aren’t any glaciers on other planets, at least not any of which the news has reached our planet. Or large, metropolitan areas waiting to be scraped away, for that matter. On most scientific matters, only three or four people in the world have enough actual knowledge to be scientifically literate.

Or not.

For the 40 years between the time young Margaret Mead returned to New York and started gathering up all those chairs, and the time Derek Freeman set out for Samoa, Margaret Mead was the only scientist in the world qualified to have an opinion about sex in Samoa. And her science was so botched, she wasn’t qualified either.

Whatever Paul Ehrlich may actually be qualified to talk about, telling people that the world is going to starve to death just as the Green Revolution was kicking into high gear wasn’t it.

No geneticist in the ’30s, a quarter century before DNA was discovered, could possibly have been qualified to say that entire groups of people should be flushed out of the gene pool. And those guys, and Paul Ehrlich, and Margaret Mead weren’t alone. They were just noisier than most. Here are some other things that socially aware, scientifically literate people have told us:

  • Tomatoes aren’t really tomatoes, they’re love apples and they will kill you.
  • Poinsettias will kill you, too, so keep poinsettias away from kids.
  • If you swim after a meal you’ll catch stomach cramps and drown.
  • If you hide under your fourth-grade desk, atom bombs can’t hurt you.
  • Go easy on the spaghetti because spaghetti is the kind of trash food that makes poor people fat. This advice was replaced by:
  • Eat lots of spaghetti because spaghetti contains complex carbohydrates, which was replaced by:
  • Don’t eat spaghetti because spaghetti is nothing more than empty calories, which was replaced by:
  • Eat lots of spaghetti because spaghetti is part of a Mediterranean diet, and Mediterranean people live to very old ages.
  • A glass of wine with dinner is good for the nerves, which was replaced by:
  • A single sip of alcohol leaves whole mountainsides of clear-cut brain cells in its wake, so never drink anything alcoholic, which was replaced by:
  • In spite of scarfing down unplucked songbirds, and sheep pancreases, and things even the Chinese won’t eat, French people drink lots of red wine, and they live longer than you do, so drink red wine, but not because you enjoy it, which was replaced by:
  • It’s not the alcohol that makes the French live a long time, it’s the grapes their wine is made out of. So drink grape juice, instead, which was replaced by:
  • It’s not the grapes, it’s the alcohol. Alcohol clears your arteries. Skip the red wine, chug down the hard stuff, and you can live as long as a Frenchman without the sulfites.
  • An apple a day keeps the doctor away, which was replaced by:
  • Modern-day factory farmed apples come coated with Alar. Alar is the most potent cancer-causing agent in our food supply, so don’t even think about touching an apple unless you are wearing a hazmat suit, which was replaced by:
  • Alar is nothing more than an apple growth-regulating hormone and doesn’t have anything to do with people, so go on, eat apples.
  • Bumblebees can’t fly. But that one was based on a faulty mathematical model, which brings us to mathematical models in general. In general, researchers fall back on mathematical models when whatever they’re trying to figure out is too complicated for them to understand.

Nowadays scientists run their mathematical models through computers when they want to figure out something that’s too complicated to understand. Sometimes the computer models are so complicated, nobody understands them, either . . . especially where weather and supercomputers are involved. Which, now that global warming is à la mode, leads to questions nobody has answers to.

When people mention that we just had the hottest summer in half a century, they never say what happened 51 years ago to make things even hotter, because the computer wasn’t programmed to tell them.

When you ask why, if the oceans are beginning to boil away, is there so much more sea ice around Antarctica than there used to be, all they can answer is that the science is complicated, and they’re right. The science is complicated. It’s too complicated for the scientists who do that kind of science to understand. It’s way too complicated for scientists who do other kinds of science to understand. And as for the people who don’t do any science at all, such as the ones trying to persuade you that the whole thing is too complicated for you to understand, they don’t understand it any better than you do.

When people mention that we just had the hottest summer in half a century, they never say what happened 51 years ago to make things even hotter.

The very best that anybody can do with questions like these is to compare what the computer spits out with what seems to be going on in the real world. That’s easy with bumblebees. When your model tells you bumblebees can’t fly, you know something’s wrong with the model. When the model tells you summers should have been heating up for the past 15 years, and they haven’t been, maybe the computer hit a patch of short-term bad luck involving natural variations in weather patterns, and things really will heat up when the computer’s luck changes and the weather gets back on track.

Or, maybe, the sun ran out of spots for a while, the way it did in the Little Ice Age. And global warming is the only thing between us and freezing to death.

Or an increase in forest litter in the tropics is soaking up the carbon dioxide.

Or, maybe, all the sulfur compounds that Chinese coal-burning plants have been dumping into the air are shielding us from the solar gain we’d be getting if the Chinese were running their factories on natural gas.

Or the sudden, rapid growth of trees in the Siberian and Canadian sub-Arctics is swallowing up carbon dioxide as fast as the Chinese can generate it.

Or calcium in the ocean is turning carbon dioxide into limestone.

Or it’s all part of some long-term cycle having to do with Ice Ages. Carl Sagan was right, and the glaciers are coming for New York after all.

Or . . .

Or . . .

Or, could be, something is wrong with the model.

My money says we’re having a Margaret Mead moment: the science isn’t good enough, and nobody knows what’s going on. Not the computer programs. Not the people who write the computer programs. Not the scientists who study global warming. Certainly not the scientists who don’t study global warming. Or the hordes of socially aware laymen who consider themselves scientifically literate. And, most of all, not the politicians, pundits, and public intellectuals who built their careers on global warming.

The difference between me and these folks is, I know I’m scientifically illiterate. I am to science what a student at a madrassa is to the imam. All I can do is rely upon him to repeat the sacred texts to me. But with all the nonsense that’s been spoon-fed to us in the past, I’m going to ask questions before I get stampeded into doing something that doesn’t agree with the way the world looks to me. So, when someone who fancies himself scientifically literate tells me bumblebees can’t fly . . . and I look out my window at a whole gardenful of bumblebees buzzing around, I’ll need an explanation I can understand before I start claiming those bees aren’t flying.

When your model tells you bumblebees can’t fly, you know something’s wrong with the model.

Could be the global warm-mongers are right. Could be that God really does have an emerald palace all fitted out with rivers of non-alcoholic wine and six dozen amnesiac maidens waiting to be deflowered just by me so they can forget about it the next morning and start over again as virgins . . . if I’m righteous about not running the air conditioner. But I’d need more than the word of somebody who hasn’t been any closer to Paradise than I have before I turn off the AC on a summer’s day.

When the kid sitting cross-legged on the mat next to mine stops bobbing his head as he memorizes yet one more sura, and tells me that if I don’t quit driving my car the ocean will swell up and wash away Denver, I’m going to want to know what happened in Colorado a thousand years ago when the weather was so warm that Vikings were homesteading in Greenland.

Could be there’s an explanation for that, but I’d need to hear it before I start passing laws to force people to raise their children in squalor because the things they need to do in order to lead decent lives are too wasteful and antisocial for the rest of us to countenance.

I’d need better proof than Margaret-Mead-knows-best before I recommit to the silly personal values of the ’60s. And I’d need a lot better proof before I start castrating people I don’t think are as smart as I am, or forcing young mothers to have abortions, or condemning entire populations to gas chambers . . . or millions of people here, and billions in other places, to lifelong poverty because I don’t think they should burn coal or gasoline or nuclear energy.

Share This

Crisis Communism


No law has drawn more ire from libertarians and conservatives than Obamacare. The idea of the government using its power to punish people for making a free and informed decision not to purchase health insurance, justified by the noblest-sounding idealism of "lowering costs" and "increasing access," is obvious pavement for the road to socialism. If the government has the right to impose economic decisions on us, then capitalism is finished.

My own view is that, contrary to conventional libertarian wisdom, Obamacare gets some things right. I have a history of health problems and the end of exclusions for preexisting conditions benefits me greatly; without it I probably would not have health insurance. I also like the Obamacare health insurance exchanges, because they enable plans to compete for buyers, and competition is the engine that lowers cost and improves quality. In terms of preexisting conditions, and the lack of competition among plans, I think the old system was broken and the new system is better.

But my point is that these good things would have happened from deregulation. The flaws in the old system were caused by government control, not by the free market or the greed of insurance companies. In fact, greed is a main motive of Obamacare's insurance-company backers, who love a law that forces people to buy their products and pay them more money.

Here I posit a theory that I call Crisis Communism: when the government interferes in the free market it causes a crisis, which the socialists then use as an excuse for greater government interference, justified by the need to end the crisis. Thus regulation achieves a downward spiral towards Marxism. One good example is the Great Depression. The Federal Reserve caused it; then the New Deal was offered as a solution — which made it worse.

In the field of health insurance, two regulations precipitated the crisis "solved" by Obamacare. First, the complex of laws and codes known as ERISA (associated with the Employee Retirement Income Security Act of 1974) tended to force health insurance to come from a worker's employer, so that the employer chose the plan, which killed competition for plans among individual consumers. Second, the state insurance commissioners issued detailed regulations about what a health insurance plan was allowed to cover and what benefits it could have. The advocates of Obamacare might blame the free market for a bad system, when really it was state socialism that was to blame.

I want Obamacare repealed. But if we are to repeal Obamacare, then we must also repeal ERISA and all state health insurance regulations, so that free market competition can force health insurers to make plans available at prices that people want to pay for the benefits they want and freely choose to purchase.

Share This

Progress and Poverty


I remember R.W. Bradford, founder of this journal, testing a new keyboard by typing out, “Good news — the depression is over, and the banks are filling with money.” Anyone else would have written, “The quick brown fox jumps over the lazy dog.” But Bill liked news, and when he could find it, good news.

So I want to begin with some good news. The year now ending witnessed significant reductions in the rates of certain linguistic crimes. And since “law enforcement agencies” (a.k.a. cops) always take credit for any accidental lowering of a crime rate, this column gladly takes credit for these reductions. Congratulations, Word Watch.

After years of pointing out that “begging the question” doesn’t mean what you might too hastily assume it means (the prompting of an inquiry) — that it means, instead, a species of logical fallacy (arguing in a circle, using a proposition to prove itself) — I am happy to find that many public speakers now realize where the trap door is hidden, and do their best to avoid it. The people on Fox News practically break their necks getting to the other side. They used to put “that begs the question” in every other sentence, and always in the wrong way. No more. Now, just when you see that they’re dying to say it, there’s a pause, a deep breath, and a slow rephrasing: “That . . . uh . . . poses the question”; “That . . . leads to the question”; “That . . . makes me want to ask you . . .” Somebody obviously told them to read Liberty.

After years of hammering away at the ridiculous idea that President Obama is a great, or even a good, writer and speaker (a hammering that could be heard as recently as last month’s Word Watch), I am gratified by some faint signs that conservatives don’t always feel obliged to begin their denunciations of an Obama utterance by saying, “Despite his soaring rhetoric,” or “The president’s actions are not as inspiring as his words.” They should be saying, “Despite his bathetic attempts at rhetoric” and “not as insipid as his words,” but that may come later, when pundits learn the existence of “bathetic” and “insipid” — in short, when they read Word Watch more often.

The great producers, the great fecund sows, of deformed prose are politics and bureaucracy, and that queen of all sows, political bureaucracy.

And after years of insisting that celebrity is not the same as significance, or even fame, I find curious indications that Word Watch may be exerting some influence on the crude but candid (i.e., free) media. I refer, for instance, to the reader comments that appeared on TMZ, following the death of Paul Walker. Walker was an action film star. He liked fast cars. On November 30, he was killed in a speeding car that went out of control and hit a light pole. It was a horrible accident, and the reader comments on TMZ were appropriately sympathetic. But they were more. They were self-dramatizing in a way that has become predictable after every death of anyone who might conceivably be regarded as a public figure. Hundreds of readers proclaimed themselves devastated with grief on behalf of Walker, his family, and his friends — people with whom these readers had no acquaintance whatever. Finally, someone had had enough. “Sorry,” he wrote, “RIP, our prayers are with the family, etc.....who is he?”

It’s a good thing that TMZ, like Word Watch, exists in cyberspace, or there would have been mob violence. But somebody had to point out that heartfelt feelings are often nothing but words.

Celebrity is fleeting, and even authentic feelings pass away, but some things never leave us. Word Watch can’t do anything about them. For God’s sake, even the second George Bush is back. He is daily proclaimed “more popular than President Obama.” When you think of it, this isn’t saying much. But now he is being cited as a film authority — and in the most gruesomely authoritative way. In late November, ads appeared for a movie called The Book Thief, and these ads said, “The critics are raving . . . . And President George Bush raves, ‘It’s a truly wonderful movie.’” He certainly put a lot of energy into that one. Not only wonderful but truly wonderful. But what truly conveys the feeling of the perpetual, the eternal, the Egyptian pyramidal, is that word “raves.” Raves. The expression has screamed at me from every film ad I have ever had to sit through. The critics are raving. Even a former president is raving. And as always, the New York Times raves. They’ve all gone crazy together.

Well, let them. We’re used to it. But must we get used to the steady seep of ignorance into the foundations and concrete basements of our language? I know you have your own examples; here are three of mine:

1. The effort to make “which” a universal connective: “I bought a new place in Vista Hills, which I didn’t realize the taxes were so high.”

2. The loss or mangling of strong verbs, and the creation of dumb replacements for them. It’s bad enough to hear that “the suspect spit,” not spat, “at the arresting officer”; but must we hear “spitted at him”? And why can’t people realize that the past tense of “fit” is “fitted,” but the past tense of “shit” is “shat”?

3. The growing movement to ignore the rules about comparatives and superlatives, whenever their use requires a split second of thought. Example: a journalist on Greta van Susteren’s show, commenting (December 10) on the latest Quinnipiac poll about Obama: “It’s on healthcare that people are ranking him the most low.” Most low? The superlative of “low” is ”lowest.” Is that too hard? Yes, if you can’t figure out what to do when an adjective gets two words away from its noun.

“Most low” exemplifies a general problem — people’s increasingly evident inability to keep track of their sentences. Leland Yeager, a friend and expert advisor of this column, has collected many instances of the problem, including offerings by such respectable journals as The Economist and the Wall Street Journal. Try these exhibits from the Yeager museum of unnatural history:

“A key benefit to [sic] offshore wind power is the lower rate of wind turbulence at sea vs. on land” (WSJ, June 19, 2008). As Yeager suggests, why not just write, “A key advantage of offshore wind power is less wind turbulence at sea than on land”? But here is early documentation of an illiteracy that continues to spread: the use of “versus” (“vs.”) to mean “than.” What next — “My kid is smarter vs. your kid”?

Commentators “take great pride in emphasising how much more sophisticated civilization was in Japan in the 11th century compared with Europe at that time” (Economist, Dec. 20, 2008). It doesn’t take much to compete with the medieval West. But what exactly is being “compared” — “the 11th century” and “Europe”? No, it’s supposed to be . . . let’s see . . . it must be levels of sophistication in Japanese and European civilizations in the 11th century. Commentators apparently like to emphasize the idea that in the 11th century Japan was more sophisticated than Europe.

That’s one way of reforming the sentence, and you can easily think of many others — none of which occurred to the writer. But there are sentences that just make you want to give up and head for the bar. If you have any interest in economics, you’ve seen too many sentences like this one, which Yeager recovered from the Federal Reserve Bank of St. Louis Review (Sept.-Oct. 2008):

But the embedded leverage in these products meant that end-investors were often buying assets with much greater risk characteristics compared with the underlying pool of mortgages, credit card debts, or loans than they might suppose.

Do scholarly journals still have editors?

Still, the great producers, the great fecund sows, of deformed prose are politics and bureaucracy, and that queen of all sows, political bureaucracy: always ignorant, always talking, always striving to influence, always striving, simultaneously, to obscure the truth. The Obamacare fiasco has born teeming litter after teeming litter of repulsive words. Any example will do, but let’s look at a little missive by the irrepressible Julie Bataille, director of communications, Centers for Medicare and Medicaid Services (November 22, 2013). Remember, as you read, that she is a director of communications.

“Today,” she begins, “Jeff Zients [the wizard that Obama appointed to clean up the mess he had made of the merry old land of Oz] offered an update on our efforts to improve; data on key metrics on site performance, the progress made this week and the view looking forward.”

Already you know you’re in trouble. You know that Bataille has no intention of rushing forward with any facts. If she did, she would say up front what’s wrong with the site, instead of tucking “site performance” into a box called “metrics,” tucking that box into one called “data,” and tucking that one into an “update” that was “offered” by somebody else. How about just giving us the data? We know that an update on “progress” assumes that progress has been made — but that’s the topic of debate, isn’t it? Could Bataille be begging the question? Clearly, she is a very bad writer. She’s going to give us nothing but happy talk, and the happy talk will consist of slick-sounding clichés, such as the progressive “view looking forward.” Turning worse into worst, she will mangle those clichés. To her, a “view” looks.

As for “real-time management decision making,” does that mean that some management decision making is performed in unreal time?

“In late October,” she continues, “we appointed QSSI as the general contractor to deploy their expertise in technology and program management to lead this project forward.”

So. Since late October, when the nation, as distinguished from Ms. Bataille, realized that Obamacare was a hideous disaster, something called QSSI has been leading the project forward. (There’s that word again.) But how is that leading accomplished? What’s been happening? Oh, it’s all very technical. Let’s just say that the company (singular), here regarded as they (plural), deploy their expertise. Expertise, one gathers, is like an army. Division 1: Attack that defective code! Division 2: You’re in reserve; wait behind the hill. Division 3: Lift the siege of Fort Obama!

“The team from QSSI continues to work with people from CMS [can’t have enough acronyms] and other contractors around the clock [can’t have enough clichés, either] to troubleshoot the system, prioritize fixes, and provide real-time management decision making.”

So you can “troubleshoot” a “system,” can you? I suppose, then, you can “troubleshoot” almost anything. “Hey, honey, I just wanta troubleshoot ya.” OK. But I draw the line at prioritizing fixes. It just sounds so gruesome. As for “real-time management decision making,” does that mean that some management decision making is performed in unreal time? Maybe that’s what went wrong with Obama . . .

We haven’t reached the end of Bataille’s memo — that’s a very long way off — but we have reached the climax, which she has cleverly deployed in the middle. And this is it:

“Thanks to this team effort, we have made measurable progress.”

Measurable progress.Let’s consider how such phrases might work in real time.

Automobile passenger: “Hey, what’s the speed limit, anyway? Seems like we’re going awful slow.”
Automobile driver: “No, we are making measurable progress.

Airline seat holder: “How long before we get to Cleveland?”
Airline attendant: “We are making measurable progress, sir.”

Employer: “When do you expect to get that project done?”
Employee: “I am making measurable progress.”
Employer: “You’re fired.”

Bataille’s communication, horrible as it seems, is a fair sample of the words oozing out of Washington. If you’re like me, you’ve often wondered: do people who write this kind of prose actually think the way they write? Are they just prowling across their keyboard, trying to find enough words to bamboozle everybody else, or does it all come spontaneously and sincerely to them? When their car breaks down, do they look for expertise that can be deployed? When the guy from Triple A arrives, do they reflect that measurable progress is now being made? Which alternative is more terrible to contemplate — that kind of cunning or that kind of sincerity?

Share This

One Flapper Escapes the Trap


America’s glorious War on Drugs is viewed with increasing skepticism. Because people keep proposing different variations of it, we never stop talking about it. But we keep talking about it in the same way. Public debate almost always dwells on the superficial aspects, rarely touching upon those closest to the heart.

The argument that addiction to, or abuse of, certain substances is of greater concern to “society” than it is to us as individuals is the basis of every form of prohibition. It claims that we belong to others more than we do to ourselves. But to prohibit certain substances because people might abuse them is a violation of human dignity. If our lives are “society’s” more than they are our own, then we are something less than entirely human.

I’ve never used illegal drugs. Even though I was a teenager during the seventies, when supposedly “everybody did it.” Was that because drugs were against the law? I don’t think so.

I didn’t hang around with people who had access to anything stronger than marijuana. And I had plenty of opportunity to see how that affected them. It made them stupid, and it made them stink. I didn’t want to be stupid, and I didn’t want to stink.

As an adult, I became addicted to an entirely legal substance: alcohol. Would I have used it if it had been illegal? As illegality wasn’t what deterred me from smoking weed, it probably would have had little to do with keeping me from drinking. I liked the taste of booze, and it made me feel powerful and utterly brilliant. It was fetishized (by the “society” to which I supposedly belong) as a rite of passage to all things grown-up and glamorous, and those were exactly the things I wanted to be.

Had I been a flapper in the speakeasy days, I’d have been swilling gin and dancing the Charleston right along with the rest of them.

Perhaps sensing the utilitarian coldness of the “society owns us” line, many prohibitionists appeal to our Inner Five-Year-Old. They simply care about us — more than we may care about ourselves. But why does their concern for us take precedence over our own? It comes around, no less than the other argument, to claiming that somebody else is more important than we are.

Their concern purportedly trumps ours. But I’ve known many alcoholics and other addicts who are valiantly battling their addiction. And not one of us got clean or sober because anybody else wanted us to. Any recovery program will tell you that is never enough. If we live and recover instead of giving up and dying, it can only be because we value ourselves enough to believe that our lives are worthwhile.

No one else can make you value yourself. Nor is it likely to add to your estimation of yourself to be told that somebody else’s interest in you is more important than your own. None of the people who have overcome an addiction to illegal drugs did so because of such an appeal. That wouldn’t appeal to anybody. Which is probably why — since it is the argument so often used — so many people are hooked on illegal drugs.

The drive to illegalize booze got traction during the industrial revolution. The saloon became the place to be counted, herded, and manipulated into voting as the powerful desired. Might this not have been because people had already begun to feel more like sheep than like human beings? Could not the desire to intoxicate oneself into oblivion have something to do with the abuse of alcohol (and drugs) in the first place?

How, then, will playing upon the sense that somebody else owns us — that we are not people in our own right in any meaningful sense — make us want to drink or use drugs any less?

Within every individual is that spark of humanity that gives us our identity. That recognition of our own worth. It goes beyond the mere survival instinct found in animals, driving each of us not only to exist, but to live. To strive for wisdom and achievement. To be free not simply from some trap (the highest aspiration of an animal), but to pursue a higher purpose.

I got sober — and stay sober — because I want to live the fullest life possible. The more “society” permits the liberty for human beings to reach their potential, the less attractive an escape into intoxication will be. Then prohibition schemes of every sort will be as dead as the flappers and bootleggers of our past.

Share This

Electoral Politics


Share This
Syndicate content

© Copyright 2018 Liberty Foundation. All rights reserved.

Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.