Unfair Competition from Robotland

 | 

This campaign season brings many complaints about “shipping jobs overseas.” Candidates promise to crack down on the offending corporations. American workers and the United States as a whole must compete on a slanted playing field against foreigners paid much below a dollar an hour. Moreover, the foreigners manipulate their currencies. They buy less from us than we from them, putting the US into a trade deficit (more exactly, a current-account deficit) costing us many billions of dollars a year. China, Japan, and Mexico count among the worst offenders. Free trade is fine, but only when it is fair.

In a similar but imaginary scenario, technology has advanced so far that “Robots” (in a stretched sense of the word) displace American workers at costs equivalent to Robot wages of 50 cents an hour. What is the difference between shipping jobs to Bangladesh and shipping jobs to Robotland? Well, Robotland does not have a balance of payments, so it cannot be accused of buying less from us than we from it, fleecing us of the difference. In the real world, automatic market mechanisms, if allowed to operate, forestall worrisome trade deficits and surpluses; and if the foreigners do make unbalanced sales to us, what can they do with the money? They acquire American bank accounts, securities, and properties, so supplying us with financial capital on advantageous terms.

What sense does the notion of one country competing with others have? Does it mean that international trade is a zero-sum game, with countries squabbling over shares in a fixed total of gains? On the contrary, international trade and advanced technology are alike in making desired goods more abundant. One country’s relatively low standard of living would trace to technological and entrepreneurial backwardness and perhaps to bad government. It would be absurd to blame its relative poverty on incompetent trade-policy negotiators.

One country’s relatively low standard of living would trace to technological and entrepreneurial backwardness and perhaps to bad government.

In the real world, conceivably, Robotland technology might displace many American workers, inviting Luddite arguments. I do not want to get into that issue here. I merely ask what the difference is between the scenarios of foreign competition and robots.

I wish that today’s vapid political debates could give way to ones with candidates testing one another’s policy-relevant understanding by posing questions like the one about robots. Other questions might be: How do your trade-policy proposals square with the principle of comparative advantage? What light might the absorption approach to balance-of-payments analysis shed on a connection between a trade deficit and a government budget deficit? In what sense is the Social Security trust fund a reassuring reality and in what sense a deceptive farce?

Unfortunately, such questions would not faze Donald Trump. He would respond with vicious personal insults and with reiterations of his own excellence. Anyway, allowing such questions could be entertaining. They might even enlighten some voters.




Share This


Poor Little Me

 | 

According to Madeleine Albright, I’m going to hell. As is every woman who isn’t voting for Hillary Clinton. All I can say is that heaven won’t be much of a paradise if it’s populated with the fools who are.

But if a lot of other self-proclaimed leftist smarties are right, I can’t go to hell, because I don’t even exist. After all, I’m a female libertarian. Further complicating matters is that now the progressive Left has decreed that gender does not exist. So not only am I going to hell (though I don’t exist because I’m a libertarian woman, and hell doesn’t exist because these people don’t believe in it), but I can’t be a woman because gender is nonexistent. Color me confused.

I don’t think I can even call myself a left-libertarian anymore. I want nothing whatsoever more to do with the Left. I’m glad that in 2016, a woman can run for president and be taken seriously, but the possibility that Hillary Clinton is not only running, but just might win, makes my blood run cold. I guess progressives still want me to be a woman so I can be a good little victim and vote for her. These people are so crazy, they make me want to run for my life.

If one out of every two people on the planet was helpless against the other, our species would have died out hundreds of thousands of years ago.

As a woman, I am expected, by the so-called progressives who have taken out a copyright on feminism, to sit around crying, “Poor little me!” I refuse to do that, not because I hate every woman in the world, or fail to care about our rights, but because I’m not an idiot. If I am not very much mistaken, we have been half of the human race since the Garden of Eden. Which means that over the millennia, we’ve had every bit as much to do with how things have turned out as men have. If we haven’t, then we’ve all been idiots.

According to the sort of history I’ve been taught since I was a girl, men have always been awful brutes — while women have been just sitting there and taking it. That doesn’t correspond to the history of my life, or the lives of most of the women I’ve ever known. I don’t even think that most of us could possibly believe it. If we were such ineffectual feathers in the wind, we’d never muster the will to get up in the morning.

My philosophy of politics and history is one in which every individual will has an influence on the whole. Events unfold as they do because of the interaction of multitudes. This was one of the aspects of libertarianism that attracted me from the start: everybody counts. The human drama is far too unruly to be centrally planned or collectively organized. If one out of every two people on the planet was helpless against the other, our species would have died out hundreds of thousands of years ago.

Certainly the rules by which we’ve played haven’t always been fair. It’s appalling to me that my grandmothers — each of whom had as much sense as any man I’ve ever met — couldn’t vote until 1920. But that arrangement was OK with most of the women in this country until it wasn’t anymore, after which it was changed. Women do as much to keep each other down (if not more) than men have ever done to oppress them.

What we dearly need is not an amendment to the Constitution, but an adjustment of attitude.

A crucial reason why women have lacked the power wielded by men is that men tend to be loyal to one another, and women do not. We compete with one another so fiercely and viciously that men shudder to think of it. They may kill each other in wars, but the rest of the time they manage to cooperate pretty nicely. We undermine and sabotage each other nearly every day of our lives from nursery school to nursing home.

Although I’m gay, I never liked playing with little girls when I was a kid. They made me nervous. One day they’d be friendly, the next they’d get mad — for no apparent reason — and the day after that, they’d be sugar and spice once again. I rarely trusted them. Most of my friends were boys, because they were temperamentally pretty much the same, day in and day out. I usually knew what to expect.

In my adult life, most of the really treacherous things ever done to me have been done by women. A lot of women have been kind and supportive, too, and it would be unfair for me to overlook them. But all along the way, I’ve benefitted from the support, encouragement, and mentorship of a variety of men. As has every other woman who has ever succeeded at much of anything in life — whether she’ll admit it or not.

I regard it as highly offensive when I’m informed that I should vote for Hillary Clinton because she’s a woman. It’s utter nonsense to suggest that this is any less sexist than the notion that a guy ought to vote for Donald Trump, Ted Cruz, or Bernie Sanders because they’re men. It will be “our turn” to be president when the majority of men and women determine that a female candidate is worthy of the office.

Women finally got the vote because enough women thought that every other woman deserved the franchise. When we get over the inferiority complex that tells us that men’s opinions of us carry more weight than our own of ourselves and one another, that’s when we’ll finally “achieve equality.” As long as we allow the political left to convince us that we’re helpless and victimized little nitwits, that’s exactly how we’ll behave. What we dearly need is not an amendment to the Constitution, but an adjustment of attitude. We’ve got vastly more power than we think.




Share This


Free Speech — A Losing Candidate?

 | 

Consider these two arrangements of the same story:

  1. May 11 — Violence caused the cancellation of a Donald Trump rally in Chicago after Trump denounced people opposed to his candidacy. People who came to protest against Trump were fought by Trump supporters.
  2. May 11 — Violence caused the cancellation of a Donald Trump rally in Chicago after protesters entered the hall and fought with Trump supporters. Trump had previously denounced protesters who appeared inside his rallies.

Both versions are true. But the first of them is a piece of propaganda, designed to get people to vote against Trump.

It’s easy to write such things. Try your own hand at it — maybe you can get a job with one of those big media outlets that are spending a lot of time blaming Trump for the violence of their own allies, the ’60s refugees and college clones who want to make sure that only one Great Thought gets heard in America.

But before I share any more of my own great thoughts, please take this brief version of the Minnesota Multiphasic Personality Inventory test. Have you already concluded that I who am writing this am a supporter or a detractor of Donald Trump?

Others spoke of their campaign for “compassion and understanding,” thus making theirs the first riot ever staged for compassionate purposes.

If your answer is Yes, you have jumped to a conclusion, and you will interpret all subsequent sentences as further proof of your opinion. You will also conclude — or you have already concluded — that I am either a racist, a sexist, a xenophobe, or an American patriot who just wants to see something done about the mess in Washington. I’m sure your ability to divine these things will be gratifying to your self-esteem.

But if your answer is No, then you are qualified to read what follows. Reading involves, among other things, the ability to identify what a piece of writing is about. This piece of writing is not about Donald Trump or my opinion of Donald Trump. It’s about a massive default from the principle of free speech.

Trump’s rally on March 11 was shut down by a mob of leftists, many of them carrying Bernie Sanders signs. Sanders was not behind the action, but his political faction was massively involved. In the for-once-apt words of a police union spokesman, “it was a planned event with professional protestors.” A week or weeks in advance they had planned what they were going to do and how they were going to do it. Once in occupation of the hall where Trump was supposed to speak, they alternately shrieked in well-rehearsed apoplexy and danced and giggled with delight. When attempts were made to interview them about what they were trying to do, most refused to admit knowing any motive. Others spoke of their campaign for “compassion and understanding,” thus making theirs the first riot ever staged for compassionate purposes.

In short, this was the gross and obvious use of a mob to deny free speech and assembly to one’s political opponents.

That being, as I said, obvious, I confidently awaited the outrage that must surely follow, even from the American political establishment. But I was disappointed. Every online headline I saw made it sound as if Trump had attacked his own rally; every article arranged the story so as to picture “violence” being spontaneously ignited by the presence of people who support Donald Trump, or actually and solely begun by them. The worst headline I saw — but there were probably even worse — was this from the Washington Post:

‘Get ’em out!’ Racial tensions explode at Donald Trump’s rallies

In truth, the only connection with “race” was the presence of Black Lives Matter activists and other people screaming about Trump being a “white supremacist,” which of course he is not. Trump is a jackass who happens to be white. Other people are jackasses who happen to be black. In neither case does race matter. But if you want to claim that someone is a white supremacist, just go to the Washington Post, and they’ll give you a headline. That headline is your license to destroy the right of free speech that allows the Post to enjoy its own ridiculous life.

Similar events continue. When protesters disrupteda Trump rally in Arizona on March 19, after trying to prevent Trump from even reaching the venue, the CBS News headline was “Violence Erupts at Donald Trump Rally in Tucson.” Clever, very clever. Omit the human agents — the people who want to shut Trump up — and make it appear as if Trump were some dangerous natural phenomenon that may “erupt” at any time. The message? Get away from Trump.

if you want to claim that someone is a white supremacist, just go to the Washington Post, and they’ll give you a headline.

This is shameful dishonesty. But silly me, I was half expecting leading Democrats to be embarrassed by the mob behavior of some of their supporters. Had it been a Republican mob that attacked a rival political campaign, we would never have heard the end of the Democrats’ outraged demands that all Republicans immediately repudiate such fascist tactics. For the Democrat establishment, however, Trump was the fascist. Sanders showed not a hint of shame about his followers, and no questioner tried to get him to. Mrs. Clinton lost no time in denouncing “the ugly, divisive rhetoric of Donald Trump, and the encouragement he has given to violence. . . .” “If you play with matches,” she said, after exhaustive research in America’s vast storehouse of domestic clichés, “you can start a fire you cannot control.” Well, that much was to be expected from such an implacable proponent of objective law as Hillary Clinton.

But let us return to the problem of fouling the well you drink from, which is the giddy enterprise of the Washington Post and other journals that value free speech mainly because it’s good for people who agree with them. Consider Trump’s Republican rivals. Long victims of media slanders about their party’s supposed alliance with the supposed racists and violence-mongers of the Tea Party, Republicans might be expected to insist on free speech and fair play for everyone, but especially for themselves. Well, don’t expect anything like that. When push came to shove at the Trump rallies, they preferred to blame the victim, a fellow Republican, and try for a cheap political advantage.

Ted Cruz asserted that if you talk as Trump does, “you’re creating an environment that only encourages” violence. This from the man who has been mightily, and unfairly, blamed for inciting the wrath of other Republican senators — by refusing to give up his right to free speech.

John Kasich repeated, like a mantra, “Donald Trump has created a toxic environment . . . Donald is creating a very toxic environment, and it’s dividing people.” Note to Kasich: what is a “toxic environment”? Another note to Kasich: Aren’t you “dividing people” whenever you disagree with somebody? A third note to Kasich: why are your clichés of a higher intellectual quality than Donald Trump’s?

Republicans might be expected to insist on free speech and fair play for everyone, but especially for themselves. Well, don’t expect anything like that.

But it was left to Marco Rubio — who as I maintained last month is not a bad talker, so long as he’s talking one-on-one and about something specific, instead of standing on the balcony to deliver the papal blessing — it was left to Rubio to deliver the most inane remark of this supremely inane political season:

Presidents and presidential candidates cannot just say whatever they want.

I guess not. And I guess that’s what makes their sayings so profound, so probing, so candid, and so trustworthy.

Republican operatives were singing from the same page as the candidates, or vice versa. To cite one of many examples, Guy Benson, political editor of Town Hall, an outlet for conservative and sometimes radical conservative ideas, and attempts at ideas, used an interview with Fox on March 17 to accuse Trump of “fomenting violence.” To cite another, Doug Heye, a “Republican strategist and advisor,” lamented to Fox’s eager ears that attention had been stolen from Rubio’s campaign by the riot in Chicago, while his interviewer, Shep Smith, noted that some people thought the riot was actually contrived by Trump. To be fair, Heye then said that although Trump used “bigoted” language, he was “not a bigot,” and he himself would vote for Trump if he were nominated.

Let’s pause for a moment, and meditate upon these samples of the Republican mind at work.

If you’ve ever suspected that the political leaders of our nation are just not that bright, here is new evidence. Trump’s political appeal is known to result very largely from his warfare against the politically correct Left, an ideological formation that is feared and despised by almost everyone in the country who doesn’t have a Ph.D., work for a Human Resources department, or hold office in a safe Democrat district. In fact, it’s hated and despised by many people who do fit those descriptions; they’re just afraid to admit it. And as Trump’s Republican opponents have good reason to understand, this aspect of his political appeal is very strong. They also know that Americans traditionally resent blatant attempts to shut people up. They may try to shut people up themselves — specific people on specific occasions — but in the abstract, at least, they dislike the process. They have a feeling that it’s unfair, undemocratic, counterproductive. That feeling also is very strong.

In these circumstances, what would any Republican politician with brains more powerful than a bowl of jello have to say about the politically correct attacks on Trump’s rallies? He would say, “As you know, I am opposed to Donald Trump’s nomination on the Republican ticket. Nevertheless, I believe that all Americans should condemn the dastardly attempt of political radicals and supporters of the Democratic Party to do something that has never been done in American politics — prevent a candidate from running for the high office of president of the United States,” etc., etc., etc. Anyone could write that speech, which would appeal to virtually everyone in the country and position the speaker as morally superior not just to the Democrat mob but also to Donald Trump, whose own protests might be written off as merely self-interested.

Even a child might pity the obvious phoniness and insensate self-interest of Clinton's attempt to escape from being criticized.

But that’s not what happened. It was one of those moments when the fortunes of the Grand Old Party were magically aligned with those of high principle and popular sentiment, and the GOP not only missed the moment but disgraced it. Its candidates and spokesmen actually thought that their own self-interest was involved, not with the assertion of ideas that almost everyone holds, but with the petty advantage to be sought by suggesting that their chief opponent deserved whatever bad things happened to him. In the process, they gratified the politically correct people who can barely force themselves to vote for Hillary Clinton, let alone some low-life Republican, and they morally outraged the legions of Trump supporters whose assistance they themselves require for victory.

It seems very childish to point this out. But our politics (not without the help of Donald Trump) have become so childish that anyone who knows that C-A-T spells “cat” is operating with an enormous intellectual advantage over the other kids.

I should have reached this conclusion about the prevalence of baby talk and infantile tantrums when Hillary Clinton went before the Benghazi committee and screamed, with well-rehearsed outrage, “What difference does it make?” Even a child might pity the obvious phoniness and insensate self-interest of her attempt to escape from being criticized. Yet the august organs of public opinion hailed it as an unanswerable defense of her actions. Only later did they sense that there might have been some slippage in the public relations department: everybody but them considered Clinton’s tantrum the worst performance ever presented on TV. So why should I be surprised by the need to suggest that America’s deep political thinkers may have missed a few other things — things that even some non-pundits understand?

Among those things are the following reflections:

  1. It’s wrong to blame the victim, whether the victim is sensible or not, likable or not, or any other not. A woman who is robbed while walking down a dark street is not responsible for being robbed, even though “she should have known better” than to walk that way. A man who ventures into “a bad neighborhood” with an expensive watch — ditto. A person who makes rude remarks from a public stage is not to blame if someone organizes a mob to kick him off the stage. Even a blowhard who goes around saying, “If anybody tries to kick me off this stage, I’ll hit him in the face” is not to blame if, yes, somebody tries to kick him off the stage. We are not living in the old Soviet Union, which had such tender feelings that any rude remark became a provocation. Weighing rights on the scale with provocations is an excellent means of getting rid of rights, and that’s why it is the consistent practice of dictatorships.
  2. Whether Donald Trump was being jocular or not when he suggested to his listeners that if somebody caused trouble, people in the audience would be justified in taking physical action against that person, those remarks had nothing to do with the invasions of his rallies. If talking offhandedly about violence actually incited violence, then half the stand-up comics and three-quarters of the leftwing demonstrations in this country would be guilty of inciting violence. If Trump had said absolutely nothing about any kind of violence, the people who turned out to “protest” his alleged racism and sexism would still have turned out to “protest” his alleged racism and sexism. That’s what their signs said they were doing. Logically, anyone sincerely moved to protest Trump’s rude bellowings would want to do so by exhibiting the opposite behavior. But that’s not what makes a mob. For that you need bullhorns, filthy slogans, and, yes, actual violence. When other Republicans maintained that Trump was getting what was coming to him, they were siding with people who would cheerfully raise the same kind of mobs against them.
  3. When it comes to free speech and free assembly, it makes no difference whether someone is pleasant or unpleasant, or even whether he is a “racist” or some other offensive something. Free speech isn’t about allowing your sweet old grandmother to discuss how much she’s always admired Mother Teresa. Neither she nor her admiration requires protection. It’s unpopular views and unpopular people that require protection, and they are guaranteed protection by our national charter.

So much for my review of ideas that should have occurred to everyone, but obviously have not, although there is nothing more important in the realm of words than everybody’s right to use them freely.

Washington Post,




Share This


Cuba, Obama, and Change

 | 

Although Republicans and, no doubt, the Castro brothers perceive President Obama’s visit to Cuba on March 20 as American kowtowing, the perception on the Cuban street is entirely different.

To Cubans, the visit is an honor ranking right up there with the Pope’s visit, and one not vouchsafed to the island since President Coolidge’s visit in 1928. El mulato, as he’s informally referred to in the Cuban fashion of conferring nicknames on everyone, and his historic visit, bring the promise of hope and change to the island more concretely than any pronouncement ever made by the Castros.

I know. Three days ago I returned from a 30-day bike journey across the island, from Baracoa in the east to Havana. American flags were everywhere — in cars, taxis, horse- and pedal-drawn taxis, even clothing — even before the visit was announced. Warned by guidebooks and savants to minimize exchanges with uniformed personnel, and never to photograph any, I found the admonition accurate. These were all serious, unfriendly, incorruptible, suspicious, and averse to any sign of curiosity. But once the visit was announced, I decided to test the premises. Passing soldiers, policemen and God-knows-what functionaries, I’d yell, “We’re not enemies anymore!” and I’d add some typical Cuban sassy wordplay non-sequitur as a true native would. I managed to get a few smiles and even some playful responses. Things are changing.

The hustle, bustle, entrepreneurship, and raw energy that permeated every person was a far cry from the typical listless socialist citizen.

Back in March 2012, in a Liberty article entitled “The Metamorphosis,” I outlined the changes to the Cuban economy legislated by the Castro government (see also “Cuba: Change We Can Count On?”, Liberty, December 2010). The changes attempted to drop one-fifth of the workers from government jobs and make them self-employed — this in a country where everyone is employed by the government at pay scales of $1–2 per day. But the fine print indicated internal ideological conflicts. While dozens of job categories were authorized — from transportation to food, to lodging, to construction, to personal grooming (and many more) — permits, taxes, limits on employees and much red tape don’t make the goal easy to achieve.

Nonetheless, the hustle, bustle, entrepreneurship, and raw energy that permeated every person pursuing his hopes and dreams along miles of city streets and rural roads was a far cry from the typical listless socialist citizen. Ironically, even the poorest — those whom socialism is touted to help the most — were selling homemade sweets, cucuruchos, in the Sierra Maestra Mountains without permits! One told me he’d be fined $3,000 if caught. To a poor guajiro unable to pay such a fine, jail would await.

Seemingly everyone is trying to become independent of the government and develop self-employed income. One university economics grad student whose psychologist wife still works for the state now runs a B&B in Las Tunas, where I overnighted between Bayamo and Camagüey. Next year he plans on studying Milton Friedman and Frederick Hayek. I asked whether that was possible; he said definitely, in advanced academia. His study plan had already been approved.

Three blocks from the capitolio in Havana, along the Prado, I spotted a sandwich board advertising real estate. A university economics professor tended the spot. He had no office other than his board, his clipboard, and the built-in bench on the promenade. Though we’d seen many “For Sale” signs on many buildings, including the humblest of abodes, we saw no real estate offices. I excitedly elbowed my wife Tina, a realtor in Arizona, to engage her interest.

Bad move.

A middleman agent of finance — the epitome of freewheeling capitalism — just didn’t fit into her perception of a socialist economy. Either the man was deluded or he was a scammer (an unlikely scenario: the police are ruthless with physical and financial crimes). I insisted on us engaging the man. Immediately she blurted out, “How can you own property in Cuba when there are no property rights and the state can confiscate your property at any time for any reason?”

Seemingly everyone is trying to become independent of the government and develop self-employed income.

The poor man, without a vestige of the ingeniousness of an American used-car-salesman, took on a pained and thoughtful look. He didn’t know where to start, but he understood that Tina had zeroed in on the heart of the matter. Translating his response was an exercise in empathy. He told us of a building across the street from the capitolio whose residents had just been told by the government to move out: the government needed the space. He didn’t know whether compensation, alternative housing, or even a grace period had been granted. He was the first to admit that Cuba has no property rights and no judicial system to enforce them. Nonetheless, what was he to do? New laws, albeit extremely constricted, allowed for the buying and selling of cars and property. No mortgages are available; only cash transactions. Interest is still illegal. But someone was paying him four times the salary he’d made at the university. He had nothing but hope and an optimistic outlook: “This time the people will not let the changes be reversed.”

I reminded him of the roadside billboards that read, “The changes in progress are for MORE SOCIALISM” — a sure sign, he counterintuitively agreed, along with Obama’s visit, that the changes now have a better chance of sticking than any previous promises. Or as one informant put it, “Castro educated us; now we know what he’s up to.”




Share This


Cat and Mouse, Red Herring, and a Whiff of Gingerbread

 | 

I’m not a blood-and-guts kind of viewer, but I love a good horror flick, the kind that keeps the viewer constantly off balance with neat little plot twists and hair-raising anticipation of terror. Skillful pacing is essential to the horror genre; we need to be confused, soothed, startled, thrown off course, cajoled, fooled, and soothed some more until we are terrorized by the tantalizing anticipation of the monstrously unthinkable event — even if that event never occurs. Maybe especially if it never occurs.

Too many horror films rely on blood and guts to elicit screams, but a brilliant director can deliver the shivers within a PG-13 rating. In 10 Cloverfield Lane, first-time director Dan Trachtenberg does all of this brilliantly.

As the film opens, Michelle (Mary Elizabeth Winstead) is packing hastily, tossing belongings into a bag and grabbing necessities with a deft hand. The scene is filmed as a series of close, panicky shots that create suspense even where there is none; we learn that she has simply decided to leave her fiancé Ben (Bradley Cooper). The last thing we see in the apartment is a close up of her keys and her engagement ring, and then she drives away into the night. Misdirection. In a horror film, it gets you every time.

We need to be confused, soothed, startled, thrown off course, cajoled, fooled, and soothed some more until we are terrorized by the tantalizing anticipation of the monstrously unthinkable event.

It happens again at a dark, secluded filling station. Is someone lurking in the shadows? Is someone following her? I won’t tell. But the tension heightens merely from the anticipation that someone lurking in the shadows. Somehow (I won’t tell you that either) Michelle wakes up in a strange room with an IV needle in her arm, a bloody scrape on her forehead, a brace supporting her injured knee — and a chain attaching her leg to the wall. It’s Misery all over again, we think, only Michelle is the “writer,” and Howard (John Goodman) is the good Samaritan arriving with a plate of scrambled eggs, a fresh bandage, and a petulant, “You need to show me some appreciation!” à la Kathy Bates. Sometimes borrowed creepiness is even creepier.

Howard tells Michelle that Armageddon has occurred, but they are safely secured in his underground survival bunker. He explains that he rescued her from an accident just before the blast happened. But then, why is she chained to the wall? And why does he keep locking the door? And why won’t he let her go to the bathroom without him in the room?

Michelle isn’t the only young visitor in this strange menagerie. Emmett (John Gallagher, Jr.) — yes, Emmett! Could any name be spookier in a horror movie? — sports a broken arm and a scraggly beard that suggests he may have been down here for a while — or it could just be a fashion statement. We don’t know. But Emmett seems to believe Howard’s story.

But then, why is she chained to the wall? And why does he keep locking the door?

The set is closed and claustrophobic, just three people locked in a bunker playing a mutual game of cat-and-mouse as they wait out the fallout up above, while also waiting out each other’s mistrust down below. Adding to the creepiness is the cheeriness of Howard’s bunker, with its 1950s furniture in the living area, pine cabinets in the kitchen area, fake sunflowers on the table, jukebox in the corner, and board games on the shelf. The vivid colors create a bizarre fairytale effect, almost like the gingerbread house that trapped Hansel and Gretel by baiting them with food.

If you’re feeling claustrophobic from watching too many weeks of that creepy, freaky bully suffering from a perennially bad hair day, roaring epithets at his uninvited critics, then turn off the television, leave the campaign news behind, and go see John Goodman as a creepy, freaky bully suffering from a perennially bad hair day roaring epithets at his unhappy guests. 10 Cloverfield Lane is a winner.


Editor's Note: Review of "10 Cloverfield Lane," directed by Dan Trachtenberg. Bad Robot, 2016, 103 minutes.



Share This


Fantasy Politics

 | 

I’ve become convinced that here in the US, voters read too many comic books. They want super-powers to do super-duper things. Because the government wields such awesome might, they feel small and vulnerable. Only through their favorite political candidate do they believe they can live out their grand fantasies. If “their” guy or gal wins, together they can rule the world!

Politics are an even more intoxicating stupidity potion than team sports. Superman and Batman are much more fun. People don’t think that if their team wins the championship, their lives will be happier for any longer than a couple of weeks. But they’re sure that if their candidate wins the election, he will vanquish every evildoer on earth, transform the country into paradise, and guarantee a fabulously prosperous future. He says he will, and — against all reason, and despite every past disappointment — they believe him.

Hillary Clinton wants us to think she’s Wonder Woman. For a long time, many people did. The mental picture of her in short-shorts, a star-spangled brassiere, and a tiara is so traumatizing that imagining it makes me want to drink bleach. She has, however, survived not only invisible Bosnian bullets but more scandals than a stray dog has fleas. We’ll have to buy the next issue of the comic to see if she can dodge indictment for having compromised national security as Super Secretary of State.

People who think like gullible children also vote like them. Because their fondest wish is to be taken care of by Mommy and Daddy, forever and ever, amen, an awful lot of them can be bribed with free goodies.

Vastly more entertaining is The Donald. That’s a superhero nickname, if I ever heard one. Singlehandedly, he’s going to Make America Great Again. He declares that once elected, he will build a second Berlin Wall along our southern border, transport millions of people out of the country with a sweep of his scepter, and make Vladimir Putin cry like a little girl.

The fact that no president has the power to work such wonders doesn’t daunt his devotees. Never before has a president been The Donald! Or Tremendous Ted. Or the Magnificent Marco. Any one of whom can do all things — because he says so.

What worries me is that people who think like gullible children also vote like them. They do their deepest reading by flashlight in a blanket fort. Because their fondest wish is to be taken care of by Mommy and Daddy, forever and ever, amen, an awful lot of them can be bribed with free goodies. We’re just liable to end up electing not Superman but the Tooth Fairy, in the unlikely form of Tinkerbell Sanders. That’s a prospect that should make all libertarians reach for the arsenic.




Share This


Blast Radius

 | 




Share This


The Olympics and Liberty

 | 

I’m often asked what makes a film “libertarian.” Does it need to be set in a dystopian totalitarian future? Must the protagonist be fighting a government bureaucracy or authority? Many libertarian films do contain those features. But my favorites are those in which a protagonist achieves a goal or overcomes obstacles without turning to the government to fix things.

Two such films are in theaters now, and both are based on true stories about Olympic athletes who achieved their goals in spite of government interference, not because of government aid. Race tells the Jesse Owens story, and Eddie the Eagle tells the Michael Edwards story. Both are worth seeing.

Race is the perfect title for this film that focuses on both racing and racism. Owens was one of the most famous athletes of the 20th century. Historian Richard Crepeau (who spoke at FreedomFest last year) described the 1935 college track meet at Ann Arbor in which Owens, in the space of 45 minutes, set three world records and tied a fourth as “the most impressive athletic achievement since 1850.” Nevertheless, Owens (Stephan James) is not welcome at the 1936 Olympics in Berlin. Adolf Hitler (Adrian Zwicker) intends to use “his” Olympics as a propaganda piece to highlight the physical superiority of the Aryan race, and he does not want any blacks or Jews to spoil his plan. He hires filmmaker Leni Riefenstahl (Carice van Houten) to document the glorious event.

The film reveals the backstage negotiations between Olympic Committee representative Avery Brundage (Jeremy Irons) and the German organizing committee at which Brundage insisted on assurances that Jews and blacks would be allowed to compete. Brundage’s insistence is somewhat hypocritical, considering the treatment Owens and other black athletes were enduring at home, but he was successful in forestalling a threatened American boycott of the Games.

What makes a film “libertarian”? Does it need to be set in a dystopian totalitarian future? Must the protagonist be fighting a government bureaucracy or authority?

Owens faces similar pressure from the NAACP, as he is warned that he ought to boycott the Games to protest racism in Germany. Owens feels the weight of his race as he considers the conflict, but in the end he delivers the most resounding protest of all, winning four gold medals and derailing Hitler’s plan in short order. This is as it should be. What good would it have done if Owens had stayed home to protest German policy? Would it have made any difference? Would anyone even have noticed? I felt the same way when President Carter made the opposite decision in 1980 and forced American athletes to boycott the 1980 Games in Moscow to protest Russia’s invasion of Afghanistan. What good did it do to destroy the dreams of hundreds of American athletes who had trained their whole lives to compete in a tournament that comes along only once every four years? Did it help anyone in Afghanistan? Did it hurt all those Russian athletes who took home more medals because the Americans weren’t there? I think not.

In the movie, Owens also faces the pressure of athletic celebrity, and Stephan James skillfully portrays the ambition and the temptations of a small-town boy chasing big-time dreams. He is anchored in his pursuits by his college coach Larry Snyder (Jason Sudeikis) and his girlfriend Ruth (Shanice Banton), who would become his wife and partner until the day he died in 1980. As with most sports films, the outcome of the contest is known from the beginning. The real story is how the hero gets there, and how he conducts himself along the way. Owens was a hero worthy of the title.

Eddie the Eagle tells the story of an Olympic hero of a different sort — one who is remembered for his tenacity rather than his innate skill. Michael Edwards (played by Taron Egerton as an adult and by brothers Tommy Costello, Jr. at 10 years old and Jack Costello at 15) simply dreams of being an Olympian; he doesn’t care what sport. His mother (Jo Hartley) nurtures that dream, giving him a special biscuit tin to hold his medals and praising his accomplishments, even if it’s just holding his breath for 58 seconds. Ironically, Eddie is motivated by a picture of Jesse Owens in a book about the Olympics. By contrast, Eddie’s father (Keith Allen) is a pragmatist, encouraging Eddie to give up his silly dreams and become a plasterer like him. His father isn’t a bad man; he just wants to protect his son from disappointment and financial waste. Fortunately for Eddie, he has the kind of optimistic personality that simply doesn’t hear criticism.

Owens feels the weight of his race as he considers the conflict, but in the end he delivers the most resounding protest of all, winning four gold medals and derailing Hitler’s plan in short order.

Eddie settles on skiing as his sport and manages to qualify for the British Olympic team, but the Committee cuts him because he “isn’t Olympic material.” Read: you don’t dress well or look right and you’re rather clumsy. Undaunted, Eddie turns to ski jumping because — well, because no one else in Britain competes in ski jumping. If he can compete in an international event and land on his feet, he can qualify for the Calgary Olympics. This is the same year that the Jamaican bobsled team slipped through the same loophole — a loophole that was quickly closed before the following season. Now athletes must compete internationally and place in the top 30% of finishers in order to qualify. But in 1988, if you could find a sport that few people in your country competed in, you could literally “make the team.”

With his father’s van and his mother’s savings, Eddie takes off for the training slopes in Germany. There he tackles the jumps, crashes on his landings, and tackles the jumps again. When he lands the 15-meter jump successfully, he moves on to the 40 and the 70, crashing more than he lands. Low camera angles during the jumps create the sensation of height and speed, providing a rush of adrenaline for the audience. Frequent shots of Eddie tumbling after a crash emphasize just how risky this beautiful sport is. We admire Eddie’s persistence, even as we cringe at his crashes. He believes in himself, no matter what.

Eventually he meets up with Bronson Peary (Hugh Jackman), a chain-smoking, hard-drinking slope groomer who looks incredibly lean and buff for an alcoholic. Peary turns out to be a former ski jumper who lost his chance for Olympic glory by not taking his training seriously. This, of course, sets us up for the perfect sports metaphor movie: unskilled amateur with indomitable heart meets innately talented athlete who threw it all away, and both find redemption while training for the Games.

Eddie turns to ski jumping because — well, because no one else in Britain competes in ski jumping.

It’s a great story about overcoming obstacles, sticking with a goal, and ignoring the naysayers. It demonstrates the power of a mother’s encouragement, and the possibility that even a poor, farsighted boy from a working-class neighborhood can achieve his dreams — if he doesn’t kill himself practicing for it.

All this allows us to forgive the fact that the movie mostly isn’t true. Yes, Michael Edwards did compete in the Calgary Olympics. He did set a British record for ski-jumping, despite coming in dead last in both events, simply because, as the only British jumper, his was the only British record. His exuberance and joy just for participating in the Olympics led to his being the only individual athlete referred to specifically in the closing speech that year (“some of us even soared like an eagle”). But Bronson Peary never existed, and Michael Edwards actually trained with US coaches at Lake Placid, albeit with limited funds that caused him to use ill-fitting equipment. But that wouldn’t have given us such a feel-good story.


Editor's Note: Reviews of "Race," directed by Stephen Hopkins. Forecast Pictures, 2016, 134 minutes; and "Eddie the Eagle," directed by Dexter Fletcher. Pinewood Studios, 2016, 106 minutes.



Share This


Only Nostrums Need Apply

 | 

The "Great Depression" began with the stock market crash of 1929. In all previous depressions, there was little, if any, federal government intervention to extricate America from economic travail. It was held that the federal government possessed neither the knowledge nor the constitutional authority to meddle with the free-market capitalist economy that had propelled America from a fledgling agricultural enclave to a global industrial power in less than 150 years.

Everything was about to change. The 1920s experienced at once the reckless expansion of credit by the Federal Reserve and economic thought by the liberal elite. The former produced an enormous margin-driven stock market bubble that burst in October 1929; the latter produced a remedy that burst any chance of recovery from economic distress. Unlike all previous economic downturns, the calamity in 1929 invoked intense federal government intervention; it also invoked the longest depression in American history. The days of limited government — so expressively and resolutely defined by the Constitution — would be gone for good.

Then-president Herbert Hoover transformed the ensuing mild recession (from which the economy was already recovering by June 1930) into a depression, which, in 1932, was delivered to newly elected president Franklin Delano Roosevelt. With his Brain Trust of lawyers, journalists, and college professors and the freshly minted ideas of Keynesian scholars, he concocted an unprecedented grab-bag of nostrums known as the "New Deal" and parlayed Hoover's depression into a prolonged Great Depression. The American economy would not return to its pre-crash prosperity until 1946.

Unlike all previous economic downturns, the calamity in 1929 invoked intense federal government intervention; it also invoked the longest depression in American history.

In fairness, the bungling of both presidents was enhanced by the Federal Reserve System. The primary function of the Fed was to ensure that US banks could withstand "runs" by depositors attempting, en masse, to withdraw their assets during financial downturns. The Fed was established in 1913 to act as the lender of last resort (LLR) for banks. It replaced the private clearinghouse system that had successfully provided LLR services prior to 1913. But between 1930 and 1933, when stressed banks were desperate for liquidity, the Fed followed a tight money policy. This inexplicable neglect of its primary function contributed to the collapse of more than10,000 banks between 1930 and 1933. Then, in 1936 and 1937, its insistence on raising bank-reserve requirements (once again shrinking the money supply), contributed to the severe recession of 1937–38 — the recession within the Depression.

Unlike President Harding, who did not intervene in the depression of 1920, Hoover believed that not intervening in 1929 "would have been utter ruin." Accordingly, he increased federal spending 42% by 1932, boasting that his administration had embarked on "the most gigantic program of economic defense and counterattack ever evolved in the history of the Republic." Hoover was then excoriated by FDR for extravagant spending and excessive taxing, for entertaining the idea that "that we ought to center control of everything in Washington as rapidly as possible,” and for “leading the country down the path of socialism.”

FDR's public objection to Hoover's intervention was, however, merely a ploy to win the election of 1932. Privately, he believed that Hoover's most gigantic program was not gigantic enough. Roosevelt’s New Deal would put Hoover's reckless extravagance to shame. And while Hoover's intervention was to be temporary and limited, FDR's would become permanent and unlimited. FDR radically transformed government's role in the economy to "center control of everything in Washington as rapidly as possible" and "lead the country down the path of socialism."

By all accounts, the intentions of the New Deal were noble and praiseworthy. To an objective observer, little else can be said that is favorable. Although Democrats hail the welfare and regulatory state that FDR created, the establishment of a welfare and regulatory state was not a New Deal objective. Its objective was economic recovery — which was never achieved under New Deal programs. Unable to restore the American economy, the charismatic FDR gave only ironic hope to a nation in despair: the hope that it could endure the seemingly endless hardship that his policies inflicted.

Hoover believed that not intervening in 1929 "would have been utter ruin." Accordingly, he increased federal spending 42% by 1932.

If not for World War II, FDR's intervention "would have been utter ruin" for the nation. As economist Larry Summers, former director of President Obama's National Economic Council, admonished: “Never forget, never forget, and I think it’s very important for Democrats especially to remember this, that if Hitler had not come along, Franklin Roosevelt would have left office in 1941 with an unemployment rate in excess of 15% and an economic recovery strategy that had basically failed.”

The New Deal was the paragon of nostrums: a political fantasy whose probability of success was inversely proportional to the conceit of its exaggerated claims. Blaming both capitalism and his predecessor for the nation's economic plight, FDR felt compelled to rely on untested Keynesian concepts for stimulating the economy. What emerged was a haphazard torrent of elixirs, boondoggles, and utopian schemes ("a saturnalia of expropriation and waste," to H.L. Mencken) whose only centering force was a frenetic shove to expand federal power. Brain Trust professor Raymond Moley, a close FDR advisor who eventually became critical of the New Deal, found FDR a rank amateur in such matters. Said Moley in 1939, “To look upon these programs as the result of a unified plan, was to believe that the accumulation of stuffed snakes, baseball pictures, school flags, old tennis shoes, carpenter’s tools, geometry books, and chemistry sets in a boy’s bedroom could have been put there by an interior decorator.”

Brain Trust professor Alvin Hansen (aka the "American Keynes"), who favored "highly centralized collectivism" as the optimal method to "command and direct the productive resources," also became frustrated. According to Hansen, "Every attempt at a solution involves it in a maze of contradictions. Every artificial stimulant saps its inner strength. Every new measure conjures out of the ground a hundred new problems."

FDR set the precedent for government by nostrum, and demonstrated that the only thing worse than a liberal nostrum is a well-funded liberal nostrum. Said FDR's Treasury Secretary Henry Morgenthau: "We are spending more than we have ever spent before and it does not work. I say after eight years of this administration we have just as much unemployment as when we started . . . And an enormous debt to boot!"

The charismatic Roosevelt gave only ironic hope to a nation in despair: the hope that it could endure the seemingly endless hardship that his policies inflicted.

The spending continued until after WWII. Keynesian economists such as Hansen were beside themselves with fear that postwar budget cuts would drastically harm the New Deal goal of pre-crash prosperity. If the spending did not continue, warned Paul Samuelson, America would experience “the greatest period of unemployment and industrial dislocation which any economy has ever faced.”

To their intellectual dismay, once tax rates were cut and price controls removed, the private sector (i.e., capitalism) took over, and the American economy soared. According to economist Steven Moore, "personal consumption grew by 6.2% in 1945 and 12.4% in 1946 even as government spending crashed. At the same time, private investment spending grew by 28.6% and 139.6%." Unemployment dropped to 4% in 1946 and "stayed that low for the better part of a decade . . . during the biggest reduction in government spending in American history."

The Great Depression finally ended, when the spending finally stopped. It was not the New Deal that ended the depression. Nor was it WWII. It was the curtailment of the New Deal that ended the depression, 17 years after it started.

What has America learned from this tragic ordeal? Libertarians and conservatives have learned that there is no better argument for limited government than the New Deal. Prior to 1929, the federal government did not intervene in times of economic distress. Recovery was left to the forces of capitalism; individuals and businesses were left to fend for themselves, receiving relief primarily from private charities, occasionally from state and local governments. During that long history, the federal government nostrum was peddled only by snake-oil salesmen, and recovery from economic downturns averaged four years. It often occurred in two years or less. Capitalism did not produce depressions, and less intrusive means of intervention, including no federal intervention, produced far superior results.

After the Panic of 1893, President Grover Cleveland did virtually nothing, except to arrange the repeal of interventionist laws; the ensuing depression, which, according to many, was every bit as devastating as the Great Depression, ended in about four years. (On the contrast between the two depressions, see an essay by Stephen Cox in Edward Younkins, Capitalism and Commerce in Imaginative Literature.) After the Crash of 1920, in which the stock market fell further than it would in 1929, President Harding did less than nothing interventionist. He cut taxes for all income groups, cut the federal budget by almost 50%, and reduced the national debt by 33%. The ensuing depression ended in less than two years and was followed by eight years of unprecedented prosperity, the "Roaring Twenties." Harding succeeded where FDR failed. "Wobbly" Warren Harding!

From this evidence, libertarians and conservatives conclude that nostrums should be avoided at all costs. Chances are, that without nostrums, the Great Depression would have ended in four years, instead of 17. With its return to prosperity, America would have had more than enough money to finance all the roads, schools, parks, and bridges that were built under FDR's make-work programs. But it is critically important to understand that it wasn't the fact that it took 17 years for the nostrums to work. They never worked. The idea that the New Deal succeeded is a myth. The Great Depression did not start until after politicians intervened and did not end until their intervention finally stopped, after subjecting the nation to more than 17 years of want and despair.

Capitalism did not produce depressions, and less intrusive means of intervention, including no federal intervention, produced far superior results.

But liberals, who live in a world in which ideology trumps evidence, missed the tragically abysmal failure that was the New Deal. To them the lesson of the Great Depression was the power of the nostrum: once established, nostrums never go away; they stay and breed more nostrums. In the hands of liberals, a nostrum is a ratchet. While libertarians and conservatives are appalled by the failure of FDR's economic assistance programs, liberals are enraptured by the welfare state that they established: the vibrant, lucrative poverty industry and the languid, needy underclass that it services, both intractable, both agitating for more and bolder nostrums.

This is why the New Deal consumes liberal thought, and why a nostrum is modern liberalism's only thought. The New Deal spawned the "Great Society," Lyndon Johnson's New Deal. And today America endures Barack Obama's "saturnalia of expropriation and waste." Today's liberal can conceive of neither a problem that does not require government intervention nor a solution that does not require a nostrum. Liberals do not care that their nostrums do not work (if one did, it wouldn't be a nostrum). A nostrum's main value lies in its ratcheting effect. As noted historian and FDR worshiperArthur Schlesinger, Jr. put it, "There seems no inherent obstacle to the gradual advance of socialism in the United States through a series of New Deals.”

When Republicans took power in 1953, even President Eisenhower, the architect of our victory in World War II, was afraid to scale back New Deal legislation. In 1969, President Nixon was afraid to cut back already failing Great Society legislation. He was warned by the friendly, and sincere, advice of Democrat Senator Daniel Patrick Moynihan: “All the Great Society activist constituencies are lying out there in wait, poised to get you if you try to come after them, the professional welfarists, the urban planners, the day-carers, the social workers, the public housers. . . . Just take [the] Model Cities [program], the urban ghettos will go up in flames if you cut it out.”

FDR gave us Social Security, the largest and most popular program of his legacy — the “most successful government program in the history of the world,” as Democrat Senator Harry Reid exclaimed. Johnson gave us Medicare, an even larger program. In retirement, all but the wealthiest among us depend on the benefits paid out by these two programs. But both are colossal Ponzi schemes, on track to go broke by 2034. This is not to say that a "social safety net" is unimportant or unnecessary. But, despite their laudable intentions, such entitlement programs, as they have been formulated by pandering politicians, are nostrums that have created an unfunded liability of $90 trillion and threaten to bankrupt the nation.

Today's liberal can conceive of neither a problem that does not require government intervention nor a solution that does not require a nostrum.

Let’s review the history of intervention in another way. When the market crashed in 1920, unemployment increased from 4% to 12%. By August 1921, the economy began its recovery. When the market crashed in 1929, unemployment increased from 4% to 9%, where it lingered for one month, before gradually decreasing to 6.3% in June 1930. The economy was recovering on its own from a mild recession. But that June, Republican President Hoover and a Democrat Congress enacted the Smoot-Hawley tariffs, the first in a long series of heavy-handed federal interventions. Unemployment soared to 16% in 1931. Massive federal spending, debt financing, tax increases, the denial of liquidity (by the Federal Reserve) to failing banks, and numerous other forms of federal tinkering crushed US GDP growth for the rest of the decade. Throughout the 1930s, the median annual unemployment rate was 17.2%. Unemployment did not fall below 14% until the early 1940s, when 12 million Americans were hired by the military.

In June 1930, had the federal government pursued the limited-government policies of the Harding-Coolidge administrations, the depression would have been over by 1932. But the nostrums that were pursued instead prolonged the depression, and, in the process, writes Robert Higgs (in “The Mythology of Roosevelt and the New Deal”), revolutionized "the institutions of American political and economic life," changed "the country’s dominant ideology," and created a leviathan that is "still hampering the successful operation of the market economy and diminishing individual liberties." The New Deal agencies, whose supreme ineptitude caused America to suffer more than ten years longer than it would have under limited-government policies, remain today. Notes Higgs:

One need look no further than an organization chart of the federal government. There one finds such agencies as the Export-Import Bank, the Farm Credit Administration, the Rural Development Administration (formerly the Farmers Home Administration), the Federal Deposit Insurance Corporation, the Federal Housing Administration, the National Labor Relations Board, the Rural Utility Service (formerly the Rural Electrification Administration), the Securities and Exchange Commission, the Social Security Administration, and the Tennessee Valley Authority — all of them the offspring of the New Deal. Each in its own fashion interferes with the effective operation of the free market. By subsidizing, financing, insuring, regulating, and thereby diverting resources from the uses most valued by consumers, each renders the economy less productive than it could be — and all in the service of one special interest or another.

Yet the myth — the pernicious myth — of the New Deal lives on. Today, as FDR blamed his predecessor and capitalism for America's economic plight, Mr. Obama, who won election waving the New Deal banner against the "Great Recession" of 2008, blames capitalism and George W. Bush. Obama even blamed Bush for adding $4.9 trillion to the national debt (money borrowed during eight years of Bush's tenure to finance establishment-Republican nostrums), calling it "irresponsible" and "unpatriotic" — just before he [Obama] went on to borrow $10.6 trillion for his nostrums, running up the national debt to an unprecedented $19 trillion.

With almost one year left for Mr. Obama to enlarge this staggering arrearage, both Democrat candidates for the 2016 presidential election propose the only thing that liberalism allows them to offer: more nostrums.

Hillary Clinton and Bernie Sanders, ever ready to put taxpayer money where their mouths are, clamor for a new New Deal — no doubt to build on the success of Obama's New Deal. A new New Deal is needed, they say, because "for too long,” as one activist put it, “the federal government has tolerated and perpetuated practices of racial and gender discrimination, allowed rampant pollution to contaminate our water and air, sent millions to prison instead of colleges and permitted Wall Street and CEOs to rig all the rules." Correcting the deficiencies of existing big government nostrums calls for a new New Deal with bigger Big Government.

Sanders has a new New Deal for $19.6 trillion (paid for with a 47% tax increase). Clinton even has one for "communities of color" — perhaps to lock in the votes of Americans, who, with Job-like patience, have been waiting more than 50 years for the ruthlessly inept Great Society programs to eliminate poverty and racial injustice, reconcile immigration, and improve public education.

Both Democrat candidates for the 2016 presidential election propose the only thing that liberalism allows them to offer: more nostrums.

After almost eight years of suffering Mr. Obama's nostrums (the Wall Street bailout, the auto industry bailout, the Stimulus, Quantitative Easing, the Green Economy, Dodd-Frank, Obamacare, Middle Class economics, the profligate regulatory morass . . . ), all of America waits, its economy in chronic stagnation— for bigger, better nostrums. We might as well wait for the Treasury Secretary finally to admit, "We are spending more than we have ever spent before and it does not work. I say after eight years of this administration we have just as much unemployment as when we started . . . And an enormous [$19 trillion] debt to boot!"

Willfully oblivious to the evidence, we resign ourselves to a stifling federal patrimony, where no problem escapes a New Deal style nostrum and "every attempt at a solution involves it in a maze of contradictions. Every artificial stimulant saps its inner strength. Every new measure conjures out of the ground a hundred new problems."

* * *

Further Reading

Articles

1920: The Great Depression That Wasn't by C.J. Maloney
The Depression You've Never Heard Of: 1920–1921
by Robert P. Murphy
Great Myths of the Great Depression
by Lawrence W. Reed
The Great Depression
by Hans F. Sennholz
The Mythology of Roosevelt and the New Deal
by Robert Higgs

Books

FDR's Folly: How Roosevelt and His New Deal Prolonged the Great Depression by Jim Powell
The Forgotten Man: A New History of the Great Depression
by Amity Shlaes
New Deal or Raw Deal?: How FDR's Economic Legacy Has Damaged America
by Burton Folsom
A Monetary History of the United States, 1867-1960
by Milton Friedman and Anna Schwartz




Share This


Pandora’s Book

 | 

What would you do if you were told that something you believe is not true? It would depend on who was telling you, I guess. It would also depend on how important the belief was to you, and on the strength of the evidence offered, wouldn’t it?

Suppose the belief in question had shaped your career and your view of how the world works. What if you were offered strong evidence that this fundamental belief was just plain wrong? What if you were offered proof?

Would you look at it?

In his 2014 book, A Troublesome Inheritance: Genes, Race and Human History, Nicholas Wade takes the position that “human evolution has been recent, copious, and regional.” Put that way, it sounds rather harmless, doesn’t it? In fact, the book has caused quite a ruckus.

What if you were offered strong evidence that this fundamental belief was just plain wrong? What if you were offered proof?

The following is not a review of Wade’s book. It is, instead, more a look at how the book was received and why. There are six parts: a story about Galileo, a summary of what I was taught about evolution in college, a sharper-edged rendering of the book’s hypothesis, an overview of some of the reviews, an irreverent comment on the controversy over Wade’s choice of a word, and, finally, an upbeat suggestion to those engaged in the ongoing nurture vs. nature debate.

1. It is the winter of 1609. In a courtyard of the University of Padua, Galileo Galilei closes one eye and peers at the moon through his recently improved telescope. As he observes the play of light and shadow on its surface, there comes a moment when he realizes that he is looking at the rising and setting of the sun across the mountains and valleys of another world. He is stunned.

Galileo hurries to tell his friend and colleague, Cesare Cremonini, then drags him to the courtyard, urging him to view this wonder. Cesare puts his eye to the scope for just a moment, then pulls his head back, pauses, frowns, and says, “I do not wish to approve of claims about which I do not have any knowledge, and about things which I have not seen . . . and then to observe through those glasses gives me a headache. Enough! I do not want to hear anything more about this.”

What a thing to say.

A little context might help. Cesare taught the philosophy of Aristotle at Padua. Aristotle held that the moon was not a world but a perfect sphere: no mountains, no valleys. Furthermore, the Inquisition was underway, and a tenured professor of philosophy who started rhapsodizing about “another world” would have been well advised to restrict his comments to the Celestial Kingdom. The Pope, you see, agreed with Aristotle. To him, and, therefore, to the Roman Catholic Church, the only “world” was the earth, the immobile center of the universe around which everything else moved. Any other view was taboo. Poor Cesare! Not only did he not want to look through the telescope; he did not want there to be mountains on the moon at all.

The question in the present drama is this: who is playing the role of Cremonini?

It would get worse. Soon Galileo would point his scope at Jupiter and discover its moons, heavenly bodies that clearly weren’t orbiting the earth. Then he would observe and record the astonishing fact that Venus went through phases as it orbited not the earth but the sun. So: Ptolemy was wrong, Copernicus was right, and Cesare Cremonini would go down in history as the epitome of willful ignorance. Galileo, of course, fell into the clutches of the Inquisition and became a hero of the Renaissance.

To be fair to Cesare, the story has been retrospectively streamlined into a sort of scientific morality tale. While the part about Galileo’s discovery is probably more or less right, Cremonini’s remark wasn’t made directly to Galileo. It was reported to him later in a letter from a mutual friend, Paolo Gualdo. The text of that letter is included in Galileo’s work, Opere II. And while those jagged borders of light and dark on the moon, imperfectly magnified, were certainly thought-provoking, to say that the case against Ptolemy was closed on the spot, that night in Padua, would be too neat.

It makes a good story, though, and a nice lens for viewing reactions to scientific breakthroughs. Changing our focus now from the moons of Jupiter to the molecular Rubik’s cube we call the human genome, the question in the present drama is this: who is playing the role of Cremonini?

2. In an undergraduate course, taken decades ago, I was taught that human evolution had more or less stopped when the glaciers retreated about 10,000 years ago. Evolution had been driven primarily by natural selection in response to a changing environment; and, as such changes had, for the time being at least, halted, so too had the evolution of man.

I was taught that races exist only as social constructs, not as meaningful biological categories, and that these constructs are only skin deep. They told me that the social behavior of an individual is not genetic, that behavioral and cognitive propensities just aren’t in our genes.

I was taught that the differences among social organizations are unrelated to the genetic differences of the populations that comprise those various organizations, and that social environments have no influence on human evolution.

3. To show how Wade’s book stirred things up, I will present his central hypothesis with an emphasis on the controversial parts. I’ll avoid scientific jargon, in an effort to make the meaning clearer to my fellow nonscientists.

Wade believes that humanity has been evolving rapidly during the past 30,000 years and continues to evolve rapidly today. It is not just our physical characteristics that continue to evolve. The genes that influence our behavior also evolve. (Yes, that’s what the book says, that our behavior is influenced by our genes.)

is humanity rapidly evolving? Is there such a thing as race in biological terms? Nicholas Wade believes that the answer is “yes.”

He also believes that humanity has evolved differently in different locations, most markedly on the different continents, where the major races evolved. (Yes, the book calls them races.)

These separately evolved genetic differences include those that influence behavior. (Yes, the book says that race is deeper than the skin.)

Furthermore, these genetic differences in behavioral propensities have contributed to the diversity of civilizations. The characteristics of any given civilization, in turn, influence the direction of evolution of the humans who compose it.

Oh, my.

We now know that the earth goes around the sun. But is humanity rapidly evolving? Is there such a thing as race in biological terms? Does the particular set of alleles in an individual’s genome influence how that person behaves? Does the particular frequency of alleles in the collective genetic material of the people who compose a civilization influence the characteristics of that civilization? Do the characteristics of a civilization influence the direction of the evolution of the humans that compose it? Nicholas Wade believes that the answer to all these questions is “yes.” While he does not claim that all of this has been proven, he is saying, in effect, that what I learned in college is not true. Am I now to be cast as Cremonini?

4. There are those who disagree with Wade.

In fact, lots of people didn’t like A Troublesome Inheritance at all. I’ve read about 20 reviews, few of them favorable. Even Charles Murray, writing in theWall Street Journal, seemed skeptical of some of Wade’s arguments.Most of the others were simply unfavorable, among them reviews in the Washington Post, the New York Review of Books, Scientific American, the New York Times, The New Republic, and even Reason. Slate and The Huffington Post piled on. While Brian Bethune’s review in MacLean’s was gentler than most, it was gently dismissive.

The reactions run from disdain to anger to mockery. Nathaniel Comfort’s satirical review Hail Britannia!,in his blog Genotopia, is the funniest. Donning the persona of a beef-fed, red-faced, pukka sahib at the height of the Raj, he praises Wade’s book as a self-evident explanation of the superiority of the West in general and the British in particular. (I once saw a retired British officer of the Indian Army being told by an Indian government official that he had to move his trailer to a remote area of a tiger preserve to ensure the security of a visiting head of state. He expressed his reluctance with the words, “I’m trying to be reasonable, damn it, but I’m not a reasonable man!”)

There’s some pretty heated language in these reviews, too. That the reviewers are upset is understandable. After all, they have been told that what they believe is not true. And the fellow doing the telling isn’t even a scientist.Sure, Nicholas Wade was a science writer and editor for the New York Times for three decades, but that doesn’t makehim a scientist. Several of the reviews charge that Wade relies on so many historical anecdotes, broad-brush impressions, and hastily formed conclusions that it’s a stretch to say the book is based on science at all.

Of course they’re angry. Some of these guys are professors who teach, do research, and write books on the very subject areas that Wade rampages through. If he’s right, then they’re wrong, and their life’s work has been, if not wasted, at the very least misguided.

The consensus is that Wade has made a complete hash of the scientific evidence that he cites to make his case: cherry-picking, mischaracterizing, over-generalizing, quoting out of context, that kind of thing.

Another common complaint is that, wittingly or not, Wade is providing aid and comfort to racists. In fact, the animosity conveyed in some of the reviews may spring primarily from this accusation. In his review in the New York Times, David Dobbs called the book “dangerous.” Whoa. As I said, they don’t like A Troublesome Inheritance at all.

So, is Nicholas Wade just plain wrong, or are his learned critics just so many Cremoninis?

5. While the intricacies of most of the disagreements between Wade and his critics are over my head, one of the criticisms is fairly clear. It is that Wade uses the term “race” inappropriately.

The nub of the race question is that biologists want the word “race” as it applies to humans to be the equivalent of the word “subspecies” as it applies to animals. As the genetic differences among individual humans and the different populations of humans are so few, and the boundaries between the populations so indistinct, biologists conclude that there are no races. We are all homo sapiens sapiens. We are one.

Several of the reviews charge that Wade relies on so many historical anecdotes, broad-brush impressions, and hastily formed conclusions that it’s a stretch to say the book is based on science at all.

Just south of Flathead Lake in Montana is an 8,000-acre buffalo preserve. One summer day in the mid-’70s, I walked into its visitors center with my wife and father-in-law and asked the woman behind the counter, “Where are the buffalo?” She did not hesitate before hissing, “They’re bison.” Ah, yes: the bison-headed nickel, Bison Bill Cody, and, “Oh, give me a home where the bison roam . . .” You know the critter.

Put it this way: to a National Park Ranger, a buffalo is a bison; to a biological anthropologist, race is a social construct. That doesn’t mean there’s no such thing as a buffalo.

I don’t mean to make light of it. I’ve read the explanations. I’ve studied the printouts that graph and color-code populations according to genetic variation. I’ve studied the maps and charts that show the differences in allele frequencies among the groups. I’ve squinted at the blurry edges of the clusters. I get all that, but this much is clear: the great clusters of genetic variation that correspond to the thousands of years of relative isolation on the various continents that followed the trek out of Africa are real, and because they are genetic, they are biological. In any case, we are not in a biology class; we are in the world, where most people don’t talk about human “subspecies” very often, if ever. They talk about human “races.” To criticize Wade’s use of the term “race” seems pedantic. Whether to call the clusters “races” or “populations” or “groups” is a semantic dispute.

Put it another way: If you put on your “there is no such thing as race” costume for Halloween, you’ll be out trick-or-treating in your birthday suit, unless you stay on campus.

Besides, use any word you want, it won’t affect the reality that the symbol represents. The various “populations” either have slightly different genetic mixes that nudge behavior differently, or they don’t. I mean, are we seeking the truth here or just trying to win an argument?

6. While Wade offers no conclusive proof that genes create behavioral predispositions, he does examine some gene-behavior associations that point in that direction and seem particularly open to further testing. Among them are the MAOA gene and its influence on aggression and violence, and the OXTR gene and its influence on empathy and sensitivity. (The symbols link to recent research results.)

What these have in common is that the biochemical chain from the variation of the gene to the behavior is at least partly understood. The chemical agents of the genes in question are L-monoamine oxidase and oxytocin, respectively. Because of this, testing would not be restricted to a simple correlation of alleles to overt behaviors in millions of people, though that is a sound way to proceed as well. The thing about the intermediate chemical triggers is that they could probably be measured, manipulated, and controlled for.

We are in the world, where most people don’t talk about human “subspecies” very often, if ever. They talk about human “races.”

The difficult task of controlling for epigenetic, developmental, and environmental variables would also be required but, in the end, it should be possible to determine whether the alleles in question actually influence behavior.

If they do, the next step would be to determine the frequency of the relevant allele patterns in various populations. If the frequency varies significantly, then the discussion about how these genetic differences in behavioral propensities may have contributed to the diverse characteristics of civilizations could be conducted on firmer ground.

If the alleles are proven not to influence behavior, then Wade’s hypothesis would remain unproven, and lots of textbooks wouldn’t have to be tossed out.

Of course, it’s not so simple. The dance between the genome and the environment has been going on since life began. At this point, it might be said that everything in the environment potentially influences the evolution of man, making it very difficult to identify which parts of human behavior, if any, are influenced by our genes. Like Cremonini, I have no wish to approve of claims about which I do not have knowledge.

But the hypothesis that Wade lays out will surely be tested and retested. The technology that makes the testing possible is relatively new, but improving all the time. We can crunch huge numbers now, and measure genetic differences one molecule at a time. It is inevitable that possible links between genes and behavior will be examined more and more closely as the technology improves. Associations among groups of alleles, for example, and predispositions of trust, cooperation, conformity, and obedience will be examined, as will the even more controversial possible associations with intelligence. That is to say, the telescope will become more powerful. And then, one evening, we will be able to peer through the newly improved instrument, and we shall see.

That is, of course, if we choose to look.




Share This

© Copyright 2016 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.