What's in a Birth Certificate?

 | 

So, Barack Obama released his “long form” birth certificate and bought himself some temporary tactical advantage against Donald Trump and millions of angry conspiracy mongers out there. The document confirms, as much as any such item can, that Barack Hussein Obama II was born at 7:24 p.m. on August 4, 1961 at Kapiolani Maternity and Gynecological Hospital in Honolulu.

The lingering question of whether the man meets the constitutional requirements to be President of the United States — specifically, that he is a native-born citizen — is answered. He does. And, specifically, he is.

But, if you look a little more deeply into the hard typeface of the Certificate of Live Birth, you can see some evidence of the background that drives a person like Obama. Indulge me in a little armchair Freud.

According to the document, the president’s mother — Stanley Ann Dunham — was 18 years old when she delivered him. That’s very young by any reasonable standard, young enough to make the baby’s arrival seem reckless and ill-advised. A heavy burden to bear, especially when you were the baby in question.

And there’s the bureaucratic judgment in the Certificate’s explication that the mother’s name did not match the name of the child’s father. The 18-year-old girl signed the Certificate “Ann Dunham Obama,” a small act of revolt against the officious document’s implication of illegitimacy.

Good for her.

Many ask, now that Obama has released this Certificate, why he didn’t do it sooner. I have an idea. He’s been protecting his mother from the harsh judgment of petty tyrants.

How many bureaucratic sneers did she suffer, bearing and raising a mixed-race child in the early 1960s? How many dirty looks, when she carried him on buses or airplanes? Brought him to campus at the several universities she attended? Applied for passports? Applied for food stamps?

And how soon did little Barack II realize that authority figures judged his mother harshly? That official forms made unfriendly assumptions about her marital status? How quickly did other kids say unkind things about his . . . unconventional . . . mom? Kids in Hawaii. Kids in Kansas. Kids in Indonesia.

The facts around Stanley Ann Dunham’s life are hazy and are likely to remain so. This is the haze created by family members protecting a loved one they know needs the help. A foolish daughter. An eccentric mother.

There are different versions of when or even whether Stanley Ann and the elder Obama married. Apparently, they never lived together as husband and wife; and the President’s own wife has said that his mother was “very single when she had him.”

There’s a hard edge in that last bit, even from his wife. Young Michelle Robinson — from an intact, upright, churchgoing family headed by a father who was a civil servant — had plenty of occasions to judge foolish teenage mothers on food stamps. You can practically hear it in the very that she uses to modify single. Michelle wasn’t going to bounce around a bunch of motley state schools with a baby on her hip; she was going to Princeton.

For most of his 50 years, Barack Obama has been protecting his mother from judgments and slights. Since she passed away in the ‘90s, he’s been protecting her ghost. He’s the archetypal high-achiever from a dysfunctional family. That’ll never change.

And that archetypal sort is precisely who seeks the presidency in this dysfunctional age.

Trump and Obama’s other antagonists have already moved on to press the President to make public his transcripts from college and law school, as previous presidents have. Their assumption, which Obama himself has tacitly acknowledged in his memoirs, is that he was a mediocre student who advanced through elite academia on affirmative action preferences.

Here’s a prediction: Obama will never release those transcripts. His birth certificate — his entrance into this world — is a testament to what the Babbitts deemed his foolish mother’s recklessness and immaturity. But his Ivy League degrees are the armor he built to protect her and himself from those judgments. And slights. And dirty looks.

He’s not going to lower that.

And, in this narrow Freudian context, good for him. It isn’t a good idea, tactically — and it’s poor form, personally — to mess with a man’s coping mechanisms.




Share This


The Healthy Society

 | 

British Prime Minister David Cameron, speaking on February 5, deplored “state multiculturalism,” a failed doctrine of “encourag[ing] different cultures to live separate lives, apart from each other and the mainstream.” Immigrants should feel, rather, that they were living in an inclusive society, sharing a national identity, culture, curriculum, and language. Lacking such a sense of belonging and experiencing, instead, segregation and separatism, some young Muslims in Britain had turned to extreme Islamism. Cameron cited the “horror” of forced marriage in some immigrant communities. He proposed “a two-month programme to show sixteen-year-olds from different backgrounds how to live and work together.”

Earlier, on October 16, Chancellor Angela Merkel had expressed similar worries for Germany, home to some four million Muslims. The idea of people from different cultural backgrounds living happily "side by side" without a common culture did not work, she indicated: "This [multicultural] approach has failed, utterly failed." French President Sarkozy has said similar things.

Such remarks might be code words for anti-Islamism, but I am not cynical enough to think so. The worry of Cameron and others seems plausible, but it requires nuancing. It is perhaps soundest for a situation with two aspects. First, only one substantial immigrant culture, rather than several, confronts the local culture. Second, the authorities work to preserve the distinction — as, for instance, through year-by-year, not just transitional, public-school instruction in the immigrant language.

But this scenario does not necessarily recommend the opposite: different cultures melted into a homogeneous dominant one. A good society, true enough, does require consensus on ethical norms such as treating other people honestly and honorably, respecting their rights and property — not cheating, stealing, or committing aggression. Such consensus can accommodate differences in details of etiquette and lifestyles (arguably extending to same-sex and even polygamous marriages). Furthermore, a good society requires acceptance of a common legal system, without special privileges or burdens for particular groups. Consensus on the political system also rules out seeking change by violence, but it admits advocating even radical change by constitutional means.

While rejecting militaristic and imperialistic nationalism, Ludwig von Mises welcomed liberal nationalism, including movements for liberation and unity of populations speaking a common language (Nation, State, and Economy, 1919/1983). Liberal nationalism can be a bulwark of peace. Different nations should be able to respect and — to interpret a bit — even share in each one’s pride in its own culture and history. By extension, such mutual respect and celebration can extend to members of different national heritages within a single country. A healthy multiculturalism welcomes a diversity of interests and heritages without official favor or disfavor for any.

A diversity of national heritages can enrich a country’s overall culture. Quasi-native speakers of heritage languages, especially with their own publications and broadcasts, can promote language-learning and can be useful in diplomacy and in war. Diversity even of national restaurants and foods — Chinese, Mexican, Greek, German, French, and so on — multiplies options for work and for leisure. Cultural diversity can bolster a general awareness of history.

Diverse national heritages can scarcely offer benefits as great as those of the occupational division of labor and of domestic and international trade. They can, however, multiply the variety of niches in life in which a person or a family can feel comfortable and important. They can help avoid the dismal opposite, a society in which individuals must feel superior or inferior in competition on a single scale of overriding significance (money being the most obvious metric). A diverse society includes all sorts of (decent) persons, including, yes, entrepreneurs and investors obsessed with creating wealth and making money. Few people, however, can realistically expect outstanding success on the monetary scale. Pursuing an unattainable material equality would foster attitudes and politics incompatible with a quasi-equality of a more humane and more nearly attainable type.

A healthy society — to continue my amateur psychologizing — comprises many “noncomparing groups” (so called by analogy with the noncompeting groups recognized by the 19th-century economist John Elliott Cairnes). People should not be ranked according to the fields in which their accomplishments lie. Each person should have a chance to excel in something, whether craftsmanship, business, scholarship, athletics, a hobby such as collecting classic cars or rare coins, a religious group, travel and adventure, conviviality, or self-effacing service to mankind.

And, yes, cultivating a national heritage. Many kinds of excellence should be as respectable as the amassing of fortunes. A teacher could continue associating without embarrassment with former colleagues or students who had become business tycoons, not because progressive taxation had lopped off their huge incomes but because scholarly values and monetary values were regarded as incommensurate yet of equal dignity. While the approach to equality sought by left-liberal egalitarians implies measurement, true liberals need to follow Herbert W. Schneider (Three Dimensions of Public Morality, 1956, p. 97; cf. pp. 100, 118) in emphasizing "the incommensurability of human beings.”




Share This


The Invisible Tribe

 | 

People who say the Constitution is “living” or “invisible” usually don’t like what it says and don’t have the patience or the votes to amend it.

In February, the New York Times ran a piece by Laurence Tribe entitled "On Health Care, Justice Will Prevail," in which he argues that not only is the individual mandate in the new healthcare law constitutional, but it is both necessary and good. He also delivers what sounds like a series of short pregame pep-talks to the more conservative Supreme Court justices, seemingly trying to finagle them into joining the majority that he confidently predicts will uphold the constitutionality of the mandate.

Laurence Tribe is a professor of constitutional law at Harvard Law School and the Carl M. Loeb Professor at Harvard University. Widely considered to be the foremost current authority on the laws and Constitution of the United States, he has argued 34 cases before the Supreme Court, including Bush v.Gore, arguing for Mr. Gore. Among Tribe's former students and research assistants are Supreme Court Justice Elena Kagan, Chief Justice John Roberts, and President Barack Obama, whom he called “the best student I ever had” (The Concord Monitor, November 14, 2007).

Here I will examine four specific points in Tribe's essay in the Times: (1) an error in word choice, (2) a sentence that misleads through lack of clarity, (3) a conclusion built on a false premise, and (4) an answer that begs a question. The constitutionality of the mandate and its fate in the Court will be addressed only when they touch on these smaller points. The pep-talks will not be examined. The elusiveness of objectivity will be the subject of the conclusion.

The first and second points are in paragraph four:

"Many new provisions in the law, like the ban on discrimination based on pre-existing conditions, are also undeniably permissible. But they would be undermined if healthy or risk-prone individuals could opt out of insurance, which could lead to unacceptably high premiums for those remaining in the pool. For the system to work, all individuals — healthy and sick, risk-prone and risk-averse — must participate to the extent of their economic ability."

The first point is in second sentence, where Tribe asserts that, if “risk-prone individuals could opt out of insurance,” those remaining could be saddled with “unacceptably high premiums.” Using the same formula: If all drunks who ride motorcycles opted out of a health insurance pool, the premium paid by the people who remained would go sky-high.

It is as though Laurence Tribe has stood on a soapbox in Harvard Yard and shouted, albeit in bureaucratese, “The individual mandate is Marxist.”

This is nonsense. Only when risk-averse, not risk-prone, individuals drop out en masse do the premiums for those remaining rise. The wrong word was used. Using the corrected formula, the point that Tribe may have been trying to make can be illustrated with the same hypothetical: all teetotalers who do not ride motorcycles must be made to pay health insurance premiums high enough to cover not just their own modest health-related expenses, but also the astronomical medical bills of drunks on Harleys.

Now it makes sense; that is to say, the revised formula’s effect on the price of premiums makes sense, not the system that doesn’t permit sober people to opt out.

The second point is the wording of the final phrase, “for the system to work, all individuals . . . must participate to the extent of their economic ability.” This is the sort of vague language bureaucrats use to camouflage authoritarian unpleasantness.

If clarity had been the goal, it might have said: “Each person must be compelled to buy health insurance and to pay a price directly proportional to the amount of money he has so that medical care can be provided to each person according to his medical needs.” Sorry, that’s less clear, and too wordy. Try this: “From each according to his ability, to each according to his needs” (Marx, Karl, Critique of the Gotha Program).

If the reader pauses now, and dispassionately compares Tribe’s meaning with Marx’s, he will, if he is honest with himself, conclude that the two are, in fact, the same. It is as though Laurence Tribe has stood on a soapbox in Harvard Yard and shouted, albeit in bureaucratese, “The individual mandate is Marxist,” a term employed here not in any pejorative sense, but in an effort accurately to convey the meaning of the carefully crafted phrase “must participate to the extent of their economic ability.”

To make matters even more jarringly redistributive, equivalent amounts of medical care will not be provided under this steeply progressive pricing scheme. Because good health and wealth are positively correlated, “for the system to work,” those with the most modest medical bills will pay the most for the insurance and those with the most expensive bills will pay little or nothing. Tribe was probably trying to state this truth as plainly as he could without triggering the howls of the anticommunists among us.

Given his endorsement of the compulsory and redistributive nature of the mandate, however, it is unlikely that Tribe would deny the accuracy of the label “Marxist” or, for that matter, be offended by it. To expect either outcome would insult his intellectual honesty and integrity.

The annoying aspect of this second point is that, by cloaking the mandate’s naked Marxist core in vague language, Tribe may have been trying to strengthen his argument. A strong argument does not need camouflage.

The third point is in the last part of paragraph six, where Tribe neatly summarizes a set of premises and conclusions that is widely held to be true, but isn’t:

"Individuals who don’t purchase insurance they can afford have made a choice to take a free ride on the health care system. They know that if they need emergency-room care that they can’t pay for, the public will pick up the tab. This conscious choice carries serious economic consequences for the national health care market, which makes it a proper subject for federal regulation."

Consider the case of Mr. A, who has studied the actuarial tables and discovered that the only insurance he would be allowed to buy is priced according to the risk factors of people who almost certainly will have medical expenses many times as costly as his. He has saved enough money so he could afford to buy that insurance, if he wanted to, but he realized that actuarially it would be cheaper for him to pay his own medical bills out of pocket. In fact, he has saved enough money so he could afford to pay even catastrophic medical bills, if it came to that. Mr. A has chosen not to buy the insurance offered because, for him, it is not a good deal. He has chosen to self-insure.

So, while Mr. A did not purchase insurance he could afford, he has not “made a choice to take a free ride on the health care system.” Mr. A can and doespay for emergency room visits in full upon receipt of the bill. Unlike people covered by Medicaid, who really are taking a “free ride,” he has never asked the public or anyone else to pick up the tab, and never will. Mr. A’s “conscious choice” to self-insure carries “serious consequences for the national health care market” only to the extent that the government, having spent itself into a yawning sinkhole of debt, and finding voters reluctant to pay higher taxes, has passed a law that would strongarm Mr. A into picking up the tab not only for himself, as he has been all along, but for others as well.

That Tribe did not take into account those who choose to responsibly self-insure is odd. Surely he knows people who are successful, self-reliant, and self-insured. But whatever his reason, half-truths were used to reach a conclusion that as a result is, at best, 50% nonsense.

Fourth, and finally, in the seventh paragraph, Tribe tries to demonstrate that the constitutionality of the mandate does not depend on the commerce clause:

"Even if the interstate commerce clause did not suffice to uphold mandatory insurance, the even broader power of Congress to impose taxes would surely do so. After all, the individual mandate is enforced through taxation, even if supporters have been reluctant to point that out."

Let’s see. If the commerce clause, even in its broadest interpretation, fails to persuade a majority of the Court that the mandate is constitutional, the fact that the new healthcare law levies a fine on those who fail to comply with the mandate creates an opening. Because the law names the IRS as the collection agent for the fine, that fine takes on the coloration of taxation. Therefore (if therefore is not too strong a word in this line of reasoning), the Court can conclude that the individual mandate is constitutional because Congress is simply exercising its power to tax.

If all Congress has to do to make a law constitutional is to impose a fine for failure to comply and make the IRS the bill collector, then Congress can constitutionally make anyone do anything it wants.

Note that Tribe does not argue that the individual mandate itself is a form of taxation, but that it is “enforced through taxation.” He cannot claim that the mandate is a tax because the money is passed directly from the hands of the private citizen into the maw of the private insurance corporation. The government only oversees this unfunded individual mandate. So Tribe must be content to say that the fine itself is a tax.

Is it really that simple? If all Congress has to do to make a law constitutional is to impose a fine for failure to comply with that law, and make the IRS the bill collector so that the fine can be called a tax, then Congress can constitutionally make anyone do anything it wants by tacking a fine onto not doing it. That can’t be right.

Let us say that a law is passed that compels all obese people to lose a certain portion of their weight annually until a desirable target weight is achieved. Even though the massive weight-loss industry crosses state lines, and eliminating the scourge of obesity would benefit the national economy, an unimaginative Supreme Court stubbornly maintains that the government does not have the power to force people to lose weight. But then it is brought to the Court’s attention that the law imposes a small fine, payable to the IRS, on obese citizens who fail to meet their federally mandated weight-loss targets. Does the Court have no choice but meekly to acquiesce and uphold the law as constitutional?

Is there a legal philosopher’s stone that transmutes fines into taxes and through this magic transforms otherwise unconstitutional laws into models of constitutional compliance? Or is this an example of a more subtle proposition, “When I use a word, it means just what I choose it to mean, neither more nor less” (Dumpty, Humpty, Through the Looking Glass).

Even if either the mandate or the fine were a tax, the resulting legal axiom still wouldn’t pass the sniff test: “It is taxed, therefore, it is constitutional.” Deep in the penumbra of the Constitution that may not look like nonsense, but here on the sunny side, it certainly does. It sounds like the Red Queen channeling Descartes and Tribe simultaneously.

The fact that the Congress has the constitutional power to tax surely cannot mean that anything it chooses to tax is, as a result, constitutional. The argument begs the question. The fact that a fine is attached to the mandate does not make it constitutional.

The Times piece leaves the impression that Tribe has not given the problem of the individual mandate his full attention, which is understandable, given that his specialty is the law, not medicine, insurance, or economics. Besides, as a tenured professor at Harvard, he probably has essentially free healthcare for life as an untaxed benefit, making any concern that he has about the unfunded mandate entirely academic.

To say that “justice will prevail” if the Court upholds the mandate is easy for Tribe.He will not be asked to sacrifice anything at all, while others, many of more modest means, will be compelled to pay thousands of after-tax dollars per year to cover someone else’s higher risks.

To put this in another way, Laurence Tribe will not be picking up the tab when the Harley goes skittering across the freeway; others will.

And where is the justice in that?

***

In his 2008 book, The Invisible Constitution, Tribe explains the futility of relying on the text of the Constitution to resolve constitutional questions. He tells of what he calls the “dark matter” in the “shadow constitution” and the “ocean of ideas, propositions, recovered memories, and imagined experiences” that comprise the real mass of the “invisible” Constitution, which dwarfs the mere document. (One good review is: The Dark Matter of Our Cherished Document: What you see in the Constitution isn't what you get,Dahlia Lithwick, in Slate,Nov. 17, 2008)

If the written text of the Constitution, and its accompanying case law, which everyone can read and compare notes on, is but the tip of the iceberg, and the real mass is hidden below, in the ocean of collective consciousness, imagination, memory, or even the collective unconscious, then its truths can only be accurately interpreted by initiates specially trained to dive beneath the surface like cormorants to fathom and retrieve its complex meanings. Or, to switch metaphors, perhaps this Constitution is an ethereal entity whose cryptic messages can be divined only by oracles who breathe the heady air found in the realm above the clouds of partisanship and bring down to us the purity of its truths without relying on an old scrap of parchment.

On May 4, 2009, Laurence Tribe wrote a letter to his star pupil, assessing potential nominees to the Supreme Court. In it, he sized up Sonia Sotomayor, then a Court of Appeals judge, advising President Obama that, “Bluntly put, she’s not nearly as smart as she seems to think she is, and her reputation for being something of a bully could well make her liberal impulses backfire and simply add to the fire power of the Roberts/Alito/Scalia/Thomas wing of the Court . . .” Another of his former students, Ed Whalen, posted the letter on the website of the Ethics and Public Policy Center.

 

Perhaps this Constitution is an ethereal entity whose cryptic messages can be divined only by oracles who breathe the heady air found in the realm above the clouds of partisanship.

To interpret this letter, Sonia Sotomayor, who studied law at Yale, not Harvard, might want to take a page from The Invisible Constitution and acknowledge the futility of relying on the rows of tiny symbols strung haphazardly together that constitute the actual text. There is so much more dark matter between the lines and in the murky ocean upon which such a letter floats.

The now Associate Justice Sotomayor may be comforted if she peers into the dark waters and discerns the outline of a psychological defense mechanism, first proposed by Freud, and called projection. A person who uses this tool unconsciously denies his own negative attributes and projects them onto others. This reduces his anxiety by allowing the expression of unconscious fears and desires without letting the conscious mind recognize them as his own.

Some of the people who say that the Constitution is “living” or “invisible” become judges so they can creatively distort the parts they disagree with from the bench. They are judicial activists posing as unbiased judges. Others work from the sidelines to bend and twist those parts so that the Constitution may be forged into a weapon that adds fire power to liberal impulses in the ongoing ideological battle. These are not legal analysts; they are merely political actors striking unconvincing poses of objectivity.




Share This


Individualism in Real Life

 | 

Bethany Hamilton is one of those perfect libertarian heroes. When she wants something, she goes after it. When things go wrong, she looks for a way to make it right. She doesn't whine or complain, even when the thing that goes wrong is the horror of a shark taking off her arm. She relies on herself, her family, and her God.

The movie about Bethany, Soul Surfer, has its predictably maudlin moments, fueled by Marco Beltrami's heavily emotional musical score, but don't let that put you off. If you are looking for a film to demonstrate libertarian principles to your friends, take them to Soul Surfer.

The film is based on the true story of Bethany, a competitive surfer with corporate sponsorship who was on her way to professional status when a huge shark bit off her arm. She returned to competitive surfing within a matter of months, and is now a professional surfer. She also seems to be a really nice girl. I learned that not only from the film, but also from the articles I have read about her.

And the Hamiltons seem to be a model libertarian family. They ignored traditional middle-class expectations in order to follow the dreams they made for themselves. All of them, parents and children alike, live to surf. When Bethany showed a great aptitude for surfing, her parents opted out of the government school system and educated her at home so she could take advantage of daytime surfing. After her injury, they did for her only the things she absolutely could not do for herself, encouraging her quickly to find new ways of managing her "ADLs" (activities of daily living).

The film portrays the Hamiltons (Dennis Quaid and Helen Hunt, parents) as a close-knit family within a close-knit community of surfers who understand the true nature of competition. True competition isn't cutthroat or unfair. In fact, unfettered competition has the power to make every person and every product better. Even Bethany's surfing nemesis, Malina Birch (Sonya Balmores), is portrayed as competitively solid. After she paddles full out toward a wave during a competition instead of kindly slowing down to allow for Bethany's injury, Bethany (AnnaSophia Robb) thanks Malina for treating her as an equal and pushing her to be her best. It's a great example of the good that competition can accomplish.

Instead of turning to government support groups to help her deal with her injury, Bethany turns to three free-market sources: family, business, and religion. When she returns to surfing, she rejects the judges' offer to give her a head start paddling past the surf. Instead, her father designs a handle to help her "deck dive" under the waves. When finances are a problem, a news magazine offers to provide her with a prosthetic arm in exchange for a story, and a surfboard manufacturer sponsors her with equipment and clothing. The youth leader at her church (Carrie Underwood) gives her a fuller perspective on her life by taking her on a service mission to Thailand after the tsunami that hit in 2004. There she learns the joy of serving others — a kind of work that earns her psychic benefits rather than monetary rewards. She isn't "giving back"; she is "taking" happiness.

These examples of self-reliance and nongovernmental solutions to problems raise the level of this emotionally predictable film to one that is philosophically satisfying — and well worth seeing.


Editor's Note: Review of "Soul Surfer," directed by Sean McNamara. Sony Pictures Entertainment, 2011, 106 minutes.



Share This


True for Me

 | 

I’ve reflected before on the unintended comedy that flows from people in positions of authority or influence in our society denying the existence of objective reality. In the far future, enlightened people will look back on this era of American history and marvel at the fact that the children of a system built on reason could behave so irrationally.

A recent example: the slightly past-her-prime movie actress Ashley Judd (née Ciminella) recently made the rounds of television talk shows to promote a memoir. As expected of such a project, Judd included salacious tales of incestuous sexual abuse suffered when she was a young girl and edgy sex when she was a young woman. Also, occasional bouts of manic depression.

In the book, Judd recalls when she was in middle school and her mother — the country music singer Naomi Judd — started dating her second (and current) husband:

"Mom and pop were wildly sexually inappropriate in front of my sister and me ... a horrific reality for me was that when pop was around I would have to listen to a lot of loud sex in a house with thin walls. . . . I now know this situation is called covert sexual abuse."

It’s too bad that Judd has been reduced to this. She made some pretty good films in the 1990s and early 2000s — including my personal favorite, the surreal 1999 whodunit Eye of the Beholder.

But her deepest self-abasement doesn’t appear in her book. Asked on the Today show what her family thought of the book, Judd said:

"You know, the book is very honest [but] it’s not necessarily accurate, because everyone in my family has their own perspective and their own experiences. But it’s very true for me."

Ugh. Beware “true for me” memoirists.

Of course, some people go for this situational twaddle. One of the half-wit columnists at the website Salon.com wrote:

"Judd’s admission that her memoir is “true for me” allows for an acknowledgment of the real trauma she’s experienced while also making room in the narrative for other versions of events. Memory might not hold up in a court of law, but that doesn’t matter much to a scarred heart. One that’s suffering depression and a host of other hurts. And, by admitting that, Judd’s telling others that if it feels like abuse to you, it was abuse. And that’s good enough."

No, it’s not. As James Frey, Greg Mortenson, and a growing list of other fabulists and swindlers will attest, “true for me” memoirists are a sleazy lot. Often full of sanctimonious, politically-correct hypocrisy. Usually tripped up by undeserved self-regard.

Sadly, these same faults apply to the younger Ms. Judd. Less than three years ago, she appeared in a series of videos produced by a statist political advocacy group called Defenders Action Fund; in those videos, she castigated Sarah Palin for supporting the sport killing of wolves from helicopters. To wit: “Now back in Alaska, Palin is again casting aside science and championing the slaughter of wildlife.”

So, a woman who doesn’t hold herself to a standard of factual accuracy in her salacious memoir damns another woman for “casting aside science” when dealing with wildlife management. This selective embrace of objective reality is part of the reason that American culture is on the decline.

Statists thrive when people doubt objective reality and use terms like “true for me.”

Ashley Judd’s loud and libidinous mother probably summed up the real ethics of such people when — asked by the Today show for a response to her daughter’s stories — she said: “I love my daughter. I hope her book does well.” Cha-ching.




Share This


Atlas at Last

 | 

The John Galt Line has finally pulled out of the station, and is barreling across the country picking up hitchhikers who may be wondering what all the fuss is about. After a spectacular pre-release advertising campaign that included multiple premieres in major cities and pre-purchased ticket sales that encouraged nearly 300 screen owners to give the film a chance, Atlas Shrugged Part 1 opened on April 15 (Tax Day) as the third-grossing film of the weekend (looking only at screen averages, not total sales).

"Mixed" is an understatement when describing the reviews. Professional critics on RottenTomatoes give it a 6% approval rating, perhaps the lowest rating I have ever seen for a film. Meanwhile, audiences gave it an unbelievable 85%.

In fact, the film doesn’t deserve either rating. It belongs somewhere in the low middle.

It is not as good as the 85% would indicate; audiences who saw it on opening weekend are rabid fans, bent on a mission to have everyone in America see the film and learn Ayn Rand's philosophy: free markets, free thinking, and self-reliance.

But it doesn't deserve the godawful 6% usually reserved for low-budget slasher flicks, either. It is not as bad as its low budget and relatively unknown cast of actors and producers would cause one to expect. It is respectable.

The cinematic quality is quite good, especially the outdoor scenes of Colorado and the special effects used to create the train and the bridge. The acting isn't bad, but it isn't great. Often I was painfully aware of Taylor Schilling being painfully aware of where Dagny should place her arm, or how Dagny should turn her head; I never felt that she embodied Dagny. Similarly, the background cast at the Reardens' anniversary party appeared to be made up of friends and family of the cast and crew (someone needed to teach them how NOT to mug for the camera).

For fans of Ayn Rand and Atlas Shrugged Part 1, the brightest compliment for this film is that it stays true to first third of the book. (Parts 2 and 3 are expected to follow.) For fans of filmmaking, however, the biggest problem is that it stays true to the book. The film is dialogue heavy, with very little action.

I’m not a Hollywood film reviewer; but I’m a watcher and a reader. I know that books and films are two different genres, and their stories have to be presented in two different ways. Books are primarily cerebral; films are primarily visual. Books can focus on philosophy and conversation; films must focus on action. Books can take days or weeks to read; films must tell their story in a couple of hours. When adapting a book to film, streamlining is essential. Unfortunately, the words in this film are so dense that the ideas become lost.

Atlas Shrugged Part 1 contains some great quotations, but it is not a film that will convince anyone but the Rand faithful of the supremacy of the free market. It makes the same mistake that most libertarians do when espousing philosophy: it assumes that everyone already sees the problems in the way libertarians do. It does not sufficiently engage the non-business person in seeing the long-term effects for everyone when government intervenes in the market. I can hear my middle-class neighbors and colleagues saying "So what?" when Rearden (Grant Bowler) is forced to sell all but one of his businesses. "How is that going to hurt me?" they might wonder.

Even the conflict between Dagny's pure free-market economics and her brother James's (Matthew Marsden) collusion with government is insufficiently portrayed; Dagny seems to be simply getting around the stockholders when she takes over the John Galt Line. Moreover, she and Rearden can hardly be seen as icons of virtue when they violate a freely made and morally binding contract (his marriage vows) by jumping into bed together. Even more damning is Ellis Wyatt's decision to burn his oil fields rather than let anyone else enjoy the fruits of his labor. My middle-class neighbors would howl with outrage at this decision. In short, I don't see how this film will convince anyone that returning to free-market principles will improve our economy and our way of life. It seems like everyone in the film is cutting moral corners somewhere.

"Not bad" is faint praise for a movie that has been 50 years in the waiting. Unfortunately, business pressures caused it to be rushed through with only five weeks in the writing, and another five weeks in the filming. Business is often an exercise in compromise, and this film's production is a classic example. I think, however, that if The Fountainhead's Howard Roark had been the architect of this film, it would have been burned along with Ellis Wyatt's oil fields. It's good, but not good enough.


Editor's Note: Review of "Atlas Shrugged Part 1" (2011), directed by Paul Johannson. The Strike Productions, 2011, 97 minutes.



Share This


A Surprise Hit

 | 

Last week I attended a screening of Atlas Shrugged, Part I, and was amazed to see that the theater was literally sold out and packed full. I cannot ever recall seeing that before. After all, this is not your usual fluffy romantic farce, comic book superhero movie, or action flick. It is an honest effort to put Ayn Rand’s extremely long novel into movie format.

Producers Harmon Kaslow and John Aglialoro are expanding the release from the initial 299 theaters to 425 by this weekend, and as many as 1,000 by the end of April. This is stunning, considering that the marketing plan was considered lame by Hollywood insiders, because it used the internet rather than more traditional venues, such as TV and radio, for running ads.

Not only are ticket sales doing well, but film-related merchandise — including replicas of the bracelet Dagny Taggart (Taylor Schilling) wears in the movie (made out of “Rearden metal”) — are flying off the shelves.

Aglialoro, a businessman who put $10 million of his personal capital into the flick, as well as co-writing and co-producing it, attributes its success in great measure to fortunate timing. I think that in this he is absolutely right. Obama’s leftist regime, with its bash-the-rich and blame-the-businesses rhetoric, massive new regulations, crony capitalism, and redistributionist mindset, has created a ready market for the movie.

Ironically, Obama may prove to be the cause of a whole new wave of Rand mania. That is well worth a chuckle or two, no?




Share This


A Whiff of Smoke

 | 

I have oft reflected, in this place, on the fiscal fires advancing toward us.

The United States is running massive deficits, even as it creates massive new entitlement programs (e.g., Obamacare). Our national debt is about to hit $14.3 trillion. Our three main existing entitlement programs (Social Security, Medicare, and Medicaid) are underfunded by more than $100 trillion. And consider such under-the-radar unfunded liabilities as the Pension Benefit Guaranty Corporation, which "guarantees" the pensions of millions of private industry workers and is underfunded in the tens of billions. That's now. Later, as more workers retire . . . you can guess what will happen. Meanwhile, the deficits of those Twin Towers of moral hazard, Freddie Mac and Fannie Mae, continue rising.

All these are at the federal level. The states are another matter.

The states are currently running a deficit of around $125 billion, which may not sound like much in this context. But at their level there is also more than $3 trillion in outstanding bond debt (counting municipal as well as state bonds), and what we can only estimate at about $3 trillion in unfunded liabilities.

The alarming rise in this ocean of red ink has caused Standard & Poor’s to cut its outlook on the US credit rating from “stable” to “negative,” meaning that in the credit agency’s view, we have about a one-third chance of having our AAA credit rating lowered during the next two years. Should that happen, the cost of our staggering amount of borrowing will explode.

The agency’s decision reflects more than its worries about our current deficit of $1.6 trillion (about 10.8% of current GDP) and our total debt of $14.3 trillion (over 91% of our GDP). It also registers the agency's skepticism about whether the wise solons in Washington will be able to staunch the flow of red ink anytime soon.

A number of predictable events have quickly followed. First, the Obama regime immediately pooh-poohed the embarrassing announcement. Spokesman Jay Carney said, “The political process will outperform S&P’s expectations. . . . The fact is where the issues are important, history shows that both sides can come together and get things done.” Obama’s unserious budget proposals belie these words.

Also predictable was the immediate drop in the stock markets. The Dow fell by 1.1% (140 points) that day, and London’s FTSE 100 fell by 2% (126). Equally predictable was the immediate weakening of the dollar, and gold's breaking the important psychological barrier of $1,500 an ounce. (It wasn't very long ago that it broke the psychological barriers of $500, then $1,000.)

The continuing housing bust masks our increasing inflation. While the official American inflation rate remains about 3%, prices have been rising rapidly in food and energy. Commodity prices are at or near their historic highs.

The public is now receiving a small whiff of smoke from the myriad fires that endanger us. People haven’t yet seen the flames, but when they do, the reaction will be severe. This is only the first year in which baby boomers are starting to retire. When all 78 million of them are on Social Security, Medicare, SSDI, Obamacare, and so on, you can expect a fiscal firestorm — probably hyperinflation or bond defaults.

The good news is that in those fiscal flames the hubristic and vicious (in the sense of vice producing) progressive-liberal state will likely be consumed. One hopes that will mitigate the pain.




Share This


Is Greed the Problem with Capitalism?

 | 

With the opening last fall of Money Never Sleeps, the sequel to Wall Street, Americans were again subjected to Hollywood’s version of how the economic system works: big business is evil, and greed is at the heart of our economic problems. The original Wall Street movie was released during the Reagan administration — aperiod that initiated significant economic expansion.Nonetheless, the movie offered a stern warning about what to expect from greed run rampant. The villain of the story was Gordon Gekko (Michael Douglas), the powerful head of a mergers and acquisitions firm. Toward the end of the movie Gekko makes a now-well-known speech about why “greed is good,” a speech that is meant to highlight the pro-capitalistarguments often made by businesses and free market advocates. Gekko tells a group of shareholders of a company he is trying to acquire that,

“Greed, for lack of a better word, is good. Greed is right, greed works. Greed clarifies, cuts through, and captures the essence of the evolutionary spirit. Greed, in all of its forms; greed for life, for money, for love, [for] knowledge has marked the upward surge of mankind.”

Of course, true to Hollywood form, in the end Gekko winds up implicated in a major fraud, thus revealing the true moral of the story, which is of course, that “greed is bad!”

Ironically, the current economic crisis has been an opportune time for Hollywood to make money by highlighting the immorality of greed (as in Money Never Sleeps). For many observers it was greed by the managers of financial institutions that led to easy loans with little to no down payments; greed by homeowners that led to purchases of houses they couldn’t afford; greed on Wall Street that led to the creation of such clever new financial instruments as mortgage-backed securities and credit default swaps; greed by CEOs that led to corporate extravagances and ridiculously high executive compensation packages; and greed by consumers that led to excessive use of credit cards to buy things now, rather than waiting till they earned the money to pay for it.

Tom D’Antoni in the Huffington Post declared that “the concept that ‘Greed is Good,’ is dead. It rose to its despicable zenith in tandem with the rise of Reagan, and has been the guiding principle of industry, finance and government ever since. . . . Greed brought us to this place . . . unregulated, untrammeled, vicious greed. Greed has no morals or ethics. Greed has no regard for others. Greed feeds only the greedy and feeds on every thing and everyone within grasping distance.” John Steele Gordon, author of a book on financial history, wrote, “There is no doubt at all about how we got into this mess. … Greed, as it periodically does when traders and bankers forget the lessons of the past, clouded judgments.”

 

Religion and greed

The world’s religions almost unanimously contend that greed is wrong. Although not explicitly proscribed in the Ten Commandments, greed is implicated in their command not to covet one’s neighbor’s property or spouse. The Bible contends that “the love of money is the root of all evil” (Timothy 6:10). In the year 590, Pope Gregory declared greed to be one of the seven deadly sins, along with lust, pride, gluttony, sloth, envy, and wrath. Among the seven, greed is often considered one of the worst, if not the worst, mostly because greed can inspire many of the other sins.

In almost every major religious tradition, greed is condemned unequivocally. The Qu’ran states, “Anyone who is protected from his own stinginess, these are the successful ones.” (64:16) The Tao Te Ching states, “When there is no desire, all things are at peace” (Chapter 37). In the Bhagavad Gita, Lord Krishna declares, “There are three gates leading to this hell — lust, anger and greed. Every sane man should give these up, for they lead to the degradation of the soul” (16:21). Sulak Sivaraksa, a leading Buddhist writer, states that “Buddhism condemns greed, which can easily lead to aggression and hatred.”

We do not appeal to other peoples’ humanity when we seek our sustenance, but rather to their self-interest, or in this case their greed.

Reacting to the recent economic crisis, Dr. John Sentamu, Archbishop of York, attacked exploitative moneylenders who pursued "ruthless gain"; he urged banks not to "enrich themselves at their poor neighbours' expense." Pope Benedict, in his 2008 Christmas message, said, “If people look only to their own interests, our world will certainly fall apart.” The Dalai Lama asked, “What is the real cause of this sort of economic crisis?” His answer: “Too much speculation and ultimately greed.”

 

Greed as a necessity

Greed is an easy target. It is not hard to convince most people that greed is the primary source of many of our economic woes. But is it really?

Stephen Pearlstein points out what many economists believe. He writes, “In a capitalist economy like ours, the basic premise is that everyone is motivated by a healthy dose of economic self-interest. . . . Without some measure of greed and the tension it brings to most economic transactions, capitalism wouldn't be as good as it is in allocating resources and spurring innovation.”

This is the central idea behind Adam Smith’s oft-quoted line about the butcher, the brewer, and the baker in The Wealth of Nations:

“It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest. We address ourselves, not to their humanity but to their self-love, and never talk to them of our own necessities but of their advantages.” (Wealth of Nations, Book 1, Chapter 2)

Smith is arguing that the economic system provides for our wants and needs because, first and foremost, people are trying to help themselves, and they do so by producing and selling meat, beer, and bread to others. These market outcomes are not achieved because of charity. We do not appeal to other peoples’ humanity when we seek our sustenance, but rather to their self-interest, or in this case their greed.Nonetheless the modern economist’s acceptance of greed as a positive force in society has not been readily accepted, given centuries of moral teachings to the contrary.

 

Seeking a middle way

Is there a resolution to the greed paradox? Is greed evil? Is it a necessary evil? Is greed something that humankind should seek to eliminate, perhaps replacing it with altruism? Or is greed something so ingrained in the human psyche that there is no hope of eliminating it?

Perhaps we simply need to learn how to live with greed. Perhaps there is a middle way, a method of channeling greed in good rather than bad ways.

Aristotle argued that “virtue is concerned with passions and actions, in which excess is a form of failure, and so is defect, while the intermediate is praised and is a form of success” (Nicomachean Ethics, Book 2, Chapter 6). It is the middle way that is the goal. Indeed, dictionary definitions of greed highlight not only self-interest but an “intense, selfish desire” (New Oxford American) or “an excessive desire to acquire or possess more than one needs or deserves” (American Heritage). Greed is usually not implicated if someone’s desires are average or if one achieves a moderate standard of living.

Religious writings sometimes take account of this. Thus, one hadith, or saying of the prophet Muhammad, states, "Eat what you want and dress up as you desire, as long as extravagance and pride do not mislead you” (Hadith as reported by Abd’allah ibn Abbas, 1:645). In Judaism too, one Midrashic interpretation asserts, “Were it not for the yetzer hara [the evil urge], a man would not build a house, take a wife, beget children, or engage in commerce” (Bereishis Rabbah 9:7).

There is no community or society in the world that fails to benefit from the voluntary exchanges and market activities that occur in abundance in everyday life.

Returning to the issue of our current financial crisis, some observers recognize that greed cannot be eliminated. Michael Lewis and David Einhorn of the New York Times write, “ ‘Greed’ doesn’t cut it as a satisfying explanation for the current financial crisis. Greed was necessary but insufficient; in any case, we are as likely to eliminate greed from our national character, as we are lust and envy.” Robert Sidelsky writes that John Maynard Keynes “believed that material well being is a necessary condition of the good life, but that beyond a certain standard of comfort, its pursuit can produce corruption, both for the individual and for society.”

Steven Pearlstein suggests a different perspective, that greed may not be about the degree of desire, or how much one acquires, but about how one acquires wealth: “Even before they decided to give away most of their money, nobody seemed to begrudge Bill Gates or Warren Buffett their billions or criticize them for their ‘unbridled’ greed. That seems to have a lot to do with the fact that Gates and Buffett made their money on the basis of their own ingenuity, skill and hard work.”

 

Methods of satisfying greed

The American economist Henry George (1839–1897) is mostly famous for his theory of the Single Tax, but in his book Protection or Free Trade there is a passage that can help resolve some of the tension about greed and profit seeking: “Is it not true, as has been said, that the three great orders of society are ‘working-men, beggar-men, and thieves’?” (pp. 21–22).

Prima facie this passage may seem unremarkable, or at worst confusing: after all, what exactly is an “order of society?” But if we think about it carefully in light of the current discussion, it actually provides the seeds, or kernels, for understanding “greed.”

First, let’s recognize that what George has in mind are three primary ways in which people obtain benefits for themselves, or in other words “profit.” As I’ll argue, how a person ultimately judges profit-seeking activities and whether he views greed as good or bad will depend largely on which one of George’s “great orders” he believes to be most prominent in society. But first let’s discuss each of these profit acquisition methods, beginning with the last one, “thief.”

 

Thieves: involuntary transfers

One of the simplest methods a person can use to satisfy his greed for food, clothing, automobiles, cameras, computers, or the money to acquire these things is simply to take them away from someone else. Theft has been a part of life since the beginning of society, and it is likely to remain a part of society for a long time to come.

When the rightful owner of something has it stolen by another, the thief clearly benefits. He is now in possession of the valued item. The victim suffers a loss, since he does not possess and can no longer receive benefits from the product. The victim will surely feel that an injustice has occurred and will demand the return of the stolen property and the punishment of the perpetrator, if those responses are at all possible. But regardless of what happens afterward, theft involves a transfer of an item from a legitimate possessor to an illegitimate possessor, and the transfer always occurs involuntarily. Thus the term “involuntary transfer” offers a better moniker, especially because in many situations the transfers may technically not be considered theft, but will have similar characteristics.

Around the world societies evaluate outrighttheft in similar ways. It is generally considered bad, or wrong, or perhaps evil, with perhaps only a few exceptions tolerated. These exceptions are rare, and societies have put into place an elaborate system of laws that prohibit theft in a variety of situations while providing penalties for those found guilty of having violated these laws. Suffice to say that the acquisition of benefits by means of theft is unacceptable in all societies around the world. Although this is a seemingly obvious point, it forms the basis for most of the cry-outs about injustices around the world. In brief, people claim an injustice whenever they perceive that someone is getting “ripped off” in some way.

 

Beggar-men: voluntary transfers

The second of George’s orders of society — that is, another major way in which people acquire the goods and services that their greed desire — consists of those who are given the items voluntarily by someone else. A beggar stands on the street corner and solicits money from passersby. The money they give him represents a transfer of goods and services from the giver to the recipient. Although the giver loses, the money obtained by the beggar is not ill-gotten, in a traditional legal sense, because it has been given to him willingly; it is a voluntary transfer.

From the giver’s perspective this action is called charity and the action is held in high esteem in most societies in the world. Charity is not self-serving; it is in the service of others. It is not consideredharmful, but helpful. Charity is encouraged and promoted in all of the major religions. Some people, such as Mother Teresa, who have spent their lives giving to needy people, are respected or even beatified by their religious groups.

 

Working-men: voluntary exchange

The third order of society that George mentions is “working-men.” This is another method an individual can use to acquire the goods and services that his greed may inspire. Work generates an income that can be used to purchase consumption goods, but it is important to recognize the underlying process. Work in a commercial societyis an activity devoted to producing a good or service that someone else will wish to purchase; a product that is desirable. Through the free voluntary exchange of the product for money in the marketplace, a business generates the revenue that is used to pay its workers. That money, or income, is then used by the workers to purchase other goods and services produced by other workers. In the end, when you strip away the money part of the transactions, what is really taking place in market activity is the voluntary exchange of one good for another. And since both parties to a trade exchange their goods voluntarily, it must be that both benefit from the transaction, for if not, why trade?

Voluntary exchange is the cornerstone of the world’s economic prosperity. The very first lesson in Smith’s Wealth of Nations is the principle of the division of labor: productivity can increase as the production process becomes more specialized; that is, as labor or workers are divided into more specialized tasks. But the only way to take advantage of these benefits afterwards is through exchange. If you cannot exchange, there is no incentive for specialization.

We should never portray greed in general as good or bad, right or wrong, but as something that can be satisfied in either acceptable or unacceptable ways.

Based in part on this fundamental principle, economists have long supported the free market, which essentially means allowing free and voluntary exchanges, without social or governmental impediments. Indeed, societies everywhere generally accept and promote trade both within and beyond their borders. There is no community or society in the world that fails to benefit from the voluntary exchanges and market activities that occur in abundance in everyday life. To summarize: if greed inspires work that in turn inspires voluntary exchanges in the marketplace, then the outcome is mostly good for everyone involved.

 

Distinguishing “good” greed from “bad” greed

Greed can generate either good or bad outcomes, depending on which great order of society, or in other words which method, is involved in its satisfaction. If greed inspires a person to work long hours in a business providing valuable goods and services to others in order to satisfy the needs of himself and his family, then greed should be perfectly acceptable on pragmatic grounds. If greed inspires a person to innovate and create new products that others will desire in the market, then greed is good. In each of these cases greed is satisfied through voluntary exchange. However, if greed inspires a person to acquire what he desires by taking the rightful possessions of another person without that person’s consent, then greed is not good. In this case greed does not encourage useful behavior in the marketplace, but rather fear that one’s marketable goods will be appropriated by others. For similar reasons, greed is also wrong when it inspires someone to put roadblocks in the way of others who are trying to sell their products in the marketplace. In both these cases greed is satisfied by means of involuntary transfers and is rightly condemned. Yet if greed urges one to beg for food and clothing, or to seek the charitable contributions of others, and if those items are given voluntarily, then greed is satisfied in an acceptable manner; that is, a manner that has no deleterious effects on other people’s ability to benefit themselves by means of free exchange. The compassion of charitable people, helping those less fortunate, engaging in voluntary transfers, is clearly unobjectionable.

We should never portray greed in general as good or bad, right or wrong, but as something that can be satisfied in either acceptable or unacceptable ways. The distinguishing feature isn’t the presence of greed itself or even the intensity of the greed, but the way in which greed is satisfied. Following the suggestion of George’s great orders, the greed satisfied by a working man is commendable, the greed of a beggar-man is unfortunate but acceptable when necessary, and the greed satisfied by thievery is the primary source of injustice in the world.

 

Greed and the economic crisis

Many criticisms about greed’s role in the current economic crisis are really complaints about involuntary transfers. Hollywood and liberal Democrats look at the crisis and see injustice in the high salaries of CEOs, the comparativelylow wages paid to average workers, the excessive loans made to people who could not afford the homes they were buying, and the political clout of business insiders who got rules written on their behalf. But the reason people see injustice is mostly because they believe that someone is getting ripped off. It may be the consumer or the taxpayer or the low-paid worker at the company, but in any case, the perception is that one group is receiving less because someone else is receiving more.

Frequently these complaints are correct. Big business does sometimes engage in fraud. Consider the recent scandal involving Bernie Madoff. Madoff offered investors better than average returns largely by fabricating them in financial statements and by using the principal deposited by new investors to pay the returns of investors lucky enough to get out early. His setup was a classic Ponzi scheme that inevitably collapsed when too many people demanded their money back at the same time. Clearly Madoff was greedy — as were the investors who were looking for better returns than they had any reason to expect from an honest investment scheme. However, this case is a clear example of greed fueling involuntary transfers rather than valuable production and trade. The investors were led to believe that their money was wisely invested in companies making above average profit when in reality new investor money was transferred to exiting investors as needed. As long as deposits exceeded withdrawals the Ponzi scheme could continue.

Many other prominent examples of insider trading, accounting scams, and other shady dealings have been uncovered over the years and have resulted in prosecution and jail sentences. Yes, businesses may be exploitative. Cries of injustice by the general public rang out when huge bonuses were announced for executives at the financial firms that were bailed out by the government. After the billions of TARP dollars were transferred to these failing institutions, many of the banks were quickly out of trouble and the systemic crisis was averted. However, announcements that these same companies would pay millions of dollars in overdue bonuses to executives touched off a wave of indignation.

Many companies profit both by selling desirable goods and by taking advantage of involuntary transfers.

The source of the anger is obvious. In the midst of the crisis these institutions laid off a large portion of their work forces. Meanwhile, their overextended positions on loans, with effects multiplied by their own enormous size, contributed to the crisis. Since bonuses are typically made to reward good behavior, it seemed inappropriate for executives who were implicated in the crisis and were “saved” by a taxpayer-financed bailout to be able to walk away a few months later with hefty bonuses. Reward appeared to be disconnected from achievement. Most observers would contend that these companies were restored to profitability, not by the skill and hard work of executives producing a superior product for their customers, but by involuntary transfers from taxpayers. So again, there is a sense that involuntary transfers helped to satisfy the greed of a few individuals.

The key for high salaries to be viewed as equitable, or just, is that they are deserved. As mentioned earlier, relatively few people seem to begrudge the high salaries and enormous wealth of Bill Gates, or popular figures in sport. Their earnings are generally recognized as a result of the voluntary exchange process. These people earn money by providing valuable goods and services to others around the world.

Basketball stars seldom lobby to advance their interests, but big business often does, and this is a large source of complaints about greed. Robert Reich goes so far as to describe lobbying as political corruption:

“If we define political corruption as actions causing the public to lose confidence that politicians make decisions in the public's interest rather than in the special interest of those who give them financial support, the biggest corruption of our political process is entirely legal. It comes in the form of campaign contributions that would not be made were it not for implicit quid pro quos by politicians, bestowing favors of one sort or another on the contributors.”

But what sort of favors does he mean? He continues:

“The fights that actually preoccupy Congress day by day, which consume weeks or months of congressional staffers' time and which are often the most hotly contested by squadrons of Washington lobbyists and public-relations professionals, are typically contests between competing companies or competing sectors of an industry or, occasionally, competing industries. . . . Many of these battles (e.g. over health care reform) continue but have moved into the regulatory process, where different companies, sectors, and industries are seeking rules that advantage them and disadvantage their competitors.”

Reich is arguing that the business of government has become the provision of rules and regulations that favor some over others. In other words, he is describing a completely legal, but involuntary, transfer process promulgated by government. The winners are those who have the most clout among legislators. Often they are the ones (big business and big labor) who can offer the most in campaign contributions. The losers are either the less influential competitors, or the taxpayers who must provide funding for the subsidies provided, or the consumers who pay higher prices produced by taxes or regulations.

The same process of involuntary transfer appears in connection with financial sector reform. Again, greed is said to be the source of corruption, but it is just a smokescreen. People demand that something be done; they demand that government prevent financial crises, such as occurred in 2008. Unfortunately, no one quite knows how to do that. Nevertheless, lack of knowledge won’t prevent changes from being made. That’s because there are plenty of influential organizations standing in the wings with suggestions. While all of these suggestions will be presented as important to the national interest, the changes will be particularly helpful to the organizations themselves.

Even more likely is that good ideas for regulatory reform will be paired with ideas that serve particular corporate interests. This is one of the reasons that so many pieces of legislation are thousands of pages long these days: to buy political support, commonsensical reforms must be combined with favors for powerful interests. It is no wonder that, after decades of rule writing like this, our regulatory system is a twisted web that requires companies to hire huge teams of experts and consultants simply to untangle.

 

So is greed the problem with capitalism?

Liberal Democrats and conservative Republicans ought to find these examples of involuntary transfers equally objectionable. Confusion arises because of the focus on greed as the culprit. Critics of business and free markets see the greed that is satisfied through fraud and other involuntary transfers, and therefore condemn all efficientprofitseeking activities. But what about businesses that are making money and paying high salaries to their executives because of the desirable goods and services they are selling to their customers — doesn’t greed inspire their activities? And if we could stamp out greed from our psyches, wouldn’t we also be eliminating a motive that makes the modern economy work? Of course the answer is yes to those questions, which is why supporters of free markets are quick to condemn the “greed” arguments made by the Left.

One apparent problem is that many companies profit both by selling desirable goods and by taking advantage of involuntary transfers. The two activities are often confounded within the same business. For instance, executives at Enron perpetrated an accounting scam that prevented shareholders from knowing that the company was sinking deeply into the red, but at the same time the company provided valuable energy services to its customers. Although some portion of the riches made by Enron executives were fraudulent, some other portion was not. Similarly, some companies that use political influence to gain favorable regulatory treatment — treatment that effectively transfers money in their direction — simultaneously produce and sell legitimate products in the marketplace. Their high salaries and profits, no doubt sought and achieved by greed, are partly due to acceptable voluntary exchanges and partly due to objectionable involuntary transfers.

This confounding effect leads to many problems of interpretation. For example, high CEO salaries are often explained by using marginal productivity theory, according to which competition in the CEO market drives the prices for those positions to the levels observed. Under this interpretation, CEO salaries are the deserved share of production in a voluntary exchange market system and thus are acceptable. On the other hand, one could interpret high CEO salaries as the consequence of an exploitative process, in which CEOs are rewarded in the competitive market because they have effectively increased their companies’ shares of wealth by means of involuntary transfers from taxpayers or consumers. Since it is very difficult to measure which portion of a large company’s income is attributable to which kind of process, different interpretations are possible. However, here the disagreement is not about principle but about the interpretation of data.

 

Conclusion

Unfortunately, the right lessons about greed and capitalism are unlikely to be found either in recent Hollywood productions (which indiscriminately condemn all products of greed) or in recent economic theory. For theory, it may be best to revert to the old classics: read Smith’s Wealth of Nations and Theory of Moral Sentiments; read Frédéric Bastiat’s The Law; read Friedrich Hayek’s The Road to Serfdom; read Henry George’s Protection or Free Trade.

In movies, the classics are also best. I mean, for example, a movie from 1954 entitled Executive Suite (starring William Holden and Frederic March). The film explores two different approaches to business; one based on reverence for the bottom line no matter what methods are used to achieve it; the other based on hard work, innovation, and the production of superior goods that the workers themselves can be proud of. By the end of the movie, the moral superiority of one over the other is obvious. The greed that inspires work, innovation, and pride (voluntary exchange) wins out over the greed that inspires fraud, blackmail, and accounting tricks (involuntary transfers).

We need to resurrect this understanding of business. We need to remember how aspiration, inspiration, and greed, appropriately directed, can create a workplace filled with well-treated, well-motivated workers striving to produce a superior product for their customers. Indeed, Hollywood can show us a way out of the current economic crisis; only it is not today’s Hollywood.

 

Works Cited

D’Antoni, Tom, “Finally the Death of Greed,” online at The Huffington Post, Dec. 11, 2008.

George, Henry (1949), Protection or Free Trade, Robert Schalkenbach Foundation, New York.

Gordon, John Steele, “Greed, Stupidity, Delusion — and Some More Greed,” online at the New York Times, Sept. 22, 2008.

Lewis, Michael, and David Einhorn, “The End of the Financial World as We Know It,” New York Times, Jan. 3, 2009.

Pearlstein, Steven, “Greed Is Fine. It's Stupidity That Hurts,” Washington Post, Oct. 2, 2008.

Reich, Robert, “Everyday Corruption,” The American Prospect, June 21, 2010.

Sivaraksa, Sulak, “Buddhism Nationalism and Ethnic Conflict,” an interview conducted in July 1993, published in the Tamil Times. Online at http://federalidea.com/focus/archives/112.

p




Share This


The News About the News

 | 

When I was a child, we subscribed to two newspapers a day. The Los Angeles Times arrived early in the morning, and the Herald-Examiner plopped onto our doorstep in the late afternoon, usually thrown by my friend Dennis Miller, who had a paper route. (Back then, moms felt safe letting their young boys ride their bikes by themselves every day and knock on doors asking for money once a month.) I always liked the Examiner better, because the photos were a little larger, the stories a little racier, the features a little more entertaining. I didn't realize back then that it was intentional: morning papers contained cold gray news for people in a hurry; evening papers provided lighter fare and racier storytelling for readers who wanted to relax and unwind after a hard day.

With the advent of television news, then cable news, then electronic news, print news has become less and less profitable. Newspapers around the country are cutting back on stories, letting staffers go, and just plan folding up. When documentarian Andrew Rossi received permission to hang out with a video camera at the New York Times offices for a whole year, he didn't know that the demise of print journalism would become the focus of the story; no good documentarian ever knows exactly where the film will end up. But that's where Page One: Inside the New York Times went, and the result is a sometimes lively, sometimes somber, mostly interesting story about the past, present, and future of journalism.

Page One is a bit character heavy in the beginning as it introduces several side stories at once. The character who shines with the most luster is David Carr, the eccentric Monday columnist for the Business section of the Times who focuses primarily on media issues. One of the ironies pointed out in the film is the fact that the Times found it necessary to open a desk in 2008 to cover the demise of the media, and Carr does it in this film with a protective vengeance.

What's cool about Carr is that he lived first, and became a respected journalist second. A self-described cokehead in his youth, he spent some time in jail before becoming a respected writer. He wrote for a number of alternative publications before joining the Times when he was approaching 50. As a result, his voice, both written and spoken, is often unfiltered and unabashed, providing most of the humor in what is often a gray documentary.

But what is killing print journalism? First is the need for profits. Subscription rates will never be able to cover the costs of writing, printing, and delivering the news. Advertising revenue is the true source of support for newspapers, and ad revenue in print media is down everywhere. As a result, coverage is down, and serious coverage is down even more. Who's going to cover city hall when readers only want to know what Lindsay Lohan is up to? And since readership determines advertising rates, more fluff is passing for news these days.

Second is the need for speed. People used to be willing to wait for the scheduled newspapers, with an occasional "Extra" in which to "read all about it" when breaking news called for the editor to "Stop the presses!" Today's tech-savvy consumers, by contrast, are constantly in touch with breaking news, through texting, Twitter, Facebook, and other instant news feeds. They expect to know what's going on, moments after it happens.

On the other hand, the blogosphere's post-now, check-facts-later mentality gives print media the edge in accuracy and credibility. Carr wryly disparages the "caco-phony" of Twitter, even though he grudgingly admits that Twitter is a "wired collective voice" that gives him a sense of what people are talking about. One important scene in the documentary demonstrates a typical 10 a.m. meeting at the Times, where several editors and reporters sit around a table discussing stories currently in progress. There is an air of calm as they take the time to check facts, discuss context, consider reader interest, and check facts again.

Nevertheless, the documentary pulls no punches in reporting on the Times' gross mistakes, including the Jayson Blair scandal and Judith Miller's 2002 articles reporting weapons of mass destruction in Iraq that turned out not to exist. Miller defends herself by saying, "If your sources are wrong, you're going to be wrong." Blair was simply lying. I'm not sure which is worse — being naively hoodwinked or being deliberately devious.

One of the most shocking revelations in the film is the "end of the war" in Iraq that was neatly choreographed by NBC execs to coincide with the 6:30 news. The documentary claims that NBC simply wanted to give viewers a "mission accomplished" closure to the story. So they filmed their reporter accompanying "the final combat troops leaving Iraq" and broadcast it live on the evening news, even though the Pentagon had made no such announcement. It reminded me of the ending of Ray Bradbury's Fahrenheit 451, when protagonist Guy Montag watches an innocent pedestrian being chased down, caught, and killed in his stead, just to give viewers the satisfaction of "closure" on the evening news.

The film touches on dozens of areas affecting journalism today. All of them are interesting and important, but the film's own cacophony of information prevents it from having a strong central storyline. In a way, this presentation is more real and honest than a neatly tied story with a beginning, middle, and end. Life doesn't always have a climax on page 72. Nevertheless, it's a fascinating film, well worth viewing.


Editor's Note: Review of "Page One: Inside the New York Times," directed by Andrew Rossi. Magnolia Pictures, 88 minutes.



Share This


Why Are We Suddenly Winning?

 | 

There was a recent miracle in Wisconsin, and it is the kind of miracle that merits some reflection.

After facing the fury of the public employees for weeks, Scott Walker, Republican governor of Wisconsin, was able to sign into law restrictions on the collective bargaining rights of state employees. This was the bill that the Republicans in the state legislature had managed to pass even while the Democrats in the state senate were hiding out of state, attempting to deny them a quorum).

It was miraculous enough that the Governor and the Republican legislators hung tough in the face of the union-orchestrated onslaught of confrontational — nay, hysterical — demonstrations by spoiled teachers, duped school kids, poseur leftist college students, and union rent-a-goons (all apparently unchecked by sympathetic cops), not to mention a massive ad blitz portraying the Republicans as subhuman brutes bereft of all compassion.

But after the bill was signed, the unions went to the Left’s time-tested Plan B: use the court system to get what you want. And fortuitously for the unions, there was an election for a seat on the state Supreme Court, in which fairly conservative Justice David Prossner was up for reelection. Unions plowed an astounding $3.5 million into defeating Prossner and electing their chosen tool, JoAnne Kloppenberg, who made it clear that if elected she would rule the troublesome law unconstitutional. It was an egregious display of a complete lack of judicial temperament, not to mention intellectual integrity.

The race promised to be close — remember, another of the time-tested tools of leftist unions is to use their organizational power and endless dues money to win off-year elections, relying on the fact that the rationally ignorant taxpayer is usually too preoccupied with other things—such as actually having to work for a living — to vote in minor elections.

Initial reports showed that the union stooge had squeaked out a victory — she even went on TV to crow about it. But then another miracle occurred: during the certification process, County Clerk Kathy Nickolaus discovered that the City of Brookfield’s votes had not been counted into the statewide total. When they were, it turned out that Prossner won going away, with a margin large enough that the usual Democratic election-stealing tactics (finding a few mysterious votes that appeared to be cancelled ballots, for example) won’t work to reverse it.

This is a huge setback for organized labor in particular, and the Left in general. Be crystal clear on this: if organized labor can’t count on winning an off-year election in a blue state after spending millions of dollars and rallying its myrmidons to march, scream, and vote in lockstep, it is in very profound trouble.

Why? What accounts for all these developments?

Several recent stories help us understand why rightist Republicans, usually pathetically lacking in anything remotely resembling electoral savvy and political street-smarts, are beginning to achieve some measure of success in enacting their agenda. The stories have to do with the looming approach of the public choice tipping point  — the point at which rational ignorance ceases to be rational.

The first is a nice article by libertarian economist and all-around BFB (brilliant French babe) Veronique de Rugy. She brings to our notice the growth of that glorious illustration of the law of unintended consequences, the AMT (Alternative Minimum Tax).

The AMT was passed by a hysterical Congress back in 1969, when it was discovered that about 150 wealthy Americans were — gasp! — using legal deductions to avoid paying taxes. (Ironically, most were elderly people who had put their money in tax-free municipal bonds, bonds that, of course, fund government.)  Under the AMT, certain higher-income taxpayers must file two tax forms, the regular form, and the AMT form. If the latter shows that they owe more than they would if they used the regular form, they have to cough up the difference.

The Governor and the Republican legislators hung tough in the face of hysterical demonstrations by spoiled teachers, duped school kids, poseur leftist college students, and union rent-a-goons.

The cruel joke on the taxpayer is that the AMT was never indexed for inflation, so as each year passes it molests more and more taxpayers. In fact, the number of hapless filers hit by the AMT rose from a couple of hundred in 1970 to about a half-million in 1979 to 4.5 million in 2009.  In that year, Congress had to pass a short-term patch, so that in 2010 the number of taxpayers eligible to be AMT’d wouldn’t rise to 27 million! No doubt the taxpayers who would be targeted were made aware of this by their accountants, and pressured Congress to stop it.

The second article is a superb piece by the SAG (smart American guy) Stephen Moore. Moore contends (rightly, in my view) that “we’ve become a nation of takers, not makers.” He mentions a number of statistics that illustrate this.

There are now nearly two government workers for every worker in manufacturing (22.5 million versus 11.5 million). He notes that back in 1960, there were 15 million workers in manufacturing versus only 8.7 million in government. To put this in another way, we have more people working for government “than work in construction, farming, fishing, forestry, manufacturing, mining, and utilities combined.” Almost half of the combined $2.2 trillion that state and municipal governments cost taxpayers every year is spent on public employee pay and benefits, including pensions. (Moore doesn’t note the fact that this outlay will soon explode, as the Baby Boomers retire.)

California now has twice as many government workers as workers in manufacturing. New Jersey has two and a half times as many. Florida and New York stand at about three to one, government to manufacturing. Even more amazing, Iowa and Nebraska — classic farm states — now have five times as many government workers as farmers.

Pointing to recent surveys of college grads, which show that these people would rather work in government than in private industry, Moore makes the devastating point: “When 23-year-olds aren’t willing to take career risks, we have a real problem on our hands. Sadly, we could end up with a generation of Americans who want to work at the Department of Motor Vehicles.”

A third article illustrates the reason for the attractions of government work. This New York Times piece reports on a growing practice of government workers — retiring on full pension and immediately going back to work; in effect, double dipping. In true Times fashion, it builds the story around a human sample case, one Carlos Bejarano, who is the superintendent of an Arizona school district of about 7,000 students in grades K through 8.

In the middle of a fiscal crisis, in which the district is planning to lay off staff and close two schools because of economic hard times, Bejarano will retire, this year, on a pension of $100,000 per annum — but will continue at his existing job. Oh, by the way, his job pays a modest $130,000 a year, plus health and other benefits, that come to a total of about $150,000 yearly. At one hearing about the school closings, an outraged citizen sputtered out, “How dare you?”

Cases like this are legion, and the populace is beginning to know it. Try explaining to the average worker why he is paid a fraction of Mr. Bejarano’s wage, and has a 401k instead of a defined-benefit pension, or is even unemployed, while his taxes go to pay some public school administrator what Bejarano makes. Good luck.

So there you have it. These stories illustrate what I think is driving the increasing boldness of the Republicans. People are becoming ever more aware of the possible increase in their taxes, and there is increasing awareness of how vast government has become, and how often these multitudes of government workers are ridiculously overcompensated. It is this growing awareness that is changing the rules of the game — changing it from a game that favors “progressive” liberals to one that just might favor classical liberals.




Share This


Appreciation of Depreciation

 | 

Everything depreciates; even the IRS realizes this truism. They realize that your rental property is falling apart as time assaults its clapboard shingles. Even our bodies lose value, though the taxpeople grant no deductions for our annual, inarguable loss of strength, stamina, beauty, or the mental assets whereby we make our living.

Wouldn’t it be a political masterstroke for the Obama healthcare crazies to add that earmark to their bill — bodily depreciation? Rental houses decline, and so do structures of bone and flesh. You don’t believe me? Take a good look at Nancy Pelosi.

Consider your shiny new car — drive it a year and it has lost 30% of its value. And how ’bout your new $40 sweater? At the garage sale you’ll be lucky to get five bucks. And let’s not even talk about underwear. One day’s use and it’s worthless. Oh, maybe five cents as a rag.

But of all the products of our civilization, nothing loses monetary value like books. I learned this lesson in economics at our Charlotte branch library. Books initially listed at 25 bucks could be purchased for a mere quarter of a dollar. What a remarkable decline! And think of it. Unlike the car, TV, sweater, or underwear, the books are functionally brand new — just as useful as the day they were printed. Sometimes they're even more so, with helpful notes that some previous reader scribbled in the margin. And better yet, if you're really lucky, sweet little notes and inscriptions: “Christmas 1945: Rob, Hope you return by June." "This book — see page 6 — expressed my feeling for you! All my love, Betty.” I often wonder why Betty didn’t save $19.95 and enhance her poetic reputation by simply copying page 6, changing a few names, and sending Rob a handwritten note.

You never know what you’ll find. I had an overnight guest who searched my library for bedtime reading and randomly picked the “bank” where we kept stray greenbacks for a household emergency. He didn’t get to finish the book, he said — could he borrow it? We switched him to a safer book.

Everything depreciates, including — it hurts me to say this — love, though nobody has calculated a percentage acceptable to the IRS. Too many variables. The only exceptions may be wine and super-aged antiques. I guarantee you that my iMac, 50 years hence, will gain additional value (the patina of age, you know), as compared with the items in my kitchen pantry: green cheese, curdled milk, and purple veal chops.

And then there are cars. First, a giant decline in value, beginning with the drive out of the showroom; then a continuing decline as you pile up miles. But somewhere in its automotive lifecycle, a car becomes a nonpareil gem, worth more than you paid for it. Buy and hold.




Share This


The Thin Blue Line

 | 

Maricopa County (AZ) sheriff Joe Arpaio has done it again.

If you’re familiar with the sheriff’s methods, that “it” has you uneasy even before you click the link. What did he do this time? Is the “it” an old favorite, such as forcing an illegal immigrant to give birth while shackled, hunting illegals in the desert from behind the turret of a .50-caliber machine gun, or defending his prison guards after their fatal battering of a mentally retarded man accused of misdemeanor loitering? Or is this “it” a newly-devised offense against both Constitution and decency?

Unfortunately for everyone under his jurisdiction, it’s the latter. In late March Sheriff Joe mobilized his tank, his armored troop carriers, his SWAT platoon, and his bomb-defusing robot, in order to serve a warrant on Jesus Llovera, a man with no felony convictions and no history of owning or even displaying any weapons. Llovera’s alleged crime? Breeding birds for cockfighting.

Now, Llovera does have one prior on his record: a misdemeanor for attending a cockfight. And he did have 115 of the birds on his property (all of which, rather needlessly, were put to sleep during the raid) that were more than likely intended for similarly cruel spectacles. So this is a bit more focused than when Arpaio raids retail stores on the off chance that some of their employees lack visas. But even with probable cause and a legally obtained warrant to serve, what could possibly justify such an ostentatious show of force?

For Arpaio, the answer is always simple — and certainly has nothing to do with any book of local, state, or federal law. No, what Sheriff Joe craves is the spotlight, and he will do anything — any “it” — to get himself in front of the cameras. In this case, the “it” he did was to organize this whole raid in order that C-list celebrity Steven Seagal could ride along and look tough in a tank for his reality show Lawman.

Remember: SWAT teams were originally instituted to deal with extreme threats to the public safety, such as the ex-Marine Charles Whitman, who barricaded himself in the University of Texas clock tower and rained bullets onto the crowd below. Now we have entire departments going out in full riot gear to arrest a single unarmed man and euthanize his chickens.

Whether that makes for gripping TV drama is a question I leave to Seagal’s dwindling audience to decide. I am sure, however, that it doesn’t make for good policing — and just as sure that Arpaio doesn’t give a damn.



Share This


School Choice News

 | 

Among a number of other bits of good news lately, there has been a favorable Supreme Court ruling regarding school choice.

A closely-divided Court decided (5–4, in Arizona Christian School Tuition Organization v.Winn) to uphold an Arizona law meant to facilitate school choice. The law allows people who donate to organizations that support religious schools to write off all their school payments on their state income taxes. Opponents of the law — including, naturally, teachers’ unions and public school administrations — argued that the tax credit amounted to establishment of religion, and was thus unconstitutional. They pointed to the fact that many of the schools supported by the tax credit required students to be of a particular faith. The opponents were trying to get around the landmark 2002 Supreme Court ruling Zelman v.Simmons-Harris, which held that voucher programs comply with the establishment clause, even when the vouchers are used to send kids to religious schools.

The opponents’ suit was based on a 1968 Supreme Court ruling that allows people who are not harmed by a religious subsidy to have standing to sue, because otherwise enforcement of the establishment clause would be difficult. But the majority of the current Court held that the exemption was meant only to apply to actual government payments to support religion, and a tax credit is not a government payment; it is just funds never collected to begin with.

This ruling will permit more states to allow tax breaks enabling parents whose children are being cheated out of a decent education by the state monopolistic school systems to send their kids to religious schools instead (or private secular schools, for that matter). Robert Enlow, head of the estimable Foundation for Educational Choice, hailed the verdict, saying, “Every state that is considering a tax-credit program can rest easy.” As a religious agnostic, I also hail the ruling. If you want to send your kids to a religious school, it seems obvious that you should have that right — it doesn’t harm me in the least.

Predictably, educrat Francisco Negron, head lawyer for thee National School Boards Association, the major organization representing state public school systems, condemned the ruling, rightly viewing it as another blow to the public school monopoly. Indeed, yes sir, it is a blow — to those disgusting swamps of governmental failure, which deserve all the efforts we can make to drain them, since they are destroying the lives of hundreds of thousands of children, every year. Negron’s specific complaint, that allowing tax deductions for private schools lowers the resources available for public schools, is specious. Yes, allowing tax credits reduces funds available to the public schools, but it also reduces the number of their students, hence their costs.

Those who find little difference between the political parities should note that all of Bush’s Court appointees voted for the ruling, and all of Obama’s and Clinton’s voted against it. The Obama administration supported the law officially, but the people whom Obama put on the Court voted against it.  Justice Kagan — Obama’s most recent pick for the court — wrote the dissenting opinion. This is a classic progressive liberal trick: feign support for popular initiatives, but pack the courts with judges who will rule them unconstitutional.




Share This


The Way-Too-Friendly Skies

 | 




Share This


Behind the NPR Fiasco

 | 

“You who think that you’re so great! You who judge humanity to be so small! You who wish to reform everything! Why don’t you reform yourselves?”

— Frédéric Bastiat

When a fundraising scandal recently ensnared National Pubic Radio, some opinion-makers rushed to revel at the arrogant organization’s woes. I resisted not because I’m a spoilsport but because, living all of my adult life on the West Coast, I’ve absorbed enough Zen philosophy to give wide berth to schadenfreude. It can be a prologue to one’s own misfortunes.

Some weeks have passed, now, and it’s safe to consider what happened. And why it matters.

The facts of the scandal are fairly straightforward. James O’Keefe — the right-wing media provocateur — had a couple of colleagues pose as members of a U.S. Islamic group interested in donating money to NPR. These representatives of the “Muslim Education Action Center” were put in touch with Ron Schiller, NPR’s head of fundraising, who arranged a lunch meeting. The fake Muslims came to the meeting equipped with hidden audio and video devices.

Over the course of the two-hour lunch, they said outrageous things about Israel, Republicans, and the Tea Party. Schiller, practiced in the obsequious manners of big-dollar fundraising toadyism, agreed and agreed. He agreed to some things that implied NPR has an anti-Israel bias and other things that indicated he and his colleagues are insecure strivers with naught but contempt for middle America.

For all his sucking up, Schiller didn’t get the $5 million check. The fake Muslims said they had a few things to think over first. And they hustled out with their material.

In a telephone call recorded after the lunch, the fake Muslims asked Schiller’s lieutenant (NPR’s “senior director of institutional giving”) whether she could have the $5 million donation treated as anonymous. The fake Muslims claimed that they were concerned about being audited by the government; she replied that this was possible and that she would do everything she could to obscure the gift’s provenance.

O’Keefe whittled the two hours of video into an 11-minute excerpt. And he released his excerpt to the internet and television news outlets, which repeated snippets of the NPR fundraiser sucking up to the Muslim Brotherhood and calling the Tea Party a collection of ignorant bigots.

Outrage — some genuine, some clearly manufactured — followed. And a couple of obsequious fundraisers weren’t going to satisfy the establishment Right’s partisan bloodlust. Besides, before his lunch with the fake Muslims, Schiller had already announced that he was leaving NPR to take a similar post with the left-leaning Aspen Institute. So, the Right turned its attention to a bigger target: NPR’s chief executive, a woman named Vivian Schiller (who is, as noted repeatedly, no relation to Ron Schiller).

Schiller’s boast that NPR didn’t need the government money that it normally receives played into the hands of NPR’s political adversaries.

Vivian Schiller had been an executive at the New York Times Company before moving to NPR and had been on the radar of establishment Republicans for some time. She rose to the top of their hit lists after firing NPR and Fox News commentator Juan Williams for . . . well, for splitting his time between NPR and Fox News.

On March 9, NPR released this statement from its Board of Directors’ Chairman Dave Edwards:

“It is with deep regret that I tell you that the NPR Board of Directors has accepted the resignation of Vivian Schiller as President and CEO of NPR, effective immediately. The Board accepted her resignation with understanding, genuine regret, and great respect for her leadership of NPR these past two years.”

Edwards said the decision to part ways with Vivian Schiller proved the Board’s “commitment to NPR’s standards.”

While the organizational elite talked about standards, NPR’s trench diggers made like the Ministry of Truth — rewriting history to justify throwing Vivian Schiller under the bus. According to NPR’s own media correspondent, David Folkenflik: “some at NPR found Vivian Schiller’s leadership under fire wanting.” And he quoted one longtime network employee saying “we have not been well served by recent management. Many of our managers are talented and solid, but others have not been and have exposed us to some terrible, terrible hits.”

All of this was petty distraction. The big issue looming behind the quibbles over O’Keefe’s video antics—one Schiller’s embarrassing comments and the other Schiller’s shaky management—was, of course, money. When the New York Times reported on Schiller and Schiller’s fumbling pas de deux, it tried to set the frame:

“In the midst of a brutal battle with Republican critics in Congress over federal subsidies, NPR has lost its chief executive after yet another politically charged embarrassment.”

One of Ron Schiller’s most embarrassing comments on the O’Keefe video was a boast that NPR didn’t need the government money that it normally receives. This played into the hands of NPR’s political adversaries.

For years, establishment Republicans have been calling for cuts in the federal funds given to NPR and its parent entity, the Corporation for Public Broadcasting. These calls have grown louder since control of the House of Representatives changed hands in 2010. And they’ve changed from “cut the government funding” to “eliminate the government funding.” The day that Vivian Schiller resigned, House Majority Leader Eric Cantor released the following statement:

“Our concern is not about any one person at NPR, rather it’s about millions of taxpayers. NPR has admitted that they don’t need taxpayer subsidies to thrive, and at a time when the government is borrowing 40 cents of every dollar that it spends, we certainly agree with them.”

NPR has long played Enron-like accounting games when explaining how much government money it receives each year.

Like most flimflam artists, its executives prefer to talk in percentages than absolute dollars. They say that NPR only gets 2 or 3% . . . or maybe 5% . . . of its operating budget in the form of direct government assistance. Well, sort of direct; the money goes to the Corporation for Public Broadcasting first and then to NPR. Strictly speaking, this explanation is true. But it’s also incomplete.

NPR counts more heavily on programming fees collected from its “member stations,” in most cases low-on-the-dial FM operations affiliated in some manner with colleges or universities around the country. These fees — which account for somewhere between 15 and 25% of NPR’s operating budget — are usually paid with federal government grants made to the local stations.

It’s easy to rationalize an earnest, middlebrow radio network as, well, maybe not the worst waste of $40 or $50 million in taxpayer money.

If establishment Republicans like Rep. Cantor have their way and eliminate the federal subsidies of NPR and its member stations, the network will lose as much as a third of its revenue base. Despite the tough talk, that would make a major dent in its business model. In absolute dollars, NPR’s annual budget has ranged between $150 and $170 million in recent years; so the cut would be something like $40 or $50 million from that.

Establishment Republicans have an intricately-wrought animosity to National Public Radio.

As several sharp media observers (most notably, Timothy Noah of the online magazine Slate.com) have pointed out, Republicans have been calling for cuts to NPR’s government subsidies for decades. The cuts rarely ever take effect. Instead, the politicians and network dance a kind of statist tango in which the two sides exchange insults, realize a mutual utility and then decide to coexist rather than taking action against one another.

To be sure, NPR has a left-wing bias. This bias is most evident in the network’s framing of topics in the news — the production-booth decisions about establishing the terms of debate on a particular topic, defining the parameters of coverage, formulating the questions asked of interview subjects. And, perhaps most important, determining which topics aren’t covered at all.

And NPR’s coverage of the present administration is a study in euphemism, rationalization, and justification. Every failure or setback is “unexpected,” any modest success is “profound” and “important.”

Despite this corporatism and institutional arrogance, NPR produces some good work. Its overall tone is generally earnest rather than partisan. And it puts on some damn good shows — including its weekend programming staples Car Talk and the documentary series This American Life. When you’ve listened to one of these — or a set of Ella Fitzgerald’s best work — it’s easy to rationalize an earnest, middlebrow radio network as, well, maybe not the worst waste of $40 or $50 million in taxpayer money.

Perhaps most important to establishment Republicans, the government subsidies give them influence with NPR. And its earnest, middlebrow listeners. As long as the network relies on government funds, it has to be “fair” to both establishment political parties. And, in this context, “fair” means perpetuating topics and coverage that serves the interests of the establishment parties.

So, why the difference this year? Why the executive resignations instead of another statist tango? Was the difference James O’Keefe? Or forces beyond his media trickery?

Probably the latter.

An NPR employee in a position to know told me that the organization elite worries that establishment Republicans aren’t interested in the tango this time around. Influenced by Tea Party activists, particularly in the House of Representatives, the GOP may actually cut NPR’s allowance significantly, if not completely. That’s why Ron Schiller’s boast about not needing government money and obsequious remarks about ignorant Tea Partiers were such a double-whammy. And why NPR’s Board wanted more than just the head of a middle-level executive who was already halfway out the door.

NPR’s institutional elite may still be as earnest and dedicated as the network itself; but it breeds monsters.

NPR fired Vivian Schiller to show true believers in the Congress that it’s still willing to dance the statist tango. Now, it waits to see if they’re impressed. We’ll find out this summer, when Speaker Boehner assembles his first budget.

One last point to consider, with regard to arrogance of institutions like NPR.

Here in the States, public radio is like your uncle, the charming communist who teaches sociology at the local community college. Earnest. Dedicated. Credentialed. Green. Reform-minded. Smart in a million minor ways. So, why do many of its employees make bone-headed decisions in the things they say and do?

Ron Schiller isn’t the only one who’s done something stupid. Last year, it came out that the publicity director for one of NPR’s larger member stations had posted to the left-wing Internet user group JournoList that she would “Laugh loudly like a maniac and watch his eyes bug out” if right-wing radio personality Rush Limbaugh were dying in front of her.

The publicity director, a woman with the Dickensian name Sarah Spitz, later issued a watery apology:

“I made poorly considered remarks about Rush Limbaugh to what I believed was a private email discussion group from my personal email account. …I apologize to anyone I may have offended and I regret these comments greatly; they do not reflect the values by which I conduct my life.”

That common weasel phrase “may have offended,” so fatal to the spirit of apology.

NPR took great pains to distance itself from Ms. Spitz. It emphasized that she had never been an employee of the network — although it had run a few pieces she’d submitted from her occasional on-air work at the local station where she was an employee.

The term “cognitive dissonance” applies here. Some small minds don’t like the confusion caused by holding conflicting or inconsistent ideas, so they flee to orthodoxy. Structure and agreement. Arrogant institutions offer these things; but decadent institutions (which can also be arrogant) aren’t able to manage their orthodoxy and structure. Counter-intuitively, they become more orthodox because they are institutionally decadent. So it is with NPR. Its institutional elite may still be as earnest and dedicated as the network itself; but it breeds monsters. Small minds that seek agreement instead of wisdom, tormented by insecurities that they barely perceive.

They can’t imagine anyone disagreeing with the institution’s orthodoxy. Just as they can’t imagine anyone voting for McCain. Or Barr. Or anyone other than the overwhelmed mediocrity now occupying the White House. This lack of imagination becomes a kind of mental defect; and the people become fruit ripe for plucking by someone like James O’Keefe.

I’d turn back to O’Keefe and tell him that such ripe fruit is also low fruit. But who am I to get between a man and his livelihood?

As for NPR, if it loses its government subsidies, the good programs it produces will find value in the open market. And value eventually finds a home.




Share This


How to Unblock Your Writing

 | 

Wouldn't it be great to have limitless access to all the cells in your brain? To have a Google feature of sorts that would allow you to immediately call up just the right fact or memory that you need at any given moment, and the ability to synthesize and analyze all that information instantly?

That's what the drug NTZ does for characters in the film Limitless, a mystery-thriller in theaters now. Eddie Morra (Bradley Cooper) is a sci-fi writer with a contract for a book, but a debilitating writer's block has prevented him from writing a single word. His life is a mess, his apartment is a mess, he's a mess, and his girlfriend Lindy (Abbie Cornish) has just broken up with him because of it.

Then he meets an old friend, Vernon Gant (Johnny Whitworth) who gives him a tab of NTZ. Within minutes Eddie experiences a rush of insight and intelligence. He can remember books he thumbed through in college, television shows he saw in childhood, and math equations he learned in high school. Within hours he has cleaned his apartment, resolved his back rent, and written 50 scintillating pages of his novel. But the next day, he is back to being the same sloppy Eddie who can't write a single word. More NTZ is in order.

If this story line sounds familiar, it is. Daniel Keyes explored this idea of artificially stimulated intelligence in his "Flowers for Algernon," which was later made into the movie Charlie starring Cliff Robertson. Phenomenon, a John Travolta film, uses the same premise. Even the spy spoof television show "Chuck" uses a similar premise when the title character is able to access information stored in "the intersect," as though his brain were a remote computer. What makes this film stand out, however, is its jazzy musical score, witty script, engaging mystery, and skillful cast, not to mention its unexpected ending.

The film begins with the resounding thump of a sledgehammer against a metal door that jars the audience out of its seat. The throbbing musical score (by Paul Leonard-Morgan) drives the story forward, lifting the audience into a feel-good mood.

Eddie's brain on NTZ is simulated artfully for the audience through special camera effects that make Eddie's consciousness seem to speed not just down the road but also through cars, down sidewalks, into buildings, and out of them again at dizzying roller-coaster speeds. When he begins to write, letters appear from his ceiling and drop like rain into his room. Later, when he starts using his newfound skill to make money in the stock market, his ceiling tiles become a ticker tape, composing themselves into the stock symbols that he should buy. Intensified color is also used to portray his intensified awareness; even Cooper's intensely clear blue eyes add to his character's altered sense of reality. These techniques are very effective in creating a sense of what Eddie is experiencing.

The story's suspense is driven by Eddie's shady relationships with a drug dealer (Whitworth), a loan shark (Andrew Howard), a stalker (Tomas Arana), and an investment banker (Robert de Niro).  Eddie cleverly draws on all the memories stored in his brain to thwart the bad guys, but when he unwittingly comes across a dead body, he reacts in the way a normal person would — completely terrified, knocking over a chair as he collapses, then hiding in a corner with a golf club for protection as he calls the police. It's refreshing to see a character react as we probably would, instead of displaying unrealistic aplomb.

Limitless is a surprisingly entertaining film, with its fast pace, skilled cast, creative camera work, and interesting plot. Well worth the price of a ticket.


Editor's Note: Review of "Limitless," directed by Neil Burger. Many Rivers Productions, 2011, 105 minutes.



Share This


Much More Than Moore

 | 

The hardest part of making a film is not writing the script, hiring the cast, choosing the locations, planning the shots, or editing the footage down to a moving and entertaining feature that tells the story in under two hours. The hardest part of filmmaking is finding the funding. It takes money to make a movie. Lots of money.

Ideally, the consumers (moviegoers) should pay for the product (the movie on the screen). And ultimately, they do, $10 at a time. But filmmakers need money upfront to make the product. Piles and piles of money. This is just Capitalism 101 for libertarians, and it makes me stare in disbelief when Americans glibly criticize the capitalist system for being corrupt and selfish. What could be less selfish than deciding to forego current consumption in order to invest in someone else's dream?

From the earliest days of filmmaking, films have been financed in several ways: using personal funds, either from one's own pocket or that of a rich friend or relative; applying for business loans; studio investment; and selling product placement. In recent years, product placement has become increasingly important as a way to fund the burgeoning cost of producing a movie, where a million dollars can be considered little more than chump change.

Morgan Spurlock, the new darling of the political-agenda documentary, exposes the process of selling embedded advertising in his new film, The Greatest Movie Ever Sold, which opens later this month. But, as I said, product placement is nothing new. From the start, radio programs and TV shows were "brought to you by" a particular sponsor; product placement was simply a way of getting the product into the show itself. Today product placement is a multibillion-dollar undertaking. Also called "co-promotion" and "brand partnering," this marriage of convenience provides money for the movie makers and brand recognition for the product. According to the documentary, businesses spent $412 billion last year on product placement ads, from the Coke glasses placed in front of the judges on American Idol, to the Chrysler 300s driven by Jack Bauer on 24 (after Ford withdrew its F-150s), to the kind of phones that Charlie's Angels carry.

The film is informative, intelligent, and laugh-out-loud funny, largely because of Spurlock's dry, self-deprecating humor as he goes about looking for sponsors for his film, which is simply a movie about Spurlock looking for sponsors for his film. Where Michael Moore made his mark in documentaries by humiliating his subjects through ambush journalism, Spurlock is gleefully upfront about what he is doing, treating his subjects with seriocomic respect and appreciation.

We all know we're being had, but he does it so openly that he makes us enjoy being had.

Spurlock doesn't just walk into business meetings unprepared, and beg for money. He does his homework, as good filmmakers (or any salesperson) should. He begins with a psychological evaluation to determine his "Brand Personality," which helps him identify what kinds of products would be a good fit for his film. Not surprisingly, his brand personality is "mindful/playful," so he looks for products whose makers think of themselves as appealing to consumers who are mindful and playful. He arrives at meetings with high quality storyboards and mockups to make his pitch. He listens carefully to the producers and accommodates their concerns. After all, if their needs aren't met, they won't fund the film. They are his consumers as much as the ticket buyers at the multiplex will be.

The film is liberally peppered with products, all of them described, worn, eaten, or presented with Spurlock's respectful glee. We all know we're being had, but he does it so openly that he makes us enjoy being had. Even his attorney is a product placed in the movie; after discussing a contract, Spurlock asks how much the consultation will cost him, and the attorney replies, "I charge $770 an hour. But the bigger question is, how much is it going to cost me to be in your movie?" (I wrote the attorney's name in my notes, but I'm not repeating it here. He hasn't paid Liberty anything to be mentioned in our magazine . . .)

Spurlock likens his movie to a NASCAR racer, and accordingly wears a suit covered in his sponsors' logos for interviews. The official poster shows his naked body tattooed with the logos, with a poster board of the film's title strategically placed across his crotch.  (Nudity sells, but I guess his manhood didn't pay for product placement.)

The film is funny but also informative. Despite Spurlock's gleeful presentation, he offers many serious ideas about product placement in movies and about advertising in general. For example, he discusses the potential loss of artistic control when the sponsoring company wants things done a certain way. This isn't new; Philip Morris reportedly told Lucy and Desi they had to be seen smoking more frequently on "I Love Lucy," the most popular show of the 1950s, and they complied. A filmmaker has to weigh the money against the control, and decide how much to compromise.

Truth in advertising is also discussed. Spurlock visits Sao Paolo, Brazil, where outdoor advertising has been completely banned by a new "Clean City Law." Now store owners rely more heavily on word-of-mouth referrals for new customers, which may indeed be a more honest form of testimonial, but highly inefficient — and inefficiency is generally passed along to consumers in the form of higher prices. In the film, local Brazilians glowingly praise their ability to "see nature" now that the billboards are gone, as Spurlock's cameras pan to the high-rise buildings that overpower the few shrubs and trees in the downtown area and block the view of the sky. Subtle, and effective.

Spurlock also interviews several people to get their opinions of truth in advertising. Ironically, one of the interviewees has bright magenta hair taken from a bottle, another has the unmistakable ridge of breast augmentation, another is wearing a sandwich board advertising a nearby store, while a fourth is pumping gas at the chain that has made a brand-partnering deal with Spurlock. Once again Spurlock is making gentle fun of his subjects, and we laugh gleefully along with him. (But I'm still not willing to reveal the name of the gas station until they pony up with some advertising money for Liberty.)

The Greatest Movie Ever Sold may not be the greatest documentary ever made, but it is mindful and playful, like its maker. If it comes to your town, don't miss it.


Editor's Note: Review of "The Greatest Movie Ever Sold," directed by Morgan Spurlock. Snoot Entertainment/Warrior Poet, 2011, 90 minutes.



Share This


Losing the Battle, Spinning the War

 | 

March was a time of judgment on the American official language — the language spoken by the people considered most qualified to sling words around: politicians, media operatives, public educators of all kinds. The official language was weighed in the balance, and found wanting. It proved grossly unequal to the challenge of such mighty events as the Japanese earthquake, labor unrest in Wisconsin, and the political embarrassments of government radio. And then along came Libya.

As usual, the commander in chief led the nation into linguistic battle on most of the fronts available; and as usual, he was beaten in every skirmish. About Wisconsin he did what he ordinarily does; he tried to get into the fight, while also trying to stay out of it. A violent proponent of unions, and an eager recipient of union funds, he still hopes to win the electoral votes of all those states that are in financial turmoil because of the demands of public employee unions. So he acknowledged the states’ budget problems, and then he said, “It is wrong to use those budget cuts to vilify workers.” A little later, when asked to state Obama’s position on the continuing turmoil in Wisconsin, his press agent repeated that inane remark.

Of course, nobody was vilifying workers, even if you are crazy enough to equate workers with government employees. What some people were doing — and suddenly, such a lot of people — was trying to keep the unions that represent people employed by state and local governments from bankrupting their employers. Obama’s feckless verbal feint would have turned into a factual rout if some White House correspondent had asked the obvious question: “What vilification are you referring to?” But nobody seemed able to do that.

The commander in chief led the nation into linguistic battle on most of the fronts available; and as usual, he was beaten in every skirmish.

Meanwhile, union shock troops were occupying the capitol of Wisconsin, trying to prevent its legislature from voting. These vilified workers caused over seven million dollars of damage. Yet even Fox News’ Megyn Kelly, a rightwing personality on a rightwing channel, was willing to call the Wisconsin actions “peaceful.” You see what I mean about the official language not being adequate to the crisis? Suppose I came over to your house with a few thousand friends chanting obscene slogans against you, and we camped in your living room for weeks, attempting to force you to do what we wanted you to do — would you call that peaceful? Of course not, but only one person in the media, a volunteer bloggist whom Yahoo! News, in a fit of common sense, allowed on its site, made a point like that. Congratulations, bloggist. You have linguistic qualifications that none of the media professionals can equal. But they’re the ones who are getting paid.

Among this country’s most influential purveyors of the American official language is National Public Radio. I’m calling it that because it is currently attempting to deny its identity as government radio by calling itself by a set of non-referential initials: it just wants to be known as good ‘ol “NPR.” Well, sorry, alphabetical agency: we all know the smell of a government medium. It comes from the money it tries to cadge from the taxpayers.

In early March a highly paid government-radio official was caught on video telling some “Muslim” potential donors that “NPR” would actually be better off without government help, presumably because it would no longer have to pay any attention to the majority of the American people, whom he suggested were ignorant and stupid and susceptible to the racist propaganda of people who actually, believe it or not, would like a smaller government. He identified the tragedy of America as the fact that its educated elite (clearly typified by himself) was so small and uninfluential. Those were the views that Mr. Ron Schiller, senior vice president of National Public Radio, expressed concerning the citizens of the United States, who (perhaps tragically) put the “N” in “NPR.”

Suppose I came over to your house with a few thousand friends chanting obscene slogans against you, and we camped in your living room for weeks, attempting to force you to do what we wanted you to do — would you call that peaceful?

Schiller was forced to resign immediately. His brief public statement assesses his behavior in this way: “While the meeting I participated in turned out to be a ruse, I made statements during the course of the meeting that are counter to NPR’s values and also not reflective of my own beliefs. I offer my sincere apology to those I offended.”

Again we see the limitations of the official language, which proved utterly incapable of specifying what went wrong with Mr. Schiller, who might be offended by his remarks, or why anybody might be offended. In short, the official language was incapable of answering any question that anyone who read his statement would probably ask. And it created new and damaging questions: Why did you make statements that were not reflective of your own beliefs — that is, lie? By the way, what are your beliefs? Do you actually believe that other Americans are smart but you are dumb, yet for some reason you keep maintaining the opposite? If so, how does that happen?What were you thinking, anyway? But no one in the high-class media found the words to ask such simple questions.

Now we come to the terrible events in Japan. Again, Obama was in the vanguard of our linguistic forces. And again . . . Here’s what he said about the earthquake and tsunami, on March 11 — in prepared remarks, presumably edited by numerous White House word wizards, who were struggling to get exactly the right tone. “This,” Obama said, “is a potentially catastrophic disaster.”

Gosh, this thing is so bad, something really bad may happen.

When the president is attacked and captured by his own language, what can we expect of his assistant priests, the writers and readers of the “news” media? The answer is, Even worse. And we got it.

Particularly impressive was the horror-movie approach, with the Japanese cast as Godzilla: “Operators at the Fukushima Daiichi plant's Unit 1 scrambled ferociously to tamp down heat and pressure inside the reactor” (AP report, March 11). I have trouble picturing anyone tamping down heat orpressure, but it’s even harder for me to picture someone doing it ferociously, unless that someone is a monster trying to rescue its offspring from the accursed humans’ nuclear experiments.

But maybe the ferocious beings were actually the talking heads of American TV. On the selfsame day, March 11, Fox News’ late-night guy was calling the earthquake and tsunami “one of the worst natural disasters in the history of mankind.” Fox News’ Rick Folbaum called it “the fifth worst earthquake in the history of earthquakes, folks.” Yet again, the official language just isn’t up to the task. It ought to be able to distinguish between “the hundred years since earthquake records have been scientifically kept” and “the history of earthquakes” or “the history of mankind,” but evidently it can’t. Under communism, hundreds of thousands of people in China lost their lives in natural disasters — but we have no words to speak of them, do we? Or maybe, just maybe, we never read a book, so we don’t know nothin’ ‘bout things like that. In either case, the problem lies with words. We can’t use them, and we can’t read them either.

After Folbaum made his immortal declaration, his colleague, Marianne Rafferty, consoled the audience by promising, “We will keep everyone up-to-dated.” Would anyone who had ever read a book—I mean a real book, with real words—say a thing like that? What would you have to be paid to make such a statement before an audience of educated people, or even just people?

The worst thing is that words are related, in certain ways, to thoughts; so if you don’t have thoughts . . . Some examples:

“Is Japan getting the assistance it needs?” That’s the question that Wolf Blitzer asked the Japanese ambassador to the United States (March 12, CNN). I thought it was a little strange that Japan, one of the richest and most technologically advanced nations on the planet, should be the object of that question. But never mind. In reply, the ambassador noted, somewhat vaguely, that his prime minister had ordered one-fourth of the nation’s armed forces to help the people currently starving a moderate distance north of Tokyo. That apparently satisfied Blitzer. He didn’t say what you would have said: “What! Why isn’t he mobilizing the entire army?” He didn’t say what you would have expected him to say: “Wait a minute! What’s your God damned army for, anyhow? We can get our correspondents into the disaster zone — why can’t you get your army in? And if you can’t, why don’t you air-drop supplies? In short, Mr. Ambassador, what the hell are you talking about?” But I guess Blitzer couldn’t think of those questions. After all, he’s merely one of America’s most famous interviewers.

Marianne Rafferty consoled the audience by promising, “We will keep everyone up-to-dated.” Would anyone who had ever read a book—I mean a real book, with real words—say a thing like that?

“There’s the sense that they’re in this together, and they’re just trying to get along as best they can.” That’s what CNN’s Anderson Cooper said on March 14, describing Japanese people waiting hours for government water, only to have an official tell them that the government had run out of water and they would have to wait an undetermined number of additional hours in line. He liked the way the victims remained stolidly in that line. He thought it was good that they didn’t complain. Yes, in subsequent days of reporting, he did begin doing what any normal information-processor would have done right away: he criticized the Japanese government for its lies and incompetence, at least about the lurking threat that we all fear, nuclear reactors. But he never questioned his favorable view of the people’s passivity (the media word was “calm”). It just wasn’t in him to make the connection between the people’s passivity and the government’s incompetence. Again, he didn’t have the words. I assume that he didn’t have the thoughts, either.

Here’s another instance. “You wonder how any government could deal with such a thing,” intoned Shepard Smith, a Fox News figure momentarily stationed in Japan, on the evening of March 15. He was referring to the combination of the nuclear issue and the disaster relief issue, both of which the Japanese government was supposed to “deal with.” Personally, I didn’t “wonder” about that. I suspect that you didn’t either. Any responsible government could find out how to deal with such problems. There are known procedures for addressing dangers in nuclear power plants, and disaster relief is not an unknown science. This wasn’t World War II. But maybe the Japanese official class is like our own — so tied up in its own linguistic incapacities that it can’t formulate an efficient thought.

Now to Libya. I’ve recently written two reflections about Libya for this journal, so I can hit the ground running. What everyone with a brain is still laughing about is President Obama’s address to the nation on March 28. Generally, watchers identified the most risible part of the speech as Obama’s denial that he intended to get rid of Qaddafi. Admittedly, he wanted Qaddafi gone; yet, he said, “broadening our mission to include regime change would be a mistake.” He couldn’t find the words to say “ousting Qaddafi,” so he said “regime change.” If you’ve got the magic decoder, you’ll understand this. But you still may not understand his policy.

By denying his lust for regime change, he costumed himself as a dove. Unfortunately, that made the hawks wonder whether he really, truly, wanted Qaddafi out. (They’d heard double-talk before.) So on the next day, he back went on TV, to express his satisfaction that the members of Qaddafi’s inner circle supposedly “understand that the noose is tightening.” Ah! Now we are the executioner with the noose. So both the hawks and the doves are happy, right? Well, maybe not.

The vocabulary is missing. The official language has no words for “war,” “making war,” or anything else that Obama was obviously doing.

You can tell when somebody is really dumb, or is really desperate for the attention of people in Washington: that person is eager to go on TV and defend nonsense like this, which nobody else could possibly defend. Thus Bill Richardson, once Clinton’s ambassador to Monica Lewinsky, then governor of New Mexico, now television expert on constitutional law, informing CNN that Obama was acting purely in order “to avert a humanitarian disaster” when he started bombing Libya. Asked whether the president shouldn’t have consulted with someone in Congress before going to war, Richardson said there was no need: “This is not a war powers situation.”

You see! You see! There it is again. The vocabulary is missing. The official language has no words for “war,” “making war,” or anything else that Obama was obviously doing. So we are forced to watch this strange, slow shifting of vehicles around the used car lot, as political salesmen try to find some piece of junk that the suckers will buy: “this is not a war powers situation.

Imagine Libyan planes and rockets bombarding the New Jersey coast. Would that be a war powers situation? Would it turn into one if its goal were regime change? Or would it still be a mission to avert a humanitarian disaster, and therefore immune from legislative review?

But here’s the real stuff. In his address to the nation on March 28, President Obama tried to calculate the scale of the humanitarian disaster he was trying to avert, without the help of long (or even short) consultations with Congress: “We knew that if we waited one more day, Benghazi — a city nearly the size of Charlotte — could suffer a massacre that would have reverberated across the region and stained the conscience of the world."

I know, I know — you can’t resist the unintentional humor of “a city nearly the size of Charlotte,” as if anybody knew, or cared, how large Charlotte (North Carolina?) might be. The desired impression was: Whoa! That big, dude? Then I guess we gotta go to war! The real impression was: Not!

But there are so many other things to notice:

The image that simply makes no sense: try to picture a massacre that reverberates.

The modesty that presidents get whenever they know they’re in trouble, and “I” just naturally transforms itself to “we.” (Were YOU waiting? Did YOU know?)

The Victorian prissiness of “suffer a massacre.”

The pathological specificity of “one more day” and “nearly the size.”

The moral stupidity of “stained the conscience of the world,” which literally means that if some bad thing happens, everyone in the world becomes guilty of it. (All right; you think I’m just being sarcastic. Then tell me what the phrase actually means.)

And finally, the breakdown in thought and grammar evident in the goofy progression of verbs: “If we waited . . . Benghazi could suffer a massacre that would have reverberated.” To see what’s happened here, insert some normal words into the various grammatical slots. Like this: “I knew that if I waited, you could write me a check that would have made me rich.” Huh?

Anyone who knew grammar would have fixed that one up, but as we know, Obama, the world-famous author, has no knowledge of grammar, never having mastered even the like-as distinction, let alone verb progression. But examine his inability to distinguish the meanings of “could” and “would.” The president was forced to admit that he had made a decision, that what he did wasn’t inevitable, and that he wasn’t, like Yahweh, absolutely certain about the future. That’s how “could” got into that abominable sentence. Yet at the same time, he wanted to imply that he was certain about the future: why else could, or might, “we” have made the decision we made? So he put in “would.”

And that solved his problem. So far as he could tell.

Don’t blame him. He speaks only the official language.




Share This


Equal Opportunity Dining

 | 




Share This


Libya and 2012

 | 

By intervening in the Libyan civil war and dragging the United States into war with Libya, President Obama has effectively signed his resignation papers. There is no way that he will win in 2012.

Republicans already hate Obama, but more and more the modern-liberal Democrats are turning on him. Some say he hasn’t done enough to fight global warming (although he’s done too much already); some say he should implement the end of Don’t Ask Don’t Tell more quickly (which is quite true); some say that he makes too many compromises (although he doesn’t make enough of them). But all these things aside, Obama was elected first and foremost on an anti-war platform. He has failed to stabilize the situation in Iraq, he has failed to get us out of Afghanistan, and now he has started a war with Libya, a war that he cannot blame on Bush.

And why did we get involved in Libya? Either because of its oil reserves, or because we are now the world’s policeman. Neither reason befits an anti-war politician. The anti-war people are soon going to start abandoning Obama in droves, and he will find that the modern-liberal wing of the Democratic Party is not going to worship him as it did back in 2008. Obama’s “change” has been revealed for what it always was, more of the same old politics. He is vulnerable in 2012, especially if the Republic Party nominates an anti-war, libertarian-leaning candidate.




Share This

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.