Good as Gold

 | 

James Grant is the best-spoken and most accomplished hard-money man in Wall Street. Plenty of people can inveigh against the Fed; the publisher of Grant’s Interest Rate Observer can put it in the context of today’s financial markets and also in the context of history.

Grant is also a historian. He wrote a biography of John Adams, and another of financier Bernard Baruch, and several other histories, including my favorite among his works, The Trouble with Prosperity: The Loss of Fear, the Rise of Speculation and the Risk to American Savings (1996). His new book, “Mr. Speaker! Mr. Speaker!” should be of particular interest to readers of Liberty.

That’s not because of any deep interest I have in his nominal subject, Thomas B. Reed (1839–1902). Reed was a moderately interesting figure. He was fat, he ate fish balls for breakfast, and he spoke French. He was a congressman from Maine, first elected in the centennial year, 1876. He was a man of his time, a Republican stalwart who supported the tariff and gold and opposed foreign wars. Though he was for woman suffrage, he was no progressive, at least in the sense that today’s progressives are. He had no desire to teach other people how to live. “Pure in his personal conduct,” Grant writes, “he had no interest in instructing the impure.”

During the 1890s, Reed was speaker of the House. His claim to fame is how he changed the House rules to give himself, and the majority, much more power to get things done. Whether that was good or bad depends on your point of view.

Parts of the book are about Reed’s parliamentary maneuverings and also about his wit. These parts are well-written, but I was not much interested in them. The greater part of the book, however, is about the financial events and national political battles from 1870 or so to 1899, which I found very interesting.

Now gold is the money of the distant past. In the 1870s, Grant observes, “Gold was the money of the future."

Grant covers several such things, from the collapse of the Freedmen’s Bank to the stolen election of 1876. He has a particular interest in the currency. One story he tells is of Lincoln’s printing of greenbacks during the Civil War, and the struggle afterward over restoring the link to gold. In the ideology of the day, restoration was necessary. It was part of honorable dealing. But people dragged their feet about doing it, because a national debt had been piled up during the war, and what was the need to repay investors in gold if they had bought bonds with greenbacks? And besides, there was a boom on, and shrinking the money supply would spoil it.

The boom went bust in 1873. For the rest of the decade the country was arguing about the commitment to return to gold. The way in which this argument was conducted was much different from the way it would be today. Now gold is the money of the distant past. In the 1870s, Grant observes, “Gold was the money of the future”:

“A 21st century student of monetary affairs may stare in wonder at the nature of the monetary debate in mid 1870s America. A business depression was in progress. From the south, west and parts of the east arose a demand for a cheaper and more abundant currency. Yet such appeals were met with, and finally subdued by, a countervailing demand for a constant, objective and honest standard of value.”

“Resumption day” was Jan. 1, 1879. There followed “a mighty boom.” In the 1880s, prices fell 4.2% and wages rose.

This was the era of laissez-faire. “The government delivered the mails, maintained the federal courts, battled the Indians, examined the nationally chartered banks, fielded the army, floated the navy and coined the currency.” The big social program was Civil War pensions — for the Union side, not the Confederates. By the standards of the day, the Democrats were the small-government party and the Republicans were the big-government party, but policy differences were at the margin only.

The federal government was funded by the tariff, which the Republicans, and Reed, thought was good for American labor. The Democrats quoted the free-trade arguments of Frédéric Bastiat. “The Republicans,” Grant writes, “having no ready answer for Bastiat’s arguments, were reduced to pointing out that he was, indeed, French.”

The ideological atmosphere started changing in the 1890s. The Panic of 1893 brought on a severe depression, and another political battle over the currency. This time the supporters of gold battled it out with the supporters of silver, with bimetallists arguing for both at once.

Grant provides the best explanation of the gold-versus-silver battle I have read, especially the futility of having two metallic standards at the same time. Here he is not so proud of Thomas Reed, who was by then speaker: “One senses, reading him, that he was not quite sure what the gold standard was all about.” Reed’s position “lacked clarity, or, one might even say, courage. [President Grover] Cleveland had one unshakable conviction, which was gold. Reed had many convictions, only one of which — no free silver — was strictly nonnegotiable.”

Reed’s final battle was over war with Spain. He was against it. The new president, William McKinley, seemed to be against it, but lacked the courage really to oppose it. Ex-President Cleveland was against it, as was William Jennings Bryan, the presidential candidate whom McKinley had beaten in 1896. But the Congress was for it, and even with his self-enhanced power as speaker, Reed couldn’t block the war resolution of 1898. It passed the House 310 to 6.

A year later, Reed resigned. He wrote to a friend, “You have little idea what a swarming back to savagery has taken place in this land of liberty.” Publicly he said nothing. Three years later he was dead.

And so Grant ends his book. It is a fine book. I recommend it. Don’t be put off by any lack of interest you may have in Thomas B. Reed. I wasn’t much interested in him, either. You don’t have to be.

Grant covers several such things, from the collapse of the Freedmen


Editor's Note: Review of "'Mr. Speaker! Mr. Speaker!' The Life and Times of Thomas B. Reed, the Man Who Broke the Filibuster," by James Grant. Simon & Schuster, 2011, 448 pages.



Share This


Appleby's Revolution

 | 

One thing that has always struck me about the Communist Manifesto is that Karl Marx and Friedrich Engels held capitalism in awe. “The bourgeoisie, during its rule of scarce one hundred years, has created more massive and more colossal productive forces than have all preceding generations together,” they wrote in 1848. They listed the wondrous accomplishments one by one, from steam navigation to canal building, and “whole populations conjured out of the ground.” Of course, their praise preceded a call for workers to overthrow the great capitalist conjuror.

Joyce Appleby’s book, The Relentless Revolution: A History of Capitalism, conveys the same sense of wonder. She praises capitalism, saying that its “distinctive characteristic . . . has been its amazing wealth-generating capacities. The power of that wealth transformed traditional societies and continues to enable human societies to do remarkable things."

Appleby certainly doesn’t recommend overthrowing the system. But her appreciation of capitalism diminishes as the book goes on, just as Marx’s and Engels’ did in the manifesto. At the start, The Relentless Revolution is like an exhilarating train ride, full of insights and a historian’s gold mine of information, but it loses steam and slows to a crawl once the Rubicon of the Industrial Revolution has been crossed. At the end, Appleby is praising the US government for trying to rein in the capitalist beast.

Appleby, who taught history at UCLA, describes herself as a “left-leaning liberal with strong . . . libertarian strains.” Given today’s academic history departments and her own attitudes, she deserves admiration for looking at capitalism as objectively as she can. “In this book,“ she writes, “I would like to shake free of the presentation of the history of capitalism as a morality play, peopled with those wearing either white or black hats.” For half the book, she achieves that goal.

The Relentless Revolution fits into the recent parade of big books trying to explain the rise of the industrial West — books such as Jared Diamond’s Guns, Germs, and Steel, Kenneth Pomeranz’s The Great Divergence, and Ian Morris’ Why the West Rules — For Now. These and other authors are trying to answer a couple of giant questions: why did the West (not China, India, or another part of the world) make the transition from subsistence living to a world of constantly increasing productivity? And how exactly did it happen? As I shall explain, Appleby offers two major responses that I found valuable.

Appleby identifies two main factors in Britain’s rise: higher wages providing the incentive to increase productivity, and coal providing the means.

The pivotal moment in world history is normally called the Industrial Revolution (although perhaps that’s a misnomer, given its slow gestation). Many factors contributed to this revolution, and part of the “game” motivating the big books is to offer new factors. It’s generally agreed that the process flowered first in Great Britain, but the reasons may stretch back to such things as the European practice of primogeniture, the separation of powers between the Catholic Church and the state, and the system of rights spawned by feudalism under changing population pressures.

Appleby focuses on 17th and 18th-century England (why she gives short shrift to Scotland is a puzzle). She points to two reasons why the Industrial Revolution occurred there: England had higher wages than the Continent, and also cheap sources of coal. Higher wages provided the incentive to increase productivity, and coal provided the means.

The idea that England’s wages were higher is actually new to me and undoubtedly has a complex history of its own; Appleby merely says that England’s population growth had leveled off in the 17th century. As for coal accessibility, she agrees with Pomeranz, who says that China fell behind Europe partly because its coal deposits were harder to reach.

Whatever the precursors, actual inventions — especially of the steam engine — required talent and knowledge, and herein lies the first of Appleby’s distinctive insights. In a way that I haven’t seen before, Appleby integrates two related forces: the British tendency toward scientific theorizing (or “natural philosophy”), and the British tendency to tinker. “Technology met science and formed a permanent union. At the level of biography, Galileo met Bacon,” she writes.

She elaborates: “Because of the open character of English public life, knowledge moved from the esoteric investigations of natural philosophers to a broader community of the scientifically curious. The fascination with air pressure, vacuums, and pumps became part of a broadly shared scientific culture that reached out to craftsmen and manufacturers in addition to those of leisure who cultivated knowledge.”

The Royal Society illustrates this integration. Created in 1662, it was both practical and scientific. Its members studied that New World import, the potato, but it also “brought together in the same room the people who were most engaged in physical, mechanical, and mathematical problems.”

Appleby’s second valuable contribution is to take a cold-eyed look at the role of New World slavery in creating the markets that nurtured the Industrial Revolution. Indeed, like Pomeranz, she sees slavery as a major factor behind the engine of progress.

In The Great Divergence, Pomeranz says that the vast expansion of agricultural cultivation in the New World enabled Europe to avoid the “land constraint” that kept the Chinese from developing an industrial economy. And slave labor played an enormous role in the cultivation of sugar in the Caribbean and cotton in the United States. Thus, Pomeranz says, Europe overcame the limitations of land by colonizing the New World, which made agriculture possible to an extent unlikely in tiny Holland or England.

Appleby’s take is a little different. She emphasizes more the three-way trade that we learned about in middle school (slaves from Africa were taken to the New World, where sugar was purchased and taken to Great Britain, where finished clothing and other products were bought, to be sold in Africa). Pomeranz and Appleby are not arguing that slavery was profitable, or that it was “necessary,” but, rather, that it was a significant element in the system of trade that led to the Industrial Revolution. Enthusiasts for capitalism such as myself tend to focus on the “trade”; critics of capitalism — along with Appleby — stress the “slave” part of the trade.

Furthermore, Appleby considers American plantations and British inventions as two sides of the same coin. “These two phenomena — American slave-worked plantations and mechanical wizardry for pumping water, smelting metals, and powering textile factories — may seem unconnected. Certainly we have been loath to link slavery to the contributions of a free enterprise system, but they [the phenomena]must be recognized as twin responses to the capitalist genie that had escaped the lamp of tradition during the seventeenth century.”

Whether she is right about this, I don’t know, but she brings to public attention the periodic debate by economic historians over the role of slavery, a debate that Pomeranz reawakened with The Great Divergence.

Unfortunately, after these interesting observations, The Relentless Revolution begins to wind down. As the book moves on, Appleby tends to equate anything that takes industrial might — such as the imperialist armies that took possession of Africa in the late 19th century — with capitalism. “Commercial avarice, heightened by the rivalries within Europe, had changed the world,” she writes in explaining the European adventures in Africa. I dispute that. While Cecil Rhodes and Henry Morton Stanley may have been private “investors,” for the most part Africa was overcome by government power, not by individuals or corporations.

While Cecil Rhodes and Henry Morton Stanley may have been private “investors,” for the most part Africa was overcome by government power, not by individuals or corporations.

Even greater blindness sets in as Appleby’s 494-page volume moves into the recent past and she becomes influenced by modern prejudices. In Appleby’s view, the financial crash in 2008 was caused by financiers; the only contribution by public officials was to “dismantle the regulatory system that had monitored financial firms.” No word about the role of the Federal Reserve, the Community Reinvestment Act, or Fannie Mae and Freddie Mac.

Indeed, capitalism becomes the same as exploitation, in spite of her more balanced initial definition of capitalism as “a cultural system rooted in economic practices that rotate around the imperative of private investors to turn a profit.”

In her last chapter, Appleby writes, “The prospect of getting rich unleashed a rapacity rarely seen before in human society.” This extravagant statement does not jibe with archaeological (not to mention historical) evidence of the human past. In Why the West Rules — For Now, Morris reports on an archaeological excavation at Anyang village in China. In 1300 BC, this home of the Shang kings was the site of massive deaths. Morris estimates that, in a period of about 150 years, the kings may have killed 250,000 people  — an average of 4 or 5 a day, but probably concentrated in large, horrific rituals — “great orgies of hacking, screaming, and dying.” This was the world before capitalism.

To be fair to Appleby, she is saying that the productivity unleashed by capitalism extended the breadth of human rapacity, and to some degree the example of slavery supports that notion. Eighteenth-century British gentlemen could think high-minded thoughts while enjoying their tea with sugar grown by often brutally treated slave labor — which they may have invested in. “This is the ugly face of capitalism,” Appleby writes, “made uglier by the facile justifications that Europeans offered for using men until they literally dropped dead.”

True. At the same time, it was the British who abolished the slave trade, at high cost in patrolling the seas. Before that, Africans were enslaved only because other Africans captured and sold them to the Europeans. (Malaria and other diseases kept Europeans out of the interior of Africa until late into the 19th century.) There is a lot of blame to go around.

So this book is a mixed bag. Even though Appleby increasingly departs from objectivity as she proceeds into modern times, I respect her project and appreciate her insights. I hope that her “left-leaning” historian colleagues read her book and expand their understanding of the past — just as I have.

fits into the recent parade of big books trying to explain the rise of the industrial West


Editor's Note: Review of "The Relentless Revolution: A History of Capitalism," by Joyce Appleby. Norton, 2010, 495 pages.



Share This


Enforcers of Health

 | 

Libertarians have uneasy thoughts about the enforcers of public health. In my state, the public health authorities banned tobacco billboards — an act that some thought a violation of the First Amendment. This year, they pushed in the legislature to ban the sale of flavored cigars and pipe tobacco. In both cases, they have said that their aim is to make tobacco less attractive to children.

A century ago the fight was much more immediately vital. Then people of libertarian mind were fighting with the public health enforcers over the control of smallpox.

That disease was eradicated in the 20th century by campaigns of vaccination, many of them not entirely voluntary. In the United States, a turning point came in the epidemic of 1899 to 1902, when outbreaks of the disease prompted a political battle over compulsory vaccination. The story is told in legal writer Michael Willrich’s new book, Pox.

The push for compulsory vaccination grew out of the facts of the disease itself. It was a killer. Of people infected by the main strain of the disease, between 20 and 30% died. Smallpox was highly contagious, so that an infected person was a threat to everyone around him who was unvaccinated. First symptoms appeared more than a week after exposure, so fighting the epidemic by treating sick people was a strategy of being perpetually behind. The disease could, however, be stamped out by vaccinating the healthy.

For the progressives, the model was the Kaiser’s Germany. As Willrich says, “German law required that every child be vaccinated in the first year of life, again during school, and yet again (for the men) upon entering military service.” Germany had “the world’s most vaccinated population and the one most free from smallpox.”

In the constitutionalist America of the 1890s, vaccination was a state, local, and individual responsibility. Americans, writes Willrich, were “the least vaccinated” people of any leading country. The federal government had a bureau, the Marine-Hospital Service, to tend to illnesses of sailors; it had been started in the 1790s under President John Adams. The Service had experts in smallpox control, but they were advisory only and expected local authorities to pay for local work.

The antivaccinationists argued that compulsory vaccination would lead to other bad things. And four years later, in 1906, Indiana enacted America’s first compulsory sterilization law.

In a smallpox outbreak, the public health enforcers would set up a “pesthouse,” usually at the edge of town. They would scour the community for the sick, paying particular attention to the shacks and tenements of blacks, immigrants, and the rest of the lower classes, where the disease tended to appear first. People in these groups were subjected, Willrich says, to “a level of intrusion and coercion that American governments did not dare ask of their better-off citizens.”

Public-health doctors, accompanied by police, would surround tenement blocks and search people’s rooms. The sick would be taken to the pesthouse under force of law, and everyone else in the building would be vaccinated.

Often a community would have a vaccination ordinance that on its face was not 100% compulsory. It would declare that no child could be admitted to public school without a vaccination mark. For adults, the choice might be vaccination or jail — and all prisoners were, of course, vaccinated.

The procedure involved jabbing the patient with a needle or an ivory point and inserting matter from an infected animal under the skin. Typically it was done on the upper arm; often that arm was sore for a week or two, so that people in manual trades couldn’t work. Sometimes, vaccination made people so sick it killed them — though not nearly as often as the disease did.

Such was the background for the outbreaks of 1899–1902. At that time, two new factors emerged. First, America had just had a war with Spain, and had conquered the Philippines, Cuba, and Puerto Rico. All these places were rife with disease — and their status as conquered possessions, Willrich writes, provided “unparalleled opportunities for the exercise of American health authority.”

There the authorities had no worries about people’s rights: “The army’s sanitary campaigns far exceeded the normal bounds of the police power, which by a long American constitutional tradition had always been assumed to originate in sovereign communities of free people.” And coercive methods worked. They worked dramatically.

The second new thing in 1899 was that most of the disease spreading in the United States was a less-virulent form that killed only 0.5 to 2% of those infected. That meant that many Americans denied the disease was smallpox, and didn’t cooperate. The public health people recognized what the disease for what it was, and they went after it with the usual disregard of personal rights — maybe even more disregard, because of “the efficiency of compulsion” overseas. And they stirred up, Willrich writes, “one of the most important civil liberties struggles of the 20th century.”

Their fight was with the antivaccinationists, whom Willrich calls “personal liberty fundamentalists” who “challenged the expansion of the American state at the very point at which state power penetrated the skin.”

He continues:

“Many antivaccinationists had close intellectual and personal ties to a largely forgotten American tradition and subculture of libertarian radicalism. . . . The same men and women who joined antivaccination leagues tended to throw themselves into other maligned causes of their era, including anti-imperialism, women’s rights, antivivisection, vegetarianism, Henry George’s single tax, the fight against government censorship of ‘obscene’ materials and opposition to state eugenics.”

Often, antivaccinationists denied that vaccination worked. Many were followers of chiropractic or other non-standard medicine; this was also the time of “a battle over state medical licensing and the increasing dominance of ‘regular,’ allopathic medicine.” Of course, the antivaccinationists were wrong about vaccination not working, but they were not wrong when they said it was dangerous. In 1902, the city of Camden NJ required all children in the public schools to be vaccinated. Suddenly children started coming down with lockjaw. They were a small fraction of those who had been vaccinated, but all of them fell ill about 21 days after vaccination. Several died, and the press made a scandal of it.

Our newly conquered possessions provided “unparalleled opportunities for the exercise of American health authority.”

Under the common law of the day, the vaccine maker — the H.K. Mulford Co. — was not liable to the parents, because it had no contract with them. Nor was the government liable, though the government had required the treatment. Thus, writes Willrich, “The arm of the state was protected; the arm of the citizen was not.”

The medical and public health establishment initially denied that the lockjaw had been caused by the vaccines. This was admitted only after a consultant to a rival vaccine maker, Parke-Davis, made an epidemiological case for it. Congress responded by quickly passing the Biologics Control Act of 1902, which ordered the inspection and licensing of vaccine producers, and required them to put an expiration date on their products. It was one of the first steps in the creation of the regulatory state.

The act calmed the public and drove some smaller companies out of business. The first two licensees were Parke-Davis (later absorbed by Pfizer) and Mulford (absorbed by Merck). And the purity of vaccines, which had been improving already, improved dramatically in the following decade.

The antivaccinationists turned to the courts in a fight about constitutional principles. The argument on the state’s side was that it had a duty to protect citizens from deadly invasion, which might be launched by either an alien army or an alien army of microbes. The argument on the other side was the individual’s right to control what pathogens were poked into his body, and, as the antivaccinationists’ lawsuit contended, his right to “take his chance” by going bare in a time of contagion.

They brought their case to the Supreme Judicial Court of Massachusetts, and lost. Citing the power of government to quarantine the sick and conscript soldiers in war, the court said, “The rights of individuals must yield, if necessary, when the welfare of the whole community is at stake.” Then they appealed to the US Supreme Court. In the same year in which the libertarian side won a landmark economic-liberty ruling in Lochner v.New York (1905), it lost, 7–2, in the vaccination case, Jacobson v.Massachusetts. Writing for the court, Justice John Harlan compared the case to defense against foreign invasion, and wrote, “There are manifold restraints to which every person is necessarily subject for the common good.”

It is an unlibertarian answer. The author of Pox thinks it was the right answer — and so do I, in this case. Smallpox was indeed a kind of invasion. In the 20th century it killed 300 million of Earth's people, and disfigured millions more. And though public health people can make similar arguments, and they do, about tobacco as a mass killer, I do not support their efforts to stamp out its use by ever more state intrusion. The difference is that with smallpox the individual’s refusal to be vaccinated put others in imminent danger of death. Collective action — compelled collective action — was the only way to defeat the invader. None of this is the case with tobacco.

Naturally, the progressives wanted to apply their wonderfully practical idea to all sorts of things. Referring to epidemics and the germ theory of disease, Willrich writes, “Progressives took up the germ theory as a powerful political metaphor. From the cities to the statehouses to Washington, the reformers decried prostitution, sweatshops, and poverty as ‘social ills.’ A stronger state, they said, held the ‘cure.’ ”

The antivaccinationists argued that compulsory vaccination would lead to other bad things. They made some fanciful statements; one wrote, “Why not chase people and circumcise them? Why not catch the people and give them a compulsory bath?” But they were right. Four years later, in 1906, Indiana enacted America’s first compulsory sterilization law. This was done in pursuit of another scientific, society-improving cause that excited progressives: eugenics. In 1927, in Buck v.Bell, the Supreme Court approved Virginia’s law of compulsory sterilization of the “feeble-minded.” That was the case in which Justice Oliver Wendell Holmes — supposedly the great champion of individual rights — pontificated: “The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes. Jacobson v.Massachusetts, 197 U.S. 11. Three generations of imbeciles are enough.”

Jacobson has been cited in many Supreme Court decisions, usually to approve state power. And the applications have not been confined to medicine. In 2004, Justice Clarence Thomas cited Jacobson in his dissent in Hamdi v.Rumsfeld, which was about a US citizen captured in Afghanistan. Thomas, often regarded as the most libertarian of the current justices, was arguing that Yaser Esam Hamdi had no right of habeas corpus and could be held on US soil indefinitely without charge.

Collective action — compelled collective action — was the only way to defeat the invader. None of this is the case with tobacco.

The smallpox epidemic of 1899–1902 was a pivotal event institutionally as well as legally. The Biologics Act created more work for the Marine-Hospital Service, which a few years later became the US Public Health Service. The service’s Hygienic Laboratory, which was empowered to check the purity of commercial vaccines, later grew into the National Institutes of Health.

From the story told in Pox, you can construct rival political narratives. The progressives’ “we’re all in it together” formula about the need for science, federal authority, officials, regulation, and compulsion is there. And in the case of smallpox, progressivism put its strongest foot forward: it was warding off imminent mass death. Smallpox is one of the least promising subjects for the libertarian formula of individual rights and personal responsibility. Yet here you can also find the bad things that libertarian theory predicts: state power used arbitrarily and unevenly, collateral deaths, official denial of obvious truths, a precedent for worse things later on, and even the favoring of large companies over small ones.

I don’t like the progressive formula. But in the case of smallpox it fits, and I accept it, looking immediately for the limits on it.

Fortunately for liberty as well as health, few other diseases are as contagious and deadly as smallpox. Even Ebola and SARS — two contagious and deadly diseases of recent decades — were mostly fought by quickly isolating the sick. State power was necessary — some people were forcibly quarantined — but there were no mass vaccinations of the healthy.

Still, vaccination surfaces as an issue from time to time. It is, on its face, a violation of the rights of the person. So is quarantine. And yet it is prudent to allow both of them in some situations.

That is an admission that the libertarian formula doesn’t work all the time. Liberty is for rational adults in a minimally normal world. That is a limitation, but not such a big one. Surely it is better to design a society for people who can think, make decisions, and take responsibility, while in a few cases having to break those rules, than to live in a world designed for people defined as children of the state.


Editor's Note: Review of "Pox: An American History," by Michael Willrich. Penguin, 2011, 400 pages.



Share This


Individualism in Real Life

 | 

Bethany Hamilton is one of those perfect libertarian heroes. When she wants something, she goes after it. When things go wrong, she looks for a way to make it right. She doesn't whine or complain, even when the thing that goes wrong is the horror of a shark taking off her arm. She relies on herself, her family, and her God.

The movie about Bethany, Soul Surfer, has its predictably maudlin moments, fueled by Marco Beltrami's heavily emotional musical score, but don't let that put you off. If you are looking for a film to demonstrate libertarian principles to your friends, take them to Soul Surfer.

The film is based on the true story of Bethany, a competitive surfer with corporate sponsorship who was on her way to professional status when a huge shark bit off her arm. She returned to competitive surfing within a matter of months, and is now a professional surfer. She also seems to be a really nice girl. I learned that not only from the film, but also from the articles I have read about her.

And the Hamiltons seem to be a model libertarian family. They ignored traditional middle-class expectations in order to follow the dreams they made for themselves. All of them, parents and children alike, live to surf. When Bethany showed a great aptitude for surfing, her parents opted out of the government school system and educated her at home so she could take advantage of daytime surfing. After her injury, they did for her only the things she absolutely could not do for herself, encouraging her quickly to find new ways of managing her "ADLs" (activities of daily living).

The film portrays the Hamiltons (Dennis Quaid and Helen Hunt, parents) as a close-knit family within a close-knit community of surfers who understand the true nature of competition. True competition isn't cutthroat or unfair. In fact, unfettered competition has the power to make every person and every product better. Even Bethany's surfing nemesis, Malina Birch (Sonya Balmores), is portrayed as competitively solid. After she paddles full out toward a wave during a competition instead of kindly slowing down to allow for Bethany's injury, Bethany (AnnaSophia Robb) thanks Malina for treating her as an equal and pushing her to be her best. It's a great example of the good that competition can accomplish.

Instead of turning to government support groups to help her deal with her injury, Bethany turns to three free-market sources: family, business, and religion. When she returns to surfing, she rejects the judges' offer to give her a head start paddling past the surf. Instead, her father designs a handle to help her "deck dive" under the waves. When finances are a problem, a news magazine offers to provide her with a prosthetic arm in exchange for a story, and a surfboard manufacturer sponsors her with equipment and clothing. The youth leader at her church (Carrie Underwood) gives her a fuller perspective on her life by taking her on a service mission to Thailand after the tsunami that hit in 2004. There she learns the joy of serving others — a kind of work that earns her psychic benefits rather than monetary rewards. She isn't "giving back"; she is "taking" happiness.

These examples of self-reliance and nongovernmental solutions to problems raise the level of this emotionally predictable film to one that is philosophically satisfying — and well worth seeing.


Editor's Note: Review of "Soul Surfer," directed by Sean McNamara. Sony Pictures Entertainment, 2011, 106 minutes.



Share This


Atlas at Last

 | 

The John Galt Line has finally pulled out of the station, and is barreling across the country picking up hitchhikers who may be wondering what all the fuss is about. After a spectacular pre-release advertising campaign that included multiple premieres in major cities and pre-purchased ticket sales that encouraged nearly 300 screen owners to give the film a chance, Atlas Shrugged Part 1 opened on April 15 (Tax Day) as the third-grossing film of the weekend (looking only at screen averages, not total sales).

"Mixed" is an understatement when describing the reviews. Professional critics on RottenTomatoes give it a 6% approval rating, perhaps the lowest rating I have ever seen for a film. Meanwhile, audiences gave it an unbelievable 85%.

In fact, the film doesn’t deserve either rating. It belongs somewhere in the low middle.

It is not as good as the 85% would indicate; audiences who saw it on opening weekend are rabid fans, bent on a mission to have everyone in America see the film and learn Ayn Rand's philosophy: free markets, free thinking, and self-reliance.

But it doesn't deserve the godawful 6% usually reserved for low-budget slasher flicks, either. It is not as bad as its low budget and relatively unknown cast of actors and producers would cause one to expect. It is respectable.

The cinematic quality is quite good, especially the outdoor scenes of Colorado and the special effects used to create the train and the bridge. The acting isn't bad, but it isn't great. Often I was painfully aware of Taylor Schilling being painfully aware of where Dagny should place her arm, or how Dagny should turn her head; I never felt that she embodied Dagny. Similarly, the background cast at the Reardens' anniversary party appeared to be made up of friends and family of the cast and crew (someone needed to teach them how NOT to mug for the camera).

For fans of Ayn Rand and Atlas Shrugged Part 1, the brightest compliment for this film is that it stays true to first third of the book. (Parts 2 and 3 are expected to follow.) For fans of filmmaking, however, the biggest problem is that it stays true to the book. The film is dialogue heavy, with very little action.

I’m not a Hollywood film reviewer; but I’m a watcher and a reader. I know that books and films are two different genres, and their stories have to be presented in two different ways. Books are primarily cerebral; films are primarily visual. Books can focus on philosophy and conversation; films must focus on action. Books can take days or weeks to read; films must tell their story in a couple of hours. When adapting a book to film, streamlining is essential. Unfortunately, the words in this film are so dense that the ideas become lost.

Atlas Shrugged Part 1 contains some great quotations, but it is not a film that will convince anyone but the Rand faithful of the supremacy of the free market. It makes the same mistake that most libertarians do when espousing philosophy: it assumes that everyone already sees the problems in the way libertarians do. It does not sufficiently engage the non-business person in seeing the long-term effects for everyone when government intervenes in the market. I can hear my middle-class neighbors and colleagues saying "So what?" when Rearden (Grant Bowler) is forced to sell all but one of his businesses. "How is that going to hurt me?" they might wonder.

Even the conflict between Dagny's pure free-market economics and her brother James's (Matthew Marsden) collusion with government is insufficiently portrayed; Dagny seems to be simply getting around the stockholders when she takes over the John Galt Line. Moreover, she and Rearden can hardly be seen as icons of virtue when they violate a freely made and morally binding contract (his marriage vows) by jumping into bed together. Even more damning is Ellis Wyatt's decision to burn his oil fields rather than let anyone else enjoy the fruits of his labor. My middle-class neighbors would howl with outrage at this decision. In short, I don't see how this film will convince anyone that returning to free-market principles will improve our economy and our way of life. It seems like everyone in the film is cutting moral corners somewhere.

"Not bad" is faint praise for a movie that has been 50 years in the waiting. Unfortunately, business pressures caused it to be rushed through with only five weeks in the writing, and another five weeks in the filming. Business is often an exercise in compromise, and this film's production is a classic example. I think, however, that if The Fountainhead's Howard Roark had been the architect of this film, it would have been burned along with Ellis Wyatt's oil fields. It's good, but not good enough.


Editor's Note: Review of "Atlas Shrugged Part 1" (2011), directed by Paul Johannson. The Strike Productions, 2011, 97 minutes.



Share This


How to Unblock Your Writing

 | 

Wouldn't it be great to have limitless access to all the cells in your brain? To have a Google feature of sorts that would allow you to immediately call up just the right fact or memory that you need at any given moment, and the ability to synthesize and analyze all that information instantly?

That's what the drug NTZ does for characters in the film Limitless, a mystery-thriller in theaters now. Eddie Morra (Bradley Cooper) is a sci-fi writer with a contract for a book, but a debilitating writer's block has prevented him from writing a single word. His life is a mess, his apartment is a mess, he's a mess, and his girlfriend Lindy (Abbie Cornish) has just broken up with him because of it.

Then he meets an old friend, Vernon Gant (Johnny Whitworth) who gives him a tab of NTZ. Within minutes Eddie experiences a rush of insight and intelligence. He can remember books he thumbed through in college, television shows he saw in childhood, and math equations he learned in high school. Within hours he has cleaned his apartment, resolved his back rent, and written 50 scintillating pages of his novel. But the next day, he is back to being the same sloppy Eddie who can't write a single word. More NTZ is in order.

If this story line sounds familiar, it is. Daniel Keyes explored this idea of artificially stimulated intelligence in his "Flowers for Algernon," which was later made into the movie Charlie starring Cliff Robertson. Phenomenon, a John Travolta film, uses the same premise. Even the spy spoof television show "Chuck" uses a similar premise when the title character is able to access information stored in "the intersect," as though his brain were a remote computer. What makes this film stand out, however, is its jazzy musical score, witty script, engaging mystery, and skillful cast, not to mention its unexpected ending.

The film begins with the resounding thump of a sledgehammer against a metal door that jars the audience out of its seat. The throbbing musical score (by Paul Leonard-Morgan) drives the story forward, lifting the audience into a feel-good mood.

Eddie's brain on NTZ is simulated artfully for the audience through special camera effects that make Eddie's consciousness seem to speed not just down the road but also through cars, down sidewalks, into buildings, and out of them again at dizzying roller-coaster speeds. When he begins to write, letters appear from his ceiling and drop like rain into his room. Later, when he starts using his newfound skill to make money in the stock market, his ceiling tiles become a ticker tape, composing themselves into the stock symbols that he should buy. Intensified color is also used to portray his intensified awareness; even Cooper's intensely clear blue eyes add to his character's altered sense of reality. These techniques are very effective in creating a sense of what Eddie is experiencing.

The story's suspense is driven by Eddie's shady relationships with a drug dealer (Whitworth), a loan shark (Andrew Howard), a stalker (Tomas Arana), and an investment banker (Robert de Niro).  Eddie cleverly draws on all the memories stored in his brain to thwart the bad guys, but when he unwittingly comes across a dead body, he reacts in the way a normal person would — completely terrified, knocking over a chair as he collapses, then hiding in a corner with a golf club for protection as he calls the police. It's refreshing to see a character react as we probably would, instead of displaying unrealistic aplomb.

Limitless is a surprisingly entertaining film, with its fast pace, skilled cast, creative camera work, and interesting plot. Well worth the price of a ticket.


Editor's Note: Review of "Limitless," directed by Neil Burger. Many Rivers Productions, 2011, 105 minutes.



Share This


Much More Than Moore

 | 

The hardest part of making a film is not writing the script, hiring the cast, choosing the locations, planning the shots, or editing the footage down to a moving and entertaining feature that tells the story in under two hours. The hardest part of filmmaking is finding the funding. It takes money to make a movie. Lots of money.

Ideally, the consumers (moviegoers) should pay for the product (the movie on the screen). And ultimately, they do, $10 at a time. But filmmakers need money upfront to make the product. Piles and piles of money. This is just Capitalism 101 for libertarians, and it makes me stare in disbelief when Americans glibly criticize the capitalist system for being corrupt and selfish. What could be less selfish than deciding to forego current consumption in order to invest in someone else's dream?

From the earliest days of filmmaking, films have been financed in several ways: using personal funds, either from one's own pocket or that of a rich friend or relative; applying for business loans; studio investment; and selling product placement. In recent years, product placement has become increasingly important as a way to fund the burgeoning cost of producing a movie, where a million dollars can be considered little more than chump change.

Morgan Spurlock, the new darling of the political-agenda documentary, exposes the process of selling embedded advertising in his new film, The Greatest Movie Ever Sold, which opens later this month. But, as I said, product placement is nothing new. From the start, radio programs and TV shows were "brought to you by" a particular sponsor; product placement was simply a way of getting the product into the show itself. Today product placement is a multibillion-dollar undertaking. Also called "co-promotion" and "brand partnering," this marriage of convenience provides money for the movie makers and brand recognition for the product. According to the documentary, businesses spent $412 billion last year on product placement ads, from the Coke glasses placed in front of the judges on American Idol, to the Chrysler 300s driven by Jack Bauer on 24 (after Ford withdrew its F-150s), to the kind of phones that Charlie's Angels carry.

The film is informative, intelligent, and laugh-out-loud funny, largely because of Spurlock's dry, self-deprecating humor as he goes about looking for sponsors for his film, which is simply a movie about Spurlock looking for sponsors for his film. Where Michael Moore made his mark in documentaries by humiliating his subjects through ambush journalism, Spurlock is gleefully upfront about what he is doing, treating his subjects with seriocomic respect and appreciation.

We all know we're being had, but he does it so openly that he makes us enjoy being had.

Spurlock doesn't just walk into business meetings unprepared, and beg for money. He does his homework, as good filmmakers (or any salesperson) should. He begins with a psychological evaluation to determine his "Brand Personality," which helps him identify what kinds of products would be a good fit for his film. Not surprisingly, his brand personality is "mindful/playful," so he looks for products whose makers think of themselves as appealing to consumers who are mindful and playful. He arrives at meetings with high quality storyboards and mockups to make his pitch. He listens carefully to the producers and accommodates their concerns. After all, if their needs aren't met, they won't fund the film. They are his consumers as much as the ticket buyers at the multiplex will be.

The film is liberally peppered with products, all of them described, worn, eaten, or presented with Spurlock's respectful glee. We all know we're being had, but he does it so openly that he makes us enjoy being had. Even his attorney is a product placed in the movie; after discussing a contract, Spurlock asks how much the consultation will cost him, and the attorney replies, "I charge $770 an hour. But the bigger question is, how much is it going to cost me to be in your movie?" (I wrote the attorney's name in my notes, but I'm not repeating it here. He hasn't paid Liberty anything to be mentioned in our magazine . . .)

Spurlock likens his movie to a NASCAR racer, and accordingly wears a suit covered in his sponsors' logos for interviews. The official poster shows his naked body tattooed with the logos, with a poster board of the film's title strategically placed across his crotch.  (Nudity sells, but I guess his manhood didn't pay for product placement.)

The film is funny but also informative. Despite Spurlock's gleeful presentation, he offers many serious ideas about product placement in movies and about advertising in general. For example, he discusses the potential loss of artistic control when the sponsoring company wants things done a certain way. This isn't new; Philip Morris reportedly told Lucy and Desi they had to be seen smoking more frequently on "I Love Lucy," the most popular show of the 1950s, and they complied. A filmmaker has to weigh the money against the control, and decide how much to compromise.

Truth in advertising is also discussed. Spurlock visits Sao Paolo, Brazil, where outdoor advertising has been completely banned by a new "Clean City Law." Now store owners rely more heavily on word-of-mouth referrals for new customers, which may indeed be a more honest form of testimonial, but highly inefficient — and inefficiency is generally passed along to consumers in the form of higher prices. In the film, local Brazilians glowingly praise their ability to "see nature" now that the billboards are gone, as Spurlock's cameras pan to the high-rise buildings that overpower the few shrubs and trees in the downtown area and block the view of the sky. Subtle, and effective.

Spurlock also interviews several people to get their opinions of truth in advertising. Ironically, one of the interviewees has bright magenta hair taken from a bottle, another has the unmistakable ridge of breast augmentation, another is wearing a sandwich board advertising a nearby store, while a fourth is pumping gas at the chain that has made a brand-partnering deal with Spurlock. Once again Spurlock is making gentle fun of his subjects, and we laugh gleefully along with him. (But I'm still not willing to reveal the name of the gas station until they pony up with some advertising money for Liberty.)

The Greatest Movie Ever Sold may not be the greatest documentary ever made, but it is mindful and playful, like its maker. If it comes to your town, don't miss it.


Editor's Note: Review of "The Greatest Movie Ever Sold," directed by Morgan Spurlock. Snoot Entertainment/Warrior Poet, 2011, 90 minutes.



Share This


Bones Crunch!

 | 

Liam Neeson fairly burst onto the big screen in 1993 with his compelling performance as Oskar Schindler, the man who saved over a thousand Jews from Nazi execution, in the Oscar-winning Schindler's List. It wasn't his first film by any means, but it was his first big film, and it garnered him an Oscar nomination for best actor. From there his career turned in the direction one would expect for an Irish-born, classically trained actor with a resonant voice and proclivity for accents. He played characters with stature: Rob Roy, Michael Collins, Alfred Kinsey, Jean Valjean, the god Zeus. He was the voice of Aslan. He also had fatherly, mentoring roles in such films as Batman Begins and Star Wars Episode I.

So how did this stately-but-slightly-sagging, now-middle-aged man suddenly morph into an action figure? A figure who has become a number one draw at the box office?

It started with Taken (2008), a film in which he plays a father determined to rescue his kidnapped daughter. Not an unlikely reach for a man his age — except that his character, Bryan Mills, is a retired CIA agent who is highly trained in combat and espionage. Taken was 10% distraught father's angst and 90% thrill ride, with enough hand-to-hand combat, gunfights, dead bodies, and car chases in a 93-minute thriller to satisfy the most avid video game player. (And that's what many of these new thrillers have become: video games without the controllers.)

From there Neeson has voiced a character in an actual video game (Fallout 3), and fought the bad guys with The A-Team. Now he is taking on a horde of assassins in the new psychological conspiracy thriller, Unknown.

With a more engaging plot than most action movies, Unknown offers a satisfying evening's entertainment. The story has numerous unexpected twists and subtle clues, with enough red herrings to keep even this staid reviewer off balance. Neeson plays Martin Harris, a biochemist arriving in Berlin with his beautiful wife (January Jones) to present a paper at a scientific conference. When his briefcase is accidentally left behind at the airport, he grabs a cab to retrieve it and ends up in the river when the driver swerves to avoid some falling debris. By the time Harris returns to the hotel, after spending four days in a coma, another man has taken his place, and his wife does not recognize him. Creepy men start following him. Cars race through traffic with guns blazing. Bodies crash through walls in the throes of battle. Bones crunch. Bullets fly. Blood flows. Necks snap. And in the middle of it all, Liam Neeson, looking more like Al Bundy than James Bond, is the unlikely action hero.

I don't get it. But I like it.


Editor's Note: Review of "Unknown," directed by Jaume Collet-Serra. Warner Brothers, 2010, 113 minutes.



Share This


More Than Just a Pretty Film

 | 

The Illusionist is a lovely animated movie by French filmmaker Sylvain Chomet — a movie that, despite its beauty, has a disturbing message.

Its leading characters are a kindly vaudeville magician and the young working girl whom he befriends. The story is sweet and full of pathos, as the older gentleman sacrifices his own comfort and well being to please the girl. Appropriately, the film is drawn in the soft-edged, old-school style that predates Pixar. Its French pedigree is obvious, from its watercolor backgrounds and exaggerated, non-realistic faces to its impressionistic musical score. The characters communicate with each other through a combination of mime and an odd pseudo-language reminiscent of the way adults speak in the old "Peanuts" TV specials. This adds to the dreamlike quality of the story, although it can be off-putting to those who aren't fans of French animation.

Based on a story by Jacques Tati (1907–1982), the famous French filmmaker, The Illusionist is intended to show the deep father-daughter connection between a lonely old man and an equally lonely young girl. Metaphorically, however, the film offers a powerful, though certainly unintentional, warning look at the relationship between the working class and the welfare class. The magician's relationship with the young cleaning girl begins innocently and sweetly. When her bar of soap slips away from her while she is cleaning the floors, he picks it up and "magically" turns it into a fancy box of perfumed hand soap, offering it to her with a flourish. She is thrilled. The next day she washes his shirt to show her appreciation, and he "magically" produces a coin from her ear to thank her — the way kindly uncles do when they visit little nieces and nephews. Noticing that the sole of her shabby shoe is flapping wildly as she walks, he buys her a pair of bright red shoes.

Before long the magician's gig at the local vaudeville theater ends, and he must move on to the next town. Without being invited, the girl follows him. When the conductor asks for her ticket, she points to the old man, miming her expectation that he will produce a ticket for her out of thin air. Not wanting to disappoint her, the poor man complies, again with a magical flourish. Throughout the rest of the film the girl stays with the man, pointing to new goodies that she wants — a new coat, high-heeled shoes, a new dress, and a coin from her ear every time they part. The man takes on extra jobs to pay for her increasing demands. He sleeps on the couch so she can have the single bedroom in his tiny apartment. Sadly, the girl never catches on to what is happening to the man. You can probably guess where this leads. Small- time magicians, like golden geese, eventually give out.

The film offers a powerful demonstration of what has happened to a whole generation of people who have grown up under the welfare state. They have no idea where money comes from, or how to earn it. They turn to the government for housing, food stamps, education, medical care, and even entertainment in the form of parks and recreation. They seem to think that money can appear out of thin air, and that people who work owe them all the goodies they want. Like the man in the film, tax-paying Americans are becoming threadbare and exhausted. The demands on them are too many, and they're tired of not being appreciated for meeting those demands. At some point they are going to stop working — also like the man in the film. What then?

A friend who teaches middle school in the Bronx asked her students to write an essay about what they want to be when they grow up — pretty standard fare for a middle-school essay. One young man wrote about going to college, becoming a lawyer, and representing clients in court. "I'll make a lot of money, and I'll wear nice suits and carry a briefcase," he dreamed. But he ended his essay with this chilling observation: "If I do that, I'll probably earn too much money and I'll lose my housing and food stamps. So maybe that's not such a good idea." What a self-defeating decision! Yet I see that idea in practice every day as I work with people from Yonkers and the Bronx. They are so afraid of losing their tiny apartments in crumbling buildings on potholed streets in seedy neighborhoods that they won't even consider moving to a different state with a lower cost of living, where they could get a job and provide for their families themselves.

How surprising, that the demise of the American dream would be so skillfully and artistically presented in the form of a French animated film. It is well worth sharing with friends as a cautionary tale of pending disaster.


Editor's Note: Review of "The Illusionist," directed by Sylvain Chomet. Pathé-Django, 2010, 90 minutes.



Share This


Assessing the Bailouts

 | 

Here are some libertarian warnings about the government rescue of General Motors:

“The current restructuring plan calls for the U.S. Treasury Department to have controlling interest in General Motors, which amounts to absolute nationalization. In GM's headquarters in Detroit there is a cluster of bureaucrats from the government's task force telling GM how to run its business. . . . Furthermore, the White House fired General Motors Chairman and CEO Rick Wagoner. When the executive branch intervenes in a private business and ousts management, bailout or not, it is a staggering violation of the American ideal of free enterprise. This sets a precedent for unlimited government trampling over the private sector. On March 30th, Obama said, ‘Let me be clear. The United States government has no interest in running GM. We have no intention of running GM.’ If that's the case — and we know it's not — then why scoop up majority ownership?” (Karen DeCoster, LewRockwell.com, May 23, 2009)

And:

“Will GM be run as profitably and efficiently as Amtrak? Will GM be paid not to produce, like the agricultural sector? Will it feed into an economic bubble like Fannie Mae and Freddie Mac? Will it boast the negligible oversight and waste of the so-called ‘stimulus’ package? Will it feature the fiscal irresponsibility of Social Security? Or will we see the runaway costs of Medicaid?” So many options. (David Harsanyi, Reason.com, June 3, 2009)

And:

“GM’s bankruptcy announcement today . . . might well be remembered as the company’s last act of capitalism. If GM emerges from bankruptcy organized and governed by the plan created by the Obama administration, it is impossible to see how free markets will have anything to do with the U.S. auto industry. With taxpayers on the hook for $50 billion (at a minimum), the administration will do whatever it has to — including tilting the playing field with policies that induce consumers to buy GM or hamstring GM’s competition or subsidize its costs — in order for GM to succeed.” (Daniel Ikenson, Cato@Liberty, June 1, 2009)

And:

“The more the Treasury lends, the more GM will come to be Government Motors. In Obama's America, that will mean becoming a development project for electric cars, plug-in hybrids, or whatever it is the politicians want. The result might be wonderful, but the history of state-controlled companies suggests boondoggles are more likely. The best thing is to let GM do what failing companies have always done: reorganize if possible, liquidate if necessary. . . . Let it go, and let investors put their money where the odds are better.” (Bruce Ramsey, The Seattle Times, April 1, 2009)

That last excerpt is, of course, by me. I didn’t condemn the firing of Rick Wagoner, because I would have fired him, too. But on the main point — that so much government money risked turning GM into a money-losing government pet — I agreed.

It hasn’t happened that way. As I write, the new GM has had three quarters in the black, the latest one earning five and a half cents of operating profit on each dollar of sales. That’s quite respectable. In November 2010, GM became a public company again with a stock offering. GM planned to offer the stock at $26 to $29, depending on the market, but demand was so strong that it sold the shares at $33; and as I write, the stock is trading above that. The stock offering reduced the Treasury’s stake in GM from 60.8% to one-third. If the US government can sell its remaining stock at $53, it will break even.

Chrysler hasn’t done as well, but it is also in the black, and looks like it will make it.

Of course, disaster could yet strike and turn the pessimists into prophets. But let’s be honest: so far, the affair has turned out much better than we feared. We ought to know why, especially if we expect to oppose the next bailout.

One place to start is Steven Rattner’s book, Overhaul, subtitled An Insider’s Account of the Obama Administration’s Emergency Rescue of the Auto Industry. Rattner, 58, was President Obama’s “car czar” on the GM and Chrysler rescues. Rattner is a Wall Street guy; he was a founder and partner of the Quadrangle Group, a leveraged buyout firm. Before that he was a reporter for the New York Times, which explains why his book reads so well.

Of course, you are free not to believe his account. I haven’t seen it contradicted, so I take it at face value. Reasons may come along to change that. So far they have not.

I don’t share Rattner’s politics. He is an Obama man. He has raised money for Democrats, and his wife Maureen is a former national finance chairman of the Democratic Party. He says in the book that he is a Democrat because “the Republicans had favored the rich at a time of growing income inequality, abandoned fiscal responsibility, and held unfortunate positions on social issues such as a woman’s right to choose abortion.”

The GM affair has turned out much better than we feared. We ought to know why, especially if we expect to oppose the next bailout.

As “car czar,” Rattner was leader of Team Auto, a group of people, mostly from outside government, whom he recruited. His bosses were Larry Summers, President Obama’s chief economic adviser; and Tim Geithner, Obama’s treasury secretary. Rattner says his task “was not designed to further a particular economic theory,” but was strictly to save those parts of GM and Chrysler that could be made economically viable. He says that in his meeting with Obama on March 26, 2009, the president told him, “I want you to be tough and I want you to be commercial.”

Rattner focused on GM. When his work began, in January 2009, the company already had been given billions by the Bush administration and was burning through it at a bonfire rate. The economy was in panic. The Dow Jones Industrial Average was shriveling to 6,600, the figure it would touch in early March. GM was no longer a viable enterprise. On his judgment of GM, Rattner sounds much like the libertarian accountant Karen DeCoster, who had been saying on LewRockwell.com for several years that GM was an ongoing industrial disaster.

Rattner is no partisan of unions; he recalls his days at the New York Times when the Newspaper Guild defended featherbedding and incompetents. But he does respect unions, and he doesn’t blame them for what happened at GM. It’s management that runs a company. As a leveraged buyout guy, he had expertise in firing, hiring, and evaluating managers — and at GM he was appalled by what he saw and heard.

“From Wagoner on down,” he writes, “GM seemed to be living in a fantasy.” To executives, the problem was entirely the recession, not the GM product line or the way the company was run. GM management’s idea, Rattner says, “seemed to be to trim only what was easily achieved and absolutely necessary, and use taxpayer dollars to ride out the recession.”

But GM had been bleeding market share continually for 33 years. After rebates, its cars sold for thousands of dollars less than their European and Japanese counterparts — a market measure of the people’s esteem for the products. For management not to see that was a kind of sickness. GM had a culture of excuses, of institutionalized failure.

Rattner’s killing comment: “No one at GM thought like an owner.”

Team Auto’s idea was that government money would come only with big changes to make GM viable, including sacrifices by management, labor, bondholders and creditors, and with a major mind readjustment at the top. The place to begin was the CEO.

There was a hullabaloo in the press because the government fired Wagoner. So it had; but the government was the investor, and an investor may put conditions on his investment.

“After nearly a decade of experience as a private equity manger, I believe in a bedrock principle of that business: put money behind only a bankable management team. To my mind, no private equity firm on the planet would have backed Rick Wagoner or GM’s current board. . . . A CEO who leads a company into a state in which its only recourse is a government bailout shouldn’t be in his job.”

And that’s right. There was a hullabaloo in the press because the government fired Wagoner. So it had; but the government was the investor, and an investor may put conditions on his investment. He who pays the jukebox calls the tune.

The critics wanted GM to be put into bankruptcy — either Chapter 11, in which creditors are partly stiffed and the company survives, or Chapter 7, in which the creditors are given the carcass. The critics tended to gloss over the fact that each of these procedures involves the government, in the form of a federal bankruptcy court. Each involves the breaking of private contracts. The difference is that in a bankruptcy the decisions are made by a judge (appointed by other judges) rather than by a politician or direct political appointee. The other difference is that a bankruptcy judge does not have a pipeline to public money.

Those are important differences, but it’s still the government.

An ordinary Chapter 11 would have taken perhaps 18 months, and Rattner says there was no way GM could get debtor-in-possession financing from a bank for such a period. Wagoner had waited too long, and GM was too weak. Wagoner told Team Auto that once the company was in bankruptcy, he believed nobody would buy its cars, because customers would be worried about warranty coverage. Because of that, Wagoner said, he had done no preparation for a Chapter 11. The warranty problem was real, but this was also a self-excusing argument from a guy lining up for a bailout.

There was a different kind of Chapter 11 called a Section 363. It had never been tired on an industrial company the size of GM or Chrysler. A Section 363 allowed Team Auto, the new investor, to fashion a new GM from the assets of the old, leaving creditors to fight over the bones of Old GM. In that way, Team Auto brought out what Rattner calls “Shiny New GM” in a little more than a month.

Maybe that tells something about bankruptcy. Critics write as if bankruptcy were a fixed thing, like the Ten Commandments. But it is a human institution, and may be changed so that it works better or worse.

In the GM case, the stockholders would get nothing, which is what they would have gotten anyway. The bondholders would get a share of New GM worth about 35 cents on the dollar — an amount Rattner says was more than they would have received in a straight liquidation and therefore too much, because the government wasn’t intervening on account of them. The workers’ medical benefit trust would get a piece of the new company in exchange for releasing the new GM from all of its retiree medical obligations, which, unlike pensions, are not guaranteed by the Pension Benefit Guraranty Corp. The United Auto Workers would get a seat on the GM board. The Canadian government, also investing capital, would get a 12% stake.

Given the choice, taking the equity was the right decision. If you’re going to pay the money, you may as well get something for it.

And the US Treasury would get almost 61%. That was a problem. Contrary to all the muttering about Obama being a socialist, the president didn’t want the government to own GM, nor did any substantial constituency in the Democratic Party. And yet, when the decision came about taking shares of stock, it was this, according to Rattner: “We can either get nothing for something or we can get something for something.”

Given that choice, taking the equity was the right decision. If you’re going to pay the money, you may as well get something for it. The Treasury did, and the result of that decision a year and a half later is that the taxpayers have been made one-third whole, and have a chance of coming out entirely whole.

And compared with things two years ago, that is an economic success.

Rattner wants to say there were no politics involved in these decisions, but he can’t quite do it. The choice to undertake the project in the first place was political. And always there was what Rattner calls “the Washington Post Test”: don’t do anything that you don’t want to see written about in theWashington Post.

But the bounds of the politically possible left a large space, and within this space Team Auto had discretion. Rattner writes, “No one in the Obama administration ever asked us to favor labor for political reasons.” Indeed, one of his chapters is called, “F**k the UAW,” a quotation he attributes to White House Chief of Staff Rahm Emanuel.

Organized labor took its lumps in the rescue — at GM it lost almost one-third of its North American jobs. But it came out ahead of where it would have been with no rescue, and it preserved its pensions and high wages for older workers by pushing pay far down for new workers.

With GM, Team Auto assumed from the start that there was a commercially viable core. This wasn’t obvious with Chrysler. It was smaller than GM, and Rattner says the economic case for saving it was much weaker. Obama’s economist, Austan Goolsbee, argued that Chrysler ought to be liquidated: the net loss of US jobs wouldn’t be so bad, because many of Chrysler’s customers would buy their cars from GM and Ford. Better to have two strong companies than three weaker ones. At one point Team Auto considered a plan to transfer Chrysler’s top brands — Dodge Ram, Jeep, and Chrysler minivans — to GM, and shut down the rest.

The political interference after the deal was announced came from Congress, on behalf of car dealers.

Team Auto was divided on whether to subsidize a deal with Fiat to save Chrysler. Rattner voted to do it, but he sounds as if he wished he hadn’t. The matter finally went to Obama, who said to do it. That is when Obama told Rattner to be “tough” and “commercial”—advice that Obama was not exactly following, himself.

The main theme of Rattner’s book is that the GM deal — the big one — was done much as a private investor would, if the investor had decided already to do something. Team Auto’s aim was to make the company profitable. That meant a new CEO, a new chief financial officer, a new chairman, and several new members of the board. It meant shedding Pontiac, Saturn, Hummer, and Saab, and narrowing the product line to Chevrolet, Buick, Cadillac, and GMC Trucks. It meant shedding almost 30,000 jobs. And it meant cutting out 1,124 GM dealers.

The political interference after the deal was announced came from Congress, on behalf of dealers. Each member had imperiled dealers in his district. Sen. Kay Bailey Hutchison, Republican, held up a war-funding bill on behalf of Texas car dealers. House Majority Leader Steny Hoyer, Democrat of Maryland, went to bat for a dealer who had sold only three Chryslers in all of 2008 — and later told Rattner he’d let the companies keep too many dealers alive. Or so Rattner says. Rep. Gabrielle Giffords, Democrat of Arizona, went to bat for a Chrysler dealer, Rattner writes, and “repeated her talking points over and over.”

The intervention by Congress, Rattner writes, was “an enormous, pointless distraction for the two companies at a critical time. Its interference left me wondering what in the auto rescue Congress might like to micromanage next — choosing factory locations or deciding which executives and workers stayed and which had to go?”

That’s politics. The incentives facing elected officials are alien to the imperatives of business. Of course, members of Congress would not have had the ability to intervene so easily if there had been no intervention by the executive branch.

Which gets back to the original question. Obama is an elected official. Why did he let this be done in a mostly businesslike way?

Rattner doesn’t ask the question, but the answer that floats to the surface is that Obama is responsible for the whole country, not just congressional districts with GM and Chrysler assembly plants.

But it was more than that. Probably the strongest reason why Obama didn’t politicize the bailouts to the extent that libertarians feared — and a reason not in Rattner’s book — is that the American people, Republicans and Democrats included, hated the bailout. The polls were clear about that. The first demos of the Tea Party showed it. So did the rise in the reputation of Ford, which became America’s most popular car company because it never took a bailout. There simply was little political gain for Obama in creating Government Motors, and much political hazard.

Probably the strongest reason why Obama didn’t politicize the bailouts to the extent that libertarians feared is that the American people hated the bailout.

So he didn’t do it. By the standards of capitalism — the standards of the market — the bailouts have turned out well. The new General Motors and the new Chrysler are viable companies. Libertarians have to admit that.

When they do so, they should also point out that much of this was due to luck, and not anchored in the nature of things. This time it worked out; but once you allow politicians to run about fixing broken industries with public money, all sorts of bad incentives are created. There is the bad incentive to be a Rick Wagoner, and wait to be saved. There is the bad incentive for the bailers-out not to be tough and commercial, but to be political instead, in order to get votes. In the spring of 2009 Obama wasn’t over a barrel for the electoral votes of Michigan and Ohio, but a future president might be. Rattner alludes to these problems, but his book is about the specific case, not a general case.

A car company rescue also gives the politician a chance to be tinkerer-in-chief. Obama isn’t a car guy, but I can think of politicians who would use the opportunity to push the Chevy Volt, or to go beyond it. Rattner is blunt about the Volt: it costs $40,000 to make, and it competes with cars that cost half as much. He writes, “There is no scenario under which the Volt, estimable as it may be, will make any material contribution to GM’s fortunes for many years.”

In his epilogue, Rattner compares the auto rescues, which resulted in two viable companies, with Obama’s “stimulus” package, which cost 10 times as much and spent the money “without anything like the rigor that private equity or venture capital investors apply.” He says that Americans “we will be disappointed by how little lasting benefit we got for those dollars.”

That is so. It is also a low standard for measuring government spending.

Libertarians still have strong reasons to oppose corporate rescues generally. But each case has its own facts, and sometimes a bad idea delivers good results.

I’m glad it came out well — I guess.

/p


Editor's Note: Review of "Overhaul: An Insider's Account of the Obama Administration's Emergency Rescue of the Auto Industry," by Steven Rattner. Houghton Mifflin Harcourt, 2010, 320 pages.



Share This
Syndicate content

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.