Ain’t That a Shame?

 | 

“You’d be like heaven to touch, I want to hold you so much.” Is there a more perfect lyric in the world, one reviewer asks. The lyrics of the Four Seasons expressed all the yearning of unrequited love. I can still remember the party where my adolescent heart was stirred while that song played in my mind. “Can’t take my eyes off of you,” I hummed softly, but his eyes adored someone else. Oh what a night — the music of our youth stays with us and has the power to evoke long-dormant memories and emotions.

That’s one reason that Jersey Boys (like Mamma Mia) has had such a long and successful run on Broadway, playing to people who often sing along (to the annoyance of the person in the next seat). The Four Seasons were the “other” ’60s sound — not rock and roll and not Motown but simple, true lyrics sung in clear, clean harmonies with that strong countertenor of Frankie Valli set in just the right key for female teenyboppers. I learned how to sing harmony with the Four Seasons. They were a sound you could play in front of your parents.

Sinatra, another Frank who made it out of Jersey through his glorious voice, is next to the Pope in this story — quite literally.

Their personal lives were another story, however — normalized at the time but recently placed in another light by the Broadway musical and now the film. As represented by the movie, the boys from Jersey — Tommy, Nicky, Joe, and Frankie (Bob was from a nicer background) — were little more than hoodlums, knocking over delivery trucks and breaking into jewelry stores when they were supposed to be in the library. They knew the beat cops by name, and for some of them the local detention facility was like a revolving door, as the characters gleefully admit in the film. Of course, this is the way it’s remembered by Frankie Valli and Bob Gaudio, executive producers of the film; Tommy, Nicky, and Joey might remember it quite differently.

“There were three ways out of the neighborhood,” Tommy DeVito (Vincent Piazza) tells the audience. “Join the army, join the mob, or become famous.” The first two could get you killed, so singing was the ticket out. Sinatra, another Frank who made it out of Jersey through his glorious voice, is next to the Pope in this story — quite literally. Their photos are set in a double frame and stand like a shrine of hope on the living room shelf of Frankie’s childhood home.

The first half of the film focuses on the boys’ backgrounds and their slow rise to fame through seedy nightclubs and bowling alley bars. Waiting over an hour for the first familiar song to appear in this film heightens the drama at its unveiling. I was tapping my foot impatiently. But when it finally arrives it reminds us of how sublime their harmonies were, and how simple their lyrics: “She-e-e-rry, Sherry baby, She-e-erry, Sherry baby. She-eh-eh-eh-eh-erry baby. Sherry baby. Sherry, won’t you come out tonight?” Sheesh! How did that ever make it to the radio? Yet it topped the charts and was followed by hit after hit that told our stories in song.

One of Eastwood’s biggest mistakes was the decision to bring several original cast members and other virtual unknowns from the Broadway stage to the sound stage.

The lyrics of the songs tell the story in the film too, although it all works better in the stage musical, where the production numbers are showcased. Instead of using the lyrics to carry the story forward as most musicals do, Eastwood inserts them almost like a sidebar to the story he prefers to tell. In the film the songs often play in the background, and often while the characters are speaking, so the effect is lessened.

The huge theater where I saw the movie held exactly four viewers at the 7:15 show on opening night. Four Fans for the Four Seasons. Sigh. With the popularity of the Broadway musical (and Clint Eastwood as the producer and director) the film had a disappointing turnout for its opening day. But there’s the rub: Clint Eastwood. Who would have thought this talented octogenarian director known for his spare direction and raw drama would turn to the Broadway musical genre this late in his career? Oh wait — he already did, and it was a disaster. Eastwood starred as the singing prospector who shares a wife (Jean Seberg) with his partner (Lee Marvin, who has purchased her from a polygamous Mormon) in Lerner and Loewe’s Paint Your Wagon (1969), a movie based very loosely on the 1951 play that ran for only 289 performances. Eastwood was ridiculous in that film, and he brings no genuine experience to the filming of this musical. He also uses actors with no genuine experience on screen, intensifying the problem.

One of Eastwood’s biggest mistakes was the decision to bring several original cast members and other virtual unknowns from the Broadway stage to the sound stage. With only one familiar face — Christopher Walken as mob boss Gyp DeCarlo, who acts as a kindly godfather to the Jersey boys — there is no name other than Eastwood’s to attract film audiences. The four who play the Seasons are actually pretty good, (Vincent Piazza as Tommy DeVito, Michael Lomenda as Nick Massi, Erich Bergen as composer Bob Gaudio, and Tony-award-winner John Lloyd Young as Frankie Valli), but they aren’t, well, they aren’t seasoned. Renee Marino, who plays Frankie’s wife Mary onstage and in the film, is simply annoying with her exaggerated movements and wild outbursts of emotion. I actually went home and looked up her background, expecting to learn that she is Eastwood’s newest girlfriend, but she isn’t. (Remember those godawful movies from the ’70s and ’80s when Sondra Locke was his main squeeze? They were every which way but right.) The most interesting actor is Joseph Russo, also a newcomer, and only because he plays Joe Pesci. Yes, that Joe Pesci. He’s credited in the movie with bringing Bob Gaudio into the group, back when Pesci was just another kid from New Jersey. Eventually Tommy DeVito went to work with Pesci, and Pesci took Tommy’s name for his character in Goodfellas.

The problem is that acting for the screen is quite different from acting for a live audience. A movie screen is 70 feet wide, making the actor much larger than life. The flick of an eyebrow or twitch of a finger can relay emotion and communicate thoughts. Stage actors, on the other hand, must play to the balcony. Their actions are broad, even in tender moments. When Mary leans across a diner table with her butt in the air and her lips pouting forward as a come-on to the inexperienced Frankie, it works for the stage but is comical and unrealistic for the screen. And Eastwood should know, because he is the master of unspoken communication. In interviews Marino gushes about how relaxed and easy-going Eastwood was on set, but she needed direction. Desperately. “I need you, baby, to warm the lonely nights” can be said without words and bring tears to the eyes. Keep it simple, and keep it real. As Frankie says to Bob Gaudio about the arrangement of a new song, “If you goose it up too much it gets cheesy.”

That joy comes through in the closing credits of the film, when the cast members dance through the streets to a medley of songs reminiscent of the curtain-call encore

Overall Jersey Boys is a good film that provides interesting background about the music industry. Touring and recording isn’t all glitz and glamour; it’s mostly packing and repacking, eating in diners, staying in nondescript hotel rooms where you aren’t sure which direction is the bathroom in the middle of the night, missing family events, and in the end getting screwed over by unscrupulous money managers. It’s tough. But the film doesn’t give us much perspective about the Four Seasons and the time period in which they wrote. They were the clean-cut lounge singers who made hit after hit side by side with the Beatles, the Beach Boys, and the Rolling Stones. They held their own during the tumultuous ’60s, just singing about love: “Who loves you? Who loves you pretty baby?” They paved the way for a whole new sound in the ’70s when they added a brass orchestra.

Despite the hardships of the touring life, that wonderful music makes it all worthwhile. When asked to describe the best part of being the Four Seasons, Frankie responds simply, “When it was just us four guys singing under a street light.” Anyone who sings knows that feeling. It’s the joy of making music together.

That joy comes through in the closing credits of the film, when the cast members dance through the streets to a medley of songs reminiscent of the curtain-call encore at the end of the Broadway musical. Wisely Eastwood used the recordings of the original Four Seasons for the closing credits instead of the voices of the actors who play them in the movie. The difference is profound. Valli had such a glorious bell-like quality to his falsetto, while Young’s is simply false. He tries hard, but the effort shows. In the first hour of the film, when people react to his voice as he is “discovered,” it’s almost puzzling. What’s so great about this nasally voice with the slight rasp that makes you want to clear your throat? In the closing minutes of this film, listening to the original Four Seasons, it all makes sense.


Editor's Note: Review of "Jersey Boys," directed by Clint Eastwood. Warner Brothers, 2014, 134 foot-tapping minutes.



Share This


Action Plus Gravitas

 | 

Tight shot on the face of a man sleeping. His eye snaps open, and it is yesterday morning — again. He rises, and the day unfolds exactly as it did the day before. No one else knows that the day is being repeated, but he remembers, and he reacts. Each time he learns the best way to react in order to get where he wants to be. With eternity to learn and an infinite number of do-overs until he gets it right, the man develops skills, enhances relationships, and eventually gets the girl.

Groundhog Day (1993) is one of my favorite movies, but that’s not the film I am reviewing here. Edge of Tomorrow relies on the same premise of a neverending loop in which a man wakes up day after day in the same place, facing the same dilemma, surrounded by the same people doing and saying the same things. But he changes and grows with each repeated day.

As the film opens, an alien force has invaded Europe, burrowed underground, and started spreading across the continent toward England, China, and Russia. Enter Major William Cage (Tom Cruise), a media specialist with the Army who started in ROTC and rose to the rank of Major through office successes; he has never trained for combat, and he has no intention of going to war. When commanded to go to the front lines of a beach invasion in Normandy, he bolts. When next we see him he is handcuffed, stripped of his rank, and forced to join J Squad on the day they are going to invade France. He has no training with weaponry, doesn’t even know how to disengage the safety, and buckles under the weight of his heavy armor.

It is an unusual treat to see Cruise playing a terrified coward who doesn’t know how to fight, since he usually plays the tough guy who is cool as a cucumber under pressure. Of course, before long he is using his repetition of days to build up his skills and learn how to fight so that he can save the world. It’s an impossible mission, but someone has to do it. Helping him is Lt. Rita Vrataski (Emily Blunt), a war hero known as the Angel of Verdun because she almost single-handedly vanquished the alien enemy in a previous battle. That’s because Rita has also experienced repetition of days and used her repeated experience to anticipate the enemy’s moves. Together she and Cage fight to reach the source of the alien force and destroy it.

The story line is reminiscent of a video game in which the player adopts a character on the screen and fights through several different levels to accomplish a goal. Each time the player “dies” he has to start over, and each time he plays, he gets a little further in the game by remembering where the booby traps are. Often players work together, telling each other which tunnel or path is safe and which one has a lurking danger. Cage and Rita work together in this way, remembering what happened the “previous day” and moving further each time toward their goal. When Cage says to Rita at one point, “We’ve never made it this far before,” it sounds exactly like my munchkins playing Mario together.

It is an unusual treat to see Cruise playing a terrified coward who doesn’t know how to fight.

This video-game reference does not trivialize the film; it simply gives the viewer something more to ponder about metaphysics, the nature of life, and what you might do if you could see into the future and learn from your mistakes. A do-over once in a while could make all the difference.

Santayana said, “Those who cannot remember the past are condemned to repeat it.” Director Doug Liman has remembered and learned from the past. While Edge of Tomorrow borrows heavily from the concept of Groundhog Day, it is not doomed in any way. Moreover, Liman brings to this project a strong history in action films from his work directing the Bourne series. Edge of Tomorrow is fresh, exciting, and compelling. The references to the storming of Normandy give it a sense of gravitas missing from most modern action films (it was even released on June 6, to coincide with the anniversary of the invasion). The threat of a lurking menace that spreads unseen and underground until it has become unstoppable and can enter one’s mind gives the audience a sense of personal investment while suggesting that the enemy is a thought or philosophy, not an army. Even the solution for stopping the enemy — that is, getting inside the enemy’s mind and understanding his perspective — is also a powerful lesson for modern warfare. Edge of Tomorrow works on every level.


Editor's Note: Review of "Edge of Tomorrow," directed by Doug Liman. Warner Brothers, 2014, 113 minutes.



Share This


Another Perspective on Piketty

 | 

Someone acquainted only secondhand with Thomas Piketty’s book translated as Capital in the Twenty-First Century or who has only skimmed it might well dismiss it as a mere leftist, redistributionist tract. That would be a mistake and injustice — and thus counterproductive. Libertarian critics should try to answer Piketty’s findings, attitudes, and recommendations respectfully and seriously (unless, of course, they find themselves converted away from their own doctrine).

His tome of viii + 685 pages, full of tables, charts, and citations, is an impressive work of resourceful scholarship. A massive and detailed web site supplements it. I have neither the time and energy nor the competence to verify his voluminous statistics. Pieced together, as some of them are, from fragmentary sources (such as tax and probate records) of decades and even centuries ago, they must incorporate some elements of interpolation and educated guessing. Still, no reason is apparent for questioning his and his collaborators’ diligence and honesty.

Piketty avoids the pretensions of so much academic economics — decorative mathematics and dubious econometrics. (“[M]athematical models ... are frequently no more than an excuse for occupying the terrain and masking the vacuity of the content,” p. 574.) His book employs, and sparingly, only the simplest algebra; but I did find a few symbols and their definitions bothersome.

Piketty’s case for reforms is not mainly an economic argument but a sustained appeal to the reader’s intuition against extreme inequality.

For example, Piketty makes much of the inequality r>g as the condition of growth of the ratio of capital (wealth) to national income, g being the growth rate of the denominator. The condition would be trivially obvious if r, the numerator, were the growth rate of the capital stock; but Piketty usually, and misleadingly, calls it the “rate of return on capital.” That description would apply if all and only the earnings on capital were saved and reinvested. Expenditure of some capital earnings on consumption instead would reduce the growth of the capital stock and the capital-income ratio, as Piketty occasionally mentions; and saving or dissaving from labor income would also affect the ratio’s growth (or shrinkage).

Nevertheless, Piketty’s compilation of long-term statistics for several countries suggests a trend to him. Only occasionally does he mention that most of his income figures are of income before taxes and before supplementation by government redistribution. Anyway, the long-term trend of the capital-income ratio seems to have been upward, exacerbating the inequality of both wealth and income. The chief historical exception is the period 1914–1945, when wars and depression destroyed so much wealth.

Piketty gives particular attention to the concentration of income and wealth in the top 1%, or even the top tenth or hundredth of 1% of their distributions. He seems particularly concerned about great inherited fortunes and the lavish leisured lifestyles that they make possible (as in novels by Jane Austen and Honoré de Balzac, mentioned as a welcome change of pace from dense argument).

His remedy for great inequality would be not only highly progressive income and inheritance taxes but progressive annual taxes on total wealth itself. He recognizes the political unlikelihood of getting his wealth taxes enacted and enforced, however, because implausibly close international collusion of governments would be required. He draws on the literature of Public Choice little if at all. He supplements his arguments with page after page of the history of taxation in different countries.

Nowhere, as far as I noticed, and to his credit, does Piketty blame inequality for economic crises and depressions or commit the crude Keynesianism of recommending redistribution to raise the propensity to consume. He does not maintain that the apparent trend toward greater inequality will continue without limit. He does not maintain that the extreme wealth of only a few thousand families will give those few tyrannical power over their fellow citizens — far from few enough, actually, to be a coherent oligarchy. Nor does he (or his translator) toss about words like “unfair” and “unjust,” although he does occasionally aspire to more “social justice” and “democracy” in the inexpediently and popularly stretched sense of the latter word.

One might expect concern about inequality to include concern about further concentration of resources and power in the state. However, Piketty does not expect his more drastic and broad-based progressive taxes to raise much more revenue. Nor, perhaps inconsistently with not expecting this, does he worry about damaging incentives to work and innovate. Possibly he agrees with John Stuart Mill in thinking that the distribution of wealth can be separated from its production. Possibly, like José Ortega y Gasset’s Mass Man (The Revolt of the Masses, 1930), he regards the wonders of modern industrial civilization as automatically existing, like facts of nature. Although an avowed socialist in the loose European sense of the term, he does not want to destroy capitalism. He even welcomes considerable privatization: government agencies and employees need not themselves provide all the services that tax money pays for.

Wealth is not something that belongs to the government, which it may leave to its producers or redistribute as the country’s rulers see fit.

For Piketty, reducing inequality is a goal in its own right. I agree so far as reducing it means undoing government measures that actually foster it. These include aspects of crony capitalism: subsidies, tax privileges, protection from both domestic and foreign competition, and most of what makes highly paid lobbying worthwhile. Also at others’ expense, arguably, a policy of artificially low interest rates benefits Wall Street operators and wealthy stock investors and traders.

As I ended reading his book, I realized that Piketty’s case for reforms is not mainly an economic argument but a sustained appeal to the reader’s intuition, although not explicitly to envy. Intuition presumably carries more weight if the reader comes to share it himself without having actually been told what to think. If so, Piketty’s economic language and massive quantities of ingeniously gathered statistics amount to what I call a Murray Rothbard or Alan Reynolds style of argument: deploy such an array of facts and figures, dates, places, mini-biographies, and even personality sketches that, even if they scarcely add up to a coherent argument, you come across to your reader or audience as a consummate expert whose judgments command respect. But saying so may exaggerate; for Piketty’s tables, charts, and sketches of characters in novels may usefully jog the intuition. Anyway, one should not disparage Piketty’s impressive research and methods and their likely application in projects beyond his own.

As for an intuition against extreme inequality, I confess to one of my own, although it does not mean welcoming heavier and more progressive taxes. We should worry about undermining respect for private property as a human right and essential pillar of any functioning economic system. Wealth is not something that belongs to the government, which it may leave to its producers or redistribute as the country’s rulers see fit.

Still, the intuition persists, as it did with Henry Simons, that saint of the Chicago School of economics in its early days, who found inequality “unlovely,” and as it persisted with Nobelist James Buchanan, prominent libertarian, who advocated stiff inheritance taxes. Somehow, I am uneasy about the pay of executives said to be 600 times as great as the pay of their ordinary workers, even though they may well contribute more than that much to their companies’ revenues. I am uneasy about lifestyles of opulent leisure permitted by great inherited wealth, rare though they may be. I cannot justify or explain my intuition, which, anyway, is not crass envy.

I don’t call on public policy to heed that intuition, any more than I share the apparently spreading expectation that some authority take action against whatever offends somebody, whether the lifestyle, the behavior, the speech, or the suspected thought of someone else. I wouldn’t want an egalitarian intuition implemented in anything like Piketty’s ways. Government measures to alleviate or avoid actual poverty, even beyond the “safety net,” are something quite different.

An intuitive dislike of extreme inequality does not rule out unease at Piketty’s line of thinking. Again, however, I warn libertarians: don’t risk a boomerang effect by unfairly dismissing his work as a mere ideological tract. It is indeed a work of genuine scholarship. Dealing with its challenging ideas can strengthen the libertarian case.


Editor's Note: Review of "Capital in the Twenty-First Century," by Thomas Piketty, translated by Arthur Goldhammer. Belknap Press, 2014.



Share This


Mind the Gap

 | 

“Capitalism automatically generates arbitrary and unsustainable inequalities that radically undermine democratic societies.” — Thomas Piketty, Capital in the 21st Century

French professor Thomas Piketty’s new book — ranked #1 on Amazon and the New York Times — is a thick volume with the same title as Karl Marx’s 1867 magnum opus, Capital. Many commentators have noted the Marxist tone — the author cites Marx more than any other economist — but that’s a distraction.

The author discusses capital and economic growth, and recommends a levy on capital, but the primary focus of the book is inequality. In mind-numbing minutiae of data from Europe and the United Staes, Piketty details how inequality of income and wealth have ebbed and flowed over the past 200 years before increasing at an “alarming” rate in the 21st century. Because of his demonstrated expertise, his scholarship and policy recommendations (sharply higher progressive taxes and a universal wealth tax) will be taken seriously by academics and government officials. Critics would be wise to address the issues he raises rather than simply to dismiss him as a French polemicist or the “new Marx.”

According to his research, inequality grows naturally under unfettered capitalism except during times of war and depression. “To a large extent, it was the chaos of war, with its attendant economic and political shocks, that reduced inequality in the twentieth century” (p. 275, cf. 471) Otherwise, he contends, there is a natural tendency for market-friendly economies to experience an increasing concentration of wealth. His research shows that, with the exception of 1914-45, the rate of return on property and investments has consistently been higher than the rate of economic growth. He predicts that, barring another war or depression, wealth will continue to concentrate into the top brackets, and inherited wealth will grow faster with an aging population and inevitable slower growth rates, which he regards as “potentially terrifying” and socially “destabilizing.”

If market-generated inequality is the price we pay to eliminate poverty, I’m all in favor.

His proposal? Investing in education and technical training will help, but won’t be enough to counter growing inequality. The “right solution” is a progressive income tax up to 80% and a wealth tax up to 10%. He is convinced that these confiscatory rates won’t kill the motor of economic growth.

One of the biggest challenges for egalitarians like Piketty is to define what they mean by an “ideal” distribution of income and wealth. Is there a “natural” equilibrium of income distribution? This is an age-old question that has yet to be resolved. I raised it in a chapter in “Economics on Trial” in 1991, where I quoted Paul Samuelson in his famous textbook, “The most efficient economy in the world may produce a distribution of wages and property that would offend even the staunchest defender of free markets.”

But by what measure does one determine whether a nation’s income distribution is “offensive” or “terrifying”? In the past, the Gini ratio or coefficient has been used. It is a single number that varies between 0 and 1. If 0, it means that everyone earns the same amount; if 1, it means that one person earns all the income and the rest earn nothing. Neither one is ideal. Suppose everyone earns the same wage or salary. Perfect equality sounds wonderful until you realize that no economy could function efficiently that way. How you could hire anyone else to work for you if you had to pay them the same amount you earn?

A wealth tax destroys a fundamental sacred right of mankind — the right to be left alone.

Even social democrats William Baumol and Alan Blinder warned in their popular economics textbook, “What would happen if we tried to achieve perfect equality by putting a 100% income tax on all workers and then divide the receipts equally among the population? No one would have any incentive to work, to invest, to take risks, or to do anything else to earn money, because the rewards for all such activities would disappear.”

So if a Gini ratio of 0 is bad, why is a movement toward 0 (via a progressive income tax) good? It makes no sense.

Piketty wisely avoids the use of the Gini ratios in his work. Instead he divides income earners into three general categories, the wealthy (top 10% income earners), the middle class (40%), and the rest (50%), and tracks how they fare over the long term.

But what is the ideal income distribution? It’s a chimera. The best Piketty and his egalitarian levelers can do is complain that inequality is getting worse, that the distribution of income is unfair and often unrelated to productivity or merit (pp. 334–5), and therefore should be taxed away. But they can’t point to any ideal or natural distribution, other than perhaps some vague Belle Époque of equality and opportunity (celebrated in France between 1890 and 1914).

Piketty names Simon Kuznets, the 20th century Russian-American economist who invented national income statistics like GDP, as his primary antagonist. He credits Kuznets with the pro-market stance that capitalist development tends to reduce income inequality over time. But actually it was Adam Smith who advocated this concept two centuries earlier. In the Wealth of Nations, Smith contended that his “system of natural liberty” would result in “universal opulence which extends itself to the lowest ranks of the people.”

Not only would the rich get richer under unfettered enterprise, but so would the poor. In fact, according to Smith and his followers, the poor catch up to the rich, and inequality is sharply reduced under a liberal economic system without a progressive tax or welfare state. The empirical work of Stanley Libergott, and later Michael Cox, demonstrates that through the competitive efforts of entrepreneurs, workers, and capitalists, virtually all American consumers have been able to change an uncertain and often cruel world into a more pleasant and convenient place to live and work. A typical homestead in 1900 had no central heating, electricity, refrigeration, flush toilets, or even running water. But by 1970, before the welfare state really got started, a large majority of poor people benefited from these goods and services. The rich had all these things at first — cars, electricity, indoor plumbing, air conditioning — but now even the poor enjoy these benefits and thus rose out of poverty.

Piketty and other egalitarians make their case that inequality of income is growing since the Great Recession, and they may well be correct. But what if goods and services, what money can buy, becomes a criteria for inequality? The results might be quite different. Today even my poor neighbors in Yonkers have smartphones, just like the rich. While every spring the 1% attend the Milken Institute Conference in LA that costs $7,000 or more to attend; the 99% can watch the entire proceedings on video on the Internet a few days later — for free. The 1% can go to the Super Bowl for entertainment; the 99% gather around with their buddies and watch it on an widescreen HD television. Who is better entertained?

Contrary to Piketty’s claim, it’s good that capital grows faster than income, because that means people are increasing their savings rate.

Piketty & Co. claim that only the elite can go to the top schools in the country, but ignore the incredible revolution in online education, where anyone from anywhere in the world can take a course in engineering, physics, or literature from Stanford, MIT, or Harvard for a few thousand dollars, or in some cases, for absolutely nothing.

How do income statistics measure that kind of equal access? They can’t. Andrew Carnegie said it best, “Capitalism is about turning luxuries into necessities.” If that’s what capital and capitalism does, we need to tax it less, not more.

A certain amount of inequality is a natural outcome of the marketplace. As John Maynard Keynes himself wrote in the Economic Consequences of the Peace (1920), “In fact, it was precisely the inequality of the distribution of wealth which made possible those vast accumulations of fixed wealth of and of capital improvements which distinguished that age [the 19th century] from all others.”

A better measure of wellbeing is the changes in the absolute real level of income for the poor and middle classes. If the average working poor saw their real income (after inflation) double or triple in the United States, that would mean lifting themselves out of poverty. That would mean a lot more to them than the fortunes of the 1%. Even John Kenneth Galbraith recognized that higher real growth for the working class was what really mattered when he said in The Affluent Society (1959), “It is the increase in output in recent decades, not the redistribution of income, which has brought the great material increase, the well-being of the average man.”

Political philosopher James Rawls argued in his Theory of Justice (1971) that the most important measure of social welfare is not the distribution of income but how the lowest 10% perform. James Gwartney and other authors of the annual Economic Freedom Index have shown that the poorest 10% of the world’s population earn more income when they adopt institutions favoring economic freedom. Economic freedom also reduces infant mortality, the incidence of child labor, black markets, and corruption by public officials, while increasing adult literacy, life expectancy, and civil liberties. If market-generated inequality is the price we pay to eliminate poverty, I’m all in favor.

I have reservations about Piketty’s claim that “Once a fortune is established, the capital grows according to a dynamic of its own, and it can continue to grow at a rapid pace for decades simply because of its size.” To prove his point, he selects members of the Forbes billionaires list to show that wealth always grows faster than the average income earner. He repeatedly refers to the growing fortunes of Bill Gates in the United States and Liliane Bettencourt, heiress of L’Oreal, the cosmetics firm.

Come again?

I guess he hasn’t heard of the dozens of wealthy people who lost their fortunes, like the Vanderbilts, or to use a recent example, Eike Batista, the Brazilian businessman who just two years ago was the 7th wealthiest man in the world, worth $30 billion, and now is almost bankrupt.

Piketty conveniently ignores the fact that most high-performing mutual funds eventually stop beating the market and even underperform. Take a look at the Forbes “Honor Roll” of outstanding mutual funds. Today’s list is almost entirely different from the list of 15 or 20 years ago. In our business we call it “reversion to the mean,” and it happens all the time.

Prof. Piketty seems to have forgotten a major theme of Marx and later Joseph Schumpeter, that capitalism is a dynamic model of creative destruction. Today’s winners are often tomorrow’s losers.

IBM used to dominate the computer business; now Apple does. Citibank used to be the country’s largest bank. Now it’s Chase. Sears Roebuck used to be the largest retail store. Now it’s Wal-Mart. GM used to be the biggest car manufacturer. Now it’s Toyota. And the Rockefellers used to be the wealthiest family. Now it’s the Waltons, who a generation ago were dirt poor.

Piketty is no communist and is certainly not as radical as Marx in his predictions or policy recommendations. Many call him “Marx Lite.” He doesn’t advocate abolishing money and the traditional family, confiscating all private property, or nationalizing all the industries. But he’s plenty radical in his soak-the-rich schemes: a punitive 80% tax on incomes above $500,000 or so, and a progressive global tax on capital with an annual levy between 0.1% and 10% on the greatest fortunes.

There are three major drawbacks to Piketty’s proposed tax on wealth or capital.

First, it violates the most fundamental principle of taxation, the benefit principle. Also known as the accountability or “user pay” principle, taxation is justified as a payment for benefits or services rendered. The basic idea is that if you buy a good or use a service, you should pay for it. This approach encourages efficiency and accountability. In the case of taxes, if you benefit from a government service (police, infrastructure, utilities, defense, etc.), you should pay for it. The more you benefit, the more you pay. In general, most economists agree that wealthier people and big businesses benefit more from government services (protection of their property) and should therefore pay more. A flat personal or corporate income tax would fit the bill. But a tax on capital (or even a progressive income tax) is not necessarily connected to benefits from government services — it’s just a way to forcibly redistribute funds from rich to poor and in that sense is an example of legal theft and tyranny of the majority.

Second, a wealth tax destroys a fundamental sacred right of mankind — financial privacy and the right to be left alone. An income tax is bad enough. But a wealth tax is worse. It requires every citizen to list all their assets, which means no secret stash of gold and silver coins, diamonds, art work, or bearer bonds. Suddenly financial privacy as guaranteed by the Fourth Amendment becomes illegal and an underground black market activity.

Third, a wealth tax is a tax on capital, the key to economic growth. The worst crime of Piketty’s vulgar capitalism is his failure to understand the positive role of capital in advancing the standard of living in all the world.

To create new products and services and raise economic performance, a nation needs capital, lots of it. Contrary to Piketty’s claim, it’s good that capital grows faster than income, because that means people are increasing their savings rate. The only time capital declines is during war and depression, when capital is destroyed.

He blames the increase in inequality to low growth rates, when, says, the economic growth rate falls below the return on capital. The solution isn’t to tax capital, but to increase economic growth via tax cuts, deregulation, better training and education and productivity, and free trade.

Even Keynes understood the value of capital investment, and the need to keep it growing. In his Economic Consequences of the Peace, Keynes compared capital to a cake that should never be eaten. “The virtue of the cake was that it was never to be consumed, neither by you nor by your children after you.”

What country has advanced the most since World War II? Hong Kong, which has no tax on interest, dividends, or capital.

 

If the capital “cake” is the source of economic growth and a higher standard of living, we want to do everything we can to encourage capital accumulation. Make the cake bigger and there will be plenty to go around for everyone. This is why increasing corporate profits is good — it means more money to pay workers. Studies show that companies with higher profit margins tend to pay their workers more. Remember the Henry Ford $5 a day story of 1914?

If anything, we should reduce taxes on capital gains, interest, and dividends, and encourage people to save more and thus increase the pool of available capital and entrepreneurial activity. A progressive tax on high-income earners is a tax on capital. An inheritance tax is a tax on capital. A tax on interest, dividends, and capital gains is a tax on capital. By overtaxing capital, estates, and the income of our wealthiest people, including heirs to fortunes, we are selling our country and our nation short. There’s no telling how high our standard of living could be if we adopted a low-tax policy. What country has advanced the most since World War II? Hong Kong, which has no tax on interest, dividends, or capital.

Hopefully Mr. Piketty will see the error of his ways and write a sequel called “The Wealth of Nations for the 21st Century,” and will quote Adam Smith instead of Karl Marx. The great Scottish economist Adam Smith once said, “Little else is required to carry a state from the lowest barbarism to the highest degree of opulence but peace, easy taxes, and a tolerable administration of justice.” Or per haps he will quote this passage: “To prohibit a great people….from making all that they can of every part of their own produce, or from employing their stock and industry in the way that they judge most advantageous to themselves, is a manifest violation of the most sacred rights of mankind.”


Editor's Note: Review of "Capital in the Twenty-First Century," by Thomas Piketty, translated by Arthur Goldhammer. Belknap Press, 2014.



Share This


They Didn’t Want a War

 | 

Margaret MacMillan’s The War that Ended Peace gives a fascinating description of the background, stretching back to around 1900, of what she, like people at the time, calls the “Great War.” She relates how the Bosnian crisis of 1908, the Moroccan crises of 1905 and 1911, the crises arising from wars among the Balkan countries in 1912 and 1913, and various minor incidents were successfully muddled through without war among the great powers. The most general source of tension seems to have been fear of being attacked first and concern to make and maintain alliances.

Leading statesmen optimistically expected that tension between Austria-Hungary and Serbia, exacerbated by the assassination of Archduke Franz Ferdinand on 28 June 1914, would somehow be resolved like the earlier crises. Even after Austria-Hungary rejected Serbia’s compliant but not total acceptance of its ultimatum and declared war, hope lingered of keeping the war contained.

Few policymakers had wanted war (the main exception perhaps being Franz Conrad von Hötzendorf, Austro-Hungarian Chief of Staff). The German Kaiser was no exception, although he was addicted to impulsive speeches and interviews, liked to strut in military uniform, and even enjoyed fiddling with the detailed design of uniforms (as did his fellow emperors Franz Joseph and Nicholas II).

World War I was a momentous and enduring tragedy. Germany, for one, had everything to gain from continuing peace.

As those examples suggest, MacMillan goes into revealing detail not only about demographic, economic, political, diplomatic, and military situations and events but also about people — royalty, politicians, foreign ministers, diplomats, generals and admirals, journalists, and influential or well connected socialites — together with their backgrounds, illnesses, deaths, and strengths or quirks of personality.

Much of this is relevant to the role of sheer and even trivial accident in momentous history. MacMillan herself notes several examples. The Russian monk Rasputin, whatever his faults, strongly advocated peace and had great influence with the Imperial family; but he had been stabbed by a madwoman on the very day of the Austrian Archduke’s assassination and was recovering slowly, far from St. Petersburg. The Archduke himself had long realized that Austria-Hungary was too weak to risk an aggressive foreign policy. Alfred von Kiderlen-Wächter, German Foreign Minister and in MacMillan’s opinion a force for peace, had died in December 1912. Joseph Caillaux, France’s peace-minded Prime Minister, had had to resign in January 1912, partly in connection with his second wife’s shooting of an editor who had threatened to publish some indiscreet love letters that Caillaux had sent to her while she was still married to someone else. Although MacMillan does not explicitly raise the question, I was set to wondering how events would have evolved if Otto von Bismarck, a realist who was satisfied with Germany’s international position achieved by 1871, had been alive and in office in 1914. Or what if Gavrilo Princip’s bullet had missed the Archduke?

MacMillan ends her book, apart from a 13-page epilogue, with the outbreak of war in July-August 1914. That is fine with a reader more interested in the consequences of particular wars and with how the wars might have been avoided (as many potential wars no doubt were barely avoided) than with the details of the actual fighting. World War I was a momentous and enduring tragedy. Germany, for one, had everything to gain from continuing peace, including its growing leadership in science and industry. MacMillan writes a gripping story. She conveys a feel of the suspense that must have prevailed during the final crisis. My opinion of her book is overwhelmingly favorable.

Or it would be except for one minor but pervasive and annoying defect. The book is erratically punctuated, mainly but not everywhere underpunctuated. Even independent clauses, often even ones with their own internal punctuation, go unseparated by a comma or semicolon. Restrictive and nonrestrictive phrases and clauses are not distinguished, as clarity requires, by absence or presence of punctuation. Such erratic and erroneous punctuation delays understanding, if usually only for a second. Even so, it distracted me from the book’s fascinating story.

Above all, it distracted me with sustained wonder about how so untypically mispunctuated a book could emerge from a major publishing house. Could the copyeditor have given up in the face of a daunting and tedious task? Could an incompetent editor have imposed the damage, which the author then passively left standing? Could the author have committed the errors herself and then, perhaps out of bad experience with previous copyeditors, have insisted on none of their tampering this time? None of these hypotheses seems plausible, but I can’t think of a better one. The author’s including her copyeditor in her long list of Acknowledgments adds to the mystery.

I’d be grateful if someone could relieve my curiosity with the true story.


Editor's Note: Review of "The War that Ended Peace," by Margaret MacMillan. Random House, 2013, 784 pages.



Share This


Memories of War

 | 

Last month I visited the Kamikaze Peace Museum in Chiran, Japan, a small town characterized by cherry-lined streets and what remains of a centuries-old Samurai village. The museum is a moving tribute to the 1,000 or so young men who were ordered to give their lives for god and country (the emperor was considered divine) by flying their planes directly into American targets in the Pacific during the final months of World War II. Chiran was the departure point for most of those flights.

The museum contains photographs of all the men, along with the letters many of them wrote to their families on the eve of their death. These pilots were little more than boys, most of them aged 17–28, some of them photographed playing with puppies as they posed, smiling, in front of their planes. In their letters they urged their mothers to be proud, their sisters to be comforted, their girlfriends to move on without them, and their children to be brave. One man wrote, “I am sorry that Papa will not be able to play horsey with you any more.” Another’s girlfriend leapt from a bridge to her death after she read his letter, and yet another’s wife drowned herself and her children before his flight so he could die without regret. Several of these young pilots were Koreans conscripted into the service against their will. None felt he had a choice; living with the loss of honor would be much more painful than any fear of death. I felt nothing but sadness for these young boys.

Two weeks later I was in Oahu, where over 200 Japanese planes attacked Pearl Harbor in the early morning of December 7, 1941, killing 2,400 Americans, wounding another thousand, and crippling the American fleet. The attack brought America into war in the Pacific. One cannot visit the Pearl Harbor Memorial without feeling profound sadness for the loss of life that day and in the four years that were to come. Yet, having just visited the Kamikaze Peace Museum, I could not hate the men who flew the bombers into Pearl Harbor. The words of Edwin Starr resonated in my mind: “War: What Is It Good For?”

Perhaps it is good for peace. But at what price? I thought of this as I watched The Railway Man, based on the memoirs of a British soldier, Eric Lomax (Colin Firth and Jeremy Irvine) who was captured by the Japanese during World War II, forced to help build the railway through Thailand that was immortalized by the Oscar-winning film The Bridge on the River Kwai (1957), and tortured by his captors when he built a radio receiver inside their prison. The title of the film has dual meanings; not only does Lomax help build the railroad through Thailand, but from his youth he has had an obsession for trains and has always memorized details about train schedules, train depots, and the towns that surround train stations. In context, the title also suggests a metaphor for the bridges that are eventually built, through arduous effort, between Lomax and others, including his wife Patti.

None felt he had a choice; living with the loss of honor would be much more painful than any fear of death.

As the film opens, Lomax (Firth) is a middle-aged man who meets a pretty nurse, Patti (Nicole Kidman), on a train. He immediately falls in love with her. (The film implies that this is a first marriage for the shy and socially inept Lomax, but the real Eric Lomax was already married at the time he met Patti. He married Agnes just three weeks after returning from the war, and then divorced her just a few months after meeting Patti on the train. This, and the rest of the story, suggests to me that he returned from the war safe, but not sound.) Eric notes morosely, “Wherever there are men, there’s been war,” and Patti replies with a gentle smile, “And wherever there’s been a war, there’s been a nurse like me to put them back together.”

This introduces the key to their relationship. The war has officially ended 35 years earlier, but it still rages in Lomax’s mind. He will need the kind and patient wisdom of a nurse to help put him back together again. His struggle with post-traumatic stress disorder is skillfully portrayed when ordinary events trigger painful memories that transport him immediately to his jungle experiences as a POW. For example, the sound of the shower triggers terrifying memories of the water torture he endured at the hands of his brutal captors. The unexpected intrusion of these scenes demonstrates the unending aftermath of war and the difficulty of controlling its horrifying memories.

Wise casting adds to the pathos of this fine film. Much of what I know about World War II has been shaped by the films I’ve seen, and most of those were populated by actors well into their 30s and 40s. But in this film Young Eric (Jeremy Irvine) and his comrades are played by slender boys in their early 20s who can’t even grow a stubble of beard after four days aboard a prison train. They are closer to the tender ages of the soldiers they are portraying, and this increases the pathos of the story and our admiration for the strength and resolve of these boys who are thrust into manhood, much like the kamikaze pilots, before they even know what war is.

The Railway Man is a character-driven film that demonstrates the choices we have, even when it seems we have no choices at all. Jesus demonstrated the power of choice when he said, “If a man requires of you his coat, give him your cloak also” and, “If a man smites you, turn the other cheek.” He wasn’t telling his followers to give up and give in, but to take charge and move on, by invoking the right to choose one’s attitude when it seems that the freedom to choose one’s actions is gone. This film demonstrates that same transformative power of choice.


Editor's Note: Review of "The Railway Man," directed by Jonathan Teplitzky. Weinstein Company, 2014, 116 minutes.



Share This


The Apple of Knowledge and the Golden Rule

 | 

Russell Hasan is an author who has contributed a good deal to Liberty. Now he makes a contribution to liberty itself, in the form of two extensive monographs: The Apple of Knowledge: Introducing the Philosophical Scientific Method and Pure Empirical Essential Reasoning, and Golden Rule Libertarianism: A Defense of Freedom in Social, Economic, and Legal Policy. Both works are available online, at the addresses that follow at the end of this review. And both are very interesting.

I’ll start with The Apple of Knowledge, which itself starts with an account of the author’s quest for wisdom. He did not find it in the lessons of professional (i.e., academic) philosophers, who venerated the counterintuitive claims of earlier professional philosophers, often echoing their conviction that objective truth could not be found. The author turned to the objectivist philosophy of Ayn Rand, but found that “it was truly a political philosophy, and not a rigorously reasoned system of metaphysics and epistemology. Rand’s ideas seemed clever and useful, but they contained contradictions and holes and gaps.”

So, as an intellectual entrepreneur, Hasan decided to see whether he could solve crucial philosophical problems himself. That’s the spirit of liberty.

He states his agenda in this way:

The six problems that this book will solve are: 1. Induction, 2. Consciousness, 3. Knowledge, 4. The Scientific Method, 5. Objectivity, and 6. Things in Themselves.

Hasan believes that these problems can be solved by his versions of “(1) the philosophical scientific method, and (2) pure empirical essential reasoning.”

What does this mean in practice? It means a rejection of dualism and radical skepticism, a reasoned acceptance of the world as empirically discovered by the healthy consciousness. An example:

When you look at this book and say “I am looking at this book, I am reading this book, I am aware of the experience of this book,” and you wonder about what it means to be conscious and to have an experience and to see this book, the only things in the picture are two physical objects, (1) this book, which physically exists and is the thing that you are experiencing and are conscious of, and (2) your brain, which is the consciousness that experiences and is aware of the book by means of the perceptions and concepts in your brain. Similarly, when you see a red apple, the red apple itself is the red that you see, and your brain is the subject which perceives that object and is aware of that object. Nowhere in this picture is there a need to deny that consciousness exists. We need not deny that you really see a red color. We need not deny that you are aware of an apple. And there is also no need to believe in ghosts or non-physical souls as an explanation for your being aware of an apple and seeing its red color.

As this example suggests, Hasan has an admirably clear style throughout. His clarity may also suggest, erroneously, that the problems he addresses are easy to solve, or that he deems them easy to solve. They aren’t, and he doesn’t. For every statement he makes there are time-honored quibbles, evasions, and yes, real challenges. The enjoyment of reading through this fairly long book comes from following Hasan’s own path among the challenges, assessing his arguments, and finding out how many of those arguments one wants to buy.

To this process, a statement of my own ideas can add no particular enjoyment. For what it’s worth — and it isn’t directly relevant to Hasan’s essential concerns — his grasp of Christian and biblical theology could be much stronger. Here’s where the dualism that he rejects asserts itself despite his efforts; he tends to see Christian ideas (as if there were just one set of them) as dualistically opposite to his own: Christians are against the world, the flesh, and the devil, while he is heartily in favor of the first two, at least. But it’s not as simple as that. “World” and “flesh” can mean a lot of things, as a concordance search through St. Paul’s epistles will illustrate. You don’t need to believe in God to recognize the complexity of Christian thought. (And, to digress a bit further, “666” didn’t come “from ancient confusion between the Latin word ‘sextus’ which means six and the Latin word ‘sexus’ which means sex.” No, it originated in the biblical book of Revelation [13:18], and it’s a code term, probably for “Nero.”)

It makes no difference whether you’re smarter or richer than I am, because it requires the same effort — that is, none — for both of us to leave each other alone.

About the philosophical problems that Hasan treats I can say that he usually appears to make good sense — very good sense. His education in the objectivist tradition is evident; his respect for the real world — which is, after all, the world that all philosophy is trying to explain — is still more evident. Both are valuable, and essential to his project. Indeed, Apple of Knowledge can be viewed as a particularly interesting and valuable addition to the objectivist tradition of philosophy that begins with Ayn Rand.

Golden Rule Libertarianism is an exposition and defense of a variety of radical libertarian ideas — about victimless crimes, war and peace, government intervention in the economy, and so on. Few libertarians will be surprised by the results of Hasan’s inquiries in these areas — but what does “Golden Rule Libertarianism” mean?

This represents what I take to be a new approach, though one that is nascent in the libertarian concept of the great “negative right,” the right to be left alone. From this point of view, it makes no difference whether you’re smarter or richer than I am, because it requires the same effort — that is, none — for both of us to leave each other alone. The Golden Rule, most famously enunciated by Jesus but, as Hasan points out, hardly foreign to other religious and ethical teachers, yields a more “positive” approach. “Do unto others what you would have them do unto you.” Yet nobody wants his neighbor to do certain things — to prohibit him from speaking or publishing his views or sleeping with whomever he wants, even on the pretense of helping him. In this sense, the Golden Rule turns out to be just as “negative” as libertarians could wish. As Hasan says in one exemplification of his theory:

if you let me be free to make economic decisions, including what wage to pay and at what price to buy services from other people, then I will give you the same freedom to make your own choices instead of me making your choices for you.

There is a pragmatic dimension to this. In case you are wondering whether letting everyone be free to make his or her own decisions would leave the poor in the lurch, or, to vary the metaphor, in the clutches of an exploitative capitalism that the poor are not capable of turning to their own advantage, Hasan adds:

The best thing you can do for me is to get the government out of my way and let me be free, because capitalism helps the poor more than socialism.

Libertarians understand this, and Hasan provides plenty of reasons for everyone else to understand it too. His book will be valuable to nonlibertarians, because there is something in it for every interest or problem they may have. As he says, in another exemplary passage:

The liberal concern for civil liberties, e.g. my freedom to write atheist books, and the conservative concern for freedom from regulation, e.g. my freedom to buy and sell what I want on my terms, is really two sides of the same libertarian coin, because if the government claims the right to be the boss of your beliefs then it will soon usurp the power to be the boss of your place in the economy and take total control over you, and if the government is the boss of the economy then it will inevitably take over the realm of ideas in order to suppress dissent and stifle criticism of the economic planners.

I believe that Hasan is right to pay particular attention to what he calls “the coercion argument,” which is one of the strongest ripostes to libertarian thought. It is an attempt to argue against libertarian ideas on libertarian grounds. The notion is that if I leave you alone I permit someone else to coerce you. As Hasan says,

Some version of the coercion argument underscores a great deal of anti-libertarian sentiment: poor people will be coerced into selling their organs and body parts, which justifies denying them the right to do so. Poor people are coerced into accepting dangerous, low-paying jobs such as coal mining, or are coerced into working long hours for wages that are lower than what they want. They are coerced into buying cheap high-fat fast food, or are coerced into buying cheap meat, packed at rat-infested plants, and so on. The coercion argument is a thorn in the side of laissez-faire politics, because socialists argue that poor people aren’t really free in a capitalist system where they face economic coercion.

Hasan’s insight into the legal history and ramifications of the coercion argument is enlightening:

An example of the grave seriousness of the coercion myth is legal scholar Robert Lee Hale’s famous law review article “Coercion and Distribution in a Supposedly Non-Coercive State” (1923). Hale brainwashed generations of law students with his argument that capitalist employers exert coercion upon workers, and socialism would not produce more coercion or less freedom than capitalism.

This is a powerful myth, but Hasan has little trouble refuting it. Others are yet more formidable; I would be surprised, however, if even a hostile reader could emerge from a serious consideration of Hasan’s arguments without admitting serious damage to his or her own assumptions.

For libertarian readers, the fun is in seeing what Hasan will do with various popular topics of libertarian discourse — natural rights versus utilitarianism, racial discrimination, gay marriage, an interventionist versus a non-interventionist foreign policy, privatization of education and banking, disparity of wealth, etc. Even well-informed libertarians will be surprised, and probably grateful, for many of the arguments that Hasan adduces.

Hasan is one of the few questioning occupational licensing, which exacts immense costs from society, and especially from the poor, who must pay dearly for even the simplest services.

I was such a reader — and for me, the book gained in stature because I did not always agree with it. For me, libertarianism is more a matter of experience and less a matter of moral logic than it is for Hasan; but even within the large area of our philosophical, or at least temperamental, disagreement, I found myself admiring his intrepid and intricate, yet nevertheless clear and cogent progression of thought. I suspect that anyone who shares my feeling for the great chess match of political economy will share my feeling about this book.

Not all of Hasan’s many topics can possibly be of intense interest to everyone, but that’s just another way of saying that the book is rich in topics. My heart rejoiced to see a chapter on the evils of occupational licensing — a practice that virtually no one questions but that exacts immense costs from society, and especially from the poor, who must pay dearly for even the simplest services of licensed individuals. And I was very pleased to see Hasan take on many of the most sacred cows of my fellow academics.

One is game theory. Readers who are familiar with game theory and with the part of it that involves the so-called prisoner’s dilemma know that for more than two decades these things have been the favorite pastime, or waste of time, among thousands of social scientists. (If you ask, How can there by thousands of social scientists? or, Why don’t they find something better to do?, see above, under “occupational licensing.”) The tendency of game theory is to deal with people as objects of power, not subjects of their own agency. Its effect has often been to emphasize the statist implications of human action. Hasan cuts through the haze:

The specific refutation of Game Theory and the “prisoner’s dilemma” is that the solution is not for the group to impose a group-beneficial choice onto each individual, it is for each individual to freely choose the right choice that benefits the group. If the benefits of the supposedly right, good-for-the-group decision are really so great, then each individual can be persuaded to freely choose the right, optimal, efficient choice.

My advice is to get the book, which like the other book is available at a scandalously low price, read the introductory discussion, then proceed to whatever topics interest you most. You may not agree with the arguments you find, but you will certainly be stimulated by the reading.


Editor's Note: Review of "The Apple of Knowledge: Introducing the Philosophical Scientific Method and Pure Empirical Essential Reasoning," and "Golden Rule Libertarianism: A Defense of Freedom in Social, Economic, and Legal Policy," by Russell Hasan. 2014.



Share This


It’s Smart, It’s Exciting, It’s Fun

 | 

 

The specific details of a superhero movie plot seldom really matter; all we usually need to know is that an evil superpower, sporting a foreign accent, is out to destroy the world as we know it, and it is up to the superhero not only to protect the community from destruction but also to preserve our way of life. Dozens of superheroes have been created in comic-book land, and all of them have been sharing time on the silver screen for the past decade or more, with half a dozen of their adventures released this year alone. So far audiences are flocking to theaters with the same enthusiasm that kept our grandfathers heading to the local cinema every Saturday afternoon to see the latest installment of Buck Rogers.

These films tend to reflect the fears and values of whatever may be the current culture, which is one of the reasons for their lasting popularity. We see our worst fears in the threats posed by the enemies, and our hopes and fears in the characters of the heroes. But lately those heroes have been somewhat reluctant and unsure of their roles as heroes, and the people they have sworn to protect have been less trusting and appreciative — they complain about collateral damage and even question the heroes’ loyalty. In an era of relativism and situational ethics, a full-on hero with overwhelming power seems hard to support.

The Avengers share conversations praising freedom and choice, and they reject blind obedience in favor of making their own decisions.

This month it’s Captain America’s turn to save the day. Created by Jack Kirby and Joe Simon in 1941, Captain America (alter ego: Steve Rogers) is a WWII fighter pilot who is transformed from a 5’4” wimp to a 6’2” muscle man through a scientific experiment intended to create an army of super warriors. He ends up being cryogenically frozen and is thawed out in modern times. Part of his appeal is his guileless naiveté, especially as he reacts to modern technology and mores. He uses his virtually indestructible shield to fight for truth, justice, and the American way (okay, that’s the other superhero, but their morals are virtually the same). I like Captain America’s shield — it signifies that his stance is defensive, not aggressive.

As The Winter Soldier opens, nothing is going right for the Avenger team led by Nick Fury (Samuel L. Jackson) and Captain America (Chris Evans). Police, government agencies, and even agents of SHIELD (Strategic Homeland Intervention, Enforcement and Logistics Division, the organization that oversees and deploys the superheroes) are attacking them and treating them as national enemies. The Captain and former Russian spy Natasha (Scarlett Johansson), aka the Black Widow, have become Public Enemies number 1 and 2, but they don’t know why. They spend the rest of the movie trying to clear their names and save the world, without any help from the government they have sworn to uphold.

While the specific plot isn’t particularly important in these movies, motivation usually is. Why do the characters do what they do? Meaningful dialogue inserted between the action scenes reveals the values of both good guys and bad guys, and away we go, rooting for the guy who is going to save us once again.

I’m happy to report that Captain America: The Winter Soldier, lives up to its potential. As a libertarian, I can agree with most of the values it projects. First, politicians, government agencies, and the military industrial complex are the untrustworthy bad guys in this film, and for once there isn’t an evil businessperson or industrialist in sight. Additionally, the Avengers share conversations praising freedom and choice, and they reject blind obedience in favor of making their own decisions. For example, The Falcon (Anthony Mackie) aka Sam Wilson, tells Steve about his buddy being shot down in the war, and then says, “I had a real hard time finding a reason for being over there after that.” Captain America admits, “I want to do what’s right, but I’m not sure what that is anymore.” Like Montag in Bradbury’s Fahrenheit 451, he is ready to think for himself and determine his own morality. (Compare that philosophy to Peter Parker [Spider-Man] being told by his wise Uncle Ben that responsibility is more important than individual choice in Spider-Man 2, followed by Uncle Ben’s death when Peter chooses “selfishness” over responsibility.)

Meanwhile, the Secretary of State (Robert Redford — yes, Robert Redford! He said his grandchildren like the franchise, so he wanted to do the film for them) says cynically of a particular problem, “It’s nothing some earmarks can’t fix.”

The mastermind behind the assault on freedom (I won’t tell you who it is, except that it’s someone involved in government) justifies his destructive plan by saying, “To build a better world sometimes means tearing down the old one” and opining that “humanity cannot be trusted with its own freedom. If you take it from them, they will resist, so they have be given a reason to give it up willingly.” Another one adds, “Humanity is finally ready to sacrifice its freedom for security,” echoing Ben Franklin’s warning. These power-hungry leaders boast of having manufactured crises to create conditions in which people willingly give up freedom. This isn’t new, of course. Such tactics are as old as Machiavelli. Yet nothing could feel more current. I’m happy to see young audiences eating this up.

Captain America first appeared on film in 1944, at the height of WWII. He has never been as popular as Superman, Batman, or Spider-Man. A made-for-TV movie aired in 1979, and a dismal version (with a 3.2 rating) was made in 1990. However, the latest incarnation, with Chris Evans as the wimp-turned-military powerhouse, has been highly successful, with three films released in the past four years: two self-titled films (Captain America: The First Avenger in 2011, and this one) as well as one ensemble outing (The Avengers, 2012).

These power-hungry leaders boast of having manufactured crises to create conditions in which people willingly give up freedom. This isn’t new, of course.

One of the things I like about the Avengers is that they aren’t born with innate super powers à la Superman or X-Men; for the most part their powers come from innovation, technology, and physical training. They’re gritty and real, and they bruise and bleed. Directors Anthony and Joe Russo were determined to make this movie as real as possible too, so they returned to live action stunts whenever they could instead of relying on CGI and green screen projection. Yes, they use stunt doubles when necessary, but, as Anthony Mackie (the Falcon) reported in praise of the Russos, “if they could build it [a set piece], they built it. If we [the actors] could do it [a difficult maneuver], we did it. . . . That’s why the movie looks so great.” Many of the action scenes are beautifully choreographed and often look more like dancing than fighting, especially when Captain America’s shield is ricocheting between him and a gigantic fighter plane.

Of course, the film has its share of corniness too. When you’re a hero named Captain America, you’re expected to be a rah-rah, apple-pie American, and Captain America is. He even drives a Chevy, the all-American car. So does Nick Fury (Samuel L. Jackson), who brags about his SUV with a straight face as though it’s a high-end luxury vehicle. In fact, all the SHIELD operatives drive Chevys, as do many of the ordinary commuters on the street. That’s because another concept that’s as American as apple pie is advertising. Product placement permeates the film, but most of the time it’s subtly and artfully done. Captain America wears an Under Armour t-shirt (which is pretty ironic when you think about it — under armor beneath a super-hero uniform), and the Falcon, whose superpower is a set of mechanized wings that let him fly, sports a small and subtle Nike swoosh on his after-hours attire. (Nike — the winged goddess, get it?)

Captain America is a hit, and for all the right reasons. The dialogue is intelligent, the humor is ironic, the action sequences are exciting, and the heroes are fighting for individual freedom. It even contains a theme of redemption. And for once, the bad guys aren’t businessmen. Ya gotta love it.

Captain America: The Winter Soldier, directed by Anthony Russo and Joe Russo. Sony Pictures, 2014, 136 minutes.




Share This


The Road to Potential

 | 

How I hate the word “potential”! While acknowledging innate abilities with faint praise, it reeks of withering disappointment, talents wasted, opportunities lost.

Transcendence is a film with tremendous potential.

It begins with a talented cast of discriminating actors that includes Johnny Depp, Rebecca Hall, Cillian Murphy, Morgan Freeman, and Paul Bettany. (OK, scratch Morgan Freeman from the list of “discriminating actors.” Freeman is a fine actor, but he has become as ubiquitous as Michael Caine.) Add a postapocalyptic setting, an army of zombified robotic post-humans, a love story that transcends death, and the time-honored “collision between mankind and technology.” Mix in some dialogue about challenging authority and questioning the meaning of life, and create a metaphor suggesting the thirst for power in the form of the World Wide Web. It seems like the perfect formula for an exciting sci-fi thriller.

Yet Transcendence barely gets off the ground. The story, about terrorists who attack the world’s computer hubs simultaneously in order to stop the Internet, should be powerfully engaging, but it lacks any building of tension or suspense. Instead it is a dull, slow-moving behemoth emitting loud, unexpected bursts of explosive violence.

Will Caster (Johnny Depp) is a computer programmer working on ways to heal the planet through nanotechnology. When terrorists attack his computer lab and infect him with deadly radiation poison, his wife Evelyn (Rebecca Hall) and his research partner Max (Paul Bettany) convince him to let them upload his consciousness onto a hard drive that will allow his sentient self to continue to live within the machine. It’s an interesting concept that ought to cause one to reflect on what makes us human: is it the physical body of flesh and bones that can move and act? Or is it the non-physical collection of memories, thoughts, and personality? Many people have had their heads cryogenically frozen after death, in hopes that someday their minds can be restored within a synthetic body and they can regain life. But that isn’t what this movie is about.

“Transcendence” is a dull, slow-moving behemoth emitting loud, unexpected bursts of explosive violence.

The plan works, and Will speaks from the computer screen after his death. However, Max immediately and inexplicably regrets having uploaded Will to the machine, so he joins forces with the terrorists (who also join forces with the government — it’s hard to keep all the factions and their motivations straight) to stop Will from doing what he was uploaded to do. Meanwhile, Evelyn joins forces with Will and together they build a solar power grid in an isolated Nevada desert to give Will enough juice to mingle with every scintilla of the Internet.

Yes, this makes Will omniscient, omnipresent, and all-powerful. And that’s a definition of God, right? Will is treated like God’s evil twin, set on sucking up all the power in the universe (there’s that metaphor of the power grid.) But he doesn’t do anything evil. He doesn’t steal power from others; he creates his own from the sun — and he pays the Nevada residents a good wage to work for him. He doesn’t kill anyone, destroy anything, or even growl menacingly. In fact, he uses his power to refine his work in nanotechnology, and soon he is able to heal the blind, make the lame walk, and restore life to those who have been killed. (In case you hadn’t noticed, this is godlike too.) As they are healed, his new followers become like him — imbued with the Internet and able to communicate with Will through his spirit — that is, the Cloud.

This storyline has the potential for being eerie and scary, à la Invasion of the Body Snatchers; but Will’s followers don’t do anything wrong either, and they aren’t at all zombie-like. They are just humans who once were disabled and now can see, walk, lift, run, hear, and produce. How is that different from living with a pacemaker or a titanium hip, and carrying around a smart phone that keeps one constantly connected to the Internet? Nevertheless, the government is determined to take them out, simply because they are now created in Will’s image and have the ability to communicate worldwide.

All of this has the potential for philosophical discussion, but I had to use all my creativity as a literature professor to reach the potential message I’ve described here. The message is beyond subtle — so subtle, in fact, that I think it went over the director’s own head. I’m not sure what he intended to suggest, except possibly that the collision between mankind and technology is usually good for a studio green light. I doubt that he was even aware of the potential metaphor or deeper meaning of the film he created.

Ah. There’s that word again. “Potential.” A film that had transcendent potential is instead fraught with withering disappointment, wasted talent, and lost opportunities. Insomniacs will love it.


Editor's Note: Review of "Transcendence," directed by Wally Pfister. Alcon Entertainment, 2014, 119 minutes.



Share This


Noah Sails Where No Rock Ever Sailed Before

 | 

Myth has been defined as “a story, often employing superhuman figures and supernatural events, that attempts to explain something about the world and that expresses some important value or belief of a culture or society” (Howard Canaan, Tales of Magic from around the World). Myths have simple plots with few specific details; their meaning can evolve over time to represent changing cultural values. This is what director Darren Aronofsky has done with his epic new film, Noah — he has created a myth like that.

Audiences who want to see the biblical story of Noah will be disappointed. Judeo-Christian believers, indeed, will be offended and outraged by the laughable inaccuracies in this movie, from the biblical point of view. (Believers will know they’re in trouble when they see the film begin with the words, “In the beginning there was nothing.”) Darren Aronofsky has rewritten a new myth, for modern times. It is no longer the story of a prophet who was chosen by God to build an ark and repopulate the earth after everyone else drowns. The conflict is no longer between God and Satan but between humans and Rock People (representing Earth — but more below). Rock People communicate with a Creator, but humans do not communicate with God. The purpose of religion is not to forge a relationship between God and people but to protect the earth and the animals. “If anything happens to one of these creatures, it will be lost forever,” Noah warns his sons, but he has no similar concern for humanity. As Noah walks among the wicked community that is about to be destroyed by the flood, he observes many gruesome acts, but the pinnacle of their depravity is presented as they cleave animal carcasses for cooking. Methuselah explains, “Men will be punished for what they have done to this world,” not for what they have done to one another. Noah is a modern myth that represents the hegemonic values of today.

Aronofsky fractures the Bible, combining snippets from several biblical stories and pretending that they all happened to Noah.

Maybe it was because I had just seen the new Mr. Peabody and Sherman movie, but Rocky and Bullwinkle’s Fractured Fairy Tales kept coming to mind while I was watching Noah. Aronofsky fractures the Bible, combining snippets from several biblical stories and pretending that they all happened to Noah, including Eve’s attraction to the serpent, Cain’s murder of his brother Abel, Elisha’s army of angels, and Abraham’s near-sacrifice of his son Isaac. It’s the most bizarre concoction, yet I’m sure that many gullible filmgoers went home saying, “Wow! I didn’t know Noah almost killed his granddaughters!” You see, in order to understand Rocky’s Fractured Fairy Tales, you had to know that Sleeping Beauty did not eat a poisoned apple.

And here’s another thing you probably wouldn’t know was in the Bible if you didn’t see this movie: God did not create humans — some crazy giant Rock People did. These Rock People look like piles of boulders until they pull themselves together, Transformer-like, and start stomping around the earth. They have multiple arms and glowing eyes and thundering voices à la Optimus Prime and are a whole lot more exciting than the voice of God. They create an eerie static hum whenever they’re close by, and they strike fear into the hearts of men. Except the hearts of the ones they like.

According to the Book of Aronofsky, these Rock People came from outer space as meteors of light and got stuck in the muck of primordial creation. They made humans out of the dust of the earth, and as the film opens they’re really mad at themselves for doing it because humans really suck. But you already know that, if you’ve been reading the newspapers lately.

See, it turns out that animals are “the innocents” and “man broke the world.” Eve’s real treachery wasn’t curiosity or disobedience or a desire for wisdom; it was that she brought children to the earth and allowed her descendants to wreak havoc there. Now Noah’s wife wants to do the same! But Father Noah Knows Best. He is determined to put all the animal pairs onto the ark and save only his three sons, his post-menopausal wife (Jennifer Connelly), and one barren girl (Emma Watson) so that humans cannot repopulate the earth and ruin it again. His job is simply to keep the animals safe until the flood subsides, and then quietly let humans become extinct.

It probably doesn’t surprise you that in this movie, Noah never communicates with God, or vice versa. So where does he get the idea of building the ark? From the dregs of a psychotropic tea given to him by Methuselah (Anthony Hopkins). Methuselah, by the way, has supernatural powers, but when he uses them to cause a wonderful and necessary miracle, Noah gets pretty ticked off and starts grabbing for daggers. Actually, there is very little to set Noah apart from the wickedness around him. He wields an ax and a bludgeon with the best of them, and he can be pretty heartless.

Despite the assertion that “in the beginning there was nothing,” there are deists in the movie. They just aren’t the prophets. The Rock People talk directly to a Creator, and so does Tubal-cain (Ray Winstone), the leader of the wicked nomads and a descendant of Cain, who killed his brother Abel and was forced to bear a mark on his forehead as the first murderer. Tubal-cain doesn’t actually appear in the biblical story of Noah, but Aronofsky throws him in, probably because he became a blacksmith (Genesis 4:22) and is credited by many scriptorians with inventing weapons of war. While Noah is drinking the psychedelic Kool-Aid, Tubal-cain is calling out to God, “Speak to me! I am a man, made in your image. I am like you — I give life and I take it away. Why will you not converse with me?” Meanwhile, the priesthood that has been passed from Adam to Methuselah to Noah is embodied in the skin of the serpent that tempted Eve in the Garden. This is no mere fracture of the tale; Aronofsky delights in making the godly evil and evil godly.

The Rock People have multiple arms and glowing eyes and thundering voices à la Optimus Prime and are a whole lot more exciting than the voice of God.

Sure, I get it: Hollywood doesn’t like references to God (approving ones, anyway). And yes, I know the difference between Sunday School and the Cineplex; I wasn’t expecting a sermon. But why make a movie about Noah if you are going to leave out the driving force behind the story? It’s like making Clash of the Titans without Zeus or Poseidon. Why not just make a movie about Rock Giants that duke it out with brutal nomads while one family escapes in a boat with a bunch of animals? Let those of us who know how to read leave the theater saying, “Wow, did you catch those references to Noah?” instead of “Man, did he ever get that wrong!”

What drew Aronofsky to the story of Noah in the first place? I suspect it was the same characteristics that have kept myths alive for centuries. Archetypal characters, iconic conflicts, and simple truths about human nature resonate with us. One does not have to be religious to experience the resonance of biblical stories, nor should religious people be offended that I categorize biblical stories as myth. Contrary to popular opinion, “myth” does not mean “a lie,” or “a story that is not true.” Myths express “not historical or factual truth, but inner or spiritual truth. They are the shared stories that express insights about human nature, human society, or the natural world” (Canaan). Myths are so profound that they transcend the need to be factual. In fact, they can even transcend Hollywood’s need to be cynical.

Despite my criticism of the first two hours of this film, I found the conclusion profoundly satisfying. After all the fracturing and twisting and pushing away from humanity, Aronofsky ends with a cathartic moment of transformation and hope. It probably isn’t worth the two-hour journey to get there, and it’s totally out of whack with the source material. But Aronofsky creates a lovely scene of redemption at last.


Editor's Note: Review of "Noah," directed by Darren Aronofsky. Paramount Pictures, 2014, 138 minutes.



Share This
Syndicate content

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.