Apes Unlimited

 | 

In 1968, at the height of the civil rights movement and the anti-war movement, a film emerged reflecting on the necessity for both. Although it was nominated for two Academy Awards (and won an honorary Oscar for Special Achievement in Makeup Design, which did not become an official category until 1981), the original Planet of the Apes is often dismissed as a campy sci-fi costume flick. Yet it addressed important issues about war, technology, and what it means to be human.

Most people know the plot: after being cryogenically frozen and suspended for centuries, three astronauts crashland on a planet that is remarkably compatible with human life; it has the right atmosphere, temperature, water, and food. The big difference is that on this planet the apes are civilized scientists while the humans are, like Jonathan Swift’s “yahoos” in the fourth part of Gulliver’s Travels, uncivilized brutes. No one who has seen the film can forget the shocking sight of the torch of liberty projecting from the sand in the final scene. The message was clear: this will be our future if we do not change our course.

POTA was followed by four sequels in rapid succession (1970, ’71, ’72, ’73, and a TV series in 1974). Now, nearly 50 years later, the message is just as timely: wars erupt as cultures clash around the globe. An African-American is in the White House, but government-promoted racism continues to flourish. Laboratory experiments change our food into something not-quite-natural, while genetically changed strains of viruses and biowarfare threaten our DNA. It’s not surprising that a new set of cautionary prequels should emerge that imagine a prelude to the 1968 POTA and offer a similarlycautionary message about war and civil rights. The newest film is not only just as timely, but even more sinister.

No one who has seen the original Planet of the Apes can forget the shocking sight of the torch of liberty projecting from the sand in the final scene.

As Rise of the Planet of the Apes (2011) ends, an experimental cure for Alzheimer’s disease has mutated into a deadly virus that has led to the near demise of the human race while causing the apes (on whom the drugs were experimented) to develop language and technological skills. (See my review in Liberty.) Now we have Dawn of the Planet of the Apes, and it is a surprisingly satisfying addition to the franchise, despite being a bit slow in the first half.

As Dawn opens, apes now populate the woodland north of San Francisco, and they use weapons, ride horses, and plan strategies as they hunt deer (yes, these apes apparently have become carnivorous). The opening shot, looking up from the floor of the forest at dozens of apes swinging from treetop to treetop, is both eerie and beautiful. But the humans have not become extinct. A few were able to survive the “simian flu” and are now living in isolated camps in San Francisco (and possibly in other pockets around the world). Inevitably, the world of the apes and the world of the humans collide when the humans enter the forest to look for a way to repair a dam that could provide hydroelectric power to the city.

The film makes a strong case for the idea that reactions to actions, not the actions themselves, lead to war, and that appropriate reactions can avert it. Refreshingly, the film does not imply, as one might expect, that humans (especially white humans) are always bad, and animals (especially black animals) are always good. Instead, there are good and bad characters in both groups. Carver (Kirk Acevedo) is a trigger-happy human who shoots when scared. His foolish action could lead to either retaliation (war) or conciliation (patrolled borders) from the apes. Koba (Toby Kebbell) is a bitter ape who fears humans and wants war. He seems to have read Saul Alinsky’s playbook about how to use deception to influence public opinion. Under Koba’s leadership, the apes lock up the humans and their own peaceful dissenters, and steal weapons from the human arsenal. In a subtle nod to George Orwell’s Animal Farm, apes can be seen in the background painting a list of rules on the wall of their dwelling, a list that begins with “Apes do not kill apes” — after Koba usurps Caesar’s role as leader.

On the other hand, Caesar (Andy Serkis) and Malcolm (Jason Clarke) try to lead the apes and humans, respectively, toward a negotiated peace. Caesar calms his followers by reminding them, “If we go to war, we could lose all we’ve built — Home. Family. Future.” He then turns to a solution reminiscent of Robert Frost’s response to the Cold War (“good fences make good neighbors”) by delineating boundaries between the two groups. Cross this line, and we fight. Malcolm recommends similar restraint with the humans. I like this suggestion that we judge others by their actions, not by their pedigree. Of course, Koba prevails, and the second half of the film is a tense, action-packed battle between humans and apes as Caesar and Malcolm try to restore détente.

The original POTA introduced a then-groundbreaking prosthetic technique that allowed actors playing apes to move their cheeks and lips and show emotion on their faces. It was so innovative that designer John Chambers won the Oscar for his achievement. Now the apes are made to move and talk in a completely different way. That’s Andy Serkis, king of the motion-capture creatures (Smeagol-Gollum, King Kong, the previous Caesar) playing Caesar the Ape, but he isn’t wearing a hairy body suit or a prosthetic mask; he’s wearing a tight-fitting body suit with computerized balls attached to record his movements. Those are his expressive eyes we see on the screen, but his ape’s body is drawn through computer-generated “motion capture” techniques using the patterns created by the electrodes attached to his body. The ape is then drawn over the movements, complete with fur, scars, and expressions.

Man fades into the shadows while apes run toward the sunlight, signaling the rise of the apes and the end of humankind’s reign on the earth.

Consider that there might be dozens of apes or other CG animals in each scene, that each ape has to be drawn individually on each frame, that there are 24 frames per second in this 130-minute film, that for much of the film the apes are communicating in an intricate form of sign language, and that it all looks so real that you forget it’s animation, and you begin to appreciate what a work of art this film is.

Film uses a language of its own to create metaphors. In this one, a fiery backdrop during a battle scene reminds us visually that “war is hell.” Similarly, as the film ends a man fades into the shadows while apes run toward the sunlight, signaling the rise of the apes and the end of humankind’s reign on the earth. In this version, the apes use sign language when communicating with one another; I expect that in the next, the humans will have devolved to the yahoos that Taylor (Charlton Heston) found when he “crashlanded to earth” nearly 50 years ago. If that film is anything like this one, it will be well worth watching.


Editor's Note: Review of "Dawn of the Planet of the Apes," directed by Matt Reeves. 20th Century Fox, 2014, 130 minutes.



Share This


Point Counterpoint

 | 

Dinesh D’Souza is a debater beyond compare. I have watched him debate at least a dozen times, and he is simply brilliant in the way he sets up his opponent, recognizes the opponent’s position, and then systematically takes it apart and refutes it. Once when he was debating Christopher Hitchens on the value of religion, Hitchens called D’Souza’s bluff by not making his own case, thereby giving D’Souza nothing to tear apart. Undaunted, D’Souza first told the audience what Hitchens should have said about the bad things that have happened in the name of religion, and then went ahead with his own side of the debate, never missing a beat and managing to stay within his time limit to boot.

I thought about those debating skills while watching D’Souza’s new movie, America: Imagine a World Without Her. The film begins with an imagined reenactment of a Revolutionary War battle in which Washington dies and America never comes into existence. What might the world look like without the American philosophy? He then switches into devil’s advocate, listing five significant areas in which Americans should feel deep shame:

  1. Theft of lands from Native Americans, and genocide against them
  2. Theft of the American Southwest from Mexico
  3. Theft of life and labor from African-Americans
  4. Theft of resources from around the world through war and expansionism
  5. Theft of profits from consumers through capitalism (“You didn’t create that business — someone else built those roads, educated those employees, etc.”)

Watching this part of the film, especially as the first three points were elaborated, I nodded my head in agreement and disgust. These were terrible events that blot our nation’s history. How would D’Souza debate his way out of this one, I wondered?

D’Souza then steps back to give context and historical background to these situations. He does not denigrate or trivialize the suffering of the people involved, but he widens the story to give a broader perspective. By the time he is finished we feel humbled by the bad things, but no longer shamed by our history. In fact, our pride is restored for the good that we have accomplished, despite our slowness sometimes in getting there. Quoting both Martin Luther King and Abraham Lincoln, he calls the equal rights vouchsafed in the Declaration of Independence a “promissory note” that took decades — nay, two centuries — to pay off, and indeed is still a promissory note in some instances.

By the time D’Souza is finished we feel humbled by the bad things, but no longer shamed by our history.

I was especially pleased that D’Souza included a segment on Madam C.J. Walker, the first black American woman to become a millionaire. Walker made her million manufacturing and selling cosmetics and pomades for African-Americans. She started as a cotton picker, worked her way up to cook, and saved her money to start her business. She is a true entrepreneurial hero who is often overlooked in the history books, I think, because she doesn’t fit the cult of victimhood ascribed to blacks and women, and because she made it on her own through entrepreneurship, not through political activism. I only know about her because her mansion is a mile from my house. (It survived the Roosevelt wealth tax devastation by serving as a tax-exempt old folks home for several decades, but is now a private residence again.) Now, thanks to D’Souza’s movie, others will know about this American entrepreneurial hero.

I would have been happy if the film had ended there, but then D’Souza turns to his opponents in this debate, such people as Boston University professor Howard Zinn, whose 1980 book A People’s History of the United States 1492–Present has influenced many political activists; and Saul Alinsky, whoseRules for Radicals heavily influenced such politicians and “community organizers” as Hillary Clinton and Barack Obama. Like a good debater, D’Souza defuses the ammunition his detractors might use against him, the business about his recent run-in with the law, by addressing it head-on instead of giving his opponents an opportunity to whisper about it or suggest that he is hiding something. He admits that what he did was wrong (he reimbursed two friends who donated to another friend’s campaign in order to circumvent campaign contribution limits established by law — a law, by the way, that many people consider a violation of First Amendment right to free speech.) D’Souza frames his admission within the context of selective prosecution (some would call it political persecution) in retaliation for his previous film, 2016: Obama’s America.

America: Imagine a World without Her opened this week to coincide with the Fourth of July. It is an impressive piece of filmmaking, not only for its well-structured arguments but for its production qualities. Producer Gerald Molen, who won an Oscar as producer of Schindler’s List, is the man behind the magic. The film is also a featured selection at the Anthem Libertarian Film Festival as part of FreedomFest at Planet Hollywood in Las Vegas next week (information about FilmLovers Passes is at anthemfilmfestival.com).


Editor's Note: Review of "America: Imagine a World Without Her," directed by Dinesh D’Souza and John Sullivan. Lionsgate, 2014, 103 minutes.



Share This


Ain’t That a Shame?

 | 

“You’d be like heaven to touch, I want to hold you so much.” Is there a more perfect lyric in the world, one reviewer asks. The lyrics of the Four Seasons expressed all the yearning of unrequited love. I can still remember the party where my adolescent heart was stirred while that song played in my mind. “Can’t take my eyes off of you,” I hummed softly, but his eyes adored someone else. Oh what a night — the music of our youth stays with us and has the power to evoke long-dormant memories and emotions.

That’s one reason that Jersey Boys (like Mamma Mia) has had such a long and successful run on Broadway, playing to people who often sing along (to the annoyance of the person in the next seat). The Four Seasons were the “other” ’60s sound — not rock and roll and not Motown but simple, true lyrics sung in clear, clean harmonies with that strong countertenor of Frankie Valli set in just the right key for female teenyboppers. I learned how to sing harmony with the Four Seasons. They were a sound you could play in front of your parents.

Sinatra, another Frank who made it out of Jersey through his glorious voice, is next to the Pope in this story — quite literally.

Their personal lives were another story, however — normalized at the time but recently placed in another light by the Broadway musical and now the film. As represented by the movie, the boys from Jersey — Tommy, Nicky, Joe, and Frankie (Bob was from a nicer background) — were little more than hoodlums, knocking over delivery trucks and breaking into jewelry stores when they were supposed to be in the library. They knew the beat cops by name, and for some of them the local detention facility was like a revolving door, as the characters gleefully admit in the film. Of course, this is the way it’s remembered by Frankie Valli and Bob Gaudio, executive producers of the film; Tommy, Nicky, and Joey might remember it quite differently.

“There were three ways out of the neighborhood,” Tommy DeVito (Vincent Piazza) tells the audience. “Join the army, join the mob, or become famous.” The first two could get you killed, so singing was the ticket out. Sinatra, another Frank who made it out of Jersey through his glorious voice, is next to the Pope in this story — quite literally. Their photos are set in a double frame and stand like a shrine of hope on the living room shelf of Frankie’s childhood home.

The first half of the film focuses on the boys’ backgrounds and their slow rise to fame through seedy nightclubs and bowling alley bars. Waiting over an hour for the first familiar song to appear in this film heightens the drama at its unveiling. I was tapping my foot impatiently. But when it finally arrives it reminds us of how sublime their harmonies were, and how simple their lyrics: “She-e-e-rry, Sherry baby, She-e-erry, Sherry baby. She-eh-eh-eh-eh-erry baby. Sherry baby. Sherry, won’t you come out tonight?” Sheesh! How did that ever make it to the radio? Yet it topped the charts and was followed by hit after hit that told our stories in song.

One of Eastwood’s biggest mistakes was the decision to bring several original cast members and other virtual unknowns from the Broadway stage to the sound stage.

The lyrics of the songs tell the story in the film too, although it all works better in the stage musical, where the production numbers are showcased. Instead of using the lyrics to carry the story forward as most musicals do, Eastwood inserts them almost like a sidebar to the story he prefers to tell. In the film the songs often play in the background, and often while the characters are speaking, so the effect is lessened.

The huge theater where I saw the movie held exactly four viewers at the 7:15 show on opening night. Four Fans for the Four Seasons. Sigh. With the popularity of the Broadway musical (and Clint Eastwood as the producer and director) the film had a disappointing turnout for its opening day. But there’s the rub: Clint Eastwood. Who would have thought this talented octogenarian director known for his spare direction and raw drama would turn to the Broadway musical genre this late in his career? Oh wait — he already did, and it was a disaster. Eastwood starred as the singing prospector who shares a wife (Jean Seberg) with his partner (Lee Marvin, who has purchased her from a polygamous Mormon) in Lerner and Loewe’s Paint Your Wagon (1969), a movie based very loosely on the 1951 play that ran for only 289 performances. Eastwood was ridiculous in that film, and he brings no genuine experience to the filming of this musical. He also uses actors with no genuine experience on screen, intensifying the problem.

One of Eastwood’s biggest mistakes was the decision to bring several original cast members and other virtual unknowns from the Broadway stage to the sound stage. With only one familiar face — Christopher Walken as mob boss Gyp DeCarlo, who acts as a kindly godfather to the Jersey boys — there is no name other than Eastwood’s to attract film audiences. The four who play the Seasons are actually pretty good, (Vincent Piazza as Tommy DeVito, Michael Lomenda as Nick Massi, Erich Bergen as composer Bob Gaudio, and Tony-award-winner John Lloyd Young as Frankie Valli), but they aren’t, well, they aren’t seasoned. Renee Marino, who plays Frankie’s wife Mary onstage and in the film, is simply annoying with her exaggerated movements and wild outbursts of emotion. I actually went home and looked up her background, expecting to learn that she is Eastwood’s newest girlfriend, but she isn’t. (Remember those godawful movies from the ’70s and ’80s when Sondra Locke was his main squeeze? They were every which way but right.) The most interesting actor is Joseph Russo, also a newcomer, and only because he plays Joe Pesci. Yes, that Joe Pesci. He’s credited in the movie with bringing Bob Gaudio into the group, back when Pesci was just another kid from New Jersey. Eventually Tommy DeVito went to work with Pesci, and Pesci took Tommy’s name for his character in Goodfellas.

The problem is that acting for the screen is quite different from acting for a live audience. A movie screen is 70 feet wide, making the actor much larger than life. The flick of an eyebrow or twitch of a finger can relay emotion and communicate thoughts. Stage actors, on the other hand, must play to the balcony. Their actions are broad, even in tender moments. When Mary leans across a diner table with her butt in the air and her lips pouting forward as a come-on to the inexperienced Frankie, it works for the stage but is comical and unrealistic for the screen. And Eastwood should know, because he is the master of unspoken communication. In interviews Marino gushes about how relaxed and easy-going Eastwood was on set, but she needed direction. Desperately. “I need you, baby, to warm the lonely nights” can be said without words and bring tears to the eyes. Keep it simple, and keep it real. As Frankie says to Bob Gaudio about the arrangement of a new song, “If you goose it up too much it gets cheesy.”

That joy comes through in the closing credits of the film, when the cast members dance through the streets to a medley of songs reminiscent of the curtain-call encore

Overall Jersey Boys is a good film that provides interesting background about the music industry. Touring and recording isn’t all glitz and glamour; it’s mostly packing and repacking, eating in diners, staying in nondescript hotel rooms where you aren’t sure which direction is the bathroom in the middle of the night, missing family events, and in the end getting screwed over by unscrupulous money managers. It’s tough. But the film doesn’t give us much perspective about the Four Seasons and the time period in which they wrote. They were the clean-cut lounge singers who made hit after hit side by side with the Beatles, the Beach Boys, and the Rolling Stones. They held their own during the tumultuous ’60s, just singing about love: “Who loves you? Who loves you pretty baby?” They paved the way for a whole new sound in the ’70s when they added a brass orchestra.

Despite the hardships of the touring life, that wonderful music makes it all worthwhile. When asked to describe the best part of being the Four Seasons, Frankie responds simply, “When it was just us four guys singing under a street light.” Anyone who sings knows that feeling. It’s the joy of making music together.

That joy comes through in the closing credits of the film, when the cast members dance through the streets to a medley of songs reminiscent of the curtain-call encore at the end of the Broadway musical. Wisely Eastwood used the recordings of the original Four Seasons for the closing credits instead of the voices of the actors who play them in the movie. The difference is profound. Valli had such a glorious bell-like quality to his falsetto, while Young’s is simply false. He tries hard, but the effort shows. In the first hour of the film, when people react to his voice as he is “discovered,” it’s almost puzzling. What’s so great about this nasally voice with the slight rasp that makes you want to clear your throat? In the closing minutes of this film, listening to the original Four Seasons, it all makes sense.


Editor's Note: Review of "Jersey Boys," directed by Clint Eastwood. Warner Brothers, 2014, 134 foot-tapping minutes.



Share This


Action Plus Gravitas

 | 

Tight shot on the face of a man sleeping. His eye snaps open, and it is yesterday morning — again. He rises, and the day unfolds exactly as it did the day before. No one else knows that the day is being repeated, but he remembers, and he reacts. Each time he learns the best way to react in order to get where he wants to be. With eternity to learn and an infinite number of do-overs until he gets it right, the man develops skills, enhances relationships, and eventually gets the girl.

Groundhog Day (1993) is one of my favorite movies, but that’s not the film I am reviewing here. Edge of Tomorrow relies on the same premise of a neverending loop in which a man wakes up day after day in the same place, facing the same dilemma, surrounded by the same people doing and saying the same things. But he changes and grows with each repeated day.

As the film opens, an alien force has invaded Europe, burrowed underground, and started spreading across the continent toward England, China, and Russia. Enter Major William Cage (Tom Cruise), a media specialist with the Army who started in ROTC and rose to the rank of Major through office successes; he has never trained for combat, and he has no intention of going to war. When commanded to go to the front lines of a beach invasion in Normandy, he bolts. When next we see him he is handcuffed, stripped of his rank, and forced to join J Squad on the day they are going to invade France. He has no training with weaponry, doesn’t even know how to disengage the safety, and buckles under the weight of his heavy armor.

It is an unusual treat to see Cruise playing a terrified coward who doesn’t know how to fight, since he usually plays the tough guy who is cool as a cucumber under pressure. Of course, before long he is using his repetition of days to build up his skills and learn how to fight so that he can save the world. It’s an impossible mission, but someone has to do it. Helping him is Lt. Rita Vrataski (Emily Blunt), a war hero known as the Angel of Verdun because she almost single-handedly vanquished the alien enemy in a previous battle. That’s because Rita has also experienced repetition of days and used her repeated experience to anticipate the enemy’s moves. Together she and Cage fight to reach the source of the alien force and destroy it.

The story line is reminiscent of a video game in which the player adopts a character on the screen and fights through several different levels to accomplish a goal. Each time the player “dies” he has to start over, and each time he plays, he gets a little further in the game by remembering where the booby traps are. Often players work together, telling each other which tunnel or path is safe and which one has a lurking danger. Cage and Rita work together in this way, remembering what happened the “previous day” and moving further each time toward their goal. When Cage says to Rita at one point, “We’ve never made it this far before,” it sounds exactly like my munchkins playing Mario together.

It is an unusual treat to see Cruise playing a terrified coward who doesn’t know how to fight.

This video-game reference does not trivialize the film; it simply gives the viewer something more to ponder about metaphysics, the nature of life, and what you might do if you could see into the future and learn from your mistakes. A do-over once in a while could make all the difference.

Santayana said, “Those who cannot remember the past are condemned to repeat it.” Director Doug Liman has remembered and learned from the past. While Edge of Tomorrow borrows heavily from the concept of Groundhog Day, it is not doomed in any way. Moreover, Liman brings to this project a strong history in action films from his work directing the Bourne series. Edge of Tomorrow is fresh, exciting, and compelling. The references to the storming of Normandy give it a sense of gravitas missing from most modern action films (it was even released on June 6, to coincide with the anniversary of the invasion). The threat of a lurking menace that spreads unseen and underground until it has become unstoppable and can enter one’s mind gives the audience a sense of personal investment while suggesting that the enemy is a thought or philosophy, not an army. Even the solution for stopping the enemy — that is, getting inside the enemy’s mind and understanding his perspective — is also a powerful lesson for modern warfare. Edge of Tomorrow works on every level.


Editor's Note: Review of "Edge of Tomorrow," directed by Doug Liman. Warner Brothers, 2014, 113 minutes.



Share This


Another Perspective on Piketty

 | 

Someone acquainted only secondhand with Thomas Piketty’s book translated as Capital in the Twenty-First Century or who has only skimmed it might well dismiss it as a mere leftist, redistributionist tract. That would be a mistake and injustice — and thus counterproductive. Libertarian critics should try to answer Piketty’s findings, attitudes, and recommendations respectfully and seriously (unless, of course, they find themselves converted away from their own doctrine).

His tome of viii + 685 pages, full of tables, charts, and citations, is an impressive work of resourceful scholarship. A massive and detailed web site supplements it. I have neither the time and energy nor the competence to verify his voluminous statistics. Pieced together, as some of them are, from fragmentary sources (such as tax and probate records) of decades and even centuries ago, they must incorporate some elements of interpolation and educated guessing. Still, no reason is apparent for questioning his and his collaborators’ diligence and honesty.

Piketty avoids the pretensions of so much academic economics — decorative mathematics and dubious econometrics. (“[M]athematical models ... are frequently no more than an excuse for occupying the terrain and masking the vacuity of the content,” p. 574.) His book employs, and sparingly, only the simplest algebra; but I did find a few symbols and their definitions bothersome.

Piketty’s case for reforms is not mainly an economic argument but a sustained appeal to the reader’s intuition against extreme inequality.

For example, Piketty makes much of the inequality r>g as the condition of growth of the ratio of capital (wealth) to national income, g being the growth rate of the denominator. The condition would be trivially obvious if r, the numerator, were the growth rate of the capital stock; but Piketty usually, and misleadingly, calls it the “rate of return on capital.” That description would apply if all and only the earnings on capital were saved and reinvested. Expenditure of some capital earnings on consumption instead would reduce the growth of the capital stock and the capital-income ratio, as Piketty occasionally mentions; and saving or dissaving from labor income would also affect the ratio’s growth (or shrinkage).

Nevertheless, Piketty’s compilation of long-term statistics for several countries suggests a trend to him. Only occasionally does he mention that most of his income figures are of income before taxes and before supplementation by government redistribution. Anyway, the long-term trend of the capital-income ratio seems to have been upward, exacerbating the inequality of both wealth and income. The chief historical exception is the period 1914–1945, when wars and depression destroyed so much wealth.

Piketty gives particular attention to the concentration of income and wealth in the top 1%, or even the top tenth or hundredth of 1% of their distributions. He seems particularly concerned about great inherited fortunes and the lavish leisured lifestyles that they make possible (as in novels by Jane Austen and Honoré de Balzac, mentioned as a welcome change of pace from dense argument).

His remedy for great inequality would be not only highly progressive income and inheritance taxes but progressive annual taxes on total wealth itself. He recognizes the political unlikelihood of getting his wealth taxes enacted and enforced, however, because implausibly close international collusion of governments would be required. He draws on the literature of Public Choice little if at all. He supplements his arguments with page after page of the history of taxation in different countries.

Nowhere, as far as I noticed, and to his credit, does Piketty blame inequality for economic crises and depressions or commit the crude Keynesianism of recommending redistribution to raise the propensity to consume. He does not maintain that the apparent trend toward greater inequality will continue without limit. He does not maintain that the extreme wealth of only a few thousand families will give those few tyrannical power over their fellow citizens — far from few enough, actually, to be a coherent oligarchy. Nor does he (or his translator) toss about words like “unfair” and “unjust,” although he does occasionally aspire to more “social justice” and “democracy” in the inexpediently and popularly stretched sense of the latter word.

One might expect concern about inequality to include concern about further concentration of resources and power in the state. However, Piketty does not expect his more drastic and broad-based progressive taxes to raise much more revenue. Nor, perhaps inconsistently with not expecting this, does he worry about damaging incentives to work and innovate. Possibly he agrees with John Stuart Mill in thinking that the distribution of wealth can be separated from its production. Possibly, like José Ortega y Gasset’s Mass Man (The Revolt of the Masses, 1930), he regards the wonders of modern industrial civilization as automatically existing, like facts of nature. Although an avowed socialist in the loose European sense of the term, he does not want to destroy capitalism. He even welcomes considerable privatization: government agencies and employees need not themselves provide all the services that tax money pays for.

Wealth is not something that belongs to the government, which it may leave to its producers or redistribute as the country’s rulers see fit.

For Piketty, reducing inequality is a goal in its own right. I agree so far as reducing it means undoing government measures that actually foster it. These include aspects of crony capitalism: subsidies, tax privileges, protection from both domestic and foreign competition, and most of what makes highly paid lobbying worthwhile. Also at others’ expense, arguably, a policy of artificially low interest rates benefits Wall Street operators and wealthy stock investors and traders.

As I ended reading his book, I realized that Piketty’s case for reforms is not mainly an economic argument but a sustained appeal to the reader’s intuition, although not explicitly to envy. Intuition presumably carries more weight if the reader comes to share it himself without having actually been told what to think. If so, Piketty’s economic language and massive quantities of ingeniously gathered statistics amount to what I call a Murray Rothbard or Alan Reynolds style of argument: deploy such an array of facts and figures, dates, places, mini-biographies, and even personality sketches that, even if they scarcely add up to a coherent argument, you come across to your reader or audience as a consummate expert whose judgments command respect. But saying so may exaggerate; for Piketty’s tables, charts, and sketches of characters in novels may usefully jog the intuition. Anyway, one should not disparage Piketty’s impressive research and methods and their likely application in projects beyond his own.

As for an intuition against extreme inequality, I confess to one of my own, although it does not mean welcoming heavier and more progressive taxes. We should worry about undermining respect for private property as a human right and essential pillar of any functioning economic system. Wealth is not something that belongs to the government, which it may leave to its producers or redistribute as the country’s rulers see fit.

Still, the intuition persists, as it did with Henry Simons, that saint of the Chicago School of economics in its early days, who found inequality “unlovely,” and as it persisted with Nobelist James Buchanan, prominent libertarian, who advocated stiff inheritance taxes. Somehow, I am uneasy about the pay of executives said to be 600 times as great as the pay of their ordinary workers, even though they may well contribute more than that much to their companies’ revenues. I am uneasy about lifestyles of opulent leisure permitted by great inherited wealth, rare though they may be. I cannot justify or explain my intuition, which, anyway, is not crass envy.

I don’t call on public policy to heed that intuition, any more than I share the apparently spreading expectation that some authority take action against whatever offends somebody, whether the lifestyle, the behavior, the speech, or the suspected thought of someone else. I wouldn’t want an egalitarian intuition implemented in anything like Piketty’s ways. Government measures to alleviate or avoid actual poverty, even beyond the “safety net,” are something quite different.

An intuitive dislike of extreme inequality does not rule out unease at Piketty’s line of thinking. Again, however, I warn libertarians: don’t risk a boomerang effect by unfairly dismissing his work as a mere ideological tract. It is indeed a work of genuine scholarship. Dealing with its challenging ideas can strengthen the libertarian case.


Editor's Note: Review of "Capital in the Twenty-First Century," by Thomas Piketty, translated by Arthur Goldhammer. Belknap Press, 2014.



Share This


Mind the Gap

 | 

“Capitalism automatically generates arbitrary and unsustainable inequalities that radically undermine democratic societies.” — Thomas Piketty, Capital in the 21st Century

French professor Thomas Piketty’s new book — ranked #1 on Amazon and the New York Times — is a thick volume with the same title as Karl Marx’s 1867 magnum opus, Capital. Many commentators have noted the Marxist tone — the author cites Marx more than any other economist — but that’s a distraction.

The author discusses capital and economic growth, and recommends a levy on capital, but the primary focus of the book is inequality. In mind-numbing minutiae of data from Europe and the United Staes, Piketty details how inequality of income and wealth have ebbed and flowed over the past 200 years before increasing at an “alarming” rate in the 21st century. Because of his demonstrated expertise, his scholarship and policy recommendations (sharply higher progressive taxes and a universal wealth tax) will be taken seriously by academics and government officials. Critics would be wise to address the issues he raises rather than simply to dismiss him as a French polemicist or the “new Marx.”

According to his research, inequality grows naturally under unfettered capitalism except during times of war and depression. “To a large extent, it was the chaos of war, with its attendant economic and political shocks, that reduced inequality in the twentieth century” (p. 275, cf. 471) Otherwise, he contends, there is a natural tendency for market-friendly economies to experience an increasing concentration of wealth. His research shows that, with the exception of 1914-45, the rate of return on property and investments has consistently been higher than the rate of economic growth. He predicts that, barring another war or depression, wealth will continue to concentrate into the top brackets, and inherited wealth will grow faster with an aging population and inevitable slower growth rates, which he regards as “potentially terrifying” and socially “destabilizing.”

If market-generated inequality is the price we pay to eliminate poverty, I’m all in favor.

His proposal? Investing in education and technical training will help, but won’t be enough to counter growing inequality. The “right solution” is a progressive income tax up to 80% and a wealth tax up to 10%. He is convinced that these confiscatory rates won’t kill the motor of economic growth.

One of the biggest challenges for egalitarians like Piketty is to define what they mean by an “ideal” distribution of income and wealth. Is there a “natural” equilibrium of income distribution? This is an age-old question that has yet to be resolved. I raised it in a chapter in “Economics on Trial” in 1991, where I quoted Paul Samuelson in his famous textbook, “The most efficient economy in the world may produce a distribution of wages and property that would offend even the staunchest defender of free markets.”

But by what measure does one determine whether a nation’s income distribution is “offensive” or “terrifying”? In the past, the Gini ratio or coefficient has been used. It is a single number that varies between 0 and 1. If 0, it means that everyone earns the same amount; if 1, it means that one person earns all the income and the rest earn nothing. Neither one is ideal. Suppose everyone earns the same wage or salary. Perfect equality sounds wonderful until you realize that no economy could function efficiently that way. How you could hire anyone else to work for you if you had to pay them the same amount you earn?

A wealth tax destroys a fundamental sacred right of mankind — the right to be left alone.

Even social democrats William Baumol and Alan Blinder warned in their popular economics textbook, “What would happen if we tried to achieve perfect equality by putting a 100% income tax on all workers and then divide the receipts equally among the population? No one would have any incentive to work, to invest, to take risks, or to do anything else to earn money, because the rewards for all such activities would disappear.”

So if a Gini ratio of 0 is bad, why is a movement toward 0 (via a progressive income tax) good? It makes no sense.

Piketty wisely avoids the use of the Gini ratios in his work. Instead he divides income earners into three general categories, the wealthy (top 10% income earners), the middle class (40%), and the rest (50%), and tracks how they fare over the long term.

But what is the ideal income distribution? It’s a chimera. The best Piketty and his egalitarian levelers can do is complain that inequality is getting worse, that the distribution of income is unfair and often unrelated to productivity or merit (pp. 334–5), and therefore should be taxed away. But they can’t point to any ideal or natural distribution, other than perhaps some vague Belle Époque of equality and opportunity (celebrated in France between 1890 and 1914).

Piketty names Simon Kuznets, the 20th century Russian-American economist who invented national income statistics like GDP, as his primary antagonist. He credits Kuznets with the pro-market stance that capitalist development tends to reduce income inequality over time. But actually it was Adam Smith who advocated this concept two centuries earlier. In the Wealth of Nations, Smith contended that his “system of natural liberty” would result in “universal opulence which extends itself to the lowest ranks of the people.”

Not only would the rich get richer under unfettered enterprise, but so would the poor. In fact, according to Smith and his followers, the poor catch up to the rich, and inequality is sharply reduced under a liberal economic system without a progressive tax or welfare state. The empirical work of Stanley Libergott, and later Michael Cox, demonstrates that through the competitive efforts of entrepreneurs, workers, and capitalists, virtually all American consumers have been able to change an uncertain and often cruel world into a more pleasant and convenient place to live and work. A typical homestead in 1900 had no central heating, electricity, refrigeration, flush toilets, or even running water. But by 1970, before the welfare state really got started, a large majority of poor people benefited from these goods and services. The rich had all these things at first — cars, electricity, indoor plumbing, air conditioning — but now even the poor enjoy these benefits and thus rose out of poverty.

Piketty and other egalitarians make their case that inequality of income is growing since the Great Recession, and they may well be correct. But what if goods and services, what money can buy, becomes a criteria for inequality? The results might be quite different. Today even my poor neighbors in Yonkers have smartphones, just like the rich. While every spring the 1% attend the Milken Institute Conference in LA that costs $7,000 or more to attend; the 99% can watch the entire proceedings on video on the Internet a few days later — for free. The 1% can go to the Super Bowl for entertainment; the 99% gather around with their buddies and watch it on an widescreen HD television. Who is better entertained?

Contrary to Piketty’s claim, it’s good that capital grows faster than income, because that means people are increasing their savings rate.

Piketty & Co. claim that only the elite can go to the top schools in the country, but ignore the incredible revolution in online education, where anyone from anywhere in the world can take a course in engineering, physics, or literature from Stanford, MIT, or Harvard for a few thousand dollars, or in some cases, for absolutely nothing.

How do income statistics measure that kind of equal access? They can’t. Andrew Carnegie said it best, “Capitalism is about turning luxuries into necessities.” If that’s what capital and capitalism does, we need to tax it less, not more.

A certain amount of inequality is a natural outcome of the marketplace. As John Maynard Keynes himself wrote in the Economic Consequences of the Peace (1920), “In fact, it was precisely the inequality of the distribution of wealth which made possible those vast accumulations of fixed wealth of and of capital improvements which distinguished that age [the 19th century] from all others.”

A better measure of wellbeing is the changes in the absolute real level of income for the poor and middle classes. If the average working poor saw their real income (after inflation) double or triple in the United States, that would mean lifting themselves out of poverty. That would mean a lot more to them than the fortunes of the 1%. Even John Kenneth Galbraith recognized that higher real growth for the working class was what really mattered when he said in The Affluent Society (1959), “It is the increase in output in recent decades, not the redistribution of income, which has brought the great material increase, the well-being of the average man.”

Political philosopher James Rawls argued in his Theory of Justice (1971) that the most important measure of social welfare is not the distribution of income but how the lowest 10% perform. James Gwartney and other authors of the annual Economic Freedom Index have shown that the poorest 10% of the world’s population earn more income when they adopt institutions favoring economic freedom. Economic freedom also reduces infant mortality, the incidence of child labor, black markets, and corruption by public officials, while increasing adult literacy, life expectancy, and civil liberties. If market-generated inequality is the price we pay to eliminate poverty, I’m all in favor.

I have reservations about Piketty’s claim that “Once a fortune is established, the capital grows according to a dynamic of its own, and it can continue to grow at a rapid pace for decades simply because of its size.” To prove his point, he selects members of the Forbes billionaires list to show that wealth always grows faster than the average income earner. He repeatedly refers to the growing fortunes of Bill Gates in the United States and Liliane Bettencourt, heiress of L’Oreal, the cosmetics firm.

Come again?

I guess he hasn’t heard of the dozens of wealthy people who lost their fortunes, like the Vanderbilts, or to use a recent example, Eike Batista, the Brazilian businessman who just two years ago was the 7th wealthiest man in the world, worth $30 billion, and now is almost bankrupt.

Piketty conveniently ignores the fact that most high-performing mutual funds eventually stop beating the market and even underperform. Take a look at the Forbes “Honor Roll” of outstanding mutual funds. Today’s list is almost entirely different from the list of 15 or 20 years ago. In our business we call it “reversion to the mean,” and it happens all the time.

Prof. Piketty seems to have forgotten a major theme of Marx and later Joseph Schumpeter, that capitalism is a dynamic model of creative destruction. Today’s winners are often tomorrow’s losers.

IBM used to dominate the computer business; now Apple does. Citibank used to be the country’s largest bank. Now it’s Chase. Sears Roebuck used to be the largest retail store. Now it’s Wal-Mart. GM used to be the biggest car manufacturer. Now it’s Toyota. And the Rockefellers used to be the wealthiest family. Now it’s the Waltons, who a generation ago were dirt poor.

Piketty is no communist and is certainly not as radical as Marx in his predictions or policy recommendations. Many call him “Marx Lite.” He doesn’t advocate abolishing money and the traditional family, confiscating all private property, or nationalizing all the industries. But he’s plenty radical in his soak-the-rich schemes: a punitive 80% tax on incomes above $500,000 or so, and a progressive global tax on capital with an annual levy between 0.1% and 10% on the greatest fortunes.

There are three major drawbacks to Piketty’s proposed tax on wealth or capital.

First, it violates the most fundamental principle of taxation, the benefit principle. Also known as the accountability or “user pay” principle, taxation is justified as a payment for benefits or services rendered. The basic idea is that if you buy a good or use a service, you should pay for it. This approach encourages efficiency and accountability. In the case of taxes, if you benefit from a government service (police, infrastructure, utilities, defense, etc.), you should pay for it. The more you benefit, the more you pay. In general, most economists agree that wealthier people and big businesses benefit more from government services (protection of their property) and should therefore pay more. A flat personal or corporate income tax would fit the bill. But a tax on capital (or even a progressive income tax) is not necessarily connected to benefits from government services — it’s just a way to forcibly redistribute funds from rich to poor and in that sense is an example of legal theft and tyranny of the majority.

Second, a wealth tax destroys a fundamental sacred right of mankind — financial privacy and the right to be left alone. An income tax is bad enough. But a wealth tax is worse. It requires every citizen to list all their assets, which means no secret stash of gold and silver coins, diamonds, art work, or bearer bonds. Suddenly financial privacy as guaranteed by the Fourth Amendment becomes illegal and an underground black market activity.

Third, a wealth tax is a tax on capital, the key to economic growth. The worst crime of Piketty’s vulgar capitalism is his failure to understand the positive role of capital in advancing the standard of living in all the world.

To create new products and services and raise economic performance, a nation needs capital, lots of it. Contrary to Piketty’s claim, it’s good that capital grows faster than income, because that means people are increasing their savings rate. The only time capital declines is during war and depression, when capital is destroyed.

He blames the increase in inequality to low growth rates, when, says, the economic growth rate falls below the return on capital. The solution isn’t to tax capital, but to increase economic growth via tax cuts, deregulation, better training and education and productivity, and free trade.

Even Keynes understood the value of capital investment, and the need to keep it growing. In his Economic Consequences of the Peace, Keynes compared capital to a cake that should never be eaten. “The virtue of the cake was that it was never to be consumed, neither by you nor by your children after you.”

What country has advanced the most since World War II? Hong Kong, which has no tax on interest, dividends, or capital.

 

If the capital “cake” is the source of economic growth and a higher standard of living, we want to do everything we can to encourage capital accumulation. Make the cake bigger and there will be plenty to go around for everyone. This is why increasing corporate profits is good — it means more money to pay workers. Studies show that companies with higher profit margins tend to pay their workers more. Remember the Henry Ford $5 a day story of 1914?

If anything, we should reduce taxes on capital gains, interest, and dividends, and encourage people to save more and thus increase the pool of available capital and entrepreneurial activity. A progressive tax on high-income earners is a tax on capital. An inheritance tax is a tax on capital. A tax on interest, dividends, and capital gains is a tax on capital. By overtaxing capital, estates, and the income of our wealthiest people, including heirs to fortunes, we are selling our country and our nation short. There’s no telling how high our standard of living could be if we adopted a low-tax policy. What country has advanced the most since World War II? Hong Kong, which has no tax on interest, dividends, or capital.

Hopefully Mr. Piketty will see the error of his ways and write a sequel called “The Wealth of Nations for the 21st Century,” and will quote Adam Smith instead of Karl Marx. The great Scottish economist Adam Smith once said, “Little else is required to carry a state from the lowest barbarism to the highest degree of opulence but peace, easy taxes, and a tolerable administration of justice.” Or per haps he will quote this passage: “To prohibit a great people….from making all that they can of every part of their own produce, or from employing their stock and industry in the way that they judge most advantageous to themselves, is a manifest violation of the most sacred rights of mankind.”


Editor's Note: Review of "Capital in the Twenty-First Century," by Thomas Piketty, translated by Arthur Goldhammer. Belknap Press, 2014.



Share This


They Didn’t Want a War

 | 

Margaret MacMillan’s The War that Ended Peace gives a fascinating description of the background, stretching back to around 1900, of what she, like people at the time, calls the “Great War.” She relates how the Bosnian crisis of 1908, the Moroccan crises of 1905 and 1911, the crises arising from wars among the Balkan countries in 1912 and 1913, and various minor incidents were successfully muddled through without war among the great powers. The most general source of tension seems to have been fear of being attacked first and concern to make and maintain alliances.

Leading statesmen optimistically expected that tension between Austria-Hungary and Serbia, exacerbated by the assassination of Archduke Franz Ferdinand on 28 June 1914, would somehow be resolved like the earlier crises. Even after Austria-Hungary rejected Serbia’s compliant but not total acceptance of its ultimatum and declared war, hope lingered of keeping the war contained.

Few policymakers had wanted war (the main exception perhaps being Franz Conrad von Hötzendorf, Austro-Hungarian Chief of Staff). The German Kaiser was no exception, although he was addicted to impulsive speeches and interviews, liked to strut in military uniform, and even enjoyed fiddling with the detailed design of uniforms (as did his fellow emperors Franz Joseph and Nicholas II).

World War I was a momentous and enduring tragedy. Germany, for one, had everything to gain from continuing peace.

As those examples suggest, MacMillan goes into revealing detail not only about demographic, economic, political, diplomatic, and military situations and events but also about people — royalty, politicians, foreign ministers, diplomats, generals and admirals, journalists, and influential or well connected socialites — together with their backgrounds, illnesses, deaths, and strengths or quirks of personality.

Much of this is relevant to the role of sheer and even trivial accident in momentous history. MacMillan herself notes several examples. The Russian monk Rasputin, whatever his faults, strongly advocated peace and had great influence with the Imperial family; but he had been stabbed by a madwoman on the very day of the Austrian Archduke’s assassination and was recovering slowly, far from St. Petersburg. The Archduke himself had long realized that Austria-Hungary was too weak to risk an aggressive foreign policy. Alfred von Kiderlen-Wächter, German Foreign Minister and in MacMillan’s opinion a force for peace, had died in December 1912. Joseph Caillaux, France’s peace-minded Prime Minister, had had to resign in January 1912, partly in connection with his second wife’s shooting of an editor who had threatened to publish some indiscreet love letters that Caillaux had sent to her while she was still married to someone else. Although MacMillan does not explicitly raise the question, I was set to wondering how events would have evolved if Otto von Bismarck, a realist who was satisfied with Germany’s international position achieved by 1871, had been alive and in office in 1914. Or what if Gavrilo Princip’s bullet had missed the Archduke?

MacMillan ends her book, apart from a 13-page epilogue, with the outbreak of war in July-August 1914. That is fine with a reader more interested in the consequences of particular wars and with how the wars might have been avoided (as many potential wars no doubt were barely avoided) than with the details of the actual fighting. World War I was a momentous and enduring tragedy. Germany, for one, had everything to gain from continuing peace, including its growing leadership in science and industry. MacMillan writes a gripping story. She conveys a feel of the suspense that must have prevailed during the final crisis. My opinion of her book is overwhelmingly favorable.

Or it would be except for one minor but pervasive and annoying defect. The book is erratically punctuated, mainly but not everywhere underpunctuated. Even independent clauses, often even ones with their own internal punctuation, go unseparated by a comma or semicolon. Restrictive and nonrestrictive phrases and clauses are not distinguished, as clarity requires, by absence or presence of punctuation. Such erratic and erroneous punctuation delays understanding, if usually only for a second. Even so, it distracted me from the book’s fascinating story.

Above all, it distracted me with sustained wonder about how so untypically mispunctuated a book could emerge from a major publishing house. Could the copyeditor have given up in the face of a daunting and tedious task? Could an incompetent editor have imposed the damage, which the author then passively left standing? Could the author have committed the errors herself and then, perhaps out of bad experience with previous copyeditors, have insisted on none of their tampering this time? None of these hypotheses seems plausible, but I can’t think of a better one. The author’s including her copyeditor in her long list of Acknowledgments adds to the mystery.

I’d be grateful if someone could relieve my curiosity with the true story.


Editor's Note: Review of "The War that Ended Peace," by Margaret MacMillan. Random House, 2013, 784 pages.



Share This


Memories of War

 | 

Last month I visited the Kamikaze Peace Museum in Chiran, Japan, a small town characterized by cherry-lined streets and what remains of a centuries-old Samurai village. The museum is a moving tribute to the 1,000 or so young men who were ordered to give their lives for god and country (the emperor was considered divine) by flying their planes directly into American targets in the Pacific during the final months of World War II. Chiran was the departure point for most of those flights.

The museum contains photographs of all the men, along with the letters many of them wrote to their families on the eve of their death. These pilots were little more than boys, most of them aged 17–28, some of them photographed playing with puppies as they posed, smiling, in front of their planes. In their letters they urged their mothers to be proud, their sisters to be comforted, their girlfriends to move on without them, and their children to be brave. One man wrote, “I am sorry that Papa will not be able to play horsey with you any more.” Another’s girlfriend leapt from a bridge to her death after she read his letter, and yet another’s wife drowned herself and her children before his flight so he could die without regret. Several of these young pilots were Koreans conscripted into the service against their will. None felt he had a choice; living with the loss of honor would be much more painful than any fear of death. I felt nothing but sadness for these young boys.

Two weeks later I was in Oahu, where over 200 Japanese planes attacked Pearl Harbor in the early morning of December 7, 1941, killing 2,400 Americans, wounding another thousand, and crippling the American fleet. The attack brought America into war in the Pacific. One cannot visit the Pearl Harbor Memorial without feeling profound sadness for the loss of life that day and in the four years that were to come. Yet, having just visited the Kamikaze Peace Museum, I could not hate the men who flew the bombers into Pearl Harbor. The words of Edwin Starr resonated in my mind: “War: What Is It Good For?”

Perhaps it is good for peace. But at what price? I thought of this as I watched The Railway Man, based on the memoirs of a British soldier, Eric Lomax (Colin Firth and Jeremy Irvine) who was captured by the Japanese during World War II, forced to help build the railway through Thailand that was immortalized by the Oscar-winning film The Bridge on the River Kwai (1957), and tortured by his captors when he built a radio receiver inside their prison. The title of the film has dual meanings; not only does Lomax help build the railroad through Thailand, but from his youth he has had an obsession for trains and has always memorized details about train schedules, train depots, and the towns that surround train stations. In context, the title also suggests a metaphor for the bridges that are eventually built, through arduous effort, between Lomax and others, including his wife Patti.

None felt he had a choice; living with the loss of honor would be much more painful than any fear of death.

As the film opens, Lomax (Firth) is a middle-aged man who meets a pretty nurse, Patti (Nicole Kidman), on a train. He immediately falls in love with her. (The film implies that this is a first marriage for the shy and socially inept Lomax, but the real Eric Lomax was already married at the time he met Patti. He married Agnes just three weeks after returning from the war, and then divorced her just a few months after meeting Patti on the train. This, and the rest of the story, suggests to me that he returned from the war safe, but not sound.) Eric notes morosely, “Wherever there are men, there’s been war,” and Patti replies with a gentle smile, “And wherever there’s been a war, there’s been a nurse like me to put them back together.”

This introduces the key to their relationship. The war has officially ended 35 years earlier, but it still rages in Lomax’s mind. He will need the kind and patient wisdom of a nurse to help put him back together again. His struggle with post-traumatic stress disorder is skillfully portrayed when ordinary events trigger painful memories that transport him immediately to his jungle experiences as a POW. For example, the sound of the shower triggers terrifying memories of the water torture he endured at the hands of his brutal captors. The unexpected intrusion of these scenes demonstrates the unending aftermath of war and the difficulty of controlling its horrifying memories.

Wise casting adds to the pathos of this fine film. Much of what I know about World War II has been shaped by the films I’ve seen, and most of those were populated by actors well into their 30s and 40s. But in this film Young Eric (Jeremy Irvine) and his comrades are played by slender boys in their early 20s who can’t even grow a stubble of beard after four days aboard a prison train. They are closer to the tender ages of the soldiers they are portraying, and this increases the pathos of the story and our admiration for the strength and resolve of these boys who are thrust into manhood, much like the kamikaze pilots, before they even know what war is.

The Railway Man is a character-driven film that demonstrates the choices we have, even when it seems we have no choices at all. Jesus demonstrated the power of choice when he said, “If a man requires of you his coat, give him your cloak also” and, “If a man smites you, turn the other cheek.” He wasn’t telling his followers to give up and give in, but to take charge and move on, by invoking the right to choose one’s attitude when it seems that the freedom to choose one’s actions is gone. This film demonstrates that same transformative power of choice.


Editor's Note: Review of "The Railway Man," directed by Jonathan Teplitzky. Weinstein Company, 2014, 116 minutes.



Share This


The Apple of Knowledge and the Golden Rule

 | 

Russell Hasan is an author who has contributed a good deal to Liberty. Now he makes a contribution to liberty itself, in the form of two extensive monographs: The Apple of Knowledge: Introducing the Philosophical Scientific Method and Pure Empirical Essential Reasoning, and Golden Rule Libertarianism: A Defense of Freedom in Social, Economic, and Legal Policy. Both works are available online, at the addresses that follow at the end of this review. And both are very interesting.

I’ll start with The Apple of Knowledge, which itself starts with an account of the author’s quest for wisdom. He did not find it in the lessons of professional (i.e., academic) philosophers, who venerated the counterintuitive claims of earlier professional philosophers, often echoing their conviction that objective truth could not be found. The author turned to the objectivist philosophy of Ayn Rand, but found that “it was truly a political philosophy, and not a rigorously reasoned system of metaphysics and epistemology. Rand’s ideas seemed clever and useful, but they contained contradictions and holes and gaps.”

So, as an intellectual entrepreneur, Hasan decided to see whether he could solve crucial philosophical problems himself. That’s the spirit of liberty.

He states his agenda in this way:

The six problems that this book will solve are: 1. Induction, 2. Consciousness, 3. Knowledge, 4. The Scientific Method, 5. Objectivity, and 6. Things in Themselves.

Hasan believes that these problems can be solved by his versions of “(1) the philosophical scientific method, and (2) pure empirical essential reasoning.”

What does this mean in practice? It means a rejection of dualism and radical skepticism, a reasoned acceptance of the world as empirically discovered by the healthy consciousness. An example:

When you look at this book and say “I am looking at this book, I am reading this book, I am aware of the experience of this book,” and you wonder about what it means to be conscious and to have an experience and to see this book, the only things in the picture are two physical objects, (1) this book, which physically exists and is the thing that you are experiencing and are conscious of, and (2) your brain, which is the consciousness that experiences and is aware of the book by means of the perceptions and concepts in your brain. Similarly, when you see a red apple, the red apple itself is the red that you see, and your brain is the subject which perceives that object and is aware of that object. Nowhere in this picture is there a need to deny that consciousness exists. We need not deny that you really see a red color. We need not deny that you are aware of an apple. And there is also no need to believe in ghosts or non-physical souls as an explanation for your being aware of an apple and seeing its red color.

As this example suggests, Hasan has an admirably clear style throughout. His clarity may also suggest, erroneously, that the problems he addresses are easy to solve, or that he deems them easy to solve. They aren’t, and he doesn’t. For every statement he makes there are time-honored quibbles, evasions, and yes, real challenges. The enjoyment of reading through this fairly long book comes from following Hasan’s own path among the challenges, assessing his arguments, and finding out how many of those arguments one wants to buy.

To this process, a statement of my own ideas can add no particular enjoyment. For what it’s worth — and it isn’t directly relevant to Hasan’s essential concerns — his grasp of Christian and biblical theology could be much stronger. Here’s where the dualism that he rejects asserts itself despite his efforts; he tends to see Christian ideas (as if there were just one set of them) as dualistically opposite to his own: Christians are against the world, the flesh, and the devil, while he is heartily in favor of the first two, at least. But it’s not as simple as that. “World” and “flesh” can mean a lot of things, as a concordance search through St. Paul’s epistles will illustrate. You don’t need to believe in God to recognize the complexity of Christian thought. (And, to digress a bit further, “666” didn’t come “from ancient confusion between the Latin word ‘sextus’ which means six and the Latin word ‘sexus’ which means sex.” No, it originated in the biblical book of Revelation [13:18], and it’s a code term, probably for “Nero.”)

It makes no difference whether you’re smarter or richer than I am, because it requires the same effort — that is, none — for both of us to leave each other alone.

About the philosophical problems that Hasan treats I can say that he usually appears to make good sense — very good sense. His education in the objectivist tradition is evident; his respect for the real world — which is, after all, the world that all philosophy is trying to explain — is still more evident. Both are valuable, and essential to his project. Indeed, Apple of Knowledge can be viewed as a particularly interesting and valuable addition to the objectivist tradition of philosophy that begins with Ayn Rand.

Golden Rule Libertarianism is an exposition and defense of a variety of radical libertarian ideas — about victimless crimes, war and peace, government intervention in the economy, and so on. Few libertarians will be surprised by the results of Hasan’s inquiries in these areas — but what does “Golden Rule Libertarianism” mean?

This represents what I take to be a new approach, though one that is nascent in the libertarian concept of the great “negative right,” the right to be left alone. From this point of view, it makes no difference whether you’re smarter or richer than I am, because it requires the same effort — that is, none — for both of us to leave each other alone. The Golden Rule, most famously enunciated by Jesus but, as Hasan points out, hardly foreign to other religious and ethical teachers, yields a more “positive” approach. “Do unto others what you would have them do unto you.” Yet nobody wants his neighbor to do certain things — to prohibit him from speaking or publishing his views or sleeping with whomever he wants, even on the pretense of helping him. In this sense, the Golden Rule turns out to be just as “negative” as libertarians could wish. As Hasan says in one exemplification of his theory:

if you let me be free to make economic decisions, including what wage to pay and at what price to buy services from other people, then I will give you the same freedom to make your own choices instead of me making your choices for you.

There is a pragmatic dimension to this. In case you are wondering whether letting everyone be free to make his or her own decisions would leave the poor in the lurch, or, to vary the metaphor, in the clutches of an exploitative capitalism that the poor are not capable of turning to their own advantage, Hasan adds:

The best thing you can do for me is to get the government out of my way and let me be free, because capitalism helps the poor more than socialism.

Libertarians understand this, and Hasan provides plenty of reasons for everyone else to understand it too. His book will be valuable to nonlibertarians, because there is something in it for every interest or problem they may have. As he says, in another exemplary passage:

The liberal concern for civil liberties, e.g. my freedom to write atheist books, and the conservative concern for freedom from regulation, e.g. my freedom to buy and sell what I want on my terms, is really two sides of the same libertarian coin, because if the government claims the right to be the boss of your beliefs then it will soon usurp the power to be the boss of your place in the economy and take total control over you, and if the government is the boss of the economy then it will inevitably take over the realm of ideas in order to suppress dissent and stifle criticism of the economic planners.

I believe that Hasan is right to pay particular attention to what he calls “the coercion argument,” which is one of the strongest ripostes to libertarian thought. It is an attempt to argue against libertarian ideas on libertarian grounds. The notion is that if I leave you alone I permit someone else to coerce you. As Hasan says,

Some version of the coercion argument underscores a great deal of anti-libertarian sentiment: poor people will be coerced into selling their organs and body parts, which justifies denying them the right to do so. Poor people are coerced into accepting dangerous, low-paying jobs such as coal mining, or are coerced into working long hours for wages that are lower than what they want. They are coerced into buying cheap high-fat fast food, or are coerced into buying cheap meat, packed at rat-infested plants, and so on. The coercion argument is a thorn in the side of laissez-faire politics, because socialists argue that poor people aren’t really free in a capitalist system where they face economic coercion.

Hasan’s insight into the legal history and ramifications of the coercion argument is enlightening:

An example of the grave seriousness of the coercion myth is legal scholar Robert Lee Hale’s famous law review article “Coercion and Distribution in a Supposedly Non-Coercive State” (1923). Hale brainwashed generations of law students with his argument that capitalist employers exert coercion upon workers, and socialism would not produce more coercion or less freedom than capitalism.

This is a powerful myth, but Hasan has little trouble refuting it. Others are yet more formidable; I would be surprised, however, if even a hostile reader could emerge from a serious consideration of Hasan’s arguments without admitting serious damage to his or her own assumptions.

For libertarian readers, the fun is in seeing what Hasan will do with various popular topics of libertarian discourse — natural rights versus utilitarianism, racial discrimination, gay marriage, an interventionist versus a non-interventionist foreign policy, privatization of education and banking, disparity of wealth, etc. Even well-informed libertarians will be surprised, and probably grateful, for many of the arguments that Hasan adduces.

Hasan is one of the few questioning occupational licensing, which exacts immense costs from society, and especially from the poor, who must pay dearly for even the simplest services.

I was such a reader — and for me, the book gained in stature because I did not always agree with it. For me, libertarianism is more a matter of experience and less a matter of moral logic than it is for Hasan; but even within the large area of our philosophical, or at least temperamental, disagreement, I found myself admiring his intrepid and intricate, yet nevertheless clear and cogent progression of thought. I suspect that anyone who shares my feeling for the great chess match of political economy will share my feeling about this book.

Not all of Hasan’s many topics can possibly be of intense interest to everyone, but that’s just another way of saying that the book is rich in topics. My heart rejoiced to see a chapter on the evils of occupational licensing — a practice that virtually no one questions but that exacts immense costs from society, and especially from the poor, who must pay dearly for even the simplest services of licensed individuals. And I was very pleased to see Hasan take on many of the most sacred cows of my fellow academics.

One is game theory. Readers who are familiar with game theory and with the part of it that involves the so-called prisoner’s dilemma know that for more than two decades these things have been the favorite pastime, or waste of time, among thousands of social scientists. (If you ask, How can there by thousands of social scientists? or, Why don’t they find something better to do?, see above, under “occupational licensing.”) The tendency of game theory is to deal with people as objects of power, not subjects of their own agency. Its effect has often been to emphasize the statist implications of human action. Hasan cuts through the haze:

The specific refutation of Game Theory and the “prisoner’s dilemma” is that the solution is not for the group to impose a group-beneficial choice onto each individual, it is for each individual to freely choose the right choice that benefits the group. If the benefits of the supposedly right, good-for-the-group decision are really so great, then each individual can be persuaded to freely choose the right, optimal, efficient choice.

My advice is to get the book, which like the other book is available at a scandalously low price, read the introductory discussion, then proceed to whatever topics interest you most. You may not agree with the arguments you find, but you will certainly be stimulated by the reading.


Editor's Note: Review of "The Apple of Knowledge: Introducing the Philosophical Scientific Method and Pure Empirical Essential Reasoning," and "Golden Rule Libertarianism: A Defense of Freedom in Social, Economic, and Legal Policy," by Russell Hasan. 2014.



Share This


It’s Smart, It’s Exciting, It’s Fun

 | 

 

The specific details of a superhero movie plot seldom really matter; all we usually need to know is that an evil superpower, sporting a foreign accent, is out to destroy the world as we know it, and it is up to the superhero not only to protect the community from destruction but also to preserve our way of life. Dozens of superheroes have been created in comic-book land, and all of them have been sharing time on the silver screen for the past decade or more, with half a dozen of their adventures released this year alone. So far audiences are flocking to theaters with the same enthusiasm that kept our grandfathers heading to the local cinema every Saturday afternoon to see the latest installment of Buck Rogers.

These films tend to reflect the fears and values of whatever may be the current culture, which is one of the reasons for their lasting popularity. We see our worst fears in the threats posed by the enemies, and our hopes and fears in the characters of the heroes. But lately those heroes have been somewhat reluctant and unsure of their roles as heroes, and the people they have sworn to protect have been less trusting and appreciative — they complain about collateral damage and even question the heroes’ loyalty. In an era of relativism and situational ethics, a full-on hero with overwhelming power seems hard to support.

The Avengers share conversations praising freedom and choice, and they reject blind obedience in favor of making their own decisions.

This month it’s Captain America’s turn to save the day. Created by Jack Kirby and Joe Simon in 1941, Captain America (alter ego: Steve Rogers) is a WWII fighter pilot who is transformed from a 5’4” wimp to a 6’2” muscle man through a scientific experiment intended to create an army of super warriors. He ends up being cryogenically frozen and is thawed out in modern times. Part of his appeal is his guileless naiveté, especially as he reacts to modern technology and mores. He uses his virtually indestructible shield to fight for truth, justice, and the American way (okay, that’s the other superhero, but their morals are virtually the same). I like Captain America’s shield — it signifies that his stance is defensive, not aggressive.

As The Winter Soldier opens, nothing is going right for the Avenger team led by Nick Fury (Samuel L. Jackson) and Captain America (Chris Evans). Police, government agencies, and even agents of SHIELD (Strategic Homeland Intervention, Enforcement and Logistics Division, the organization that oversees and deploys the superheroes) are attacking them and treating them as national enemies. The Captain and former Russian spy Natasha (Scarlett Johansson), aka the Black Widow, have become Public Enemies number 1 and 2, but they don’t know why. They spend the rest of the movie trying to clear their names and save the world, without any help from the government they have sworn to uphold.

While the specific plot isn’t particularly important in these movies, motivation usually is. Why do the characters do what they do? Meaningful dialogue inserted between the action scenes reveals the values of both good guys and bad guys, and away we go, rooting for the guy who is going to save us once again.

I’m happy to report that Captain America: The Winter Soldier, lives up to its potential. As a libertarian, I can agree with most of the values it projects. First, politicians, government agencies, and the military industrial complex are the untrustworthy bad guys in this film, and for once there isn’t an evil businessperson or industrialist in sight. Additionally, the Avengers share conversations praising freedom and choice, and they reject blind obedience in favor of making their own decisions. For example, The Falcon (Anthony Mackie) aka Sam Wilson, tells Steve about his buddy being shot down in the war, and then says, “I had a real hard time finding a reason for being over there after that.” Captain America admits, “I want to do what’s right, but I’m not sure what that is anymore.” Like Montag in Bradbury’s Fahrenheit 451, he is ready to think for himself and determine his own morality. (Compare that philosophy to Peter Parker [Spider-Man] being told by his wise Uncle Ben that responsibility is more important than individual choice in Spider-Man 2, followed by Uncle Ben’s death when Peter chooses “selfishness” over responsibility.)

Meanwhile, the Secretary of State (Robert Redford — yes, Robert Redford! He said his grandchildren like the franchise, so he wanted to do the film for them) says cynically of a particular problem, “It’s nothing some earmarks can’t fix.”

The mastermind behind the assault on freedom (I won’t tell you who it is, except that it’s someone involved in government) justifies his destructive plan by saying, “To build a better world sometimes means tearing down the old one” and opining that “humanity cannot be trusted with its own freedom. If you take it from them, they will resist, so they have be given a reason to give it up willingly.” Another one adds, “Humanity is finally ready to sacrifice its freedom for security,” echoing Ben Franklin’s warning. These power-hungry leaders boast of having manufactured crises to create conditions in which people willingly give up freedom. This isn’t new, of course. Such tactics are as old as Machiavelli. Yet nothing could feel more current. I’m happy to see young audiences eating this up.

Captain America first appeared on film in 1944, at the height of WWII. He has never been as popular as Superman, Batman, or Spider-Man. A made-for-TV movie aired in 1979, and a dismal version (with a 3.2 rating) was made in 1990. However, the latest incarnation, with Chris Evans as the wimp-turned-military powerhouse, has been highly successful, with three films released in the past four years: two self-titled films (Captain America: The First Avenger in 2011, and this one) as well as one ensemble outing (The Avengers, 2012).

These power-hungry leaders boast of having manufactured crises to create conditions in which people willingly give up freedom. This isn’t new, of course.

One of the things I like about the Avengers is that they aren’t born with innate super powers à la Superman or X-Men; for the most part their powers come from innovation, technology, and physical training. They’re gritty and real, and they bruise and bleed. Directors Anthony and Joe Russo were determined to make this movie as real as possible too, so they returned to live action stunts whenever they could instead of relying on CGI and green screen projection. Yes, they use stunt doubles when necessary, but, as Anthony Mackie (the Falcon) reported in praise of the Russos, “if they could build it [a set piece], they built it. If we [the actors] could do it [a difficult maneuver], we did it. . . . That’s why the movie looks so great.” Many of the action scenes are beautifully choreographed and often look more like dancing than fighting, especially when Captain America’s shield is ricocheting between him and a gigantic fighter plane.

Of course, the film has its share of corniness too. When you’re a hero named Captain America, you’re expected to be a rah-rah, apple-pie American, and Captain America is. He even drives a Chevy, the all-American car. So does Nick Fury (Samuel L. Jackson), who brags about his SUV with a straight face as though it’s a high-end luxury vehicle. In fact, all the SHIELD operatives drive Chevys, as do many of the ordinary commuters on the street. That’s because another concept that’s as American as apple pie is advertising. Product placement permeates the film, but most of the time it’s subtly and artfully done. Captain America wears an Under Armour t-shirt (which is pretty ironic when you think about it — under armor beneath a super-hero uniform), and the Falcon, whose superpower is a set of mechanized wings that let him fly, sports a small and subtle Nike swoosh on his after-hours attire. (Nike — the winged goddess, get it?)

Captain America is a hit, and for all the right reasons. The dialogue is intelligent, the humor is ironic, the action sequences are exciting, and the heroes are fighting for individual freedom. It even contains a theme of redemption. And for once, the bad guys aren’t businessmen. Ya gotta love it.

Captain America: The Winter Soldier, directed by Anthony Russo and Joe Russo. Sony Pictures, 2014, 136 minutes.




Share This
Syndicate content

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.