Dickie Eklund's Punch-Out!!

 | 

Do we really need another rags-to-riches movie about boxing? Probably not. But filmmakers keep making them, and we keep watching them. Whether you like boxing or not, there is something cathartic about the hero's struggle itself. Like the best boxing movies, the latest one is more about the fighter than the fight, more about the family duking it out outside the ring than the boxing going on inside it. We can always use another film about family dynamics and the will to overcome obstacles, and The Fighter is one of those.

Dickie Eklund (Christian Bale) is the classic small-town hero, still basking in the glory of a quasi-victory 14 years earlier, in a bout where he knocked down Sugar Ray Leonard. Not knocked out, mind you, but knocked down. And some say that Sugar Ray actually tripped. Nevertheless, Dickie is called “The Pride of Lowell,” and as this film begins he is swaggering down the street in that Massachusetts town with an HBO film crew in tow, documenting his “comeback” as a trainer for his younger half-brother, Micky Ward (Mark Wahlberg).

Micky’s manager is his mother Alice (Melissa Leo), a hard-driving, chain-smoking, no-nonsense matriarch in tight pants and high heels. Leo is over-the-top perfect in this role, from the moment she prances into the gym, clipboard in hand, and asks the film crew, “Did you get that? Do you need me to do it again?”

Alice is the ultimate stage mother: pushy, strong, manipulative, and naively confident in her ability to manage her sons’ careers. “You gonna let her talk to me like that?” she rages at Micky when his girlfriend Charlene (Amy Adams) stands up to the rude and domineering matriarch. “I have done everything for you!” she screeches. She is also a classic enabler. Like many mothers who know how to give affection but don’t know how to parent, Alice sees no wrong in Dickie, and her constant sympathy and approval contribute to his sense of entitlement and its disastrous consequences.

Boxing movies are never really about the fights; they’re about the fighter.

The story of fraternal conflict is as old as Cain and Abel, Ishmael and Isaac, Esau and Jacob. In this case, two brothers vie for their parents’ love and attention, while trying to work out their own relationship. Dickie is clearly Alice’s favorite, but since he is virtually washed up as a boxer, Micky becomes the family’s new great hope — even his seven sleazy sisters are completely focused on the light they share from their brothers’ moments of fame. Dickie is the son mired in past glory, and Micky is the son trying to break away and rise above his toxic roots. But Micky is constantly pulled back by his love for his crazy family, and especially by his childlike love for his older brother.

As with many small-town heroes, adulthood has not been good to Dickie. He hangs out in bars and crack houses when he should be in the ring training and sparring with Micky. He shows up several hours late for training — while the HBO cameras keep rolling. He dives out the back window of his girlfriend’s house when he hears his mother coming, afraid of her disapproval. Nevertheless, throughout the first half of the film, Dickie is high on life, hopped up, and wide eyed. His backstreet swagger oozes confidence and joy.

Partway through the film, however, we realize that the documentary isn’t going to be about Dickie’s comeback as a fighter and trainer; it’s going to be HBO’s High on Crack Street: Lost Lives in Lowell (1995). Richard Farrell, who directed the HBO doc, plays a character much like himself in a cameo as a cameraman in this film. Of course, Dickie and his family don’t know the true topic of the documentary, and their moment of realization is devastating, performed with an understated emotion from each actor that is pitch-perfect. Farrell captured the tragedy of crack addiction in the real documentary, and Russell does it again with this film. (Note to self: never trust anyone with a movie camera, no matter what the person tells you is being filmed.)

Boxing movies are never really about the fights; they’re about the fighter, so this film as aptly titled. It’s not about one fighter, though, but about several — Dickie, the has-been boxer fighting to regain his former glory; Micky, the stronger brother fighting to break out of the other’s shadow; Charlene, the girlfriend fighting for respect; and Alice, the mother fighting for her family’s success. All of this takes place in a setting that has seen more rags than riches over the years, a place where boxing can be a pathway to money and status, but more often leads to broken hearts and broken bones.

The Fighter is a film about choice — about choosing to work hard, or not; choosing to be self-interested, or not; choosing the right friends, or not. Dickie’s choices land him in prison; Micky’s choices (when unencumbered by Dickie’s and Alice’s management) land him on a path to the welterweight championship. The scenes juxtaposing Micky’s training in the gym with Dickie’s sparring in the prison yard (and Alice’s chasing her husband with a frying pan) say a lot about choice and consequence in this film about fighting — it’s not just about beating someone up, but about fighting to survive.

This is also a film about love, and how to express it when the person you love is toxic; here, true love is expressed by knowing when someone is hurting, and reaching out to carry the load. This is a film about breaking away, but also about hanging on. How Dickie and Micky manage to do both makes The Fighter well worth watching, even though it might be called“justanother boxing film.”


Editor's Note: Review of "The Fighter," directed by David O. Russell. Paramount Pictures, 2010, 115 minutes.



Share This


The Capital Gang

 | 

For me “I’m a libertarian and I support the Washington Redskins” is right up there with “I’m from the government and I am here to help.” It makes my shoulders twitch and I feel creepy-crawlies run up and down my spine.

It all started in the run-up to Superbowl XVIII played at Tampa Stadium on January 22, 1984. The highly backed patrician ruling-class Redskins faced the underdog blue-collar working-class Raiders. Their respective QBs had some history as they had competed together for the highly prestigious Heisman Trophy back in 1970. Redskin QB (then with Notre Dame) Joe Theismann changed the way he pronounced his name from Thees-man to Thighs-man to make it rhyme with the name of the vaunted trophy in order to garner more votes. When Raider QB (then at Stanford) Jim Plunkett convincingly blew away Joe and also famous father Archie Manning (2,229 to 1,410 to 849), the Thighs-man camp infamously said that Jim had only won it because both his parents were blind. Please. What a classless act.

Happily the Raiders smashed the Redskins, leading 21–3 after just one quarter and scoring on special teams, defense, and offense. The final score was 38–9, and the record books had to be rewritten. Poetic justice?

One additional happy result of that total whipping was that the distinguished MVP scholar Charles Murray renamed his book of the mid-’80s, the book that shot him to stardom. As he recounts on pages xiii and xiv of the tenth anniversary edition, the working title had been F****** Over The Poor — The Missionary Position. Then it became Sliding Backward, but while he was watching the sad sack ‘Skins go nowhere late that Sunday, the title Losing Ground was born. Some TV commentator probably said something such as “the ‘Skins lose yet more ground to the Raiders,” and a light went on in Murray’s head.

There is only one good reason for the continued existence of the Landover Looters, and it is simply this: every single time they lose, absenteeism within the federal government soars the following Monday.

Eighteen months later I moved from California to northern Virginia and wall to wall, front to back, ceiling to floor ‘Skins fandom. There was no soccer (DC United) and no baseball (Nationals) and the basketball (Bullets) and ice hockey (Capitals) barely registered on the local sports radar screen. All that these rent-seeking, tax-guzzling federal employees and their hangers on cared about was the Redskins. Forget the country. They were totally nuts, completely besotted. There was a 30-year wait for season tickets and probably still is. People had to die before you could advance up the list. And it was all so PC that when the gun death-rate in DC hit record levels the Bullets had to be renamed and chose to be the Wizards.

In defense of all the other pro sport teams named Washington or DC, at least they all play there. The so-called Washington Redskins play in Landover MD and train in Ashburn VA. One wonders how many of the players and staff live in DC and how many in the suburbs or even further out.

I am curious as to why all five major sports leagues have to have a DC area franchise. Surely this cannot be connected to the special status that sports leagues enjoy under federal regulations.

There are large echoes here of the equally despised British soccer team Manchester United (fondly known in Manchester itself as “the scum”) which regularly sits atop the English Premier League. It plays in a city called Stretford, and its players live in the next door, very tony county of Cheshire rather than more downmarket Lancashire.

Hence the joke, How many soccer clubs are there in Manchester? Two: Manchester City and Manchester City Reserves. And hence the sign at the Manchester city line when Carlos Tevez signed to leave United for City: Welcome to Manchester.

The name refers to a criminal act of destruction of private property, deception, and sleight of hand; commemorating an attempt to point the finger of a crime falsely at a minority.

Common sense surely dictates that just as Manchester United should be renamed Stretford United so the Washington Redskins should become the Landover Redskins or perhaps the Landover Looters, to reflect the dominant local industry. It is simply dishonest to trade the way they do. They are living a lie.

But why is the team called the Redskins in the first place? What has the swamp of Washington got to do with Native Americans other than as a source of subsidy and special treatment? The answer is that the franchise started in Boston, Mass., as in the place where white patriots dressed up as Native Americans and chucked all that tea overboard. So the name refers to a criminal act of destruction of private property, deception, and sleight of hand; the name commemorates an attempt to point the finger of a crime falsely at a minority, an attempt to unleash the might of the British Army on peaceful natives. It really is disgraceful.

Speaking of minorities, these ‘Skins so beloved by Federal bureaucrats were the very last team in the NFL to integrate, and they did so with great reluctance and in a pretty surly, bad tempered way. The suspicion is that they did so only because the Department of the Interior owned their then stadium (typical) and the Kennedy administration was not impressed at seeing a non-integrated team in the nation’s capital — not really Camelot!

There are sports bars in the DC region with affiliations other than the ‘Skins, but they are nearly as rare as hens’ teeth. I used to frequent a Steelers bar with my friend Father Jack out toward Dulles on fall Sunday afternoons, until the BATF hit it. “Hands on the table — don’t reach for anything, not even your cutlery — don’t make our day.” I am sure the BATF agents were all ‘Skins fans.

The result is a cloying, all pervading, overarching pro-Redskins atmosphere that is not healthy. I recall taking elder son Miles to pre-K one Monday morning in say 1986; he was proudly wearing his brand new Dallas Cowboys shirt, a gift from Uncle Leonard. A female teacher stopped us in the corridor:

Teacher, somewhat condescendingly and pointing at said shirt: “Mr. Blundell, don’t you know this is Redskins country?”

Blundell in his best posh British accent: “Oh I am terribly sorry. I thought the Cowboys were America’s Team!”

If this were a comic strip, the next panel would show a woman with a screwed up face looking at the heavens, elbows stuck firmly into her ribs and clenched fists raised by her jaw, with a thought bubble reading “Argh! *&#%@?+^#*&.”

So as the population of the Swamp changes every election cycle, waves of well-meaning (I am being charitable) men and women, true sports fans who support good honest teams that play in privately owned stadiums, sweep in and are corrupted into supporting the Redskins. You can’t chat at the water cooler or over coffee or at lunch unless you are in the Skindom. It is so sad, but then Washington believes in monopolies such as currency issuance, taxation, and regulation.

When good internationally proven liberty-minded folk such as me confront these so-called libertarian Redskins we receive really mealy-mouthed responses, typical of which is “Oh, when I think of Washington I think of the person not the place!” Right! These people are confused and confusing, embarrassed and embarrassing, and not to be trusted until they go through therapy.

There is only one good reason for the continued existence of the Landover Looters, and it is simply this: every single time they lose, which is well over 50% recently, absenteeism within the Federal government soars the following Monday. This can only be a good thing.

But there is a solution for the Landover Looters problem. The team should move to Syracuse in upstate New York and become the Syracuse “Washington’s Redskins” with the nickname of the “Waistcoats.” Let me explain. George Washington signed a treaty with the Oneida Nation in that area to fight the Brits. So to the extent that Washington Redskins exist free of deceit, capture, and vainglory they are in the Syracuse-Finger Lakes region.

Why Waistcoats? Because Washington wore them and it’s probably a better nickname than “the big girls’ blouses,” which is what I call “libertarians” who support the Landover Looters.




Share This


Your 401k Is a Sitting Duck

 | 

In Liberty some time back (“Pension Peril,” March 2009), I reflected on President Kirchner of Argentina, who helped fund her country’s failing public pension system by simply stealing money from the private pension savings accounts that many of her countrymen had managed to accumulate. Her government expropriated (“nationalized”) the $24 billion private pension funds industry in order to save the public system, forcing citizens to trade their savings for Argentinean Treasury bills of dubious creditworthiness. I suggested then that such a thing might happen in the US, where Americans have many billions put aside in various retirement vehicles — a tempting target for any cash-starved government.

I think that dark day is growing closer. My feeling is confirmed by some troubling news, recently reported by the Adam Smith Institute’s wonderful website. The author of the report, economist Jan Iwanik, notes that a number of European countries are shoring up their tottering public pension plans by the Peronista tactic of stealing from those who have prudently put aside some extra money for their retirement.

Bulgaria, for example, has put forward a plan to confiscate $300 million from the private savings accounts of its already impoverished citizens and put those funds into the public social security system. Fortunately, organized protest has cut the amount transferred to “only” $60 million — for now, at least. And Poland has crafted a scheme to divert one-third of all future contributions that are made to private retirement savings accounts, so that the money flows instead to the public social security scheme. This will amount to $2.3 billion a year stolen from frugal people to shore up the improvident public system.

The most egregious case is that of Hungary. This state, which has been teetering on the verge of insolvency for years, has taken a drastic punitive step. Under a new law, all citizens who have saved for their retirement face a Hobson’s choice: either they turn over their entire retirement accounts to the government for the funding of the public system, or they lose the right ever to collect a state pension, even though they have paid and must continue paying contributions to the state system. The Hungarian government thus hopes to pocket all of the $14.2 billion that the hapless Hungarians have managed to squirrel away.

As our own national insolvency grows nigh, it is just a matter of time before the feds take a swing at the enormous pot of private retirement savings held by Americans. If you think you’ve heard nothing but class warfare rhetoric from this administration, just wait till it feels the need to take your 401ks, IRAs, and so on. The demonization of the productive and the prudent will be loud and shrill.




Share This


Turn Out the Lights, the Party's Over

 | 

With a budget of $65 million, Spider-Man: Turn Off the Dark is touted as the most lavish musical ever mounted on Broadway. Much of the money has been invested in mechanical lifts and flying machines, high-tech costumes, and, unfortunately, medical bills. Already one performer has broken both wrists, another has broken both feet, another has fractured his ribs and injured his back, and the leading actress has suffered a concussion that took her out of the show for a while. And Spider-Man hasn't even officially opened yet. (It's still in previews, and the official opening date, when the show will be set in stone and critics are invited to write their reviews, keeps being pushed back.)

You know you're in trouble when the stage manager has to make an announcement before the first act assuring the audience that OSHA representatives are on hand backstage to make sure the stunts are in full compliance with safety requirements, and that the state Department of Labor has okayed the production, despite the numerous injuries. (The continued injury rate gives you a lot of confidence in OSHA and the Department of Labor, doesn't it?) Going to a performance of this new musical feels eerily like going to a hockey game or a stock car race — you hate to admit it, but you're almost hoping to see blood. Look at all the laughs Conan O'Brien has milked from the show's growing injury list.

Let’s be frank: accidents aside, the show was doomed from the beginning. All the stunts and technical tricks in the world can't make up for a bad script, and this one is a snoozer. It gained the potential for an interesting plot by introducing an unexpected new character, the mythological Arachne of Greek mythology, who was transformed into a spider for boasting that she was a better weaver than Athena, patron goddess of weaving. Two characters from different eras cursed with spidery traits and struggling to become human again could have produced a dynamic new story.

Going to a performance of this new musical feels eerily like going to a hockey game or a stock car race — you hate to admit it, but you're almost hoping to see blood.

But instead of focusing on this new character development and trusting the audience to know the story of how Peter Parker became Spiderman (which any possible audience is certain to know already), the show's producers decided to leave Arachne dangling (literally) for most of the show and concentrate on retelling the core story.

The production is framed by four punk teens who seem to be writing a script or filming a video (it isn't clear what they are doing) in front of the stage. They tell each other the story, and then their story comes to life as the actors perform it, almost action-for-action and word-for-word the way we have already seen it in comic books, on film, and in amusement parks. First we hear it, then we see it — yet we already know it. Talk about overkill! I was ready to pull out the industrial strength Raid before the first act was finished.

Even then . . . The show could have survived a weak storyline if director Julie Taymor had delivered what she is known for: a montage of splashy, whimsical, creative production numbers that wow the audience with unexpected visual delights. This is what she did in her film Across the Universe and Broadway's phenomenal The Lion King. In both those shows, the story is just a vehicle for delivering breathtaking musical productions — and it works. Who can forget the spectacular parade of lifelike animals or the dancing grasses and rivers in The Lion King? The sets, the costumes, the choreographies, and the thrilling music are simply magnificent, despite the silliness of some of the main characters.

Unfortunately, Taymor's vision for Spider-Man falls as short as the safety harness that was supposed to catch Spidey's stand-in during his unintentionally death-defying drop into the orchestra pit. Yes, Arachne's spider costume is pretty cool as she hangs and twists in the air while her legs and abdomen grow. But we saw something quite similar at the end of Act One in Wicked. The dance of the golden spiders as they swing from 40-foot golden curtains is lovely as well, but we've seen that in every Cirque du Soleil show of the past 20 years. The fights between Spidey and Green Goblin as they fly above the audience and land in the balconies are probably the most unexpected and technically difficult, but only about half the audience can actually see them, since the fights take place high at the back of the theater.

In short, even if the production crew of Spider-Man: Turn Off the Dark can get its acts together and fix the technical problems, the show will still have artistic problems that may be insurmountable. It isn't as showy as Cirque de Soleil, or as campy as Spamalot, or as interesting as Wicked. It simply isn't very good, and it certainly isn't worth risking people's lives for. My advice: turn out the lights; the party's over.


Editor's Note: "Spider-Man: Turn Off the Dark" is currently in previews at the Foxwoods Theatre on 42nd Street.



Share This


Artists in the Movies: The Ten Best Films

 | 

I’m not sure why I consider artists so fascinating. Perhaps it is the especially acute way they see the world — vision being for me only a weak sensory modality. Perhaps it is the fact that they use more of the right side of the brain than I typically use in my own work. But whatever the reason, apparently I am not alone in my fascination, since movies about artists are fairly numerous in the history of cinema. In this essay I want to review ten of the best such movies ever made.

I will confine myself to artists in the narrow sense of painters, as opposed to creative writers, photographers, or musicians. I will even put aside sculptors, even though that rules out reviewing such interesting films as Camille Claudel (1988), the good if depressing bioflick about the sculptress who worked with and was the mistress of Auguste Rodin.

I will also confine myself to standard movies, as opposed to documentaries. There are many fine documentaries about individual artists and artistic movements. One particularly worth noting is My Kid Could Paint That” (2007), a film that honestly explores the brief career of four-year-old Marla Olmstead, who caused a sensation when her abstract paintings caught the attention of the media and the public, and started selling them for many thousands of dollars each. After an expose on CBS News, the public began to wonder if she had really produced her own work. That is the fascinating question the film investigates, but in the background is another, equally fascinating question — whether abstract art has any intrinsic quality, or whether it is all a matter of the perception of the critics.

But to return. One other restriction I will adopt is to consider feature films only, as opposed to TV series. This causes me some grief, since one of my favorite portrayals of painters on screen will have to be skipped — the delightful three-part BBC miniseries The Impressionists (2006). This series is TV at its finest. It is a historically accurate portrayal of the French impressionist school of painters (Manet, Monet, Renoir, Bazille, Degas, and Cézanne) that is compelling and entertaining story telling. It is structured as a series of memory flashbacks that occur to Claude Monet as he is interviewed late in his life by a journalist about the artistic movement he and his circle created.

In theory, it shouldn’t be any more difficult to produce a decent movie about a painter than about any other subject, but in practice, there are many possible pitfalls.

But what does a good movie about an artist include? Such a film can take many forms. It can be a straight bioflick recounting a person’s life and achievements — as in Lust for Life, The Agony and the Ecstasy, and Seraphine. It can explore a controversy, such as the merit of abstract art (Local Color). It can explore some of the ways artists interact with other artists — competition or romantic involvement (Modigliani, Frida, and Lust for Life again). It can examine the interaction between artists and mentors (Local Color), or patrons or art critics (The Agony and the Ecstasy, Girl with a Pearl Earring), or other intellectuals (Little Ashes). It can dramatize relationships between artists and family members (Lust for Life, Moulin Rouge). It can try to meaningfully convey the inspiration for or the process of artistic creation (The Agony and the Ecstasy, Rembrandt, Girl with a Pearl Earring). Finally, it can analyze the personality of an artist (The Moon and Sixpence, Moulin Rouge, Seraphine).

My criteria for ranking these movies are not much different from those I use to judge any other movies: quality of ideas, story, acting, dialogue, and cinematography. In theory, it shouldn’t be any more difficult to produce a decent movie about a painter than about any other subject, but in practice, there are pitfalls that can ensnare you.

In particular, it seems that many directors, in trying to make a movie about art, try to make the movie artsy. One thinks of the disastrously bad film Klimt (2007), an internationally produced bioflick about the Viennese artist Gustav Klimt (1862–1918), played by John Malkovich. The flick is tedious and hard to follow, with numerous hallucinatory scenes interspersed in the action. Malkovich gives a listless performance, portraying the artist as bereft of any charm. The result is risible.

I expect art, not artsiness. And I will mention one other thing I look for in movies about painters: if it accords with the story line, I like to see the artist’s work displayed. If a person is supposed to be great about doing something, one naturally wants to see the evidence.

To build suspense, I’ll present the movies that made my top ten in reverse order of my judgments of their importance and quality.

Number ten on the list is The Agony and the Ecstasy (1965). This lavishly produced film is based Irving Stone’s best seller of the same title, but actually just focusing on part of the story — Michelangelo (1475–1564, portrayed by Charlton Heston) painting the ceiling of the Sistine Chapel at the prodding of his patron, Pope Julius II (Rex Harrison). The eminent director Carol Reed directed the movie, and it was nominated for five Academy Awards, including those for cinematography, art direction, and score. In each of those areas the film is indeed excellent. Especially effective is the scene in which Michelangelo gets his key inspiration for his ceiling mural from observing the beauty of the clouds. The interesting idea explored in the movie is the way in which influence of a patron can help even a highly individual artist elevate the artistic level of his work. The pope insisted that Michelangelo do the job, even though he initially demurred, viewing himself primarily as a sculptor.

Many directors, in trying to make a movie about art, try to make the movie artsy. One thinks of the disastrously bad film Klimt.

The acting in this film isn’t as good as one would expect of the two leads. Heston and Harrison, both recipients of the Oscar for best actor in other movies, seem somehow miscast in their roles. But the movie transcends this weakness; the glory of Michelangelo’s art is on full display in a beautiful color production.

Number nine is Frida (2002), directed by Julie Traynor and starring Selma Hayek (who also coproduced the movie). This is an unvarnished look at the life of Frida Kahlo (1907–1954), focusing on the accident that made her a semi-invalid and caused her lifelong pain, and on her tempestuous marriage to the painter Diego Rivera. Rivera was already famous when they met, and her career grew alongside his. His numerous adulterous affairs are not hidden, nor are her affairs with other women (as well as Leon Trotsky). Both Rivera and Kahlo were devout socialists, as the movie emphasizes.

Selma Hayek’s performance is extraordinary — it is obvious she was completely devoted to the project. She convincingly conveys the physical suffering Kahlo endured, along with the mental anguish caused by Rivera’s endless philandering. She was nominated for an Oscar for her performance. Alfred Molina is excellent as Diego Rivera, and Edward Norton gives a nice performance as Nelson Rockefeller (who, ironically, commissioned Rivera to do a mural for him), as does Antonio Banderas (playing the painter David Alfaro Siqueiros). The cinematography is also excellent, and we get to see quite a few of the artist’s paintings. Traynor does a good job of integrating the history of the times with the story line.

Number eight is Local Color (2006). Written and directed by George Gallo, it is a fictionalized account of his friendship with the landscape painter George Cherepov (1909–1987), an artist he met while he was hoping to pursue art, before turning in his twenties to screenwriting and directing. Gallo’s earliest success was writing the screenplay for “Midnight Run.”

In the movie, the Gallo figure John Talia Jr. (Trevor Morgan) is thinking about what to do after high school. His father (perfectly played by Ray Liotta) hopes he will get a regular job, but young John wants to be a painter. He manages to gain the friendship of a crusty, profane, but gifted older Russian painter, here called Nicoli Seroff (played brilliantly by Armin Mueller-Stahl). Seroff invites John to spend the summer at his house, much to the worry of John’s father, who is concerned that Seroff is gay and will “take advantage” of his son. After some tension between the two, Seroff finally breaks down and shows John how to paint.

Besides being a nice meditation on the role a mentor can play in an artist’s life, the movie has as a subtext an exploration of two related and important questions about contemporary art: is there great artistic merit in abstract art, and should art divide the elites from the ordinary public? This subtext plays out in the exchanges between the prickly Seroff and a pompous local art critic Curtis Sunday (played exquisitely by Ron Perlman, of Hellboy and Beauty and the Beast). Their dispute culminates in a hilarious scene in which Seroff shows Sunday a painting produced by an emotionally disturbed child with whom Seroff has worked. Seroff shows Sunday the painting without revealing who made it, and asks for Sunday’s opinion about the artist. Sunday then begins to talk earnestly about the virtues of the artist, thinking he must be a contemporary painter. When Seroff tells Sunday the truth, Sunday storms off to the howls of Seroff’s laughter. The movie has excellent cinematography — which gathers interest from the fact that all the oil paintings shown in the film were painted by Gallo himself.

In the film Frida, Diego Rivera's numerous adulterous affairs are not hidden, nor are Kahlo's affairs with other women — as well as with Leon Trotsky.

Number seven on my list will be a surprise. It is The Moon and Sixpence (1942). The movie is a superb adaption of W. Somerset Maugham’s brilliant short novel of the same name. The story is about a fictional painter, Charles Strickland, and is loosely based on the life of Paul Gauguin (1848–1903). Strickland (well played by the underrated actor George Sanders, who could play the cad well) is a stockbroker who suddenly and unexpectedly leaves his wife and family in midlife to pursue his vision of beauty — his painting. He is followed by a family friend, Geoffrey Wolfe (a character I suspect Maugham based on himself, and beautifully portrayed by Herbert Marshall), who narrates as he tries to make sense of Strickland’s ethical worldview.

What we see is a man who is an egotist to the core, but we realize that this is an egotism driven by a desire to create. A key scene in this regard is the one in which Strickland explains to Wolfe that he doesn’t choose to paint, but he has to paint. Maugham doesn’t make it easy on us by portraying Strickland as also driven to flee civilization — by, for instance, a bad marriage or family. In fact, the title seems to indicate that in the end he himself fails to appreciate Strickland’s choice: it comes from a Cockney phrase about somebody who is so struck by the moon that he steps over sixpence: by focusing on something abstract — such as artistic beauty—one misses out on something that may be more worthwhile, such as rich human relationships.

But what makes this film powerful is its exploration of the idea that a person can be an egoist—even immoral by conventional standards — but still be a creative genius. Indeed, I recommend this film in my ethical theory classes as an example of Nietzsche’s brand of egoism.

Number six is Girl with a Pearl Earring (2003), a tale from the historical novel by Tracy Chevalier about the life of Johannes Vermeer (1632–1675). It imagines the story of a young woman, Griet, who comes to the Vermeer household as a maid. Griet’s father was a painter, but went blind, forcing her to support herself by working as a domestic servant. The Vermeer household is dominated by his all-too-fecund and extremely jealous (not to say shrewish) wife Catharina, along with her mother Maria Thins.

Griet is fascinated by Vermeer’s work, the colors and composition. Noticing her interest, Vermeer befriends her, letting her mix his paints in the morning. The viewer suspects that this friendship involves a romantic interest, at least on his part. He is careful to keep the friendship from Catharina’s notice. While shopping with the chief maid, Griet meets the butcher’s son Pieter, who is very attracted to her, and we suspect that the feeling is mutual.

As if this incipient romantic triangle weren’t enough excitement for poor Griet, Vermeer’s concupiscent patron Van Ruijven sees her and pushes Vermeer to let her work in his house. Faced with Vermeer’s refusal, Van Ruijven commissions him to paint her, which Vermeer agrees to do. Van Ruijven, obviously, isn’t motivated by art so much as by lust — he even attempts to rape Griet. All this culminates, however, in her becoming the model for Vermeer’s most famous masterpiece, “Girl with a Pearl Earring.” The earring, which is one of a pair borrowed from Catharina, making her extremely jealous, goes with Griet as she leaves Vermeer’s household, ending her adventure with an interesting memento.

What we see is a man who is an egotist to the core, but we realize that this is an egotism driven by a desire to create.

The art direction is superb. It is executed in colors reminiscent of the painter’s method (dark background with vivid tones in the key objects). Appropriately, the film received Oscar nominations for both best art direction and best cinematography. The acting was almost entirely excellent, with Essie Davis playing a very irascible Catharina, Tom Wilkinson a randy Van Ruijven, Judy Parfitt a practical Maria, and Cillian Murphy a supportive Pieter. Especially outstanding is Scarlett Johannson as a very self-contained Griet. She bears an uncanny resemblance to the girl in the actual painting. The sole disappointment is Colin Firth as Vermeer. He plays the role in a very inexpressive way — more constipated than contemplative, to put it bluntly.

Number five is a film about the life and work of a controversial modern artist, Pollock (2000).

Jackson Pollock (1912–1956) was a major figure in the abstract art scene in post-WWII America. He grew up in Arizona and California, was expelled from a couple of high schools in the 1920s, and studied in the early 1930s at the Art Students League of New York. From 1935 to 1943 he did work for the WPA Federal Art Project. During this period, as throughout his life, he was also battling alcoholism.

Receiving favorable notice in the early 1940s, in 1945, he married another abstract artist, Lee Krasner. Using money lent to them by Peggy Guggenheim, they bought what is now called the Pollock-Krasner House in Springs (Long Island), New York. Pollock turned a nearby barn into his studio and started a period of painting that lasted 11 years. It was here he developed his technique of letting paint drip onto the canvas. As he put it, “I continue to get further away from the usual painters’ tools such as easel, palette, brushes, etc. I prefer sticks, trowels, knives, and dripping fluid paint or heavy impasto with sand, broken glass or other foreign matter added.” He would typically have the canvas on the floor and walk around it, dripping or flicking paint.

In 1950, Pollock allowed the photographer Hans Namuth to photograph him at work. In the same year he was the subject of a four-page article in Life, making him a celebrity. During the peak of his popularity (1950–1955), buyers who were pressing him for more paintings, making demands that may have intensified his alcoholism.

He stopped painting in 1956, and his marriage broke up (he was running around with a younger girlfriend, Ruth Kligman). On August 11, 1956, he had an accident while driving drunk that killed both him and a friend of Ruth, and severely injured her. But after his death, Krasner managed his estate and worked to promote his art. She and he are buried side by side in Springs.

Critics have been divided over Pollock’s work. Clement Greenberg praised him as the ultimate phase in the evolution of art, moving from painting full of historical content to pure form. But Craig Brown said that he was astonished that “decorative wallpaper” could gain a place in art history. However one might view Pollock’s work, it has commanded high prices. In 2006, one of his paintings sold for $140 million.

The movie tracks the history fairly closely, starting in the early 1940s, when Pollock attracted the attention of Krasner and Guggenheim, and moving through his marriage to Krasner, his pinnacle as the center of the abstract art world, and the unraveling of his personal life. Throughout, we see him angry, sullen, and inarticulate, whether drunk or sober.

Ed Harris, who directed the film and played the lead, is fascinating (if depressing) to watch. He plays a generally narcissistic Pollock, with the problem of alcoholism featured prominently. He was nominated for a best actor award for this role. Especially good is Marcia Gay Hardin as Lee Krasner, who won a best actress Oscar for her performance. The main defect of the movie is that it gives us no idea why Pollock was angry and alcoholic. Was it lack of respect for his own work? Did he feel it wasn’t really worthy of the praise it received? We get no clue.

Number four is a piece of classic British cinema, “Rembrandt” (1936), meaning, of course, Rembrandt van Rijn (1606–1669), who is generally held to be the greatest painter of the Dutch Golden Age (an era that included Vermeer, his younger contemporary). Rembrandt achieved great success fairly early in life with his portrait painting, then expanded to self-portraits, paintings about important contemporaries, and paintings of Biblical stories. In the latter, his work was informed by a profound knowledge of the Bible.

But his mature years were characterized by personal tragedy: a first marriage in which three of his four children died young, followed by the death of his wife Saskia. A second relationship with his housekeeper Geertje ended bitterly; and a third, common law, marriage to his greatest love, Hendrickje Stoffels, ended with her death. Finally, Titus, his only child to have reached adulthood, died. Despite his early success, Rembrandt’s later years were characterized by economic hardship, including a bankruptcy in which he was forced to sell his house and most of his paintings. The cause appears to have been his imprudence in investing in collectables, including other artists’ work.

Rembrandt’s painting was more lively and his subjects more varied than was common at the time, when it was common to paint extremely flattering portraits of successful people. One of his pieces proved especially provocative: “The Militia Company of Captain Frans Banning Cocq,” often called “The Night Watch,”an unconventional rendition of a civic militia, showing its members preparing for action, rather than standing elegantly in a formal line up. Later stories had it that the men who commissioned the piece felt themselves to have been pictured disrespectfully, though these stories appear to be apocryphal.

The main defect of Pollock is that it gives us no idea why he was angry and alcoholic. Was it lack of respect for his own work? Did he feel it wasn’t really worthy of the praise it received? We get no clue.

The movie is fairly faithful to historical reality, except that it doesn’t explore Rembrandt’s financial imprudence, attributing his later poverty to his painting of “The Night Watch as an exercise in truth-telling. The movie shows him painting these middle-class poseurs for what they were, and their outrage then leading to a cessation of high-price commissions from other burghers. The direction is excellent, as one would expect from Alexander Korda, one of the finest directors Britain ever had. The support acting is first rate, especially Elsa Lanchester as Rembrandt’s last wife Hendrickje, Gertrude Lawrence as the scheming housekeeper and lover Geertje, and John Bryning as the son Titus. Charles Laughton’s performance as Rembrandt is masterful. There are great actors, and then there are legends, and he was both. Unlike his loud and dominating performances in such classics as The Hunchback of Notre Dame and Mutiny on the Bounty, his role in this film is that of a wise and decent man, devoted to his art and his family, and he makes it even more interesting. The main flaw in the film is that we don’t see much of the artist painting or of his paintings, but that is a comparatively minor flaw in an otherwise great film.

Number three is the great Moulin Rouge (1952), based on the life of Henri Marie de Toulouse-Lautrec-Monfa (1864–1901). Toulouse-Lautrec was born into an aristocratic family. At an early age he lived in Paris with his mother, and showed promise as an artist. But also early in his life he showed an infirmity. At ages 13 and 14 he broke first one leg than the other, and they both failed to heal properly. As an adult, he had the torso of a man and the legs of a boy, and he was barely 5 feet tall. A lonely and deformed adolescent, he threw himself into art.

He spent most of his adult life in the Montmartre area of Paris, during a time when it was a center of bohemian and artistic life. He moved there in the early 1880s to study art, meeting van Gogh and Emile Bernard at about this time. After his studies ended in 1887, he started exhibiting in Paris and elsewhere (with one exhibition featuring his works along with van Gogh’s).

He focused on painting Paris on the wild side, including portraits of prostitutes and cabaret dancers. In the late 1880s, the famous cabaret Moulin Rouge opened (it is still in Montmartre to this day), and commissioned Toulouse-Lautrec to do its posters. These brought him to public attention and notoriety, and won him a reserved table at the cabaret, which also displayed his paintings prominently. Many of his best known paintings are of the entertainers he met there, such as Jane Avril and La Goulue (who created the can-can).

By the 1890s, however, his alcoholism was taking its toll on him, as was, apparently, syphilis (not unknown among artists of the time). He died before his 37th birthday, at his parent’s estate. In a brief 20 years, he had created an enormous amount of art — over seven hundred canvases and five thousand drawings.

The movie, directed (and co-written) by John Huston, was a lavish production fairly true to history. It should not be confused with the grotesque 2001 musical of the same name. The cinematography and art direction are superb, showing us scenes of the Moulin Rouge in particular and Paris in general, as captured by the artist. The film won the Oscar for best art direction and best costume design.

The directing and acting are tremendous. (Huston was nominated for best director, and his film for best picture.) Zsa Zsa Gabor is great as Jane Avril, as are Katherine Kath as La Goulue and Claude Nollier as Toulouse-Lautrec’s mother. And Colette Marchand is perfect as Marie Charlet, the prostitute with whom Toulouse-Lautrec becomes involved. She wa nominated for an Academy Award as best supporting actress, and won the Golden Globe, for her performance. But most amazing is the work of the lead, Joses Ferrer, who plays both Toulouse-Lautrec père and fils. Playing Toulouse-Lautrec the artist required Ferrer (always a compelling actor) to stand on his knees. His was a bravura performance, making the artist both admirable and pitiable. Ferrer was nominated for an award as best actor, but unfortunately did not win.

If there is one flaw in the film, it is an unneeded sentimentality, well illustrated by the final scene. With Henri on his deathbed, his father cries to him that he finally appreciates his art, while figures from Henri’s memory bid him goodbye. Huston, one of the greatest directors in the history of film, especially adept at coldly realistic film noir (e.g., “The Maltese Falcon”), should have toned this down. My suspicion is that the studio wanted something emotionally “epic,” and Huston obliged.

Number two on my list is a small independent flick, Modigliani (2004). The story explores the art scene in Paris just after WWI, with artists such as Pablo Picasso, Amedeo Modigliani, Diego Rivera, Jean Cocteau, Juan Gris, Max Jacob, Chaim Soutine, Henri Matisse, Marie Vorobyev-Stebeslka, and Maurice Utrillo living in the Montparnasse district.

We meet Modigliani as he enters the café Rotonde, stepping from tabletop to tabletop as the patrons applaud.

Amedeo Modigliani (1884–1920) was born into a poor Jewish family in Italy. He grew up sickly, contracting tuberculosis when he was 16. He showed interest and talent in art at an early age, and went to art school, first at his hometown of Livorno, then later in Florence and Venice. He was fairly well read, especially in the writings of Nietzsche. He moved to Paris in 1906, settling in Montmartre. Here he met Picasso, and spent a lot of time with Utrillo and Soutine. He also rapidly became an alcoholic and drug addict, especially fond of absinthe and hashish (beloved by many artists then). He adapted rapidly to the Bohemian lifestyle, indulging in numerous affairs and spending many wild nights at local bars. Yet he managed to work a lot, sketching constantly. He was influenced by Toulouse-Lautrec and Cezanne but soon developed his own style (including his distinctive figures with very elongated heads). After a brief return home to Italy in 1909 for rest, he returned to Paris, this time moving to Montparnasse. He is said to have had a brief affair with Russian poetess Anna Akhmatova in 1910, and worked in sculpture until the outbreak of WWI. He then focused on painting, among other things painting portraits of any of other artists.

In 1916, he was introduced to a beautiful young art student, Jeanne Hebuterne. They fell in love, and she moved in with him, much to the anger of her parents, who were conservative Catholics, not fond of the fact their daughter was involved with a poor, drunken, struggling Jewish artist.

And struggle they did. While Modigliani sold a fair number of pieces, the prices he got were very low. He often traded paintings for meals just to survive. In January 1920, Modigliani was found by a neighbor delirious, clutching his pregnant wife. He died form tubercular meningitis, no doubt exacerbated by alcoholism, overwork, and poor nourishment.

His funeral attracted a gathering of artists from Paris’ two centers of art (Montmartre and Montparnasse). It was all very Nietzschean: brilliant young man does art his way, defies all slave moral conventions, and dies in poverty. The genius is spurned by hoi polloi too addled by slave morality to appreciate the works of the übermensch. Jeanne died two days later by throwing herself out a window at her parents’ house, killing herself and her unborn child. It was only in 1930 that the family allowed her to be reburied by his side.

The movie takes place in the pivotal year 1919. We meet Modigliani as he enters the café Rotonde, stepping from tabletop to tabletop as the patrons applaud. He winds up at Picasso’s table, where he kisses Picasso. This bravura entry invites us to focus where we should — on the relationship between these two artists, both important in a new era of art. The relationship is complex. On the one hand, they are obviously friends — and friends of a sort that Aristotle would have approved: their friendship is based on appreciation of each other’s intellectual virtue, their art. But there is a darker side to it: they are also rivals, competitors for the crown of king of the new artists.

Modigliani and Jeanne are struggling to pay rent, and Jeanne’s father has sent their little girl away to a convent to be raised. Modigliani sees a chance to get his child back, and provide for Jeanne. He will enter a work in the annual Paris art competition, one that, so far, he and his circle have scorned. Picasso, feeling challenged, enters also, with other members of the circle joining him. Modigliani knows that the competition will be tough, so by force of will he works to produce a masterpiece. This challenges Picasso as well, and we see the artists vying to see who will win.

But the denouement is tragic. Modigliani finishes, and asks Picasso to take his piece to the show. He goes to City Hall to marry Jeanne. After leaving late, he stops at a bar for a drink. And that foolish act leads to an ending whose bittersweet drama I don’t want to spoil.

The acting is excellent throughout. Hippolyte Girardot is notable as Maurice Utrillo, and Elsa Zylberstein is superb as Jeanne. But the best support is given by Omid Djalili as a smoldering, intense Picasso, who succeeds where Modigliani fails, but understands that his success did not result from greater genius. The amazing Andy Garcia gives a fabulous performance as Modigliani.

It was all very Nietzschean: brilliant young man does art his way, defies all slave moral conventions, and dies in poverty. The genius is spurned by hoi polloi too addled by slave morality to appreciate the works of the übermensch.

The critics panned this movie mercilessly, and it has some flaws — most importantly, you cannot appreciate the film without a knowledge of Modigliani’s biography, because it focuses only on his last year. (It also takes some liberties with the facts.) But the film powerfully conveys a unique friendship and rivalry, and explores an artist’s self-destructiveness. Very instructive, though not the makings of a box office bonanza.

And now — drum roll please! — coming in as number one on my list is a classic that holds up well, more than a generation after its making: Lust for Life (1956).

Like The Agony and the Ecstasy, this movie was based on a best-selling novel by Irving Stone. The director was the brilliant Vincente Minnelli. The film tells the tragic story of the life of Vincent van Gogh (1853–1890), with lavish attention to the man’s magnificent art. It follows the life of van Gogh from his youth, during which he his struggled to find a place as a missionary, to his mature years, during which he struggled to find a place as an artist. (The van Gogh family lineage was full of both artists and ministers.) Van Gogh is usually categorized with Gauguin, Cézanne, and Toulouse-Lautrec as the great post-impressionists.

Van Gogh is played by Kirk Douglas, who was primarily known as an action lead — adept at playing the tough guy outlaw or soldier (helped by his buff physique and chiseled handsome face). This was a casting gamble, but it paid off, with Douglas giving one of the best performances of his career, if not the best performance. He was nominated for a Best Actor Oscar and won the Golden Globe for playing the mentally tormented van Gogh with credibility.

The support acting just doesn’t get any better. Most notable is Anthony Quinn as a young, egoistic, and arrogant Paul Gauguin, who for a brief time was van Gogh’s roommate, but couldn’t handle van Gogh’s emotional intensity and instability. Quinn rightly received an Oscar for best supporting acting. Also excellent is James Donald as van Gogh’s loyal brother Theo.

The story development and dialogue are first rate (the screenwriter Norman Corwin was nominated for an Oscar), as is the art direction (also nominated for an Oscar).

The movie showcases many of van Gogh’s paintings. It also explores the crucial role his brother played in keeping him painting, supporting him financially as well as emotionally. If it were not for Theo van Gogh, the world would likely have never known Vincent. The contrast with Moulin Rouge is stark: Toulouse-Lautrec never got the support of his father until it was too late.

Five films that did not make my list deserve honorable mention. The first must be of a picture I have reviewed for Liberty (October 2009), Seraphine (2009). It is a wonderfully filmed, historically accurate bioflick of the French “Naïve” painter Seraphine Louis (1864–1942). She was discovered by an art critic, flourished for a brief period after World War I, but with the Depression her career ended, and she was eventually confined to an asylum. The relatively unknown actress Yolande Moreau is simply wonderful in the lead role.

The second honorable mention is Convicts 4 (1962), which tells the true story of artist John Resko. Resko was condemned to death after he robbed and unintentionally killed a pawnshop owner while attempting to steal a stuffed toy for a Christmas gift for his daughter. He was given a reprieve shortly before his scheduled execution, and with the help of some fellow inmates adapted to prison life. In prison, he learned to paint, and came to the notice of art critic Carl Calmer, who fought for — and eventually, with the help of the family of the man Resko killed won Resko’s release. Ben Gazzara is outstanding as Resko, and Vincent Price (in real life an art critic and a major art collector) convincing as Calmer.

If the producers had wished to fictionalize the story, they should have done so, and changed the names.

Third honorable mention goes to the television movie Georgia O’Keeffe (2009). Again, since I recently reviewed this movie for Liberty (August 2010), I will be brief. The film gives a nice account of one of the first American artists to win international acclaim, Georgia O’Keeffe (1887–1986). It focuses on her most important romantic and professional relationship, the one with Alfred Stieglitz, the famous photographer and art impresario. O’Keefe (played superlatively by Joan Allen) was hurt by Stieglitz’s philandering, but they remained mutually supportive professionally, even after a painful parting. Jeremy Irons is superb as Stieglitz.

As I noted in my review, at the end of the movie, one is left to wonder why Stieglitz was so callous in his treatment of O’Keeffe (in flouting his adultery, and in one scene bragging about his new paramour’s having his child to O’Keeffe, with whom he had earlier angrily dismissed the idea of having children). Was this merely the blindness of narcissism, or was there an undercurrent of profound envy at his wife’s success as an artist — one greater than his?

Fourth honorable mention is Artemisia (1998), based on the life of Artemisia Gentileschi (1593–1656). She was one of the earliest women painters to win widespread acclaim, being the first woman artist accepted into Florence’s Accademia di Arte del Disegno. The film is gorgeously produced, with a first-rate performance by Valentina Cervi as Artemisia and Miki Manojlovic as Agostino Tassi. Its major flaw is its historical inaccuracy, portraying Tassi as Artemisia’s chosen lover, while in fact he was her rapist. If the producers had wished to fictionalize the story, they should have done so, and changed the names. Stretching or selectively omitting history in a bioflick can make sense, but a total inversion of a pivotal event is a major flaw.

Watching a large number of movies about artists over a short period of time can be a recipe for depression.

The fifth film receiving honorable mention is Basquiat (1996), a good movie about the sad life of Jean-Michel Basquiat (1960–1988), who was one of the earliest African-Americans to become an internationally known artist. He was born in Brooklyn, and despite his aptitude for art and evident intelligence (including fluency in several languages and widespread reading in poetry and history), he dropped out of high school. He survived on the street by selling t-shirts and postcards, and got his earliest notice as a graffiti artist using the moniker “SAMO.” In the late 1970s, he was a member of the band Gray. In the early 1980s his paintings began to attract notice, especially when he became part of Andy Warhol’s circle. In the mid-1980s, he became extremely successful, but also got more caught up in drugs, which led to his early demise from a heroin overdose. Jeffrey Wright is superb as Basquiat, as are David Bowie as Andy Warhol and Dennis Hopper as the international art dealer and gallerist Bruno Bischofberger. Also compelling is Gary Oldman as artist Albert Milo, a fictionalized version of the director Julian Schnabel.

Watching a large number of movies about artists over a short period of time can be a recipe for depression, given the amount of tragedy and pain on display. Often this pain was caused by a lack of public and critical recognition or support, leading great painters to experience genuine deprivation and what must have been the torment of self-doubt. Worse, the pain was sometimes self-inflicted or inflicted on others, because of the narcissism or lack of self-control that made such messy lives for so many artists.

But watching these films is intellectually as well as visually rewarding. You see the triumph of creative will over unfavorable conditions and outright opposition — and the beauty that unique individuals have contributed to the world.




Share This


Snow White and Mayor Dork

 | 

On Sunday December 26, 2010, the blizzard of 2010 hit the northeastern United States. I, for one, enjoyed watching the snow fall. If we can’t have a white Christmas, a white day-after-Christmas is the next best thing.

But in New York City things were not so merry. Upwards of two feet of snow fell in New York. Clearing the roads after a snowstorm seems a relatively simple challenge, one for which Mayor Michael Bloomberg should have had ample time to prepare. The mayor’s absolute failure reveals him as an absolute incompetent.

For years Bloomberg has opposed libertarian freedoms in New York City, from gun rights to the right to smoke cigarettes in bars. (This was a pet peeve of mine, back when I used to smoke and drink.) But at the very least, he has tended to handle emergencies well — at least, one always saw him on the evening news at the scene of the disaster, once the mess had been cleared up. But not this time.

I spoke with my father two days after the blizzard. He lives in eastern Queens, and he was still snowed in, with the roads outside his house unplowed, the piles of snow too high to get past, and bus and subway lines in his area not running. His fate was shared by most people in Queens and Brooklyn.

I am spending my winter vacation at my mother’s home in southwestern Connecticut, and here I get New York TV news channels, which showed that the city was in a state of devastation. It was reported that the day after the snowstorm it took eight hours for ambulances to respond to 911 calls because of the condition of the roads. The next day, the news said that the mayor blamed his inability to plow the roads on drivers who had irresponsibly abandoned their cars in the middle of the street. TV reporters are consistent in saying that New Yorkers are outraged. The City Council plans to respond to this emergency by… holding a hearing.

What New York City needs is men of action, not windbag politicians. If the city is too incompetent to clear the roads after a snowstorm, it is only because politicians and bureaucrats have no accountability and suffer no monetary loss from the failure of state-owned infrastructure. Needless to say, two feet of snow is not the worst crisis that the city may face in the future. The only way to prevent a future disaster is to stick our hand into our magical bag of libertarian wisdom and pull out an idea whose time has come: privatize the roads.

If the streets of New York City were under private ownership, the owners would make certain that snow removal happened efficiently; if they failed then they would go bankrupt and someone else would buy the roads and operate them to the satisfaction of consumers. One TV news story showed a Brooklyn family with a newborn baby. With an oil truck trapped in piles of snow just a few streets away, their heat had gone out for lack of oil, and ambulances had trouble reaching them. Their baby’s death should weigh on the conscience of every statist who fights against allowing free market competition to improve upon the nightmare of state-owned infrastructure.




Share This


Hat Trick

 | 




Share This


Your Recovery Dollars at Work

 | 

About three months ago, a curious sign appeared at one end of my street. It reads, “Putting America to Work. Project Funded by the American Recovery and Reinvestment Act.” It depicts a hard-hat-wearing stick figure digging into a pile of dirt — as if this jaunty cartoon of a “shovel-ready” project would soothe my anger at the wealth confiscation that funds such ridiculous endeavors.

Not much goes on in my small, East Coast rural enclave. The acquisition of “city” sewage by the nearest two towns was a big deal around here. So the government sign was the talk of our street. There was no explanation of why the sign appeared, no explanation of what project was in the offing. This was strange.

Then, roughly two weeks after the sign was erected, road crews appeared on both ends of our street and started tearing up the asphalt. The re-paving project was completed two weeks later.

Some neighbors speculated that the project was inflicted on us to predispose us to vote Democratic in the upcoming local election. But elections here are the smallest of small potatoes. It wasn't logical that federal funds would be spent to influence local voting. One neighbor speculated that the road was being prepared for a utility development set to occur in the next few years; but another road is slated to be built specifically for that purpose, at the opposite end of the nearest big town. None of us could come up with a reasonable answer. I suppose I could have attended a township meeting to divine the reasons behind this project, but I don’t have the time to waste and it’s highly unlikely that the simple folk, and by that I mean simpletons, who make up the township committee would have a credible answer.

As I said, this is a rural area. Roads need only be passable  — pickup trucks and tractors do just fine. Given that my street is only one section of a decently long through road, this paving project does not qualify as a “road to nowhere”; but it is very strange that the project was limited to one section of the road. Even stranger, there was nothing wrong with this part of the road in the first place. Nary a pothole! There is no meaningful difference between the street in its pre-recovery- dollars condition and the street in its post-recovery-dollars state. The road is now black. It used to be to gray.

In short, the project was a colossal waste of money. The dollars devoted to it should not have been printed, let alone spent. The workers involved in it did not achieve sustainable employment; they simply received unemployment subsidies by another name. No one was “put to work” in the sense that the designers of the Recovery Act intended the populace to believe.

Increased employment results from increased demand for goods and services. Allowing taxpayers to keep the majority of our dollars is the best option for “Recovery and Reinvestment” in all areas. Greater disposable income spurs demand as well as mitigating the risk of investment in small ventures. A person can spend his or her own dollars on any number of goods or endeavors that would contribute to sustained economic activity. More dollars in the hands of the citizenry will “put more people to work” than dollars in the hands of government ever will.

The first step to an actual recovery is limiting government spending. How do we achieve this?

We can apply my friend’s sound advice on dealing with young children: give them only very limited options. For example, instead of asking, “Where would you like to go for your birthday dinner?” ask, “For your birthday dinner, would you like to go to Friendly’s or McDonald’s?” Young children are ill-equipped to handle unlimited discretion. Governments are too.

With the country in its present mood, severely limiting government’s spending discretion is an attractive and realizable goal. We already have the set of tools necessary to do this. It’s called the Constitution.




Share This


Well, at Least That's Over

 | 

Happy New Year! It gives me pleasure to report that we survived 2010 with fewer devastating hits to the language than we’ve seen in recent years.

If you’re inclined to whine about 2010, please remember “the audacity of hope” and its sad but well-merited fate in the year just past. Of course, there is usually an easy passage from pomposity to farce, but the passage of “audacity of hope” was particularly easy, and particularly gratifying to observe. Every friend of the English language shuddered on election day 2008, expecting that Obama’s stilted, painfully self-conscious phrase would be enshrined forever in America’s pantheon of quotations, alongside “The only thing we have to fear is fear itself,” “Fourscore and seven years ago,” and “Th-th-th-th-th! That’s all, folks!” But now it’s merely a subject for sardonic humor.

So much of interest might have been said in 2010, but wasn’t.

I’m sorry, however, that I can’t welcome the new year as ecstatically as Addison DeWitt once greeted the debut of Eve Harrington. I am not available for shouting from the housetops or dancing in the streets. It isn’t simply that a lot of muddy snow remains to be shoveled off America’s pavements; it’s that so much of interest might have been said in 2010, but wasn’t.

In 2010 we experienced comparatively little linguistic terror or catastrophe, but we didn’t experience many linguistic delights, either. Washington — Mordor on the Potomac — was more vulnerable to solemn sneers and glorious jests than it had been for many years, and that’s saying something, but its opponents were seldom equal to the occasion. The most eloquent and resonant sound of opposition was “Don’t touch my junk.” That saying will last, and deserves to last. Its four modest monosyllables combine a trenchant protest against authority with a wry parody of enforced sensitivity: if you nice people won’t let me say “penis” or “testicles,” I’ll just call them “junk”; now how do you like that?

But try to think of some equally generous gift to the language, received from 2010. Tell me if you do. I’ll be interested.

The year did afford its share of linguistic monstrosities. It promoted, for example, the further growth of the Great Blob “We.” You know what I mean. Your nurse says, “How are we doing today, Mr. Johnson?” Your boss says, “I think that we [meaning you] had better get that report out right away.” Yesterday a waitperson asked me (I was dining alone), “And how did we like our salad?” I was tempted to reply, “I don’t know; I haven’t had time to poll the rest of us”; but friends have told me that waiters do sometimes spit in your food, so I took refuge in a haughty silence.

All politicians now use “we” to describe themselves. Newt Gingrich was just obeying this professional ethic when, in December, he was interviewed by Fox News about whether he intended to run for president. He replied that “we” were considering it. This makes me wonder how many people may actually be lurking on my ballot, underneath the name of any single candidate that “we” might vote for. It also reminds me irresistibly of those cartoons in which a three-headed monster keeps talking to itself.

But it was Oprah Winfrey who, in 2010, broke all records for “we.” It happened in an interview with Barbara Walters. Barbara asked Oprah about rumors that she was gay, and Oprah responded, “We have said, ‘We are not gay,’ a number of times.” Well, I have never said that, not even once. Have you? But then we weren’t being given the third degree by Barbara Walters.

Nevertheless, “moving forward,” as politicians often said in 2010: the past year not only failed to come up with any colorful new phrases; it was churlish about using old ones that might still have some value. I was astonished by the neglect of a number of venerable expressions that should have seemed perfectly natural, indeed unavoidable, in the context of the year’s political events. These locutions may never have been star players, but their absence from the team made the game a lot less fun to watch.

Who are the war-speakers now? Who claims to be besieged, subverted, held hostage by today’s forces of evil? Why, it’s our pacifist president and his friends, that’s who.

While following the controversy over the tax bill, I was shocked to hear not one satirical reference to the fact that Democrats like to “soak the rich.” And amid the outpouring of sympathy for people who have missed their mortgage payments, I heard not one mention of “giving a hand” to “the deserving poor.” “The poor” no longer exist in our national vocabulary. In this respect, the president is fully representative of leaders left, right, and center: he never talks about “the poor”; he talks exclusively about “the middle class,” or at most about “working families.” (I thought that child labor had been outlawed — except on farms, because farm states have two senators each — but I must have been wrong.) No one ever thinks of po’ folks now.

This is disappointing to me, because I grew up around po’ folks, and a lotta folks I know are still po’. I can’t see why they should be omitted from the glossary, but in 2010 even the professional friends of the working man did exactly that. Obama used the word “folks” with fanatical phoniness, but he didn’t call the poor folks “poor.” I suppose that’s because he and his friends had discovered that really poor people don’t vote, and therefore shouldn’t be noticed, and that relatively poor people always insist that they are middle class.

Relatively rich people do that too. Have you ever met an American who referred to himself as “rich”? There’s no point in debating the question of whether to “soak” the rich. They’re linguistically extinct — except when the Democrats want to increase their taxes. Then, as we discovered in 2010, they become the “super-rich” (i.e., people who make more than $250,000 a year).

That is what the Republicans call “class warfare,” a phrase I am heartily sick of, despite its fair degree of accuracy. The reason I regard it as fairly accurate is that Obama’s leading supporters and administrative fixtures are virtually all super-rich themselves — and I’m not talking about people who make only $250K. I doubt that Obama knows anyone who makes as little as that, or has known anyone who makes as little as that during his own past years of political “service.” But some kind of warfare is going on. The most famous remark that Obama made in 2010 was his crack about Republican congressmen holding “hostages” (i.e., refusing, out of principle, to vote for his legislation). That’s war talk, that is.

If Obama came back, where did he come back from? From his dismally low popularity? From the 9.6% unemployment fostered and protected by his economic policies?

And it’s interesting: starting in the 1960s, “right wing” people were violently attacked by college professors and other kindly, mild-mannered folk for “militarizing” the language — you know, insisting on prosecuting a “cold war” against an “evil empire,” and calling communists “traitors” when they were merely plotting to set up a Stalinist dictatorship. The attack revived after 9/11, when a concerted attempt was made to ban the word “evil” as an aggressive, contemptuous piece of hate speech, reminiscent of . . . er . . . uh . . . Nazis or something. (Gosh, I almost said “radical Islamicists.”) But who are the war-speakers now? Who claims to be besieged, subverted, held hostage by today’s forces of evil? Why, it’s our pacifist president and his friends, that’s who.

The truth is less ideological and more rhetorical. Obama was desperate when he made that statement. He would have said anything if he’d thought it would help. To rescue his political career, he needed to make a deal with the Republicans, but he also needed to conciliate the many members of his party who hate Republicans. He decided that the best way to do it was to show that he, too, hated Republicans. That wasn’t hard, because it was true. He does hate them. So he charged that the Republicans had, in effect, manned up (another ridiculous 2010 expression) and were negotiating with him at the point of a legislative gun. Oh, the humanity! But he had to go along with them, for the sake of the republic.

If you can’t see through this stuff, you’re even more naïve than the New York Times.

But speaking of naïve journalism, this is the time for Word Watch to make its fearless forecast for 2011. Here goes.

During 2011, I envision a more complex linguistic situation than prevailed on 2010. I predict that the nation will be annoyed and harassed, not just by the usual guff, but by three rival political dialects.

1. Conservaspeak

This is a language in which I am well educated, a language that has come pretty naturally to me since I stopped being a leftist several generations ago; but I have to concede that it’s lacking in charm. The Republican leadership, which is not very charming to begin with, will speak continually of “balancing the budget,” “ensuring fiscal responsibility,” “setting the nation’s house in order,” “getting America back to work,” and so on and so on. Sound words, if sincerely spoken — which ordinarily they won’t be. But don’t go to John Boehner or Mitch McConnell for inspiring words. They’re too busy running across the fields, with the Tea Party chasing after them.

2. Progressish

Until 2010, “progressives” were old fogies who believed in everything that appeared in the Socialist Party platform of 1912. They went down to the community center on Friday night and listened to speakers (whom no one but other speakers had ever heard of) explain how Big Oil runs the government and will stop at nothing until it poisons the earth and destroys all its people. Outside of that, they had no life. They all voted enthusiastically for Obama but were then horrified to discover that he wasn’t prepared to outlaw capitalism the very next day. One or two of these advanced thinkers happened to be billionaires and thus managed to get themselves taken semi-seriously, so long as they doled out cash; but that was it.

Then came 2010, and by the time it was over, the most leftward people in the Democratic Party had all declared themselves “progressives” out of frustration with Obama. For one thing, he was a total loser. For another thing, they wouldn’t admit to themselves that the specific reason he had lost the November election was that he had followed their advice and “doubled down” on his least popular policy initiatives. To differentiate their wing of the party from the die-hard Obamaites, they needed their own special word for themselves — and lo! “progressive” was found and seized upon. Suddenly, like some animal species that was thought to be extinct until it blundered into a neighborhood where the garbage wasn’t always picked up on time, “progressives” propagated themselves everywhere. Congress and the old-fashioned media filled up with them, overnight.

The current “progressive” ideology isn’t much worth talking about; it consists largely of the idea that government should always expand exponentially, which it would be doing if the president would only ignore the wishes of nine-tenths of the American populace. The progressish dialect isn’t much fun, either; but it will be very prominent in 2011. Expect to hear much more about “empowerment,” “workers’ rights,” “corporate control,” “masters of war,” “the military-industrial complex,” and other standard shibboleths of the distant Left, as leftists try to hold Obama’s renomination hostage in the temple of their idolatries.

3. Obamablab

This is the worst one.

Obama’s popularity ratings have been in the swamp since mid-2009. His amateurish performance as president resulted in his opponents’ overwhelming victory in the election of 2010. Since that election, his biggest accomplishment has been rounding up enough Democrats to vote for the continuation of the Republican tax cuts he had campaigned against.

Strangely, in response to his questionable achievements a chorus of cheers is now being heard from the loftiest heights of the established media — cheers rendered in a barbaric, virtually untranslatable tongue, full of terms that have no plausible equivalent in normal English. Thus, Obama is complimented for his “thoughtful,” even “deeply intellectual and reflective” leadership, for his “moderation,” his “conciliatory approach,” and his “reaching-across-the-aisle method of government.” He is said to have “emerged victorious” and to have “surprised the pundits” as he “turned the corner” on his “struggle to lead America out of its financial doldrums.” Obama is, in short, “the comeback kid.”

Only an expert on mental illness could comprehend what all this means, but its chief characteristic is clearly its gross dishonesty. If Obama came back, where did he come back from? From his dismally low popularity? From the 9.6% unemployment fostered and protected by his economic policies? From the total disarray of his own party? From any other conditions that were just as evident on November 2 as they are today?

The president is not a kid, and the only way in which he has come back is by means of this hideously contrived and shopworn language. We’ve had a comeback kid before: his name was Bill Clinton. And there has never been a moment when modern liberals were not relabeled, when necessary, as conciliatory “moderates” of a “bipartisan spirit,” “pragmatists” who “govern from the center,” etc. Some years ago, the New York Times declared, in a lead editorial, that Walter Mondale was “a man of principle, who has always had the courage to compromise.”

To conclude. These three dialects are the linguistic survivors of 2010. We’ll have to put up with them. But I can think of a good thing about last year: it appears to have jettisoned one considerable wad of smarm: “transparency.” We used to hear a lot about the cellophane-like “transparency” of the Obama administration. Now it appears, what with the healthcare deals and the taxation deals and the stimulus deals and the immigration deals and the security deals and all the other kinds of deals, that “the era of transparency” was over before it started, banished by the era of obvious lies. And that’s the real come-back kid.




Share This


The Simple Life

 | 

Remember calculators? How simple. Even my three score and ten year-old brain could use a calculator without the benefit of a 12-year-old associate offering advice on the sidelines. Naturally, this was B.C. (Before Computers). Then the computer came along and with much difficulty — much cursing — much advice from mocking 12-year-olds who found an activity they loved, besides obnoxiousness and noisemaking — my stressed brain learned to operate the device. So I thought.

Then “they,” the strange pointy-headed people who lived in the woods and emerged to design software, somehow discovered that even I could use 30% of the functions on the computer. No good. They changed it.

Why, oh why, are they obsessed with change? No sooner do I learn X than they change it to Y.

Highly intelligent but aged minds hate change. “Leave it alone,” says the home page of my 15-year-old Mac, to those people who live in the woods.

It all reminds me of the mania to modify a product just to make it different — to stimulate sales, not efficiency. “Hey look, I’ve got the new whatchamacallit - newest model, makes popcorn, too. Bet your iPad or Raspberry can't make popcorn.”

Thank goodness, for the moment, we still live in a capitalist society. Companies like profits, and change is often the engine of profit. That’s OK, just give me a choice. If I don’t need to track the

number of passengers with green shirts flying out of Kennedy, don’t build it into the “M” key on my keyboard. And don’t ring bells and flash green naked women on my screen so I remember to upgrade to this bizarre requirement.

Because of those technical wood nymphs, change becomes religious. It doesn’t always bring improvement, but it does always bring complication. There ought to be two streams of development. The first would be like your car. You bought a 2010 Ford; it remains a 2010 Ford. The accelerator never moves from its floorboard position. The instrument panel still indicates miles per hour, not feet per second. My kind of device. The second would be a test of your mental flexibility. Here, everything changes. The accelerator is now the brake. This is for users who like puzzles and are intrigued by how the device operates, not by what it does.

But in the computer world, even if you stick with the same computer, it’s always bugging you to update this or that. And it has clever little tricks. While you’re playing tennis, it swaps out your operating system so you have to call that smart aleck 12-year-old just to send an email. This is a world that worships change — for better or worse.

My pet remembrance of the “fix it even if it ain’t broke” philosophy is the battery-powered watch. Yep, I’m convinced that’s when it all started — a pivotal date in the history of uselessness. Now, I’m not a watchmaker, but batteries cost money and add an item to your “to do” list. And I swear they’re dying sooner and sooner. How long will it be before it’s a daily ritual? And few stores will change a battery.

How hard was it in the old days to give that little stem a few twists? Free twists, I might add. Think about it.

Gotta go now — my computer is groaning, which means that if I don’t install the popcorn app, it’ll erase my files of all stories that contain the word “popcorn."




Share This

© Copyright 2013 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.