Escape from Dannemora

 | 

I’m going to say something that many libertarians don’t want to hear: prisons need discipline, and plenty of it.

I’m reflecting on the big news item of the past three weeks, the escape of two convicts from the maximum-security prison in Dannemora, New York, an institution that used to be feared as “Siberia.” They escaped because they were allowed to live in an “honor” block, work with and have sex with civilians, cook their own meals, wear civilian clothes, and enjoy a level of control and discipline that permitted them to acquire power tools and use them to cut holes in their cells and escape. Power tools. Used by men sent to prison for vicious murders, including, in one instance, the dismemberment of the murder victim. Tools freely used, and undetected.

What’s that noise? Is that a guy cutting his way out of prison, or is that just a guy cutting up some other prisoner? Whatever. Have a nice night.

It is one thing to debate about whom to send to prison. It is another thing to screw around with the lives of the people we decide to send there. Because, make no mistake about it, the first victims of convicts who are not controlled are other convicts. If you want the rapes, murders, tortures, and gang aggressions that happen routinely in American prisons to continue to happen, all you need to do is let the bad guys act in whatever way they want. If that’s your “libertarian” philosophy, it will have a big impact, because just one of those bad guys in a prison unit can be enough to ensure the victimization of everybody else.

When there’s a good reason to send somebody to prison — and sometimes there is — you don’t have to give him a life sentence, but you do have to keep him safely inside.

The late Nathan Kantrowitz made this point very powerfully in his exacting study of prison life, Close Control. I followed Nathan’s lead in my own book, The Big House: Image and Reality of the American Prison. I added that, in my judgment, the sorry state of American penology is the result of a vicious convergence of modern liberal and modern conservative ideas. The conservatives want to lock people up, and do it on the cheap. And it’s true, you can get a lot of non-discipline and non-control, very cheaply indeed. The liberals believe that convicts are somehow rehabilitated by being allowed to wear their own clothes, cook their own meals, and wander about the joint, victimizing anyone who’s weaker than they are. (Unnoticed by the conservatives, the liberals also ordained that those who run the prison system would get paid enough to give them 15 degrees at Harvard. They’re unionized, after all, and they’re the biggest sinkhole in many state budgets.)

Libertarians ought to be smarter. When there’s a good reason to send somebody to prison — and sometimes there is — you don’t have to give him a life sentence, but you do have to keep him safely inside, and safe from victimizing or being victimized by other prisoners.

In the short run, whoever is running the New York prison system needs to be fired, immediately. (I’m afraid that the fix is already in, and this will never happen.) In the middle run, real investigations of penology — not ideological declarations about penology, from any vantage point — need to be conducted, so that people can learn what the few good scholars, such as Nathan Kantrowitz, have already established. In the long run, Americans should stop making savage jokes about rape in prison and start considering the steps that are necessary to keep rape and murder, and by the same token, escape, from happening in their supposedly secure institutions.




Share This


The Lady and the Tigers

 | 

Tippi Hedren is the actress whose intelligence illuminated Alfred Hitchcock’s The Birds. Now, 52 years later, Hedren is still an illuminating person — as shown by her powerful performance at a hearing managed by the board of asses who are in charge of California’s “bullet train.”

The train — which does not exist and may never exist — is the biggest (putative) construction project in American history, and perhaps the biggest boondoggle, a reduction ad absurdum of “planning for the environment,” “planning for energy conservation,” and all the rest of it. Its cost estimates are 600% higher than the voters thought they were mandating, and this is one reason the majority of voters now wish they hadn’t listened to propaganda for the project. They agreed to build a railroad that would deliver passengers from Los Angeles to San Francisco in a time substantially less than three hours. It’s now clear that there’s no possibility this can happen, or anything close to it, no matter where the rail line is put. But no one knows where it will be put. The managers of the project, the California High Speed Rail Authority, insensately determined to carry on despite the many kinds of fools they are making of themselves, are still deciding which communities they’re going to unleash their bulldozers on.

Another assumption is that it’s efficient to destroy a series of towns in the pursuit of what is in fact slow-speed rail.

They have to hold public hearings about this. Unfortunately, they don’t have to listen to what they hear at them, and they don’t. The latest hearing involved outraged residents of several Southern California towns that may be devastated by the train. One of them is Acton, where Hedren operates an animal-rescue preserve that caters to big cats. So Hedren showed up at the hearing.

Dan Richard, chairman of the Authority, used the occasion to pontificate: “What we’re building here, by the way, in high-speed rail, is the most efficient way to deal with our transportation needs of the future." “By the way?” The rhetoric is almost as condescending as the statement itself, which assumes that its audience is stupid enough to believe it’s efficient to spend at least $100 billion to propel a few hundred people a day to a destination they could have reached more quickly and cheaply by air. Another assumption is that it’s efficient to destroy a series of towns in the pursuit of what is in fact slow-speed rail.

Hedren, 85, identified the problem with people who make statements like that: "You don't listen, you don't care. . . . You are going to take this beautiful little town of Acton . . . and you are going to destroy it with this train." Then she mentioned the lions and tigers she cares for (but has no illusions about). "I am more afraid of you," she told the planners.




Share This


Fakers and Enablers

 | 

Last month, a UCLA graduate student in political science named Michael LaCour was caught faking reports of his research — research that in December 2014 had been published, with much fanfare, in Science, one of the two most prestigious venues for “hard” (experimental and quantifiable) scientific work. Because of his ostensible research, he had been offered, again with much fanfare, a teaching position at prestigious Princeton University. I don’t want to overuse the word “prestigious,” but LaCour’s senior collaborator, a professor at prestigious Columbia University, a person whom he had enlisted to enhance the prestige of his purported findings, is considered one of the most prestigious number-crunchers in all of poli sci. LaCour’s dissertation advisor at UCLA is also believed by some people to be prestigious. LaCour’s work was critiqued by presumably prestigious (though anonymous) peer reviewers for Science, and recommended for publication by them. What went wrong with all this prestigiousness?

Initial comments about the LaCour scandal often emphasized the idea that there’s nothing really wrong with the peer review system. The New Republic was especially touchy on this point. The rush to defend peer review is somewhat difficult to explain, except as the product of fears that many other scientific articles (about, for instance, global warming?) might be suspected of being more pseudo than science; despite reviewers’ heavy stamps of approval, they may not be “settled science.” The idea in these defenses was that we must see l’affaire LaCour as a “singular” episode, not as the tin can that’s poking through the grass because there’s a ton of garbage underneath it. More recently, suspicions that Mt. Trashmore may be as high as Mt. Rushmore have appeared even in the New York Times, which on scientific matters is usually more establishment than the establishment.

I am an academic who shares those suspicions. LaCour’s offense was remarkably flagrant and stupid, so stupid that it was discovered at the first serious attempt to replicate his results. But the conditions that put LaCour on the road to great, though temporary, success must operate, with similar effect, in many other situations. If the results are not so flagrantly wrong, they may not be detected for a long time, if ever. They will remain in place in the (pseudo-) scientific literature — permanent impediments to human knowledge. This is a problem.

But what conditions create the problem? Here are five.

1. A politically correct, or at least fashionably sympathetic, topic of research. The LaCour episode is a perfect example. He was purportedly investigating gay activists’ ability to garner support for gay marriage. And his conclusion was one that politically correct people, especially donors to activist organizations, would like to see: he “found” that person-to-person activism works amazingly well. It is noteworthy that Science published his article about how to garner support for gay marriage without objecting to the politically loaded title: “When contact changes minds: An experiment on transmission of support for gay equality.” You may think that recognition of gay marriage is equivalent to recognition of gay equality, and I may agree, but anyone with even a whiff of the scientific mentality should notice that “equality” is a term with many definitions, and that the equation of “equality” with “gay marriage” is an end-run around any kind of debate, scientific or otherwise. Who stands up and says, “I do not support equality”?

The idea in these defenses was that we must see l’affaire LaCour as a “singular” episode, not as the tin can that’s poking through the grass because there’s a ton of garbage underneath it.

2. The habit of reasoning from academic authority. LaCour’s chosen collaborator, Donald Green, is highly respected in his field. That may be what made Science and its peer reviewers pay especially serious attention to LaCour’s research, despite its many curious features, some of which were obvious. A leading academic researcher had the following reaction when an interviewer asked him about the LaCour-Green contribution to the world’s wisdom:

“Gee,” he replied, “that's very surprising and doesn't fit with a huge literature of evidence. It doesn't sound plausible to me.” A few clicks later, [he] had pulled up the paper on his computer. “Ah,” he [said], “I see Don Green is an author. I trust him completely, so I'm no longer doubtful.”

3. The prevalence of the kind of academic courtesy that is indistinguishable from laziness or lack of curiosity. LaCour’s results were counterintuitive; his data were highly exceptional; his funding (which turned out to be bogus) was vastly greater than anything one would expect a graduate student to garner. That alone should have inspired many curious questions. But, Green says, he didn’t want to be rude to LaCour; he didn’t want to ask probing questions. Jesse Singal, a good reporter on the LaCour scandal, has this to say:

Some people I spoke to about this case argued that Green, whose name is, after all, on the paper, had failed in his supervisory role. I emailed him to ask whether he thought this was a fair assessment. “Entirely fair,” he responded. “I am deeply embarrassed that I did not suspect and discover the fabrication of the survey data and grateful to the team of researchers who brought it to my attention.” He declined to comment further for this story.

Green later announced that he wouldn’t say anything more to anyone, pending the results of a UCLA investigation. Lynn Vavreck, LaCour’s dissertation advisor at UCLA, had already made a similar statement. They are being very circumspect.

4. The existence of an academic elite that hasn’t got time for its real job. LaCour asked Green, a virtually total stranger, to sign onto his project: why? Because Green was prestigious. And why is Green prestigious? Partly for signing onto a lot of collaborative projects. In his relationship with LaCour, there appears to have been little time for Green to do what professors have traditionally done with students: sit down with them, discuss their work, exclaim over the difficulty of getting the data, laugh about the silly things that happen when you’re working with colleagues, share invidious stories about university administrators and academic competitors, and finally ask, “So, how in the world did you get those results? Let’s look at your raw data.” Or just, “How did you find the time to do all of this?”

LaCour’s results were counterintuitive; his data were highly exceptional; his funding was vastly greater than anything one would expect a graduate student to garner.

It has been observed — by Nicholas Steneck of the University of Michigan — that Green put his name on a paper reporting costly research (research that was supposed to have cost over $1 million), without ever asking the obvious questions about where the money came from, and how a grad student got it.

“You have to know the funding sources,” Steneck said. “How else can you report conflicts of interest?” A good point. Besides — as a scientist, aren’t you curious? Scientists’ lack of curiosity about the simplest realities of the world they are supposedly examining has often been noted. It is a major reason why the scientists of the past generation — every past generation — are usually forgotten, soon after their deaths. It’s sad to say, but may I predict that the same fate will befall the incurious Professor Green?

As a substitute for curiosity, guild courtesy may be invoked. According to the New York Times, Green said that he “could have asked about” LaCour’s claim to have “hundreds of thousands in grant money.” “But,” he continued, “it’s a delicate matter to ask another scholar the exact method through which they’re paying for their work.”

There are several eyebrow-raisers there. One is the barbarous transition from “scholar” (singular) to “they” (plural). Another is the strange notion that it is somehow impolite to ask one’s colleagues — or collaborators! — where the money’s coming from. This is called, in the technical language of the professoriate, cowshit.

The fact that ordinary-professional, or even ordinary-people, conversations seem never to have taken place between Green and LaCour indicates clearly enough that nobody made time to have them. As for Professor Vavreck, LaCour’s dissertation director and his collaborator on two other papers, her vita shows a person who is very busy, very busy indeed, a very busy bee — giving invited lectures, writing newspaper columns, moderating something bearing the unlikely name of the “Luskin Lecture on Thought Leadership with Hillary Rodham Clinton,” and, of course, doing peer reviews. Did she have time to look closely at her own grad student’s work? The best answer, from her point of view, would be No; because if she did have the time, and still ignored the anomalies in the work, a still less favorable view would have to be entertained.

This is called, in the technical language of the professoriate, cowshit.

Oddly, The New Republic praised the “social cohesiveness” represented by the Green-LaCour relationship, although it mentioned that “in this particular case . . . trust was misplaced but some level of collegial confidence is the necessary lubricant to allow research to take place.” Of course, that’s a false alternative — full social cohesiveness vs. no confidence at all. “It’s important to realize,” opines TNR’s Jeet Heer, “that the implicit trust Green placed in LaCour was perfectly normal and rational.” Rational, no. Normal, yes — alas.

Now, I don’t know these people. Some of what I say is conjecture. You can make your own conjectures, on the same evidence, and see whether they are similar to mine.

5. A peer review system that is goofy, to say the least.

It is goofiest in the arts and humanities and the “soft” (non-mathematical) social sciences. It’s in this, the goofiest, part of the peer-reviewed world that I myself participate, as reviewer and reviewee. Here is a world in which people honestly believe that their own ideological priorities count as evidence, often as the determining evidence. Being highly verbal, they are able to convince themselves and others that saying “The author has not come to grips with postcolonialist theory” is on the same analytical level as saying, “The author has not investigated the much larger data-set presented by Smith (1997).”

My own history of being reviewed — by and large, a very successful history — has given me many more examples of the first kind of “peer reviewing” than of the second kind. Whether favorable or unfavorable, reviewers have more often responded to my work on the level of “This study vindicates historically important views of the text” or “This study remains strangely unconvinced by historically important views of the episode,” than on the level of, “The documented facts do not support [or, fully support] the author’s interpretation of the sequence of events.” In fact, I have never received a response that questioned my facts. The closest I’ve gotten is (A) notes on the absence of any reference to the peer reviewer’s work; (B) notes on the need for more emphasis on the peer reviewer’s favorite areas of study.

This does not mean that my work has been free from factual errors or deficiencies in the consultation of documentary sources; those are unavoidable, and it would be good for someone to point them out as soon as possible. But reviewers are seldom interested in that possibility. Which is disturbing.

I freely admit that some of the critiques I have received have done me good; they have informed me of other people’s points of view; they have shown me where I needed to make my arguments more persuasive; they have improved my work. But reviewers’ interest in emphases and ideological orientations rather than facts and the sources of facts gives me a very funny feeling. And you can see by the printed products of the review system that nobody pays much attention to the way in which academic contributions are written, even in the humanities. I have been informed that my writing is “clear” or even “sometimes witty,” but I have never been called to account for the passages in which I am not clear, and not witty. No one seems to care.

But here’s the worst thing. When I act as a reviewer, I catch myself falling into some of the same habits. True, I write comments about the candidates’ style, and when I see a factual error or notice the absence of facts, I mention it. But it’s easy to lapse into guild language. It’s easy to find words showing that I share the standard (or momentary) intellectual “concerns” and emphases of my profession, words testifying that the author under review shares them also. I’m not being dishonest when I write in this way. I really do share the “concerns” I mention. But that’s a problem. That’s why peer reviewing is often just a matter of reporting that “Jones’ work will be regarded as an important study by all who wish to find more evidence that what we all thought was important actually is important.”

You can see by the printed products of the review system that nobody pays much attention to the way in which academic contributions are written, even in the humanities.

Indeed, peer reviewing is one of the most conservative things one can do. If there’s no demand that facts and choices be checked and assessed, if there’s a “delicacy” about identifying intellectual sleight of hand or words-in-place-of-ideas, if consistency with current opinion is accepted as a value in itself, if what you get is really just a check on whether something is basically OK according to current notions of OKness, then how much more conservative can the process be?

On May 29, when LaCour tried to answer the complaints against him, he severely criticized the grad students who had discovered, not only that they couldn’t replicate his results, but that the survey company he had purportedly used had never heard of him. He denounced them for having gone off on their own, doing their own investigation, without submitting their work to peer review, as he had done! Their “decision to . . . by-pass the peer-review process” was “unethical.” What mattered wasn’t the new evidence they had found but the fact that they hadn’t validated it by the same means with which his own “evidence” had been validated.

In medicine and in some of the natural sciences, unsupported guild authority does not impinge so greatly on the assessment of evidence as it does in the humanities and the social sciences. Even there, however, you need to be careful. If you are suspected of being a “climate change denier” or a weirdo about some medical treatment, the maintainers of the status quo will give you the bum’s rush. That will be the end of you. And there’s another thing. It’s true: when you submit your research about the liver, people will spend much more time scrutinizing your stats than pontificating about how important the liver is or how important it is to all Americans, black or white, gay or straight, that we all have livers and enjoy liver equality. But the professional competence of these peer reviewers will then be used, by The New Republic and other conservative supporters of the status quo in our credentialed, regulated, highly professional society, as evidence that there is very little, very very very little, actual flim-flam in academic publication. But that’s not true.

ldquo;decision to . . . by-pass the peer-review processrsquo;s not true.




Share This


Confused on the Concept

 | 

Sen. Dianne Feinstein (D-CA) is one of those “liberals” who cannot resist the temptation to invent new rights (for government) and destroy old ones (for people). In response to the latest round of terror conspiracy charges, she has issued a public statement, which reads as follows:

I am particularly struck that the alleged bombers made use of online bombmaking guides like the Anarchist Cookbook and Inspire Magazine. These documents are not, in my view, protected by the First Amendment and should be removed from the Internet.

I am particularly struck by the senator’s inability to distinguish reading about something from doing it. Perhaps she believes that no one should know the chemical composition of dynamite, because such knowledge might be used to destroy a public building. Perhaps she believes that Hitchcock’s movies should be banned, because they show how to kill people with knives, scissors, and birds. Perhaps she is accustomed to rushing on stage to keep Macbeth from killing the king.

Or perhaps she is merely a typical American politician, busy about her work of ruining concepts she is incapable of understanding.




Share This


Pulp

 | 

When I was in grade school, a neighbor had an unfinished basement room, all studs and drywall, filled with paperback science fiction books and magazines. I was given free rein to browse and borrow. It was a treasure trove.

Among the things I read was the 1951 short story, The Marching Morons, by Cyril M. Kornbluth. It takes place in a distant future where, because of adverse genetic selection, the average IQ has fallen to 45.

A detail of the story that has stayed with me was the marketing of cars in that imaginary distant future. The cars weren’t very fast or powerful, so they were fitted out with electronic sound effects that made them sound like rolling thunder.

Here's the short story.

Reading the Washington Post the other day, I stumbled upon this:

For the 2015 Mustang EcoBoost, Ford sound engineers and developers worked on an “Active Noise Control” system that amplifies the engine’s purr through the car speakers . . .

Ford said in a statement that the vintage V-8 engine boom “has long been considered the mating call of Mustang,” but added that the newly processed pony-car sound is “athletic and youthful,” “a more refined growl” with “a low-frequency sense of powerfulness.”

Here's the link to the piece.

Welcome to the future.




Share This


The Battle of the Resumes

 | 

Maureen Dowd’s new column about Hillary Clinton convinces me that I am not the only one who smells something peculiarly sick and rotten in presidential politics.

On one side, we have Hillary Clinton, who presents a resume for high office with these major bullet points:

  1. Partnership in a radically dysfunctional marriage with a discredited former president, specializing in cheating and sleazing.
  2. Female gender.
  3. A long string of jobs — partner in a provincial law firm, power behind a throne, United States senator, secretary of state — which she survived, innocent of credit for any specific accomplishment.
  4. Proven ability to cadge money from Near Eastern religious fanatics, one-dimensional feminists, crony capitalists, and other people with hands out for favors.
  5. Proven ability, acquired from her husband (see 1, above), to operate (with the help of 4, above) a political mafia.
  6. Proven ability to tell nothing but lies.
  7. Proven ability to deliver any desired quantity of self-righteous statements about other people’s duties.

On the opposite side, we have John Ellis (“Jeb”) Bush, whose resume emphasizes these points:

  1. Membership in a family that includes two abjectly unsuccessful presidents.
  2. Modest success as governor of Florida.
  3. Proven ability to cadge money from “moderate” (i.e., non) Republicans and crony capitalists devoted to cheap labor, open immigration, and votes for Dems.
  4. Proven ability to lose votes from anyone to the right of Anderson Cooper.
  5. Proven ability to look stupid on any public occasion.
  6. Proven ability to deliver any desired quantity of self-righteous statements about other people’s duties.

It’s remarkable that everyone who has any knowledge of politics has read these resumes, understands them, and talks about them as if they were plastic disks in a checkers game.

Well, almost everyone. Dowd, for all her leftist craziness, is a respectable person.

But let’s see . . . Who has the longer resume? Jeb or Hillary?

It’s Hillary! She wins!

Can it be that in today’s America, or any other country, this is how things happen?




Share This


Obama's ISIS Strategy: Death by Flatulence

 | 

The more the Obama administration talks about the war on terrorism, the less we know. What are we fighting? Is it violent extremism or radical Islam? OK, it's actually radical Islam (we only need to kill jihadists, not all Muslims); the term "violent extremism" is less offensive to violent Islamists and no one cares about its repugnance to non-Muslim violent extremists — a subset in the Venn diagram of terrorism that is imperceptible to all but a handful of White House officials.

But is it Sunni radical Muslims or Shiite radical Muslims that are the problem? Or both? (And who are we to make such judgments — after the Crusades and all?) Do we need to worry about Iran, with its expanding regional hegemony, soon to be bolstered by nuclear weapons? Or Iraq, which, having been abandoned by the US in 2010, has descended into barbaric chaos with the Sunni Islamic State of Iraq and al-Sham (ISIS) running amok throughout its north, and equally vicious Iranian militia groups running amok everywhere else? Or both?

And what about the original Syrian rebels, valiantly fighting Bashar Assad? When, in 2011, the civilian death toll from Assad's brutal regime had reached 2,000, a horrified Mr. Obama declared that Assad must step aside. Yet, after drawing his famous red line, it was Obama who stepped aside, allowing both ISIS and Iranian thugs to trespass into Syria. What are we to make of Obama's silence today, when the Syrian death toll exceeds 200,000? And, as Hezbollah fighters and Iran’s Revolutionary Guard Corps (IRGC) creep into the Golan Heights and Hamas wages war in Gaza, why has Mr. Obama become displeased with Israeli president, Nethenyahu? Is it time to abandon Israel?

When it comes to facing ISIS on the ground, those with the most to lose have the greatest aversion to do so.

Some experts believe that if we (Western infidels) knew what radical Muslims wanted, then a reasonably peaceful coexistence agreement could be reached. But, as President Obama is discovering in his negotiations with Iran, even when we know what radical Muslims want, compromise is a charade, with reason playing, at best, a bit part to concession.

Despite his Herculean appeasement efforts, Obama has been unable to convince Iran to abandon its nuclear weapons program. His support for President Nouri al-Maliki (a puppet of Iran ) and his (Maliki's) violent purge of Sunni participation in Iraqi government affairs; his hasty withdrawal of American military forces — just when the Bush-Petraeus surge had stabilized the country and Vice President Biden was gleefully declaring that Iraq was "going to be one of the great achievements of this administration"; his refusal to help the Kurds fight ISIS militants; his blind eye to the spread of Shiite terrorism in Syria, Iraq, Lebenon, Yemen, and Gaza — all has been for naught.

In 2012, Obama issued a crystal clear promise to "do whatever it takes to prevent Iran from producing an atomic bomb." That promise became nebulous with a November 2013 agreement to forge, within six months, a treaty to freeze or reverse progress at all of Iran’s major nuclear facilities. Today, as the delays (and the relaxation of economic sanctions against Iran) continue, Obama's promise is idle. The mullahs, who have been playing him for a sucker all along, will get their bomb. Obama can only hope for a toothless treaty that postpones Iran's acquisition of a functioning ICBM system — until after he leaves office, when nuclear proliferation in the Middle East will become his successor's problem.

As al Qaeda continues to be a grave threat, Mr. Obama has convinced himself that for ISIS — the now much larger threat — we can pretend that everything's going to be OK.

We also know what Sunni Muslim radical organizations such as ISIS want. They tell us, loudly and unequivocally: 7th-century Islam, a caliphate, with sharia law, and remorseless death to all who interfere. That they are pathologically indifferent to diplomacy, negotiation, or compromise is demonstrated in a relentless parade of choreographed atrocities: decapitation, crucifixion, immolation, torture, rape, slavery, and mass murder, to name a few. In his brilliant and disturbing exposé, What ISIS Really Wants, Graeme Wood elucidates,

We can gather that their state [ISIS] rejects peace as a matter of principle; that it hungers for genocide; that its religious views make it constitutionally incapable of certain types of change, even if that change might ensure its survival; and that it considers itself a harbinger of — and headline player in — the imminent end of the world.

Wood suspects that, in the past year, president Obama's confusion over the nature of ISIS "may have contributed to significant strategic errors." The confusion extends much further back. As ISIS marauded into Iraq in late 2013, Obama may have believed that he could reason with Abu Bakr al-Baghdadi, the leader of what Obama perceived to be the al Qaeda JV team. However, already embroiled in the war against terrorism and fully aware of ISIS's fanatical designs on Iraq, he might have followed the advice of Benjamin Franklin, arguably the finest diplomat in US history, who knew that sometimes "force shites on the back of reason." Had Obama chosen this path, any time before January 3, 2014, the day when Fallujah fell to al-Baghdadi's brutal thugs, would have been a fine time for overwhelming military force to shit on the back of ISIS.

It did not. Unchallenged, ISIS continued its rapid expansion, conquering most of northern Iraq by early June, when it captured the city of Mosul. It wasn't until August, when American journalist James Foley was beheaded, that Obama sprang into action — in a press briefing, where the president announced, to the dismay of our allies in the Middle East and Europe, that he had no strategy.

By the following week, however, he had hastily cobbled together a plan to "degrade and ultimately defeat" ISIS. Enlisting the aid of allies (nine, initially), it would involve air strikes against ISIS targets in Iraq and not involve American "boots on the ground" anywhere. With Syria but a tattered impression in his entangled memory, Secretary of State John Kerry spouted, “Obviously I think that’s a red line for everybody here.” ISIS poses no existential threat to the US, yet. The immediate threat is to Iraq, the oil producing monarchies in the Arabian Peninsula, and, to a lesser extent, Europe. When it comes to facing ISIS on the ground, those with the most to lose have the greatest aversion to do so.

Obama's goal may be to defeat ISIS, but his strategy is based on constraint.

Only the Kurds have been willing to face ISIS. Apart from Israel, they are our only true ally in the region. They struggle alone, except for sporadic US air support. Their weapons are obsolete. The ISIS attackers wield vastly superior American weapons, stolen from the Iraqi military. Kurdish pleas for such weapons have found nothing but Obama's shameless denial.

Our other Middle East allies meekly stand by, partly because of their reluctance to face any grueling warfare, but also, perhaps more significantly, because of their suspicions about Obama. They are Sunnis, who, while appreciating Obama's dilemma in Syria (where he can't bomb ISIS without helping Assad), are deeply troubled by his concessions to Iran — a Shiite juggernaut feared more than ISIS. Why should they follow a leader whose ultimate sympathies lie with their ultimate enemy?

President Obama entered office vowing to deliver on his campaign pledge to improve America's image in the Middle East. Apologizing for America's arrogance (including the War in Iraq, torture, Gitmo, and more), he did his best to ingratiate himself to the Muslim world. He did, however, warn that "al Qaeda is still a threat and that we cannot pretend somehow that because Barack Hussein Obama got elected as president suddenly everything's going to be OK."

But ending the Iraq War did not win the favor of Islam. Indeed, Obama's hasty withdrawal from Iraq (against the wishes of his military advisors) thrust that country into a violent chaos that destroyed what he himself called “a sovereign, stable and self-reliant Iraq" and touted as "an extraordinary achievement." It allowed ISIS to be created — reconstituted from the remnants of al Qaeda in Iraq (AQI) that had been defeated by the Bush-Petraeus surge. With his pre-announced 2016 exit, Afghanistan is likely to follow the same trajectory. And we were kicked out of Libya, Yemen, and Syria by Sunni Muslim terrorists, Shiite Muslim terrorists, and Vladimir Putin, respectively. So much for America's image.

As al Qaeda continues to be a grave threat, Mr. Obama has convinced himself that for ISIS — the now much larger threat — we can pretend that everything's going to be OK. In his recent Vox interview, he asserted that the media exaggerates terrorism and that climate change and epidemic disease may be more important issues. He concedes that it is legitimate for Americans to be concerned "when you've got a bunch of violent, vicious zealots who behead people or randomly shoot a bunch of folks in a deli in Paris," fastidiously avoiding, of course, any association with radical Islam. We should not be alarmed by the organization that he once dismissed as a JV team, and now dismisses as a caliphate, believing that it will collapse under its own weight. Says Obama, "It [ISIS] can talk about setting up the new caliphate but nobody is under any illusions that they can actually, you know, sustain or feed people or educate people or organize a society that would work."

Nevertheless, with the gruesome ISIS murders, in early February, of a Japanese journalist (beheaded), a Jordanian pilot (burned alive in a cage), and 21 Egyptian Christians (beheaded), Obama was spurred to action. He convened a global summit, in Washington DC, where leaders from 60 countries came to combat "violent extremism” — by the surprising method of "empowering local communities" that can provide "economic, educational and entrepreneurial development so people have hope for a life of dignity." Said the president, "We can help Muslim entrepreneurs and youths work with the private sector to develop social media tools to counter extremist narratives on the Internet." To that end, the State Department promptly opened 350 twitter accounts (designed, apparently, to deluge the violent extremists with clever anti-barbarism tweets) and a new web site: "The Solution to Violent Extremism Begins in your Community."

Strangely, they are serious. Violent extremism, says John Kerry, is "the defining fight of our generation." Back in the real world, however, it is quite astonishing that Obama has been unable to convince countries such as Saudi Arabia, Turkey, and the Gulf states to join the fight against ISIS. These Sunni Muslim nations, having the most to lose, should be the most willing to put their own boots on the ground against ISIS. Nothing would please America more than to see Arab Muslim soldiers at the forefront of Obama's "degrade and ultimately defeat" ISIS’ campaign. Should this happen, I am sure that Christians, Jews, and those of other faiths would march together with Muslim Americans through the streets of America cheering for our president and praising his inspired leadership.

Hope could work. It has worked very well for Obama in the past. After all, it's how he was elected president.

But it's not likely. Obama's goal may be to defeat ISIS, but his strategy is based on constraint: can't bomb Syria, can't cross Kerry's redline, can't jeopardize negotiations with Iran, can't offend Islam, can't capture terrorists, and so forth. Such a strategy, together with his indecisiveness and distaste for military force, crowds out the possibility of victory. Besides, even if ISIS is defeated, al Qaeda and numerous other radical Muslim organizations remain — not to mention Iran, an immensely virulent, existing terrorist organization, on the fast track to obtain nuclear weapons.

President Obama, therefore, has retreated to his community organizer roots, where he finds, as chief weapons against Islamic terrorism: political rhetoric, social media, and hope — hope that ISIS self-destructs, that budding terrorists find jobs, that Iran abandons its nuclear ambitions, that pithy tweets will curb terrorist atrocities and stymie terrorist recruitment, and that the media stops exaggerating the barbarous acts committed, as Obama is careful to insist, by "individuals from various religions."

Hope could work. It has worked very well for Obama in the past. After all, it's how he was elected president. On the other hand, in Poor Richard’s Almanack, Franklin also warned, "He that lives upon Hope, dies farting."




Share This


If Ever, Oh Ever, a Wiz There Was

 | 

Entering his capital in triumph after a desperately hard campaign, Frederick the Great rode with his eyes forward, refusing to acknowledge the cheers of the crowd. An aide said, “Sire, the people are applauding you.” Frederick, eyes still resolutely on the road, replied, “If you put a monkey on this horse, the people would applaud him.”

That is my idea about news “anchors” such as Brian Williams.

It’s not everybody’s idea. On Feb. 8, in one of the last columns he wrote before his untimely death, David Carr said:

For some time now, there have been two versions of Brian Williams. One is an Emmy-winning, sober, talented anchor on the “NBC Nightly News” and the other is a funny, urbane celebrity who hosts “Saturday Night Live,” slow-jams the news with Jimmy Fallon and crushes it in every speech and public appearance he makes.

Each of those personas benefited the other, and his fame and appeal grew accordingly, past the anchor chair he occupied every weeknight and into a realm of celebrity that reaches all demographics and platforms. Even young people who wouldn’t be caught dead watching the evening news know who Mr. Williams is.

I’m citing this as a good example of the strange idea that there was a before and after to the Williams story — a before in which Williams was not just a celebrity but a funny, urbane, talented, appealing celebrity, and an after in which he was a dope and a blowhard, always telling ridiculous stories about himself.

As witnessed by the add-on adjectives, one can be a celebrity without having any attractive qualities at all. That has been obvious for some time, but it’s still interesting to know. Unfortunately, it’s also evident that one doesn’t need to do much in order to be regarded as funny, urbane, talented, and appealing. To me, and to hundreds of millions of other people, there was never anything remarkable about Brian Williams. I don’t regard slow-jamming the news with Jimmy Fallon as something that requires a lot of talent. Williams never crushed it with me.

Williams’ talent, such as it was, consisted merely of being a news anchor who did things that are usually not associated with being a news anchor. Lots of people are celebrities for reasons like that. Preachers and politicians get loud laughs when they tell a joke, but only because people think it’s amusing that someone with such a dull job can tell any joke at all. The animals on YouTube are considered amusing for doing things that any dull, ordinary person does every day; their talent is merely being animals that are trying to do those things. But if you found out that the dog wasn’t really a dog, or the cat wasn’t really a cat, or the news anchor wasn’t anchoring much of anything, no one would want to watch any of the supposedly amusing antics. And being a news anchor requires a lot less than being a dog or cat.

One can be a celebrity without having any attractive qualities at all. That has been obvious for some time.

I am old enough to have been a victim of the Age of Cronkite, an age now deeply venerated by a lot of people who believe that at some time in the past there really was a Wizard of Oz. I say “victim” because in those days there was no national electronic news except the offerings of the three government-licensed networks. Cable TV — always called, suspiciously, “pay TV” — did not exist. Basically, it was illegal. So, for lack of competition, a complacent man of modern-liberal ideas who was capable of reading a few minutes of typecript, crying when Democrats were hurt or killed, and, essentially, reprocessing news releases from the White House (or, in times of Republican administrations, from opponents of the White House) was worshiped as a god. At the time, however, he was worshiped by nobody except people whose own intellectual equipment was so faulty that their fondest hope was to be like him.

I know of one “news anchor” who was smart and knowledgeable and a good writer of his own books. That was David Brinkley. There used to be a cable anchor who was even better than Brinkley, Brit Hume of Fox News. But Brinkley is dead, and Hume is retired. Compared with these respectable figures, Walter Cronkite was the little mouse you see in the diorama of North American mammals, nibbling seeds at a fearful distance from the lordly elk. Brian Williams is down the hall, in the insect diorama.

This kind of comparison is actually too good to waste on such a lowly subject as Williams. It would be more appropriate for creatures with real significance — dictators, kings, and presidents. In the presidential diorama, the elk herd would feature such important fauna as Washington, Jefferson, Jackson, Cleveland (who commanded his aides to “tell the truth,” and meant it); the mice would be Monroe, Benjamin Harrison, Hoover, and so on; and the insects would be Tyler, Carter, Clinton, Bush (the second), and Obama. Yes, I know, we may need to bring more animals into the metaphor. But the curious — or curiously predictable — thing is that Williams actually aspired to be one of the celebrity insects, who in our times are happiest scurrying about in their hard little bodies, irritating everybody else into noticing them.

Mental image of Williams, looking for a cupboard in which to store imaginary deaths.

In a documentary filmed in 2006 about Hurricane Katrina, which in 2005 flooded low-lying parts of New Orleans, and which with a lot of help from Williams enraged the nation at the inability of Republican administrations to overrule acts of God, Williams boasted: “People say we found our voice on this story, after some long, cold years of one Bush term and some change.” What? What did he mean by that?

He provided part of an explanation in a speech at Temple University last October. He was there, as Tim Graham put it in News Busters on Feb. 10, to pick up “an award for ‘excellence.’” They give each other awards, these excellent people. And if they have to lie, well what the hell? "I have seen thousands of dead people in different places," Williams claimed, erroneously. Then he demanded the reward of sympathy for his own imaginary suffering. Speaking of himself, he said, "You have to find a place to put that [his erroneous memories] or else you can't get up in the morning." Mental image of Williams, looking for a cupboard in which to store imaginary deaths.

After that outburst, he turned to his reason for hearing a mighty significance in the “voice” he “found” in New Orleans — in the tale he told (with suitable adjustments, over the years, such as seeing dead bodies floating around the streets, or nearly dying, himself, of dysentery, or being threatened by gangs that busted into his hotel) about the New Orleans hurricane. I apologize for the syntax of the following quotation from Williams. You have to realize that this is how talented network news anchors (pay rate: $10 million a year) talk when they’re off-script:

For what it meant to our society. For what it still means. The issues. Race. The environment. Energy. Justice. The lack of it. It's all still there.

Now really, what can you make of that? Beneath the total incoherence appears to lurk some claim that by reporting (falsely) on a flood, Williams was somehow addressing issues of race (granted, most of the people who were flooded out by Katrina were African-Americans), energy (huh?), justice (it’s unjust to be flooded by a hurricane?), and “the environment.” The only way in which that last phrase makes sense is to assume that Williams is indicting Mother Nature for being, as Tennyson called her, “red in tooth and claw.” But I’m sure that’s not what he meant.

Whenever Brian Williams had himself photographed in some bold act of “reporting,” he was surrounded by network tenders, every one of whom knew what he was doing, and knew it was crap.

When something is really bad, it’s impossible to parody. Its literary effect cannot be enhanced, no matter what you do. But the political and social interest of Williams’ bizarre statements has not been fully developed. The big question is, why didn’t somebody at NBC stop him from saying all this crap? Everybody knew he was saying it, over and over again, for years. And I’m not just depending on anonymous sources to make that allegation. Given the nature of television broadcasting, it has to be true.

To me, one of the most amazing things in the world is that people give some kind of credence to the word “reality” in connection with what they see on television. Consider the term “reality TV.” Twenty feet away from those morons who face the camera and pour out their hearts about how lonely and helpless they’re feeling is a crowd of photographers, directors, producers, make-up artists, best boys, gophers, and people whose jobs cannot be described. In the same way, whenever Brian Williams had himself photographed in some bold act of “reporting,” he was surrounded by network tenders, every one of whom knew what he was doing, and knew it was crap. When he dribbled out his life story to interviewers on other media, hundreds of people back at NBC were following the publicity it gave him, and them. They knew, all right. The social and political question is, why did they let it go on, for more than a decade?

One answer is that they were lazy. But that’s the wrong answer. People who have responsible positions with TV networks aren’t sleepy little puppies; they’re sleepless sharks, required to compete with other sleepless sharks. OK, try this: nothing was done about Williams because he was being paid ten million dollars a year, and you don’t mess with that kind of investment. If you do, the investment will make like a shark and mess with you. That’s a better answer. And maybe it’s a sufficient one, although it doesn’t account for the behavior of the top predators, the ones who were doling out the money and should have been more risk-averse.

A third answer, which may be true, or partially true, is that Williams was protected by his dopey, inarticulate, yet constant political correctness. Here is the guy who interviewed the last President Bush, long after he had left office, and expressed astonishment that his recent reading matter could actually have included a Camus novel and three of Shakespeare’s plays. Astonishment. To the man’s face. Anyone not a dopey leftist would have said, “Oh, Mr. President, what impressed you most about those works?” But Williams is just dopey enough to believe his own dopey propaganda. He believed that Bush was dumb, and he didn’t know how to deal with the counter-evidence. (Or, probably, with the literary conversation that might have ensued.)

I hope you won’t write in to accuse me of being a partisan of George Bush, either one of them. Don’t worry about that. But everybody with the least curiosity has always known that Bush (regnal years 2001–2009) is a huge reader of books. Whether they do him any good — that’s another question. But what books has Williams ever read? Certainly none that would reveal to him the individuality of human life, its constant war with social stereotypes (e.g., “Republicans have no culture”). So naturally he aspired to become a stereotype — in just the best and brightest way. He cast himself as a thoughtful advocate for such stereotyped issues as, oh, “the environment,” “justice,” and the like. No one could possibly fear that he would ever harbor a critical thought about such things.

Williams is just dopey enough to believe his own dopey propaganda.

And that, I believe, is why “liberal” commentators have been so anxious to defend him, regretting that he quit, being confident that his offenses didn’t rise to the level of lying, bringing in psychiatry to remind us that people easily and innocently confuse their memories, and doing all the other stuff they’re paid big bucks to do. I guess they don’t want to lose their own license to lie.

NBC’s official approach is different. Network pooh-bahs are taking the line that presidential spokesman Josh Earnest recently took, in response to questions about Obama’s decision not to join the Charlie Hebdo protest against terrorism, or to send anyone important to sub for him. The basic policy is that responsible officialsaccept responsibility only for successes. Failures are no one’s responsibility. They happened. Well, sort of. But now we can move on.

As Julie Pace of the AP informed her readers, Earnest explained his boss’s absence from the Paris demonstration in this way:

Earnest said the White House took the blame but that Obama himself was not personally involved in the decision. Earnest would not say who was responsible for deciding the administration's participation in the event.

In other words, it is now possible to get on Air Force One and travel to Paris, or not to get on Air Force One and travel to Paris, and still have nothing whatever to do with the decision, personally. It wasn’t the decision of anyone who lives in the White House; the White House itself made the decision, or at least took the blame. Personal now means impersonal, and responsibility means freedom from responsibility.

Good. Very good.




Share This


Of Love and Violence

 | 

Two films opened during the Valentine’s weekend with hopes of becoming the box office blockbuster of choice, but neither is a traditional date-night romance. One feeds into typical male fantasies, while the other is based on a series of books that has had women swooning for three years. Which won at the box office opening weekend? And more importantly, which is the better film? We decided to switch things up and invite a man to review Fifty Shades of Grey while our entertainment editor, a woman, reviews Kingsman.

First up is the film that met with the most pre-release outrage. Reviews of Fifty have been published with titles such as “Fifty Shades of Smut,” “Fifty Shades of Shame,” and even “Fifty Shades of Dull.” In fact, Fifty Shades of Greyhas met with so much uproar that Kingsman: The Secret Service slipped right under the radar of the morality police. The authors of these reviews have good reason to be concerned about the long-term effects of pornography, especially pornography that focuses on violence. But does Fifty Shades of Grey, edited to receive an R rating rather than NC-17, really fit the definition? We asked film historian Steven DeRosa for his review.

* * *

Fifty Shades of Grey

How does one review the cinematic qualities of a cultural phenomenon? A good rule of thumb is to forget the phenomenon and judge the film on its own merits. In that regard, Fifty Shades of Grey succeeds on a certain level, but suffers under the restraints — no pun intended — of Sam Taylor-Johnson's direction and Kelly Marcel's screenplay. As a movie, Fifty Shades is entertaining to a degree, titillating to an extent, but falls short of the mark in terms of its aspirations. No, Fifty Shades was not aiming to be serious art, but in the spirit of its Valentines' Day weekend opening, this should have been a fun, sexy romp.

At the outset, allow me to disclose that I have not read E.L. James's novel. I should also state that I teach cinema studies at a liberal arts college and include in my curriculum the Steven Shainberg film Secretary (2002). The reason I bring this up is that the character portrayed by James Spader in that film bears the name E. Edward Grey. I am often asked by students if there is a correlation between Spader's Grey and the Grey of Fifty Shades, to which there is no easy answer. Was E.L. James inspired by Secretary?

Grey is somehow so charmed by Anastasia's naiveté, awkwardness, and lip biting that he later stalks her and shows up at the small-town hardware store where she works.

Decades ago, Hollywood churned out weepy melodramas known as "women's pictures." While scarcer, they are still made, and are now referred to as chick flicks. Fifty Shades fits into this category in that it expects its predominantly female audience to identify with the protagonist, Anastasia Steele, whose aim is not so to much attain the unattainable as to tame the untamable. On its most basic level, Fifty Shades succeeds in doing that, yet the film has significantfailings, caused largely by several faults of dramatic structure and partly by a lack of chemistry between the two leading characters, as portrayed by Dakota Johnson and Jamie Dornan.

The film opens on clumsy, doe-eyed Anastasia Steele, an English major substituting for her friend, journalism major Kate, who was to interview 27-year-old billionaire Christian Grey for their school newspaper. Anastasia literally stumbles into Grey's office, and for whatever reason he feels compelled to take pity on her and help her conduct the interview. Grey is somehow so charmed by Anastasia's naiveté, awkwardness, and lip biting that he later stalks her and shows up at the small-town hardware store where she works. Here she helps him with his shopping list of serial killer supplies — two sizes of duct tape, a package of zip ties, and rope. Rather than being alarmed by this, Ana is intrigued.

The odd stalker-like behavior continues when Christian sends Ana a rare edition of Tess of the D'Urbervilles and shows up to "rescue" her one night when she drunk-dials him from a club. All of this is leading to Christian's deflowering of Ana, which comes far too soon. Some of the most romantic movies ever made succeeded simply by keeping the lovers at a distance until it was almost excruciating — think of James Stewart kissing and then losing and losing again Kim Novak in Alfred Hitchcock's Vertigo, or Daniel Day-Lewis unbuttoning Michelle Pfeiffer's glove to kiss her exposed wrist in Martin Scorsese's The Age of Innocence.

Even Secretary had the good sense to concentrate on small, intimate details of the characters. At the end of that film's first spanking scene, there is a closeup of the dominant's hand brushing against the submissive's, and she responds by interlocking her pinky with his. This attention to character detail is absent from Fifty Shades, in favor of scenes showing off Grey's toys, and not the ones in his "Red Room of Pain." The scenes involvea more conventionalhelicopter and glider, piloted by him. Grey beds Steele so early in Fifty Shades that, again, there is no tension — dramatic, sexual, or otherwise.

If Ana Steele's goal is to domesticate Christian Grey and turn him into boyfriend material — someone who will take her out to dinner and a movie, cuddle up with her on the couch, and spoon with her on a cold winter's night — he reveals to her too soon that all of this is a distinct possibility. "If you agree to be my submissive, I'll be devoted to you," says Grey. There simply is no tension built up to suggest otherwise. After all, he sleeps in the same bed with her that first night, in spite of protestations that he never does that. If Ana plays along, she'll be able to top from the bottom for the rest of her days with Grey.

Even after the relationship has already been consummated, this bizarre courtship continues with Grey presenting a contract to Ana so they can solidify terms such as safe words, sleeping arrangements, and which activities and toys she will allow Grey to subject her to or use on her. Oddly, the contract negotiation scene is both funny and sexy and one of the few memorable scenes in the movie. The sex and domination scenes do little to connect the audience with either character, so those scenes fall flat.

If Ana plays along, she'll be able to top from the bottom for the rest of her days with Grey.

Perhaps the most fatal flaw in Fifty Shades is that it barely scratches the surface of its Christian Grey. At one point in the story, Grey confesses to Ana details about "the woman who gave birth to him." It is a moment in the movie that is quickly glossed over, but is supposed to begin to explain something of the character's backstory. "I had a rough start in life. That's all you need to know," hesays. And that's all we get to know. Thevulnerability caused by this void is an element not fully explored, at least not in this installment, which is obviously a setup for two sequels to come.

Was Fifty Shades of Grey going to be the movie that put BDSM in the mainstream? No. Were sales of wrist restraints and riding crops going to skyrocket overnight? Probably not. Fifty Shades of Grey misses the opportunity to be a very talked about movie for the simple reason that it is so antiseptic and watered down that it could never live up to the imaginations of readers who devoured E. L. James's books. — Steven DeRosa

* * *

Kingsman: The Secret Service

Who needs Mr. Grey when you can have Mr. Darcy? Jane Austen’s Pride and Prejudice is one of the most romantic stories ever written, and Colin Firth, who played the dashing and noble Mr. Darcy in the 1995 made-for-TV miniseries, stars as Harry Hart in this homage to James Bond.

Hart is certainly dashing in his impeccable Saville Row suits, and he’s noble too — quite often he sets his umbrella gun to “stun” instead of “AK-47” mode when he’s engaged in battle.

Firth, who won an Oscar for his portrayal of King George VI (The King’s Speech, 2010) is usually cast in more dignified roles, but he is surprisingly perfect as Harry Hart: he is elegant and edgy, unintentionally funny, and sports a newly trimmed-down physique that makes his action sequences — 80% of which he did himself — believable. (Well, as believable as 200 corpses in a single fight can be.)

Hart is one of an elite group of British spies trained in spectacular martial arts whose purpose is to save the world from dastardly masterminds who would rather see it destroyed. In this story, their nemesis is Richmond Valentine (Samuel L. Jackson). Hart? Valentine? Now you understand why the film opened this particular weekend.

The violence is so over the top that it’s cartoonish rather than gruesome, but still — I was looking for my “safe word.”

Kingsman contains all the ingredients of a James Bond film: the evil mastermind who has a physical deformity (Valentine speaks with a lisp); the sultry villainess who has a deadly physical specialty (Valentine’s sidekick, Gazelle [Sofia Boutella], has blades instead of feet and slices her opponents with the accuracy of a delicatessen chef); the spectacular opening scene that is actually the end to a previous episode; multiple exotic settings around the globe; cartoonish fights and chase scenes; and an evil plan that will destroy the world if the master villain isn’t stopped in time.

Writer-director Matthew Vaughn (Kick-Ass, X-Men: First Class, Snatch) adds a twist to the James Bond homage by focusing this plot on the recruitment of a new crop of Kingsmen — sort of X-Men: First Class Goes to Spy School. Hart sponsors a smart but troubled teenager named Eggsy (Taron Egerton) as his protégé, and Eggsy is soon part of group of wise-ass teenagers competing against one another in deadly tasks for the honor of becoming a Kingsman.

Meanwhile, the official Kingsmen are engaged in trying to thwart Valentine’s evil plan to dominate the world, and soon the two groups (what’s left of them) join forces. I should probably give you a warning: V may be for Valentine, but it’s also for Violence. Vaughn is the director of Kick Ass, after all. He goes for edgy. The violence is so over the top that it’s cartoonish rather than gruesome, but still — I was looking for my “safe word.” In addition to sliced limbs and spurting blood, you’ll find 50 shades of grey matter exploding in this film, as well as a fireworks display you aren’t likely to forget. And that church scene? It’s all done in a single take. Now that’s impressive.

So who wins the Valentine’s Day contest? RottenTomatoes gives Kingsmen: The Secret Service a 71% critics’ rating, while Fifty Shades of Grey earned a mere 26%. Splat. But the box office tells a different story. Kingsmen earned $35 million during opening weekend, while Fifty Shades brought in more than twice that much, $81 million — and Kingsmen had an extra day, opening on Thursday instead of Friday. It will be interesting to see which film has more staying power in the theaters; I suspect that everyone who was panting to see Mr. Grey has already had enough. — Jo Ann Skousen


Editor's Note: Review of "Kingsmen: The Secret Service," directed by Matthew Vaughn. Twentieth Century Fox, 2015, 129 minutes; and "Fifty Shades of Grey," directed by Sam Taylor-Johnson. Focus Features, 2015, 125 minutes (14 minutes and 17 seconds of which are sex scenes).



Share This


Stevie, Dictator of Togo

 | 

I was a student at the Université du Bénin in Togo in 1983. With typical and, I think, admirable American disrespect for authority, my fellow exchange students and I enjoyed calling the president of Togo “Stevie,” because he had changed his name from Etienne (French for “Steven”) to Gnassingbé, to sound more African. Our Togolese friends did not find it funny. It wasn’t that they were offended. They were afraid when they heard us talking like that and told us of ditches where the tortured corpses of the president’s critics appeared overnight.

According to my sources, the legends about Eyadéma Gnassingbé were officially encouraged. One, the story of the plane crash, was the subject of an entire comic book that I read when I was in Togo. In the comic, the president of Togo figured as a superhero with metaphysical powers. It was meant to be taken literally.

It’s true that Eyadéma survived a plane crash in 1974. It’s also true that he credited his survival to his own mystical powers. In the comic book, the plane was sabotaged, and his survival was definitely the miraculous result of his personal magic. In a national monument built to commemorate the incident, Eyadéma’s statue towers over images of the heroic officials who apparently didn’t have enough magic of their own and died in the crash.

A vast black Mercedes limousine trolled the market streets of Lomé scooping up pretty teenaged girls for the president’s use, and they usually ended up dead.

It’s also true that Eyadéma was a leader of the coup that unseated Sylvanus Olympio, the first president of Togo. At the time of the coup, Eyadéma was called Etienne Eyadéma, and the legend is that he personally machine-gunned Olympio at the gates of the American embassy in Lomé, where the then-president was seeking asylum. By the way, that coup followed a common pattern in sub-Saharan, post-colonial Africa: colonial powers establish trading relations with coastal tribe (in Togo’s case, the Ewe). Colonial powers assert administrative control over a large inland area, making the coastal elite a minority within the colonial borders. At the time of independence, the coastal elite takes over. (Sylvanus Olympio was Ewe.) The army is dominated, numerically, by inland tribes. (In Togo’s case, they included the Kabye.) The soldiers get fed up and stage a coup. (Eyadéma was Kabye.)

One day, I was walking through the market with a Togolese friend when he told me another story about Stevie. I had pointed out to him a very pretty girl selling chocolate bars. The girl was about 13. She balanced an enameled tin platter on her head. The platter bore a perfect pyramid of scores of identical chocolate bars in white and red paper wrappers. And the grace note was the girl’s matching white and red dress. She had made herself into a lovely advertisement for dark chocolate. Clever and pretty. But it only reminded my friend of the legends about Eyadéma’s sexual powers. He said that a vast black Mercedes limousine trolled the market streets of Lomé scooping up pretty teenaged girls for the president’s use, and that they usually ended up dead, not because of any abuse beyond presidential rape, but as a mere side effect of the great girth of his manhood.

Stevie died in office. At the time of his death in 2005, he was the longest serving head of state in all of Africa. His son, Faure Gnassingbé, took over and is still president.




Share This
Syndicate content

© Copyright 2017 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.