The Greek Mystique

 | 

I’m not an economist. I may have gotten my figures wrong. I may have gotten my economic history wrong. But it seems to me that Greece, population 11 million, has defaulted on about $100 billion worth of emergency loans that were made to cover its inability to pay off even larger loans. It also seems to me that the money that was loaned went to sustain a pension system that enabled people — almost half of them government employees — to retire at an absurdly early age, and at a still more absurd age if they worked at hundreds of “hazardous” occupations, such as beautician and radio announcer. And it now appears that while taking emergency loans to enable it to get through a “tough” period of “austerity” mandated by its fiendish creditors, Greece actually added 70,000 workers to the government payroll.

In response to the awful suffering imposed on them from beyond, Greeks went to the polls on Sunday and passed a referendum encouraging their government to demand yet more money from their creditors, with the stipulation that Greeks themselves would do nothing “further” to economize. The referendum won by a landslide. The human pebbles who slid down the electoral hill apparently believed that the people who loaned them money were exploiting them by expecting them to honor some part of their agreements.

The Greek government will now demand that a large portion of its debt be “written down”; in other words, that Greece be licensed simply to keep the money it was loaned and now refuses to pay back. In support of this idealistic notion, many of the pebbles took to the streets, indignantly proclaiming that “Greeks are not beggars!” They are right; there are other words for what they are — or, more properly, for how they’re acting. It’s a fine illustration of the way in which normal, decent people turn into ne’er-do-wells and conmen at the polls. The first victims of the conmen are themselves. They convince themselves that they are acting decently — indeed, that they are impelled by a righteous cause.

hile taking emergency loans to enable it to get through a “tough” period of “austerity” mandated by its fiendish creditors, Greece actually added 70,000 workers to the government payroll.

We’ll see whether Greece will continue to find European financial agencies that are silly enough to provide more money, on the Greeks’ own terms. Maybe it will. In Europe, there are two suckers born every minute.

Others besides me have commented on these matters, and I’ve read a lot of their comments. But so far I haven’t encountered a certain kind of comment. It seems to me an obvious one to make, but it isn’t being made. So I’ll make it.

When we talk about “European” loans to “Greece,” we must remember that we are talking about money that governments and government-sponsored banks have arranged to cover the debts of Greek official institutions. No private individual would make loans like this, unless he was figuring on some government covering his ass. In Greece itself, no private individual would do that.It’s like the California “bullet train”: it’s supposed to be a wonderful investment, but somehow, not a penny of private money has ever been invested in it.

If there is a better argument against centralized economic decisions, I can’t think of one. Here we have enormously ridiculous, enormously expensive losses, engendered by a class of government-sponsored experts who thought they knew better than every other individual on the planet. And by the way, these experts were working with other people’s money, with money that is taken, not requested. That kind of money is always easy to spend. And here is the financial system that is supposed to give the world security.

No private individual would make loans like this, unless he was figuring on some government covering his ass.

The Greeks aren’t the only people who think that “investment” means extracting money from productive individuals and giving it to the government to spend on projects that can’t possibly turn a profit. That’s the modern system of political economy. As for the ability of the United States, or the now-sainted China, to stimulate its economy by increasing its debts, the comment of Ray Gaines in Monday’s Wall Street Journal says it all: the system is not working. Meanwhile, the culture of entitlement that is inseparably linked to borrowing without repaying spreads inexorably from the seminar room to the legislative chamber to the chamber of commerce and the welfare mob. Too confused to argue, it asserts its positions; too proud to beg, it demands.




Share This


The Ron Paul Un-Revolution

 | 

A mere ten years back, if I told Americans and Canadians that I held libertarian views, many responded — recognizing that I was not a native English speaker — that “libertarian” was not a word. They thought I wanted to say “liberal.”

Today, “Don’t tread on me” flags, Ron Paul posters, and other advertisements for libertarian ideas grace houses and yards, even in remote places of the USA. Libertarianism is no longer an obscure concept. And a huge credit for making libertarianism mainstream goes to Ron Paul.

I am a big fan of Ron. He is, in my view, one of the finest human beings alive, despite the fact that I could never understand how, as a congressman, he could interact on a daily basis with sociopathic politicians and their sepoys. How could he not feel repulsion and frustration, operating in such an environment?

Politics by its very nature establishes a mindset of expediency and political activism, which are always in direct conflict with deeper understanding of principles.

Ron fought for a paradigm shift in the way the US government works. He voted against new laws. He wanted the US military for defense only, wanted removal of American forces from hundreds of bases around the world, and saw no reason why the US should be involved in Iraq, Afghanistan, Libya, etc. Quite rightly he saw no reason for the US to be still in Japan, Korea, and Europe, even if the bases there were maintained by invitation. He asked why the US should be supporting the dictatorial regime in Saudi Arabia. He wanted a significant reduction in welfare payments. He wanted to audit and end the Federal Reserve. He wanted an end to the War on Drugs. He wanted the US to be out of the UN and NATO. He fought vehemently against NSA surveillance, and for the right to bear arms. He wanted government to be out of the medical business.

In short, he wanted the government to govern — to provide law and order, and defense — and to get out of virtually everything else. He wanted the US to follow its Constitution.

What Ron said was well-reasoned and extremely well-conveyed in his speeches, with passion and a breath of fresh air for those who had grown tired of the political process. Most libertarian organizations promoted him, and Ron got a massive reception at many university campuses around the US. He set records of sorts for money raised in his canvassing for the US Presidential elections of 2012. Earlier seen, by some, as convocations of old white men, libertarian meetings started getting more people of other races, more young people, and an increased number of women. I cannot remember how many times I have been told by people that they saw the reason and value of liberty after listening to Ron.

Many libertarians saw this as the start of a snowballing of the libertarian movement. After a few beers, the dreamy ones, those with a passion for spreading their message, could imagine an exponential increase in libertarian views. In their opinion, it was only a matter of time before the whole world would accept liberty. “Truth and reason win in the end,” they would say.

Alas, this was not the sign that the movement was gaining speed, but a sign of its sickness. Ron, having chosen a wrong means to spread his message — politics — had implanted a virus among his audience. Ron’s charisma glorified the political process. Unfortunately, politics by its very nature establishes a mindset of expediency and political activism, which are always in direct conflict with deeper understanding of principles.

The golden ring of politics corrupts everyone, slowly and subtly, without their recognizing it, corrupting their souls, ossifying their principles into facades that fall apart at the slightest pressure.

The virus of politicized libertarianism eventually mutated. In libertarian circles, it became very important to increase the number of one’s adherents. Many libertarian organizations got very well-funded. Students were flying around the world, attending conferences, one after another. Free-market organizations were being set up everywhere, all well-financed.

Many of the politicized libertarians ran to the lap of the government, determined to join the fight against the real or imagined enemy. In one strike they had forgotten that war is the health of the state.

Given the financial encouragement, all sorts of people, even if they were not principally libertarian, joined. My guess is that some who in the course of time would have become principled libertarians accepted and repeated libertarian mantras, as beliefs taken on faith, without fully understanding the reasoning behind them. This had to lead to ossification of the mental process.

There was an emphasis on getting more women into the movement. Some, who were market savvy, realized that it was going to be far easier to get attention in a women-deficit environment. It was ignored that the sexual objectification of women was demeaning to them and a huge step back for the libertarian philosophy. There was also an emphasis on ideological inclusiveness. Boundaries should be made a bit fuzzy, to allow a bit of compromise, to make libertarianism more inviting, less radical. One well-known anarchist, in an attempt to be inclusive, started calling the core values of libertarianism “brutalism.” Soon there were left-libertarians, thick-libertarians, thin-libertarians, bleeding-heart-libertarians, etc.

Last year, I went to a speech by a bleeding-heart-libertarian in Delhi and could not hold myself back from asking in what way the things he advocated were any different from radical socialism.

When two small terrorist incidents happened in Ottawa, many of the politicized libertarians ran to the lap of the government, determined to join the fight against the real or imagined enemy. In one strike they had forgotten that war is the health of the state. They suddenly had no problem imposing restrictions on certain people who lived and dressed differently. Uninterested in collateral damage, they had no problems blowing the Middle East out of existence. They had forgotten that the state is a much worse enemy. Islam and all its flaws would have been better controlled in a stateless environment. They lost their sense of balance — better the enemy they knew than the one they didn’t — for they were not moored in principles.

Libertarians of East European heritage — unconsciously driven by indoctrinated hatred for Russia, not by philosophy — wanted the US to embargo Russia. Coming full circle, this mutant movement even opposed Ron Paul, for he opposes US involvement in foreign lands. Meanwhile, drug-peddlers and prostitutes were seen as embodying libertarianism. Many young people were encouraged to look for issues with the police. Going over the speed limit, driving under influence, or jumping red lights were not only condoned but seen as expressions of liberty.

Libertarianism does not try to prevent people from selling their bodies or consuming drugs, but it is a logical fallacy to assume that this means that libertarianism encourages these activities. Even in an anarchist world, to stay civilized, there would still be rules against driving under the influence or jumping red lights.

Politics is a virus that implants in the brain the top-down approach to social change. A real change can only happen from the bottom up.

The meaning of libertarianism was being removed from its principles. Once you lose your moorings, you lose direction. It is an error to think that libertarianism means no rules or system, something that a superficial understanding of the philosophy might make one think.

Politics is a virus that implants in the brain the top-down approach to social change. A real change can only happen from the bottom up. The thinking of the politically minded is not based on principles but on political organization. It is doomed to fail. Did Ron not see this?

Principles are principles and hence unchangeable. Any philosophy must be radically based on principles, if it is not to lose its moorings. Do I foresee a world where there will be no dishonesty or violence? No. But that does not mean I should become more inclusive, to bring in more people by starting to practise partial honesty or partial violence. Just because the state might never cease to exist does not mean that I accept its legitimacy to make my values more inclusive.

Radicalism gives meaning and passion to carry on when the seas are frothy and uncertain. There is something, indeed a lot, behind the Christian concept of the remnant. The remnant stay on their course even in a turbulent world.

Without radicalism, without a solid grasp of principles, the superstructure has nothing to hold itself in place and must fall apart eventually.

But hasn’t the libertarian movement grown by leaps and bounds? Alas, this is a myth of those who hold irrational, romantic opinions, living secluded lives among others with similar ideas. In reality it is statism that is in the ascendant, not only in the West but even more in the non-Western countries.

Despite the fact that Ron made a huge contribution in making “libertarianism” known to the mainstream, by being in politics — which might at surface look like a small issue — he made a major compromise with his principles. He politicized libertarianism. This seemingly simple compromise will end as his legacy and possibly as a permanent confusion of the concept of libertarianism, not unlike the way in which the meaning of “liberal” mutated in North America.

You cannot make someone a libertarian. It cannot be a result of groupthink or politics. The change can only happen through self-reflection, meditation, contemplation, reason, and a passion for the truth. A libertarian society can emerge only as the end result of character-building, mostly through working on the self, from the bottom up.




Share This


Escape from Dannemora

 | 

I’m going to say something that many libertarians don’t want to hear: prisons need discipline, and plenty of it.

I’m reflecting on the big news item of the past three weeks, the escape of two convicts from the maximum-security prison in Dannemora, New York, an institution that used to be feared as “Siberia.” They escaped because they were allowed to live in an “honor” block, work with and have sex with civilians, cook their own meals, wear civilian clothes, and enjoy a level of control and discipline that permitted them to acquire power tools and use them to cut holes in their cells and escape. Power tools. Used by men sent to prison for vicious murders, including, in one instance, the dismemberment of the murder victim. Tools freely used, and undetected.

What’s that noise? Is that a guy cutting his way out of prison, or is that just a guy cutting up some other prisoner? Whatever. Have a nice night.

It is one thing to debate about whom to send to prison. It is another thing to screw around with the lives of the people we decide to send there. Because, make no mistake about it, the first victims of convicts who are not controlled are other convicts. If you want the rapes, murders, tortures, and gang aggressions that happen routinely in American prisons to continue to happen, all you need to do is let the bad guys act in whatever way they want. If that’s your “libertarian” philosophy, it will have a big impact, because just one of those bad guys in a prison unit can be enough to ensure the victimization of everybody else.

When there’s a good reason to send somebody to prison — and sometimes there is — you don’t have to give him a life sentence, but you do have to keep him safely inside.

The late Nathan Kantrowitz made this point very powerfully in his exacting study of prison life, Close Control. I followed Nathan’s lead in my own book, The Big House: Image and Reality of the American Prison. I added that, in my judgment, the sorry state of American penology is the result of a vicious convergence of modern liberal and modern conservative ideas. The conservatives want to lock people up, and do it on the cheap. And it’s true, you can get a lot of non-discipline and non-control, very cheaply indeed. The liberals believe that convicts are somehow rehabilitated by being allowed to wear their own clothes, cook their own meals, and wander about the joint, victimizing anyone who’s weaker than they are. (Unnoticed by the conservatives, the liberals also ordained that those who run the prison system would get paid enough to give them 15 degrees at Harvard. They’re unionized, after all, and they’re the biggest sinkhole in many state budgets.)

Libertarians ought to be smarter. When there’s a good reason to send somebody to prison — and sometimes there is — you don’t have to give him a life sentence, but you do have to keep him safely inside, and safe from victimizing or being victimized by other prisoners.

In the short run, whoever is running the New York prison system needs to be fired, immediately. (I’m afraid that the fix is already in, and this will never happen.) In the middle run, real investigations of penology — not ideological declarations about penology, from any vantage point — need to be conducted, so that people can learn what the few good scholars, such as Nathan Kantrowitz, have already established. In the long run, Americans should stop making savage jokes about rape in prison and start considering the steps that are necessary to keep rape and murder, and by the same token, escape, from happening in their supposedly secure institutions.




Share This


The Lady and the Tigers

 | 

Tippi Hedren is the actress whose intelligence illuminated Alfred Hitchcock’s The Birds. Now, 52 years later, Hedren is still an illuminating person — as shown by her powerful performance at a hearing managed by the board of asses who are in charge of California’s “bullet train.”

The train — which does not exist and may never exist — is the biggest (putative) construction project in American history, and perhaps the biggest boondoggle, a reduction ad absurdum of “planning for the environment,” “planning for energy conservation,” and all the rest of it. Its cost estimates are 600% higher than the voters thought they were mandating, and this is one reason the majority of voters now wish they hadn’t listened to propaganda for the project. They agreed to build a railroad that would deliver passengers from Los Angeles to San Francisco in a time substantially less than three hours. It’s now clear that there’s no possibility this can happen, or anything close to it, no matter where the rail line is put. But no one knows where it will be put. The managers of the project, the California High Speed Rail Authority, insensately determined to carry on despite the many kinds of fools they are making of themselves, are still deciding which communities they’re going to unleash their bulldozers on.

Another assumption is that it’s efficient to destroy a series of towns in the pursuit of what is in fact slow-speed rail.

They have to hold public hearings about this. Unfortunately, they don’t have to listen to what they hear at them, and they don’t. The latest hearing involved outraged residents of several Southern California towns that may be devastated by the train. One of them is Acton, where Hedren operates an animal-rescue preserve that caters to big cats. So Hedren showed up at the hearing.

Dan Richard, chairman of the Authority, used the occasion to pontificate: “What we’re building here, by the way, in high-speed rail, is the most efficient way to deal with our transportation needs of the future." “By the way?” The rhetoric is almost as condescending as the statement itself, which assumes that its audience is stupid enough to believe it’s efficient to spend at least $100 billion to propel a few hundred people a day to a destination they could have reached more quickly and cheaply by air. Another assumption is that it’s efficient to destroy a series of towns in the pursuit of what is in fact slow-speed rail.

Hedren, 85, identified the problem with people who make statements like that: "You don't listen, you don't care. . . . You are going to take this beautiful little town of Acton . . . and you are going to destroy it with this train." Then she mentioned the lions and tigers she cares for (but has no illusions about). "I am more afraid of you," she told the planners.




Share This


Fakers and Enablers

 | 

Last month, a UCLA graduate student in political science named Michael LaCour was caught faking reports of his research — research that in December 2014 had been published, with much fanfare, in Science, one of the two most prestigious venues for “hard” (experimental and quantifiable) scientific work. Because of his ostensible research, he had been offered, again with much fanfare, a teaching position at prestigious Princeton University. I don’t want to overuse the word “prestigious,” but LaCour’s senior collaborator, a professor at prestigious Columbia University, a person whom he had enlisted to enhance the prestige of his purported findings, is considered one of the most prestigious number-crunchers in all of poli sci. LaCour’s dissertation advisor at UCLA is also believed by some people to be prestigious. LaCour’s work was critiqued by presumably prestigious (though anonymous) peer reviewers for Science, and recommended for publication by them. What went wrong with all this prestigiousness?

Initial comments about the LaCour scandal often emphasized the idea that there’s nothing really wrong with the peer review system. The New Republic was especially touchy on this point. The rush to defend peer review is somewhat difficult to explain, except as the product of fears that many other scientific articles (about, for instance, global warming?) might be suspected of being more pseudo than science; despite reviewers’ heavy stamps of approval, they may not be “settled science.” The idea in these defenses was that we must see l’affaire LaCour as a “singular” episode, not as the tin can that’s poking through the grass because there’s a ton of garbage underneath it. More recently, suspicions that Mt. Trashmore may be as high as Mt. Rushmore have appeared even in the New York Times, which on scientific matters is usually more establishment than the establishment.

I am an academic who shares those suspicions. LaCour’s offense was remarkably flagrant and stupid, so stupid that it was discovered at the first serious attempt to replicate his results. But the conditions that put LaCour on the road to great, though temporary, success must operate, with similar effect, in many other situations. If the results are not so flagrantly wrong, they may not be detected for a long time, if ever. They will remain in place in the (pseudo-) scientific literature — permanent impediments to human knowledge. This is a problem.

But what conditions create the problem? Here are five.

1. A politically correct, or at least fashionably sympathetic, topic of research. The LaCour episode is a perfect example. He was purportedly investigating gay activists’ ability to garner support for gay marriage. And his conclusion was one that politically correct people, especially donors to activist organizations, would like to see: he “found” that person-to-person activism works amazingly well. It is noteworthy that Science published his article about how to garner support for gay marriage without objecting to the politically loaded title: “When contact changes minds: An experiment on transmission of support for gay equality.” You may think that recognition of gay marriage is equivalent to recognition of gay equality, and I may agree, but anyone with even a whiff of the scientific mentality should notice that “equality” is a term with many definitions, and that the equation of “equality” with “gay marriage” is an end-run around any kind of debate, scientific or otherwise. Who stands up and says, “I do not support equality”?

The idea in these defenses was that we must see l’affaire LaCour as a “singular” episode, not as the tin can that’s poking through the grass because there’s a ton of garbage underneath it.

2. The habit of reasoning from academic authority. LaCour’s chosen collaborator, Donald Green, is highly respected in his field. That may be what made Science and its peer reviewers pay especially serious attention to LaCour’s research, despite its many curious features, some of which were obvious. A leading academic researcher had the following reaction when an interviewer asked him about the LaCour-Green contribution to the world’s wisdom:

“Gee,” he replied, “that's very surprising and doesn't fit with a huge literature of evidence. It doesn't sound plausible to me.” A few clicks later, [he] had pulled up the paper on his computer. “Ah,” he [said], “I see Don Green is an author. I trust him completely, so I'm no longer doubtful.”

3. The prevalence of the kind of academic courtesy that is indistinguishable from laziness or lack of curiosity. LaCour’s results were counterintuitive; his data were highly exceptional; his funding (which turned out to be bogus) was vastly greater than anything one would expect a graduate student to garner. That alone should have inspired many curious questions. But, Green says, he didn’t want to be rude to LaCour; he didn’t want to ask probing questions. Jesse Singal, a good reporter on the LaCour scandal, has this to say:

Some people I spoke to about this case argued that Green, whose name is, after all, on the paper, had failed in his supervisory role. I emailed him to ask whether he thought this was a fair assessment. “Entirely fair,” he responded. “I am deeply embarrassed that I did not suspect and discover the fabrication of the survey data and grateful to the team of researchers who brought it to my attention.” He declined to comment further for this story.

Green later announced that he wouldn’t say anything more to anyone, pending the results of a UCLA investigation. Lynn Vavreck, LaCour’s dissertation advisor at UCLA, had already made a similar statement. They are being very circumspect.

4. The existence of an academic elite that hasn’t got time for its real job. LaCour asked Green, a virtually total stranger, to sign onto his project: why? Because Green was prestigious. And why is Green prestigious? Partly for signing onto a lot of collaborative projects. In his relationship with LaCour, there appears to have been little time for Green to do what professors have traditionally done with students: sit down with them, discuss their work, exclaim over the difficulty of getting the data, laugh about the silly things that happen when you’re working with colleagues, share invidious stories about university administrators and academic competitors, and finally ask, “So, how in the world did you get those results? Let’s look at your raw data.” Or just, “How did you find the time to do all of this?”

LaCour’s results were counterintuitive; his data were highly exceptional; his funding was vastly greater than anything one would expect a graduate student to garner.

It has been observed — by Nicholas Steneck of the University of Michigan — that Green put his name on a paper reporting costly research (research that was supposed to have cost over $1 million), without ever asking the obvious questions about where the money came from, and how a grad student got it.

“You have to know the funding sources,” Steneck said. “How else can you report conflicts of interest?” A good point. Besides — as a scientist, aren’t you curious? Scientists’ lack of curiosity about the simplest realities of the world they are supposedly examining has often been noted. It is a major reason why the scientists of the past generation — every past generation — are usually forgotten, soon after their deaths. It’s sad to say, but may I predict that the same fate will befall the incurious Professor Green?

As a substitute for curiosity, guild courtesy may be invoked. According to the New York Times, Green said that he “could have asked about” LaCour’s claim to have “hundreds of thousands in grant money.” “But,” he continued, “it’s a delicate matter to ask another scholar the exact method through which they’re paying for their work.”

There are several eyebrow-raisers there. One is the barbarous transition from “scholar” (singular) to “they” (plural). Another is the strange notion that it is somehow impolite to ask one’s colleagues — or collaborators! — where the money’s coming from. This is called, in the technical language of the professoriate, cowshit.

The fact that ordinary-professional, or even ordinary-people, conversations seem never to have taken place between Green and LaCour indicates clearly enough that nobody made time to have them. As for Professor Vavreck, LaCour’s dissertation director and his collaborator on two other papers, her vita shows a person who is very busy, very busy indeed, a very busy bee — giving invited lectures, writing newspaper columns, moderating something bearing the unlikely name of the “Luskin Lecture on Thought Leadership with Hillary Rodham Clinton,” and, of course, doing peer reviews. Did she have time to look closely at her own grad student’s work? The best answer, from her point of view, would be No; because if she did have the time, and still ignored the anomalies in the work, a still less favorable view would have to be entertained.

This is called, in the technical language of the professoriate, cowshit.

Oddly, The New Republic praised the “social cohesiveness” represented by the Green-LaCour relationship, although it mentioned that “in this particular case . . . trust was misplaced but some level of collegial confidence is the necessary lubricant to allow research to take place.” Of course, that’s a false alternative — full social cohesiveness vs. no confidence at all. “It’s important to realize,” opines TNR’s Jeet Heer, “that the implicit trust Green placed in LaCour was perfectly normal and rational.” Rational, no. Normal, yes — alas.

Now, I don’t know these people. Some of what I say is conjecture. You can make your own conjectures, on the same evidence, and see whether they are similar to mine.

5. A peer review system that is goofy, to say the least.

It is goofiest in the arts and humanities and the “soft” (non-mathematical) social sciences. It’s in this, the goofiest, part of the peer-reviewed world that I myself participate, as reviewer and reviewee. Here is a world in which people honestly believe that their own ideological priorities count as evidence, often as the determining evidence. Being highly verbal, they are able to convince themselves and others that saying “The author has not come to grips with postcolonialist theory” is on the same analytical level as saying, “The author has not investigated the much larger data-set presented by Smith (1997).”

My own history of being reviewed — by and large, a very successful history — has given me many more examples of the first kind of “peer reviewing” than of the second kind. Whether favorable or unfavorable, reviewers have more often responded to my work on the level of “This study vindicates historically important views of the text” or “This study remains strangely unconvinced by historically important views of the episode,” than on the level of, “The documented facts do not support [or, fully support] the author’s interpretation of the sequence of events.” In fact, I have never received a response that questioned my facts. The closest I’ve gotten is (A) notes on the absence of any reference to the peer reviewer’s work; (B) notes on the need for more emphasis on the peer reviewer’s favorite areas of study.

This does not mean that my work has been free from factual errors or deficiencies in the consultation of documentary sources; those are unavoidable, and it would be good for someone to point them out as soon as possible. But reviewers are seldom interested in that possibility. Which is disturbing.

I freely admit that some of the critiques I have received have done me good; they have informed me of other people’s points of view; they have shown me where I needed to make my arguments more persuasive; they have improved my work. But reviewers’ interest in emphases and ideological orientations rather than facts and the sources of facts gives me a very funny feeling. And you can see by the printed products of the review system that nobody pays much attention to the way in which academic contributions are written, even in the humanities. I have been informed that my writing is “clear” or even “sometimes witty,” but I have never been called to account for the passages in which I am not clear, and not witty. No one seems to care.

But here’s the worst thing. When I act as a reviewer, I catch myself falling into some of the same habits. True, I write comments about the candidates’ style, and when I see a factual error or notice the absence of facts, I mention it. But it’s easy to lapse into guild language. It’s easy to find words showing that I share the standard (or momentary) intellectual “concerns” and emphases of my profession, words testifying that the author under review shares them also. I’m not being dishonest when I write in this way. I really do share the “concerns” I mention. But that’s a problem. That’s why peer reviewing is often just a matter of reporting that “Jones’ work will be regarded as an important study by all who wish to find more evidence that what we all thought was important actually is important.”

You can see by the printed products of the review system that nobody pays much attention to the way in which academic contributions are written, even in the humanities.

Indeed, peer reviewing is one of the most conservative things one can do. If there’s no demand that facts and choices be checked and assessed, if there’s a “delicacy” about identifying intellectual sleight of hand or words-in-place-of-ideas, if consistency with current opinion is accepted as a value in itself, if what you get is really just a check on whether something is basically OK according to current notions of OKness, then how much more conservative can the process be?

On May 29, when LaCour tried to answer the complaints against him, he severely criticized the grad students who had discovered, not only that they couldn’t replicate his results, but that the survey company he had purportedly used had never heard of him. He denounced them for having gone off on their own, doing their own investigation, without submitting their work to peer review, as he had done! Their “decision to . . . by-pass the peer-review process” was “unethical.” What mattered wasn’t the new evidence they had found but the fact that they hadn’t validated it by the same means with which his own “evidence” had been validated.

In medicine and in some of the natural sciences, unsupported guild authority does not impinge so greatly on the assessment of evidence as it does in the humanities and the social sciences. Even there, however, you need to be careful. If you are suspected of being a “climate change denier” or a weirdo about some medical treatment, the maintainers of the status quo will give you the bum’s rush. That will be the end of you. And there’s another thing. It’s true: when you submit your research about the liver, people will spend much more time scrutinizing your stats than pontificating about how important the liver is or how important it is to all Americans, black or white, gay or straight, that we all have livers and enjoy liver equality. But the professional competence of these peer reviewers will then be used, by The New Republic and other conservative supporters of the status quo in our credentialed, regulated, highly professional society, as evidence that there is very little, very very very little, actual flim-flam in academic publication. But that’s not true.

ldquo;decision to . . . by-pass the peer-review processrsquo;s not true.




Share This


Confused on the Concept

 | 

Sen. Dianne Feinstein (D-CA) is one of those “liberals” who cannot resist the temptation to invent new rights (for government) and destroy old ones (for people). In response to the latest round of terror conspiracy charges, she has issued a public statement, which reads as follows:

I am particularly struck that the alleged bombers made use of online bombmaking guides like the Anarchist Cookbook and Inspire Magazine. These documents are not, in my view, protected by the First Amendment and should be removed from the Internet.

I am particularly struck by the senator’s inability to distinguish reading about something from doing it. Perhaps she believes that no one should know the chemical composition of dynamite, because such knowledge might be used to destroy a public building. Perhaps she believes that Hitchcock’s movies should be banned, because they show how to kill people with knives, scissors, and birds. Perhaps she is accustomed to rushing on stage to keep Macbeth from killing the king.

Or perhaps she is merely a typical American politician, busy about her work of ruining concepts she is incapable of understanding.




Share This


Pulp

 | 

When I was in grade school, a neighbor had an unfinished basement room, all studs and drywall, filled with paperback science fiction books and magazines. I was given free rein to browse and borrow. It was a treasure trove.

Among the things I read was the 1951 short story, The Marching Morons, by Cyril M. Kornbluth. It takes place in a distant future where, because of adverse genetic selection, the average IQ has fallen to 45.

A detail of the story that has stayed with me was the marketing of cars in that imaginary distant future. The cars weren’t very fast or powerful, so they were fitted out with electronic sound effects that made them sound like rolling thunder.

Here's the short story.

Reading the Washington Post the other day, I stumbled upon this:

For the 2015 Mustang EcoBoost, Ford sound engineers and developers worked on an “Active Noise Control” system that amplifies the engine’s purr through the car speakers . . .

Ford said in a statement that the vintage V-8 engine boom “has long been considered the mating call of Mustang,” but added that the newly processed pony-car sound is “athletic and youthful,” “a more refined growl” with “a low-frequency sense of powerfulness.”

Here's the link to the piece.

Welcome to the future.




Share This


The Battle of the Resumes

 | 

Maureen Dowd’s new column about Hillary Clinton convinces me that I am not the only one who smells something peculiarly sick and rotten in presidential politics.

On one side, we have Hillary Clinton, who presents a resume for high office with these major bullet points:

  1. Partnership in a radically dysfunctional marriage with a discredited former president, specializing in cheating and sleazing.
  2. Female gender.
  3. A long string of jobs — partner in a provincial law firm, power behind a throne, United States senator, secretary of state — which she survived, innocent of credit for any specific accomplishment.
  4. Proven ability to cadge money from Near Eastern religious fanatics, one-dimensional feminists, crony capitalists, and other people with hands out for favors.
  5. Proven ability, acquired from her husband (see 1, above), to operate (with the help of 4, above) a political mafia.
  6. Proven ability to tell nothing but lies.
  7. Proven ability to deliver any desired quantity of self-righteous statements about other people’s duties.

On the opposite side, we have John Ellis (“Jeb”) Bush, whose resume emphasizes these points:

  1. Membership in a family that includes two abjectly unsuccessful presidents.
  2. Modest success as governor of Florida.
  3. Proven ability to cadge money from “moderate” (i.e., non) Republicans and crony capitalists devoted to cheap labor, open immigration, and votes for Dems.
  4. Proven ability to lose votes from anyone to the right of Anderson Cooper.
  5. Proven ability to look stupid on any public occasion.
  6. Proven ability to deliver any desired quantity of self-righteous statements about other people’s duties.

It’s remarkable that everyone who has any knowledge of politics has read these resumes, understands them, and talks about them as if they were plastic disks in a checkers game.

Well, almost everyone. Dowd, for all her leftist craziness, is a respectable person.

But let’s see . . . Who has the longer resume? Jeb or Hillary?

It’s Hillary! She wins!

Can it be that in today’s America, or any other country, this is how things happen?




Share This


Obama's ISIS Strategy: Death by Flatulence

 | 

The more the Obama administration talks about the war on terrorism, the less we know. What are we fighting? Is it violent extremism or radical Islam? OK, it's actually radical Islam (we only need to kill jihadists, not all Muslims); the term "violent extremism" is less offensive to violent Islamists and no one cares about its repugnance to non-Muslim violent extremists — a subset in the Venn diagram of terrorism that is imperceptible to all but a handful of White House officials.

But is it Sunni radical Muslims or Shiite radical Muslims that are the problem? Or both? (And who are we to make such judgments — after the Crusades and all?) Do we need to worry about Iran, with its expanding regional hegemony, soon to be bolstered by nuclear weapons? Or Iraq, which, having been abandoned by the US in 2010, has descended into barbaric chaos with the Sunni Islamic State of Iraq and al-Sham (ISIS) running amok throughout its north, and equally vicious Iranian militia groups running amok everywhere else? Or both?

And what about the original Syrian rebels, valiantly fighting Bashar Assad? When, in 2011, the civilian death toll from Assad's brutal regime had reached 2,000, a horrified Mr. Obama declared that Assad must step aside. Yet, after drawing his famous red line, it was Obama who stepped aside, allowing both ISIS and Iranian thugs to trespass into Syria. What are we to make of Obama's silence today, when the Syrian death toll exceeds 200,000? And, as Hezbollah fighters and Iran’s Revolutionary Guard Corps (IRGC) creep into the Golan Heights and Hamas wages war in Gaza, why has Mr. Obama become displeased with Israeli president, Nethenyahu? Is it time to abandon Israel?

When it comes to facing ISIS on the ground, those with the most to lose have the greatest aversion to do so.

Some experts believe that if we (Western infidels) knew what radical Muslims wanted, then a reasonably peaceful coexistence agreement could be reached. But, as President Obama is discovering in his negotiations with Iran, even when we know what radical Muslims want, compromise is a charade, with reason playing, at best, a bit part to concession.

Despite his Herculean appeasement efforts, Obama has been unable to convince Iran to abandon its nuclear weapons program. His support for President Nouri al-Maliki (a puppet of Iran ) and his (Maliki's) violent purge of Sunni participation in Iraqi government affairs; his hasty withdrawal of American military forces — just when the Bush-Petraeus surge had stabilized the country and Vice President Biden was gleefully declaring that Iraq was "going to be one of the great achievements of this administration"; his refusal to help the Kurds fight ISIS militants; his blind eye to the spread of Shiite terrorism in Syria, Iraq, Lebenon, Yemen, and Gaza — all has been for naught.

In 2012, Obama issued a crystal clear promise to "do whatever it takes to prevent Iran from producing an atomic bomb." That promise became nebulous with a November 2013 agreement to forge, within six months, a treaty to freeze or reverse progress at all of Iran’s major nuclear facilities. Today, as the delays (and the relaxation of economic sanctions against Iran) continue, Obama's promise is idle. The mullahs, who have been playing him for a sucker all along, will get their bomb. Obama can only hope for a toothless treaty that postpones Iran's acquisition of a functioning ICBM system — until after he leaves office, when nuclear proliferation in the Middle East will become his successor's problem.

As al Qaeda continues to be a grave threat, Mr. Obama has convinced himself that for ISIS — the now much larger threat — we can pretend that everything's going to be OK.

We also know what Sunni Muslim radical organizations such as ISIS want. They tell us, loudly and unequivocally: 7th-century Islam, a caliphate, with sharia law, and remorseless death to all who interfere. That they are pathologically indifferent to diplomacy, negotiation, or compromise is demonstrated in a relentless parade of choreographed atrocities: decapitation, crucifixion, immolation, torture, rape, slavery, and mass murder, to name a few. In his brilliant and disturbing exposé, What ISIS Really Wants, Graeme Wood elucidates,

We can gather that their state [ISIS] rejects peace as a matter of principle; that it hungers for genocide; that its religious views make it constitutionally incapable of certain types of change, even if that change might ensure its survival; and that it considers itself a harbinger of — and headline player in — the imminent end of the world.

Wood suspects that, in the past year, president Obama's confusion over the nature of ISIS "may have contributed to significant strategic errors." The confusion extends much further back. As ISIS marauded into Iraq in late 2013, Obama may have believed that he could reason with Abu Bakr al-Baghdadi, the leader of what Obama perceived to be the al Qaeda JV team. However, already embroiled in the war against terrorism and fully aware of ISIS's fanatical designs on Iraq, he might have followed the advice of Benjamin Franklin, arguably the finest diplomat in US history, who knew that sometimes "force shites on the back of reason." Had Obama chosen this path, any time before January 3, 2014, the day when Fallujah fell to al-Baghdadi's brutal thugs, would have been a fine time for overwhelming military force to shit on the back of ISIS.

It did not. Unchallenged, ISIS continued its rapid expansion, conquering most of northern Iraq by early June, when it captured the city of Mosul. It wasn't until August, when American journalist James Foley was beheaded, that Obama sprang into action — in a press briefing, where the president announced, to the dismay of our allies in the Middle East and Europe, that he had no strategy.

By the following week, however, he had hastily cobbled together a plan to "degrade and ultimately defeat" ISIS. Enlisting the aid of allies (nine, initially), it would involve air strikes against ISIS targets in Iraq and not involve American "boots on the ground" anywhere. With Syria but a tattered impression in his entangled memory, Secretary of State John Kerry spouted, “Obviously I think that’s a red line for everybody here.” ISIS poses no existential threat to the US, yet. The immediate threat is to Iraq, the oil producing monarchies in the Arabian Peninsula, and, to a lesser extent, Europe. When it comes to facing ISIS on the ground, those with the most to lose have the greatest aversion to do so.

Obama's goal may be to defeat ISIS, but his strategy is based on constraint.

Only the Kurds have been willing to face ISIS. Apart from Israel, they are our only true ally in the region. They struggle alone, except for sporadic US air support. Their weapons are obsolete. The ISIS attackers wield vastly superior American weapons, stolen from the Iraqi military. Kurdish pleas for such weapons have found nothing but Obama's shameless denial.

Our other Middle East allies meekly stand by, partly because of their reluctance to face any grueling warfare, but also, perhaps more significantly, because of their suspicions about Obama. They are Sunnis, who, while appreciating Obama's dilemma in Syria (where he can't bomb ISIS without helping Assad), are deeply troubled by his concessions to Iran — a Shiite juggernaut feared more than ISIS. Why should they follow a leader whose ultimate sympathies lie with their ultimate enemy?

President Obama entered office vowing to deliver on his campaign pledge to improve America's image in the Middle East. Apologizing for America's arrogance (including the War in Iraq, torture, Gitmo, and more), he did his best to ingratiate himself to the Muslim world. He did, however, warn that "al Qaeda is still a threat and that we cannot pretend somehow that because Barack Hussein Obama got elected as president suddenly everything's going to be OK."

But ending the Iraq War did not win the favor of Islam. Indeed, Obama's hasty withdrawal from Iraq (against the wishes of his military advisors) thrust that country into a violent chaos that destroyed what he himself called “a sovereign, stable and self-reliant Iraq" and touted as "an extraordinary achievement." It allowed ISIS to be created — reconstituted from the remnants of al Qaeda in Iraq (AQI) that had been defeated by the Bush-Petraeus surge. With his pre-announced 2016 exit, Afghanistan is likely to follow the same trajectory. And we were kicked out of Libya, Yemen, and Syria by Sunni Muslim terrorists, Shiite Muslim terrorists, and Vladimir Putin, respectively. So much for America's image.

As al Qaeda continues to be a grave threat, Mr. Obama has convinced himself that for ISIS — the now much larger threat — we can pretend that everything's going to be OK. In his recent Vox interview, he asserted that the media exaggerates terrorism and that climate change and epidemic disease may be more important issues. He concedes that it is legitimate for Americans to be concerned "when you've got a bunch of violent, vicious zealots who behead people or randomly shoot a bunch of folks in a deli in Paris," fastidiously avoiding, of course, any association with radical Islam. We should not be alarmed by the organization that he once dismissed as a JV team, and now dismisses as a caliphate, believing that it will collapse under its own weight. Says Obama, "It [ISIS] can talk about setting up the new caliphate but nobody is under any illusions that they can actually, you know, sustain or feed people or educate people or organize a society that would work."

Nevertheless, with the gruesome ISIS murders, in early February, of a Japanese journalist (beheaded), a Jordanian pilot (burned alive in a cage), and 21 Egyptian Christians (beheaded), Obama was spurred to action. He convened a global summit, in Washington DC, where leaders from 60 countries came to combat "violent extremism” — by the surprising method of "empowering local communities" that can provide "economic, educational and entrepreneurial development so people have hope for a life of dignity." Said the president, "We can help Muslim entrepreneurs and youths work with the private sector to develop social media tools to counter extremist narratives on the Internet." To that end, the State Department promptly opened 350 twitter accounts (designed, apparently, to deluge the violent extremists with clever anti-barbarism tweets) and a new web site: "The Solution to Violent Extremism Begins in your Community."

Strangely, they are serious. Violent extremism, says John Kerry, is "the defining fight of our generation." Back in the real world, however, it is quite astonishing that Obama has been unable to convince countries such as Saudi Arabia, Turkey, and the Gulf states to join the fight against ISIS. These Sunni Muslim nations, having the most to lose, should be the most willing to put their own boots on the ground against ISIS. Nothing would please America more than to see Arab Muslim soldiers at the forefront of Obama's "degrade and ultimately defeat" ISIS’ campaign. Should this happen, I am sure that Christians, Jews, and those of other faiths would march together with Muslim Americans through the streets of America cheering for our president and praising his inspired leadership.

Hope could work. It has worked very well for Obama in the past. After all, it's how he was elected president.

But it's not likely. Obama's goal may be to defeat ISIS, but his strategy is based on constraint: can't bomb Syria, can't cross Kerry's redline, can't jeopardize negotiations with Iran, can't offend Islam, can't capture terrorists, and so forth. Such a strategy, together with his indecisiveness and distaste for military force, crowds out the possibility of victory. Besides, even if ISIS is defeated, al Qaeda and numerous other radical Muslim organizations remain — not to mention Iran, an immensely virulent, existing terrorist organization, on the fast track to obtain nuclear weapons.

President Obama, therefore, has retreated to his community organizer roots, where he finds, as chief weapons against Islamic terrorism: political rhetoric, social media, and hope — hope that ISIS self-destructs, that budding terrorists find jobs, that Iran abandons its nuclear ambitions, that pithy tweets will curb terrorist atrocities and stymie terrorist recruitment, and that the media stops exaggerating the barbarous acts committed, as Obama is careful to insist, by "individuals from various religions."

Hope could work. It has worked very well for Obama in the past. After all, it's how he was elected president. On the other hand, in Poor Richard’s Almanack, Franklin also warned, "He that lives upon Hope, dies farting."




Share This


If Ever, Oh Ever, a Wiz There Was

 | 

Entering his capital in triumph after a desperately hard campaign, Frederick the Great rode with his eyes forward, refusing to acknowledge the cheers of the crowd. An aide said, “Sire, the people are applauding you.” Frederick, eyes still resolutely on the road, replied, “If you put a monkey on this horse, the people would applaud him.”

That is my idea about news “anchors” such as Brian Williams.

It’s not everybody’s idea. On Feb. 8, in one of the last columns he wrote before his untimely death, David Carr said:

For some time now, there have been two versions of Brian Williams. One is an Emmy-winning, sober, talented anchor on the “NBC Nightly News” and the other is a funny, urbane celebrity who hosts “Saturday Night Live,” slow-jams the news with Jimmy Fallon and crushes it in every speech and public appearance he makes.

Each of those personas benefited the other, and his fame and appeal grew accordingly, past the anchor chair he occupied every weeknight and into a realm of celebrity that reaches all demographics and platforms. Even young people who wouldn’t be caught dead watching the evening news know who Mr. Williams is.

I’m citing this as a good example of the strange idea that there was a before and after to the Williams story — a before in which Williams was not just a celebrity but a funny, urbane, talented, appealing celebrity, and an after in which he was a dope and a blowhard, always telling ridiculous stories about himself.

As witnessed by the add-on adjectives, one can be a celebrity without having any attractive qualities at all. That has been obvious for some time, but it’s still interesting to know. Unfortunately, it’s also evident that one doesn’t need to do much in order to be regarded as funny, urbane, talented, and appealing. To me, and to hundreds of millions of other people, there was never anything remarkable about Brian Williams. I don’t regard slow-jamming the news with Jimmy Fallon as something that requires a lot of talent. Williams never crushed it with me.

Williams’ talent, such as it was, consisted merely of being a news anchor who did things that are usually not associated with being a news anchor. Lots of people are celebrities for reasons like that. Preachers and politicians get loud laughs when they tell a joke, but only because people think it’s amusing that someone with such a dull job can tell any joke at all. The animals on YouTube are considered amusing for doing things that any dull, ordinary person does every day; their talent is merely being animals that are trying to do those things. But if you found out that the dog wasn’t really a dog, or the cat wasn’t really a cat, or the news anchor wasn’t anchoring much of anything, no one would want to watch any of the supposedly amusing antics. And being a news anchor requires a lot less than being a dog or cat.

One can be a celebrity without having any attractive qualities at all. That has been obvious for some time.

I am old enough to have been a victim of the Age of Cronkite, an age now deeply venerated by a lot of people who believe that at some time in the past there really was a Wizard of Oz. I say “victim” because in those days there was no national electronic news except the offerings of the three government-licensed networks. Cable TV — always called, suspiciously, “pay TV” — did not exist. Basically, it was illegal. So, for lack of competition, a complacent man of modern-liberal ideas who was capable of reading a few minutes of typecript, crying when Democrats were hurt or killed, and, essentially, reprocessing news releases from the White House (or, in times of Republican administrations, from opponents of the White House) was worshiped as a god. At the time, however, he was worshiped by nobody except people whose own intellectual equipment was so faulty that their fondest hope was to be like him.

I know of one “news anchor” who was smart and knowledgeable and a good writer of his own books. That was David Brinkley. There used to be a cable anchor who was even better than Brinkley, Brit Hume of Fox News. But Brinkley is dead, and Hume is retired. Compared with these respectable figures, Walter Cronkite was the little mouse you see in the diorama of North American mammals, nibbling seeds at a fearful distance from the lordly elk. Brian Williams is down the hall, in the insect diorama.

This kind of comparison is actually too good to waste on such a lowly subject as Williams. It would be more appropriate for creatures with real significance — dictators, kings, and presidents. In the presidential diorama, the elk herd would feature such important fauna as Washington, Jefferson, Jackson, Cleveland (who commanded his aides to “tell the truth,” and meant it); the mice would be Monroe, Benjamin Harrison, Hoover, and so on; and the insects would be Tyler, Carter, Clinton, Bush (the second), and Obama. Yes, I know, we may need to bring more animals into the metaphor. But the curious — or curiously predictable — thing is that Williams actually aspired to be one of the celebrity insects, who in our times are happiest scurrying about in their hard little bodies, irritating everybody else into noticing them.

Mental image of Williams, looking for a cupboard in which to store imaginary deaths.

In a documentary filmed in 2006 about Hurricane Katrina, which in 2005 flooded low-lying parts of New Orleans, and which with a lot of help from Williams enraged the nation at the inability of Republican administrations to overrule acts of God, Williams boasted: “People say we found our voice on this story, after some long, cold years of one Bush term and some change.” What? What did he mean by that?

He provided part of an explanation in a speech at Temple University last October. He was there, as Tim Graham put it in News Busters on Feb. 10, to pick up “an award for ‘excellence.’” They give each other awards, these excellent people. And if they have to lie, well what the hell? "I have seen thousands of dead people in different places," Williams claimed, erroneously. Then he demanded the reward of sympathy for his own imaginary suffering. Speaking of himself, he said, "You have to find a place to put that [his erroneous memories] or else you can't get up in the morning." Mental image of Williams, looking for a cupboard in which to store imaginary deaths.

After that outburst, he turned to his reason for hearing a mighty significance in the “voice” he “found” in New Orleans — in the tale he told (with suitable adjustments, over the years, such as seeing dead bodies floating around the streets, or nearly dying, himself, of dysentery, or being threatened by gangs that busted into his hotel) about the New Orleans hurricane. I apologize for the syntax of the following quotation from Williams. You have to realize that this is how talented network news anchors (pay rate: $10 million a year) talk when they’re off-script:

For what it meant to our society. For what it still means. The issues. Race. The environment. Energy. Justice. The lack of it. It's all still there.

Now really, what can you make of that? Beneath the total incoherence appears to lurk some claim that by reporting (falsely) on a flood, Williams was somehow addressing issues of race (granted, most of the people who were flooded out by Katrina were African-Americans), energy (huh?), justice (it’s unjust to be flooded by a hurricane?), and “the environment.” The only way in which that last phrase makes sense is to assume that Williams is indicting Mother Nature for being, as Tennyson called her, “red in tooth and claw.” But I’m sure that’s not what he meant.

Whenever Brian Williams had himself photographed in some bold act of “reporting,” he was surrounded by network tenders, every one of whom knew what he was doing, and knew it was crap.

When something is really bad, it’s impossible to parody. Its literary effect cannot be enhanced, no matter what you do. But the political and social interest of Williams’ bizarre statements has not been fully developed. The big question is, why didn’t somebody at NBC stop him from saying all this crap? Everybody knew he was saying it, over and over again, for years. And I’m not just depending on anonymous sources to make that allegation. Given the nature of television broadcasting, it has to be true.

To me, one of the most amazing things in the world is that people give some kind of credence to the word “reality” in connection with what they see on television. Consider the term “reality TV.” Twenty feet away from those morons who face the camera and pour out their hearts about how lonely and helpless they’re feeling is a crowd of photographers, directors, producers, make-up artists, best boys, gophers, and people whose jobs cannot be described. In the same way, whenever Brian Williams had himself photographed in some bold act of “reporting,” he was surrounded by network tenders, every one of whom knew what he was doing, and knew it was crap. When he dribbled out his life story to interviewers on other media, hundreds of people back at NBC were following the publicity it gave him, and them. They knew, all right. The social and political question is, why did they let it go on, for more than a decade?

One answer is that they were lazy. But that’s the wrong answer. People who have responsible positions with TV networks aren’t sleepy little puppies; they’re sleepless sharks, required to compete with other sleepless sharks. OK, try this: nothing was done about Williams because he was being paid ten million dollars a year, and you don’t mess with that kind of investment. If you do, the investment will make like a shark and mess with you. That’s a better answer. And maybe it’s a sufficient one, although it doesn’t account for the behavior of the top predators, the ones who were doling out the money and should have been more risk-averse.

A third answer, which may be true, or partially true, is that Williams was protected by his dopey, inarticulate, yet constant political correctness. Here is the guy who interviewed the last President Bush, long after he had left office, and expressed astonishment that his recent reading matter could actually have included a Camus novel and three of Shakespeare’s plays. Astonishment. To the man’s face. Anyone not a dopey leftist would have said, “Oh, Mr. President, what impressed you most about those works?” But Williams is just dopey enough to believe his own dopey propaganda. He believed that Bush was dumb, and he didn’t know how to deal with the counter-evidence. (Or, probably, with the literary conversation that might have ensued.)

I hope you won’t write in to accuse me of being a partisan of George Bush, either one of them. Don’t worry about that. But everybody with the least curiosity has always known that Bush (regnal years 2001–2009) is a huge reader of books. Whether they do him any good — that’s another question. But what books has Williams ever read? Certainly none that would reveal to him the individuality of human life, its constant war with social stereotypes (e.g., “Republicans have no culture”). So naturally he aspired to become a stereotype — in just the best and brightest way. He cast himself as a thoughtful advocate for such stereotyped issues as, oh, “the environment,” “justice,” and the like. No one could possibly fear that he would ever harbor a critical thought about such things.

Williams is just dopey enough to believe his own dopey propaganda.

And that, I believe, is why “liberal” commentators have been so anxious to defend him, regretting that he quit, being confident that his offenses didn’t rise to the level of lying, bringing in psychiatry to remind us that people easily and innocently confuse their memories, and doing all the other stuff they’re paid big bucks to do. I guess they don’t want to lose their own license to lie.

NBC’s official approach is different. Network pooh-bahs are taking the line that presidential spokesman Josh Earnest recently took, in response to questions about Obama’s decision not to join the Charlie Hebdo protest against terrorism, or to send anyone important to sub for him. The basic policy is that responsible officialsaccept responsibility only for successes. Failures are no one’s responsibility. They happened. Well, sort of. But now we can move on.

As Julie Pace of the AP informed her readers, Earnest explained his boss’s absence from the Paris demonstration in this way:

Earnest said the White House took the blame but that Obama himself was not personally involved in the decision. Earnest would not say who was responsible for deciding the administration's participation in the event.

In other words, it is now possible to get on Air Force One and travel to Paris, or not to get on Air Force One and travel to Paris, and still have nothing whatever to do with the decision, personally. It wasn’t the decision of anyone who lives in the White House; the White House itself made the decision, or at least took the blame. Personal now means impersonal, and responsibility means freedom from responsibility.

Good. Very good.




Share This
Syndicate content

© Copyright 2018 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.