Fools and Their Folly

 | 

Ralph Northam, governor of Virginia —perhaps soon-to-be ex-governor of Virginia — is a fool. On that we can all agree.

But until a few days ago, he was not a fool.

He was not a fool when he was running for governor and some of his followers ran an ad suggesting that his opponent was a violent racist, an ad that he first defended, while implicitly disavowing, and then disavowed, while implicitly defending. A few associates of his opponent’s party remember that, but nobody really cares.

Someone finally publicized what must have been known to many, a page from Northam’s med-school yearbook showing a man in blackface and a Klansman drinking happily together

And he was not a fool when, on January 30 of this year, he commented on a bill advanced by his party in the legislature that seemed to allow abortions during normal-term birth, with the option of infanticide, by saying:

So in this particular example if a mother is in labor, I can tell you exactly what would happen, the infant would be delivered. The infant would be kept comfortable. The infant would be resuscitated if that’s what the mother and the family desired, and then a discussion would ensue between the physicians and the mother.

Conservatives pounced on this saying, asserting that Northam was a baby killer, although it was easier to show that his comments about “exactly what would happen” were more like the maunderings of a fool than any declaration of specific intent. But few people called him a fool.

Then, in early February, someone finally publicized what must have been known to many, a page from Northam’s med-school yearbook glorifying alcoholic beverages and illustrating their glory by showing a man in blackface and a Klansman drinking happily from their cans of (presumably) brew. That’s exactly what you want in your med-school yearbook, right? If you do, you’re a fool.

Northam then proceeded to prove, and overprove, that you cannot part a fool from his folly. He confessed that he was one of the men in the picture, though he didn’t say which one, and apologized for the harmful effects of what he had done. A day later he decided that he was not one of the men in the picture and had, in fact, nothing to do with the picture — although, he added, he had once done a blackface imitation of Michael Jackson’s moonwalk routine. It is said that Northam’s wife had to prevent him from showing the press that he could still do the moonwalk.

The root cause of racism and all its ridiculous symbols and tokens is folly, mindlessness, sheer stupidity.,

Instantaneously, cries arose from every quarter, including Northam’s own party, that he must resign forthwith. There were even cries, from outside his party, against the allegedly culpable inaction of his lieutenant governor, an African-American who, perhaps, did not wish to be seen staging a coup d’etat. Northam was now everything vile and vicious, and the whole nation appeared to agree.

But the root of this vileness was not identified. The root cause of racism and all its ridiculous symbols and tokens — symbols and tokens that may sometimes exist without any particularly racist thought, or any thought at all — is folly, mindlessness, sheer stupidity, the conviction that you are thinking when you’re not, the conviction that you can get through the world without any mental activity, and that nobody else will notice.

Apparently, the odds on doing so are pretty good, because Northam did get through 59 years in this world without anyone noticing what a dope he is. It’s only the “racism” that was finally observed. And I suppose that this is the way the republic needs to continue, because where could political leaders be found if every fool were identified as such, and driven from public office?




Share This


L’Amour, L’Amour!

 | 

I’m sure that at some time in your life you’ve had a friend who made you his confidant about the details of a troubled romance. He claimed to want your advice, but advice was hard to give, because he kept painting different pictures of his special person. One day she was an angel; the next day, a devil; the third day, some woman he could barely remember — a minor mistake from which he was moving forward. But the cycle began all over again, and you wondered whether he was talking about the same person, or any person, or just a strange projection of himself.

I thought of this when I watched the behavior of the alleged news media on the weekend of January 18, when they fell in love with a story provided by an oft-discredited reporter for the oft-discredited BuzzFeed. The story, which involved “evidence” that Donald Trump had told one of his attorneys, the oft-discredited Michael Cohen, to lie to Congress about a hotel deal in Russia, was unlikely on the face of it. But beauty is in the eye of the beholder, and the media took to it like trout to Acme’s Amazing Fly. Then it was proven false, and became, like a discarded love affair, a sad betrayal of ardent feelings, closely followed by, “Oh, that! Do you still care about that?”

I’m doubtless being too judgmental, but ye who have watched a friend go through this cycle again and again, whisper now to me: after a while, don’t you begin to wonder whether your lovelorn buddy is actually very bright? You don’t care whether he’s a college professor or an expert on something scientific, or even a talking head on TV. You wonder: maybe this guy’s just not very smart.

Beauty is in the eye of the beholder, and the media took to it like trout to Acme’s Amazing Fly.

I say that, because the media have gone through all this many times before, and we know they’ll go through it many times again. Curiously unable to make a cogent argument against the Trump regime, the $300K-per-year hacks of big media are always dying to romance another story with flashy makeup and fishnet stockings, protecting themselves from consequences not with a condom but with a magic incantation: “If this is true . . . .”

On January 18, the phrase, “If this is true, then President Trump will be impeached” was repeated so often that, a couple of days later, during the wake-up-with-a-hangover-but-without- your-wallet period of the news cycle, I heard a pundit on MSM-TV (don’t ask me which; they all look alike to me) exclaim: “‘If true’ — the most important words in Washington today!”

Here’s the game, and any fanatic, or newsroom partisan, or idiot with an axe to grind, knows how to play it: in a well-wired nation of hundreds of millions of people, you can source any kind of story you want to run. If you want to suggest that plants can talk, or microwaves cause cancer, or marijuana has no medical value, or minimum wage laws create jobs, or immigration increases average household income, or crime is out of control, you can refer to a study or report that makes that claim, broadcast it, and add, “If true, this calls for . . .” some kind of action.

After a while, don’t you begin to wonder whether your lovelorn buddy is actually very bright?

You can do the same with any well-known person. You can find someone who accuses him or her of something, present some version of testimony or senior officials’ anonymous comments or the cleaning staff’s careful review of discarded notes, add the “If true,” and make your own suggestions about impeachment, hanging, drawing and quartering, or merely (because you are full of mercy) firing, shaming, and reeducating.

Intelligent people can usually see through this. Unintelligent people assume that nobody will. It is with this in mind that I present the comment of Congressman Jim Clyburn (D-SC) regarding the “if true” debacle of the weekend of January 18: “I don’t think that my Democratic friends are in any way rushing to judgment because they qualified right up front [by saying], 'If this is true.' When you preface your statement with 'If this is true,' that, to me, gives you all the cover you need."

So if some rightwing screed should claim, with no evidence except its say-so, that Jim Clyburn told an election official in his district to pack the ballot box, the whole establishment media as well as House Republicans would be justified in saying to the nation, in tones of solemn righteousness, “If this is true, Clyburn will be thrown out of Congress”? Well, if you say so. People have been hanged on less evidence.

Intelligent people can usually see through this. Unintelligent people assume that nobody will.

But let’s return to the wording of Representative Clyburn’s statement, the part about “if this is true” giving “you all the cover you need.” Cover, used in this sense, has interesting connotations. It originated in the argot of criminals — “Yeah, I’m a bank teller; that’s my cover, till we git through with lootin’ the joint” — and it has never shed its associations with shady dealing. To cover yourself means to obscure a wrongful or equivocal deed. No one says cover myself without meaning cover up. If Clyburn doesn’t know this, he’s illiterate. If he does know it, he’s bragging about his colleagues’ shadiness.

Aaron Blake, senior political reporter for the Washington Post (what titles they have!), reviewed the issues about BuzzFeed’s fake news and its, ahem, coverage in a long series of tweets, going back and forth over the ethical problems like a cow searching helplessly for that last blade of grass (“I honestly don’t know what the answer is here”), and munching such deep thoughts as: “Each piece that’s written about something that may turn out to be untrue is counter-productive, at best. Even with extensive caveating (which I included), it furthers a story the [sic] erodes trust in the media.” He preceded this observation with a muddled commentary on the supposed responsibility of you and me, his audience (if any): “Media consumers aren’t as savvy as we’d like them to be, and just because something is technically accurate and qualified doesn’t make it good. People skip right over those caveats, and if they want to believe these reports, they treat them like gospel.”

Well, isn’t that smart! It’s almost as smart as thinking that caveating is a word, and very hip and cool, indeed. It’s almost as smart as telling your audience (media consumers) how dumb you think they are. But wait! Maybe that means that you yourself aren’t very smart. If that is true . . .

If Clyburn doesn’t know this, he’s illiterate. If he does know it, he’s bragging about his colleagues’ shadiness.

It’s hard to think about Washington, the place where words and phrases go to die, without thinking of that great eviscerator of meanings, the Washington Post, which recognized and continues to encourage the talent of Mr. Blake. On the night of January 18, the Post ran a story, as it had to do, about Special Counsel Robert Mueller’s contemptuous dismissal of the BuzzFeed report. At the end of that story appeared the words, in bold type: “Reporting the facts for over 140 years” — a bizarre reference to the Post itself. This claim was followed by a list of articles that “The Post Recommends.” The first two ridiculed President Trump. The third was headlined in this way:

Five big takeaways from the stunning report that Trump told Cohen to lie

If Trump told Michael Cohen to commit perjury, this could break the dam.

For God’s sake, couldn’t they drop the “recommends” at the moment when they themselves were debunking the stunning report?

Intelligent? No.

But as if to verify a lack of intelligence, the liberal media, and some noteworthy conservative media, immediately fell head over heels in love with a new story — a story about the supposed attack on an “ancient,” “frail,” American Indian “elder” and “Vietnam War veteran” who was “surrounded” and “harassed” and “threatened” by teenagers from a Catholic school in Covington, Kentucky who had come to Washington to participate in a church-sponsored anti-abortion rally.

It’s almost as smart as telling your audience how dumb you think they are.

By this time, I don’t need to tell you what happened on January 18 at the Lincoln Memorial. My own version, which I believe is now the generally accepted one, is that the teenagers were waiting for a bus when they were attacked with violent words by a nutball group of “black Israelites” who called them crackers, faggots, and incest children, and called their black members a word that sounds like Negro, but is not. Rather than respond with violence, the students continued to wait, with placid, dopey high-school expressions on their faces. Then, out of nowhere, an American Indian from Ypsilanti, Michigan came forward to beat a drum in their faces. I mean in their faces. Through all these things, the students responded with goofy good humor, chanting inane school cheers, jumping along with the rhythm of the drum, etc. That’s it. Here is Robby Soave’s account of the story, from Reason. And here are videos, of various political tendency. You are welcome to disagree with Soave’s interpretation, or mine.

In any event, the “elder’s” entourage bore cameras, and by means of a Twitter source that even Twitter has now banished for misrepresenting itself, an invidiously edited video of the proceedings was made available to established “news” organizations, which immediately, without waiting a second, retailed the incident as a prize example of white racism.

This new spasm of national outrage included, in short order and with no pretense of investigation, fervent denunciations of the students not only by the usual suspects but also by the March for Life, the students’ Catholic diocese, the neighboring Catholic diocese, their school, and that august conservative journal, supporter of the right to life, and scourge of political correctness, National Review. NR published an article alleging of the students that “they might as well have just spit on the cross and got it over with.”

I’m not a Catholic, but I’m willing to confess: when I see “spit” being used instead of the real form of the verb in question, which is “spat,” my thought goes to, “You’re pretty dumb, aren’t you?” Especially if you’re a religious person, supposedly steeped in Scripture, and think that the Kentucky students are like the Roman soldiers who nailed Jesus to the cross. That’s the comparison that NR’s author made. My liveliest feeling was disgust at this combination of ignorance (why can’t you bother to investigate, at least, before you accuse people of being Christ-killers?), lack of perspective (even if the kids had been guilty of something, they’re effing kids, man), and inquisitorial thinking (by this point in history, I don’t need to explain what I mean by that). National Review — is that the journal William F. Buckley once edited?

When I see “spit” being used instead of the real form of the verb in question, which is “spat,” my thought goes to, “You’re pretty dumb, aren’t you?”

Eventually NR apologized for the frantic article by its deputy managing editor, but with some curious excuses. The author, it said, was “operating off the best version of events he had” — an excuse that can be made for any failure to exercise a modicum of skepticism — and he was “writing as a faithful Catholic and pro-lifer who has the highest expectations of his compatriots, not as a social-justice activist.” Wait a minute — did I get that right? Are readers of NR supposed to be reassured that a writer of trash is one of their own?

Within a few days, and after a few threats of lawsuits, many prominent people who had said literally thoughtless things about the Kentucky high-school students — such as the suggestion that there were never more punchable faces than theirs (a desire for physical brutality is ordinarily a sign of intelligence, correct?) — were deleting their posts and tweets and declarations and journal articles (such as the NR article), sometimes in coward silence, but sometimes with sickeningly stupid attempts at explanation.

Example: one Jack Morrissey, a figure in Hollywood, has a Twitter account, on which he said, “#MAGAkids go screaming, hats first, into the woodchipper.” He followed that evocative phrase with a famous image from the movie Fargo, in which a dead body is fed into a woodchipper. Be it noted that the Kentucky kids were, some of them, wearing MAGA hats, which seems to have been the real reason why they were harassed, first in person, and then in the media, it being fair to attack kids as faggots and incest children and words that sounds like Negro but are not and people who have stolen your land, so long as they appear to be supporters of the opposite political party. Very well. Mr. Morrissey dumped his tweet, and apologized. He said, “Yesterday I tweeted an image based on FARGO that was meant to be satirical — as always — but I see now that it was in bad taste.”

Are readers of National Review supposed to be reassured that a writer of trash is one of their own?

Well, good. But wait a minute. Morrissey also said, “I have no issue whatsoever with taking responsibility, but also completely apologizing that I clearly intended it to be seen as satire. That was clearly not recorded that way by many who saw it.”

Oh, I see. It’s we the readers who were dumb enough to miss the point that Morrissey clearly intended to be seen as satire. I’m very sorry! I completely apologize (as opposed to partially apologizing). But tell me, what was it a satire of? If Morrissey would give me a clue, even in his afterthoughts, that it might conceivably be a satire of people who rush to judgement and persecute other people and, in effect, feed kids into a woodchipper, alive and screaming, hats first, then perhaps I will understand. Otherwise, I will conclude that it was a satire of the students, and it was a kind of satire suggesting that something atrociously bad should happen to its objects.

I don’t think that smart people join mobs.

And I don’t think that smart people, apologizing for writing something that appears to be a vile attack on others, will abdicate their responsibility to discover, at long last, the relevant facts of the situation they wrote about. But maybe I’m wrong. Maybe Morrissey is as smart as he’s paid to be. Maybe it isn’t dumb for him to have added: “I have seen tweets from both sides feeling disappointed that the mainstream media went his [sic] way or that way. But I haven’t had the headspace to take the time to watch all the videos.”

Isn’t that precious? He doesn’t have the headspace. And I’ll bet he’s right. He doesn’t.




Share This


Glorious Beale Street

 | 

“Every black person born in America was born on Beale Street, born in the black neighborhood of some American city,” James Baldwin wrote in the 1974 novel on which Barry Jenkins’ film If Beale Street Could Talk is based. It refers to an area of Memphis important to African-Americans, designated by an act of Congress as “the Home of the Blues.”

In the 1860s black traveling musicians began performing there; they eventually developed a genre known as Memphis Blues, led by such legends as B.B. King, Louis Armstrong, Muddy Waters, Rosco Gordon, Memphis Minnie, Albert King, and Rufus Thomas. B.B. King was once billed as “the Beale Street Blues Boy.”

An astute real estate developer, Thomas Church, became the first black millionaire in the South after he bought land along Beale Street following a devastating yellow fever epidemic. The famous Church Park, a cultural and recreational center where blues musicians gathered, is named for him, not for a religious organization.

By the 1960s Beale Street had fallen on hard times. Many businesses had closed, and a disastrous urban renewal program had torn down many of the historic buildings.

In 1869 a congregation of freed slaves began building the Beale Street Baptist Church. Besides the congregation, it housed the newspaper offices of civil rights journalist Ida B. Wells. Such notables as Ulysses S. Grant and Teddy Roosevelt spoke there, while Booker T. Washington, Woodrow Wilson, and FDR spoke at the 2,000-seat auditorium in Church Park.

However, by the 1960s Beale Street had fallen on hard times. Many businesses had closed, and a disastrous urban renewal program had torn down many of the historic buildings and the neighborhoods surrounding it instead of renewing them. In April 1968 Martin Luther King, Jr. was assassinated not far from Beale Street.

Eventually the neighborhood was restored by the racially diverse Beale Street Development Corporation, and the area is now a popular tourist destination featuring the Beale Street Music Festival in early May each year. Beale Street’s development is tightly controlled by the city of Memphis, the BSDC, and a management company.

In so many ways, the story of Beale Street is an apt metaphor for the African-American experience — artistically gifted, entrepreneurially astute, politically active, brought down by neglect and resentment, and then restored by a consortium of well-intentioned but often misguided do-gooders who have changed the essence of what it once was.

In one particularly beautiful scene, the smoke from Fonny's cigarette swirls around a sculpture and jazz music swirls around the scene as he coaxes the wood into submission to his art.

Beale Street is also an apt metaphor for the characters in Jenkins’ movie If Beale Street Could Talk. A love story at heart, the film uses flashbacks to show how Tish (KiKi Layne) and Fonny (Stephan James) grew up as childhood friends, fell in love as teenagers, planned a future that included marriage and family, and saw their plans destroyed when Fonny was falsely accused of a heinous crime.

Although the movie takes place in Harlem, the characters represent different aspects of the Beale Street story. Fonny is an artist with big dreams. In one particularly beautiful scene, the smoke from his cigarette swirls around a sculpture and jazz music swirls around the scene as he coaxes the wood into submission to his art. Tish’s mother, Sharon (Regina King), works tirelessly against injustice, and Tish’s sister Ernestine (Teyonah Parris) is a rising activist who tells Tish at one point, “Unbow your head, sister, and do not be ashamed.”

Tish and Fonny’s fathers (Colman Domingo and Michael Beach) are both hardworking entrepreneurs. (Well, OK, they aren’t entirely legal, but they justify their black-market business by saying, “I never met a white man who didn’t lie and steal.” And in truth, Fonny is in jail because false witnesses have been suborned against him.) Fonny’s mother (Aunjanue Ellis) represents the church in the black community — moral and austere. And of course the urban renewal board is represented by the overzealous justice system that intends to clean up Harlem by putting the young black men in jail — whether they’re guilty or not.

Jazz and the blues also play central roles in this film. The soundtrack, mostly performed as string adagios, is bluesy, haunting, and full of despair, an emotion created by the close, discordant, unresolved harmonies and the deep, slow vibration of the bow across the bass strings. At the end, the credits roll to the sound of Billy Preston and Joseph Green’s slow, jazzy, plaintive “My Country ’tis of Thee.” If ever there was a time for singing the Beale Street blues, this is it.

OK, they aren’t entirely legal, but they justify their black-market business by saying, “I never met a white man who didn’t lie and steal.”

Although Baldwin describes the young lovers in his novel as plain and unattractive, Jenkins chose to cast his Tish and Fonny with two astonishingly beautiful young actors. KiKi Layne radiates wide-eyed innocence mingled with tough determination, and Stephan James is not only handsome but also blessed with kind eyes and a warm smile. Who wouldn’t be drawn to them? Studies show that we trust and like attractive people more readily than ugly people, and clearly Jenkins was not going to take any chance that the audience might not sympathize with his protagonists. Mind you, I’m not complaining about the casting; it was a pleasure watching these two fall in love on screen.

We don’t learn the nature of the crime with which Fonny has been charged until 45 minutes into the movie, although we learn in the first five minutes that he is in jail. Jenkins also softens the scene of the first sexual encounter between the two by having Fonny gently cover Tish’s naked breasts with a blanket in a gesture that is both protective and romantic. It subtly tells us that Fonny could not have done what he is charged with; he just isn’t that kind of guy.

Sadly, under our flawed, overcrowded, injustice system, it doesn’t much matter whether a person is guilty or innocent, especially if the person is poor or black. Most never go to trial. In fact, according to legal scholar William J. Stuntz, an astounding 94% of state felony convictions and 97% of federal convictions stem from plea bargains. If you can’t afford bail, you’ll sit in jail, waiting for your day in court, often for months and sometimes for years. So you take the deal and the record, just to get out of jail and back to your life. As Tish says to the audience in voiceover narration, “I hope that no one has to talk to anyone they love through a glass.”

Faced with the prospect of 30-to-life for a trial conviction versus 8-to-10 for a plea deal, even an innocent person is likely to take the deal.

Moreover, plea bargains have now become the safer bet in a legal system where freedom hangs on how a jury interprets the evidence and the defendant. Faced with the prospect of 30-to-life for a trial conviction versus 8-to-10 for a plea deal, even an innocent person is likely to take the deal. The deadly “to life” tacked on to many sentences today is especially chilling for the innocent; how can you convince the parole board of your remorse for a crime you did not commit?

The routine indeterminate sentencing of “to life,” which is bad for many reasons, was created three generations ago by liberal reformers. Its heyday is long past and needs to be eliminated, along with mandatory sentencing and three-strikes rules, to allow judges to judge and prisoners to have hope.

If Beale Street Could Talk presents a powerful story of love, loss, and loyalty. Baldwin’s 1974 portrayal of the injustice of our court system is just as true today. Barry Jenkins’ film version is not completely true to the novel, nor should it be — film is a visual and aural genre and needs to be adapted accordingly. The film is beautiful to watch, even though it is heartbreaking to comprehend.


Editor's Note: Review of "If Beale Street Could Talk," directed by Barry Jenkins. Annapurna, 2018, 119 minutes.



Share This


No White Saviors Need Apply

 | 

Witty, ironic, meaningful, and delightfully entertaining, Green Book is quite possibly the best movie released in 2018.

It’s based on the true story of African-American pianist Don Shirley (Mahershala Ali) and the unlikely friendship he developed with Tony Vallelonga (Viggo Mortensen), a Copacabana bouncer and self-proclaimed bullshitter. In fall 1962 Shirley hired Tony to be his driver and bodyguard during a concert tour through the South of the Don Shirley Trio, consisting of Shirley and two white string musicians. What follows is a new twist on the old buddy genre as two opposites, one black, suave, educated, and sophisticated and the other white, uncouth, ill-spoken, and street smart, learn to like each other. The two could not be more different, or more written against stereotype.

Shirley is an isolated individualist — certainly not defined by his race, but confined by Jim Crow nonetheless.

The name of the movie comes from a guidebook published by the Negro Tourists Bureau from the 1930s to the mid-1960s called The Negro Motorist Green Book. As you can guess, it identified restaurants, hotels, and public buildings that travelers of African descent could patronize. It was demeaning, and the Don Shirley Trio could make three times as much money doing gigs in New York, where they were more accepted and could move more freely. But, like Jackie Robinson before him, Don Shirley was out to make a point and blaze trails. He chose the southern circuit on purpose. Oleg (Dimiter D. Marinov), the bass player, understands. “Genius is not enough,” he explains to Tony. “It takes courage to change people’s hearts.”

We also realize that genius is not enough to bring happiness, any more than money is. Shirley is educated, talented, and rich, but he drinks alone. He knows the white European masters of music, but he doesn’t recognize Little Richard, Aretha Franklin, or Sam Cooke. He is a gourmand, but has never tasted fried chicken. He isn’t welcome in the hotel where his companions are staying, but when some men staying at the Green Book hotel invite him to join them for a game of horseshoes, he doesn’t know what to do. “If I’m not black enough, or white enough, or man enough, then tell me — what am I?” he asks Tony in anguish. He is an isolated individualist — certainly not defined by his race, but confined by Jim Crow nonetheless.

The acting throughout the film is superb. Ali won a Golden Globe for his role as the regal, impeccable Shirley; his comedic timing for noncomic dialog is perfect, and wait till you see him play the piano! In fact, the music in this film is stunning. Mortensen packed on the pounds and embraced his inner slob to play the lovable, slovenly, totally unself-aware Tony Lip. Linda Cardellini as Tony’s wife Dolores is so perfect that Nick Vallelonga, Tony’s real-life son and the author of the book on which the screenplay is based, said that he was in tears whenever Cardellini was on camera, because she is so much like his mother. Cardellini is one of those quietly unsung actors who is marvelous in everything she chooses to do. In addition, many of the people in the family scenes are not actors but members of the Vallelonga extended family, and there is an authentic vibrancy as they interact with one another around the table.

Tony is an equal opportunity bigot; he warns Shirley to “watch out for them Krauts and Cuban bastards.”

Unfortunately, following the film’s initial praise from critics when it opened and its three wins at the Golden Globes (for Best Screenplay, Best Supporting Actor, and Best Picture) the reputation of this fine film was maligned. Critics recently charged that it’s “racist” and another “white savior story.” Either these people haven’t seen the film, or they don’t understand the “white savior" genre, or they’re terrified to speak out against the progressivist hegemony.

Well, I’m not afraid to speak out. The person who is saved in this film is not the cultured, wealthy, talented black pianist, who hires the bodyguard, pays the bills, and calls the shots. It’s the gauche, ignorant, uncouth, bigoted white restaurant bouncer who takes the job and the orders. And anyone who suggests that any film with a black star and a white star necessarily creates a hierarchy with the black man at the bottom is being, well, just plain racist.

At the beginning of the movie Tony is comfortable in his bigotry. He’s a product of his environment, and his environment has been racist. He’s an equal opportunity bigot, however; he warns Shirley to “watch out for them Krauts and Cuban bastards.” Tony agrees to be Shirley’s driver and manage his itinerary, but he flatly refuses to launder his employer’s clothes or shine his shoes. He needs the money the job will provide, but he’s a little embarrassed by the relationship; when someone questions him about it he responds, “He ain’t my boss — I work for the record company!” At his home, when two black repairmen finish a heavy job, Dolores gives them each a glass of water. Seeing this, Tony fishes the glasses out of the sink and drops them into the trash. He will not be putting his lips where black lips have been. Does this sound like someone with a “white savior complex” to you? I think it’s no coincidence that during his concerts Shirley often plays a jazzy medley of songs from South Pacific, one of which bears the lyric, “You've got to be taught to be afraid / Of people whose eyes are oddly made, / And people whose skin is a diff'rent shade. / You've got to be carefully taught.” This film shows that you can be untaught as well.

Although Tony does rescue Shirley from a couple of beatings, which he is paid to do, Shirley rescues Tony from jail.

The reversal of stereotypes continues when the well-spoken Dr. Shirley offers to correct Tony’s diction and make him more presentable in fine society, an ironic (and witty) race reversal. He even tries to change Tony’s last name to Valley, “something more pronounceable,” in an ironic nod to the emasculating and insidious practice of renaming slaves for the convenience of the owner when they were purchased. While Tony is the star of the movie, Shirley is the power of the relationship. He even owns a throne.

In every way, Dr. Don Shirley is superior to Tony. He is wealthier, more educated, more refined. He lives in a beautifully appointed apartment above Carnegie Hall and wears immaculately tailored suits, while Tony lives in a small apartment in the Bronx and wears ill-fitting bowling shirts. Although Tony does rescue Shirley from a couple of beatings, which he is paid to do, Shirley rescues Tony from jail. Tony is the protagonist on this journey, the one who changes, the one who is saved from his own bigotry to discover a friendship that would last until the end of his life. Not willing to share a glass with a black man? By the end of the film he is walking around in his undershorts and sleeping in the same room.

No, the real concern about this film — the true progressivist fear that’s whitewashed by accusations of white saviorism — is that it does not fit the current narrative of blacks as victims who need saving. (Ironically, at the behest of our mostly white legislators.) The hypocrisy is so blatant it’s maddening. Don Shirley’s “sin” is that he achieved success through hard work and talent — yes, by his bootstraps — and that he espoused a philosophy of peaceful resistance. "You don’t win with violence,” he tells Tony. “You only win with dignity.” Try touting that philosophy with activists today.

Tony is the protagonist on this journey, the one who changes, the one who is saved from his own bigotry to discover a friendship that would last until the end of his life.

In one particularly poignant scene, Shirley and Tony happen to stop near a field of black laborers to check something in the car. Camera filters intensify the lighting of the scene, mimicking the muted colors of a mid-century painting. No words are spoken, and none are necessary. The laborers stand in the fields picking cotton, dressed in headscarves and calico, while Shirley sits in the backseat of a Cadillac DeVille picking lint from his tailored suit with his soft manicured hands — for one reason: Shirley was given not only a talent for music but also a mother who could recognize it, nurture it and sell it. The key to his success is hinted at in the unsung lyrics of “Happy Talk,” also from the South Pacific medley: “You got to have a dream. / If you don’t have a dream / How you gonna have a dream come true?” Yes, Shirley, as well as the fieldworkers, faced racism and Jim Crow laws. Shirley had to live by the Green Book when he traveled, and he hated it. But he wasn’t victimized by it. He had a dream, and he made it happen.

In sum, Don Shirley’s story does not fit the political narrative of black suppression and victimhood that can only be righted through in-your-face activism and hatred toward whites. We are allowed to admire Black Panther as a strong leader and role model without disturbing the political narrative because he comes from Africa and has not been “tainted” by American sins. But we mustn’t tolerate the example of strong African-American characters without the backdrop of white racism. Thus a central theme of Hidden Figures — a film about the remarkable black women mathematicians who worked in NASA’s space program — deals with the women having to leave the building where they worked to use the colored bathrooms in a distant building, despite the fact that NASA already provided integrated bathrooms at that time. Talk about “demeaning.” Hollywood put them in the colored stalls, not NASA.

Similarly, the story of Don Shirley’s remarkable achievement must be sullied through unfair and untrue criticism of the powerful, witty, uplifting movie based on his life, simply because it doesn’t fit the acceptable stereotype. Indeed, I was soundly criticized for praising this film. But I won’t be cowed. And don’t you be fooled: Green Book is quite possibly the best movie you’ll see this year.


Editor's Note: "Green Book," directed by Peter Farrelly. Participant Media, 2018, 130 minutes.



Share This


Ominous Parallels?

 | 

A congressman wrote to a friend about an argument on the floor of the House of Representatives:

I never said a word to anybody, but quietly cocked my revolver in my pocket and took my position in the midst of the mob, and as coolly as I write it to you now, I had made up my mind to sell out my blood at the highest possible price.

An historian described the atmosphere in the Capitol in this way:

Recurrently, speakers lashed out in passages that threatened to precipitate a general affray. . . . Practically all members were now armed with deadly weapons. In both chambers, Senator Hammond said, “the only persons who do not have a revolver and a knife are those who have two revolvers.” For a time a New England Representative, a former clergyman, came unarmed, but finally he too bought a pistol. A Louisiana Congressman threatened to fetch his double-barrelled shotgun into the House. Supporters of both parties in the galleries also bore lethal weapons, and were ready to use them.

I quote from Allan Nevins’ The Emergence of Lincoln (New York, 1950; 2.121, 124), the best study I know of American politics in the late 1850s. The passages I cite refer to events of early 1860. In the middle of 1861, such events and the emotions that accompanied them produced their final effect — civil war.

What produced this expansion of political and military force, much of it permanent, though unimaginable in earlier American history?

Daniel Webster (and many others) had warned that factional disputes, intensified without limit, could result only in catastrophe:

Sir, he who sees these states, now revolving in harmony around a common centre, and expects to see them quit their places and fly off without convulsion, may look the next hour to see the heavenly bodies rush from their spheres, and jostle against each other in the realms of space, without producing the crush of the universe. (Speech in the Senate, March 7, 1850)

The warnings were heard and understood; yet, as Lincoln was to say in his second inaugural address, “the war came.”

What produced this awful effect, this war in which a million people perished, and more were dreadfully wounded? What produced this war of limbs hacked off without anesthetic, of towns put to the torch, of economic and psychological devastation on an enormous scale? What produced this expansion of political and military force, much of it permanent, though unimaginable in earlier American history? And what produced the peace that followed the war, a peace in which black people, the objects of the victors’ alleged solicitude, languished in poverty and systematic humiliation, generation after generation? And this sorry peace was inseparable from the war itself.

In the second inaugural Lincoln identified what he considered the causes of the conflict:

Both parties deprecated war, but one of them would make war rather than let the nation survive, and the other would accept war rather than let it perish.

Lincoln’s words impute to the major actors more conscious choice and final purpose than most of them felt. Jostling one another in the pursuit of immediate ends, leaders on both sides employed political methods that were not intended to produce a war, yet turned out to be the best means of doing so.

Let me put it in this way. Suppose you want to effect a violent disruption of human life. Here are some things you can do.

1. Convince yourself that you and your friends are right, entirely, and no one else is right, at all, about anything, thereby creating as many political divisions as possible. Reject any speculation that other people, though wrong, may have serious reasons for being that way.

A flood of propaganda spread the idea that no one who disagreed with the latest version of partisan orthodoxy could possibly have any but immoral reasons for doing so.

2. Try to make sure that the political field is cleared of everyone but deadly enemies.

It is often said, and this is true, that before the 1830s Southerners were in general agreement that slavery was an evil, and many Southerners were more than amenable to limiting and eventually getting rid of it. There is also general agreement that the great majority of Northerners were happy enough to endorse ideas for the gradual abolition of slavery; indeed, every Northern state that started with slavery had successfully ended it. Even in the slave states, there were large numbers of free black people — by 1860, 250,000 of them.

Yet 30 years of being labeled enemies by both the partisans of slavery and the partisans of abolition progressively immobilized the ordinary, mildly well-intentioned middle range of public opinion. A flood of propaganda, emanating from each camp of zealots, spread the idea that no one who disagreed with the latest version of partisan orthodoxy could possibly have any but immoral reasons for doing so. Of the thousands of low points in this supposed dialogue, I will mention one — the political emasculation of Webster, formerly the North’s most admired public figure, at the hands of his fellow New England intellectuals, for the crime of supporting the Compromise of 1850. Thus Whittier, the supposedly gentle Quaker poet, depicting Webster as Satan in hell and Noah in his drunkenness:

Of all we loved and honored, naught
Save power remains;
A fallen angel’s pride of thought,
Still strong in chains.

From those great eyes
The soul has fled:
When faith is lost, when honor dies,
The man is dead!

Then, pay the reverence of old days
To his dead fame;
Walk backward, with averted gaze,
And hide the shame! (Whittier, “Ichabod”)

Note the instructive tone, the ecclesiastical certainty (“the soul has fled”), the moralistic comments and commands. These methods, though repulsive to almost everyone, are necessary to your purpose. You cannot be too self-confident when affixing the mark of Cain. Guard yourself: you must never become conscious of the irony involved in damning people while pretending that they are only worth ignoring.

Leaders on both sides employed political methods that were not intended to produce a war, yet turned out to be the best means of doing so.

3. Once you’ve converted potential collaborators into scorned opponents, and multiplied those opponents, do your best either to silence or to enrage them. Southerners were better at this than Northerners. In the South, the mails were censored to prevent dissemination of anti-slavery opinion, and mobs were formed to rid communities of people who gave signs of being anti-slavery; in ten Southern states, the Republican Party wasn’t even on the ballot. But in the North as well, jurists, writers, and teachers were targets of political correctness. Mobs were raised against “agents of the South,” non-abolitionists were purged from Protestant clergies, and politically active people were hounded into choosing between an official Democratic Party, directed by an incompetent president, which insisted that the Kansas-Nebraska Act be renounced and reviled, and a rising Republican Party, which insisted, for opposite reasons, that the Kansas-Nebraska Act be renounced and reviled.

4. Turn marginal positions into moral and political tests. The great issue of the 1850s was the question of whether slavery should be permitted in the Western territories, where no one but wild fanatics had ever believed that slavery could subsist. The North nonetheless demanded that it be banned by act of Congress, and the South nonetheless demanded that it be promoted by act of Congress. Sectional moralists indignantly rejected the Kansas-Nebraska idea, once favored by the South, that the question be left up to the people of the territories. Here was an issue of no practical importance, but it became the test of political viability. Emphasizing politically marginal questions makes it certain that marginal politicians will rise to the top; and if trouble is what you want, these people will give it to you.

5. Try to win, not by debate, but by definition; this is what “principled” people do. To the South and its friends, Republicans were always Black Republicans; that’s what they were. To radical Northerners, all proposals from south of the Mason-Dixon line were by definition products of the Slave Power, which was attempting to spread chattel slavery throughout the North, and ultimately to rule the Western hemisphere. It followed that useful proposals, such as gradual emancipation, which had attracted great sympathy on both sides of the Ohio, were by definition entering wedges of the opposition’s Satanic schemes, to be rejected out of hand.

Emphasizing politically marginal questions makes it certain that marginal politicians will rise to the top; and if trouble is what you want, these people will give it to you.

6. Do your best to promote identity politics — the quest for power considered as a right derived from group membership. Southern partisans applauded the Supreme Court’s bizarre decision in the Dred Scott case, asserting that the Constitution governed everyone but protected only persons of non-African descent, while the cultural leaders of the North assumed that the Constitution was of no effect whenever it contradicted the will of God, which was effectively the will of Northern clergymen.

7. Render yourself blind to your own hypocrisy. The goal of hardcore abolitionists was (hold on to your hat) the secession of the North from the South, an act that would relieve the North of any possible association with slavery. To say that this idea expressed maximal concern for the tender consciences of abolitionists and minimal concern for the welfare of the slaves would be a pathetic understatement. As documented by such historians as Edward Renehan (The Secret Six, 1997), few abolitionists (John Brown was an exception) had any respect for actual, living African-Americans. Distinguished leaders of the abolition movement spoke of them in terms I do not wish to quote. Most hardcore abolitionists were also pacifists, advocates of “non-resistance.” Yet when secession happened, they became fervent advocates of violence as a means of crushing the other section’s suddenly illegal and immoral rupture of the union. Southern publicists cultivated a similarly gross hypocrisy — a growing emphasis on the Christianizing and civilizing effects of slavery, amid increasing attempts to criminalize the education of black people and curtail their practice of religion.

8. The fact that you can’t perceive your hypocrisy doesn’t mean that other people can’t; to prevent its public disclosure, you must therefore remove from positions of influence everyone who sees you as you are. Any pretext will do. You can follow the example of the religious proponents of slavery who removed honest preachers from the pulpit, as punishment for being divisive. Or you can take your cue from the religious opponents of slavery, who attacked all who differed with them as foes of Christian love.

Few abolitionists had any respect for actual, living African-Americans. Distinguished leaders of the abolition movement spoke of them in terms I do not wish to quote.

9. Flirt with, encourage, and finally idealize violence. In 1856, Charles Sumner, Republican of Massachusetts, delivered a speech in the Senate that was so insulting to a Southern senator, a person who had aided and befriended him, that Stephen Douglas, listening, muttered to himself, “That damn fool will get himself killed by some other damn fool.” The candidate for other damned fool was Congressman Preston Brooks, Democrat of South Carolina. He didn’t try to kill Sumner, only to humiliate him, but he went to the Senate chamber and assaulted him with a cane. Once he had started, he became more enthusiastic and wounded him so badly that he might have died. The response of Southern partisans was to celebrate Brooks’ achievement, often with souvenirs of model canes, as if caning your political foes were an act of Arthurian virtue. In 1859, John Brown’s attempt to abolish slavery by inciting a servile insurrection — a campaign in which the first enemy slain was Heyward Shepherd, a free black man — sent Emerson, Thoreau, and other Best People of the North into paroxysms of idolatry. Their celebrations of Brown were immediately followed by a wave of Southern lynchings of people erroneously suspected of being in league with him. The participants seem never to have regretted their mistakes; it was all in a good cause.

When things have gone that far, what’s left but war? It’s true, few people, North or South, black or white, wanted a civil war; comparatively few people in the South actually wanted secession, and none of them would have wanted it if they’d had enough sense to visualize its consequences. But when zealots who hold political power cannot stand to be in the same room with one another, except when they are armed — physically or rhetorically — with weapons of destruction, the only choice remaining is the choice between peaceful dissolution and civil war. And few people of that kind will settle for peaceful dissolution.

Once Brooks had started, he became more enthusiastic and wounded Sumner so badly that he might have died.

So much for the events and feelings of the mid-19th century. Do they have anything to say to us, about our own time?

You can answer that question as well as I can. The idea of “ominous parallels” is basically a joke — nothing is really parallel in history, and the most ominous thing about purported parallels is probably the strength of people’s belief in them. But alleged parallels can suggest real similarities, however distant — and important dissimilarities, too.

When I compare 1860–61 with 2018–19, one dissimilarity seems especially important: the difference in intellectual culture, historical knowledge, and capacity for complex political thought between the leaders of then and the leaders of now. Seward, Lincoln, Crittenden, Davis, Benjamin, Douglas, Stephens, Houston, and immediately before them, Webster, Benton, Clay . . . We can discuss their delusions, their false perspectives, their sacrifices of long-term to short-term benefits, their strange errors of judgment. But please show me a list of equally intelligent, capable, knowledgeable, or even personally interesting political leaders in America today.

You can’t? That’s what I call ominous.




Share This


Elizabeth Warren’s Comedy Act

 | 

I thought that American politics couldn’t get any funnier, but of course I was wrong. And right now, the funniest politician is actually the sour, self-righteous Elizabeth Warren.

Long ridiculed by President Trump, and millions of other people, for claiming to be an American Indian, Warren has now triumphantly released a study of her DNA. According to the Stanford professor who analyzed the data, “the facts suggest that [she] absolutely [has] Native American ancestry in [her] pedigree.”

Warren could have as much as 1/64th Indian ancestry. So just make that cup 1/64th full.

“Pedigree”? Oh well. But the unwary reader may conclude, as Warren appears to have concluded, that her Native American “heritage” has now been authenticated. But that’s ridiculous — for two reasons.

One is that the purported percentage of her Indian ancestry is a whopping 1/1024th. That’s right — one part in a thousand.

“Would you like a cup of coffee?”

“Yes, please!”

“Fill it up?”

“Not quite. Just make it 1/1024th full.”

All right, I distorted the hard, scientific “data.” Warren could have as much as 1/64th Indian ancestry. So just make that cup 1/64th full.

Even the TV actors who burble about “discovering their Swedish heritage” by taking a DNA test and learning that they’re 40% Swedish aren’t as absurd as this US Senator.

The second ridiculosity is the whole notion of “heritage” based on genes. Culture has nothing to do with your body. But suppose it did. If you need to have your DNA analyzed to find out whether you’ve inherited some cultural characteristic, then you haven’t.

Even the TV actors who burble about “discovering their Swedish heritage” by taking a DNA test and learning that they’re 40% Swedish aren’t as absurd as this US Senator. But given her total lack of self-awareness (which is nothing unusual, given her occupation), I suppose it won’t take her long to appear on television to inform the other hundred million Americans who are at least 1/1000th Indian that now, because of the wonders of science, they too can discover who they really are — and prove it, by ending their long night of discrimination and electing one of their own (guess who?) as president.

We are all Indians now.




Share This


Racism

 | 

In 1979, undercover Colorado Springs police officer Ron Stallworth noticed a phone number in a local newspaper in a small ad seeking members to begin a new chapter of the Ku Klux Klan. He called the number and pretended to be a white supremacist, hoping to infiltrate the organization in order to thwart the rising violence against black residents in general and the black student union at the college in particular. Soon the KKK leader suggested that they meet in person. The only hitch? Ron Stallworth was black.

Spike Lee’s BlacKkKlansman tells the tale, and it’s a gripping, suspenseful, often humorous, and often troubling one. As the film narrates the story, KKK leader Walter Breachway (played in the movie by Ryan Eggold) eventually asks for a face-to-face meeting with Stallworth (John David Washington, Denzel’s son), Stallworth arranges for a white undercover narcotics cop, Flip Zimmerman (Adam Driver), to stand in for him. Yes, a black and a Jew both manage to infiltrate the hateful KKK by posing as the same white supremacist. Stallworth continues to talk with Walter by phone while Zimmerman continues to meet with Klan members in person, necessitating that their stories and even their voices match. Walter’s second in command, Felix (Jasper Paakkonen), grows suspicious, or perhaps jealous, and as his sadistic streak surfaces we worry for Zimmerman’s life.

Director Lee chooses caricature rather than character with some of his KKK subjects, but after watching decades of black caricature on film, I can forgive him this hamhandedness.

During the course of his investigation Stallworth contacts David Duke himself (Topher Grace), then the Grand Wizard of the Ku Klux Klan and future Louisiana State Representative. The boyish Grace, best known for the TV series That ’70s Show, plays Duke with perfect oblivion to his bigotry. Lee is a bit heavyhanded, however, in his determination to connect Duke’s rhetoric with Trump’s “Make America Great Again” rhetoric.

Adam Driver provides a nuanced performance as the lapsed, nonchalant Jew forced to confront his feelings about his heritage when he is threatened simply because of his genetic stew. Corey Hawkins is fiery as Kwame Ture (aka Stokely Carmichael), and Laura Harrier channels Angela Davis luminously with her big round glasses and bigger round afro as Patrice Dumas, president of the black student union. Harry Belafonte is a standout as Jerome Turner, carrying with him the weary weight of his own decades in the civil rights movement. Director Lee chooses caricature rather than character with some of his KKK subjects, particularly the slack-jawed near-imbecile Ivanhoe (Paul Walter Hauser) and Walter’s perky, overweight, frilly aproned wife Connie (Ashlie Atkinson). But after watching decades of black caricature on film, I can forgive him this hamhandedness.

While the plot of BlackKKlansman covers just nine months in the 1970s, the story spans more than a century. It opens with a scene from Gone with the Wind, presents upsetting clips from Birth of a Nation, which celebrated the KKK, and ends with footage from the deadly riot in Charlottesville last year. And Harry Belafonte as Jerome Turner provides a soft-spoken, emotional, and tender account of the horrifying 1916 lynching and burning of Jerome Washington in Waco, Texas.

If there is one underlying truth about racism, it is this: government is the Grand Wizard of bigotry.

I’m always a little uncomfortable and defensive when I see films like this; it’s important to be aware of black history, and I’m glad these stories are being recorded on film. But it feels as though I’m intruding somehow, as though all whites are being accused of the same ignorant, bigoted mindset that we see on the screen. In reality, of course, white supremacists represent a tiny minority of the population, while white voters, white activists, white teachers, and white politicians have worked vigorously in the cause of civil rights.

If there is one underlying truth about racism, it is this: government is the Grand Wizard of bigotry. Government legalized slavery and enforced the Fugitive Slave Law. Government institutionalized segregation through neighborhood-based public schools and “separate but equal” policies, and governments outlawed miscegenation. Government imposed poll taxes and voting questionnaires. Government grants and welfare in the 1960s were well-intentioned, but they incentivized single motherhood, established barriers to work through public assistance programs that were difficult to relinquish for an entry-level job, and created a dragnet rather than a safety net that virtually destroyed the black family in urban neighborhoods.

Meanwhile, activists — black and white, male and female — exercising their rights to free speech and open dialogue were the catalyst for change and inclusion. Freedom of speech is the most important right we have. It’s the foundation for all other rights. Yet too many activists today are turning to government to establish hate laws that limit free speech. These films seldom acknowledge the friendship and genuine concern felt by so many white Americans, or the fact that discovery of truth is a process. Lee gives a welcomed nod to this idea at the end of the film, but it takes a long time to get there. Still, BlacKkKlansman is well made and well worth seeing.


Editor's Note: Review of "BlacKkKlansman," directed by Spike Lee. Focus Features and Legendary World, 2018, 135 minutes.



Share This


The Return of Malthusian Equilibrium

 | 

After the departure of Europeans from their colonies following the end of World War II, the Third World rapidly became tyrannical, and their economies began a long decline. The institutional collapse of the Third World has continued over all these years, except that in the past two decades, from an extremely low base, its economies have improved. This economic growth did not happen because the Third World liberalized its economies or adopted any fundamental cultural change in its societies. What enabled synchronous economic progress over the past two decades in the Third World was the internet and the emergence of China.

Cheap telephony and the internet came into existence in the late ’80s. The internet provided pipelines for the transfer of technology and enabled wage-arbitrage to be exploited. Also, many countries — particularly in Latin America and sub-Saharan Arica — benefited from the export of resources to gluttonous-for-resources China, the only emerging market I know of, and to the developed world, which contrary to propaganda is economically still by far the fastest growing part of the world.

Cherry-picking countries of subsistence farmers and cattle-herders for propaganda purposes tells you nothing about the sustainability of their growth.

It is hard to believe, but many countries in the Middle East and North Africa peaked economically in the 1970s. Their competitive advantage was oil, not human resources. The per capita real GDPs of Saudi Arabia and the UAE, despite the fact that they have had a relatively peaceful existence, are about half as large as they were in the ’70s. The situation is similar in Venezuela and to a large extent in Nigeria. Except for the personal use of cellphones, the information technology revolution has simply bypassed these and many other countries.

According to the propaganda — steeped in political correctness — of the international organizations, all the fastest growing economies are in the Third World. But simple primary school mathematics helps cut through this propaganda. Ethiopia is claimed to be among the fastest growing large economies. This is quite a lie. An 8.5% growth rate of Ethiopia on GDP per capita of US$846 translates into growth of a mere US$72 per capita per year. The US economy, with GDP per capita of US$62,152, is 73 times larger, and despite its growth at a seemingly low rate of 2.2%, it currently adds US$1,367 to its per capita GDP — 19 times more than Ethiopia. The situation looks even more unfavorable for Ethiopia if its population explosion of 2.5% per year is considered.

Cherry-picking countries of subsistence farmers and cattle-herders for propaganda purposes tells you nothing about the sustainability of their growth, and certainly does not in any way enable comparison with the developed world.

The developed world is growing much, much faster than the Third World. The only exception is China.

Over the past two decades, the low hanging fruit of economic growth has been plucked in the Third World. South Asia, Southeast Asia, West Asia, Africa, and Latin America are now starting to stagnate. As the tide of the economic growth rate recedes, institutional collapse will become more visible. It will be seen on the streets as civic strife. What is happening in Venezuela, Syria, Turkey, Nicaragua, Honduras, Pakistan, Congo, and South Africa — where institutions are collapsing, social fabric is falling apart, and tyranny is raising its ugly head — are not isolated events but part of the evolving Third World pattern. Once its institutions have been destroyed, there will be no going back. They simply cannot be rebuilt.

When one looks at the world map, one realizes that all colonized countries were created in European boardrooms.

On a simplistic organizational chart, institutions in the Third World may look the same as they looked when European colonizers departed, but without reliance on the rule of law, respect for individual rights, and a rational approach to problem solving — all foundational concepts propagated by the West. They have been swamped by tribalism, magical thinking, and arbitrary dogmas and rituals.

Without the foundation of rational, critical thinking, formal education merely burdens the mind. The result is that stress among the so-called educated people in the Third World is growing, and no wonder: formal education, unassimilated, can work only in narrow areas, where all you want is cogs that can do repetitive jobs in corner cubicles, without encouragement or reward for creativity. This is not a future-oriented environment; it is a merely pleasure-centric one, in which people become easy victims of cultural Marxism. Democratic politics devolved into the politics of anti-meritocratic mass rule, destroying any institutions of true self-government.

During my recent visit to Port Moresby in Papua New Guinea, a young Western girl working for a Western embassy told me that she once went out without her security force. The police stopped her car, and she was fortunate that her security arrived before the police could take her away. The negotiation between police and security was about how much it would take not to rape her. Rape is common in Papua New Guinea, as it is in the rest of the Third World; but because this was a girl working for the embassy, rapists would have had their bones broken the day after. But their bones would have been broken the day after, “too far in the future” to be of much concern.

Without institutions of liberty and protection of private property, financial and intellectual capital does not accumulate.

When one looks at the world map, one realizes that all colonized countries were created in European boardrooms. There was no country of South Africa, Zimbabwe, Congo, or even India before the arrival of Europeans. The people who now run these countries simply do not have the ability or impetus to manage such large societies. They have tribal mentalities, unable to process information outside the visible space. The rulers of modern tribes continuously increase the size of their bureaucracies, but this merely creates overcentralization, the ossification of institutions, and massive, though unseen, systemic risks. Of course, tribalism is irrational, and internecine rivalry a fact of existence that is experienced only on a moment-to-moment basis.

Before the arrival of the Europeans, most of sub-Saharan Africa had no written language and few tools, contrary to popular perception of a pre-colonial utopia. Warfare was the order of the day. Eating flesh and brains of an enemy killed in conflict was practiced from Papua New Guinea, to Africa, to the Americas. Cannibalism is not unknown even today. Contrary to politically correct versions of history, 19th-century colonization was a massive, sudden improvement for many colonized peoples, and a paradigm shifting event for the Third World.

Europeans of the 1940s clearly knew that if they left the Third World, entropy would rapidly ensue, the locals would fail to run their countries, and those countries would implode into tribal units. These wouldn’t be self-managed societies that libertarians dream of, but tribal ones afflicted with internecine warfare. That is indeed where the Third World is heading, and much of it has arrived.

Africa’s population is growing at a faster rate now than it was in 1950.

Without institutions of liberty and protection of private property, financial and intellectual capital does not accumulate. Indeed, the Third World actively destroys or dissipates any material benefit that accrue to it. This happens through war, overconsumption, expansion of the exploiting (ordinarily the governing) class, and the active destruction of capital that one sees in the crime, vandalism, riot, and other means of destroying property that characterize the Third World. Despite their extreme possessiveness, people who destroy the capital of other people fail to maintain their own. In many Third World cities, when there is a day of celebration it is easy to assume that it is the day when employees got their salaries — which disappear by the next morning, drunk away. Capital fails to be protected or accumulated; the rational structure of a productive, thrifty, and prudent culture is not there.

While people in the West are blamed for being materialistic, Third World people are often much more focused on their possessions. The great fleet of servants in India, who are destined to forever remain servants, may earn a mere $100 dollars or less a month, but must have the latest smartphone. For me it is impossible to comprehend how they pay their rent, buy food, and still have some money left to buy a phone; but I remind myself that actually they take loans to buy smartphones and are forever in debt.

And now — the population problem is becoming worse.

Consider Africa alone. Africa’s population in 1950 represented a mere 10% of the world population. By the end of this century Africa, the poorest continent, is predicted to have at least 40% of the world’s people. Africa’s population is growing at a faster rate now than it was in 1950. Given that this rate begins from a much higher base, Africa adds six times more people today than it did in 1950.

More important: in the Third World countries, population control has mostly happened within the relatively more educated, intellectually sophisticated part of society. In Northern India, to cite another example, the unstable, uneducated, chaotic, and backward part of the population is exploding in size. Southern India, which is relatively stable and better off, is falling in population.

With ease of mobility, segregation is picking up its pace. The economically best people of the Third World find it much easier to emigrate than to stay home and fight to make society better, or maintain it in its current state. In 2017 alone, 12% of Turkish millionaires and 16% of Venezuelan millionaires emigrated. So great has been the emigration from India that it is virtually impossible to find a decent plumber or electrician. Forget about finding a good doctor. In a survey, only 30% of Indian doctors could diagnose a simple ailment. Everywhere educated people move to cities, while the rest stay on in rural places. Segregation is real, leaving the underclass with a huge deficit in leaders.

There is also segregation by sector of the economy. As the private sector has evolved in the Third World, government institutions have increasingly become brain-dead, for the best brains now want to work for high salaries in the private sector, leaving state power in the hands of the worst brains. Naturally, people have become very stressed and unsure. As an emotional escape, superstitious rituals and religious-nationalism are increasing exponentially, contributing to the elevation of exploitive, sociopathic elements to positions of power.

Perhaps, payments made to people for having children must stop; instead people should get money not to have children.

It is possible that some parts of the Third World simply cannot be “governed.” A couple of years back I undertook what I consider the most dangerous trip of my life. I went to Goma in the Democratic Republic of Congo (DRC) on my own. Even for DRC, Goma is a lawless part. The Swedish police I was staying with told me one day that a pregnant woman had been raped, her fetus removed, cooked, and fed to other women of the tribe, who had all been raped. Listening to the stories of celebration of such brutalities in the Congo and elsewhere in Africa, I couldn’t but imagine what I would do if I were forced to run the DRC. I couldn’t imagine ever being able to bring it back to relative sanity without imposing the tyranny — for fear is the only restraint available in the absence of reason — for which Leopold II of Belgium is infamous.

This brings us to the terrible predicament of the Third World. Except for China, the countries of the Third World have failed to develop inner competencies and hence internal reasons to accumulate financial and intellectual capital. They have failed to maintain their institutions, which have continued to decay after the departure of European colonizers. The crumbs of economic benefits — the gifts of western technology — have been dissipated. What can be done? How would you deal with the predicament?

There is no hope unless the vast size of the underclass, who are statistically unable to participate economically, particularly in the age of AI, is reduced. Perhaps, payments made to people for having children must stop; instead people should get money not to have children. Even this first step can only happen if the Third World institutions are changed and rational leaders are imposed. But who will impose them?

The end result is obvious. With time — slowly and then rapidly — the Third World will continue to fall apart institutionally. The Third World will implode. This two-thirds of the world population will fall into tribes that, being irrational, will have no way to resolve disputes. They will enter a phase of neverending warfare, with other tribes and within their own tribes. If there is any surplus left, it will be dissipated through population growth and overconsumption. Ahead there is only entropy and a Malthusian future, mimicking the sad Malthusian equilibrium that existed before the colonizers came.




Share This


The Grief of the Aggrieved

 | 

Diversity, more precisely, the ideology of diversity, has become the most dominant force in America’s institutions of higher learning. It is a massive project, developed over several decades, designed to provide America’s marginalized minorities with educational opportunities previously denied to them by an oppressive white America. Applying diversity principles such as social justice, fairness, and inclusion, as well as disparate admission standards and curricula, pedagogical elites assert, will enrich the education of all students (including the white majority) by preparing them to be better global citizens in an increasingly multicultural world. During four years of embracing one another’s “race, ethnicity, gender, religion, sexuality, language, ability/disability, class or age,” marginalized minority students will achieve academic success; white majority students will reject bigotry; all will learn that what people have in common is more important than their differences. Diversity, therefore, will produce both educational and social benefits.

And grief. Mostly grief, and vast quantities of it. On America’s campuses, the most notable products of diversity doctrine are the diversity czars, who preside over what historian Arthur Schlesinger, in his 1992 book The Disuniting of America, prophetically called “a quarrelsome spatter of enclaves, ghettos and tribes.”

Marginalized minority students will achieve academic success; white majority students will reject bigotry; all will learn that what people have in common is more important than their differences.

Princeton student groups recently issued a statement condemning “racism, white supremacy, Nazism, anti-Semitism, Islamophobia, ableism, misogyny, homophobia, transphobia, transmisogyny, transmisogynoir, xenophobia, and any oppression of historically marginalized communities” that plagues America and their “white-serving and male-serving institution.” Such behavior, they say, exposes its underserved “students of color, LGBT and non-binary students, women, undocumented students, students with disabilities, and low-income students” to horrific grief.

Princeton is not the only campus to witness such expressions of universal grief. The promotion of diversity has achieved no harmony. Instead, it has perpetuated what Mr. Schlesinger found — in 1992! Aggrieved factions huddle in safe zones and cringe behind Orwellian speech codes, trigger warnings, and behavior intervention teams that protect them from offensive language or the grief of microaggression.

The University of Michigan’s Inclusive Language Campaign includes “insane,” “retarded,” “gay,” “ghetto” and “illegal alien” as offensive terms, since they “offend the mentally ill, the disabled, gays, poor minorities and illegal immigrants, respectively.” “Kinky” is an example of a term that only offends black students. “America is the land of opportunity” is an example of a phrase that offends all students. The phrase “I want to die” is proposed for banning. It offends a new campus identity group (one whose rapid growth in recent years has perhaps been propelled by Diversity’s milieu of depression and anxiety): Suicidal-American students.

On America’s campuses, the most notable products of diversity doctrine are the diversity czars.

But no aspect of American education has experienced more grief than intellectual diversity. Diversity proponents reject intellectual diversity, especially the conservative and libertarian variety. Conservatives and libertarians are virtually absent from administrations and faculties, ensuring that students are not exposed to ideas that might challenge the dogma of social justice. Protests, often violent protests, are reflexively launched against speakers from outside diversity’s intellectual bubble.

Alas, grief has even spread to the bowels of Diversity. According to a recently published study, diversity educators are victims of burnout, compassion fatigue, and racial battle fatigue, inflicted by “the emotional weight” of their jobs. Their “consistent exposure to various microaggressions,” no doubt “from unruly students” aggrieved by juvenile, overbearing diversity policies, is considered to be a form “of assault and torture” — ironically, and deservedly, so.

Imagine a beleaguered diversity educator taking shelter in a campus safe house from a heavy rainstorm. He takes off his jacket as he passes the coloring book and Play-Doh area, and lies down on a nearby couch to relax. He thinks about his officious day of soothing the aggrieved, censoring speech, sniffing out bias, and, in general, carrying out the morass of rules designed to ensure intellectual and social conformity at his institution. “Compassion fatigue” brings sleep, and dreams of his pompous job, of what Tocqueville would have called “soft despotism” — the effort, as he said, to enforce “a network of small, complicated, painstaking, uniform rules through which the most original minds and the most vigorous souls cannot clear a way to surpass the crowd.” He wakes abruptly, snapping upright, quivering in a cold sweat, having mistaken a bolt of thunder for the clash of ideas, and the rush of rain for his dignity swirling around the drain.




Share This


Japan: A Love Song

 | 

For the past few decades, Japan has been known for its stagnant economy, falling stock market, and most importantly its terrible demographics.

For almost three decades, Japan’s GDP growth has mostly been less than 2%, has been negative for several of these years, and has often hovered close to zero. The net result is that its GDP is almost the same that it was 25 years ago.

The stock market index (Nikkei 225), which at the beginning of 1990 stood at 40,960, is now less than half that, despite a 27-year gap. Malinvestments in infrastructure and cross-holding of shares among companies, and the resulting crony capitalism, get a lot of the blame for draining away Japan’s competitiveness. Confucian culture is blamed for a lack of creativity and an environment in which wrongs done by senior officials go unchallenged.

You can pay money to lie on a bed with a girl who does no more than hold your hand. There are vending machines that dispense used panties.

But the real problem of Japan is supposed to be its demographic meltdown. The population is falling and the proportion of old people is increasing. The median age is 46.9 years and increasing, and the elderly dependency ratio is 42.7%. By 2050, Japan’s population is expected to fall to 109 million from the current 127 million, while the dependency ratio will continue to increase.

Major media publish regular reports about the Japanese refusing to have sex, and the large number of people in their forties who are still virgins. The “vagaries” of Japanese sexual life amuse outsiders. Manga (comics) and anime (animation) cater to fantasy by creating virtual worlds. People play pachinko (an arcade game like pinball, also used for gambling) for 18 hours a day. Girls in cute uniforms entice customers into maid-cafes, or perhaps to date joshi kosei (high school) girls. You can pay money to lie on a bed with a girl who does no more than hold your hand. There are vending machines that dispense used panties.

The unemployment rate is a mere 3%, and during my recent visit to Japan most companies told me how extremely difficult it has become for them to find recruits. Japan refuses to admit refugees or migrants, which in today’s world is seen as extremely close-minded, perhaps even bigoted.

In the early 1990s, people looked up to Japan. In retrospect we can see that the country’s economic growth and stock index were peaking.

All the above appear in the international media as something very unfavorable about Japan. International organizations beg Japan to listen to tearjerking stories about Syria and Libya, and to show compassion. The Japanese are constantly reminded that if they want their old and infirm people to be looked after, they must allow immigration. While the population of Canada is 21% first-generation immigrant, and Australia 26%, Japan is still 98.5% ethnically Japanese. The two largest ethnic minorities — Korean and Chinese — make up less than 1%. Japan simply does not want outsiders.

When I was doing my MBA in the early 1990s, people looked up to Japan. In retrospect we can see that the country’s economic growth and stock index were peaking. Opinion pieces on the outrageous price of real estate were common. At one point, the assessed value Tokyo’s Imperial Palace grounds was higher than that of the entire state of California.

In my MBA classes we heard lectures on Kaizen and other Japanese practices, terms that hardly find mention in the media these days. We were constantly reminded of how well the Japanese work in groups, and how this should be implemented in the West.

So which is true? The romanticized portrayal of the ’90s, when Japan was seen as the solution to the world’s problems, or today’s dismal caricature, in which Japan is part laughingstock and part rapidly declining society headed toward self-destruction?

From factory floors to homes, robots have made huge inroads into the Japanese society. They might even nullify the risk that the country may lack workers.

In both cases, in my view, the world has looked for mere rationalizations, rather than dissecting the underlying issues.

I am a huge fan of Japan. In Japan I see the future of humanity. Perhaps Korea and China should be included in that vision of the future. South Koreans and Chinese — who might superficially dislike Japan — have eagerly copied Japanese ways. Japanese products are sold in abundance in East and Southeast Asia. All the way to Malaysia and Singapore, people look for models to Japan and now increasingly to South Korea, which copied its economic miracle from Japan.

Blaming the Japanese for not being innovative is a distortion of reality. An American geologist with whom I recently spent a couple of days in Japan called the young Japanese “young Einsteins,” while showing me an innovative product that a large Japanese company has developed. From factory floors to homes, robots have made huge inroads into the Japanese society. They might even nullify the risk that the country may lack workers.

Japan has produced a mind-boggling array of international brands: Toyota, Sony, Citizen, Canon, Hitachi, Komatsu, Nikon, Panasonic, Toshiba, Honda, Seiko . . . the quality, perfection, passion, devotion, and mindfulness that these brands embody are hard to beat. And it’s not just the brands. Quality, cleanliness, and attention to detail is everywhere in Japan. Only a very few countries in Europe enjoy similar levels of devotion to excellence.

Politeness is one of the major pillars of any civilization. It shows respect for the other individuals, and it reflects how people live, work, and engage with others. And Japan is among the politest societies in the world. There are seven possible conjugations for most verbs, depending on how polite the speaker wants to be. I have traveled a lot on Japanese trains, and not once did the person sitting in front of me fail to ask my permission before reclining his seat. They ask, despite the ample leg space provided in these trains. When they arrive at their destinations, they always set their seats straight and organize the magazines as they were when they arrived.

Quality, cleanliness, and attention to detail is everywhere in Japan. Only a very few countries in Europe enjoy similar levels of devotion to excellence.

I cannot remember when my train was ever late, even by a minute. In Japan, South Korea, Taiwan, Singapore, Hong Kong, and increasingly in China, even in crowded subways, people mostly do not use the seats at the entrance of the compartments, so that they are always available for pregnant women and the elderly. The seats remain empty because travelers don’t want to embarrass any pregnant women or old people who may arrive later, by vacating the seat in their presence. No one talks on his phone or plays music using a speakerphone. Mostly people don’t even talk. They are at peace even on the subways, their ears unviolated by the noise of others.

I try my best to be polite, but Japanese beat me every single time. One must try to understand the mind and heart that they put into their work, and how they respect their clients. By presenting this kind of model, Japan has exported for free its civilizing culture to any society that is prepared to learn it.

Japan was almost completely destroyed in World War II, and rose from the ashes through sheer willpower. It is a country whose heartfelt honesty, respect, and integrity I am in love with.

A few months after the Tsunami of 2011, I visited the area around the town of Sendai, which had been devastated. There had been no — zero — rioting or robbery. People hadn’t begged the government for help; within months they had fixed up the place themselves. Piles and piles of crushed cars stood in neat heaps. Where the houses once stood had been cleaned up. Roads had been constructed so that a new city could grow up around them. Only someone without a heart could have kept from crying to see what a group of proud people can achieve.

By presenting this kind of model, Japan has exported for free its civilizing culture to any society that is prepared to learn it.

Throughout the world, many groups complain about the historical injustices that “they” (actually their ancestors) faced. In 1945, Japan stood extremely humiliated and virtually destroyed. But ask Japanese about their sufferings of those days, and you will very likely get a blank stare. Proud people do not blame their past for their present.

Japan is still 98.5% Japanese. Is that inward-looking and racist? Maybe that is the wrong question. Multi-ethnic societies have worked virtually nowhere in the world. People who arrived in Europe as long as 1,400 years ago — Romani gypsies — are still a separate community. As a group, they are not only unassimilated; they haven’t integrated with the mainstream ways of life. People tend to get ghettoized on racial, religious, or linguistic lines. That has been the history of North America, Europe, and other parts of the world. Japan has avoided all of the associated social problems — including that of crime and terrorism — that today afflict the developed world.

Crime is virtually unknown in Japan. No one locks his bicycle, and people often leave their belongings — including purses — unattended. Late at night, young women can walk the streets alone, unaccosted, even in the areas controlled by Yakuza (Japanese mafia). Six-year-old kids can be seen crossing the road all alone.

Japanese bureaucracy is believed to be slow and an impediment to innovation. It is hard to measure how much more bureaucratic Japan is compared to other developed nations, but the Economist’s crony-capitalism index puts Japan — again quite contrary to popular beliefs — better than the USA and the UK.

Is it at all possible that a counterfactual narrative was constructed by the leftist social justice warriors who control the media, to pressure Japan into doing the bidding of pro-multicultural, pro-diversity international organizations?

Crime is virtually unknown in Japan. No one locks his bicycle, and people often leave their belongings — including purses — unattended.

An outsider does react with shock to some of the images of anime and manga, and the idea of buying used schoolgirls' panties in vending machines. But the reality is that sexual perversion is not unique to Japan. In the West the law is so strict that a lot of perversion remains hidden. But one does get a glimpse of what so many western men look for when they go to Thailand and surrounding countries, and to Latin America.

What I find impressive is that what Japan does is right in your face — Japan is like the Amsterdam of Asia.

Forty-two percent of men and 44.2% of women between the age of 18 and 34 years are said to be virgins, a statistic one often reads in the international media. But this statistic pools together a broad band of ages. There is nothing unusual — or even wrong — about 18-year-olds being virgins.

Another often quoted number is that one out of four Japanese over the age of 30 years is still a virgin. This is wrong, for the data applies only to unmarried people, yet the word “unmarried” is often left out. Eighty-six percent of men and 89% of women eventually marry. So the correct estimate of virgin Japanese over the age of 30 years is less than 4%, far less than the media would have you believe.

There is really not much about Japan’s demographics that is abnormal. The country's native birth rate compares well with that of other wealthy economies.

Are single mothers and promiscuity really the metric of a better society? Western media seem to suggest this is so. There is indeed a correlation between being conscientious and shyness in sexual matters. Only 2% of Japanese children are born outside marriage, compared with 40% in the UK and the US. This is to be celebrated, not ridiculed.

There is really not much about Japan’s demographics that is abnormal. The country's native birth rate compares well with that of other wealthy economies. There is indeed a problem in that Japanese live longer, surviving into their unproductive years farther than people elsewhere — hence the high and growing dependency ratio. This is a problem, but it is a problem of success, not of failure.

I cannot but wonder if Japan is demonized for refusing to promote immigration or promiscuity. In my view it is perhaps the best large country in the developed world — for exactly the reasons it is, ironically, demonized for. My Japanese friends tell me about the inhibitions that kids develop under a very strict social structure, but for me as an outsider — a gaijin, literally “not one of us” — it is hard to understand Japan’s social dynamics completely. Japan indeed has its problems, but they are far outweighed by the great goodness of the place. It is one of humanity’s finest accomplishments, which should be celebrated not just by Japanese but by everyone.




Share This
Syndicate content

© Copyright 2019 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.