The Last Cargo


The launch last month of “The 1619 Project” by The New York Times unleashed a barrage of partisan volleys and countervolleys consisting mostly of debatable claims, finger-pointing, and innuendo. It’s predictable for the polarized times we live in. Some of the Democratic presidential candidates are calling for reparations for the nation’s original sin of slavery (were the 360,000 Union deaths not enough?); and we’ve elected a president whom many consider racist — an accusation resorted to glibly, promiscuously, and with a keyword so broadly defined and overused it’s become as meaningless as the word “love.”

The NYT presents the project as an appropriate bookend to the 400th anniversary of the first slave ship, the White Lion, to arrive at the continental US, at Jamestown. But perhaps a more apt bookend might be the arrival of the last slave ship, the schooner Clotilda, in 1859 (or 1860, according to some sources), 240 years after the White Lion docked at Point Comfort in Virginia.

It’s taken quite a while for the Clotilda’s story to air.

Although the transatlantic slave trade had begun in the early 1500s with destinations to the Caribbean and Brazil, the 20-odd Angolans aboard the White Lion — taken against their will — were classified as indentured servants, some of whom later acquired their freedom, as per that definition. Children of those Africans who did end up as slaves were born free — according to the laws of that time.

It’s taken quite a while for the Clotilda’s story to air. Zora Neale Hurston (1891–1960), whose four grandparents had all been slaves, interviewed Cudjo Kossola Lewis, the second-to-last survivor of the Clotilda, in 1927. She’d trained as an anthropologist under the tutelage of Franz Boas, considered the “Father of American Anthropology,” and this was her first serious project.

Boas had introduced and firmly established the concept of cultural relativity as an investigative axiom: “a person's beliefs, values, and practices should be understood based on that person's own culture, rather than be judged against the criteria of another.” As a field tool, the concept allowed Hurston to present Kossola’s narrative more objectively, through his eyes. She’d attended Howard, Barnard, and Columbia with classmates Ruth Benedict and Margaret Mead (who got a bit creatively carried away with the cultural relativity bit in the South Pacific).

Hurston’s transcription of Kossola’s dialect is inconsistent, slaloming between her efforts at accurate transcription and reversion to more conventional English when the task became overwhelming.

Barracoon: The Story of the Last “Black Cargo,” the book that resulted from Hurston’s interviews with Kossola, remained unpublished until May 2018. Back in 1931, Viking Press rejected it. They would only accept the manuscript if Hurston rewrote Kossola’s vernacular into standard English. They had a point (though not the one I want to make right here). Hurston’s transcription of Kossola’s dialect is inconsistent, slaloming between her efforts at accurate transcription (which she thought essential) and reversion to more conventional English when the task became overwhelming. Additionally, according to novelist Alice Walker’s foreword in the book, “There was concern among ‘black intellectuals and political leaders’ that the book laid uncomfortably bare Africans’ involvement in the slave trade.”

And that’s not the only inconvenient truth buried in Barracoon. Hurston, a black female anthropologist, was an independent thinker. She opposed school integration and programs that guaranteed blacks a right to work. In 1955 she claimed that "adequate Negro schools" already existed. (See John M. Eriksen, Brevard County, Florida: A Short History to 1955, chapter 13; and "Negro Writer Opposes Court Ruling,” Titusville Star Advocate, September 30, 1955, p. 2.) And she was a Republican during the New Deal.

Although John McWhorter, a linguist and Associate Professor of English and Comparative Literature at Columbia, has called Hurston "America's favorite black conservative," she’s been more properly characterized as a libertarian by David and Linda Beito ("Isabel Paterson, Rose Wilder Lane, and Zora Neale Hurston on War, Race, the State, and Liberty," Independent Review 12, Spring 2008) She was no social conservative and, in foreign policy, was a noninterventionist. And then there are the controversial watermelons.

Although John McWhorter has called Hurston "America's favorite black conservative," she’s been more properly characterized as a libertarian.

Kossola lived life to his own rhythms, tending his gardens and active in his church. To ingratiate herself with him and unlock a volubility concealed behind an apparent reticence, Hurston would bring peaches, hams, and watermelons as gifts. Once they shared an entire iced watermelon, gnawed down to the rind, taking up all their allotted interview time but unlocking a trust and warmth that sealed a lasting friendship. Whether the association of watermelons and blacks’ taste for them already existed is a question best left to pop historians. But Alice Walker in her foreword to Barracoon again picks up a racial trope, “Imagine how many generations of black people would never admit to eating watermelon!”

* * *

Kossola, an Isha Yoruba, was captured in a slave raid by the army of King Glélé of Dahomey (in present-day Benin), which consisted of about 7,000 male and 5,000 female warriors — the renowned Dahomey Amazons. He was 19 and engaged to be married. His village was stormed by the Amazons, all belted with the dangling heads of opponents killed in battle. Kossola reported that they were the equal of any man. The old and infirm were decapitated on the spot (so much for the nurturing nature of the gentler sex). Meanwhile, the male Dahomey warriors were stationed at the gate posts to ambush and capture the fleeing villagers, Kossola among them.

After about four weeks in transit, three spent in a barracoon — a holding cell for newly-captured slaves — the captives were treated to a big feast by their captors: “the people of Dahomey come bring us lot of grub for us to eatee ’cause dey say we goin’ leave dere. We eatee de big feast,” recalled Kossola.

Captain William Foster, owner, builder, and skipper of the Clotilda — which was anchored outside the surf zone (the port of Whydah lacking any docking facilities) — purchased 130 of the captives. He chose equal numbers of males and females. Although offered, he “preemptorily” [sic] forbade their branding. Each captive cost $50 to $60 on the coast but could be sold in Alabama for about $800 apiece (nearly $23,000 in today’s money).

His village was stormed by the Amazons, all belted with the dangling heads of opponents killed in battle.

Foster’s fellow investors in this slaving venture consisted of the brothers Jim, Tim, and Burns Meaher from Maine. The Captain carried $9,000 in gold. The Clotilda was manned by a crew of 12 (all Yankees), including him. Although the importation of slaves had been illegal since 1808, the Meahers, who owned a mill and shipyard, built swift vessels for blockade running and “filibustering expeditions.” At the time, smuggling slaves from Cuba was common practice.

Transporting and loading the captives into the Clotilda through the heavy Atlantic surf required the services of skilled men of the Kroo tribe, men, an ethnic group of independent operators who specialized in negotiating breakers with sleek surf boats. Their skills as mariners were so expert that the Royal Navy enlisted many of them from 1820 to as late as 1924. But Kroo canoes had limited capacities. Kossola, naked and terrified, thought he’d breathed his last. He was the last captive loaded onto the Clotilda.

After 116 of the slaves had been brought on board, Foster became aware of possible treachery involving Dahomans planning to recapture the cargo and holding him hostage. He immediately gave orders to abandon the cargo not already on board “and to sail away with all speed.”

Kossola, naked and terrified, thought he’d breathed his last. He was the last captive loaded on board.

The Clotilda got away, but the next day was chased by an English cruiser on the lookout for slavers. Foster escaped by pressing sail. The slaves down in the hold were in cramped conditions (although they had much more space — five feet of headroom — than many slaves in previous Middle Passage transports) After being kept below decks for 12 days, mainly because of real or false alarms, they were brought on deck so they might limber up. The captain ordered the crew to help them walk and exercise. For the rest of the passage, except for the twentieth day, when another British cruiser was spotted, and near the end, on the approach to Mobile, they spent most of their time on deck — 116 slaves to 12 crew. Only two died. The crossing took 70 days.

According to Hurston, the Clotilda arrived in Mobile Bay under cover of darkness in August 1859 (other sources say July 9, 1860). Fear of discovery and prosecution, and the fact that blacks illegally brought in could not be enslaved, made their sale problematic. In fact, Foster and the Meahers were later tried in federal court in Mobile, though not convicted, for lack of evidence: the Clotilda and its manifest had been burned and sunk, the black captives well hidden. Other sources say they were found guilty and charged heavy fines, which were never paid. The outbreak of the Civil War prevented further pursuit of the case.

Forty-eight of the slaves were secretly sold. The remaining 60 (according to one of the discrepant sources) were divvied up among the principals: James Meaher took 32 (16 couples), Burns Meaher took ten, Tim Meaher eight, and Captain Foster ten. Kossola went to Jim Meaher, where he acquired the name Cudjo Lewis. “Cudjo” was a name given by the Akan people of Ghana to children born on a Monday, while “Lewis” is probably a corruption of Kossola’s father’s name, Oluale, which Meaher had difficulty pronouncing.

The Clotilda was chased by an English cruiser on the lookout for slavers. Foster escaped by pressing sail.

Cudjo reported that they were not immediately put to work; first they were trained by American slaves, who ridiculed the Africans for their ignorance and “savage” ways. He became a stevedore loading wood for Jim Meaher’s cargo boats on the Mobile to Montgomery run. He was worked hard, but praised his master for taking good care of his slaves:

Cap’n Jim, he a good man. He not lak his brother, Cap’n Tim. He doan want his folks knock and beat all de time. He see my shoes gittee ragedy, you know, and he say, ‘Cudjo, if dat de best shoes you got, I gittee you some mo’!’ Now das right. I no tellee lies.

“Cap’n” Tim’s brother Burns was also cruel, but it seemed to have its limits. Cudjo reported that their slaves worked the brothers’ plantation fields:

Dey got overseer wid de whip. One man try whippee one my country women and dey all jump on him and takee de whip ‘way from him and lashee him wid it. He doan never try whip African women no mo’.

This astonishing account of slave resistance without repercussions is reminiscent of a similar incident reported by Frederick Douglass in his autobiography, Narrative of the Life of Frederick Douglass, an American Slave.

* * *

Douglass was a proud, headstrong man. Like my brother John, who was drafted for the Korean War — and looking forward to proudly serving his country, only to be discharged for being “temperamentally unsuited to taking orders” — Douglass was not cut out for servitude, though instead of “proudly serving his master,” he just complied with performing his duties . . . as long as he was treated with respect.

As a boy, Douglass had not only been treated well, he’d been taught to read and write. But he ended up with a master with “quite a number of differences” with him. During the nine months he spent with Master Thomas, “he had given me a number of severe whippings, all to no good purpose.” So Thomas decided to send Douglass to Edward Covey, a man who, for a price, specialized in breaking recalcitrant slaves.

After a series of particularly brutal beatings, Douglass decided — to his own surprise — to fight back.

Covey set about the task with alacrity, putting Douglass to brutal work in the fields: “During the first six months, scarce a week passed without his whipping me.” He was treated so brutally that he admitted that at one point “Mr. Covey succeeded in breaking me.” But this was not to be permanent.

After a series of particularly brutal beatings, Douglass decided — to his own surprise — to fight back . . . come what may. During a two-hour tussle, Douglass “drew blood” and got, by far, the better of the encounter. Covey retreated. “The whole six months afterward, he never laid the weight of his finger upon me in anger. This battle with Mr. Covey was the turning point in my career as a slave.” From then on, civility — or what passes for civility in a master-slave relationship — was the order of the day. Douglass performed his duties; Covey let him be.

Douglass’ analysis of the event demonstrates the intelligence and wisdom of this young man. Covey could only go so far. Slaves were extremely valuable property: to render one unfit for service was financial suicide — not to mention that Douglass wasn’t his slave (or that Douglass had seriously beaten Covey).

Mr. Covey enjoyed the most unbounded reputation for being a first rate . . . negro-breaker. It was of considerable importance to him. That reputation was at stake; and had he sent me — a boy about sixteen years old — to the public whipping post, his reputation would have been lost; so, to save his reputation, he suffered me to go unpunished . . . I was nothing before; I was a MAN NOW.

The incident was so pivotal to his life that Douglass filled 11 pages of his first book on it, and 32 in the subsequent autobiography. In contrast (and I digress here somewhat), David W. Blight in his Pulitzer Prize-winning biography, Frederick Douglass: Prophet of Freedom, gives it only two pages and ignores Douglass’ analysis and insights.

Slaves were extremely valuable property: to render one unfit for service was financial suicide.

I suspect an ideological bias, such as surfaces in his introduction:

[Douglass was] a proponent of classic nineteenth-century political liberalism . . . he strongly believed in self-reliance . . . but fundamentally was not a self-made man.

Let’s take a closer look at these assertions. Inserting “nineteenth-century” between “classic” and “liberalism” implies, to these libertarian sensibilities, that classic liberalism was an outdated, even discarded philosophy. But nothing could be further from the truth. Classic liberalism is alive and thriving today. And to say that Frederick Douglass, the epitome of a self-made man, was not a self-made man is to contradict all the evidence contained in Blight’s flawed tome. In at least a dozen instances in the book — instances of Douglass solving problems, escaping bondage, rising to the occasion, creating opportunities, helping others — Blight is unambiguously forced to aver that Douglass was in fact a “self-made man,” using those exact same words (p. xv).

Inserting “nineteenth-century” between “classic” and “liberalism” implies that classic liberalism was an outdated, even discarded philosophy. But nothing could be further from the truth.

Douglass’ examination of Mr. Covey’s behavior is a classic liberal analysis of conduct based on economic self-interest, a perspective that Blight either refuses to acknowledge or completely ignores. It does not fit his worldview, and he refuses to give it air time — in spite of the fact that Douglass’ analysis of the event was a formative experience in his life.

Blight reveals his biases more artlessly whenever he mentions Republicans — never mind that, for abolitionists, Republicans were the only game in town. About the 2013 unveiling of a statue of Douglass in Washington DC, Blight’s introduction condescendingly observes:

Congressional Republicans walked around proudly sporting large buttons that read FREDERICK DOUGLASS WAS A REPUBLICAN. Douglass descendants present, as well as some of us scholars with, shall we say, different training and research, smiled and endured.

Yes . . . that was in 2013. But Blight can’t help projecting modern biases into the past, through subtle wording and innuendo throughout the book, especially when Douglass becomes active in Republican Party politics. This is but one reason why the book was a chore to get through. Blight is no Ron Chernow or Robert Caro.

* * *

But back to Cudjo Kossola Lewis. The Africans were unaware of the start of the Civil War, but when the Union blockade and the surrounding fighting made food scarce, “Cap’n Jim Meaher send word he doan want us to starve, you unnerstand me, so he tell us to kill hogs. He say de hogs dey his and we his, and he doan wantee no dead folks.”

On April 12, 1865, only three days after Robert E. Lee’s surrender at Appomattox, but five or six (some sources say four) years of Cudjo’s life as a slave in America, Union soldiers told him he was free. The Africans celebrated by making drums and beating them “lak in de Affica soil.” Their first inclination was to return to Africa: “dey [the Meahers and Foster] ought take us back home.”

When they discovered the cost of such an improbable venture, they nonetheless worked hard and saved their money. But finally deciding that going back to Africa was unrealistic, they deputized Cudjo to approach the Meahers for land to settle on. Tim, the meaner of the Meahers, jumped to his feet and responded, “Fool, do you think I goin’ give you property on top of property? I tookee good keer my slaves in slavery and derefo’ I doan owe dem nothing? You doan belong to me now, why must I give you my lan’?

Union soldiers told him he was free. Cudjo and the other Africans celebrated by making drums and beating them “lak in de Affica soil.”

Notwithstanding Tim’s rebuff, James, the kinder and gentler Meaher, might have helped finalize the deal. The Africans bought Meaher land three miles north of Mobile at Magazine Point, establishing a settlement they called Africatown — but now known as Plateau — in 1866 (the date Hurston provides, but according to Sylviane A. Diuf, in the Encyclopedia of Alabama, Cudjo bought two acres on September 30, 1872 for $100 — or about $2,000 today).

Cudjo became a naturalized American citizen, married, had six children, and became sexton of his church. In 1902, while driving his buggy over train tracks, he was hit by a train and injured. A sympathetic white lady who saw the accident ensured he was well taken care of and told him he had a case against the railroad. Cudjo knew nothing about American law. The lady hooked him up with a lawyer who took on contingency his case against the Louisville and Nashville Railroad. Cudjo won and was awarded $650.

But he never collected. Cudjo reported that after the verdict, a yellow fever epidemic hit Mobile. The lawyer and his family headed north to safety, but on the way the lawyer died. Yet another source (Encyclopedia of Alabama) says that the verdict was overturned on appeal.

Cudjo Kossola Lewis died on July 17, 1935.

* * *

In these times of “fake news,” the publication of Barracoon — finally — should be a breath of fresh air. I say this notwithstanding the fact that while writing this review I discovered so many discrepancies in the account that I’m left wondering how to account for them: a year’s difference in the arrival of the Clotilda in Mobile; the number of years Cudjo spent in bondage; the resolution of Cudjo’s lawsuit; Hurston’s purported plagiarism from earlier sources; and other, more minor controversies. They seem to be endemic to the genre.

Whatever the causes of the discrepancies in Kossola’s story, at least they don’t seem to be rooted in ideological manipulation — a shortcoming that has bedeviled American slave narratives since at least the times of William Lloyd Garrison. Antebellum abolitionists resorted to widespread hyperbole concerning the horrors of slavery in order to convince an ill-informed and often indifferent public.

While writing this review I discovered so many discrepancies in the account that I’m left wondering how to account for them.

Yes, I know, you’re thinking, How can one overstate the evils of slavery? It’s like exaggerating the fires of hell. I don’t know about you, but accuracy works best to convince me about anything. When people resort to lies, or just don’t check their facts well enough, I lose trust, no matter how well-intentioned the narrative may be.

The altering of facts continues to this day. The movie 12 Years a Slave, based on the book by the same title (and reviewed in these pages by Jo Ann Skousen, “A Slave Narrative, and More,” November 10, 2013), contains at least four falsifications, all of which are ideologically based. As Jo Ann points one out:

Some of the vignettes simply don’t ring true, as when the lecherous and sadistic slave owner, Edwin Epps (Michael Fassbender) whips Patsey (Lupita Nyong’o) almost to death because she has spoken back to him. Patsey is his most productive slave. She picks twice as much cotton every day as any of the men do. She is a valuable, unblemished piece of property, even if he doesn’t acknowledge her humanity. It does not make sense that he would destroy such a valuable capital good in a fit of pique.

The movie depicts William Ford (played by Benedict Cumberbatch), the slave owner, in quite another light than Northup, the slave (played by Chiwetel Ejiofor), described him in his book: “There never was a more kind, noble, candid, Christian man than William Ford.”

Only the well-off could afford to own slaves before the war, and they weren’t likely to burn $23,000 for fun.

Falsifications like the one Skousen points out are particularly egregious. Not only do they go against basic economic theory but they paint human nature in the worst light possible.

This out of The Atlantic:

In the film version, shortly after Northup is kidnapped, he is on a ship bound south. A sailor enters the hold and is about to rape one of the slave women when a male slave intervenes. The sailor unhesitatingly stabs and kills him. This seems unlikely on its face — slaves are valuable, and the sailor is not the owner. And, sure enough, the scene is not in the book. A slave did die on the trip south, but from smallpox, rather than from stabbing.

But the worst one, which I haven’t seen referenced, was a passage in the book where Northup is sent on an errand that requires crossing a gator-infested bayou. Along the way, he encounters an alligator, and sweats bullets. In the movie the scene is changed. Instead of an alligator, he encounters two rednecks whooping it up hanging a black.


The heydays of lynching blacks were after the Civil War, not the 1840s, when slaves were worth about $23,000, average, in today’s money. And though crackers were the foot soldiers of the Ku Klux Klan during and after Reconstruction, only the well-off could afford to own slaves before the war, and they weren’t likely to burn $23,000 for fun.

Barracoon discloses some inconvenient truths, and in doing so, to my mind, enhances the credibility of the horrors of slavery by revealing not just its inhumanity but the glimpses of humanity that at times appeared. Caricatures and satire only succeed with the ignorant and the convinced.

In the movie the scene is changed. Instead of an alligator, he encounters two rednecks whooping it up hanging a black.

And instances of behavior that tempers the conventional narrative of slave societies run through many slave biographies. Besides Douglass’ and Northup’s (dictated, despite Northup’s literacy, to David Wilson), check out The Life of Olaudah Equiano, Prince Among Slaves; Incidents in the Life of a Slave Girl; The History of Mary Prince; and The Barber of Natchez (a free black in 1830s Mississippi).

Perhaps it’s our knowledge of the Holocaust that makes some of us project its atrocities back onto our slavery era. I don’t know. But for now, let’s keep the two separate and not make too many generalizations about universal human behavior. Truth is the best antidote to propaganda, however well-intentioned.

Editor's Note: Review of “Barracoon: The Story of the Last ‘Black Cargo,’” by Zora Neale Hurston. Amistad, 2018, 171 pages.

Share This

Joker: Nothing But Scary PR?


In 1960 Alfred Hitchcock created a trailer for Psycho unlike any other. Instead of editing together a composite of actual scenes from the movie, he took audiences on a six-minute tour of the Bates Motel and mansion, telling them where certain murders would take place — including the famous “ba-a-a-athroom” — without revealing who would be killed, or by whom. He also warned audiences that they would not be admitted to the theater after the film had begun, a revolutionary concept in an era when it was common to enter a theater whenever you happened to arrive and then stay through until it had looped back to your personal starting point — when you would get up and leave, often uttering the phrase, “This is where I came in.” Some theaters took Hitch’s kitschy trailer one step further, assuring audiences that medical personnel would be on the premises to treat the fainthearted.

I was reminded of this innovative marketing plan during the week before Joker opened, when somber-faced newscasters offered advice to those who planned to see it: “Look around for people who might be in the theater alone”; “Have a plan if the theater is attacked”; and “Always know where two exits are located.” Despite their somber faces I had to wonder — what’s their motive here? Was it just helpful advice? Or was there more? Did I detect a tinge of hope that a big news story was on the horizon, a shooting of hurricane proportions? Or was the hype part of the marketing scheme, focused more on helping the advertisers than the viewers? Certainly I sensed a bit of hypocrisy from an industry that calls for gun control in its political posturing while producing films full of violence.

It’s the sympathetic victimization of the Joker that troubles most fans and many critics, yet it’s what makes the character so fascinating.

The news hype led me to wonder whether I should risk the copycats and wannabes who might be goaded into taking a gun into a packed theater. Did I feel lucky? Well did I, punk? I also expected a film full of torture and gore, based on the warnings, which made me wary. In addition, fans carped about the audacity of creating a sympathetic backstory for Batman’s most famous nemesis, a psychologically twisted character steeped in pure evil. Meanwhile, the New York Times wondered in its review what all the fuss was about. I decide to find out for myself.

It’s a terrific movie, and would be so whether or not it was a backstory for a Batman character; the Batman references that weave in and out of the story are surprising and satisfying but not necessary. The movie could stand on its own as a film tracing the dark psychological journey of a man struggling to find happiness and acceptance while dealing with psychiatric issues stemming from a tortured childhood. It’s the sympathetic victimization of the Joker that troubles most fans and many critics, yet it’s what makes the character so fascinating. In his journal the Joker writes, “The worst part about having a mental illness is that people expect you to behave as if you don’t.” That’s probably true, and most of us are guilty of having had that expectation.

The film is set in 1970s Manhattan — er, I mean Gotham — when Times Square was home to derelicts, gangs, and XXX peepshows. A Guiliani-esque voice excoriating the filth and promising to clean it up is heard on the radio as the film opens. Little does he know the filth that is building up in one of his citizens.

When it does happen, the onscreen violence is shockingly quick and bloody, but not gruesome.

Before becoming the Joker, Arthur Fleck (Joaquin Phoenix) is a wannabe comedian who works as a clown-for-hire, writes potential jokes in the journal his psychiatrist has prescribed as therapy, and lives with his mother, Penny Fleck (Frances Conroy) — dotes on her, really. He washes her hair while she’s taking a bath, smiles when she tells him to “put on a happy face,” and crawls into bed with her to watch “The Murray Franklin Show” –a program starring Robert De Niro and based not-so-loosely on “The Tonight Show with Johnny Carson.” The distinctive Ed McMahon chuckle of Murray’s sidekick on the couch is a subtle contrast to Arthur’s uncontrollable stress-induced laughter, a malady akin to Tourette’s Syndrome. In a later scene he curls up beside his mother’s pillow à la “A Rose for Emily” (a scene also alluded to in the Psycho trailer). Something is not quite right in the Fleck home. The Psycho connection I noted before seeing the movie turns out to be fairly apt, and not just for the marketing scheme. A bit of Marnie enters into this story as well.

Less bloody than any Tarantino flick, Joker has a way of creating suspense that is more akin to Hitchcock than to a slasher or gangster film. In fact, much of the killing takes place off screen, leaving the viewer to wonder what actually happened in an eerie, “surely he didn’t . . . ?” kind of hopefulness. The soundtrack, heavy on deep bass bowing and street percussion, and the lighting, full of flickering shadows, is also reminiscent of Hitchcock’s style. When it does happen, the onscreen violence is shockingly quick and bloody, but not gruesome. Very crazy, and very effective.

What makes this film work is its star. Joaquin Phoenix reaches deep into the quirks and self-deceptions of a man both victimized and victimizer, a man who laughs because he’s crying. He deliberately avoided portraying the symptoms of any single disability because he didn’t want audiences to smugly diagnose the Joker and thus think they understand him. Phoenix said in an interview, “I was never certain what was motivating him. I have my own opinion. I think I know what it is for me. But I wouldn't want to impose on anyone who hasn't seen the movie." This makes his character utterly unpredictable and devastatingly dangerous.

Artie’s killing sprees are often followed by an oddly erotic celebratory dance made more macabre by Phoenix’s 52-pound weight loss to prepare for the role. Phoenix said of his extreme dieting, “What I didn't anticipate was this feeling of kind of fluidity that I felt physically. I felt like I could move my body in ways that I hadn't been able to before. And I think that really lent itself to some of the physical movement that started to emerge as an important part of the character." That fluidity of motion extends to a fluidity of character, moving between pathos and demonic psychosis.

Should we empathize with a school shooter, if we discover that he had a tortured youth? Does victimhood give the criminal a pass?

So what is Joker’s own backstory? I won’t reveal too much, but I will say that a couple of moments made me gasp with surprise. Like many kids with tortured backgrounds, Artie is isolated, lonely, and frustrated. He’s picked on, bullied, beaten, and laughed at. His dream of becoming a standup comedian is thwarted by his handicap of uncontrollable laughter that worsens with the adrenaline that comes from facing an audience.

Moreover, his therapist is pretty useless. She provides pills when he asks for them and recommends he keep a journal. She asks him questions, but seems not to listen to the answers. He wants help, but he isn’t getting much of it. And what little he is getting comes to an end when funding is cut by the mayor (yes, we can blame the government for creating the Joker). This too feels like a warning about the dangers lurking in school hallways today, where troubled kids are allowed to fester without help until they erupt with a cascade of gunfire. Zazie Beetz, who plays Artie’s love interest, rejected the idea of the Joker as a sympathetic character but told an interviewer, “It’s kind of an empathy toward isolation, and an empathy towards what is our duty as a society to address people who slip through the cracks in a way. There is a lot of culture of that right now. So is it empathy for that or just an observation on personalities who struggle?”

Good question. Should we empathize with a school shooter, if we discover that he had a tortured youth? Does victimhood give the criminal a pass? We are definitely made to feel sorry for Artie, and thus to understand his motivation for killing, even as we are horrified by what he does. For that matter, is it fair to see Bruce Wayne as a hero and Artie Fleck as a villain, when both are driven by a desire for justice and revenge? This is where Arthur Fleck departs from Batman’s Joker. Joker is amoral, detached, cold, and brilliant. Arthur is all emotion, his tormented laughter coming from a place of deep personal pain. The Joker is heartless; Artie is all heart. The result is a fascinating case study of a psycho.

Editor's Note: Review of "Joker," directed by Todd Phillips. Warner Brothers, 2019, 122 minutes.

Share This

After-Death Experience


Judy Garland appeared in 40 movies, earned a Juvenile Academy Award, two additional Oscar nominations, a Golden Globe, a Special Tony Award, and was the first woman to win a Grammy for Album of the Year. She was the youngest entertainer and first woman to receive the Cecil B. DeMille Award for lifetime achievement in the film industry — when she was just 39 years old. She was such a professional that, during her heyday, she could deliver dialogue, lyrics, and choreography in just one take. She starred regularly opposite Gene Kelly, Fred Astaire, and, of course, Mickey Rooney. She is remembered as one of the greatest entertainers of the 20th century; in fact, the American Film Institute lists her eighth among the greatest female stars of the Golden Age of cinema. Camille Paglia wrote in 1998 that Garland “makes our current crop of pop stars look lightweight and evanescent.” James Mason, her costar in A Star is Born, said in his eulogy at her funeral, “Judy’s great gift was that she could wring tears out of hearts of rock.”

None of this illustrious career appears in Judy, the new film based loosely on Peter Quilter’s play “End of the Rainbow” and even more loosely on Garland’s life. Just as the producers of Iron Lady chose to focus on the sad end of Margaret Thatcher’s life after dementia had set in, director Rupert Goold focuses on the humiliating end of Garland’s career. Moreover, Judy takes more liberties than a drunken sailor in playing fast and loose with the facts. Events are combined and reordered, characters are condensed or eliminated, and her final husband, Michael Deans (Finn Wittrock), leaves her several months before her death, when in fact they had been married only three months when she died, and he was the one who found her that day.

James Mason, her costar in A Star is Born, said in his eulogy at her funeral, “Judy’s great gift was that she could wring tears out of hearts of rock.”

The result is more an artistic impression of Garland than a documentary of the end of her life. It’s far from accurate, and far from complete. Liza Minnelli posted on her Facebook page that she has never met Renee Zellweger, who plays Garland, and that she did not sanction the film.

Nevertheless, it is a terrific movie that finds its theme and its purpose in the end, and Zellweger is surprisingly good in the role, from her sideswept haircut to her sideways smile to her fragile insecurities. Occasionally she even achieves that searching depth of Garland’s luminous brown eyes. And her voice is good — not Garland good, but good enough that I searched the credits to see who had dubbed her voice. (Zellweger actually did her own singing, after a year of training.)

The picture opens with Garland hawking her young children Lorna and Joe Luft onstage, much as her own mother had hawked her as a child actress. She is penniless, homeless (thanks to the IRS), fragile, and humiliated. Desperate to earn enough money to buy a home where she can maintain custody of her children, she accepts a contract to perform in London, where her insecurity about her voice often leads to disaster. Slugging down pills with booze, she staggers to the microphone, performs (sometimes well, sometimes not so well), and staggers back off. As she pops pills or eschews food or experiences insecurity, the scene flashes back to the young Judy (Darci Shaw) filming The Wizard of Oz, with Louis B. Mayer (Richard Cordery) browbeating her and groping her, and with her mother Ethel Gumm (Natasha Powell) urging her to swallow amphetamines “to take the edge off” her appetite, followed by barbiturates to help her sleep. And Judy complies, because that’s her job. We get it: once she became Judy Garland, little Frances Gumm never had a chance at happiness. According to the movie, the booze and pills were not her fault.

What they do next is a hint of the life that Judy will later have, living on as an icon of gay culture.

Jessie Buckley is terrific as Rosalyn Wilder, the stoic, inventive, and eventually sympathetic young theater publicist assigned to make sure Judy is happy, sober, dressed, and ready for each performance. Rufus Sewell is heartless but pragmatic as ex-husband Syd Luft, bent on providing a stable home for their children, even if it means breaking Judy’s heart and spirit. And while Zellweger doesn’t quite channel Garland (as though anyone could) she delivers a fine performance that could well garner an Oscar nod come January. Kudos as well to the set and costume designers for providing well-coordinated splashes of color throughout the film, a deliberate reminder perhaps that Garland was known as the queen of Technicolor (or perhaps just a happy serendipity).

Garland died of an overdose in 1969, unaware of the remarkable life after death that awaited her. Despite the sadness and tragedy of her final years, the film ends on a hopeful, almost joyful note, hinting at her rebirth. Two fans befriend her at the stage door one night and invite her to their flat for supper. They’re gay, and one of them has spent six months in jail for it. Little is spoken, but Judy expresses her empathy and support with gesture and song as they sing together at the piano. It is very intimate, and very touching. The two men are in the audience again at her final concert when her voice falters while she sings “Rainbow.” What they do next is a hint of the life that she will later have, living on as an icon of gay culture. The unity is simply astonishing, and brought many in the movie audience to tears. If establishing this future connection was Goold’s intent in making the movie, he succeeded admirably.

On a recent cruise I joined the nightly standing-room throng to hear Perry, the ship’s wonderfully irrepressible, sequin-clad piano lounge singer. His between-song patter was delightful and often began with an exaggerated, “Who here doesn’t remember 1947 when Judy . . .” or “Who here doesn’t think Judy should have played the role of . . .” or “Who here doesn’t listen daily to Judy’s double album . . .” etc. If, as Abe says in Simon Stephens and Nick Payne’s “Sea Wall / A Life,” we live until the last person utters our name, Judy will never die.

Editor's Note: Review of "Judy," directed by Rupert Goold. BBC Films, 2019, 118 minutes.

Share This

Divulged and Then Forgotten


You remember the Katharine Gun story, right? The British “Ed Snowden” who leaked a damning National Security Administration email that urged wiretaps and extortion in order to influence the UN vote in favor of invading Iraq, back in 2003? And you remember the British “Neil Sheehan,” Martin Bright, who got hold of the document and published it on the front page of the Observer in early March of that year? Surely Gun went to prison and Bright won a Pulitzer, right? Together they prevented the war in Iraq? No? You don’t remember?

Well, that’s because only the first half of the above scenario actually happened. Gun did leak the document, and the Observer did run Bright’s story on its front page, on March 2, 2003. All hell should have broken loose, and support for the war, already shaky in some quarters, should have ended. Nevertheless, three weeks later George Bush began the Shock and Awe bombing of Iraq. The article, though itself shocking and awful, had little effect, for reasons that are made clear in the movie Official Secrets (and would spoil the experience for you if I revealed them here.)

Full disclosure: I wasn’t entirely against the war when it started. I was living in New York when the Towers were hit. I listened as emergency vehicles screamed their way down Broadway that day. Comforted my daughter when she woke up with nightmares that week. Had nightmares myself when the Metro North trains chugging by at night entered my dreams as thundering military planes. Later, I feared the weapons of mass destruction whose existence Colin Powell confirmed to the UN in calm, measured, insistent tones. Yes, as much as I hate war, I was manipulated by the hype, the news, and my fears. And by that gaping hole in downtown Manhattan, that looked like an abscessed cavity among the skyscrapers as I flew into LaGuardia a month after the attack. But mostly by those weapons of mass destruction.

All hell should have broken loose, and support for the war, already shaky in some quarters, should have ended.

I say this as a reminder that public opinion mattered immensely in the runup to the war. Bush did not want to be seen as the aggressor but as the moral defender. Therefore, he needed the support and approval of the media, the world at large, and the UN in particular. The leaked document could have influenced all three. Indeed, the editorial board of the Observer had supported the war, until its members were convinced that the document was real and they decided to publish the article.

Official Secrets tells this story skillfully, suspensefully, and with reasonable accuracy; Gun was a consultant on the film and spent many hours with director and writer Gavin Hood to help him understand the motivation for what she did, and her experience after she was caught. But as in all films, the story is streamlined and enhanced for dramatic effect. In particular, Gun is married to a Turkish Muslim, Yasar Gun (Adam Bakri), which casts some underexplored suspicion on her motivation. Moreover, whenever a film is based on a true story, the filmmaker has to package it for presentation in a two-hour block with a rising conflict and satisfactory resolution. That requires streamlining events and enhancing or creating certain characters to make it work. But Official Secrets feels like an honest presentation, whether or not it is entirely factual.

Two character lines drive the film: that of the whistleblower Kat Gun (Keira Knightley) — do I dare say she pulls the trigger on the NSA? — and that of the reporters Martin Bright (Matt Smith) and Peter Beaumont (Matthew Goode), who investigate and write the story. All face the same dilemma: how to reveal confidential information without facing jail time.

Yes, as much as I hate war, I was manipulated by the hype, the news, and my fears.

As a low-level translator for the British Government Communications Headquarters, Gun is basically hired to spy, eavesdropping on private conversations and alerting her supervisor if something seems “suspicious.” She is bound by the Official Secrets Act of 1989 not to reveal or even talk about anything she sees or hears or experiences at work. (This becomes particularly onerous when she tries to communicate with her lawyer.) But when reminded that she works for the British government, she counters, “I work for the British people.” Good stuff.

Bright and Beaumont are mostly concerned with authenticity: Is the document real? Can they confirm its source without revealing their own sources? Their efforts to verify create a more suspenseful and compelling storyline than Gun’s relationship with her husband and her fears about their personal risks. To me it’s the best part of the film. Ralph Fiennes as her principled attorney also provides some fine libertarian talking points.

When reminded that she works for the British government, Gun counters, “I work for the British people.”

Both of these topics — spying and ethical journalism — are highly relevant today. My inbox is full of speech-chilling articles about Apple and Google using Siri and Alexa to listen in on private conversations, and even more chilling articles about the draconian “social credit” system arising in China. And as I write this review, the New York Times is trying to justify its decision to run a front-page story dredging up a sexual abuse allegation against Justice Brett Kavanaugh and “accidentally” leaving out a sentence indicating that the presumed victim says that she doesn’t remember the incident. In a particularly dramatic scene, a similar “accident” happens in Official Secrets.

Moreover, the “quick” war in Iraq quickly expanded to Afghanistan and other countries in the Middle East, dragging on for 18 years with no end in sight. Over 5000 US troops have been killed and tens of thousands have been injured. And American good will around the world is at an all-time low.

We really showed them, didn’t we?

Editor's Note: Review of "Official Secrets," directed by Gavin Hood. Entertainment One, 2019, 112 minutes.

Share This

Strong on the Individual


With its intellectual A-list cast sporting two comediennes, multiple Oscar nominees, and a writer-director known for his careful character development founded in realism, Where’d You Go, Bernadette promises to be the perfect choice for filmgoers with discriminating taste (if “discriminating” can still be used in a favorable context) I was a little concerned by the 46% critics’ rating, but the 77% viewer approval (and that A-list cast) gave me hope that the critics had missed the point on this one.

Bernadette (Cate Blanchett) lives with her husband Elgie (Billy Crudup) and teenaged daughter Bee (Emma Nelson) in Seattle, where Elgie works as a software developer for Microsoft and Bernadette is a stay-at-home mom whose eccentric nonconformity drives the other mothers mad at Bee’s tony private school. As the film opens they are planning a family trip to Antarctica, a reward for Bee’s perfect report card, and Bernadette is smiling her agreement while plotting how to avoid going.

The film’s title refers literally to Bernadette’s unexplained absence from home midway through the film, but it also refers figuratively to her lost sense of identity. She is an award-winning architect who has lived for nearly two decades in a peeling, unfinished fixer-upper and uses a virtual assistant in India so she doesn’t have to deal face-to-face with people; a gregarious and affectionate wife who can’t open up to her husband; a homeowner who creates devious acts of microaggression against her neighbors; and a mother who . . . well, mothering seems to be the one things she does joyously and well.

As the film opens they are planning a family trip to Antarctica and Bernadette is smiling her agreement while plotting how to avoid going.

Slowly, gradually, we begin to understand what has made Bernadette so withdrawn, and our sympathy for her deepens. We get it. We even discover that she has a bit in common with Ayn Rand’s architect Howard Roark. But the development is too slow and too gradual to be more than moderately engaging, let alone comedic. Yes, it has deeply ironic moments that cause us to guffaw knowingly, but the pacing is simply too slow, the movie too long, and the delivery too deliberate for most audiences.

Bernadette seems to be delusional or agoraphobic or clinically depressed — or perhaps all three. Blanchett plays them all to perfection, manically discoursing nonstop one moment, retreating into isolation the next, growing loving and affectionate in other scenes. We may not understand her, and we wouldn’t want to live next door to her, but we like her all the same. In sum, she is suffering an identity crisis of enormous proportions.

The film itself suffers from a similar identity crisis. Like its heroine, it pretends to be what others want it to be instead of what it is — a surprisingly upbeat but slow-paced drama about a woman struggling with mental illness and her relationships with the people who love her. Yet if you’ve seen any trailers for the film, you would expect it to be a fast-paced, rollicking, laugh-out-loud comedy. But it’s not, which is probably why critics have given it a pass. It’s as though the marketing team decided to promote what they wanted the film to be, instead of what it is. And that’s a lot of Bernadette’s problem too: she has chosen to suppress what makes her special, including the tragedy in her life, in order to be accepted. And it’s driving her crazy.

Slowly we begin to understand what has made Bernadette so withdrawn, and our sympathy for her deepens. We even discover that she has a bit in common with Ayn Rand’s architect Howard Roark.

Director Richard Linklater deliberately cast against type in order to defy audience expectations and emphasize the message of the film: We can’t really be ourselves when we are trying to satisfy someone else’s expectations. Judy Greer, Kristen Wiig, and Megan Mulally, all titans of smart and sassy comedy, are excellent, but they’re given serious roles, one as a clinical psychologist, one as a former acquaintance, and one as a tight-assed controlling neighbor. And because we expect them to be rollickingly funny, we can’t quite accept them as they are — we make them funny, even when they aren’t. It’s a brilliantly subtle directing decision, but it doesn’t quite work, especially in the face of a marketing campaign that didn’t trust the strategy.

Bernadette failed with the critics for largely the same reason Bernadette fails with her neighbors — it tries to present itself as something it’s not. And it simply isn’t necessary. It doesn’t have to fit a genre. The film has strong characters, strong acting, and a strong message. It may be slow, but that’s OK. My friends don’t have to keep me in stitches all the time, and neither do my movies. Where’d You Go, Bernadette is slow, but it’s worth seeing, and maybe even worth seeing again.

Blinded by the Light is another independent film about discovering and maintaining one’s true identity when others are trying to tell you who you ought to be. Set in the 1980s and inspired by the youthful experiences of writer Javed Khan as a Pakistani growing up in Luton, England, its themes of immigration, racism, culture, and fitting in are as current as yesterday’s Facebook post.

In Springsteen’s music Javed finds his own story: the working-class neighborhood, unemployed father, blue-collar expectations with white-collar dreams, and the raw talent to make them happen.

Javed (Viveik Kalra) lives in a working-class neighborhood in England with his ultratraditional Pakistani family. His father Malik (Kulvinder Ghir) left home against his own parents’ wishes in search of a better life for his family in England, but he is extremely traditional in his expectations about family and culture. He controls the money, the social life, and the future plans of his wife and his children. “You will not become British!” he shouts at Javed when Javed asks to attend a neighborhood party. As the film opens they are preparing for their oldest daughter’s marriage to a man she has not yet met. Malik pockets the meager earnings from his hard-working wife and children, commanding respect and control even though he is unemployed himself.

And yet, isn’t “becoming British” – or American, or Texan, or suburban — precisely why immigrants bring their families to a new land? Isn’t it because they like what they have observed and want to take advantage of its success? Yet I see newcomers time and again setting about to change the very culture that attracted them.

At school Javed is picked on and shunned. In the neighborhood he is chased and spat upon. At home he feels isolated and unhappy.

And then he discovers Bruce.

Springsteen may not be the Boss of me, but he certainly became the Boss of his own life, without ever losing sight of his hometown roots.

In Springsteen’s music Javed finds his own story: the working-class neighborhood, unemployed father, blue-collar expectations with white-collar dreams, and the raw talent to make them happen. I’ve never been a big fan of Bruce Springsteen; most of his music is too loud and his raspy voice is too harsh for me. But as The Boss’s lyrics became the soundtrack to Javed’s young life, with key phrases popping onto the screen during key moments, I gained new insights and appreciation for Springsteen as a poet. He may not be the Boss of me, but he certainly became the Boss of his own life, without ever losing sight of his hometown roots.

That’s the key to Javed’s identity too — he discovers how to stand up for himself, follow his own dreams and choose his own path without rejecting his foundation as a Pakistani growing up in working class England. Through the inspiration of Springsteen’s music Javed finds his own voice and a way to embrace both his future and his past.

Editor's Note: Review of "Where’d You Go, Bernadette," directed by Richard Linklater. Annapurna Pictures, 2019, 130 minutes; and "Blinded by the Light," directed by Gurinder Chadha. Bend It Films, Ingenious Media, Levantine Films, Rakija Films, New Line Cinema, Warner Brothers, and a slew of others (Chadha was persistent in getting this film distributed!), 2019, 118 minutes.

Share This

Nuclear Power: Again, Why Not?


Those who believe that manmade climate change threatens civilization, and even nature itself, with imminent death should be clamoring for more nuclear energy production, which releases no greenhouse gases. I wrote this recently as part of a longish essay about my nonexpert citizen’s skepticism regarding climate change. I speculated that one reason for the fact that there is no closing of the ranks around nuclear energy is that it’s reputed to be dangerous, especially in view of the possibility of radiation leaks. I also argued that, in spite of this ill fame, it’s difficult to find anywhere evidence of much health damage caused by radiation.

This information scarcity makes it difficult to assess the reasonableness of the widespread avoidance of nuclear energy production. Almost everyone who expresses an opinion dislikes it or is mistrustful of it. Voicing the objection that these unfavorable attitudes are not rooted in evidence either raises the eyebrows of disbelief or triggers the silent charge that you have not looked hard enough (which may, of course, be true).

Soon after writing the essay I just mentioned, I read a new book overflowing with anti-nuclear evidence that moved the hand on my clock some — only a little, but enough — to be worth discussing. The book is Kate Brown’s A Manual for Survival: A Chernobyl Guide to the Future. In any case, if I don’t discuss it, others will, from a predictably laudatory angle, I would bet.

Brown is an engaging writer, though one with baffling lapses, in several languages.

Brown’s thesis is that the 1986 Chernobyl nuclear accident released more radioactivity, in more places, for longer than the official sources allow, and that many more people’s health was affected, and much more severely, as well as more lastingly, than is openly acknowledged.

Brown is an engaging writer, though one with baffling lapses, in several languages (“tribunal” for “tribune,” “amass” for “mass,” “bales of hay,” for “bales of wool,” to “curate” for to “examine,” “Judenrein” for “Judenfrei”). Although she is a Soviet expert and a reader of Russian, for several pages (p. 184 and on), she confuses perestroika (“deep societal reform” or “restructuring”) with glasnost (informational “opening,” in the sense of increased transparency). And despite the fact that Brown is a tenacious, assiduous, even a formidable, researcher in the service of her thesis (more on this below), several major defects detract from the persuasiveness of her book.

First, Brown is not a neutral investigator, or even a journalist, but unambiguously an activist who hates nuclear anything. Perhaps as a result, she pretty much treats every dissenting voice as part of a long-lasting conspiracy involving Soviet authorities (national and local — no surprise), major UN agencies, and several US federal agencies, to conceal the scale of the ill effects of radiation exposure, as well as the intensity and duration of such exposure. This is not completely unbelievable. I myself think that the deadly climate change narrative is supported at once by local, national, and international government actors, although not within the context of a conspiracy but of a passively shared perception.

But Brown’s overreliance on a conspiracy explanation ends up undermining the credibility she earns by good archival digging. When she adds the International Red Cross to her already rich mix of plotters, my willingness to suspend disbelief vacillates. I don’t see how it cannot. As she continues, it crashes. The main international organization that cites large numbers of radiation victims following the accident is Greenpeace. But Brown herself honestly describes Greenpeace’s attempt to collect data in the Soviet Union as a fiasco.

Second, and possibly fatally for her, Brown straightforwardly imputes increases in mortality and morbidity in the vicinity of Chernobyl to a rise in radioactivity in the region following the reactor’ meltdown. She does this without benefit of baseline estimates regarding either radioactivity or health conditions before the accident. This is a major defect, of course: you may not impute a rise in Y to a rise in X if you cannot demonstrate a rise in X. In a roundabout way, she admits in several places that she cannot demonstrate a rise in radioactivity around Chernobyl or, in fact, anywhere at all, following the accident. She argues — persuasively if you disregard the rest of her book — that hundreds of nuclear weapons tests in the ’50s and ’60s had so overwhelmingly loaded the atmosphere with radioactivity that it may be difficult or impossible to isolate the comparatively modest radioactive emissions from Chernobyl specifically. (See, for example pp. 244–245, for the radioactive saturation following American tests.)

Brown’s overreliance on a conspiracy explanation ends up undermining the credibility she earns by good archival digging.

But it seems to me that if you are unable to measure a rise in X, there is no point in trying to blame it for a rise in Y. And if you believe that X causes Y and you cannot link an actual increase in Y even to an identified increase in X, you may not claim much about anything. Brown thus finds herself in the impossible situation of trying to demonstrate rigorously that something that she argues cannot be assessed (increase in radioactivity) is the cause of something that she loosely measures or that remains unmeasured (rises in illness and in mortality reasonably traceable to radiation exposure).

Third, like many writers with a cause, Brown devotes no attention to possible negative evidence, to evidence against her thesis. She has nothing to say about situations where there should be an excess of pathologies and of mortality — according to her implicit model that enough exposure to radiation must result in noticeable excess morbidity — but none appears. The book is written a little like a scholarly article in which contradiction is pretty much expected, even guaranteed, from other knowledgeable sources, as part of a social process. But for this book, it’s not, for reasons I develop below.

Some of the book’s tangible findings seem defective as soon as you perform a little comparison around them. For example, Brown deals abundantly with rates, including accident rates, of course, before and after the Chernobyl disaster. But she seldom provides absolute numbers, which alone can tell us how much the rates matter. (If I read that the annual rate of suicide among churchgoing southern black grandmothers of five has increased by 100% in one year, I should ask whether it’s gone from 10,000 to 20,000, or from two to four, or even from one to two.) When she does give real numbers, absolute numbers, the effect tends to be underwhelming in ways she does not seem to understand: “eighty new thyroid cancers among 2.5 million Belarusian children . . .” (p. 250). Obviously, that’s hundreds of tragedies for parents and relatives and for the sick children themselves, but it’s not a massive epidemic. Perhaps it’s no more than a counting error.

Here is an indirect but reasonable comparison of orders of magnitude: in 1990, the rate of child mortality for Russia was 2.15% (“Child Mortality,” Max Roser, Our World in Data, May 10, 2019). Applying this rate to Brown’s 2.5 million children gives us a raw number of children’s deaths from all causes of about 54,000. I am sorry, but 80 out of 54,000 is not a big surplus. This comparison assumes that Belarusian children's death rate from all causes after Chernobyl is similar to Russia’s in 1990. This does not seem farfetched. Furthermore, the 80 thyroid cancers Brown reports probably did not all result in deaths, which makes the raw number of 80, blamed on accidental exposure to radioactivity, look even smaller.

When she does give real numbers, absolute numbers, the effect tends to be underwhelming in ways she does not seem to understand.

More strangely, the author treats what I believe is her best causal evidence with near indifference. She mentions in passing two large studies of nuclear plant workers conducted in Europe, in relative openness and under favorable European conditions — studies that convincingly link exposure to radiation to several pathologies (p. 294). To reach skeptics like me, that research should have been presented at the beginning of the book, rather than near the very end. Since it was not, I have to wonder what’s wrong with this research. (It’s probably nothing, but am I expected to confirm the research myself?)

Even the best evidence that Brown collects herself seems relegated to a near afterthought. She reports that the health statistics for areas affected by radiation remain normal-looking until shortly after the dissolution of the Soviet Union in the ’90s, when higher figures begin to show up. It’s as if a lid had been removed that constrained the truth. However, there are other possible explanations for this sudden change and its coincidence with the end of Soviet power; Brown spends little time discussing them.

In spite of its serious structural defects, in spite of its inadequate treatment of intriguing data, I am not able to dismiss Manual for Survival . . . not entirely. The main reason is that I am one of those who secretly suspect (against orthodoxy) that a large enough accumulation of anecdotal evidence ceases to be merely anecdotal.

If nothing else, the secretive habits of the Soviet informational first responders at the time pretty much guaranteed that some data were locked up, and others simply not collected.

Brown has performed huge amounts of both field research and archival research, spread over what seems to be 25 years, or even 30 years. Her perseverance is exceptional. Thanks to the quality of her narrative, the reader easily gathers that much of her work was done under adverse conditions, physically and politically. (The thought crossed my mind that I would want Brown on my side in any bar fight.) But, as I said in an opening paragraph, Brown is an activist. She seems to understand the scientific endeavor, but science isn’t her main business. It’s difficult to imagine anyone checking her numerous sources, except perhaps a scholar funded by the nuclear industry. The hire would pretty much have to be a Ukrainian with a very good command of English, because a high proportion of the book references are in Russian, or even in Ukrainian. It’s not going to happen. No one is likely ever to check all her sources — not even a principled sample of her sources, not even a handful.

Yet, I am thinking, a portion of her alarming assertions is probably true, and official sources probably underestimate the health damage caused by the Chernobyl accident. If nothing else, the secretive habits of the Soviet informational first responders at the time pretty much guaranteed that some data were locked up, and others simply not collected. The health lessons that Chernobyl has for today are still necessarily limited. The accident happened in connection with a primitive nuclear technology and under the rule of a political system that was routinely both criminal and mendacious. Since Chernobyl, the nuclear reactor core at Fukushima melted down in the worst possible context of a natural catastrophe. Dissimulation of the health consequences of the Fukushima disaster was unlikely in the relatively open Japanese society. Radiation leaks took place in the midst of a population especially sensitive to such dangers. And yet, not much appears to have happened to anyone’s health that can be linked to radiation.

So, Manual for Survival has moved the needle a little for me — not much, but some. The best way I can express it is this: I would still advocate the replacement of nearly all coal-fueled energy production plants with nuclear plants. I would probably refrain from the same recommendation in connection with relatively clean natural gas plants. That’s because there is no doubt that coal burning pollutes in several ways, irrespective of the reality of the climate change narrative. Natural gas is so clean by contrast that it may not be worth it to take even the slight and poorly demonstrated health risk that may be associated with accidental radiation exposure in order to avoid burning gas.

Her monumental work is largely irrelevant for rationalists, except from a historical viewpoint. It may stand well as another chapter in the sorry history of the Soviet Union.

Of course, if someone whom I thought qualified reviewed Brown’s multiple sources and pronounced them mostly adequate, I would revise my judgment again about the safety of nuclear energy production.

In the end, her monumental work is largely irrelevant for rationalists, except from a historical viewpoint. It may stand well as another chapter in the sorry history of the Soviet Union. The primitive technologies and the incompetent and weak sociopolitical controls of Chernobyl are gone for good. There is a segment in Steven Pinker’s well-documented Enlightenment Now (2018) about both the disadvantages and the overwhelming advantages of nuclear power (pp. 144–150). Why, even climate-change-fixated National Geographic shows a few signs of coming around! Though not many signs: see the short feature on nuclear engineer Leslie Dewan, in the March 2019 issue.

Editor's Note: Review of "A Manual for Survival: A Chernobyl Guide to the Future," by Kate Brown. Norton: New York, 2019. 432 pages.

Share This

Despised, But Not Resisted


After reviewing Quentin Tarantino’s The Hateful Eight (2015), I swore I had seen my last QT film. The acting was stagy, the bloody violence gratuitous, the storyline beyond unbelievable. He hadn’t just “jumped the shark”; he had catapulted the cow jumping over the moon. I was done.

But something about his latest offering, Once Upon a Time . . . in Hollywood, drew me back. The stellar cast, led by Leonardo DiCaprio and Brad Pitt, promised committed, unexpected performances. The setting, 1960s southern California, was where and when I grew up, and I was drawn to the nostalgia I would certainly experience. And the story, leading up to the Manson murders, was not only tragic but also somehow romantic in the classical sense — a story of people who captured the interest of the nation when it occurred. Many say the ’60s died that day, along with Sharon Tate and her friends. Yes, I assumed there would be blood (and there is) but at least it wouldn’t be gratuitous this time. And in fact, it doesn’t show up until late in the film. I was willing to give QT another look.

Similar to Tarantino’s breakout Pulp Fiction (1994), Once Upon a Time presents multiple disconnected storylines while foreshadowing an explosive climax. Rick Dalton (DiCaprio) is a TV western star nearing the end of his TV career. Dalton is based not-so-loosely on Clint Eastwood in “Rawhide” or Steve McQueen in “Wanted Dead or Alive.” Like Eastwood, Dalton is encouraged to move to Europe to make spaghetti westerns with a director named Sergio. And like McQueen, he is a bounty hunter in his TV series. Also like McQueen, Dalton carelessly knocks a young girl onto the floor in a movie scene; McQueen is reported to have knocked an actress across the room during a “method acting” improvisation for the great Constantin Stanislavski. After the scene cuts, the little girl tells Dalton, “That’s the best piece of acting I’ve ever seen.” Stanislavski said the same to McQueen after he smacked the young starlet in acting class.

Many say the ’60s died that day, along with Sharon Tate and her friends.

McQueen shows up in the film by the way — played by Damian Lewis, who utterly nails McQueen’s piercing eyes and brooding mouth. The film is full of homages and allusions such as this, and one could enjoy it just looking for the Easter eggs. Tarantino knows his Hollywood trivia! But there is much more to this movie than homage.

Another storyline focuses on Cliff Booth (Brad Pitt), who works as Dalton’s stunt double and gofer. He drives Dalton around town, grabs him a beer when he’s thirsty, runs his errands, fixes his antennae, listens when he’s despondent, and does it all with that winning Brad Pitt smile. Audiences at the premier in Cannes whistled and clapped when Pitt stripped off his shirt to work on said antennae. At 55, Pitt is still plenty buff. Dalton might be the protagonist, but Booth is clearly the star. Even the way he side-clicks his tongue, signaling to his dog that it’s time for dinner, is gobsmacking.

Dalton lives next door to Roman Polanski, who he hopes will notice him and cast him in a movie. Meanwhile, Sharon Tate (Margot Robbie), Polanski’s pregnant wife, is luminously happy about her breakout role in a Dean Martin movie, The Wrecking Crew. Having started her career in TV shows like “Mister Ed” and “The Beverly Hillbillies,” she is understandably ecstatic to see her name and image on a movie poster. Robbie plays her shy excitement just right — almost embarrassed to look at the poster in the movie theater lobby, wanting to be recognized, finally having to say who she is, then basking in the recognition she has created and positively glowing as she listens to the crowd reacting to her scenes. You can’t help feeling empathy for this pretty girl whose life was cut so gruesomely short, back in 1969.

Tarantino knows his Hollywood trivia! But there is much more to this movie than homage.

And then there’s Charlie Manson (Damon Herriman), who makes a brief appearance at 10050 Cielo Drive in Benedict Canyon, looking for its previous resident, Terry Melcher. We see him almost as a shadow, a ghost that hovers and lingers without really touching down. His “family” — Squeaky Fromme (Dakota Fanning, all grown up and sporting a potty mouth); Froggie (Harley Quinn Smith); Pussycat (Margaret Qualley), and Tex (Austin Butler) — provide a constant simmering background to the film and an ongoing foreshadowing of the climax we know is going to come. They dive into dumpsters, thumb rides on street corners, maraud though the town like bandits, and preen like sirens. They are spooky and scary, even without blood. Take a note, QT.

As expected, the stories eventually come together, but in unexpected ways. And that’s all I’ll say about that.

Despite its length and somewhat slow development, this is Tarantino’s best work since Inglourious Basterds. I will probably see it again, next time to focus more on the Hollywood allusions and Easter eggs. And, reluctantly perhaps, I will continue to see and review Tarantino’s movies. He is the most maddening and brilliant of directors. I despise him — but I can’t resist him. Ironically, I think that’s what the “family” said about Manson.

Editor's Note: Review of "Once Upon a Time . . . in Hollywood," directed by Quentin Tarantino. Sony Pictures, 2019, 161 minutes.

Share This

Stories, Good and Bad


Steve Almond’s latest book, Bad Stories: What the Hell Just Happened to Our Country, is the author’s attempt to sort through America’s flaws, as he perceives them, in order to explain the ascendency of President Donald Trump. The rock that gave credibility to his account — the Trump-Putin collusion — has now eroded to mere talus, and Almond’s book stands before the world as just another silly piece of business from the academic left.

Almond is enough of a socialist to recommend the nationalization of the football industry (and that’s going pretty far). In an interview in The Sun (September 2015) he was questioned about his recently published book, Against Football. At one point, the interviewer, David Cook, asks: “Is there anything that would make football worth watching again for you?” In his reply, Almond describes his “dream” of public ownership of the “football industry.” Thus: “The football industry could benefit our communities rather than billionaire owners and sponsors. What would it be like if the teams were publicly owned and the profits were funneled into the public coffers?” And he asks, “Is that such a crazy idea: that this game might help the people who need help the most?”

Almond’s book stands before the world as just another silly piece of business from the academic left.

Bad Stories is premised on the determinist idea that individual minds develop according to the stories to which they’re exposed. Furthermore, bad stories, the ones he sees as flawed or distorted, function to keep the good stories out of circulation. All this comes very close to the traditional socialist preference for a deterministic theory of human character and conduct. As Robert Owen put it, “A man’s character is created for him.” Owenite socialist experiments in America collapsed, one by one, and yet he kept on preaching the socialist faith, and many others have since joined him in resisting reality.

The Great Enlightenment of the 18th century, with its discovery that man, through reason and experiment, could identify chemical elements and synthesize molecules, led some thinkers to believe that they could, through reason, design an ideal society. Breathing this atmosphere, Owen concluded that the social order was all wrong, as did Fourier and such other extremists as Marx and Sorel. The main obstacle between them and the realization of their collectivist dreams was human experience, which included 35,000 years of trial and error. One product of the centuries of ebb and flow was the rise of Western civilization and its slow and costly march toward personal and economic freedom. And thus emerged free-market capitalism with its built-in pricing system, a wondrous instrument that adjusts production and distribution, according to demand. Under socialism, the free-moving pricing system is replaced by a governing bureaucracy — a nonaccountable realm, as James Q. Wilson described it. However much the socialist may rant about a just distribution of wealth, the pricing system carries with it a justice that sensible humans can truly understand — rewards based on the satisfaction of consumer demand.

It’s this form of justice that the socialist cannot abide, for he cannot control it — cannot channel its rewards to those that he, by heaven, believes are the most deserving. Hayek has warned us that personal freedom has never existed without economic freedom. “Fear not,” the socialist might say, “human nature is malleable. People will adjust to the imposed system.” But have they ever done so? And one might ask — if people are all that malleable, how did the socialists dream up socialism?

Owenite socialist experiments in America collapsed, one by one, and yet he kept on preaching the socialist faith, and many others have since joined him in resisting reality.

But here we have Steve Almond, feeling impelled to explain the rise of Donald Trump. He reveals the many reasons he divines, foremost among them being those “bad stories” that “distort our belief system.” Why does he neglect the individual’s reasoning process and the skepticism it can produce — especially in reflecting on political issues? Are individual Americans mere bots, driven now here, now there, by media prompts? The reasons he deduces for victorious Trumpism often involve a mass response to external influences, and these often trace back to capitalism. Consider this partial list: voter suppression, elections held on Tuesday, a disaffected electorate, hostile feelings for the other party, passions of a small minority, white privilege and the petty bourgeoisie, dehumanization, conservative paranoia, lying Rush Limbaugh and talk radio, thirst for entertainment, the sports brain, racism, opposition to social change, Trump’s unbridled aggression, television, the internet, Russian bots, lack of a Fairness Doctrine, the media’s right-wing conspiracy (a laugher), Vladimir Putin, Albanian hackers, and, of course, the Electoral College.

Almond criticizes the right and the alt-right for their paranoia, never showing, through example, how they demonstrate that quality. But his own apocalyptic predictions evince the paranoia he attributes to Trump supporters. One of his preferred literary works when seeking an analogy for Trumpism is Moby-Dick. The novel reeks with Significance. So great is the reek that I’ve wondered whether Melville was only kidding. One of my teachers in college, Professor E.H. Rosenberry, had similar thoughts — consider his book, Melville and the Comic Spirit.

But to Almond, Captain Ahab is “a parable about our national destiny in which the only bulwark against self-inflicted tyranny is the telling of a story.” (A bulwark? — how about the Constitution?) But Trump is more than an inchoate Ahab — he also resembles Kurtz, the antihero in Joseph Conrad’s Heart of Darkness. Conrad himself was a conservative, loyal to his adopted country, and, according to biographer Jeffrey Meyer, congenitally pessimistic. In Kurtz, we discover a smooth-talking reformer, who seeks to civilize a primitive African tribe. He dreams of a heroic reputation, but is drawn into the very culture he hopes to uplift. “The horror! The horror!” are his last words. Is it Almond, not Trump, who resembles Kurtz? But Almond ventures onward — Trump is also the golden calf prayed to by the Israelites. Indeed, he sees Trump as a packhorse for all the tyrannical vices — a bully, a bullshitter, a hollow entertainer who avoids the issues.

Why does he neglect the individual’s reasoning process and the skepticism it can produce — especially in reflecting on political issues?

What issues? Almond hardly mentions intellectual issues. He never details those Russian-hacker smears he complains of, though he spends a great deal of space on Hillary’s emails. When he does address policy issues, he bases his judgments on the immediate interests of his close acquaintances and a woman he found on He mentions the benefits they derived from Obamacare, for example, or from the Obama “stimulus package.” He overlooks the long-term effects of these “benefits” on the country. He believes the voters should worry more about their personal vulnerabilities than about their grievances. Were this a sound principle, the Pilgrims might have stayed home, along with others who sailed the seas to America. Aggrieved by oppression, they risked a long period of vulnerability at sea — for the promise of freedom.

As Almond sees it, Vladimir Putin is still fighting the Cold War. That we won that war is, to Almond, a bad story, even though the ultimate collapse of the Soviet empire is an established fact, and America can rightly claim to have carried the day. The author disparages Ronald Reagan as “the star of Bedtime for Bonzo napping in the Oval Office.” And yet, crucial steps toward Cold War victory were taken during the Reagan presidency. The Reagan-inspired arms race, Reagan’s frankly anti-Soviet stance, and his close alliance with the Thatcher government, all contributed to the economic failure of the Soviets and the eventual decision by Gorbachev to release the captive nations. At the time, George Kennan and his containment policy got much of the credit, though the policy had led to the squandering of blood and treasure in winless wars. But Reagan’s contribution was enormous. What was his approach to the Cold War? “We win, they lose,” he said — and we won.

Will Trump initiate a period of American decline? Putin hopes so, according to Almond. So far, Trump’s tax cuts have brought a period of prosperity to America and provided a clear lesson: as an economic stimulus, a tax-cut beats the federal printing press every time. Creating inflation to stimulate the economy is the equivalent of a lie — an economic falsehood. But Almond, the socialist, may well see prosperity as decline — as “late-model capitalism.” And consider this claim by the author: “When our president fumes about NFL player protests, or Confederate monuments, or gun rights, he isn’t just ‘shoring up his base.’ He’s doing Putin’s bidding.” This is fanciful nonsense.

Almond, the socialist, may well see prosperity as decline — as “late-model capitalism.”

Donald Trump is proving of little danger to America. And he will likely alert our people to a true danger — the swing among Democrats toward socialism, a system that has created a procession of disasters, especially during the 20th century. Its popularity on campuses is a monument to the corruption bred by left-academics, and the ignorance they cultivate. An extensive literature of liberty exists in the archives of America and the Western world. But one would hardly suspect its existence, if judged by the state of knowledge of the average college student of today. When colleges and universities accept an enormous tuition, yet keep students in ignorance to preserve their leftist sympathies, they perpetrate a fraud.

Almond’s political ideas are accompanied by a fashionable anti-patriotism. That America is a representative democracy is, to him, a “bad story.” But members of both houses of Congress are elected by the populations of entire states, or of districts within each state — hence we have representative democracy in the legislative branch. The president of the United States is, at present, chosen by electors in each state, the number of each state’s electors being the same as the number of members in its Congressional delegation. The Constitution doesn’t require a direct popular vote for these electors. The system was meant to protect the interests of the less populated states and to produce a careful choice for an office of great potential power. George Washington recognized this power and, after his second term in office, established a precedent by bidding it farewell.

There is something malodorous in the words, something that smacks of mere namecalling.

And yet, Almond views America as “borne [sic] of high ideals and low behaviors, the land of all men are created equal and slave labor. We’ve been engaged in a pitched battle ever since, between greed and generosity, between the comforts of ignorance and the burden of moral knowledge.” This quotation is taken from the final chapter of Bad Stories, one that particularly reveals Almond’s affliction with Trump Derangement. Here, anti-Trumpism reaches the brink of hysteria — environmental protection, civil rights, free trade, public education, health care, are “all heading for bankruptcy.” On the other hand, “the markets for white supremacy, mass shootings, corporate profiteering, and nuclear cataclysm are booming.” And worse, “[Trump’s] aides and allies are mortified by his cognitive deterioration, his inability to read, or concentrate. It becomes more and more obvious that he’s unfit for the office. And yet, the office belongs to him.”

Do Almond’s own words represent a bad story as he defines it — one flawed and distorted? There is something malodorous in the words, something that smacks of mere namecalling. He subjects the man who is now president to an incompetent psychological evaluation. Trump, it seems, “never experienced a sense of being unconditionally loved, what psychologists call attachment. The best he could hope for within his family of origin was to please his domineering father through aggression. Because he never developed an intrinsic sense of self-worth, he can’t protect himself from feelings of inadequacy.” Thus Trump “proved especially captivating to disaffected Americans.” Does this last follow? Or was his appeal simply his departure from the Republican nice-guy-loser pattern — the one pioneered by Tom Dewey and emulated by Messrs. Goldwater, Bush, Dole, McCain, and Romney?

Experience clearly indicates that free-market capitalism is a far more effective system than its rivals. It’s no contest.

Chapter 15 in Bad Stories is entitled “Give Us Your Tired, Your Poor, Your Huddled Masses.” This fragment Almond sees as a “bad story.” It’s taken from a sonnet, “The New Colossus,” by the poet Emma Lazarus (1849–1887), a gifted woman. Her poem has been used and abused in debates over immigration, in particular those involving the Hart-Celler Act of 1965. Mencken wrote that the poet doesn’t deal in truth, but rather “conjures up entrancing impossibilities.” Perhaps the Lazarus poem conjures up a beautiful dream, the kind that guides and inspires the living. It bears a touch of early feminism, which ought to fascinate the present-day advocates. But it also contains, not only an invitation to our shores, but the inspiration to seek freedom wherever it’s dreamed of.

Alas, Almond is an apostle of equal outcomes, rather than freedom. I’m amazed that socialism ended up, not as one of Almond’s bad stories, but as a resuscitated grand scheme, or better, a secular faith. The fall of the Soviets after decades of blood and slave labor — despite the cheerleading from Western intellectuals — the rejection of British socialism with its Control of Engagements Order, the move of Scandinavian countries toward a free-market economy, and, much earlier, the failure of virtually all socialist communities in America, all taken together, suggest that maybe socialism isn’t such a good idea. The mass murders in Cuba, the desperate exits of its citizens, and the impoverishment of Venezuela offer more evidence of its futility.

Have you ever seen a more hysterical sentence?

Still, Almond sees capitalism, personified by Donald Trump, as the great evil. It stands in the way of our solving “crises that are beyond empirical doubt.” In his view, these include “climate change, resource depletion, and inequalities of wealth and opportunity, all of which are triggering mass immigrations, political unrest, and violent extremism.” It follows — to him — that we must reject bad stories (which ones?) and place our faith in “reason and empiricism.” Empiricism? But empiricism is the doctrine that the source of all knowledge is experience. And experience clearly indicates that free-market capitalism is a far more effective system than its rivals. It’s no contest.

Resistance to Almond’s collectivist agenda is reflected not only in “a ruthless free market theology” but also in other reactionary attitudes — a “make believe retreat from globalism, a nostalgia for white hegemony.” Worse yet, “our culture lurches about within the shadow of its own extinction, yet lacks the moral imagination to change its moral destiny.” Still worse, “our style of capitalism has acted as a financial centrifuge, perhaps the most brutal aggregator of wealth in human history, built on a foundation of slave labor and fortified by plunder, imperial warfare, the decimation of the labor movement, the predation of Wall Street, [and] the steady subjugation of oversight to private gain.” Have you ever seen a more hysterical sentence? Or one requiring more determined efforts to trace empirical facts about the world’s greatest multiethnic middle-class society back to its supposed causes in plunder and brutality?All this “resistance,” Almond conjures up with little regard for the reason and empiricism he urges on the rest of us.

It seems fair to remind Almond and his admirers that in its two-and-a-half centuries of existence, the United States of America has freed more people from slavery than any country in history. It has taken in tens of millions of immigrants seeking a better life and, through trade and the simple giving of gifts, has spread its bounty around the world. It has lost hundreds of thousands of its bravest sons in battles against the forces of tyranny — and often restored the war-damaged lands.

It’s the very risks, the adventure, involved in living in a free society that fascinate those whose heroic qualities remain unsuppressed.

Perhaps a touch of the sportsmanship in a “sports-brain” would make Almond a better, more reflective writer, as would a closer look at that “ruthless free-market theology.” A “brutal aggregator of wealth” must operate in a system of voluntary exchange. He doesn’t hide his money in the bathtub. He adds it to the river of capital that flows into financing and investments — creating new businesses, new products, new homes, more jobs, and more opportunities for improving the human condition. Is everyone happy in a free-market economy? No, of course not. Would there be greater happiness under any other economic regime? No again. A system that rewards the takers by plundering the producers will only spread misery in the long term. Frédéric Bastiat understood this, and his plain words might well be put in the hands of college students everywhere. He spoke of the “instinctive struggle of all people toward liberty.” Yes, and the struggle reflects those heroic qualities, which, taken together, represent a wonder of this earth — it’s called the human spirit.

It’s the very risks, the adventure, involved in living in a free society that fascinate those whose heroic qualities remain unsuppressed. But will their spirit roam free, or will it suffocate under a system that forces them to abide by a bureaucratic plan? An enforced equality can only be achieved at the expense of the best — the most creative, the most productive, the most in need of liberty. I see nothing in this that shows “moral imagination,” or that will lead to an improved “moral destiny.” And it may place all of us nearer to extinction.

With the exception of Almond’s personal experiences, which might make a decent book, Bad Stories is of little value as a source of truth. But as an example of the fashionable nonsense that passes for truth among the academic Left, it may be very useful indeed — it may lead some heroic individuals to educate young minds properly.

Useful Reading

Editor's Note: Review of "Bad Stories: What the Hell Just Happened to Our Country," by Steve Almond. Red Hen Press, 2018, 257 pages.

Share This

Only Yesterday


On July 6, 1957, John Lennon and his skiffle band, the Quarrymen, performed at the Crowning of the Rose Queen parade at Woolton Parish Church in Liverpool. Paul McCartney met Lennon backstage between sets and played a few songs, including a medley of Little Richard songs, on his own guitar. Two weeks later John invited Paul to join his band, and in 1960 they changed their name to the Beatles. Ten years after that the band dissolved. It had been the most prolific and profound decade of music writing, producing over 200 original songs on 13 albums and covering nearly 100 songs written by other artists.

I bring this up because today, as I write, is July 6, the anniversary of Paul’s meeting John, and today I happened to see Yesterday, a whimsical film based on their music.

What if Paul hadn’t attended that Rose Queen parade? What if John had become an artist instead of a musician? What if the Beatles had never existed as a musical group? Would the music still exist somehow? Was it their music, or does it belong to the universe?

Ten years after that the band dissolved. It had been the most prolific and profound decade of music writing.

That’s the theme of Yesterday, Danny Boyle’s delightful story of Jack Malik (Himesh Patel) a mediocre singer-songwriter who wakes up one day after an unexplained global power outage to discover that no one has heard of the Beatles, and none of their songs exist — except in his own memory. A Google search of Beatles produces nothing but a big black stinkbug. A search for John, Paul, George, and Ringo brings up a wiki article on Pope John Paul. Coca-Cola doesn’t exist either, along with several other modern products we take for granted. Jack’s reactions to the loss of these products, and his friends’ reactions to his requests for them, lead to several delightful moments in the movie.

So what would you do if you discovered you were the only repository of some of the greatest music of the 20th century? Jack is like Guy Montag and the educated tramps at the end of Bradbury’s Fahrenheit 451 — he becomes a living songbook, trying to remember and recreate every song in the Beatles lexicon.

Well, OK — he also performs the songs and accepts the accolades for having written them. He isn’t completely motivated by altruism. Or even a little bit. It’s somewhat galling to hear that these new songs are the best he’s ever written, but he gets over it easily enough. Soon he abandons his lifelong manager and almost girlfriend Ellie (Lily James) for a high-powered LA agent played by a deliciously wicked Kate McKinnon (Saturday Night Live’s Hillary Clinton), who openly admits to Jack that he will do all the work while she takes all the money — a clear but subtle reference to “Taxman.”

A Google search for John, Paul, George, and Ringo brings up a wiki article on Pope John Paul.

As familiar as he is with the music, Jack struggles to remember the lyrics. “Yesterday” is easy enough. But what about more complex works, such as “Eleanor Rigby” or “Back in the USSR”? We struggle along with him, wryly learning that we can sing along, but we can’t really sing alone. “When does she do that knitting?” he murmurs to himself as he works on recreating “Eleanor Rigby.” “And where does the rice come in?”

Eventually he pulls it together. (Father McKenzie, not Eleanor, does the darning, not knitting. And it’s “a” wedding, not “her” wedding — a significant difference for a song about loneliness. Such is the weakness of memory.)

So while Jack doesn’t actually write the songs he claims as his own, he does exert tremendous effort and labor to recreate the songs. And even then, he doesn’t quite get it right. As he accompanies himself on guitar or keyboard, some of the melodies and harmonies are a little off — intentionally so, I’m sure, to remind us how memory works — or doesn’t. Jack performs the songs as a solo act, and while I liked some of his modernized arrangements, especially his passionate version of “Help,” I missed the harmonies. All of this emphasizes how complex the Beatles’ recordings were, how deceptively simple their harmonies seemed, and how tight they really were, often trading off the melody as they sang.

It’s somewhat galling to hear that these new songs are the best he’s ever written, but he gets over it easily enough.

Boyle’s admiration for the music is apparent in subtle ways — a familiar flute riff in the background in one scene, a whimsical tuba in another, the tornadic crescendo rising to a climax from A Day in the Life as Jack flies into the air during an accident that occurs when the lights go out. Jack is “the lucky man who made the grade” — the chosen one who will “fill the holes” in the music.

So here are the questions I came away with after watching this well-written and solidly acted film:

  • Were the Beatles great because of their music, or was their music great because of the Beatles?
  • Would the Beatles have been so profoundly influential if their later, more experimental works had come first, the way Jack presents them, before their bouncy love songs had cemented their popularity?
  • In short, was the phenomenon of the Beatles as important to their lasting influence as the music itself, or does the music stand on its own?

Another question is whether the Beatles would have been as successful today. The music industry is profoundly different today from what it was 50 years ago. Music is sold (or stolen) piecemeal on iTunes or YouTube and other sites. Musicians make their money on concert tickets and merchandise sales, not music royalties.

Could the Beatles have written more than 200 songs and produced 13 albums if they had been on the road month in and month out? How would the piecemeal nature of today’s music sales have affected their creativity?

The ’60s were a magical time for music, and Boyle suggests through this film that there was something mystical about it as well.

There really wasn’t a “B side” to a Beatles record; nearly all their songs were hits. Their albums were carefully curated so that each track complemented the others. Rubber Soul has a personality distinct from Revolver, and Sgt. Pepper’s was organized and performed as though it were an actual concert. Fans waited months for the next album to emerge and bought all of them eagerly, listening to each track in the order that was intended. By contrast, Jack feeds his audience all the songs at once, as he remembers them, almost the way we binge on Netflix series, glutting ourselves greedily and then looking around for more, sad that we didn’t savor what we had. As a viewer I couldn’t help but wonder what was going to happen when Jack ran out of songs. By contrast, the ’60s were a magical time for music, and Boyle suggests through this film that there was something mystical about it as well.

Yesterday asks us to consider these questions, but only in passing. The story stands on its own, with a delightful cast of main and supporting characters (Jack’s parents are a hoot!), a sweet love story, and a nostalgic soundtrack. Skip Spider-man’s Homecoming this week and enjoy a little homecoming of your own, reminiscing with the music of Yesterday.

Editor's Note: Review of "Yesterday," directed by Danny Boyle. Universal Pictures, 2019, 116 minutes.

Share This

Endgame: the Biggest Superhero Show


Avengers: Endgame may not be the best movie of 2019, but it certainly is the biggest. Clocking in at 3 hours and 1 minute, it’s the longest studio release since the epics of the 1960s. And grossing an astounding $1.2 billion dollars worldwide in its first weekend, it is the biggest financial success in Hollywood history, breaking six box office records so far. At the cineplex where I saw the film with my grandson, it was being shown every half hour, 35 screenings in a single day, beginning at 8:30 on a Sunday morning. We felt lucky to snag three seats together in the neck-craning third row — and we bought them in advance. My grandson was already seeing it for the second time. I might see it again too.

So what’s the attraction of Endgame? It’s just another superhero movie in a long line of superhero movies, right? Is it really that special?

For many, "Infinity War" and "Endgame" marked the movie event of the decade. Super fans rewatched a dozen films or more in preparation for the release of both films.

Well, yes and no. Several factors make this film quite special, while others caused me to cringe in disbelief. I lost interest long ago in the superhero genre, yet I felt duty bound as a movie reviewer to see this one, and found elements that gave it a spiritual and literary gravitas I wasn’t expecting. Since there are hundreds of traditional movie reviews praising this film, I want to step away from the traditional and focus on my experience and reaction watching it, and I can’t do that without talking about significant elements of the plot. So if you want to see the film without the spoilers, you’d better stop reading and save this review for after you’ve seen the movie.

First, the basic plot. Endgame is the culmination of 22 separate superhero films based on Stan Lee’s Marvel Comics, featuring Iron Man (Robert Downey, Jr.), Captain America (Chris Evans), the Hulk (Mark Ruffalo), Thor (Chris Hemsworth), Spider-Man (played by numerous actors over the series, and in this one by Tom Holland), Black Panther (Chadwick Boseman), Antman (Paul Rudd), Wasp (Evangeline Lilly), the Black Widow (Scarlett Johannson), Dr. Strange (Benedict Cumberbatch), and the entire Guardians of the Galaxy contingent, along with numerous side characters from each domain.

The characters have been crossing into one another’s universes over the past several installments, finally culminating in the ultimate Showdown with Evil in the two-part finale that includes Infinity War (2018) and Endgame, the current film. For many, these two films marked the movie event of the decade. Super fans rewatched a dozen films or more in preparation for the release of both films. While listening to a call-in radio show last week I heard a woman ask whether she could honorably get out of going to a hospice retreat with a friend who is dying of cancer, because she already had tickets to see this movie with her family. (The radio host wisely advised the woman to skip the movie and assist the friend.) Some theaters scheduled marathon showings for fans who wanted to watch the earlier films in the series together. I didn’t bother to “prepare,” yet I still enjoyed the two films just fine by accepting the fact that I didn’t know every backstory or reference, and that was OK.

What I like about this movie is how it taps into deep mythological archetypes and also introduces a new concept about time travel.

In Episode One of the finale (Infinity War), super villain Thanos (Josh Brolin) wants to gain possession of six powerful stones that will allow him to vaporize half the population of the universe with the snap of his gauntleted fingers. He thinks this will make the universe a better place by reducing overpopulation. (Here is the first of many conundrums; what kind of evil superpower is motivated by the desire to make the world a better place? But that’s the way it is.) His plan shares nothing with the parlor game of imagining a lifeboat with too many people onboard and arguing about who deserves to stay in the lifeboat and who needs to be thrown overboard so the others will have enough food and water to survive; his plan is a seemingly fair, unbiased, randomly selected annihilation. And he succeeds, in a particularly moving ending to Episode One, as half our favorite characters are vaporized into tiny particles of flame.

Episode Two (Endgame) begins with the vaporization of a family on a picnic, reminding us that it isn’t just the Avengers who have lost half their numbers; nearly everyone in the universe has suffered the loss of a loved one. Now the remaining Avengers must wage another battle to regain the stones so they can reverse the process and bring back the people who have been annihilated. This is pretty iconic good-versus-evil, science-versus-nature stuff, with mechanized robot-warriors on the evil side, and flesh-and-blood (or wood-and-sap) superheroes on the other.

What I like about this movie is how it taps into deep mythological archetypes and also introduces a new concept about time travel. One of the accepted rules of the time-travel genre is that anything you do while traveling in the past will change the future. Thus Marty McFly in Back to the Future returns to his original present and discovers that his family has changed. His father is successful, his mother is slender, his siblings are happy, and his old nemesis is washing Marty’s car. Only Marty remembers the former timeline, because only he has lived the “new past.” This occurs in most time-travel movies; only the person who went into the past remembers the old present. It always makes me a little sad that those who remained behind don’t realize the dreariness or danger they’ve escaped. In Endgame it’s explained that they can’t change the past because what they are doing actually occurs in the present. What they’re changing is the future, which is true of all our choices. Thanos will still have vaporized everyone, but now everyone will be able to come back. I like that concept because all of them realize the fate they’ve escaped and appreciate the sacrifice of those who fought for them.

Then Thanos returns and another epic battle ensues — just as Satan will, according to the book of Revelation.

What I like even more are the biblical and mythological allusions in this story. At the end of Infinity War, two biblical allusions appear in the sudden and random vaporizing of half the population. The first is a reference to the rapture at the end of the world, when, according to Matthew 24:40, “Two men will be working together in the field; one will be taken, the other left.” That’s exactly how the vaporizing feels. The other is a reference to the destroying angel taking the firstborn of every household in Egypt that marked the beginning of the Israelite Exodus into the wilderness. And another reference to the Exodus is seen when Thanos observes, “As long as there are those who remember what was, there will be those who resist,” echoing God’s decision to make all the Israelites who remembered Egypt wander in the wilderness until they died before the others were allowed into the Promised Land. In yet another scene, Dr. Strange holds back a towering flood of water, just as Moses held back the water in The Ten Commandments. (Okay, that was Charlton Heston. But he was portraying Moses.)

Thanos succeeds initially in Infinity War, but he is killed in an early battle of Endgame. For five years, the remaining residents of the earth enjoy a fairly idyllic life. They marry, start families, work the fields, study, produce, and live in peace. Then Thanos returns and another epic battle ensues — just as Satan will, according to the book of Revelation, be bound for 1,000 years of peace and then unleashed to gather his armies for a final epic battle. And when the final battle occurs in Endgame, the Avengers are joined by the resurrected beings who had been vaporized, just as in the battle of Armageddon — according to some interpretations — Christ will be joined by the resurrected dead. It’s a powerful scene in the movie, met with thunderous cheers from the audience, and made more powerful by the archetypal allusion.

Other references to redemption occur as well. Bruce Banner learns to embrace his inner Hulk, who now lives peacefully on the outside. We see a virtual resurrection of the late Stan Lee, who created the Marvel universe and died in 2018, de-aged and in his prime for his final cameo appearance in a Marvel movie. “Hey, man! Make love, not war!” he calls, as he drives out of sight. Robert Downey, Jr., who plays Iron Man (my favorite of the Avengers because he is an inventor, a businessman, and a reluctant superhero) has also experienced a kind of redemption in his life, having overcome his nearly debilitating addictions 20 years ago to become one of the most bankable stars in Hollywood. His career was dead, and it has roared back to life.

This is an astonishing affront after the brilliant success of "Black Panther" in 2018, and serves to highlight the self-righteous hypocrisy of Hollywood liberals.

Perhaps it was the serendipity of Endgame’s opening just a few days after Easter that caused me to see Christian archetypes in the film. The ultimate hero in the story willingly sacrifices his own life to reverse the deaths of those who were vaporized, saying, “If we don’t take that stone, billions of people stay dead.” He willingly trades his life for the lives of his friends — and everyone else. Like Jesus, he dies when his heart literally breaks. His last words are “I am . . . [his name],” I AM being a name of God, applied by Christ to himself. By contrast, the character’s father tells him, “The greater good has seldom outweighed my self-interest,” but I like to think that self-interest and concern for others are not mutually exclusive. In fact, each of us acting in our own self-interest while respecting the rights and property of others is probably the fastest way to the greater good. So I like this too.

My biggest beef with Avengers: Endgame is that the Black Panther universe is shoved to the back of the bus. Its citizens don’t show up until the third hour of the film; they literally stand at the back of the Avengers group in a significant funeral scene; and the token non-BP black Avengers, War Machine (Don Cheadle) and Falcon (Anthony Mackie), play secondary roles throughout the battles. Even Okoye [Danai Gurira], who survives the Thanos vaporization in Infinity War and thus ought to be fighting alongside the Avengers throughout the movie, makes only a token appearance in the first two-thirds of Endgame (to report on an earthquake under the sea). This is an astonishing affront after the brilliant success of Black Panther in 2018, and serves to highlight the self-righteous hypocrisy of Hollywood liberals. What were they thinking?

My second beef is actually a chortle at Hollywood do-gooders who couldn’t quite figure out what was evil in Thanos’s plan, whose goal is actually to remove one of their pet peeves — the overpopulation that is supposedly destroying the planet. One would almost expect them to see this save-the-planet-by-eliminating-the-humans tactic as a good thing. (Indeed, many historians argue that the plague produced an economic boon in the Middle Ages by reducing unemployment.) And in fact, Captain America comments to Black Widow at one point, “I saw a pod of whales when I was coming in, over the bridge . . . Fewer ships, cleaner weather.” What a good guy that Thanos is!

I wanted to see more of how the loss of half the population of earth affected trade, manufacturing and production, but everything was presented in a very hodge-podge way.

Nevertheless, the film gives us only a glimpse of life after near-annihilation, and it is glaringly inconsistent with the “liberals’” worldview. The reduction in population seems to have resulted in a dystopian future; five years later, cars are still abandoned where they were left by their drivers when they were vaporized, and our heroes are living an agrarian life in the woods. Okay, that makes sense. If we lose the people who run the factories, drive the trucks, service the power grid, and pump the oil, life is going to become pretty bleak, I think. At least back-to-basics. I wanted to see more of how the loss of half the population of earth affected trade, manufacturing and production, but everything was presented in a very hodge-podge way.

An example: despite their agrarian existence, Pepper Potts (Gwyneth Paltrow) is making Tony Stark and their daughter peanut butter sandwiches on something that looks suspiciously like factory-produced Wonder Bread. Who’s manufacturing that soft, squishy, sliced bread with its pure white center and thin tan crust? For that matter, who’s manufacturing the peanut butter? Who’s making their daughter’s machine-knit sweater, her jersey-knit leggings, and her cute little pink tennis shoes? Evidently the Gen-Xers and Millennials who run Hollywood these days don’t understand where products come from, besides the store (or Amazon). Meanwhile, the Internet works, the computer and communications systems work, kids are taking selfies on their cell phones, and somehow food supplies are getting to the diner everyone patronizes. I wanted to give everyone in Hollywood a copy of “I, Pencil.”

Despite such inconsistencies in the setting and plot, not to mention the ideas, and despite my not having watched at least half of the films leading up to this denouement, I have to admit I was moved by the story. I think it was largely because I watched it with my own set of tropes and understandings about good and evil, sacrifice and redemption, resurrection and restoration. The relationships are well portrayed, and I bought into the battle, although I would have liked a clearer philosophical conflict than “Please don’t kill everyone.” I appreciated the fact that some characters changed and chose their own paths. Like my grandson, I will probably see Endgame again.

Editor's Note: Review of "Avengers: Endgame," directed by Joe and Anthony Russo. Marvel Studios, 2019,181 minutes.

Share This
Syndicate content

© Copyright 2019 Liberty Foundation. All rights reserved.

Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.