The Last Cargo

 | 

The launch last month of “The 1619 Project” by The New York Times unleashed a barrage of partisan volleys and countervolleys consisting mostly of debatable claims, finger-pointing, and innuendo. It’s predictable for the polarized times we live in. Some of the Democratic presidential candidates are calling for reparations for the nation’s original sin of slavery (were the 360,000 Union deaths not enough?); and we’ve elected a president whom many consider racist — an accusation resorted to glibly, promiscuously, and with a keyword so broadly defined and overused it’s become as meaningless as the word “love.”

The NYT presents the project as an appropriate bookend to the 400th anniversary of the first slave ship, the White Lion, to arrive at the continental US, at Jamestown. But perhaps a more apt bookend might be the arrival of the last slave ship, the schooner Clotilda, in 1859 (or 1860, according to some sources), 240 years after the White Lion docked at Point Comfort in Virginia.

It’s taken quite a while for the Clotilda’s story to air.

Although the transatlantic slave trade had begun in the early 1500s with destinations to the Caribbean and Brazil, the 20-odd Angolans aboard the White Lion — taken against their will — were classified as indentured servants, some of whom later acquired their freedom, as per that definition. Children of those Africans who did end up as slaves were born free — according to the laws of that time.

It’s taken quite a while for the Clotilda’s story to air. Zora Neale Hurston (1891–1960), whose four grandparents had all been slaves, interviewed Cudjo Kossola Lewis, the second-to-last survivor of the Clotilda, in 1927. She’d trained as an anthropologist under the tutelage of Franz Boas, considered the “Father of American Anthropology,” and this was her first serious project.

Boas had introduced and firmly established the concept of cultural relativity as an investigative axiom: “a person's beliefs, values, and practices should be understood based on that person's own culture, rather than be judged against the criteria of another.” As a field tool, the concept allowed Hurston to present Kossola’s narrative more objectively, through his eyes. She’d attended Howard, Barnard, and Columbia with classmates Ruth Benedict and Margaret Mead (who got a bit creatively carried away with the cultural relativity bit in the South Pacific).

Hurston’s transcription of Kossola’s dialect is inconsistent, slaloming between her efforts at accurate transcription and reversion to more conventional English when the task became overwhelming.

Barracoon: The Story of the Last “Black Cargo,” the book that resulted from Hurston’s interviews with Kossola, remained unpublished until May 2018. Back in 1931, Viking Press rejected it. They would only accept the manuscript if Hurston rewrote Kossola’s vernacular into standard English. They had a point (though not the one I want to make right here). Hurston’s transcription of Kossola’s dialect is inconsistent, slaloming between her efforts at accurate transcription (which she thought essential) and reversion to more conventional English when the task became overwhelming. Additionally, according to novelist Alice Walker’s foreword in the book, “There was concern among ‘black intellectuals and political leaders’ that the book laid uncomfortably bare Africans’ involvement in the slave trade.”

And that’s not the only inconvenient truth buried in Barracoon. Hurston, a black female anthropologist, was an independent thinker. She opposed school integration and programs that guaranteed blacks a right to work. In 1955 she claimed that "adequate Negro schools" already existed. (See John M. Eriksen, Brevard County, Florida: A Short History to 1955, chapter 13; and "Negro Writer Opposes Court Ruling,” Titusville Star Advocate, September 30, 1955, p. 2.) And she was a Republican during the New Deal.

Although John McWhorter, a linguist and Associate Professor of English and Comparative Literature at Columbia, has called Hurston "America's favorite black conservative," she’s been more properly characterized as a libertarian by David and Linda Beito ("Isabel Paterson, Rose Wilder Lane, and Zora Neale Hurston on War, Race, the State, and Liberty," Independent Review 12, Spring 2008) She was no social conservative and, in foreign policy, was a noninterventionist. And then there are the controversial watermelons.

Although John McWhorter has called Hurston "America's favorite black conservative," she’s been more properly characterized as a libertarian.

Kossola lived life to his own rhythms, tending his gardens and active in his church. To ingratiate herself with him and unlock a volubility concealed behind an apparent reticence, Hurston would bring peaches, hams, and watermelons as gifts. Once they shared an entire iced watermelon, gnawed down to the rind, taking up all their allotted interview time but unlocking a trust and warmth that sealed a lasting friendship. Whether the association of watermelons and blacks’ taste for them already existed is a question best left to pop historians. But Alice Walker in her foreword to Barracoon again picks up a racial trope, “Imagine how many generations of black people would never admit to eating watermelon!”

* * *

Kossola, an Isha Yoruba, was captured in a slave raid by the army of King Glélé of Dahomey (in present-day Benin), which consisted of about 7,000 male and 5,000 female warriors — the renowned Dahomey Amazons. He was 19 and engaged to be married. His village was stormed by the Amazons, all belted with the dangling heads of opponents killed in battle. Kossola reported that they were the equal of any man. The old and infirm were decapitated on the spot (so much for the nurturing nature of the gentler sex). Meanwhile, the male Dahomey warriors were stationed at the gate posts to ambush and capture the fleeing villagers, Kossola among them.

After about four weeks in transit, three spent in a barracoon — a holding cell for newly-captured slaves — the captives were treated to a big feast by their captors: “the people of Dahomey come bring us lot of grub for us to eatee ’cause dey say we goin’ leave dere. We eatee de big feast,” recalled Kossola.

Captain William Foster, owner, builder, and skipper of the Clotilda — which was anchored outside the surf zone (the port of Whydah lacking any docking facilities) — purchased 130 of the captives. He chose equal numbers of males and females. Although offered, he “preemptorily” [sic] forbade their branding. Each captive cost $50 to $60 on the coast but could be sold in Alabama for about $800 apiece (nearly $23,000 in today’s money).

His village was stormed by the Amazons, all belted with the dangling heads of opponents killed in battle.

Foster’s fellow investors in this slaving venture consisted of the brothers Jim, Tim, and Burns Meaher from Maine. The Captain carried $9,000 in gold. The Clotilda was manned by a crew of 12 (all Yankees), including him. Although the importation of slaves had been illegal since 1808, the Meahers, who owned a mill and shipyard, built swift vessels for blockade running and “filibustering expeditions.” At the time, smuggling slaves from Cuba was common practice.

Transporting and loading the captives into the Clotilda through the heavy Atlantic surf required the services of skilled men of the Kroo tribe, men, an ethnic group of independent operators who specialized in negotiating breakers with sleek surf boats. Their skills as mariners were so expert that the Royal Navy enlisted many of them from 1820 to as late as 1924. But Kroo canoes had limited capacities. Kossola, naked and terrified, thought he’d breathed his last. He was the last captive loaded onto the Clotilda.

After 116 of the slaves had been brought on board, Foster became aware of possible treachery involving Dahomans planning to recapture the cargo and holding him hostage. He immediately gave orders to abandon the cargo not already on board “and to sail away with all speed.”

Kossola, naked and terrified, thought he’d breathed his last. He was the last captive loaded on board.

The Clotilda got away, but the next day was chased by an English cruiser on the lookout for slavers. Foster escaped by pressing sail. The slaves down in the hold were in cramped conditions (although they had much more space — five feet of headroom — than many slaves in previous Middle Passage transports) After being kept below decks for 12 days, mainly because of real or false alarms, they were brought on deck so they might limber up. The captain ordered the crew to help them walk and exercise. For the rest of the passage, except for the twentieth day, when another British cruiser was spotted, and near the end, on the approach to Mobile, they spent most of their time on deck — 116 slaves to 12 crew. Only two died. The crossing took 70 days.

According to Hurston, the Clotilda arrived in Mobile Bay under cover of darkness in August 1859 (other sources say July 9, 1860). Fear of discovery and prosecution, and the fact that blacks illegally brought in could not be enslaved, made their sale problematic. In fact, Foster and the Meahers were later tried in federal court in Mobile, though not convicted, for lack of evidence: the Clotilda and its manifest had been burned and sunk, the black captives well hidden. Other sources say they were found guilty and charged heavy fines, which were never paid. The outbreak of the Civil War prevented further pursuit of the case.

Forty-eight of the slaves were secretly sold. The remaining 60 (according to one of the discrepant sources) were divvied up among the principals: James Meaher took 32 (16 couples), Burns Meaher took ten, Tim Meaher eight, and Captain Foster ten. Kossola went to Jim Meaher, where he acquired the name Cudjo Lewis. “Cudjo” was a name given by the Akan people of Ghana to children born on a Monday, while “Lewis” is probably a corruption of Kossola’s father’s name, Oluale, which Meaher had difficulty pronouncing.

The Clotilda was chased by an English cruiser on the lookout for slavers. Foster escaped by pressing sail.

Cudjo reported that they were not immediately put to work; first they were trained by American slaves, who ridiculed the Africans for their ignorance and “savage” ways. He became a stevedore loading wood for Jim Meaher’s cargo boats on the Mobile to Montgomery run. He was worked hard, but praised his master for taking good care of his slaves:

Cap’n Jim, he a good man. He not lak his brother, Cap’n Tim. He doan want his folks knock and beat all de time. He see my shoes gittee ragedy, you know, and he say, ‘Cudjo, if dat de best shoes you got, I gittee you some mo’!’ Now das right. I no tellee lies.

“Cap’n” Tim’s brother Burns was also cruel, but it seemed to have its limits. Cudjo reported that their slaves worked the brothers’ plantation fields:

Dey got overseer wid de whip. One man try whippee one my country women and dey all jump on him and takee de whip ‘way from him and lashee him wid it. He doan never try whip African women no mo’.

This astonishing account of slave resistance without repercussions is reminiscent of a similar incident reported by Frederick Douglass in his autobiography, Narrative of the Life of Frederick Douglass, an American Slave.

* * *

Douglass was a proud, headstrong man. Like my brother John, who was drafted for the Korean War — and looking forward to proudly serving his country, only to be discharged for being “temperamentally unsuited to taking orders” — Douglass was not cut out for servitude, though instead of “proudly serving his master,” he just complied with performing his duties . . . as long as he was treated with respect.

As a boy, Douglass had not only been treated well, he’d been taught to read and write. But he ended up with a master with “quite a number of differences” with him. During the nine months he spent with Master Thomas, “he had given me a number of severe whippings, all to no good purpose.” So Thomas decided to send Douglass to Edward Covey, a man who, for a price, specialized in breaking recalcitrant slaves.

After a series of particularly brutal beatings, Douglass decided — to his own surprise — to fight back.

Covey set about the task with alacrity, putting Douglass to brutal work in the fields: “During the first six months, scarce a week passed without his whipping me.” He was treated so brutally that he admitted that at one point “Mr. Covey succeeded in breaking me.” But this was not to be permanent.

After a series of particularly brutal beatings, Douglass decided — to his own surprise — to fight back . . . come what may. During a two-hour tussle, Douglass “drew blood” and got, by far, the better of the encounter. Covey retreated. “The whole six months afterward, he never laid the weight of his finger upon me in anger. This battle with Mr. Covey was the turning point in my career as a slave.” From then on, civility — or what passes for civility in a master-slave relationship — was the order of the day. Douglass performed his duties; Covey let him be.

Douglass’ analysis of the event demonstrates the intelligence and wisdom of this young man. Covey could only go so far. Slaves were extremely valuable property: to render one unfit for service was financial suicide — not to mention that Douglass wasn’t his slave (or that Douglass had seriously beaten Covey).

Mr. Covey enjoyed the most unbounded reputation for being a first rate . . . negro-breaker. It was of considerable importance to him. That reputation was at stake; and had he sent me — a boy about sixteen years old — to the public whipping post, his reputation would have been lost; so, to save his reputation, he suffered me to go unpunished . . . I was nothing before; I was a MAN NOW.

The incident was so pivotal to his life that Douglass filled 11 pages of his first book on it, and 32 in the subsequent autobiography. In contrast (and I digress here somewhat), David W. Blight in his Pulitzer Prize-winning biography, Frederick Douglass: Prophet of Freedom, gives it only two pages and ignores Douglass’ analysis and insights.

Slaves were extremely valuable property: to render one unfit for service was financial suicide.

I suspect an ideological bias, such as surfaces in his introduction:

[Douglass was] a proponent of classic nineteenth-century political liberalism . . . he strongly believed in self-reliance . . . but fundamentally was not a self-made man.

Let’s take a closer look at these assertions. Inserting “nineteenth-century” between “classic” and “liberalism” implies, to these libertarian sensibilities, that classic liberalism was an outdated, even discarded philosophy. But nothing could be further from the truth. Classic liberalism is alive and thriving today. And to say that Frederick Douglass, the epitome of a self-made man, was not a self-made man is to contradict all the evidence contained in Blight’s flawed tome. In at least a dozen instances in the book — instances of Douglass solving problems, escaping bondage, rising to the occasion, creating opportunities, helping others — Blight is unambiguously forced to aver that Douglass was in fact a “self-made man,” using those exact same words (p. xv).

Inserting “nineteenth-century” between “classic” and “liberalism” implies that classic liberalism was an outdated, even discarded philosophy. But nothing could be further from the truth.

Douglass’ examination of Mr. Covey’s behavior is a classic liberal analysis of conduct based on economic self-interest, a perspective that Blight either refuses to acknowledge or completely ignores. It does not fit his worldview, and he refuses to give it air time — in spite of the fact that Douglass’ analysis of the event was a formative experience in his life.

Blight reveals his biases more artlessly whenever he mentions Republicans — never mind that, for abolitionists, Republicans were the only game in town. About the 2013 unveiling of a statue of Douglass in Washington DC, Blight’s introduction condescendingly observes:

Congressional Republicans walked around proudly sporting large buttons that read FREDERICK DOUGLASS WAS A REPUBLICAN. Douglass descendants present, as well as some of us scholars with, shall we say, different training and research, smiled and endured.

Yes . . . that was in 2013. But Blight can’t help projecting modern biases into the past, through subtle wording and innuendo throughout the book, especially when Douglass becomes active in Republican Party politics. This is but one reason why the book was a chore to get through. Blight is no Ron Chernow or Robert Caro.

* * *

But back to Cudjo Kossola Lewis. The Africans were unaware of the start of the Civil War, but when the Union blockade and the surrounding fighting made food scarce, “Cap’n Jim Meaher send word he doan want us to starve, you unnerstand me, so he tell us to kill hogs. He say de hogs dey his and we his, and he doan wantee no dead folks.”

On April 12, 1865, only three days after Robert E. Lee’s surrender at Appomattox, but five or six (some sources say four) years of Cudjo’s life as a slave in America, Union soldiers told him he was free. The Africans celebrated by making drums and beating them “lak in de Affica soil.” Their first inclination was to return to Africa: “dey [the Meahers and Foster] ought take us back home.”

When they discovered the cost of such an improbable venture, they nonetheless worked hard and saved their money. But finally deciding that going back to Africa was unrealistic, they deputized Cudjo to approach the Meahers for land to settle on. Tim, the meaner of the Meahers, jumped to his feet and responded, “Fool, do you think I goin’ give you property on top of property? I tookee good keer my slaves in slavery and derefo’ I doan owe dem nothing? You doan belong to me now, why must I give you my lan’?

Union soldiers told him he was free. Cudjo and the other Africans celebrated by making drums and beating them “lak in de Affica soil.”

Notwithstanding Tim’s rebuff, James, the kinder and gentler Meaher, might have helped finalize the deal. The Africans bought Meaher land three miles north of Mobile at Magazine Point, establishing a settlement they called Africatown — but now known as Plateau — in 1866 (the date Hurston provides, but according to Sylviane A. Diuf, in the Encyclopedia of Alabama, Cudjo bought two acres on September 30, 1872 for $100 — or about $2,000 today).

Cudjo became a naturalized American citizen, married, had six children, and became sexton of his church. In 1902, while driving his buggy over train tracks, he was hit by a train and injured. A sympathetic white lady who saw the accident ensured he was well taken care of and told him he had a case against the railroad. Cudjo knew nothing about American law. The lady hooked him up with a lawyer who took on contingency his case against the Louisville and Nashville Railroad. Cudjo won and was awarded $650.

But he never collected. Cudjo reported that after the verdict, a yellow fever epidemic hit Mobile. The lawyer and his family headed north to safety, but on the way the lawyer died. Yet another source (Encyclopedia of Alabama) says that the verdict was overturned on appeal.

Cudjo Kossola Lewis died on July 17, 1935.

* * *

In these times of “fake news,” the publication of Barracoon — finally — should be a breath of fresh air. I say this notwithstanding the fact that while writing this review I discovered so many discrepancies in the account that I’m left wondering how to account for them: a year’s difference in the arrival of the Clotilda in Mobile; the number of years Cudjo spent in bondage; the resolution of Cudjo’s lawsuit; Hurston’s purported plagiarism from earlier sources; and other, more minor controversies. They seem to be endemic to the genre.

Whatever the causes of the discrepancies in Kossola’s story, at least they don’t seem to be rooted in ideological manipulation — a shortcoming that has bedeviled American slave narratives since at least the times of William Lloyd Garrison. Antebellum abolitionists resorted to widespread hyperbole concerning the horrors of slavery in order to convince an ill-informed and often indifferent public.

While writing this review I discovered so many discrepancies in the account that I’m left wondering how to account for them.

Yes, I know, you’re thinking, How can one overstate the evils of slavery? It’s like exaggerating the fires of hell. I don’t know about you, but accuracy works best to convince me about anything. When people resort to lies, or just don’t check their facts well enough, I lose trust, no matter how well-intentioned the narrative may be.

The altering of facts continues to this day. The movie 12 Years a Slave, based on the book by the same title (and reviewed in these pages by Jo Ann Skousen, “A Slave Narrative, and More,” November 10, 2013), contains at least four falsifications, all of which are ideologically based. As Jo Ann points one out:

Some of the vignettes simply don’t ring true, as when the lecherous and sadistic slave owner, Edwin Epps (Michael Fassbender) whips Patsey (Lupita Nyong’o) almost to death because she has spoken back to him. Patsey is his most productive slave. She picks twice as much cotton every day as any of the men do. She is a valuable, unblemished piece of property, even if he doesn’t acknowledge her humanity. It does not make sense that he would destroy such a valuable capital good in a fit of pique.

The movie depicts William Ford (played by Benedict Cumberbatch), the slave owner, in quite another light than Northup, the slave (played by Chiwetel Ejiofor), described him in his book: “There never was a more kind, noble, candid, Christian man than William Ford.”

Only the well-off could afford to own slaves before the war, and they weren’t likely to burn $23,000 for fun.

Falsifications like the one Skousen points out are particularly egregious. Not only do they go against basic economic theory but they paint human nature in the worst light possible.

This out of The Atlantic:

In the film version, shortly after Northup is kidnapped, he is on a ship bound south. A sailor enters the hold and is about to rape one of the slave women when a male slave intervenes. The sailor unhesitatingly stabs and kills him. This seems unlikely on its face — slaves are valuable, and the sailor is not the owner. And, sure enough, the scene is not in the book. A slave did die on the trip south, but from smallpox, rather than from stabbing.

But the worst one, which I haven’t seen referenced, was a passage in the book where Northup is sent on an errand that requires crossing a gator-infested bayou. Along the way, he encounters an alligator, and sweats bullets. In the movie the scene is changed. Instead of an alligator, he encounters two rednecks whooping it up hanging a black.

Please!

The heydays of lynching blacks were after the Civil War, not the 1840s, when slaves were worth about $23,000, average, in today’s money. And though crackers were the foot soldiers of the Ku Klux Klan during and after Reconstruction, only the well-off could afford to own slaves before the war, and they weren’t likely to burn $23,000 for fun.

Barracoon discloses some inconvenient truths, and in doing so, to my mind, enhances the credibility of the horrors of slavery by revealing not just its inhumanity but the glimpses of humanity that at times appeared. Caricatures and satire only succeed with the ignorant and the convinced.

In the movie the scene is changed. Instead of an alligator, he encounters two rednecks whooping it up hanging a black.

And instances of behavior that tempers the conventional narrative of slave societies run through many slave biographies. Besides Douglass’ and Northup’s (dictated, despite Northup’s literacy, to David Wilson), check out The Life of Olaudah Equiano, Prince Among Slaves; Incidents in the Life of a Slave Girl; The History of Mary Prince; and The Barber of Natchez (a free black in 1830s Mississippi).

Perhaps it’s our knowledge of the Holocaust that makes some of us project its atrocities back onto our slavery era. I don’t know. But for now, let’s keep the two separate and not make too many generalizations about universal human behavior. Truth is the best antidote to propaganda, however well-intentioned.


Editor's Note: Review of “Barracoon: The Story of the Last ‘Black Cargo,’” by Zora Neale Hurston. Amistad, 2018, 171 pages.



Share This


Despised, But Not Resisted

 | 

After reviewing Quentin Tarantino’s The Hateful Eight (2015), I swore I had seen my last QT film. The acting was stagy, the bloody violence gratuitous, the storyline beyond unbelievable. He hadn’t just “jumped the shark”; he had catapulted the cow jumping over the moon. I was done.

But something about his latest offering, Once Upon a Time . . . in Hollywood, drew me back. The stellar cast, led by Leonardo DiCaprio and Brad Pitt, promised committed, unexpected performances. The setting, 1960s southern California, was where and when I grew up, and I was drawn to the nostalgia I would certainly experience. And the story, leading up to the Manson murders, was not only tragic but also somehow romantic in the classical sense — a story of people who captured the interest of the nation when it occurred. Many say the ’60s died that day, along with Sharon Tate and her friends. Yes, I assumed there would be blood (and there is) but at least it wouldn’t be gratuitous this time. And in fact, it doesn’t show up until late in the film. I was willing to give QT another look.

Similar to Tarantino’s breakout Pulp Fiction (1994), Once Upon a Time presents multiple disconnected storylines while foreshadowing an explosive climax. Rick Dalton (DiCaprio) is a TV western star nearing the end of his TV career. Dalton is based not-so-loosely on Clint Eastwood in “Rawhide” or Steve McQueen in “Wanted Dead or Alive.” Like Eastwood, Dalton is encouraged to move to Europe to make spaghetti westerns with a director named Sergio. And like McQueen, he is a bounty hunter in his TV series. Also like McQueen, Dalton carelessly knocks a young girl onto the floor in a movie scene; McQueen is reported to have knocked an actress across the room during a “method acting” improvisation for the great Constantin Stanislavski. After the scene cuts, the little girl tells Dalton, “That’s the best piece of acting I’ve ever seen.” Stanislavski said the same to McQueen after he smacked the young starlet in acting class.

Many say the ’60s died that day, along with Sharon Tate and her friends.

McQueen shows up in the film by the way — played by Damian Lewis, who utterly nails McQueen’s piercing eyes and brooding mouth. The film is full of homages and allusions such as this, and one could enjoy it just looking for the Easter eggs. Tarantino knows his Hollywood trivia! But there is much more to this movie than homage.

Another storyline focuses on Cliff Booth (Brad Pitt), who works as Dalton’s stunt double and gofer. He drives Dalton around town, grabs him a beer when he’s thirsty, runs his errands, fixes his antennae, listens when he’s despondent, and does it all with that winning Brad Pitt smile. Audiences at the premier in Cannes whistled and clapped when Pitt stripped off his shirt to work on said antennae. At 55, Pitt is still plenty buff. Dalton might be the protagonist, but Booth is clearly the star. Even the way he side-clicks his tongue, signaling to his dog that it’s time for dinner, is gobsmacking.

Dalton lives next door to Roman Polanski, who he hopes will notice him and cast him in a movie. Meanwhile, Sharon Tate (Margot Robbie), Polanski’s pregnant wife, is luminously happy about her breakout role in a Dean Martin movie, The Wrecking Crew. Having started her career in TV shows like “Mister Ed” and “The Beverly Hillbillies,” she is understandably ecstatic to see her name and image on a movie poster. Robbie plays her shy excitement just right — almost embarrassed to look at the poster in the movie theater lobby, wanting to be recognized, finally having to say who she is, then basking in the recognition she has created and positively glowing as she listens to the crowd reacting to her scenes. You can’t help feeling empathy for this pretty girl whose life was cut so gruesomely short, back in 1969.

Tarantino knows his Hollywood trivia! But there is much more to this movie than homage.

And then there’s Charlie Manson (Damon Herriman), who makes a brief appearance at 10050 Cielo Drive in Benedict Canyon, looking for its previous resident, Terry Melcher. We see him almost as a shadow, a ghost that hovers and lingers without really touching down. His “family” — Squeaky Fromme (Dakota Fanning, all grown up and sporting a potty mouth); Froggie (Harley Quinn Smith); Pussycat (Margaret Qualley), and Tex (Austin Butler) — provide a constant simmering background to the film and an ongoing foreshadowing of the climax we know is going to come. They dive into dumpsters, thumb rides on street corners, maraud though the town like bandits, and preen like sirens. They are spooky and scary, even without blood. Take a note, QT.

As expected, the stories eventually come together, but in unexpected ways. And that’s all I’ll say about that.

Despite its length and somewhat slow development, this is Tarantino’s best work since Inglourious Basterds. I will probably see it again, next time to focus more on the Hollywood allusions and Easter eggs. And, reluctantly perhaps, I will continue to see and review Tarantino’s movies. He is the most maddening and brilliant of directors. I despise him — but I can’t resist him. Ironically, I think that’s what the “family” said about Manson.


Editor's Note: Review of "Once Upon a Time . . . in Hollywood," directed by Quentin Tarantino. Sony Pictures, 2019, 161 minutes.



Share This


Four Theories about the Great Depression

 | 

More than most people, libertarians have beliefs about the Great Depression. Having spent several years studying the matter, I have some conclusions about four such beliefs: first, that what caused the depression was the Federal Reserve allowing a drop in the money supply; second, that what made it terrible was the passage of the Smoot-Hawley Tariff, which collapsed America’s foreign trade; third, that the New Deal really began under Herbert Hoover; and fourth, that what lengthened the Depression was fear of what the New Deal government would do.

In addressing these questions, I am relying heavily on my hometown newspapers — the Seattle Times, Seattle Post-Intelligencer and Seattle Star — because newspapers are “the raw material of history.” They are not the only sources available, and they have their mistakes, omissions, and biases. But they are broader than politicians’ collected personal papers and broader, in a different sense, than the economists’ statistical tables. As sources for general research about a period, I like newspapers best. I know newspapers. I spent 37 years working for newspapers and magazines, about half that time on the business and financial pages.

The first of the four beliefs, associated with Milton Friedman and the Chicago School, is that the Federal Reserve was responsible for turning a recession into a depression — the deepest and longest in American history — by shrinking the money supply. It’s true that there was less money in people’s pockets, and that was a bad effect. But when economists talk about the Fed shrinking the money supply, they mean shrinking the money available to the banks — and during most of the Depression banks were loaded to the gunwales with money. With few willing and qualified borrowers, they simply parked depositors’ money in US Treasury bonds and local bonds and warrants (thereby helping to finance their local governments and the New Deal). Bankers talked about this on the business pages, and showed it in the year-end bank balance sheets presented in newspaper display ads. For those reasons I find it difficult to indict the Fed for starving the banking system of money.

Newspapers have their mistakes, omissions, and biases. But they are broader than politicians’ collected personal papers and broader, in a different sense, than the economists’ statistical tables.

A variant of this argument is that the Fed mistakenly turned a recession into a depression by raising interest rates.

Overall the Fed lowered interest rates in the depression. In the two years following the Crash of 1929, the Fed cut its rate on short-term loans to banks, going down from 6% to 1.5%. But to stop the outflow of the Treasury’s gold during the currency crisis of September 1931, the Fed temporarily raised the rate to 3.5%. This 2% bump is the “mistake” that the economists holler about. At the time the Fed did this, critics said it would retard recovery, and when recovery didn’t come, the critics pronounced themselves right. But at the time, the financial editor of the Seattle Times noted that the Fed’s supposedly stimulative 1.5% interest rate hadn’t done anything to stimulate recovery. (The Keynesians would later say the Fed was “pushing on a string.”) Investors weren’t holding back because of two percentage points. They were holding back because they were afraid to borrow at all.

I’m not a historian of the Fed, and am not claiming the Fed made no mistakes. But pinning the depression on the stinginess of the Fed to the banks doesn’t seem right. If it were true, the interest rates would have been higher. Also, there would have been furious complaints in the newspapers, even in Seattle. And I didn’t see it.

During most of the Depression banks were loaded to the gunwales with cash. With few willing and qualified borrowers, they simply parked depositors’ money.

The second belief is that the Smoot-Hawley Tariff caused the Depression by posting the highest taxes on imports in the 20th century. The figure usually cited is that the average tariff rate under Smoot-Hawley was 59% — a horrible rate. This, however, was the rate on dutiable goods, and excludes the many goods on the free list. The average rate on all goods was 19.8% — still bad, but something less than torture.

Free traders always reach for the Smoot-Hawley argument. I have heard it not only from libertarians but from supporters of the WTO, TPP, NAFTA, and the promoters of trade in my hometown. And politically, I am on free traders’ side. I agree that the Smoot-Hawley Tariff, signed in June 1930 by Herbert Hoover, was bad medicine. And in this case, there was protest in the newspapers, with voices saying it was a terrible, self-defeating law, and predicting that other countries would retaliate. The newspapers ran stories when the other countries did retaliate.

Smoot-Hawley was also a contributing cause of the collapse in the international bond market in 1931, because it made it more difficult for America’s debtors — Britain, France, Germany, Brazil, Bolivia, Peru, and others — to earn the dollars to repay their debts. But this one bad law cannot bear all the blame for the subsequent implosion of America’s imports and exports.

I can think of four reasons why. First, the Depression was already on, so that by June 1930 imports and exports were already headed downward. Second, if you want to blame tariffs, put two-thirds of the blame on the tariffs in place before Smoot-Hawley was signed, which were an average of 13.5% on all goods. Third, in 1930 exports made up only about 5% of US output (versus 12.5% today), so that the shrinkage in trade, though dramatic in itself, was only two or three percentage points of the overall economy.

This one bad law cannot bear all the blame for the subsequent implosion of America’s imports and exports.

Finally, in September 1931, the British Commonwealth went off the gold standard. The British, Australian, and Canadian currencies were immediately devalued by 15 to 20%. Austria, Germany, Japan, and Sweden also went off gold, effectively devaluing their own currencies. The products of these fiat-money countries immediately dropped in price relative to the products of the United States. One example: Swedish wood pulp pushed US pulp out of world markets, so that almost all the pulp mills in Washington state shut down.

When Franklin Roosevelt came into office in March 1933, he ended the convertibility of the dollar into gold at the old rate of $20.67 an ounce. The reason for doing this was not a shortage of gold; the Treasury had stacks of it. The reason was to match the foreign devaluations and make American goods competitive again. And it did. Trade, the stock market, and the real economy jumped immediately when the dollar went off gold. From April to July 1933 there was a kind of boom, even though Smoot-Hawley was still in effect. (The boom ended because of the National Industrial Recovery Act and some other things, but that is another story.)

If you focus on principles, which libertarians like to do, you can lose sight of magnitudes and proportions that matter more.

The third belief, that Herbert Hoover was an interventionist and implemented a kind of proto-New Deal, is a thesis of Murray Rothbard in America’s Great Depression. Rothbard recounts that after the Crash of 1929, Hoover called leaders of industry to the White House and made them promise not to cut wages. The theory at the time was that this would maintain “purchasing power” and thereby prevent a depression. That was a precedent for the New Deal. It was noted at the time by business columnist Merryle Rukeyser (father of Louis Rukeyser, host of PBS-TV’s “Wall Street Week” from 1970 to 2002). Merryle Rukeyser wrote in December 1929 of the Hoover meetings, “The old-fashioned idea of leaving such matters to the individualism of business leaders — known as the doctrine of laissez faire among economists — has been formally laid to rest and buried.”

So Rothbard had a point: in principle, Hoover was an interventionist. But if you focus on principles, which libertarians like to do, you can lose sight of magnitudes and proportions that matter more. The larger fact is that the Hoover and Roosevelt regimes were hugely different in what the federal government undertook to do, what constitutional precedents they set, how many people they employed, how much money they spent, and how much they affected the world we still live in.

The fourth belief, that the New Deal prolonged the depression by frightening investors, is the thesis of libertarian historian Robert Higgs in his essay, “Regime Uncertainty: Why the Great Depression Lasted So Long and Why Prosperity Resumed After the War.” (Reprinted in Depression, War and Cold War, Independent Institute, 2006.) Higgs argues that the Depression lasted for more than ten years because of “a pervasive uncertainty among investors about the security of their property rights in their capital and its prospective returns” during the later New Deal of 1935–1940.

I can’t comment on much past the beginning of 1935, because that’s where I am in my reading. But I can verify that “regime uncertainty” was real, and that I saw evidence of it beginning in mid-1933, when the initial Roosevelt boom faltered.

At first Forbes advised his business readers to swallow it and said he was loyally swallowing it himself.

In the newspapers I read, the best barometer of this is B.C. Forbes’ business-page column. Forbes — the founder of the eponymous magazine — was very much a pro-capitalist guy. (The magazine calls itself a “capitalist tool.”) Forbes once wrote that his job as a newspaper columnist was to explain the economy to ordinary readers by interviewing industrialists and bankers. Much of the time Forbes was a transmission belt of their doings, thoughts, and feelings along with his own.

It was predictable that Forbes would not like the New Deal. At first he advised his business readers to swallow it and said he was loyally swallowing it himself. But he quickly began choking on the two principal “recovery” programs, the Agricultural Adjustment Act (AAA) and the National Recovery Administration (NRA). The NRA’s boss, Gen. Hugh Johnson, was a loud, imperious man who had been President Wilson’s boss of military conscription during World War I. During the early New Deal, Johnson helped to popularize two expressions: to chisel, meaning to lower one’s price below the government minimum, and to crack down, meaning to punish. In July 1933, Johnson went right to work, cracking down on the chiselers in American industry.

General Johnson was the closest that peacetime American business ever had to a military dictator. In August 1933, Forbes called him “a Vesuvius, in epochal, thundering eruption . . . Not even Teddy Roosevelt in his most explosive days matched General Johnson’s Titanic energy and action — or his wielding of the big stick.”

And: “Mussolini has nothing on him in readiness to undertake multitudinous tasks and to swing the Big Stick.” (This was when Italy’s dictator, Benito Mussolini, was popular with many Americans.)

General Johnson was the closest that peacetime American business ever had to a military dictator.

In the fall of 1934, when Gen. Johnson was replaced by labor attorney Donald Richberg, Forbes wrote: “Reason is expected to replace ranting swashbucklerism.” Forbes loved to publicize good omens, but during these years he was repeatedly disappointed.

In March 1934, Forbes quoted an anonymous industrialist (probably Charles Schwab of Bethlehem Steel, whom he named elsewhere in the column): “No, don’t quote me as saying anything that would sound like criticism of the administration or any branch of it. It’s too dangerous. I don’t want to be cracked down on at this time when Washington has unlimited power to do what it likes.”

Later in the same month Forbes wrote, “The fear today is not of the law but of bureaucrats. Few employers regard themselves as in a position to stand up against dictation as Henry Ford has done.” (Ford had refused to accept the NRA’s “voluntary” price and production controls, and was not allowed to display the Blue Eagle and its motto “We Do Our Part.”)

One of Forbes’ October 1934 columns was an open letter to Franklin Roosevelt, titled in the Seattle Post-Intelligencer “Mr. President, All Employers Aren’t Crooks.”

Forbes loved to publicize good omens, but during these years he was repeatedly disappointed.

Forbes is not the only wellspring of business angst. Here is Merryle Rukeyser, a man more sympathetic to the New Deal than Forbes, in September 1934: “Business men are in a timid mood because of lack of assurance as to their tax liability and as to the attitude of the powers that be toward business profits.”

A doubter might argue that a handful of newspaper columns aren’t enough to prove Higgs’ thesis. I suppose so; but how would you prove it? It is about a state of mind — “confidence” — and how do you demonstrate that except by considering what people say and do? In fact, investors talked and acted as if they lacked confidence; statistics show a shortage of long-term investment. And in fact, there were statements by Roosevelt and by Hugh Johnson, Harold Ickes, Henry Wallace, Rexford Tugwell, and other New Dealers that might very well cause investors to lack confidence. And it was not only the New Dealers, but also their opponents on the left: Dr. Francis Townsend, who wanted every American over 60 to have $200 a month of government money (about $3000 in today’s terms); Upton Sinclair, the Democratic nominee for governor who wanted to set up a socialist economy in California; Father Coughlin, a radio priest who ranted against the rich; and Sen. Huey Long, the “Kingfish” of Louisiana who called his program “Share the Wealth,” and who was stopped only by an assassin’s bullet. This was a different time — and newspapers give you a flavor of it.

Of the four beliefs about the Depression I mentioned at the beginning, I think Robert Higgs’ “regime uncertainty” is most clearly verified. (Read his essay!) The crucial fact about the Depression of the 1930s is not that America got out of it; it always gets out. It’s that the getting out took more than ten years, which was longer any other depression in US history, and that Canada, Britain, Germany, and most other countries got out sooner, and that it took a worldwide war and the eclipse of the New Dealers for America to get all the way out.

Investors talked and acted as if they lacked confidence; statistics show a shortage of long-term investment.

But I don’t think the depression of the 1930s — the onset of it, the depth of it, the duration of it — was caused by any single thing. The commercial world is more complicated than that. I think the Austrian theory of overinvestment, or “mal-investment,” explains much of the setup of the crash, because in the late 1920s and into 1930 there were a lot of bad investments in real estate, commercial buildings, holding companies, and junky stocks. The Crash in 1929 shrank people’s assets and, more important, their confidence — for years. The Dow Jones Industrials went down almost 90%. The reparations owed by Germany to Britain and France, the sovereign debts owed to the United States by Germany, Britain, and France, as well as by Brazil and other South American republics, all had something to do with it, because in 1931 this grand edifice of debt went down in a heap. The bond market was so thoroughly wrecked that counties, cities, school districts, and corporations were locked out of long-term borrowing for several years. Smoot-Hawley and the whole movement toward economic nationalism had a bad effect. The gold standard deepened the Depression because it imposed a discipline on government finances — heavy spending cuts — at a time when they were painful, and when some countries freed themselves of that discipline it shifted the pain to the other ones. Finally, the anti-capitalist political currents and the ad hoc, experimental, extralegal character of the New Deal frightened investors, whose long-term commitments were needed for economic recovery.

That’s the best I can do. I’m still reading old newspapers.




Share This


The Two Socialisms

 | 

When I was in college, the selling point of socialism, communism, revolutionary activism, all of that, was something called “participatory democracy.” That’s what the mighty SDS (Students for a Democratic Society) stood for. That’s what the neo-Marxists stood for. That’s what all the “community organizers” stood for. The idea, endlessly reiterated, was that “decisions must be made by the people affected by those decisions.” No one talked about Medicare for all, or government-funded preschools, or government-mandated revisions of the environment. The idea was that centralized “state capitalism” was wrong, not primarily because it was inefficient, or even inequitable in its effects, but because its decisions were not “democratic.” They had not been made by the people affected by them. If it was inequitable or “slow” (i.e., inefficient), that was why.

Now we are witnessing an immense revival of “socialism,” led by Democratic Party opportunists and hacks. And it is all about laws that need to be made to increase the power of the centralized state. It is about giving professional politicians sole power over healthcare, housing, education, transportation, employment, qualifications for voting, and the possibility of self-defense — and all this without the tiniest hint that anyone except the Philosopher Kings who compose the Democratic Majority in the House of Representatives should be consulted. Participation? What’s that?

American “socialism” has shifted, in our time, from a demotic and “participatory” style to a rule-from-the-top dogmatism.

I have to be honest. I am a foe of “participatory democracy.” I do not believe it is optimal, in any sense, to give power over the individual’s existence to whoever happens to be a coworker, a fellow student, or just a guy who happens to turn up at a meeting. I find myself unable to decide whether a regime of little Red Guards is more repellent than a regime of Bernie Sanders bureaucrats arrayed, rank on rank and cube on cube, to decide what the width of my bathroom door should be.

But I think it’s worthy of notice that American “socialism” has shifted, in our time, from a demotic and “participatory” style to a rule-from-the-top dogmatism, constantly twisting in response to the whims of the politicians but always determined to enforce those whims.

I wonder whether any of the socialists have noticed this. Perhaps they are as ignorant of their own traditions as they are of economics or sociology, or respect for anyone except themselves.




Share This


Somebody’s Favorite

 | 

In the wake of last year’s militant #MeToo movement, when actresses haughtily proclaimed, “We will no longer be pressured into trading sex for jobs” (and bullied other actresses into wearing black at the event to show their solidarity), the Academy this year has bizarrely honored The Favourite with ten Oscar nominations, tying Roma for first place in number and confirming once and for all (as if there were any doubt) that the Academy of Motion Picture Arts and Sciences has zero credibility and doesn’t know what the hell it is doing.

Loosely based on the reign of Queen Anne and her relationships with Sarah Churchill,Duchess of Marlborough, and a servant named Abigail (eventually Lady Masham), the film suggests that the silly and childlike Anne made all of her decisions based on which woman’s tongue pleased her best — and I don’t mean by talking. The film fairly drips with transactional sex, from stagecoach wanking to arranged marriages to child trafficking to extortionate sex to withholding of affection for political positioning to ordinary prostitution. We even see ducks mating.

A young social climber, formerly an aristocrat but working now as a servant, worms her way cunningly — or in this case, cunnilingually — into the favor of Queen Anne.

Despite its praise from a supposedly “woke” Hollywood culture, the film’s theme is simply appalling. Yet Rachel Weisz, who plays Sarah Marlborough, called the film “a funnier, sex-driven All About Eve.” In that film, an established star (Margo Channing) befriends an aspiring actress (Eve Harrington), only to see her try to usurp her position in the theater. Similarly, in The Favourite, a young social climber, Abigail (Emma Stone), formerly an aristocrat but working now as a servant, worms her way cunningly — or in this case, cunnilingually — into the favor of Queen Anne (Olivia Colman) by befriending and then pushing aside the queen’s long-standing confidante and advisor, Lady Churchill (Weisz), simultaneously finagling a financially and socially beneficial marriage to regain her aristocratic status.

Don’t misunderstand my objection — I enjoy a good bedroom farce, with doors slamming, lovers hiding, comic timing, and double entendres galore. But this is different. The Favourite doesn’t just joke about sex; it celebrates the use of sex to gain political power, and hypocritically undermines everything these same preening, moralizing Hollywood hotshots stood up for just last year.

It also seems to justify rape, as long as it’s funny and as long as the women are in charge. When Lord Masham enters Abigail’s servant quarters without being invited, she asks him, “Are you here to seduce me or to rape me?” He responds, “I’m a gentleman.” “To rape me, then,” she deadpans, and the audience chuckles.

Forgive me if I’m wrong, but I thought rape had ceased to be funny, even in the movies. And nary a trigger warning in the trailers. Tsk, tsk.

All I’m asking is that the Academy pick a side and stick with it. Or admit that it really has no backbone or underlying moral principles whatsoever, and quit pretending to have the upper hand on social morality.

I enjoy a good bedroom farce, with doors slamming, lovers hiding, comic timing, and double entendres galore. But this is different.

So why the accolades for The Favourite? It’s all in the technique (to mimic Lady Abigail to Lord Masham on their wedding night as she turns her back and offers him her hand — you get the idea). First are the obvious awards: all three women have been nominated, and all three deliver stellar performances. Weisz and Stone are deliciously nasty to one another and grovel appropriately, if disgustingly, for Anne’s sexual attention. Colman’s Queen Anne is gouty, needy, dumpy, screechy, and even develops a convincing stroke midway through. She’s amazing. Nominations for the Big Three — Best Picture, Best Director, and Best Screenplay — bring the tally to six.

Of course, any time you make a “costume drama,” you can expect to see a nomination for Best Costume Design, and in this case, it is well deserved. The early 18th century is not a common era for filmmaking, so costume designer Sandy Powell couldn’t just rent the costumes from a local supplier; most of them had to be made specifically for this film. And they are spectacular. The opulent textures and colors, and especially the tailoring details of the pockets, lace, and scarves are stunning, although the fabrics — including recycled denim and a chenille blanket — are far from authentic. The massive 18th-century wigs are impressive too, and even more impressive because, due to budget restraints, Powell often took the wigs apart after they were used in one scene and remade them for another. Interestingly, Lady Sarah is often dressed in men’s fashions. It prompts the question: can a woman only be powerful if she’s manly?

The opulent costumes fit perfectly within the opulent production design, also nominated for an Oscar, as it demonstrates the aristocratic decadence of the time. England is at war with France, and Queen Anne keeps threatening to double the taxes, but her courtiers are fiddling while the figurative fires burn. We see duck races inside the castle. Live pigeons, used for skeet shooting overlooking the sumptuous lawns. Exotic pineapples, imported from who knows where. A naked courtier being pummeled with blood oranges in one of the palace salons, just for fun.

Weisz and Stone are deliciously nasty to one another and grovel appropriately, if disgustingly, for Anne’s sexual attention.

Lord Harley (Nicholas Hoult) says, “A man’s dignity is the one thing that keeps him from running amok,” but we don’t see much that inspires dignity among these characters. In one scene, Queen Anne’s cheeks are painted with heart-shaped rouge, and in a later scene she murmurs distractedly, “Off with her head. Off with her head!” It does feel as though we have fallen through the looking-glass.

Adding to that looking-glass sensation is the bizarre use of fisheye lenses and dizzying panorama shots of interiors that create distorted scenes, almost as though we are looking through a giant peephole. And to a certain extent, we are. Screenwriters Deborah Davis and Tony McNamara based their characterization on letters between Queen Anne and Lady Churchill that indicate an intimately affectionate friendship and chose to play up the lesbian angle as the driving force in their characters and in their politics. All three important women in this filmwere married, but that doesn’t necessarily indicate heterosexual preference, especially in court marriages.

Still, the sexual relationship between Anne and Sarah — if indeed it existed — was intended to be private and, I hope, loving and intimate and true. The fisheye lenses and peephole angles reinforce that sense of peeking in on something we aren’t supposed to see — and that we might have a distorted impression of what really happened. Although Abigail did eventually take Sarah’s place as the Queen’s Mistress of the Robes, there is no historicalindication that Abigail used sex to win the Queen’s affection. Sarah and Anne did indeed have a falling out, possibly over money for building Blenheim Palace, and the Marlboroughs were banished to the continent. Abigail then became the “queen’s favorite,” or personal lady-in-waiting. After Queen Anne’s death the Marlboroughs returned to England and finished building Blenheim. That’s what we know.

In a later scene the queen murmurs distractedly, “Off with her head. Off with her head!” It does feel as though we have fallen through the looking-glass.

The Favourite opened with a limited run in November to a dismal $442,000 box office its first weekend. Trailers had been somewhat misleading, suggesting that the story was a more audience-friendly knock-down, drag-out catfight between two ladies-in-waiting, not a fairly graphic lesbian love triangle. Either way, it didn’t do well at first. After its Oscar nominations, however, it returned to theaters and as of January 31 had grossed over $42 million worldwide, from an audience of mostly bewildered moviegoers. That’s the power of an Oscar nomination.

Liberty readers might well enjoy The Favourite, depending on where they stand on the situations I’ve described. It’s bizarre in many ways, but it’s also witty, opulent, and well-acted. It presents three powerful women controlling the throne and politics of England in their own womanly way, especially Lady Sarah, who evidently really did have the queen’s ear from their childhood and ruled from Anne’s shoulder until the war with France ended. All three women use their sex for trade, but they do it willingly and deliberately, from a position of power rather than victimhood. Is it possible —even probable — that women in Hollywood have been doing the same thing for over a century, and only cried “outrage!” (and somehow managed to blame Republicans) after they were caught?

The Favourite might even turn out to be your favorite, even though it isn’t mine.


Editor's Note: Review of "The Favourite," directed by Yorgos Lanthimos. Element Pictures, 2018, 119 minutes.



Share This


No Escape from Human Nature

 | 

Are humans instinctively brutal? Do we attend hockey games, boxing matches, and race car events hoping see blood? Do we rubberneck at car accidents hoping to see death? Have we really made no moral progress since gladiator games were used as public executions?

The producers of Escape Room want us to think so. From The Most Dangerous Game (1932) to The Naked Prey (1965) to The Hunger Games trilogy (2012–2015), movies have explored the concept of humans hunting humans and have tapped into the idea of execution as entertainment. And that’s what happens in this movie.

Inspired by the escape-the-room genre of video games, real-life escape room adventures have become popular over the past decade in cities all over the world. Contestants are locked inside a room decorated to resemble a haunted house, prison cell, space station, or other isolated location and are given a time limit during which to discover clues, solve riddles, and find the escape hatch. It’s a fun, socially interactive, real-life alternative to sitting in front of a computer screen discovering clues, solving riddles, and finding the escape hatch.

They soon realize that one person will die in each room. Who will it be? What would you do to make sure it isn’t you?

The premise of Escape Room is simple. Six strangers are invited to compete for a high-stakes prize by solving a series of puzzles in order to escape from a series of rooms. Danny (Nik Dodani) is a videogame nerd who has played in nearly a hundred escape rooms before. Zooey (Taylor Russell) is a shy math prodigy with a talent for solving puzzles. Jason (Jay Ellis) is an investment banker with expensive tastes. Ben (Logan Miller) is a stock clerk for a grocery store. Amanda (Deborah Ann Woll) is an army veteran, and Mike (Tyler Labine) is a blue-collar worker. What has brought these six together? And how will they interact under pressure?

The six soon realize, of course, that this is no game. If they fail, they die.

With its PG-13 rating, Escape Room is high on suspense and low on blood and guts, making it an entertaining film as the audience members work along with the characters to solve the riddles and unlock the doors.

What makes the film interesting are the gradual reveal of the characters’ backgrounds and their interaction with one another as they do what it takes to survive. They soon realize that one person will die in each room. Who will it be? What would you do to make sure it isn’t you? They’re all strangers, after all. They only just met, and they have no personal connection with one another. Will self-interest lead to treachery? Or will goodness win out?

You couldn’t share. There simply wasn’t enough. So you did what you must.

Despite being driven by self-interest, we still seem to want our heroes to be self-sacrificing — at least in Hollywood. We cheered when Han Solo, that maverick businessman of the cosmos, returned to help the resistance in Star Wars. We took heart when Katniss Everdeen refused to kill her youthful opponents in The Hunger Games. We even approved when Hombre (Paul Newman), the ultimate libertarian hero, reluctantly risked his life to rescue the wife of the thieving, racist Bureau of Indian Affairs agent from the stagecoach robbers.

But in reality, when push comes to shove and our own lives are on the line, what would we do to survive?

I recently listened to The Women in the Castle, by Jessica Shattuck, a fictionalized account of the widows of Jewish resistance leaders and their experiences during and after World War II. It’s a sappy, sentimental novel full of 21st-century morality and clichés. For example, Shattuck refers to “racial profiling” when her characters are asked to show their papers, a term that did not exist in World War II. Moreover, her protagonist is cloyingly egalitarian. She comes from an aristocratic background and thus has special access to food and protection. Yet she refuses to accept those special favors, or at least expresses consternation about accepting them. To hell with accuracy; Shattuck seems compelled to imbue her 20th-century protagonist with 21st-century values, no matter what. Such egalitarianism is a fine principle in times of plenty, but when your children are truly starving or threatened by death, you will accept any special opportunity offered to feed and protect them.

Hollywood conveniently whitewashes the truth about the survival instinct in order to celebrate community, sacrifice, and cooperation.

Actual Holocaust survivor Viktor Frankl belies Shattuck’s politically correct fantasy about genteel survival morality in his concentration camp memoir Man’s Search for Meaning. Frankl reveals a particularly troubling source of survivor’s guilt — he admits that in order to live through the kind of brutal starvation they experienced in the camps, those who survived had to be ruthlessly selfish at times. There might be one piece of potato in the soup pot, and that one piece of potato would determine who had enough sustenance to survive the march to work the next day, and who would collapse in the snow. You couldn’t share. There simply wasn’t enough. So you did what you must to scavenge that bite of potato, reach the warm spot at the center of the mass of prisoners, avoid the front of the line when the camp guards were looking for someone to shoot. You might feel guilty. You might be furtive. But you did it anyway.

In such films as Escape Room, Hollywood conveniently whitewashes the truth about the survival instinct in order to celebrate community, sacrifice, and cooperation. The hero manages to be self-sacrificing and self interested, to fall on the grenade and make it out alive. And that’s OK. After all, we’re looking for escapism, not realism, in entertaining movies like this one.


Editor's Note: Review of "Escape Room," directed by Adam Robitel. Original Film Production, 2019, 100 minutes.



Share This


Remembering the Great War

 | 

As the world prepared to commemorate the 100th anniversary of the ending of World War I on November 11, 1918, director Peter Jackson accepted a daunting commission: to create a documentary that would honor the soldiers who fought in the trenches, using the original footage that was filmed 100 years ago.

This would not be a documentary about generals, military strategy, assassinations of obscure archdukes, or theaters of war. Jackson would not interview modern historians about the significance of the war or provide any scripted narration. Instead, Jackson would bring these long-dead soldiers to life by allowing them to tell their own story.

The result is a magnificent piece of work, both in the story it tells and in the technology Jackson used to tell it. This is a film made entirely in the editing room.

This would not be a documentary about generals, military strategy, assassinations of obscure archdukes, or theaters of war.

To create the storyline, Jackson and his team reviewed over 600 hours of interviews with survivors, conducted during various commemorations of the War to End All Wars. Jackson then began selecting portions of the interviews, taking a snippet here and a snippet there, until he was able to cobble together a narrative line that begins with young 16- and 17-year-old boys sneaking off to lie about their ages in order to join the army; follows them into the trenches, villages, and battlefields; and ends with the survivors returning home, many of them injured, many of them “loony” (an earlier term for PTSD), and many of them (according to one of the narrators) facing employment signs that said “Army veterans need not apply.” Their remembrances, told with voices that are cracked with age, are moving and authentic. No historian’s expertise could tell their story better.

Once the storyline had been established, Jackson reviewed 100 hours of footage from the war, selecting the best scenes to match the narration. Much of the footage was third- or fourth-generation, meaning it was a copy of a copy of a copy, each generation becoming less and less crisp. Much of it was either too dark or too light to be viewed clearly. And all of the movements were jerky and unnatural as the filmmakers had to crank the film through the camera by hand, trying to keep it steady at approximately twelve frames per second, which is only half the number of frames per second that we are accustomed to seeing in today’s movies.

And here is where the magic begins. Jackson used computer technology to add frames to the footage, smoothing out the action and making it feel as normal as any film you would see today. Then he colorized the film, using actual uniforms, tanks, and other artifacts from his own considerable collection of WWI memorabilia to help the artists get the colors just right. Next he enlisted professional lipreaders to figure out what the men were saying in the footage, and hired voice actors from the actual regions of each regiment, so the accents would be authentic. He added sound effects made by recording actual tank movements, mortar explosions, bayonet affixions, and other background noises. Finally, he created a natural musical score largely based on whistling and other natural music of the battlefield. The result brings these antique films to life. We simply forget that cameras couldn’t do this 100 years ago.

Jackson brings these long-dead soldiers to life by allowing them to tell their own story.

I’m not usually a fan of colorization; while it does make a film feel more natural for modern viewers, it neutralizes the skillful play of shadow and contrast designed deliberately and carefully by directors of the ’30s and ’40s. They knew what they were doing, and they did it well. However, in this film the colorization is a masterful addition. It brings out details in the film that in black and white were hidden or completely lost. Most notable is the blood; we simply don’t see blood as anything but dirt in black and white.

We also see how terribly young these soldiers were, marching off to war and grinning for the cameras. Although we never know their names, Jackson edits the footage so that several of the men come into view several times, and we begin to identify with them. We see not only the war, but how they lived, what they ate, how they slept, and even how they played. And in many cases, we are seeing them just before they died. It is a sobering, respectful, and impressive film.

They Shall Not Grow Old is neither pro-war nor anti-war; it simply asks us to consider the cost of war — not in the billions of dollars that are spent, but in the millions of lives that are lost. The title of the film is based on a selection from Laurence Binyon’s Ode of Remembrance called “For the Fallen,” which has been used as a tribute to all who die in war:

They shall grow not old, as we that are left grow old:
Age shall not weary them, nor the years condemn.
At the going down of the sun and in the morning
We will remember them.

They mingle not with their laughing comrades again;
They sit no more at familiar tables of home;
They have no lot in our labour of the day-time;
They sleep beyond England's foam.

Lee Teter’s painting “Vietnam Reflections” pays a similar tribute to the fallen, but from a different perspective, that of the grieving survivor. It depicts a man, clearly a veteran though he wears no uniform, mourning at the Vietnam Veterans Memorial in Washington DC, where the names of all the fallen are etched on a long, low wall deliberately situated below ground level. His head is bowed in quiet anguish, his arm outstretched and his hand leaning heavily against the wall, willing it to reach inside and touch his comrades on the other side. Unseen by him, because his eyes are closed, several soldiers seem to be standing inside the wall, their reflections ghostly as they reach out, hand to hand, to console the man who, having survived the war, continues to carry its burdens. His guilt is understood by the clothing Teter chose to give him. He is dressed in a business suit; the soldiers wear army fatigues. A briefcase rests on the ground beside the veteran; the soldiers carry field kits. The businessman’s hair is flowing and tinged with gray; theirs is dark and crew cut. The fallen soldiers shall not grow old, start businesses, or have children.

 Most notable is the blood; we simply don’t see blood as anything but dirt in black and white.

And therein lies the survivor’s grief. “We that are left grow old,” as Binyon says in his poem, but survival is neither a reward nor a relief. It is a burden. Age does weary them, and the years do condemn.

No one knows the true story of war except those who experience it, and even then, it is a private, individual grief that none of them can truly share or understand. Consequently, using the voices of the actual soldiers to tell their story was a brilliant narrative strategy for They Shall Not Grow Old. They speak next to one another, but not in conversation with one another. The viewer remains enveloped in the currency of the story and simply observes their experience without explanation, editorializing, or the distraction of a modern historian’s modern interpretation.

The film is moving and impressive, but you’ll have to find it on Netflix or another platform because its theatrical release was limited to just December 17 and December 27. And that’s a shame, because the moment when Jackson switches from the jerky, original, black and white footage to his colorized and edited version is breathtaking. I’m so glad I got to see it on a full-sized screen. If you do see it, make sure you watch the director’s cut with Peter Jackson’s interview explaining how he did it. It’s like listening to a magician’s reveal.


Editor's Note: Review of "They Shall Not Grow Old," directed by Peter Jackson. WingNut Films, 2018, 99 minutes.



Share This


Beer, Bikes & Brexit

 | 

“Ride left, die right!”

Our mantra, continuously repeated to each other, often as a cheerful, running admonition but sometimes shouted in panic, was mostly repeated as a silent meditation while pedaling our bikes from the toe of Cornwall to Scotland’s sagittal crest during June and parts of May and July. The Land’s End to John O’Groats quest has become something of an obsession not only in the UK but also to a cross-section of aficionados worldwide — a British version of the Way of St. James, if you like. Like the Via de Santiago, it has many alternates, with the shortest at 874 miles. Our choice, the National Cycle Trails’ Sustrans Route, is 1,200 miles long.

One aspirant, who with his wife runs a B&B in Bodmin, Cornwall (in which we overnighted) was leaving for John O’Groats the following day to run one variant. Yes, run. Or as the placard on the back of the ungainly plastic box that contained his essentials (including a sleeping-rough kit) and was duct-taped to a tiny day-pack he’d strap to his back proclaimed:

Colin is running for ShelterBox disaster relief charity
1 man 3 Peaks
1 ShelterBox
1,000 miles marathon a day

The 3 Peaks were Ben Nevis, Scotland’s highest; Scafell Pike, England’s highest, and Brown Willy, a small hill atop Bodmin Moor, Cornwall’s highest point. (and one over which Tina, my wife and I biked on our adventure). The man was 53 years of age, and this was his third British Isles end-to-end ultra-ultra-marathon, as these inconceivably long long-distance runs are called.

He wasn’t the only eccentric adventurer we encountered. Another runner, whom we met at John O’Groats just as we were finishing, was just starting out. Unlike Colin, he’d packed his gear into a two-wheeled trailer attached to his chest with a harness. As we watched him start, he jogged into the arm swing and curious gait that ultra-marathoners affect to make it look as if they were running when in fact they proceed little faster than a fast walk, or about four miles per hour. We never found out his raison de run. One tiny, 73-year-old man with a backpack the size of a manatee and a pace that rivaled varve deposition in Loch Lomond (where we encountered him) was doing the walk to commemorate the Queen’s longevity. He presented us with his card. It requested donations to cover his expenses.

The man was 53 years of age, and this was his third British Isles end-to-end ultra-ultra-marathon.

Ian Stocks was bicycling a 20-day, 1,500 mile variant that included the UK’s eastern and westernmost salients, for Alzheimer’s Research UK. At a projected 75 mile-per-day pace he ought to have been much further south than where we met him. I noticed that his gear — bike, panniers, electronics — all looked new, and my BS antenna began to quiver. The received wisdom in the classic-liberal view is that as welfare programs take over the functions of private charities, the latter tend to atrophy. Great Britain, definitely a welfare state, seems to have a surfeit of charitable initiatives. What was going on?

I’d once been solicited for a contribution to “raise awareness for breast cancer” by a group of women breast cancer survivors who were planning on skiing across the Greenland ice cap. They were all seasoned adventurers. I knew what they were up to. Contributions would pay for gear and transportation first; any money left over would go to the “raise awareness” bit.

At this point, let me clarify a popular misconception concerning folks who participate in extreme sports, objectives, deeds, adventures, and such for charity. Their shtick is to persuade the public that they are willing to undergo extreme exertion and privation for a good cause. But nothing could be further from the truth. They do what they are doing because they are addicted to adventure, unique accomplishments, firsts, personal challenges, transcendence of the quotidian, making their mark, even adrenaline or drama; in a word — they love what they do. But extreme adventures are costly, so many fund their objectives by invoking the charity label. I told Trish, the leader of the Greenland expedition (who, years before, had taught me to cross-country ski), that I needed my money to fund my own adventures and that I wished them luck. She didn’t take that well.

Great Britain, definitely a welfare state, seems to have a surfeit of charitable initiatives. What was going on?

So I checked out Ian Stocks’ website. What a surprise! All contributions go directly to the charity; nothing passes through Ian’s hands. Ian’s motivation is his father’s dementia. As of this writing, Ian is still behind schedule, mile-wise, but he has raised over 100% of his targeted contributions.

To me the more fundamental question is why this whole charade is necessary. If an individual wants to make a charitable contribution to a cause he cares for, why does he need a sideshow with no connection to the cause to spur him? Is it even entertainment? Perhaps, in a gladiatorial sort of way.

My wife Tina and I had decided to tackle the end-to-end ride for purely selfish reasons: beer — unlimited traditional cask ales (more on them later), single malt whiskies, and daily feasts of delicious UK food: the full breakfast fry — bacon, sausage, egg, baked beans, fried mushrooms and tomato, black pudding, hash browns and fried toast; the best curry restaurants in the world (Kashmiri, Bengali, Pakistani, Bangladeshi, South Indian); fish and chips to die for; carveries featuring four roasts with all the trimmings, including Yorkshire pudding and meat pastries that defy counting; steak and ale and steak and kidney pies, sausage rolls, Cornish pasties, shepherd’s and pork pies, and many local, flaky variants. And, of course, the chance to explore a country in low gear, meet the people — prone to extreme civility with a subtle but outrageous sense of humor — and get our fair share of exercise in order to continue such adventures well into our senility.

If an individual wants to make a charitable contribution to a cause he cares for, why does he need a sideshow with no connection to the cause to spur him? Is it even entertainment?

Many end-to-end bikers from a variety of countries crossed our path. Unlike motorists, long-distance bikers always stop to pass the time of day, to inquire about one another’s journey, objectives, provenance, etc. Nearly all who were heading north targeted John O’Groats. Just to add a little spice to the repetitive answers and one-up them all, I decided to tell everyone that Tina and I were headed for Scapa Flow. Only the Brits got the joke.

Two separate couples had come all the way from New Zealand. I asked them why they’d come halfway around the world for this biking adventure, when they lived in a country famous for its natural beauty and low population density, a country that would seem to offer a biking paradise. Both couples shook their heads and looked at each other. They both — separately — responded that New Zealand was a relatively new country, and so did not have a well-developed network of old or secondary roads crisscrossing the two main islands. Only primary highways, mostly two-lane, bind the country together. These have narrow shoulders (when at all), and drivers are not sensitive to bikers.

The Road Less Traveled

The Sustrans route we chose uses traffic-free paths and quiet single-lane roads, hence its 1,200 mile length. Those quiet single-lane roads have their own quirks. Nearly all are bordered by 6–9’ hedges, perfectly vertical and maintained by vertical mowers. They are so narrow that planners have installed “passing places” and “lay-bys” about every 100 yards. The occasional oncoming or passing car encountering another car — or bike — must wait for one of these to get by. However, the Sustrans route also seems to go out of its way to stitch together every hill top, traverse watersheds cross-wise instead of following drainages, and generally adhere to Mae West’s observation that “the loveliest distance between two points is a curved line.”

Just to add a little spice to the repetitive answers and one-up them all, I decided to tell everyone that Tina and I were headed for Scapa Flow.

England’s myriad roads, in plan view, mimic the pattern formed by cracked tempered glass — an intricate grid twisted and crabbed beyond any recognizably geometric shape and resembling a Voronoi tessellation. They started out that way and only got more complex as time went on. When Rome conquered England, according to Nicholas Crane in The Making of the British Landscape, “the web of footpaths and tracks serving settlements were an ill-fitting jigsaw of local and regional networks which were difficult for outsiders to navigate.” The bends and salients in England’s roads had evolved over hundreds (or even thousands) of years to link settlements, sources of raw materials, strongholds, religious sites, and so on. These evolved organically before the invention of bulldozers and certainly of modern road engineering with road cuts and fills that reduce gradients and straighten out unnecessary curves. Except for the historic nature of English roads, which sometimes subjected us to 20% grades and less-than-direct transects, they’re a biker’s paradise.

The Hills Are Afoot

Cornwall, the forgotten Celtic country, was a disheartening start to a very challenging ride. Not only does the Cornish section of the route gain little in a northerly direction — and sometimes even trends south — its ride profile resembles tightly clustered stalagmites with significant climbs over Bodmin Moor, the St. Burian and St. Columb Major summits, and a queue of lesser hills. Our old bodies required two rest days in quick succession — at Truro and Bude — if we were to have any chance of reaching Scotland pedaling.

Cornwall might seem forgotten because it’s bedeviled by an identity crisis. Although Celtic in origin, distinct in language and separate as an entity from England, with many unique cultural traits, it somehow missed Tony Blair’s devolution revolution in 1997. Rob, our host at the very modest Truro Lodge, told us that Truro, a cathedral city, was the capital of Cornwall. Since he’d been Truro’s last mayor, I asked him if that made him First Minister of Cornwall. He smiled wryly, admitting that Cornwall had had such an influx of English settlers that there wasn’t much enthusiasm for Cornish devolution, much less independence.

Except for the historic nature of English roads, which sometimes subjected us to 20% grades and less-than-direct transects, they’re a biker’s paradise.

But there is some ambivalence. The Cornish language is being revived. Cornish music, nowhere near as popular as Irish or Scottish music, can still be heard. Outside Truro Cathedral, a couple of buskers with fiddle, guitar, and microphone played traditional tunes to an enthusiastic audience. And in Penzance, along the waterfront promenade, a Cornish band led by a baton-waving, tails-wearing drum major marched in front of our hotel evenings at dusk playing Cornish Morris-type music (I later found out that the all-volunteer ensemble was short on musicians and was soliciting participants).

In 2007, David Cameron promised to put Cornwall’s concerns "at the heart of Conservative thinking." However, the new coalition government established in 2010 under his leadership did not appoint a Minister for Cornwall. Although Cornwall only holds the status of a county in Great Britain, as recently as 2009 a Liberal Democrat MP presented a Cornwall devolution bill in Parliament (it got nowhere), and advocacy groups demanding greater autonomy from Westminster have been waxing and waning over the years.

On June 5 we left Cornwall and entered Devon, the heart of Thomas Hardy country, complete with irresistibly cute, white-washed thatched roof cottages. Though every bit as hilly as Cornwall (1,640’ Exmoor, the Mendip Hills and the Blackdown Hills to the fore), it welcomed us with a roadside sign declaring: Where there are flowers there is hope.

The Cornish language is being revived. Cornish music, nowhere near as popular as Irish or Scottish music, can still be heard.

“Oh, how much fun!” Tina declared — her enthusiastic response to any novelty, serendipitous triviality, unanchored excess of exuberance, or even the prospect of another 20% uphill grade. Up these our odometers would sometimes only display zero miles per hour, even though we were making progress. To pass the time on the slow pedal I recounted the libertarian themes in Hardy’s Jude the Obscure, a novel she’d never read: his depiction of marriage as a crushing force, his belief that organized religion complicates and obstructs ambition, and his critique of the Victorian class system.

At Glastonbury (another rest day) our route finally turned resolutely north. The famous abbey town and final resting place of the legendary King Arthur has become a bit of a Wessex Sedona with crystal shops, goddess centers, metaphysical bookstores, vitamin, herb, and natural food shops, and a vibrant cast of street characters in a variety of stages of mendicancy, sanity, and hygiene exhibiting extremes of sartorial flourishes from total nakedness through purposeful dishevelment to natty eccentricity. Even our B&B hostess had a claim to fame. Sarah Chapman held the Guinness Book of World Records women’s record for walking five kilometers upright on her hands! But the ruins of the abbey, legendarily founded by Joseph of Arimathea in 63 AD and associated with Saints Columba, Patrick, and Bridget but sacked and burned by Henry VIII when he broke with Rome over its refusal to submit to him instead of the Pope, are the town’s saving grace.

By the time we reached Bristol we were deep in the bosom of Old England. Bristol, once England’s doorway to the world, is a thriving, lively, modern city. In its harbor, lovingly replicated, docks the Matthew, John Cabot’s ship. A plaque next to his oversize statue reads: In May 1497 John Cabot sailed from this harbour in the Matthew and discovered North America. The only drawback to being a port city is the seagulls, loud giant avian dive bombers. They are brazen and incorrigible in their quest for food. Early mornings reveal overturned trash bins throughout the city. Gulls have been reported snatching burgers out of hands and even whacking a pedestrian eating a snack on the back of the head so that he drops it and the gull steals the tidbit. One municipal mayor complained that gulls are a protected species.

On to the Midlands

Past the moors and fens, the landscape turned to rolling farm and toft landscape dotted with rhododendron copses. Through the humid and fecund West Midlands, we developed a fondness for the heady odor of pungent silage mixed with barnyard manure — definitely an acquired taste. One evening at a pub, a morris troupe, performing traditional English music and dance dating from before the 17th century, enhanced our after-ride pints. The all-male troupe wearing bells — perhaps the original source of the phrase “with bells on their toes” — and accompanied by a squeeze box, was delighted to entertain foreigners familiar with morris dancing. We stayed in an old Tudor building with buckled floors, absurdly low pass-throughs, and narrow winding stairs whose commemorative plaque read: Crown Hotel: Rebuilt in 1585 on site of a much earlier inn destroyed by the fire of 1583. A coaching stop on the London-Chester run.

By now Britain’s schizophrenic weights and measures standards were beginning to puzzle us. Road distances were in miles, temperatures in centigrade, beer and milk in pints, and folks gave their weight in “stone” with large weights measured in Imperial tons. While the metric system may be simpler in computation, the English system is ergonomic and evolved organically, thereby rendering it more intuitive. And, most curious of all to me, a northern country that in summer experiences 19 or 20 hours of daylight and invented standard time, which it measures from the Prime Meridian of the World at Greenwich, succumbs to the idiocy of Daylight Savings Time.

Refreshingly, the government has not been able — by and large — to impose metric mandates or force observance of DST throughout the realm. When the time changes, businesses readjust their opening and closing times to GMT. With barely four or five hours of total darkness, how much daylight needs to be “saved”? As to the other weights and measures, one informant told me that, except for centigrade temperatures, all new and traditional systems coexist peacefully, with only a handful of rigid requirements such as strong spirits in pubs, which must be sold in 25ml, 35ml, 50ml, and 70ml increments.

Up these hills our odometers would sometimes only display zero miles per hour, even though we were making progress.

Worcester (pronounced Wooster), is the home of Worcestershire Sauce and site of the last battle of the Civil War, in which Cromwell decisively defeated the Royalists. Even more importantly, Worcester Cathedral holds the remains of King John, he of the Magna Carta. The mausoleum was extremely moving, not just for its considerable age and all the empty space surrounding it, but also for the immense significance of Magna Carta itself. For all that a lot of it is unintelligible, Magna Carta was the first assault on the absolute power of English royalty through the separation of powers and the recognition of the rights of a portion of the populace.

In keeping with Sustran’s objective of avoiding traffic, we bypassed Birmingham, Britain’s second largest city. Not so for Manchester. Inevitably, we got lost there. Signage was poor, our map not detailed enough, and Google not up to the task. So, contrary to the clichéd stereotype of a male, I asked a passerby for directions. The lady responded, “You’re in luck, I’m a geographer. Where are you going?”

Now, asking passersby has its drawbacks — too many to detail here — but, in this instance, we weren’t going to a particular place but rather trying to find National Cycle Network Route 6 to get back on track. Never mind; an academic geographer informant — here was the gold standard! After detailing our trip to her I showed her our guidebook’s map. She was no biker and had never heard of the National Cycle Network. She wasn’t impressed by either our guidebook or our map, of which she couldn’t make sense. At once she launched into a tirade about computer generated maps and lectured us on the preeminence of British ordnance survey maps.

Through the humid and fecund West Midlands, we developed a fondness for the heady odor of pungent silage mixed with barnyard manure — definitely an acquired taste.

I responded that she was absolutely correct, except that we would have needed over 100 ordnance survey maps to cover our entire route, at a prohibitive cost in space and pounds sterling. Then she and Tina, interrupting their on-again, off-again chitchat, in between attempting to solve the riddle at hand, pulled out their smartphones — the last resort of the self-unreliant — and sought guidance from Google.

By now I was losing patience. We’d eaten up precious time getting nowhere, so I resorted to a navigator’s last resort: bracketing. I thanked our geographer for her help, gently disengaged Tina from her, and explored four separate directional salients for a mile each, starting from the roundabout we’d stopped at in order, to ensure that one of those was or wasn’t where we were headed. Through the process of elimination, a compass, a closer examination of the clues in our guide, and not a little intuition, we found our route. Lo and behold, we were nigh on it! A block further along the last salient explored, we encountered a National Cycle Network Route 6 sign.

The lessons: Never mistake a geographer for a cartographer: the former specializes in the distribution of the human population over the surface of the land; the latter makes maps. And . . . have confidence in your own abilities.

North by Northwest

The Yorkshire Dales, Cumbria, and the Lake District welcomed us with a smorgasbord of all-you-can-climb hills, appetizers to the Scottish Highlands. By now we’d talked to a lot of innkeepers, publicans, bikers, walkers, shopkeepers, and random strangers. With the 70th anniversary of the National Health Service (NHS) imminent on July 5, I sought infrequent opportunities to gather anecdotes about people’s experience with the service, especially now that Conservative governments had floated proposals to make the NHS financially more viable, most of which included increasing copays. I never brought up the subject but always managed to get folks to elaborate on offhand remarks. One lady mentioned that she’d recently broken her wrist playing cricket. So I asked her if the NHS had taken care of her (Britain has a dual — private and public — insurance and medical system).

For all that a lot of it is unintelligible, Magna Carta was the first assault on the absolute power of English royalty.

“Yes, they did,” she said. But then she backtracked, saying, “No, they didn’t.” So she explained. She went to the nearest hospital with her hand bent at an unnatural angle to her forearm. The staff said they had no room for her, to go to another hospital. So she did. The next hospital looked at her wrist and said it was broken. But they had no room for her. “Go home and wrap it up,” they said. Luckily, her husband had private insurance. The private doctor immediately took care of the fracture.

Another B&B host, an elderly lady who had recently lost her husband and ran a very modest one-woman operation told us she’d had a hip replacement. I asked how well the NHS had treated her. She responded that it had taken a while to get the procedure done, but only because she didn’t understand and had difficulty navigating the bureaucratic requirements. Once she mastered them she was put in queue, awaited her turn, and was very happy with the results.

Of course, the other hot topic of conversation was Brexit. I wasn’t shy about soliciting opinions on that. Two issues determined the close vote: immigration and EU rules (trade, a third issue, was uncontentious: everyone favored trade. However, the first two are interpreted very differently along the political continuum.

Luckily, her husband had private insurance. The private doctor immediately took care of the fracture.

In the course of our five-week traverse of the island we encountered numerous resident immigrants from a very broad array of countries working in sales, labor, and the service sector. I made a point of listing the countries they hailed from: Italy, Romania, Poland, Venezuela, Eritrea, Somalia, India, France, Pakistan, Greece, Spain, Bangladesh, Hungary, Czech Republic, Ethiopia, Thailand, Russia, Germany, Argentina, China, Latvia, Bulgaria, Slovakia, Belgium, Brazil, Philippines, Ukraine, Ireland, and the USA. These were not tourists or ethnic waiters at ethnic restaurants.

Left-leaning reportage attributes the pro-Brexit, anti-immigration vote to “racism,” or “little Englanders,” the British version of chauvinist rednecks. Right-wingers claim that immigrants are taking over jobs. Neither of these glib explanations stuck a chord with us or our informants. But all, regardless of whether they were “leavers” or “remainers,” expressed strong concern about Britain’s nearly limitless immigration. One Welsh AI entrepreneur — a remainer — averred that with an unemployment rate of 4.1% there was no employment problem in the UK. Gareth was so fixated on trade that he blithely dismissed any other concern as illusory.

As to racism, none of the immigrants we interviewed alluded to it; in fact, all expressed a great deal of affection and respect between themselves, the Brits, their neighbors, and their employers (ours was a very limited random sample). And none of the Brits expressed any — even the slightest — unfavorable sentiment about foreigners. Only when riding through Muslim enclaves did we sense any, admittedly vague, tension. So what was going on?

One waitress complained about the niggling EU rules — another erosion of British sovereignty — that even control the shape of bananas an Englishman can eat.

My sense is that the Brexit vote was a continuation of a British exceptionalism that goes back to 1066 — it’s been nearly a millennium since the last invasion. Compared to the continental countries, Britain has been uniquely stable, especially — being an island — as to its borders. In that sense, there is a nebulous perception of continental countries as entities akin to banana republics, with feuds, invasions, and shifting boundaries. To Brits, joining that club has always cost some degree of sovereignty. Margaret Thatcher personified that sentiment when she was unwilling to sacrifice the pound sterling, the world’s oldest, most stable currency (except under Callahan and Wilson) to a committee of continental bureaucrats. Britain did not join the Euro currency; but it did join the European Union, a continuation of the aspiring free-trade policies of the earlier Common Market. The Brits want to trade but don’t want others to control them.

One Scots barmaid was in favor of leaving, but voted to remain for the sake of her children. She complained about the niggling EU rules — another erosion of British sovereignty — that even control the shape of bananas an Englishman can eat. Gareth, our Welsh informant, thought this a red herring issue. But immigration rules are part of the broader EU rules: both require a surrender of sovereignty that the Brits have had enough of ceding.

Finally, there was a general concern that Britain was losing its identity — its culture, if you will — and becoming a nation of immigrants like the US. The August 11 issue of The Economist reports that “more than a third of London’s population was born abroad.”

Scotland the Heatwave

It was uncanny. As soon as we crossed the unmarked border into Scotland, the plaintive tones of a highland bagpipe filled the air. Around the corner we suddenly found ourselves in Gretna Green, once Britain’s answer to America’s Texas, where Scottish law allowed marriage between underage couples, but now a slightly pathetic tourist trap where couples with a romantic disposition to elopement still choose to tie the knot. Never mind, we were entranced and let the piper grab our souls, wrench our hearts, draw tears, and make us feel that we could transcend our limits. And, remarkably, accents turned on a penny from Yorkie to brogue.

As they say in Kentucky, “we were in pig heaven!”

On the first day of summer hordes of embarrassingly (to us, anyway) scantily clad Scots crowded along the shores of every loch, river, canal, and estuary, suntanning their albescent flesh. The unusually hot and dry weather, which had started earlier, was the cause of much comment. Tina, ever one to engage anyone in friendly conversation, asked a middle-aged lady if the unusual circumstances might be caused by global warming. The lady replied that if they were, “Bring it on!” In the 20 days we spent in Scotland it never rained. On June 29 at Pitlochry, the temperature hit 89 degrees Fahrenheit while we were there — leading to a hot muggy night with little sleep in a land where air conditioners and fans are a waste of money.

We looked forward every day to a pint or two of “real ale,” available in participating pubs everywhere but sadly lacking in Gretna Green — another disappointing aspect of the little town. I’m an avid fan of British Real Ale, a beer nearly unavailable anywhere else, and a primary reason for our trip. Real or cask ales (cask-conditioned beer) are unfiltered (they still retain yeast, though that drops to the bottom of the cask) and unpasteurized beer, conditioned (by processes including secondary fermentation) and served from a cask without additional nitrogen or carbon dioxide pressure. They require pumping by hand to serve and give good head in spite of being lightly carbonated compared to bottled beers. There is nothing quite like them in spite of their being brewed as bitters, stouts, porters, and even IPAs.

Breweries are small and local, and mostly supply only a handful of establishments — until recently. We visited one brewery in Pitlochry, the Moulin Traditional Ale Brewery, that brews only 120 liters per day of four different ales and supplies only one pub and one hotel. In the latter half of the last century corporate brewers began buying up pubs, pushing their beers and sidelining — or even eliminating — cask ales. Brits were not amused. In response, the Campaign for Real Ale was founded in 1971, and managed to convince the corporates not to eliminate cask ales. Some, such as Adnams, Greene King, and Marston’s, now even brew their own cask ales.

Although this anecdote is either false — Hume died in 1776 — or was altered in the retelling, it well captures Hume’s thinking.

While in Glasgow we managed to hit the Fifth Glasgow Real Ale Festival, offering over 150 different real ales from all over the realm. As they say in Kentucky, “we were in pig heaven!” We’d barely finished our first pint when the 18-piece Caledonian Brewery Edinburgh Pipe Band marched in playing “Scotland the Brave,” forcing us to freeze in place and raising the hairs on the nape of our necks. We imbibed 105 different real ales during our ride. Only space prevents me from listing them all and their creative names. As of 2014 there were 738 real ale brewers or pubs in the US. There might even be one near you.

In Killin we took a rest day and visited the Clan McNab burial grounds on Inchbuie Island in the River Dochart, along with the associated Iron Age fort and even earlier stone circle. Here in Prescott, Arizona, my hometown, David McNab books Celtic musicians who come on tour to the US. Married to a Scots lassie, he treasures his heritage. We’d be a culturally poorer town without his concerts.

As we passed Loch Tay, the Scottish Crannog Centre, an outdoor museum with a restored lake dwelling dating from about 800 BC, beckoned. The crannogs were built on stilts or artificial rock islands on the water. Villages, consisting of three crannogs, each with about 90 inhabitants, were common in Scotland and Ireland as early as 5,000 years ago and as late as the early 18th century. While Scotland has only 350–500 crannog villages, Ireland — on a much larger land mass — boasts about 1,200. Doubtless, both countries have many more crannog villages, underwater archaeology presenting considerably more obstacles (in survey and excavation) than terrestrial.

This odd dwelling pattern was first glibly explained as being of a defensive nature (most 19th century archaeologists being retired military men), but few weapons or evidence of warfare associated with the crannogs exists. The new explanation is that the dense vegetation of the Celtic countries favored cleared land for agriculture, not for mere habitation, while the riparian location facilitated extensive trade networks, evidence for which — including networks all the way to mainland Europe — is abundant.

The Loch Tay Crannog Centre, near Kenmore, Perth, and Kinross, isn’t just one reconstructed crannog with three dugouts. The staff has recreated the entire lifestyle of the inhabitants: foot-operated lathes; grain-grinding stones; wool spinning, dyeing, and weaving; and fire-starting by “rubbing two sticks together,” a practice often mentioned but seldom seen. It means using a fire drill. With the proper knowledge, preparation and materials, all things are possible. The demonstrator (even his shoes and clothing were authentic) started a fire in less than a minute.

The Braw Hielands

Somewhere beyond the Crannog Centre we crossed into the political subdivision known as the Highlands and Islands of Scotland. Trees and settlements became scarcer, midges and cleggs more numerous. Heather (purple), gorse (yellow), and bracken (green) gilded the landscape. Long-haired Highland cattle and Scottish Blackface, Wensleydale, Cheviot, and Shetland sheep predominated. It is here — not in Gretna Green — that the romance of Scotland kicks in: Rabbie Burns; Bonnie Prince Charlie; Nessie; Capercaillie and Old Blind Dogs; kilts, sporrans, and claymores; haggis; the Outlander miniseries; and even Mel Gibson berserking over the moors as William Wallace come to mind.

However, my own mind gravitated to those two giants of the Scottish Enlightenment, David Hume and Adam Smith. I’d not run across any memorials, statues, or even streets named for either in their homeland. That’s more understandable for Hume, whose somewhat counterintuitive, esoteric — albeit undogmatic — thinking isn’t readily accessible. But Adam Smith, the father of economics, the Charles Darwin (or Albert Einstein) of the dismal science, is a household name. His insights are readily accessible and intuitive.

In three separate trips to Scotland, I have been struck by the lack of Adam Smith memorials.

Smith and Hume were drinking buddies (which is saying a lot in 18th century Scotland, where getting plastered to oblivion was a national pastime). One bit of Hume’s thought that was accessible — though still counterintuitive — is encapsulated in an exchange he had with Smith. The United States had failed to agree on an official religion for the new country: a first for its time. Smith, a man of indeterminate religious beliefs, bemoaned the fact, opining that the lack of an official faith would doom the country into irreligiosity. Hume, an agnostic, disagreed. He predicted that countries without official faiths would experience a flowering of religions, while the official religions of countries that had them would wither into irrelevance. Although this anecdote is either false — Hume died in 1776 — or was altered in the retelling, it well captures Hume’s thinking.

The anecdotal Hume was right. America soon experienced the Second Great Awakening, the birth of a multiplicity of religious sects in the 1800s. Today, according to The Guardian (September 4, 2017), more than half the UK population has no religion; while nearly 76% of Americans identify as religious.

In three separate trips to Scotland (one where I walked across the country) I was struck by the lack of Adam Smith memorials. One informant said the Scots had little affection for Smith. Public opinion inside Scotland holds Adam Smith, the father of capitalism, responsible for the Highland Clearances. And public opinion outside Scotland perceives the Scots as socialist. It’s not so simple.

In the 2017 UK elections, the Conservative Party received 28.6% of the vote and overtook the Labour Party, the real far-left socialists, who received 27.1%, as the main opposition party to the majority Scottish National Party, which got 36.9%. The Scots are nationalistic, thrifty, good businessmen who hate taxes — traits not often associated with socialism (though they abhor class and status pretensions).

But back to Smith and the Highland Clearances. Smith was a strong advocate of both private property and efficiency in production. When The Wealth of Nations came out, Scottish clan chiefs decided to reinterpret their position as not just clan heads, but also fee simple owners of clan lands, according to how they interpreted Smith’s concept of private property. They became lairds, owners of the clan lands instead of feudal lords. As feudal lords they’d had a complex set of rights and duties with their crofters. However, as lairds, they suddenly became absolute owners of what was now their private property. Since Scottish law had not formalized feudal rights and duties, the transition from a feudal system to a modern market economy was — to say the least — awkward.

The crofters were subsistence farmers. Their part of the deal was to give a percentage of their harvest to the clan chief in return for protection, leadership, dispute resolution, and so on. Advances in agronomy and a booming market for wool indicated to the new self-declared lairds that sheep grazing would enrich them much more than a few bushels of oats. Most chose sheep over oats and evicted the crofters, hence the Clearances. (This is a simplified version.) Not all lairds ignored the crofters’ feudal rights. Lairds’ responses ran the gamut from keeping the crofters as tenant farmers, to buying them out, to cruel dispossession and eviction. There was no uniform formula; the greediest landlords made the headlines. Adam Smith got the blame. Finally, however, in 2008, an elegant ten-foot bronze Adam Smith statue on a massive stone plinth and financed by private donations was unveiled in Edinburgh’s Royal Mile within sight of a rare statue of his friend David Hume.

Outside Inverness, capital of the Highlands, the Culloden battlefield, site of the last battle (1746) fought on British soil, cast its spell. Supporters of the Stuart (Jacobite) dynasty fought the by-then established Hanoverian dynasty army of George II. The German Hanoverians had been installed as monarchs of the United Kingdom after Parliament tired of both Stuarts and civil wars. A common misconception holds that Jacobitism was a Scottish cause because the Stuarts, before being invited to rule over England had been kings of Scotland, and most of the Jacobites were Scots. Again, not so simple.

Since Scottish law had not formalized feudal rights and duties, the transition from a feudal system to a modern market economy was — to say the least — awkward.

Monarchy has its own rules of succession. Under those rules, Charles Stuart (Bonnie Prince Charlie) ought to have become king of the United Kingdom. The problem was that the Stuarts were Catholics and a Catholic, according to the Act of Settlement passed by Parliament in 1701 — the expedient to finally dump the Stuarts — could not rule over a Church of England realm, much less head that church. Adherents to the monarchy’s rules of succession did not accept Parliament’s power to overturn those rules, hence the Jacobite uprising. Scots, English, and Irish participated. The presumptive heir to the Jacobite crown today is Franz Bonaventura Adalbert Maria von Wittelsbach who, if he were on the throne, would be known as King Francis II.

We took a rest day in Inverness and got a dose of fire-and-brimstone Scottish Calvinism and attended a couple of ceilidhs — once both at the same time. A determined preacher in white shirt and tie stood on the Crown Road, Inverness’s high street, reading the Bible in thunderous and orotund sonority to the passersby while fiercely gesticulating with his free hand. We were entranced. Particularly when a young fellow in a t-shirt and a newsboy cap took a stance across the street, pulled a bagpipe out of its case, laid out the case to collect donations, and hit the chords of “MacPherson’s Lament.” He completely drowned out the homilist, who nonetheless persevered, impervious to all external distractions. As to the other ceilidhs, one particular impromptu session at a pub included two fiddles, a guitar, uilleann pipes, and a saxophone — the last two instruments a particularly innovative and sonorous combination.

North of Inverness nearly all the trees disappeared, as did fences, buildings, and power poles; even the livestock thinned. It was a magical, surreal landscape with the odd abandoned stone sheep enclosure. At Tongue, the North Sea finally came into view. When the Orkneys appeared on the horizon, our hearts skipped a beat: we knew we were nearly done. Stroma, the nearest Orkney, presented a spectral appearance. It had been abandoned in 1962. A scattering of old stone cottages, unconnected by roads, eerily dotted its landscape. Soon John O’Groats, little more than an inn and tourist shops, materialized out of the grassy plain. We’d covered 1,291 miles — according to our bike odometers — in 29 days, with an additional eight rest days.

The piper completely drowned out the homilist, who nonetheless persevered, impervious to all external distractions.

After a shuttle to Inverness and an overnight ride on the Caledonian Sleeper we arrived at Euston Station, London. During the ride — both of them — we reflected on Britain’s virtues. It’s a country with no earthquakes, volcanoes, hurricanes, tornadoes, forest fires, or mudslides; having an ideal climate with no extremes of heat or cold, aridity or rain; a varied and undulating topography of grasslands, moorland, woodland, glades, estuaries, highlands, and lowlands; hamlets, villages, towns, and cities with a minimum of sprawl; little crime, few slums or homelessness; a cultured people with a generally sensible disposition (and oodles of book stores); and enjoying separation of head of state from head of government. Finally, it’s always been Great, and, best of all — has unsurpassed beer and whisky. What more can you ask for? Lower taxes?




Share This


Beyond Relativism

 | 




Share This


Vietnam Revisited

 | 

I never fought in Vietnam. By the time I was old enough to go, I held a high draft-lottery number and a student deferment, and was never called up. I do remember the war, though. Early on, when Kennedy sent in the advisers, I was in elementary school, and saw the pictures on TV and in Life magazine. When Johnson sent in half a million men, I was in junior high, and we argued about the war in class. When Nixon came to power I was in high school, and we debated it more. When the four protesters were killed at Kent State University, I was finishing my first year at the University of Washington in Seattle. My instructor in German cancelled classes and gave us all A’s so we could go protest. I stood aside, watching the protesters flood onto Interstate 5 and block traffic until the cops pushed them off the exit to what are now the offices of Amazon.

My sentiments on the Vietnam War, like those of most Americans, evolved. In 1975, when South Vietnam collapsed, its government appealing for help and the US Congress and President Ford offering none, I was as coldhearted as anyone. I thought, “To hell with Vietnam.” I had been reading about it, thinking about it, arguing about it since I was a kid. During that time 58,000 Americans, of whom nearly 18,000 were draftees, had been killed there, along with maybe a million Vietnamese, and for what? In economists’ terms, the mountain of corpses was a “sunk cost” — and I was ready to watch the whole damn thing sink into the South China Sea.

I was living in Berkeley, California, when the South fell. I remember standing outside my apartment on May 1, 1975, taking photographs of a parade down Telegraph Avenue welcoming the Communist victory. “All Indochina Must Go Communist,” one banner said. Well, I hadn’t evolved that much. For me the fall of South Vietnam was a day for quiet sadness.

By 1975, 58,000 Americans, of whom nearly 18,000 were draftees, had been killed there, along with maybe a million Vietnamese, and for what?

As a kid in junior high, I had supported the war. Recall the geopolitical situation: Communists had eaten up a third of the world, with big bites in Eastern Europe in 1945–48, China in 1949, North Vietnam in 1954 and Cuba in 1959. They had been stopped in a few places — in Malaya, by the Brits — but once firmly established they had never been pushed back.The Cold War’s rules of engagement were that the Communists could contest our ground — what we called the Free World — but we dared not contest theirs. And the end of that road did not look good.

When I used that argument — and “domino theory” is not a good name for it — no one knew the Communist system was facing extinction. People knew it was a poor system for satisfying private wants, but as a foundation for political power, it did seem to pass the Darwin test: it had survived and spread.

All the old arguments came back as I was reading Max Hastings’ new book,Vietnam: An Epic Tragedy, 1945–1975. Hastings, an Englishman, is my favorite military historian; for years I have had his 1987 book, The Korean War, on my shelf, and I breezed through his 752-page Vietnam in a few days. In this book Hastings has undertaken to write the narrative of the war, and not all from the American side, but also in the voices of South and North Vietnam. Hastings reveals that there were arguments and worries on their side as well as ours. Many in the North hated the draft and did not want to trek down the Ho Chi Minh Trail to fight. Over the years, 200,000 Northerners deserted while in the South. The Northern soldiers also underwent far more privations than the Americans or their Southern allies, living on rice and water spinach (sold in Asian markets here as on choy) and often starving. On one occasion, Hastings says, they killed and ate an orangutan.

People knew communism was a poor system for satisfying private wants, but as a foundation for political power, it did seem to pass the Darwin test: it had survived and spread

Hastings analyzes the assumptions and the strategies of both sides. To the low-level Vietcong, the war was mostly about getting rid of Americans who looked and acted like the “long-nose” French, Vietnam’s late imperial overlords. The cadres tried to indoctrinate the VC in Marxism, but identity politics had the stronger pull.

Strength of belief and feeling makes a difference in war, and in Vietnam the advantage went to the other side. For a military historian, Hastings makes a key admission when he says that fighting was less important than “the social and cultural contest between Hanoi and Saigon.”

In that contest, the North’s standard-bearer was “Uncle Ho,” the Gandhi-like figure of Ho Chi Minh, who had kicked out the imperialist French. In the South, a society that included landowners, merchants, and bureaucrats who had worked for the French and prayed in the same church as the French, one of the icons was Vice President Nguyen Cao Ky. One observer said that Ky, an air force pilot with slick black hair and a pencil-thin moustache, looked like a saxophone player in a cheap Manila nightclub. Writes Hastings of Ky, “He was publicly affable, fluent, enthusiastic about all things American but the taste of Coca-Cola — and as remote as a Martian from the Vietnamese people.”

Strength of belief and feeling makes a difference in war, and in Vietnam the advantage went to the other side.

South Vietnam was a society rotten with corruption and ill-gotten wealth. “Again and again,” writes Hastings, “peasants were heard to say that whatever else was wrong with the communists, they were not getting rich.” History shows, though, that life is easier in a society in which some are wrongly rich than in one in which the rich are rounded up and shot, leaving everyone else poor. Hastings writes that when the North Vietnamese army rolled into Saigon, the soldiers were amazed at how much stuff the people had.

The Vietcong were terrorists. They beheaded the village chieftains who opposed them, and sometimes buried them alive. The Americans were told to behave better than that, but with their B-52s, high explosives, and napalm they dispensed death wholesale. American soldiers, Hastings writes, went to war “wearing sunglasses, helmets, and body armor to give them the appearance of robots empowered to kill.” Back at base, “Army enlisted men took it for granted that Vietnamese would clean their boots and police their huts.” And also use the bar girls for sexual entertainment.

Hundreds of thousands of South Vietnamese still fought and died for their state, and also worked with the Americans. First-generation Vietnamese in my home state are fiercely loyal to the old Republic of Vietnam, and still fly the yellow flag with the three stripes. Apparently they were not a majority of their countrymen, else the conflict would have come out differently.

With their B-52s, high explosives, and napalm the Americans dispensed death wholesale.

As the Pentagon Papers showed, smart people in the US government saw early on that South Vietnam was ultimately not a viable cause. President Kennedy expressed his doubts, but he also believed deeply that his mission was to stop the Communists. “Nothing that came later was inevitable,” Hastings writes, “but everything derived from the fact that sixteen thousand men were in country because John F. Kennedy had put them there.”

Hastings doesn’t buy the theory propagated in Oliver Stone’s movie JFK that Kennedy was on the verge of backtracking when he was shot.

Kennedy’s successor, Lyndon Johnson, sent half a million men to Vietnam because he didn’t want to be blamed for losing it, as Truman had been blamed for losing China. Johnson’s successor, Richard Nixon, saw that the war was lost, but he took four years to pull the troops out — an indecent interval in which thousands of Americans and Vietnamese died — because he didn’t want his name on an American defeat. For each of these US leaders, the concern was his country’s prestige (a Sixties word) and his own political standing. “An extraordinary fact about the decision making in Washington between 1961 and 1975,” Hastings observes, “was that Vietnamese were seldom, if ever, allowed to intrude upon it.”

Kennedy, Johnson, and Nixon were focused on the Chinese and the Russians, and assumed they were in charge in Hanoi as much as the Americans were in Saigon. Hastings says it was not so. The Russians and the Chinese were frustrated at the North Vietnamese aggressiveness, and repeatedly advised them to cool it. Within the North Vietnamese leadership, Ho often agreed with his foreign advisors, but Hastings says that policy was set not by Ho but by Communist Party General Secretary Le Duan, “though the world would not know this.”

Nixon saw that the war was lost, but he took four years to pull the troops out — an indecent interval in which thousands of Americans and Vietnamese died — because he didn’t want his name on an American defeat.

By Hastings’ account the Americans were not the only ones who made big mistakes on the battlefield. Militarily, the biggest Communist mistake was the Tet (Lunar New Year) offensive of 1968. Le Duan’s idea was to show the flag in all the Southern cities, spark an uprising among the people, and swamp the Southern government in one big wave. In the event, the South Vietnamese didn’t rise. In Saigon, the Vietcong breached the wall of the US embassy, and in Hue, North Vietnamese regulars occupied the town north of the Perfume River for several weeks and methodically executed all their enemies. But everywhere the Communists were driven back.

The Vietcong lost 50,000 dead in Tet and follow-on attacks, five times the combined US and South Vietnamese military deaths. Largely cleansed of Vietcong, the countryside was quieter in the following year, as the North Vietnamese Army built up forces to fill the void left by the defeated Southern guerrillas. Though Tet was a military defeat for the North, the US press played it as a Communist show of strength, thereby tipping the balance of opinion in America against the war. For the Communists, a military defeat became a political victory.

The journalists had played it the way it looked, and it hadn’t looked like a South Vietnamese victory. American journalists had learned to distrust their government’s statements about the war. They should have begun four years earlier by distrusting the Tonkin Gulf Incident, which was used in 1964 to justify the de facto US declaration of war. Of the two supposed attacks on the destroyer USS Maddox, Hastings writes, one wasn’t real and the other was “a brush at sea that could easily and should rightfully have been dismissed as trivial.”

For the Communists, the military defeat of the Tet Offensive became a political victory.

In the case of Tet, US journalists inadvertently helped the enemy, but generally the press gave Americans a more accurate picture of the war in South Vietnam than the government did. The press did a poor job of reporting the shortcomings of the North, but it wasn’t allowed to go there. In 1966, when I was arguing with my schoolmates for the war, I repeatedly heard them say that communism would be a bad system for us, but it was a better one for the Vietnamese. If Americans had good reporting from North Vietnam, I don’t think my schoolmates would have said things like that. We anti-communists were right about one thing: communism turned out to be just as bad as we said it was.

The question remains as to what, if anything, America should have done to stop the Communists in Vietnam. Hastings quotes CIA officer Rufus Phillips describing what America did: “We decided that we were going to win the war and then give the country back to the Vietnamese. That was the coup de grace to Vietnamese nationalism.” But if it was wrong to do that in Vietnam, it should have been wrong in Korea, and it worked there, at least well enough to preserve the Republic of Korea. It can be no surprise that Kennedy and Johnson would try a military solution again.

What was the difference? Hastings touches on this question only briefly, mentioning the obvious: Korea is a peninsula with a border just 160 miles long, while South Vietnam had a border with Cambodia, Laos, and North Vietnam more than 1,000 miles long, perforated in many spots by the Ho Chi Minh Trail, the complex of corridors through which the Communists infiltrated the South with fighters and supplies. The warfare on the Korean peninsula was conventional, with front lines; in Vietnam it was a guerrilla contest while the Americans were there, becoming conventional only after they had decided to go. The physical climate was different, too. The Koreas were divided on the 38thparallel, about the latitude of San Francisco; the Vietnams were divided on the 17th parallel, about the latitude of Belize City. All of Vietnam is in the tropics, with attendant cloudbursts, humidity, bacteria, and bugs.

American journalists had learned to distrust their government’s statements about the war. They should have begun four years earlier by distrusting the Tonkin Gulf Incident.

And there were political differences. Ho Chi Minh was a hero of national independence; Kim Il Sung pretended to be one, but had a less inspiring story. Also, in Korea the old imperial masters were not long-nosed Caucasians but Japanese.

A penultimate thought. Hastings quotes without comment Lee Kuan Yew, the patriarch of capitalist Singapore, to the effect that if the Americans had not resisted the Communists in Vietnam, “We would have been gone.” Call this the “domino theory” if you like. It was a view I encountered in the early ’90s, when I worked in Hong Kong for Asiaweek magazine. Our founder and publisher, a Kiwi named Michael O’Neill, maintained that the American effort in Vietnam had stopped the Communists from pushing on to Thailand, Malaysia, Singapore, Indonesia, and the Philippines. Meanwhile, China had junked communist economics, and Vietnam, unless it wanted to remain poor, would have to do the same. And that, O’Neill argued, meant that the Americans had really won the Vietnam War, even if they didn’t know it.

Or maybe, I thought, we had lost the war but in the long run it didn’t matter — because the war wasn’t the decisive contest.

Twenty-one years after the war ended, I traveled to Vietnam with my wife and six-year-old son. In Danang I met a group of men who had fought for the South and suffered persecution from the victors. They weren’t bitter at the Americans, nor were the tour guys who drove us to Khe Sanh and were too young to remember. In the North, at Ha Long, I chatted up the proprietor of a tiny restaurant who said that during the war, when he had been a truck driver for a state-owned coalmine, he had lost his house to American bombing. I told him I was very sorry my countrymen destroyed his house.

He shrugged. “I have a better one now.”


Editor's Note: Review of "Vietnam: An Epic Tragedy, 1945–1975," by Max Hastings. Harper, 2018, 857 pages.



Share This
Syndicate content

© Copyright 2019 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.