No Escape from Human Nature

 | 

Are humans instinctively brutal? Do we attend hockey games, boxing matches, and race car events hoping see blood? Do we rubberneck at car accidents hoping to see death? Have we really made no moral progress since gladiator games were used as public executions?

The producers of Escape Room want us to think so. From The Most Dangerous Game (1932) to The Naked Prey (1965) to The Hunger Games trilogy (2012–2015), movies have explored the concept of humans hunting humans and have tapped into the idea of execution as entertainment. And that’s what happens in this movie.

Inspired by the escape-the-room genre of video games, real-life escape room adventures have become popular over the past decade in cities all over the world. Contestants are locked inside a room decorated to resemble a haunted house, prison cell, space station, or other isolated location and are given a time limit during which to discover clues, solve riddles, and find the escape hatch. It’s a fun, socially interactive, real-life alternative to sitting in front of a computer screen discovering clues, solving riddles, and finding the escape hatch.

They soon realize that one person will die in each room. Who will it be? What would you do to make sure it isn’t you?

The premise of Escape Room is simple. Six strangers are invited to compete for a high-stakes prize by solving a series of puzzles in order to escape from a series of rooms. Danny (Nik Dodani) is a videogame nerd who has played in nearly a hundred escape rooms before. Zooey (Taylor Russell) is a shy math prodigy with a talent for solving puzzles. Jason (Jay Ellis) is an investment banker with expensive tastes. Ben (Logan Miller) is a stock clerk for a grocery store. Amanda (Deborah Ann Woll) is an army veteran, and Mike (Tyler Labine) is a blue-collar worker. What has brought these six together? And how will they interact under pressure?

The six soon realize, of course, that this is no game. If they fail, they die.

With its PG-13 rating, Escape Room is high on suspense and low on blood and guts, making it an entertaining film as the audience members work along with the characters to solve the riddles and unlock the doors.

What makes the film interesting are the gradual reveal of the characters’ backgrounds and their interaction with one another as they do what it takes to survive. They soon realize that one person will die in each room. Who will it be? What would you do to make sure it isn’t you? They’re all strangers, after all. They only just met, and they have no personal connection with one another. Will self-interest lead to treachery? Or will goodness win out?

You couldn’t share. There simply wasn’t enough. So you did what you must.

Despite being driven by self-interest, we still seem to want our heroes to be self-sacrificing — at least in Hollywood. We cheered when Han Solo, that maverick businessman of the cosmos, returned to help the resistance in Star Wars. We took heart when Katniss Everdeen refused to kill her youthful opponents in The Hunger Games. We even approved when Hombre (Paul Newman), the ultimate libertarian hero, reluctantly risked his life to rescue the wife of the thieving, racist Bureau of Indian Affairs agent from the stagecoach robbers.

But in reality, when push comes to shove and our own lives are on the line, what would we do to survive?

I recently listened to The Women in the Castle, by Jessica Shattuck, a fictionalized account of the widows of Jewish resistance leaders and their experiences during and after World War II. It’s a sappy, sentimental novel full of 21st-century morality and clichés. For example, Shattuck refers to “racial profiling” when her characters are asked to show their papers, a term that did not exist in World War II. Moreover, her protagonist is cloyingly egalitarian. She comes from an aristocratic background and thus has special access to food and protection. Yet she refuses to accept those special favors, or at least expresses consternation about accepting them. To hell with accuracy; Shattuck seems compelled to imbue her 20th-century protagonist with 21st-century values, no matter what. Such egalitarianism is a fine principle in times of plenty, but when your children are truly starving or threatened by death, you will accept any special opportunity offered to feed and protect them.

Hollywood conveniently whitewashes the truth about the survival instinct in order to celebrate community, sacrifice, and cooperation.

Actual Holocaust survivor Viktor Frankl belies Shattuck’s politically correct fantasy about genteel survival morality in his concentration camp memoir Man’s Search for Meaning. Frankl reveals a particularly troubling source of survivor’s guilt — he admits that in order to live through the kind of brutal starvation they experienced in the camps, those who survived had to be ruthlessly selfish at times. There might be one piece of potato in the soup pot, and that one piece of potato would determine who had enough sustenance to survive the march to work the next day, and who would collapse in the snow. You couldn’t share. There simply wasn’t enough. So you did what you must to scavenge that bite of potato, reach the warm spot at the center of the mass of prisoners, avoid the front of the line when the camp guards were looking for someone to shoot. You might feel guilty. You might be furtive. But you did it anyway.

In such films as Escape Room, Hollywood conveniently whitewashes the truth about the survival instinct in order to celebrate community, sacrifice, and cooperation. The hero manages to be self-sacrificing and self interested, to fall on the grenade and make it out alive. And that’s OK. After all, we’re looking for escapism, not realism, in entertaining movies like this one.


Editor's Note: Review of "Escape Room," directed by Adam Robitel. Original Film Production, 2019, 100 minutes.



Share This


Remembering the Great War

 | 

As the world prepared to commemorate the 100th anniversary of the ending of World War I on November 11, 1918, director Peter Jackson accepted a daunting commission: to create a documentary that would honor the soldiers who fought in the trenches, using the original footage that was filmed 100 years ago.

This would not be a documentary about generals, military strategy, assassinations of obscure archdukes, or theaters of war. Jackson would not interview modern historians about the significance of the war or provide any scripted narration. Instead, Jackson would bring these long-dead soldiers to life by allowing them to tell their own story.

The result is a magnificent piece of work, both in the story it tells and in the technology Jackson used to tell it. This is a film made entirely in the editing room.

This would not be a documentary about generals, military strategy, assassinations of obscure archdukes, or theaters of war.

To create the storyline, Jackson and his team reviewed over 600 hours of interviews with survivors, conducted during various commemorations of the War to End All Wars. Jackson then began selecting portions of the interviews, taking a snippet here and a snippet there, until he was able to cobble together a narrative line that begins with young 16- and 17-year-old boys sneaking off to lie about their ages in order to join the army; follows them into the trenches, villages, and battlefields; and ends with the survivors returning home, many of them injured, many of them “loony” (an earlier term for PTSD), and many of them (according to one of the narrators) facing employment signs that said “Army veterans need not apply.” Their remembrances, told with voices that are cracked with age, are moving and authentic. No historian’s expertise could tell their story better.

Once the storyline had been established, Jackson reviewed 100 hours of footage from the war, selecting the best scenes to match the narration. Much of the footage was third- or fourth-generation, meaning it was a copy of a copy of a copy, each generation becoming less and less crisp. Much of it was either too dark or too light to be viewed clearly. And all of the movements were jerky and unnatural as the filmmakers had to crank the film through the camera by hand, trying to keep it steady at approximately twelve frames per second, which is only half the number of frames per second that we are accustomed to seeing in today’s movies.

And here is where the magic begins. Jackson used computer technology to add frames to the footage, smoothing out the action and making it feel as normal as any film you would see today. Then he colorized the film, using actual uniforms, tanks, and other artifacts from his own considerable collection of WWI memorabilia to help the artists get the colors just right. Next he enlisted professional lipreaders to figure out what the men were saying in the footage, and hired voice actors from the actual regions of each regiment, so the accents would be authentic. He added sound effects made by recording actual tank movements, mortar explosions, bayonet affixions, and other background noises. Finally, he created a natural musical score largely based on whistling and other natural music of the battlefield. The result brings these antique films to life. We simply forget that cameras couldn’t do this 100 years ago.

Jackson brings these long-dead soldiers to life by allowing them to tell their own story.

I’m not usually a fan of colorization; while it does make a film feel more natural for modern viewers, it neutralizes the skillful play of shadow and contrast designed deliberately and carefully by directors of the ’30s and ’40s. They knew what they were doing, and they did it well. However, in this film the colorization is a masterful addition. It brings out details in the film that in black and white were hidden or completely lost. Most notable is the blood; we simply don’t see blood as anything but dirt in black and white.

We also see how terribly young these soldiers were, marching off to war and grinning for the cameras. Although we never know their names, Jackson edits the footage so that several of the men come into view several times, and we begin to identify with them. We see not only the war, but how they lived, what they ate, how they slept, and even how they played. And in many cases, we are seeing them just before they died. It is a sobering, respectful, and impressive film.

They Shall Not Grow Old is neither pro-war nor anti-war; it simply asks us to consider the cost of war — not in the billions of dollars that are spent, but in the millions of lives that are lost. The title of the film is based on a selection from Laurence Binyon’s Ode of Remembrance called “For the Fallen,” which has been used as a tribute to all who die in war:

They shall grow not old, as we that are left grow old:
Age shall not weary them, nor the years condemn.
At the going down of the sun and in the morning
We will remember them.

They mingle not with their laughing comrades again;
They sit no more at familiar tables of home;
They have no lot in our labour of the day-time;
They sleep beyond England's foam.

Lee Teter’s painting “Vietnam Reflections” pays a similar tribute to the fallen, but from a different perspective, that of the grieving survivor. It depicts a man, clearly a veteran though he wears no uniform, mourning at the Vietnam Veterans Memorial in Washington DC, where the names of all the fallen are etched on a long, low wall deliberately situated below ground level. His head is bowed in quiet anguish, his arm outstretched and his hand leaning heavily against the wall, willing it to reach inside and touch his comrades on the other side. Unseen by him, because his eyes are closed, several soldiers seem to be standing inside the wall, their reflections ghostly as they reach out, hand to hand, to console the man who, having survived the war, continues to carry its burdens. His guilt is understood by the clothing Teter chose to give him. He is dressed in a business suit; the soldiers wear army fatigues. A briefcase rests on the ground beside the veteran; the soldiers carry field kits. The businessman’s hair is flowing and tinged with gray; theirs is dark and crew cut. The fallen soldiers shall not grow old, start businesses, or have children.

 Most notable is the blood; we simply don’t see blood as anything but dirt in black and white.

And therein lies the survivor’s grief. “We that are left grow old,” as Binyon says in his poem, but survival is neither a reward nor a relief. It is a burden. Age does weary them, and the years do condemn.

No one knows the true story of war except those who experience it, and even then, it is a private, individual grief that none of them can truly share or understand. Consequently, using the voices of the actual soldiers to tell their story was a brilliant narrative strategy for They Shall Not Grow Old. They speak next to one another, but not in conversation with one another. The viewer remains enveloped in the currency of the story and simply observes their experience without explanation, editorializing, or the distraction of a modern historian’s modern interpretation.

The film is moving and impressive, but you’ll have to find it on Netflix or another platform because its theatrical release was limited to just December 17 and December 27. And that’s a shame, because the moment when Jackson switches from the jerky, original, black and white footage to his colorized and edited version is breathtaking. I’m so glad I got to see it on a full-sized screen. If you do see it, make sure you watch the director’s cut with Peter Jackson’s interview explaining how he did it. It’s like listening to a magician’s reveal.


Editor's Note: Review of "They Shall Not Grow Old," directed by Peter Jackson. WingNut Films, 2018, 99 minutes.



Share This


Beer, Bikes & Brexit

 | 

“Ride left, die right!”

Our mantra, continuously repeated to each other, often as a cheerful, running admonition but sometimes shouted in panic, was mostly repeated as a silent meditation while pedaling our bikes from the toe of Cornwall to Scotland’s sagittal crest during June and parts of May and July. The Land’s End to John O’Groats quest has become something of an obsession not only in the UK but also to a cross-section of aficionados worldwide — a British version of the Way of St. James, if you like. Like the Via de Santiago, it has many alternates, with the shortest at 874 miles. Our choice, the National Cycle Trails’ Sustrans Route, is 1,200 miles long.

One aspirant, who with his wife runs a B&B in Bodmin, Cornwall (in which we overnighted) was leaving for John O’Groats the following day to run one variant. Yes, run. Or as the placard on the back of the ungainly plastic box that contained his essentials (including a sleeping-rough kit) and was duct-taped to a tiny day-pack he’d strap to his back proclaimed:

Colin is running for ShelterBox disaster relief charity
1 man 3 Peaks
1 ShelterBox
1,000 miles marathon a day

The 3 Peaks were Ben Nevis, Scotland’s highest; Scafell Pike, England’s highest, and Brown Willy, a small hill atop Bodmin Moor, Cornwall’s highest point. (and one over which Tina, my wife and I biked on our adventure). The man was 53 years of age, and this was his third British Isles end-to-end ultra-ultra-marathon, as these inconceivably long long-distance runs are called.

He wasn’t the only eccentric adventurer we encountered. Another runner, whom we met at John O’Groats just as we were finishing, was just starting out. Unlike Colin, he’d packed his gear into a two-wheeled trailer attached to his chest with a harness. As we watched him start, he jogged into the arm swing and curious gait that ultra-marathoners affect to make it look as if they were running when in fact they proceed little faster than a fast walk, or about four miles per hour. We never found out his raison de run. One tiny, 73-year-old man with a backpack the size of a manatee and a pace that rivaled varve deposition in Loch Lomond (where we encountered him) was doing the walk to commemorate the Queen’s longevity. He presented us with his card. It requested donations to cover his expenses.

The man was 53 years of age, and this was his third British Isles end-to-end ultra-ultra-marathon.

Ian Stocks was bicycling a 20-day, 1,500 mile variant that included the UK’s eastern and westernmost salients, for Alzheimer’s Research UK. At a projected 75 mile-per-day pace he ought to have been much further south than where we met him. I noticed that his gear — bike, panniers, electronics — all looked new, and my BS antenna began to quiver. The received wisdom in the classic-liberal view is that as welfare programs take over the functions of private charities, the latter tend to atrophy. Great Britain, definitely a welfare state, seems to have a surfeit of charitable initiatives. What was going on?

I’d once been solicited for a contribution to “raise awareness for breast cancer” by a group of women breast cancer survivors who were planning on skiing across the Greenland ice cap. They were all seasoned adventurers. I knew what they were up to. Contributions would pay for gear and transportation first; any money left over would go to the “raise awareness” bit.

At this point, let me clarify a popular misconception concerning folks who participate in extreme sports, objectives, deeds, adventures, and such for charity. Their shtick is to persuade the public that they are willing to undergo extreme exertion and privation for a good cause. But nothing could be further from the truth. They do what they are doing because they are addicted to adventure, unique accomplishments, firsts, personal challenges, transcendence of the quotidian, making their mark, even adrenaline or drama; in a word — they love what they do. But extreme adventures are costly, so many fund their objectives by invoking the charity label. I told Trish, the leader of the Greenland expedition (who, years before, had taught me to cross-country ski), that I needed my money to fund my own adventures and that I wished them luck. She didn’t take that well.

Great Britain, definitely a welfare state, seems to have a surfeit of charitable initiatives. What was going on?

So I checked out Ian Stocks’ website. What a surprise! All contributions go directly to the charity; nothing passes through Ian’s hands. Ian’s motivation is his father’s dementia. As of this writing, Ian is still behind schedule, mile-wise, but he has raised over 100% of his targeted contributions.

To me the more fundamental question is why this whole charade is necessary. If an individual wants to make a charitable contribution to a cause he cares for, why does he need a sideshow with no connection to the cause to spur him? Is it even entertainment? Perhaps, in a gladiatorial sort of way.

My wife Tina and I had decided to tackle the end-to-end ride for purely selfish reasons: beer — unlimited traditional cask ales (more on them later), single malt whiskies, and daily feasts of delicious UK food: the full breakfast fry — bacon, sausage, egg, baked beans, fried mushrooms and tomato, black pudding, hash browns and fried toast; the best curry restaurants in the world (Kashmiri, Bengali, Pakistani, Bangladeshi, South Indian); fish and chips to die for; carveries featuring four roasts with all the trimmings, including Yorkshire pudding and meat pastries that defy counting; steak and ale and steak and kidney pies, sausage rolls, Cornish pasties, shepherd’s and pork pies, and many local, flaky variants. And, of course, the chance to explore a country in low gear, meet the people — prone to extreme civility with a subtle but outrageous sense of humor — and get our fair share of exercise in order to continue such adventures well into our senility.

If an individual wants to make a charitable contribution to a cause he cares for, why does he need a sideshow with no connection to the cause to spur him? Is it even entertainment?

Many end-to-end bikers from a variety of countries crossed our path. Unlike motorists, long-distance bikers always stop to pass the time of day, to inquire about one another’s journey, objectives, provenance, etc. Nearly all who were heading north targeted John O’Groats. Just to add a little spice to the repetitive answers and one-up them all, I decided to tell everyone that Tina and I were headed for Scapa Flow. Only the Brits got the joke.

Two separate couples had come all the way from New Zealand. I asked them why they’d come halfway around the world for this biking adventure, when they lived in a country famous for its natural beauty and low population density, a country that would seem to offer a biking paradise. Both couples shook their heads and looked at each other. They both — separately — responded that New Zealand was a relatively new country, and so did not have a well-developed network of old or secondary roads crisscrossing the two main islands. Only primary highways, mostly two-lane, bind the country together. These have narrow shoulders (when at all), and drivers are not sensitive to bikers.

The Road Less Traveled

The Sustrans route we chose uses traffic-free paths and quiet single-lane roads, hence its 1,200 mile length. Those quiet single-lane roads have their own quirks. Nearly all are bordered by 6–9’ hedges, perfectly vertical and maintained by vertical mowers. They are so narrow that planners have installed “passing places” and “lay-bys” about every 100 yards. The occasional oncoming or passing car encountering another car — or bike — must wait for one of these to get by. However, the Sustrans route also seems to go out of its way to stitch together every hill top, traverse watersheds cross-wise instead of following drainages, and generally adhere to Mae West’s observation that “the loveliest distance between two points is a curved line.”

Just to add a little spice to the repetitive answers and one-up them all, I decided to tell everyone that Tina and I were headed for Scapa Flow.

England’s myriad roads, in plan view, mimic the pattern formed by cracked tempered glass — an intricate grid twisted and crabbed beyond any recognizably geometric shape and resembling a Voronoi tessellation. They started out that way and only got more complex as time went on. When Rome conquered England, according to Nicholas Crane in The Making of the British Landscape, “the web of footpaths and tracks serving settlements were an ill-fitting jigsaw of local and regional networks which were difficult for outsiders to navigate.” The bends and salients in England’s roads had evolved over hundreds (or even thousands) of years to link settlements, sources of raw materials, strongholds, religious sites, and so on. These evolved organically before the invention of bulldozers and certainly of modern road engineering with road cuts and fills that reduce gradients and straighten out unnecessary curves. Except for the historic nature of English roads, which sometimes subjected us to 20% grades and less-than-direct transects, they’re a biker’s paradise.

The Hills Are Afoot

Cornwall, the forgotten Celtic country, was a disheartening start to a very challenging ride. Not only does the Cornish section of the route gain little in a northerly direction — and sometimes even trends south — its ride profile resembles tightly clustered stalagmites with significant climbs over Bodmin Moor, the St. Burian and St. Columb Major summits, and a queue of lesser hills. Our old bodies required two rest days in quick succession — at Truro and Bude — if we were to have any chance of reaching Scotland pedaling.

Cornwall might seem forgotten because it’s bedeviled by an identity crisis. Although Celtic in origin, distinct in language and separate as an entity from England, with many unique cultural traits, it somehow missed Tony Blair’s devolution revolution in 1997. Rob, our host at the very modest Truro Lodge, told us that Truro, a cathedral city, was the capital of Cornwall. Since he’d been Truro’s last mayor, I asked him if that made him First Minister of Cornwall. He smiled wryly, admitting that Cornwall had had such an influx of English settlers that there wasn’t much enthusiasm for Cornish devolution, much less independence.

Except for the historic nature of English roads, which sometimes subjected us to 20% grades and less-than-direct transects, they’re a biker’s paradise.

But there is some ambivalence. The Cornish language is being revived. Cornish music, nowhere near as popular as Irish or Scottish music, can still be heard. Outside Truro Cathedral, a couple of buskers with fiddle, guitar, and microphone played traditional tunes to an enthusiastic audience. And in Penzance, along the waterfront promenade, a Cornish band led by a baton-waving, tails-wearing drum major marched in front of our hotel evenings at dusk playing Cornish Morris-type music (I later found out that the all-volunteer ensemble was short on musicians and was soliciting participants).

In 2007, David Cameron promised to put Cornwall’s concerns "at the heart of Conservative thinking." However, the new coalition government established in 2010 under his leadership did not appoint a Minister for Cornwall. Although Cornwall only holds the status of a county in Great Britain, as recently as 2009 a Liberal Democrat MP presented a Cornwall devolution bill in Parliament (it got nowhere), and advocacy groups demanding greater autonomy from Westminster have been waxing and waning over the years.

On June 5 we left Cornwall and entered Devon, the heart of Thomas Hardy country, complete with irresistibly cute, white-washed thatched roof cottages. Though every bit as hilly as Cornwall (1,640’ Exmoor, the Mendip Hills and the Blackdown Hills to the fore), it welcomed us with a roadside sign declaring: Where there are flowers there is hope.

The Cornish language is being revived. Cornish music, nowhere near as popular as Irish or Scottish music, can still be heard.

“Oh, how much fun!” Tina declared — her enthusiastic response to any novelty, serendipitous triviality, unanchored excess of exuberance, or even the prospect of another 20% uphill grade. Up these our odometers would sometimes only display zero miles per hour, even though we were making progress. To pass the time on the slow pedal I recounted the libertarian themes in Hardy’s Jude the Obscure, a novel she’d never read: his depiction of marriage as a crushing force, his belief that organized religion complicates and obstructs ambition, and his critique of the Victorian class system.

At Glastonbury (another rest day) our route finally turned resolutely north. The famous abbey town and final resting place of the legendary King Arthur has become a bit of a Wessex Sedona with crystal shops, goddess centers, metaphysical bookstores, vitamin, herb, and natural food shops, and a vibrant cast of street characters in a variety of stages of mendicancy, sanity, and hygiene exhibiting extremes of sartorial flourishes from total nakedness through purposeful dishevelment to natty eccentricity. Even our B&B hostess had a claim to fame. Sarah Chapman held the Guinness Book of World Records women’s record for walking five kilometers upright on her hands! But the ruins of the abbey, legendarily founded by Joseph of Arimathea in 63 AD and associated with Saints Columba, Patrick, and Bridget but sacked and burned by Henry VIII when he broke with Rome over its refusal to submit to him instead of the Pope, are the town’s saving grace.

By the time we reached Bristol we were deep in the bosom of Old England. Bristol, once England’s doorway to the world, is a thriving, lively, modern city. In its harbor, lovingly replicated, docks the Matthew, John Cabot’s ship. A plaque next to his oversize statue reads: In May 1497 John Cabot sailed from this harbour in the Matthew and discovered North America. The only drawback to being a port city is the seagulls, loud giant avian dive bombers. They are brazen and incorrigible in their quest for food. Early mornings reveal overturned trash bins throughout the city. Gulls have been reported snatching burgers out of hands and even whacking a pedestrian eating a snack on the back of the head so that he drops it and the gull steals the tidbit. One municipal mayor complained that gulls are a protected species.

On to the Midlands

Past the moors and fens, the landscape turned to rolling farm and toft landscape dotted with rhododendron copses. Through the humid and fecund West Midlands, we developed a fondness for the heady odor of pungent silage mixed with barnyard manure — definitely an acquired taste. One evening at a pub, a morris troupe, performing traditional English music and dance dating from before the 17th century, enhanced our after-ride pints. The all-male troupe wearing bells — perhaps the original source of the phrase “with bells on their toes” — and accompanied by a squeeze box, was delighted to entertain foreigners familiar with morris dancing. We stayed in an old Tudor building with buckled floors, absurdly low pass-throughs, and narrow winding stairs whose commemorative plaque read: Crown Hotel: Rebuilt in 1585 on site of a much earlier inn destroyed by the fire of 1583. A coaching stop on the London-Chester run.

By now Britain’s schizophrenic weights and measures standards were beginning to puzzle us. Road distances were in miles, temperatures in centigrade, beer and milk in pints, and folks gave their weight in “stone” with large weights measured in Imperial tons. While the metric system may be simpler in computation, the English system is ergonomic and evolved organically, thereby rendering it more intuitive. And, most curious of all to me, a northern country that in summer experiences 19 or 20 hours of daylight and invented standard time, which it measures from the Prime Meridian of the World at Greenwich, succumbs to the idiocy of Daylight Savings Time.

Refreshingly, the government has not been able — by and large — to impose metric mandates or force observance of DST throughout the realm. When the time changes, businesses readjust their opening and closing times to GMT. With barely four or five hours of total darkness, how much daylight needs to be “saved”? As to the other weights and measures, one informant told me that, except for centigrade temperatures, all new and traditional systems coexist peacefully, with only a handful of rigid requirements such as strong spirits in pubs, which must be sold in 25ml, 35ml, 50ml, and 70ml increments.

Up these hills our odometers would sometimes only display zero miles per hour, even though we were making progress.

Worcester (pronounced Wooster), is the home of Worcestershire Sauce and site of the last battle of the Civil War, in which Cromwell decisively defeated the Royalists. Even more importantly, Worcester Cathedral holds the remains of King John, he of the Magna Carta. The mausoleum was extremely moving, not just for its considerable age and all the empty space surrounding it, but also for the immense significance of Magna Carta itself. For all that a lot of it is unintelligible, Magna Carta was the first assault on the absolute power of English royalty through the separation of powers and the recognition of the rights of a portion of the populace.

In keeping with Sustran’s objective of avoiding traffic, we bypassed Birmingham, Britain’s second largest city. Not so for Manchester. Inevitably, we got lost there. Signage was poor, our map not detailed enough, and Google not up to the task. So, contrary to the clichéd stereotype of a male, I asked a passerby for directions. The lady responded, “You’re in luck, I’m a geographer. Where are you going?”

Now, asking passersby has its drawbacks — too many to detail here — but, in this instance, we weren’t going to a particular place but rather trying to find National Cycle Network Route 6 to get back on track. Never mind; an academic geographer informant — here was the gold standard! After detailing our trip to her I showed her our guidebook’s map. She was no biker and had never heard of the National Cycle Network. She wasn’t impressed by either our guidebook or our map, of which she couldn’t make sense. At once she launched into a tirade about computer generated maps and lectured us on the preeminence of British ordnance survey maps.

Through the humid and fecund West Midlands, we developed a fondness for the heady odor of pungent silage mixed with barnyard manure — definitely an acquired taste.

I responded that she was absolutely correct, except that we would have needed over 100 ordnance survey maps to cover our entire route, at a prohibitive cost in space and pounds sterling. Then she and Tina, interrupting their on-again, off-again chitchat, in between attempting to solve the riddle at hand, pulled out their smartphones — the last resort of the self-unreliant — and sought guidance from Google.

By now I was losing patience. We’d eaten up precious time getting nowhere, so I resorted to a navigator’s last resort: bracketing. I thanked our geographer for her help, gently disengaged Tina from her, and explored four separate directional salients for a mile each, starting from the roundabout we’d stopped at in order, to ensure that one of those was or wasn’t where we were headed. Through the process of elimination, a compass, a closer examination of the clues in our guide, and not a little intuition, we found our route. Lo and behold, we were nigh on it! A block further along the last salient explored, we encountered a National Cycle Network Route 6 sign.

The lessons: Never mistake a geographer for a cartographer: the former specializes in the distribution of the human population over the surface of the land; the latter makes maps. And . . . have confidence in your own abilities.

North by Northwest

The Yorkshire Dales, Cumbria, and the Lake District welcomed us with a smorgasbord of all-you-can-climb hills, appetizers to the Scottish Highlands. By now we’d talked to a lot of innkeepers, publicans, bikers, walkers, shopkeepers, and random strangers. With the 70th anniversary of the National Health Service (NHS) imminent on July 5, I sought infrequent opportunities to gather anecdotes about people’s experience with the service, especially now that Conservative governments had floated proposals to make the NHS financially more viable, most of which included increasing copays. I never brought up the subject but always managed to get folks to elaborate on offhand remarks. One lady mentioned that she’d recently broken her wrist playing cricket. So I asked her if the NHS had taken care of her (Britain has a dual — private and public — insurance and medical system).

For all that a lot of it is unintelligible, Magna Carta was the first assault on the absolute power of English royalty.

“Yes, they did,” she said. But then she backtracked, saying, “No, they didn’t.” So she explained. She went to the nearest hospital with her hand bent at an unnatural angle to her forearm. The staff said they had no room for her, to go to another hospital. So she did. The next hospital looked at her wrist and said it was broken. But they had no room for her. “Go home and wrap it up,” they said. Luckily, her husband had private insurance. The private doctor immediately took care of the fracture.

Another B&B host, an elderly lady who had recently lost her husband and ran a very modest one-woman operation told us she’d had a hip replacement. I asked how well the NHS had treated her. She responded that it had taken a while to get the procedure done, but only because she didn’t understand and had difficulty navigating the bureaucratic requirements. Once she mastered them she was put in queue, awaited her turn, and was very happy with the results.

Of course, the other hot topic of conversation was Brexit. I wasn’t shy about soliciting opinions on that. Two issues determined the close vote: immigration and EU rules (trade, a third issue, was uncontentious: everyone favored trade. However, the first two are interpreted very differently along the political continuum.

Luckily, her husband had private insurance. The private doctor immediately took care of the fracture.

In the course of our five-week traverse of the island we encountered numerous resident immigrants from a very broad array of countries working in sales, labor, and the service sector. I made a point of listing the countries they hailed from: Italy, Romania, Poland, Venezuela, Eritrea, Somalia, India, France, Pakistan, Greece, Spain, Bangladesh, Hungary, Czech Republic, Ethiopia, Thailand, Russia, Germany, Argentina, China, Latvia, Bulgaria, Slovakia, Belgium, Brazil, Philippines, Ukraine, Ireland, and the USA. These were not tourists or ethnic waiters at ethnic restaurants.

Left-leaning reportage attributes the pro-Brexit, anti-immigration vote to “racism,” or “little Englanders,” the British version of chauvinist rednecks. Right-wingers claim that immigrants are taking over jobs. Neither of these glib explanations stuck a chord with us or our informants. But all, regardless of whether they were “leavers” or “remainers,” expressed strong concern about Britain’s nearly limitless immigration. One Welsh AI entrepreneur — a remainer — averred that with an unemployment rate of 4.1% there was no employment problem in the UK. Gareth was so fixated on trade that he blithely dismissed any other concern as illusory.

As to racism, none of the immigrants we interviewed alluded to it; in fact, all expressed a great deal of affection and respect between themselves, the Brits, their neighbors, and their employers (ours was a very limited random sample). And none of the Brits expressed any — even the slightest — unfavorable sentiment about foreigners. Only when riding through Muslim enclaves did we sense any, admittedly vague, tension. So what was going on?

One waitress complained about the niggling EU rules — another erosion of British sovereignty — that even control the shape of bananas an Englishman can eat.

My sense is that the Brexit vote was a continuation of a British exceptionalism that goes back to 1066 — it’s been nearly a millennium since the last invasion. Compared to the continental countries, Britain has been uniquely stable, especially — being an island — as to its borders. In that sense, there is a nebulous perception of continental countries as entities akin to banana republics, with feuds, invasions, and shifting boundaries. To Brits, joining that club has always cost some degree of sovereignty. Margaret Thatcher personified that sentiment when she was unwilling to sacrifice the pound sterling, the world’s oldest, most stable currency (except under Callahan and Wilson) to a committee of continental bureaucrats. Britain did not join the Euro currency; but it did join the European Union, a continuation of the aspiring free-trade policies of the earlier Common Market. The Brits want to trade but don’t want others to control them.

One Scots barmaid was in favor of leaving, but voted to remain for the sake of her children. She complained about the niggling EU rules — another erosion of British sovereignty — that even control the shape of bananas an Englishman can eat. Gareth, our Welsh informant, thought this a red herring issue. But immigration rules are part of the broader EU rules: both require a surrender of sovereignty that the Brits have had enough of ceding.

Finally, there was a general concern that Britain was losing its identity — its culture, if you will — and becoming a nation of immigrants like the US. The August 11 issue of The Economist reports that “more than a third of London’s population was born abroad.”

Scotland the Heatwave

It was uncanny. As soon as we crossed the unmarked border into Scotland, the plaintive tones of a highland bagpipe filled the air. Around the corner we suddenly found ourselves in Gretna Green, once Britain’s answer to America’s Texas, where Scottish law allowed marriage between underage couples, but now a slightly pathetic tourist trap where couples with a romantic disposition to elopement still choose to tie the knot. Never mind, we were entranced and let the piper grab our souls, wrench our hearts, draw tears, and make us feel that we could transcend our limits. And, remarkably, accents turned on a penny from Yorkie to brogue.

As they say in Kentucky, “we were in pig heaven!”

On the first day of summer hordes of embarrassingly (to us, anyway) scantily clad Scots crowded along the shores of every loch, river, canal, and estuary, suntanning their albescent flesh. The unusually hot and dry weather, which had started earlier, was the cause of much comment. Tina, ever one to engage anyone in friendly conversation, asked a middle-aged lady if the unusual circumstances might be caused by global warming. The lady replied that if they were, “Bring it on!” In the 20 days we spent in Scotland it never rained. On June 29 at Pitlochry, the temperature hit 89 degrees Fahrenheit while we were there — leading to a hot muggy night with little sleep in a land where air conditioners and fans are a waste of money.

We looked forward every day to a pint or two of “real ale,” available in participating pubs everywhere but sadly lacking in Gretna Green — another disappointing aspect of the little town. I’m an avid fan of British Real Ale, a beer nearly unavailable anywhere else, and a primary reason for our trip. Real or cask ales (cask-conditioned beer) are unfiltered (they still retain yeast, though that drops to the bottom of the cask) and unpasteurized beer, conditioned (by processes including secondary fermentation) and served from a cask without additional nitrogen or carbon dioxide pressure. They require pumping by hand to serve and give good head in spite of being lightly carbonated compared to bottled beers. There is nothing quite like them in spite of their being brewed as bitters, stouts, porters, and even IPAs.

Breweries are small and local, and mostly supply only a handful of establishments — until recently. We visited one brewery in Pitlochry, the Moulin Traditional Ale Brewery, that brews only 120 liters per day of four different ales and supplies only one pub and one hotel. In the latter half of the last century corporate brewers began buying up pubs, pushing their beers and sidelining — or even eliminating — cask ales. Brits were not amused. In response, the Campaign for Real Ale was founded in 1971, and managed to convince the corporates not to eliminate cask ales. Some, such as Adnams, Greene King, and Marston’s, now even brew their own cask ales.

Although this anecdote is either false — Hume died in 1776 — or was altered in the retelling, it well captures Hume’s thinking.

While in Glasgow we managed to hit the Fifth Glasgow Real Ale Festival, offering over 150 different real ales from all over the realm. As they say in Kentucky, “we were in pig heaven!” We’d barely finished our first pint when the 18-piece Caledonian Brewery Edinburgh Pipe Band marched in playing “Scotland the Brave,” forcing us to freeze in place and raising the hairs on the nape of our necks. We imbibed 105 different real ales during our ride. Only space prevents me from listing them all and their creative names. As of 2014 there were 738 real ale brewers or pubs in the US. There might even be one near you.

In Killin we took a rest day and visited the Clan McNab burial grounds on Inchbuie Island in the River Dochart, along with the associated Iron Age fort and even earlier stone circle. Here in Prescott, Arizona, my hometown, David McNab books Celtic musicians who come on tour to the US. Married to a Scots lassie, he treasures his heritage. We’d be a culturally poorer town without his concerts.

As we passed Loch Tay, the Scottish Crannog Centre, an outdoor museum with a restored lake dwelling dating from about 800 BC, beckoned. The crannogs were built on stilts or artificial rock islands on the water. Villages, consisting of three crannogs, each with about 90 inhabitants, were common in Scotland and Ireland as early as 5,000 years ago and as late as the early 18th century. While Scotland has only 350–500 crannog villages, Ireland — on a much larger land mass — boasts about 1,200. Doubtless, both countries have many more crannog villages, underwater archaeology presenting considerably more obstacles (in survey and excavation) than terrestrial.

This odd dwelling pattern was first glibly explained as being of a defensive nature (most 19th century archaeologists being retired military men), but few weapons or evidence of warfare associated with the crannogs exists. The new explanation is that the dense vegetation of the Celtic countries favored cleared land for agriculture, not for mere habitation, while the riparian location facilitated extensive trade networks, evidence for which — including networks all the way to mainland Europe — is abundant.

The Loch Tay Crannog Centre, near Kenmore, Perth, and Kinross, isn’t just one reconstructed crannog with three dugouts. The staff has recreated the entire lifestyle of the inhabitants: foot-operated lathes; grain-grinding stones; wool spinning, dyeing, and weaving; and fire-starting by “rubbing two sticks together,” a practice often mentioned but seldom seen. It means using a fire drill. With the proper knowledge, preparation and materials, all things are possible. The demonstrator (even his shoes and clothing were authentic) started a fire in less than a minute.

The Braw Hielands

Somewhere beyond the Crannog Centre we crossed into the political subdivision known as the Highlands and Islands of Scotland. Trees and settlements became scarcer, midges and cleggs more numerous. Heather (purple), gorse (yellow), and bracken (green) gilded the landscape. Long-haired Highland cattle and Scottish Blackface, Wensleydale, Cheviot, and Shetland sheep predominated. It is here — not in Gretna Green — that the romance of Scotland kicks in: Rabbie Burns; Bonnie Prince Charlie; Nessie; Capercaillie and Old Blind Dogs; kilts, sporrans, and claymores; haggis; the Outlander miniseries; and even Mel Gibson berserking over the moors as William Wallace come to mind.

However, my own mind gravitated to those two giants of the Scottish Enlightenment, David Hume and Adam Smith. I’d not run across any memorials, statues, or even streets named for either in their homeland. That’s more understandable for Hume, whose somewhat counterintuitive, esoteric — albeit undogmatic — thinking isn’t readily accessible. But Adam Smith, the father of economics, the Charles Darwin (or Albert Einstein) of the dismal science, is a household name. His insights are readily accessible and intuitive.

In three separate trips to Scotland, I have been struck by the lack of Adam Smith memorials.

Smith and Hume were drinking buddies (which is saying a lot in 18th century Scotland, where getting plastered to oblivion was a national pastime). One bit of Hume’s thought that was accessible — though still counterintuitive — is encapsulated in an exchange he had with Smith. The United States had failed to agree on an official religion for the new country: a first for its time. Smith, a man of indeterminate religious beliefs, bemoaned the fact, opining that the lack of an official faith would doom the country into irreligiosity. Hume, an agnostic, disagreed. He predicted that countries without official faiths would experience a flowering of religions, while the official religions of countries that had them would wither into irrelevance. Although this anecdote is either false — Hume died in 1776 — or was altered in the retelling, it well captures Hume’s thinking.

The anecdotal Hume was right. America soon experienced the Second Great Awakening, the birth of a multiplicity of religious sects in the 1800s. Today, according to The Guardian (September 4, 2017), more than half the UK population has no religion; while nearly 76% of Americans identify as religious.

In three separate trips to Scotland (one where I walked across the country) I was struck by the lack of Adam Smith memorials. One informant said the Scots had little affection for Smith. Public opinion inside Scotland holds Adam Smith, the father of capitalism, responsible for the Highland Clearances. And public opinion outside Scotland perceives the Scots as socialist. It’s not so simple.

In the 2017 UK elections, the Conservative Party received 28.6% of the vote and overtook the Labour Party, the real far-left socialists, who received 27.1%, as the main opposition party to the majority Scottish National Party, which got 36.9%. The Scots are nationalistic, thrifty, good businessmen who hate taxes — traits not often associated with socialism (though they abhor class and status pretensions).

But back to Smith and the Highland Clearances. Smith was a strong advocate of both private property and efficiency in production. When The Wealth of Nations came out, Scottish clan chiefs decided to reinterpret their position as not just clan heads, but also fee simple owners of clan lands, according to how they interpreted Smith’s concept of private property. They became lairds, owners of the clan lands instead of feudal lords. As feudal lords they’d had a complex set of rights and duties with their crofters. However, as lairds, they suddenly became absolute owners of what was now their private property. Since Scottish law had not formalized feudal rights and duties, the transition from a feudal system to a modern market economy was — to say the least — awkward.

The crofters were subsistence farmers. Their part of the deal was to give a percentage of their harvest to the clan chief in return for protection, leadership, dispute resolution, and so on. Advances in agronomy and a booming market for wool indicated to the new self-declared lairds that sheep grazing would enrich them much more than a few bushels of oats. Most chose sheep over oats and evicted the crofters, hence the Clearances. (This is a simplified version.) Not all lairds ignored the crofters’ feudal rights. Lairds’ responses ran the gamut from keeping the crofters as tenant farmers, to buying them out, to cruel dispossession and eviction. There was no uniform formula; the greediest landlords made the headlines. Adam Smith got the blame. Finally, however, in 2008, an elegant ten-foot bronze Adam Smith statue on a massive stone plinth and financed by private donations was unveiled in Edinburgh’s Royal Mile within sight of a rare statue of his friend David Hume.

Outside Inverness, capital of the Highlands, the Culloden battlefield, site of the last battle (1746) fought on British soil, cast its spell. Supporters of the Stuart (Jacobite) dynasty fought the by-then established Hanoverian dynasty army of George II. The German Hanoverians had been installed as monarchs of the United Kingdom after Parliament tired of both Stuarts and civil wars. A common misconception holds that Jacobitism was a Scottish cause because the Stuarts, before being invited to rule over England had been kings of Scotland, and most of the Jacobites were Scots. Again, not so simple.

Since Scottish law had not formalized feudal rights and duties, the transition from a feudal system to a modern market economy was — to say the least — awkward.

Monarchy has its own rules of succession. Under those rules, Charles Stuart (Bonnie Prince Charlie) ought to have become king of the United Kingdom. The problem was that the Stuarts were Catholics and a Catholic, according to the Act of Settlement passed by Parliament in 1701 — the expedient to finally dump the Stuarts — could not rule over a Church of England realm, much less head that church. Adherents to the monarchy’s rules of succession did not accept Parliament’s power to overturn those rules, hence the Jacobite uprising. Scots, English, and Irish participated. The presumptive heir to the Jacobite crown today is Franz Bonaventura Adalbert Maria von Wittelsbach who, if he were on the throne, would be known as King Francis II.

We took a rest day in Inverness and got a dose of fire-and-brimstone Scottish Calvinism and attended a couple of ceilidhs — once both at the same time. A determined preacher in white shirt and tie stood on the Crown Road, Inverness’s high street, reading the Bible in thunderous and orotund sonority to the passersby while fiercely gesticulating with his free hand. We were entranced. Particularly when a young fellow in a t-shirt and a newsboy cap took a stance across the street, pulled a bagpipe out of its case, laid out the case to collect donations, and hit the chords of “MacPherson’s Lament.” He completely drowned out the homilist, who nonetheless persevered, impervious to all external distractions. As to the other ceilidhs, one particular impromptu session at a pub included two fiddles, a guitar, uilleann pipes, and a saxophone — the last two instruments a particularly innovative and sonorous combination.

North of Inverness nearly all the trees disappeared, as did fences, buildings, and power poles; even the livestock thinned. It was a magical, surreal landscape with the odd abandoned stone sheep enclosure. At Tongue, the North Sea finally came into view. When the Orkneys appeared on the horizon, our hearts skipped a beat: we knew we were nearly done. Stroma, the nearest Orkney, presented a spectral appearance. It had been abandoned in 1962. A scattering of old stone cottages, unconnected by roads, eerily dotted its landscape. Soon John O’Groats, little more than an inn and tourist shops, materialized out of the grassy plain. We’d covered 1,291 miles — according to our bike odometers — in 29 days, with an additional eight rest days.

The piper completely drowned out the homilist, who nonetheless persevered, impervious to all external distractions.

After a shuttle to Inverness and an overnight ride on the Caledonian Sleeper we arrived at Euston Station, London. During the ride — both of them — we reflected on Britain’s virtues. It’s a country with no earthquakes, volcanoes, hurricanes, tornadoes, forest fires, or mudslides; having an ideal climate with no extremes of heat or cold, aridity or rain; a varied and undulating topography of grasslands, moorland, woodland, glades, estuaries, highlands, and lowlands; hamlets, villages, towns, and cities with a minimum of sprawl; little crime, few slums or homelessness; a cultured people with a generally sensible disposition (and oodles of book stores); and enjoying separation of head of state from head of government. Finally, it’s always been Great, and, best of all — has unsurpassed beer and whisky. What more can you ask for? Lower taxes?




Share This


Beyond Relativism

 | 




Share This


Vietnam Revisited

 | 

I never fought in Vietnam. By the time I was old enough to go, I held a high draft-lottery number and a student deferment, and was never called up. I do remember the war, though. Early on, when Kennedy sent in the advisers, I was in elementary school, and saw the pictures on TV and in Life magazine. When Johnson sent in half a million men, I was in junior high, and we argued about the war in class. When Nixon came to power I was in high school, and we debated it more. When the four protesters were killed at Kent State University, I was finishing my first year at the University of Washington in Seattle. My instructor in German cancelled classes and gave us all A’s so we could go protest. I stood aside, watching the protesters flood onto Interstate 5 and block traffic until the cops pushed them off the exit to what are now the offices of Amazon.

My sentiments on the Vietnam War, like those of most Americans, evolved. In 1975, when South Vietnam collapsed, its government appealing for help and the US Congress and President Ford offering none, I was as coldhearted as anyone. I thought, “To hell with Vietnam.” I had been reading about it, thinking about it, arguing about it since I was a kid. During that time 58,000 Americans, of whom nearly 18,000 were draftees, had been killed there, along with maybe a million Vietnamese, and for what? In economists’ terms, the mountain of corpses was a “sunk cost” — and I was ready to watch the whole damn thing sink into the South China Sea.

I was living in Berkeley, California, when the South fell. I remember standing outside my apartment on May 1, 1975, taking photographs of a parade down Telegraph Avenue welcoming the Communist victory. “All Indochina Must Go Communist,” one banner said. Well, I hadn’t evolved that much. For me the fall of South Vietnam was a day for quiet sadness.

By 1975, 58,000 Americans, of whom nearly 18,000 were draftees, had been killed there, along with maybe a million Vietnamese, and for what?

As a kid in junior high, I had supported the war. Recall the geopolitical situation: Communists had eaten up a third of the world, with big bites in Eastern Europe in 1945–48, China in 1949, North Vietnam in 1954 and Cuba in 1959. They had been stopped in a few places — in Malaya, by the Brits — but once firmly established they had never been pushed back.The Cold War’s rules of engagement were that the Communists could contest our ground — what we called the Free World — but we dared not contest theirs. And the end of that road did not look good.

When I used that argument — and “domino theory” is not a good name for it — no one knew the Communist system was facing extinction. People knew it was a poor system for satisfying private wants, but as a foundation for political power, it did seem to pass the Darwin test: it had survived and spread.

All the old arguments came back as I was reading Max Hastings’ new book,Vietnam: An Epic Tragedy, 1945–1975. Hastings, an Englishman, is my favorite military historian; for years I have had his 1987 book, The Korean War, on my shelf, and I breezed through his 752-page Vietnam in a few days. In this book Hastings has undertaken to write the narrative of the war, and not all from the American side, but also in the voices of South and North Vietnam. Hastings reveals that there were arguments and worries on their side as well as ours. Many in the North hated the draft and did not want to trek down the Ho Chi Minh Trail to fight. Over the years, 200,000 Northerners deserted while in the South. The Northern soldiers also underwent far more privations than the Americans or their Southern allies, living on rice and water spinach (sold in Asian markets here as on choy) and often starving. On one occasion, Hastings says, they killed and ate an orangutan.

People knew communism was a poor system for satisfying private wants, but as a foundation for political power, it did seem to pass the Darwin test: it had survived and spread

Hastings analyzes the assumptions and the strategies of both sides. To the low-level Vietcong, the war was mostly about getting rid of Americans who looked and acted like the “long-nose” French, Vietnam’s late imperial overlords. The cadres tried to indoctrinate the VC in Marxism, but identity politics had the stronger pull.

Strength of belief and feeling makes a difference in war, and in Vietnam the advantage went to the other side. For a military historian, Hastings makes a key admission when he says that fighting was less important than “the social and cultural contest between Hanoi and Saigon.”

In that contest, the North’s standard-bearer was “Uncle Ho,” the Gandhi-like figure of Ho Chi Minh, who had kicked out the imperialist French. In the South, a society that included landowners, merchants, and bureaucrats who had worked for the French and prayed in the same church as the French, one of the icons was Vice President Nguyen Cao Ky. One observer said that Ky, an air force pilot with slick black hair and a pencil-thin moustache, looked like a saxophone player in a cheap Manila nightclub. Writes Hastings of Ky, “He was publicly affable, fluent, enthusiastic about all things American but the taste of Coca-Cola — and as remote as a Martian from the Vietnamese people.”

Strength of belief and feeling makes a difference in war, and in Vietnam the advantage went to the other side.

South Vietnam was a society rotten with corruption and ill-gotten wealth. “Again and again,” writes Hastings, “peasants were heard to say that whatever else was wrong with the communists, they were not getting rich.” History shows, though, that life is easier in a society in which some are wrongly rich than in one in which the rich are rounded up and shot, leaving everyone else poor. Hastings writes that when the North Vietnamese army rolled into Saigon, the soldiers were amazed at how much stuff the people had.

The Vietcong were terrorists. They beheaded the village chieftains who opposed them, and sometimes buried them alive. The Americans were told to behave better than that, but with their B-52s, high explosives, and napalm they dispensed death wholesale. American soldiers, Hastings writes, went to war “wearing sunglasses, helmets, and body armor to give them the appearance of robots empowered to kill.” Back at base, “Army enlisted men took it for granted that Vietnamese would clean their boots and police their huts.” And also use the bar girls for sexual entertainment.

Hundreds of thousands of South Vietnamese still fought and died for their state, and also worked with the Americans. First-generation Vietnamese in my home state are fiercely loyal to the old Republic of Vietnam, and still fly the yellow flag with the three stripes. Apparently they were not a majority of their countrymen, else the conflict would have come out differently.

With their B-52s, high explosives, and napalm the Americans dispensed death wholesale.

As the Pentagon Papers showed, smart people in the US government saw early on that South Vietnam was ultimately not a viable cause. President Kennedy expressed his doubts, but he also believed deeply that his mission was to stop the Communists. “Nothing that came later was inevitable,” Hastings writes, “but everything derived from the fact that sixteen thousand men were in country because John F. Kennedy had put them there.”

Hastings doesn’t buy the theory propagated in Oliver Stone’s movie JFK that Kennedy was on the verge of backtracking when he was shot.

Kennedy’s successor, Lyndon Johnson, sent half a million men to Vietnam because he didn’t want to be blamed for losing it, as Truman had been blamed for losing China. Johnson’s successor, Richard Nixon, saw that the war was lost, but he took four years to pull the troops out — an indecent interval in which thousands of Americans and Vietnamese died — because he didn’t want his name on an American defeat. For each of these US leaders, the concern was his country’s prestige (a Sixties word) and his own political standing. “An extraordinary fact about the decision making in Washington between 1961 and 1975,” Hastings observes, “was that Vietnamese were seldom, if ever, allowed to intrude upon it.”

Kennedy, Johnson, and Nixon were focused on the Chinese and the Russians, and assumed they were in charge in Hanoi as much as the Americans were in Saigon. Hastings says it was not so. The Russians and the Chinese were frustrated at the North Vietnamese aggressiveness, and repeatedly advised them to cool it. Within the North Vietnamese leadership, Ho often agreed with his foreign advisors, but Hastings says that policy was set not by Ho but by Communist Party General Secretary Le Duan, “though the world would not know this.”

Nixon saw that the war was lost, but he took four years to pull the troops out — an indecent interval in which thousands of Americans and Vietnamese died — because he didn’t want his name on an American defeat.

By Hastings’ account the Americans were not the only ones who made big mistakes on the battlefield. Militarily, the biggest Communist mistake was the Tet (Lunar New Year) offensive of 1968. Le Duan’s idea was to show the flag in all the Southern cities, spark an uprising among the people, and swamp the Southern government in one big wave. In the event, the South Vietnamese didn’t rise. In Saigon, the Vietcong breached the wall of the US embassy, and in Hue, North Vietnamese regulars occupied the town north of the Perfume River for several weeks and methodically executed all their enemies. But everywhere the Communists were driven back.

The Vietcong lost 50,000 dead in Tet and follow-on attacks, five times the combined US and South Vietnamese military deaths. Largely cleansed of Vietcong, the countryside was quieter in the following year, as the North Vietnamese Army built up forces to fill the void left by the defeated Southern guerrillas. Though Tet was a military defeat for the North, the US press played it as a Communist show of strength, thereby tipping the balance of opinion in America against the war. For the Communists, a military defeat became a political victory.

The journalists had played it the way it looked, and it hadn’t looked like a South Vietnamese victory. American journalists had learned to distrust their government’s statements about the war. They should have begun four years earlier by distrusting the Tonkin Gulf Incident, which was used in 1964 to justify the de facto US declaration of war. Of the two supposed attacks on the destroyer USS Maddox, Hastings writes, one wasn’t real and the other was “a brush at sea that could easily and should rightfully have been dismissed as trivial.”

For the Communists, the military defeat of the Tet Offensive became a political victory.

In the case of Tet, US journalists inadvertently helped the enemy, but generally the press gave Americans a more accurate picture of the war in South Vietnam than the government did. The press did a poor job of reporting the shortcomings of the North, but it wasn’t allowed to go there. In 1966, when I was arguing with my schoolmates for the war, I repeatedly heard them say that communism would be a bad system for us, but it was a better one for the Vietnamese. If Americans had good reporting from North Vietnam, I don’t think my schoolmates would have said things like that. We anti-communists were right about one thing: communism turned out to be just as bad as we said it was.

The question remains as to what, if anything, America should have done to stop the Communists in Vietnam. Hastings quotes CIA officer Rufus Phillips describing what America did: “We decided that we were going to win the war and then give the country back to the Vietnamese. That was the coup de grace to Vietnamese nationalism.” But if it was wrong to do that in Vietnam, it should have been wrong in Korea, and it worked there, at least well enough to preserve the Republic of Korea. It can be no surprise that Kennedy and Johnson would try a military solution again.

What was the difference? Hastings touches on this question only briefly, mentioning the obvious: Korea is a peninsula with a border just 160 miles long, while South Vietnam had a border with Cambodia, Laos, and North Vietnam more than 1,000 miles long, perforated in many spots by the Ho Chi Minh Trail, the complex of corridors through which the Communists infiltrated the South with fighters and supplies. The warfare on the Korean peninsula was conventional, with front lines; in Vietnam it was a guerrilla contest while the Americans were there, becoming conventional only after they had decided to go. The physical climate was different, too. The Koreas were divided on the 38thparallel, about the latitude of San Francisco; the Vietnams were divided on the 17th parallel, about the latitude of Belize City. All of Vietnam is in the tropics, with attendant cloudbursts, humidity, bacteria, and bugs.

American journalists had learned to distrust their government’s statements about the war. They should have begun four years earlier by distrusting the Tonkin Gulf Incident.

And there were political differences. Ho Chi Minh was a hero of national independence; Kim Il Sung pretended to be one, but had a less inspiring story. Also, in Korea the old imperial masters were not long-nosed Caucasians but Japanese.

A penultimate thought. Hastings quotes without comment Lee Kuan Yew, the patriarch of capitalist Singapore, to the effect that if the Americans had not resisted the Communists in Vietnam, “We would have been gone.” Call this the “domino theory” if you like. It was a view I encountered in the early ’90s, when I worked in Hong Kong for Asiaweek magazine. Our founder and publisher, a Kiwi named Michael O’Neill, maintained that the American effort in Vietnam had stopped the Communists from pushing on to Thailand, Malaysia, Singapore, Indonesia, and the Philippines. Meanwhile, China had junked communist economics, and Vietnam, unless it wanted to remain poor, would have to do the same. And that, O’Neill argued, meant that the Americans had really won the Vietnam War, even if they didn’t know it.

Or maybe, I thought, we had lost the war but in the long run it didn’t matter — because the war wasn’t the decisive contest.

Twenty-one years after the war ended, I traveled to Vietnam with my wife and six-year-old son. In Danang I met a group of men who had fought for the South and suffered persecution from the victors. They weren’t bitter at the Americans, nor were the tour guys who drove us to Khe Sanh and were too young to remember. In the North, at Ha Long, I chatted up the proprietor of a tiny restaurant who said that during the war, when he had been a truck driver for a state-owned coalmine, he had lost his house to American bombing. I told him I was very sorry my countrymen destroyed his house.

He shrugged. “I have a better one now.”


Editor's Note: Review of "Vietnam: An Epic Tragedy, 1945–1975," by Max Hastings. Harper, 2018, 857 pages.



Share This


Still Amazing After All These Years

 | 

Nearly 50 years ago I was a high school student working my first summer job as a maid-waitress-cook at a rustic lodge in the Trinity Alps of northern California. The lodge was owned and still under construction by one of my high school teachers. My sister had worked there for a week and gone home, saying it was too much for too little. I stuck it out for another three weeks, until one evening when a coworker badly mistreated me. I fled the lodge and walked three miles to the nearest phone to call my parents and ask them to come get me. I then trudged the three miles back to where I had been working so they would be able to find me (life before cellphones!). It was July 20, 1969. I looked up through the trees at the starry sky, totally unaware that Neil Armstrong and Buzz Aldrin were about to touch down on the moon. Because of my call, my father missed the moon landing on TV. He never let me forget it.

Watching First Man, Damien Chazelle’s new film with Ryan Gosling as Neil Armstrong making his way to that historic step onto the moon, I finally understood why my father was so upset. What a glorious, terrifying, awe-inspiring moment that was, and we sense it with more caution than elation as Armstrong hesitates on the final rung of the ladder before finally stepping down onto the dusty landscape. I don’t know whether Chazelle got it right, but he certainly got it impressively — the absolute quiet of space, the broad expanse and rugged terrain of the moon’s surface, the suspenseful risk of training, the view from the cockpit. And the musical score by Justin Hurwitz, who has worked with Chazelle on four award-winning films, works perfectly throughout the film.

So much could have gone wrong — and did, along the way.

For two hours the film builds to that moment when Armstrong steps onto the moon, demonstrating that it was more harrowing than glorious. So much could have gone wrong — and did, along the way. The film opens with Armstrong fighting to control his X-15 supersonic jet and land it without crashing. Training requires astronauts to practice precision tasks while spinning at such dizzying speeds that it’s a race against the inevitable moment when they will pass out. As the men are strapped into the Gemini module before taking off to practice docking in space, we expect to see the excitement and jubilation of astronauts finally realizing their little-boy dreams. Instead, their faces are subdued, focused, and even a bit apprehensive. And with good reason: despite all their earthbound preparations, there was no guarantee that they would return successfully. Indeed, numerous pilots had died during the testing phase. The space race was a grim undertaking, punctuated by moments of exhilaration, performed against a backdrop of angry protestors chanting against the enormous financial and personal cost.

Armstrong is calm, almost emotionless, as he contends with the rigors of space travel, the tragedy of a child’s death, and the stoicism of his wife Janet (Claire Foy). He can roughhouse with his young sons, but he can’t tell them he loves them. As he leaves for the moon, he hugs one son but shakes hands with the other. Such passionless focus is a strength, not a weakness, for someone in his position; Armstrong’s ability to think and react impassively in an emergency is a primary reason for his success, and Gosling portrays him masterfully.

But as an audience we want our heroes to be exciting and outgoing. My husband happened to meet Armstrong, Aldrin, and Collins a few months later during their victory tour around the world. He was in Colombia at the time, and called out Armstrong’s name in his strong American English. As Mark describes it, Armstrong turned with a broad, winning smile and shook his hand vigorously before rejoining the parade. The weighty burden of weightless space had lifted for a while, and he could enjoy the gravitas of what he and the others had accomplished.

Raising the American flag and leaving it on the moon as a reminder of who got there first was a huge deal. It did not “transcend countries and borders.”

The quietness of Armstrong’s character makes the film less compelling and may explain the reason for poor turnout on opening weekend. It’s more likely that the poor turnout was owing to the controversy that preceded opening weekend. You’ve probably heard the disgruntled rumblings about the flag on the moon being left out of the movie, so let me address the controversy right here: yes, the American flag does appear on the moon in the film. It’s distant, and it’s small, but it’s there. Rumors about the flag’s absence began shortly after the film’s premiere at the Venice Film Festival, when audiences waited expectantly for that iconic moment. It didn’t happen, and social media exploded with boycott-laden outrage. (I don’t know whether the film was re-edited after the festival, or if audiences were simply expecting more, but the flag is definitely in the scene now.) Gosling explained in an interview that the moon landing “transcended countries and borders [and]… was widely regarded in the end as a human achievement,” so that’s why they didn’t include the flag-raising moment.

This is pure 21st-century poppycock, of course. Competition with the Russians was the driving force behind the space program, and the reason JFK dedicated so many billions of dollars for it. The Russians were ahead of the Americans nearly every step of the way. Raising the American flag and leaving it on the moon as a reminder of who got there first was a huge deal. It did not “transcend countries and borders.” It was the reason we were there. Chazelle and company should not have minimized it into a kumbaya moment of one-world humanism. Moreover, in his zeal to turn American exceptionalism into ordinary human accomplishment, Chazelle missed a great opportunity to make his statement — with the flag. Armstrong’s biography recounts how the astronauts struggled to assemble the malfunctioning flagpole. That struggle could have been presented as a metaphor for 21st-century American politics and the difficulty of raising the flag today.

First Man is a good film with some great special effects and fine acting all around. Jason Clarke is especially good as Ed White, and Corey Stoll is feisty as Buzz Aldrin. Claire Foy, best known for her role as Queen Elizabeth II in the excellent Netflix series The Crown, is wonderful as the stoic yet passionate Janet Armstrong. And with the 50th anniversary of the moon landing just six months away, I’m happy that this film has been made.


Editor's Note: Review of "First Man," directed by Damien Chazelle. Universal Pictures, 2018, 141 minutes.



Share This


Racism

 | 

In 1979, undercover Colorado Springs police officer Ron Stallworth noticed a phone number in a local newspaper in a small ad seeking members to begin a new chapter of the Ku Klux Klan. He called the number and pretended to be a white supremacist, hoping to infiltrate the organization in order to thwart the rising violence against black residents in general and the black student union at the college in particular. Soon the KKK leader suggested that they meet in person. The only hitch? Ron Stallworth was black.

Spike Lee’s BlacKkKlansman tells the tale, and it’s a gripping, suspenseful, often humorous, and often troubling one. As the film narrates the story, KKK leader Walter Breachway (played in the movie by Ryan Eggold) eventually asks for a face-to-face meeting with Stallworth (John David Washington, Denzel’s son), Stallworth arranges for a white undercover narcotics cop, Flip Zimmerman (Adam Driver), to stand in for him. Yes, a black and a Jew both manage to infiltrate the hateful KKK by posing as the same white supremacist. Stallworth continues to talk with Walter by phone while Zimmerman continues to meet with Klan members in person, necessitating that their stories and even their voices match. Walter’s second in command, Felix (Jasper Paakkonen), grows suspicious, or perhaps jealous, and as his sadistic streak surfaces we worry for Zimmerman’s life.

Director Lee chooses caricature rather than character with some of his KKK subjects, but after watching decades of black caricature on film, I can forgive him this hamhandedness.

During the course of his investigation Stallworth contacts David Duke himself (Topher Grace), then the Grand Wizard of the Ku Klux Klan and future Louisiana State Representative. The boyish Grace, best known for the TV series That ’70s Show, plays Duke with perfect oblivion to his bigotry. Lee is a bit heavyhanded, however, in his determination to connect Duke’s rhetoric with Trump’s “Make America Great Again” rhetoric.

Adam Driver provides a nuanced performance as the lapsed, nonchalant Jew forced to confront his feelings about his heritage when he is threatened simply because of his genetic stew. Corey Hawkins is fiery as Kwame Ture (aka Stokely Carmichael), and Laura Harrier channels Angela Davis luminously with her big round glasses and bigger round afro as Patrice Dumas, president of the black student union. Harry Belafonte is a standout as Jerome Turner, carrying with him the weary weight of his own decades in the civil rights movement. Director Lee chooses caricature rather than character with some of his KKK subjects, particularly the slack-jawed near-imbecile Ivanhoe (Paul Walter Hauser) and Walter’s perky, overweight, frilly aproned wife Connie (Ashlie Atkinson). But after watching decades of black caricature on film, I can forgive him this hamhandedness.

While the plot of BlackKKlansman covers just nine months in the 1970s, the story spans more than a century. It opens with a scene from Gone with the Wind, presents upsetting clips from Birth of a Nation, which celebrated the KKK, and ends with footage from the deadly riot in Charlottesville last year. And Harry Belafonte as Jerome Turner provides a soft-spoken, emotional, and tender account of the horrifying 1916 lynching and burning of Jerome Washington in Waco, Texas.

If there is one underlying truth about racism, it is this: government is the Grand Wizard of bigotry.

I’m always a little uncomfortable and defensive when I see films like this; it’s important to be aware of black history, and I’m glad these stories are being recorded on film. But it feels as though I’m intruding somehow, as though all whites are being accused of the same ignorant, bigoted mindset that we see on the screen. In reality, of course, white supremacists represent a tiny minority of the population, while white voters, white activists, white teachers, and white politicians have worked vigorously in the cause of civil rights.

If there is one underlying truth about racism, it is this: government is the Grand Wizard of bigotry. Government legalized slavery and enforced the Fugitive Slave Law. Government institutionalized segregation through neighborhood-based public schools and “separate but equal” policies, and governments outlawed miscegenation. Government imposed poll taxes and voting questionnaires. Government grants and welfare in the 1960s were well-intentioned, but they incentivized single motherhood, established barriers to work through public assistance programs that were difficult to relinquish for an entry-level job, and created a dragnet rather than a safety net that virtually destroyed the black family in urban neighborhoods.

Meanwhile, activists — black and white, male and female — exercising their rights to free speech and open dialogue were the catalyst for change and inclusion. Freedom of speech is the most important right we have. It’s the foundation for all other rights. Yet too many activists today are turning to government to establish hate laws that limit free speech. These films seldom acknowledge the friendship and genuine concern felt by so many white Americans, or the fact that discovery of truth is a process. Lee gives a welcomed nod to this idea at the end of the film, but it takes a long time to get there. Still, BlacKkKlansman is well made and well worth seeing.


Editor's Note: Review of "BlacKkKlansman," directed by Spike Lee. Focus Features and Legendary World, 2018, 135 minutes.



Share This


The Great Panic

 | 

I have long been a fan of the Panic of 1893, which is the usual name for the great depression of the 1890s. When I say “great” I mean it is comparable by all available measures (business losses, unemployment, political turmoil) to the Great Depression of the 1930s — with two exceptions. First, the depression of the 1930s lasted for more than ten years, ending only with the start of the Second World War in Europe; the depression of the 1890s lasted less than half as long. Second, in the 1930s the federal government intervened massively to try to end the depression, whereas the government of the 1890s did as little as it could.

These two exceptions are closely related. In 1893 and after, President Grover Cleveland had the political and above all the intellectual courage to allow prices to sink until recovery could begin. He devoted his best efforts to stabilizing the dollar, so that sound money and real prices could beget confidence, and confidence could beget reinvestment. This happened. But in 1929 and after, Presidents Herbert Hoover and Franklin Roosevelt were guided by the economic ignorance and sheer quackery of their times (and ours); they intervened to keep prices up and bail out bad investments — using money, of course, extorted from the people who had made good investments. Roosevelt’s subsidies extended to the destructive political ideas of his time; he encouraged political action to fulfill the borderline-crazy terms of his first inaugural address, in which he announced:

The money changers have fled from their high seats in the temple of our civilization. We may now restore that temple to the ancient truths. The measure of the restoration lies in the extent to which we apply social values more noble than mere monetary profit.

The result was not only chronic political turmoil but a failure of reinvestment caused by a chronic absence of confidence in the nation’s economic and political prospects. Money, as R.W. Bradford used to say, wants to be invested, but it didn’t during the 1930s, when for a series of years there was actually “negative investment” in the economy.

In 1929 and after, Presidents Herbert Hoover and Franklin Roosevelt were guided by the economic ignorance and sheer quackery of their times (and ours).

So you see one reason why I am a fan of the depression of the 1890s — it provides clear and persuasive economic, political, and, if you will, spiritual lessons. But another reason is that the economic and political controversies of the 1890s are a lot of fun. Communism is dull stuff, no matter where it appears, and in the 1930s it manifested itself in remarkably dull, stupid, pompous, and oppressive forms. Compared with that, the nostrums of the 1890s are bright, delusive rays of sunshine. You just have to smile at Jacob Coxey’s plan to save the country by a complicated scheme for the federal government to print tons of paper money and use it to give free loans to local governments so they could create jobs in public building programs — a plan he implemented in the first of the great marches on Washington, the march of Coxey’s Army. The march culminated in Coxey’s arrest at the Capitol, for walking on the grass.

And who wouldn’t have fun trying to follow the logical permutations of the Free Silver idea, the notion that the American economy would be perfected if the federal government would simply produce unlimited quantities of silver dollars (and paper instruments representing them), priced at 16 silver dollars for one gold dollar, when the market price of a gold dollar was much higher than 16 silver dollars? This was a recipe for outrageous inflation, yet in 1896 it captured the Democratic Party and could have led to the election of the Democratic candidate, William Jennings Bryan, he of the stirring Cross of Gold speech:

You shall not press down upon the brow of labor this crown of thorns. You shall not crucify mankind upon a cross of gold.

It’s a good speech, and some of the books and pamphlets written in favor of Free Silver are immensely clever complications of an argument that is clearly wrong but has a way of starting to look right if you don’t take a step backward and remind yourself of what it’s really about.

Compared with the remarkably dull, stupid, pompous, and oppressive forms of communism that manifested in the 1930s, the nostrums of the 1890s are bright, delusive rays of sunshine.

Now comes Bruce Ramsey, author of the book I am reviewing and — all cards on the table — senior editor of Liberty and a good friend of mine. Bruce is a tireless researcher of the events, theories, and movements of the 1890s. He knows their importance. He knows they reveal important truths about the ways in which economies function, and in which people function within them. And he knows they’re fun. The only problem is that the vast majority of Americans have simply forgotten about the depression of the 1890s. They forgot about it almost as soon as it was over. (I have an essay about this in Edward Younkins’ Capitalism and Commerce in Imaginative Literature [Lexington Books, 2015].) In the popular imagination, the decade of desperation was soon transformed into the Gay Nineties.

There aren’t a lot of good treatments of national politics and economics in the 1890s. Allan Nevins’ biography of Cleveland (1932) remains the best. And there are few decent treatments of the effects of the depression on individual men and women, in their local communities. That’s the vital part of the story that practically nobody knows. And that’s what Ramsey gives us in his brilliant new book about the state of Washington during the Panic.

In writing such a book, Ramsey faced one of the hardest challenges a writer of history can encounter. A straight-line narrative of national political and economic events would capture only part of the picture. So would an exclusive concern with one particular locality, such as Bruce’s home state, Washington. So would concentration on certain personalities, as in the cheap, tangential approach to history that one sees in the Ken Burns films. What Bruce needed to present was the full tapestry of local people and local events, rippling in the strong winds of national affairs; he needed to capture not only the big patterns but the individual figures in the tapestry, and he needed to show those ripples of history too. But he was equal to the challenge.

The vast majority of Americans have simply forgotten about the depression of the 1890s. They forgot about it almost as soon as it was over.

Bruce Ramsey is a quick but colorful narrator. He provides the pungent detail and the suggestive episode and then moves briskly onward to the next significant picture, whether it’s the portrait of an interesting man or woman, an array of statistics, a sketch of political developments nationwide, or a tale of something that’s too ridiculous to be true, but is. Did you know that in 1893 the Populist governor of Kansas tried to use the state militia to oust the Republicans (who happened to be in a majority) from the House of Representatives in Topeka? (If Dorothy wanted adventure, she could have stayed right in Kansas.) This absurd drama — one of many in Ramsey’s book — offers some perspective on the absurd politics of the present era. To say that Ramsey’s political narrative is entertaining is itself absurd; it’s an absurd understatement.

Here are thousands of stories, small in the number of words that Ramsey, a thrifty narrator, allots to each, but large in drama and implication. We see people who are found talking gibberish in darkened hotel rooms because their bank deposits of $256 had been lost to the panic. We see government officials who steal money, and lose it, and then escape to Argentina, or to a place off the coast of Washington called Tatoosh Island, thence to change identities and be discovered working as mowers in Idaho. We learn of a government official who is acquitted by a jury that doesn’t believe that bribery is against the law. We listen to a contractor for the Northern Pacific railway who says he “had put white men at work at $2 and gradually raised their wages to $2.50, although there was no time when [he] could not have employed Chinamen at 80 cents” (p. 51). We meet mayors who work in shingle mills because their cities can’t pay them a salary, and unionists who resort to riot and terror to keep their salaries from being cut.

The sheer number of stories that Ramsey tells is remarkable; still more remarkable is his unfailing ability to integrate them into larger contexts of meaning. Here’s one of the general patterns he sees. Businesses and banks that made it through this great depression often did so because they backed each other up. Seattle, where the spirit of cooperation was strong, suffered many fewer losses than such competing communities as Tacoma and Spokane. Seattle’s bankers went so far as to refuse deposits from people who had withdrawn them in panic from other banks. This was individual action, but it was mutually supportive. It was a kind of spontaneous order, and it often saved the day.

We see people who are found talking gibberish in darkened hotel rooms because their bank deposits of $256 had been lost to the panic.

Here’s another pattern. Led by President Cleveland, the federal government disclaimed responsibility for helping individuals — whether bankers or street sweepers — get out of their financial jam. Most public opinion seems to have backed him up. Newspapers in the Pacific Northwest counseled their readers to take responsibility for themselves — and above all not to hurt business by fleeing to some place with a marginally better economy. Their message was “stay here and keep pitching.” A Baptist potentate cautioned against giving money to the poor indiscriminately; this was “a selfish act, done to make the giver feel good” (83). Some local governments acted in what they regarded as the spirit of community and provided employment on public works projects, and some of them went broke doing it. But charity ordinarily began at home. As Ramsey observes, very perceptively, “In a world with little free public food, people tend to be generous with their private food” (93).

A darker side of community spirit was the almost universal feeling that if anyone was going to be without a job, it shouldn’t be someone white. Everywhere Asians were fired from jobs or prevented from getting any, and mobs formed to destroy Chinatowns throughout the region. It was only a temporary rescue when the wife of a local missionary faced down a mob that came for the Chinese people of La Grande, Oregon: “She appeared with a Winchester and announced that the first man to enter the house would be shot” (79). Most of the Chinese left town anyway; and although 14 rioters were arrested, none was convicted. Oregon’s Progressive governor haughtily rejected President Cleveland’s request that he protect the rights of the Chinese.

A darker side of community spirit was the almost universal feeling that if anyone was going to be without a job, it shouldn’t be someone white.

Much of Ramsey’s book is devoted to racism and progressivism during the depression. It’s quite a story, and again, it’s a gift of perspective: then as now, the predominant individualism of America was too much of a burden for many Americans to bear.

Obviously, the implications of Ramsey’s stories go far beyond the Pacific Northwest. The stories of that region cannot be explained without reference to the bigger stories of the nation’s money policy, its “reform” and “progressive” movements, and its national elections. Ramsey devotes lively chapters to all these things. If you don’t know the 1890s, this is the book for you, wherever you live. If you do know the 1890s, you know a lot about America, and this book will help you learn even more.

The Panic of 1893 is beautifully illustrated, with fine contemporary pictures, and backed by years of patient research. It is a distinguished and compelling book.


Editor's Note: Review of "The Panic of 1893: The Untold Story of Washington State’s First Depression," by Bruce Ramsey. Caxton, 2018, 324 pages.



Share This


When Nobody Knew What a Dollar Would Be

 | 

The Caxton Press has just published my book, The Panic of 1893, and I can now write for Liberty about it. Its topic is the final economic downturn of the 19th century. For more than three years, my head was in the 1890s — in books, articles, personal and official papers, lawsuits, and, especially, old newspapers, chiefly from my home state. The book’s subtitle is, The Untold Story of Washington State’s First Depression.

It is a popular history, not a libertarian book as such. But I have a few thoughts for a libertarian audience.

Many libertarians espouse the Austrian theory of the trade cycle, in which the central bank sets interest rates lower than the market rate, leading to a speculative boom, bad investments, and a collapse. In the 1890s the United States had no central bank. Interest rates before the Panic of 1893 were not low, at least not in Washington. The common rate on a business loan was 10%, in gold, during a period in which the general price level had been gently falling. Washington was a frontier state then, and it needed to pay high interest rates to attract capital from the East and from Europe. Credit standards, however, were low, sometimes appallingly low. Many of Washington’s banks had been founded by pioneers — optimistic patriarchs who lent freely to their neighbors, associates, relatives, and themselves. By a different road from the Austrians’ theory, the economy was led to the place it describes: a Hallowe’en house of bad investments.

The Sherman Silver Purchase Act was a sop to the inflationists, who wanted an increase in the money supply, and to the silver mining interests, who wanted the government to continue buying their silver.

The dollar was backed by gold, with the US Treasury intending to keep at least $100 million of gold on hand. But in 1890, at the peak of the boom period, Congress passed the Sherman Silver Purchase Act, obligating the Treasury to buy up the nation’s silver output with newly printed paper money. It was a sop to the inflationists, who wanted an increase in the money supply, and to the silver mining interests, who wanted the government to continue buying their silver, which it had been doing to create silver dollars. Politically the Sherman Silver Purchase Act was also part of a deal to pass the McKinley Tariff, which raised America’s already high tariff rates even higher.

The problem with the Sherman Silver Purchase Act was that the new paper money being paid to the silver miners could be redeemed in gold. The prospect of an increase every year in paper claims against the Treasury’s gold alarmed foreign investors, and they began to pull gold out. Two crises abroad also shifted the psychology of lenders and borrowers worldwide: Argentina defaulted on a gold loan from the Baring Brothers in 1890 and a real estate boom in Australia collapsed in 1893. These crises shifted the thoughts of financial men from putting money out to getting it back, from a preference for holding promises to a preference for cash.

By the time Grover Cleveland took office in March 1893, the Treasury’s gold cover had shrunk to $101 million. A run began on the Treasury’s gold — and that triggered the Panic of 1893.

In the Pacific Northwest, the four-year-old state of Washington (pop. 350,000 then) had 80 bank failures in the following four years.

Two crises abroad also shifted the psychology of lenders and borrowers worldwide: Argentina defaulted on a gold loan from the Baring Brothers in 1890 and a real estate boom in Australia collapsed in 1893.

Economists have listed the ensuing depression as the second-deepest in U.S. history. (One estimate: 18% unemployment.) But they don’t know. The government didn’t measure unemployment in the 1890s. And the rate of unemployment may not be the best comparison. America was less wealthy in the 1890s than in the 1930s, and living conditions were harsher. In absolute terms, the bottom of the depression of the 1890s was clearly lower than that of the 1930s.

The Left of the 1890s, the Populists and silverites, wanted cheap money. They blamed the depression on the gold standard. And gold is not an easy taskmaster; libertarians have to admit that.

The silverites wanted a silver standard. Most of them were “bimetallists,” claiming to favor a gold standard and a silver standard at the same time, with 16 ounces of silver equal to one ounce of gold. Their idea was that by using gold and silver the people would have more money to spend.

Free silver was a policy well beyond the Sherman Silver Purchase Act, which compelled the Treasury to buy silver at the market price. In the mid-1890s, silver fell as low as 68 cents an ounce. At that price, a silver dollar had 53 cents’ worth of silver in it and the silver-gold ratio was 30-to-1.

In absolute terms, the bottom of the depression of the 1890s was clearly lower than that of the 1930s.

The bimetallists wanted 16-to-1. That was the ratio for U.S. currency set in the late 1700s when the market was at 16-to-1. Later the market shifted and Congress changed the ratio to 15 1/2-to-1. Then came the Civil War, and the U.S. government suspended the gold standard, and printed up its first “greenbacks,” the United States Notes.

The United States Notes were effectively a new currency, and traded at a discount from metallic dollars. In September 1896, the Seattle Post-Intelligencer reminded readers of those times:

There never was a time from the beginning of the first issue of greenbacks down to the resumption of specie payments when the greenback dollar was ever accepted on the Pacific Coast for anything more than its market price in terms of gold.

The greenback was discounted, sometimes by 50 to 60%.

In 1873, Congress decided to define the dollar as a certain weight of gold, but not silver. The silver people in the 1890s called this “The Crime of ’73.”

Redemption of paper money under the gold standard began in 1879. To placate the silver interests, Congress had passed a law requiring the government to buy silver at the market price and coin it into dollars — the Morgan dollars prized by collectors today. At the beginning, the silver in a Morgan dollar was worth about a dollar, but by the 1890s, the value of silver had fallen.

In 1890, the silver-dollar law was replaced by the Sherman Silver Purchase Act, which created paper money. The government still coined silver dollars, and by 1896 had more than 400 million of them in circulation.

To placate the silver interests, Congress had passed a law requiring the government to buy silver at the market price and coin it into dollars.

The law did not require the Treasury to pay out gold for silver dollars, and it hadn’t. But the law declared all the different kinds of dollars (and there were five different kinds of paper money, at that point) to be equally good for everyday use except for taxes on imports. At the amounts an individual was ever likely to have, a silver dollar was as good as a gold dollar.

If you ask why a sane person would have designed a monetary system with gold dollars, silver dollars, Gold Certificates, Silver Certificates, National Currency, Treasury Notes, and United States Notes — Congress had designed it, one variety at a time.

Under the proposal for “free silver,” gold would be kept at the official price of $20.67 and silver set at one-sixteenth that price, or $1.29. Just as the world was free to bring an ounce of gold to the Treasury and take away $20.67 — “free gold” — the world would be free to bring an ounce of silver to the Treasury and take away $1.29. Free silver! The advocates called this the “unlimited coinage” of silver, but the aim was to create dollars, not coins. Most of the silver could pile up in the Treasury and be represented by crisp new pieces of paper.

The gold people argued that for the United States to set up a 16-to-1 currency standard in a 30-to-1 world was nuts. Essentially, the Treasury would be offering to pay out one ounce of gold for 16 ounces of silver. It would be a grand blowout sale on gold, and the world would come and get it until the gold was gone. The Treasury would be left with a Fort Knox full of silver, and the U.S. dollar would become a silver currency like the Mexican peso.

Surely the gold people were right about that. (And today’s ratio is 78 to 1.)

Milton Friedman argues in his book Money Mischief that two standards, with the cheapest metal defining the dollar in current use, would have worked all right. If the cheap metal got too expensive, the system would flip and the dollar would be defined by the other metal. In theory it makes sense, and apparently before the Civil War it had worked that way. But the financial people didn’t want a system like that.

The Treasury would be left with a Fort Knox full of silver, and the U.S. dollar would become a silver currency like the Mexican peso.

In 1896, America had a watershed election, with the silver people for Bryan, the Democrat, and the gold people for McKinley, the Republican. A third party, the People’s Party, endorsed Bryan. Its followers, the Populists, didn’t want a silver standard. They were fiat-money people. But Bryan was against the gold standard, and that was enough.

In that contest, the silver people were derided as inflationists. They were, to a point. They wanted to inflate the dollar until the value of the silver in dollars, halves, quarters, and dimes covered the full value of the coin. The silver people were not for fiat money.

Here is the Spokane Spokesman-Review of October 1, 1894, distinguishing its silver-Republicanism from Populism:

Fiat money is the cornerstone of the Populist faith . . . Silver money is hard money, and the fiatist is essentially opposed to hard money . . . He wants irredeemable paper money, and his heart goes out to the printing press rather than the mint.

The Populists and silverites argued in 1896 that the gold standard had caused the depression, and that as long as gold ruled, the nation would never recover. History proved them wrong. They lost, and the nation recovered. It began a recovery after the election settled the monetary question. Investors and lenders knew what kind of money they’d be paid with.

Milton Friedman makes a monetarist point in Money Mischief that starting in about 1890, gold miners had begun to use the cyanide process, which allowed gold to be profitably extracted from lower-grade ore. The result was an increase in gold production all through the decade. I came across a different story in my research. The increase in the supply of gold (about which Friedman was correct) was outstripped by the increase in the demand for gold. Prices in gold dollars declined sharply during the depression of the 1890s, including the prices of labor and materials used in gold mining. It became more profitable to dig for gold. Deflation helped spur a gold-mining boom — in the Yukon, famously, but also in British Columbia, in Colorado, and in South Africa.

The US began a recovery after the election settled the monetary question. Investors and lenders knew what kind of money they’d be paid with.

Under a gold standard, a deflation sets in motion the forces that can reverse it. This is a useful feature, but it can take a long time.

The recovery from the depression of the 1890s began not with a burst of new money but with a quickening of the existing money. What changed after the election was the psychology of the people. They knew what sort of money they held and could expect. The important point wasn’t that it was gold, but that it was certain. If Bryan had been elected and the dollar became a silver currency, people would have adjusted. With gold, they didn’t have to adjust, because it was what they already had.

The writers of the 1890s had a less mechanistic view of the economy than people have today. People then didn’t even use the term, “the economy.” They might say “business” or even “times,” as if they were talking of weather conditions. They talked less of mechanisms (except the silver thing) and more of the thoughts and feelings of the people. People today are cynical about politicians who try to manipulate their thoughts and feelings, and think that it’s the mechanisms that matter. And sometimes mechanisms matter, but the thoughts and feelings always matter.

Prices in gold dollars declined sharply during the depression of the 1890s, including the prices of labor and materials used in gold mining. It became more profitable to dig for gold.

Now some observations about the ideas of the 1890s.

The Populists, called by the conservative papers “Pops,” were much like the Occupy Wall Street rabblerousers of a decade ago: anti-corporate, anti-banker, anti-bondholder, anti-Wall Street, and anti-bourgeois, but more in a peasant, almost medieval way than a New Left, university student way. Many of the Pops were farmers, with full beards at a time when urban men were shaving theirs off or sporting a mustache only. More than anti-Wall Street, the Pops were anti-debt, always looking for reasons for borrowers not to pay what they owed. On Wikipedia, Populism is labeled left-wing, which it was mainly. It was also rural, Southern, Western, anti-immigrant, and often racist. In Washington state it was anti-Chinese.

In the 1890s traditional American libertarianism was in the mainstream. In the newspapers this is very striking, with the Republican papers championing self-reliance and the Democratic papers championing limited government. Democrats, for example, argued against the McKinley Tariff — which imposed an average rate of more than 50% — as an impingement on individual freedom. Here is Seattle’s gold-Democrat daily, the Telegraph, of September 10, 1893:

If it be abstractly right that the government shall say that a man shall buy his shoes in the United States, why is it not equally right for it to say that he shall buy them in Seattle? . . . Where shall we draw the line when we start out from the position that it is the legitimate and natural function of government to regulate the affairs of individuals . . .

Our idea is that the least government we can get along with and yet enjoy the advantages of organized society, the better.

Here is the silver-Republican Tacoma Ledger of Dec. 3, 1895:

Thoughtful men must perceive that our whole system of civilization is undergoing a revolution in its ideas; and we are in danger of gradually supplanting the old, distinctive idea of the Anglo-Saxon civilization — the ideas of the individualism of the man, his house as his castle, and the family as his little state, which he represents in the confederation of families in the state — by the Jacobinical ideas of . . . continental republicanism . . . The continental republican theory contemplates the individual man as an atom of the great machine called the nation. The Anglo-Saxon considers every man a complete machine, with a young steam engine inside to run it. The continental republican must have a government that will find him work and give him bread. The Anglo-Saxon wants a government only to keep loafers off while every man finds his own work and earns his own bread.

Contrast that with today’s editorial pages.

The Populists were anti-debt, always looking for reasons for borrowers not to pay what they owed.

Here’s a final one I particularly liked. Archduke Franz Ferdinand of Austria-Hungary — the same gent whose assassination 21 years later would touch off World War I — came through Spokane on the train in 1893. Americans, fascinated with him just as they would be a century later with Princess Diana, stood in the rain for hours to get a glimpse of the famous archduke — and they were sore because he never showed himself. On October 9, 1893, here is what the Seattle Telegraph had to say about that:

Why in the name of common sense should the people of this country go out of their way to honor a man simply because he happens to be in the line of succession to a throne . . . The correct thing is to let their highnesses and their lordships and all the rest of them come and go like other people. To the titled aristocracy of Europe there is no social distinction in America.

The America of the 1890s had some unlovely aspects. But in my view, the Telegraph’s attitude toward princes is exactly right. I recalled the Telegraph’s patriotic comment during all the blather over the wedding of Princess Diana’s son.

The 1890s had its blather, but after 125 years, sorting out facts from nonsense is easier. Silly statements, especially wrong predictions, don’t weather well. It makes me wonder what of today’s rhetoric will seem utterly preposterous in the 2100s.




Share This


Hip Replacement: Lesson One

 | 

“In a soldier’s stance, I aimed my hand
at the mongrel dogs who teach . . .”
                                      — Bob Dylan, My Back Pages (1964)

English, like every other living language, constantly evolves. Every utterance holds the promise of change: a new take, a fresh twist, an old word with a new meaning, or a neatly turned phrase that nudges the language, and the people who speak it, in a new direction. This is Donald Trump in the ’80s: “You have to think anyway, so why not think big?”

New words are created all the time. The verb “mansplain,” coined less than a decade ago, describes a practice at least twice as old: when a man explains something, a new word, say, to a woman in a condescending manner. And, of course, words just disappear. I haven’t heard “tergiversation” since William Safire died. Some words are like mayflies, here and gone. A word used only once is called an “onanym,” which, appropriately, is one.

As changes accumulate, the distance between the old and new versions of the language grows and the older version gradually becomes dated, then archaic, and, eventually, incomprehensible. Read Beowulf. (Or consider that less than 60 years ago we elected a president named “Dick.”)

And, of course, words just disappear. I haven’t heard “tergiversation” since William Safire died.

The sound of English changes, too. Its phonological components, such as tone, pitch, timbre, and even melody, change. If you learned American English, the North American dialect of Modern English, scores of years ago, as I did, you have heard many such changes and, while you can probably understand the current version, provided the slang isn’t too dense, you probably cannot reproduce its sound.

This, then, is a music lesson of sorts, designed to help you, my fellow older American, replicate the sounds of what we will call Post-Modern English, or PME, the successor to American English. Not the slang, mind you, but the sound of it, the music. If you are wondering why you should bother, reflect on this: you wouldn’t parade around in public wearing the same clothes that you wore as a child, would you? Of course not, because fashion changes and we should change with it, provided that we do it in an unaffected way. Choosing to update the sound of your English is as sensible as hanging up your coonskin cap. One must make an effort to ensure that one’s outfit looks snatched, after all.

The lesson includes a short passage from a radio broadcast I heard that contains many of the phonological changes that American English has undergone during the past several decades. While I managed to jot it down, I couldn’t get a copy of the audio. No matter. You can tune into any pop radio station and listen to the banter of the DJs. They are first-rate role models for Post-Modern English. (Dakota Fanning is not. I heard her interviewed on NPR, and to my ear she sounded like nothing so much as the valedictorian at a finishing school graduation, circa 1940. To be fair, NPR features many guests, and hosts, for that matter, whose mastery of PME is just totally awesome.)

Choosing to update the sound of your English is as sensible as hanging up your coonskin cap.

Ready? There are five parts. The first reveals the essence of Post-Modern English, so that you will know how to comport yourself when speaking it. The second will help you adjust your vocal cords to the proper register. The third comprises ten exercises drawn from the transcript of that radio broadcast. The fourth alerts you to a few problems you may encounter once you have mastered PME, and suggests practical solutions. The fifth and final part will put American English where it belongs: in the rear-view mirror. Just as Professor Higgins taught Miss Eliza Doolittle to drop the cockney and pass as posh, so I will teach you to drop your stodgy American English and sound cool. By the end of this linguistic makeover you will sound like a real hep cat. (The spellchecker just underlined “hep.” Bastards.)

* * *

Part One: The Essence

As French is the language of love and German is the language of Nietzsche, Post-Modern English is the language of wimps.

(Just now, you may have jumped to the conclusion that the word “wimps” was deployed in the previous sentence as a pejorative. It was not. It was chosen because it is the word that best embodies the defining characteristics of Post-Modern English. If you’ll bear with me, I think you’ll come to agree.)

When a French woman explains a line from Also Sprach Zarathustra,she sounds as if she were flirting. When a German man puts the moves on a fräulein in a dimly lit hotel bar, he sounds as if he were explaining how to change the oil in a diesel engine. Let us stipulate that the French woman is not a flirt and the German man is not a mechanic. It doesn’t matter; their languages make them sound that way. And when a fluent speaker of Post-Modern English asks you to move your car because he’s been boxed in, he sounds like a puppy that has just peed on your white carpet. He may not be a wimp, but he sure does sounds like one. It is simply the case that each of these languages, at least when heard by Americans of a certain age, creates a vivid impression of the speaker. It is no more complicated than that. So why does the American guy sound like such a wimp?

Post-Modern English is the language of wimps.

At the core of Post-Modern English are two directives that determine not just the attitude but also the moral stance that its speakers assume as they turn to face the oncoming challenges of the modern world. These two directives, sometimes called the Twin Primes, preempt both the laws enacted by governments and the commandments handed down by ancient religions. (Practically, this means that in the event of a conflict between any of those laws or commandments and either of these two directives, it is the latter that will be adhered to, not the laws of God and man, all other things being equal.) You may have heard one or both of the Twin Primes invoked when a speaker of Post-Modern English suspects a violation has occurred in the vicinity.

The First Directive is “Don’t judge.” The Second is “Don’t be a dick.”

How, you may be asking yourself, could two such sensible and straightforward prohibitions make millions of people sound wimpy? Enforced separately, they probably couldn’t, but enforced together, they lay a paradoxical trap that can make even the straightest spine go all wobbly.

When a fluent speaker of Post-Modern English asks you to move your car because he’s been boxed in, he sounds like a puppy that has just peed on your white carpet.

Step by step, now. To judge others is considered rude in Post-Modern English, especially if the judgment is thought to be harsh. A person who judges others in this way and then communicates that judgment to those being judged is often referred to as a dick. If, for example, you saw someone who was judging others and, in a completely sincere attempt to be helpful, you said to that person, “Don’t be a dick,” you would have, in effect, not only made a judgment about that person’s behavior, but also communicated it to that person in a harsh way. By definition, then, by telling this person that he has behaved badly, you yourself would have strayed off the reservation to which PME speakers have agreed to confine themselves, and would have become the very thing that you have judged so harshly: a dick.

Now, Post-Modern English speakers are not stupid. They are aware of this trap and, not wishing to be hoist with their own petards, do what any reasonable person would do. Not only do they rarely call other people “dicks,” but they fall all over themselves to avoid any communication that can be interpreted as passing judgment on others. Simple statements about mundane matters are qualified and watered down so that the likelihood of giving offense is minimized. Explanations are inflected to sound like questions, apologies, or cries for help. Commonplace opinions are framed semi-ironically, often attached to the word “like,” so that they can be retracted at the first sign of disagreement. This feature of the language is called “ironic deniability.” It also allows one to blame irony when the real culprit is stupidity.

As a result, fluent PME speakers, when compared with speakers of earlier forms of American English, sound more uncertain, unassertive, and nonjudgmental. To put it bluntly, they sound more sheepish. Not because they are, you understand, any more than the French woman was flirtatious. It is just that the rules of the language have prodded them, bleating, into the chute that leads inescapably to the waiting tub of dip. In short, to avoid being dicks, they end up being wimps.

By telling this person that he has behaved badly, you yourself would have become the very thing that you have judged so harshly: a dick.

Wake up, old son, and smell the nitro coffee. In this brave new world, wimpiness is cool.

And that, my crusty-but-benign student, is all you need to know. You don’t need a dissertation on the cultural and historical forces that forged this pained linguistic posture; all you need is to imitate its cringe as you complete the lesson ahead and go on to achieve fluency in Post-Modern English. Here’s an aspirational commandment: “Thou shalt be cool.” You can do this. It’s goals.

Part Two: The Vocal Register

Please say, “So, that’s pretty much it, right?” in your normal 20th century voice. OK? Now say it again, but make the pitch of your voice as low as you can.

How’d it go? When you lowered the pitch, did you hear a sizzle, a popping sound, like bacon frying? No? Try again, as low as it will go. Once you’ve achieved this effect, I’ll give you the backstory.

Ready? That sizzle is the sound of liquid burbling around your slackened vocal cords. As you may have noticed, this register, often called vocal fry, has been growing in popularity during the past few decades.

Fluent PME speakers, when compared with speakers of earlier forms of American English, sound more uncertain, unassertive, and nonjudgmental. To put it bluntly, they sound more sheepish.

In the 1987 movie Made in Heaven, Debra Winger played the archangel Emmett, God’s right-hand man, who was supposed to be a chain-smoker. As Ms. Winger was not, she had to simulate a smoker’s voice for the part, serendipitously producing a pitch-perfect proto-vocal fry. While this early mutation event does not appear to have lodged in the inheritable DNA of the language, it is fascinating in the same way that the Lost Colony of Roanoke is.

Vocal fry’s current run on the linguistic hit parade is more likely to have begun when Britney Spears croaked “Baby One More Time” in 1998, although it is occasionally said that the real patient zero was someone named Kardashian. Whatever.

Women tend to use vocal fry more than men. A wag on TV once said that women are trying to usurp the authority of the patriarchy by imitating the vocal register of the male. This would be in stark contrast to the southern belle or transvestite, both of whom artificially raise the pitch of their voices, sometimes into falsetto, to enhance the appearance of femininity.

Isn’t the theory that these bubbling vocal cords were repeatedly sautéed and baked less likely than the much simpler explanation of demonic possession?

Another theory holds that the phenomenon is simply the result of too much booze and marijuana. For this “Animal House Hypothesis” to be taken seriously, however, it must account for the fact that vocal fry did not present in the ’60s (except briefly in Clarence “Frogman” Henry’s 1956 recording of “Ain’t Got No Home”). Considering that the sound more nearly resembles an audition for the next installment of the Exorcist franchise, isn’t the theory that these bubbling vocal cords were repeatedly sautéed and baked less likely than the much simpler explanation of demonic possession? The smoker’s rasp sounds much drier, anyway.

There has been an effort to dismiss the bubbling as a mere affectation. But ask yourself: what are the odds that a vocalization nearly indistinguishable from Mongolian throat singing will be adopted by millions of young people, simply to strike a pose? I’m just not buying it. The simplest explanation may be best: it was an innocently adopted, thoroughly harmless preteen fad that unexpectedly took root in adolescence and grew into a well-established, widespread adult habit, like picking one’s nose.

Don’t sizzle when you uptalk. You’ll frighten the children.

We may not know where it came from, and we may not know why it came, but we do know that vocal fry, while not quite the sine qua non of Post-Modern English, sends the loud and clear message, to anyone who happens to be within earshot, that standing here is a proud master of the 21st-century version of American English, gargling spit while speaking. (I seem to recall once seeing something similar being done by a ventriloquist.)

Learn the sounds in the lesson below; sing them with the sizzle above, while acting like a sick toy poodle at the vet’s, and your quest will be over. The Holy Grail of this Elders’ Crusade will be yours: PME fluency. (Oh, and remember: don’t sizzle when you uptalk. You’ll frighten the children.)

Part Three: The Exercises

So, in the 2016 election, Clinton was really sure she would sort of capture the states in the rust belt, but she didn’t. I mean, the turnout there was pretty much deplorable, right?

1. Discourse markers, sometimes called fillers, such as those used above (so, really, sort of, I mean, pretty much, and right), while not integral to either the meaning or the music of Post-Modern English, enhance its aesthetics, signal that the speaker is part of the linguistic in-crowd, and help the speaker sound as if his grip on what he’s saying is less than firm. It gives him wiggle room and makes him seem all squirmy: the Daily Double. Placing fillers in a phrase to best effect calls for a keen ear, rigorous practice, and a constant monitoring of how it is being done by the cool kids.

Beginning immediately, insert at least one filler into each sentence you speak. Yes, it requires self-discipline, but don’t worry; in time, it will become habitual and you will be able to dispense with self-discipline entirely.

There are fillers galore. To gain perspective, note that like, actually, and dude, while still heard, have grown slightly stale.

Yes, it requires self-discipline, but don’t worry; in time, it will become habitual and you will be able to dispense with self-discipline entirely.

About ten years ago, like was like ubiquitous. Like it was in like every sentence like three or four times. I mean, it had like metastasized. Then, over the next few years, its rate of use fell by 73%, as though it had gone into remission. Often, when a word or fad becomes a pandemic, it burns itself out. There was a sign on a Mississippi country store: “Live Bait – Nitecrawlers – Cappuccino.” It could be that the overuse of like was deemed uncool by some shadowy teen language tribunal and labeled a bad habit, like smoking tobacco. But as with that addiction, many found it impossible to go cold turkey. You’ve probably heard of Nicorette,a gum used by smokers trying to ease withdrawal. Well, the discourse markers sort of, kind of, you know, I mean, and pretty much have been the linguistic Nicorette to millions of like addicts trying to kick the habit. Some former addicts have resorted to saying kinda-sorta. They are sincere in their belief that this constitutes an evolutionary step forward.

Actually, which often sounds a trifle pompous, has largely been replaced by so in the initial position and right in the final position, as demonstrated in the lesson. It can still be used, but sparingly. Once per minute ought to do it, actually; twice, at most.

In place of dude, try bro, or brah, or bruh, or perhaps you could consider using nothing at all.

In summary, “Actually, I like know what I’m talking about, dude,” compares unfavorably to, “So, that’s pretty much, you know, how it sort of is, brah — I mean, right?” While both sets of words still appear in the lexicon of New English, the latter reflects the more gracile stage of linguistic evolution that has been achieved, and is, therefore, preferred. It sounds more woke, too, doesn’t it, or is that just me?

They are sincere in their belief that this constitutes an evolutionary step forward.

2. The first two syllables in the word “election” should be mid-range in pitch, and clearly and crisply enunciated, while the final syllable should be lower pitched and slightly drawn out: “shuuun.” (In other applications, the terminal syllable must be uptalked. This will be covered in Lesson Two.) The increase in duration for the final “shun” is mandatory for all words ending in “-tion.” God knows why. But try it again, with a little sizzle: “elek- shuuun.” Nice.

3. “Clinton” should be pronounced “Cli/en” with a glottal stop entirely replacing the medial “nt” consonant blend. Glottal stops are a thing right now. “Mountain” is “mow/en,” and “important” is “impor/ent,” not to be confused with the mid-Atlantic pronunciation “impordent.” (Note that in the go-to example for glottal stops in American English, “mitten” becoming “mi/en,” it is only the “t” sound that is replaced, as it is in “impor/ent.” Replacing the “nt” seems to be the more recent, bolder approach, and is thus more worthy of imitation.) Practice these glottal stops in front of a mirror. To avoid embarrassment, it’s better to practice when you’re alone than to try them out in public before they’ve been thoroughly polished.

4. The word “sure” should not be pronounced like “shirt” without the “t” but rather as “shore,” rhyming with “snore,” with the long “o” and a strongly vocalized “r.” This pronunciation probably hails from Brooklyn, where it had been successfully detained for decades. Similarly, don’t say “toorist,” say, “toarist.” (By George, you’ve got it.) Again, practice. This is hot stuff. Cutting edge. Hundo P.

To avoid embarrassment, it’s better to practice when you’re alone than to try things out in public before they’ve been thoroughly polished.

5. In the word “capture,” the first syllable, “cap,” should be mid-range in pitch and clipped at the end, with a fraction of a second pause before dropping down to the second syllable, “chur,” which must be at a low pitch and slightly drawn out, so that it sounds like the endearing growl of a small dog.

This rule, first promulgated by anonymous Valley Girls back in the eighties, applies to all multi-syllabic words that end in “-ture” and most words of more than one syllable that end in “r.” The amount of fry used in this application has varied over time, and the appropriate level has been the subject of a lively but inconclusive debate. I take the view that it is a matter of personal taste. Experiment with the sizzle; go ahead. Practice with this list: rapture, juncture, fracture, puncture, rupture. Remember: Start high, go low, go long. Grrrr.

6. In “the rust belt,” “the” should be at mid-register pitch, while both “rust” and “belt” should be about five full notes higher. Yes, this is the famous sound of uptalk. The higher pitch of “rust” and “belt” suggests that a question is being asked. The goal is to create the impression that you are checking to see if the listener knows, as you are pretending to know, exactly what the rust belt is. What is desired is the illusion of a simultaneous, unspoken, empathetic interaction of mutual insecurity, something like, “Are you still with me? Am I doing OK?”, evoking at most an instant, tiny nod from the listener and a silent “Yes, I’m still with you, and you’re doing just fine, I think.” Try not to sound too needy. Aim for a subtle patina of clingy insecurity. It’s more credible. No need to ham it up.

Again, it is the legendary Valley Girls who are credited with this classic innovation. Australia recently filed a suit with the International Court of Justice disputing this claim. As if!

Aim for a subtle patina of clingy insecurity. It’s more credible.

Uptalk, like vocal fry, is used by women more than men, and is frowned upon by some, especially when it is “overused” and “exaggerated.” What crap. When it’s used once or twice per sentence, and the high-pitched words don’t pierce the falsetto barrier too often, uptalk reliably contributes to an authentic-sounding PME fluency. While I’ll grant that it may be something of an acquired taste, with practice and patience you’ll come to find its chirping high notes as precious as I do. Uptalk is cool and is likely to remain so. (I suspect that some men avoid uptalk because it makes their mansplaining hilarious.)

7. Then, after “rust belt,” comes a pause, as though the speaker were waiting for some confirmation of comprehension. This is a faux pause. The pause should not be so long that it gives the listener sufficient time to formulate and initiate an inquiry — in this instance, into the actual membership roster of states or cities in the rust belt. The duration of the pause will vary according to the speaker’s assessment of the listener’s level of expertise. Here, the assessment would involve the fields of (a) voter behavior in 2016 and (b) the deindustrialization of the non-Canadian area around the Great Lakes during the past half-century. To use the faux pause correctly, then, refer to this rule of thumb: Low expertise? Short pause. High expertise? Shorter pause. As always, the primary concern should be style, not substance.

8. The words “but she” should be two full steps lower than “belt” (from the fifth to the third), but “didn’t” should be right back at the same pitch as “belt.” That’s right, another dose of uptalk.

To master the technique, the novice should start by uptalking at least 50 times a day. When I was starting out, I kept a pencil stub and a little note pad in my shirt pocket to tally up my uses of uptalk during the course of the day with neatly crosshatched bundles of five. You might want to give it a try, as it keeps your shoulder to the wheel. I am proud to say that I uptalk effortlessly all the time now, and the surprise and sheer delight on the faces of young people when they hear an older gentleman “talking up” makes all the hours of practice worthwhile. I feel like I’m really making a difference.

While I’ll grant that it may be something of an acquired taste, with practice and patience you’ll come to find its chirping high notes as precious as I do.

A word of caution. When uptalk is employed at a very high frequency, volume, and pitch, and the whole sampler of fillers is tossed in, a critical mass can be achieved that has been known to set off a chain reaction. First your dog, then the neighbors’, then their neighbors’ — before you know it, the whole neighborhood is filled with the sound of a howling canine chorus. Once, when I overdid it, the damned coyotes even joined in. So mix fillers into your uptalk carefully. I’m just saying.

9. The word “didn’t” should be pronounced as a crisp, two-syllable “dident.” The short “e” sound should be clearly heard as in “Polident.” (Think “prissy.”) This same rule applies to “doesn’t,” which becomes “duhzent,” emphasis again on the short “e.” While “couldn’t” and “shouldn’t” also sometimes become “couldent” and “shouldent,” as one might expect, just as frequently they come out as, “coont” and “shoont,” utilizing the short “oo” of “schnook.” (Thinking back, the guys I heard using this pronunciation may have been lit.) Either of these modern variants is acceptable, but eschew the fuddy-duddy standard pronunciations of the original contractions, “could/nt” and “should/nt,” which, oddly, feature glottal stops. (Yesterday, I heard “coo/ent.” Very chill.) Oh, and don’t say “did/nt.” (With all due respect, you’d sound like a cave man.)

10. The final word, “right,” should be pronounced in a way that places it at an equal distance from (a) assuring the listener that what you just said was not only correct, but cool, and (b) seeking assurance from the listener that what you just said was not only correct, but cool. In order to achieve this effect, the coloration of “right” must be subtly blended so as to become a creature of the twilight world between the declarative and the interrogative: not falling, not rising, not whining, and never, ever abrupt. With the proper inflection, “right” will hit this sweet spot, where the listener will wonder, “Wait. What? Is he asking me or telling me?”

Practice these ten exercises. Practice hard, then get out there and commence pussyfooting.

Part Four: Problems and Solutions

As you gain fluency in Post-Modern English, what you seem to lose in self-confidence, you will more than make up for with an increased capacity to appear empathetic. Your use of PME will lower the walls and build new bridges between you and the people around you. Your sacrifice of the ability to assert your will and pass judgment on others will help create a more open, tolerant, and nonjudgmental human community. You will contribute to a world in which nobody will feel the need to say “Don’t judge me,” or “Don’t be a dick,” because there will be no one judging them and no one will be acting like a dick. That’s right: no more judges and no more dicks. It will be a world of greater respect, warmth, and, yes, love.

The bad news is that you’ll have to keep an eye out for three problems that may rear their ugly little heads.

What you seem to lose in self-confidence, you will more than make up for with an increased capacity to appear empathetic.

First, there is backsliding. Although you now sound hip, as you approach your dotage you may find among your life’s baggage a few truths that you feel should be self-evident to everyone. You may even feel a need to share these truths with the people who, sad to say, have not had the pleasure of reading the self-published revisions to your personal Boy Scout Handbook. (You may also feel a constant pain in your lower back. These symptoms often occur together.) Pretending to be wimpy may have grown so taxing that, as therapy, you decide to briefly drop the Post-Modern English charade and revert to your former pre-PME self. But how do you safely remount your high horse?

To avoid unjust accusations of hypocrisy, it is best to choose the venue and target of these code-switching episodes carefully. I’ve heard that a marvelous place to engage in them is on urban motorways. I am told that it is easy to find drivers who are unaware of your exhaustive personal list of driving dos and don’ts. What next?

You may find yourself behind the wheel of a large automobile. Some knucklehead in a little Kia cuts in front of you without even signaling, missing you by mere yards. Gunning it, you pull up next to him. You lower your window. He lowers his. Then you let him have it with both barrels — figuratively, of course. You tell him, in stark Anglo-Saxon terms, in as loud and clear a voice as you can muster, the obscene fate that awaits him and his mother before their imminent and humiliating deaths. After that, spleen thoroughly vented, you brake and swerve onto the exit ramp, switch back to PME,and reassume your Oprah-like pose of nonjudgmental equanimity.

Here are a few tips. Before you switch codes, make absolutely sure that the knucklehead in your crosshairs doesn’t know who you are. Anonymity is crucial. And avoid the rush hour, when traffic sometimes grinds to a halt. Offended knuckleheads have been known to leap from their cars, screaming obscenities and brandishing revolvers. They are, after all, knuckleheads. (Good thing it’s illegal to use a wireless telephone while driving. No one will be able to post your outburst on the internet.)

The best way to keep from backsliding is, obviously, to get a grip on yourself.

Before you switch codes, make absolutely sure that the knucklehead in your crosshairs doesn’t know who you are. Anonymity is crucial.

Second, should you choose to “just say no” to the temptation to backslide, beware of unsuccessful repression. If, in order to achieve PME fluency, you have to repress the wish to lord it up over everybody, and the repression fails to keep that wish contained, you may catch it sneaking out of that darkened back room of your consciousness, where you’ve been keeping it out of public view, and exposing itself in what is popularly known as a “Freudian slip.”

Attending a lovely garden party, you might intend to say, “Oh, You’re so kind. Thank you so much,” only to find yourself saying, “Why don’t you just go fuck yourself.” Remember, you could have said this to the knucklehead who cut you off, but you didn’t want to be seen as a hypocrite.

What then? The best way to avoid Freudian slips is to keep a civil tongue in your head. If you think that you might need professional help to accomplish this, weekly sessions with a competent therapist for a year or two should do the trick. And don't be put off if the hourly fee is hundreds of dollars. Medicare covers it.

Third, and finally: As bad as that slip would be, there is the possibility of something even more embarrassing. Freud himself believed that a sufficiently strong unfulfilled wish, if locked away in some dark dungeon of the subconscious, could create intolerable internal feelings that were then projected onto an external object in the form of a paranoid delusion of the kind that motivates such modern political extremists as white supremacists and their mirror-twins, the antifas. You may find yourself on the campus of a large university, waiving simplistic placards, shouting incoherent platitudes, and trading ineffectual blows with someone very much like yourself, a person who speaks Post-Modern English fluently but finds it difficult to express his opinions nonviolently. Why, he may even lack the most basic linguistic tools that are needed to engage in civil discourse.

You might intend to say, “Oh, You’re so kind. Thank you so much,” only to find yourself saying, “Why don’t you just go fuck yourself.”

The solution? Just pull yourself together, man. Snap out of it, for the love of God.

Given your age, maturity, and ability in archaic English, spotting these pitfalls early on and avoiding them should not be difficult. If, however, you find that you’re experiencing uncontrollable urges to play the pontiff, convert the heathen, or some such, and you feel the need for relief, there is a category of medications called anti-androgens that lower the testosterone levels often associated with judgmentalism. Most of the side effects are limited to the secondary sexual characteristics and are not life threatening. If this sounds right for you, you should consult your health care provider.

Should the medication prove ineffective and your symptoms persist, there is a growing body of evidence indicating that immediate and lasting relief can be achieved through gender reassignment surgery, provided that you are a male. While this has become a relatively safe and routine procedure, boasting a single-digit mortality rate, a small percentage of otherwise qualified candidates hesitate to “go under the knife.” But if you count yourself among these reluctant few, take heart. There is one more glimmer of hope: the experimental treatment protocol called “strategic relocation.” While there is insufficient data to conclusively prove the treatment’s therapeutic efficacy, the available anecdotal evidence suggests that, at the very least, more research is warranted.

Ferris T. Pranz, a postdoctoral fellow in the Department of Applied Metaphysics of Eastern Montana State University at Purdie, has been observing a band of people living with judgmentalism. These people were individually tagged and released over the past decade by the Montana Department of Behavior Management (MDBM) outside Fertin, a farming town near Lake Gombay, just south of the Missouri River. In his unpublished 2017 field notebook, Pranz records his painstaking efforts to gain the trust of this strategically relocated band at their watering hole, a smoke-filled bar called “Grumpy’s.”

There is one more glimmer of hope: the experimental treatment protocol called “strategic relocation.”

Pranz’s observations have raised some eyebrows in the judgmentalism community in Montana. Despite the Fertin band’s characteristically opinionated and aggressive communicational style and constant abuse of both alcohol and tobacco, they seem to share a gruff good humor while playing at pool, darts, and cards. Interestingly, they often refer to themselves as “blowhards,” apparently without shame or irony, and then laugh loudly. When Pranz would ask the group to explain the laughter, they would invariably laugh again, more loudly. Pranz has recommended that further research be conducted to discern the motives behind this laughter, possibly utilizing a double-blind design.

More broadly, Pranz and his colleagues at EMSUP have proposed a major longtitudinal study to explore the incongruity of the seemingly upbeat ambience in “Grumpy’s” by designing instruments to quantify (1) the specific characteristics of these Fertin people and the effect that such characteristics may have on their communicational dynamics; (2) the effects of the complete absence of treatment by means of any of the experimentally proven therapies for people living with late-stage degenerative judgmentalism. These effects can then be compared with therapeutic outcomes in matched groups receiving such treatments. Pranz has also recommended that the proposed longtitudinal study be completed prior to authorization of an expanded “strategic relocation” program to include areas beyond Fertin. In October of 2017, the Board of Directors of the Friends of Judgmentalism in Bozeman passed a resolution in support of Pranz’s proposal. Pranz plans to apply for a grant through the MDBM in June of 2018.

Part Five: Looking Backward

American English is the language of our past, already dated and quickly becoming archaic. As will be shown, the impression that it makes when spoken is not good. More importantly, it conveys an aggressive smugness that is out of step with today’s world. Even the founding documents of the United States, written in American English, sound absolutist, judgmental, and harsh.

By now, you must have asked yourself: “If French is the language of love, and German is the language of Nietzsche, and Post-Modern English is the language of wimps, then what the heck is American English?” Well?

American English is the language of our past, already dated and quickly becoming archaic. It conveys an aggressive smugness that is out of step with today’s world.

As a native speaker of American English, I am not qualified to answer. To find a place to sit somewhere outside of one’s own language and culture, and then to sit there and listen to one’s language being spoken in order to gather an impression of the speaker, using only the sound of the language, not its meaning, is like trying to street-park a Class A RV on the Upper East Side: while it may be possible, I’ve never seen it done. No, this question should be answered by people who don’t speak the language.

American English began in 1607, when the first British colonist stepped on the shore of the James River. How do you suppose American English sounds to British ears today? I’m told there are three main impressions. First, it is spoken more slowly than British English, giving the impression that the speaker is also a little slow. Second, it is spoken more loudly than British English, and with more emotion. As both of these characteristics are associated with children, the impression is that the speaker is somewhat immature. Third, American English is rhotic, meaning that “r” is pronounced both at the end of a word and before another consonant. As this pronunciation is normally associated with Scotland, Ireland, and remote rural areas, the impression is that the speaker is a bit rustic.

Taken together, then, to British ears American English is the language of dim-witted, childish yokels. One might call it the language of knuckleheads. That is not to say that Americans are knuckleheads. It simply means that our language makes us seem that way.

Post-Modern English, while less given to the glacial John Wayne drawl or the grating Jerry Lewis bray of American English, retains the rhotic accent, even doubling down on it with the vocal fry. Still, in two of the three categories, it constitutes an evolutionary step beyond its parent language. Even British children have begun to pick up Post-Modern English from Netflix, much to the delight and amusement of their parents.

To British ears American English is the language of dim-witted, childish yokels. One might call it the language of knuckleheads.

I was once told by a friend who spoke only the Arabic of the Nejd that French sounded like someone saying, “Loo, loo, loo, loo, loo,” and English sounded like someone saying, “Raw, raw, raw, raw, raw.” That was just one Bedouin’s opinion, of course. It seemed funnier in Arabic, somehow. “Loo, loo, loo.” We had a good laugh.

In 1776, less than 200 years after that first colonist was beached, Thomas Jefferson wrote the Declaration of Independence. What a marvelous symbolic moment in the evolution of English! He had to write it in American English, of course, because the Post-Modern branch wouldn’t emerge for two centuries. While this does not excuse him, it reduces his level of culpability. Listen:

We hold these truths to be self-evident, that all men are created equal.

Can you hear his certainty? Why, the phrase simply drips with self-confidence. To assert that a truth is self-evident is an act of rhetorical bravado that positively swaggers. (“Because I said so.”) Note the absence of fillers to dull the sharp edges. He seems to have missed the lesson that explains how “you have your truths and I have mine.” He seems to be saying that “all truths are not created equal,” pardon my French. And what is this nonsense about “men”?

So Jefferson was sure of himself, and assertive. But was he judgmental? Ask yourself: What is this Declaration of Independence, at its core? Is it a celebratory birth announcement, like a Hallmark card? (“Please welcome…”)

It seemed funnier in Arabic, somehow. “Loo, loo, loo.” We had a good laugh.

Far from it. This is Thomas Jefferson leveling a public and harsh judgment against one King George III. It spells out George’s crimes, like a rap sheet or an indictment. It is clear: Tom is judging George. Tommy is calling Georgie a dick. Listen:

A prince, whose character is thus marked by every act which may define a tyrant, is unfit to be the ruler of a free people.

This white, male, rich, privileged, powerful, slaveholding “founder” of America is writing in the scathingly self-righteous tones of archaic American English. The sound of Jefferson’s voice is clear. He is cocksure and in-your-face. He is your judge, jury, and executioner. The music of his American English is a march being played by a big brass band oompahing down Main Street on the Fourth of July, snare drums rattling like assault rifles. Courage is needed to follow the facts, no matter where they lead. It pains me to have to say it, but Thomas Jefferson was a dick.

Your final assignment is to translate the first of the two fragments above (the one with the “self-evident truths”) from American English into Post-Modern English. You have five minutes. Begin.

OK, time’s up. Answers will vary, of course, but it might be useful to compare your translation with the following:

So, some of us were sorta thinking? that a coupla of these like, ideas? or whatever? we had were, oh, I don’t know, kind of, you know, well, not so bad? I guess, right? And, uh, oh yeah, that all men, I mean, like women, too, kind of like, everybody? I mean, are pretty much, I’m gonna say, created? you know, like, equal? right. or whatever, so...”

It sounds vaguely Canadian, eh?

Yes, it is time to put American English out to pasture. Post-Modern English is not just cooler; it is more in keeping with the zeitgeist. It is the language best suited to the more equitable, inclusive, and nonjudgmental world that we are building together.

It pains me to have to say it, but Thomas Jefferson was a dick.

It is time to hang up that coonskin cap.

* * *

All living languages are continuously evolving — as are we, the species that speaks those languages. Do these two forms of evolution influence each other? Of course they do. Through millennia, the evolutionary pas de deux of our species on this earth has been and continues to be shaped by, and to shape, the words and music of our languages. To the extent that there is intent in this choreography, it is ours. We are making ourselves. The changes we make to our languages have consequences beyond the merely aesthetic. They affect the choices we make that determine our destiny. We should, therefore, make changes to our languages with the same caution we exercise in rearranging the molecules of our genome. Are we good?

“. . . Fearing not that I’d become my enemy
in the instant that I preach.”
                          — Bob Dylan, My Back Pages (1964)




Share This
Syndicate content

© Copyright 2019 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.