Hong Kong in Context

 | 

Taking a casual survey of American political rhetoric, one would suspect that we were at the dawn of a new age — or at least that this nation had a poor memory. Somehow everything has become unprecedented. Unprecedented healthcare reform; unprecedented opposition to healthcare reform; unprecedented Republican victories in the midterm elections; unprecedented demonstrations in Hong Kong. But China has a long memory.

The recent protests in Hong Kong have adhered to the choreography of Chinese politics in at least one important respect: the Communist regime has accused its political opponents of being unpatriotic. Xinhua, the state news agency, recently published a commentary denouncing celebrities who supported the protests for the putative crime of challenging the authority of the Party, and — by a heroic leap of logic — of betraying a lack of love for the motherland. CY Leung, the Chief Executive, has accused foreign actors of orchestrating the demonstrations. He did not specify who these foreign actors were, but we all know that he means the United States, as if we weren’t content with the existing friction in bilateral relations and decided on a whim to make life more difficult for the Chinese government.

The democratic aspirations of the people of Hong Kong should be framed, by them and by their friends abroad, not in terms of their unique identity, but in terms of universal values that all Chinese can share.

Such hamfisted tactics could be dismissed, were it not for the real danger that the accusations might actually be taken seriously. There is an ugly history of antagonism between the people of Hong Kong and their estranged brethren on mainland China, inspired by subjects ranging from the status of the Cantonese dialect to patriotic education to reports of tourists doing unseemly things in unseemly places in Hong Kong. To people from mainland China, the aloofness of people from Hong Kong often smacks of arrogance and snobbery. But the Chinese can put up with snobbery. It plagues Beijing and Shanghai, and nobody seems to mind. In the case of Hong Kong, the danger is that the protests may be viewed in light of this antagonism and interpreted as a posture of “more-democratic-than-thou.”

Hong Kong has always been viewed as an enclave of wealthy, westernized Chinese, enjoying a wide measure of civil liberties that have been resolutely denied to people from the mainland. There is a significant possibility that they will be regarded as spoiled children, not content with their privileges and clamoring for more. The Communist regime will avail itself of every opportunity to cast aspersions on the pro-democracy demonstrators, and any indication that this is a struggle for Hong Kong’s exclusive rights will only serve to alienate it from the rest of China.

The democratic aspirations of the people of Hong Kong should be framed, by them and by their friends abroad, not in terms of their unique identity — for that would invite references to their former status as a colony of the West — but in terms of universal values that all Chinese can share. To Americans nurtured on the idea of universal values, this should not seem unprecedented.

/p




Share This


Three Good Books

 | 

I have an apology to make. I have been far behind in letting you know about books I’ve enjoyed, books that I think you will enjoy as well.

To me, one of the most interesting categories of literature is a work by a friend of liberty that is not the normal work by a friend of liberty. The typical libertarian book (A) concerns itself exclusively with public policy, (B) assumes that its readers know nothing about public policy, (C) assumes that its readers are either modern liberals or modern conservatives, who need to be argued out of their ignorance, or modern libertarians, who need to be congratulated on their wisdom. I find these books very dull. I suspect that when you’ve read one of them, you’ve read them all. But I have no intention of reading them all.

What I want is a book that has a libertarian perspective and actually tells me something new. One such book is Philosophic Thoughts, by Gary Jason. You know Gary; besides being a professor of philosophy, he is also one of Liberty’s senior editors. The book presents 42 essays, some on logic, some on ethical theory, some on metaphysics, some on applications of philosophy to contemporary issues. Libertarian perspectives are especially important in the discussions of ethical theory, where we have essays on such matters as tort reform, free trade, boycotts of industry, and unionization (issues that Jason follows intently). The attentive reader will, however, notice the spirit of individualism everywhere in the book.

What you see in the book is someone learning, as he moves from France to America and from mid-century to the present, that “American” is the best name for his own best qualities.

The essays are always provocative, and Jason knows how to keep them short and incisive, so that the reader isn’t just invited to think but is also given time to do so. Of course, you can skip around. I went for the section about logic first, because, as readers of Liberty know, I understand that topic least. I wasn’t disappointed. There is nothing dry about Jason’s approach to problems that are unfairly regarded as “abstract” or “merely theoretical.” He is always smart and challenging, but he makes sure to be accessible to non-philosophers. In these days of fanatical academic specialization, it’s satisfying to see real intellectual curiosity (42 essays!). And Jason doesn’t just display his curiosity — he is no dilettante. He contributes substantially to the understanding of every topic he considers.

Another book that I’ve enjoyed, and I don’t want other people to miss, is a work by Jacques Delacroix, who has contributed frequently to these pages. In this case, you can tell a book by its cover, because the cover of Delacroix’s book bears the title I Used to Be French. Here is the cultural biography — cultural in the broadest sense — of a man who became an American, and an American of the classic kind: ingenuous, daring, engaging, funny, and again, curious about everything in the world. Whether the author began with these characteristics, I don’t know, but he has them now; and what you see in the book is someone learning, as he moves from France to America and from mid-century to the present, that “American” is the best name for his own best qualities.

Arthurdale was the result of Mrs. Roosevelt’s commendable concern for the poor and of her utter inability to understand what to do about poverty.

It takes literary skill to project a many-sided personality; and the strange thing is that it takes even more skill to project the differences we all feel between American culture (bad or good) and French — or any other European — culture (bad or good). We feel those differences, but when we try to describe them we usually get ourselves lost in generalizations. Delacroix doesn’t. He has a taste for the pungent episode, the memorable anecdote. He also displays two of the best qualities of which a good author, American or French, can ever be possessed: an exact knowledge of formal language and an intimate and loving acquaintance with the colloquial tongue.

Sampling Delacroix’s topics, one finds authoritarianism, Catholicism, Catholic iconography, the Cold War, communism, diving, driving, the end of the Middle Ages, existentialism, food, French borrowings from English, the French navy (being in it), getting arrested, grunion, jazz, Levis, lovemaking, Muslims, the People’s Republic of Santa Cruz, political correctness, the Third World in its many forms. . . . Most (even grunion) are topics that a lesser author would inevitably get himself stuck to, but Delacroix romps through them all. If you want a loftier metaphor, you can say that they (even the grunion) are jewels strung on the book’s central story, as sketched in the summary on the back cover: “A boy grows up in the distant, half-imaginary continent of post-World War II France. Bad behavior and good luck will eventually carry him to California where he will find redemption.” And a lot of fun, for both the reader and himself.

Fun, also, in another way, is a book I’ve been perversely withholding from you for three years. It’s Back to the Land: Arthurdale, FDR’s New Deal, and the Costs of Economic Planning, by C.J. Maloney (also, be it noted, a contributor to Liberty). What does that title mean? Well, Arthurdale, West Virginia, was a settlement begun in 1933 by the United States government under the inspiration of Eleanor Roosevelt. It was the result of Mrs. Roosevelt’s commendable concern for the poor and of her utter inability to understand what to do about poverty. Her idea — which was shared by a multitude of college professors, pundits, quack economists, and the usual products of “good” Eastern schools — was that there was an “imbalance” between rural and urban America; that the latter was too big and the former too small; and that the government should “resettle” hordes of Americans “back on the land” (where, incidentally, most of them had never lived). Mrs. Roosevelt was especially concerned with converting out-of-work miners into “subsistence” farmers. She and her New Deal accomplices designed a turnkey community for 800 or so lucky recipients of government largesse — land, houses, furnishings, equipment, expert advice. What could go wrong?

The answer, as Maloney shows, is “virtually everything.” The planned community had no plans except bad ones. The farms didn’t support themselves, and the farmers didn’t really want to farm them. Everything cost more — lots more — than it should have. Attempts to supplement small farming by small industry repeatedly failed. When the “colonists” managed to produce a surplus of something, the government wouldn’t let them sell it. The democratic and communitarian ideals hailed by government bureaucrats — who included some of the nastiest specimens of the New Deal, such as Rexford Guy Tugwell, one of the smuggest and stupidest creatures who ever attracted national attention — were continuously negated by the power of the Planners themselves.

It’s a good story, amusing though sad; and I wish I could say it was amazing. Unfortunately, it was just one of the predictable results of those dominating impulses of big government: arrogance and wishful thinking. Maloney’s well-researched book places Arthurdale firmly in the context of 20th-century interventionism, with plenty of information about the broader movements it represented and the people involved in them. The book is lively and pointed. Like the other books mentioned here, it is both an education and an entertainment. Like those other books, it is one of a kind, and not to be missed.


Editor's Note: Review of "Philosophic Thoughts: Essays on Logic and Philosophy," by Gary Jason. New York: Peter Lang, 2014. 416 pages; "I Used to Be French: an Immature Autobiography," by Jacques Delacroix. Santa Cruz CA: By the Author (but you can get it on Amazon), 2014. 420 pages; and "Back to the Land: Arthurdale, FDR’s New Deal, and the Costs of Economic Planning," by C. J. Maloney. Hoboken: John Wiley & Sons, 2011. 292 pages.



Share This


Point Counterpoint

 | 

Dinesh D’Souza is a debater beyond compare. I have watched him debate at least a dozen times, and he is simply brilliant in the way he sets up his opponent, recognizes the opponent’s position, and then systematically takes it apart and refutes it. Once when he was debating Christopher Hitchens on the value of religion, Hitchens called D’Souza’s bluff by not making his own case, thereby giving D’Souza nothing to tear apart. Undaunted, D’Souza first told the audience what Hitchens should have said about the bad things that have happened in the name of religion, and then went ahead with his own side of the debate, never missing a beat and managing to stay within his time limit to boot.

I thought about those debating skills while watching D’Souza’s new movie, America: Imagine a World Without Her. The film begins with an imagined reenactment of a Revolutionary War battle in which Washington dies and America never comes into existence. What might the world look like without the American philosophy? He then switches into devil’s advocate, listing five significant areas in which Americans should feel deep shame:

  1. Theft of lands from Native Americans, and genocide against them
  2. Theft of the American Southwest from Mexico
  3. Theft of life and labor from African-Americans
  4. Theft of resources from around the world through war and expansionism
  5. Theft of profits from consumers through capitalism (“You didn’t create that business — someone else built those roads, educated those employees, etc.”)

Watching this part of the film, especially as the first three points were elaborated, I nodded my head in agreement and disgust. These were terrible events that blot our nation’s history. How would D’Souza debate his way out of this one, I wondered?

D’Souza then steps back to give context and historical background to these situations. He does not denigrate or trivialize the suffering of the people involved, but he widens the story to give a broader perspective. By the time he is finished we feel humbled by the bad things, but no longer shamed by our history. In fact, our pride is restored for the good that we have accomplished, despite our slowness sometimes in getting there. Quoting both Martin Luther King and Abraham Lincoln, he calls the equal rights vouchsafed in the Declaration of Independence a “promissory note” that took decades — nay, two centuries — to pay off, and indeed is still a promissory note in some instances.

By the time D’Souza is finished we feel humbled by the bad things, but no longer shamed by our history.

I was especially pleased that D’Souza included a segment on Madam C.J. Walker, the first black American woman to become a millionaire. Walker made her million manufacturing and selling cosmetics and pomades for African-Americans. She started as a cotton picker, worked her way up to cook, and saved her money to start her business. She is a true entrepreneurial hero who is often overlooked in the history books, I think, because she doesn’t fit the cult of victimhood ascribed to blacks and women, and because she made it on her own through entrepreneurship, not through political activism. I only know about her because her mansion is a mile from my house. (It survived the Roosevelt wealth tax devastation by serving as a tax-exempt old folks home for several decades, but is now a private residence again.) Now, thanks to D’Souza’s movie, others will know about this American entrepreneurial hero.

I would have been happy if the film had ended there, but then D’Souza turns to his opponents in this debate, such people as Boston University professor Howard Zinn, whose 1980 book A People’s History of the United States 1492–Present has influenced many political activists; and Saul Alinsky, whoseRules for Radicals heavily influenced such politicians and “community organizers” as Hillary Clinton and Barack Obama. Like a good debater, D’Souza defuses the ammunition his detractors might use against him, the business about his recent run-in with the law, by addressing it head-on instead of giving his opponents an opportunity to whisper about it or suggest that he is hiding something. He admits that what he did was wrong (he reimbursed two friends who donated to another friend’s campaign in order to circumvent campaign contribution limits established by law — a law, by the way, that many people consider a violation of First Amendment right to free speech.) D’Souza frames his admission within the context of selective prosecution (some would call it political persecution) in retaliation for his previous film, 2016: Obama’s America.

America: Imagine a World without Her opened this week to coincide with the Fourth of July. It is an impressive piece of filmmaking, not only for its well-structured arguments but for its production qualities. Producer Gerald Molen, who won an Oscar as producer of Schindler’s List, is the man behind the magic. The film is also a featured selection at the Anthem Libertarian Film Festival as part of FreedomFest at Planet Hollywood in Las Vegas next week (information about FilmLovers Passes is at anthemfilmfestival.com).


Editor's Note: Review of "America: Imagine a World Without Her," directed by Dinesh D’Souza and John Sullivan. Lionsgate, 2014, 103 minutes.



Share This


They Didn’t Want a War

 | 

Margaret MacMillan’s The War that Ended Peace gives a fascinating description of the background, stretching back to around 1900, of what she, like people at the time, calls the “Great War.” She relates how the Bosnian crisis of 1908, the Moroccan crises of 1905 and 1911, the crises arising from wars among the Balkan countries in 1912 and 1913, and various minor incidents were successfully muddled through without war among the great powers. The most general source of tension seems to have been fear of being attacked first and concern to make and maintain alliances.

Leading statesmen optimistically expected that tension between Austria-Hungary and Serbia, exacerbated by the assassination of Archduke Franz Ferdinand on 28 June 1914, would somehow be resolved like the earlier crises. Even after Austria-Hungary rejected Serbia’s compliant but not total acceptance of its ultimatum and declared war, hope lingered of keeping the war contained.

Few policymakers had wanted war (the main exception perhaps being Franz Conrad von Hötzendorf, Austro-Hungarian Chief of Staff). The German Kaiser was no exception, although he was addicted to impulsive speeches and interviews, liked to strut in military uniform, and even enjoyed fiddling with the detailed design of uniforms (as did his fellow emperors Franz Joseph and Nicholas II).

World War I was a momentous and enduring tragedy. Germany, for one, had everything to gain from continuing peace.

As those examples suggest, MacMillan goes into revealing detail not only about demographic, economic, political, diplomatic, and military situations and events but also about people — royalty, politicians, foreign ministers, diplomats, generals and admirals, journalists, and influential or well connected socialites — together with their backgrounds, illnesses, deaths, and strengths or quirks of personality.

Much of this is relevant to the role of sheer and even trivial accident in momentous history. MacMillan herself notes several examples. The Russian monk Rasputin, whatever his faults, strongly advocated peace and had great influence with the Imperial family; but he had been stabbed by a madwoman on the very day of the Austrian Archduke’s assassination and was recovering slowly, far from St. Petersburg. The Archduke himself had long realized that Austria-Hungary was too weak to risk an aggressive foreign policy. Alfred von Kiderlen-Wächter, German Foreign Minister and in MacMillan’s opinion a force for peace, had died in December 1912. Joseph Caillaux, France’s peace-minded Prime Minister, had had to resign in January 1912, partly in connection with his second wife’s shooting of an editor who had threatened to publish some indiscreet love letters that Caillaux had sent to her while she was still married to someone else. Although MacMillan does not explicitly raise the question, I was set to wondering how events would have evolved if Otto von Bismarck, a realist who was satisfied with Germany’s international position achieved by 1871, had been alive and in office in 1914. Or what if Gavrilo Princip’s bullet had missed the Archduke?

MacMillan ends her book, apart from a 13-page epilogue, with the outbreak of war in July-August 1914. That is fine with a reader more interested in the consequences of particular wars and with how the wars might have been avoided (as many potential wars no doubt were barely avoided) than with the details of the actual fighting. World War I was a momentous and enduring tragedy. Germany, for one, had everything to gain from continuing peace, including its growing leadership in science and industry. MacMillan writes a gripping story. She conveys a feel of the suspense that must have prevailed during the final crisis. My opinion of her book is overwhelmingly favorable.

Or it would be except for one minor but pervasive and annoying defect. The book is erratically punctuated, mainly but not everywhere underpunctuated. Even independent clauses, often even ones with their own internal punctuation, go unseparated by a comma or semicolon. Restrictive and nonrestrictive phrases and clauses are not distinguished, as clarity requires, by absence or presence of punctuation. Such erratic and erroneous punctuation delays understanding, if usually only for a second. Even so, it distracted me from the book’s fascinating story.

Above all, it distracted me with sustained wonder about how so untypically mispunctuated a book could emerge from a major publishing house. Could the copyeditor have given up in the face of a daunting and tedious task? Could an incompetent editor have imposed the damage, which the author then passively left standing? Could the author have committed the errors herself and then, perhaps out of bad experience with previous copyeditors, have insisted on none of their tampering this time? None of these hypotheses seems plausible, but I can’t think of a better one. The author’s including her copyeditor in her long list of Acknowledgments adds to the mystery.

I’d be grateful if someone could relieve my curiosity with the true story.


Editor's Note: Review of "The War that Ended Peace," by Margaret MacMillan. Random House, 2013, 784 pages.



Share This


The Crimean Crisis

 | 

As readers of this journal may remember, I am not an isolationist, if “isolationist” be defined as “one who deems it immoral for the United States to use force across its borders”; but I am an isolationist if the word be defined as “one who thinks the United States should mind its own business.” To my mind, the Crimean crisis is a classic instance of a conflict about which the United States should do just that.

I am at least a mild supporter of the Ukrainian revolution, as I understand it. And I have little or no use for Vladimir Putin, as I understand him. I can understand why Ukrainian-speaking Ukrainians would like to hang on to the Crimea. But I can also understand why Putin would like to get it away from them (as he virtually has, right now). It’s the location of a Russian fleet. The majority of its population speaks Russian, is Russian, and resents attempts of Ukrainian nationalists to make them speak and be Ukrainian. The Russians presumably know what can happen to dissenting nationalities when even the most “liberal” revolution heats up. And after all, the Crimea is part of Ukraine only because the old Soviet dictatorship, in an idle moment, gave it to Ukraine.

There are some reasons why the United States should not want Russia to annex the Crimea. It’s generally best for us when the Russians have an unstable base, such as Ukraine, for their military power. Even the least legitimate borders are often better than no borders at all, so it would generally be better if nationalists of every kind thought it was futile to try rearranging them. And it would be unfortunate to see a guy like Putin win.

This does not add up to a reason for us to “get tough” with Putin. It would be almost impossible to do so anyway, and expect any degree of success. President Obama may draw “lines in the sand,” but no one in the world believes what he says, even if it’s accidentally true.

So this is a good time for us to enjoy our isolationist traditions to the full. Bad things may happen; bad things undoubtedly are happening. This is to be expected wherever 19th-century European nationalism rears its head again. But that is, and must be, their problem.




Share This


Rose Wilder Lane Takes Another Bow

 | 

Rose Wilder Lane fans should not miss Susan Wittig Albert’s new book, A Wilder Rose (Persevero Press, 2013). The book is written as a novel but is really novelized biography. It focuses on Lane’s life in the 1930s, when she went to live with her mother, Laura Ingalls Wilder, and rewrote her mother’s manuscripts as the Little House books.

I didn’t grow up with those books and have read only the first one. I have read William Holtz’s biography, The Ghost in the Little House, (University of Missouri Press, 1993; see Liberty, Mar. 1992, p. 51), which explained how Rose transformed her mother’s oral-tradition stories into commercially valuable fiction. I can’t vouch for everything in Albert’s new book, but the Rose she presents — and this is written in the first person — sounds very much like Rose’s voice.

Albert has a chapter on Rose’s brief romance with Garet Garrett, a writer I know very well. I can vouch for the fact that the Garrett in A Wilder Rose sounds like him. Some of his statements in the book are right out of his letters to Rose.

Albert’s novel is mostly about relationships: between Rose and her mother, between Rose and her seven-year companion, Helen Boylston, between Rose and a boy she took under her wing, John Turner, between Rose and Garet, and most of all, between Rose and her writing.

The later Rose became an enemy of the state. She did this by not signing up for Social Security and not making a lot of money the state could tax.

Rose wrote for money. Despite her pinched upbringing, or maybe because of it, she was a spender, not a saver. When she had money she went on trips and enjoyed herself. She paid for the education of John Turner, and of Rexh Mehta, a boy she had known in Albania. She built her mother a stone house and brought electricity to their hardscrabble farm. The rewriting of her mother’s unpublishable drafts was partly motivated by a desire for her mother to have money so that Rose would not feel obligated to give her so much of it.

The central event of A Wilder Rose is mother and daughter agreeing, after struggle and face-saving, that Rose would rewrite the Little House manuscripts without credit or disclosure. Another theme is Rose’s incessant desire to shake free of the need to earn “cash, cash, cash,” and write about the ideas she cared about, all the while she was spending money on the people she cared about.

At the end of the 1930s Rose Wilder Lane did shake free of financial obligations and write what she cared about. Her 1943 polemic, The Discovery of Freedom, has made her a historical figure for libertarians. It is not, however, an achievement that much interests biographers, who are attracted much more to the story of the unsung ghostwriter of the famous Little House books.

The later Rose became an enemy of the state. She did this by not signing up for Social Security and not making a lot of money the state could tax. She no longer wrote novels or thousand-dollar stories for the Saturday Evening Post. She did write some things, but there was less of a market for them and her output declined. She was no longer famous.

There is not much in Albert’s book about her life after 1939.


Editor's Note: Review of "A Wilder Rose", by Susan Wittig Albert. Persevero Press, 2013, 302 pages.



Share This


Barbara Branden, RIP

 | 

Two weeks ago I received a message from Barbara Branden expressing joy that her book, The Passion of Ayn Rand (1986), was now available as an ebook, with a new introduction by her. Nice going! I thought, to have a book in print for 27 years, and to be reintroducing it today, in a form of publication unknown when the book was written.

In her 84 years, Barbara herself passed through many forms and editions, without ever losing her essential being, or her essential spunk. When very young, she and her former husband Nathaniel Branden became acquainted with Ayn Rand — first as inquirers into the philosophic and literary work of an author who was not, at the time, particularly well known; then as virtual family members, the innermost of Rand’s inner circle; then as Rand’s chief publicists; then as her first biographers (Who Is Ayn Rand? [1962]); then as disillusioned former disciples (1968).

Now here is the very unusual thing: both Barbara and Nathaniel repudiated their absurdly flattering and credulous biography and many of the fanatical conclusions that their mentor had derived from her libertarian and Objectivist premises, but they didn’t throw the accomplishments out with the failures. They kept investigating and publicizing the best parts of Rand, her true intellectual accomplishments. And in 1986, Barbara produced the first real biography of her former friend, a work that demonstrated she could not only admire but also distinguish what was worthy of admiration. She showed where her earlier biography had gone wrong, and she had a lot to say about where she herself had gone wrong during the time when she wrote it. No maudlin emotions, no spite was expressed — but a great deal of gratitude for the true things Rand taught.

Very few authors ever repudiate anything they’ve written; even fewer repudiate their writings in a candid and discriminating manner. And very few libertarians or Objectivists have ever possessed the charm, the personal persuasiveness of Barbara Branden. I sometimes think that there would be millions more libertarians if there were only a few more people able to speak like Barbara. She was never interested in rhetorical victories or smart remarks (though she did have a taste for ironic epigram); she was interested in stating a case clearly and smoothly (no “ums” allowed). She succeeded, both in private and in public.

Barbara was a prize speaker at libertarian events, but I can tell you that she was also an excellent listener, one of the best listeners I have ever known among ideologically inclined people. She didn’t debate; she didn’t spar for intellectual advantage; she didn’t pretend to know what she didn’t know; she asked questions, acknowledged contrasting ideas, made suggestions, said things like “I hope you’re right,” and smiled with joy over the human fellowship that real conversation brings.

Very few libertarians or Objectivists have ever possessed the charm, the personal persuasiveness of Barbara Branden.

Memories. I remember sitting on the big couch in Barbara’s apartment in Los Angeles, while she took a day to help me with the research I was doing for The Woman and the Dynamo, my biography of Isabel Paterson. Rand was Paterson’s disciple, and Barbara was Rand’s disciple, and now Barbara was helping me, the latter-day disciple of Paterson. She was completing one of the many circles that libertarians needed to complete. When my book came out, Barbara received it with pleasure, despite the different interpretations it presented of some important things in her own book. Another author would have resented them; she assuredly did not.

I remember attending the party that preceded the auction of some of Rand’s papers, at Los Angeles in 1998, talking with Barbara, and watching her pose for pictures with Nathaniel. She didn’t pretend not to cry; not all the cycles of her life had been pleasant for her, although she was happy to see this particular cycle returning on an upward curve. She did not cry when I talked with her on the phone while she was recovering — oh, this was many years ago — from a cancer that could have claimed her life. I called, fearing to find her at death’s door. Not at all! Her voice was a little weak, but her spirit was confident. “I am learning,” she said, “not to be a cancer-prone person.”

I remember Barbara telling me about the time when she (and Nathaniel, I believe) were arguing with Bennett Cerf, Rand’s publisher, a man known as a modern liberal. “I don’t think that went very deep,” Barbara said. “When we pressed him about the liberal idea that people should sacrifice to help ‘those less fortunate than themselves,’ he finally said, ‘We have to do it, because otherwise they’ll destroy us.’”

I remember looking forward to visiting Los Angeles so I could go with Barbara to her favorite restaurant (a place with “Hamburger” in the name) and hear more of her stories. I remember Barbara’s healthy appreciation for handsome, hunky men. I remember her humor. And I remember her good humor. Some people are born bitter; others have bitterness thrust upon them; Barbara always refused that gift. She was interested in more vital matters.

I remember so many other things about Barbara . . . but how strange it seems to say “remember,” as if she were actually gone. True, she died on December 11, 2013 — in her sleep, after leaving a hospital where she had been treated, apparently with at least temporary success, for a lung ailment. But no one who knew Barbara Branden will believe she is actually gone.

rsquo; he finally said,




Share This


To Protect Us from Ourselves

 | 

When the AIDS epidemic began in the late 1970s, contracting the virus was a virtual death sentence. No one had a cure. In fact, at first, there wasn't even a diagnosis. People just weakened and wasted away until they died, usually of pneumonia. I don't know anyone who didn't know someone who died from the mysterious illness during the 1980s.

Once it was diagnosed, finding a cure became a top priority, and pharmaceutical researchers who had promising results in lab experiments were fast-tracked to human trials in an effort to out-pace the death toll. But people were dying faster than the cure could be found. Moreover, only half the people participating in the tests were given the medications that might cure them; the other half were given a placebo, and even their doctors did not know who was getting the real thing. The FDA controlled the game, and while they were fast-tracking the research, they weren't fast enough for the patients who were dying at alarming rates, and alarmingly fast.

Meanwhile, researchers in other countries were working just as hard to find a cure. AIDS sufferers desperate for medicine went abroad for treatment. Many treatments consisted of high doses of vitamin and mineral supplements that would boost the compromised immune system, giving the body the strength to fight the virus. These supplements and medications were not illegal, but they were not approved either. Consequently, individuals could use them, but they could not sell them. To circumvent this technicality, "buyers clubs" were born. By purchasing a monthly membership, people could have all the supplements they needed for free. The FDA didn't like these buyers clubs, but they couldn't stop them unless the specific supplements were declared illegal to use. Buyers clubs flourished around the country as thousands of terminally ill patients lined up for treatment.

He shouts at his doctor, "Screw the FDA! I'm going to be DOA!" Then he drives to Mexico to find his own treatment.

Dallas Buyers Club tells the story of Ron Woodroof (Matthew McConaughey), an electrical engineer and rodeo rider who contracted the AIDS virus in 1985. Given just 30 days to live, he begs to get into the clinical trials or to buy AZT, the only drug that was showing any promise. When he can't get into the clinical trials or buy the drug outright, he shouts at his doctor (Jennifer Garner), "Screw the FDA! I'm going to be DOA!" Then he drives to Mexico to find his own treatment from Dr. Vass (Griffin Dunne), an American who has lost his license to practice medicine in the US. The Dallas Buyers Club is born, and Woodroof lives another seven years, along with hundreds of other survivors who purchase memberships from him. The film documents his fight with the FDA as he struggles to keep his supplements from being actively banned instead of simply "not approved."

Ron Woodroof is about as unlikely a hero as you will ever find in a film. A disgusting man with disgusting habits, he's a foul-mouthed, homophobic, alcoholic, coke-snorting, porn-viewing womanizer without an ounce of the milk of human kindness. Both F words — “fuckin'” and “faggot” — regularly spew from his mouth. "Fifty bucks?" he says incredulously to a desperate young man who has come to join the buyers club. Then he strides to the door of his motel-room-turned-"club"-office and shouts to the men lined up in the parking lot, "Membership is four hundred bucks. You got that? Four hundred bucks. I'm not running no goddam charity!" He turns to the frightened young man: "Don't you come back here till you got $350 more." He's in it for the money. Saving lives is just a byproduct.

Ron learns what prejudice feels like when his friends turn against him. They call him "faggot" because they assume that's how he acquired the disease, yet they avoid him because they are afraid of catching it by standing too close. In anger Woodroof spits at them, knowing that his body fluids have become a deadly weapon. Early research demonstrated that AIDS mostly occurred among the "4H" group: homosexuals, heroin users, Haitians, and hemophiliacs. I remember the dark joke that used to circulate in the 80s: "What's the worst thing about getting AIDS? Convincing your parents that you're Haitian." But it was also a danger among promiscuous heterosexuals who engaged in indiscriminate, unprotected sex. And that was the way Ron Woodroof lived his life. He practically shouts "Hallelujah" when a woman who is HIV-positive joins the Buyers Club, because now he can have sex again without worrying about transmitting the disease.

The film portrays the FDA as the bad guys, in cahoots with the pharmaceutical companies and preventing sick people from getting the treatment they want and need. Like most libertarians, I am convinced that the FDA does as much harm in delaying the approval of effective treatments or approving the use of harmful treatments as it does good in its stated purpose of protecting the public. Dr. Vass tells Woodroof that the high dose of AZT used in the FDA-approved trials was toxic, poisoning the body along with the virus. Woodroof gets better when he stops taking his black-market AZT and starts taking Vass' supplements (as well as experimental Interferon he eventually buys in Japan).

However, I have to suggest that the patients involved in the clinical trials bear some of the blame for the skewed results of the early tests of AZT. Many of them were sharing or selling their meds in order to help friends who were also infected but could not get into the trials. For example, Rayon (Jared Leto) a transvestite whom Ron reluctantly befriends in the hospital, is selling half his AZT to his partner, who also has AIDS. This would have skewed Rayon's results. When Rayon got better, researchers naturally assumed that the dosage they prescribed was correct, when actually he was taking half as much as they thought he was taking. Future patients would be prescribed more than they needed, and they would not get better. These trials were flawed, because the patients were not being honest.

Of course, the whole system was flawed because the market was not allowed to operate in the open. As one almost-wise judge says in the movie, "Someone who is terminally ill ought to be allowed to take whatever he wants. But that is not the law." I would go one step further: we are all terminal. We are all going to die. We ought to be able to decide what we put into our bodies, as long as we accept the consequences of our actions — which includes getting sick and having to pay for treatment from our own pockets or the private insurance we pay for (which might not be available to us if our willful actions have caused the problem.) We don't need government watchdogs. Private organizations such as Consumer Reports, the Better Business Bureau, PCGS (Professional Coin Grading Service), and even Good Housekeeping, with its Seal of Approval, work just fine, thank you very much. But if someone agrees to participate in a clinical trial, whether publicly or privately funded, that person is obligated to be honest and diligent in maintaining the integrity of the tests.

We are all terminal. We are all going to die. We ought to be able to decide what we put into our bodies, as long as we accept the consequences of our actions.

Matthew McConaughey lost 38 pounds for this role, and he looks terrible. His cheeks are sunken, his eyes dull, his skin sallow. Other actors have undergone massive weight loss for particular roles; Christian Bale and Tom Hanks come immediately to mind, as well as Jared Leto, who lost 30 pounds for his role as Rayon in this film. But McConaughey does not seem to be bouncing back from this extreme weight loss as well as others have. In more recent roles this year his skin still looks sallow, and his eyes still have that dark, almost vacant brightness. While I admire his dedication to his craft, and I'm not surprised that so many critics are predicting Oscar nominations for McConaughey and Leto, I hope that this fine actor has not inflicted permanent damage on his liver or other organs in order to make this film, especially because it is not a great film. It's an important topic, but the movie drags in places, and I caught myself looking at my watch several times.

Moreover, it is borderline pornographic, from the opening scene when Woodroof is having a threesome at a rodeo and continuing through his voyeuristic visits to strip clubs, to the porn adorning his walls, to additional threesomes — or maybe it was foursomes; I had to stop looking — even after he finds out he has AIDS. I realize that director Jean-Marc Vallee was developing Woodroof's seedy character with these scenes, but I think the audience could have gotten the point without the scenes being so graphic. As a result, this important movie with its strong libertarian theme is making the rounds of the art houses instead of the major theaters, where it could (and should) have been seen by hundreds of thousands more viewers, viewers whose minds might have been changed about the FDA and other government agencies created to "protect us from ourselves." These scenes might not bother you, but I will be recommending that my friends read the article written by Bill Minutaglio for the Dallas Morning News on which this story is based. Here is a link: http://www.buyersclubdallas.com/.


Editor's Note: Review of "Dallas Buyers Club," directed by Jean-Marc Vallée. Voltage Pictures, 2013, 117 minutes.



Share This


O Tempora! O Bama!

 | 

For anyone who enjoys linguistic spectacle, who savors both the triumphs and the flops of the American language, there is just too much to savor in the political carnival now going on. You’re reduced to picking a few favorites — but there are so many to pick from.

For a while my favorite performance was the testimony, if you want to call it that, of Kathleen Sebelius, God’s gift to satirists, who on October 30 told a congressional committee investigating the zany antics of the Obamacare website, “Today, more individuals are successfully creating accounts, logging in, and moving on to apply for coverage and shop for plans. We are pleased with these quick improvements, but we know there is still significant, additional work to be done. We continue to conduct regular maintenance nearly every night to improve the consumer experience.”

That was her way of describing the worst disaster in the history of computation. Unluckily for her, the website crashed (for the thousandth time) during the hour of her testimony, a testimony in which she said, “The website has never crashed. It is functional but at a very slow speed and very low reliability.”

 I thought that was hard to beat, but then I discovered Marilyn Tavenner, administrator of something called the Centers for Medicare and Medicaid Services. (Everything is a “center” these days, and every center has as many “services” as confidence men have “angles.” Pretty much the same angles, too.) On November 5, Tavenner let Congress know what her center is doing about people whose insurance plans have been swept away by Obamacare: “This is actually a conversation we're having today. . . . Is there a way we can actively engage to reach out to people who have been canceled?" 

From these heights of metaphor one lands with a thump on the pancake-like flatness of a quickly succeeding passage.

 Rome burned while Nero conversed. “Conversations,” thoughts of “engagement,” and questions about whether there are ways to “reach out” (“actively,” not passively) are good means of wasting time if you’re chairman of the country club greens committee, or if you’re a highly paid bureaucrat who finds that she has nothing to say for herself when the public finally discovers her existence. I’m not sure they do much for “people who have been canceled.” As the Beatles might have sung, “Oh, look at all the canceled people.” 

 Tavenner looked like a winner — until I encountered US Sen. Kay Hagan (D-NC). On November 12, Hagan panicked and called a press conference to rescue herself from the Obamacare wreckage (she’s up for reelection next year). Someone asked her to comment on the miserably small number of people signing up for Obamacare. According to Dana Milbank of the Washington Post, this is what ensued: 

“You know,” she replied. “I know the — I believe this coming Friday, those numbers are going to be published and uh, you know, as soon as I see them, you know, obviously it’s, it’s m-much fewer than the administration expected.”

A reporter from the Greensboro (N.C.) News & Record asked why Hagan, like President Obama, had told people that if they liked their health plans they’d be able to keep their health plans.

There was a long pause before Hagan responded, then a deep intake of breath. “You know, Doug,” she responded, “the, um” — here she exhaled and paused again — “the way these, the — the regulations and the law, uh” — pause — “came forward recently, I think people were surprised that the, uh, the — the actual original plans would be, um, would be canceled.”

You may say that all politicians would sound like that, if the statements they made were accurately reproduced; and if so, you’d be close to right. Deprived of his teleprompters, President Obama says “uh” about 20 times a minute, up to 40 when he’s agitated (these subverbal attempts to communicate are tactfully omitted from the reported versions). And of course President Obama, and Rep. Boehner, and former Gov. Palin (shall I go on?) often have no more meaning in their utterances than poor Sen. Hagan.

But we mustn’t judge rhetorical effectiveness simply by the content of a politician’s remarks, or noise. It’s charm that counts, and our politicians have little or none of that quality. The “uhs” contribute to the effect, but even a total absence of “uhs” couldn’t make Harry Reid look like something other than the troll who wanted to eat the billy goats gruff. Nor would it turn President Obama into a charming character.

Whatever Obama touches, he disfigures. His speech has as much relation to literature as an advertising brochure.

For some, certainly, Obama has “charisma,” but of charm he is completely destitute. He comes across as a phony and a blowhard, and it’s hard not to see a wide vein of meanness and chronic anger beneath the high-school-principal intonations. When he’s not looking at his teleprompter — when he’s supposed to be conversing with an actual human being — he’s usually gazing fixedly at a point about 12 inches in front of his chest, as if he were studying an invisible set of instructions for dealing with the underclass. This is the antithesis of charm. It’s the kind of thing one expects from bank examiners, experts on epistemology, and actors emerging from a heavy course of anger therapy. Sen. Hagan, by contrast, manifests herself as a hapless innocent, as someone so childish that she calls a press conference to display her knowledge — of a subject she knows nothing about. She’s like a little girl who begs to show everyone how well she can play the piano, without ever realizing that you can’t play a tune without learning the notes. But isn’t it cute, the way she’s trying? Less cute is President Obama.

There are four types of rhetoric in which he habitually indulges, and none of them is even mildly amusing, let alone endearing:

1. The “soaring” mode that even his supporters now derisively call “the hopey-changey thing.”

2. The false-plebeian style that he uses in exact proportion to his slippage in the polls. This style, or pretense at style, consists largely of dropping final g’s, saying “a whole buncha” instead of a number, and referring constantly to “folks.” In that speech he gave at Boston, the one in which he tried to save his lie about Obamacare by claiming he had always told people “you can keep your insurance . . . if,” he said of his failed healthcare scheme, “We’re just gonna keep workin’ at it. We’re gonna grind it out.” That might be charming if the accent weren’t so obviously faked, if “grind it out” meant anything under the circumstances, and if he (“we”) were actually doing any work, as opposed to golf.

3. The paranoid style, in which he unmasks the monstrous forces scheming against his official program, the “some people” who “don’t want it to succeed” and therefore, magically, keep it from succeeding. Evidence? Most of them voted against it!

4. The cold, haughty, you’re-so-dumb-you’ll-just-have-to-believe-this, lie-flinging mode. “I was not informed directly that the website would not be working, as [sic] the way it was supposed to,” he said on November 14. Wait. What do you mean? Do you mean that you didn’t know? That nobody ever told you? No, they didn’t. They didn’t tell me directly. Now go away.

Of course, when people insert “directly” into a sentence like that, you know they’re trying to deceive someone. You also know that the someone is not going to be you; almost anybody (most certainly including you) can catch on to the fact that “directly” means “I hope to fool you.” Indeed, the trick is so obvious that only a fool would use it. Obama himself has recognized that people might possibly think he’s a fool — and by recognizing the possibility, he has tried to eliminate it. “You know,” he said on November 14, “I’m accused of a lot of things [there’s that paranoid style again] but I don’t think I’m stupid enough to go around saying this is going to be like shopping on Amazon or Travelocity a week before the website opens if I thought that it wasn’t going to work.” But either he is stupid enough to keep telling obvious lies or he is stupid enough not to insist on being informed directly about the stuff he seems to be lying about. Take your pick; either way, he’s stupid enough.

The mystery to me is why people ever thought there was any force or meaning in Obama’s verbiage. At its best, it was just the same awful guff that politicians are always dishing out. In his second inaugural address, where he might have been expected to be on his best behavior, he made such sparkling utterances as:

We have always understood that when times change, so must we; that fidelity to our founding principles requires new responses to new challenges; that preserving our individual freedoms ultimately requires collective action. [A fresh thought, that.]

This generation of Americans has been tested by crises that steeled our resolve and proved our resilience. [What happened to changing when the times change?]

My fellow Americans, we are made for this moment, and we will seize it – so long as we seize it together. [Damn! And here I was just about to seize it myself. I guess I’ll have to wait for a consensus to emerge.]

We, the people, still believe that our obligations as Americans are not just to ourselves, but to all posterity. [Note: not just to some posterity.]

My selection of these idiotic sentiments is as close to random as selection can get; the speech is all like that, although sometimes Obama decides to give you something extra special in the way of metaphor. This attempt always fails. One example may suffice. After quoting the Declaration of Independence, Obama says, “Today we continue a never-ending journey, to bridge the meaning of those words with the realities of our time.” What in the world can those words signify? Picture words, words that have meaning. Now picture bridging that meaning. Huh? Already it makes no sense. But then we’re supposed to picture the bridge as the realities of our time. And this journey to do something with the realities of our time is never-ending? It’s going to last forever? No, it’s all too much for me.

From these heights of metaphor one lands with a thump on the pancake-like flatness of a quickly succeeding passage. This one is about the great discoveries that “we” have made during “our” history: “Together, we determined that a modern economy requires railroads and highways to speed travel and commerce; schools and colleges to train our workers.” Gosh, really? Schools and highways? Glad we determined that requirement.

I have little sympathy with the worldview evoked by President Kennedy’s inaugural address, but it is a work of literature — not great literature, but certainly very respectable. Anyone who, having read that speech, turns to Obama’s reinaugural remarks will be struck by the attempted resemblance. But whatever Obama touches, he disfigures. His speech has as much relation to literature as an advertising brochure. Indeed, it was written for the same purpose. The only literary excellence that Obama ever showed was his curious refusal to speak at Gettysburg on the 150th anniversary of Lincoln’s speech. There’s just one way to explain it. Obama thought he could top John F. Kennedy, but he feared he couldn’t top Abraham Lincoln, and for once a kind of humility came over him. It’s too bad, because that speech would have offered a lot of entertainment.

Even a total absence of “uhs” couldn’t make Harry Reid look like something other than the troll who wanted to eat the billy goats gruff.

Given the glaring weaknesses of Obama’s prose, it is shocking, almost horrifying, that both his friends and his adversaries keep paying tribute to it. His critics, astonishingly, condemn him for his inability to live up to his rhetoric. Here’s Obama foe Rich Lowry, writing in National Review Online: “The launch of HealthCare.gov should cast a shadow over the stirring passage in the president’s second inaugural address where he spoke of how ‘we must harness new ideas and technology to remake our government.’” Pardon me — harnessideas? Technology to remake our government? This stuff is “stirring”? It’s barely intelligible. Before we harness those ideas, do we have to brush them and feed them and make sure they’re well shod? Is that something Obama neglected to do with his healthcare “ideas”?

The biggest contribution that Obama has made to stirring the linguistic pot has been the license he has given to other people who think it’s cool and smart to enact the role of political used-car salesmen. They don’t understand how funny they are. And the comedy leaks from the op-ed page into the news reports. Consider the following from Reuters (Nov. 19):

The rollout of Obama's signature domestic policy has hurt the popularity of the initiative, but the decline has been fairly modest, a Reuters/Ipsos poll showed on Monday.

Forty-one percent of Americans expressed support for Obamacare in a survey conducted from Thursday to Monday. That was down 3 percentage points from a Reuters/Ipsos poll taken from September 27 to October 1.

Opposition to the healthcare law stood at 59 percent in the latest poll, versus 56 percent in the earlier survey.

In other words, once you’ve fallen down the first 56 steps, the next three are only a modest reduction in altitude. After you’ve passed the landing on the 50th step, it’s hard for anything to do much more damage to your unpopularity. But wouldn’t it be funny if you thought you could talk your way upstairs?




Share This


Kennedy and Communism

 | 

On November 22, it will be 50 years since I sat in my typing-for-infants class and heard a radio voice coming over the PA system. “There are reports,” it said, “that shots were fired at President Kennedy’s motorcade in Dallas, Texas.” My teacher, a model of business efficiency, concluded very plausibly that someone in the principal’s office was playing around with the equipment. Unfortunately, she was wrong.

I can’t say that I regard Kennedy’s death as a world-historical event. He was a brighter and, to me, a much more interesting and sympathetic personality than his kinfolk or most of the other political figures of the time. Several times in his life he faced the virtual certainty of death, and faced it with courage and cheerfulness. He learned enough about economics to advocate a large tax cut that vastly increased the nation’s wealth. He also helped to get us into the Cuban missile crisis — and then rather skillfully got us out of it. I don’t know what he would have done about Vietnam. I do know that he fostered a cult of military masculinity (fifty-mile hikes!) that produced some very sorry thinking and acting. He believed that Robert McNamara was a real smart guy; he had a soft spot for can-do fools like that. The scion of a gangsterlike family, he plotted to make his brother Robert and then his brother Edward presidents after him. He lied habitually and outrageously about almost every aspect of his own life. He accepted the Pulitzer Prize for a book he didn’t write, and became angry when people suggested that he hadn’t written it. There is reason to believe that in 1960 he was able to defeat his good friend Richard Nixon because his allies in Texas and Illinois stuffed the ballot boxes for him. Sadly, the evil part of Kennedy’s legacy was passed along, and amplified; most of the good died with him.

About the assassination I have little to say. To my mind, David Ramsay Steele made a conclusive case for Oswald as the sole assassin; see his article in Liberty in November 2003. Since then, no evidence has been discovered that threatens Steele’s argument, and much analysis has confirmed it. I am bothered, however, by something closely connected with the assassination (but not with Kennedy himself), something that appears not to bother anyone else. It is a strange idea: the idea that communism was never of any significance in America; that either there weren’t any communists or they never really did much of anything (such as killing President Kennedy). Even intelligent and well-disposed people believe this.

Sadly, the evil part of Kennedy’s legacy was passed along, and amplified; most of the good died with him.

But of course there were communists, and they did lots of things. They were very busy bees. It’s not for nothing that the 1930s were once called the Red Decade in American intellectual life, or that a ton of intellectual autobiographies were written from the standpoint of “I was a communist although later I quit.” About communist influence in the popular media during the 1930s and 1940s, take a look at Red Star Over Hollywood by Ronald and Allis Radosh — and even the Radoshes couldn’t get all the red influences into a book. In 1948, the Democratic Party was split by a conflict between anticommunists, communists, and communist stooges; out of it came the Progressive Party, an outfit managed by communists and their friends. Its presidential candidate was the former vice president of the United States, Henry Wallace. In 1956, there were still American intellectuals fighting it out over the issue of whether Khrushchev should have trashed the memory of Stalin.

How does all this connect with Kennedy? The connection is that the person who shot him, Lee Harvey Oswald, was a communist activist. Oswald defected to the Soviet Union and upon returning to the United States became a professional defender of Castro. He denied being “a communist” but proclaimed himself “a Marxist.” He had his picture taken holding a gun in one hand and militant literature in another; his wife wrote “Hunter of fascists” on the back of it. Oswald lay in wait for and attempted to murder Edwin Walker, a rightwing general. When the fervently anticommunist President Kennedy came to Dallas, Oswald succeeded in murdering him. Now, why do you think he did that? Do you think that communism might not have had something to do with it?

According to most conspiracy theories, however, Oswald either didn’t shoot Kennedy at all, or he was the least important member of a murder group that had nothing to do with communism. The theorists believe that Kennedy was murdered by rightwing CIA operatives, or rightwing oil companies, or rightwing militarists — anyone on the right will do. Even sensible people have trouble with the simple notion that Oswald was a freak for communism. Consider Fred Kaplan, writing for the Washington Post on November 14. Kaplan says that he himself, in his callow youth, accepted various conspiracy theories, only to discover that they weren’t decently based on fact. (I can say something similar about my own intellectual development.) But then he says:

The only remaining mystery, really, is Oswald’s motives — and yet, here too, no convincing evidence has emerged that links his action to the Mafia, the CIA, the Cubans, or anything of the sort. The most persuasive theory I’ve read — first put forth in a New York Review of Books article by Daniel Schorr (later reprinted in his book Clearing the Air) — is that Oswald killed Kennedy, believing the deed would earn him favor with Castro. But who knows? The mystery at the heart of the matter (why did Oswald do it?) remains unsolved.

“Really”? Do people talk this way about Leon Czolgosz, who assassinated President McKinley because Czolgosz was an anarchist and McKinley wasn’t? Do people talk this way about Charles Guiteau, who assassinated President Garfield because Garfield failed to gratify Guiteau’s insane idea that he deserved to be appointed ambassador to France? Do people talk this way about John Wilkes Booth, who assassinated President Lincoln because Wilkes was a supporter of the Confederacy and Lincoln had just destroyed it? Do people talk this way about . . . oh, why go on? If a member of the American Nazi Party, or the NRA, or even the PTA had killed John F. Kennedy, there would be no “unsolved mystery.”

The real mystery is why even well-meaning, well-educated Americans can’t just accept communism for what it was (and is): a political movement capable of interesting people and inspiring them, even inspiring them to violent action — which it has often praised and rewarded. Oswald killed Kennedy because Oswald was a communist, and acted up to it.

So silly is the cover-up-the-communists routine that the hosts of movies on my beloved Turner Classics are always alleging that someone was “blacklisted” or otherwise injured by “accusations” of communism, without ever wondering — just as a subject of curiosity, now that we’re discussing old so-and-so’s difficult life — whether he or she may actually have been a communist.

And speaking of cultural authorities, I recently (don’t ask me why) looked up the Wikipedia article on Ed Sullivan, the prune-faced impresario of early television song-and-dance shows, and discovered that its account of Sullivan’s life occupies itself mightily with the question of whether Sullivan excluded communists from his program. I have to admit that I am an agnostic on this grave moral issue. If I were Ed Sullivan, maybe I’d have had communists on my show, and maybe I wouldn’t have. I probably would have, if they were good enough dancers — but if you substitute “Nazis” for “communists” in this thought experiment, fewer people would say that my decision should be obvious. But look at what the Wikipedia entry says: “[A] guest who never appeared on the show because of the controversy surrounding him was legendary black singer-actor Paul Robeson, who . . . was undergoing his own troubles with the US entertainment industry's hunt for Communist sympathizers.”

If a member of the American Nazi Party, or the NRA, or even the PTA had killed John F. Kennedy, there would be no “unsolved mystery.”

All right; I guess so. But Robeson didn’t need to be “hunted”; everybody knew where he was on the ideological spectrum. And his politics ensured that he had other troubles, of the intellectual and moral kind, troubles far worse than not getting on the Ed Sullivan show. The facts are simple. Robeson had a great voice. He could even act. He was also America’s best-known communist. He was proud of this morally repellent role. Accepting the Stalin Peace Prize in 1953, he said, among many other things:

I have always insisted — and will insist, even more in the future on my right to tell the truth as I know it about the Soviet peoples: of their deep desires and hopes for peace, of their peaceful pursuits of reconstruction from the ravages of war, as in historic Stalingrad; and to tell of the heroic efforts of the friendly peoples in Poland, Czechoslovakia, Hungary, Albania, Romania, Bulgaria, great, new China and North Korea — to explain, to answer the endless falsehoods of the warmongering press with clarity and courage.

For Robeson’s tribute to the “deep kindliness and wisdom” of Joseph Stalin, go here.

Wikipedia’s own page on the “Political Views of Paul Robeson” does its best for him, but it concludes, “At no time during his retirement (or his life) is Paul Robeson on record of mentioning any unhappiness or regrets about his beliefs in socialism or the Soviet Union nor did he ever express any disappointment in its leaders including Vladimir Lenin and Joseph Stalin. Moreover, only a few sources out of hundreds interviewed and researched by two of his biographers Martin Duberman and Lloyd Brown agreed with the claims made in the mainstream media of Robeson's supposed embitterment over the USSR.”

Why bring these things up? Mainly because there’s a significant historical question at stake: were there communists or not, and were they important or not? That’s enough, but there are political reasons too. The abolition of communism from American history has been a way of arousing sympathy for the authoritarian Left and any ideas or people associated with it. It has been a way of keeping the Left from self-criticism, the kind of criticism that, one is given to believe, would automatically lead to such excesses as “witch-hunts” against “alleged communists.” Denying the presence of communism has been a way of obeying the old slogan, “No Enemies on the Left.” There is a danger here, similar to the danger of forgetting the sometime appeal of fascism.

This month witnessed another anniversary besides that of the Kennedy assassination. Thirty-five years ago, on Nov. 18, 1978, a man named Jim Jones engineered the murder-suicide of more than 900 people, mostly Americans, at Jonestown, Guyana. People think of Jones as some kind of offbeat Christian who got a little more offbeat. What he did is regarded as a warning against religious cultism. But he wasn’t, and it isn’t. Jones was a political agitator who used a pretense of religion — and it was a pretty feeble pretense — to sell what he called “revolutionary communism.” This approach enabled him to become a major player in San Francisco politics. Some of his fellow politicians covered up for him, ignoring or denying his communism; others were actually inspired by him — by his politics, not by his “religion.”

If you go to yet another Wikipedia page — “Peoples Temple” — you will learn a lot of things about this, although you won’t learn why the Jonestown episode isn’t seen as Americans’ most impressive and also most disastrous attempt to build a communist utopia. Yet the take-home message can still be found. It appears in the clichéd slogan that was posted behind the speaker’s stand from which Jones delivered his death decrees: “Those Who Do Not Remember the Past Are Condemned to Repeat It.”




Share This
Syndicate content

© Copyright 2018 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.