Unsettling Climate Science

 | 

The central issue in the maddeningly intransigent climate change debate is equilibrium climate sensitivity (ECS).

ECS measures the climate's response to increasing levels of atmospheric CO2. Specifically, it is the increase in the global average temperature anomaly (GATA) produced by a doubling of the quantity of CO2 injected into the atmosphere. For climate change policy, nothing else matters. The type and magnitude of phenomena attributable to current and future warming depend on the value of ECS, as does the type and magnitude of appropriate climate policy.

In its latest climate assessment report (the “Fifth Assessment Report,” AR5), the United Nations Intergovernmental Panel on Climate Change (IPCC) states that the ECS "is in the range 1.5 oC and 4.5 oC (high confidence)." If the actual ECS were less than 1.5 oC, future warming would be quite tolerable to humans (though intolerable to the climate change theory of climate cultists). An ECS of 2 oC is a level of anthropogenic global warming (AGW) to which humanity could adapt; indeed, it might be beneficial to humans. An ECS in the neighborhood of 2.5 oC would require more mitigation (e.g., non-trivial reductions in greenhouse gas [GHG] emissions) than adaptation. By 3 oC, AGW changes to catastrophic AGW (CAGW), with extreme climate damage likely. An ECS of 4.5 oC is apocalypse territory. Beyond that, contact Al Gore.

Not even scientific uncertainty will stand between John Kerry and an historic treaty enshrining his name.

Now, despite its declaration of high confidence, the IPCC's ECS range is too wide, and useless to policymakers. At the low end, doing nothing seems like a reasonable policy. At the high end, we should move to the mountains, preferably the mountains of Canada, and build dikes around our solar-powered, doomsday cities.

The IPCC's 2007 report (AR4) gave an ECS range and a "best estimate" (namely, 3.0). But no best estimate was given in AR5. The reason: a significant discrepancy between observation-based estimates and IPCC climate model estimates. Of 19 observational-based studies of ECS, 11 showed values below 1.5 oC — i.e., below what the IPCC said was the minimum. Could this be the work of the "shoddy scientists" and "extreme ideologues" that John Kerry warned us about — the ones (in this case, the authors of 11 studies) we should not allow "to compete with scientific fact"?

Apparently so. And such "scientific facts" can only weaken Kerry's hand in his climate change negotiations — which require a large ECS to elevate global warming to the status of pandemics, poverty, terrorism, and weapons of mass destruction. According to the New York Times, he wants to be "the lead broker of a global climate treaty in 2015 that will commit the United States and other nations to historic reductions in fossil fuel pollution." Little, not even scientific uncertainty, will stand between Mr. Kerry and an historic treaty enshrining his name. Rest assured that in advancing US interests, Kerry will fully rely on the negotiating skills he has demonstrated in his work on the Syrian chemical weapons deal, the Iranian nuclear weapons agreement, the ISIS coalition structure, and the Israeli-Palestinian peace treaty.

In the meantime, under the auspices of its National Climate Assessment (NCA), the Obama administration is moving forward aggressively with its climate change policies, undeterred by the ambiguity of ECS science. To Mr. Obama, the apocalypse is already in progress. For example, his NCA asserts:

Sea level rise, storm surge, and heavy downpours, in combination with the pattern of continued development in coastal areas, are increasing damage to U.S. infrastructure including roads, buildings, and industrial facilities, and are also increasing risks to ports and coastal military installations. Flooding along rivers, lakes, and in cities following heavy downpours, prolonged rains, and rapid melting of snowpack is exceeding the limits of flood protection infrastructure designed for historical conditions.

Climate havoc of such magnitude corresponds to an ECS exceeding 3.0, putting the planet on the fast track to CAGW, and NCA recommendations on the fast track to trillions of dollars.

If it were true. Recent studies indicate an ECS significantly lower than both IPCC and the NCA estimates. According to the Cato Institute, since 2011, 14 peer-reviewed studies have found the earth to be much less sensitive to CO2 increases than previously thought. "Most of these sensitivities are a good 40% below the average climate sensitivity of the [IPCC] models." The most recent study puts the ECS at 1.64 oC, "a value that is nearly half of the number underpinning all of President Obama’s executive actions under his Climate Action Plan." Such low estimates are hardly the stuff of rapid ice melt, surging sea levels, and extreme weather events. We may be transitioning to DAGW (decrepit AGW; for climateers, disconcerting AGW).

The current, and continuing, warming pause further erodes the NCA position. In defiance of the more than 100 billion tons of CO2 that have been spewed into the atmosphere since 1998, the temperature has not increased. The AGW hypothesis called for it to rise; the CAGW hypothesis called for it to shoot through the roof — as Al Gore demonstrated in An Inconvenient Truth, by propelling himself on a pneumatic scissors lift to ever-loftier heights of temperature. But the GATA hasn't budged.

History is replete with grand schemes that shattered the dreams of the central planners who concocted them.

Climate scientists are aghast. They can't explain the missing heat. At least a dozen possibilities are discussed in “A Sensitive Matter” and “Climate Change: The Case of the Missing Heat.” Some don't even involve CO2. It might be aerosol particles, reflecting heat back into space. It might be clouds. And let's not forget the sun, which has been experiencing a weak “solar maximum.” Perhaps the heat is hiding in the ocean, over 700 meters below the surface, or deeper still. It might be moving around — shuttled by the Pacific Decadal Oscillation (PDO), alternately favoring El Niño and La Niña in 15–30 year cycles.

But, as we read in “A Sensitive Matter,” "it might be that the 1990s, when temperatures were rising fast, was the anomalous period." Or, "as an increasing body of research is suggesting, it may be that the climate is responding to higher concentrations of carbon dioxide in ways that had not been properly understood before." The science grows unclearer.

Possibly more disconcerting than the ambiguous ECS or the perplexing warming pause are the unfounded claims of damage from future warming. General Circulation Models (aka, Global Climate Models, GCMs), which are used to project future warming, have consistently overstated temperature trends. They are plagued with flaws that could invalidate their reliability. The IPCC itself concedes as much. As Steven Hayward observed in Climate Cultists,

While climate skeptics are denounced for mentioning “uncertainty,” the terms “uncertain” and “uncertainty” appear 173 times, while “error” and “errors” appear 192 times, in the 218-page chapter on climate models in the latest IPCC report released last September [2013]. As the IPCC admits, “there remain significant errors in the model simulation of clouds. It is very likely that these errors contribute significantly to the uncertainties in estimates of cloud feedbacks and consequently in the climate change projections.”

Why, then, is the Obama administration clamoring for urgent, profligate government action? According to AR5, there is low confidence that today's "sea level rise, storm surge, and heavy downpours" can be attributed to AGW. Nor can droughts, wildfires, and other "extreme weather" events — no matter how many times catastrophists say otherwise. Such events require climate-ravaging temperatures that, having been projected by flawed GCMs, may never be reached. Maybe it's too soon for the wholesale replacement of extraordinarily cheap and reliable fossil fuels with extraordinarily expensive and unreliable wind and solar farms. After all, history is replete with similarly grand schemes that shattered the dreams of the central planners who concocted them (the Soviet Union's collectivization of farming and China's Great Leap Forward come to mind).

But, what if Obama and Kerry are right? After all, AGW is a plausible theory, there has been post-industrial warming (a 0.8 °C increase since 1850), and, through the burning of fossil fuels, humans (especially in China and India, where carbon emissions are sharply rising, while US emissions are declining) pump immense quantities of CO2 into the atmosphere. Who knows, the warming could resume — possibly at the alarming rates assumed in the NCA?

It is precisely this possibility that Messrs. Obama and Kerry flaunt, in making the case for immediate "climate action." The "cost of inaction" is too great, they tell us; we can't afford to wait. The possibility of abrupt and rapid temperature rise, however remote it may be, is of such grave concern that, last June, president Obama used his executive authority (bypassing Congressional approval) to issue new EPA rules requiring US power plants to cut CO2 emissions 30% by 2030. Yet, with full compliance through 2100, these rules would reduce the GATA by an unnoticeable 0.02 °C. But Americans whose electricity is generated by coal-fired power plants will painfully notice that it's the "cost of action" that's too great.

Moreover, according to the EPA’s own model (the Model for the Assessment of Greenhouse-Gas Induced Climate Change [MAGICC]), a 100% reduction in US emissions would reduce the end-of-century GATA by a distressingly futile 0.14 ºC. Who could possibly be undisturbed by this result — other than the EPA employee who is, no doubt, in line to receive a Champions of the Earth Award for inspiration, in coining the model's name.

More unsettling are the results of integrated economic and climate models (described in Examining the Threats Posed by Climate Change) that measure the cost of policy action to mitigate climate damage. For scenarios similar to those assumed by the NCA (e.g., an ECS of 3.2 ºC, resulting in a 3.4oC temperature increase by 2100), the cost to the US economy of global climate inaction (i.e., unmitigated warming through 2100) is a 1.8% reduction in GDP. The cost of global climate action (that prevents a 2.0 ºC GATA increase) reduces US GDP by 3.2%. Thus, with the Obama administration's "climate insurance" investment, the abatement cost could be twice that of the averted climate damage — not unlike the administration’s Solyndra investment, which involved solar panels whose manufacturing cost was almost twice their selling price.

We simply do not know, with any precision, the earth's climate sensitivity.

Very likely, we can afford to wait. NCA plans (carbon regulation, carbon taxes, cap-and-trade, global emissions treaties, etc.) that are meant to control the climate are, at best, an expensive fantasy — a green dereism that is anathema to almost 6 of the 7 billion people inhabiting the planet. The 6 billion have no choice but to burn increasingly large quantities of affordable fossil fuels. Who would insist on draconian climate policies that will be ignored by the vast majority of the world's population; that will have no measurable effect on GATA, if they are not enforced globally; and that would cost the US economy twice as much as the damage they save, if they could be enforced? All this to insure against the possibility that the warming resumes and it follows a hellish pace for the next 86 years.

With AR5, the IPCC's fifth attempt to quantify ECS, this most important measure of climate response remains too vague for identifying the appropriate climate change policy. We simply do not know, with any precision, the earth's climate sensitivity. Its obscurity is exceeded only by the idiosyncrasies of atmospheric CO2, the biases of GCM errors, and the cajolement by which countries such as China and India will be brought into emissions compliance. The essence of AR5 is uncertainty, garnished with ambiguity and doubt.

To skeptics (aka deniers, flat-earthers, merchants of doubt), the recent estimates of dramatically lower ECS dictate caution, and possibly a reexamination of the AGW hypothesis. Common sense dictates the need for much greater scientific clarity. Use the warming downtime to find the missing heat and the modeling errors — and a better case for urgent, radical action. The integrated modeling results, which show the alarmingly high cost and low effectiveness of such action, make a more compelling case for inaction. Perhaps a reevaluation of present policy is in order.

Not likely. As Hayward noted,

Despite all this, there has been not even the hint of a second thought from the climateers, nor any reflection that their opinions or strategies could bear some modification. The environmental community is so deeply invested in looming catastrophe that it’s difficult to envision a scientific result that would alter their cult-like bearing.

Accordingly, on the day AR5 was released, John Kerry rushed out to declare,

Once again, the science grows clearer, the case grows more compelling, and the costs of inaction grow beyond anything that anyone with conscience or common sense should be willing to even contemplate.

Once again, John Kerry's arrogance grows clearer, and most unsettlingly so to people who believe that precision in climate science should trump hysteria in climate policy — people who, in the contemplation of the Obama administration, are the "extreme ideologues."

Editor's note: Readers are referred to the author’s previous contribution to this subject, Liberty, Oct. 14.

that have been spewed into the atmosphere since 1998, the temperature has not increased. The AGW hypothesis called for it to rise; the CAGW hypothesis called for it to shoot through the roof




Share This


When You Wish Upon a Czar

 | 

Two minutes after President Obama gave his political crony Ron Klain the job of Ebola Czar, I got a text message from a friend. He’s a political scientist, so I was expecting him to complain about Klain’s being nothing but a Democratic Party hack, but he didn’t. His comment took an historical turn. What he said was, “If trends continue, America will have more czars than Russia had in its whole history.”

I saw that as a protest, not against the Russian monarchy, but against the current assumption that words prove their worth, not in use, but in overuse. To my friend, a word is valuable because it’s both appropriate and fresh. To many other people, it’s valuable because it’s capable of being used over and over again, in any possible circumstance.

There’s nothing wrong, in itself, about the use of “czar” to mean something like “an official appointed to exercise full power over a designated matter.” Czar is an admirably brief, concrete, imagistically evocative word to express that meaning. But one can be driven to suicide by other people’s overuse of even the finest words. No one wants to hear “I love you” every minute of every day, and certainly no one wants to contemplate an endless sequence of organization charts in which every position is labeled “Czar.”

We don’t consider the fact that “czars” have one important characteristic in common with actual czars: it would take the Bolsheviks to get rid of them.

Consider: the United States now has two czars in the same realm. The first was Dr. Nicole Lurie, whose existence no one remembered until the president started being urged to appoint an emergency preparedness czar. Then we learned that we already had one, and it was Dr. Lurie, who is Assistant Secretary for Preparedness and Response for the Department of Health and Human Services. But that made no difference; another monarch was added to America’s ever-growing College of Czars.

In 1908, when Ferdinand I, Prince Regnant of Bulgaria, proclaimed himself Czar of Bulgaria, his action excited much unfavorable comment from other monarchs. There already was a Czar of Russia, and the general opinion was that one was enough. Contemporary Americans are clearly without that kind of taste and discrimination. We want a czar in every pot. We don’t consider the fact that “czars” have one important characteristic in common with actual czars: they are very hard to get rid of. Even if they’ve finished their job and wiped out Ebola or baseball or whatever else it is they’re supposed to handle, they or their bureaucratic progeny remain in office. It would take the Bolsheviks to get rid of them.

There’s another term that has been spread by the nation’s romance with Ebola — the old but increasingly dangerous abundance of caution. How long those six syllables had, until recently, been incubating deep in our linguistic organs, only the zombies know, but now, suddenly, the contagion is everywhere. Whenever a government official delays some urgent job, it’s out of an abundance of caution. Whenever an American citizen is prevented from exercising his rights, it’s because an abundance of caution led the FDA to deny him a drug, or led the gun suppressors to deny him a permit, or led the cops to arrest him for reminding them of the law, or led the high school principal to tell him not to wear a flag-print t-shirt, thereby offending non-Americans. Once it gets going, abundance of caution can do a lot of damage.

State-friendly terms such as czar, abundance of caution, and of course national crisis have been big winners in this, the Ebola Period of our history. Meanwhile, phrases dear to the hearts of (certain) libertarians have suffered badly — indeed, have virtually disappeared from public use: open borders, freedom to immigrate, right to immigrate, and the like. I confess that such terms have never been favorites of mine. To the disgust of (certain) other libertarians, I have argued at length against the concepts they express (Liberty, October 2006). Those terms will have a difficult time regaining the spotlight now occupied by domestic terrorists, the terrorists’ wacko foreign exemplars, and the Ebola virus. It’s hard to see how a radical immigrationistwould answer the question, “Do you mean that Thomas Eric Duncan had a right to enter America and spread a deadly disease?”, or the obviously succeeding questions, “So you’re saying that the right to immigrate isn’t universal, after all? So why do you think it’s a right?” We’ll see what the friends of open borders do to revive their favorite words. I’m sure they’ll think of something.

Where would we be without "adults in the room" and the other pseudo-psychological clutter that appears in almost every political analysis?

While they’re thinking, we await in horror the coming election. The political results may be bad or good — more or less crippling to our actual rights — but the linguistic phenomena are already gruesome. A friend recently asked whether American political commentary could do without stupid sports metaphors. The answer is, Apparently not. Where would we be if elections weren’t up for grabs, if the trailing candidate didn’t need to hit a home run, if the leading candidate weren’t trying to run the clock out, orif one of the two parties weren’t just playing DE-fense, never managing to get across the goal line?

And where would we be without adults in the room and the other pseudo-psychological clutter that appears in almost every political analysis? Protestors, for example, never yell and scream; they vent their frustrations; they act them out. Their actions are signals that our communities need healing, and that healing can come only from a therapeutic national conversation or bipartisan dialogue — both parties on the psychiatrist’s couch.

Does Biden understand the poem that he slightly misquotes? Clearly not.

But here I must apologize. At some point in this column, I went out of bounds. I stopped blaming the victims — blaming phrases that started their lives with hope and promise, only to lose it because of community pressure to be something they’re not — and I started displaying my phobias about expressions that were losers to begin with.So I’ll adopt a more proactive stance and pose the challenging question: what would happen if an American public figure actually tried to ignore all insipid current clichés and restore the greatness of the English language, the language of Shakespeare and Emerson and Jefferson, of Emily Dickinson and Robert Frost and . . . oh, maybe, of William Butler Yeats?

Well, here is what would happen, and did happen, when, on Oct. 3, Vice President Biden spoke at the prestigious John F. Kennedy Forum, “Harvard’s premier arena for public speech.” “Folks,” said Biden,

Folks, “all’s changed, changed utterly. A terrible beauty has been born.” Those are the words written by an Irish poet William Butler Yeats about the Easter Rising in 1916 in Ireland. They were meant to describe the status of the circumstance in Ireland at that time. But I would argue that in recent years, they better describe the world as we see it today because all has changed. The world has changed.

There’s been an incredible diffusion of power within states and among states that has led to greater instability. Emerging economies like India and China have grown stronger, and they seek a great force in the global order and global affairs. . . .

The international order that we painstakingly built after World War II and defended over the past several decades is literally fraying at the seams right now.

Now, let’s see. Yeats did write a poem, called “Easter 1916,” about the Irish nationalist Easter Rebellion. His poem suggests that commonplace people were transformed, at least in imagination, by their participation in that failed revolt:

All changed, changed utterly:
A terrible beauty is born.

Does Biden understand the poem that he slightly misquotes? Clearly not. No good poem, and particularly not Yeats’s poem, “describe[s] the status of the circumstance” of something. But does Biden understand his own remarks? Again, clearly not. What terrible beauty could he possibly see in “the status of the circumstance” that he himself describes — “diffusion of power within states and among states that has led to greater instability,” an “international order” that is “literally fraying at the seams” (and can ya believe it, “right now,” too)? That’s not beautiful. It’s not even terrible, in the sense that Biden wants to import from Yeats. A person who doesn’t understand that literally means literally, not figuratively, or that something that was “built” doesn’t have “seams” and therefore cannot “fray” . . . this person should stay as far away as possible from other people’s poetry. We’re used to the vice president’s torrent of clichés; must we now be visited with his attempts to be learned and original?

It’s interesting to speculate how many people would say what they say, if they understood it. Here’s a passage that the vice president presumably wouldn’t like; it’s from a political analysis by Jennifer Rubin, issued on Sept. 30 by the Washington Post.It’s about a number of Democratic senators who may not win their elections. I’ll put the most obvious clichés in italics:

They were napping while the Islamic State surged and were asleep during the wheel for other Obama foreign policy flubs. They didn’t raise any objection to zeroing troops out of Iraq and Afghanistan. They were unmoved by the atrocious Iran interim deal. They were quite happy to watch the sequestration cuts wreak havoc on military preparedness. Now the bill has come due for circling the wagons around Obama.

The quantity of clichésis bad enough, but does she really mean to say that the senators were happy to watch even when they were asleep? Is she really able to picture a cut, much less such a passive, somnolent thing as a sequestration cut, wreaking havoc? Does she really think that people who circle wagons get a bill for it? And what picture was in her mind when she thought of people sleeping during the wheel?

Enough. I’m tired. I’m going to find some wheel to sleep during.




Share This


Good-Bye, Uncle Kodie

 | 

The recent bankruptcy filing by the Eastman Kodak Company was a shock to me, but not exactly a surprise. It was certainly another sad reminder that the world I once knew was gone forever. I worked for the company at its Kodak Park Works in Rochester, New York in the 1960s, leaving of my own accord in 1970. A detailed autopsy of its decline and fall must await a soul with perceptions keener than mine. But I suspect its decline was a result of its very domination of the imaging market. With little competition, the company’s leaders had simply forgotten how to compete — which involves adjusting to changing market conditions, which involves making sound and timely decisions.

I hired on as a research chemist in the Organic and Polymer Chemistry Department, part of the Chemistry Division, which was part of the famous Research Laboratories established in 1912 by noted photographic theorist C.E.K. Mees. I had done poorly in graduate school — quantum mechanics and its chemical and philosophical extensions struck me as moonshine. Still, I revealed some small gift for research in organic chemistry. Of course, there were many fine organic chemists at Kodak’s laboratories, some famous, and they sent a steady stream of publications to scientific journals.

In the 1960s, Kodak was riding high. Its most profitable market was the amateur photography market. Its Instamatic camera appeared in the early part of the decade. It was a huge success, and Kodak’s amateur film business was booming. And of course, the company sold film to professional photographers, motion picture film to Hollywood, and X-ray film to the medical profession. Needing chemicals free from impurities that could harm silver halide emulsions, Kodak had long ago begun making its own. And this led it into the successful marketing of research chemicals and polymers through its Distillation Products Industries and their Tennessee Eastman and Texas Eastman Divisions.

The annual wage dividend was a result of a profit-sharing plan begun by George Eastman to discourage socialist tendencies.

I worked in Building 129, and then in Building 82, the latter a brand new research building with the best interior design for the working chemist I’ve ever seen. There were other chemistry laboratories in Building 59, which also housed the Applied Photography and Emulsion Research divisions. These two divisions were considered chimneys to the top administrative positions in the Research Laboratories and to those in the company hierarchy. In those days, the company promoted entirely from within — was this the fatal flaw?

In any event, for the plain old-fashioned organic chemist, the opportunities to learn and grow within the science were enormous. In research at a fundamental level, no one is really certain of what will yield useful results. Some with imposing credentials may think they have an accurate crystal ball, and some may even prove correct in their educated guesses. But in the long run, innovation is best served by research leaders who know when to stand aside — and how to choose people who don’t require nagging supervision. The chemists at Kodak, free to roam, were devoted experimentalists and produced a huge amount of new work, including new synthetic techniques, new reactions, and new organic compounds.

Of course, every chemist took the company course in photography, one that covered both theory and practice. It gave everyone an intellectual nudge toward the practical problems of image-making with silver-halide emulsions. And there was a photography-related testing program. Each new compound was sent to the Emulsion Research Division for testing — was it an antifoggant? A diffusion transfer agent? Did it promote undercut? And from time to time, a request would come back for more of a particular compound that had proved interesting. But I was quite free to explore my favorite field — heterocyclic chemistry. I might have done it all for room and board, but I was paid a decent salary and, in addition, got that famous annual wage dividend. The dividend was a result of a profit-sharing plan begun by George Eastman — to discourage socialist tendencies. Such rewards helped me endure the long snowy winters of Rochester.

Curiously, at each implosion, the mood was festive, and the onlookers cheered. I don’t think the cancelled-stock holders cheered.

The company had accumulated an enormous expertise in the manufacture of photographic film. It had developed the precise system of emulsion coatings, the proper mixes of silver halides, the sensitizing dyes, and the dye couplers for the amazing color processes. And with all this knowledge, and the success of the Instamatic line, came the idea that nothing would ever change. Oh, there was some distant thunder — I recall the suggestion from Varian Associates’ Edward Ginzton that his company was looking for an electric camera. This was back in the sixties, after I had bought some Varian stock. Yet within Eastman Kodak, I heard it said that, like the internal combustion engine (so help me), silver photography was such a perfect invention that it could never be replaced. Perhaps voicing this assumption was a gesture of loyalty to the company. Widely held, it lightened the burden of its top executives. Any far-reaching decisions, however imperative in the light of reality, could be postponed, if not altogether avoided. Still, as early as 1975, the executives had good reason to believe that digital imaging would, sooner or later, replace silver photography as a means of taking pictures. In that year, Steve Sasson, an electrical engineer working in a Kodak laboratory, constructed the first crude digital camera.

Looking back, I can recall seeing signs of fatty degeneration within the company. There were organizational slots being filled, but little work to occupy those who filled them. Some employees seemed to be struggling to find things to do. And I myself wondered, from time to time, why I was there. Perhaps I should have been replaced by an electrical engineer, though I couldn’t have guessed that at the time. I did leave my name on nine published papers and a number of company reports and memoranda, along with some novel unpublished work and two patents — neither patent of any real importance. My papers were sniffed at by certain academics, but I still have a collection of requests for reprints, and the papers are still referenced here and there. Certain compounds I made were superior antifoggants — but fogging isn’t a problem in digital imaging, at least not fogging by allylthiourea.

During my stay at Kodak, one new road to possible profit was almost, but not quite taken. The company hired a professor away from academe to establish a testing program, meant to identify potential drugs among the huge number of new compounds prepared by the organic chemists. But Kodak fired the man not long after it hired him. The “powers that be” decided they didn’t want to get involved in the making and marketing of drugs. I remember being surprised by the firing — having already made organic compounds by the boxful and sent them off to some storage area. I’ve always wondered what happened to those compounds and whether some wonder drug existed among them. Much later, of course, Kodak bought Sterling Drugs, to give it “worldwide infrastructure” — for what exactly? If it had tested its own compounds as potential drugs, it might have made plenty through licensing, without acquiring an enormous debt.

Within Eastman Kodak, I heard it said that, like the internal combustion engine, silver photography was such a perfect invention that it could never be replaced.

As I indicated earlier, the Eastman Kodak Company had for years been more than just a camera-and-film company. Eastman organic chemicals were common in research laboratories everywhere, and the company marketed its manufactured polymers through Tennessee Eastman and Texas Eastman. Yet none of these functions now belong to the parent company — all were “spun off” as the Eastman Chemical Company in 1993. The following year, Kodak sold its remaining interests in Sterling, the drug company it had bought just five years earlier. Its management team had apparently given up on its idea of diversification. It had decided instead to concentrate on its core business — and cast away those profitable but distracting assets. From diversification to downsizing in five years? This is the picture of a floundering management team.

Kodak’s decline had, I’m sure, a terrifying effect on Rochester. The misfortunes of the company nearly erased the value of Kodak stock — and in reorganizing under bankruptcy, the company cancelled the stock. It created new stock shares, but the former stockholders were left with nothing. In the 1960s, the earlier issue had risen above $140 a share, then split and headed upward toward its previous high. The annual wage dividend was calculated, in part, on the value of the common stock, and the company’s stock acquisition plan provided many employees with what they regarded as a nest egg. Local businesses prospered from Kodak’s payroll. I can recall Christmas shopping at the B. Forman Department Store. Mr. Forman would walk the floor, and once, when I told him I worked at Kodak, he said, “Good, you can have the whole store.”

Life was good in those days. Eastman Kodak was not just a company, but a city within a city, a kind of mini-civilization. There was a Kodak Park Athletic Association, whose softball team once had a pitcher named “Shifty” Gears — his feats are now recorded in the National Softball Hall of Fame. And there were the Kodactors, the employees’ prize-winning theatrical group. Many of Kodak’s professional people lived on the same street and attended the same social gatherings. For perhaps too many employees, the company was their world, encouraging the sense of a carefree existence. And it all proved to be a summer before the storm.

From diversification to downsizing in five years? This is the picture of a floundering management team.

Ah, but I recall my years at Kodak as a time of youth and affluence. I took dates to Eddie’s Chop House, heard my favorite piano player, Erroll Garner, at the Eastman Theater, and swam and sun-bathed at Ontario Beach. I recall talking to a Ph.D. candidate who had worked at Kodak and, when he got his degree, planned on returning to “Uncle Kodie.” Alas — Kodak is no longer Rochester’s rich uncle. And the world it created is now, if not gone, then greatly contracted.

Small businesses along State Street have disappeared — their clientele was mostly Kodak employees. From what I’ve read and seen online, Kodak Park, once an enormous manufacturing and research complex on Lake Avenue, is now much reduced. A number of its once-important buildings have been imploded. And curiously, at each implosion, the mood was festive, and the onlookers cheered. I don’t think the cancelled-stock holders cheered.

Markets change — and when markets change, management must respond. As Ludwig von Mises told us long ago, a business makes its profits by adjusting its total business practice to market conditions. Fujifilm, the Japanese photographic company, adjusted competently; Kodak simply failed to adjust with comparable skill. The capacity for sound decisions simply wasn’t there. Kodak’s leaders had the future in their hands, but didn’t recognize it — or found some excuse for evading the necessary decisions. In recent decades, the company had anointed a procession of George McClellans, when what they needed was a Robert E. Lee, or even a Nathan Bedford Forrest. Before stepping aside in March 2014, CEO Antonio Perez was himself a significant drain on the company’s assets. But he did smash and bash the company into some new things, leading it into and out of bankruptcy, drumming up trade in business markets. In the process, silver-emulsion coating became touch-screen technology and color photography became ink-jet printing. And now, the company’s stock is back on the Big Board. There may yet be a life for Eastman Kodak — though I suspect it will be as a mere pebble in a huge cultural and economic crater.

SOURCES
“Antonio Perez.” Forbes. www.Forbes.com/profile/antonio-perez/
“Antonio Perez Won’t Have Many More Kodak Moments.” New York Business Journal, 1 Aug. 2013. www.bizjournals.com/newyork/news/2013/07/31/kodak-ceo-to-resign-after-bankruptcy.html?page=all
Appelbome, Peter. “Despite Long Slide by Kodak, Company Town Avoids Decay.” The New York Times, 16 Jan. 2012. www.nytimes.com/2012/01/17/nyregion/despite-long-slide-by-kodak-rochester-avoids-decay.html?pagewanted=all&_r=0
Brancaccio, David. “Decline of Kodak Offers Lessons for U.S. Business.” Marketplace, 20 Dec. 2011. www.marketplace.org/topics/business/economy-40/decline-kodak-offers-lessons-us-business
DiSalvo, David. “The Fall of Kodak: A Tale of Disruptive Technology and Bad Business.” Forbes, 2 Oct. 2011. www.forbes.com/sites/daviddisalvo/2011/10/02/what-i-saw-as-kodak-crumbled/
Dobbin, Ben. “Digital Camera Turns 30 — Sort Of.” NBC News.com, 9 Sept. 2005. www.msnbc.msn.com/id/9261340/ns/technology_and_science-tech_and_gadgets/t/digital-camera-turns-sort/
“Eastman Kodak Building 23 Demolition.” You Tube, 1 July 2007, inter alia. http://www.youtube.com/results?search_query=eastman+kodak+building+demolition
Feder, Barnaby J. “Kodak’s Diversification Plan Moves into a Higher Gear.” The New York Times, 25 Jan. 1988. www.nytimes.com/1988/01/25/business/kodak-s-diversification-plan-moves-into-a-higher-gear.html
Fruedenheim, Milt. “Business People: Senior Kodak Officer to Head Sterling Drug.” The New York Times, 21 Aug. 1988. www.nytimes.com/1988/08/12/business/business-people-senior-kodak-officer-to-head-sterling-drug.html
“Harold (Shifty) Gears. The National Softball Hall of Fame. www.asasoftball.com/hall_of_fame/memberDetail.asp?mbrid=177
Keeley, Larry. “The Kodak Lie.” CNN Money, 18 Jan. 2012 http://tech.fortune.cnn.com/2012/01/18/the-kodak-lie/
“Kodak to Sell Off Eastman Chemical Company: Restructuring: The Spinoff, Which Will Wipe Out $2 Billion of Debt, Is in Response to Stockholder Pressure.” The Los Angeles Times, 16 June 1993. http://articles.latimes.com/1993-06-16/business/fi-3622_1_eastman-chemical
“Kodak to Sell Remaining Sterling Winthrop Unit: Drug: Smith Kline Will Buy the Consumer Health Products Business for $2.925 Billion.” Ibid, 30 Aug. 1994. http://articles.latimes.com/1994-08-30/business/fi-32940_1_health-products-business
LaMonica, Paul. “The Anti-Kodak: Eastman Chemical.” CNN Money, 27 Jan. 2012. http://money.cnn.com/2012/01/27/markets/thebuzz/index.htm
Mees, Charles Edward Kenneth. The Organization of Industrial Scientific Research. New York: McGraw-Hill, 1920. http://books.google.com/
Miles, Stuart. “The Decline and Fall of Kodak.” Pocket-Lint, 1 Oct. 2011. www.pocket-lint.com/news/42342/kodak-shares-plunge-bankruptcy-fears/
Mises, Ludwig von. Human Action: A Treatise on Economics. Third Revised Ed. Chicago: Contemporary Books, 1966.
Munir, Kamal “The Demise of Kodak: Five Reasons.” The Wall Street Journal, 26 Feb. 2012. http://blogs.wsj.com/source/2012/02/26/the-demise-of-kodak-five-reasons/
“The Rise and Fall of Eastman Kodak.” The Night Owl Trader, 25 Sept. 2011 and added posts. http://nightowltrader.blogspot.com/2011/09/rise-and-fall-of-eastman-kodak.html
Pfanner, Eric. “Fujifilm Finds Niche With Niche With Old-Style Cameras That Mask a High-Tech Core.” The New York Times, 19 Nov. 2013. www.nytimes.com/2013/11/20/business/international/as-digital-camera-sales-sputter-fujifilm-finds-its-niche.html
Scheyder, Ernest. “Focus on Past Glory Kept Kodak from Digital Win.” Reuters, 19 Jan. 2012. www.reuters.com/article/2012/01/19/us-kodak-bankruptcy-idUSTRE80I1N020120119
“Summer Arts Theater Presents Two Plays.” Spencerport NY Suburban News, 23 July 1964. inter alia. (The Kodactors). http://fultonhistory.com/

http://www.bizjournals.com/newyork/news/2013/07/31/kodak-ceo-to-resign-after-bankruptcy.html?page=all




Share This


You Be the Judge

 | 

I like a character-driven film even more than a film with a good plot. Happily for viewers, The Judge offers both: it's a satisfying courtroom whodunit encased in a family drama portrayed by two powerhouse actors who convey the complex, competitive, and often painful relationship between a father and a son.

Hank Palmer (Robert Downey, Jr.) is one of those clever, conniving, cutthroat attorneys who put lawyers at the bottom of the list of most-trusted occupations. When he asks for a continuation on a difficult case because his mother has died, the prosecutor asks cynically, "How many times has your mother died this year?"

The truth is, only once. Hank's mother has indeed died, and he leaves immediately, alone, to attend her funeral. He has not visited his family since leaving home after high school.

Hank's father, Joseph (Robert Duvall) is the film’s title judge. He approaches the law in a way opposite to his son’s — not as a game to be manipulated but as a protective force to be honored and upheld.

Hank's first stop when he arrives in town is his father's courtroom, where he sits in the gallery unnoticed to watch his father at work adjudicating a case. Clearly he admires and has been shaped by his father. He even followed in his father's career path.

Downey and Duvall are giants of nuance; we read more in their eyes than we hear from their lips.

Nevertheless, when Hank shows up for the funeral, Joseph can barely contain his disdain, and Hank can barely control his eagerness to get away. We learn that Hank hasn't been home in 20 years, yet he had stayed in contact with his mother (by phone) and was close enough to confide in her the most intimate part of his life — his failing marriage. Why? What happened? Of course we blame the harsh, domineering father. But when Joseph is arrested for a hit-and-run that occurs on the night after the funeral, Hank jumps from the first-class seat in which he is preparing to fly out of town and rushes to oversee the judge's defense.

Hank hates his father bitterly, yet he clearly admires him, loves him, and can't stand the thought of him going to prison. As other details about Hank's childhood emerge we realize that something has happened between them that was so deeply scarring that neither of them has been able to address it directly. It holds them together even as it drives them apart.

Director David Dobkin reveals all of this to us in flashes and snippets — not all at once, because the characters themselves can't face it all at once, and he wants us to feel their aversion. He wants us to feel how hard it is for them to talk about it or even to remember it.

A film like this could easily devolve into sentimental drivel, but it does not, largely because of the skills of its actors. Downey and Duvall are giants of nuance; we read more in their eyes than we hear from their lips. Jeremy Strong as the mentally disabled brother Dale and Vincent D'Onofrio as the older brother Glen who remained in the small hometown to hold the family together while Hank went on to defense attorney glory also deliver powerful performances.

At almost 2 1/2 hours, The Judge is a bit long, but if you have father-child issues of your own that deal with judging and being judged (and who doesn't?), you will find this film an absorbing and compelling opportunity for reflection and catharsis.


Editor's Note: Review of "The Judge," directed by David Dobkin. Team Downey, 2014, 141 minutes.



Share This


Public Servants

 | 

I’ve always liked the comedian Paul Mecurio. He’s a smart, funny, attractive guy. The other day, when I was surfing around and landed on Fox’s softnews program “Outnumbered,” I found that he was the guest, so I decided to watch.

Someone on the show said that President Obama behaves as he does because he “doesn’t like America,” and Paul got upset and said that he often doesn’t agree with Obama himself, but he didn’t like that kind of thing to be said about a man who has devoted his life to “public service.”

His statement came as a shock — not to the people of Fox, but to me. I’ve been listening to talk about “public service” all my life, but hearing Obama called a public servant made the concept seem even stranger than it had before.

Who, besides government employees, especially politicians, is associated with “service”? Who “serves” other people? Well, for instance, people in restaurants; they serve the public. They’re even called “servers.” So what, if anything, do a politician and a waiter or a waitress — a public servant and a servant of the public — have in common? That’s the question I asked myself, and one question led to others.

The last time you went to a restaurant, did you see your server punching, kicking, and biting the other servers, for the privilege of waiting on your table? Did your server claim to be the only person qualified to do so? Did you see him passing out money to the other diners, so they would choose him to wait on them? Or did he just promise them good jobs, cheap but perfect healthcare, and lavish retirement benefits? When you sat down, did he deliver an hour-long speech, saying how glad he was to see you and how much he had already done for you?

When you objected, did your server call the police and have you arrested for “hate speech”?

If you came with your children, were they ushered into a back room to be educated about how great the servers were? If you objected, were you sharply reminded that “this is the law”? After you’d been there a while, did you notice that many of the tables were filled with people who were eating and drinking but never appeared to receive a check? Did you notice that when they were rowdy and disruptive, the servers went to them and apologized for the disapproving looks that other diners cast in their direction? Did you notice that when the server brought your food, he first gave half of it to the people at neighboring tables?

When you read the menu, did you notice that many of the advertised dishes had been labeled “Unconstitutional,” and dishes with new and unfamiliar names had been penciled in? If you ordered filet mignon with the chef’s special sauce, did your server return with a cold turkey burger and an empty ketchup bottle? If you ordered a good cabernet, were you told that anything but grape juice was available only by prescription? If you complained about the food, did your server refuse to comment, because the matter was under investigation?

In the middle of your meal, did the servers suddenly head for the windows and start shooting at the restaurant next door? Did they grab all the young males in the place and use them as human shields? When the firing mysteriously ceased, did they demand a loan to cover the unexpected cost of ammunition?

When you studied the bill, did you notice that after your waiter added up the surcharges, special surcharges, seat rental fees, menu licensing fees, and other sources of revenue not previously mentioned, you were paying 15 times more than the amount listed next to the items you ordered — which, again, your server never brought you? When you objected, did your server call the police and have you arrested for “hate speech”?

Did those things happen to you? No? They didn’t? The people serving you never did any of those things? Then perhaps there is a difference between public servants and people who actually perform a service to the public. And perhaps it’s time we clarified our vocabulary.




Share This


Updated Aphorism #6

 | 

chambers succeed




Share This


Hong Kong: Democracy and Liberties

 | 

As I write (October 15), protestors in Hong Kong are still trying to make the city more democratic and to wean it off Chinese government influence.

Protestors were seen cleaning up after themselves and even helping out the police with umbrellas during downpours. Indeed, HK is one of the most civilized places I have been to, and I visit several times a year. Despite its congestion, people respect your space and are hard-working, making it one of the freest, safest, and most competitive places in the world.

China itself is a communist dictatorship, or so it is believed. When the UK transferred the administration of HK to China in 1997, the world was convinced that China would destroy HK’s liberties. Between 1997 and 2003, the HK property market fell between 30% and 50%, and in some areas even more. A mass-migration happened to Canada, Australia, New Zealand, and the UK.

Democratic pressures lead to consistent increase in the size of government, as the majority insists on getting more and more from the pockets of wealth-generators.

By 2003, the realization had set in that the Chinese Communist Party had no intention of destroying HK’s liberties. HK continued to boom and stayed as one of the freest places in the world. China not only did not flood HK with continental Chinese, as had been suspected, but it maintained a visa regime like that which had existed before they took over: even today it is Chinese who need a visa to visit, not Indians, the stark enemies of China. Those who had left HK for good started returning. Businesses, the stock market, and the general economy boomed.

Within HK, you could speak, shout, and write against China and the Communist Party, on the streets and in the parliament, and still find yourself feeling as secure as you would have in a similar situation in Canada or the UK.

International observers — from social democrats to believers in the free market — sacrificed their integrity when they refused to admit that their forecasts about what China would do with HK had been proven wrong. They refused to express respect toward China for how well it had maintained HK. Even a criminal deserves fair treatment.

But should HK not get democracy, more liberties, and freedom of speech?

People’s understanding of democracy is utterly twisted, in an Orwellian sense. “Begging the question,” they treat liberty and democracy as synonymous. As defined, “democracy” is a system in which the government is elected, in some form, by the majority of people. By itself the concept says nothing about institutions of liberty and the size of government.

The fanatic believers in democracy, despite the common failure of democracies around the world — in Afghanistan, Iraq, Iran, Nepal, Pakistan, and more recently in Libya and Egypt — refuse to see the shallowness of their New Age religion. They refuse to see that democratic pressures lead to consistent increase in the size of government, as the majority insists on getting more and more from the pockets of wealth-generators. This invariably leads to overall reduction of liberties and relegates the majority to the culture and mentality of beggars.

The bazaar of bribes was conducted openly, without an iota of fear. People were groveling and pleading. The bureaucrats were demeaning these people and shouting at them.

But what about the freedom of speech and liberties that democracy promotes? As a student in my university in India, I could be beaten up without any moral hiccups if that was what the majority decided. These days, I podcast interviews of people from around the world, to discuss cultures. Most of my contacts feel flattered and are happy to talk. The country with the highest refusal rate for interviews is democratic India. In fact, the rate is close to 100%. In India, you can speak against systemic corruption, as long you do so in vague, broad terms, although what really matters in any fight is to pinpoint corruption of specific institutions. Hardly an Indian will talk to me about specific corruption.

Institutional corruption entangles people, for they must be a part of it even if they hate it, if they want to survive. Last week, I was in a government office in India. There were more private “facilitators,” to help navigate the corruption, than bureaucrats. The bazaar of bribes was conducted openly, without an iota of fear. People were groveling and pleading. The bureaucrats were demeaning these people and shouting at them. Where are liberties and freedom of speech in the world’s biggest democracy?

Should it be so difficult to understand that democracy and liberties are not synonymous?

If you want freedom of speech and other liberties, you must fight for better institutions, preferably private and non-democratic and hence unpoisoned by the majority who care less for virtues and more for material pleasures.

Or let’s consider the world’s second biggest democracy and the most passionate proselytizer, the land of the free, the USA. Americans can talk freely about broad, amorphous subjects. But can they talk about specific ones? How many people can claim to speak their minds openly about race, native Indian issues, the sexual orientation of others, women, etc.? And how many fail to speak freely because they fear they might get into the no-fly list or in the records of the CIA or that an unhappy government might initiate IRS audits? When at American airports, I make sure I don’t utter certain words — even in an innocent sentence — to avoid having a SWAT team descend on me. The lack of freedom of speech has become so institutionalized in the minds of Americans that they don’t even realize what they don’t have.

In comparison, non-democratic Hong Kong is a freewheeling place where people have the freedom to say what they think. There is hardly a country anywhere in the world better in comparison. Only those prepared to fool themselves or incapable of deeper thinking conflate freedom of speech with democracy.

Another way in which the international society, the secular but fanatic believers in democracy, has lacked integrity is their failure to recognize that some of the best improvements in liberties and economic growth have appeared in non-democratic countries: HK, China, Singapore, and Macau. Korea and Taiwan grew the most when they lacked a proper democratic system. So did Japan and Chile. I struggle to find a nation in recent times that has begun to succeed under democracy.

Our lack of integrity is not just a standalone vice. It detaches us from seeing the truth, from weighing the situation properly and assess what must be done to improve society.

But given the liberties and higher intellectual environment in HK — as I concede above — should its people not have the right to vote freely for their own government? Aren’t the students and people of HK — as I concede above — among the most civilized people anywhere?

It is an error to believe that what people say is what they want. The fever of democracy has now been sweeping the world for a few years. This is not a demand for more liberties or improvement in human rights, as they seem to demand, but in essence a demand for a magic wand, to get something for nothing.

People should fight for more liberties and an even smaller government. But “democracy” will take them in the opposite direction.

Collectives and mass movements are based on such desires and it is an error to expect higher ideals from them. Ready to follow unexamined romantic ideas, students of HK are supporting leftist elements. While a parliamentarian, Leung Kwok-hung, a Che Guevara lover, shouts and protests against the Chinese regime openly and without fear while he is in HK, I wonder if he would allow the same liberties to others if he came to power in a democratic Hong Kong.

One of the worst political disasters of recent times has been to give the vote to students. However good they might be, they simply lack the life experience to understand the relationships between ideas and, if they do, to weigh them based on their importance. They lack the experience to comprehend life in its complexities. Formal education at best is about learning the alphabet of life. But life must be lived and experienced to create prose from this alphabet. Moreover, education around the world, including HK and Singapore, indoctrinates students in what must be accepted as beliefs. And it is the “progressive” agenda of those in the West and their wishy-washy Marxist ideology that is now a matter of faith among students around the world. HK’s recent movement is heavily influenced by this.

So, what should Hong Kong do, if not fight for more liberties? HK has perhaps the smallest government in the world and is among the freest societies. Even then it’s worth reducing the size of its government, one hopes to nothing. Yes, indeed, people should fight for more liberties and an even smaller government. But “democracy” will take them in the opposite direction. Moreover, fighting on the street is always a wrong start, for it presumes that the protestor can infringe on other people’s liberties, to somehow gain larger liberties for everyone. Our path must be in sync with our goals. What one sees in HK today is the path backwards.




Share This


Principles of Climate Science Estimation Theory

 | 

At the People's Climate March last month, a throng of boisterous protestors trudged through the streets of Manhattan, demanding that elected officials finally begin treating climate change as a top priority. "Climate Action Now," demanded a popular sign. Accompanied by such climate change luminaries as UN Secretary General Ban Ki-moon, former Vice President Al Gore, comedian Chris Rock, and actors Leonardo DiCaprio and Mark Ruffalo, the climate cause message would be heard loud and clear, at last. The size of the crowd (estimated to be tens of thousands to 400,000, and according to MediaMatters, "by far the largest climate-related protest in history") moved NYC mayor, Bill De Blasio, to hope that this time it would be a “turning point moment” in sounding the alarm of climate change — an outcry that, to De Blasio and fellow climateers, had the auditory effect of "the science is settled" being shrieked 400,000 times.

Secretary of State John Kerry, who has equated global warming with weapons of mass destruction, was also hopeful. In town to attend a separate, private climate-change event, Kerry expressed an optimism "that world leaders [would] come to the United Nations to recognize this threat [global warming, not WMDs] in the way that it requires and demands." An ardent believer in settled science, Mr. Kerry may have overestimated its power when he urged governments to exploit "the small window of time that we have left in order to be able to prevent the worst impacts of climate change from already happening." Few stand in greater awe of science than John Kerry.

And there was no shortage of Superstorm Sandy reminders, testifying to the rising sea levels that will inundate such cities as New York. "We're seeing storms that are devastating the East Coast and the Gulf Coast,” cried Ricken Patel, the executive director of the march. “We're seeing flooding that's threatened this city and many others.” “Cut your emissions or you'll sleep with the fishes," warned a popular sign. To all in attendance, it was time to build dikes.

Who cares if the models are deeply flawed? It feels like they are accurate.

How high should we build them? The current Intergovernmental Panel on Climate Change (IPCC) estimate is about two feet, unless one is designing for the worst case scenario, which is three feet. These are estimates (from the IPCC's latest climate assessment report, AR5, released in September, 2013) for global mean sea level rise (GMSLR) by the year 2100. More recently, the Obama administration's National Climate Assessment (NCA) has given two, much higher, estimates. The first, which assumes that humanity will adopt NCA recommendations for curbing CO2 emissions, is three feet. The second, which assumes that humanity will ignore its recommendations, is six feet. That is, the dikes should be six feet high.

In his 2006 Academy Award winning documentary,An Inconvenient Truth, Al Gore estimated a 20-foot sea level rise, driven by rapidly melting Arctic ice. In 2007, as he accepted the Nobel Peace Prize for his climate change speculations, Gore exclaimed, "The North Polar ice cap is falling off a cliff," estimating that "it could be completely gone in summer" by 2013. James Hansen, the father of anthropogenic global warming (AGW), estimated a similar, but more sinister, rise: the current linear GMSLR trend will change to exponential growth (a dog-whistle term, invoking unimaginable imaginary rage from the climate cult), with the approach of 2100.

But the accuracy of such estimates — of accelerated ice melt flow abruptly raising global sea levels — is not without controversy. In a 2007 hearing by the House Committee on Science and Technology, IPCC scientist Richard Alley testified that "on this particular issue, the trend of acceleration of this flow with warming, we don’t have a good assessed scientific foundation right now."

Testifying again, in 2010, Dr. Alley discussed climate "tipping points" (another cultist dog-whistle), stating that "available assessments . . . do not point to a high likelihood of triggering an abrupt climate change in the near future that is large relative to natural variability, rapid relative to the response of human economies, and widespread across much or all of the globe. However, such an event cannot be ruled out entirely."

Antarctic sea ice, which has been increasing since sea ice extent measurements began in 1979, reached a record level in 2014.

Then there is the suite of General Circulation Models (GCMs) — climate simulations used by scientists to estimate the magnitude of future climate havoc, and used by politicians as the scientific basis for estimating the magnitude of their agendas. Such simulations have demonstrated little predictive value. Despite the IPCC's resounding 95% certainty (the gold standard, said CNN) of AGW and Kerry's assurance (another gold standard) that "the science has never been clearer," levee designers would do well actually to read AR5, especially where it states that “there remain significant errors in the model simulation of clouds. It is very likely that these errors contribute significantly to the uncertainties in estimates of cloud feedbacks and consequently in the climate change projections.”

Nevertheless, many of us are reluctant to dismiss the infernal claims of the catastrophists. After all, their estimates are generated by highly sophisticated and complex computer simulations. Who cares if the models are deeply flawed? It feels like they are accurate. How else can extreme weather events (storms, droughts, wildfires, famines, violent crime, terrorism, etc.) be explained? Besides, we've seen the melting Arctic — over and over again, every summer. And, God have mercy, the beleaguered polar bears, waiting despondently for the ice that will never return, and their consequent extinction. More alarming is some scientists’ claim that West Antarctica is beyond saving. Are we only left to hope, along with John Kerry, that science can prevent it "from already happening"?

Hope may not be enough. The phrase "cannot be ruled out entirely" leaves the door open for larger estimates. It is the door to cataclysm, through which Dr. Alley — the voice of reason, under oath — scurried in a Mother Jones interview last May, when he estimated that the melting of the West Antarctic ice sheet "will unleash a global Superstorm Sandy that never ends." Combined with a Greenland ice melt (next in line for catastrophe), which will be equivalent to "the storm surge caused by Supertyphoon Haiyan," this could produce, according to Alley’s estimates, a sea level rise of 33 feet — apparently unleashing a Super Hurricane Sandy and Super Typhoon Haiyan that never ends. Alley went on to claim that if governments continue to "fiddle and do nothing," then the entire continent (Antarctica) would melt; he estimated that "someday, it would reward you with as much as 200 feet of sea level rise."

It seems that the scientific foundation Dr. Alley discovered as a basis for these estimates, the foundation that was missing in 2007, was lost again the following month, when it was reported that Antarctic sea ice, which has been increasing since sea ice extent measurements began in 1979, reached a record level. And, while it is true that the Arctic sea ice extent has been decreasing since 1979, it began to rebound in 2013 — ironically, the very year Mr. Gore picked to mark the end of its summer ice. The Arctic sea ice extent at the end of this summer's melt season was 48% greater than that of 2012. Over the past two years, annual Arctic ice has increased dramatically in both area (up 43 to 63%) and volume (up 50%).

These developments have led some scientists to conclude that "the Arctic sea ice spiral of death seems to have reversed." Yet they have led others to invoke CO2, ecologism's god of climate, which is supposedly planning to rid the Arctic of summer ice "by September 2015" — just in time for next year's ice melt season, and, given the now-expected resumption of Arctic summer tours,idyllic climate change vacations, with happy climate changers photographing forlorn polar bears and retreating glaciers.

Such a rapid climate reversal would be seen as a mystical event by climate cultists. It would certainly mystify John Kerry, not to mention Al Gore, whose standing as a climate prophet would be restored (what's a two-year error in climate forecasting?). It would end the warming pause — now in its 16th year, befuddling our best climate scientists, who can't explain how the more than 100 billion tons of CO2 that have been belched into the atmosphere since 1998 have produced no warming — and the yearning of catastrophists for the return of rising temperatures. In that coming warmth, they will revel in their bombastic estimates of danger and their equally alarming prescriptions (i.e., humanity's penance) for saving the planet.

Politicians jump with alacrity to unprincipled estimates of human attribution and government remedies of future warming — all of them inexplicably precise.

But there is growing evidence that next September may be too early for celebration. The apocalypse might be postponed. The sluggish rise in sea level that began around 1850 (at the end of the Little Ice Age, when sea level was low, and could be expected to rise) remains sluggish. Many people (possibly everyone who actually read AR5) should find that the IPCC's estimate of GMSLR is not supported by the evidence it provided. For example, the IPCC analysis assumes that the accelerated sea level rise beginning around 1970 was the result of anthropogenic forcing. But the sea level rise from 1910 to 1950, a period during which human influence was not "the dominant cause of the observed warming," was of similar magnitude. Several recent studies (e.g., American Meteorological Society, Environmental Science, and the National Oceanic and Atmospheric Administration) agree, finding no evidence of a global warming influence on sea levels, and estimating a GMSLR of less than 5 inches per century.

Thus, after more than 25 years of intense climate research, the estimated end-of-century sea level rise is somewhere between 5 inches and 20 feet; but it could be 33 feet, and 200 feet cannot be ruled out entirely. Thanks, climate scientists, for settling the science. But what's the safe dike height?

Unfortunately, politicians, the de facto gurus of climate science, think that they know. Trampling over the principles of climate science (principles for estimating the rate of warming and its human component), they jump with alacrity to unprincipled estimates of human attribution and government remedies of future warming — all of them inexplicably precise. But the vast majority of climate scientists agree, we are told.

The search for scientific truth to inform climate change policy has become, however well-intentioned, a campaign of public deception to promote a political agenda. Can an agenda whose success depends on unrelenting estimates of looming catastrophe, ceaseless exploitation of fear, and infantile suppression of debate (the “consensus,” the “settled science,” the vilification of skeptics, etc.) be expected to do more than provoke record-breaking climate change marches, demonstrations of science-illiterates and the willfully uninformed? Is climate change policy based on sound science, designed to ensure our safety, or is it based on green hysteria, maintained to ensure an omnipotent government state? Liberal French philosopher Pascal Bruckner (in “Against Environmental Panic)suspects the latter: a cynical ideology in which "All the foolishness of Bolshevism, Maoism, and Trotskyism are somehow reformulated exponentially in the name of saving the planet."

Are the new climate Cassandras (Obama, Gore, Kerry, et alia) principled climate change heroes, seeking scientific truth? In Bruckner's estimation, it might be that "these are not great souls who alert us to troubles but tiny minds who wish us suffering if we have the presumption to refuse to listen to them. Catastrophe is not their fear but their joy." It cannot be ruled out entirely.




Share This


Yum

 | 

This election year has been full of odd little funny things. It’s like a buffet loaded with wilted salads and overcooked chicken — but down near the end of the table they’ve laid out some tiny, tasty desserts.

One of these delicious offerings was the response of someone named Alison Lundergan Grimes, the Democratic nominee for Senate from Kentucky, when she was asked, in an editorial conference at the Louisville Courier-Journal, whether she had voted for the much-detested-in-Kentucky Barack Obama. Over and over, she refused to provide an answer, blathering instead about what the election is really about, slamming her opponent, and saying that she “respect[s] the sanctity of the ballot box” (an odd way of indicating that although you want to go to the Senate and vote all the time, you won’t say how you voted for president). In this case, the hardcore partisan decided (and it was a decision, because her response was immediate and well-rehearsed) that hiding from her own party allegiance was worth the price of looking like a clown. Either that, or she’s so stupid she didn’t realize that she’d look like a clown. Anyway, it was a hilarious performance.

A few days before, Republican activists had secretly recorded conversations with activists in the Grimes campaign, including a major donor. They then shared these conversations with the national audience — chiefly comments about how Grimes was lying all the time about her support for the state’s leading industry, coal. Her reason? Otherwise, you dope, she would never get elected!

Who can withstand the force of arguments like that? Who can resist the comedy of people working in a moralistic cause while espousing a philosophy of amoralism?

Even funnier was an editorial statement that appeared in the Louisville Courier-Journal, in which the paper’s politics writer explains why it wouldn’t publish anything about the Republicans’ conversational adventures. You can read the statement for yourself and assess the reasons. But the first thing you’ll notice is that in explaining why the paper won’t run the story, the writer goes ahead and recites the whole thing!

As the waiters say: here’s your dessert — enjoy!




Share This


All We Really Need to Know . . .

 | 

Kindergarten was a lot of fun, but I’m glad it’s over. Some people liked it so well they wish they’d never left. A few give every indication that they wish they could go back. I think a great many really need to.

In 1988, a Unitarian minister named Robert Fulghum published a bestselling book entitled All I Really Need to Know I Learned in Kindergarten. I’ve only read excerpts from it, so I can’t be sure of the author’s intention. From the parts I’ve seen, my guess is that he agreed with me.

I used to think that growing up was, you know, some sort of goal; it was the state of being that was ultimately desired by most human beings. The only alternative I could envision, as a child, was dying before I got old enough to be an adult. That didn’t seem like a very attractive option.

But our government, in its infinite benevolence, offers us another one.

The dominating State doesn’t want us to be adults, because adults are independent and think for themselves. It wants us to remain forever little children. It doesn’t even mind that we might be oversized brats, because then it has a ready excuse to use whatever force may be necessary to control us. Because of this, it directs much of its efforts toward treating us like children. And when we’re persistently treated in this way, most of us are going to behave like children.

That, of course, gives the State an excuse to go on treating us like kids, and on and on it goes. None of us wants to think that we are anything less than adults. But we see all those other people out there carrying on like toddlers, so we easily become convinced that for the sake of us grownups, the government must be stern and parental with them, just to keep them in line.

The dominating State doesn’t want us to be adults, because adults are independent and think for themselves. It wants us to remain forever little children.

Libertarians annoy people, because we tend to remind them of the things they learned in kindergarten and then, evidently, forgot. Most people think they remember everything they learned in kindergarten. It’s all those other fools who need to be reminded. When libertarians remind them of the basics, they’re insulted. But they really ought to humor us. All those other poor fools need every reminder they can get.

Among the admonitions issued by the Rev. Fulghum, we must share everything, play fair, not hit people, clean up our own messes, and never take things that aren’t ours. There are more rules — 16 in all — but those are the ones that absolutely must be remembered if we are to have a harmonious society. If we don’t always flush, wash our hands before eating, consume warm cookies and cold milk, or take a nap every afternoon, we might be a little tired and somewhat unhygienic, but most people will never know. And putting things back where we found them, saying we’re sorry when we’ve hurt people, watching out for traffic, and holding hands and sticking together pretty much go along with the most important suggestions. The others — living balanced lives, being aware of wonder, remembering that we will all die, and just looking — we either figure out over the course of our years on this planet or suffer the consequences ourselves.

But libertarianism is an even simpler philosophy. It boils everything down to basic logical and moral principle. It can be gunked-up and expanded into all sorts of things, many of them complicated and some even crazy. Those who, for whatever reason, dislike the notion that others might enjoy the same degree of freedom they want for themselves seem to have an extra bone in their heads that blocks them from understanding libertarian ideas.

It especially irks “progressives” — civilized, evolved, peaceful, and nonviolent as they want to think they are — to be told that when they resort to government action against people they dislike, they are using violence. It isn’t being administered directly, because they aren’t going out and shooting them or personally threatening them with guns, so they don’t want to see the connection. When libertarians patiently explain that the State has guns, bombs, tanks, police dogs, and now drones, means of force that it uses with ever-increasing frequency even on its own citizens, they pretend that’s just a technicality. No doubt they even want to believe it.

They have fallen so totally in love with government intervention in every dispute that they are actually all about aggression. Instead of progressives, they could more accurately be called aggressives.

When I debate this with aggressives on political blogs, the argument always runs something like this: “They [whoever they are, though almost always conservatives] are bad people. So we must hit them.” It’s never articulated this plainly, but of course that always comes down to being what they’re saying.

That’s the reasoning of a 5-year-old — a 5-year-old who has either yet to enter kindergarten or flunked it. And when this is pointed out to them, however gently, they almost invariably resort to calling people names and using profanity. They may think this makes them look more grown up, but it makes them look like seriously delinquent 5-year-olds. In an era when their favorite means of settling disputes was more readily employed, they’d have been hauled out behind the woodshed and paddled.

“But-but-but,” goes the standard whine, “they do it, too!” Johnny’s mommy lets him, so why can’t I?

As for conservatives, they are frankly authoritarians. They groove on violence. They can’t understand why 5-year-olds aren’t still being hauled out behind the woodshed and paddled. Johnny’s mommy probably takes him to the playground with an Uzi on her shoulder. This is the attitude they want to emulate?

How can we withdraw from imperialistic military adventures in other countries if we see violence as the solution to absolutely every problem?

How much aggression can a progressive society tolerate? That is not a trivial question. If everybody in a society behaves like a kindergartener, is real progress possible? Can such a society even function on a basically civilized level?

Libertarians may be annoying, but they’re raising a concern it behooves any serious progressive to consider. How, for example, can we withdraw from imperialistic military adventures in other countries if we see violence as the solution to absolutely every problem? If all we have is a hammer, as the saying goes, will everything in the world, at home as well as abroad, not look like a nail? How we behave at home, toward one another, does in large part determine how we behave abroad.

And if we can muster no greater fellow-feeling for other people in our own country, how on earth are we to deal with those in faraway lands with genuine compassion? There’s also a lot to the saying that charity begins at home.

I may be horribly misguided, but I’ve always been under the impression that progressives wanted to be “the adults in the room,” as they often say. That they believed human beings needed to continue evolving from a more primitive and childish state to a higher consciousness. That they wanted to keep the torch of the Enlightenment lit and moving forward through the generations. Yet increasingly they carry on like the studio audience of Captain Kangaroo.

Their response to nearly every situation is, indeed, to use government force. Not as a last resort — as may occasionally be necessary, out of self-defense, when their adversaries insist on using force against them — but as the very first and only resort. Without even trying to, as one of their heroes, John Lennon, so famously sang, “Give Peace a Chance.”

Another holy word in the progressive vocabulary — ranking right up there alongside peace — is democracy. In which they claim to fervently believe, and for the sake of which they can apparently justify almost anything they do. But without the sort of mutual respect, willingness to listen, to share everything, play fair and not hit people we were supposed to have learned in kindergarten, democracy is impossible. As are peace, equality, justice, and everything else that self-professed progressives say they favor.

Our school years, even the later ones, often seem to have been meaningless. “When I think back on all the crap I learned in high school,” sang Simon and Garfunkel, “it’s a wonder I can think at all.” But some of that stuff was, indeed, meaningful — and what we learned in kindergarten actually may have been some of the most important stuff of all. They gave it to us early not because it was OK if we forgot it, but because it would be most fundamental to our lives from that time on.

Do we know enough to read the writing on the wall? Will we awaken to the realization that only in a society where everyone’s rights and freedoms are respected can anyone’s be safe? If not, that moving finger’s message on the wall will spell not progress, but doom.

Any society that has degenerated into a gigantic, unruly kindergarten will eventually find itself deprived of freedom. The jackboots will step in to restore order. For the big-moneyed backers of big government — those who actually benefit from it, those whom it ensconces in power — this is undoubtedly the plan. I wonder when “progressives” are going to wake up and see that.

I know it will happen eventually. They’ll figure it out sooner or later. I only hope that later doesn’t turn into too late.




Share This


The Film You’ve Been Hearing About

 | 

Normally I go to a movie theater with a pen in my hand and a notebook in my lap. Yes, it requires me to break away from the universe created on the screen, but it’s a small price to pay on behalf of my readers. Ten minutes into Gone Girl, however, I put both away and settled back for the ride. Don’t even bother to fasten your seat belt — you’ll want to feel every twist and turn.

It’s a beautiful sunny morning when Nick Dunne (Ben Affleck) arrives at The Bar with a board game under his arm and begins bantering with the barmaid Margo (Carrie Coon), who turns out to be his twin sister. Soon Nick’s phone rings. It’s his neighbor, and he rushes home. His cat is outside. The door is ajar. The glass coffee table is upturned and shattered. There’s a speck of blood on the range hood. And Amy, his wife — his girl — is gone.

Gone Girl is a “whodunit” in the tradition of the best classic murder mysteries but with a modern twist that keeps the audience guessing all the way to the end. Not only do we not know who done it; we don’t even know the answer to “done what?” Amy (Rosamund Pike) is gone, and someone has mopped up a pile of her blood from the kitchen floor. But without a body, homicide detective Rhonda Boney (Kim Dickens) can’t make an arrest. Meanwhile, there’s a boatload of possible suspects in the vicinity, including Nick’s mentally unstable father (Leonard Kelly-Young), Amy’s overachieving parents (Rand Elliot and Lisa Banes), Amy’s spurned former boyfriends (Neil Patrick Harris and Boyd Holbrook), the neighbor down the street who claims to be Amy’s best friend (Casey Wilson), and even Nick’s oh-so-close twin, Margo.

Someone says, “Smile,” and you do. And Greta Van Susteren takes it upon herself to broadcast that photo and give it an entire backstory.

Dark, good-looking, and lantern-jawed, Ben Affleck was obviously cast for his striking resemblance to Scott Peterson, who was tried in the media (and then in court) for the murder of his pregnant wife, Laci, after she went missing on Christmas Eve, 2002. In both the movie and the Peterson case, a wandering pet alerted neighbors that something was amiss. In both, the husband was alone on the water when his wife went missing. In both, the parents of the missing woman supported their son-in-law (until the girlfriend showed up). And in both, the cable news networks made it their lead story every night.

In many ways this story is an indictment of the “trial by media” that has become a regular staple in the daily diet of the news. Simply put, sensationalism sells. “We all know” that JonBenét Ramsey was killed by her father. Unless it was her uptight mother. Or her creepy stepbrother. (Choose a team.) Ed Smart was a prime media suspect in the disappearance of his daughter, Elizabeth, until she was found, alive, nine months later. (To his credit, Sean Hannity came to believe Smart’s story and gave him plenty of competing air time.) Casey Anthony was acquitted of the murder of her little girl, but “everyone knows” she did it; we reviewed the evidence night after night on cable, even before her trial began. Amanda Knox, a college student studying in Italy, was convicted of the murder of her roommate, Meredith Kercher, in part because she was seen kissing her boyfriend and sitting on his lap while waiting to be interviewed by the local police. She just didn’t look distraught enough. And “we all know” what that means.

But we also know what a camera can do in the blink of a lens. Someone snaps a candid photo from across the room while you are in the middle of saying a word or while you are squinting into the sun, and you look angry or sullen or goofy. Someone stands next to you for a photo or a selfie, and you automatically smile, no matter what you are feeling inside. You see a friend across the room, and you smile as you wave hello, even if the occasion is as somber as a funeral or a trial. It’s automatic, even when you’re upset. Someone says, “Smile,” and you do. You just do. And Greta Van Susteren takes it upon herself to broadcast that photo and give it an entire backstory.

The beauty of Gone Girl is that you just don’t know which of the snapshots to believe, or what is going to happen next. It’s a thrill ride of epic proportions, and I’m not going to spoil it for you by saying another word.

Unfasten your seatbelt. It’s going to be a gloriously bumpy ride.


Editor's Note: Review of "Gone Girl," directed by David Fincher. Regency Enterprise-Pacific Standard, 2014, 149 minutes.



Share This


Updated Aphorism #5

 | 




Share This


From Books to Film

 | 

Liam Neeson has made a name for himself in the last few years as the Old Geezer of action heroes in movies known for their simple plots, video-game action, and one-word titles such as Taken, Unknown,and Non-Stop. . . . Regular readers of Liberty know that I’m rather taken with the Taken films, regardless of their simplicity.

But Neeson is more than just a rugged face with a powerful punch; he’s a classically trained actor with more than a dozen major awards and two dozen major nominations, so it’s nice to see him back in a role that allows him to flex his acting muscles again. A Walk Among the Tombstones is a cool, atmospheric crime drama based on a series of novels written by Lawrence Block that feature former-cop-turned-private-investigator Matt Scudder. Scudder is also a former-drunk-turned-recovering-alcoholic.

In Walk, Scudder (Neeson) is the privatest of private eyes; he doesn’t have a license and operates outside the law. He is driven by a mixture of justice and revenge, garnished with a twist of guilt over a tragedy that occurred while he was a cop. This combination can become a dangerous cocktail. Like Mel Gibson’s character in the Lethal Weapon series, Scudder doesn’t have a strong survival instinct. In some ways, in fact, he sees death as a welcome escape — and this adds to the tension in the film. Contributing to the tension are the unexpected and jarring juxtapositions of beauty and horror that lift the quality of the filmmaking and enhance the viewers’ expectations. Especially effective is the way the AA 12-Step affirmations are used at a significant point in the film.

Initially Scudder rejects the job of tracking down the ruthless pervs who have kidnapped and then gruesomely murdered the wife of wealthy drug dealer Kenny Kristo (Dan Stevens). He knows Kenny just wants the pleasure of killing them slowly — and gruesomely — and he doesn’t want to be a part of that. But he is drawn into the case when he realizes that the men might be serial killers who will strike again. The film then becomes a race to find the killers and stop them.

Scudder doesn’t have a strong survival instinct. In some ways he sees death as a welcome escape.

In the books Scudder works closely with police detective Joe Durkin, and the movie role of Durkin was originally cast with Ruth Wilson as a female Joe. But director Scott Frank decided that Scudder’s character is more realistic as a brooding loner, so Joe Durkin was cut from the story line — after Wilson had already been filmed in many of the scenes! (That these scenes aren’t even missed is a tribute to the film’s editor.) Elaine, Scudder’s call-girl girlfriend in the novels, is conspicuously absent as well. Instead, Scudder interacts with a homeless, tech savvy, African-American teenager named TJ (Brian “Astro” Bradley) who serves to soften Scudder in the eyes of the audience while emphasizing the lack of family connection in Scudder’s life. Cutting the two once-prominent characters was a smart move for the movie, despite their integral part in the novel series.

A Walk Among the Tombstones is a throwback to the heyday of gritty crime movies. I almost expected to see Popeye Doyle show up for a car chase under the elevated train. The acting is superb at every level. Dan Stevens is especially good as Kenny Kristo, the grieving and vengeful husband; his intense, icy eyes glare up under heavy brows in every scene, convincing us that he is capable of anything. Astro (who was discovered as a young rapper on The X Factor a couple of years ago) is also effective as the homeless kid with the optimistic outlook and poignant determination. And it’s always good to see Liam Neeson step out of his one-word, one-tone, one-dimensional action romps to remind us that he’s still an A-list actor.


Editor's Note: Review of "A Walk Among the Tombstones," directed by Scott Frank (2014). Cross Creek Pictures, 2014, 114 minutes.



Share This


Whatever Happened to His Nobel Prize?

 | 

I’ve been asking my friends a question. It’s a question that should have occurred to me before, but it hit me rather suddenly a few days ago, during President Obama’s fulminations about what he was going to do to ISIS (“ISIL,” in his chronic though unexplained vocabulary). I couldn’t answer the question, so I began asking other people.

The question is: whatever happened to Obama’s Nobel Peace Prize? I mean, when was the last time you heard anybody mention it?

I can only speculate about the last occasion when I heard of it. I imagine it was mentioned when Obama was destroying the government of Libya and replacing it with another one (and that turned out well, didn’t it?). But I don’t actually remember anybody bringing it up. I would also imagine that someone mentioned it when Obama was campaigning for reelection on the claim that he had killed Osama bin Laden. Again, however, I can’t specifically recall anyone drawing attention to the Nobel Prize. The Prize for Peace, remember.

I hope this means that the Nobel Prize has become irrelevant. I mean, Al Gore got one.

Then came the Drone Wars, with more brags from Obama about liquidating his enemies. Then his first attempt at invading Syria, with all those statements about drawing lines in the sand. I can’t remember any discussion, at the time, of the peculiar moral and intellectual evolution experienced by the Nobel laureate. Then came . . .

You get the picture. I can’t identify anyone who discussed that issue, ever. Of course, there must have been someone who did. I can’t read everything.

So when we got to Obama’s ISIS bombing campaign, I started asking other people. Nobody could remember any references, printed or televised, to a Nobel Prize for Peace. A few said they hoped that meant it was all a bad dream — Obama, the prize, everything. A few wanted to debate what Obama should have done about the prize in the first place. Some thought he should have refused it, saying he wanted to do something to deserve the honor, which he hadn’t had the opportunity to do as yet; or saying that as the president of a country that often needs to protect itself by engaging in military force, he would be hypocritical if he accepted a prize for Peace. I’d favor the first option, myself. I think it would have been the best public relations move a president ever made. But what’s obvious to me isn’t obvious to Obama.

Anyway, since my friends couldn’t remember any references to the irony of Obama the peace-prize man, I started monitoring my TV more closely. I have yet to encounter the faintest allusion to Obama’s Nobel Prize. Indeed, everyone seems to be studiously avoiding it. To specify just one example: Peter Baker, a big guy at the New York Times, prattling to CNN on Sept. 29. The subject was promising for a Peace Prize mention: Baker had been invited to discuss the president’s inability to describe his actions regarding ISIS as warfare, not just “being in a war environment” and so on. So now, I thought, Baker will certainly mention the Prize. Now he’ll have to mention the Prize. But no. He dished out the usual statements about Obama’s wanting to be “a peace president,” as his interviewer said, but he never even got close to a Nobel Prize.

I hope this means that the Nobel Prize has become irrelevant. I mean, Al Gore got one. I also hope that Obama is becoming irrelevant. But I’m afraid that what is now irrelevant is the human memory.

For memory’s sake, therefore, I wish to specify, for the record, that according to the Nobel Prize website, “the Nobel Peace Prize 2009 was awarded to Barack H. Obama ‘for his extraordinary efforts to strengthen international diplomacy and cooperation between peoples.’"

Well, that’s all right. They gave him the prize about one second after he became president. How did they know what would happen afterward?




Share This

© Copyright 2017 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.