The Reality of “Emerging Markets”


The British Empire was the largest in history. At the end of World War II Britain had to start pulling out. A major part of the reason was, ironically, the economic prosperity that had come through industrialization, massive improvements in transportation, and the advent of telecommunications, ethnic and religious respect, freedom of speech, and other liberties offered by the empire.

After the departure of the British — as well as the French, German, Belgians, and other European colonizers — most of the newly “independent” countries suffered rapid decay in their institutions, stagnant economises, massive social strife, and a fall in standards of living. An age of anti-liberalism and tyranny descended on these ex-colonized countries. They rightly got to be known as third-world countries.

The blame — at least among those on the Right — went mostly to socialism and the rise of dictators. This is not incorrect, but it is a merely proximate cause.

An armchair economist would have assumed that these ex-colonized countries, still very backward and at a very low base compared to Europe, would grow economically at a faster rate. Quite to the contrary, as time passed by, their growth rate stayed lower than that of the West.

The blame — at least among those on the Right — went mostly to socialism and the rise of dictators. This is not incorrect, but it is a merely proximate cause. Clarity might have been reached if people had contemplated the reason why Marxism and socialism grew like weeds in the formerly colonial countries.

According to conventional wisdom, the situation changed after the fall of the socialist ringleader, the USSR, in the late ’80s. Ex-colonized countries started to liberalize their economies and widely accepted democracy, leading to peace, the spread of education and equality, the establishment of liberal, independent institutions, and a massive economic growth sustained during the past three decades. The “third-world” would soon be known as the “emerging markets.”

In some ways, government regulations and repression of businesses in the “emerging markets” have actually gotten much worse.

Alas, this is a faulty narrative. Economic growth did pick up in these poor countries, and the rate of growth did markedly exceed that of the West, but the conventional narrative confuses correlation with causality. It tries to fit events to ideological preferences, which assume that we are all the same, that if Europeans could progress, so should everyone else, and that all that matters is correct incentives and appropriate institutions.

The claimed liberalization in the “emerging markets” after the collapse of the USSR did not really happen. Progress was always one step forward and two steps back. In some ways, government regulations and repression of businesses in the “emerging markets” have actually gotten much worse. Financed by increased taxes, governments have grown by leaps and bounds — not for the benefit of society but for that of the ruling class — and are now addicted to their own growth.

The ultimate underpinnings of the so-called emerging markets haven’t changed. Their rapid economic progress during the past three decades — a one-off event — happened for reasons completely different from those assumed by most economists. The question is: once the effect of the one-off event has worn off, will the so-called emerging markets revert to the stagnation, institutional degradation, and tyranny that they had leaped into soon after the European colonizers left?

In the “emerging markets” (except for China) synchronized favorable economic changes were an anomaly. They resulted in large part from the new, extremely cheap telephony that came into existence (a result of massive cabling of the planet done in the ’80s) and the subsequent advent of the new technology of the internet. The internet enabled instantaneous transfer of technology from the West and, by consequence, an unprecedented economic growth in the “emerging markets.”

Those who hold China in contempt for copying Western technology don’t understand that if copying were so easy, the rest of the world would have done the same.

Meanwhile, a real cultural, political, and economic renaissance started in China. It was an event so momentous that it changed the economic structure not just of China but of the whole world. Because China is seen as a communist dictatorship, it fails to be fully appreciated and respected by intellectuals who are obsessed with the institution of democracy. But now that the low-hanging fruit from the emergence of the internet and of China (which continues to progress) have been plucked, the “emerging markets” (except, again, for China) are regressing to the normal: decay in their institutions, stagnant economies, and social strife. They should still be called the “third world.”

There are those who hold China in contempt for copying Western technology, but they don’t understand that if copying were so easy, Africa, the Middle East, Latin America, and South Asia would have done the same. They were, after all, prepared for progress by their colonial history. European colonizers brought in the rule of law and significantly reduced the tribal warfare that had been a daily routine in many of the colonies — in the Americas, Africa, the Middle East, and Asia. Britain and other European nations set up institutional structures that allowed for accumulation of intellectual and financial capital. Western-style education and democracy were initiated. But this was helpful in a very marginal way.

For those who have not travelled and immersed themselves in formerly colonized countries, it is hard to understand that although there was piping for water and sewage in Roman days, it still does not exist for a very large segment of the world’s population. The wheel has been in existence for more than 5,000 years, but a very large number of people still carry water pots on their heads.

It is not the absence of technology or money that is stopping these people from starting to use some basic forms of technology. It is something else.

A remark often attributed to Churchill, although unverified, has more than passed the test of time: “If Independence is granted to India, power will go to the hands of rascals, rogues, freebooters; all Indian leaders will be of low calibre and men of straw. They will have sweet tongues and silly hearts. They will fight amongst themselves for power and India will be lost in political squabbles. A day would come when even air and water would be taxed in India.”

The hope that once the correct incentives are in place and institutions have been organized, the third world on a path to perpetual growth, couldn’t be more wrong.

Europeans of that time clearly knew that there was something fundamentally different between the West and the Rest, and that the colonies would not survive without the pillars and cement that European management provided. With the rise of political correctness this wisdom was erased from our common understanding, but it is something that may well return to haunt us in the near future as expectations from the third-world fail and those who immigrate to Europe, Canada, Australia, and the US fail to assimilate.

For now, the hope among those in the World Bank, the IMF, and other armchair intellectuals has been that once the correct incentives are in place and institutions have been organized, imposed structures will put the third world on a path to perpetual growth. They couldn’t be more wrong.

The cart has been put in front of the horse. It is institutions that emerge from the underlying culture, not the other way around. And a cultural change is a millennia-long process, perhaps even longer. As soon as Europeans quitted their colonies, the institutional structures they left started to crumble. Alas, it takes a Ph.D. from an Ivy League college and a quarter of a million dollar salary at the World Bank or IMF not to understand what is the key issue with development economics and institutional failures: the missing ingredient in the third world was the concept of objective, impartial reason, the basis of laws and institutions that protect individual rights. This concept took 2,500 years to develop and get infused into the culture, memes, and genes of Europeans — a difficult process that, even in Europe, has never been completed.

European institutions were at roots a product of this concept. Despite massive effort by missionaries, religious and secular, and of institutions imposed on poor countries, reason failed to get transmitted. Whatever marginal improvement was achieved over 200 to 300 years of colonization is therefore slowly and surely being undone.

Without reason, the concepts of equal law, compassion, and empathy do not operate. Such societies simply cannot have institutions of the rule of law and of fairness. The consequence is that they cannot evolve or even maintain the Western institutions that European colonizers left behind. Any imposed institutions — schools, armies, elections, national executives, banking and taxation systems — must mutate to cater to the underlying irrationalities and tribalisms of the third world.

Alas, it takes a Ph.D. from an Ivy League college and a quarter of a million dollar salary at the World Bank or IMF not to understand what is the key issue with development economics and institutional failures.

In these “emerging markets,” education has become a dogma, not a tool; it floats unassimilated in the minds of people lacking objective reason. It has burdened their minds. Instead of leading to creativity and critical thinking, it is used for propaganda by demagogues. Without impartial reason, democracy is a mere tribal, geographical concept steeped in arrogance. All popular and “educated” rhetoric to the contrary, I can think of no country in the nonwestern world that did well after it took to “democracy.”

The spread of nationalism (which to a rational mind is about the commonality of values) has created crises by unifying people tribally. The most visible example is what is happening in the Middle East, but the basic problem is the same in every South Asian and African country. It is the same problem in most of South America. India, the geographical entity I grew up in, has rapidly been collectivized under the flag and the anthem. It might eventually become the Middle East on steroids, once Hindutava (Hindu nationalism) has become well-rooted.

In Burma, a whiff of democracy does not seem to have inhibited Buddhists’ genocide against the Muslim Rohingya. Thailand (which wasn’t colonized in a strictly political sense) has gone silent, but its crisis hasn’t. Turkey and Malaysia, among the better of these backward societies, have taken paths of rapid regression to their medieval pasts. South Africa, which not too far in the past was seen as a first-world country, got rid of apartheid, but what it now has is even worse. The same happened with Venezuela, which in the recent past was among the richer countries of the world. It is ready to implode, as may Brazil one day. Pakistan, Bangladesh, Nepal, and East Timor are acknowledged to be in a mess, and are getting worse by the day. Indonesia took a breather for a few years and is now again on the footprints of fanaticism. India is the biggest democracy, so its problems are actively ignored by the Western press, but they won’t be for long, as India continues to become a police state.

The spread of nationalism has created crises by unifying people tribally.

Botswana was seen as a country whose growth was among the fastest for the longest. What was ignored was the fact that this rather large country has a very small population, which benefited hugely from diamonds and other natural resources. The top political layer of Botswana is still a leftover from the British. The local culture continues to corrode what was left by them, and there are clear signs that Botswana is past its peak. Papua New Guinea was another country that was doing reasonably well, before the Australians left. It is now rapidly regressing to its tribal, irrational, and extremely violent norm, where for practical purposes a rape is not even a crime.

The world may recognize most of the above, but it sees these countries’ problems as isolated events that can corrected by a further imposition of Western institutions, under the guidance of the UN or some such international (and therefore “noncolonialist”) organization. Amusingly, our intellectual climate — a product of political correctness — is such that the third world is seen as the backbone of humanity’s future economic growth.

Unfortunately, so-called emerging markets are headed for a chaotic future. The likeliest prospect is that these countries will continue catering to irrational forces, particularly tribalism, and that they will consequently cease to exist, disintegrating into much smaller entities. As their tide of economic growth goes out with the final phase of plucking the free gift of internet technology, their problems will surface rapidly, exactly when the last of those who were trained in the colonial system are sent to the history books.

Share This

Buchanan the Wicked?


Morbid curiosity tempts me to buy and read Nancy MacLean’s new book Democracy in Chains, but I have resisted so far; for I don’t want to add to the unearned wealth that her book’s notoriety will probably bring her. I know enough from reviews, favorable and unfavorable, and from a published interview with the author herself, to understand that one main theme is the supposed wicked influence of James Buchanan.

I knew Buchanan very well from 1957, when, as Economics Department chairman, he brought me to the University of Virginia. There I was his academic colleague and friend. After he left the University of Virginia (in honorable protest against the University administration’s maltreatment of a colleague), I kept in contact with him and often saw him at professional meetings and occasionally at his homes in Blacksburg and then Fairfax.

Buchanan took economics seriously. He wouldn’t waste time on conspiracies and was no apologist for the wealthy and powerful.

Buchanan took economics seriously. He drew inspiration from his admired professor at Chicago, Frank Knight, and from the writings of the Swedish economist Knut Wicksell. He encouraged the creative thinking of his graduate students. He was a fabulously hard worker whose collected writings fill 20 large volumes and whose Nobel Prize was amply deserved. He wouldn’t waste time on conspiracies and was no apologist for the wealthy and powerful.

I understand Buchanan’s economic and political philosophy quite well, for my own is close to his. He was more of an egalitarian than I am, favoring an extreme estate tax and pondering redistributionary taxation as an arrangement whereby people insure one another against economic distress. While not completely agreeing with John Rawls, who called for social and economic arrangements designed to maximize the welfare of the least-well-off stratum of the population, he admired Rawls and his writings.

As for his thought on limits to democracy, I could expound it at length and enthusiastically. He admired the American Founders, who wisely tried to create a constitutional republic charged with protecting people’s rights even against abusive majorities and government itself. (“Democracy” is a much abused word, sabotaging clear thought by cramming various and even inconsistent good things together under a single label.)

In short, James Buchanan was an entirely different person from the one that Ms. MacLean imagines. She did not bother to know what she was writing about. But historians and journalists have a professional duty to check the truth of what they write.

Share This

Bring On the Trillion Dollar Coin!


Most Americans regard the federal deficit and the national debt as a single problem. In reality, they are two separate but related problems. Let’s decouple them and discover whether the widely disparaged idea of a “trillion dollar coin” would actually be an improvement over the practice of continuous borrowing to cover the federal government’s deficit.

Hard money is money backed by a tangible good, typically gold or silver. Fiat money is money backed only by the “good faith and credit” of the government issuing it. In the United States, fiat money comes in two flavors. The vast majority of US currency consists of Federal Reserve Notes and their electronic equivalents, backed by government bonds sold to the public and various central banks. These bonds pay interest that is booked as an expense within the government’s annual budget. For fiscal year 2016 interest payments on these bonds totaled $432 billion, or more than $2,800 for each income tax return filed.

Trump needs no additional congressional authority to mint the coin, since the enabling legislation is already in place.

The second type of US fiat money consists of all coinage and a small amount of paper money with the designation “United States Note” rather than “Federal Reserve Note.” This type of paper money, issued mostly in $2 and $5 denominations, circulated alongside Federal Reserve Notes until the 1970s, and is still occasionally found in circulation today. Coins and US notes are not backed by government debt and pay no interest.

A few years ago, when Republicans in Congress were refusing to raise the national debt ceiling, an idea was floated for minting a platinum coin with a face value of one trillion dollars. This was and still is technically legal, thanks to a 1996 law authorizing the minting of platinum bullion and proof coins. The law empowers the Secretary of the Treasury to strike platinum coins in any denomination that he or she deems appropriate. The idea was that the trillion dollar coin would be minted and deposited at the Federal Reserve, which would then credit the government’s account with a trillion dollars. The government could then spend this newly created money by “writing checks” on this account without having to increase the national debt ceiling or issue additional interest-paying bonds.

The idea died when Republicans caved in and agreed to raise the national debt ceiling. Fast forward to 2017, and now it’s the Democrats who are playing budget brinksmanship in an effort to force President Trump to restore funding for many of their pet causes, such as environmental projects and Planned Parenthood. Currently the fight is over the legislation needed to avoid a government “shutdown” by the end of April. Shortly thereafter, Congress must deal with raising the national debt ceiling. Many Democrats can be expected to oppose such an increase if Trump is unwilling to fund their most critical spending priorities. By teaming up with conservative Republicans who oppose on principle any increase in the national debt, congressional Democrats would likely have the votes to block any debt ceiling increase and thus threaten another government “shutdown.”

However, President Trump has the option to do an end run around the Democrats’ plan by dusting off the “trillion dollar coin” idea and actually implementing it. He needs no additional congressional authority to do so, since the enabling legislation is already in place. This would be a bold move with far-reaching consequences, most of them good.

More importantly, the “trillion dollar coin” would sever the link between mounting federal deficits and ever-higher interest payments on the national debt.

For starters, it would deprive the Democrats of their most potent legislative weapon in their drive to maintain and increase spending on programs that subsidize and empower their core constituencies. Defeating the Democrats’ plans would not eliminate the deficit, but it would lead to less government spending than any plan forged by a “bipartisan consensus.”

More importantly, the “trillion dollar coin” would sever the link between mounting federal deficits and ever-higher interest payments on the national debt. Freezing and then lowering these interest payments are essential to the nation’s economic health, as this interest is a substantial drain on disposable income and productivity.

All government-created fiat money is inflationary, and money backed by hard assets would be preferable. But fiat money backed by “trillion dollar coins” is neither more nor less inflationary than fiat money backed by interest-paying bonds. Of the two choices, the “trillion dollar coin” option is better for both taxpayers’ pocketbooks and the nation’s economic health.

Share This

Rothbard’s Mistake


Being interested in the history of the 1930s, I recently picked up a copy of America’s Great Depression by the influential libertarian Murray Rothbard (1926–1995). I choked on the introduction, where Rothbard lays out his theory about theory, which makes no sense to me.

“This book rests squarely on the Misesian interpretation of the business cycle,” he writes, referring to the theories of the older libertarian economist, Ludwig von Mises (1881–1973). “Note that I make no pretense of using the historical facts to ‘test’ the truth of the theory. On the contrary, I contend that economic theories cannot be ‘tested’ by historical or statistical fact. These historical facts are complex and cannot, like the controlled and isolable physical facts of the scientific laboratory, be used to test theory . . . The only test of a theory is the correctness of the premises and of the logical chain of reasoning.”

You have to keep in mind that the map sometimes lies, or maybe tells you a truth different from the one you need to know.

Philosophers make a distinction between statements that are valid and statements that are true. Validity is like math. It’s about logic. If P then Q. It’s theory. Truth is about what’s real, which is not the same thing. Logic is useful, but ultimately what we care about is what’s real.

I am reminded of the accounting classes I took many years ago. I gave up on accounting, but one thing has stuck in my mind: the professor described accounting as a map of the “territory” of a firm, and warned us not to confuse the map with the territory. The “map” might say the company is making money, but the truth might be that it runs out of cash before the owners are paid. (As a business journalist I wrote about some companies like that.) The map is useful; to steer the company you need the map. But you have to keep in mind that it sometimes lies, or maybe tells you a truth different from the one you need to know.

Back to Rothbard. He says that an economic theory is “a priori to all other historical facts.” It can be used to explain the historical record, but it cannot be tested. Here is his argument:

Suppose a theory asserts that a certain policy will cure a depression. The government, obedient to the theory, puts the policy into effect. The depression is not cured. The critics and advocates of the theory now leap to the fore with interpretations. The critics say that failure proves the theory incorrect. The advocates say that the government erred in not pursuing the theory boldly enough, and that what is needed is stronger measures in the same direction. Now the point is that empirically there is no possible way of deciding between them. Where is the empirical “test” to resolve the debate? How can the government rationally decide upon its next step? Clearly, the only possible way of resolving the issue is in the realm of pure theory — by examining the conflicting premises and chains of reasoning.

This strikes me as piffle. There are several ways of deciding between the two claimants. You can compare what happened at times when the policy was imposed with what happened at times when it wasn’t. You might compare the depression of the 1930s with the depressions of 1920–21 or 1893–97 or 1873–79, etc., and see that the one in the 1930s featured the slowest recovery in US history. That is evidence (not proof) that whatever policies were tried didn’t work too well. You can dig deeper. How did investors, entrepreneurs, company managers, workers, and other people in the 1930s respond to the National Recovery Administration? To mass unionization? To the retained-earnings tax? To the abandonment of gold? What did supporters and opponents predict the players would do, and what did they do?

Robert Higgs asks these kinds of questions in Depression, War and Cold War. You can reject what he does — none of his arguments amount to a drop-dead test such as you find in a chemistry lab — but they are ingenious. They are instructive. They make a case.

The social life of humans is more complicated than a test tube.

Rothbard argues, in essence, that such questions are too messy to answer. A theory cannot be “tested” in the way a question in chemistry can be “tested” by heating compounds in a test tube. He’s right in thinking that you can’t test that way with economic policies, but it doesn’t mean that “empirically there is no possible way of deciding between them.” You can look at what lawyers call “the preponderance of the evidence.” “Test” is a high-hurdle word, the wrong word. You can evaluate. You won’t get to 100% certainty, but it’s unlikely that you’ll be stuck at 50-50, either. You can decide, but you have to look at the territory as well as your map — and you may find yourself correcting your map to make it fit the territory better.

Essentially Rothbard denies this.

“Clearly,” he asserts, “the only possible way of resolving the issue [of choosing the best economic policy] is in the realm of pure theory — by examining the conflicting premises and chains of reasoning.” In other words, the only way to decide what to do “in the territory” is to pick the best-looking map without looking at the territory.

No, no, no! Because the social life of humans is more complicated than a test tube, and because cause and effect are mixed up and piled on each other, you have to check your “map” against the territory all the time. Because your theory is only an approximation. A simplification. It is not life.

Praxeology is not primary. Supply and demand curves are not reality.

To quote the philosopher Robert Heinlein: “What are the facts? Again and again — what are the facts?”

If you say, “I don’t care about what facts you have. What experiences, or what statistics, or anything. I have my theory, I’m sure it’s right, and I don’t need to ‘test’ it,” you become irrelevant. You become ignorable. You become the frog at the bottom of the well.

Share This

Manna from Heaven


When we talk of economics, we often do it by means of labels and mantras. Discussing economic subjects in this way means that we do not fully discuss them; we just use words and phrases that suggest preconceived notions. I think this is because economics is predominantly political, and “political” is another way of saying “snake oil sales.”

One mantra that I often hear is people’s invocation of a Robin Hood morality, the morality of robbing Peter to pay Paul: Robin Hood cared for the poor downtrodden (Paul) with the wealth he stole from the fat cats (Peter). What is ignored about this fairy tale is that Peter is the lord of the land who uses his governmental authority to confiscate the property of Paul, the peasants. Robin is a hero because he fights the totalitarian government of Peter to return confiscated wealth to oppressed taxpayers.

What got me thinking about the labels that political commentators use in discussing economics was Hillary Clinton’s assertion that Donald Trump’s plan to cut taxes in order to revive the economy was just “Trumped up trickle-down.” “Trickle-down” is the label often used by the political enemies of leaving wealth in the hands of CEOs and others of corporate administrative rank. The “trickle-down” label comes from the idea that these people spend the wealth hiring workers to construct whatever their companies’ products may be. Thus, wealth “trickles down” from the wealthy administrators to the needy workers.

Robin Hood is a hero because he fights the totalitarian government to return confiscated wealth to oppressed taxpayers.

But what is the government’s economic system of high taxes and “wealth redistribution”? In its intention, the wealth redistribution system is also trickle-down. In this system, government takes the place of corporate administration. It accumulates wealth — by taxation. This wealth is then supposed to trickle down to the subjects of the government, by means of redistribution programs. So, why is trickle-down bad when wealth trickles down from company administration, but good when it trickles down from government?

The feudal system that I mentioned when talking about Robin Hood was actually a wealth redistribution system. But in such systems, does wealth really trickle down? “Trickle-down” is appropriate to the sales pitch used by politicians when they claim that they intend to do such things as pay for infrastructure, education, and retirement. However, the wealth redistribution system is, in fact, trickle-out. “Trickle-out” means that the government takes wealth from its subjects and distributes it to its preferred lobbyists. Think military contractors, Elon Musk, and Planned Parenthood. Those are a few examples. Does the wealth ever get back to the subjects? Well, some does, but the amount that the subjects get is inversely proportional to the number of lobbyists who get some of the wealth before it makes its way back.

Politicians claim the place of God: they sell themselves as all-powerful beings that you need to take care of you.

The lobbyists and their clients reward the government by giving back some of the loot they received, prompting politicians to increase their take by selling more and more “economic stimuli” to the public, as if they were actually providing some kind of free food.

In the book of Exodus, God gives the children of Israel a miraculous food called manna, which is meant to sustain them on their journey out of servitude to the king of Egypt. In the modern form of this story, politicians claim the place of God: they sell themselves as all-powerful beings that you need to take care of you. They prefer this story about themselves to the reality of “trickle-down,” which is how we truly get our bread from heaven. In every light rain, water trickles down from above; this water is the food for plants, and thus the origin of our daily bread. And I think this is why politicians hate trickle-down economics: our food comes from sources beyond their control. This kind of economics dethrones them from their delusion of almighty power; and it exempts us — if we reflect on it — from our dependency on them.

Share This

Customer Care


Share This

Going Halfsies


Share This

Unfair Competition from Robotland


This campaign season brings many complaints about “shipping jobs overseas.” Candidates promise to crack down on the offending corporations. American workers and the United States as a whole must compete on a slanted playing field against foreigners paid much below a dollar an hour. Moreover, the foreigners manipulate their currencies. They buy less from us than we from them, putting the US into a trade deficit (more exactly, a current-account deficit) costing us many billions of dollars a year. China, Japan, and Mexico count among the worst offenders. Free trade is fine, but only when it is fair.

In a similar but imaginary scenario, technology has advanced so far that “Robots” (in a stretched sense of the word) displace American workers at costs equivalent to Robot wages of 50 cents an hour. What is the difference between shipping jobs to Bangladesh and shipping jobs to Robotland? Well, Robotland does not have a balance of payments, so it cannot be accused of buying less from us than we from it, fleecing us of the difference. In the real world, automatic market mechanisms, if allowed to operate, forestall worrisome trade deficits and surpluses; and if the foreigners do make unbalanced sales to us, what can they do with the money? They acquire American bank accounts, securities, and properties, so supplying us with financial capital on advantageous terms.

What sense does the notion of one country competing with others have? Does it mean that international trade is a zero-sum game, with countries squabbling over shares in a fixed total of gains? On the contrary, international trade and advanced technology are alike in making desired goods more abundant. One country’s relatively low standard of living would trace to technological and entrepreneurial backwardness and perhaps to bad government. It would be absurd to blame its relative poverty on incompetent trade-policy negotiators.

One country’s relatively low standard of living would trace to technological and entrepreneurial backwardness and perhaps to bad government.

In the real world, conceivably, Robotland technology might displace many American workers, inviting Luddite arguments. I do not want to get into that issue here. I merely ask what the difference is between the scenarios of foreign competition and robots.

I wish that today’s vapid political debates could give way to ones with candidates testing one another’s policy-relevant understanding by posing questions like the one about robots. Other questions might be: How do your trade-policy proposals square with the principle of comparative advantage? What light might the absorption approach to balance-of-payments analysis shed on a connection between a trade deficit and a government budget deficit? In what sense is the Social Security trust fund a reassuring reality and in what sense a deceptive farce?

Unfortunately, such questions would not faze Donald Trump. He would respond with vicious personal insults and with reiterations of his own excellence. Anyway, allowing such questions could be entertaining. They might even enlighten some voters.

Share This

Only Nostrums Need Apply


The "Great Depression" began with the stock market crash of 1929. In all previous depressions, there was little, if any, federal government intervention to extricate America from economic travail. It was held that the federal government possessed neither the knowledge nor the constitutional authority to meddle with the free-market capitalist economy that had propelled America from a fledgling agricultural enclave to a global industrial power in less than 150 years.

Everything was about to change. The 1920s experienced at once the reckless expansion of credit by the Federal Reserve and economic thought by the liberal elite. The former produced an enormous margin-driven stock market bubble that burst in October 1929; the latter produced a remedy that burst any chance of recovery from economic distress. Unlike all previous economic downturns, the calamity in 1929 invoked intense federal government intervention; it also invoked the longest depression in American history. The days of limited government — so expressively and resolutely defined by the Constitution — would be gone for good.

Then-president Herbert Hoover transformed the ensuing mild recession (from which the economy was already recovering by June 1930) into a depression, which, in 1932, was delivered to newly elected president Franklin Delano Roosevelt. With his Brain Trust of lawyers, journalists, and college professors and the freshly minted ideas of Keynesian scholars, he concocted an unprecedented grab-bag of nostrums known as the "New Deal" and parlayed Hoover's depression into a prolonged Great Depression. The American economy would not return to its pre-crash prosperity until 1946.

Unlike all previous economic downturns, the calamity in 1929 invoked intense federal government intervention; it also invoked the longest depression in American history.

In fairness, the bungling of both presidents was enhanced by the Federal Reserve System. The primary function of the Fed was to ensure that US banks could withstand "runs" by depositors attempting, en masse, to withdraw their assets during financial downturns. The Fed was established in 1913 to act as the lender of last resort (LLR) for banks. It replaced the private clearinghouse system that had successfully provided LLR services prior to 1913. But between 1930 and 1933, when stressed banks were desperate for liquidity, the Fed followed a tight money policy. This inexplicable neglect of its primary function contributed to the collapse of more than10,000 banks between 1930 and 1933. Then, in 1936 and 1937, its insistence on raising bank-reserve requirements (once again shrinking the money supply), contributed to the severe recession of 1937–38 — the recession within the Depression.

Unlike President Harding, who did not intervene in the depression of 1920, Hoover believed that not intervening in 1929 "would have been utter ruin." Accordingly, he increased federal spending 42% by 1932, boasting that his administration had embarked on "the most gigantic program of economic defense and counterattack ever evolved in the history of the Republic." Hoover was then excoriated by FDR for extravagant spending and excessive taxing, for entertaining the idea that "that we ought to center control of everything in Washington as rapidly as possible,” and for “leading the country down the path of socialism.”

FDR's public objection to Hoover's intervention was, however, merely a ploy to win the election of 1932. Privately, he believed that Hoover's most gigantic program was not gigantic enough. Roosevelt’s New Deal would put Hoover's reckless extravagance to shame. And while Hoover's intervention was to be temporary and limited, FDR's would become permanent and unlimited. FDR radically transformed government's role in the economy to "center control of everything in Washington as rapidly as possible" and "lead the country down the path of socialism."

By all accounts, the intentions of the New Deal were noble and praiseworthy. To an objective observer, little else can be said that is favorable. Although Democrats hail the welfare and regulatory state that FDR created, the establishment of a welfare and regulatory state was not a New Deal objective. Its objective was economic recovery — which was never achieved under New Deal programs. Unable to restore the American economy, the charismatic FDR gave only ironic hope to a nation in despair: the hope that it could endure the seemingly endless hardship that his policies inflicted.

Hoover believed that not intervening in 1929 "would have been utter ruin." Accordingly, he increased federal spending 42% by 1932.

If not for World War II, FDR's intervention "would have been utter ruin" for the nation. As economist Larry Summers, former director of President Obama's National Economic Council, admonished: “Never forget, never forget, and I think it’s very important for Democrats especially to remember this, that if Hitler had not come along, Franklin Roosevelt would have left office in 1941 with an unemployment rate in excess of 15% and an economic recovery strategy that had basically failed.”

The New Deal was the paragon of nostrums: a political fantasy whose probability of success was inversely proportional to the conceit of its exaggerated claims. Blaming both capitalism and his predecessor for the nation's economic plight, FDR felt compelled to rely on untested Keynesian concepts for stimulating the economy. What emerged was a haphazard torrent of elixirs, boondoggles, and utopian schemes ("a saturnalia of expropriation and waste," to H.L. Mencken) whose only centering force was a frenetic shove to expand federal power. Brain Trust professor Raymond Moley, a close FDR advisor who eventually became critical of the New Deal, found FDR a rank amateur in such matters. Said Moley in 1939, “To look upon these programs as the result of a unified plan, was to believe that the accumulation of stuffed snakes, baseball pictures, school flags, old tennis shoes, carpenter’s tools, geometry books, and chemistry sets in a boy’s bedroom could have been put there by an interior decorator.”

Brain Trust professor Alvin Hansen (aka the "American Keynes"), who favored "highly centralized collectivism" as the optimal method to "command and direct the productive resources," also became frustrated. According to Hansen, "Every attempt at a solution involves it in a maze of contradictions. Every artificial stimulant saps its inner strength. Every new measure conjures out of the ground a hundred new problems."

FDR set the precedent for government by nostrum, and demonstrated that the only thing worse than a liberal nostrum is a well-funded liberal nostrum. Said FDR's Treasury Secretary Henry Morgenthau: "We are spending more than we have ever spent before and it does not work. I say after eight years of this administration we have just as much unemployment as when we started . . . And an enormous debt to boot!"

The charismatic Roosevelt gave only ironic hope to a nation in despair: the hope that it could endure the seemingly endless hardship that his policies inflicted.

The spending continued until after WWII. Keynesian economists such as Hansen were beside themselves with fear that postwar budget cuts would drastically harm the New Deal goal of pre-crash prosperity. If the spending did not continue, warned Paul Samuelson, America would experience “the greatest period of unemployment and industrial dislocation which any economy has ever faced.”

To their intellectual dismay, once tax rates were cut and price controls removed, the private sector (i.e., capitalism) took over, and the American economy soared. According to economist Steven Moore, "personal consumption grew by 6.2% in 1945 and 12.4% in 1946 even as government spending crashed. At the same time, private investment spending grew by 28.6% and 139.6%." Unemployment dropped to 4% in 1946 and "stayed that low for the better part of a decade . . . during the biggest reduction in government spending in American history."

The Great Depression finally ended, when the spending finally stopped. It was not the New Deal that ended the depression. Nor was it WWII. It was the curtailment of the New Deal that ended the depression, 17 years after it started.

What has America learned from this tragic ordeal? Libertarians and conservatives have learned that there is no better argument for limited government than the New Deal. Prior to 1929, the federal government did not intervene in times of economic distress. Recovery was left to the forces of capitalism; individuals and businesses were left to fend for themselves, receiving relief primarily from private charities, occasionally from state and local governments. During that long history, the federal government nostrum was peddled only by snake-oil salesmen, and recovery from economic downturns averaged four years. It often occurred in two years or less. Capitalism did not produce depressions, and less intrusive means of intervention, including no federal intervention, produced far superior results.

After the Panic of 1893, President Grover Cleveland did virtually nothing, except to arrange the repeal of interventionist laws; the ensuing depression, which, according to many, was every bit as devastating as the Great Depression, ended in about four years. (On the contrast between the two depressions, see an essay by Stephen Cox in Edward Younkins, Capitalism and Commerce in Imaginative Literature.) After the Crash of 1920, in which the stock market fell further than it would in 1929, President Harding did less than nothing interventionist. He cut taxes for all income groups, cut the federal budget by almost 50%, and reduced the national debt by 33%. The ensuing depression ended in less than two years and was followed by eight years of unprecedented prosperity, the "Roaring Twenties." Harding succeeded where FDR failed. "Wobbly" Warren Harding!

From this evidence, libertarians and conservatives conclude that nostrums should be avoided at all costs. Chances are, that without nostrums, the Great Depression would have ended in four years, instead of 17. With its return to prosperity, America would have had more than enough money to finance all the roads, schools, parks, and bridges that were built under FDR's make-work programs. But it is critically important to understand that it wasn't the fact that it took 17 years for the nostrums to work. They never worked. The idea that the New Deal succeeded is a myth. The Great Depression did not start until after politicians intervened and did not end until their intervention finally stopped, after subjecting the nation to more than 17 years of want and despair.

Capitalism did not produce depressions, and less intrusive means of intervention, including no federal intervention, produced far superior results.

But liberals, who live in a world in which ideology trumps evidence, missed the tragically abysmal failure that was the New Deal. To them the lesson of the Great Depression was the power of the nostrum: once established, nostrums never go away; they stay and breed more nostrums. In the hands of liberals, a nostrum is a ratchet. While libertarians and conservatives are appalled by the failure of FDR's economic assistance programs, liberals are enraptured by the welfare state that they established: the vibrant, lucrative poverty industry and the languid, needy underclass that it services, both intractable, both agitating for more and bolder nostrums.

This is why the New Deal consumes liberal thought, and why a nostrum is modern liberalism's only thought. The New Deal spawned the "Great Society," Lyndon Johnson's New Deal. And today America endures Barack Obama's "saturnalia of expropriation and waste." Today's liberal can conceive of neither a problem that does not require government intervention nor a solution that does not require a nostrum. Liberals do not care that their nostrums do not work (if one did, it wouldn't be a nostrum). A nostrum's main value lies in its ratcheting effect. As noted historian and FDR worshiperArthur Schlesinger, Jr. put it, "There seems no inherent obstacle to the gradual advance of socialism in the United States through a series of New Deals.”

When Republicans took power in 1953, even President Eisenhower, the architect of our victory in World War II, was afraid to scale back New Deal legislation. In 1969, President Nixon was afraid to cut back already failing Great Society legislation. He was warned by the friendly, and sincere, advice of Democrat Senator Daniel Patrick Moynihan: “All the Great Society activist constituencies are lying out there in wait, poised to get you if you try to come after them, the professional welfarists, the urban planners, the day-carers, the social workers, the public housers. . . . Just take [the] Model Cities [program], the urban ghettos will go up in flames if you cut it out.”

FDR gave us Social Security, the largest and most popular program of his legacy — the “most successful government program in the history of the world,” as Democrat Senator Harry Reid exclaimed. Johnson gave us Medicare, an even larger program. In retirement, all but the wealthiest among us depend on the benefits paid out by these two programs. But both are colossal Ponzi schemes, on track to go broke by 2034. This is not to say that a "social safety net" is unimportant or unnecessary. But, despite their laudable intentions, such entitlement programs, as they have been formulated by pandering politicians, are nostrums that have created an unfunded liability of $90 trillion and threaten to bankrupt the nation.

Today's liberal can conceive of neither a problem that does not require government intervention nor a solution that does not require a nostrum.

Let’s review the history of intervention in another way. When the market crashed in 1920, unemployment increased from 4% to 12%. By August 1921, the economy began its recovery. When the market crashed in 1929, unemployment increased from 4% to 9%, where it lingered for one month, before gradually decreasing to 6.3% in June 1930. The economy was recovering on its own from a mild recession. But that June, Republican President Hoover and a Democrat Congress enacted the Smoot-Hawley tariffs, the first in a long series of heavy-handed federal interventions. Unemployment soared to 16% in 1931. Massive federal spending, debt financing, tax increases, the denial of liquidity (by the Federal Reserve) to failing banks, and numerous other forms of federal tinkering crushed US GDP growth for the rest of the decade. Throughout the 1930s, the median annual unemployment rate was 17.2%. Unemployment did not fall below 14% until the early 1940s, when 12 million Americans were hired by the military.

In June 1930, had the federal government pursued the limited-government policies of the Harding-Coolidge administrations, the depression would have been over by 1932. But the nostrums that were pursued instead prolonged the depression, and, in the process, writes Robert Higgs (in “The Mythology of Roosevelt and the New Deal”), revolutionized "the institutions of American political and economic life," changed "the country’s dominant ideology," and created a leviathan that is "still hampering the successful operation of the market economy and diminishing individual liberties." The New Deal agencies, whose supreme ineptitude caused America to suffer more than ten years longer than it would have under limited-government policies, remain today. Notes Higgs:

One need look no further than an organization chart of the federal government. There one finds such agencies as the Export-Import Bank, the Farm Credit Administration, the Rural Development Administration (formerly the Farmers Home Administration), the Federal Deposit Insurance Corporation, the Federal Housing Administration, the National Labor Relations Board, the Rural Utility Service (formerly the Rural Electrification Administration), the Securities and Exchange Commission, the Social Security Administration, and the Tennessee Valley Authority — all of them the offspring of the New Deal. Each in its own fashion interferes with the effective operation of the free market. By subsidizing, financing, insuring, regulating, and thereby diverting resources from the uses most valued by consumers, each renders the economy less productive than it could be — and all in the service of one special interest or another.

Yet the myth — the pernicious myth — of the New Deal lives on. Today, as FDR blamed his predecessor and capitalism for America's economic plight, Mr. Obama, who won election waving the New Deal banner against the "Great Recession" of 2008, blames capitalism and George W. Bush. Obama even blamed Bush for adding $4.9 trillion to the national debt (money borrowed during eight years of Bush's tenure to finance establishment-Republican nostrums), calling it "irresponsible" and "unpatriotic" — just before he [Obama] went on to borrow $10.6 trillion for his nostrums, running up the national debt to an unprecedented $19 trillion.

With almost one year left for Mr. Obama to enlarge this staggering arrearage, both Democrat candidates for the 2016 presidential election propose the only thing that liberalism allows them to offer: more nostrums.

Hillary Clinton and Bernie Sanders, ever ready to put taxpayer money where their mouths are, clamor for a new New Deal — no doubt to build on the success of Obama's New Deal. A new New Deal is needed, they say, because "for too long,” as one activist put it, “the federal government has tolerated and perpetuated practices of racial and gender discrimination, allowed rampant pollution to contaminate our water and air, sent millions to prison instead of colleges and permitted Wall Street and CEOs to rig all the rules." Correcting the deficiencies of existing big government nostrums calls for a new New Deal with bigger Big Government.

Sanders has a new New Deal for $19.6 trillion (paid for with a 47% tax increase). Clinton even has one for "communities of color" — perhaps to lock in the votes of Americans, who, with Job-like patience, have been waiting more than 50 years for the ruthlessly inept Great Society programs to eliminate poverty and racial injustice, reconcile immigration, and improve public education.

Both Democrat candidates for the 2016 presidential election propose the only thing that liberalism allows them to offer: more nostrums.

After almost eight years of suffering Mr. Obama's nostrums (the Wall Street bailout, the auto industry bailout, the Stimulus, Quantitative Easing, the Green Economy, Dodd-Frank, Obamacare, Middle Class economics, the profligate regulatory morass . . . ), all of America waits, its economy in chronic stagnation— for bigger, better nostrums. We might as well wait for the Treasury Secretary finally to admit, "We are spending more than we have ever spent before and it does not work. I say after eight years of this administration we have just as much unemployment as when we started . . . And an enormous [$19 trillion] debt to boot!"

Willfully oblivious to the evidence, we resign ourselves to a stifling federal patrimony, where no problem escapes a New Deal style nostrum and "every attempt at a solution involves it in a maze of contradictions. Every artificial stimulant saps its inner strength. Every new measure conjures out of the ground a hundred new problems."

* * *

Further Reading


1920: The Great Depression That Wasn't by C.J. Maloney
The Depression You've Never Heard Of: 1920–1921
by Robert P. Murphy
Great Myths of the Great Depression
by Lawrence W. Reed
The Great Depression
by Hans F. Sennholz
The Mythology of Roosevelt and the New Deal
by Robert Higgs


FDR's Folly: How Roosevelt and His New Deal Prolonged the Great Depression by Jim Powell
The Forgotten Man: A New History of the Great Depression
by Amity Shlaes
New Deal or Raw Deal?: How FDR's Economic Legacy Has Damaged America
by Burton Folsom
A Monetary History of the United States, 1867-1960
by Milton Friedman and Anna Schwartz

Share This

A Depression that Should Not Be Forgotten


I liked The Forgotten Depression but did not love it.

Its subject is the depression of 1921 — a valuable subject because, as the author indicates, the depression went away without massive government intervention. Imagine that.

The author's brief history of the problems some big banks got into, in the late 1800s and early 1900s, is excellent. Grant shows how the principals of the banks had their own fortunes tied up in the banks’ capital, which usually kept them prudent. Still, they made mistakes. For instance, National City Bank (“forerunner of today’s Citigroup” p. 18) for instance, invested in Tsarist Russia — just before the Bolshevik revolution. The firm had over 60% of its capital tied up in sugar investments in Cuba, when prices were high, then crashed. Guarantee Trust, another huge bank of the time, was considered "too big to fail," almost a hundred years before our present policies on that subject.

The author aptly characterizes the 1920–21 depression as the last major downturn to be "un-medicated" (by government stimulus policies), and makes telling comparisons with the activist Herbert Hoover and Franklin Roosevelt administrations. Notably, the earlier depression was of short duration, while the “medicated” depression of the 1930s and the recent Great Recession went on and on. Grant’s discussions of the various economists, bankers, and policy makers involved in the problems of the 1920s are challenges to today’s economists, policy makers, and historians.

Meanwhile, Grant adds texture and depth to his story with descriptions of the difficulties suffered by the various sectors of the economy during the Forgotten Depression: farming, steel production, the auto industry, construction, and even haberdashers (including one very famous and resentfully unsuccessful one). But be prepared for a massive amount and variety of statistics about earnings and losses, interest rates, unemployment rates, sales rates and amounts, etc., etc. The author is an expert who knows how to deal with statistics. His writing is not nearly as dry as it could be.

Notably, this earlier depression was of short duration, while the “medicated” depression of the 1930s and the recent Great Recession went on and on.

He is also basically sound on substantive economic issues. He provides a good explanation of the classic gold standard up to the 1920-21 depression, and then of the fake "gold exchange" standard thereafter. He understands market forces and government intrusions and distortions. His description of the anti-business, anti-market biases, or ideology, of the key figures in the Woodrow Wilson administration is appropriately sickening.

Unfortunately, Grant’s presentation of basic business-cycle theory is lacking, save for a discussion of people who believe in the market vs. people who think government competent to force “solutions” on it. His explanations of how government coercion usually has unfavorable, unforeseen, and mostly unacknowledged repercussions is good, but probably not good enough to convince those who believe in such things.

Two other matters need to be mentioned.

First, the book has a section called Acknowledgements, which is more like a bibliographic essay. I liked this section very much. It is moderately short, fun, and informative to read, and it gives a great commentary on Grant’s main sources, some of which I highly recommend. The Great Depression sources, which he uses to excellent effect throughout the book, are very important

Second, the "hero" of Grant’s story is wonderful. But I won’t give it away. It’s in the book.

Editor's Note: Review of "The Forgotten Depression: 1921: The Crash that Cured Itself," by James Grant. Simon & Schuster, 2014, 272 pages.

Share This
Syndicate content

© Copyright 2017 Liberty Foundation. All rights reserved.

Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.