As even its harshest critics concede, neoliberalism is hard to pin down. In broad terms, it denotes a preference for markets over government, economic incentives over social or cultural norms, and private entrepreneurship over collective or community action. It has been used to describe a wide range of phenomena—from Augusto Pinochet to Margaret Thatcher and Ronald Reagan, from the Clinton Democrats and Britain’s New Labour to the economic opening in China and the reform of the welfare state in Sweden.
Even though neoliberalism as an ideology springs from a desire to curtail the state’s role, neoliberalism as a political-economic reality has produced increasingly powerful, interventionist and ever-reaching – even authoritarian – state apparatuses.
The process of neoliberalisation has entailed extensive and permanent state intervention, including: the liberalisation of goods and capital markets; the privatisation of resources and social services; the deregulation of business, and financial markets in particular; the reduction of workers’ rights (first and foremost, the right to collective bargaining) and more in general the repression of labour activism; the lowering of taxes on wealth and capital, at the expense of the middle and working classes; the slashing of social programmes, and so on.
These policies were systemically pursued throughout the West (and imposed on developing countries) with unprecedented determination, and with the support of all the major international institutions and political parties.
In this sense, neoliberal ideology, at least in its official anti-state guise, should be considered little more than a convenient alibi for what has been and is essentially a political and state-driven project, aimed at placing the commanding heights of economic policy ‘in the hands of capital, and primarily financial interests’.
Capital remains as dependent on the state today as it did in under ‘Keynesianism’ – to police the working classes, bail out large firms that would otherwise go bankrupt, open up markets abroad, etc. In the months and years that followed the financial crash of 2007-9, capital’s – and capitalism’s – continued dependency on the state in the age of neoliberalism became glaringly obvious, as the governments of the US and Europe and elsewhere bailed out their respective financial institutions to the tune of trillions of euros/dollars.
In Europe, following the outbreak of the so-called ‘euro crisis’ in 2010, this was accompanied by a multi-level assault on the post-war European social and economic model aimed at restructuring and re-engineering European societies and economies along lines more favourable to capital.
Nonetheless, the flawed notion that neoliberalism entails a retreat of the state continues to remain a fixture of the left. This is further compounded by the idea that the state has been rendered powerless by the forces of globalisation. Conventional wisdom holds that globalisation and the internationalisation of finance have ended the era of nation states and their capacity to pursue policies not in accord with the diktats of global capital. But does the evidence support the assertion that national sovereignty has truly reached the end of its days?
New Zealanders like to think that we are, in most respects, up with – if not actually ahead of – the play. Sadly, however, as a new government is about to emerge, there is no sign that our politicians and policymakers are aware of recent developments in a crucial area of policy, and that, as a result, we are in danger of missing out on opportunities that others have been ready to take.
The story starts, at least in its most recent form, with two important developments. First, there is the now almost universal recognition that the vast majority of money in circulation is not – as most people once believed – notes and coins issued on behalf of the government by the Reserve Bank, but is actually created by the commercial banks through the credit they advance, using bank entries rather than cash, and usually on mortgage.
The truth of this proposition, so long denied, is now explicitly accepted by the Bank of England, and was – as long ago as 1994 – explained in a letter written by our own Reserve Bank to an enquirer, and stating in terms that 97% of the money included in the usually used definition of money known as M3 is created by the commercial banks.
The proposition is endorsed by the world’s leading monetary economists – Lord Adair Turner, the former chair of the UK’s Financial Services Authority and Professor Richard Werner of Southampton University, to name but two. These men are not snake-oil salesmen, to be easily dismissed. They have been joined by leading financial journalists, such as Martin Wolf of the Financial Times.
The second development was the use by western governments around the world of “quantitative easing” in the aftermath of the Global Financial Crisis. “Quantitative easing” was a sanitised term to describe what is often pejoratively termed “printing money” – but, whatever it is called, it was new money created at the behest of the government and used to bail out the banks by adding it to their balance sheets.
These two developments, not surprisingly, generated a number of unavoidable questions about monetary policy. If banks could create billions in new money for their own profit-making purposes, (they make their money by charging interest on the money they create), why could governments not do the same, but for public purposes, such as investment in new infrastructure and productive capacity?
And if governments were indeed to create new money through “quantitative easing”, why could that new money not be applied to purposes other than shoring up the banks?
The conventional answer to such questions (and the one invariably given in New Zealand by supposed experts in recent times) is that “printing money” will be inflationary – though it is never explained why it is miraculously non-inflationary when the new money is created by bank loans on mortgage or is applied to bail out the banks.
But, in any case, the master economist, John Maynard Keynes, had got there long before the closed minds and had carefully explained that new money could not be inflationary if it was applied to productive purposes so that new output matched the increased money supply. Nor was there any reason why the new money should not precede the increased output, provided that the increased output materialised in due course.
Those timorous souls who doubt the Keynesian argument might care to look instead at practical experience. Franklin Delano Roosevelt used exactly this technique to increase investment in American industry in the year or two before the US entered the Second World War. It was that substantial boost to American industrial capacity that was the decisive factor in allowing the Allies to win the war.
And the great Japanese (and Keynesian) economist, Osamu Shimomura, (almost unknown in the West), took the same approach in advising the post-war Japanese government on how to re-build Japanese industry in a country devastated by defeat and nuclear bombs.
The current Japanese Prime Minister, Shinzo Abe, is a follower of Shimomura. His policies, reapplied today, have Japan growing, after years of stagnation, at 4% per annum and with minimal inflation.
Our leaders, however, including luminaries of both right and left, some with experience of senior roles in managing our economy – and in case it is thought impolite to name them I leave it to you to guess who they are – prefer to remain in their fearful self-imposed shackles, ignoring not only the views of experts and the experience of braver leaders in other countries and earlier times, but – surprisingly enough – denying even our own home-grown New Zealand experience.
Many of today’s generation will have forgotten or be unaware of the brave and successful initiative taken by our Prime Minister in the 1930s – the great Michael Joseph Savage. He created new money with which he built thousands of state houses, thereby bringing an end to the Great Depression in New Zealand and providing decent houses for young families (my own included) who needed them.
Who among our current leaders would disown that hugely valuable legacy?
Bryan Gould, 2 October 2017
“In a very rapidly changing scenario, with a burgeoning population, fast-changing demographic profile, and growth aspirations of people around the world putting pressure on natural resources, our economic thoughts and practices have to change.”
In the beginning there was nothing, no human beings, no animals, no trees, no oceans, no earth, no sun, no stars, not even space or time. A quantum fluctuation leading to the Big Bang almost 14 billion years ago sowed the seeds of the Universe and space and time, as we know it. In the initial phase, stars, black holes, and galaxies were formed. The Earth, our home planet, was born almost 10 billion years later, about 4 billion years ago. It was then a fiery ball and took almost 1 billion years to cool down. Seeds of life sprouted about 3 billion years ago, some say spontaneously, while others hold a view through panspermia, no one knows for sure.
While the earth was cooling, life forms were evolving and the planet was undergoing cataclysmic changes. Continents were shifting and breaking apart, ocean floors were rising and sinking, volcanoes were erupting. Forests, animals, fishes, amphibians came and disappeared, so much so that according to some, 99.9% of the species in existence since beginning of life on Earth have ceased to exist. These changes, over a period of hundreds of millions of years, left us the legacy of natural resources—coal, crude oil, natural gas— and minerals so necessary for industrial processes and evolution of a technological civilisation.
Life forms continued to evolve. Humans came on the scene. No one is sure, but it is said that human sub-species evolved about half a million years ago in the African Savannah. With human civilisations, human aspiration too continued to develop and grow, perhaps slowly, if we were to compare it with the developments in the last 100 years.
The advent of the Industrial Revolution, which started in Europe around 1760, brought in its wake a transformation. Progress brought about by technology encouraged a shift from primarily an agricultural world to an industrial one. Rapid shifts took place in many parts of the world, mainly Europe and North America, and in the earlier part of the last century, in Japan. Such shifts are now taking place in parts of Asia, mainly India and China, Latin America, and Africa. These changes, by themselves great achievements for mankind, have led to a burgeoning population and major demographic changes. An off-shoot of this technological progress has been that more intensive and concentrated methods of food production are required for supporting technological societies and longer human life spans, stemming from better healthcare.
About the time of the birth of Jesus Christ, the planet supported a population of about 200 million human beings, which, by the early 19th century i.e. in a period of about 1,830 years touched a billion people. In another 185 years, we have expanded 7-fold to over 7.2 billion people and we are still continuing to expand. The advent of technological changes and exploitation of natural resources has improved the living conditions of human beings, and on an average a human being lives better, is better fed, and better educated than any other time in the history of mankind.
All this has been brought about by scientific advances in different fields such as Quantum Physics, Relativity, Material Sciences, Chemistry, Agricultural Sciences, and so on and so forth.
The list is endless.
However, a large population and better living standards have created their own challenges in fields as diverse as economics, social sciences, ecology, and environment. At the heart of these is the rapid exploitation of natural resources, be it in the form of energy-generating resources like coal or crude oil, mineral resources like ores, or environmental resources, which are being degraded in the pursuit of economic growth.
These issues are well known, and have been discussed in various fora for decades now. The first Club of Rome report, Limits to Growth, which was published in 1972, raises many issues pertinent to these changes. That landmark report and subsequent Club of Rome reports, which generated extensive debates in the 1970s, now lie peacefully buried in the archives of libraries around the world. While these issues are still relevant, it is not the intent of this book to reiterate them.
Along with technological progress, economic theories evolved as well. A key aspect of economic theories was better and more efficient utilisation of resources, be it capital, land or labour. These concepts and theories optimized utilisation of resources and went a long way in improving the living standards of mankind across the world.
These economic theories, which have served us well for many decades now, need a relook, particularly from the point of view of sustainability. If we lived in a world where resources were infinite or virtually limitless in relation to our consumption, we would have had no issues. But that is indeed not the case, more so, as our population and resource consumption have been expanding exponentially. Using current methods of economic analysis, capital allocation really promotes gross long-term inefficiencies in our resource utilisation. If we continue with these approaches, our societies would become unsustainable.
The authors have long held the view that not only do our economic theories lead to unsustainable development, but really amount to stealing from our future generations. We compare our society to a rich man who sells his family silver to sustain his lifestyle and in the end leaves practically nothing for his children. What is worse in our case is that we would leave our children a huge debt, which they would have to pay. This book will provide enough evidence that our economic and capital allocation models do the same thing: promote current consumption at the cost of future generations. The problem is further compounded by the short-sightedness of the political class in most nations of the world where the focus seems to be the next year, the next election, or in non-democratic societies, growth in personal wealth or stature. Similarly, the corporate world around us generally thinks of the next quarter, the next shareholders’ meet, and the bonuses, which the top managers can persuade the Boards and shareholders to pay them. Few think of the long-term strategies for the company, and fewer still about long-term sustainability issues.
Most businesses use capital allocation models to optimise their working. Similar concepts are, at least theoretically, used by countries (where their leaders are not driven by political considerations, which is not often) to utilise national resources. Few realise the pitfalls of such models. So wide is the use of these models that working of all banks would come to a standstill if somehow these formulae were to be erased from their computers.
Capital allocation models are generally skewed in favour of current consumption. They place a premium on current consumption and earlier use of the resources vis-à-vis saving them for the future generations. For example, if we can pump a barrel of oil now and its price is US$100, our benefit (less the pumping out cost, which we for the sake of simplicity assume to be zero) is US$100. But if we leave the same barrel of oil underground so that someone else can use it 50 years later at a 10% cost of capital, the value of the same barrel of oil today is 85 cents. If we were more farsighted and do not use it for 100 years, the present value falls to 0.7 cents. So our incentive is in using the resource as fast as possible. Of course, in doing this analysis we conveniently forget that nature took several hundred million years to generate the same barrel of oil.
Another way of looking at the same situation is, if, for the sake of argument, through some technological breakthrough it is possible to extract 100 barrels of oil after 50 years, but if the field were to be exploited now, only 1 barrel could be extracted and the remaining 99 barrels are lost forever. Managers would still find it desirable to extract that one barrel of oil now, notwithstanding the fact that future generations would lose 99 barrels of oil. This example may sound extreme, but analogous decisions are routinely taken globally. As a result, the rate of consumption of natural resources is so high that the world reserves of many key resources would be exhausted in a couple of generations. As these resources get exhausted, their availability would decline, although this fall would be generally gradual. But a fall in resource availability would impact industrial production as well as all the consequences that would inevitably result from it.
Everybody would be impacted. No one would be spared. But youngsters in their twenties and thirties, with 30 to 40 years of working life remaining, would be most affected. Their hopes, aspiration and dreams of a comfortable and peaceful retirement after years and years of hard work would stand shattered as money, not backed by availability of goods and services, would lose value as its purchasing power falls.
The aim of this book is to bring out the deep lacunae in our economic thought and practices. The existing economic practices were developed when natural resources were plentiful, the global population small, and natural resource consumption minuscule in relation to the reserves. But in a very rapidly changing scenario, with a burgeoning population, fast-changing demographic profile, and growth aspirations of people around the world putting pressure on natural resources, our economic thoughts and practices have to change.
No change is without associated pain. We are all comfortable with the present thought processes, which predict steady and sustained growth based on the implicit assumption that resources are unlimited. But the reality is that we live in a finite world with limited resources, and after that reality is factored in, none of these projections hold true. And the sooner we realise this, the better it is and perhaps less painful too.
This book is divided into two sections. The first section, The Context, highlights the world we live in and how fast we are consuming our resources and impacting the environment. Some readers may find The Context grim and depressing, but we have painted the picture as we see it based on the best available information. We would request such readers bear with us or simply move on to the next part, The New Economic Paradigm, and then come back to The Context. In the second part, The New Economic Paradigm, we have suggested a new approach to our economic theories, which would lead to a more sustainable world.
“Humans are extremely intelligent and yet extremely foolish. They have failed to perceive the inter-linkages in the Web of Life; remove a few links and the Web could collapse, threatening their own existence.”
Stealing From Our Children. The real dilemma of growth and the need for New Economics – Kamal K. Kothari and Chitra Chandrasekhar.
get it from Amazon.com
The tax system plays a crucial role in New Zealand’s housing markets. At the simplest level, the Goods and Services Tax is applied to new land development and new house construction, raising the price of new housing by 15%.
But the effects of the tax system are more complicated than this.
Since 1986 several tax changes have caused an intergenerational rift in New Zealand society by increasing the prices young people pay to purchase houses.
Some of these tax changes appear justifiable on efficiency grounds, but even these have made it more expensive for young people to purchase or rent property.
In conjunction with other tax changes that have artificially raised property prices, a generation of older property owners have become rich at the expense of current and future generations of New Zealanders.
The scale of the problem
The scale of the problem is seen by observing how average property prices have increased by over 220% in inflation-adjusted terms since 1989; the highest rate of increase in the developed world.
The average size of new houses has also increased more quickly than in Australia or the United States, the only two countries that publish this data.
The average size of a new dwelling in 2013 was 198 m2, up from 125 m2 in 1989, and nearly twice as large as the average new house in Europe.
The tax changes that have affected housing can be divided into those that affect the cost of supplying housing and those that affect the demand for housing.
Unfortunately, unravelling the effect of taxes on house prices and rents is challenging.
The effects depend on the extent that the supply of new housing is responsive to prices.
If the supply of housing is very responsive to prices, taxes that affect supply prices (such as GST) become fully reflected in prices, while taxes that affect demand (such as the relative size of taxes on housing income and other assets) do not. Conversely, if the supply of housing is not really responsive to prices, supply taxes like GST have little effect on prices but demand taxes have large effects.
The analysis is further complicated because the supply of land – particularly land in good locations – is less responsive to price than the supply of new houses.
It is quite possible that a particular tax can simultaneously lead to higher land prices but not much new land, and larger houses but not much of an increase in building costs.
Since 1989, the ways that the tax system affects the demand for housing has been the biggest problem.
The fundamental difficulty is that the returns from other classes of assets such as interest income are more heavily taxed than the returns from housing.
Because interest is more heavily taxed than the returns from owneroccupied housing – which are essentially the rent people get from their own home – people have an incentive to live in larger houses than otherwise, and pay more for well-located properties.
In the absence of this tax distortion, many people would choose to live in smaller houses and land prices in major cities would be a lot lower.
It is not unreasonable to suspect the premium people pay for well-located properties is twice as high as they would pay under a non-distortionary tax system.
But this is not all. The tax system provides incentives for landlords to pay a much higher price/rent multiple for the houses they lease, largely because the absence of a capital gains tax.
Because the houseprice/ rent multiple could increase either because house prices increase or because rents decline (or some combination of both), the tax system could make buying more expensive or it could make renting more affordable. Most of the evidence suggests house prices have increased rather than rents have fallen; either way, the result is a tax-induced decline in home ownership rates.
When the tax system causes artificially high house prices, costs are imposed on current and future generations of young people, who have to borrow more and pay higher mortgage costs.
Why 1989? New Zealanders have never paid tax on the capital gains associated with house price increases, and the way housing is taxed was not fundamentally changed in 1989.
This is true. But the distortionary effects of taxation depend on the way houses are taxed relative to other asset classes, and in 1989 the government changed the way some other capital income is taxed.
Until 1989, money placed into retirement saving schemes was tax deductible, and the earnings from this money were not taxed as they accumulated.
Under this tax scheme – which is used in most developed countries including the United Kingdom, the United States, France, Germany, and Japan – the money placed in these savings schemes is taxed in a similar way to housing.
It reduces the incentive for owner-occupiers and landlords to overinvest in housing.
While the distortions in the current tax system could be eliminated by introducing a capital gains tax on housing and all other assets, and by taxing the rent you implicitly pay yourself when you own your home, most countries have found this too difficult to do.
As they have discovered, it is far simpler to change the way other savings are taxed.
On the supply side, in addition to GST, the Local Government Act (2002) has also affected the cost of supplying housing by changing taxes.
Instead of levying property taxes (rates) to fund the costs of developing new sections, local governments have progressively imposed development charges.
This change has improved efficiency by moving the costs of a larger city to the new people populating it, but it has also increased the price of housing right across cities.
People who bought before 2002 shifted the cost of new development to others, increasing the value of their houses, even though their development costs had been paid by other ratepayers.
For a long time, economists have pointed out that if you tax the income from housing less than other assets, you tend to increase land prices.
At the macroeconomic level, they have noted that this tends to increase national debt levels, and lower national income.
The first owners of land benefit from these schemes, but everyone else loses.
Perhaps this is a reason why other countries have been concerned to tax housing on a similar basis to other assets.
It is unfortunate New Zealand does not do so, even if the tax changes implemented since the late 1980s have proved very advantageous to middle-aged and older generations.
In November 1980 the prophets returned to Nashville, Tennessee, to be honored. Vanderbilt University hosted a symposium honoring the Southern Agrarians on the fiftieth anniversary of the publication of I’ll Take My Stand: The South and the Agrarian Tradition (1930).
I’ll Take My Stand was an indictment of the industrial civilization of modern America. The authors hoped to preserve the manners and culture of the rural South as a healthy alternative. The book was the inspiration of two Vanderbilt English professors and poets, John Crowe Ransom and Donald Davidson, and their former student, the poet Allen Tate.
It was composed of twelve essays written by twelve separate individuals, the title page declaring them to be Twelve Southerners.
An essayist in Time magazine, claiming that 150 doctoral theses had been written about the book, remarked on the appeal of Agrarianism to modern-day environmentalists and theorists of the “zero-sum” society.
“Why do the Agrarians, with their crusty prophecies and affirmations, still sound so pertinent, half a very non-agrarian century later?” he asked.
The answer, he felt, lay in the power of Agrarianism as a poetic metaphor. This was a view shared by the organizers of the event, who, in a volume derived from it, argued that I’ll Take My Stand was a prophetic book. Once dismissed as a nostalgic, backward-looking defense of a romanticized Old South, the book was rather “an affirmation of universal values” and a defense of the “religious, aesthetic, and moral foundations of the old European civilization.”
Industrial society devalues human labor by replacing it with machines, argued the Twelve Southerners. Machine society undercut the dignity of labor and left modern man bereft of vocation and in an attenuated state of “satiety and aimlessness,” glutted with the surfeit of consumer goods produced by the industrial economy. Industrialism, they argued, was inimical to religion, the arts, and the elements of a good life, leisure, conversation, hospitality.
The Twelve Southerners were frankly reactionary and seriously proposed returning to an economy dominated by subsistence agriculture.
The theory of agrarianism, they declared, “is that the culture of the soil is the best and most sensitive of vocations, and that therefore it should have the economic preference and enlist the maximum number of workers.” Why, they asked, should modern men accept a social system so manifestly inferior to what had gone before? “If a community, or a section, or a race, or an age, is groaning under industrialism, and well aware that it is an evil dispensation,” the Twelve Southerners declared, “it must find the way to throw it off. To think that this cannot be done is pusillanimous.”
I’ll Take My Stand was a self-conscious defense of the South, undertaken sixty-five years after Robert E. Lee surrendered at Appomattox Court House.
The passage of years revealed an almost protean quality to Agrarianism. It came to mean very different things to a variety of different thinkers. Indeed, the contributors themselves, over the years, interpreted and reinterpreted their original impulse in light of changing convictions and interests. In 1930, I’ll Take My Stand was an indictment of industrial capitalism and a warning of its potential to destroy what the Agrarians considered a more humane and leisurely social order.
For some, it later came to be a statement of Christian humanism. For others, it was a rousing defense of the southern heritage and southern culture, which, in turn, meant a defense of the Western tradition. For others, Agrarianism was merely a metaphor for the simple life—one not consumed with materialism. For others still, the symposium was part of a traditional southern political discourse, which warned against centralized power and a strong state and which stood against bourgeois liberalism.
After World War II, the nascent conservative movement—poised against what it perceived to be an unwise liberal elite and in defense of traditional values and American capitalism—subsumed the Agrarians within its intellectual tradition. The Agrarians became respected, if quixotic, dissenters from the main trend of American progressivism.
Since the founding of the nation, southerners had sought a way to reconcile modernity and tradition, to participate in the modern market economy while retaining the shockingly premodern (yet profitable) system of slave labor. Slaveholders were alternately beguiled by the riches of the capitalist marketplace and appalled at the prospect of a society based on the pecuniary impulse and the self-interest, chicanery, and competitiveness of the market.
The 1920s offered a richer discourse on the crises of faith, morals, and science produced by modernity than any decade since. Agrarianism was an attempt to respond to questions being asked by others besides southerners: Is it possible to satisfy the felt needs for community, leisure, and stability in the dizzying whirl of modern life? How do we validate values in a disenchanted and secular age?
The Twelve Southerners’ response was both radical and conservative. They rejected industrial capitalism and the culture it produced. In I’ll Take My Stand they called for a return to the small-scale economy of rural America as a means to preserve the cultural amenities of the society they knew. Ransom and Tate believed that only by arresting the progress of industrial capitalism and its imperatives of science and efficiency could a social order capable of fostering and validating humane values and traditional religious faith be preserved.
The South as a symbolic marker of both traditional society and Western civilization became the central element of the Agrarian discourse. Modernism and modernization were no longer deeply related; what was a radical conservatism was now southern traditionalism. This bifurcation of economic and cultural analysis, which the Agrarians had originally resisted, reflects a distinctive attribute of the conservative movement that was emerging in the 1950s and transforming the leadership of the American Right.
Conservatives, southern and otherwise, constitute the final group to preserve the memory of the Agrarians. Conservatives have proudly honored the Agrarians as perceptive forefathers and tend to present them as southern traditionalists—proponents of a social order based on religion, opponents of a godless and untraditional leviathan state, critics of a rootless individualism, and, above all, stout defenders of the South, which necessarily entails a defense of southern tradition, culture, and values.
The paradox for southerners of the Agrarians’ generation, to change but to remain loyal to history, remained a continuing source of division for the Agrarians and undercut the radical conservatism of I’ll Take My Stand.
The growth of great nation-states, even if democratic, had marginalized the individual. Indeed, the individual was reduced to meaninglessness, with no sense of responsibility, no sense of past and place. In this context, the Agrarian image of a better antebellum South came to represent for Warren a potential source of spiritual revitalization. The past recalled not as a mythical “golden age” but “imaginatively conceived and historically conceived in the strictest readings of the researchers” could be a “rebuke to the present.”
In the end, the history of the Agrarian tradition was shaped by the pressure of the past on this group of southern intellectuals, a past whose legacy included segregation and white supremacy.
The southerner, Ransom wrote in I’ll Take My Stand, “identifies himself with a spot of ground, and this ground carries a good deal of meaning; it defines itself for him as nature.”
This may be so, but the interpretation of this meaning has been the subject of much conflict among southerners, white and black, throughout the century. At the heart of Agrarianism was the question not only of where do I stand, but also, who belongs? And it was not the ground that provided the answers but the human beings who took their stands upon it.
THE REBUKE OF HISTORY. The Southern Agrarians and American conservative thought.
Paul V Murphy
get it at Amazon.com
In 1955 the U.S. Supreme Court issued its second Brown v. Board of Education ruling, calling for the dismantling of segregation in public schools with “all deliberate speed.”
Thirty-seven-year-old James McGill Buchanan liked to call himself a Tennessee country boy. No less a figure than Milton Friedman had extolled Buchanan’s potential. As Colgate Whitehead Darden Jr., the president of the University of Virginia reviewed the document, he might have wondered if the newly hired economist had read his mind. For without mentioning the crisis at hand, Buchanan’s proposal put in writing what Darden was thinking: Virginia needed to find a better way to deal with the incursion on states’ rights represented by Brown.
States’ rights, in effect, were yielding in preeminence to individual rights. It was not difficult for either Darden or Buchanan to imagine how a court might now rule if presented with evidence of the state of Virginia’s archaic labor relations, its measures to suppress voting, or its efforts to buttress the power of reactionary rural whites by underrepresenting the moderate voters of the cities and suburbs of Northern Virginia. Federal meddling could rise to levels once unimaginable.
What the court ruling represented to Buchanan was personal. Northern liberals—the very people who looked down upon southern whites like him, he was sure—were now going to tell his people how to run their society. And to add insult to injury, he and people like him with property were no doubt going to be taxed more to pay for all the improvements that were now deemed necessary and proper for the state to make.
Find the resources, he proposed to Darden, for me to create a new center on the campus of the University of Virginia, and I will use this center to create a new school of political economy and social philosophy. It would be an academic center, rigorously so, but one with a quiet political agenda: to defeat the “perverted form” of liberalism that sought to destroy their way of life, “a social order,” as he described it, “built on individual liberty,” a term with its own coded meaning but one that Darden surely understood. The center, Buchanan promised, would train “a line of new thinkers” in how to argue against those seeking to impose an “increasing role of government in economic and social life.”
Buchanan fully understood the scale of the challenge he was undertaking and promised no immediate results. But he made clear that he would devote himself passionately to this cause.
Buchanan’s team had no discernible success in decreasing the federal government’s pressure on the South all the way through the 1960s and ’70s. But take a longer view—follow the story forward to the second decade of the twenty-first century—and a different picture emerges, one that is both a testament to Buchanan’s intellectual powers and, at the same time, the utterly chilling story of the ideological origins of the single most powerful and least understood threat to democracy today: the attempt by the billionaire-backed radical right to undo democratic governance.
A quest that began as a quiet attempt to prevent the state of Virginia from having to meet national democratic standards of fair treatment and equal protection under the law would, some sixty years later, become the veritable opposite of itself: a stealth bid to reverse-engineer all of America, at both the state and the national levels, back to the political economy and oligarchic governance of midcentury Virginia, minus the segregation.
The goal of all these actions was to destroy our institutions, or at least change them so radically that they became shadows of their former selves?
This, then, is the true origin story of today’s well-heeled radical right, told through the intellectual arguments, goals, and actions of the man without whom this movement would represent yet another dead-end fantasy of the far right, incapable of doing serious damage to American society.
When I entered Buchanan’s personal office, part of a stately second-floor suite, I felt overwhelmed. There were papers stacked everywhere, in no discernible order. Not knowing where to begin, I decided to proceed clockwise, starting with a pile of correspondence that was resting, helter-skelter, on a chair to the left of the door. I picked it up and began to read. It contained confidential letters from 1997 and 1998 concerning Charles Koch’s investment of millions of dollars in Buchanan’s Center for Study of Public Choice and a flare-up that followed.
Catching my breath, I pulled up an empty chair and set to work. It took me time—a great deal of time—to piece together what these documents were telling me. They revealed how the program Buchanan had first established at the University of Virginia in 1956 and later relocated to George Mason University, the one meant to train a new generation of thinkers to push back against Brown and the changes in constitutional thought and federal policy that had enabled it, had become the research-and-design center for a much more audacious project, one that was national in scope. This project was no longer simply about training intellectuals for a battle of ideas; it was training operatives to staff the far-flung and purportedly separate, yet intricately connected, institutions funded by the Koch brothers and their now large network of fellow wealthy donors. These included the Cato Institute, the Heritage Foundation, Citizens for a Sound Economy, Americans for Prosperity, FreedomWorks, the Club for Growth, the State Policy Network, the Competitive Enterprise Institute, the Tax Foundation, the Reason Foundation, the Leadership Institute, and more, to say nothing of the Charles Koch Foundation and Koch Industries itself.
I learned how and why Charles Koch first became interested in Buchanan’s work in the early 1970s, called on his help with what became the Cato Institute, and worked with his team in various organizations. What became clear is that by the late 1990s, Koch had concluded that he’d finally found the set of ideas he had been seeking for at least a quarter century by then—ideas so groundbreaking, so thoroughly thought-out, so rigorously tight, that once put into operation, they could secure the transformation in American governance he wanted. From then on, Koch contributed generously to turning those ideas into his personal operational strategy to, as the team saw it, save capitalism from democracy—permanently.
In his first big gift to Buchanan’s program, Charles Koch signaled his desire for the work he funded to be conducted behind the backs of the majority. “Since we are greatly outnumbered,” Koch conceded to the assembled team, the movement could not win simply by persuasion. Instead, the cause’s insiders had to use their knowledge of “the rules of the game”—that game being how modern democratic governance works—“to create winning strategies.” A brilliant engineer with three degrees from MIT, Koch warned, “The failure to use our superior technology ensures failure.” Translation: the American people would not support their plans, so to win they had to work behind the scenes, using a covert strategy instead of open declaration of what they really wanted.
Future-oriented, Koch’s men (and they are, overwhelmingly, men) gave no thought to the fate of the historical trail they left unguarded. And thus, a movement that prided itself, even congratulated itself, on its ability to carry out a revolution below the radar of prying eyes (especially those of reporters) had failed to lock one crucial door: the front door to a house that let an academic archive rat like me, operating on a vague hunch, into the mind of the man who started it all.
What animated Buchanan, what became the laser focus of his deeply analytic mind, was the seemingly unfettered ability of an increasingly more powerful federal government to force individuals with wealth to pay for an increasing number of public goods and social programs they had had no personal say in approving. Better schools, newer textbooks, and more courses for black students might help the children, for example, but whose responsibility was it to pay for these improvements? The parents of these students? Others who wished voluntarily to help out? Or people like himself, compelled through increasing taxation to contribute to projects they did not wish to support? To Buchanan, what others described as taxation to advance social justice or the common good was nothing more than a modern version of mob attempts to take by force what the takers had no moral right to: the fruits of another person’s efforts. In his mind, to protect wealth was to protect the individual against a form of legally sanctioned gangsterism. Where did this gangsterism begin? Not in the way we might have expected him to explain it to Darden: with do-good politicians, aspiring attorneys seeking to make a name for themselves in constitutional law, or even activist judges. It began before that: with individuals, powerless on their own, who had figured out that if they joined together to form social movements, they could use their strength in numbers to move government officials to hear their concerns and act upon them.
The only fact that registered in his mind was the “collective” source of their power—and that, once formed, such movements tended to stick around, keeping tabs on government officials and sometimes using their numbers to vote out those who stopped responding to their needs. How was this fair to other individuals? How was this American?
Even when conservatives later gained the upper hand in American politics, Buchanan saw his idea of economic liberty pushed aside. Richard Nixon expanded government more than his predecessors had, with costly new agencies and regulations, among them a vast new Environmental Protection Agency. George Wallace, a candidate strongly identified with the South and with the right, nonetheless supported public spending that helped white people. Ronald Reagan talked the talk of small government, but in the end, the deficit ballooned during his eight years in office.
Had there not been someone else as deeply frustrated as Buchanan, as determined to fight the uphill fight, but in his case with much keener organizational acumen, the story this book tells would no doubt have been very different. But there was. His name was Charles Koch. An entrepreneurial genius who had multiplied the earnings of the corporation he inherited by a factor of at least one thousand, he, too, had an unrealized dream of liberty, of a capitalism all but free of governmental interference and, at least in his mind, thus able to achieve the prosperity and peace that only this form of capitalism could produce. The puzzle that preoccupied him was how to achieve this in a democracy where most people did not want what he did.
Ordinary electoral politics would never get Koch what he wanted. Passionate about ideas to the point of obsession, Charles Koch had worked for three decades to identify and groom the most promising libertarian thinkers in hopes of somehow finding a way to break the impasse. He subsidized and at one point even ran an obscure academic outfit called the Institute for Humane Studies in that quest. “I have supported so many hundreds of scholars” over the years, he once explained, “because, to me, this is an experimental process to find the best people and strategies.”
The goal of the cause, Buchanan announced to his associates, should no longer be to influence who makes the rules, to vest hopes in one party or candidate. The focus must shift from who rules to changing the rules. For liberty to thrive, Buchanan now argued, the cause must figure out how to put legal—indeed, constitutional shackles on public officials, shackles so powerful that no matter how sympathetic these officials might be to the will of majorities, no matter how concerned they were with their own reelections, they would no longer have the ability to respond to those who used their numbers to get government to do their bidding. There was a second, more diabolical aspect to the solution Buchanan proposed, one that we can now see influenced Koch’s own thinking. Once these shackles were put in place, they had to be binding and permanent. The only way to ensure that the will of the majority could no longer influence representative government on core matters of political economy was through what he called “constitutional revolution.”
By the late 1990s, Charles Koch realized that the thinker he was looking for, the one who understood how government became so powerful in the first place and how to take it down in order to free up capitalism—the one who grasped the need for stealth because only piecemeal, yet mutually reinforcing, assaults on the system would survive the prying eyes of the media, was James Buchanan.
The Koch team’s most important stealth move, and the one that proved most critical to success, was to wrest control over the machinery of the Republican Party, beginning in the late 1990s and with sharply escalating determination after 2008. From there it was just a short step to lay claim to being the true representatives of the party, declaring all others RINOS—Republicans in name only. But while these radicals of the right operate within the Republican Party and use that party as a delivery vehicle, make no mistake about it: the cadre’s loyalty is not to the Grand Old Party or its traditions or standard-bearers. Their loyalty is to their revolutionary cause.
Our trouble in grasping what has happened comes, in part, from our inherited way of seeing the political divide. Americans have been told for so long, from so many quarters, that political debate can be broken down into conservative versus liberal, pro-market versus pro-government, Republican versus Democrat, that it is hard to recognize that something more confounding is afoot, a shrewd long game blocked from our sight by these stale classifications.
The Republican Party is now in the control of a group of true believers for whom compromise is a dirty word. Their cause, they say, is liberty. But by that they mean the insulation of private property rights from the reach of government, and the takeover of what was long public (schools, prisons, western lands, and much more) by corporations, a system that would radically reduce the freedom of the many. In a nutshell, they aim to hollow out democratic resistance. And by its own lights, the cause is nearing success.
The 2016 election looked likely to bring a big presidential win with across-the-board benefits. The donor network had so much money and power at its disposal as the primary season began that every single Republican presidential front-runner was bowing to its agenda. Not a one would admit that climate change was a real problem or that guns weren’t good, and the more widely distributed, the better. Every one of them attacked public education and teachers’ unions and advocated more charter schools and even tax subsidies for religious schools. All called for radical changes in taxation and government spending. Each one claimed that Social Security and Medicare were in mortal crisis and that individual retirement and health savings accounts, presumably to be invested with Wall Street firms, were the best solution.
Although Trump himself may not fully understand what his victory signaled, it put him between two fundamentally different, and opposed, approaches to political economy, with real-life consequences for us all. One was in its heyday when Buchanan set to work. In economics, its standard-bearer was John Maynard Keynes, who believed that for a modern capitalist democracy to flourish, all must have a share in the economy’s benefits and in its governance. Markets had great virtues, Keynes knew—but also significant built-in flaws that only government had the capacity to correct.
As a historian, I know that his way of thinking, as implemented by elected officials during the Great Depression, saved liberal democracy in the United States from the rival challenges of fascism and Communism in the face of capitalism’s most cataclysmic collapse. And that it went on to shape a postwar order whose operating framework yielded ever more universal hope that, by acting together and levying taxes to support shared goals, life could be made better for all.
The most starkly opposed vision is that of Buchanan’s Virginia school. It teaches that all such talk of the common good has been a smoke screen for “takers” to exploit “makers,” in the language now current, using political coalitions to “vote themselves a living” instead of earning it by the sweat of their brows. Where Milton Friedman and F. A. Hayek allowed that public officials were earnestly trying to do right by the citizenry, even as they disputed the methods, Buchanan believed that government failed because of bad faith: because activists, voters, and officials alike used talk of the public interest to mask the pursuit of their own personal self-interest at others’ expense. His was a cynicism so toxic that, if widely believed, it could eat like acid at the foundations of civic life. And he went further by the 1970s, insisting that the people and their representatives must be permanently prevented from using public power as they had for so long. Manacles, as it were, must be put on their grasping hands.
Is what we are dealing with merely a social movement of the right whose radical ideas must eventually face public scrutiny and rise or fall on their merits? Or is this the story of something quite different, something never before seen in American history? Could it be—and I use these words quite hesitantly and carefully—a fifth-column assault on American democratic governance?
The term “fifth column” has been applied to stealth supporters of an enemy who assist by engaging in propaganda and even sabotage to prepare the way for its conquest.
This cause is different. Pushed by relatively small numbers of radical-right billionaires and millionaires who have become profoundly hostile to America’s modern system of government, an apparatus decades in the making, funded by those same billionaires and millionaires, has been working to undermine the normal governance of our democracy. Indeed, one such manifesto calls for a “hostile takeover” of Washington, D.C. That hostile takeover maneuvers very much like a fifth column, operating in a highly calculated fashion, more akin to an occupying force than to an open group engaged in the usual give-and-take of politics. The size of this force is enormous. The social scientists who have led scholars in researching the Koch network write that it “operates on the scale of a national U.S. political party” and employs more than three times as many people as the Republican committees had on their payrolls in 2015.
For all its fine phrases, what this cause really seeks is a return to oligarchy, to a world in which both economic and effective political power are to be concentrated in the hands of a few. It would like to reinstate the kind of political economy that prevailed in America at the opening of the twentieth century, when the mass disfranchisement of voters and the legal treatment of labor unions as illegitimate enabled large corporations and wealthy individuals to dominate Congress and most state governments alike, and to feel secure that the nation’s courts would not interfere with their reign. The first step toward understanding what this cause actually wants is to identify the deep lineage of its core ideas. And although its spokespersons would like you to believe they are disciples of James Madison, the leading architect of the U.S. Constitution, it is not true.
Their intellectual lodestar is John C. Calhoun. He developed his radical critique of democracy a generation after the nation’s founding, as the brutal economy of chattel slavery became entrenched in the South, and his vision horrified Madison.
Democracy in Chains: The Deep History of the Radical Right’s Stealth Plan for America
by Nancy Maclean
Nancy K. MacLean is an American historian. She is the William H. Chafe Professor of History and Public Policy at Duke University and the author of numerous books and articles on various aspects of twentieth-century United States history.
get it from Amazon