In fiscal year 2000, the federal government consumed about 18.2 percent of U.S. GDP, while state and local government consumed a further 11.5 percent, for a total of 29.7 percent. This is significantly lower than the level prevailing in 1975, and has therefore been a cause of much optimism about the economy. However, the long-term historical picture is very different.
The last time that the stock market was as overvalued as it currently is, U.S. federal public expenditure was three percent of the GDP, while state and local public expenditure totaled 7.5 percent. Total public spending was therefore 10.5 percent of GDP, in line with the historic rule which had applied to all societies where such things were measured: If public spending exceeds 10 percent of GDP, national bankruptcy and misery will be the result.
Of course, that was in 1929.
It’s an interesting historical fact, nevertheless. Lord Liverpool, Britain’s least remembered major Prime Minister and one of her best, won the Napoleonic Wars with public spending never exceeding 11-12 percent of GDP, as far as we can judge — this is one of the reasons why Jane Austen’s novels don’t mention the wars. Louis XVI lost control of France after the Revolutionary War, with public spending only around nine percent of GDP — French tax collection systems were a great deal less effective than were the British, because the rich were largely exempt. During most of the 19th Century, British public spending generally declined as a percentage of GDP, in spite of the temptations from Sir Robert Peel’s reintroduction of the income tax in 1842. Even in 1913, by which time Britain had old-age pensions and unemployment insurance, public spending was still a fraction below 10 percent of GDP.
U.S. public spending, in both North and South, rose above 10 percent of GDP during the Civil War, but only for a short period. The Confederacy went bankrupt, while public spending in the victorious North sank far below 10 percent of GDP in the years after the war, to the extent that the national debt was close to being paid off by 1890 — this calamity was only prevented by the profligacy of the 1889-1890 “Billion dollar Congress.” U.S. public spending, around 7.5 percent of GDP in 1900, rose during World War I, to about 15 percent of GDP at the peak. But, it was reduced again by the 1920s boom and the fiscally careful policies of Treasury Secretary Andrew Mellon.
As the nineteenth century wore on, with the rise of good communications and accurate record-keeping, governments found themselves able to mulct their taxpayers more effectively than had previously been possible. From about 1860, instantaneous communication by telegraph and rapid transportation by rail, together in Britain with the Victorian administrative reforms, made it possible for bureaucrats to monitor income and expenditures much more closely than had previously been possible, and to collect a larger share of it in taxes. A part in this change was also played by the Industrial Revolution, which resulted in even poorer taxpayers receiving cash income rather than subsistence agricultural produce.
In Britain, the first great breakthrough came at the time of World War I, when the top rate of income tax rose to 52.5 percent. After the war, the Lloyd George government of 1918-22 — accused at the time of being a bunch of greedy war profiteers but actually led by a leftist Welshman — proved remarkably fertile in devising new social programs and benefits, which were then consolidated by the fearful Baldwin regime in the 1920s. Consequently, spending levels remained far above their pre-1914 level.
In World War II, wimpy inhibitions about income tax levels were cast aside; the top rate rose to 97.5 percent. This time, in an effort to remove the profiteers, the electorate installed an out-and-out socialist government, which proceeded to increase public spending to around 35 percent of GDP and introduce a National Health Service. Public spending continued to creep up until 1979, when Margaret Thatcher, the heroic scourge of the public sector, decreased it over 10 years from 43 percent of GDP to 39 percent, an advance that was wholly lost by the Major government’s pre-election spending spree in 1991-92. Currently, after eight years of economic boom, public spending is just over 40 percent of GDP, at which level it is the second lowest in the European Union.
In the United States, the first big increases came under the New Deal, which took its inspiration partly from Keynes, who in turn had been instrumental in the British spending increases of 1914-22. World War II and the Cold War played a role, as did the Great Society. By 1975, public spending was somewhat above its present level, with federal spending representing 21.3 percent of GDP and state and local spending 12.3 percent, for a total of 33.6 percent. The drop since then, while significant, has hardly been revolutionary.
While the technological advances of the nineteenth and twentieth centuries made the breakthrough in public spending above the 10 percent level possible, they didn’t make it inevitable: After all, in advanced countries, there was very little further tax-collecting technology available in 1940 than in 1860. Had Britain not entered World War I, Keynes would presumably have remained an obscure India Office bureaucrat, and the revolution in public spending might never have occurred — nor, indeed would the loss of the British Empire, unless Keynes at the India Office had managed to lose that instead.
It is not at all clear what the taxpayer gets for this expenditure. Government spending to relieve the plight of the truly poor, ballyhooed as a social necessity at the time the spending was imposed, is in fact a small proportion of the total — certainly no more than 3-4 percent of GDP. Defense, while a key need during the Cold War, need be no larger now than in 1900, and indeed is very barely larger today than in 1900 if you take Britain and the U.S. together (the U.S. has inherited Britain’s “world policeman” role, and the need for greater expenditure that flows from it.)
The big increases in outlays are for healthcare, education and retirement programs, all of which were provided largely through the private sector in 1900. Today, all of them involve a large government role. The case of U.S. health care demonstrates the inefficiency of this. Even though the U.S. government role in health care is lower than that in many other countries, the overall cost is at over 14 percent of GDP, which is monstrously high. But U.S. life expectancy remains lower than in countries with much lower health care costs.
This exorbitant cost arises because private healthcare provision is made very largely through insurance schemes; both government and insurance schemes involve the same problem of third-party provision: The party paying the bill is not the party receiving the service.
Likewise, U.S. education suffers from the fact that the providers of the service are not paid directly by the users, but by a state bureaucracy. Retirement provision also suffers from the fact that investment of the funds saved for retirement is left not to individuals but to the state, and investment returns are accordingly depressed.
While health care, education and retirement provisions are all costs that would have to be borne by the individual, it is thus quite clear that their provision is made drastically less efficient and less user-friendly by the cost being borne by the government (or by a third-party insurance company.)
Hence, at a minimum, a system like Singapore’s Central Provident Fund is needed. This compels individuals to save a substantial portion of their income. But they are then able to direct the investment of the proceeds, and to spend them as they wish on medical care, retirement and, in this model, education. Such a system would remove the problem of third-party provision, and render these three services as efficient as those in the rest of the private sector.
Even the Singaporean system, however, represents an unhealthy intrusion of the “nanny state” into the private lives of its citizens. The mere fact that the state has the technological ability to do something does not make it desirable to do. After all, the advent of the Internet will very soon make it possible for the state to spy on its citizens’ intimate activities. Yet, except for the very few who regard ‘1984’ as a Utopian fantasy, nobody suggests that it should do so.
Let us instead resolve that the 21st Century will reverse the great mistake of the 20th, and return public spending to its historically sanctioned level of 10 percent of GDP, a level at which Big Government dies and freedom is reborn.
-0-
(The Bear’s Lair is a weekly column that is intended to appear each Monday, an appropriately gloomy day of the week. Its rationale is that the proportion of “sell” recommendations put out by Wall Street houses remains far below that of “buy” recommendations. Accordingly, investors have an excess of positive information and very little negative information. The column thus takes the ursine view of life and the market, in the hope that it may be usefully different from what investors see elsewhere.)
This article originally appeared on United Press International.