Britain lost its AAA credit rating from Moody’s last week, amidst much wailing from the government and cries of triumph from the opposition that had been largely responsible for the country’s declining creditworthiness. However, since the United States lost its AAA from Standard and Poor’s in 2011, with no visible adverse effects on its bond issuance, the schadenfreude seems a little overdone. In any case, the departure of France, Britain and the United States from the AAA ranks raises an important question: in this age of democratically-mandated social expenditures, does any country deserve a AAA rating?
National credit ratings for rich countries are fairly recent; the United Kingdom got its Moody’s rating only in 1978. At that time Venezuela, more or less as badly run as it is today, albeit with rather less international debt, was rated AAA because of its oil reserves. With that precedent, there was little question of giving the U.K. a rating below AAA, even though the country had come close to bankruptcy only two years earlier.
Before that time, rich countries borrowed domestically more or less at will – it wasn’t thought necessary for them to get a certification from a private rating agency. Internationally, credit availability for them was by no means infinite. Britain, generally regarded as the world’s solidest credit, came near to exhausting the international market’s credit availability before the U.S. entry into World War I in 1917, found itself unable to borrow money from J.P. Morgan during the “gold standard” crisis of 1931, and acted as if the private capital markets were closed to it in 1944-45, although in fact private finance was a possible alternative never properly explored (more on that next week, when my subject will be the follies of the 1944 Bretton Woods Agreement.)
Under the pre-1914 Gold Standard, money markets were pretty unforgiving to gold standard countries. If you ran your economy too sloppily or borrowed too much, a massive gold outflow would occur, and short-term interest rates would rise into the double digits, bringing about a wrenching recession. Your only alternative to suffering economic pain was to leave the gold standard and admit that your economic management was a washout. At that point, international markets would be unlikely to lend you money anyway, so you still suffered the economic pain. The United States suffered a deep and painful depression in the 1840s, for example, after several states defaulted following the wind-up of the Second Bank of the United States. The U.S. railroad boom was stopped in its tracks until after 1850, because international credit markets distrusted American paper. Fortunately, the Federal government was running surpluses or close to it throughout this period, so that its credit never came into serious question.
Default in those days meant default, and the only way you could borrow money thereafter was to have a revolution and install a competent government. France defaulted during the Napoleonic Wars, but was able to borrow after 1815 (a famous operation led by Barings) because with Allied help it had thrown out the mountebank Napoleon and restored order in the form of Louis XVIII and the Duc de Richelieu. However the Russian defaults after 1917 resulted in the country being barred from private international capital markets until the late 1960s.
It was only with the fiat money regime after World War II that the theory grew up that “countries don’t go bust,” in the words of former Citicorp Chairman Walter Wriston. Countries could print as much domestic currency as they needed to finance their needs, while internationally there was the International Monetary Fund, whose sole purpose initially was to provide emergency funding to countries that had messed up their finances and so couldn’t tap conventional bond markets.
Emerging markets in Latin America continued to go bankrupt, but rich countries were able to extend their debt more or less ad infinitum, suffering a bout of inflation that hurt domestic savers (especially if they were subject to high post-World War II rates of income tax) but imposed very few penalties on the erring government concerned. This policy of “repression” was pursued successfully and disgracefully by Britain after World War II (exchange controls, preventing the middle classes from getting their money out, were an important additional engine of financial waterboarding.) It thus enabled Britain to reduce its debt from 250% of GDP to normal levels by the 1970s, at the cost of impoverishing millions of Britons. However no formal default occurred.
The financial crisis of 2008 and the global response thereto have put us in a new world. No longer are governments compelled to behave themselves by the strictures of the Gold Standard – Maynard Keynes, who had all sorts of ways in which he wanted governments to spend money, was right from his own viewpoint to assist in the destruction of that admirable system. On the other hand, we are also not in a world in which exchange controls can be imposed, forcing residents to keep their money in the domestic economy, where it can be looted by the government.
More important, the traditional constraints, or rather self-restraints, followed by government policy in the 1945-2008 period have gone by the wayside. Whereas Francois Mitterrand’s profligate socialist government of 1981-86 in France was pretty quickly brought to heel by a collapse in the value of the franc, today all governments have discovered the joys of budget deficits and “quantitative easing.” Governments like those of South Korea and Australia that are still pursuing more or less traditional fiscal and monetary policies are finding their exporters heavily disadvantaged in the international market by an inexorable rise in their currencies against their sloppier neighbors.
In the long run, and we’re talking 5 years not 20, the profligate governments’ finances must collapse. We are seeing a preview of the effect of such a collapse on their domestic economies by the troubles of the Eurozone PIIGS. Government austerity sufficient to rebalance budgets will be deeply unpopular with the electorate, which is used to an expansive Santa Claus government, and will result in politically pathological election results such as that in Italy. Thus government finances will accelerate out of control.
At that point, if the local central bank remains stimulative outright default may be avoided; the economy will simply suffer a Weimar spiral of inflation, with government financing itself primarily by central bank money printing. Technically, no default will occur; the government’s bonds will merely become worthless, as did those of the pre-war Reich in the Weimar period. If your idea of a AAA investment is one that loses all its value within a decade or so but remains paying its minimal-value nominal obligations, then Britain and the United States will technically remain not just AA plus but AAA. However I would suggest that the credit rating concerned, while technically correct, has not given investors any useful information in that event.
The Weimar inflation of 1923-24 wiped out German middle class savings and indirectly led to the rise of Hitler a decade later. From the point of view of Germany’s people as well as its creditors, it represented the worst possible outcome. It is thus possible that the central banks of countries that are running into trouble will limit their “quantitative easing,” thus forcing the governments to service their debts in more or less real money.
In that case, a full default is unlikely because the country’s economy will remain reasonably prosperous. However years of deficit finance will have left the country’s debt at an excessive multiple of GDP, and central bank restraint, pushing interest rates above the level of inflation, will make the debt impossible to service. As has recently happened in Greece, creditors will then be forced to take a write-down of the country’s obligations – which is graded, correctly, as a partial default by the rating agencies.
Such a partial default will prove the credit rating of AAA or even AA plus to have been wrong. Ironically, a partial default will be the best outcome for the government, its people and its creditors; far more wealth will be preserved than in a hyperinflation. Yet the hyperinflation will validate the credit rating, while the partial default will prove it in error.
History suggests pretty strongly that the default-free experience of British creditors since 1672 and U.S. federal government creditors since 1790 is anomalous and that government defaults are relatively likely occurrences, even among rich countries. They become especially probable when, as at the present, those governments engage in self-deluding fiscal and monetary experimentation.
For the reasons outlined above, modern governments, with electorates to bribe, should never be rated AAA. After all, they generally run at a loss year by year. There are AAA quality credits, but they are corporate, companies with strong businesses diversified both geographically and in terms of product lines, prudent management and low debt. Believe it or not, there are in the United States 82 public companies that have increased their dividends every year for more than 30 years, with the longest such track record stretching back to 1954.
Companies such as Procter and Gamble (NYSE:PG) and Emerson Electric (NYSE:EMR) have well-balanced global businesses with a wide range of stable products, and have increased their dividends every year since 1954 and 1957 respectively. Also, importantly, they are probably too big to be taken over, or in the case of Emerson (which has had only 3 CEOs in 60 years) too well liked by their shareholders. (Takeovers are death to debt holders, because they are generally financed by loading up on leverage, wrecking the quality of outstanding debt.) However Moody’s rates Procter and Gamble only AA3 and Emerson A2.
That’s ridiculous. For any debt with a maturity of less than a century (which could find the companies overwhelmed by strategic change) Procter and Gamble and Emerson should be rated AAA. Unlike governments, they are conservatively run with an emphasis on long-term value. Unlike governments, they make profits. And unlike governments, they aren’t meaningfully subject to elections.
-0-
(The Bear’s Lair is a weekly column that is intended to appear each Monday, an appropriately gloomy day of the week. Its rationale is that the proportion of “sell” recommendations put out by Wall Street houses remains far below that of “buy” recommendations. Accordingly, investors have an excess of positive information and very little negative information. The column thus takes the ursine view of life and the market, in the hope that it may be usefully different from what investors see elsewhere.)