If social policy were medicine, and countries were the patients, the United States today would be a post-surgical charge under observation after an ambitious and previously untested transplant operation. Surgeons have grafted a foreign organ — the European welfare state — into the American body. The transplanted organ has thrived — in fact, it has grown immensely. The condition of the patient, however, is another question altogether. The patient’s vital signs have not responded entirely positively to this social surgery; in fact, by some important metrics, the patient’s post-operative behavior appears to be impaired. And, like many other transplant patients, this one seems to have effected a disturbing change in mood, even personality, as a consequence of the operation.
The modern welfare state has a distinctly European pedigree. Naturally enough, the architecture of the welfare state was designed and developed with European realities in mind, the most important of which were European beliefs about poverty. Thanks to their history of Old World feudalism, with its centuries of rigid class barriers and attendant lack of opportunity for mobility based on merit, Europeans held a powerful, continentally pervasive belief that ordinary people who found themselves in poverty or need were effectively stuck in it — and, no less important, that they were stuck through no fault of their own, but rather by an accident of birth. (Whether this belief was entirely accurate is another story, though beside the point: This was what people perceived and believed, and at the end of the day those perceptions shaped the formation and development of Europe’s welfare states.) The state provision of old-age pensions, unemployment benefits, and health services — along with official family support and other household-income guarantees — served a multiplicity of purposes for European political economies, not the least of which was to assuage voters’ discontent with the perceived shortcomings of their countries’ social structures through a highly visible and explicitly political mechanism for broadly based and compensatory income redistribution.
But America’s historical experience has been rather different from Europe’s, and from the earliest days of the great American experiment, people in the United States exhibited strikingly different views from their trans-Atlantic cousins on the questions of poverty and social welfare. These differences were noted both by Americans themselves and by foreign visitors, not least among them Alexis de Tocqueville, whose conception of American exceptionalism was heavily influenced by the distinctive American worldview on such matters. Because America had no feudal past and no lingering aristocracy, poverty was not viewed as the result of an unalterable accident of birth but instead as a temporary challenge that could be overcome with determination and character — with enterprise, hard work, and grit. Rightly or wrongly, Americans viewed themselves as masters of their own fate, intensely proud because they were self-reliant.
To the American mind, poverty could never be regarded as a permanent condition for anyone in any stratum of society because of the country’s boundless possibilities for individual self-advancement. Self-reliance and personal initiative were, in this way of thinking, the critical factors in staying out of need. Generosity, too, was very much a part of that American ethos; the American impulse to lend a hand (sometimes a very generous hand) to neighbors in need of help was ingrained in the immigrant and settler traditions. But thanks to a strong underlying streak of Puritanism, Americans reflexively parsed the needy into two categories: what came to be called the deserving and the undeserving poor. To assist the former, the American prescription was community-based charity from its famously vibrant “voluntary associations.” The latter — men and women judged responsible for their own dire circumstances due to laziness, or drinking problems, or other behavior associated with flawed character — were seen as mainly needing assistance in “changing their ways.” In either case, charitable aid was typically envisioned as a temporary intervention to help good people get through a bad spell and back on their feet. Long-term dependence upon handouts was “pauperism,” an odious condition no self-respecting American would readily accept.
The American mythos, in short, offered less than fertile soil for cultivating a modern welfare state. This is not to say that the American myth of unlimited opportunity for the rugged individualist always conformed to the facts on the ground. That myth rang hollow for many Americans — most especially for African-Americans, who first suffered for generations under slavery and thereafter endured a full century of officially enforced discrimination, as well as other barriers to self-advancement. Though the facts certainly did not always fit the ideal, the American myth was so generally accepted that the nation displayed an enduring aversion to all the trappings of the welfare state, and put up prolonged resistance to their establishment on our shores.
Over the past several decades, however, something fundamental has changed. The American welfare state today transfers over 14% of the nation’s GDP to the recipients of its many programs, and over a third of the population now accepts “need-based” benefits from the government. This is not the America that Tocqueville encountered. To begin to appreciate the differences, we need to understand how Americans’ relationship to the welfare state has changed, and with it, the American character itself.
AN AMERICAN REVOLUTION
The road to our modern welfare state traces its way through northern Europe, most notably through Bismarck’s social-insurance legislation in late 19th-century Germany, Sweden’s pioneering “social democracy” policies during the interwar period, and Britain’s 1942 “Beveridge Report,” which offered the embattled nation a vision of far-reaching and generous social-welfare guarantees after victory.
Over the first three decades of the 20th century, while welfare programs were blossoming in Europe, in the United States the share of the national output devoted to public-welfare spending (pensions, unemployment, health, and all the rest) not only failed to rise but apparently declined. The ratio of government social outlays to GDP looks actually to have been lower in 1930 than it was in 1890, due in part to the death of Civil War veterans (of the Union army) and their dependents who had been receiving pensions. Thirty-six European and Latin American countries — many of which lagged far behind the U.S. in terms of educational attainment and socioeconomic development — already had put in place nationwide “social insurance” systems for old-age pensions by the time the United States passed the Social Security Act in 1935, establishing our first federal legislation committing Washington to providing public benefits for the general population.
Suffice it to say, the United States arrived late to the 20th century’s entitlement party, and the hesitance to embrace the welfare state lingered on well after the Depression. As recently as the early 1960s, the “footprint” left on America’s GDP by the welfare state was not dramatically larger than it had been under Franklin Roosevelt — or Herbert Hoover, for that matter. In 1961, at the start of the Kennedy Administration, total government entitlement transfers to individual recipients accounted for a little less than 5% of GDP, as opposed to 2.5% of GDP in 1931 just before the New Deal. In 1963 — the year of Kennedy’s assassination — these entitlement transfers accounted for about 6% of total personal income in America, as against a bit less than 4% in 1936.
During the 1960s, however, America’s traditional aversion to the welfare state and all its works largely collapsed. President Johnson’s “War on Poverty” (declared in 1964) and his “Great Society” pledge of the same year ushered in a new era for America, in which Washington finally commenced in earnest the construction of a massive welfare state. In the decades that followed, America not only markedly expanded provision for current or past workers who qualified for benefits under existing “social insurance” arrangements (retirement, unemployment, and disability), it also inaugurated a panoply of nationwide programs for “income maintenance” (food stamps, housing subsidies, Supplemental Social Security Insurance, and the like) where eligibility turned not on work history but on officially designated “poverty” status. The government also added health-care guarantees for retirees and the officially poor, with Medicare, Medicaid, and their accompaniments. In other words, Americans could claim, and obtain, an increasing trove of economic benefits from the government simply by dint of being a citizen; they were now incontestably entitled under law to some measure of transferred public bounty, thanks to our new “entitlement state.”
The expansion of the American welfare state remains very much a work in progress; the latest addition to that edifice is, of course, the Affordable Care Act. Despite its recent decades of rapid growth, the American welfare state may still look modest in scope and scale compared to some of its European counterparts. Nonetheless, over the past two generations, the remarkable growth of the entitlement state has radically transformed both the American government and the American way of life itself. It is not too much to call those changes revolutionary.
The impact on the federal government has been revolutionary in the literal meaning of the term, in that the structure of state spending has been completely overturned within living memory. Over the past half-century, social-welfare-program payments and subventions have mutated from a familiar but nonetheless decidedly limited item on the federal ledger into its dominant and indeed most distinguishing feature. The metamorphosis is underscored by estimates from the Bureau of Economic Analysis, the unit in the federal government that calculates GDP and other elements of our national accounts. According to BEA figures, official transfers of money, goods, and services to individual recipients through social-welfare programs accounted for less than one federal dollar in four (24%) in 1963. (And, to go by BEA data, that share was not much higher than what it had been in 1929.) But by 2013, roughly three out of every five federal dollars (59%) were going to social-entitlement transfers. The still-shrinking residual — barely two budgetary dollars in five, at this writing — is now left to apply to all the remaining purposes of the federal government, including the considerable bureaucratic costs of overseeing the various transfer programs under consideration themselves.
Thus did the great experiment begun in the Constitution devolve into an entitlements machine — at least, so far as daily operations, budgetary priorities, and administrative emphases are concerned. Federal politics, correspondingly, are now in the main the politics of entitlement programs — activities never mentioned in the Constitution or its amendments.
THE ROAD TO WELFARE
Scarcely less revolutionary has been the remolding of daily life for ordinary Americans under the shadow of the entitlement state. Over the half-century between 1963 and 2013, entitlement transfers were the fastest growing source of personal income in America — expanding at twice the rate for real per capita personal income from all other sources, in fact. Relentless, exponential growth of entitlement payments recast the American family budget over the course of just two generations. In 1963, these transfers accounted for less than one out of every 15 dollars of overall personal income; by 2013, they accounted for more than one dollar out of every six.
The explosive growth of entitlement outlays, of course, was accompanied by a corresponding surge in the number of Americans who would routinely apply for, and accept, such government benefits. Despite episodic attempts to limit the growth of the welfare state or occasional assurances from Washington that “the era of big government is over,” the pool of entitlement beneficiaries has apparently grown almost ceaselessly. The qualifier “apparently” is necessary because, curiously enough, the government did not actually begin systematically tracking the demographics of America’s “program participation” until a generation ago. Such data as are available, however, depict a sea change over the past 30 years.
By 2012, the most recent year for such figures at this writing, Census Bureau estimates indicated that more than 150 million Americans, or a little more than 49% of the population, lived in households that received at least one entitlement benefit. Since under-reporting of government transfers is characteristic for survey respondents, and since administrative records suggest the Census Bureau’s own adjustments and corrections do not completely compensate for the under-reporting problem, this likely means that America has already passed the symbolic threshold where a majority of the population is asking for, and accepting, welfare-state transfers.
Between 1983 and 2012, by Census Bureau estimates, the percentage of Americans “participating” in entitlement programs jumped by nearly 20 percentage points. One might at first assume that the upsurge was largely due to the graying of the population and the consequent increase in the number of beneficiaries of Social Security and Medicare, entitlement programs designed to help the elderly. But that is not the case. Over the period in question, the share of Americans receiving Social Security payments increased by less than three percentage points — and by less than four points for those availing themselves of Medicare. Less than one-fifth of that 20-percentage-point jump can be attributed to increased reliance on these two “old age” programs.
Overwhelmingly, the growth in claimants of entitlement benefits has stemmed from an extraordinary rise in “means-tested” entitlements. (These entitlements are often called “anti-poverty programs,” since the criterion for eligibility is an income below some designated multiple of the officially calculated poverty threshold.) By late 2012, more than 109 million Americans lived in households that obtained one or more such benefits — over twice as many as received Social Security or Medicare. The population of what we might call “means-tested America” was more than two-and-a-half times as large in 2012 as it had been in 1983. Over those intervening years, there was population growth to be sure, but not enough to explain the huge increase in the share of the population receiving anti-poverty benefits. The total U.S. population grew by almost 83 million, while the number of people accepting means-tested benefits rose by 67 million — an astonishing trajectory, implying a growth of the means-tested population of 80 persons for each 100-person increase in national population over that interval.
In the mid-1990s, during the Clinton era, Congress famously passed legislation to rein in one notorious entitlement program: Aid for Families with Dependent Children. Established under a different name as part of the 1935 Social Security Act, AFDC was a Social Security program portal originally intended to support the orphaned children of deceased workers; it was subsequently diverted to supporting children from broken homes and eventually the children of unwed mothers. By the 1980s, the great majority of children born to never-married mothers were AFDC recipients, and almost half of AFDC recipients were the children of never-married mothers. The program’s design seemed to create incentives against marriage and against work, and it was ultimately determined by bipartisan political consensus that such an arrangement must not continue. So with the welfare reforms of the 1990s, AFDC was changed to TANF — Temporary Aid to Needy Families — and eligibility for benefits was indeed restricted. By 2012, the fraction of Americans in homes obtaining AFDC/TANF aid was less than half of what it had been in 1983.
The story of AFDC/TANF, however, is a one-off, a major exception to the general trend. Over the same three decades, the rolls of claimants receiving food stamps (a program that was officially rebranded the Supplemental Nutrition Assistance Program, or SNAP, in 2008 because of the stigma the phrase had acquired) jumped from 19 million to 51 million. By 2012 almost one American in six lived in a home enrolled in the SNAP program. The ranks of Medicaid, the means-tested national health-care program, increased by over 65 million between 1983 and 2012, and now include over one in four Americans. And while the door to means-tested cash benefits from the Social Security program through AFDC/TANF had been partly (though not entirely) closed, a much larger window for such benefits was simultaneously thrown open in the form of Supplemental Security Income, a program intended to provide income for the disabled poor. Between 1983 and 2012, the number of Americans in households receiving Federal SSI more than sextupled; by 2012, over 20 million people were counted as dependents of the program.
All told, more than 35% of Americans were taking home at least some benefits from means-tested programs by 2012 — nearly twice the share in 1983. Some may be tempted to blame such an increase on increasingly widespread material hardship. It is true that the American economy in 2012 was still recovering from the huge global crash of 2008, and unemployment levels were still painfully high: 8.1% for the year as a whole. But 1983 was a recovery year for the U.S. economy, too; the recession of 1981 and 1982 was the most severe in postwar American history up to that point, and the unemployment rate in 1983 was 9.6%, even higher than in 2012.
By the same token, although the official poverty rate was almost identical for the two years — the total population estimated to be below the official poverty line was 15.2% in 1983 and 15.0% in 2012 — the proportion of Americans drawing means-tested benefits was dramatically higher in 2012. By 2012, there was no longer any readily observable correspondence between the officially designated condition of poverty and the recipience of “anti-poverty” entitlements. In that year, the number of people taking home means-tested benefits was more than twice the number of those living below the poverty line — meaning a decisive majority of recipients of such aid were the non-poor. In fact, by 2012 roughly one in four Americans above the poverty line was receiving at least one means-tested benefit.
How could this be? America today is almost certainly the richest society in history, anywhere at any time. And it is certainly more prosperous and productive now (and in 2012) than it was three decades ago. Yet paradoxically, our entitlement state behaves as if Americans have never been more “needy.” The paradox is easily explained: Means-tested entitlement transfers are no longer an instrument strictly for addressing absolute poverty, but instead a device for a more general redistribution of resources. And the fact that so many are willing to accept need-based aid signals a fundamental change in the American character.
THE MORAL FABRIC
Asking for, and accepting, purportedly need-based government welfare benefits has become a fact of life for a significant and still growing minority of our population: Every decade, a higher proportion of Americans appear to be habituated to the practice. If the trajectory continues, the coming generation could see the emergence in the United States of means-tested beneficiaries becoming the majority of the population. This notion may seem absurd, but it is not as fanciful as it sounds. In recent years, after all, nearly half of all children under 18 years of age received means-tested benefits (or lived in homes that did). For this rising cohort of young Americans, reliance on public, need-based entitlement programs is already the norm — here and now.
It risks belaboring the obvious to observe that today’s real existing American entitlement state, and the habits — including habits of mind — that it engenders, do not coexist easily with the values and principles, or with the traditions, culture, and styles of life, subsumed under the shorthand of “American exceptionalism.” Especially subversive of that ethos, we might argue, are essentially unconditional and indefinite guarantees of means-tested public largesse.
Some components of the welfare state look distinctly less objectionable to that traditional sensibility than others. Given proper design, for example, an old-age benefit programs such as Social Security could more or less function as the social-insurance program it claims to be. With the right structure and internal incentives, it is possible to imagine a publicly administered retirement program entirely self-financed by the eventual recipients of these benefits over the course of their working lives. The United States is very far from achieving a self-funded Social Security program, of course, but if such a schema could be put in place, it would not in itself do violence to the conceptions of self-reliance, personal responsibility, and self-advancement that sit at the heart of the traditional American mythos. (Much the same could likewise be said of publicly funded education.) Moral hazard is inherent, and inescapable, in all public social-welfare projects — but it is easiest to minimize or contain in efforts like these. By contrast, the moral hazard in ostensibly need-based programs is epidemic, contagious, and essentially uncontrollable. Mass public provision of means-tested entitlements perforce invites long-term consumption of those entitlements.
The corrosive nature of mass dependence on entitlements is evident from the nature of the pathologies so closely associated with its spread. Two of the most pernicious of them are so tightly intertwined as to be inseparable: the breakdown of the pre-existing American family structure and the dramatic decrease in participation in work among working-age men.
When the “War on Poverty” was launched in 1964, 7% of children were born outside of marriage; by 2012, that number had grown to an astounding 41%, and nearly a quarter of all American children under the age of 18 were living with a single mother. (In the interest of brevity, let us merely say much, much more data could be adduced on this score, almost all of it depressing.)
As for men of parenting age, a steadily rising share has been opting out of the labor force altogether. Between 1964 and early 2014, the fraction of civilian men between the ages of 25 and 34 who were neither working nor looking for work roughly quadrupled, from less than 3% to more than 11%. In 1965, fewer than 5% of American men between 45 and 54 years of age were totally out of the work force; by early 2014, the fraction was almost 15%. To judge by mortality statistics, American men in the prime of life have never been healthier than they are today — yet they are less committed to working, or to attempting to find work, than at any previous point in our nation’s history.
No one can prove (or disprove) that the entitlement state is responsible for this rending of the national fabric. But it is clear that the rise of the entitlement state has coincided with these disheartening developments; that it has abetted these developments; and that, at the end of the day, its interventions have served to finance and underwrite these developments. For a great many women and children in America, and a perhaps surprisingly large number of working-age men as well, the entitlement state is now the breadwinner of the household.
ENTITLEMENTS AND EXCEPTIONALISM
Changes in popular mores and norms are less easily and precisely tracked than changes in behavior, but here as well modern America has witnessed immense shifts under the shadow of the entitlement state. Difficult as these shifts may be to quantify, we may nevertheless dare to identify, and at least impressionistically describe, some of the ways the entitlements revolution may be shaping the contemporary American mind and fundamentally changing the American character.
To begin, the rise of long-term entitlement dependence — with the concomitant “mainstreaming” of inter-generational welfare de-pendence — self-evidently delivers a heavy blow against general belief in the notion that everyone can succeed in America, no matter their station at birth. Perhaps less obvious is what increasing acceptance of entitlements means for American exceptionalism. The burning personal ambition and hunger for success that both domestic and foreign observers have long taken to be distinctively American traits are being undermined and supplanted by the character challenges posed by the entitlement state. The incentive structure of our means-based welfare state invites citizens to accept benefits by showing need, making the criterion for receiving grants demonstrated personal or familial financial failure, which used to be a source of shame.
Unlike all American governance before it, our new means-tested arrangements enforce a poverty policy that must function as blind to any broad differentiation between the “deserving” and “undeserving” poor. That basic Puritan conception is dying today in America, except perhaps in the circles and reaches where it was already dead. More broadly, the politics surrounding the entitlement system tends to undermine — by and large deliberately — the legitimacy of utilizing stigma and opprobrium to condition the behavior of beneficiaries, even when the behavior in question is irresponsible or plainly destructive. For a growing number of Americans, especially younger Americans, the very notion of “shaming” entitlement recipients for their personal behavior is regarded as completely inappropriate, if not offensive. This is a strikingly new point of view in American political culture. A “judgment-free” attitude toward the official provision of social support, one that takes personal responsibility out of the discussion, marks a fundamental break with the past on this basic American precept about civic life and civic duty.
The entitlement state appears to be degrading standards of citizenship in other ways as well. For example, mass gaming of the welfare system appears to be a fact of modern American life. The country’s ballooning “disability” claims attest to this. Disability awards are a key source of financial support for non-working men now, and disability judgments also serve as a gateway to qualifying for a whole assortment of subsidiary welfare benefits. Successful claims by working-age adults against the Social Security Disability Insurance (SSDI) program rose almost six-fold between 1970 and 2012 — and that number does not include claims against other major government disability programs, such as SSI. There has never been a serious official effort to audit SSDI — or, for that matter, virtually any of the country’s current entitlement programs.
The late senator Daniel Patrick Moynihan once wrote, “It cannot too often be stated that the issue of welfare is not what it costs those who provide it, but what it costs those who receive it.” The full tally of those costs must now include the loss of public honesty occasioned by chronic deception to extract unwarranted entitlement benefits from our government — and by the tolerance of such deception by the family members and friends of those who commit it.
Finally, there is the relation between entitlements and the middle-class mentality. An important aspect of the American national myth is that anyone who works hard and plays by the rules can gain entry to the country’s middle class, regardless of their income or background. Yet while low incomes, limited educational attainment, and other material constraints manifestly have not prevented successive generations of Americans from aspiring to the middle-class or even entering it, the same cannot be said of constraints emanating from the mind. Being part of the American middle class is not just an income distinction — it is a mentality, a self-conception. To be middle class is to be hard-working and self-sufficient, with self-respect rooted in providing a good life for oneself and one’s family. Can members in good standing of the American middle class really maintain that self-conception while simultaneously taking need-based government benefits that symbolically brand them and their family as wards of the state?
It is no secret that the American middle class is under great pressure these days. Most commentary and analysis on this question has focused on “structural,” material reasons for this phenomenon: globalization, the faltering American jobs machine, widening economic differences in society, difficulties in keeping up the pace of mobility, and many others. Conspicuously absent from this discussion have been the consequences of enrolling a sizable and still-growing share of the populace in welfare programs intended for the helpless and needy. With more than 35% of America receiving means-tested benefits, should it really be surprising that over a third of the country no longer considers itself “middle class”?
THE END OF EXCEPTIONALISM
The worldwide spread and growth of the social-welfare state seems strongly to suggest that there is a universal demand today for such services and guarantees in affluent, democratic societies. Given the disproportionate growth almost everywhere of entitlements in relation to increases in national income, it would seem that voters in modern democracies the world over regard such benefits as “luxury goods.” In one sense, we might therefore say there is nothing particularly special about the recent American experience with the entitlement state. But as we have also seen, there is good reason to think that the entitlement state may be especially poorly suited for a nation with America’s particular political culture, sensibilities, and tradition.
The qualities celebrated under the banner of “American exceptionalism” are perhaps in poorer repair than at any time in our nation’s history. There can be little doubt (to return to our medical metaphor) that the grafting of a social-welfare system onto our body public is in no small part responsible for this state of affairs.
And there is little reason to believe that the transplant will be rejected any time soon. To date the American voter’s appetite for entitlement transfers appears to be scarcely less insatiable than those of voters anywhere else. Our political leadership, for its part, has no stomach for taking the lead in weaning the nation from entitlement dependence. Despite tactical, rhetorical opposition to further expansion of the entitlement state by many voices in Washington, and firm resistance by an honorable and principled few, collusive bipartisan support for an ever-larger welfare state is the central fact of politics in our nation’s capital today, as it has been for decades. Until and unless America undergoes some sort of awakening that turns the public against its blandishments, or some sort of forcing financial crisis that suddenly restricts the resources available to it, continued growth of the entitlement state looks very likely in the years immediately ahead. And in at least that respect, America today does not look exceptional at all.
Nicholas Eberstadt holds the Henry Wendt Chair in Political Economy at the American Enterprise Institute. This essay is adapted from his chapter in the forthcoming volume The State of the American Mind, edited by Mark Bauerlein and Adam Bellow (Templeton Press).