Monday, September 3, 2012

What Was Wrong With The Age of Black and White Education?

Obama to Iran don't you dare harm us.  (See 1 below.)

Netanyahu getting frustrated and testy with Obama.  (See 1 a below)

Is it flim and flam time? (See 1b below.)
---
Would it not be nice and better if we went back to black and white education? The progressives helped wreck education. (See 2 below.)
---
Have we become a nation of takers?  Eberstadt thinks so.

Long but well worth reading.  (See 3 below.)
---
A close friend and fellow memo reader expresses his thoughts about  Obama's black and white comments: "As for your comment about Obama mocking Romney's proposals as being from the last century to be watched on black and white TV.  Weren’t things better when they were simply – black or white? We didn’t have these varying “shades of gray” , “blue sky programs” , “rose colored glasses” , or  “purple haze?” Didn’t the world seem a bit clearer when we weren’t ruled by the “color of money?” Remember, neither black nor white are used in camouflage. And…we use to use color blind point men when working in the jungles because they could be the enemy hiding in the trees. It seems that much can be hidden in the color wheel of politics. Wasn’t it simply when it was just “black and white?”
---
Obama is both thin skinned and naive so it is hard for him to distinguish between being dissed and insulted  and tweaked and satirized!  (See 4 below.)
---
We are now moving into the last two months of the campaign and I believe Romney and Ryan will rise steadily in the various polls but more importantly I believe a degree of panic and testiness will begin to take over the Obama camp leading to reactions and responses that will fall in the realm of turn offs and thus frustration will mount.

Finally I believe Romney and Ryan will best their opponents in the debates.

Obama does not do well under pressure and relentless searing heat.  He cannot handle it. He gets prickly.  Talk to staff members that deal with him daily and this is what you will hear from those willing to let their hair down.
----
Dick
---------------------------------------------------------------------------------------------------------------------
1)


"


Iran must steer clear of US interests in Gulf'

Washington reportedly sends Tehran indirect message saying it will not back Israeli strike on nuclear facilities as long as Iran refrains from attacking American facilities in Persian Gulf
By Shimon Shiffer


The United States has indirectly informed Iran, via two European nations, that it would not back an Israeli strike against the country'snuclear facilities, as long as Tehran refrains from attacking American interests in the Persian Gulf, Yedioth Ahronoth reported Monday.

According to the report, Washington used covert back-channels in Europe to clarify that the US does not intend to back Israel in a strike that may spark a regional conflict.



In return, Washington reportedly expects Iran to steer clear of strategic American assets in the Persian Gulf, such as military bases and aircraft carriers.

Israeli officials reported an unprecedented low in the two nations'defense ties, which stems from the Obama administration's desire to warn Israel against mounting an uncoordinated attack on Iran.

The New York Times reported Monday that US President Barack Obama is promoting a series of steps meant to curb an Israeli offensive against Iran, while forcing the Islamic Republic to take thenuclear negotiations more seriously.



Iranian drill in Strait of Hormuz (Photo: MCT)


One of the steps considered is "an official declaration by Obama about what might bring about American military action, as well as covert activities that have been previously considered and rejected," the report said.

Several of Obama's top advisors believe that Jerusalem is seeking an unequivocal American statement regarding a US strike on Iran – should it actively pursue a nuclear bomb.

Israel hopes such a statement is made during Obama's address before the UN General Assembly on September 25.

Others in the White House said Israel is trying to drag the US into an unnecessary conflict in the Gulf.

White House spokesman Jay Carney said Monday that "There is absolutely no daylight between the United States and Israel when it comes to preventing Iran from getting a nuclear weapon."


Carney said that all options remain on the table for Iran. He said the "window for diplomacy remains open," adding that the diplomatic process remains the best way to deal with the Islamic Republic, though "that window will not remain open indefinitely."

Cyber war a go?

According to the New York Times, Washington has also sent Iran a back-channel deal suggesting they curb their nuclear ambitions, but Tehran rejected the deal, saying no agreement is possible sans lifting all West-imposed sanctions.

According to the report, the Obama administration is exploring the possibility of mounting a covert operation, as well as waging a "quiet" cyber war against Iran.

President Obama had previously rejected the notion, fearing such cyber assaults would wreak havoc on Iranian civilian life.


Later in September, the United States and more than 25 other nations will hold the largest-everminesweeping exercise in the Persian Gulf, in what military officials say is a demonstration of unity and a defensive step to prevent Iran from attempting to block oil exports through the Strait of Hormuz.

In fact, the United States and Iran have each announced what amounted to dueling defensive exercises to be conducted this fall, each intended to dissuade the other from attack


1a)

Report: Netanyahu slams Obama over Iran nukes

By Ryan Jones

Israel was abuzz over the weekend with news that a recent meeting between Prime Minister Binyamin Netanyahu and US Ambassador Daniel Shapiro became heated after the Israeli leader criticized US President Barack Obama for not doing enough to thwart Iran's nuclear program.

The Netanyahu-Shapiro meeting took place behind closed doors two weeks ago. Last Friday, Israeli newspaper Yediot Ahronot cited unnamed sources present at the meeting as saying "sparks and lightning were flying" when the two men discussed the Iran situation.

According to the sources, Netanyahu openly blasted what he called Obama's ineffectual policies vis-a-vis Iran's nuclear program, which Israel views as an existential threat considering the Iranian leadership's very public declarations that it wishes to annihilate the Jewish state.

Netanyahu and many other Israeli leaders have been vocal in their criticism of Obama and the entire international community for punishing Iran with little more than words while the Islamic Republic has continued its defiant nuclear program with only minor hindrance.

Netanyahu reportedly told Shapiro that instead of worrying about whether or not Israel will strike Iran, Obama should focus on the root of the problem and put some real pressure on Tehran.

At that point, Shapiro was said to have broken diplomatic protocol and snapped back at Netanyahu, insisting that the Israeli leader was misrepresenting Obama's position. Shapiro then reiterated Obama's promise to not allow Iran to attain nuclear weapons, to which Netanyahu responded, "Time is running out."

Israeli cabinet ministers who spoke to Yediot's Internet portal, Ynet, said they believe Obama will make Netanyahu pay if the former is reelected as America's president in November.

At present, Obama cannot risk being seen as creating a rift between America and Israel, lest he lose the votes of most Jews and many Christians. But after the election, Obama will have no such constraints.

"The US elections are in two months, and there is no doubt that President Barack Obama, if he is reelected, will make Netanyahu pay for his behavior. It will not pass quietly," one minister told Ynet.

Already there are indications that Obama is taking a more hostile position toward Israel.

For instance, next month Israel and the US were scheduled to hold their largest ever joint military exercise, which was to be largely focused on confronting a major ballistic missile threat (read: Iran). But over the weekend, the Pentagon suddenly and significantly downsized American participation in the exercise.

"Relations between Israel and the US have soured" as a result of differing views on the urgency of the Iran situation, concluded another cabinet minister.

1b)US CIA chief Petraeus arrives Monday to cool Israeli ire

CIA Chief David Petraeus to Turkey and Israel
CIA Chief David Petraeus to Turkey and Israel



President Barack Obama is sending CIA Director David Petraeus to Israel in a hurry Monday, Sept. 3, in an attempt to quench the flames of discord between Israel and his administration on the Iran issue. He will fly in from a visit to Ankara Sunday, where too he faces recriminations for US handling of the Syrian crisis.


Israel has a double grievance over Obama’s Iran policy: Not only does his administration spare Iran’s leaders any sense of military threat that might give them pause in their dash for a nuclear weapon, but US officials are actively preventing any Israel striking out in its own defense to dispel the dark shadow of a nuclear Iran.

Behind closed doors in Ankara Turkish Prime Minister Tayyip Erdogan and President Abdullah Gul are preparing to vent their anger against the US administration for tying their hands against establishing safe havens in Syria for rebel operations against the Assad regime. The Turkish Air Force has been on standby for the last two months for this mission, along with the Saudi and UAE air forces. However, none are prepared to go forward without logistical backing from the US Air Force.

They blame Obama’s refusal to engage directly in the Syrian conflict for the escalating terrorist threats confronting Turkey from Assad’s open door to PKK (Kurdish Workers Party) bases in northern Syria and the Iraqi-Syrian-Turkish border triangle.  Turkey is also stuck with a swelling influx of Syrian refugees piling an unmanageable burden on its economy.

Israel does not expect anything useful to come out of the Petraeus visit – or even any alleviation of the bad feeling between Binyamin Netanyahu and Barack Obama. High-placed officials in Jerusalem were of the view that the CIA chief fits the US president’s bill at this time. His visit is a non-binding gesture of goodwill for Israel which does not require the White House or the Chairman of the Joint Chiefs of Staff Gen. Martin Dempsey to backtrack or apologize for his derogatory remarks about the IDF's capacity for taking Iran on. Another advantage is that any words passing between the CIA chief and Israeli leaders may be classified.

His visit to Jerusalem will therefore not stem the ill will prevailing between Jerusalem and Washington.

All the same, Prime Minister Netanyahu chose his words carefully Sunday to avoid fingering the US directly when he urged the international community to get tougher against Iran, saying that without a "clear red line," Tehran will not halt its nuclear program. He was addressing the weekly cabinet meeting in Jerusalem.

"I believe that the truth must be said, the international community [not the US] is not drawing a clear red line for Iran, and Iran does not see international determination to stop its nuclear program," Netanyahu said.

 "Until Iran sees this clear red line and this determination, it will not stop its advancement of the Iranian nuclear program. Iran must not have a nuclear weapon," he declared.
--------------------------------------------------------------------------------------------------
2)

Dreaming Up a New America: Progressive Education and the Perversion of American Democracy

By L.E. Ikenga


As opposed to the 2008 election, which had many frustrated and emotionally charged voters dreaming up a new America with a historic presidential candidate leading the charge, the 2010 midterms had people doing the exact opposite.  In 2010, a majority of Americans stopped dreaming and started to face reality.  America was accelerating toward an irreversible and all-encompassing decline.  The path envisioned by the president and his supporters for a radically changed United States was starting to look like a dead end.  America was breaking down.
The year 2010 was also when essayist Walter Russell Mead began to ascribe many aspects of this breakdown to the failures within what he called the Blue Social Model.  His prognostications were based on what he saw as the disintegration of core American institutions and ideas, which developed and flourished under the post-Second World War industrial system.  According to Mr. Mead, the model had reached its expiration date; and among other things, what has followed is a stagnant and deeply indebted economy, crumbling social institutions, and controlled yet massive citizen dissent.  Our bona fides as an advanced industrial democracy were therefore being challenged both domestically and internationally.  He might have been right.
If he was right, then Mr. Mead's argument should have immediately raised some major concerns.  If for the past sixty years our core institutions and ideas have, as Mr. Mead had put it, "rested on the commanding heights of a few monopolistic and oligopolistic American firms and a government with runaway entitlement programs," then we should have been asking ourselves more essential questions about the nature of our history and society.  One of those questions should have been: when did America stop being a serious democracy?
But if Mr. Mead's conclusion should not have warranted such a question, and though by increasingly unqualified means we can still call ourselves a serious democracy, then we obviously have another more fundamental problem on our hands.
Bootleg Blackberries, Fake iPads, and a Fabricated Democracy
The Blue Social Model theory was articulate and intelligent, but it did not identify the central problem facing American society.  It did, however, do a good job of camouflaging it.
That problem has now become an epidemic, and it is this: a citizenry, both young and old, whose members have become increasingly ignorant and apathetic towards the basic pillars of  history and civic culture upon which their democracy has been built.  This is why Mr. Mead was able to draw attention to a false target, which marked the real crisis as the "accelerating collapse of blue government," and get away with it.
Essentially, the American masses -- now not unlike the masses in various parts of the democratic underdeveloped world -- have little to no understanding of how genuine democracies are supposed to work, and they don't care.
Instead, they identify their democracy not in terms of the revolutionary political ideas and events of Western civilization, but in terms of the following new norms: government-backed credit systems that motivate compulsive consumerism by encouraging people to live well beyond their means; unempirical race theories that promote thoughtless, face-value diversity and multicultural relativism within the body politic; and a mainstreamed, age-inclusive addiction towards insipid web 2.0 entertainments that some have said is pushing our society towards an era of digital serfdom.
In other words, democracy in America now means being able to purchase a million-dollar house when you cannot afford it and having the power to open a Facebook account when you are eight years old.  
In that the roots of this kind of perverted democracy are not grounded in watershed documents such as the Twelve Tables of Roman Law or The English Bill of Rights of 1689, it should come as no surprise that today, a petty electronics trader hustling fake iPads and bootleg BlackBerries on a chaotic street steaming with fresh sewer in Port Harcourt, Nigeria should feel comfortable in equating his democracy with ours.  Based on what the trader understands, his fifty-year-old, post-imperial democracy is supposed to be based on the promise of the Blue Social Model.  But what he does not understand is that our two-hundred-plus-year-old anti-imperial democracy is not supposed to be.
And so, although it is our democracy that has produced an environment for all types of industry and real innovation to flourish, and although his democracy has produced the exact opposite -- one so corrupt and replete with ironies too incredible to believe (including the lack of internet for the fraudulent gadgets that he sells) -- both of our democracies are becoming the same.
Until Americans begin to set themselves apart, once again, as the gatekeepers for the democratic civic standards of Western civilization, our political outcomes will continue to be no more respectable than those of the third-world electronics trader.  Just as he commands a market that is eager for his fake gadgets, our government now governs a citizenry that is eager for a fabricated democracy -- one that discourages genuine civic responsibility and comprehension.
Some people have balked and jeered when they have been shown images of the American president who bows to foreign dictators.  They have said that the leader of the free world has no business bending over for autocrats who do not represent freedom and the rule of law.  But it is not the president who is doing the bowing.  We are doing the bowing.  The American president is simply reflecting the will of a good portion of the people whom he governs. 
Rousseau, Dewey, et al.: How the "Me, Myself, and I" Generation Became an Unintelligent Mob
It is America's decades-long experiment with progressive and postmodern progressive education that has produced almost two generations of low-information citizens who have become easier to dupe for the benefit of our increasingly intrusive and imperial government.
The progressive education model began to receive widespread attention during the first decades of the twentieth century.  The ideas and scholarly works of American philosopher and education theorist John Dewey were used to develop teaching models, which began to captivate groups of exclusive and superbly credentialed left-leaning educators, who lauded the models and who sought to actualize Dewey's vision in their own schools.  Dewey and his disciples decried the status-quo paradigms of the American school system -- especially those that they saw were intent on bringing up dutiful but uncritical citizens, or those that they felt motivated academic discrimination, particularly by means of standardized testing.  Their progressive education models were therefore aimed at making American schools more reflective of genuine democracies.
For them, dreaming up a new America meant gleaning many ideas about education from prominent European social philosophers of the eighteenth century.  One such philosopher, Jean-Jacques Rousseau, via and his tract on education, Emile, became invaluable for providing examples for the progressive education model.
Emile outlines the assumptions under which young boys, especially, should be educated.  Rousseau's philosophy on education stressed the natural goodness of man and a condemnation of social conventions, most of which he believed were culpable for man's corruptive behavior.  To rehabilitate mankind, Emile emphasizes the following for the various stages of a person's initial education: 
  1. The purpose of education is to develop a child's natural capacities.  Natural education should be as far-removed from society as possible.
  2. The aim of education should always be child-centered and individualized.  Children learn by utilizing their senses; they are guided by natural curiosity.
  3. A good teacher is unobtrusive; teachers are not there to enforce doctrine or rigid instruction.
  4. Children must never be pushed to acquire information.  If they are moved on their own to learn about something, they will.
  5. Children will develop a sense of morality through their trials and errors.  They do not acquire morals by being punished for bad behavior.  Teachers are never to discipline children for perceived wrongdoing.
From such ideas, many American educators were able to promote and systematize a progressive agenda in education that placed a premium on child-centered (as opposed to knowledge-centered) instinctual  "learning activities."  As progressive teaching models came to have more influence, authoritative, well-informed teachers and traditional textbooks began to be viewed as antediluvian and unnecessary.
Once the progressive education models of the '60s and '70s turned into their present-day postmodern structures, administrators became especially devoted to using the following paradigms to motivate students to learn:
  1. Defining a student's intellectual abilities through self-expression activities such as dance, unstructured writing, self-written poetry readings, and various forms of play.
  2. A de-emphasizing of the core curriculum subjects of Western civilization in favor of subjects that underscore minority issues and excessive openness towards diversity.
  3. Achieving academic equality through non-competitive groupthink projects.
  4. Caricaturing and condemning traditional learning methods and devices such as rote memorization, drill, and recitation.
  5. "Dumbing down" or avoiding subjects that can be mastered only through ongoing practice and hard work. 
  6. Grade inflation.  
All of these ideas and practices have failed American students by the factory-load and are responsible for creating successive generations of "me, myself, and I" citizens who lack intellectual depth and who are prone, paradoxically, to unproductive mob behavior.
In a snarky but well-deserved memorandum to the many classes of 2012, Ben Stephens writes:  
A few months ago, I interviewed a young man with an astonishingly high GPA from an Ivy-League university and aspirations to write about Middle East politics. We got on the subject of the Suez Canal Crisis of 1956. He was vaguely familiar with it. But he didn't know who was the president of the United States in 1956. And he didn't know who succeeded that president.
Pop quiz, Class of '12: Do you?
His summary:
In every generation there's a strong tendency for everyone to think like everyone else. But your generation has an especially bad case, because your mass conformism is masked by the appearance of mass nonconformism.
Disguised as a crotchety hit piece against a generation of know-nothings, Stephen's memo is actually a warning regarding our nation's clear and present danger.  Here's one possible interpretation of it: at worst, progressive and postmodern education has produced ignorant masses composed of people incapable of acting as individuals.  People who are incapable of acting as individuals have no need for constitutions that protect their individual rights.  Thus the continued perversion and eventual death of a democracy.
Classical Education: Let the Revolution Begin 
Meeting American middle school students who are required to write one-act plays on the shooting of Trayvon Martin, or who begin to count on their fingers when asked how many states are in our Union, is depressing.  It is also eye-opening.  The progressive education movement has imposed many misguided ideas on America.  And in doing so, the movement's purveyors have willfully denied this unavoidable truth: children are born ignorant and require a great deal of academic guidance and instruction in order to acquire specific skill sets.  It is the discipline that it takes to acquire such skill sets that will serve them as they grow to discover what truly distinguishes them as thoughtful individuals.  
The state of our education crisis -- especially our urban education crisis -- has led many to embrace unrealistic ideas that are simply not working.  For instance, Waiting for Superman to solve our nation's K-12 education problems is an option, but it is ultimately impractical and does not offer a permanent solution.  Superheroes don't exist.  As Walter Russell Mead's Blue Social Modeltheory had exposed, America's near-obsessive reliance on government to create and administer a wealthy and growing state was erroneous.  Therefore, the idea of cheaper and better-managed state-supported education is preposterous because as Mr. Mead makes clear, "public schools are increasingly expensive to run, and yet they do not provide improved services to match those exploding costs."
The costs of public education are exploding because of progressive education gimmickry.  Progressive education is gimmick-prone because its entire pedagogic framework is one that almost always shuns rigorous, systematic, and sequential learning that is based on a history-dependent approach.  It favors, however, curricula that are non-cumulative, ahistorical, and theme-based, with subjects and languages often being studied in isolation from one another.  And from such curricula, teachers create new and "innovative" ways to entertain their students.  Entertainment industries tend to be expensive to run. 
Most Department of Education bureaucrats are oblivious to the realities of how children learn.  In New York City, for instance, Department of Education teachers employed at failing public schools continue to require students in primary grades to learn about (for state-wide exams) algebraic equations when they have yet to master basic arithmetic.  It is also why many of those same students, when they get to high school, will be taught to cram into their heads, within a three- to four-month period, a bunch of meaningless events and facts for a state Regents exam on U.S. history and government, only to go back to their previous dimwitted understanding of how our government is supposed to work once the exam is over.  Consequently, it is not shocking to meet seventh-graders who still struggle with addition and subtraction, or high school seniors who believe that the American Revolution was fought between France and Germany.  If this does not expose the writing on the wall for the future of our democracy, then what else will?  American education does not need reform.  It needs a revolution.
Going back to the lost standards of a classical education is the only way forward for American students, and this needs to happen now.  People are already expressing a hunger for something very different.  The hunger is being expressed in the form of the relentless criticism towards what many schools are offering.  But we need a credible alternative.  
I am a classical educator in urban America. Most of my students, when first introduced to the stems and endings system of Latin grammar, become anxious and afraid.  They are not used to the intellectual discipline that the study of Latin grammar demands.  They have not been taught about the Roman republic or its relevance to our political system, and because of the rampant hyper-liberal proselytizing that goes on in many of their schools, they would probably believe that the Trojan War began under the administration of George W. Bush.  This is because among other things, their parents, especially, once felt deeply obliged to school systems that were short-changing their children.  Their children were not acquiring concrete and relevant information, nor were these children being instructed based on a credible system that allowed them to organize, analyze, and interpret such information properly.  But a classical education, if made available to a greater number of students, can change all of this.
Rather than putting forth an extensive amount of information about why a classical education works, which it does, or what a classical education is, which many who are much more qualified than myself have already done, I would like to share my summary of what a classical education, at the very least, has done for my students:
  1. Motivated intellectual curiosity in core subjects and languages.
  2. Given them an excellent understanding of the origins of Western civilization and its intellectual traditions.
  3. Through the study of classical languages (Latin in particular), they have acquired an in-depth knowledge of English grammar, a more extensive vocabulary, and basic logic skills.
  4. Has enabled them to tackle more challenging subjects.
  5. Given them (and their parents) the ability to recognize and to stay clear of gimmicks and trends in education.
  6. Now have a concrete pattern for learning, which they can use to master any subject or acquire any skill.
  7. Has instilled humility. 
The need for a genuine revolution is obvious.  But what is not obvious is how to arm more of our citizens in the making with the knowledge and wisdom that they will one day use to protect themselves from various forms of undemocratic government, one of which has begun to materialize right before our very eyes.  Classical home-schools and traditional private classical schools are an excellent start, but they are not enough.  We need to reach even more students with what these schools have to offer.
America is not only breaking down; it is breaking apart.  Right now, we must find a clear and precise path back to our democratic roots.  But this can happen only if our children are properly educated.  So let the revolution begin. 
-----------------------------------------------------------------------------------
3)A Nation of Takers: America's Entitlement Epidemic: ByNicholas Eberstadt
The issue of welfare is the issue of dependency. It is different from poverty. To be poor is an objective condition; to be dependent, a subjective one as well. That the two circumstances interact is evident enough, and it is no secret that they are frequently combined. Yet a distinction must be made. Being poor is often combined with considerable personal qualities; being dependent rarely so. That is not to say that dependent people are not brave, resourceful, admirable but simply that their situation is never enviable, and rarely admired. It is an incomplete state of life: normal in a child, abnormal in an adult. In a world where completed men and women stand on their own feet, persons who are dependent—as the buried imagery of the word denotes—hang.” —Daniel Patrick Moynihan, 1973[1]
 Introduction: The Unstoppable Rise of Entitlements in Modern America, 1960–2010
The American republic has endured for almost two and a quarter centuries; the United States is the world’s oldest constitutional democracy. But over the past fifty years, the apparatus of American governance has undergone a fundamental and radical transformation. In some basic respects—its scale, its preoccupations, even many of its purposes—the U.S. government today would be scarcely recognizable to a Franklin D. Roosevelt, much less an Abraham Lincoln or a Thomas Jefferson.
What is monumentally new about the American state today is the vast and colossal empire of entitlement payments that it protects, manages, and finances. Within living memory, the government of the United States has become an entitlements machine. As a day-to-day operation, the U.S. government devotes more attention and resources to the public transfers of money, goods, and services to individual citizens than to any other objective: and for the federal government, these amounts outpace those spent for all other ends combined.
Government entitlement payments are benefits to which a person holds an established right under law (i.e., to which a person is entitled). A defining feature of these payments (also sometimes officially referred to as government transfers to individuals) is that they “are benefits received for which no current service is performed.”[2] Entitlements are a relatively new concept in U.S. politics and policy: according to Merriam-Webster, the first known use of the term was not until 1942.[3] But entitlements have become very familiar, very fast. By the reckoning of the Bureau of Economic Analysis (BEA), the U.S. government’s main unit for macroeconomic assessments and estimates, income from entitlement programs in the year 2010 was transferred to Americans under a panoply of over fifty separate types of programs, and accounted for almost one-fifth (18.5 percent) of personal income in that year.[4]
The breathtaking growth of entitlement payments over the past half century is shown in Figure 1. In 1960 U.S. government transfers to individuals from all programs totaled about $24 billion. By 2010 that total was almost 100 times larger. Over that interim, the nominal growth in entitlement payments to Americans by their government was rising by an explosive average of 9.5 percent per annum for fifty straight years. The tempo of growth, of course, is exaggerated by concurrent inflation—but after adjusting for inflation, entitlement payments soared more than twelve-fold (1248 percent), with an implied average real annual growth rate of about 5.2 percent per annum (see Figure 2).[5] Even after adjusting for inflation and population growth, entitlement transfers to individuals have more than septupled (727 percent) over the past half century, rising at an overall average of about 4 percent per annum (seeFigure 3).[6]
These long-term spending trends mask shorter-run tendencies, to be sure. Over the past two decades, for example, the nominal growth in these entitlement outlays has slowed, to an average of “only” 7.1 percent a year (or a doubling every decade). Adjusted for inflation by the Consumer Price Index, real entitlement outlays rose by an average of “just” 4.4 percent over those years—and by a “mere” 3.2 percent a year on a per capita basis. But if the pace of entitlement growth has slowed in recent decades, so has the growth in per capita income. From 1960 to 2010 real per capita income in America grew by a measured 2.2 percent on average—but over the past twenty years, it has increased by 1.6 percent per annum.[7] In other words, total entitlement payouts on a real per capita basis have been growing twice as fast as per capita income over the past twenty years; the disparity between entitlement growth on the one hand and overall income growth on the other is greater in recent times than it was in earlier decades.
The magnitude of entitlement outlays today is staggering. In 2010 alone, government at all levels oversaw a transfer of over $2.2 trillion in money, goods, and services to recipient men, women, and children in the United States. At prevailing exchange rates, that would have been greater than the entire GDP of Italy, and almost as large as Britain’s—advanced economies with populations of roughly 60 million.[8] (The U.S. transfer numbers, incidentally, do not include the cost of administering the entitlement programs.) In 2010 the burden of entitlement transfers came to slightly more than $7,200 for every man, woman, and child in America. Scaled against a notional family of four, the average entitlements burden for that year alone would have approached $29,000. And that payout required payment, too: through taxes, borrowing, or some combination of the two.
A half century of wholly unfettered expansion of entitlement outlays has completely inverted the priorities, structure, and functions of federal administration as these had been understood by all previous generations of American citizens. Until 1960 the accepted task of the federal government, in keeping with its constitutional charge, was governing. The federal government’s spending patterns reflected that mandate. The overwhelming share of federal expenditures was allocated to defending the republic against enemies foreign and domestic (defense, justice, interest payments on the national debt) and some limited public services and infrastructural investments (the postal authority, agricultural extension, transport infrastructure, and the like). Historically, transfer payments did not figure prominently (or, sometimes, at all) in our federal ledgers. The BEA, which prepares America’s GNP estimates and related national accounts, identifies only two calendar years before 1960 in which federal transfer payments exceeded other federal expenditures: in 1931, with President Herbert Hoover’s heretofore unprecedented public relief programs, and in 1935, under President Roosevelt. (Even then, given the limited size of the U.S. government in those years, these entitlement transfers were negligible from a contemporary perspective, totaling just over 3 percent of GDP in 1931, and under 3 percent in 1935.[9]) For most of FDR’s tenure, and for much of the Great Depression, the share of federal spending devoted to income transfers was a third or less of total spending.
In 1960, entitlement program transfer payments accounted for well under a third of the federal government’s total outlays—about the same fraction as in 1940, when the Great Depression was still shaping American life, with unemployment running in the range of 15 percent (see Figure 4). But over subsequent decades the share of entitlements in total federal spending suddenly soared up from 28 percent to 51 percent. It did not surpass the 50 percent mark again until the early 1990s. But over the past two decades, entitlements as a percentage of total federal spending rose almost relentlessly; by 2010 this share accounted for just about two-thirds of all federal spending, with all other responsibilities of the federal government—defense, justice, and all the other charges specified in the constitution or undertaken in the intervening decades—making up barely one-third (see Figure 5 and Figure 6). Thus, in a very real sense, entitlements have turned American governance upside-down—and within living memory.
The story of the (im)balance between entitlement transfers and overall government activities—at the federal, state, and local levels—is none too different (see Figure 7 and Figure 8). In 1940, government transfers to individuals amounted to under one-sixth of total U.S. government outlays; in 1960 these entitlements still comprised barely 19 percent of all U.S. government expenditures. Between 1960 and 2010 the share of entitlements in government spending at all levels jumped from 19 percent to 43 percent—and the ratio of nonentitlement to entitlement spending fell from 4.2:1 down to 1.3:1. On that trajectory, the day in which entitlement spending comes to exceed all other activities of all levels and branches of the U.S. government is within sight, here and now.
Although the U.S. entitlements archipelago is by now extraordinarily far-flung and complex, with dozens upon dozens of separate programmatic accounts, the overall structure of government entitlement spending falls into just a few categories. U.S. government budgeters divide them into six overall baskets: income maintenance, Medicaid, Medicare, Social Security, unemployment insurance, and all the others (see Figure 9 and Figure 10). Broadly speaking, the first two baskets attend to entitlements based upon poverty or income status; the second two, entitlements attendant upon aging or old-age status; and the next, entitlements based upon employment status. These entitlements account for about 90 percent of total government transfers to individuals, and the first four categories comprise about five-sixths of all such spending. These four bear closest consideration.
Poverty- or income-related entitlements—transfers of money, goods, or services, including health-care services—accounted for over $650 billion in government outlays in 2010 (see Figure 11). Between 1960 and 2010, inflation-adjusted transfers for these objectives increased by over thirty-fold, or by over 7 percent a year; significantly, however, income and benefit transfers associated with traditional safety-net programs comprised only about a third of entitlements granted on income status, with two-thirds of those allocations absorbed by the health-care guarantees offered through the Medicaid program.
For their part, entitlements for older Americans—Medicare, Social Security, and other pension payments—worked out to even more by 2010, about $1.2 trillion (see Figure 12). In real terms, these transfers multiplied by a factor of about 12 over that period—or an average growth of more than 5 percent a year. But in purely arithmetic terms, the most astonishing growth of entitlements has been for health-care guarantees based on claims of age (Medicare) or income (Medicaid) (see Figure 13). Until the mid-1960s, no such entitlements existed; by 2010, these two programs were absorbing more than $900 billion annually.
In current political discourse, it is common to think of the Democrats as the party of entitlements—but long-term trends seem to tell a somewhat different tale. From a purely statistical standpoint, the growth of entitlement spending over the past half century has been distinctly greater under Republican administrations than Democratic ones. Between 1960 and 2010, to be sure, the growth of entitlement spending was exponential—but in any given calendar year, it was on the whole roughly 8 percent higher if the president happened to be a Republican rather than a Democrat. This is in keeping with the basic facts of the time: notwithstanding the criticisms of “big government” that emanated from their Oval Offices from time to time, the Richard Nixon, Gerald Ford, and George W. Bush administrations presided over especially lavish expansions of the American entitlement state. Irrespective of the reputations and the rhetoric of the Democratic and Republican parties today, the empirical correspondence between Republican presidencies and turbocharged entitlement expenditures should underscore the unsettling truth that both political parties have, on the whole, been working together in an often unspoken consensus to fuel the explosion of entitlement spending in modern America.
Entitlements and America’s New Way of Life: Our Declaration of Dependence
From the founding of our state up to the present—or rather, until quite recently—the United States and the citizens who peopled it were regarded, at home and abroad, as exceptional in a number of deep and important respects. One of these was their fierce and principled independence, which informed not only the design of the political experiment that is the U.S. Constitution but also the approach to everyday affairs. The proud self-reliance that struck Alexis de Tocqueville in his visit to the United States in the early 1830s extended to personal finances. The American “individualism” about which he wrote included social cooperation, and on a grand scale—the young nation was a hotbed of civic associations and voluntary organizations. American men and women viewed themselves as accountable for their own situation through their own achievements in an environment bursting with opportunity—a novel outlook at that time, markedly different from the prevailing Old World (or at least Continental) attitudes.
The corollaries of this American ethos (which might be described as a sort of optimistic Puritanism) were, on the one hand, an affinity for personal enterprise and industry, and on the other a horror of dependency and contempt for anything that smacked of a mendicant mentality. Although many Americans in earlier times were poor—before the twentieth century, practically everyone was living on income that would be considered penurious nowadays—even people in fairly desperate circumstances were known to refuse help or handouts as an affront to their dignity and independence. People who subsisted on public resources were known as “paupers,” and provision for them was a local undertaking. Neither beneficiaries nor recipients held the condition of pauperism in high regard.[10]
Overcoming America’s historic cultural resistance to government entitlements has been a long and formidable endeavor. But as we know today, this resistance did not ultimately prove an insurmountable obstacle to the establishment of mass public entitlements and normalizing the entitlement lifestyle in modern America. The United States is now on the verge of a symbolic threshold: the point at which more than half of all American households receive, and accept, transfer benefits from the government. From cradle (strictly speaking, from before the cradle) to grave, a treasure chest of government-supplied benefits are there for the taking for every American citizen—and exercising one’s legal rights to these many blandishments is not part and parcel of the American way of life.
Just how the great American postwar migration to general entitlement dependency was accomplished is a matter for future historians. For now we can note that a certain supply-and-demand dynamic was in play; in this saga, supply helped to create its own demand. Government purveyed, and to sell these particular wares effectively, government needed to get into the business of norm-changing. A succession of presidential administrations did just that, with continuing dedication and some ingenuity. Two of the many milestones in this effort deserve brief mention here.
The first is the promulgation of the electronic benefit transfer (EBT) card, which began its march through the federal entitlement apparatus in the 1990s. EBTs were issued in the place of food stamps—coupons that could be used at grocery stores but which were made to look different from cash, and which carried restrictions on what the possessor could purchase. EBTs, in contrast, were plastic swipe cards basically indistinguishable from ordinary debit or credit cards. In 2008—under President George W. Bush—the Supplemental Farm Bill, which had always previously spoken of food “stamps” and “coupons,” struck those words from the law and replaced all mention of these possibly stigmatizing instruments with “EBT” and “card.”[11]
More recently, President Barack H. Obama offered an encomium to the new lifelong procession of entitlements—as advertisements for his 2012 reelection campaign. These feature an imaginary woman named “Julia,” who is shown to benefit from government transfers and programs from preschool (Head Start) to childbearing (Medicaid, Obamacare) to working ages (loans from the Small Business Administration) to retirement (Social Security and Medicare).[12] In this important new political departure, entitlements and social welfare programs are no longer reluctantly defended, but instead positively celebrated as part of the American dream: the promise to not only defend these but to increase their scope still further is offered as a positive reason for Obama’s reelection.
Whatever the particulars of the supply-demand interaction, the plain fact is that the utilization of government entitlement benefits by American citizens registered what epidemiologists would call a “breakout” into the general population over the past two generations. Figure 14 and Figure 15 detail one dimension of that breakout: the average share of personal income derived from government transfer benefits. (These were commissioned by the New York Times in conjunction with a major article on the prevalence of entitlement dependency in middle-class America;[13] the figures present even more detail in their interactive online version.[14]) According to this work, relying upon data from the BEA and the Census Bureau, the share of government transfer benefits in overall personal income for the nation as a whole rose from under 8 percent to almost 18 percent in the four decades between 1969 and 2009. (To be sure, 2009 was an unusually bad year for the American economy, but the long-term trend, decade by decade, was unmistakably upward.)
Given the obvious arithmetic fact that many of the United States’ three-thousand-plus counties had to register above these national averages, the geography of entitlement dependency perforce suggested some true extremes by 2010. In that year, the populations of many U.S. counties were deriving more than 40 percent of personal income from government transfers to individuals and related entitlement benefits. But interestingly enough, by 2010 the most extreme regional dependence on government transfers tended to be in rural areas rather than urban ones, and in red states rather than blue states. According to the estimates by the aforementioned New York Times team, in fact, two-thirds of the one hundred most dependent counties in America voted for the Republican rather than the Democratic candidate in the 2008 presidential election.[15] (Thus is revealed another paradox of daily life in many reaches of entitlement America: the simultaneous arranging of personal affairs to count on growing support from public transfers while unselfconsciously professing to prefer a smaller government.) 
Overall regional ratios of government transfers to personal income cannot speak to another dimension of the entitlement epidemic: the prevalence of government transfer recipience. Such estimates, however, can be derived from the Census Bureau’s survey data and from administrative records of the entitlement programs themselves, although these two sources tend to give consistently different readings. As they do with some other sources of income, including dividends and interest, recipients tend to seriously underreport their government benefits. One 2004 study by the BEA and Census Bureau researchers estimated for the year 2001 that the Census Bureau’s tallies undercounted means-tested income transfers by two-fifths, and disability benefits by two-thirds—even after official Census adjustments for underreporting.[16] By the same token, a 2007 study by a researcher at the Urban Institute found for the year 2002 that the Census Bureau’s Survey of Income and Program Participation (SIPP) figures understated the actual administrative caseload of the food stamp program by 17 percent, Medicaid / State Children’s Health Insurance Program (SCHIP) by 20 percent, and Temporary Assistance for Needy Families (TANF) (the successor to Aid to Families with Dependent Children [AFDC]) by over 40 percent—and that the performance of the Census’s Current Population Survey (CPS) was even worse.[17] The clear takeaway is that the true levels of entitlement recipience in America are even higher than the Census Bureau estimates indicate.
According to a Census Bureau data run requested by the Wall Street Journal, just over 49 percent of U.S. households were using at least one government benefit to help support themselves in early 2011 (see Figure 16). This was a tremendous increase over the early 1980s, at which time about 30 percent of households were already estimated to be on at least one of the government’s many benefit programs, although the rise was not entirely uninterrupted. In the late 1990s (in the aftermath of welfare reform and at a time of relatively robust economic growth), the prevalence of benefit recipience declined temporarily before continuing on its further ascent. If the Census Bureau reports that over 49 percent of U.S. households are obtaining at least one government benefit, we can safely say that the true number is actually already well over 50 percent. To put it another way: a majority of homes with voters in them are now applying for and obtaining one or more benefits from U.S. government programs.
The prevalence of entitlement program usage is by no means uniform by age group or ethnicity. Meaningful variations within American society and the public at large are illustrated in Figure 17Figure 18, and Figure 19. In 2004, according to a study based on CPS data conducted by a researcher at AARP (formerly the American Association of Retired Persons), nearly 48 percent of American families were already obtaining at least one government benefit (a somewhat higher level than Census Bureau researchers indicated for 2004 [see Figure 16]). By these numbers, nearly every household (98 percent) with someone sixty-five or older was obtaining at least one benefit, with 95 percent of them obtaining benefits from two programs—generally speaking, Medicare and Social Security / Old-Age and Survivors Insurance (OASI).[18]
Perhaps more striking, though, is the proportion of households with no one sixty-five or older obtaining government benefits: entitlement prevalence for this group was already at 35 percent in the year 2004. Relatively few of these beneficiaries were Social Security / OASI or Medicare cases—and of the rest, only a minority was accounted for by unemployment or disability benefits. The overwhelming majority instead were accounted for by households and families availing themselves of means-tested benefits or antipoverty programs.
As Figure 18 shows, the proportion of American households accepting means-tested benefits has soared over the past three decades. According to data from the Census SIPP survey, which began in the late 1970s, the share was just under 17 percent in 1979, but over 30 percent by 2009—and since recipience is understated in SIPP, the actual level is already higher than this. In any case, according to Census Bureau estimates, as of 2009 roughly 4 percent of Americans lived in public housing; 6 percent of Americans lived in households receiving some means-tested cash assistance, 11 percent in households accepting food stamps, and almost 25 percent in households accepting Medicaid.[19] Moreover, as of 2009, an estimated 45 percent of all American children under eighteen years of age were receiving at least one form of means-tested government aid. It is quite possible, considering the scale of underreporting in these surveys, that a majority today are getting benefits from government antipoverty programs. An outright majority of Hispanic and African Americans of all ages were reportedly using such programs, as well as almost 30 percent of Asian Americans and over 20 percent of non-Hispanic whites or “Anglos.”
It is worth noting, incidentally, that the level of means-tested benefit dependency for Anglos today is almost as high as it was for black Americans when Daniel Patrick Moynihan was prompted to write his famous report on the crisis in the African American family,[20] although the degree of dependency on government entitlements for the families in question is arguably not nearly as extreme today among the former as it was in the early 1960s among the latter). In another eerie echo of the Moynihan Report, we may see today exactly the same statistical “scissors” nationwide opening up between trend lines in unemployment rates and welfare benefits that moved Moynihan to alarm about conditions in the African American community nearly fifty years ago (see Figure 19). Over the three decades 1979–2009, the unemployment rate has risen, and fallen, and risen again in successive cycles—but the proportion of Americans living in households seeking and receiving means-tested benefits has moved in an almost steady upward direction, essentially unaffected by the gravitational pull of the unemployment rate. The same is true for the relationship between means-tested benefits and the official poverty rate for American families (see Figure 20). Even together, the unemployment rate and the family poverty rate (see Figure 21) provide almost no predictive information for tracking the trajectory of the proportion of American families obtaining one or more means-tested benefits. (By 2009 the share of American families receiving poverty-related entitlements was almost three times as high as the official poverty rate for families—and it was well over three times as high as the national unemployment rate.) One predictor of this “family dependency rate,” however, happens to be fearfully good: calendar year. All other things being equal, the family dependency rate was on a relentless rise between 1979 and 2009: after controlling for the reported unemployment and family poverty rates, dependency was nevertheless increasing by over four percentage points every decade.[21] On this track, it will only be a matter of time before a majority of Americans are seeking and obtaining “antipoverty” benefits from the government—regardless of their wealth or their employment prospects.
Entitlement recipience—even means-tested entitlement recipience—is now a Main Street phenomenon in modern America: a truly amazing turn of events for the nation of legatees to the Declaration of Independence. Entitlement dependence comes at great cost—and as Moynihan warned nearly forty years ago, “It cannot too often be stated that the issue of welfare is not what it costs those who provide it, but what it costs those who receive it.”[22] In the next pages we discuss some of these costs to its entitlements’ recipients.
The Male Flight from Work in the Entitlement Society
The omnipresence of entitlements and their attendant panoply of temptations have already markedly altered what was formerly known as the American way of life, as well as the value structure that supported it. This result should hardly surprise anyone. With personal dependence on government handouts not only destigmatized, but increasingly enshrined as a basic civil right of all U.S. citizens, mass behavior and popular attitudes could not help but mutate—often in highly uncivil directions.
The adverse influence of transfer payments on family values and family formation in America is one of these critical consequences. Important as it is, this is already old news. The deterioration of the postwar U.S. family structure under the shadow of a growing welfare state was a topic that had already attracted comment for decades before Charles Murray’s seminal study, Losing Ground[23] appeared almost thirty years ago. Murray’s exegesis formally explained what many concerned citizens had already suspected or concluded: that the perverse incentives embedded in federal-family support policies were actually encouraging the proliferation of fatherless families and an epidemic of illegitimacy. Although the AFDC program had been established back in the 1930s to provide for orphans, by the early 1980s paternal orphans accounted for just 1 percent of the AFDC caseload; by 1982 nearly half of the children on AFDC qualified because their mothers were unwed, and three-fifths of the children of never-married mothers were receiving AFDC payments.[24] It may suffice to say that AFDC and its allied benefit programs had, by these specifications, incontestably become a vehicle for financing single motherhood and the out-of-wedlock lifestyle in America. The tangled pathology linking entitlement programs to the feminization of poverty and the rise of the female-headed family was addressed, after a fashion, by the welfare reform efforts of the mid-1990s (about which more later).
While the insidious effects of entitlement programs on the lifestyles of women and children have occasioned tremendous attention since the end of World War II, much less has been devoted to their consequences for men. American manhood, however, has not been left untouched by the entitlements revolution. Before the age of entitlements, self-reliance and the work ethic were integral and indispensible elements of the ideal of manliness in America. Able-bodied men who did not support themselves were shamed—and quite commonly, ashamed: the epithet “shiftless” was reserved for such men, and they were widely looked down upon by other Americans, irrespective of age, gender, or ethnicity.
The world is very different today. The dignity of work no longer has the same call on men as in earlier times. Over the past several generations, America has come to accept a huge move out of employment by men—in an era when work was readily available and when jobs were taken up increasingly by women. Put simply, the arrival of the entitlement society in America has coincided with a historically unprecedented exit from gainful work by adult men.
Figure 22 frames the dynamic by outlining trends in the labor force participation rate—the ratio of persons working or seeking work in relation to the total reference population. From 1948 to 2011 the overall labor force participation rate for American adults age twenty and over rose—from about 59 percent to about 66 percent, despite the 2008 crash. But this arithmetic average is the confluence—really, a convergence—of two very different trends. Since 1948 the U.S. female labor force participation rate has soared: from about 32 percent to almost 60 percent. But over those same years, the male labor force participation rate plummeted: from about 89 percent to just 73 percent. Labor force participation rates for men and women are closer today than ever before—not only because of the inflow of women into the workforce, but also because of the withdrawal of men.
Under the force of these trends, men are literally becoming less important than women in keeping America at work. Over the past twenty years (1991–2011), for example, nearly 13 million of the country’s new adult jobholders were women—but just 11 million were men.[25] A multiplicity of social changes help explain the postwar feminization of the U.S. labor force, but the great decline in work by America’s men also demands notice and requires explanation.
 Figure 23 helps us understand the phenomenon of the vanishing male worker in contemporary America. For U.S. men twenty years of age and older, Figure 23 depicts both the employment to population ratio and the labor force participation rate. The gap between these two lines represents the unemployed: those in the workforce, seeking employment, but without jobs. As may be seen, a terrible gap between these two lines opened up in 2008, with the paroxysms of the Great Recession. At its widest level in postwar history—that is, in the year 2010—that gap amounted to 6.5 percent of the total male population age twenty and older. On the other hand, in the sixty years between 1948 and 2008—that is to say, before the subsequent crash—the male labor force participation rate fell by nearly 13 percentage points. In other words, male employment levels today have been depressed twice as much by the drop in the share of men seeking work as by the lack of work in the depths of the Great Recession for those seeking jobs. Between 1948 and 2011 the proportion of adult men who did not consider themselves part of the workforce steadily rose, from under 13 percent then to almost 27 percent now. From this perspective, it would appear that a large part of the jobs problem for American men today is not wanting one.
The decline in male labor force participation rates since the end of World War II admittedly does reflect in part the aging of American society. But that particular aspect of the overarching postwar male flight from work should not be overstated. In 1950, men age sixty-five or older comprised just under 12 percent of all men above the age of twenty;[26] sixty years later, the corresponding figure was just under 16 percent.[27] Thus the growth in the share of senior citizens can only explain 4 percent of the drop in labor force participation rates for men, only a small portion of the total 16 percent. More consequential was the retreat from work in the prime working-age groups. In 1950 just 3.5 percent of noninstitutionalized American men between the ages of twenty-five and fifty-four did not count themselves as part of the country’s workforce. Sixty years later, the corresponding share was over three times as high—almost 11 percent.[28] In the intervening years, incidentally, the health status of that twenty-five- to fifty-four-year-old group improved substantially over those same decades: where the odds of dying during that portion of the life course was about 19 percent for American men back in 1950, it has dropped to less than 9 percent by 2009.[29]
Americans still tend to regard themselves as a distinctively hard-working people, and in important respects, hard facts do bear this out. Americans with jobs work much longer nowadays than their continental European counterparts: by the reckoning of Harvard’s Alberto Alesina and his colleagues, in the early years of the 2000s, employed Americans were working an average of more than eighteen hundred hours per year—20 to 25 percent longer than the average German or French worker, 35 percent longer than the average for Sweden, and almost 50 percent longer than counterparts in the Netherlands.[30] But these averages are for people actually at work. Paradoxically, labor force participation ratios for men in the prime of life are demonstrably lower in America than in Europe today.
The paradox is highlighted in Figure 24, which contrasts labor force participation rates for men in their late thirties in the United States and Greece. In the United States, as in most modern societies, men in their late thirties are the demographic with the highest rates of labor force participation. And Greece, given its ongoing public debt and finance travails, is at the moment a sort of poster child for the over-bloated, unsustainable European welfare state. Be all that as it may: the fact is that a decidedly smaller share of men in their late thirties have apparently opted out of the workforce in Greece than in the United States. By 2003—well before the Great Recession—7.2 percent of American men in this age group were outside the workforce, as against just 3 percent in Greece. Nor is Greece an anomalous representative of European work patterns in this regard. Quite the contrary: According to the International Labor Office’s LABORSTA database,[31] almost every Western European society was maintaining higher labor force participation rates than America by this criterion. Indeed: Around the year 2004, thirteen members of the EU-15 reported higher participation rates for men in their late thirties than America’s own (only Sweden’s was a shade lower—Britain’s data, for their part, do not break out participation rates for the thirty-five- to thirty-nine-year-old age group). To the American eye, Europeans may take a great many holidays and vacations —but the fact of the matter is that American men near the height of their powers are much more likely than their European brethren to go on permanent vacation.
How has America’s great postwar male flight from work been possible? To ask the question is to answer it: This is a creature of our entitlement society and could not have been possible without it. Transfers for retirement, income maintenance, unemployment insurance, and all the rest have made it possible for a lower fraction of adult men to be engaged in work today than at any time since the Great Depression—and, quite possibly, at any previous point in our national history. For American men, work is no longer a duty or a necessity: rather, it is an option. In making work merely optional for America’s men, the US entitlement state has undermined the foundations of what earlier generations termed “the manly virtues”—unapologetically, and without irony. Whatever else may be said about our country’s earlier gender roles and stereotypes, it was the case the manly virtues cast able-bodied men as protectors of society, not predators living off of it. That much can no longer be said. 
From a Nation of Takers to a Nation of Gamers to a Nation of Chiselers
With the disappearance of the historical stigma against dependence on government largesse, and the normalization of lifestyles relying upon official resource transfers, it is not surprising that ordinary Americans should have turned their noted entrepreneurial spirit, not simply to maximizing their take from the existing entitlement system, but to extracting payouts from the transfer state that were never intended under its programs. In this environment, gaming and defrauding the entitlement system have emerged as a mass phenomenon in modern America, a way of life for millions of men and women who would no doubt unhesitatingly describe themselves as law-abiding and patriotic citizens of the United States.
Abuse of the generosity of our welfare state has, to be sure, aroused the ire of the American public in the past, and continues to arouse it from time to time today. For decades, a special spot in the rhetorical public square has been reserved for pillorying unemployed “underclass” gamers who cadge undeserved social benefits. (This is the “welfare Cadillac” trope, and its many coded alternatives.) Public disapproval of this particular variant of entitlement misuse was sufficiently strong that Congress managed to overhaul the notorious AFDC program in a reform of welfare that replaced the old structure with TANF. But entitlement fiddling in modern America is by no means the exclusive preserve of a troubled underclass. Quite the contrary: it is today characteristic of working America, and even those who would identify themselves as middle class.
 Exhibit A in the documentation of widespread entitlement abuse in mainstream America is the explosion over the past half century of disability claims and awards under the disability insurance provisions of the U.S. Social Security program. In 1960 an average of 455,000 erstwhile workers were receiving monthly federal payments for disability. By 2010 that total had skyrocketed to 8.2 million (and by 2011 had risen still further, to almost 8.6 million).[32] Thus, the number of Americans collecting government disability payments soared eighteen-fold over the fifty years from 1960 and 2010. In the early 1960s almost twice as many adults were receiving AFDC checks as disability payments;[33] by 2010, disability payees outnumbered the average calendar-year TANF caseload by more than four to one (8.20 million vs. 1.86 million[34]). Moreover, recipients of government disability payments had jumped from the equivalent of 0.65 percent of the economically active eighteen- to sixty-four-year-old population in 1960 to 4.6 percent by 2010. In 1960, there were over 150 men and women in that cohort working or seeking employment for every person on disability; by 2010, the ratio was 22 to 1 and continuing to decrease. The ratios are even starker when it comes to paid work: in 1960, roughly 120 Americans were engaged in nonfarm employment for every officially disabled worker; by December 2011 there were just over 15.[35]
Although the Social Security Administration does not publish data on the ethnicity of its disability payees, it does publish information on a state-by-state basis. The These data suggest that the proclivity to rely upon government disability payments today is at least as much a “white thing” as a tendency for any other group. As of December 2011 the state with the very highest ratio of working-age disability awardees to the resident population ages eighteen to sixty-four was West Virginia (9.0 percent—meaning that every eleventh adult in this age group was on paid government disability). According to Census Bureau estimates, 93 percent of West Virginia’s population was “non-Hispanic white” in 2011.[36] In New England, by the same token, all-but-lily-white Maine (where ethnic minorities accounted for less than 6 percent of the population[37] in 2011) records a 7.4 percent ratio of working-age disability payees to resident working-age population: more than one out of fourteen.
On the other hand, in the District of Columbia, where so-called Anglos or non-Hispanic whites composed just 35 percent of the population in 2011,[38] the ratio of working-age disability recipients to working-age resident population was 3.3 percent—less than half of Maine’s, and bit more than a third of West Virginia’s.
America’s dramatic long-term rise in the proportion of working-age men and women designated as possessing entitlement-worthy disabilities is all the more remarkable when one bears in mind the tremendous improvements in public health between 1960 and 2010. Between 1960 and 2009, according to the reckoning of the Human Mortality Database, overall life expectancy at birth in the United States increased by nearly nine years, and life expectancy at age eighteen jumped by seven years (from 54.5 to 61.5). Over that same period, the odds of dying between one’s eighteenth and sixty-fifth birthdays fell markedly: from 26.1 percent to 15.1 percent, or by well over two-fifths[39] (see Figure 25). Furthermore, the automation of work and the rise of the service/information economy over those same decades made the daily routines of Americans ever less physically demanding. Given these factors, what’s the source of the seven-fold rise in the proportion of working-age Americans on government-paid disability over the past half century?
Paradoxically, despite the general aging of the population as a whole and the workforce in particular, there has been a gradual reduction in the age of the disability-entitled over time.[40] In 1960, for example, 6.6 percent of men and 6.4 percent of women on disability were in their 30s or early 40s; by 2011 the corresponding shares were 15 percent and 16.2 percent, respectively.[41] More and more Americans, it would seem, are making the securing of disability status their lifelong career; collecting disability is an increasingly important “profession” in America these days.
Hints can be found in the diagnostic categories under which disability claimants were awarded their federal stipends. In December 2011, of the 8.6 million workers on government disability, 1.5 million, or slightly over 15 percent, were granted on the basis of “mood disorders,” and another 2.5 million, or 29 percent, for diseases of the “musculoskeletal system and the connective tissue.”[42] Together, these diagnoses make up nearly half of all disability diagnoses today. In 1960, in contrast, musculoskeletal problems and mental disorders of all types accounted for only one-fifth of disability awards.[43]
The exceptionally rapid increase in awards for mood disorders and musculoskeletal problems over the past fifty-plus years may speak in part to improvements in diagnostics and redress of previously unreported afflictions. On the other hand, one may note that it is exceptionally difficult—for all practical purposes, impossible—for a medical professional to disprove a patient’s claim that he or she is suffering from sad feelings or back pain.
By year-end 2011, more Americans were “employed” (in the sense of having a source of steady income) via government disability than from construction, or transport and warehousing, and over three times as many as in information technology services. In terms of gross manpower, workers on disability pay are today nearly in the same league as the entire U.S. manufacturing sector: for every one hundred industrial workers in December 2011, there were seventy-three ex-workers receiving federal disability pay.
Apparently, disability is also the healthiest and most dynamic area of the labor force. In the period spanning January 2010 and December 2011, for example, the U.S. economy generated 1.73 million nonfarm jobs—and added almost half as many (790,000) workers to the roll of federal disability payments.[44] Nor can this perverse pattern be discounted as a short-term phenomenon peculiar to the nature of the recovery from the crash of 2008. Over the fifteen years between December 1996 and December 2011, America gained 8.8 million nonfarm private-sector jobs—and 4.1 million workers on disability payment. In the decade between December 2001 and December 2011, nongovernment nonfarm employment rose by fewer than 1 million jobs (828,000), while the ranks of the working-disabled swelled by over 3 million (3.036 million, to be precise).
In FY 2011 the Social Security Administration disbursed over $130 billion in payments for its Disability Insurance (DI) program. An additional $56 billion went to the Supplemental Social Insurance (SSI) program, many of whose recipients qualify on the grounds of being work-disabled.[45] Many more claimants are taking benefits from the DI program today than was envisioned by its overseers even a few years ago. According to recent projections by the Congressional Budget Office, the Social Security DI trust fund is on track to go bankrupt in just four years.[46] The greatest costs from the mass gaming of disability payments, however, are not necessarily economic.
In “playing” the disability system, or cheating it outright, many millions of Americans are making a living by putting their hands into the pockets of their fellow citizen—be they taxpayers now alive or as yet unborn (a steadily growing phenomenon, as we shall see in a moment). And it is not simply the disability gamers themselves who are complicit in this modern scam. The army of doctors and health-care professionals who are involved in, and paid for their services in, certifying dubious workers’ compensation cases are direct—indeed indispensible—collaborators in the operation. The U.S. judicial system—which rules on disability cases and sets the standards for disability qualification—is likewise compromised. More fundamentally, U.S. voters and their elected representatives are ultimately responsible for this state of affairs, as willing and often knowing enablers. This popular tolerance for widespread dishonesty at the demonstrable expense of fellow citizens leads to an impoverishment of the country’s civic spirit and an incalculable degradation of the nation’s constituting principles.
The Myth of “Pay-as-You-Go” Entitlements: In Reality, Increasingly Financed by the Unborn
As Americans opt to reward themselves ever more lavishly with entitlement benefits, the question of how to pay for these government transfers inescapably comes to the fore. As the transfer payment lifestyle has become normalized and generalized in modern America, U.S. citizens have become ever more broadminded about the propriety of tapping new sources of finance for supporting their appetite for more, immediate entitlements. The taker mentality has thus ineluctably gravitated toward taking from a pool of citizens who can offer no resistance to such schemes: the unborn descendants of today’s entitlement-seeking population.
The intention to plunder the earnings of future generations of Americans through current entitlement programs is, at least for now, most transparent in the design of our policies for income maintenance and health care for our retirees and older citizens: namely, Social Security (more technically, OASI) and Medicare. In theory, Social Security and Medicare are both meant to be self-financed social insurance programs, by which an enrollee’s premium payments during the working years cover needs in retirement and later life. As actually structured, these programs rely upon contributions from current workers to sustain current recipients: these arrangements are known as the “pay-as-you-go” approach. And although Social Security and Medicare beneficiaries formally draw their payments from officially established trust funds, as a practical matter these outlays are not meant to be paid for through set-asides from the recipient cohorts themselves (though Medicare has been in operation for over forty-five years, and Social Security over seventy-five years). Instead they are designed to rely upon the resources of subsequent cohorts of income earners. In effect, both are intergenerational resource transfer plans, whereby today’s takers, with very few exceptions, consume at the expense of those born after them.
Under such circumstances, it may seem like only a small step to move from taxing the current generation of workers to the following generation—or untold ones after that—in order to provide today’s older Americans with the government pensions and health-care services they take as their due. That fateful line was crossed by the U.S. welfare state long ago: there has never been any great interest in protecting of the rights of the unborn on the part of U.S. social insurance programs and their presumptive beneficiaries. In consequence, all too predictably, the U.S. trust funds for both Social Security–OASI and Medicare are not endowments at all, but accounting contrivances built upon a mountain of future IOUs.
The plain fact is that neither the Social Security nor the Medicare trust funds can honor the future promises they have made today. Both are woefully unsound from an actuarial standpoint, which is no secret: the administrators of both entitlement programs not only admit as much, but calculate the estimated magnitude of these unpayable promises (the “net present value of unfunded liabilities”) every year in a report to the respective funds’ trustees.
In its most recent report the Social Security program reckons these unfunded liabilities to be on the order of $8.6 trillion current US dollars for the seventy-five years commencing January 1, 2012. If the program is to last indefinitely, the implied “unfunded liability through an infinite horizon” would be over $20 trillion.[47] By way of comparison: at the start of 2012, the US GDP amounted to a little over $15 trillion.[48]
The Medicare program, for its part, may be even further out of kilter. The 2012 Medicare trustee’s report indicates its unfunded obligations over the next seventy-five years to total nearly $27 trillion,[49] and this assessment may be optimistic: an alternative scenario offered by Medicare’s Office of the Actuary presented an outlook of nearly $37 trillion in unfunded obligations over those same years.[50] Some careful independent analysts have suggested that the true size of these unfunded liabilities may be still greater than the government’s actuaries suggest.[51] (And the “infinite horizon” estimate of Medicare’s unfunded liabilities would presumably be still larger.)
These calculations are, of course, subject to technical considerations and assumptions (the growth outlook, the demographic outlook, the suitable long-term interest rate, the cost outlook for goods and services inside and outside the health sector, and so forth). Their results, accordingly, are sensitive to changes in assumptions about the future. They are thus best regarded as ballpark projections rather than precise forecasts.
Such computational arcana, however, should not distract from the central driver behind these truly stupendous imbalances underlying our two largest entitlement programs: the impulse to take benefits here and now, and leave it to other people later on to figure out how to pay for it all. There are many possible rationalizations for such a disposition: the notion that we have “already paid for” the old-age benefits we expect to collect through our past payroll taxes (a patent arithmetic misconception), for example, or the notion that our descendants will likely be more affluent than we are and thus better set than we are to take of the bills we are leaving behind for them. Yet whatever the rationalizations—and a great many of them circulate in public discourse today—none can make up for the sacrifice of principle that lies at the heart of this problem. For the sake of pure short-term expedience, the U.S. democracy has determined to mortgage its tomorrow for a more comfortable retirement today.
To some, the question of the unfunded balances in our very biggest entitlement programs may seem abstract, or even moot. The future, after all, is full of incertitudes, and a world seventy-five years hence is hard to envision with any confidence. The here and now, on the other hand, is all too real and pressing: too, politics, as a sport played on a day-by-day basis, is especially responsive to immediate desires and urges. Yet the consequences of the increasing American fiscal predilection for using the unborn as a sort of all-purpose credit card are already coming into view.
As much may be seen in Figure 26, which contraposes government outlays for Social Security and Medicare against the federal budget deficit over the past four decades. There is an irregular but all too steady correspondence between those two quantities over the years in question. The federal deficit is an arithmetic difference between receipts and expenditures, and thus not program-specific. Yet to judge only by its performance specifics, one would be tempted to say that the purpose of the federal deficit in recent decades has in effect been to fund our pay-as-you-go entitlement programs. (This was not true in the mid- to late 1990s, in the years when the Clinton administration was squaring off against a Republican Congress—but those years in retrospect look like a temporary aberration.) In recent years, the federal deficit has been almost exactly equal to our programmatic spending on Social Security and Medicare combined.
There are perfectly good reasons for free peoples to run government deficits and thus contract public debt: these include providing for response to dire national emergencies or perhaps underwriting investment projects in potentially productive infrastructure. The wholesale financing of current public consumption through the device of obliging unborn Americans to cover those costs (plus interest), however, has not previously been characteristic of our democratic governance.[52] Irrespective of the economic implications of this insidious innovation, this new approach to entitlements necessarily means we will be leaving a very different heritage of mores to our legatees from the one that we inherited from our forebears.     
Crowding Out Defense: Making National Security “Unaffordable” for History’s Richest Country
Unlike entitlement payments, which are nowhere mentioned in the Constitution (and might even have been unimaginable to our Founding Fathers as a function of government), the U.S. Constitution expressly establishes national security, and the maintenance of the armed forces to provide for “our common defense,” as a prime responsibility for the American state. Recall, perhaps, that the president’s first (and thus foremost) enumerated power under the Constitution is in his role as “commander in chief” (Article II, Section 2).
Over the past generations, that task of providing for the common defense was acquitted tolerably well. For better or worse, the United States in fact ended up as the world’s sole superpower at the dawn of the new millennium (at least for the time being), having won, inter alia, two world wars and a cold war. As one might expect from such exertions, American defense has been a staggeringly expensive project over the past century. At this writing (2012), overall national defense expenditures are running at over $700 billion a year[53]—a level that not only dwarfs any other presumptive contemporary competitor, but accounts for close to half of all worldwide military expenditures, according to many analysts.[54]
The size and scope of America’s military allocations, to be sure, has many critics—at home as well as abroad. Commentators on the right and left of the U.S. political spectrum decry what they call the American “national security state” (a term that has acquired a marked opprobrium in the decades since its early and decidedly more neutral Cold War coinage). And concern about excessive defense spending has hardly been a preserve of fringe extremists: no less a figure than Dwight Eisenhower, architect of the D-Day invasion and onetime supreme commander of NATO forces, warned of the dangers of a “military-industrial complex” in his famous 1961 farewell presidential address.[55]
A healthy measure of informed public skepticism toward any and all proposed military expenditures is not only suitable but essential for open democratic societies. A free people, after all, will jealously guard against impingements upon their liberties—including those arising from excessive, wasteful, or unwise outlays in the name of national defense.
But the notion that defense spending today is entirely or even mainly accountable for the burden of government that the American citizenry shoulders, though still widely believed, is by now utterly antique and completely at odds with the most basic facts. The days in which the national security state consumed more public resources than the welfare state are long past. U.S. government outlays on entitlements do not merely exceed those for defense nowadays: entitlements completely overshadow defense in the U.S. fiscal picture. Increasingly, moreover, our seemingly insatiable national hunger for government transfer payments to individual citizens stand to compromise our present and future capabilities for military readiness.
In 1961, the year of Eisenhower’s admonition about the military-industrial complex, America was devoting close to two dollars on defense for every dollar it provided in domestic entitlement payments.[56] Up to that point, defense expenditures had routinely exceeded any and all allocations for social insurance and social welfare throughout U.S. history.[57] But in 1961 a geometric growth of entitlement payments was just starting. Thanks to the unrelenting force of that spending surge, government transfer payments to individuals would surpass defense spending in just a decade—in 1971, in the midst of the Vietnam War. And for the following forty years, entitlements have continued to surpass defense expenditures—by progressively widening margins, to boot. By 2010 the United States was spending well over three times as much on transfer payments as on its entire national security budget—notwithstanding active and simultaneous overseas military campaigns in Iraq and Afghanistan (see Figure 27).
America’s ramp-up of military outlays in the decade after the September 11, 2001, attacks is well known. Much less widely known is the fact that this massive upsurge in military spending was more or less eclipsed by the enormous increase in spending on domestic entitlements over those same years. This fact may be demonstrated in many different ways, but a comparison of current spending trends for defense and entitlements over the 2001–10 period may be clearest.[58]
In FY 2001 the United States spent $305 billion on defense; for 2001–10, the cumulative total was $5.05 trillion. Over those ten years, in other words, America spent $2 trillion more on defense than if it had just continued along at the 2001 baseline. In contrast, America was spending $1.13 trillion on entitlements in 2001, and ended up spending a cumulative total of $16.03 trillion on those transfers for 2001–10. This was nearly $4.8 trillion more than would have been spent on the nominal dollar baseline from 2001. By this measure, the absolute growth over the last decade in entitlement spending was nearly two and a half times greater than the corresponding increment in defense spending. The magnitude of the upsurge in military spending over those years was widely discussed over those same years, and often decried as being unaffordable. Curiously, considering the magnitude of the quantities involved, the great simultaneous leap in entitlement spending did not seem to attract similar critical public attention.
 At this writing, the US defense budget is under mounting pressure. It is more than just a matter of winding down America’s commitments in Iraq and Afghanistan: a much more fundamental and far-reaching recasting of defense posture and global capabilities appears to be afoot. Its gathering manifestations are seen in the 2012 White House budget request, which slashed nearly half a trillion dollars from the previous official plans for military spending over the coming decade, and from the president’s January 2012 strategic guidance to revise and downsize America’s global force structure.[59]And these cuts may only be a foretaste of what is to come: last year’s bipartisan congressional “sequestration” deal, for example, calls for another half trillion dollars in prospective defense cuts in the years immediately ahead if a deficit reduction deal is not hammered out by the end of 2012.
The rationale for the slashing of overall U.S. defense capabilities in the years ahead is spelled out plainly in the Defense Department’s document on the new policy: budgetary exigency. As the president’s transmittal letter puts it, “We must put our fiscal house in order and renew our long-term economic strength. To that end, the Budget Control Act of 2011 [the aforementioned ‘sequestration’ deal] mandates reductions in federal spending, including defense spending.”[60] In short, America’s current defense posture is unsustainable because it is unaffordable.
But why, exactly, should America’s current and (heretofore) future military commitments be regarded as “unaffordable”?  
In 2010 the national defense budget amounted to 4.8 percent of current GDP (see Figure 28). As a fraction of U.S. national output, our military burden was thus lower in 2010 than in almost any year during the four-plus decades of the Cold War era. In 1961—the year of Eisenhower’s farewell address—the ratio of defense spending to GDP was 9.4 percent[61]—in other words, almost twice as high as in 2010. Put another way: America’s overall military burden was nearly twice as high in 1961 as in 2010. Americans may have deemed our defense commitments in 2010 to be ill-advised, poorly purchased, or otherwise of questionable provenance—but as a pure question of affordability, the United States is in a better position to afford our current defense burden than at virtually any time during the Cold War era.
The true problem with defense “affordability” today is not our ability to pay for these outlays per se, but rather our overall national spending priorities. Entitlements are still sacrosanct: there is as yet no serious talk of reigning in their growth path. The aforementioned congressional sequestration deal does not dare to cut into any of this group of outlays—nor does the president’s new budget proposal. By the calculus of American policymakers today then, US defense capabilities seem to be the primary area sacrificed to make the world safe for the unrestrained growth of American entitlements.
But this inverted, perverse, and feckless mind-set can only take a nation of takers so far, notwithstanding all that this approach may presage for the security of our country. As a matter of pure arithmetic, the urge to skimp on national defense to support our welfare state is utterly unsustainable. Consider: On its current trajectory, the U.S. government’s transfer payments are on track to increase by over $700 billion over the next four years. As it happens, our total current national outlay for all defense and security programs is roughly $700 billion. Even if our national defenses were to be eliminated totally tomorrow, it would at this juncture take just one presidential term for the growth of personal transfers from the U.S. government to absorb the totality of our entire current defense budget.  
Entitlements, Dependence, and the Politics of Populist Redistribution
An unavoidable consequence of the noxious something-for-nothing thinking that lies at the heart of the modern entitlement mentality is the resort to redistributionist politics. Actuarially sound insurance programs are one thing, but if we are to count on drawing more from the public purse than we can predictably be said to have contributed, someone else is going to have to foot the bill. No alternative answers are available to this simple arithmetic proposition. And today, the potential funding sources for subsidizing something-for-nothing policies are three: other countries, other citizens, or other generations. As the unwhetted contemporary public appetite for government transfers continues to mount, each of these sources will be assiduously tapped.
Plundering our descendants’ wealth to finance today’s entitlements is easy to do under America’s current political economy—and is being done now, as noted here. Financing current entitlements from foreign wealth is a potentially feasible but decidedly more complex proposition, and it need not detain us in the context of this discussion. The final option under consideration is financing today’s something-for-nothing politics from within the pool of living, voting Americans.
Curiously, very little empirical research is available on the overall incidence of taxation and transfer recipience in modern America. Available research for the most part deals with the burden of income taxes alone, excluding the many other taxes Americans bear these days—and neglecting almost entirely the issue of transfer benefits. One admirable exception to this general observation is a study by the Heritage Foundation’s Robert Rector and Christine Kim, focused on the year 2004—before the Great Recession, and its attendant calls for more redistributionist policies (see Figure 29 and Figure 30).
By Rector and Kim’s estimate, U.S. fiscal policy in 2004 was highly progressive: the top fifth was transferring out a net of about $15,000 per person per year, and the bottom fifth was taking in a net of nearly $14,000 per person a year in government transfers. Poorer people paid taxes, too—to be sure. But for every dollar in federal, state, and local taxes the top fifth gave over, they received an estimated 31 cents back in transfer benefits—whereas the bottom fifth received close to $7 in transfer benefits.
We cannot know the current landscape of redistribution until the data underlying Figures 29 and 30 are updated. Yet we do know two critical facts about the political environment that will be shaping the dynamics of political redistribution in the years immediately ahead.
The first is that earned success is being delegitimized in some political circles—including some highly powerful ones at this writing. President Obama’s momentous recent campaign speech insinuating that entrepreneurs had not achieved their own accomplishments on their own, but rather owed their wealth to preexisting government expenditures[62] on infrastructure and in their own upbringing, not only writes a narrative by which today’s American wealth may be attributed to government entitlement programs, but also opens the rhetorical door to almost limitless extractions from the well-to-do on the grounds that their success is not really theirs to enjoy.
The second would be the brute facts of rising calendar-year income dispersion in modern America, and seemingly ever greater year-to-year volatility in household earnings.[63] With more or less steadily rising “income inequality” by such metrics (though not by all metrics regarding economic inequality[64]) the call may gradually grow for explicitly redistributionist policies—whether to redress purportedly embedded structural defaults in our economic system, or to stimulate prosperity through neo-Keynesian spending schemes, or both.
The underlying problem here, unfortunately, is that such battle cries are in themselves subversive of both the formula that has to date facilitated such an extraordinary generation of wealth in the American republic and its moral underpinnings. Something-for-nothing politics requires at least a pretense of justification for its takings, and one of the most convenient rationales for excusing such takings is the claim that our opportunity society no longer really works. The irony here is that something-for-nothing politics can itself make the claim come true—if pursued on a sufficiently grand scale and in a sufficiently reckless manner.
A Nation of Takers: Is The Syndrome Unsustainable?
Within policymaking circles in Washington today, observing that America’s national hunger for entitlement benefits has placed the country on a financially untenable trajectory, with the federal budget—which is to say, the entitlements machine—generating ultimately unbearable expenditures and levels of public debt is very close to received wisdom. The bipartisan 2010 Bowles/Simpson Commission put the viewpoint plainly: “Our nation is on an unsustainable fiscal path.”[65] The same point has been made by many in Congress, by the Congressional Budget Office (CBO), and by the current head of the President’s Council of Economic Advisers (CEA), who wrote an academic paper shortly before entering the government warning that projections by the CBO and the trust funds for Social Security and Medicare were likely much too optimistic, that fiscal imbalances were on track to pose a public debt crisis within a decade, and that avoiding such a crisis would “require much larger changes than have received serious consideration in the policy process to date.”[66]
The prospect of careening along an unsustainable economic road is deeply disturbing. But another possibility is even more frightening—namely, that the present course may in fact be sustainable for far longer than most people today might imagine.
The United States is a very wealthy society. If it so chooses, it has vast resources to liquidate. In the public sphere are many trillions of dollars worth of assets, accumulated over the centuries—land, buildings, art, and the like—that can still be sold.
 In the private sector, corporate and personal debt still exceeds net public debt by a very healthy ratio: according to the Federal Reserve Bank’s estimate, as of 2011, three-quarters of all U.S. credit market debt had been contracted for private purposes (mortgages, business investments, and the like).[67] Thus there remains plenty of room for diminishing the role of such transactions in American economic life—or perhaps for crowding them out almost altogether.
And internationally the U.S. dollar is still the world’s reserve currency; there remains great scope for taking financial advantage of that privilege. As a practical matter, there is no realistic international alternative to the dollar—at least for now. It could take many years—maybe decades—for the United States to sacrifice this status by undermining the dollar’s credibility as an international medium of exchange. Debasing the dollar to finance continuing expansion of domestic spending could eventually look like an option worthy of serious consideration—at least, in an America addicted to and enslaved by entitlements.
Such devices admittedly would indefinitely forestall economic ruin—but they might well greatly postpone the day of judgment. Not so the day of reconing for the American character which may be dacrificed long before the credibility of the U.S. economy. Some would argue that asset is already wasting away before our very eyes.
Whether you agree or disagree with Eberstadt’s point of view and arguments, you will want to read a dissenting point of view by William A. Galston, entitledHave We Become a “Nation of Takers”?


[1] Daniel Patrick Moynihan, The Politics of a Guaranteed Income (New York: Random House, 1973), 17.
[2] Bureau of Economic Analysis, State Personal Income 2005, Section VI: Personal Current Transfer Receipts, March 28, 2006, http://www.bea.gov.
[3] Merriam-Webster.com, s.v. “entitlement,” http://www.merriam-webster.com.
[4] Bureau of Economic Analysis, State Personal Income and Employment Methodology, September 2011, http://www.bea.gov.
[5] These calculations use the Consumer Price Index (CPI) to adjust for inflation. U.S. Department of Labor, Bureau of Labor Statistics, Consumer Price Index,ftp://ftp.bls.gov. A good technical argument can be made for using the Personal Consumption Expenditure Price Index (PCEPI) instead. See Bureau of Economic Analysis, Table 2.3.3, Real Personal Consumption Expenditures by Major Type of Product, available at http://www.bea.gov. Using the PCEPI, the real growth of entitlement transfers would have been over fifteen-fold, at an average annual pace of 5.6 percent.
[6] Using the PCIPI, the tempo would have been closer to 4.4 percent per annum, implying a total increase of about 860 percent.
[7] Bureau of Economic Analysis, Personal Income and Outlays, Table 2.1, Personal Income and Its Disposition, http://www.bea.gov.
[8] The World Factbook 2010 (Washington, DC: Central Intelligence Agency, 2010).
[9] Bureau of Economic Analysis, Survey of Current Business, vol. 9, no. 8, August 2011, GDP and Other Major NIPA Series, 1929–2011:II, http://www.bea.gov.
[10] Statistical and historical reference to poor relief experience pre New Deal
[11] At much the same time, it was learned that EBT spending could no longer be demonstrably limited to food purchase. In practice, the government apparently no longer tries to account for the purposes for which these debit cards are used. See Luke Rosiak, “Top Secret: Feds Won’t Say What Food Stamps Buy,” Washington Times, June 24, 2012, http://times247.com.
[14] Jeremy White, Robert Gabeloff, Ford Fessenden, Archie Tse, and Alan McLean, “The Geography of Government Benefits,” New York Times, February 11, 2012, http://www.nytimes.com.
[18] One may well ask why these reported rates are not 100 percent, since virtually all Americans are eligible for Medicare and Social Security by their sixty-fifth birthday. This could in part be a matter of underreporting, as with means-tested benefits. It is also possible that the difference could reflect the fact that some older persons wait past the age of sixty-five to collect Social Security benefits (monthly payments are scaled progressively upward for those who defer to age seventy), and some are theoretically ineligible for Medicare (such as illegal aliens who have not contributed to Social Security). Or it could be some combination of the two.
[20] See http://www.stanford.edu/~mrosenfe/Moynihan’s percent20The percent20Negro percent20Family.pdf.
[21] Results refer to a simple bivariate and multivariate regression with calendar year, unemployment, and family poverty rates as independent (or “x”) variables and family dependency rate as the dependent (or “y”) variable. R-squares for unemployment vs. family dependency and family poverty vs. family dependency were each under 0.05, and neither relationship was statistically significant; calendar year vs. family poverty yields an R-square of 0.8 and was statistically significant. Multivariate regression with all three independent variables gave an R-square of 0.88, but produced nonsense results for family poverty (indicating that a rise in the former tended to reduce family dependence)—while offering an extraordinarily strong degree of statistical significance for the calendar-year variable, whose coefficient was 0.44.
[22] Daniel Patrick Moynihan, The Politics of a Guaranteed Income (New York: Random House, 1973), 18.
[23] Charles Murray, Losing Ground: American Political Society, 1950–1980 (New York: BasicBooks, 1984).
[24] Nicholas Eberstadt, “Economic and Material Poverty in Modern America” (unpublished paper, November 1986), 42.
[26] Derived from U.S. Census Bureau, “Historical Statistics of the United States, Colonial Times to 1970, Part 1,” Series A 119–34, 1975, p. 15.
[27] Derived from U.S. Census Bureau, “Statistical Abstract of the United States: 2012,” Table 9, 2012, p. 12.
[28] U.S. Bureau of Labor Statistics, One Screen database, Labor Force Statistics, Series LNU01300001, http://www.bls.gov/data/.
[29] Derived from life tables presented in the Human Mortality Database, University of California–Berkeley and Max Planck Institute for Demographic Research,  http://www.mortality.org.
[30] Alberto F. Alesina, Edward L. Glaeser, and Bruce Sacerdote, “Work and Leisure in the U.S. and Europe: Why So Different?” In Mark Gertler and Kenneth Rogoff, eds.,  NBER Macroeconomic Annual 2005 (Cambridge, MA: MIT Press, 2006), Table 2,  http://www.nber.org.
[31] International Labor Office Department of Statistics, LABORSTA Database, “Total and Economically Active Population by Age Group” for USA and EU-15 Countries (Austria, Belgium, Denmark, Finland, France, Germany, Greece, Ireland, Italy, Luxembourg, Netherlands, Portugal, Spain, Sweden, and United Kingdom), http://laborsta.ilo.org/.
[32] U.S. Social Security Administration, Annual Statistical Report on the Social Security Disability Insurance Program, 2011, Disabled Beneficiaries and Nondisabled Dependents, Table 1: Number, December 1960–2011, selected years, http://www.ssa.gov.
[33] Forthcoming
[34] U.S. Department of Health and Human Services, Administration for Children and Families, Office of Family Assistance, TANF: Total Number of Families, Fiscal and Calendar Year 2010 as of 05/16/2011, http://www.acf.hhs.gov.
[35] Derived from U.S. Social Security Administration, Annual Statistical Report on the Social Security Disability Insurance Program, 2011, Disabled Workers, Table 19,  http://www.ssa.gov; Bureau of Labor Statistics, http://data.bls.gov/pdq/SurveyOutputServlet.
[36] U.S. Department of Commerce, U.S. Census Bureau, West Virginia Quickfacts, June 7, 2012, http://quickfacts.census.gov.
[37] U.S. Department of Commerce, U.S. Census Bureau, Maine Quickfacts, June 7, 2012,  http://quickfacts.census.gov.
[38] U.S. Department of Commerce, U.S. Census Bureau, District of Columbia Quickfacts, June 7, 2012, http://quickfacts.census.gov/qfd/states/11000.html.
[39] Human Mortality Database, United States of America, Life tables (period 1x1), http://www.mortality.org.
[40] A phenomenon noted almost two decades ago. See Kalman Rupp and David Stapleton, “Determinants of the Growth in the Social Security Administration’s Disability Programs: An Overview,” Social Security Bulletin 58, no. 4 (1995),  http://www.ssa.gov.
[41] U.S. Social Security Administration, Annual Statistical Report on the Social Security Disability Insurance Program, 2011, Disabled Workers, Table 19.
[42] U.S. Social Security Administration, Annual Statistical Report on the Social Security Disability Insurance Program, 2011, Disabled Workers, Table 24: Distribution by Diagnostic Group and Age, December 2011, http://www.ssa.gov.
[43] U.S. Social Security Administration, Annual Statistical Report on the Social Security Disability Insurance Program, 2011, Awards to Disabled Workers, Table 18: By Diagnostic Group, 1960–2000, http://www.ssa.gov.
[44] See FN 
[45] U.S. Social Security Administration, FY 2013 President’s Budget, Table 3: SSA Outlays by Program, http://www.ssa.gov.
[46] U.S. Congressional Budget Office, Policy Options for the Social Security Disability Insurance Program, July 2012, http://www.cbo.gov.
[47] U.S. Social Security Administration, 2012 Old-Age, Survivors, and Disability Insurance (OASDI) Trustees Report, Table IV.B6: Unfunded OASDI Obligations through the Infinite Horizon, http://www.ssa.gov.
[48] U.S. Bureau of Economic Analysis, News Release, June 18, 2012, Table 3; figure is for First Quarter 2012, current dollars, http://www.bea.gov.
[49] Joseph Antos, “Medicare’s Fiscal Crisis and Options for Reform,” American Enterprise Institute, April 30, 2012, 1, http://www.aei.org.
[50] Romina Boccia, “CBO Report Echoes Trustees on Medicare, Social Security,” Heritage Foundation, June 14, 2012, Chart 4, http://www.heritage.org.
[51] See, for example, Jagadeesh Gokhale and Kent Andrew Smetters, “Fiscal and Generational Imbalances: New Budget Measures for New Budget Priorities,” American Enterprise Institute, August 1, 2003, which estimated the unfunded liabilities of these programs in 2003 at much higher levels than those officially reported at that time.
[52] A point made with both nuance and force in Christopher DeMuth, “Debt and Democracy,” Legatum Institute, May 12, 2012, http://www.hudson.org.
[53] The U.S. White House Office of Management and Budget, Historical Tables, Table 3.2: Outlays by Function and Subfunction: 1962–2017,http://www.whitehouse.gov. These definitions and estimates of national defense expenditure trends comes from the White House—the office of the commander in chief.
[54] International Institute for Strategic Studies, The Military Balance 2012, March 2012, Figure: Comparative Defence Statistics, http://www.iiss.org.
[55] Dwight D. Eisenhower Presidential Library and Museum, Farewell Radio and Television Address to the American People, January 17, 1961,http://www.eisenhower.archives.gov.
[56] $49.6 billion vs. $29.5 billion (in current dollars). Defense spending from the U.S. White House Office of Management and Budget, Historical Tables, Table 3.1: Outlays by Superfunction and Function, 1960–2017, ttp://www.whitehouse.gov; government transfers to individuals from Bureau of Economic Analysis, Personal Current Transfer Receipts,  http://www.bea.gov/iTable/iTableHtml.cfm?reqid=70&step=30&isuri=1&7028=-….
[57] There had been exceptions to this generalization—pension payments for veterans exceeded defense budgets in the period after the Civil War, for instance—but they were just that: exceptions.
[58] The comparison is admittedly imperfect. Current dollars are not adjusted for inflation, so they exaggerate the dimensions of the true increases in spending over time. As a matter of pure arithmetic, however, an inflation-adjusted contrast of trends in defense and entitlement spending would make the increases in entitlement spending look even more massive in relation to those in defense spending.
[59] U.S. Department of Defense, Sustainable U.S. Global Leadership: Priorities for 21st-Century Defense, January 2012, http://www.defense.gov.
[60] Ibid., 3.
[61] U.S. White House Office of Management and Budget, Historical Tables, Table 3.1: Outlays by Superfunction and Function, 1960–2017.
[62] Forthcoming
[63] Subjects addressed in my study, The Poverty of ‘The Poverty Rate”–- 
[64] Hagopian; consumption inequality Attanasio et al AEI 2010 Eberstadt intro. 
[67] See http://www.federalreserve.gov/releases/z1/current/z1.pdf

4)

The Region: Egypt kicks sand in Obama’s face


Morsy and AhmadinejadBrotherhood’s leading liberal ally defects; West still doesn’t get it

PHOTO: REUTERS
I could write a 300-page book on how the Obama administration’s Middle East policy has damaged Israel. I could write an 800-page book about how the Obama administration’s Middle East policy has damaged US interests. But why bother?

This is all you need to know: The US government asked its good buddy Egyptian President Mohamed Morsy to inspect an Iranian ship suspected of carrying arms to Syria while it passed through the Suez Canal. Remember that to do so is arguably in Egypt’s own interest since Cairo is supporting the rebels while Tehran backs the regime.

The Egyptian government, despite three decades of massive US aid, licensing to produce advanced American tanks and other equipment, strategic backing and an invitation to Washington to meet Obama – refused. Indeed, Morsy headed for Tehran to attend a “nonaligned” conference.

Does this mean Egypt is going to ally with Iran? No, Egypt will fight Iran for influence tooth and nail. The two countries will kill each others’ surrogates. But it means Morsy feels no friendlier toward America than he does toward Iran. And Cairo will not lift a finger to help Washington against Tehran unless, perhaps, America is willing to put a Muslim Brotherhood government in place in Syria, which might well happen.

In other words, under Jimmy Carter’s watch we got Islamist Iran – and, yes, things could have turned out very differently – and under Obama’s watch – and, yes, things could have turned out very differently – we got Islamist Egypt.

Egypt, the Arab world’s most important country, has been turned from an ally of America against the Iranian threat into, at best, a neutral between Washington and Tehran that will do nothing to help America.

Egypt, the Arab world’s most important country, has been turned from an ally of America – albeit an imperfect one, of course – in maintaining and trying to extend Arab-Israeli peace into a leading advocate of expanding the conflict and even potentially of going to war.

Egypt, the Arab world’s most important country, has been turned from an ally of America in fighting international terrorism into an ally of most international terrorist groups (except those that occasionally target Egypt itself).

But here’s one for the 600 rabbis who front for Obama: The destruction of the Egyptian natural gas pipeline and deal, as a result of the instability and revolution that the US government helped promote, has done as much economic damage as all the Arab and Islamic sabotage, boycotts and Western sanctions or disinvestment in Israel’s history.

Egypt alone is a catastrophe, even without mentioning another dozen examples. How much longer is the obvious fact that Egypt’s Muslim Brotherhood regime is anti-democratic, anti- American and anti-Semitic going to be denied?

But wait, there’s more. Lots more.

After meeting Egypt’s new president, Secretary of Defense Leon Panetta said, “I was convinced that President Morsy is his own man,” adding that the new president is committed to democratic reforms and to representing all Egyptians.

How does Panetta know this? Simple: this is what Morsy told him.

Of course, by endorsing Morsy before he actually does anything, the US government puts its seal of approval on the Muslim Brotherhood regime. Shouldn’t it have to prove itself before Obama gives up all that leverage? What’s next, the Nobel Peace Prize? After all, Morsy’s been in office for a few months.

Note the phrase “his own man.” What does that mean? Why, that Morsy won’t follow the Brotherhood’s orders. He will even stand up to it – presumably to be more moderate – right? Except there is no reason to believe that this is true.

Panetta added: “They agreed that they would cooperate in every way possible to ensure that extremists like al-Qaida are dealt with.” Of course, they are more likely to cooperate against al-Qaida – a group they don’t like. But will they cooperate against Egyptian Salafist terrorists, Hamas and lots of other terrorists? Of course not.

Indeed, at the precise moment Panetta was meeting Morsy, the new president was releasing Islamist terrorists from Egyptian prisons. These include terrorists from Islamic Jihad, which is part of the al-Qaida coalition! How do you square that one, secretary Panetta?

And finally, Morsy pointed out to Panetta that his own son was born in California, when the future Egyptian president was studying there. His son, Morsy pointed out, could be the president of the United States one day.

I’ll leave it to you, dear readers, to ponder that statement.

Of course, the Obama administration can claim one success in Egypt: the regime pulled its forces out of eastern Sinai in accord with the Egypt-Israel peace treaty. The problem is that it has been reported in the Egyptian media – a good source, though not confirmed – that the regime made a deal with the al- Qaida terrorists who attacked Israel: if they promised to stop fighting (for how long?) the Egyptian government would release all of their gunmen.

Meanwhile, the most important (formerly) pro-Islamist moderate intellectual in the Arabic-speaking world has defected, an event of monumental importance that is being ignored in the West. The Egyptian sociologist Sa’ad Eddin Ibrahim hated the Mubarak regime so much that he joined with the Islamists as allies and insisted that they were really moderate.

Now here are some tidbits from an interview he just gave (full interview can be watched on MEMRI TV):

Interviewer: “You indicated that the Muslim Brotherhood are hijacking the country, not merely the top political posts. Is the Muslim Brotherhood indeed about to hijack the country?”

Ibrahim: “Well, this is how it seems to me, as well as to other observers, some of whom are more knowledgeable than me about the Brotherhood,” a reference to long-time members who he said have helped him understand the Brotherhood’s “desire to hijack everything and to control everything.”

Ibrahim was the most articulate advocate of a liberal-Islamist alliance. Now he’s scared – and that should warn all of us to change policies.

Fast.

The writer is director of the Global Research in International Affairs (GLORIA) Center, Interdisciplinary Center Herzliya, and editor of the Middle East Review of International Affairs (MERIA) journal. His latest books are The Israel-Arab Reader(seventh edition), The Long War for Freedom: The Arab Struggle for Democracy in the Middle East (Wiley), and The Truth About Syria (Palgrave-Macmillan). GLORIA Center is at www.gloria-center.org.
--------------------------------------------------------------------------------





No comments: