Feeds:
Entradas
Comentarios

Archive for 30 mayo 2014

Has Rising Inequality Brought Us Back to the 1920s? It Depends on How We Measure Income

Brookings  May 20, 2014

It is now commonplace to say American inequality has reached a peak not seen since the roaring ‘20s. Though often repeated, the claim is flatly untrue under the most comprehensive—and meaningful—definition of family income.

In a widely circulated column, Adair Turner, former chairman of UK’s Financial Services Authority, told readers “The top 1 percent of Americans … have seen their incomes almost triple [since 1979], with their share of national income reaching 20 percent, a figure not seen since the 1920s.” Eduardo Porter recently informed New York Times readers that “The share of national income captured by the richest 1 percent of Americans is even higher than it was at the dawn of the 20th century.” Both writers also noted that U.S. households at the bottom or in the middle of the income distribution have seen almost no gain in income since the 1970s.

It is not hard to find income series that support Turner’s and Porter’s summary of the historical trends. Thomas Piketty and Emmanuel Saez have published statistics based on IRS records and the national income accounts suggesting that the percentage of cash market income received by top income recipients is near the peak seen in the late 1920s. (“Cash market income” consists of taxable wages, self-employment income, interest, dividends, and other cash income other than government benefits.) The Census Bureau publishes income distribution statistics based on households’ pre-tax cash incomes, that is, their cash market incomes plus the cash government benefits they receive.

The Piketty-Saez estimates confirm the claim that Americans in the top 1% receive a historically large share of pre-tax cash market incomes. The Census money income statistics show that, between 1979 and 2012, households in the middle of the income distribution saw small income gains while households in the bottom one-fifth experienced small losses in average incomes.  Both income series have a venerable place in the nation’s economic statistics. The income tax statistics give us one of our oldest statistical series on the distribution and trend in U.S. incomes. The Census Bureau’s money income statistics, which date back to the mid-1940s, add to the information provided by IRS statistics by expanding the types of income that are included and providing information on a more broadly representative sample of Americans.  (A sizeable but varying percentage of families do not file income tax returns, while nearly all households can be interviewed in the Census Bureau’s annual income survey.)

A notable problem with both the IRS income series and the Census Bureau’s money income statistics is the omission of personal tax payments and noncash income items from the income measure. For example, neither income series includes food stamps, housing assistance to low-income families, or the free health benefits provided by companies to their employees and by the government to people enrolled in Medicare or Medicaid. The IRS income tabulation published by Piketty and Saez excludes all government transfer benefits, including cash as well as in-kind transfers.

An unfortunate side effect of these omissions is that an increasing percentage of the gross incomes received by Americans is excluded from the most commonly cited income measures.  At the same time, the two income series miss the effect of shifting tax policy on family tax burdens. The Congressional Budget Office has tried to remedy these deficiencies by including most of the missing income items in a more comprehensive income definition. In addition, it has adjusted the income statistics to reflect size differences among households. It seems plausible to think a single-person household can live more comfortably on $40,000 a year than a four-person family.  Household size has shrunk over time, so even if median household income has remained unchanged, the income available to support each household member has gone up.  Finally, the CBO has made adjustments in households’ income to reflect the federal taxes they are expected to pay through social insurance contributions, personal and corporate income taxes, and excise taxes. (Unfortunately, the adjustments do not include the effects of state and local taxes.)

The CBO income measure is far from perfect, but it comes closer than the older income series to reflecting the spendable incomes of American families. If we define income to mean the annual resource flow available to a family to pay for its consumption, including health care, then the CBO income measure does a far better job than either the cash market income reported on families’ income tax returns or the Census money income measure. Instead of showing that the incomes of low- and middle-income families barely budged after the late 1970s, the CBO tabulations suggest that Americans in the bottom one-fifth of the distribution saw their real net incomes climb by almost 50%.  Those in the middle fifth of the distribution saw their incomes grow 36%.  Their after-tax income gains are nowhere near as large as those enjoyed by the top 1%, who saw their after-tax incomes triple, but they reflect a sizeable improvement in household net incomes. The estimated gains are also more consistent with our aggregate statistics on the overall trend in disposable income.

It is harder to evaluate longer term changes in American inequality under a comprehensive net income definition. So far as I know, no government agency or scholar has offered such estimates. It should be plain, however, that under a comprehensive income definition inequality is far lower today than it was in the late 1920s.  In 1929 government transfer payments to households represented less than 1% of U.S. personal income. Fifty years later, in 1979, government transfers were 11% of personal income. By 2012 they were 17% of personal income.

We have only one inequality estimate showing distributional trends back to the late 1920s, and that is the one calculated by Piketty and Saez. Their measure of income excludes government transfers. Everything we know about the distribution of government benefits suggests they narrow income disparities. They are a much more important component of income for low- and middle-income families than for the well-to-do, who derive nearly all their pre-tax incomes from the market. The CBO estimates show, for example, that when we rank households by their market incomes, households in the bottom one-fifth of the distribution receive almost three times as much government transfers as market income.  Households in the middle fifth of the market income distribution receive about one dollar in government transfers for every $5 in market income. The households in the top 1% of market income recipients receive about $150 in market income for every $1 they receive in government transfers. These estimates are based on CBO estimates of 2010 incomes. We do not have comparable estimates for the late 1920s, but we know that government transfer benefits constituted less than 1% of personal income at that time. It follows that the final distribution of income—including tax payments and transfer benefits—was much closer to the distribution of market income. In the 1920s government transfers were far too small to have a noticeable impact on the distribution of final incomes.

Though most experts on income measurement are aware of the shortcomings of the standard income measures, it is surprising how little of this knowledge has seeped into popular discussion of inequality. Based on our best statistics it is almost certain that market income inequality, after shrinking in the four decades through 1970, began to grow again and has now reached a peak last seen in 1928. However, progressive income taxes and government redistribution are far more important in determining Americans’ incomes today than they were in the 1920s. To disregard the impact of transfers and progressive taxation on the distribution of income and family well-being is to ignore America’s most expensive efforts to lessen the gap between the nation’s rich, middle class, and poor.

Editor’s Note: This piece was originally published in RealClearMarkets.

Gary Burtless

Senior Fellow, Economic Studies

The John C. and Nancy D. Whitehead Chair

Gary Burtless researches labor market policy, income distribution, population aging, social insurance, household saving, and the behavioral effects of taxes and government transfers. He was an economist with the U.S. Department of Labor.

 

Read Full Post »

A Mysterious Map of Louisiana

Susan Schulten

The New York Times   May 25, 2014

 

These days the intersection of cartography and Big Data is all the rage: Using information from the 2010 census, countless news outfits, including The New York Times, have created tools allowing readers to make customized maps of everything from trends in ethnic and racial composition to the dynamics of housing development. Indeed, we have come to expect that any large body of data will be visualized through maps and infographics. Such tools help to transform information into knowledge, and at their best allow us to see patterns that might otherwise be lost.

But while the technology may be new, the idea of mapping data in the United States can actually be traced to the Civil War. Earlier posts in Disunion have discussed the maps of slavery generated by the United States Coast Survey. At the same time, the Census Office (also part of the Treasury Department) was experimenting with maps of not just one but multiple types of data. These were designed to aid the Union war effort, but perhaps more importantly to plan for Reconstruction.

The National Archives


One of the most fascinating — and mysterious — of these experiments is an unsigned, undated map of Louisiana, buried within the voluminous war records of the National Archives. The map contains almost no environmental information save for the river systems and a few railroads. Even roads are omitted, truly unusual for any 19th-century map.

Instead, the emphasis is on parish boundaries, within which are listed free and slave populations alongside data about resources, from swine to ginned cotton. While this population data would have been available as early as 1862, the agricultural data was only published in 1864. With this information, officers and administrators moving through the state could locate the richest parishes, the largest sources of labor and the easiest means of river and rail transportation. (Oddly, the map does not list the output from over 1,500 sugar plantations located along the lower Mississippi River.)

A closer look at southeastern Louisiana
A closer look at southeastern Louisiana    (The National Archives)

The Census Office was experimenting with this type of map throughout the war. At the request of Gen. William Tecumseh Sherman, for instance, the superintendent of the census annotated a previously printed map of Georgia with information on livestock and crop yields as the former embarked on his ambitious march in the fall of 1864 deep into enemy territory. But Louisiana presented an entirely different — though equally unprecedented — challenge to the Union Army: how to control and administer a conquered region where nearly half the population was no longer strictly enslaved, but which was largely exempt from the Emancipation Proclamation.

The quandary began in April 1862, when Adm. David Farragut captured New Orleans. Soon after, President Lincoln appointed Gen. Benjamin Butler as commander of the gulf. Lincoln hoped to cultivate Unionist sentiment in New Orleans, and thereby lure Louisiana out of the Confederacy. But Butler’s rigid policies and questionable confiscation of cotton alienated many in New Orleans and the parishes beyond, even though his military quarantine effectively ended the murderous yellow fever epidemic that had ravaged the city for decades.

Butler’s tenure was brief, and by the end of 1862 Lincoln had replaced him with the former governor of Massachusetts, Maj. Gen. Nathaniel P. Banks. As commander of the gulf, Banks’s military charge was to expand the realm of Union control into Texas and up the Mississippi. But equally complex was the political task of governing an area under Union occupation. In 1860 Louisiana had a population of 600,000, slightly more than half of whom were white. Yet in some of the parishes with large plantations, blacks far outnumbered whites, especially after the war took men of military age to the Confederate Army. The Confiscation Act of March 1862 prohibited Union soldiers from returning slaves to their masters, and thereby the very presence of the Army disrupted slavery. But without any clear mandate for emancipation, many of the conditions of slavery remained. Louisiana was in limbo.

Thus Banks faced the problem of rebuilding an immensely fertile region with a profoundly unstable (and still unfree) labor system. That’s where the map came in, for it allowed Banks to see the general economic capacity of the state. While such data would have been available to anyone with access to the published records of the 1860 census (published in 1862), to see such information organized geographically enabled Banks to think strategically about managing the population, its chief crop and its food supply.

Banks’s system of labor contracts drew intense criticism from all sides, including freedmen, former plantation owners and especially antislavery Republicans in the Union. Historians have also judged it harshly for its repressive techniques, which reflected a desire to control the black population and keep plantations functioning. At the height of its operation in 1864, Banks’s system of labor contracts involved 50,000 laborers on 1,500 estates. And in part because of his labor policies, the state’s agricultural production grew significantly in 1863. In this situation, Banks probably used the map to measure the strength and resources of individual parishes. The map probably also aided Banks as he began to conscript blacks (sometimes forcibly) into the Army. By the end of 1864 he had organized more than 28 regiments, which meant that Louisiana contributed more black soldiers to the Union Army than any other state.

In these various ways, the map measured the population and its resources. In this respect the map anticipates the extensive Federal mapping efforts of the Census after the war; by the 20th century, such cartographic and statistical tools of governance had become routine.

In both the management of labor and soldiers, the map enabled Banks to govern and control by seeing the aggregate strength and composition of the population and its resources. In this respect the map anticipated the extensive federal mapping efforts of the census in postwar decades; today we live with such tools as a matter of course.

In the summer of 1864, Louisiana designed a new state constitution that abolished slavery. Thereafter, in some respects, the map was immediately outdated, and in fact it may be one of the last maps that used the term “slave.” Yet while such a category was crumbling throughout 1864, the conditions of true freedom lay far in the future, and in fact Banks’ strict efforts to regulate the movement of African-Americans laid the groundwork for the punitive black codes of the early Reconstruction period. After all, his primary goal was to control the population, and in this respect the map was no mystery at all, but the result of the logic of war.

Map courtesy of the National Archives and Records Administration in College Park, Maryland.

Follow Disunion at twitter.com/NYTcivilwar or join us on Facebook.


Susan Schulten

Susan Schulten is a history professor at the University of Denver and the author of “The Geographical Imagination in America, 1880-1950” and “Mapping the Nation: History and Cartography in Nineteenth-Century America.”

Read Full Post »

PRC-Logo

From Germany to Mexico: How America’s source of immigrants has changed over a century

Where US immigrants come from, state by state today and a century agoWith more than 40 million immigrants, the United States is the top destination in the world for those moving from one country to another. Mexico, which shares a nearly 2,000-mile border with the U.S., is the source of the largest wave of immigration in history from a single country to the United States.

But today’s volume of immigrants, in some ways, is a return to America’s past. A century ago, the U.S. experienced another large wave of immigrants. Although smaller at 18.2 million, they hailed largely from Europe. Many Americans can trace their roots to that wave of migrants from 1890-1919, when Germany dominated as the country sending the most immigrants to many of the U.S. states, although the United Kingdom, Canada and Italy were also strongly represented.

In 1910, Germany was the top country of birth among U.S. immigrants, accounting for 18% of all immigrants (or 2.5 million) in the United States. Germans made up the biggest immigrant group in 17 states and the District of Columbia, while Mexico accounted for the most immigrants in just three states (Arizona, New Mexico and Texas). Behind Germany, the second-most number of immigrants in the U.S. were from Russia and the countries that would become the USSR (11%, or 1.6 million).

US Immigrants from Germany, Mexico

Since 1965, when Congress passed legislation to open the nation’s borders, immigrants have largely hailed from Latin America and Asia. In states that have attracted many immigrants, the current share of immigrants is below peaks reached more than a century ago. Today there are four states (California, New York, New Jersey and Florida) in which about one-in-five or more people are foreign born. California peaked in 1860 at 39.7%, when China was the top country of birth among immigrants there. Meanwhile, New York and New Jersey peaked in 1910 at 30.1% (Russia and the USSR) and 26.2% (Italy), respectively.

Today, five times as many immigrants in the U.S. are from Mexico than China, the country with the second-highest number of immigrants (5% of all immigrants in the U.S., or 2.2 million). Mexico is the birthplace of 29% (or 11.7 million) of all immigrants in the United States. Immigrants born in Mexico account for more than half of all of the foreign born in four states: New Mexico (72.4%), Arizona (60.2%), Texas (59.7%) and Idaho (53.5%).

Despite Mexico’s large numbers, immigrants come to the U.S. from all over the world. India is the top country of birth among immigrants in New Jersey, West Virginia  and Pennsylvania, even though only about one-in-ten immigrants in each state are from India. Canada is the top country of birth for immigrants in Maine (27%), New Hampshire (14%), Vermont (23%), North Dakota (19%) and Montana (25%). Filipinos account for a large share of immigrants in Hawaii (45%) and Alaska (30%).

Percentage of U.S. population that is foreign born

Note: Countries are defined by their modern-day boundaries, which may be different from their historical boundaries. For example, China includes Hong Kong, Macau and Taiwan. Russia and the former USSR countries are combined in this analysis, even though the Soviet Union was only in existence between 1922 and 1991. Birthplace is self-reported by respondents. 

 

Read Full Post »

Thomas Paine, Our Contemporary

Chris Hedges

Truthdig   May 25 2014

Read Full Post »

arbannsmU.S. Covert Intervention in Chile: Planning to Block Allende Began Long before September 1970 Election

National Security Archive Electronic Briefing Book No. 470

May 23, 2014

For more information contact:
Peter Kornbluh 202/374-7281 or peter.kornbluh@gmail.com

 

battleOfChileWashington, DC, May 23, 2014 – Covert U.S. planning to block the democratic election of Salvador Allende in Chile began weeks before his September 4, 1970, victory, according to just declassified minutes of an August 19, 1970, meeting of the high-level interagency committee known as the Special Review Group, chaired by National Security Advisor Henry Kissinger. “Kissinger asked that the plan be as precise as possible and include what orders would be given September 5, to whom, and in what way,” as the summary recorded Kissinger’s instructions to CIA Director Richard Helms. “Kissinger said we should present to the President an action plan to prevent [the Chilean Congress from ratifying] an Allende victory…and noted that the President may decide to move even if we do not recommend it.”

The document is one of a compendium of some 366 records released by the State Department as part of its Foreign Relations of the United States (FRUS) series. The much-delayed collection, titled “Chile: 1969-1973,” addresses Richard Nixon’s and Kissinger’s efforts to destabilize the democratically elected Socialist government of Salvador Allende, and the U.S.-supported coup that brought General Augusto Pinochet to power in 1973. The controversial volume was edited by two former officials of the State Department’s Office of the Historian, James Siekmeier and James McElveen.

“This collection represents a substantive step forward in opening the historical record on U.S. intervention in Chile,” said Peter Kornbluh, who directs the Chile documentation project at the National Security Archive, and is the author of The Pinochet File: A Declassified Dossier on Atrocity and Accountability. Kornbluh called on the State Department to continue to pursue the declassification of all relevant records on the U.S. role in Chile, including all records of CIA contacts with the Chilean military leading up to the September 11, 1973, coup; CIA funding for the truckers’ strike as part of the “destabilization” campaign, and CIA intelligence on the executions of two U.S. citizens in the wake of the military takeover, Charles Horman and Frank Teruggi.

The FRUS series is scheduled to release an electronic supplement of additional records in the fall, and to publish another volume,Chile, 1973-1976, next year. “The next volume could advance the historical record on CIA support for the Chilean secret police, DINA, CIA knowledge of Operation Condor, and Pinochet’s act of international terrorism in Washington D.C. that killed Orlando Letelier and Ronni Karpen Moffitt,” Kornbluh suggested.

In the aftermath of General Augusto Pinochet’s arrest in October 1998, the National Security Archive, along with victims of the Pinochet regime, led a campaign to press the Clinton administration to declassify the still-secret documents on Chile, the coup and the repression that followed. Some 23,000 NSC, State Department, Defense Department and CIA records were released. Some of those have been included in the new FRUS collection which contains a set of meeting memoranda of the “40 Committee” — an interagency group chaired by Henry Kissinger which oversaw covert operations in Chile, as well as dozens of formerly secret cables, including CIA communications.

The release of the records comes amidst renewed debate over the CIA role in supporting the military coup in Chile. The forthcoming issue of Foreign Affairs contains an article by former CIA operative Jack Devine, “What Really Happened in Chile: the CIA, the Coup Against Allende, and the Rise of Pinochet,” which reveals that intelligence he obtained on September 9, 1973, alerted President Nixon in advance to the timing of the coup. “I sent CIA headquarters in Langley a special type of top-secret cable known as a CRITIC, which takes priority over all other cables and goes directly to the highest levels of government. President Richard Nixon and other top U.S. policymakers received it immediately. ‘A coup attempt will be initiated on 11 September,’ the cable read.”

Nevertheless, Devine asserts that the CIA “did not plot with the Chilean military to overthrow Allende in 1973.”

However, according to a transcript of the first phone conversation between Kissinger and Nixon following the coup, when the President asked if “our hand” showed in the coup, Kissinger explained that “we didn’t do it,” in terms of direct participation in the military actions: “I mean we helped them,” Kissinger continued. “[deleted word] created the conditions as great as possible.”

The Kissinger-Nixon transcript is reproduced in the 2013 edition of The Pinochet File.

Read the FRUS volume here

 

Read Full Post »

index2

The Civil War and P.T.S.D.

Dillon Carroll

The New York Times   May 23, 2014

Edson Bemis was a hard man to kill. Rebel soldiers tried three times, and three times they failed. At the Battle of Antietam, a musket ball ripped through his left arm. Two years later, in the horrible fighting in the Wilderness, he was shot in the abdomen, just above the groin. The ball was never extracted, remaining in his body until the day he died.

The Confederates came the closest to killing Bemis in February 1865. At Hatcher’s Run, Va., a Minié ball struck him in the head. He lay near death for several days, his skull cracked and leaking brain matter. Most passed him off for dead. Dr. Albert VanDevour, however, did not, and instead performed a risky surgery to remove the bullet from his skull. Bemis improved immediately, eventually recovering, much to the shock of everyone.

The war was finally over for Bemis. He moved to Suffield, Conn., with his wife, Jane, where they hoped to start a new life. He began working for W.W. Cooper’s, a local merchant house, but very quickly it became clear to everyone that Bemis was not right. One of his colleagues at W.W. Cooper’s, George N. Kendall, described his health as “never very good,” and Bemis began to suffer from “spells of vertigo” or “something that afflicted his head” so much so that he frequently could not work.

Kendall noticed that Edson was also “very forgetful.” He had wild mood swings, and Kendall wrote “any little thing irritates him.” He was increasingly subject to memory loss. Sometimes, for several hours each day, he had no memory of where he had been or what he had done. Eventually he had to stop working at W.W. Cooper’s because of his condition.

In 1890, Bemis suffered what appeared to be a stroke, and his condition, which was already bad, got exponentially worse. A pension official came to Suffield to interview the Bemis family and friends, and immediately noticed that although Bemis was only 55 in 1895, he walked “like a man of 80!” His wife had to assist him in dressing, she had to “cut his meat and wash his potatoes” and she described him as being “like a child.” The pension official wrote that Bemis’s only job each day was to go to the post office “right below here for the mail and to a few houses above for a pail of milk every day this is all he can do.”

In 1900, Jane had apparently had enough, and Bemis was examined and institutionalized in Westboro Insane Hospital in Westboro, Mass. By this time, his condition had spiraled even further. A doctor at Westboro, Lewis Bryant, wrote that Bemis believed he was “thirty years old” but he could not recall the present “year month or day of the week.” Bemis believed that “the civil war is still going on” and, occasionally, would “see dogs in the room.” Bryant described him as “silly, emotional, crying and laughing without apparent cause” and having “little memory confusing the present with the past…soils his clothing has had delusions and false sights, and at times requires the care and attentions usually given a child.”

Celestia Bemis, his sister, who coincidentally married a man with the last name Bemis, came to Westboro and took charge of Edson, taking him to her farm in North Brookfield. Celestia and Jane did not get along, and their feud spilled over into the notes of the pension official who occasionally checked up on Bemis. Jane claimed that Celestia ordered her to stay away from him, because her presence excited him too much, while Celestia claimed that Jane had never once tried to visit Bemis, and was content to keep cashing his pension checks without ever seeing him. Jane last saw her husband in August of 1900; he died two months later. She continued collecting a pension until her death in 1917.

Bemis’s story was not an uncommon one among Civil War veterans. Historians are beginning to uncover what was a virtual epidemic of emotional, psychological and neurological trauma that afflicted soldiers after the war. Veterans labored under emotional and psychological stress in ways that are disturbingly similar to the present. Alcoholism was rampant, as was unemployment. Suicide was endemic. Civil War veterans dotted the wards of insane asylums across the country.

Modern science would most likely have given Bemis a diagnosis of traumatic brain injury, caused by a blow to the head or a penetrating injury of the skull. Such injuries are all too common among veterans of Iraq and Afghanistan today. Symptoms of T.B.I. range from headaches, confusion, lightheadedness and dizziness to fatigue, mood changes, depression, changes in sleep patterns, restlessness and agitation. That seems to be consistent with Bemis’s litany of postwar complaints.

RELATED

If so many Civil War veterans were troubled with emotional and psychological trauma, why has it taken us so long to discover them? Veterans were loath to admit they were traumatized. In the 19th century, mental illness carried a tremendous stigma, and most veterans fought a private battle rather than disclose their trauma.

Additionally, most families preferred to care for mentally ill loved ones at home. Bemis’s care as his mental health declined became a community project. Jane certainly performed the lion’s share of the work. She dressed him, fed him, and sometimes had to help him in the bathroom. But she could not watch him all the time. A.P. Sherwin, a local doctor, later testified that everyone “in town knows soldier to be mentally afflicted” and all the people in Suffield near the Bemis household “watch him closely.” Jane Bemis testified that she did not watch him “on the street” because “everybody knows him” and that he only “goes a short way from home.”

Finally, the relationship between warfare and psychological trauma has only recently become better understood. War trauma has distressed veterans in nearly every war, but the whispers of shell shock and combat fatigue never really entered the public consciousness. It was not until after Vietnam that veterans’ groups successfully lobbied the American Psychiatric Association to include post-traumatic stress disorder in the Diagnostic and Statistics Manual of Mental Disorders. Since then, our understanding and empathy for veterans afflicted with psychological trauma has grown rapidly. Bemis’s life demonstrates that combat has been damaging to the human brain and the human psyche long before we were willing and able to give the maladies a name.

Follow Disunion at twitter.com/NYTcivilwar or join us on Facebook.


Sources: Soldier’s Certificate No. 59,267, Cpl. Edson D. Bemis, Company K, 20th Massachusetts Volunteer Infantry, National Archives; Case Files of Approved Pension Applications of Veterans Who Served in the Army and Navy Mainly in the Civil War and the War with Spain, 1861-1934, National Archives; Steven T. DeKosky, “Traumatic Brain Injury: Football, Warfare, and Long-Term Effects,” in the New England Journal of Medicine 363, No. 14 (Sept. 30, 2010); Rebecca J. Anderson, “Shell Shock: An Old Injury with New Weapons,” Molecular Interventions 8, No. 5 (Oct. 2008); Emily Singer, “Brain Trauma in Iraq,” Technology Review 111, No. 3 (May–June 2008); Jeanne Marie Laskas, “Game Brain,” GQ, Oct. 2009; Ben McGrath, “Does Football Have a Future?” New Yorker, Jan. 31, 2011.

Dillon Carroll

Dillon Carroll is a graduate student in history at the University of Georgia.

 

Read Full Post »

The Ludlow Massacre Still Matters 

Ben Mauk

The New Yorker  May 19, 2014

 

 

ludlow-massacre-580.jpeg

On April 20, 1914, members of the Colorado National Guard opened fire on a group of armed coal miners and set fire to a makeshift settlement in Ludlow, Colorado, where more than a thousand striking workers and their families were camped out. Today, the Ludlow massacre, which Caleb Crain wrote about in The New Yorker in 2009, remains one of the bloodiest episodes in the history of American industrial enterprise; at least sixty-six men, women, and children were killed in the attack and the days of rioting that followed, according to most historical accounts. Although it is less well-remembered today than other dark episodes in American labor history, such as the Triangle Shirtwaist Factory fire that claimed a hundred and forty-six lives, the Ludlow massacre—which Wallace Stegner once called “one of the bleakest and blackest episodes of American labor history”—changed the nation’s attitude toward labor and capital for the next several decades. Its memory continues to reverberate in contemporary political discourse.

In the summer of 1913, United Mine Workers began to organize the eleven thousand coal miners employed by the Rockefeller-owned Colorado Fuel & Iron Company. Most of the workers were first-generation immigrants from Italy, Greece, and Serbia; many had been hired, a decade prior, to replace workers who had gone on strike. In August, the union extended invitations to company representatives to meet about their grievances—including low pay, long and unregulated hours, and management practices they felt were corrupt—but they were rebuffed. A month later, eight thousand Colorado mine workers went on strike. Among their demands were a ten-per-cent pay raise, the enforcement of an eight-hour working day, and the right to live and trade outside the company-owned town. Many of the rights they sought were required by Colorado law but remained unenforced.

After getting evicted from their company-owned homes, the workers based their operations in makeshift tent cities surrounding the mines, the largest of which was the Ludlow camp. The Rockefellers responded by hiring a detective agency—comprised of “Texas desperadoes and thugs,” according to “Legacy of the Ludlow Massacre,” a sharply researched 1988 book by Howard M. Gitelman—who would periodically raid the camps, firing rifles and shotguns. In November, the state governor called in the Colorado National Guard at the company’s behest; the Guard’s wages were supplied by the Rockefeller family, and they helped to form militias whose members carried out sporadic raids and shootings in the tent cities.

The strike stretched on for months, and in April, 1914, John D. Rockefeller, Jr., appeared before Congress, where he framed the standoff as “a national issue, whether workers shall be allowed to work under such conditions as they may choose.” He balked at the possibility of allowing “outside people”—meaning union organizers—“to come in and interfere with employees who are thoroughly satisfied with their labor conditions.” The committee chairman asked Rockefeller whether he would stand by his anti-union principles even “if it costs all your property and kills all your employees.” Rockefeller replied, “It is a great principle.”

On April 20th, a day after Orthodox Easter, four militiamen brandished a machine gun at some of the striking miners. At some point, shots were fired—the accounts are predictably inconsistent as to who fired first—and a day-long gunfight ensued.

That evening, the National Guardsmen set fire to the Ludlow colony. Thirteen residents who tried to flee were shot and killed as the camp burned to the ground, and many more burned to death. Discovered among the ruins the following morning was a women’s infirmary, where four women and eleven children had sought to escape the fighting by hiding in a cellar-like pit. All the children and two of the women died. One survivor, Mary Petrucci, lost three of her own children in the infirmary. Years later, she recalled, “I came out of the hole. There was light and lots of smoke. I wandered among the ashes until a priest found me. I couldn’t feel anything. I was cold.”

News of the attack—and especially of the deaths under the infirmary tent—pulled the nation’s attention from the United States’ potential involvement in the Mexican Revolution. To many Americans, the massacre exposed the consequences of unchecked corporate might, and it roused the conscience of a country that had previously demonstrated impassive ambivalence toward organized labor. (Decades later, a song by Woody Guthrie captured the common sentiment of the event’s immediate aftermath: “We took some cement and walled the cave up where you killed these thirteen children inside / I said ‘God bless the Mine Workers Union,’ then I hung my head and cried.”)

Two days later, Congress convened to discuss the events at Ludlow, and to consider how the government might check martial power wielded by private industrialists. One senator, Iowa’s “radical Republican,” William Kenyon, decried the government’s ties to the violence, noting that “the Colorado Fuel & Iron Company, or the company controlling it, has certain of its bonds on deposit with the General Education Board of the Rockefeller Foundation, with which the Department of Agriculture of our Government seems to have been in partnership for some little time.” Another senator expressed a broader concern: “I fear that unless society can in some manner reconcile these troubled conditions as between capital and labor, Mexico is not the only country that will be torn by internecine strife.”

Rockefeller, for his part, released a memorandum in June, months after federal troops had been ordered to Colorado to quell the days of violent rioting that had followed the events of April 20th. “There was no Ludlow massacre,” he wrote. “The engagement started as a desperate fight for life by two small squads of militia … against the entire tent colony, which attacked them with over three hundred armed men.” He also offered a lengthy technical explanation of why the deaths in the infirmary were the result of inadequate ventilation and overcrowding, not of actions taken by “the defenders of law and property, who were in no slightest way responsible for it.”

Despite Rockefeller’s arguments, after Ludlow the Wild West era of company towns began to wane, and stricter labor laws began to appear on the books—and were even enforced. Support for unions reached an all-time high in the nineteen-thirties, as described by James Surowiecki in a 2011 article for the magazine. Yet, as Surowiecki also noted, the influence of trade unions, which supplanted company unions following the 1935 Wagner Act, has been declining for decades, as part of a general rightward shift in American politics which began in the sixties. Since the 2008 recession, there has been growing resentment for union members among non-unionized workers; in 2010, support for unions reached a historic low, according to a Pew poll.

Yet the struggle that Ludlow embodied—and that, historically, unions have taken up—is a contemporary one, even if unions are no longer playing as public a role. Today, some of the fiercest workers’-rights battles take place over government regulations that protect low-income workers’ access to Medicaid and other social services, and that buoy the federal minimum wage, which is currently far below its 1968 peak value. In her recently published autobiography, Senator Elizabeth Warren wrote that “Big corporations hire armies of lobbyists to get billion-dollar loopholes into the tax system and persuade their friends in Congress to support laws that keep the playing field tilted in their favor.” In this, she sounds almost exactly like the Republican senators who, in the days after Ludlow, worried about Colorado Iron & Fuel’s deep government influence.

What was at stake at Ludlow remains pertinent even within the modern coal industry. Last week, the Center for Public Integrity won a Pulitzer Prize for its investigative report on efforts to deny benefits to coal miners with black-lung disease. The seriesdescribes how industry-compensated lawyers have frequently withheld evidence from judges in order to defeat the medical claims of miners suffering from the resurgent ailment, which today affects about six per cent of miners in central Appalachia, according to government statistics reported in the series.

A different kind of violence is visited upon today’s miners. There are no overt, bloody showdowns between striking workers and armed National Guardsmen whose paychecks come from corporate barons. But industry money—in the form of fees paid by mine companies for consultant work—still appears to influence the diagnoses of doctors and radiologists, according to copious research compiled by the Center. And the coal industry’s go-to law firm withheld dissenting medical evidence that supported miners’ claims in eleven of the fifteen cases featured in the report. As a result, ailing and dying miners are denied the support they are owed.

There are eighty-five thousand coal miners left in the United States, but, while many are union members, the influence of the United Mine Workers was already waning by the early eighties, according to the Center for Public Integrity report. Today the union represents about twenty thousand active miners, according to the Wall Street Journal. Instead of union pressure, it was more likely the Center’s investigation that prompted the Department of Labor to announce, in February, a series of reforms that will make it easier for miners with black-lung disease to collect their medical benefits. A hundred years ago, it took a great and deadly injustice to spur lasting government reform. Here’s hoping we learn from it.

Photograph: Fotosearch/Getty

Ben Mauk is a writer from Baltimore, Maryland.He is a regular online contributor to The New Yorker, and his essays and stories appear in The Sun magazine, The American Reader, The Believer, The Los Angeles Review of Books, and elsewhere.

From 2007 to 2009, he was an editor at Dana Press in Washington, D.C. and New York City. He is a graduate of Cornell University and the Iowa Writers’ Workshop, and the recipient of a 2014-15 U.S. Fulbright Award for Young Journalists.

He lives in Iowa City, where he teaches at the University of Iowa. 

Read Full Post »

Older Posts »