Feeds:
Entradas
Comentarios

Archive for 28 noviembre 2014

Remember the Sand Creek Massacre

Credit Christine Marie Larsen

NEW HAVEN — MANY people think of the Civil War and America’s Indian wars as distinct subjects, one following the other. But those who study the Sand Creek Massacre know different.

On Nov. 29, 1864, as Union armies fought through Virginia and Georgia, Col. John Chivington led some 700 cavalry troops in an unprovoked attack on peaceful Cheyenne and Arapaho villagers at Sand Creek in Colorado. They murdered nearly 200 women, children and older men.

Sand Creek was one of many assaults on American Indians during the war, from Patrick Edward Connor’s massacre of Shoshone villagers along the Idaho-Utah border at Bear River on Jan. 29, 1863, to the forced removal and incarceration of thousands of Navajo people in 1864 known as the Long Walk.

In terms of sheer horror, few events matched Sand Creek. Pregnant women were murdered and scalped, genitalia were paraded as trophies, and scores of wanton acts of violence characterize the accounts of the few Army officers who dared to report them. Among them was Capt. Silas Soule, who had been with Black Kettle and Cheyenne leaders at the September peace negotiations with Gov. John Evans of Colorado, the region’s superintendent of Indians affairs (as well as a founder of both the University of Denver and Northwestern University). Soule publicly exposed Chivington’s actions and, in retribution, was later murdered in Denver.

After news of the massacre spread, Evans and Chivington were forced to resign from their appointments. But neither faced criminal charges, and the government refused to compensate the victims or their families in any way. Indeed, Sand Creek was just one part of a campaign to take the Cheyenne’s once vast land holdings across the region. A territory that had hardly any white communities in 1850 had, by 1870, lost many Indians, who were pushed violently off the Great Plains by white settlers and the federal government.

These and other campaigns amounted to what is today called ethnic cleansing: an attempted eradication and dispossession of an entire indigenous population. Many scholars suggest that such violence conforms to other 20th-century categories of analysis, like settler colonial genocide and crimes against humanity.

Sand Creek, Bear River and the Long Walk remain important parts of the Civil War and of American history. But in our popular narrative, the Civil War obscures such campaigns against American Indians. In fact, the war made such violence possible: The paltry Union Army of 1858, before its wartime expansion, could not have attacked, let alone removed, the fortified Navajo communities in the Four Corners, while Southern secession gave a powerful impetus to expand American territory westward. Territorial leaders like Evans were given more resources and power to negotiate with, and fight against, powerful Western tribes like the Shoshone, Cheyenne, Lakota and Comanche. The violence of this time was fueled partly by the lust for power by civilian and military leaders desperate to obtain glory and wartime recognition.

The United States has yet to fully recognize the violent destruction wrought against indigenous peoples by the Civil War and the Union Army. Connor and Evans have cities, monuments and plaques in their honor, as well as two universities and even Colorado’s Mount Evans, home to the highest paved road in North America.

Saturday’s 150th anniversary will be commemorated many ways: The National Park Service’s Sand Creek Massacre Historic Site, the descendant Cheyenne and Arapaho communities, other Native American community members and their non-Native supporters will commemorate the massacre. An annual memorial run will trace the route of Chivington’s troops from Sand Creek to Denver, where an evening vigil will be held Dec. 2.

The University of Denver and Northwestern are also reckoning with this legacy, creating committees that have recognized Evans’s culpability. Like many academic institutions, both are deliberating how to expand Native American studies and student service programs. Yet the near-absence of Native American faculty members, administrators and courses reflects their continued failure to take more than partial steps.

While the government has made efforts to recognize individual atrocities, it has a long way to go toward recognizing how deeply the decades-long campaign of eradication ran, let alone recognizing how, in the face of such violence, Native American nations and their cultures have survived. Few Americans know of the violence of this time, let alone the subsequent violation of Indian treaties, of reservation boundaries and of Indian families by government actions, including the half-century of forced removal of Indian children to boarding schools.

One symbolic but necessary first step would be a National Day of Indigenous Remembrance and Survival, perhaps on Nov. 29, the anniversary of Sand Creek. Another would be commemorative memorials, not only in Denver and Evanston but in Washington, too. We commemorate “discovery” and “expansion” with Columbus Day and the Gateway arch, but nowhere is there national recognition of the people who suffered from those “achievements” — and have survived amid continuing cycles of colonialism.

Correction: November 27, 2014
An earlier version of this article incorrectly stated that the American Indian leader Black Kettle was killed in the Sand Creek Massacre. He died at the Battle of Washita in Oklahoma in 1868. 

Read Full Post »

America’s Founding Myths

This Thanksgiving, it’s worth remembering that the narrative we hear about America’s founding is wrong. The country was built on genocide.

Massacre of American-Indian women and children in Idaho.

Under the crust of that portion of Earth called the United States of America — “from California . . . to the Gulf Stream waters” — are interred the bones, villages, fields, and sacred objects of American Indians. They cry out for their stories to be heard through their descendants who carry the memories of how the country was founded and how it came to be as it is today.

It should not have happened that the great civilizations of the Western Hemisphere, the very evidence of the Western Hemisphere, were wantonly destroyed, the gradual progress of humanity interrupted and set upon a path of greed and destruction. Choices were made that forged that path toward destruction of life itself—the moment in which we now live and die as our planet shrivels, overheated. To learn and know this history is both a necessity and a responsibility to the ancestors and descendants of all parties.

US policies and actions related to indigenous peoples, though often termed “racist” or “discriminatory,” are rarely depicted as what they are: classic cases of imperialism and a particular form of colonialism—settler colonialism. As anthropologist Patrick Wolfe writes, “The question of genocide is never far from discussions of settler colonialism. Land is life — or, at least, land is necessary for life.

The history of the United States is a history of settler colonialism — the founding of a state based on the ideology of white supremacy, the widespread practice of African slavery, and a policy of genocide and land theft. Those who seek history with an upbeat ending, a history of redemption and reconciliation, may look around and observe that such a conclusion is not visible, not even in utopian dreams of a better society.

That narrative is wrong or deficient, not in its facts, dates, or details but rather in its essence. Inherent in the myth we’ve been taught is an embrace of settler colonialism and genocide. The myth persists, not for a lack of free speech or poverty of information but rather for an absence of motivation to ask questions that challenge the core of the scripted narrative of the origin story.

Woody Guthrie’s “This Land Is Your Land” celebrates that the land belongs to everyone, reflecting the unconscious manifest destiny we live with. But the extension of the United States from sea to shining sea was the intention and design of the country’s founders.

“Free” land was the magnet that attracted European settlers. Many were slave owners who desired limitless land for lucrative cash crops. After the war for independence but before the US Constitution, the Continental Congress produced the Northwest Ordinance. This was the first law of the incipient republic, revealing the motive for those desiring independence. It was the blueprint for gobbling up the British-protected Indian Territory (“Ohio Country”) on the other side of the Appalachians and Alleghenies. Britain had made settlement there illegal with the Proclamation of 1763.

In 1801, President Jefferson aptly described the new settler-state’s intentions for horizontal and vertical continental expansion, stating, “However our present interests may restrain us within our own limits, it is impossible not to look forward to distant times, when our rapid multiplication will expand itself beyond those limits and cover the whole northern, if not the southern continent, with a people speaking the same language, governed in similar form by similar laws.”

Origin narratives form the vital core of a people’s unifying identity and of the values that guide them. In the United States, the founding and development of the Anglo-American settler-state involves a narrative about Puritan settlers who had a covenant with God to take the land. That part of the origin story is supported and reinforced by the Columbus myth and the “Doctrine of Discovery.”

The Columbus myth suggests that from US independence onward, colonial settlers saw themselves as part of a world system of colonization. “Columbia,” the poetic, Latinate name used in reference to the United States from its founding throughout the nineteenth century, was based on the name of Christopher Columbus.

The “Land of Columbus” was—and still is—represented by the image of a woman in sculptures and paintings, by institutions such as Columbia University, and by countless place names, including that of the national capital, the District of Columbia. The 1798 hymn “Hail, Columbia” was the early national anthem and is now used whenever the vice president of the United States makes a public appearance, and Columbus Day is still a federal holiday despite Columbus never having set foot on any territory ever claimed by the United States.

To say that the United States is a colonialist settler-state is not to make an accusation but rather to face historical reality. But indigenous nations, through resistance, have survived and bear witness to this history. The fundamental problem is the absence of the colonial framework.

Settler colonialism, as an institution or system, requires violence or the threat of violence to attain its goals. People do not hand over their land, resources, children, and futures without a fight, and that fight is met with violence. In employing the force necessary to accomplish its expansionist goals, a colonizing regime institutionalizes violence. The notion that settler-indigenous conflict is an inevitable product of cultural differences and misunderstandings, or that violence was committed equally by the colonized and the colonizer, blurs the nature of the historical processes. Euro-American colonialism had from its beginnings a genocidal tendency.

The term “genocide” was coined following the Shoah, or Holocaust, and its prohibition was enshrined in the United Nations convention adopted in 1948: the UN Convention on the Prevention and Punishment of the Crime of Genocide.

The convention is not retroactive but is applicable to US-indigenous relations since 1988, when the US Senate ratified it. The terms of the genocide convention are also useful tools for historical analysis of the effects of colonial- ism in any era. In the convention, any one of five acts is considered genocide if “committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group”:

  • killing members of the group;
  • causing serious bodily or mental harm to members of the group; deliberately inflicting on the group conditions of life
  • calculated to bring about its physical destruction in whole or in part;
  • imposing measures intended to prevent births within the group;
  • forcibly transferring children of the group to another group.

Settler colonialism is inherently genocidal in terms of the genocide convention. In the case of the British North American colonies and the United States, not only extermination and removal were practiced but also the disappearing of the prior existence of indigenous peoples—and this continues to be perpetuated in local histories.

Anishinaabe (Ojibwe) historian Jean O’Brien names this practice of writing Indians out of existence “firsting and lasting.” All over the continent, local histories, monuments, and signage narrate the story of first settlement: the founder(s), the first school, first dwelling, first everything, as if there had never been occupants who thrived in those places before Euro-Americans. On the other hand, the national narrative tells of “last” Indians or last tribes, such as “the last of the Mohicans,” “Ishi, the last Indian,” and End of the Trail, as a famous sculpture by James Earle Fraser is titled.

From the Atlantic Ocean to the Mississippi River and south to the Gulf of Mexico lay one of the most fertile agricultural belts in the world, crisscrossed with great rivers. Naturally watered, teeming with plant and animal life, temperate in climate, the region was home to multiple agricultural nations. In the twelfth century, the Mississippi Valley region was marked by one enormous city-state, Cahokia, and several large ones built of earthen, stepped pyramids, much like those in Mexico. Cahokia supported a population of tens of thousands, larger than that of London during the same period.

Other architectural monuments were sculpted in the shape of gigantic birds, lizards, bears, alligators, and even a 1,330-foot-long serpent. These feats of monumental construction testify to the levels of civic and social organization. Called “mound builders” by European settlers, the people of this civilization had dispersed before the European invasion, but their influence had spread throughout the eastern half of the North American continent through cultural influence and trade.

What European colonizers found in the southeastern region of the continent were nations of villages with economies based on agriculture and corn the mainstay. This was the territory of the nations of the Cherokee, Chickasaw, and Choctaw and the Muskogee Creek and Seminole, along with the Natchez Nation in the western part, the Mississippi Valley region.

To the north, a remarkable federal state structure, the Haudenosaunee Confederacy — often referred to as the Six Nations of the Iroquois Confederacy — was made up of the Seneca, Cayuga, Onondaga, Oneida, and Mohawk Nations and, from early in the nineteenth century, the Tuscaroras. This system incorporated six widely dispersed and unique nations of thousands of agricultural villages and hunting grounds from the Great Lakes and the St. Lawrence River to the Atlantic, and as far south as the Carolinas and inland to Pennsylvania.

The Haudenosaunee peoples avoided centralized power by means of a clan-village system of democracy based on collective stewardship of the land. Corn, the staple crop, was stored in granaries and distributed equitably in this matrilineal society by the clan mothers, the oldest women from every extended family. Many other nations flourished in the Great Lakes region where now the US-Canada border cuts through their realms. Among them, the Anishinaabe Nation (called by others Ojibwe and Chippewa) was the largest.

In the beginning, Anglo settlers organized irregular units to brutally attack and destroy unarmed indigenous women, children, and old people using unlimited violence in unrelenting attacks. During nearly two centuries of British colonization, generations of settlers, mostly farmers, gained experience as “Indian fighters” outside any organized military institution.

Anglo-French conflict may appear to have been the dominant factor of European colonization in North America during the eighteenth century, but while large regular armies fought over geopolitical goals in Europe, Anglo settlers in North America waged deadly irregular warfare against the indigenous communities.

The chief characteristic of irregular warfare is that of the extreme violence against civilians, in this case the tendency to seek the utter annihilation of the indigenous population. “In cases where a rough balance of power existed,” observes historian John Greniew, “and the Indians even appeared dominant—as was the situation in virtually every frontier war until the first decade of the 19th century—[settler] Americans were quick to turn to extravagant violence.”

Indeed, only after seventeenth- and early- eighteenth-century Americans made the first way of war a key to being a white American could later generations of ‘Indian haters,’ men like Andrew Jackson, turn the Indian wars into race wars.” By then, the indigenous peoples’ villages, farmlands, towns, and entire nations formed the only barrier to the settlers’ total freedom to acquire land and wealth. Settler colonialists again chose their own means of conquest. Such fighters are often viewed as courageous heroes, but killing the unarmed women, children, and old people and burning homes and fields involved neither courage nor sacrifice.

US history, as well as inherited indigenous trauma, cannot be understood without dealing with the genocide that the United States committed against indigenous peoples. From the colonial period through the founding of the United States and continuing in the twenty-first century, this has entailed torture, terror, sexual abuse, massacres, systematic military occupations, removals of indigenous peoples from their ancestral territories, and removals of indigenous children to military-like boarding schools.

Once in the hands of settlers, the land itself was no longer sacred, as it had been for the indigenous. Rather, it was private property, a commodity to be acquired and sold. Later, when Anglo-Americans had occupied the continent and urbanized much of it, this quest for land and the sanctity of private property were reduced to a lot with a house on it, and “the land” came to mean the country, the flag, the military, as in “the land of the free” of the national anthem, or Woody Guthrie’s “This Land Is Your Land.”

Those who died fighting in foreign wars were said to have sacrificed their lives to protect “this land” that the old settlers had spilled blood to acquire. The blood spilled was largely indigenous.

Adapted from An Indigenous Peoples’ History of the United Statesout now from Beacon Press.

Read Full Post »

This 75th Anniversary’s Been Overlooked. It Shouldn’t Be

Paula Rabinowitz  

HNN   November 22, 2014

Seventy-five years ago, paperback books returned to the United States with the brandname Pocket Books, which began publishing its mass-market paperbacks, sold at a quarter each, with ten titles, among them: Frank Buck’s Bring ‘Em Back Alive, Bambi by Felix Salten, James Hilton’s Lost Horizon and Emily Brontë’s Wuthering Heights. Returned, because nineteenth-century printers often bound books in paper, yet the practice had all but disappeared during the early part of the twentieth century. It may seem odd to commemorate the advent of cheap pulpy books instead of the far more significant anniversary: the signing of the Hitler-Stalin Pact on August 23, 1939. But the saga of cheap paperbacks’ arrival on American soil is intimately tied to the Second World War and its aftermath in a number of ways, deriving from and contributing to wartime innovation, necessity, mobility and censorship.

Modern paperbacks were the Depression-era brainchild of English editor Allen Lane, who developed Penguin Books in 1935 in order to provide high-quality literary works as cheaply as a pack of cigarettes. Publishing on such a massive scale depended on huge supplies of paper, which, once Britain declared war on Germany, was sharply curtailed in the UK. But the US still had an abundance of trees and paper mills and whether Lane’s assistant, Ian Ballantine and others, stole the idea, as E.L. Doctorow remembers in Reporting the Universe, or Lane shipped the enterprise overseas with Kurt Enoch and Victor Weybright (as he recalled in his memoir The Making of a Publisher), the new medium appearing on drugstore racks, bus stations and corner candy stores, became a kitschy icon that indelibly altered American tastes and habits during the mid-twentieth century. Within a few months of their initial arrival, paperbacks were everywhere. Despite the ubiquity of radio and the Hollywood banner year of 1939, when Gone with the Wind and The Wizard of Oz swept into movie theaters with lush colors, books were the mass media of wartime America. The advent of color assured a renewed love affair with the movies, even as the 1939 World’s Fair in New York marked the introduction of television, the next frontier in mass communications, which would come into its own in the 1950s. But the ability to own a book, one printed by the millions, connected Americans to new ideas in science, economics, art—not to mention new sensations about reading and the self and each other.

These new objects, emblazoned with lurid cover art and risqué tag lines, were priced to sell and, once the US entered the war, were imprinted with an admonition to send the volumes overseas to servicemen. War spurs technological breakthroughs, usually in weaponry or communication; paperbacks were part of this process, a new technology that transformed both the battlefield and home front. Books, unlike other mass media, such as the radio or movies, were tangible things that could be purchased and, like a salami from Katz’s Delicatessen, then sent “to your boy in the army.” Paperback books participated directly in the war effort when publishers and booksellers banded together to produce the Armed Services Editions—millions of books distributed free to the Army and Navy, which left a legacy that influenced a generation of Japanese and Italian scholars to study American literature when they found these handy yellow-covered books or their companions, the Overseas Editions, among their grandfathers’ war surplus (the ASE books could not be brought back the US so were dumped overseas). Books are always causing trouble; even this patriotic gesture ran afoul of Congressional attempts, through amendments to the 1944 Soldier’s Voting Act, to limit the use of certain words in publications distributed to troops that might appear to sway their political opinions.

By the 1950s, after paperback publishing exploded to encompass many imprint houses and augmented reprints with PBOs (paperback originals), the books’ provocative covers—which had been a crucial design elements meant to spur sales but also to bring vernacular modernist visual culture into private life—sparked police departments to impound books and Congress to investigate “Current Pornographic Materials” (during the 1952 Gathings subcommittee hearings), including paperbacks. What had been allowed to proliferate during the Second World War, when millions were in uniform and social mores superintending men’s and women’s behaviors loosened considerably, needed to be reined in during the Korean War and the Cold War.

Paperback book publishers had long been aware of real and potential censorship efforts mounted in the United States most notably, the 1933 case, United States v. One Book Called “Ulysses.” Its 1934 appeal decision by Augustus Hand declared that the book must be “taken as a whole,” so that even patently “obscene” portions “relevant to the purpose of depicting the thoughts of the characters … are introduced to give meaning to the whole.” This decision was aimed at the literary merits of the work and its “sincerity” of portraying characters, but because the law was aimed at “one book,” the book itself, as a total package from cover to cover, might be considered “as a whole.” Paperback publishers exploited the pulpy aspects of their product, with louche and debauched cover art attracting visual attention; but the covers rigorously conformed to the Ulysses decision ruling: each depicted a scene found within the book—even if only in a few words. The paperback was a complete work consisting not only of words but art as well.

This handy package, arriving on American shores in the midst of war’s horrors—offering its owners a “complete and unabridged” work, easily carried in pocket or pocketbook, complete with a visually compelling cover—helped usher readers into new sensations through art, science and literature. As objects that circulated along with their owners during and after WWII, they brought modernism to Main Street.

Paula Rabinowitz is the author of AMERICAN PULP: HOW PAPERBACKS BROUGHT MODERNISM TO MAIN STREET and Editor-in-Chief of the Oxford Encyclopedia of Literature. She is a Professor of English at University of Minnesota, where she teaches courses on twentieth-century American literature, film and visual cultures, and material culture.

Read Full Post »

 

hnn-logo-new

 

 

 

 

Slavery in America: Back in the Headlines

Daina Ramey Berry

HNN  November 23, 1014

 

People think they know everything about slavery in the United States, but they don’t. They think the majority of African slaves came to the American colonies, but they didn’t. They talk about 400 hundred years of slavery, but it wasn’t. They claim all Southerners owned slaves, but they didn’t. Some argue it was a long time ago, but it wasn’t.

Slavery has been in the news a lot lately. Perhaps it’s because of the increase in human trafficking on American soil or the headlines about income inequality, the mass incarceration of African Americans or discussions about reparations to the descendants of slaves. Several publications have fueled these conversations: Ta-Nehisi Coates’ The Case for Reparations in The Atlantic Monthly, French economist Thomas Picketty’s Capital in the Twenty First Century, historian Edward Baptist’s The Half Has Never Been Told: Slavery and The Making of American Capitalism, and law professor Bryan A. Stevenson’s Just Mercy: A Story of Justice and Redemption.

As a scholar of slavery at the University of Texas at Austin, I welcome the public debates and connections the American people are making with history. However, there are still many misconceptions about slavery.

I’ve spent my career dispelling myths about “the peculiar institution.” The goal in my courses is not to victimize one group and celebrate another. Instead, we trace the history of slavery in all its forms to make sense of the origins of wealth inequality and the roots of discrimination today. The history of slavery provides deep context to contemporary conversations and counters the distorted facts, internet hoaxes and poor scholarship I caution my students against.

Four myths about slavery

Myth One: The majority of African captives came to what became the United States.

Truth: Only 380,000 or 4-6% came to the United States. The majority of enslaved Africans went to Brazil, followed by the Caribbean. A significant number of enslaved Africans arrived in the American colonies by way of the Caribbean where they were “seasoned” and mentored into slave life. They spent months or years recovering from the harsh realities of the Middle Passage. Once they were forcibly accustomed to slave labor, many were then brought to plantations on American soil.

Myth Two: Slavery lasted for 400 years.

Popular culture is rich with references to 400 years of oppression. There seems to be confusion between the Transatlantic Slave Trade (1440-1888) and the institution of slavery, confusion only reinforced by the Bible, Genesis 15:13:

Then the Lord said to him, ‘Know for certain that for four hundred years your descendants will be strangers in a country not their own and that they will be enslaved and mistreated there.

Listen to Lupe Fiasco – just one Hip Hop artist to refer to the 400 years – in his 2011 imagining of America without slavery, “All Black Everything”:

[Hook]

You would never know

If you could ever be

If you never try

You would never see

Stayed in Africa

We ain’t never leave

So there were no slaves in our history

Were no slave ships, were no misery, call me crazy, or isn’t he

See I fell asleep and I had a dream, it was all black everything

[Verse 1]

Uh, and we ain’t get exploited

White man ain’t feared so he did not destroy it

We ain’t work for free, see they had to employ it

Built it up together so we equally appointed

First 400 years, see we actually enjoyed it

tt9cfqkm-1413841594.jpg

A plantation owner with his slaves. (National Media Museum from UK)

Truth: Slavery was not unique to the United States; it is a part of almost every nation’s history from Greek and Roman civilizations to contemporary forms of human trafficking. The American part of the story lasted fewer than 400 years.

How do we calculate it? Most historians use 1619 as a starting point: 20 Africans referred to as ”servants” arrived in Jamestown, VA on a Dutch ship. It’s important to note, however, that they were not the first Africans on American soil. Africans first arrived in America in the late 16th century not as slaves but as explorers together with Spanish and Portuguese explorers. One of the best known of these African “conquistadors” was Estevancio who traveled throughout the southeast from present day Florida to Texas. As far as the institution of chattel slavery – the treatment of slaves as property – in the United States, if we use 1619 as the beginning and the 1865 Thirteenth Amendment as its end then it lasted 246 years, not 400.

Myth Three: All Southerners owned slaves.

Truth: Roughly 25% of all southerners owned slaves. The fact that one quarter of the Southern population were slaveholders is still shocking to many. This truth brings historical insight to modern conversations about the Occupy Movement, its challenge to the inequality gap and its slogan “we are the 99%.”

Take the case of Texas. When it established statehood, the Lone Star State had a shorter period of Anglo-American chattel slavery than other Southern states – only 1845 to 1865 – because Spain and Mexico had occupied the region for almost one half of the 19th century with policies that either abolished or limited slavery. Still, the number of people impacted by wealth and income inequality is staggering. By 1860, the Texas enslaved population was 182,566, but slaveholders represented 27% of the population, controlled 68% of the government positions and 73% of the wealth. Shocking figures but today’s income gap in Texas is arguably more stark with 10% of tax filers taking home 50% of the income.

Myth Four: Slavery was a long time ago.

Truth: African-Americans have been free in this country for less time than they were enslaved. Do the math: Blacks have been free for 149 years which means that most Americans are two to three generations removed from slavery. However, former slaveholding families have built their legacies on the institution and generated wealth that African-Americans have not been privy to because enslaved labor was forced; segregation maintained wealth disparities; and overt and covert discrimination limited African-American recovery efforts.

The value of slaves

Economists and historians have examined detailed aspects of the enslaved experience for as long as slavery existed. Recent publications related to slavery and capitalism explore economic aspects of cotton production and offer commentary on the amount of wealth generated from enslaved labor.

My own work enters this conversation looking at the value of individual slaves and the ways enslaved people responded to being treated as a commodity. They were bought and sold just like we sell cars and cattle today. They were gifted, deeded and mortgaged the same way we sell houses today. They were itemized and insured the same way we manage our assets and protect our valuables.

y6br69t3-1413802703.jpg

Extensive Sale of Choice Slaves, New Orleans 1859, Girardey, C.E. (Natchez Trace Collection, Broadside Collection, Dolph Briscoe Center for American History)

Extensive Sale of Choice Slaves, New Orleans 1859, Girardey, C.E.

(Natchez Trace Collection, Broadside Collection, Dolph Briscoe Center for American History)

Enslaved people were valued at every stage of their lives, from before birth until after death. Slaveholders examined women for their fertility and projected the value of their “future increase.” As they grew up, enslavers assessed their value through a rating system that quantified their work. An “A1 Prime hand” represented one term used for a “first rate” slave who could do the most work in a given day. Their values decreased on a quarter scale from three-fourths hands to one-fourth hands, to a rate of zero, which was typically reserved for elderly or differently abled bondpeople (another term for slaves.)

Guy and Andrew, two prime males sold at the largest auction in US History in 1859, commanded different prices. Although similar in “all marketable points in size, age, and skill,” Guy commanded $1240 while Andrew sold for $1040 because “he had lost his right eye.” A reporter from the New York Tribune noted “that the market value of the right eye in the Southern country is $240.” Enslaved bodies were reduced to monetary values assessed from year to year and sometimes from month to month for their entire lifespan and beyond. By today’s standards, Andrew and Guy would be worth about $33,000-$40,000.

Slavery was an extremely diverse economic institution; one that extrapolated unpaid labor out of people in a variety of settings from small single crop farms and plantations to urban universities. This diversity is also reflected in their prices. Enslaved people understood they were treated as commodities.

“I was sold away from mammy at three years old,” recalled Harriett Hill of Georgia. “I remembers it! It lack selling a calf from the cow,” she shared in a 1930s interview with the Works Progress Administration. “We are human beings” she told her interviewer. Those in bondage understood their status. Even though Harriet Hill “was too little to remember her price when she was three, she recalled being sold for $1400 at age 9 or 10, “I never could forget it.”

Slavery in popular culture

Slavery is part and parcel of American popular culture but for more than 30 years the television mini-series Roots was the primary visual representation of the institution except for a handful of independent (and not widely known) films such as Haile Gerima’s Sankofa or the Brazilian Quilombo. Today Steve McQueen’s 12 Years a Slave is a box office success, actress Azia Mira Dungey has a popular web series called Ask a Slave, and in Cash Crop sculptor Stephen Hayes compares the slave ships of the 18th century with third world sweatshops.

From the serious – PBS’s award-winning Many Rivers to Cross – and the interactive Slave Dwelling Project- whereby school aged children spend the night in slave cabins – to the comic at Saturday Night Live, slavery is today front and center.

The elephant that sits at the center of our history is coming into focus. American slavery happened — we are still living with its consequences.

Daina Ramey Berry, Ph.D. is an associate professor of history and African and African Diaspora Studies at the University of Texas at Austin. She is also a Public Voices Fellow, author and award–winning editor of three books, currently at work on book about slave prices in the United States funded in part by the National Endowment for the Humanities. Follow her on Twitter: @lbofflesh. This articles was first published by Not Even Past.

Read Full Post »

What Happened the Last Time Republicans Had a Majority This Huge? They lost it.

Josh Zeitz

Politico.com    November 15, 2014

Since last week, many Republicans have been feeling singularly nostalgic for November 1928, and with good reason. It’s the last time that the party won such commanding majorities in the House of Representatives while also dominating the Senate. And, let’s face it, 1928 was a good time.

America was rich—or so it seemed. Charles Lindbergh was on the cover of Time. Amelia Earhart became the first woman to fly across the Atlantic. Jean Lussier went over Niagara Falls in a rubber ball (thus trumping the previous year’s vogue for flagpole sitting). Mickey Mouse made his first appearance in a talkie (“Steamboat Willie”). Irving Aaronson and His Commanders raised eyebrows with the popular—and, for its time, scandalous—song, “Let’s Misbehave,” and presidential nominee Herbert Hoover gave his Democratic opponent, Al Smith, a shellacking worthy of the history books.

The key takeaway: It’s been a really, really long time since Republicans have owned Capitol Hill as they do now.

But victory can be a fleeting thing. In 1928, Republicans won 270 seats in the House. They were on top of the world. Two years later, they narrowly lost their majority. Two years after that, in 1932, their caucus shrunk to 117 members and the number of Republican-held seats in the Senate fell to just 36. To borrow the title of a popular 1929 novel (which had nothing whatsoever to do with American politics): Goodbye to all that.

A surface explanation for the quick rise and fall of the GOP House majority of 1928 is the Great Depression. As the party in power, Republicans owned the economy, and voters punished them for it. In this sense, today’s Republicans have no historical parallel to fear. Voters—at least a working majority of the minority who turned out last week—clearly blame Barack Obama for the lingering aftershocks of the recent economic crash.

But what if the Republicans of 1928 owed their demise to a more fundamental force? What if it was demography, not economics, that truly killed the elephant?

In fact, the Great Depression was just one factor in the GOP’s stunning reversal of fortune, and in the 1930 cycle that saw Republicans lose their commanding House majority it was probably a minor factor. To be sure, the Republicans of yesteryear were victims of historical contingency (the Great Depression), but they also failed to appreciate and prepare for a long-building trend—the rise of a new urban majority comprised of over 14 million immigrants, and many millions more of their children. Democrats did see the trend, and they built a majority that lasted half a century.

The lesson for President Obama and the Democrats is to go big—very, very big—on immigration reform. Like the New Dealers, today’s Democrats have a unique opportunity to build a majority coalition that dominates American politics well into the century.

***

For the 1928 GOP House majority, victory was unusually short-lived. About one in five GOP House members elected in the Hoover landslide served little more than a year and a half before losing their seats in November 1930.

On a surface level, the Great Depression was to blame.

The stock market crash of October 1929 destroyed untold wealth. Shares in Eastman Kodak plunged from a high of $264.75 to $150. General Electric, $403 to $168.13. General Motors, $91.75 to $33.50. In the following months, millions of men and women were thrown out of work. Tens of thousands of businesses shut their doors and never reopened.

But in the 1920s—before the rise of pensions and 401Ks, college savings accounts and retail investment vehicles—very few Americans were directly implicated in the market. Moreover, in the context of their recent experience, the sudden downtick of 1929-1930 was jarring but not altogether unusual. Hoover later recalled that “for some time after the crash,” most businessmen simply did not perceive “that the danger was any more than that of run-of-the-mill, temporary slumps such as had occurred at three-to-seven year intervals in the past.”

By April 1930, stocks had recouped 20 percent of lost value and seemed on a steady course to recovery. Bank failures, though vexing, were occurring at no greater a clip than the decade’s norm. Yes, gross national product fell 12.6 percent in just one year, and roughly 8.9 percent of able-bodied Americans were out of work. But events were not nearly as dire as in 1921, when a recession sent GNP plunging 24 percent and 11.9 percent of workers were unemployed.

In fact, Americans in the Jazz Age were accustomed to a great deal of economic volatility and risk exposure. It was the age of Scott and Zelda, Babe Ruth, the Charleston, Clara Bow and Colleen Moore—the Ford Model T and the radio set. But it was also an era of massive wealth and income inequality. In these days before the emergence of the safety net—before unemployment and disability insurance—most industrial workers expected to be without work for several months of each year. For farm workers, the entire decade was particularly unforgiving, as a combination of domestic over-production and foreign competition drove down crop prices precipitously.

In hindsight, we know that voters in November 1930 were standing on the edge of a deep canyon. But in the moment, hard times struck many Americans as a normal, cyclical part of their existence.

Unsurprisingly, then, many House and local races in 1930 hinged more on cultural issues—especially on Prohibition, which in many districts set “wet” Democrats against “dry” Republicans—than economic ones.

If the Depression was not a singular determinant in the 1930 elections, neither had Herbert Hoover yet acquired an undeserved reputation for callous indifference to human suffering. Today, we think of Hoover as the laissez-faire foil to Franklin Roosevelt’s brand of muscular liberalism. But in 1930, Hoover was still widely regarded as a progressive Republican who, in his capacity as U.S. relief coordinator, saved Europe from starvation during World War I. When he was elected president, recalled a prominent journalist, we “were in a mood for magic … We had summoned a great engineer to solve our problems for us; now we sat back comfortably and confidently to watch problems being solved.”

In 1929 and 1930, Hoover acted swiftly to address what was still a seemingly routine economic emergency. He jawboned business leaders into maintaining job rolls and wages. He cajoled the Federal Reserve System into easing credit. He requested increased appropriations for public works and grew the federal budget to its largest-ever peacetime levels. In most contemporary press accounts, he had not yet acquired the stigma of a loser.

Still, in 1930 Hoover’s party took a beating. Republicans lost eight seats in the Senate and 52 seats in the House. By the time the new House was seated in December 1931, several deaths and vacancies resulted in a razor-thin Democratic majority.

If the election was not exclusively or even necessarily about economics, the same cannot be said of the FDR’s historic landslide two years later. As Europe plunged headlong into the Depression in 1931 and 1932, the American banking and financial system all but collapsed. With well over 1,000 banks failing each year, millions of depositors lost their life savings. By the eve of the election, more than 50 percent of American workers were unemployed or under-employed.

In response to the crisis, Hoover broke with decades of Republican economic orthodoxy. He stepped up work on the Boulder Dam and Grand Coulee Dam (popular lore notwithstanding, these were not first conceived as New Deal projects). He signed legislation outlawing anti-union (“yellow dog”) clauses in work agreements. And he chartered the Reconstruction Finance Corporation, a government-sponsored entity that loaned money directly to financial institutions, railroads and agricultural stabilization agencies, thereby helping them maintain liquidity. The RFC was in many ways the first New Deal agency, though Herbert Hoover pioneered it. Even the editors of the New Republic, among the president’s sharpest liberal critics, admitted at the time, “There has been nothing quite like it.”

Read Full Post »

The FBI vs. Martin Luther King: Inside J. Edgar Hoover’s “Suicide Letter” to Civil Rights Leader

Democracy Now   November 18, 2014

It was 50 years ago today that FBI Director J. Edgar Hoover made headlines by calling Rev. Dr. Martin Luther King Jr. the “most notorious liar in the country.” Hoover made the comment in front of a group of female journalists ahead of King’s trip to Oslo where he received the 1964 Nobel Peace Prize, becoming the youngest recipient of the prize. While Hoover was trying to publicly discredit King, the agency also sent King an anonymous letter threatening to expose the civil rights leader’s extramarital affairs. The unsigned, typed letter was written in the voice of a disillusioned civil rights activist, but it is believed to have been written by one of Hoover’s deputies, William Sullivan. The letter concluded by saying, “King, there is only one thing left for you to do. You know what it is. … You are done. There is but one way out for you. You better take it before your filthy, abnormal fraudulent self is bared to the nation.” The existence of the so-called “suicide letter” has been known for years, but only last week did the public see the unredacted version. We speak to Yale University professor Beverly Gage, who uncovered the unredacted letter.

Read Full Post »

If Obama Faces Impeachment over Immigration, Roosevelt, Truman, Eisenhower and Kennedy Should Have as Well

HNN   November 16, 2014

 

When President Obama announced last week following the mid-term elections that he would use his executive powers to make immigration changes, the incoming Senate Majority leader Mitch McConnell warned that “would be like waving a red flag in front of a bull.”  Representative Joe Barton from Texas already saw red, claiming such executive action would be grounds for impeachment.

If so, then Presidents Roosevelt, Truman, Eisenhower and Kennedy should all have been impeached.   All four skirted Congress, at times overtly flouting their administrative prerogative, to implement a guest worker program.

This was the “Bracero” agreement with the Government of Mexico to recruit workers during World War II, starting in 1942 but lasting beyond the war, all the way until 1964.  At its height in the mid 1950s, this program accounted for 450,000 Mexicans per year coming to the U.S. to work, primarily as agricultural workers.

Several aspects of the Bracero program stand out as relevant to the impasse on immigration reform over the last 15 years.  First, the program began with executive branch action, without Congressional approval.  Second, negotiations with the Mexican government occurred throughout the program’s duration, with the State Department taking the lead in those talks.  Finally, this guest worker initiative, originally conceived as a wartime emergency, evolved into a program in the 1950s that served specifically to dampen illegal migration.

Even before Pearl Harbor, growers in the southwest faced labor shortages in their fields and had lobbied Washington to allow for migrant workers, but unsuccessfully.  It took less than five months following the declaration of war to reverse U.S. government intransigence on the need for temporary workers.  Informal negotiations had been taking place between the State Department and the Mexican government, so that an agreement could be signed on April 4, 1942 between the two countries.  By the time legislation had passed authorizing the program seven months later, thousands of workers had already arrived in the U.S.

The absence of Congress was not just due to a wartime emergency.  On April 28, 1947, Congress passed Public Law 40 declaring an official end to the program by the end of January the following year.   Hearings were held in the House Agriculture Committee to deal with the closure, but its members proceeded to propose ways to keep guest workers in the country and extend the program, despite the law closing it down.  Further, without the approval of Congress, the State Department was negotiating a new agreement with Mexico, signed on February 21, 1948, weeks after Congress mandated its termination.  Another seven months later, though, Congress gave its stamp of approval on the new program and authorized the program for another year.  When the year lapsed, the program continued without Congressional approval or oversight.

The Bracero Program started out as a wartime emergency, but by the mid-1950s, its streamlined procedures made it easier for growers to hire foreign labor without having to resort to undocumented workers.  Illegal border crossings fell.

Still, there were many problems making the Bracero Program an unlikely model for the current immigration reforms.  Disregard for the treatment of the contract workers tops the list of problems and became a primary reason for shutting the program down.  However, the use of executive authority in conceiving and implementing an immigration program is undeniable.

The extent of the executive branch involvement on immigration was best captured in 1951, when a commission established by President Truman to review the status of migratory labor concluded that “The negotiation of the Mexican International Agreement is a collective bargaining situation in which the Mexican Government is the representative of the workers and the Department of State is the representative of our farm employers.”  Not only was the executive branch acting on immigration, but they were negotiating its terms and conditions, not with Congress, but with a foreign country.  Remarkable language, especially looking forward to 2014 when we are told that such action would be an impeachable offense.

Senator McConnell used the bullfighting analogy because the red flag makes the bull angry; following the analogy to its inevitable outcome is probably not what he had in mind.  The poor, but angry bull never stands a chance.  In this case, though, it won’t be those in Congress who don’t stand a chance; it will be those caught in our messy and broken immigration system.

John Dickson was Deputy Chief of Mission in Mexico and Director of the Office of Mexican Affairs at the Department of State and is a recent graduate of the University of Massachusetts public history program.

Read Full Post »

Older Posts »