Christopher Hager

4 POSTS 0 COMMENTS
Christopher Hager is the Schaenen Scholar for the current school year. Dr. Hager received his bachelor's degree from Stanford University and his Ph.D. from Northwestern University. He is an Associate Professor of English at Trinity College in Hartford, Connecticut, where he teaches courses in American literature and cultural history and, for the past three years, has co-directed the Center for Teaching and Learning. He is the author of Word by Word: Emancipation and the Act of Writing (Harvard Univ. Press, 2013), a study of the writing practices of enslaved and recently emancipated African Americans, which won the Frederick Douglass Prize and was a finalist for the Lincoln Prize.

Untitled by Phil Dokas is licensed under CC BY-NC-SA 2.0 (via Flickr)

This is the fourth in a series about American History and the Ethics of Memory. This post originally appeared on February 9, 2016.

It was a hotly contested presidential election, and the mudslinging was fierce. There were allegations of fiscal corruption, sexual impropriety, and—perhaps most damning of all—bad writing. 

The Democratic candidate, it was rumored, spelled Congress with a K. Couldn’t construct a complete sentence. Had to hire someone to write his letters for him. Was almost entirely illiterate.

The charges went viral. They even inspired snatches of satirical poetry in the newspapers:

Then a nice writing-man I have hired for my use,

To hide the bad spelin I skrawl

And them are as says how my grammar is bad,

Don’t know nothing of it all.

The man the poem was mocking, the one supposed to be guilty of these several crimes against the English language, now appears on the $20 bill. The John Quincy Adams campaign’s efforts to smear their upstart rival’s literacy did not stop Andrew Jackson from winning the White House.

Modern scholars have actually tried to figure out, “Could Andrew Jackson Spell?” The evidence is inconclusive, but the question doesn’t seem especially important for us now. What is relevant today is what the episode suggests about how we evaluate candidates—the role ideas about literacy play in political discourse, and to what effect. Left-leaning commentators’ gleefulness over Sarah Palin’s recent display of verbal clumsiness, in her speech endorsing Donald Trump, doesn’t look very different from the hilarity that ensued among Adams supporters when they heard about a 25-line letter by Jackson that included 23 misspellings.

Spelling Congress with a K doesn’t by itself seem like a disqualifier from the presidency. An effective chief executive must be able to do many things with Congress, but spelling is lower on the list than cooperating, negotiating, persuading, and maneuvering. The general idea behind the Adams campaign’s gambit was that by portraying Jackson, born in the backwoods of Tennessee, as illiterate, they could persuade voters he lacked the aptitude to manage the complexities of the national government—as the incumbent Adams, scion of one of the founding families of the republic, obviously could.

Arguably there was some truth to this. By all appearances, Jackson failed to comprehend the function and importance of the Bank of the U.S. when, with devastating economic results, he effectively destroyed it in the 1830s (one of the reasons many people would like to see an American woman replacing Jackson on the $20 bill, rather than Alexander Hamilton on the $10). But this may have been a coincidence. People who did grasp the ins and outs of central banking in the 1830s probably were highly literate, but the converse isn’t necessarily true. Plenty of people who knew the correct spelling of Congress still didn’t understand what the Bank of the U.S. was good for, just as many well-read and eloquent people in 2008 had no idea what a collateralized debt obligation was.

Besides, it didn’t work. Jackson beat Adams. The election of 1828 proved to be an early installment in the long American tradition of affection for politicians who are “regular guys” (or, in the lexicon of pollsters during the election of 2000, people you’d like to have a beer with). Not for the last time, a bookish and bespectacled candidate inspired more distrust among voters than a rough-edged, inarticulate one. Never mind that the supposedly effete Bostonian went on to serve nine terms in Congress and successfully defend the Amistad rebels, while the manly frontiersman earned a reputation for exterminating American Indians. Maybe the Adams camp would have done better for their candidate, and the country, by talking more about principles than orthography.

Which may be useful to remember in our own era. The whirlwind of attention paid to Sarah Palin’s recent speech has been dominated by derision of her odd phraseology and general incoherence—which is a perfectly legitimate (and certainly amusing) subject for Saturday Night Live (“She sounds like a greeting card from a Chinese dollar store!”). But even the venerable New York Times’s coverage devolved into a listicle called “The Most Mystifying Lines of Sarah Palin’s Endorsement Speech.”

The first question about that speech or any other politician’s shouldn’t be whether or not it’s a fluid sequence of grammatical sentences (as nice as that would be) but whether or not it’s bullshit—a word I use here in its technical sense to refer to indifference to truthfulness. Misused and made-up words are great fodder for social-media mockery (refudiate! squirmishes!), but they’re less outrageous than (to choose just one example from Palin’s speech) the claim that military veterans are not “treated better than illegal immigrants are treated in this country.” And they’re far less damaging than an attitude toward political discourse that doesn’t care whether that claim, or any other, can even be backed up. Sarah Palin may be inarticulate, but there is  more important work to be done than pointing that out.

Illustration created by Antislavery Usable Past Project at the University of Nottingham in 2014. Used with permission from the creators.

This is the third in a series about American History and the Ethics of Memory. This post originally appeared on November 17, 2015.

“Using History to Make Slavery History.” That was the title of this year’s conference of Historians Against Slavery (HAS), a four-year-old organization created to “bring historical context and scholarship to the modern-day antislavery movement in order to inform activism.” A related endeavor, the Antislavery Usable Past project, aims similarly to “bring to the present the important lessons from antislavery movements and policies of the past, and translate those lessons into effective tools for policy makers, civil society, and citizens.” There aren’t many venues in which—as at the HAS conference, which I attended last month—history professors, human-rights activists, and survivors of modern slavery sit side-by-side behind a panel table on an auditorium stage.

Plenty of critical issues could benefit from richer public knowledge of the past—climate change and the early history of industrialism, say—but contemporary antislavery activism stands out for the extensiveness of its recourse to academic history. It makes for an illuminating study in the ways different notions of “history” may inform contemporary ethics.

In the wake of the June 2015 Charleston shooting, a New York Times blogger wrote about her determination no longer to speak to her children about America’s “history” of racial violence. She felt it her responsibility to acknowledge, rather, that “the saga of racism in this country is ongoing.” In that formulation, the present is (regrettably) continuous with the past, and the word “history” means what it means at the end of the HAS conference’s title—dead and gone. History is imagined chiefly in contrast to our own time: if the events around us resemble those of decades past, then they belong not to “history” but rather to us. Getting “on the right side of history,” as some like to say (a phrasing Andy Cullison and I recently discussed in a podcast), means getting out in front of history; putting it, and its injustices, behind us.

When it comes to slavery’s history, the word is no simple synonym for “over.”

There are indeed striking and distressing similarities between the 2015 Charleston shooting and, for instance, the 1963 bombing of 16th Street Baptist Church in Birmingham, or lynchings in the early post-Civil War South. All were acts of criminal homicide—indeed, terrorism—by white supremacists targeting African Americans. Racist violence is a part of American history that, tragically, is not over. Past and present forms of slavery, on the other hand, are distinct from each other, and in some fundamental ways. Slavery in the pre-Civil War U.S. was legal, highly visible, and situated at the center of the economy. Today, it exists under the radar of laws and in the shadows of global trade. Slavery is not “history”—in the sense of being gone from the earth—but neither is it today a straightforward continuation of what it was in the past.

The slavery of the pre-Civil War U.S. has been the object of tremendous scholarly attention for years, and public consciousness of its atrocities perhaps reached its high-water mark at the moment 12 Years a Slave won the Academic Award for Best Picture. Of course, it took a long time for that water to rise from where it was when Gone With the Wind won the same award in 1939. And the recent controversy about a high-school history textbook’s allusion to the arrival African “workers” on southern plantations shows how deeply flawed cultural memory of slavery still can be. Still, a relatively strong cultural memory of slavery (Slate.com recently devoted its first Slate Academy to “The History of American Slavery”) forms a foundation on which contemporary anti-slavery activists can build.

The Polaris Project, one of the leading international organizations opposing human trafficking, seized on the success of 12 Years a Slave to publicize its own work, naming the film’s director, Steve McQueen, an awareness-raising “ambassador” and “highlighting the striking parallels” between Solomon Northup’s kidnapping and enslavement in the 1840s “and the experiences of sex and labor trafficking victims today.” The image at the top of this page, likewise, seeks to make continuities between historical and contemporary slavery visible amid their differences—by situating the famous chart of the slave ship Brookes, which galvanized British abolitionism when it was published in 1788, inside a commercial jetliner.

In fact, there are several ways of seeing continuities between slavery’s American history and its presence in the world now. Along one stream, the race-based chattel slavery of the antebellum U.S. persists in illegal forced labor around the world today, an altered form of rapacious capital-accumulation on the backs of vulnerable people. Another stream flows from historical slavery—in particular, its notorious instances of sexual exploitation of enslaved women—to modern sex trafficking. Still another passes through the “exception clause” in the Thirteenth Amendment, which abolished slavery “except as a punishment for crime whereof the party shall have been duly convicted.” Here, race matters once again, given the disproportionate incarceration of African American men, and the other streams re-converge with this one when U.S. prison labor serves corporate interests or when the prison system harbors sexual slavery.

It can seem, as David Gellman has written in this space, “jarring” to move from Louisiana plantations 175 years ago “to children prostituted in Thailand and brick kiln laborers in the thralls of debt bondage in India, or to migrant laborers in Florida and housekeepers in Washington, D.C.” But the potential payoff is significant. Making that move can tap into a vast public consensus that slavery is an intolerable injustice. Believing in a human right to freedom isn’t a partisan position.

Yet it remains that there is relatively little public awareness that enslavement still happens, or that, even if some very bad things happen, they merit the odious name of “slavery.” (Are candidates talking about slavery in presidential debates? No.) The will to stop human trafficking and commercial exploitation is hampered less by ideological divisions than by ideas about the past—notions of a history that’s believed to be over.

In the U.S., emancipation is a central part of our cultural memory, our heritage. America may be a nation that was built on the backs of slaves, but it is also a nation that fought a great war to set those slaves free. It is well that it should be so remembered. Little progress toward justice in our own time will be accomplished if Americans do not remember that their nation endured terrible suffering to secure the liberty of four million African Americans. The challenge is to celebrate that without failing to perceive, acknowledge, and work to understand both slavery’s tragic legacies and the forms in which slavery endures.

"Memphis Riots, 1866" by Alfred Rudolph Waud (Public Domain, via Tennessee State Library and Archives)

This is the second in a series on American History and the Ethics of Memory. This post originally appeared on September 15, 2015.

Warner Madison doesn’t trust the police. He thinks they view all black people with suspicion, harass them on the streets, and arrest them without cause. When police accost his children on their way to school, he can barely contain his anger. He fires off a letter protesting what he calls “one of the most obnoxious and foul and mean” things he has ever witnessed. But he possesses little hope that police treatment of African Americans in his city will change.

What does Warner Madison think of Trayvon Martin, Eric Garner, Michael Brown, Walter Scott, Freddie Gray, Samuel DuBose? There’s no telling. He’s been dead for probably more than a century. Warner Madison’s outrage about race and policing came to a head just after the Civil War, in 1865, when he was a 31-year-old barber living in Memphis, Tennessee.

Memphis isn’t among recent flashpoints—Ferguson, North Charleston, Baltimore—nor does it crop up among placenames associated with racial violence of decades past—South Central, Crown Heights, Watts. In collective memory of American history, Memphis figures as the scene of a lot of great music and the assassination of Martin Luther King. Yet the Memphis of Warner Madison’s time is an essential, though largely forgotten, part of understanding race and policing in America.

A recent New York Times article suggests Americans are living through “a purge moment” in their relationship to history. Especially since the June shooting at Emanuel AME Church in Charleston, icons of slavery and the Confederate States of America have been challenged and, in many cases, removed: the Confederate battle flag from the South Carolina statehouse grounds and, most recently, a statue of Jefferson Davis from the University of Texas at Austin’s Main Mall. At Yale University, students are now debating whether to rename Calhoun College, which honors a Yale alumnus who is best known as the antebellum South’s most ardent defender of slavery.

Skeptics are concerned about “whitewashing” the past or hiding important if controversial aspects of our history in “a moral skeleton closet.” (At the extreme end, one can find online commenters likening such reconsiderations to ISIS’s destruction of pre-Islamic antiquities in Syria.) But there’s little harm and often much good in collective soul-searching, as among the students at Yale, about how best to remember a community’s past and express its shared values—whatever decision that community may finally come to about its memorials. And, as I argued here previously, memory is a scarce resource. Figures like Jefferson Davis and John C. Calhoun shouldn’t be forgotten, but their public memorialization, in statues, street names, and the like, keeps them before our eyes to the exclusion of things we could be remembering instead—things we might find more useful for understanding the world around us.

Take Memphis in the 1860s. A southern city that fell to Union forces only about a year into the Civil War, in June 1862, it became a testing ground for the new order that would follow the destruction of slavery. As word spread across the countryside that the reputed forces of freedom had taken Memphis, African Americans flocked to the city. By the end of the war, Memphis’s black population had grown from 3,000 to 20,000, and surrounding plantations were proportionately emptied—to the dismay of cotton planters who needed laborers in their fields.

White authorities forcibly moved former slaves back onto the plantations. How? Using newly invented laws against “vagrancy.” Memphis blacks who could not prove gainful employment in the city were deemed vagrants, and vagrants were subject to arrest and impressment into the agricultural labor force.

Vagrancy had existed as a word and a phenomenon for centuries. In the post-Civil War South, it became a crime. Vagrancy laws were mainstays of southern states’ “black codes” during the late nineteenth century, because they helped white supremacists restore a social order that resembled slavery. Black men without jobs were guilty of the crime of vagrancy simply by going outside and walking down the street. Once arrested and imprisoned, they could be put on chain gangs—and white southerners could once again exploit their unpaid labor.

It’s not surprising that former Confederates were responsible for criminalizing black unemployment. But so was the Freedmen’s Bureau—the federal agency expressly charged by Congress and Abraham Lincoln with assisting former slaves in their transition to freedom. The bureau’s Memphis superintendent wrote in 1865 that the city had a “surplus population of at least six thousand colored persons [who] are lazy, worthless vagrants” and authorized patrols that were arresting black Memphians indiscriminately.

When some of the leading black citizens of Memphis had seen enough of this—men taken away to plantations at the points of bayonets, children stopped on their way to school and challenged to prove they weren’t “vagrants”—they called on the most literate people among them, including Warner Madison, to compose petitions to high-ranking federal officials. Paramount among their grievances was the harassment of “Children going to School With there arms full of Book[s].” To the African American community, freedom meant access to education. But to whites—even the very officials responsible for protecting black rights—freedom meant that African Americans needed to get to work or be policed.

Clinton Fisk, a Union general during the war, now oversaw Freedmen’s Bureau operations in all of Tennessee and Kentucky. He replied politely to the letters he received, but he never credited or even mentioned the reports of abuse Warner Madison and his compatriots provided. He asked his subordinate in Memphis to investigate, and the unbothered report came back: “I can find no evidence whatever that School children, with Books in their hands have been arrested, except in two or three cases.” Even Clinton Fisk—an abolitionist who so strongly advocated African American education that Fisk University bears his name—failed to affirm that having a book kept a person from being vagrant.

This period of conflict culminated in the Memphis riots of 1866—an episode that ought to be infamous (and is the subject of a few good books) but is generally absent from public consciousness. White Memphians initially assumed the “riots” were a black protest that turned violent, but what actually occurred in Memphis on the first days of May in 1866 was a massacre of black men, women, and children by white mobs, among whom were many police officers.

In failing to remember 1860s Memphis—failing even to know the name of someone like Warner Madison, after whom no highways or elementary schools are named—we fail to remember that the federal government once made it the special province of law enforcement agents to accost African Americans in public places. Without remembering that, we cannot apprehend the complexity and durability of the problems underlying current events.

What we now call “racial profiling,” and even the appallingly frequent uses of lethal force against black citizens, may result less from the implicit bias of police officers than from a historical legacy. Abiding modes of law enforcement and criminal justice, brought to us by nineteenth-century white Americans’ anxieties about the abolition of slavery, were designed to treat black people walking freely on city streets—unless they were being economically productive in ways white people approved—as social threats.

Racism may be only a partial explanation. Some of the people who were arresting “vagrants” in Memphis were African American—they were soldiers in the U.S. Army acting under Freedmen’s Bureau orders—and so are three of the six Baltimore police officers charged with the death of Freddie Gray. Blame may rest, too, with habits of mind upon which few people even frown—like taking gainful employment as a measure of human worth (a pernicious corollary of the belief that markets possess wisdom), or presuming that someone must be up to no good if he has (in Chuck Berry’s words) no particular place to go.

Clintonville Plaque by Robert Walker (via Wikimedia Commons, CC A 3.0)

This post originally appeared August 11, 2015.

When I was in graduate school, I walked almost every day past a rock bearing a plaque that read “On this site in 1897 nothing happened.” You may have seen it yourself. The “nothing happened” plaque has become a minor meme in embossed iron. A flickr photo pool features several dozens pictures of it in various locations around the English-speaking world. You can even buy one for yourself on Amazon.

Many people seem to think it’s a good joke. If you stop and think about it, it’s actually two jokes.

On one hand, it’s funny because it’s a reductio ad absurdem. We live in a culture of rampant commemoration—historical markers, day- and month-long observances, commemorative street names, “On this Day in History” features. The National Register of Historic Places includes more than 80,000 sites. The “nothing happened” plaque is a joke that says we tend to over-remember—that there is literally nothing we won’t devote a plaque to.

At the same time, it can be understood as a joke that says we tend to under-remember. As the writer of the Book of Ecclesiastes already knew, millennia ago, “What has been will be again, what has been done will be done again; there is nothing new under the sun.” The world is very old: everything has already happened, everywhere, in every year of recorded history (even in poor derided 1897), and a site where truly nothing happened is indeed notable enough to warrant a plaque. Every day we tread ground where past generations lived lives from which we doubtless could learn a great deal, if only we had a way to listen to them. But without concerted efforts to remember what preceded us, we remain mostly oblivious to the archaeology of human experience.

These competing meanings of the joke present a quandary: when are we remembering too much, and when are we remembering too little? Or, more precisely: given that we can’t possibly remember everything all the time, how should we choose, and how should we remember, the finite elements of the past for which our brains (and the surfaces on which we can install plaques) have room? Memory, personal and collective, has some of the same features as other scarcity-of-resources problems, including challenges in determining the most ethical distribution of those resources.

More than a decade after the terrorist attacks of September 11, you can walk into a truck stop and find a t-shirt or bumper sticker bearing pictures of the World Trade Center and the words “Never Forget.” Most people would agree, that ought never be forgotten—and the shirts and stickers are a testament to their own effectiveness in shaping public memory. But what are we doing, and what should we do, with that memory? Few people would find any fault with its heightening our admiration for first responders or making us appreciative of the peace and security most of us enjoy. But how is memory of 9/11 shaping public ideas about the U.S.’s relationship to the Islamic world? Should it be accompanied with greater remembrance of the Iraq and Afghanistan wars, about which much (civilian death tolls in the region, long-term tolls on veterans’ lives) still is not widely known?

The country’s deeper past shapes the present, too, of course. To Dylann Roof, the Confederate flag probably represented a lost past of white supremacy. Politicians who now say the flag “belongs in a museum” seem to be implying that it should be remembered differently (although no politician ventures very specific recommendations about how). Or maybe they’re suggesting it should be all but forgotten—it depends on whether one sees museums as living institutions or as society’s attic. While there may be emerging widespread agreement that the Confederate flag should not be valorized, it is much less clear exactly what we should do instead. Are there ways we could remember the histories of slavery and the Civil War that might better contribute to racial justice in our own time?

In a series of columns over the coming year, while I’m in residence at the Prindle Institute, I’ll explore questions like these that arise from the ways contemporary culture seems to remember—or not remember—various aspects of American history.