"2016-06-14_20-03-47_ILCE-6300_DSC08367" by Miguel Discart is licensed by CC BY-SA 2.0 (via Flickr)

In the wake of the largest mass shooting in the United States to date, Facebook and other social media sites have been flooded with posts honoring the victims in Orlando. Many such posts include the faces of the victims, rainbow banners and “share if you’re praying for Orlando” posts. Although there is nothing particularly harmful about sharing encouraging thoughts through social media, opinions are surfacing that it might do more harm than good.

Image created from a photograph by Conner Gordon

How Was Your Smartphone Made? Nobody Really Knows (Wall Street Journal)
by Geoffrey A. Fowler
“The more I ask about my phone’s roots in African mines and Asian assembly lines, the more uncomfortable I become. My phone might have supported forced labor or warlords.”

Indiana Governor Stunned by How Many People Seem To Have Gay Friends (New Yorker)
by Andy Borowitz
“Pence said that from what he has been able to gather thus far, the phenomenon of ‘ordinary folks’ having gay friends ‘has been going on for years.'”

The Church Camps That Aim to Bridge Race Relations (Atlantic)
by Jesse James Deconto
“Many American Christians still grieve something Martin Luther King Jr. articulated more than 50 years ago: Churches are among the most segregated spaces in America.”

When College Students Need Food Pantries More Than Textbooks (Atlantic)
by Emily Deruy
“The report found that many universities have been offering emergency aid to students at risk of dropping out for financial reasons for years, but often in an ad hoc fashion.”

Keep Abortion Legal - "Protest against Focus on the Family's “Stand for the Family” event" by Tony Webster is licensed under CC by 2.0 (via Flickr)

On June 27th, the Supreme Court decided on the hotly debated case, Whole Woman’s Health v. Hellerstedt, which dealt with access to abortion clinics in Texas. In 2013, Texas proposed a law requiring that all abortion clinics in the state hire only doctors that have “admitting privileges at local hospitals and meet outpatient surgical center standards.” This law would have shut down nearly 30 of Texas’ 40 abortion clinics, a state home to 5.4 million women in the reproductive age range.

"Toilet" by dinky123uk is licensed under CC0 Public Domain (via Pixabay)

Recently in the United States, bathroom usage rights for transgender people have come to the political fore. As a part of Title IX protections against gender discrimination in federally funded educational institutions, the Obama administration has recently ordered public schools to allow students to use whichever bathrooms they please. This should free transgender students from the unpleasantness of using what they perceive to be the wrong bathroom, or being asked to use single-user facilities (unlike and apart from their classmates).

This development is the culmination of a debate that first brewed on various college campuses across the country and later issued in various state-level “bathroom bills” that would require people to use the bathrooms that correspond with their birth certificate gender. But now that even President Obama himself is involved, this issue is unlikely to dissipate quietly and without additional relevant legislative and/or judicial action.

It’s not too difficult to see why bathrooms have historically been a focal point during times of social change. Before bathrooms became a pressure point in figuring out how transgender people should be included and accommodated publicly, they served as a literal and metaphorical site of racial tensions during the Civil Rights movement and of sexist tensions as women increasingly worked and ventured outside the home.

Hypothetically in a robustly free country, businesses and organizations would be left alone to determine their own bathroom policies, while customers would be free to visit whichever locations they like. This means in theory that businesses could choose to offer bathrooms segregated along any conceivable dimension. However, establishments with odious bathroom (and other) policies would likely fail fast.

The only places likely to thrive with such practices in place would be, for instance, small ideological clubs/foundations and houses of worship. And the existence of self-contained islands of social dissent do not threaten the liberal order. On the contrary, the protection of peaceful freedom of association is an essential feature of liberalism.

But starting from the quite non-libertarian status quo, things are much more complicated. The provision of bathrooms is already heavily regulated. For instance, overlapping and even conflicting bathroom regulations in New York City mean it’s often unclear whether a restaurant or coffee shop is in compliance with bathroom code, which depends on the number of seats, age of the building and business, and other factors.

The already-regulated status quo means that when the government declines to further regulate bathrooms, that refusal bears greater symbolic value than if public bathrooms remained a generally extra-legal issue (as in the ideal libertarian state of affairs). If it was appropriate to legally protect bathroom access for people of color and later people with disabilities, refusing to do so for transgender people suggests by implication that their status is somehow less important.

That being said, state-level bathroom laws will probably have fairly little effect in practice. It would be incredibly burdensome to actively check that bathroom visitors at any given venue were choosing the right door, regardless of whether they were supposed to use the facility corresponding to their birth gender, current legal gender, apparent gender, or personally professed gender.

Of course, acts of voyeurism and sexual assault are already criminal, so police are already empowered to prevent and investigate them whether or not they are also empowered to act as gender enforcers. Perhaps a few would-be bathroom criminals would be deterred by the prospect of getting hit with an extra charge for simply having used the wrong bathroom, but criminal penalties for sex crimes should be enough already.

Finally, we should remember that it is ok to personally disagree with the law. Even more importantly, a liberal society requires us merely to tolerate peaceful others, not to eagerly approve of everything about them in our hearts. Social conservatives have been losing the culture wars for some time and are not incorrect to feel like their ethical ideals are waning. It will take time and experience to show those uneasy with changes in bathrooms that those changes are really a non-issue. Top-down action, like that of the Obama administration, can change policies but it doesn’t necessarily win hearts and minds, and may even provoke political backlash.

Image created from a photograph by Conner Gordon

Climate change: the missing issue of the 2016 campaign (Guardian)
by Ed Pilkington and Mona Chalabi
“Many of the respondents vented despair at a political system that in their view allowed a matter of such overwhelming significance to be so overlooked. ‘The fact that no one is really talking about climate change, to me, is indicative of just how lost we are,’ said Linda Hayden, 51, from Oregon. ‘Our house is on fire and we are arguing about who is more angry!'”

The Fines and Fees That Keep Former Prisoners Poor (Atlantic)
by Alana Semuels
“The uptick in LFOs comes as states look for ways to pay for their corrections system while facing other revenue shortfalls. The fees levied on the formerly incarcerated include bench-warrant fees, filing-clerks fees, court-appointed attorney fees, crime-lab analysis fees, DNA-database fees, jury fees, and incarceration costs.”

“The Best Revenge is Your Paper”: Notes on Women’s Work (LA Review of Books)
by Alice Bolin
“If dating and marriage are work for women, in today’s economy they have found many ways to monetize them.”

Adding Classes and Content, Resurgent Libraries Turn a Whisper Into a Roar (New York Times)
by Winnie Hu
“No longer just repositories for books, public libraries have reinvented themselves as one-stop community centers that aim to offer something for everyone. In so doing, they are reaffirming their role as an essential part of civic life in America by making themselves indispensable to new generations of patrons.”

Untitled by Phil Dokas is licensed under CC BY-NC-SA 2.0 (via Flickr)

This is the fourth in a series about American History and the Ethics of Memory. This post originally appeared on February 9, 2016.

It was a hotly contested presidential election, and the mudslinging was fierce. There were allegations of fiscal corruption, sexual impropriety, and—perhaps most damning of all—bad writing. 

The Democratic candidate, it was rumored, spelled Congress with a K. Couldn’t construct a complete sentence. Had to hire someone to write his letters for him. Was almost entirely illiterate.

The charges went viral. They even inspired snatches of satirical poetry in the newspapers:

Then a nice writing-man I have hired for my use,

To hide the bad spelin I skrawl

And them are as says how my grammar is bad,

Don’t know nothing of it all.

The man the poem was mocking, the one supposed to be guilty of these several crimes against the English language, now appears on the $20 bill. The John Quincy Adams campaign’s efforts to smear their upstart rival’s literacy did not stop Andrew Jackson from winning the White House.

Modern scholars have actually tried to figure out, “Could Andrew Jackson Spell?” The evidence is inconclusive, but the question doesn’t seem especially important for us now. What is relevant today is what the episode suggests about how we evaluate candidates—the role ideas about literacy play in political discourse, and to what effect. Left-leaning commentators’ gleefulness over Sarah Palin’s recent display of verbal clumsiness, in her speech endorsing Donald Trump, doesn’t look very different from the hilarity that ensued among Adams supporters when they heard about a 25-line letter by Jackson that included 23 misspellings.

Spelling Congress with a K doesn’t by itself seem like a disqualifier from the presidency. An effective chief executive must be able to do many things with Congress, but spelling is lower on the list than cooperating, negotiating, persuading, and maneuvering. The general idea behind the Adams campaign’s gambit was that by portraying Jackson, born in the backwoods of Tennessee, as illiterate, they could persuade voters he lacked the aptitude to manage the complexities of the national government—as the incumbent Adams, scion of one of the founding families of the republic, obviously could.

Arguably there was some truth to this. By all appearances, Jackson failed to comprehend the function and importance of the Bank of the U.S. when, with devastating economic results, he effectively destroyed it in the 1830s (one of the reasons many people would like to see an American woman replacing Jackson on the $20 bill, rather than Alexander Hamilton on the $10). But this may have been a coincidence. People who did grasp the ins and outs of central banking in the 1830s probably were highly literate, but the converse isn’t necessarily true. Plenty of people who knew the correct spelling of Congress still didn’t understand what the Bank of the U.S. was good for, just as many well-read and eloquent people in 2008 had no idea what a collateralized debt obligation was.

Besides, it didn’t work. Jackson beat Adams. The election of 1828 proved to be an early installment in the long American tradition of affection for politicians who are “regular guys” (or, in the lexicon of pollsters during the election of 2000, people you’d like to have a beer with). Not for the last time, a bookish and bespectacled candidate inspired more distrust among voters than a rough-edged, inarticulate one. Never mind that the supposedly effete Bostonian went on to serve nine terms in Congress and successfully defend the Amistad rebels, while the manly frontiersman earned a reputation for exterminating American Indians. Maybe the Adams camp would have done better for their candidate, and the country, by talking more about principles than orthography.

Which may be useful to remember in our own era. The whirlwind of attention paid to Sarah Palin’s recent speech has been dominated by derision of her odd phraseology and general incoherence—which is a perfectly legitimate (and certainly amusing) subject for Saturday Night Live (“She sounds like a greeting card from a Chinese dollar store!”). But even the venerable New York Times’s coverage devolved into a listicle called “The Most Mystifying Lines of Sarah Palin’s Endorsement Speech.”

The first question about that speech or any other politician’s shouldn’t be whether or not it’s a fluid sequence of grammatical sentences (as nice as that would be) but whether or not it’s bullshit—a word I use here in its technical sense to refer to indifference to truthfulness. Misused and made-up words are great fodder for social-media mockery (refudiate! squirmishes!), but they’re less outrageous than (to choose just one example from Palin’s speech) the claim that military veterans are not “treated better than illegal immigrants are treated in this country.” And they’re far less damaging than an attitude toward political discourse that doesn’t care whether that claim, or any other, can even be backed up. Sarah Palin may be inarticulate, but there is  more important work to be done than pointing that out.

"CMRF 08/10 OLCHC Photos 44" by CMRF Crumlin is licensed under CC BY 2.0 via Flickr

Recently, a 5-year-old child named Julianne Snow passed away from from a neurological disease known as Charcot-Marie-Tooth, causing nerves in the brain to degenerate and loss in the muscles related to chewing, swallowing, and eventually breathing. Although Charcot-Marie-Tooth disease is one of the world’s most commonly inherited neurological disorders, this story made national headlines due to Julianne’s independent decision to refuse treatment.

Illustration created by Antislavery Usable Past Project at the University of Nottingham in 2014. Used with permission from the creators.

This is the third in a series about American History and the Ethics of Memory. This post originally appeared on November 17, 2015.

“Using History to Make Slavery History.” That was the title of this year’s conference of Historians Against Slavery (HAS), a four-year-old organization created to “bring historical context and scholarship to the modern-day antislavery movement in order to inform activism.” A related endeavor, the Antislavery Usable Past project, aims similarly to “bring to the present the important lessons from antislavery movements and policies of the past, and translate those lessons into effective tools for policy makers, civil society, and citizens.” There aren’t many venues in which—as at the HAS conference, which I attended last month—history professors, human-rights activists, and survivors of modern slavery sit side-by-side behind a panel table on an auditorium stage.

Plenty of critical issues could benefit from richer public knowledge of the past—climate change and the early history of industrialism, say—but contemporary antislavery activism stands out for the extensiveness of its recourse to academic history. It makes for an illuminating study in the ways different notions of “history” may inform contemporary ethics.

In the wake of the June 2015 Charleston shooting, a New York Times blogger wrote about her determination no longer to speak to her children about America’s “history” of racial violence. She felt it her responsibility to acknowledge, rather, that “the saga of racism in this country is ongoing.” In that formulation, the present is (regrettably) continuous with the past, and the word “history” means what it means at the end of the HAS conference’s title—dead and gone. History is imagined chiefly in contrast to our own time: if the events around us resemble those of decades past, then they belong not to “history” but rather to us. Getting “on the right side of history,” as some like to say (a phrasing Andy Cullison and I recently discussed in a podcast), means getting out in front of history; putting it, and its injustices, behind us.

When it comes to slavery’s history, the word is no simple synonym for “over.”

There are indeed striking and distressing similarities between the 2015 Charleston shooting and, for instance, the 1963 bombing of 16th Street Baptist Church in Birmingham, or lynchings in the early post-Civil War South. All were acts of criminal homicide—indeed, terrorism—by white supremacists targeting African Americans. Racist violence is a part of American history that, tragically, is not over. Past and present forms of slavery, on the other hand, are distinct from each other, and in some fundamental ways. Slavery in the pre-Civil War U.S. was legal, highly visible, and situated at the center of the economy. Today, it exists under the radar of laws and in the shadows of global trade. Slavery is not “history”—in the sense of being gone from the earth—but neither is it today a straightforward continuation of what it was in the past.

The slavery of the pre-Civil War U.S. has been the object of tremendous scholarly attention for years, and public consciousness of its atrocities perhaps reached its high-water mark at the moment 12 Years a Slave won the Academic Award for Best Picture. Of course, it took a long time for that water to rise from where it was when Gone With the Wind won the same award in 1939. And the recent controversy about a high-school history textbook’s allusion to the arrival African “workers” on southern plantations shows how deeply flawed cultural memory of slavery still can be. Still, a relatively strong cultural memory of slavery (Slate.com recently devoted its first Slate Academy to “The History of American Slavery”) forms a foundation on which contemporary anti-slavery activists can build.

The Polaris Project, one of the leading international organizations opposing human trafficking, seized on the success of 12 Years a Slave to publicize its own work, naming the film’s director, Steve McQueen, an awareness-raising “ambassador” and “highlighting the striking parallels” between Solomon Northup’s kidnapping and enslavement in the 1840s “and the experiences of sex and labor trafficking victims today.” The image at the top of this page, likewise, seeks to make continuities between historical and contemporary slavery visible amid their differences—by situating the famous chart of the slave ship Brookes, which galvanized British abolitionism when it was published in 1788, inside a commercial jetliner.

In fact, there are several ways of seeing continuities between slavery’s American history and its presence in the world now. Along one stream, the race-based chattel slavery of the antebellum U.S. persists in illegal forced labor around the world today, an altered form of rapacious capital-accumulation on the backs of vulnerable people. Another stream flows from historical slavery—in particular, its notorious instances of sexual exploitation of enslaved women—to modern sex trafficking. Still another passes through the “exception clause” in the Thirteenth Amendment, which abolished slavery “except as a punishment for crime whereof the party shall have been duly convicted.” Here, race matters once again, given the disproportionate incarceration of African American men, and the other streams re-converge with this one when U.S. prison labor serves corporate interests or when the prison system harbors sexual slavery.

It can seem, as David Gellman has written in this space, “jarring” to move from Louisiana plantations 175 years ago “to children prostituted in Thailand and brick kiln laborers in the thralls of debt bondage in India, or to migrant laborers in Florida and housekeepers in Washington, D.C.” But the potential payoff is significant. Making that move can tap into a vast public consensus that slavery is an intolerable injustice. Believing in a human right to freedom isn’t a partisan position.

Yet it remains that there is relatively little public awareness that enslavement still happens, or that, even if some very bad things happen, they merit the odious name of “slavery.” (Are candidates talking about slavery in presidential debates? No.) The will to stop human trafficking and commercial exploitation is hampered less by ideological divisions than by ideas about the past—notions of a history that’s believed to be over.

In the U.S., emancipation is a central part of our cultural memory, our heritage. America may be a nation that was built on the backs of slaves, but it is also a nation that fought a great war to set those slaves free. It is well that it should be so remembered. Little progress toward justice in our own time will be accomplished if Americans do not remember that their nation endured terrible suffering to secure the liberty of four million African Americans. The challenge is to celebrate that without failing to perceive, acknowledge, and work to understand both slavery’s tragic legacies and the forms in which slavery endures.

"delete" by Mixy Lorenzo is licensed under CC BY-NC-SA 2.0 (via Flickr)

In 2000, nearly 415 million people used the Internet. By July 1, 2016, that number is estimated to grow to nearly 3.425 billion. That is about 46% of the world’s population. Moreover, there are as of now about 1.04 billion websites on the world wide web. Maybe one of those websites contains something you would rather keep out of public view, perhaps some evidence of a youthful indiscretion or an embarrassing social media post. Not only do you have to worry about friends and family finding out, but now nearly half of the world’s population has near instant access to it, if they know how to find it. Wouldn’t it be great if you could just get Google to take those links down?

This question came up in a recent court case in the European Union in 2014. A man petitioned for the right to request that Google remove a link from their search results that contained an announcement of the forced sale of one of his properties, arising from old social security debts. Believing that since the sale had concluded years before and was no longer relevant, he wanted Google to remove the link from their search results. They refused. Eventually, the court sided with the petitioner, ruling that search engines must consider requests from individuals to remove links to pages that result from a search on their name. The decision recognized for the first time the “right to be forgotten.”

This right, legally speaking, now exists in Europe. Morally speaking, however, the debate is far from over. Many worry that the right to be forgotten threatens a dearly cherished right to free speech. I, however, think some accommodation of this right is justified on the basis of an appeal to the protection of individual autonomy.

First, what are rights good for? Human rights matter because their enforcement helps protect the free exercise of agency—something that everyone values if they value anything at all. Alan Gewirth points out that the aim of all human rights is “that each person have rational autonomy in the sense of being a self-controlling, self-developing agent who can relate to others person on a basis of mutual respect and cooperation.” Now, virtually every life goal we have requires the cooperation of others. We cannot build a successful career, start a family, or be good citizens without other people’s help. Since an exercise of agency that has no chance of success is, in effect, worthless, the effective enforcement of human rights entails that our opportunities to cooperate with others are not severely constrained.

Whether people want to cooperate depends on what they think of us. Do they think of us as trustworthy, for example? Here is where “the right to be forgotten” comes in. This right promotes personal control over access to personal information that may unfairly influence another person’s estimation of our worthiness for engaging in cooperative activities—say, in being hired for a job or qualifying for a mortgage.

No doubt, you might think, we have a responsibility to ignore irrelevant information about someone’s past when evaluating their worthiness for cooperation. “Forgive and forget” is, after all, a well-worn cliché. But do we need legal interventions? I think so. First, information on the internet is often decontextualized. We find disparate links reporting personal information in a piecemeal way. Rarely do we find sources that link these pieces of information together into a whole picture. Second, people do not generally behave as skeptical consumers of information. Consider the anchoring effect, a widely shared human tendency to attribute more relevance to the first piece of information we encounter than we objectively should. Combine these considerations with the fact that the internet has exponentially increased our access to personal information about others, and you have reason to suspect that we can no longer rely upon the moral integrity of others alone to disregard irrelevant personal information. We need legal protections.

This argument is not intended to be a conversation stopper, but rather an invitation to explore the moral and political questions that the implementation of such a right would raise. What standards should be used to determine if a request should be honored? Should search engines include explicit notices in their search results that a link has been removed, or should it appear as if the link never existed in the first place? Recognizing the right to be forgotten does not entail the rejection of the right to free speech, but it does entail that these rights need to be balanced in a thoughtful and context-sensitive way.

"Memphis Riots, 1866" by Alfred Rudolph Waud (Public Domain, via Tennessee State Library and Archives)

This is the second in a series on American History and the Ethics of Memory. This post originally appeared on September 15, 2015.

Warner Madison doesn’t trust the police. He thinks they view all black people with suspicion, harass them on the streets, and arrest them without cause. When police accost his children on their way to school, he can barely contain his anger. He fires off a letter protesting what he calls “one of the most obnoxious and foul and mean” things he has ever witnessed. But he possesses little hope that police treatment of African Americans in his city will change.

What does Warner Madison think of Trayvon Martin, Eric Garner, Michael Brown, Walter Scott, Freddie Gray, Samuel DuBose? There’s no telling. He’s been dead for probably more than a century. Warner Madison’s outrage about race and policing came to a head just after the Civil War, in 1865, when he was a 31-year-old barber living in Memphis, Tennessee.

Memphis isn’t among recent flashpoints—Ferguson, North Charleston, Baltimore—nor does it crop up among placenames associated with racial violence of decades past—South Central, Crown Heights, Watts. In collective memory of American history, Memphis figures as the scene of a lot of great music and the assassination of Martin Luther King. Yet the Memphis of Warner Madison’s time is an essential, though largely forgotten, part of understanding race and policing in America.

A recent New York Times article suggests Americans are living through “a purge moment” in their relationship to history. Especially since the June shooting at Emanuel AME Church in Charleston, icons of slavery and the Confederate States of America have been challenged and, in many cases, removed: the Confederate battle flag from the South Carolina statehouse grounds and, most recently, a statue of Jefferson Davis from the University of Texas at Austin’s Main Mall. At Yale University, students are now debating whether to rename Calhoun College, which honors a Yale alumnus who is best known as the antebellum South’s most ardent defender of slavery.

Skeptics are concerned about “whitewashing” the past or hiding important if controversial aspects of our history in “a moral skeleton closet.” (At the extreme end, one can find online commenters likening such reconsiderations to ISIS’s destruction of pre-Islamic antiquities in Syria.) But there’s little harm and often much good in collective soul-searching, as among the students at Yale, about how best to remember a community’s past and express its shared values—whatever decision that community may finally come to about its memorials. And, as I argued here previously, memory is a scarce resource. Figures like Jefferson Davis and John C. Calhoun shouldn’t be forgotten, but their public memorialization, in statues, street names, and the like, keeps them before our eyes to the exclusion of things we could be remembering instead—things we might find more useful for understanding the world around us.

Take Memphis in the 1860s. A southern city that fell to Union forces only about a year into the Civil War, in June 1862, it became a testing ground for the new order that would follow the destruction of slavery. As word spread across the countryside that the reputed forces of freedom had taken Memphis, African Americans flocked to the city. By the end of the war, Memphis’s black population had grown from 3,000 to 20,000, and surrounding plantations were proportionately emptied—to the dismay of cotton planters who needed laborers in their fields.

White authorities forcibly moved former slaves back onto the plantations. How? Using newly invented laws against “vagrancy.” Memphis blacks who could not prove gainful employment in the city were deemed vagrants, and vagrants were subject to arrest and impressment into the agricultural labor force.

Vagrancy had existed as a word and a phenomenon for centuries. In the post-Civil War South, it became a crime. Vagrancy laws were mainstays of southern states’ “black codes” during the late nineteenth century, because they helped white supremacists restore a social order that resembled slavery. Black men without jobs were guilty of the crime of vagrancy simply by going outside and walking down the street. Once arrested and imprisoned, they could be put on chain gangs—and white southerners could once again exploit their unpaid labor.

It’s not surprising that former Confederates were responsible for criminalizing black unemployment. But so was the Freedmen’s Bureau—the federal agency expressly charged by Congress and Abraham Lincoln with assisting former slaves in their transition to freedom. The bureau’s Memphis superintendent wrote in 1865 that the city had a “surplus population of at least six thousand colored persons [who] are lazy, worthless vagrants” and authorized patrols that were arresting black Memphians indiscriminately.

When some of the leading black citizens of Memphis had seen enough of this—men taken away to plantations at the points of bayonets, children stopped on their way to school and challenged to prove they weren’t “vagrants”—they called on the most literate people among them, including Warner Madison, to compose petitions to high-ranking federal officials. Paramount among their grievances was the harassment of “Children going to School With there arms full of Book[s].” To the African American community, freedom meant access to education. But to whites—even the very officials responsible for protecting black rights—freedom meant that African Americans needed to get to work or be policed.

Clinton Fisk, a Union general during the war, now oversaw Freedmen’s Bureau operations in all of Tennessee and Kentucky. He replied politely to the letters he received, but he never credited or even mentioned the reports of abuse Warner Madison and his compatriots provided. He asked his subordinate in Memphis to investigate, and the unbothered report came back: “I can find no evidence whatever that School children, with Books in their hands have been arrested, except in two or three cases.” Even Clinton Fisk—an abolitionist who so strongly advocated African American education that Fisk University bears his name—failed to affirm that having a book kept a person from being vagrant.

This period of conflict culminated in the Memphis riots of 1866—an episode that ought to be infamous (and is the subject of a few good books) but is generally absent from public consciousness. White Memphians initially assumed the “riots” were a black protest that turned violent, but what actually occurred in Memphis on the first days of May in 1866 was a massacre of black men, women, and children by white mobs, among whom were many police officers.

In failing to remember 1860s Memphis—failing even to know the name of someone like Warner Madison, after whom no highways or elementary schools are named—we fail to remember that the federal government once made it the special province of law enforcement agents to accost African Americans in public places. Without remembering that, we cannot apprehend the complexity and durability of the problems underlying current events.

What we now call “racial profiling,” and even the appallingly frequent uses of lethal force against black citizens, may result less from the implicit bias of police officers than from a historical legacy. Abiding modes of law enforcement and criminal justice, brought to us by nineteenth-century white Americans’ anxieties about the abolition of slavery, were designed to treat black people walking freely on city streets—unless they were being economically productive in ways white people approved—as social threats.

Racism may be only a partial explanation. Some of the people who were arresting “vagrants” in Memphis were African American—they were soldiers in the U.S. Army acting under Freedmen’s Bureau orders—and so are three of the six Baltimore police officers charged with the death of Freddie Gray. Blame may rest, too, with habits of mind upon which few people even frown—like taking gainful employment as a measure of human worth (a pernicious corollary of the belief that markets possess wisdom), or presuming that someone must be up to no good if he has (in Chuck Berry’s words) no particular place to go.