"Memphis Riots, 1866" by Alfred Rudolph Waud (Public Domain, via Tennessee State Library and Archives)

This is the second in a series on American History and the Ethics of Memory. This post originally appeared on September 15, 2015.

Warner Madison doesn’t trust the police. He thinks they view all black people with suspicion, harass them on the streets, and arrest them without cause. When police accost his children on their way to school, he can barely contain his anger. He fires off a letter protesting what he calls “one of the most obnoxious and foul and mean” things he has ever witnessed. But he possesses little hope that police treatment of African Americans in his city will change.

What does Warner Madison think of Trayvon Martin, Eric Garner, Michael Brown, Walter Scott, Freddie Gray, Samuel DuBose? There’s no telling. He’s been dead for probably more than a century. Warner Madison’s outrage about race and policing came to a head just after the Civil War, in 1865, when he was a 31-year-old barber living in Memphis, Tennessee.

Memphis isn’t among recent flashpoints—Ferguson, North Charleston, Baltimore—nor does it crop up among placenames associated with racial violence of decades past—South Central, Crown Heights, Watts. In collective memory of American history, Memphis figures as the scene of a lot of great music and the assassination of Martin Luther King. Yet the Memphis of Warner Madison’s time is an essential, though largely forgotten, part of understanding race and policing in America.

A recent New York Times article suggests Americans are living through “a purge moment” in their relationship to history. Especially since the June shooting at Emanuel AME Church in Charleston, icons of slavery and the Confederate States of America have been challenged and, in many cases, removed: the Confederate battle flag from the South Carolina statehouse grounds and, most recently, a statue of Jefferson Davis from the University of Texas at Austin’s Main Mall. At Yale University, students are now debating whether to rename Calhoun College, which honors a Yale alumnus who is best known as the antebellum South’s most ardent defender of slavery.

Skeptics are concerned about “whitewashing” the past or hiding important if controversial aspects of our history in “a moral skeleton closet.” (At the extreme end, one can find online commenters likening such reconsiderations to ISIS’s destruction of pre-Islamic antiquities in Syria.) But there’s little harm and often much good in collective soul-searching, as among the students at Yale, about how best to remember a community’s past and express its shared values—whatever decision that community may finally come to about its memorials. And, as I argued here previously, memory is a scarce resource. Figures like Jefferson Davis and John C. Calhoun shouldn’t be forgotten, but their public memorialization, in statues, street names, and the like, keeps them before our eyes to the exclusion of things we could be remembering instead—things we might find more useful for understanding the world around us.

Take Memphis in the 1860s. A southern city that fell to Union forces only about a year into the Civil War, in June 1862, it became a testing ground for the new order that would follow the destruction of slavery. As word spread across the countryside that the reputed forces of freedom had taken Memphis, African Americans flocked to the city. By the end of the war, Memphis’s black population had grown from 3,000 to 20,000, and surrounding plantations were proportionately emptied—to the dismay of cotton planters who needed laborers in their fields.

White authorities forcibly moved former slaves back onto the plantations. How? Using newly invented laws against “vagrancy.” Memphis blacks who could not prove gainful employment in the city were deemed vagrants, and vagrants were subject to arrest and impressment into the agricultural labor force.

Vagrancy had existed as a word and a phenomenon for centuries. In the post-Civil War South, it became a crime. Vagrancy laws were mainstays of southern states’ “black codes” during the late nineteenth century, because they helped white supremacists restore a social order that resembled slavery. Black men without jobs were guilty of the crime of vagrancy simply by going outside and walking down the street. Once arrested and imprisoned, they could be put on chain gangs—and white southerners could once again exploit their unpaid labor.

It’s not surprising that former Confederates were responsible for criminalizing black unemployment. But so was the Freedmen’s Bureau—the federal agency expressly charged by Congress and Abraham Lincoln with assisting former slaves in their transition to freedom. The bureau’s Memphis superintendent wrote in 1865 that the city had a “surplus population of at least six thousand colored persons [who] are lazy, worthless vagrants” and authorized patrols that were arresting black Memphians indiscriminately.

When some of the leading black citizens of Memphis had seen enough of this—men taken away to plantations at the points of bayonets, children stopped on their way to school and challenged to prove they weren’t “vagrants”—they called on the most literate people among them, including Warner Madison, to compose petitions to high-ranking federal officials. Paramount among their grievances was the harassment of “Children going to School With there arms full of Book[s].” To the African American community, freedom meant access to education. But to whites—even the very officials responsible for protecting black rights—freedom meant that African Americans needed to get to work or be policed.

Clinton Fisk, a Union general during the war, now oversaw Freedmen’s Bureau operations in all of Tennessee and Kentucky. He replied politely to the letters he received, but he never credited or even mentioned the reports of abuse Warner Madison and his compatriots provided. He asked his subordinate in Memphis to investigate, and the unbothered report came back: “I can find no evidence whatever that School children, with Books in their hands have been arrested, except in two or three cases.” Even Clinton Fisk—an abolitionist who so strongly advocated African American education that Fisk University bears his name—failed to affirm that having a book kept a person from being vagrant.

This period of conflict culminated in the Memphis riots of 1866—an episode that ought to be infamous (and is the subject of a few good books) but is generally absent from public consciousness. White Memphians initially assumed the “riots” were a black protest that turned violent, but what actually occurred in Memphis on the first days of May in 1866 was a massacre of black men, women, and children by white mobs, among whom were many police officers.

In failing to remember 1860s Memphis—failing even to know the name of someone like Warner Madison, after whom no highways or elementary schools are named—we fail to remember that the federal government once made it the special province of law enforcement agents to accost African Americans in public places. Without remembering that, we cannot apprehend the complexity and durability of the problems underlying current events.

What we now call “racial profiling,” and even the appallingly frequent uses of lethal force against black citizens, may result less from the implicit bias of police officers than from a historical legacy. Abiding modes of law enforcement and criminal justice, brought to us by nineteenth-century white Americans’ anxieties about the abolition of slavery, were designed to treat black people walking freely on city streets—unless they were being economically productive in ways white people approved—as social threats.

Racism may be only a partial explanation. Some of the people who were arresting “vagrants” in Memphis were African American—they were soldiers in the U.S. Army acting under Freedmen’s Bureau orders—and so are three of the six Baltimore police officers charged with the death of Freddie Gray. Blame may rest, too, with habits of mind upon which few people even frown—like taking gainful employment as a measure of human worth (a pernicious corollary of the belief that markets possess wisdom), or presuming that someone must be up to no good if he has (in Chuck Berry’s words) no particular place to go.

Clintonville Plaque by Robert Walker (via Wikimedia Commons, CC A 3.0)

This post originally appeared August 11, 2015.

When I was in graduate school, I walked almost every day past a rock bearing a plaque that read “On this site in 1897 nothing happened.” You may have seen it yourself. The “nothing happened” plaque has become a minor meme in embossed iron. A flickr photo pool features several dozens pictures of it in various locations around the English-speaking world. You can even buy one for yourself on Amazon.

Many people seem to think it’s a good joke. If you stop and think about it, it’s actually two jokes.

On one hand, it’s funny because it’s a reductio ad absurdem. We live in a culture of rampant commemoration—historical markers, day- and month-long observances, commemorative street names, “On this Day in History” features. The National Register of Historic Places includes more than 80,000 sites. The “nothing happened” plaque is a joke that says we tend to over-remember—that there is literally nothing we won’t devote a plaque to.

At the same time, it can be understood as a joke that says we tend to under-remember. As the writer of the Book of Ecclesiastes already knew, millennia ago, “What has been will be again, what has been done will be done again; there is nothing new under the sun.” The world is very old: everything has already happened, everywhere, in every year of recorded history (even in poor derided 1897), and a site where truly nothing happened is indeed notable enough to warrant a plaque. Every day we tread ground where past generations lived lives from which we doubtless could learn a great deal, if only we had a way to listen to them. But without concerted efforts to remember what preceded us, we remain mostly oblivious to the archaeology of human experience.

These competing meanings of the joke present a quandary: when are we remembering too much, and when are we remembering too little? Or, more precisely: given that we can’t possibly remember everything all the time, how should we choose, and how should we remember, the finite elements of the past for which our brains (and the surfaces on which we can install plaques) have room? Memory, personal and collective, has some of the same features as other scarcity-of-resources problems, including challenges in determining the most ethical distribution of those resources.

More than a decade after the terrorist attacks of September 11, you can walk into a truck stop and find a t-shirt or bumper sticker bearing pictures of the World Trade Center and the words “Never Forget.” Most people would agree, that ought never be forgotten—and the shirts and stickers are a testament to their own effectiveness in shaping public memory. But what are we doing, and what should we do, with that memory? Few people would find any fault with its heightening our admiration for first responders or making us appreciative of the peace and security most of us enjoy. But how is memory of 9/11 shaping public ideas about the U.S.’s relationship to the Islamic world? Should it be accompanied with greater remembrance of the Iraq and Afghanistan wars, about which much (civilian death tolls in the region, long-term tolls on veterans’ lives) still is not widely known?

The country’s deeper past shapes the present, too, of course. To Dylann Roof, the Confederate flag probably represented a lost past of white supremacy. Politicians who now say the flag “belongs in a museum” seem to be implying that it should be remembered differently (although no politician ventures very specific recommendations about how). Or maybe they’re suggesting it should be all but forgotten—it depends on whether one sees museums as living institutions or as society’s attic. While there may be emerging widespread agreement that the Confederate flag should not be valorized, it is much less clear exactly what we should do instead. Are there ways we could remember the histories of slavery and the Civil War that might better contribute to racial justice in our own time?

In a series of columns over the coming year, while I’m in residence at the Prindle Institute, I’ll explore questions like these that arise from the ways contemporary culture seems to remember—or not remember—various aspects of American history.

Image created from a photograph by Conner Gordon

Untangling Gun Violence from Mental Illness (Atlantic)
by Julie Beck
“Unfortunately, a consistent and dangerous narrative has emerged—an explanation all-too-readily at hand when a mass shooting or other violent tragedy occurs: The perpetrator must have been mentally ill. ‘We have a strong responsibility as researchers who study mental illness to try to debunk that myth,’ says Jeffrey Swanson, a professor of psychiatry at Duke University. ‘I say as loudly and as strongly and as frequently as I can, that mental illness is not a very big part of the problem of gun violence in the United States.'”

This New Neighborhood Will Grow Its Own Food, Power Itself, And Handle Its Own Waste (CoExist)
by Adele Peters
“In Almere, the village is likely to grow about half of the food that the community eats—it won’t grow coffee or bananas, for example. It will also feed energy back to the local grid. But in some locations, the company believes that the neighborhood could be fully self-sufficient.”

When ‘Diversity’ and ‘Inclusion’ Are Tenure Requirements (Atlantic)
by Conor Friedersdorf
“Last November, student activists at Pomona College, a selective liberal arts school in Southern California, demanded a change in the way that professors are evaluated. Alleging ‘unsafe academic environments,’ they wanted future candidates for promotion or tenure to be judged in part on ‘a faculty member’s support of a diverse student body.’ College President David Oxtoby dubbed it ‘an idea with merit.’ And a semester later, faculty were set to formally vote on the matter.”

Black students in US nearly four times as likely to be suspended as white students (Guardian)
by Ryan Felton
“As early as preschool, black children are 3.6 times as likely to receive one or more suspensions as white children. According to the data, black girls represent 20% of female preschool enrollment, but account for 54% of preschool children suspensions. Black students were also twice as likely to be expelled as white students.”

"Poor People's March at Lafayette Park and on Connecticut Avenue" by Warren K. Leffler, June 1968 (via Library of Congress, LC-U9- 19271-33A)

This post originally appeared June 16, 2015.

My last post discussed the bifurcated incentivization structure of capitalism: owners profit while workers become disempowered by working harder. In this post, I want to address an accompanying myth to the myth that capitalism compensates you better for working harder which is that collective ownership divests individuals of motivation to work.

People say that the problem with collective ownership in producing an incentive to work is that no one takes responsibility. If you don’t own it, you won’t care to maintain it. But the incentive in capitalism isn’t that you work on a thing because you own it, you work because otherwise, you will starve. The ideology here is that we are working on our own thing and that we have more investment because it is ours. This is the case in capitalism for the self-employed and small business owners–the middle class–but the middle class has shrunk considerably. A 2011 Pew Charitable Trust study shows that a third of those raised in the middle class (earning between 30 and 70% of their state’s average income) fall out of it in adulthood. A recent article on The Washington Post on the cost of college shows that it isn’t college costs that have risen but the purchasing power of the middle class that has shrunk.

The second myth–that collective ownership divests individuals of motivation to work–follows from the failure to think the collective as such. Instead of having the value of your work product going to the pocket of the owner, let’s suppose that you own the means of production collectively with all the other workers — that is, let’s say you live in communism. Ending the division between the worker and the owner and thus ending the cross-purposes between them would change the opposition between the work and the benefit of the work. It would incentivize you to work, not because otherwise you would die, but because in fact, you would reap the rewards of your work, but the you is part of a collective you, not an individual worker. Capitalism lives and dies on getting the worker to see herself more as an individual than as part of a collective. If the collective is something that we have to struggle to conceive in order to recognize, the individual is too, capitalist ideology has just been better at achieving it through limiting possibilities for subjectivization to that of the individual, as Jodi Dean argues in Political Theory and as I discuss here.

There are two ways we think that collectivity fails to incentivize: the one is a fear of getting more than you deserve and the other is a concern that what is in common won’t be taken care of. In one of the earliest defenses of private property, Aristotle argues in Politics II.5, that if citizens communally work the land and communally enjoy the profits from it there will be resentment when citizens who do less take more (Pol. 1263a9-14) and further that people tend to care for what is their own. Though, note that even in that situation, “friends share everything in common” (Pol. 1263a29).

Both of these concerns–getting more than you deserve and not taking care of the common seem to be about desire and motivation. And they assume that the desire and motivation of individuals is at odds with the community. How do we make people work? How do we make people care? When people say capitalism incentivizes, they say, the way to make people work and care is to threaten them with death. So much for right replacing might.

We think the individual’s desires and motivations are opposed to the collective’s desires and motivations–well, we deny that there are any collective desires and motivations altogether–because we have already interpellated the subject as individual and then we deny that the subject has become an individual through this process and suppose that it is natural and given. Then we foreclose the possibilities for interpellating the subject as the collective (most of pop culture is an engine for this foreclosure). This foreclosure justifies the anxiety that the individual will try to take advantage of the community, finding her ends at odds with its ends.

My point here is not to contribute to the imaginative work of conceiving the collective, but to argue that capitalism produces the conditions (the individual as the only conceivable way of thinking of the subject, the individual’s desires at odds with the community’s) to which it then argues it is the only solution.

Image created from a photograph by Conner Gordon

The Donald Trump dove myth: why he’s actually a bigger hawk than Hillary Clinton (Vox)
by Zack Beauchamp
“Trump isn’t a leftist, nor is he a pacifist. In fact, Trump is an ardent militarist, who has been proposing actual colonial wars of conquest for years. It’s a kind of nationalist hawkishness that we haven’t seen much of in the United States since the Cold War — but has supported some of the most aggressive uses of force in American history. As surprising as it may seem, Clinton is actually the dove in this race.”

Welfare Utopia (Atlantic)
by Alana Semuels
“That Oregon still maintains a safety net while other states have eradicated theirs is testament to the state’s progressivism. But the example of Oregon also highlights a troubling aspect of federal policy that turns social programs over to the states. Now that states have so much discretion, a few miles can make a big difference in how a poor person is helped by the government. Across the border, in Idaho, poor people are not as lucky.”

President Obama’s Overtime Pay Plan Threatens the ‘Prada’ Economy (New York Times)
by Noam Scheiber
“For decades, bosses at publishing houses, glossy magazines, consulting firms, advocacy groups, movie production companies and talent agencies have groomed their assistants to be the next generation of big shots by working them long hours for low wages.”

Finished reading? Check out this video from The Atlantic: Why Virginia’s Restoration of Voting Rights Matters

Colt's armory complex--east armory workers, 1909 (via Flickr, CC BY 2.0)

This post originally appeared June 9, 2015.

In my youth my parents would defend capitalism by saying that it incentivized work in contrast to communism. If you thought you could get paid the same amount whether you worked hard or not, you would see no reason to work hard or better. It isn’t just my parents. A recent This American Life podcast, “Same Bed, Different Dreams (transcript),” includes a recording smuggled out of North Korea in the early 1980s of Kim Jong-Il saying that North Korean filmmakers have no incentive to make creative and interesting work because of communism. How did everyone from one of the last communist dictators to my parents come to believe that capitalism incentivizes hard work, creative and inventive work, while communism does not?

It turns out that the men who defend and articulate the systems under dispute here–Adam Smith and Karl Marx–agree on quite a bit when it comes to what incentivizes workers. Workers work in order to eat. Smith and Marx agree that capitalists try to pay workers as close to whatever it costs to reproduce labor: “A man must always live by his work, and his wages must at least be sufficient to maintain him” (Smith, The Wealth of Nations, 61). To reproduce labor can be understood in two ways: to get it to come back again the next day or to add more labor to the force. In the first sense, to reproduce labor is to renew the body of the worker for more work. In the second sense, to reproduce labor is to produce more bodies of workers through biological reproduction. Both senses are biological reproduction, one is the furthering of energy in the same body and the other is procreative.

Smith explains the reproduction of labor in connection to the law of wages:

If this demand [for labor] is continually increasing, the reward of labour must necessarily encourage in such a manner the marriage and multiplication of labourers, as may enable them to supply that continually increasing demand by a continually increasing population. If the reward should at any time be less than what was requisite for this purpose, the deficiency of hands would soon raise it; and if it should at any time be more, their excessive multiplication would soon lower it to this necessary rate. The market would be so much understocked with labour in the one case, and so much overstocked in the other, as would soon force back its price to that proper rate which the circumstances of the society required. It is in this manner that the demand for men, like that for any other commodity, necessarily regulates the production of men, quickens it when it goes on too slowly, and stops it when it advances too fast (71).

You need workers, so you have to attract them through wages. You pay them. They produce. You pay them. But this doesn’t seem to produce wealth. To produce wealth you do two things: you invest your capital into machines and you divide the tasks of labor. So then you need more workers. You produce more. The more you divide labor, the less your workers needs skills, so the less you need to pay them. But demand cannot keep up with this cycle. Demand levels off and now you don’t need those workers. What happened while you were needing more workers is that the workers were responding to the labor market demand for more workers by having more workers, ie. children. Now they have more workers to satisfy the market, but the market is glutted. Demand levels off. You can pay your workers a whole lot less because you don’t need to pay as much to reproduce the worker because if that worker can’t sustain herself to come back the next day, someone else who wasn’t working can step in to fill that worker’s place.

In a capitalist society, the incentive to work is to live. The capitalist has an interest in keeping the worker as close to the edge of living so that she will work for a lower wage. Even Smith warns against this outcome of his own system when he says that “Where wages are high, accordingly, we shall always find the workmen more active, diligent, and expeditious, than where they are low” (72).

The capitalist’s incentive is productivity and the capitalist might realize that she can pay workers more and find more productivity. But she can also realize that she can pay them less than they are worth (less than the value they produce by adding labor to the raw materials on which they work), and still recoup a profit. The reason the capitalist can pay the worker less than she is worth is that she must work in order to eat. There is very little bargaining when one would have to walk away to protest and to walk away is to die. Part of the problem with supposing that capitalism incentivizes workers to work harder is that it assumes that workers recoup the benefits of their increased productivity. But if you work a wage job in capitalism, you actually find yourself in a more precarious situation and get paid a smaller percentage of the value that you produce the more productive you are. The capitalist pays you an hourly wage. Say you work eight hours and are paid ten dollars an hour for 80 bucks a day. If you make a widget that will be sold for $12 and it costs the capitalist $2 in materials, you add $10 in value for the widget.  Let’s say you produce 8 in a day. Then the capitalist is paying you for as much as you have produced. You are paid as much as the value you produce. The capitalist examines this situation and wonders whether you can live on less in order to reduce the wage and make more profit or she can find ways to make you more productive by dividing labor or giving you some tools. So the next day you make 16 in a day. You are really productive. But you went from getting 100% of the value you added to 50% of the value you added. Now really, what’s your incentive to work harder? The harder you work now, the more it costs to reproduce your labor, but you aren’t producing more for yourself but for the owner. Perhaps you think this makes you more indispensable so you are more secure in your work? But no, as the labor is divided in smaller parts to make workers more productive and tools are added to make you more productive as well, the easier it is to replace you and so the more willing you are to work for less.

So when people say that capitalism provides an incentive where communism does not, they are saying that capitalism does a better job of keeping people in the precarious position where they must work for whatever little they can get in order to live. The harder workers work the less of their own value they recoup. So what does capitalism incentivize? The problem with saying it incentivizes people to work harder and be more creative is that it assumes that everyone has the same incentives in capitalism–the owners and the workers. But this is not the case. Smith argues that everyone pursues her self-interest in capitalism, but the interest of the capitalist is to be more creative about how to make more money by creative uses of machinery and labor, reading the market well, and paying labor less while the interest of the worker is to be paid more so that she can stop worrying about the day-to-day concern for living. The better the capitalist organizes work, the more she recoups. The harder the worker works, the less she recoups.

Dan Horn, columnist at the The Cincinnati Enquirer interviewed a middle-class citizen, Donna Palmatary, about the lifestyle and income of the middle class for a USA Today article on the 2015 Pew Charitable Trust Study on the shrinking middle class, who aptly distinguishes the middle class from both owners and workers when she says, “You have to work for what you have.” Owners don’t work for what they have, and the workers don’t have what they work for.

Image created from a photograph by Conner Gordon

Machine Bias (ProPublica)
by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner
“Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden — who is black — was rated a high risk. Prater — who is white — was rated a low risk.”

Scientists say there’s such a thing as “ethical amnesia” and it’s probably happened to you (Quartz)
by Katherine Ellen Foley
“A study published (paywall) today (May 16) in the Proceedings of the National Academy of Sciences indicates that when we act unethically, we’re more likely to remember these actions less clearly. Researchers from Northwestern University and Harvard University coined the term “unethical amnesia” to describe this phenomenon, which they believe stems from the fact that memories of ourselves acting in ways we shouldn’t are uncomfortable.”

The Federal Government Quietly Expands Transgender Rights (Atlantic)
by Emma Green
“Something big just quietly happened to the Affordable Care Act. More than a half decade after the law’s passage, the U.S. Department of Health and Human Services has explicitly ruled that hospitals, clinics, and other health-care providers can’t discriminate against patients on the basis of gender identity.”

It’s Gotten A Lot Harder To Act Like Whiteness Doesn’t Shape Our Politics (NPR, Codeswitch)
by Gene Demby
“Whiteness has always been a central dynamic of American cultural and political life, though we don’t tend to talk about it as such. But this election cycle is making it much harder to avoid discussions of white racial grievance and identity politics when, for instance, Donald Trump’s only viable pathway to the White House is to essentially win all of the white dudes.”

"345/365" by martinak15 is licensed under CC BY 2.0 (via Flickr)

This post originally appeared October 13, 2015.

A father hands over the keys to his house to a stranger, his children fast asleep upstairs. Two grandparents share their living room with a traveling salesman in town for the week. A young woman falls asleep in the guest room of a man she has never met before that night. While such scenarios may sound like the beginning of a horror film, it is now a fact that millions of individuals in over one-hundred-and-ninety countries rely on online services to rent lodgings, most often in private homes. The broader sharing economy encompasses, among other things, the sharing of workspace, the sharing of automobiles, and even the sharing of home-cooked meals. In some cases what is shared is publicly owned, such as in most bicycle sharing schemes. But typically one party owns what is shared in an exchange between strangers.

All this cooperative activity between strangers is taking place in an age when parents feel the need to install “nanny-cams” in their children’s rooms, companies monitor their employee’s web surfing, and a proliferation of online services allow anyone to order a background check on anyone else. Do these apparently divergent cultural trends point to a more fundamental polarization of values? Or do they simply represent differential responses to varying social circumstances?

To the skeptic, the trustful enthusiasm of the sharing economy is a symptom of naïveté. The only cure is a painful experience with cynical breach of faith. Recent cases like the alleged sexual assault of an airbnb guest are the canaries in the coalmine. To the optimist, these sensational cases are remarkable precisely because of their paucity. What the amazingly rapid growth of the sharing economy teaches us is that human beings, in aggregate, are much more trustworthy than previously imagined.

I think that both the skeptic and the optimist have got it wrong. On the one hand, it’s silly to think that involvement in the sharing economy confers upon its participants the esteemed moral character trait of trustworthiness. On the other hand, the trusting attitudes manifested in many corners of the sharing economy are both rational and prudent, under the right conditions.

Borrowing a term from Princeton philosopher Philip Pettit, I will refer to these as the conditions for trust-responsiveness. In a paper delightfully entitled “The Cunning of Trust”, Pettit makes the case that you can have reason to trust others even if you have no antecedent knowledge about their reliability. This is because you can make them responsive to your trust simply by communicating that you trust them. This might seem like pulling a rabbit out of hat. But on reflection, the dynamic is not unfamiliar.

Pettit’s analysis rests on a relatively uncontroversial psychological claim: human beings care very much about their standing in the eyes of others, and are often moved by love of regard. In his Theory of Moral Sentiments, Adam Smith, the father of modern economics and a keen psychologist, went so far as to say that Nature has endowed Man “with an original desire to please, and an original aversion to offend his brethren. She taught him to feel pleasure in their favorable, and pain in their unfavorable regard. She rendered their approbation most flattering and most agreeable to him for its own sake; and their disapprobation most mortifying and offensive.”

What other people think of us matters to us a great deal. In particular, a reputation for trustworthiness has both intrinsic and instrumental value. Pettit notices that when people decide to rely on us, they signal to others that they regard us as trustworthy. We are motivated to be responsive to their trust because we want that signal to be broadcast. As Pettit puts it, “The act of trust will communicate in the most credible currency available to human beings – in the gold currency of action, not the paper money of words – that the trustor believes the trustee to be truly trustworthy, or is prepared to act on the presumption that he is.”

When a person is widely perceived to be trusted, he or she gains a highly valuable status. When we manifest trusting reliance, we give the person we rely on an incentive to do the very thing we rely on them to do, because they want to cultivate and maintain that status. This is why the trust of strangers can be a rational gamble. It is never a sure bet, but it is a good bet more often than one might imagine. And this is why the skeptic is wrong. The dramatic growth of the sharing economy is predicated on fundamental facts about of human psychology.

But the optimist gets something wrong as well. There is no necessary connection between ubiquitous sharing and the dawn of a new age of trustworthiness. Trust-responsiveness and trustworthiness are altogether different animals. A trustworthy person will do what he or she is trusted to do regardless of whether anyone else is watching. This is why we hold trustworthy people in esteem and think that trustworthiness is a morally desirable trait of character. In contrast, trust-responsiveness is predicated on a desire for good opinion and is therefore, at best, morally neutral. Moreover, trust-responsiveness will only survive under certain institutional conditions.

It’s worth noting that these conditions exist par excellence in many corners of the sharing economy. The oxygen in which this economy exists is the collection and dissemination of reviews. On airbnb, for example, hosts who meet certain criteria of responsiveness, commitment, and five-star reviews are granted the coveted status of “superhost” which is signified by a red and yellow badge of approval on their profile. This status may increase demand for booking, thereby providing a financial incentive to hosts looking to juice their profits. It also works because it flatters people who self-identify as open, warm, and hospitable.

But we shouldn’t be too cynical about all this. Aristotle noticed that moral virtue could be acquired by cultivating good habits. It may be that exercise of the dispositions of trust-responsiveness can help cultivate the morally desirable trait of genuine trustworthiness. Maybe. I think the jury is still out on that one.

Our judgments about whether to expose ourselves to the hazard of trust are influenced both by our beliefs as by arational factors. Sometimes we just have a bad feeling about someone – we don’t like the cut of their jib. These kinds of knee-jerk responses can be wiser than our reflective selves, which are prone to rationalization. But just as often our “intuitive” judgements reflect unexamined biases. A 2014 Harvard Business School study found that “non-black hosts are able to charge approximately 12% more than black hosts, holding location, rental characteristics, and quality constant. Moreover, black hosts receive a larger price penalty for having a poor location score relative to non-black hosts.” Clearly we have a long way to go in learning how to trust well and without prejudice.

My own family rents out a garden-level apartment in our house on airbnb. We’ve met many interesting people, including one guest who eventually became a co-author of mine. And the money we earn helps to pay a hefty daycare bill. When we tell our friends and family that we have lent our keys to scores of people, they sometimes respond with horror and disbelief. And to be honest, in some frames of mind, we feel pretty nervous ourselves. But overall I think we are making a rational bet, and not one that presupposes a Pollyannaish faith in humanity. Of course, a truly malicious person can always rack up sterling reviews with the express purpose of lowering his victim’s defenses. But this kind of evil, like moral virtue, is rare.

My other work on trust and promises can be here.

"Voting United States" by Tom Arthur is licensed under CC BY-SA 2.0 via Wikimedia

Donald Trump. Not a day goes by when I don’t hear that name. It is constantly on the news and it is what everybody is talking about. So much so, it is almost inescapable. This man has killed it. Since the start of his campaign he has managed to grasp the attention of the media, the nation and the world by saying whatever he wants, especially if it causes controversy. This tactic—whether purposeful or a mere reflection of his values and beliefs—has worked: Donald Trump is essentially the de facto Republican nominee. So hats off to you Mr. Trump, you have shown us how anger (against “Washington” politicians) and fear (of economic instability, foreigners, etc.) can be preyed on to mobilize a campaign to win. In the meantime, the Republican Party is struggling and making a concentrated effort to unite the party behind their champion. This might prove to be a challenge because Trump has essentially vilified everyone: not only his former opponents running for the Republican nomination (and in one case their wife) but entire nationalities, ethnicities and religions.

"Some of us read. But most of us don't." by Ed Yourdon is licensed under CC BY-NC-SA 2.0 (via Flickr)

The economy continues to struggle, the educational system underperforms and tensions exist at just about every point on the international landscape. And there is a national presidential selection process underway. It seems, in such an environment, that citizens would feel compelled to get themselves fully up to date on news that matters. It also would stand to reason that the nation’s news media would feel an obligation to focus on news of substance.

Instead, too many citizens are woefully uninformed of the day’s significant events. A pandering media, primarily television, is content to post a lowest-common-denominator news agenda, featuring Beyoncé’s “Lemonade” release and extensive tributes to Prince.

Constitutional framer James Madison once famously wrote, “Knowledge will forever govern ignorance. And a people who mean to be their own governors must arm themselves with the power which knowledge gives.” Citizens who are unable or unwilling to arm themselves with civic knowledge diminish the nation’s ability to self-govern.

Technological advances have made it easier than ever for citizens to stay informed. The days of waiting for the evening television news to come on or the newspaper to get tossed on your doorstep are long gone. News is available constantly and from multiple sources.

A growing number of citizens, particularly millennials, now rely on social media for “news.” While that might seem like a convenient and timely way to stay informed, those people aren’t necessarily aware of anything more than what their friends had for lunch. Data from the Pew Research Center indicates that about two-thirds of Twitter and Facebook users say they get news from those social media sites. The two “news” categories of most interest among social media consumers, however, are sports and entertainment updates.

Sadly, only about a third of social media users follow an actual news organization or recognized journalist. Thus, the information these people get is likely to be only what friends have posted. Pew further reports that during this election season, only 18 percent of social media users have posted election information on a site. So, less than a fifth of the social media population is helping to determine the political agenda for the other 80 percent.

The lack of news literacy is consistent with an overall lack of civic literacy in our culture. A Newseum Institute survey last year found that a third of Americans failed to name a single right guaranteed in the First Amendment. Forty-three percent could not name freedom of speech as one of those rights.

A study released earlier this year by the American Council of Trustees and Alumni had more frightening results. In a national survey of college graduates, with a multiple-choice format, just 28 percent of respondents could name James Madison as father of the Constitution. That’s barely better than random chance out of four choices on the survey. Almost half didn’t know the term lengths for U.S. senators and representatives. And almost 10 percent identified Judith Sheindlin (Judge Judy) as being on the Supreme Court.

The blame for an under-informed citizenry can be shared widely. The curriculum creep into trendy subjects has infected too many high schools and colleges, diminishing the study of public affairs, civics, history and news literacy.

The television news industry has softened its news agenda to the point where serious news consumers find little substance. Television’s coverage of this presidential election cycle could prompt even the most determined news hounds to tune out. The Media Research Center tracked how the big three broadcast networks covered the Trump campaign in the early evening newscasts of March. The coverage overwhelmingly focused on protests at Trump campaign events, assault charges against a Trump campaign staffer and Trump’s attacks on Heidi Cruz. Missing from the coverage were Trump’s economic plans, national security vision or anything else with a policy dimension.

When the Constitutional Convention wrapped up in 1787, Benjamin Franklin emerged from the closed-door proceedings and was asked what kind of government had been formed. He replied, “A republic, if you can keep it.” Those citizens who, for whatever reasons, are determined to remain uninformed, make it harder to keep that republic intact. Our nation, suffering now from political confusion and ugly protests, sorely needs a renewed commitment to civic knowledge.