"Barry Goldwater Statuary Hall" by Gage Skidmore is licensed under CC BY-SA 3.0 (via Wikimedia Commons)

The September/October 1964 issue of Fact magazine was dedicated to the then Republican nominee for president, Barry Goldwater, and his fitness for office. One of the founders of Fact, Ralph Ginzburg, had sent out a survey to over 12,000 psychiatrists asking a single question: “Do you believe Barry Goldwater is psychologically fit to serve as President of the United States?” Only about 2,400 responses were received, and about half of the responses indicated that Goldwater was not psychologically fit to be president. The headline of that issue of Fact read: “1,189 Psychiatrists Say Goldwater is Psychologically Unfit to be President!”

Black Friday @ 5th Avenue by Frank Tasche (CC BY-NC-ND 2.0)

Shopping hangovers are just as real of a threat at this time of year as the drinking ones. With Black Friday and Cyber Monday just behind us (but Christmas still ahead), it can seem once again like everyone’s busy typing denouncements of consumerism with one hand while grabbing sweet deals and swiping plastic with the other. Amidst the holidays, we are sure to continue to hear complaints that the season has been ruined by a focus on material possessions and rampant “consumerism.”

Few would deny that the western world does exist today in a state of “consumer culture.” By many accounts, capitalism is its cause. Though “consumerism” also refers to political efforts to support consumers’ interests, the term has come to bear a rather pejorative connotation referring to the prominence of consumptive activities in everyday life, especially insofar as consumptive activities seem to be escalating for individuals and societies at an unsustainable rate.

Criticisms of consumerism take several forms, of varying plausibility. Sometimes, critics suggest that people harm themselves in placing too much importance on material possessions. Other critics worry that consumerism destroys the potential for genuine individuality, as through the spread of homogenous mass market products. Au contraire, perhaps consumerism instead fosters individuality, but of a pernicious and illusory sort. Or perhaps the means – and not the ends – of consumerism are morally problematic, insofar as this social state has been brought about through psychological manipulation by advertisers.

Defenders of a basically capitalist order must bite several bullets in the interest of intellectual honesty. It’s true that not all purchases make consumers meaningfully better off. Indeed, the challenge of facing too many choices is a well-known psychological phenomenon. Sometimes people find their behaviors unduly shaped by external sources, and marketing can be amongst them. Expressing individuality (as through consumption) is of dubious moral value when doing so seems to make people isolated and self-absorbed instead of happier.

But what is the alternative? If adults don’t or shouldn’t derive their identities from autonomous personal activities – many of which are consumptive – from where exactly are they supposed to derive them? Liberal progressives (including those of an anti-capitalist bent) also worry about citizens’ identities being shaped too heavily by their arbitrary family circumstances, their race or ethnicity, their religions, and their work lives. But figuring out who we are is an essential feature of the “human condition,” and alleviating one kind of external pressure just makes room for the others to flood in.

Buying things really does often help us to form our mature selves: the way you dress, the way you decorate your home, and the range of your hobbies express a nascent identity, allow you to take it for a test drive, and provide pathways for changing it in a kind of ongoing identity feedback loop. And buying experiences (instead of stuff) shapes humans lives even better yet: restaurant expenditures aren’t just food, they fuel social gatherings. Vacations aren’t just plane tickets, they’re memories to anticipate and then treasure.

And, through a supply-and-demand lens, notice that you can only have it one way or another: when the prices for things go down, people want more of them. It’s implausible to imagine any world in which consumer goods become cheaper (and therefore more widely and equitably available) but ordinary humans don’t buy more at the margin. It signals one’s higher social status to look down on those people waiting in an hours-long Walmart line for cheap televisions and computers, but there’s certainly nothing inherently wrong with taking advantage of deals to get things for your family that would otherwise be financially out of reach.

Capitalism may be the cause of consumer culture, but consumerism is only partially a problem – and, to that extent, capitalism can also provide the solution. When people are generally rich in historical terms, we can afford (figuratively and literally) to spend time criticizing the ways in which they spend their money. Self-help and self-improvement have captured philosophical interest since ancient times, but the circumstances within which we conduct these activities are historically contingent. Navigating the prosperity of a post-industrial world requires consumer habits that can be learned and practiced as part of a balanced life. Though happiness will never be handed to us on a silver platter (or in a shopping bag), we should be glad to find ourselves operating so close to the top of Maslow’s hierarchy of needs.

Vulpes et lignator from Sebastian Brant's 1501 edition of Aesop's Fables (Public Domain). The image above is an illustration from The Fox and the Woodsman, a fable which warns readers against hypocrisy.

Hypocrisy is the practice of claiming to moral standards or beliefs to which one’s own behavior does not conform. Hypocrites are usually disliked and seen as lacking moral fiber. Many people claim that nothing annoys them more about a person than hypocrisy. I believe this is because we as a race dislike criticism and to have someone criticize us while doing the same act that is being criticized is seen as absurd and disgusting. Mankind has an interesting history of hypocrisy. I would wager that humans have committed this moral treason since before we were able to recognize it.

We start seeing mass hypocrisy in recorded history around the time of the Holy Roman Empire founded around 800 AD. Many blame the church for the problems around this time. While preaching charity and goodwill, the Catholic Church would demand high tribute and start wars with foreign lands in the name of their religion. The Hypocrisy here is very obvious. Literally preaching values of early Christianity and then violating those same values simultaneously. Eventually the enlightenment would reconstruct Europe, however it’s dissolution in 1806 marks over 1000 years of mass hypocrisy.

Then continuing on into the 19th Century, American hypocrisy emerges. Slavery was seen as perfectly acceptable, yet the Judeo-Christian values held high in American society do not support these actions. In fact in the Old Testament, the Jews were slaves to the Pharaoh of Egypt and in those stories the Egyptians were seen as the antagonists. This moral dilemma was ignored in the sermons of Southern Protestant churches. Of course hypocrisy in politics was not a new concept. That being said, the 19th Century saw public outcry at this corruption. Therefore, some political leaders such as those in New York’s Tamany Hall, spoke out against corruption while continuing to have their back pockets stuffed with funds from criminals and community leaders. Towards the end of the century, we see a hypocritical sentiment within much of the population leading of to and during the Spanish-American War. The hypocrisy comes from many citizens in the United States accusing the Spanish to be evil because of their treatment of Cubans and Filipinos. While Americans were quick to attack Spain for these colonial actions. After the war when Puerto Rico, Guam, and the Philippines fell into U.S. territory, there was never protest when Dole Fruit Company would spend the lives of Filipino and Hawaiian nationals to turn a profit. Spain was not the only target of public hate in the U.S. France, England, Germany, and Belgium all took flack for their colonial holdings. This is despite the fact that American industry was doing the exact same thing.

The 20th Century saw improvements to technology and new world conflicts. World War II, seen as the greatest conflict in human history, reshaped the world as we see it and it’s after effects can still be felt years later. The creation of Israel was one of the first actions taken by the new world paradigm. The new nation was created on top of the existing state of Palestine and received support worldwide. Israel was seen as compensation for the atrocities committed against the Jewish population by Germany and Palestine. Now, around 70 years later, the Palestinians of Israel are treated as second class citizens. In fact, most minorities experience either societal discrimination or institutional discrimination within the borders of Israel. The treatment of minorities in Israel is eerily similar to 1930s German and pre-war Palestinian treatment of Jews. Also in the Middle East, Saudi Arabia still stands as a monarchy, despite receiving support from the U.S., a staunch supporter of democracy, often using it as an excuse to invade other nations. The same can be said about former dictators Saddam Hussein of Iraq and Muammar Gaddafi of Libya. Both of these rulers were supported and their regimes sustained by the United States. Hussein was even an honorary citizen of Detroit.

There are unlimited examples of this type of behavior in human history and society today. All around the world the act of hypocrisy is condemned yet also practiced constantly. In fact, the act of discouraging hypocrisy is in itself hypocritical. Hypocrisy is as natural for humans as breathing. Rather than getting depressed about this fact, I embrace this realization. A film critic does not have to make films to be able to criticize films. In the same way, we can accept other people’s suggestions even if they are not implementing them.

This Guest Post was written by Benjamin Booher, a first-year at DePauw University.

Misty Copeland, American Ballet Theatre, Gong, November 1, 2013 by Kent G Becker (CC BY-NC-ND 2.0)

When I was 8, I started ballet. I was a disciplined kid who took everything seriously, and dance quickly became a great passion of  mine. But for many years I wasn’t that good; I felt I lacked the natural physical abilities that bless talented ballerinas.

One day something changed. I was observing the course immediately after mine. In the center of the studio, Laura, the first in her class, was performing a step in which one leg is elevated above 90 degrees. She was very similar to me in many respects, rich in determination but lacking in natural talent, her legs and feet modeled only by hours of obstinate exercise. Her leg was so much higher than mine had ever been! She looked fierce and strong, and I wished I could be like that. But beneath her smiling face, I could see the strain: she was sweating a lot, and her leg was shaking slightly. I felt a complex, painful emotion. I was ashamed both of my inferiority and of minding it so intensely. At the same time, I was inspired and determined to work harder: if she could do that, so could I! I kept dancing, with renewed enthusiasm, and by the time I graduated from my dance school I, too, was the best in my class.

I believe that what I felt that day toward my peer was what we can call emulative envy. It is a kind of non-malicious envy that has two fundamental characteristics. First, it is more focused on the lacked good, rather than on the fact that the envied has it. In my case, I was more bothered by my lack of excellence than by the fact that Laura was excellent. Therefore, this kind of envy has an inspirational quality, rather than an adversarial one. The envied appears to the envier more like a model to reach, or a target to aim to, than a rival to beat, or a target at which to shoot. When the envier is, vice versa, more bothered by the fact that the envied is better than them, than by the lack of the good, the envied is looked at with hostility and malice, and the envier is inclined to take the good away from them.

Second, emulative envy is hopeful: it involves the perception that the envier can close the gap with the envied. When this optimism about one’s chances is lacking, being focused on the good is insufficient to feel emulative envy, because we don’t believe in our capability to emulate the envier. Thus, we may fall prey of what I call inert envy, an unproductive version of emulative envy in which the agent is stuck in desiring something she can’t have, a dangerous state that can lead to develop more malicious forms of envy. In Dorothy Sayers’s vivid metaphor: “Envy is the great leveler: if it cannot level things up, it will level them down” (Sayers 1999).

There are two possible ways of leveling down: in what I call aggressive envy, the envier is confident that she can “steal” the good. Think about a ballerina who secures another dancer’s role not by her own merit but via other means, such as spreading a rumor about that dancer’s lack of confidence on stage. But this kind of “leveling down” is not always possible, or at any rate does not appear possible to the envier. In such a case, the envier is likely to feel spiteful envy, as it may have been in Iago’s case: he could not take away the good fortune Othello had, but he was certainly able to spoil all of it.

Spiteful, aggressive and inert envy are all bad in one way or another, but emulative envy seems void of any badness. Here I cannot detail the ways in which it is different from the other three, but I’d like to conclude this post with one very interesting feature it possesses. Empirical evidence (van de Ven et al 2011) shows that emulative envy spurs one to self-improve more efficiently than its more respectable cousin: admiration. This result is less surprising once we think that admiration is a pleasant state of contentment, and thus is unlikely to move the agent to do much at all. As Kierkegaard aptly put it: “Admiration is happy self-surrender; envy is unhappy self-assertion” (Kierkegaard 1941).

References
Kierkegaard, S. 1941, The Sickness Unto Death: A Christian Psychological Exposition For Upbuilding And Awakening (1849), New York: Oxford University Press.
Sayers, D. 1999, Creed or Chaos? Why Christians Must Choose Either Dogma or Disaster (Or, Why It Really Does Matter What You Believe), Manchester, NH: Sophia Institute Press.
van de Ven N., Zeelenberg M., Pieters R. 2011, “Why Envy Outperforms Admiration,” Personality and Social Psychology Bulletin, 37(6): 784–795.

Colorful Eyes by M Yashna (via Flickr CC BY 2.0)

Imagine you check your email and find a congratulatory message from your boss announcing that your colleague has just been promoted. This colleague joined the company at approximately the same time as you did, and works in your sector. You were in line for the same promotion and were anxiously waiting for the outcome. How do you feel?

Wielenberg, pioneer of robust godless normative realism (Image by Richard Fields for the DePauw Photo Gallery)

Last week we published the abstract of Erik Wielenberg’s new book, Robust Ethics: The Metaphysics and Epistemology of Godless Normative RealismIn this guest post, Wielenberg, Professor of Philosophy at DePauw University, follows up with a more in-depth discussion of the book and some of the philosophers that have influenced his thinking on moral realism and God’s existence.

In 1977, two events that would significantly impact my life took place. First, the film Star Wars was released. Second, two prominent philosophers, J.L. Mackie and Gilbert Harman, unleashed some influential arguments against moral realism.  My book is about the second of these two events.

In his famous argument from queerness, Mackie listed various respects in which objective values, if they existed, would be “queer.” Mackie took the apparent queerness of such values to be evidence against their existence. One feature of objective values that he found to be particularly queer was the alleged connection between a thing’s objective moral qualities and its natural features: “What is the connection between the natural fact that an action is a piece of deliberate cruelty — say, causing pain just for fun — and the moral fact that it is wrong? … The wrongness must somehow be ‘consequential’ or ‘supervenient’; it is wrong because it is a piece of deliberate cruelty.  But just what in the world is signified by this ‘because’?” (1977, 41)  Mackie was also dubious of the view that we could come to have knowledge of the objective moral qualities of things. He wrote that friends of objective moral values must in the end lamely posit “a special sort of intuition” that gives us knowledge of objective values.

Harman, for his part, noted an apparent contrast between ethics and science.  He compared a case in which a physicist observes a vapor trail in a cloud chamber and forms the belief “there goes a proton” with a case in which you observe some hoodlums setting a cat on fire and form the belief “what they’re doing is wrong” (1977, 4-6).  Harman was happy to classify both of these as cases of observation (scientific observation and moral observation respectively), but he noted that the moral features of things, supposing that they exist at all, seem to be causally inert, unlike the physical features of things. Harman thought that this feature of moral properties suggests that we ought to take seriously the possible truth of nihilism, the view that no moral properties are instantiated (1977, 23).  But others have drawn on Harman’s premise to support not nihilism but rather moral skepticism, the view that we do not (and perhaps cannot) possess moral knowledge. It is the latter kind of argument that I discuss in my book.

Some have suggested that theism provides the resources to answer these challenges. Mackie himself, although an atheist, suggested that theism might be able to answer his worries about the queerness of the alleged supervenience relation between moral and natural properties. In his 1982 book The Miracle of Theism, he suggested that “objective intrinsically prescriptive features, supervening upon natural ones, constitute so odd a cluster of qualities and relations that they are most unlikely to have arisen in the ordinary course of events, without an all-powerful God to create them” (1982, 115-6). More recently, Christian philosopher Robert Adams suggests that the epistemological worries that arise from Harman’s contrast between science and ethics can be put to rest by bringing God into the picture (Adams 1999, 62-70).

Thus, an interesting dialectic presents itself. Mackie and Harman, who do not believe that God exists, see their arguments as posing serious challenges for moral realism. Some theistic philosophers argue this way: if we suppose that God does exist, then we can answer these challenges to moral realism. Without God, these challenges cannot be answered. Since moral realism is a plausible view, the fact that we can answer such challenges only by positing the existence of God gives us reason to believe that God exists.

I accept moral realism yet I believe that God does not exist. I also find it unsatisfying, perhaps even “lame” as Mackie would have it, to posit mysterious, quasi-mystical cognitive faculties that are somehow able to make contact with causally inert moral features of the world and provide us with knowledge of them. The central goal of my book is to defend the plausibility of a robust brand of moral realism without appealing to God or any weird cognitive faculties.

A lot has happened since 1977.  A number of increasingly mediocre sequels and prequels to the original Star Wars have been released; disco, mercifully, has died. But there have also been some important developments in philosophy and psychology that bear on the arguments of Mackie and Harman sketched above. In philosophy, the brand of moral realism criticized by Mackie has found new life. In psychology, there has been a flurry of empirical investigation into the nature of the cognitive processes that generate human moral beliefs, emotions, and actions. As a result of these developments the challenges from Mackie and Harman sketched above can be given better answers than they have received so far — without appealing to God or weird cognitive faculties. That, at any rate, is what I attempt to do in my book. In short, my aim is to defend a robust approach to ethics (without appealing to God or weird cognitive faculties) by developing positive accounts of the nature of moral facts and knowledge and by defending these accounts against challenging objections.

Works Cited
Adams, Robert. 1999. Finite and Infinite Goods. Oxford: Oxford University Press.
Harman, Gilbert. 1977. The Nature of Morality: An Introduction to Ethics. Oxford: Oxford University Press.
Mackie, J.L. 1977. Ethics: Inventing Right and Wrong. New York: Penguin.
Mackie, J.L. 1982. The Miracle of Theism. Oxford: Oxford University Press.

The FDA is currently debating whether skin shocking devices should be banned for use on patients with autism and developmental/intellectual disabilities that display self-harmful or violent behaviors. The only treatment center to use skin shocks, the Judge Rotenburg Center in Caton, Mass., treats 55 patients with skin shocks. The center and some families claim that these skin shocks have been the only way to treat the violent behavior of certain disabled individuals being treated at the center. The skin shocks are delivered with a device attached to the arms or legs, called a GED or graduated electronic decelerator, which delivers a 2-second shock to the patient when he or she becomes violent or self-harmful. Before the center can use the shocking device, they must be granted court approval. The center and various patients describe the shocks as equivalent to a bee sting or a hard pinch; others say it is worse than that, with one comparing it to being “stung by a thousand bees.” Another patient claims that the shocks are terribly painful, and have caused nightmares. However, one mother said that the only treatment that worked on her son was shock therapy, and told the FDA via video, “Do not take away what is saving his life.” An advisory committee for the FDA identified that the devices pose a high risk, although they noted that some less jarring therapy methods have proved ineffective.

Drink MeErin, (CC BY 2.0)

Lithium is classically associated with extreme mental illness, and has a somewhat negative connotation in the public. In its concentrated form, it has been documented to alleviate several symptoms of mental illness. But it also can (and has had) severe negative health consequences for people taking it in high doses.

But this op-ed raises the question: Should we all be taking (a very little) bit of Lithium? It turns out that it is naturally occurring, in very small amounts, in many drinking water sources. What is surprising is that there is some evidence that these small amounts potentially have a surprisingly positive effect. Several studies have shown correlations between levels of naturally occurring lithium and positive social outcomes.

Researchers began to ask whether low levels of lithium might correlate with poor behavioral outcomes in humans. In 1990, a study was published looking at 27 Texas counties with a variety of lithium levels in their water. The authors discovered that people whose water had the least amount of lithium had significantly greater levels of suicide, homicide and rape than the people whose water had the higher levels of lithium. The group whose water had the highest lithium level had nearly 40 percent fewer suicides than that with the lowest lithium level.

Almost 20 years later, a Japanese study that looked at 18 municipalities with more than a million inhabitants over a five-year period confirmed the earlier study’s finding: Suicide rates were inversely correlated with the lithium content in the local water supply.

More recently, there have been corroborating studies in Greece and Austria.

This raises several interesting questions. First, should the government and the scientific community be devoting more resources to studying the effects of lithium in these small doses? Second, suppose we found out that there are positive effects. Should we all be drinking water with these naturally occurring levels of lithium or not? Would you want your municipal water supply augmented to achieve this result? What do you think?

"Waiting for the Word", (cc by 2.0)

Helen De Cruz draws on some interesting insights from the cognitive science of religion to examine a popular response to an argument against God’s existence called The Problem of Divine Hiddenness.

The basic argument is that a loving God would make his/her presence obviously known to us. Why? Because a loving God would want a loving personal relationship with us, and that’s only possible if we are in a position to know with reasonable certainty that God exists.

One kind of response to this argument is that if God’s presence were obvious, it would undermine our ability to make morally significant choices, because we wouldn’t feel free to decide how to act. We’d feel compelled to always act in a particular way. Mike Murray endorses this kind of response (Shameless plug alert: So do some other philosophers)

De Cruz examines some recent work on the cognitive science of religion, and draws on it to consider whether the above line of response is a good one. It turns out that there is some evidence, but the results are rather mixed. If people believe in a vengeful God, and are reminded about God (or a primed with spiritual words) they are less likely to cheat or act immorally in various ways. However, this is not true if people’s beliefs about God emphasize a more loving and forgiving God. Her full post is an interesting read. Check it out and let us know what you think.

Would the obvious existence of a God make it difficult for you to freely choose how to act (and thus be unable to make morally significant choices), or would you still feel free to act?

puppeteerJo Morcom. (ccbyncsa 2.0)

From sharing posts on social media sites to spreading germs of the common cold, it seems that many things in life are contagious. But does this trend toward contagion apply to morality in the form of good deeds?

Innovations in technology have allowed us to track, research, and study human interactions in a way that seemed impossible a decade ago. And while researchers might enjoy newfound methods of quantifying human behaviors, it is important to question whether this type of inquiry into our daily lives is overreaching.