Skip to content

#Ferguson: Knowledge and Justice in a Social Media Age

August 15, 2014

If you’re looking for an opinion on the killing of Michael Brown, look elsewhere.

I know that any clash between the police and young African-Americans can be co-opted by the internet Leftist hype machine. On the other hand, I know enough American history not to pretend that the relationship between the police and young African-Americans is all sunshine and lollipops. I don’t believe that black people are all criminals, or that the police are a jackbooted Gestapo ruthlessly targeting minorities. The truth is more complex is that. And its the complexity of the truth that I want to deal with.

So this post is not an opinion on Michael Brown. In a way, it’s not about Michael Brown at all. Instead, it’s about the way that people from my demographic (young, rich, white, Left-leaning Millenials) react to stories like this in the social media world. I don’t have any inside knowledge about what happened in Ferguson, MO. On the other hand, I do read a lot of Twitter and Facebook. And my diagnosis is this: more often than not, social media hampers rather than helps social justice. This is not a slam on social media per se, just a statement of its limitations. Social media plays to human epistemological limits in such a way that discourages ethical action.

Social Media Can’t Handle Complexity

Bumper stickers are designed to be simple. The message of a bumper sticker is easily digestible and signals that the driver of the car takes a certain posture towards the world. “Smoking is not a crime.” “Nobody died when Clinton lied.” “Impeach Obama.” The bumper sticker isn’t primarily interested in starting a serious dialogue about constitutional law. Rather, the bumper sticker identifies the driver as a certain kind of person–a caring Leftist, a realistic right-winger, a smoker, a parent of an honor student. The bumper sticker presents the driver, not as a complex individual, but as a member of a certain group, as a caricature.

I am Leftist; hear me roar

Social media posts are like bumper stickers. Social media is biased towards simplicity. This is not necessarily bad–Twitter’s 140 character limit can be a boon to creativity. But it does mean that social media is not a good vehicle for delivering complex arguments, the kind of arguments that sway people’s opinions. Instead, social media posts about serious issues usually function like bumper stickers–they identify the poster as a certain kind of person (Leftist, right-wing, etc).

This simplification plays to human epistemological limits. Thought takes effort. To reduce effort, the brain uses shortcuts. We tend to assimilate events into different schemas or narratives. Thomas Sowell calls these narratives “visions,” and he writes,

Reality is far too complex to be comprehended by any given mind. Visions are like maps that guide us through a tangle of bewildering complexities. Like maps, visions have to leave out many concrete features in order to enable us to focus on a few key paths to our goals. Visions are indispensable–but dangerous, precisely to the extrend that we confuse them with reality itself. What has been deliberately neglected may not in fact turn out to be negligible in its effect on the results. (A Conflict of Visions, 13-14).

Real-world example: If you see a dude in the mall wearing a tank-top, gym shorts, and big sunglasses, you’re probably going to put him into the “bro” category. Maybe he reads Aristotle and enjoys knitting. But since you don’t immediately know that, and since your brain doesn’t want to take the effort to carefully scrutinize a passerby, you put him into a category and move on. The same thing applies to a narrative. If your narrative is “no power in the ‘verse can stop me,” you will tend to interpret the girl’s rejection of your offer of dinner and a movie as proof that she’s not good enough for you. If your narrative is “I am a worm and not a man,” her rejection will simply confirm your self-hatred.

People do the same with news stories. Whenever we read a news story, we tend to assimilate it into our pre-existing narrative. We also ignore details that challenge our narrative. To make matters worse, the writers of the news story are usually doing the exact same thing. Thus, after reading the news story or the editorial piece, we think we’ve gained new knowledge, whereas we’ve actually just confirmed a pre-existing bias. Then, we post our opinion on Facebook or Twitter, blissfully unaware that we’re not offering up a considered analysis of the situation, but instead just presenting our (non-empirical) vision.

It gets better. Once people start reading our posts, then they start assimilating those into their narrative. Thus, what we thought was a poignant and witty defense of democracy and justice gets categorized as “just more Leftist buffoonery.” We don’t have time to think seriously about every single person on our Facebook newsfeed; if we did, we would have better ways to use our time. The people who agree with us are just going to have their opinions confirmed. The people who disagree with us are, at best, going to be irritated. And worst of all, we are going to be fooled into thinking that we are making serious strides in public discourse, when we’re really just patting ourselves in the back.

“Why don’t we just make an effort to think more clearly?” This is a nice thought, but it’s just not practical. Creating narratives and visions is the brain’s default option. It takes energy to try to look at the individuality of thing. Humans have a limited amount of energy. We can’t spend it all on trying to nuance every single thing that we read. I have many things that I have to do in the day. There is not enough time for me to worry about whether I am thinking clearly about every single passerby on the street, or whether I have accurately interpreted some rando’s Facebook post. This is not to say that we shouldn’t ever try to think outside of our visions or narratives. We just have to be selective in when to do so.

Edward Elric, Political Philosopher

Proposal #1: Stop talking about these things on social media. Don’t Tweet. Don’t post a Facebook status. For heaven’s sake, don’t write something on MySpace. It’s not going to help.

Objection #1: You are counseling apathy. Social media is the main channel of public discourse, and the main way to raise awareness about injustice. If you are against discussing serious issues on social media, you are de facto promoting apathy about social situations.

Reply to Objection #1: I don’t think that all discussion is fruitless. I don’t even think that social media is entirely useless for promoting discussion of serious issues. It is an excellent way of sharing articles, for instance. I do, however, think that social media has serious limitations. Twitter is a great place for funny jokes. It is not a great place for mutually uplifting dialogue.

As shown above, the medium of social media accommodates humans’ epistemic limits rather than challenging them. Social media posts are biased towards simplicity, and as such they are almost always going to be interpreted simplistically. Thus, they mainly confirm the writer or reader’s bias.

“But simplicity can still be effective.” Perhaps, but there’s another problem with social media posts, a problem illuminated by FullMetal Alchemist. In the anime/manga,  the magic system is governed by the law of Equivalent Exchange. As the protagonist Edward Elric says,

Humankind cannot gain anything without first giving something in return. To obtain, something of equal value must be lost. That is alchemy’s first law of Equivalent Exchange.

This principle applies to life. Accomplishing valuable things requires effort and risk. It takes no effort and risk to post something on Facebook. It takes effort and risk to have a serious conversation with a close friend, especially a close friend who disagrees with you. In face-to-face conversation, you have to make an effort to understand the other person’s point of view, to show charity, to respond gracefully. A nasty comment in a conversation can end a friendship. A nasty comment on a status will usually get a couple likes. The anonymity and accessibility of the internet mollify most of the consequences that taking a position usually entails.

Think of it this way: Do you really want to argue politics with a man who has a metal arm?

And why should anyone care about your opinion. If you want to get buff, you go to the gym and watch what the buff guys are doing. If you want to make money, you don’t get advice from your broke Uncle Larry. The opinion of someone on Facebook carries very little weight (“kind of like this blog post”–Greek Chorus). So you read a couple of articles from HuffPost or National Review that confirmed your view: Who died and made you the arbiter of all truth?

In Antifragile, Nassim Nicholas Taleb tackles the problem of doing ethics while recognizing our epistemic limitations. His solution is “skin in the game.” He writes, “Never ask anyone for their opinion, forecast, or recommendation. Just ask them what they have–or don’t have–in their portfolio.” Most social media posters don’t have anything in their portfolio. I’ve admired my professors who have moved to the inner city to pursue social justice or racial reconciliation. I may not agree with their opinions, but I know that their opinion has some weight behind it. The person who lives in the white suburbs, on the other hand, and waxes eloquent on the state of race in America–their opinion has no weight.

Now What?

“But what if I’ve read these articles about Michael Brown [or whatever] and I feel legitimately concerned about this issue? Are you saying I should just not care?” No. I’m saying that if you do care, you need to be smart. And if you feel called to some social justice pursuit, then start by doing something.

One of the most annoying things about Cornel West’s Race Matters was the way he evaded making specific policy pronouncements. Every ten pages he would write something like “the black community needs is a focus on justice” or “America needs is prophetic leadership.” What this actually looked like on the ground was left to the imagination. In contrast, John McWhorter’s All About The Beat: Why Hip-Hop Can’t Save Black America contained many specific policy suggestions for the problems facing black America. Whether or not you agree with McWhorter’s politics is beside the point–he wasn’t making broad statements, but instead proposing a plan of action.

I’m not against talking; if I was I wouldn’t be writing this blog. I’m not against thinking; successful action rarely comes from thrashing about wildly. I am against empty hand-wringing. I once heard a student in my college’s chapel say “We have not done enough to remember the legacy of slavery.” He meant well; but it does little good to “remember” the legacy of slavery. All the white Millenials in the greater Chicagoland area could get together in a room and have a good cry about slavery, and nothing would change. And lest you think that this is just the cranky rumination of a stone-hearted right-winger, arch-Leftist Ta-Nehesi Coates agrees. In an inspiring quote that I unfortunately can’t remember accurately, Coates says that if the “conversation” about race in America just ends in nodding and chin-stroking, then it is useless. Talking loudly about the need for change will not create change any more than talking loudly about your need to have a relationship will land you a girlfriend (trust me, I’ve tried).

“How many more times will I have to tell her that I’m single before she takes the hint?”

So do something. Do something tangible. Do something that doesn’t involve talking. If you think that the police are too antagonistic towards young black men, the question you should be asking isn’t “What should I put on Facebook,” but “What can I do to heal the rift.” This may involve becoming friends with the police. Same thing if you think that young black men are too antagonistic towards the police–sitting at your desk, complaining about people you’ve never met is only going to make you feel good about yourself. So if you want to fight racism (or whatever), find a concrete thing you can do. Maybe it’s volunteering. Maybe it’s making friends with people of another race. Maybe it’s having a serious conversation with Uncle Larry. Doing actual work won’t have the dramatic flair that posting incendiary things on Twitter will. On the other hand, your good faith effort will have a more  lasting impact than Twitter. Remember: equivalent exchange.

And if you don’t feel called to this particular battlefield, don’t feel bad. Christians especially have a problem of turning a social cause into the social cause. I remember another chapel where the speaker beat everyone over the head about not being compassionate enough to gay AIDS victims in Chicagoland. Helping them is a noble calling; it’s not my calling. If I try to help gay AIDS victims in Chicagoland, orphans in Africa, the black community in Missouri, the homeless in Scarsdale, alcoholics in Serbia, and the shoplifters of the world, I’ll spread myself thin and won’t be able to help anyone well. It would be like trying to passionately love ten women at the same time (“Only ten?”–Giovanni Casanova). Instead, it would be a better idea to find the one thing you are called to help, and then focus on that. Not feeling particularly called to racial reconciliation doesn’t make you a racist. If you don’t find yourself called to it, then help out by focusing on what you are called to do, and lend a hand to the people called to do other things whenever they’re in need.

Advertisements

The Problem with Armchair Anthropologists

August 11, 2014

This is my first piece I’ve put on the blog in a while. I wish I could say my absence was due to work or lack of interest, but the fact is that I have had better things to do–TV, video games, suburban hang-gliding. Recently, however I have missed fooling myself into thinking I’m doing something useful the intellectual stimulation that comes from blogging, so I’ve taken it up again. I don’t know how long this will last. Keep your fingers crossed.

Photo: Don't you just love the public exposé of trumpet-tone biblical illiteracy?

      This little gem from Rachel Held Evans illustrates a popular view among Millenial Christians–that normative gender roles are an illusion, and a nasty illusion at that. The idea of “gender as a social construct” has migrated beyond Leftist academia and into the Christian intellectual world. Hundreds of young Christians, inspired by that one Anthropology class they had sophomore year, are taking to the barricades to defend against the assumption of anything close to fixed gender roles. More often than not, the target is masculinity. The phrase “act like a man” is occasion for scorn. Pastors who try to talk about men’s issues are typecast as chest-beating gorillas. The general attitude was summed up in a comment a friend made to me: “Masculinity is a social construct used to marginalize the unsuccessful.” Another friend put it even more simply: “The whole idea of ‘becoming a man’ is stupid.”

Of course, after he said that I ripped his arms off with my bare hands and then fed him to these wolves.

     There’s nothing wrong with critically investigating cultural concepts like “manliness”; My target in this piece is not real anthropologists or sociologists. Rather, my target is with the people who rattle off a cliche like “manliness is a social construct used to marginalize the unsuccessful” as if it were groundbreaking wisdom. The scorn heaped upon “social constructs” (manliness or otherwise) has three main flaws to it. First, it misunderstands the nature of social constructs. Second, it misreads the concept of masculinity. Third, it is grounded in a “Cartesian” view of the self which does not reflect actual, lived experience.

 Social Constructs: Good For You, Good For America

Gender is socially constructed. This is not just true, it’s banal. Many of our ideas about what it means to be a man or a woman are part of our cultural context. For example, in American society, it is expected that women wear skirts or dresses and men do not. This is not a given–men in traditional Scotland wore kilts, which look like skirts to American eyes. There’s nothing morally wrong with wearing a kilt, nor is there anything inherent in being a male that makes wearing a kilt “unmasculine.” The decision to not wear kilts is pretty much arbitrary. And all that being said, there’s no way that you’re going to get me to wear a kilt in public.

While it’s true that the idea “men should not wear skirts” is fairly arbitrary, the fact is that, in American culture, a man wearing a kilt signifies something very different than what it would signify in Scottish culture. Armchair anthropologists talk of social constructs as if they were completely arbitrary, things that we can jettison at will. Don’t like this aspect of society? Get rid of it.

The problem is that we live in a social world. Social constructs may be “arbitrary,” but that doesn’t mean they don’t have any force. They are deeply embedded in the culture, and interconnected with other social constructs. For example, I often hear girls complain that the “system”  makes men the  exclusive initiators in romantic relationships. Thus, there is a bias against women asking men on dates. This is socially constructed, and it’s rather arbitrary. But this social construct is also deeply embedded in our “social imaginary.” If a woman asks men on dates, she runs the risk of seeming “desparate” or “too forward.” Whether this is good or bad is neither here nor there–the point is that at no point can you step outside your context and pick and choose which social constructs everyone should get rid of. Peer pressure is equally as ephemeral as social constructs, yet it does no good to tell people “don’t give in to peer pressure.”

There are many things in our world that are social constructs–handshakes, traffic laws, table manners–and we can’t simply ignore those things because they are “arbitrary.” Holding out against handshakes because they are “socially constructed” may seem like a noble goal, but don’t expect your much more thought-out custom of touching noses to catch on.  Asking a girl to coffee is a cruel joke at Wheaton College is fairly arbitrary (as Matt Damon in Good Will Hunting noted), but it is important because of what it signifies. Asking a girl to the state hog-calling contest will produce a different reaction (though if she says yes, good for you man). part of the “social imaginary” that we inhabit.

I guess the point I’m trying to make is that social constructs are “thick.” They’re not just arbitrary inventions imposed on us by the patriarchy, Hollywood, or a horde of mutant lizard aliens. They arise out of certain historical-cultural conditions and they become “embedded” into the social imaginary, such that one cannot easily get rid of them without creating ripples. Anyone who says that social constructs aren’t important is probably playing dumb; like the kid who wears black leather in the summertime and then gets mad when his parents tell him he’s trying to be a rebel.

“That’s all well and good,” you say, “but what about social constructs that are wrong or evil? Aren’t Christians supposed to resist those.” Yes, of course. Just because social constructs are deep-rooted doesn’t mean that they’re all good. Sometimes ripples need to be made.  But…

Misreading Masculinity

…I propose that masculinity or “manliness” isn’t one of them.

Academic types always tell us to “nuance,” and they’re right. If we don’t nuance, we run the risk of creating a straw-man fallacy. Here, we need to nuance our concept of “masculinity”. Occasionally I hear a Leftish person use the word “masculine” as a smear word (If they’re more academically-minded they’ll use the word “phallic”). Whenever I hear that, I get a bad taste in my mouth.

Here’s why. Readers of ancient or medieval literature will remember that the word “womanish” or “effeminate” is a smear-word in most older texts. The Roman Stoics, for example, often used the word to describe the behavior that a good Stoic would avoid. There was even a genre in medieval writing, posthumously labeled “Antifeminism,” where men would rant and complain about how bad women were.  Feminism has made a good effort to balance the scales, arguing (rightly) that women and “womanly” qualities were not less than men. Important work was done to show how “Women’s ways of knowing” were as valid as man’s ways.

But, as always, the pendulum was swung too far in the other direction. The word “masculine” began to take on a negative connotation, associated with violence, individualism and the Republican party. If the person attacking masculinity is not thinking, they often create a straw man masculinity to attack, a cartoonish mix of John Wayne, Ernest Hemingway, and 1970s Hercules films. This kind of lazy thinking assumes that when a church leader says “act like a man,” he really means for us men to run through the jungle beating our chests, clothed only with an American Flag loincloth, looking for skulls to crack (in other words, a typical afternoon for me).

“Do you think I got where I am today through reading Rachel Held Evans and listening to Pedro the Lion? Forget about it!”

It is defensible to attack a cultural aspect of “masculinity”; nay, it is glorious. The notorious “double standard” for men and women–“he’s a stud; she’s a slut”–is something that we should get away from. But it’s wrong to attack masculinity by pretending that masculinity is encapsulated perfectly by Conan the Barbarian.

Classical masculinity has had many different permutations. The saying “real men don’t cry” didn’t apply to King David or Achilles, both of whom famously cried after the death of their best friends. In some contexts, sexual conquests are highly valued; in others, like Stoicism, “real men” keep ahold of their passion. Masculinity is a big tent; and condemning all of it is a bit naive.

Even if one boils masculinity down to its most basic form, one still comes up with something fairly neutral. Anthropologist David Gilmore boiled down the almost universal (!) code of manhood to three principles: Protect, Provide, and Procreate. These could be fulfilled by a father who protects and provides for his family, or by a drug dealer who protects his turf, makes good money, and sleeps with a lot of random women. It’s important to remember that there is a difference between “being a good man and being good at being a man.” A person could be manly, but evil. But they could just as well be manly and good. Strength (in the broad sense) is a prerequisite for salvation, but that in turn does not mean that we should seek to be Weenies for Jesus.

 

Effeminate Aryan Jesus asks you to give him your life, but only if it’s not too risky or dangerous, or would cut down on League of Legends time.

 

 

You Are Not a Thinking Thing

In Introducing Radical Orthodoxy, James K. A. Smith argues that the faulty politics of modernity (liberalism), results from a faulty epistemology, which is in turn based on a faulty ontology. Following Smith, I argue that the criticism of the armchair anthropologists is the result of a flawed view of human nature and knowledge. It is grounded in a modern overconfidence in knowledge, and a” Cartesian “view of the individual.

Despite being characterized as “postmodern,” Millenials still tend to fall Christian square-glasses types still tend to think of themselves in Cartesian terms–I am a thinking thing, separate from the world out there.Thus, I can step away from this world of “social constructs” and enter a “higher” view. The subject [i.e. the college student] floats uncontaminated over the deluded masses. By critically viewing their own society, the subject thinks he has actually transcended it.

This false view of the self results in a false view of epistemology. It views knowledge as primarily discursive and individual. In other words, knowledge is made up of propositional statements that you and I and Uncle Bob have in our minds. Although many Millenials would brush up against this claim (“How dare you accuse me of believing in an Enlightenment model of knowledge!”), they still live by this model of epistemology. In fact, an academic context that privileges the research of “experts” tacitly teaches this belief about epistemology.

The world outside academia, however, is quite different. In the real world, knowledge is mostly intuitive and mostly communal. It is quite right to say that we know more than we can say. F. A. Hayek says,

The peculiar character of the problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess.

The first thing this quote shows is that F.A. Hayek’s Twitter feed would have been absolutely scintillating. The second thing is that knowledge is “spread out,” as it were. No one individual can ever know enough about anything (not even Batman). But many individuals can know things that one individual can’t. It irks me to no end that Christian Leftish Millenials can recognize the systemic aspect of human society when it comes to racism–and then treat humans like atomistic individuals in every other context. Just as tacit ideas or attitudes about race become embedded in a culture through the disparate actions of multiple people, so other ideas and attitudes (good or bad) develop as well. Jonah Goldberg writes

But here’s the thing: concepts, traditions, customs, and habits are also huge storehouses of knowledge. For instance, we don’t know all the reasons we do all of the things that fall under the rubric of “good manners.” We just do them because we should. Handshakes probably originated in the need to demonstrate that you weren’t holding a weapon. That rationale has vanished, but the handshake still has great value — but it has no price.

The difference between a tradition, like the handshake, and an a substitute created by an expert, like nose-touching, is that the first has a long history of trial-and-error behind it. Customs formulated by individuals rarely catch on. This applies even to relatively recent customs Freshman year I had an idea that my nickname should be “Eduardo Valdez”; it never caught on (“Maybe that was not the best example”–Greek Chorus). Facebook’s original purpose was to allow users to see whether other users were single. Now, it’s common practice to not put your relationship status on Facebook. No one told anyone else to do so; it just sort of happened. Similarly, no one decreed that Myspace should be vacated by anyone except for child predators and the undercover cops who love them. Rather, millions of users all had intuitions about the website, intuitions that couldn’t be accurately boiled down to a single “fact.”

 

An alternate theory is that Mark Zuckerberg was commanding a band of ninjas involved in a covert mission known only as “Operation Myspace Sux”

This is why the college sophomore who comes home and tells Ma and Pa that he’s a feminist is so darn annoying. He may know a lot of (atomistic, propositional) facts about gender. But he’s not only fighting against Ma and Pa, but Grandma and Grandpa, and their parents and so on. The collective knowledge about gender, encapsulated in stories, stereotypes (!), jokes, and so on. The reason why normal people often don’t listen to feminists isn’t necessarily because they’re boneheads. It’s because they know things that feminists don’t know,things that can’t be learned from books. One anthropology major may be smarter than a given cisgender working-class male, but five cisgender working class males down at the machine shop will know things that the anthropology major could never learn while studying at his  Midwestern Christian liberal arts college

What does this have to do with social constructs in general, and masculinity in particular? For one, it means that we can’t just dismiss masculinity as a conspiracy to “marginalize the unsuccessful.” Nor can we address the current interest in Christian masculinity as a purely cognitive issue. Walk into any Christian bookstore and you’ll find about ten-to-fifteen books if you’re lucky about Christian masculinity. Obviously this is touching a nerve. The people reading these books know something that the feminists and professors don’t know (even if we should bring their knowledge to bear on the topic). Social constructs are not arbitrary conspiracies imposed top-down, but evolve bottom up from certain historical conditions. Even if the social constructs are evil (like the systemic racism mentioned above), it is still important to know how they function in their context. For more benign constructs (like masculinity), it is important to look to see what role they are playing in the world, and whether they need to be modified. Normative gender roles can be argued about, but they can’t be dismissed. It’s not the manly thing to do.

Self-Esteem vs. Self-Respect

May 31, 2014

 Self-esteem has fallen on hard times. From its inception, it has been ruthlessly mocked, criticized and pointed to as a sign of the “crisis of Western Civilization.” And it’s hard not to see self-esteem as a “gold-stars-for-everyone” sickly egalitarianism. Critics of self-esteem (and they are many) characterize it as the smug satisfaction of people who haven’t accomplished anything—a slacker in his parent’s basement congratulating himself on his existence. It is often pointed out that self-esteem is weak sauce when a person is faced with genuine suffering and difficulty—are you going to tell someone in Buchenwald that they need to “feel good about themselves.” And self-esteem’s moral vacuity, its lack of moral orientation, is a frequent target. Hitler, after all, had high self esteem. So do drunk drivers, criminals, and Justin Bieber.

Self-Esteem is also very 90s, which is another good reason for rejecting it.

And there’s much to be said for the criticism of self-esteem. When one reads sentences like “You are a child of the Universe no less than the trees and the stars. You have a right to be here” (qt. Taylor, Sources of the Self, 497), then one longs for the bracing realness of a Camus or a Sartre. Much of the criticism of self-esteem has come from the vaguely conservative, culturally Christian side of the intellectual spectrum, and in this field, at least, the conservatives and the Christians are on the right track. But although the criticisms of “self-esteem” have been devastating, its opponents have yet to propose a viable alternative. I don’t think anyone seriously thinks that constantly feeling miserable about yourself is a good way to live your life, although there are some defeatists who get pretty close. The evisceration of self-esteem has left a void that needs to be filled.

One option, put forth by blogger Frederik deBoer, in his article “Self-Confidence is Stupid,” is “self-ownership.” DeBoer points out the vacuous and phony nature of self-confidence and self esteem. He says, “…self-confidence is a big con that we perpetuate on each other out of fear….The problem with those kind of feelings [of self-confidence] are that they change.” For deBoer, self-confidence is inherently insecure, a strategy used by emotionally brittle people to disguise their problems that doesn’t work. And this insecurity makes self-confidence highly problematic: “it may not be the case that literally everyone with what we think of as self-confidence is a jerk, but it’s pretty close… it’s entirely unclear to me that there’s actually such a thing as a projected self-confidence that isn’t ultimately a matter of saying not just “I’m good” but of saying “I’m better than you.” DeBoer is correct on this point. Since self-confidence isn’t grounded on anything solid, it works primarily as a rationalization of self-worth. And it is those who don’t have any actual accomplishments to ground their worth in who need affirmations of self-esteem. This creates a petty egalitarianism which lashes out against people who have actually accomplished something. “I’m just as good as she is,” says the person who is by no means as good as she is.

“LeBron and D-Wade think they’re so good at basketball. But they’ll never be as stylish as me.”

Nevertheless, deBoer’s alternative to self-esteem, “self-ownership,” is equally problematic. DeBoer writes, “Self-ownership means that everything that you are and do are yours, even when they’re embarrassing or sucky. Everything that’s you is yours, and you become your only judge.” This sounds refreshing at first; a person who practices self-ownership won’t have a problem with failure or suffering the way a person who practices self-esteem will. Self-ownership strikes us as true, because it factors in the bad as well as the good.

The problem with self-ownership is that it is amoral. DeBoer’s definition of self-ownership is eerily reminiscent of Mersault’s words at the end of The Stranger: I had been right, I was still right, I was always right. I had lived my life on way and I could just as well have lived it another. I had done this and I hadn’t done that. I hadn’t done this thing but I had done another. And so?” (120-121, Matthew Ward translation). DeBoer’s ownership ends up being even more vacuous than self-esteem. At best, it’s banal; at worst, it gives license to a person’s worst inclinations simply because they belong to him. And it doesn’t do justice to the experience that principled people have of committing actions that go against their “real selves,” as shown in phrases like “That’s not who I am,” “I wasn’t myself,” “I was a different person then,” etc.

One might counter that one could hitch self-ownership to a pre-existing ethic (Christian, Buddhist, Kantian, etc.). But that’s just not possible, because self-ownership is itself an ethic. The goodness (or authenticity, if you will) of the actions in the self-ownership ethic is determined by the fact that they are “owned.” This is a far cry from Buddhist ethics, in which (as I remember) the goal is to get away from the self, or Christian ethics, where an action’s goodness is determined by its accordance with God’s will. And, as deBoer said, in the self-ownership ethic “you become your only judge.” The moral vacuousness of self-esteem is never escaped; it might even be deepened.

How are we to find a way out of this trap? I propose an alternative to both self-esteem and self-ownership: self-respect. Oxford Dictionary defines self-respect as “Pride and confidence in oneself; a feeling that one is behaving with honor and dignity.” We can disregard the first part of the definition, as it is too close to self-esteem. It is the second part that interests me. I want to modify it a bit; I define self-respect as a posture of honor and dignity taken towards oneself.

Now, the emphasis of self-respect is on the posture taken, not on the being of the agent. Of course, the agent’s being can be the grounds of self-respect—I respect myself because I am a rational creature, a child of God, an emanation of divine fire, etc. But the important thing about self-respect is actions. Self-respect is a motivator for actions, specifically morally right actions. A person with self-respect does not feel shame because she is doing the right thing. And self-respect is inherently moral, just as self-ownership is inherently amoral. It makes no sense to say “Have some self-respect and rob a convenience store” or “She has no self-respect; she feeds the poor.”

The action-oriented character of self-respect makes a tangible difference in its application to life. Self-esteem is goal-oriented. If the self-confidence program fails to “win friends and influence people,” if the person suffers a terrible setback, then their self-esteem crumbles. But self-respect is not based on goals. It is based on virtuous actions, whose worth isn’t based on their outcomes. Scott Adams (the Dilbert guy), compares “system people,” with “goals people.” He says “Goal-oriented people exist in a state of continuous pre-success failure at best, and permanent failure at worst if things never work out. Systems people succeed every time they apply their systems, in the sense that they did what they intended to do.” (How to Fail at Almost Everything and Still Win Big). Self-respect means being a systems person in regards to virtuous actions. And in this regard, self-respect works at its best in the face of failure, suffering, and hardship. The person in the prison camp or the front lines won’t be served by thinking of how they are “a child of the universe.” But they can get by if they consistently apply an ethic of self-respect, lived out in continual virtuous actions (in those cases, the virtues of courage and fortitude).

One might reply, “Hey, self-respect is nothing more than just doing the right thing!” And in a sense it is. But it relies on our inner sense of shame and dignity as a motivator for right actions. There is a feedback loop going on here: we do virtuous actions because we want to not feel shameful, but we also feel honorable because the things we have done are honorable. Nevertheless, the focus of self-respect is not on the person’s being, but on the outside moral principles which the person is endeavoring to follow. In this way, self-respect provides a better alternative to both self-esteem and self-ownership.

Plus, you get a giant flaming sword, and you get to kill Jason Schwartzman and date Mary Elizabeth Winstead.

Four Non-Religious Arguments Against Homosexuality

March 14, 2014

The debate over gay marriage continues to rage across America. Opponents say that legalizing it strikes at the foundation of marriage traditionally understood. Supporters say that it is extending freedoms to everyone. If asked for my position on the issue, my answer is “I have no position.” There hasn’t been an argument or piece of evidence that has convinced me of one position, so I’m content to withhold judgment until I have a better grasp of the issue.

Nevertheless, the continuing debate has interested me in a different question–is it possible to argue for the moral wrong of homosexual behavior from a non-religious basis? Most Muslims, Christians and Jews would consider homosexual behavior immoral, but their basis comes from supernatural revelation. Could someone come to the conclusion through pure reason? In this article, I present and evaluate four arguments against homosexual behavior from a non-religious standpoint. I don’t necessarily endorse any of the arguments–I’m merely presenting them for consideration.

A few explanatory notes before I begin: These arguments aren’t taking a particular political stand on the issue, i.e. arguing that homosexual behavior should be illegal. I believe in toleration within a pluralist political framework, and that is the context in which these arguments should be taken. But, as Michael Novak points out, tolerance does not imply being “gung-ho” for someone’s beliefs or choices. Also, I will be using the term “homosexual” to refer primarily to homosexual behavior regardless of “sexual orientation.” I’m aware that the term is no longer politically correct; however, the purpose of the essay is to sidestep the issues of “sexual identity.” The terms “gay” and “straight” are relatively recent inventions, and although it’s impossible to totally escape one’s context, we can at least try to look at things from a more universal perspective. “LGBTQA” doesn’t accurately refer to what I’m describing, and it’s just harder to read.

Argument 1: The Argument from Religious Authority

  1. Most traditional religions (Islam, Christianity, Judaism, Buddhism) condemn homosexual behavior as immoral.
  2.  Therefore, there’s a strong chance that it is, in fact, immoral.

Evaluation: Arguments from authority are the weakest arguments. The reason is that no authority is infallible. As Bertrand Russell said, “Aristotle could have avoided the mistake of thinking that women have fewer teeth than men, by the simple device of asking Mrs. Aristotle to keep her mouth open while he counted.” Even the greatest thinkers make goofs sometimes. Therefore, it won’t be enough to appeal to the force of religious authority or traditional culture to close this question. A person could simply retort that the religions were wrong on this point, or wrong altogether, or that religions have countenanced so many bad things that it is foolish to look to them for moral guidance. This not only destroys the force of the argument, but also shifts the discussion away from its original focus to the question of whether “religion” is good for the world. The argument, however, can be made stronger by reducing the scope of its claims, like so:

  1. Most traditional religions (Islam, Christianity, Judaism, Buddhism) condemn homosexual behavior as immoral.
  2. Therefore, since so many wise men and women considered this in the past, we should at least be open to considering their views, even if ultimately we reject them.

This formulation of the argument is much more powerful. Instead of claiming a definite answer that it can’t deliver, it acts as an invitation to conversation. If tradition is “the democracy of the dead,” as Chesterton claimed, then this argument attempts to give the dead a fair shake. But it can’t settle the argument; it can only open the door to further considerations.

Argument 2: The Argument from Evolution

  1. Evolution is true.
  2. Humans are natural creatures, i.e. creatures who evolved through natural selection.
  3. Natural selection has “designed” humans for heterosexual relationships, in that a mammal, if it is healthy, has a natural inclination to breed, which necessarily involves a heterosexual relationship.
  4. Therefore, there is something unnatural about homosexual behavior.

Evaluation: In this form, the argument is not a syllogism; rather, it is a loosely organized set of principles, the upshot of which is that evolutionary biology should make us skeptical of homosexual behavior. After all, if the “purpose” of an animal is to breed and make more animals, then there would seem to be something biologically wrong with an animal that isn’t following its instinct.

The argument seems strong initially, but trouble arises upon further reflection. First, it doesn’t give us any moral guidance whatsoever. It can’t make the claim that there is something “wrong” with homosexual behavior, merely that there is something “unnatural” about it. And there are many things that we may consider “unnatural” that we don’t consider wrong. A person born without a limb is “unnatural;” an inclination to collect ballpoint pens is “unnatural;” stuttering is “unnatural.” We don’t consider any of these things to be morally wrong, and rightly so. Second, the term “unnatural” is itself suspect. The idea of learning precepts from nature is a shaky one, because nature itself is so shaky and changeable. If we find other animals that engage in homosexual behavior (I’ve heard there are such ones), does that make it “natural?” If we find that homosexual inclinations have their roots only in genetic predispositions, does that make them “natural?” To which “nature” are we appealing as a standard? Finally, the best refutation to this argument may be to scream, Johnny Rotten-style, “I’m not an animal!” We don’t normally treat people as animals, even if we tenaciously cling to the most naturalistic of evolutionary theories. We definitely think it wrong to treat heterosexual romantic relationships as mere animal mating rituals (this refusal is the basis of all romantic comedies). Why should we suddenly adopt this view towards homosexual relationships?

Ultimately, I think this argument fails. It may work out as a defense of heteronormativity, but heteronormativity already has a strong defense, viz. the very small number of people who engage in any homosexual behavior and the even smaller number of people whose “sexual identity” is defined by such behavior or relationships. And even if one affirms heteronormativity, one of the key tenets of the Western idea of freedom is respecting people who fall outside the norm, provided they are not harming others.

Argument 3: The sort-of Aristotelian Argument

  1. The final cause of, um, reproductive organs is reproduction.
  2. Homosexual behavior uses these contrary to the final cause.
  3. Therefore, homosexual behavior is, in some measure, wrong.

Evaluation: This argument is similar to the argument from evolution, but with a more metaphysical cast. It relies on a vaguely Aristotelian realist picture of the universe, in which teleology plays a part in things. A flower’s teleology is to bloom (I’m sure scientists will say that’s not the whole story, but let’s keep it simple.) The teleology of procreation is, well, procreation. This is, I believe, very similar to the argument that Sherif Girgis presents against gay marriage.

This argument is more effective than the the argument from evolution, simply because this view of the universe is probably more widespread than evolutionary naturalism (and isn’t itself incompatible with evolution). It, however, has a host of problems as well. Setting aside the straight-up rejection of Aristotelian realism, the reliance on final causes is problematic. For starters, it may be that we have the final cause wrong, or that there are multiple goals that humans could strive for. Most people wouldn’t want to say that creating children is the only reason to engage in amorous dalliance, nor is it the most pressing. This argument also has the effect of condemning–how can I say this delicately–all sexual practices that don’t have the end of creating children in mind, or that won’t result in this end. And what about heterosexual couples who aren’t able to have children? Once could argue that they are the exception to the rule, but the entire issue is about exceptions to the rule, and whether those exceptions are bad or neutral. It may be that heterosexual couples who can’t have children are still fulfilling the final cause, but are being hindered by an outside force. But that just brings us back to square one.

Argument 4: The Argument from Metaphysics

  1. All of human life is a search for the Other.
  2. This search for the Other is an imperative
  3. This search is and should be expressed in romantic relationships.
  4. Man and Woman are, on some level, each other’s Other.
  5. Homosexual relationships are same-sex; the term of the relationship does not seek his or her “Other,” but only a mirror of him/herself.
  6. Therefore, there is something wrong with homosexual relationships as such.

Evaluation: This is probably the strongest argument of the bunch. Its strength is that, although it relies on a metaphysical principle, it doesn’t reference any specific religion, but rather a “universal” requirement. I discovered the argument from Erik von Kuenhelt-Leddhin in Leftism Revisited. It’s also the argument used in Michael Novak’s article “The Double Hell of the Homosexual” (read before judging). The idea is that there is some element of a homosexual relationship that “misses the mark” of what humans are supposed to do. And it goes beyond the sexual aspect to cast suspicion upon the entire relationship itself.

The initial objection to this would be to say “devil take the metaphysics.” This kind of metaphysical view is going to be unpopular in many circles, especially evolutionary naturalists. One could also propose a metaphysical system of a different kind. This metaphysical system itself seems to privilege marriage in a bad way. Could a person who is celibate not “search for the Other?” If you open the door to celibates, it may be a hop, skip, and a jump to gays, lesbians, and committed bisexuals. My biggest worry about this argument is that, if developed, it might just end up being a reiteration of a Christian position on the issue. That’s not a problem in itself–it may be a good explanation of why Christians consider homosexual behavior to be wrong, beyond “The Bible says so.” But then it is no longer a non-religious argument, and it relies on a supernatural authority which not everyone accepts.

Conclusion

Ultimately, I don’t think any one of these arguments gets close to giving us a settled answer, with the possible exception of #4. #1 is merely a conversation starter. #2 and #3 fall apart under closer investigation, while #4, which seems like the tightest argument of the bunch, can’t seem to get away from religious assumptions, or at least religious trappings. The last three arguments may present a good case for heteronormativity, but that’s about it. I can’t see myself using any of the arguments in a discussion, except perhaps a modified version of #4 (and what that modification would entail, I cannot say offhand). Still, I think it’s good to consider these arguments. It’s helpful for religious people (most of whom are the opponents of “gay” marriage, etc.) to think about whether their positions can be grounded in “natural reason.” And it’s helpful for supporters to see that their “enlightened” positions aren’t eternally self-evident truths, but are grounded in a specific time and place, and open to objections (and that holds even if their positions are true). I hope that this has added something for both sides of the discussion.

Please keep all comments civil, grace-filled, and respectful of the personhood of others. Abusive comments will not be tolerated.

 

 

In Defense of Individualism

February 18, 2014

It’s been a while, hasn’t it.

In the realms of academia, and especially Christian academia, where I make my humble abode, it has become fashionable to rail against “individualism.” These sort of academic bogeys come and go–for a while it was fashionable to be against “the enlightenment.” It’s tempting to think of these things as merely fads, the academic equivalent of parachute pants or the Bay City Rollers. But academia has a huge influence upon the rest of “the culture,” and it is important to know what is going on.

The current challenge to “individualism” often comes dressed in a Christian cloak; sometimes it’s a front for liberation theology, sometimes it isn’t. The individualism decried is usually “American” individualism (as opposed to, I suppose, Norwegian individualism or Azerbaijanian individualism, which are completely innocuous.) Sometimes the thing decried is “rugged” individualism. And usually this is connected to an opposition to philosophical conservatism and (especially) to capitalism. The range of attitudes towards individualism is varied, but the general consensus is that individualism is 1) a Bad Thing, and 2) characteristic of the “right-wing” position and 3) (and this point is more often implied than stated) thus makes the “right-wing” position a Bad Thing as well. In this view, Conservatism is dominated by a selfish, dog-eat-dog ethic that privatizes everything (especially faith) and encourages fragmentation and unhealthy competition. On the other hand, Leftism is presented as being concerned with “community” and care for others or the Other.

Nevertheless, as a conservative I find this view to be a poor caricature of the conservative project. For starters, the picture of “rugged individualism” that many people say or imply is a feature of conservatism is completely off-base. I can think of only three vaguely conservative figures who would hold to anything like “rugged individualism”: Louis L’amour, Ayn Rand, and P.J. O’Rourke. L’amour’s novels are basically advertisements about how awesome Louis L’amour is, and there is a sense in which his heroes are “rugged individualists” who model traits such as self-reliance and independence. L’amour’s idea of the hero, however, is not the Nietzchean superman who towers above everyone, but the ordinary person who uses his self-reliance and independence to protect the weaker people within the community. In Sackett (the only L’amour novel I’ve read all the way through), the title character is struck by a passage in Montesquieu about how the strong members of a community have a responsibility to protect the weaker members of the community against bad guys. This is hardly “rugged individualism.” Ayn Rand’s idea of individualism is darker and more selfish, but we have to remember that it is a knee-jerk reaction to the Communist environment that Rand grew up in and escaped. Additionally, Ayn Rand’s ideas are rejected by a large number of conservatives–John Robbins (who was about as far from Leftism as you could get) spent a good deal of his writing refuting Rand’s ideas. Benjamin Wiker, in his book 10 Books Every Conservative Should Read, labels Ayn Rand as an “impostor.” Whittaker Chambers, one of the founders of the conservative movement in the ’40s and ’50s, was opposed to Rand’s ideas. And in his newest book, P. J. O’Rourke calls her a “loony old b***.” And though the libertarian O’Rourke often takes an exaggerated pose of individualism and selfishness, he does it as a joke. I sometimes feel like there are folks on the Left whose concept of conservatism is taken from reading O’Rourke’s punchlines as if they were serious statements of belief.

A Leftist might counter with the example of, say, their redneck uncle who is always watching Fox News and talking about how people should “pull themselves up by their own bootstraps.” But it’s clearly not fair to pit the best of one side against the worst of another side; the fact that Erik von Kuehnelt-Leddihn is a more intellectually scintillating writer than Al Franken does not prove that liberals are idiots. Another tack might be to say that opposition to Affirmative Action or welfare programs are a sign of conservative individualism. Perhaps in some cases, like the aforementioned redneck uncle, they are. But the conservative “opposition” to these programs is by no means monolithic (I for one, am not opposed in principle to either one), nor is it always based on the idea that a person should “pull themselves up by their own bootstraps.”

What, then, do philosophical conservatives mean when they talk about “individualism.” The crux of individualism is individual rights. One of the foundational ideas that unites all conservatives is the idea of innate human rights. Some people (I’m lookin’ at you, James K.A. Smith) call these “negative” rights, which makes them sound like something bad. But the essence of these rights is that they are innate, given to you by God or nature. No one can “give” you freedom of speech, or conscience or respect. They can only take it away. Of course, there is debate about how far these rights extend, and conservatism is by no means of one opinion in that regard. But, the point of the matter is that these rights have a common grounding, viz. the fact that all humans come from their mother’s womb and all die.

These individual rights are the basis of all other rights. Women’s rights, minority rights, gay rights, etc., are all grounded in the fact that the people agitating for those rights deserve them because of their status as individuals. In a strongly anti-individualist climate, the rights of some people are going to be privileged over the rights of others. In the Norse world, the rights of the invading Viking pillagers take precedence over those of the screaming villagers. In the strongly Communist country, the rights of the people within the Party take precedence over the rights of the dissidents. In the pre-Civil Rights South, the rights of the whites took precedence over the rights of the “Negroes.” Whenever rights are grounded in membership in a certain group, those who don’t belong in the group (the peasants/women/blacks/gays) are naturally going to be given less dignity than ones within the group (what Leftists might call “demonizing the Other). But when the rights are based on the dignity of the individual, regardless of class, sex, race or sexual orientation, then all persons have a solid basis from which to claim dignity. This doesn’t mean that simply affirming the rights of the individual means that the individual will actually get those rights–it took the US 200 years to get to the point where we were giving blacks and women the rights that they deserved. Nevertheless, the dignity and worth of the individual is the basis of those rights.

“Individualism” is also inherently opposed to tyranny. Just today I read in Christopher Hitchens’ Letters to a Young Contrarian, “Milton Friedman might be wrong about sweatshops and free-market opportunities, but he was not wrong to state that one man plus a correct opinion outvotes a majority.” I don’t think that “individualism” is the only or greatest defense against tyranny–in fact, I think you need strong communal structures to avoid tyranny. Tyranny, however, generally relies on a collective consciousness. Massive crimes are not rationalized by an appeal to the naked self-interest of one person, but to things like “The Aryan fatherland” or “the purity of our religion” or “a perfect world.” Whenever revolutionaries have had a vision of the “ideal” world, the individuals are usually left out. It’s hard to imagine a tyranny based on individual rights–though it’s possible to imagine one that uses the language of individual rights as a smokescreen.

And let us not forget that it is individuals who strike the greatest blows against tyranny. Leftists often like to point out things like “systemic evil,” and use this idea to discredit individualism. But “systemic evil” is nothing more than a way of describing the famous statement that Edmund Burke never said: “evil will only triumph when good men do nothing.” The very language of “systemic evil” suggests that there is some sort of “system” that exists autonomously of actual people, and tends to shift the blame from the actual people involved to this ethereal “system.” In reality, “systemic evils” are often struck down by individuals. Solzhenitsyn struck a powerful blow against the Soviet Union almost single-handedly. Although the Civil Rights Movement was not exclusively the work of Martin Luther King, Jr., it is doubtful that it would have achieved the gains it did had not Dr. King been a visible, individual figure at its forefront. The two most influential men in the ancient world, Socrates and Jesus Christ, were both executed because they refused to disavow their individual beliefs in the face of collective opposition.

So is individualism a good thing? It depends on what the word means–it can certainly be used as a disguise for selfishness and apathy. But I still contend that the kind of individualism that I have outlined in this essay is the basis or one of the bases, of a good and just society.

Best Music of 2013

January 1, 2014

I haven’t written a blog post all year!

Every year I do my wrap-up of the best music of the year. Due to my cash-strapped condition, I don’t have the money to buy all the new CDs that come out in the year, so this is a list of the best music that I have listened to this year, not the best music that came out this year. This was an interesting year for my music listening. I didn’t get to listen to as much music as I wanted to (of course, I never do). This year’s list of best albums contains two hip-hop offers and a punk classic, while the list of best songs ranges from a Beatles cover to a blend of classical and rock. Enjoy!

[Note: Some of these songs/albums contain profanity, etc. For those of you who are offended by such stuff, use your own good discretion when listening to these things.]

The Ten Best CDs Albums of the Year

A picture of the Smiths

  1. The Smith’s Best…vol. 1 by the Smiths: I know that it’s not cool to like greatest hits compilations, but let’s face it: sometimes it’s nice to get a CD with all of the major songs by an artist. This collection introduced me to the melancholy British band with the cult fandom (they were J.K. Rowling’s favorite band in high school.” Morrisey’s eccentric, sad lyrics form the perfect counterpoint to  Johnny Marr’s hovering guitar. This album brought me back to high school, in the best possible way.
  2. Boxer by The National: The first album by a newer band that I’ve bought on vinyl. Matt Berninger has more testosterone than the entire indie music scene, and with his voice, he could make the phone book sound like a heartbreaking love story. Fortunately, he’s a great lyricist as well, tending to minimalist, koan-like songs about the lives of quiet desperation that young city dwellers live. The rest of the band does beautifully as well. It will make you depressed–but in a good way.
  3. Bon Iver by Bon Iver: I listened to this record all the way through for the first time this year. I used to be skeptical of Justin Vernon, simply because he was so popular. This album won me over. It has its weak spots–the auto-tune on “Beth/Rest” sounds unbearably pretentious. But the majesty of compositions like “Perth” and “Holocene” is enough to justify buying the album.
  4. Trouble Will Find Me by The National: The newest album from The National. Though it is not quite as consistent as Boxer, it is still a thoroughly enjoyable album by The National. It’s much more muted and slow, and drags in the later half. Still, it’s worth it for gems like “I Should Live in Salt” and “Sea of Love.”
  5. The Animal Years by Josh Ritter: I’ve had this album for a while, but I never got around to listening to it until this summer. Josh Ritter is a true successor to Bob Dylan, nasally voice and all. I kept this album on heavy rotation as I drove the streets of Texarkana in my Ford Explorer this summer.
  6. The College Dropout by Kanye West: There are many things to hate about Kanye West: He’s pretentious, he’s married to Kim Kardashian, and the way he refers to women in his songs is often disgusting and wrong. But make no mistake: the guy is talented. Before he was dating reality stars or interrupting Taylor Swift, he made this album, a record focused more on storytelling than boasting. I usually only listen to hip-hop one day out of the month–I could listen to this record any day.
  7. Never Mind The B****cks, Here’s The Sex Pistols by The Sex Pistols: I can’t recommend this band or this record. The Sex Pistols were boorish, vulgar, and nihilistic. But their influence on rock and roll can’t be denied. This was probably the fastest, loudest, most punk thing to happen in 1977. Anthems like “God Save The Queen” propelled the music world to new heights of aggression. Love them or hate them, the world would be very different without the Sex Pistols.
  8. Youth by Matisyahu: Matisyahu is a Hasidic Jew who performs hip-hop/reggae music about its faith. That may make him seem like a novelty act, but he has the skills to keep himself relevant. His songs are great chill tunes. Another CD that I enjoyed very much during this summer.
  9. All The Times We Had by Ivan and Alyosha: Ever since hearing their song “Glorify,” I have become hooked on Ivan and Alyosha. They have all the ingredients of a good folk band: tight harmonies, catchy songs, and meaningful lyrics. Their music has a spiritual tone to it, no doubt due to the influence of The Brothers Karamazov (From where the band takes its name). The first album that I paid for on Noisetrade.
  10. Two Men With The Blues by Willie Nelson and Wynton Marsalis: Two American music legends, together again for the first time. It’s impossible to have a bad Willie Album. Pairing him with one of the foremost jazz musicians of our time is a match made in heaven. The album grooves, in a very 1930s way.

Eleven Best Songs of the Year

Another picture of the Smiths. Morissey will be flying down to Arkansas soon to pick up his awards.

  1. “Girlfriend In A Coma” by The Smiths: I tried not to have any overlap between the best songs and best albums, but it would be criminal for me not to give this song the number one slot. A heartbreaking ballad that captures the tension felt by the narrator, an overbearing and possibly abusive boyfriend who is conflicted by his girlfriend being in a coma. The chorus has one of the most beautiful string parts in all of pop music.
  2. “Keeper” by Shovels and Rope: One of the most passionate performances you’ll ever hear, from the husband-and-wife alt-country duo who mix redneck and hipster tastes into one of the most delicious sounds out there.
  3. “Don’t Let Me Down” by Mason Jar Music: This is kind of cheating, since I listened to this song yesterday. A Beatles cover that does the unexpected–improves upon the original. Instead of the simmering, barely contained rock-and-roll of the original, Mason Jar Music gives this song a slinky R&B-meets-indie vibe.
  4. “Virginia” by The Deep Dark Woods: A haunting folk song about a girl and being lonely. Of course, that’s what 50% of folk songs are about, but this one is really, really good.
  5. “Heavy Bells” by J. Roddy Walston and The Business: This song doesn’t sound anything like the rest of of J. Roddy’s catalogue, which is straight-up boogie-woogie classic rock revival. Instead, this punkish tune is the fiercest song of the year, with Mr. Roddy wailing and hollering unintelligible lyrics over The Business’ crushing guitars. Rock and Roll!
  6. “Horses” by Sean Rowe. A really weird and dark song by the deep-voiced troubadour Sean Rowe. If Clint Eastwood had ever starred in a surrealist spaghetti western, this would have been the title track.
  7. “Stay Young” by Okkervil River: The best 80s song that didn’t come out in the 80s, right down to the cheesy synths and saxophones. Anyone over the age of 40 should be disappointed that this song wasn’t played at their senior prom.
  8. “Two Hearts” by Paper Route: The pop song–three-and-a-half minutes, simple chords, catchy melodies–has fallen on hard times as of late. It’s hard to think that the same form that gave us, say, “Let It Be,” has now given us such forgettable piffle as “Can’t Stop Won’t Stop” and “Applause” (More Miley Cyrus-bashing below). This cut may not quite redeem the pop song, but it’s slick and well-polished without being smarmy or dumbed-down.
  9. “Idol” by Smith Westerns: The Chicago band has a talent for shimmery, Beatles-esque songs driven by falsetto vocals. This is one of their best.
  10. “Trophy Sixty-One” by England in 1819: The Sigur Ros of the flyover states. The Louisiana band, made up of two brothers and their dad, plays music with the reverb-heavy layered sound that has become quite popular recently. Post-Rock-and-Roll!
  11. “Vale” by Midlake: Although I wasn’t particularly impressed by Midlake’s newest album, this track shows their blend of classical and rock at its best. Both styles are blended without either one overshadowing the other.

Best Concert: The Avett Brothers: This was a very good year for concert-going, and I went to some fun shows: blink-182, J. Roddy Walston and the Business, Derek Webb, Veil of Maya. However, the best concert of the year goes to The Avett Brothers. A show by the Avett Brothers is filled with a wide musical variety–they can rock out on one song, and croon bluegrass on the next. Everything that you love from their albums you will find at their concerts. Plus they just have a lot of good songs.

Scott has an epic beard

Obligatory Lorde Award: Following in the footsteps of Mumford and Sons and Of Monsters and Men, Lorde has appealed to both the alternative and mainstream music worlds. Few non-hip-hop artists are able to have top 40 hits and make the best-of lists of alternative music websites. It seemed like every music best-of this year gave an award to Lorde’s Pure Heroine album. There’s plenty not to like about Lorde: Her young age seems like a gimmick, her album is repetitive, and that “Royals” song gets on my nerves. But let’s give credit where credit is due. Lorde occasionally comes up with some of the most effective parodies of hip-hop culture. Her song “Team” is flat-out awesome. And while she may not be the best role model for young girls, at least she’s not dirty dancing with Robin Thicke at the VMAs.

“Hey! I’m a kid! I write my own songs! Give me an award already!”

Most Interesting New Artist of the Year: Bars of Gold. Five dads from Detroit playing foot-stomping, floor-punching rock-and-roll. Guys with day jobs aren’t supposed to sound this good.

 

Should Christians Play Video Games?

December 29, 2013

This is the first in a new series called “Should Christians…” Like most of my other series, this one will go on until I lose interest in it and start chasing another ignis fatuus across the swamp of my mind. The purpose of this series is to look at a few leisure activities from a Christian perspective. My question I am asking is not “are Christians allowed to do these things?” I believe that the Bible condemns and forbids sin; we should not be legalists and increase the scope of sin to include anything we don’t like. At the same time, however, Christians should use wisdom in their entertainment choices. “All things are lawful,” says the Apostle, “but not all things are helpful.” Previous generations have erred on the side of legalism; my generation tends to go to the other extreme. On the other hand, many critiques of different leisure activities are written by (older) people who are  unfamiliar or unsympathetic with the activities being presented, and whose criticisms are therefore off-base. I hope that in this series I can be a “friendly critic” of the activities that I am looking at.

Don't Mess With Nick

I used to be a gamer. For a while I even collected video games. That all began to stop in eighth grade, when Mr. Deaton assigned me so much math homework that I had no more time to play video games anymore. My gaming habit slowed down until it got to the point where I couldn’t remember the last time I picked up a controller. Then, during this Christmas break, I decided to do a little gaming. I downloaded the SNES classic Chrono Trigger (I don’t have any money for new games), and I’ve been playing through it for the last five days or so.

First, I’d like to break down a myth. Every pop culture medium has its group of fans who insist that “The old stuff is better”–“The Andy Griffith Show is better than all that new trash on television,” “I wish they made music like they did in the 70s, before all this rap that sounds like noise.” I assume that there are video game fans that think that video games have gotten worse since the “good old days,” that “the classics” are the best, that the industry lost its soul in 1999 and never recovered. They are all wrong. They only believe this because they played the old games when they were kids. The art of the video game has improved since Chrono Trigger. Graphics have gone from looking like a bunch of dots painted on a screen to near photorealism. Controllers have evolved from joysticks to Wiimotes. Chrono Trigger pioneered the idea of giving the player multiple choices that affect the story (it’s basically a glorified choose-your-own-adventure book). Now, you have to look hard to find a game that doesn’t have a linear storyline. The level of customization and detail in any given game that came out in the last year makes Chrono Trigger look like a product of the dark ages. The advances made in video games in the last 30-40 years are the equivalent of going from flip-books to Avatar. All-around, video games are more detailed, more engrossing, and flat-out cooler than they used to be.

They are also less cool than they used to be–at least, I find them that way. The ideal age to begin gaming is between eight and twelve (if you’re a boy). At that point, you have plenty of time on your hands, you haven’t discovered girls yet, and the phrase “Daxtron whirled his flaming sword and destroyed Borthrogon the ice-demon” sounds incredible (If this phrase still sounds incredible to you, seek therapy immediately). Video games are kind of like fantasy novels, except that instead of reading about Bilbo Baggins or Rand al’Thor doing cool things, you get to go and do them yourself. The age of wide-eyed innocence is the best age for playing video games.

If you no longer think this picture is cool in an un-ironic way, then you are too old for video games.

After this age, it ceases to be cool. One thing I started noticing as I got older is that video games began to become more and more repetitive. I realized that no matter what game I played, I was still pressing the same buttons. After the allure of magical realms and huge guns began to wear off, I got tired of mindless button-mashing and long sidequests. I’ve been using my Chrono Trigger sessions as an excuse to listen to podcasts. Another thing I noticed was the massive amount of time that playing video games takes. If you’re not careful, you can lose hours playing a game and not even notice. I can hear the voices now: “But you can do the same thing with a book.” Yes, but when I finish a marathon reading session, I feel good about myself, like I’ve actually accomplished something. Whenever I finish gaming for five hours, I feel like I’ve wasted a lot of time.

One tangible reason that I feel better about reading is that I get more out of it. Sure, there’s all the great classics that I could be reading: Plato’s Republic, Oscar Wilde’s The Picture of Dorian Gray, Tim Allen’s Don’t Stand Too Close to a Naked Man. But man cannot live on hifalutin’ literature alone, and if he could, he’d be an insufferable snob. I’m not against “mindless” entertainment; it’s just that I think books are better for you even if they aren’t quite Shakespeare and Faulkner. Professor and jovial old man James V. Schall writes,

I remember [Rudolf Allers] saying in class one day that we should always be reading novels, even bad novels; for in their particularity, we will always find something, some incident, some character, some chance insight, that will teach us something we could have learned nowhere else. (A Student’s Guide to Liberal Learning 47).

The “mindless” novels that I read (Wheel of Time, The Dresden Files, Dune) still have an individuality about them that I don’t find with video games. I find it hard to think about living life without having known Matrim Cauthon or Harry Dresden; they are almost like real people, friends whom you want to introduce to your other friends. Video games, by their very nature, tend to put character development on the backburner and focus on killing monsters or blowing up giant robots. Fifty million Mario games later, we still have no idea what the guy is like.

For all we know, he could be a communist spy. After all, he does wear red and support the working man.

Character development is half of a good story, and most video games don’t have good stories. Don’t take my word for it; Damien Walter says,

“there doesn’t seem to be a single decent writer working in video games. Whenever I say this, people shout things like Mass Effect and Bioshock at me and I play them and they are about as well-written as an episode of Star Trek. Which isn’t awful, but neither is it great. It’s just functional.”

That’s a pretty generous assessment. Most video game storylines are somewhere in between a cheesy anime and Star Wars Episode I. In fact, while I was playing Chrono Trigger, I though that I would enjoy it a lot more if it actually was an anime. There are some cool plot ideas (time-travel, sword-wielding talking frogs), but the character development is nonexistent and the dialogue, well….

Zombor

     So much for the aesthetic aspect of gaming. Looking at video games qua games presents another set of problems.  The lack of a human element in gaming is problematic, because it takes the elements of pain and surprise out of the game. The possibility of failure is what makes something an adventure. G.K. Chesterton said, “When Thomas Aquinas asserted the spiritual liberty of man, he created all the bad novels in the circulating libraries” (Heretics 100). If a team loses the big game, they feel a sense of pain. If I’m wrassling with my brother and he throws me to the ground, I feel a sense of pain. In video games, failure has no teeth. Sure, you may have to go back to the end of a level if you lose, but you don’t have to see your friend Bob do his ridiculous victory dance like he does whenever he beats you at ping-pong.

     Along with failure, video games lack the element of surprise. To put it simply, you can’t cheat. Because you’re limited by the code that the games are written in, there’s no possibility of rule-breaking, no arguing over the rules, no individual style of playing. There are no Babe Ruth’s or Mohammed Ali’s in the world of gaming. In a game of cards or kickball, there’s always the possibility that a player will do something expected or even illegal, and this gives everyone a sense of adventure. In a video game,  the worst you can do is screenhack.

     Finally, gaming culture is just plain weird and creepy. There’s something a little off about grown men who are obsessed with playing games. I don’t think there is anything wrong with playing occasionally, but there are legions of pasty, overweight men (and they are mostly men) who have frittered their lives away gaining “powers” instead of doing something constructive. Video games have grown so popular as to be inescapable, but I would still shy away from the hardcore gaming crowd. As Bruce Springsteen sings “It’s a town full of losers / Then we’re pulling out of here to win.” Additionally, video games tend to cater to male fantasies in an unhealthy way. Although the violence in video games is often mentioned and exaggerated, it may be the least problematic aspect of gaming. The way that video games portray heroism is juvenile–while the hero in a movie or fantasy novel is likely to be a plucky underdog (Peter Parker, Harry Potter, Harry Dresden), the hero of a game is more likely to be an invincible space marine with bulging muscles and a name like “Marcus Fenix.” And if any women show up in a game, they’re either bikini-clad triple D-sized sex bombs (American games) or personality-challenged plaid-skirt wearing schoolgirls (Japanese games) or some disturbing combination of the two. While other entertainment industries give their token nod to feminism, the gaming industry is apparently still stuck in the hormone-crazed age of Conan the Barbarian.

Sometimes there are aliens, too.

     In conclusion, I find that video games are lacking on both aesthetic and entertainment grounds. Yes, there’s some fun to be had in playing games, but on the whole, I find that they are lacking. They fail as art, they aren’t as fun as other activities, and the culture surrounding them is kind of creepy. Should Christians play video games? There are better ways to spend time.

     In the next installment, I ask “Should Christians listen to…HEAVY METAL!?”

Current Listening: Album of the Year by The Good Life