Do the homework and come to class. [Woody Allen has said that “80 percent of success is showing up.” But be sure to show up for your homework, too; there will be approximately two hours of homework for every hour of class, i.e. five hours of homework a week. Plan for that.]
Stay in contact. Visit Dr. Miller during his office hours. Use email and the phone number. [“If you come to my office hours, then I will help you.” “If you call me after nine pm, then I’ll be tired and irritable.”]
Always Ask a Question. [Take responsibility for your education. “Fake it until you make it.” We’re not on television, and education is not a passive enterprise.]
I had the pleasure and discomfort of attending parts of the Reason Rally on Saturday, a march on Washington by atheists, agnostics, and heathens. It was cold, rainy, and frequently quite boring. I mostly went to see Bad Religion, but I enjoyed Eddie Izzard’s routine and Cristina Rad, who responds to theists this way: “You can keep your personal relationship with Jesus Christ. I have a personal relationship with reality.”
But I also found myself disappointed by how much it sounded like a meeting of milquetoast liberalism, and wondering, again, why atheism needs to be a social movement.
It’s popular to quote the study showing that atheists are distrusted about as much as rapists. But this study doesn’t quite pass the smell test: the average atheist is a well-educated white male with plenty of status and more than our fair share of trust. Asking about atheists without context produces ungrounded evaluations. My students and colleagues don’t treat me like they’d treat a rapist, even though they know I’m an atheist. They treat me like a college professor.
Of course, I had a harder time as an atheist teen, and indeed we see a steady stream of outrageous news about the mistreatment of young atheists as a part of the overall attention to bullying. I suspect, however, that such young atheists face intersecting oppressions as women or homosexuals, or are partly being punished for otherwise transgressing gender norms. First and foremost an atheist teen will tend to be seen as effeminate or tomboyish: as too thoughtful for a man, as too argumentative for a woman. So I’m not convinced that atheist teens as a group have it worse than gay and lesbian teens, even though those groups rate higher than atheists on “trust.” A gay teen atheist might disagree, but in a social setting where all difference is violently bullied, how can we be sure what’s cued the mistrust?
So why cast atheists as victims? Why the mobilization about “coming out of the religious closet”? Recent work by Robert Putnam and David Campbell suggests an answer:
[R]eligion’s influence on U.S. politics has hit a high-water mark, especially on the right. Yet at the same time, its role in Americans’ personal lives is ebbing. As religion and politics have become entangled, many Americans, especially younger ones, have pulled away from religion. And that correlation turns out to be causal, not coincidental.
By using religion to justify their politics, theologically conservative Republicans have conveyed the message to young liberals that they must reject religion in order to reject that politics. Putnam and Campbell show that a lot of the growth in atheism has been traced directly to the growth of politically partisan religion, which is partly why the cause is taken up by the young with such force in the Millennial generation:
The best evidence indicates that this dramatic generational shift is primarily in reaction to the religious right. Politically moderate and progressive Americans have a general allergy to the mingling of religion and party politics. And millennials are even more sensitive to it, partly because many of them are liberal (especially on the touchstone issue of gay rights) and partly because they have only known a world in which religion and the right are intertwined. To them, “religion” means “Republican,” “intolerant,” and “homophobic.” Since those traits do not represent their views, they do not see themselves — or wish to be seen by their peers — as religious.
That’s why a lot of the talk at the Rally yesterday sounded like banal moderate liberalism: increasingly for this generation, that’s what it means to be an atheist. Once upon a time, God was being used on both sides of these arguments. But today, it’s hard for progressive theists to be heard and understood as both progressive and theists, and young people have decided that if they must choose between those two identities, they’d rather be progressive. If you’re in favor of gay marriage, and you look around the world and see that all the objections to gay marriage come from religion, you conclude that you have to chuck God. The same thing for environmentalism, feminism, and the Occupy movement: God has too often appeared publicly on the wrong sides of these debates, and it’s hurting the brand.
I know a lot of wonderful, caring theistic activists who are smart, committed, and reasonable. But as we’ve grown older these theists have either grown more disillusioned with their faith or more disillusioned with their youthful activism. Clearly there was once a way to make those things compatible, and just as clearly something has changed in the larger culture that’s pointing out an inconsistency in the psychic lives of individual citizens.
Theists are increasingly recognizing that the humanists were right: you can be Good without God; and worse, you can be Bad with God. When your co-religionists are Success-Theology, Federalist-Society, Dominionist-Ideology Social Conservatives, you’ve got to acknowledge that faith isn’t sufficient for like-mindedness. But once you decide that faith is irrelevant to the things you thought you cared about, neither necessary nor sufficient for commitment to a political cause or civic engagement with fellow citizens on matters of fundamental concern, where do you go from there? If you’re older, you make it work and ignore the inconsistencies. If you’re a young person, you don’t think you ought to have to stomach that kind of inconsistency. So you don’t:
Consider the growth in the number of people whom sociologists call “nones,” those who report no religious affiliation. Historically, this category made up a constant 5-7 percent of the American population, even during the 1960s, when religious attendance dropped. In the early 1990s, however, just as the God gap widened in politics, the percentage of nones began to shoot up. By the mid-1990s, nones made up 12 percent of the population. By 2011, they were 19 percent. In demographic terms, this shift was huge. To put the figures in context, in the two decades between the early 1970s and the early 1990s, the heyday of evangelicalism, the fraction of the population that was evangelical grew by only about five percentage points. The percentage of nones grew twice as much in the last two decades and is still climbing. Moreover, the rise is heavily concentrated among people under 30, the so-called millennial generation. To be sure, the young are always less religiously observant than their elders; people tend to become more religious when they get married, have children, and put down roots in a community (demographers call this the life-cycle effect). Yet 20-somethings in 2012 are much more likely to reject all religious affiliation than their parents and grandparents were when they were young — 33 percent today, compared with 12 percent in the 1970s.
One-third of all young people have rejected religion because it has been co-opted by the Republican Party. I’m not particularly excited about that, as it doesn’t seem to lead to the world I want, where religion doesn’t play an important role in politics. I don’t care enough about atheism to want people to join me at it, but I care enough about public reason to wish we could have more of the discussions that matter without bad biblical exegesis, Christianist dog whistles, and silly claims about the incommensurability of secular and religious reasons.
One-third of all young people have rejected religion because it has been co-opted by the Republican Party. I’m not particularly worried about that, but theists probably should be. So, theists: what are you going to do about it?
“I am not throwing away my shot” is just an awesomely perfect refrain: it refers to ‘reserving and throwing away’ the shot in a pistol duel: deliberately firing into the ground in order to make a merely symbolic gesture of courage. It was early American custom to fire until satisfaction: this could mean until one duelist was unable to continue, or until the mutual exchange of volleys had so spooked one of the parties that they acquiesced, usually through their second, to whatever half-hearted apology was offered. Death was very rarely the result: most opponents would be satisfied with whatever face-saving injury they managed to inflict or sustain in the first three volleys, especially because of the legal and social repercussions of committing a murder in a country that viewed dueling as a European extravagance.
More honor could be lost by stubbornly refusing to accept a negotiated settlement and thus killing a man than might have been at stake in the original insult. The desire to maintain decorum even in the midst of violence required participants to restrain their rage or bloody-minded vengefulness. Today we see a similar judgment in the opprobrium heaped upon those who ‘kick a man while he’s down.’ A defeat suffered with aplomb is better than a victory sullied by distasteful displays of man’s base instincts. However, Alexander Hamilton supposedly did “throw away his shot” in the duel with Aaron Burr:
I have resolved, if our interview is conducted in the usual manner, and it pleases God to give me the opportunity, to reserve and throw away my first fire, and I have thoughts even of reserving my second fire.
Thus Hamilton claimed that he would take at most one or two shots at Burr. His first shot was a deliberate miss. Since Burr’s responding shot killed him, we can’t know what he would have done for the second round. The noted traitor Burr is said to have responded to the allegation that Hamilton never intended to fire upon him with a laconic, “Contemptible, if true.”
In my last post, I noted that Jason Brennan’s published work strongly opposed disenfranchisement in the ordinary sense, and I claimed that Eric Schiesser had misrepresented his words in order to derive that conclusion. Today, Eric Schliesser supplied an unpublished paper in which Brennan offers an argument for experimentation with competency tests to disenfranchise incompetent voters.
Eric, please accept my apology.
Here are some telling highlights from Brennan’s paper:
In this paper, I argue that the practice of unrestricted, universal suffrage is unjust. Citizens have a right that any political power held over them should be exercised by competent people in a competent way. In realistic circumstances, universal suffrage violates this right. Since nearly all current democracies have universal suffrage, all current democracies are to that extent unjust.
Restricted suffrage is about as unjust as voting age laws. It creates a ruling relationship between different classes of citizens based on a distinction that all reasonable people can accept in the abstract, but about which in practice there will be reasonable disagreement. In contrast, universal suffrage is about as unjust as a policy of enforcing jury decisions not matter what, even when we have conclusive grounds for thinking the jurors were incompetent or made their decisions incompetently. Thus, universal suffrage appears to be more intrinsically unjust than restricted suffrage.
We do not know for sure whether voter examination systems would produce better or worse results than democracy universal suffrage. However, as I have argued, such systems are less intrinsically unjust than democracies with universal suffrage. And there are good reasons to think they will produce better results than democracy with universal suffrage, though there are reasons to worry they will not. Since we are unsure of the consequences, but have reason to expect them to be positive, we might experiment with voter examination systems on a relatively small scale at first. For instance, perhaps it would be best if one state in the U.S. tried the system first. (We would want to start with a relatively non-corrupt state, such as New Hampshire, rather than a corrupt state, such as Rhode Island.) If the experiment succeeds, then the rules could be scaled up. Similarly, consider that a few hundred years ago, we have little experience with democracy. Some advocated democracy in part because they believed it would tend to produce better and more just outcomes than monarchy. Others worried that democracies would be even more corrupt, or would collapse into chaos. In light of their lack of experience, a democrat might reasonably have argued in favor of experimenting with democracy on a relatively small scale, and then scaling up if the experiment succeeds.
Unlike a call for abstention from voting, a call for restricted suffrage is certainly support for disenfranchisement. I intend to respond to this paper in depth when it is published, but for now I will say that Brennan ought not to look to Burkean conservatism for practical objections to such experiments, but rather to Hayekian liberalism or Arendtian republicanism (cf. Brennan’s comments on civic virtue.) As always, the difference between micro and macro, small experiments and institutional redesigns, should not be overlooked. (No doubt Brennan is exploring territory similar to the prediction markets I discussed with Robin Hanson last year here and here.)
Prague would double in size. As he lay in state in the old Castle of the Bohemian kings above the city, a queue some miles long would spring up. Mourners would wait all day, and all night, to see his body for the last time. The day of the funeral would be a public holiday. Hundreds of thousands of people, dressed in black and clutching flowers, would be seen lining the route taken by the cortège on the way to his final resting place. Huge black banners would fly from every office; his photograph, draped in black, would crowd every shop and news—stand and public place. Shared feelings of embarrassment would hold words back. Half-buried or forgotten anxieties about death would collectively resurface; fantasies of personal immortality would temporarily weaken. Around the graveside a forest of microphones, tripods, cameras, pads and pens would suddenly spring up. Obituaries, many of them written long ago and updated several times already, would appear in all four corners of the earth. Millions of words would he uttered. Many hundreds of different and conflicting points would be made. The words of the dead man (as Auden said) would be modified in the guts of the living. It would be said that he was a good man, a great man, a hero of the century. Harry S. Truman’s remark that a statesman is a dead politician would be confirmed. Loud sounds of grinding axes would also be heard.
The source of Havel’s tragedy, however, is not the tension between the public figure and the ‘real person’, not even his gradual loss of charisma in recent years. Such things characterise every successful political career (with the exception of those touched by the grace of premature demise). Keane writes that Havel’s life resembles a ‘classical political tragedy’ because it has been ‘clamped by moments of … triumph spoiled by defeat’, and notes that ‘most of the citizens in President Havel’s republic think less of him than they did a year ago.’ The crucial issue, however, is the tension between his two public images: that of heroic dissident who, in the oppressive and cynical universe of Late Socialism, practised and wrote about ‘living in truth’, and that of Post-Modern President who (not unlike Al Gore) indulges in New Age ruminations that aim to legitimise Nato military interventions. How do we get from the lone, fragile dissident with a crumpled jacket and uncompromising ethics, who opposes the all-mighty totalitarian power, to the President who babbles about the anthropic principle and the end of the Cartesian paradigm, reminds us that human rights are conferred on us by the Creator, and is applauded in the US Congress for his defence of Western values? Is this depressing spectacle the necessary outcome, the ‘truth’, of Havel the heroic dissident? To put it in Hegel’s terms: how does the ethically impeccable ‘noble consciousness’ imperceptibly pass into the servile ‘base consciousness’?
Žižek notes that Havel’s support for the NATO campaign is rooted in falsehood masquerading as truth:
The predominant form of today’s ‘politically correct’ moralism, on the other hand, is that of Nietzschean ressentiment and envy: it is the fake gesture of disavowed politics, the assuming of a ‘moral’, depoliticised position in order to make a stronger political case. This is a perverted version of Havel’s ‘power of the powerless’: powerlessness can be manipulated as a stratagem in order to gain more power, in exactly the same way that today, in order for one’s voice to gain authority, one has to legitimise oneself as being some kind of (potential or actual) victim of power.
This, then, is Havel’s tragedy: his authentic ethical stance has become a moralising idiom cynically appropriated by the knaves of capitalism. His heroic insistence on doing the impossible (opposing the seemingly invincible Communist regime) has ended up serving those who ‘realistically’ argue that any real change in today’s world is impossible. This reversal is not a betrayal of his original ethical stance, but is inherent in it. The ultimate lesson of Havel’s tragedy is thus a cruel, but inexorable one: the direct ethical foundation of politics sooner or later turns into its own comic caricature, adopting the very cynicism it originally opposed.