…[S]cience has realized and affirmed what men anticipated in dreams that were neither wild nor idle. What is new is only that one of this country’s most respectable newspapers finally brought to its front page what up to then had been buried in the highly non-respectable literature of science fiction (to which, unfortunately, nobody yet has paid the attention it deserves as a vehicle of mass sentiments and mass desires).
[…] For some time now, a great many scientific endeavors have been directed toward making life also “artificial,” toward cutting the last tie through which even man belongs among the children of nature. It is the same desire to escape from imprisonment to the earth that is manifest in the attempt to create life in the test tube, in the desire to mix “frozen germ plasm from people of demonstrated ability under the microscope to produce superior human beings” and “to alter [their] size, shape and function”; and the wish to escape the human condition, I suspect, also underlies the hope to extend man’s life-span far beyond the hundred-year limit.
This future man, whom the scientists tell us they will produce in no more than a hundred years, seems to be possessed by a rebellion against human existence as it has been given, a free gift from nowhere (secularly speaking), which he wishes to exchange, as it were, for something he has made himself.
I can certainly see how fiction helps to illuminate fact. At least when they are marked out as created-rather-than-discovered, works of fiction can create vivid and meaningful depictions of the world which would otherwise recede into the massiveness of numbers and complexity. As much as I love fiction, however, I’ve never been quite clear why this fictionalized vividness is preferable to the real experiences of real folks, which are also vivid (literally lived), concrete (literally occurrent), and meaningful (literally full of significance for those who underwent them).
At its best, the fictionalization of an event makes it more palatable by fitting it into a pre-arranged narrative structure: a science-fiction fan prefers the rhythms and conventions of a certain kind of story, so she might be better able to understand the horrors of colonialism through the lens of a film like Avatarthan she could through an ethnographic account of the post-colonial misery of the Peyizan Yo of Haiti. The great white savior-gone-native in that film stands as an important fictionalized falsehood that must then be overcome, but we must start from somewhere and fiction is frequently an easier beginning.
But would anyone really want to say that the fiction is truer or preferable to the ethnography? I haven’t encountered that argument, at least, outside of hyperbolic Rortyanism. Instead, we occasionally get arguments like Martha Nussbaum’s “‘Finely Aware and Richly Responsible’: Literature and the Moral Imagination.” Because of her specific views on the role of the concrete and particular in informing and grounding our general ethical views, Nussbaum argues that:
“we will need to turn to texts no less elaborate, no less linguistically fine-tuned, concrete, and intensely focused, no less metaphorically resourceful, than this novel [Henry James’ The Golden Bowl.]”
But even for Nussbaum, who differs a bit from the dogmatic particularists like Jonathan Dancy, it is possible to “take fine-tuned perception to a dangerous rootless extreme” such that we “delight in the complexity of particulars for its own sake, without sufficiently feeling the pull of a moral obligation to any.” Such imagining “too freely strays, embroiders, embellishes.”
For Nussbaum, then, we turn to fictional texts as a pedagogical exercise to cultivate the kind of moral imagination that attends to and improvises with the concrete: “an ability to miss less, while being responsible to more.” But this pedagogical exercise actually constrains the fictional text:
“We must at the same time remember that artists, as James sees it, are not free simply to create anything they like.”
The fictional text must at least aspire to the complexity of the human phenomena it intends to map. Yet one thing that jumps out of Daisey’s show is how heavy-handed and simplistic it is:
“You will carry it to your homes, and when you sit down in front of your laptops, when you open them up, you will see the blood welling up between the keys.”
This is not the cultivation of a bewildering modern tragedy, where harsh working conditions and negligent dangers are the perhaps-too-high price developing countries pay for their development. It is bullshit, a technical term best analyzed by Harry Frankfurt:
”One who is concerned to report or to conceal the facts assumes that there are indeed facts that are in some way both determinateand knowable. His interest in telling the truth or in lying presupposes that there is a difference between getting things wrong and getting them right, and that it is at least occasionally possible to tell the difference. Someone who ceases to believe in the possibility of identifying certain statements as true and others as false can have only two alternatives. The first is to desist both from efforts to tell the truth and from efforts to deceive. This would mean refraining from making any assertion whatever about the facts. The second alternative is to continue making assertions that purport to describe the way things are but that cannot be anything except bullshit.”
Deliberate fabrication in order to tell a “better story” doesn’t ever really reveal a greater truth, because it undermines the truth-seeking sensibility. From the perspective of truth seeking, bullshitters who don’t care much about truth seem particularly pernicious: the cost of false vividness is the loss of the trust and credulity that make story-telling meaningful. Of course, some readers may not care much about the truth, either. From some other perspective than truth-seeking, like an aesthetic of care, bullshitting is not necessarily a big deal…. except: what happens when that unconcern with truth leads to a threat to the values of that particular perspective?
There are two stories about Apple: one is about its brilliant business performance, and the other is about the blood and sweat behind Apple miracles. I strongly recommend that all Apple fans read this. Corporations should bear social responsibilities, and customers should also understand and be responsible to the society. — 花甲小猪
Apple is definitely a vampire factory. But if you boycott Apple, what would those workers eat without demand (for Apple products)? By then they would even lose their job! And now the U.S. is planning to move a chunk of manufacturing back to its soil, as manufacturing costs in China are soaring. What would these surplus workers be facing? The profit margin for the entire Chinese manufacturing sector is thin, nobody enjoys high salary and good benefits; yet their work intensity is strong and working conditions are poor. This is common, not only for the manufacturers of Apple! Think first how to change the miserable status quo of a giant manufacturing country! —Quasi-Economist
There are many others, collected by the New York Times. Their responses were not all finely aware or richly responsible, and possibly some of them were working for China’s infamous “Fifty Cent Party,” (a state corps of internet propagandists) but certainly less was lost on them than seems to have been lost on us.
When I was an undergraduate, I took a class called “Truth and Beauty” with the poet Ann Lauterbach. It was basically a class on reading and writing essays, but I took it because I was a philosophy major and I thought it would be about aesthetics, i.e. about whether judgments about beauty can be true or false. Every week we’d read a collection of essays and we would turn in a response essay of our own. We also met with Ann regularly to discuss our work, which was great because she had the kind of presence that made one-on-one encounters particularly powerful and instructive, like academic therapy.
During one of our sessions, I remember bemoaning the fact that my essays were all so analytical. I had read some of her poetry and I yearned for the kind of imaginative approach to language that I thought she had. (I really had no idea about poetry.) I can’t remember her exact response, but it was something like this:
Everybody has their own way of thinking, their own voice. You shouldn’t try to change the way you think, but rather work on improving it.
At the time, I found that inspiring. Here was a brilliant poet giving me permission (nay, charging me with the duty!) to dig deeper into the habits of thought and writing that were most comfortable for me. It was liberating. I’ve since come to realize that my style of thinking is much less strictly analytical and much more about exploring questions and the various possible ways of answering them. (Those links point to a couple of posts addressing different approaches to power and freedom.) But I’m glad I took Ann’s advice, because look where it got me: I got a PhD in philosophy, and I get to teach my favorite texts and questions for a living!
Now, here’s the question: why did I tell you that story?
Notice how my story works: it puts some pretty banal clichés into the mouth of a famous poet, but all she said was “be yourself.” I start by establishing her authority and gravitas, I introduce a problem via a distinction with an implicit hierarchy (analytic versus imaginative), and then the authority figure in my story teaches me a lesson that reverses the hierarchy: it’s okay to be analytic and nerdy! Then I pretend like this simple lesson is what got me to where I am today. Yay poets! Yay philosophy nerds!
Narratives tend to be too simple. The point of a narrative is to strip [detail] way, not just into 18 minutes, but most narratives you could present in a sentence or two. So when you strip away detail, you tend to tell stories in terms of good vs. evil, whether it’s a story about your own life or a story about politics. Now, some things actually are good vs. evil. We all know this, right? But I think, as a general rule, we’re too inclined to tell the good vs. evil story. As a simple rule of thumb, just imagine every time you’re telling a good vs. evil story, you’re basically lowering your IQ by ten points or more. If you just adopt that as a kind of inner mental habit, it’s, in my view, one way to get a lot smarter pretty quickly. You don’t have to read any books. Just imagine yourself pressing a button every time you tell the good vs. evil story, and by pressing that button you’re lowering your IQ by ten points or more.
Oh shit! Did I just make myself and my readers dumber? Did my little “A Man Learns a Lesson”-style story just get us all stoned on narrative inanities?
Cowen goes on to qualify this:
we use stories to make sense of what we’ve done, to give meaning to our lives, to establish connections with other people. None of this will go away, should go away, or can go away.
But, he explains, we should worry about stories more, and embrace the messiness of life more. But I wonder if he’s right? After all, Lauterbach told me I shouldn’t try to change the way I think, but rather get really good at the modes of thinking that I already prefer. Surely the same thing is true for people who love stories and think primarily in terms of stories?
So, here’s how I think about this question: Should we listen to Cowen or to Lauterbach? Why?
It seems to me that we should be suspicious of stories if we think that letting reality be messy is good for thinking clearly. The problem there is that we’re only likely to think that if we’ve had good experiences with other forms of analysis: plotting data or formalizing syllogisms. In that case, we’ll hear Cowen’s comments like I heard Lauterbach’s: “Be yourself! Those story-tellers are phonies, anyway.”
On the other hand, we might also want to dig deeper into stories and develop our critical thinking skills from within the narrative form: when is a story too neat? When is a narrator’s omniscience really pandering to the reader? What are the other stories we can tell about authors, about cultures, and about narrative manipulation that might help us to avoid the traps that narratives set for us? If we’ve already got a pretty good sense of the structure of stories, the kinds of things that narratives do and can do, we might prefer to dig deeper and hone this method. But still, the message is Lauterbach’s: “Don’t kick the poets out of the city! Poets can be wise, too!”
In this post, Lauterbach is going to stay the hero. But Cowen is a smart guy, and he tries to inoculate himself against this kind of criticism in the section on cognitive biases. Basically, he reminds us that people tend to misuse their knowledge of psychology through a kind of motivated reasoning that reproduces their earlier, ignorant biases but now with supposed expert certification. In this, as in most things “a little learning is a dangerous thing.” (But isn’t that what TED is for?) Then he reminds us of the epistemic portfolio theory, which holds that we’ll tend to balance our subjects of agnosticism, unpopular beliefs, and dogmatism in a rough equilibrium, so we ought to beware of the ways we abjure narratives in only some parts of our lives. (This is pretty much like ending his whole talk with the prankster’s “NOT!” Silly rationalists: truth-tracking and reason-responsiveness are myths we tell to children to hide the messy emotional facts of the matter.)
The passage in his talk where he typologizes the various narratives we’ll tell about the talk is also pretty funny: “I used to think too much in terms of stories, but then I heard Tyler Cowen, and now I think less in terms of stories!” Yay economists! They’re smart and have all the bases covered. Hey wait: do you think that’s why he told us that story?