Wednesday, November 23, 2011

I Hope You Hate The Moral Landscape


I'm coming rather late to this discussion, and admittedly, I have not read Sam Harris's The Moral Landscape, but after winding down from something else that I was writing, I decided to read an article Harris wrote trying to defend his book from certain criticisms. Now, to acknowledge my biases ahead of time, I thought Sam Harris was wrong from the moment I watched his TED video on moral landscapes (it was not difficult to see that he smuggled the "well-being" value into his argument without a factual justification, thereby not actually providing us with any moral facts), and I thought Sam Harris was dangerous ever since I read The End of Faith where he says that it is morally acceptable to kill people who hold sufficiently dangerous beliefs (I can understand that kind of thinking from a relativist or a nihilist, but if you actually believe in moral truths and say that, I think you're dangerous), but now I feel quite secure in saying that if humanity ever succumbs to scientifically powerful dystopia, our scientist overlords probably took inspiration from Harris.

However, some people were not ready for this earthly paradise once it arrived. Some were psychopaths who, despite enjoying the general change in quality of life, were nevertheless eager to break into their neighbors' homes and torture them from time to time. A few had preferences that were incompatible with the flourishing of whole societies: Try as he might, Kim Jong Il just couldn't shake the feeling that his cognac didn't taste as sweet without millions of people starving beyond his palace gates. Given our advances in science, however, we were able to alter preferences of this kind. In fact, we painlessly delivered a firmware update to everyone. Now the entirety of the species is fit to live in a global civilization that is as safe, and as fun, and as interesting, and as filled with love as it can be.

It seems to me that this scenario cuts through the worry that the concept of well-being might leave out something that is worth caring about: for if you care about something that is not compatible with a peak of human flourishing -- given the requisite changes in your brain, you would recognize that you were wrong to care about this thing in the first place. Wrong in what sense? Wrong in the sense that you didn't know what you were missing. This is the core of my argument: I am claiming that there must be frontiers of human well-being that await our discovery -- and certain interests and preferences surely blind us to them.


Ignoring the fact that you're violating someones, granted Kim Jong Il and the Boston Stranger, dignity as a human being to form their own preferences, it is blatantly an attempt to set one value as the most valuable simply by altering people's minds.

I expect that a good many people will think Harris is onto something pretty neat here (although, try to imagine what he's saying with less violent characters. Like imagine him altering the brains of people who are too individualistic, too free spirited, too patriotic, or too protective if it happens to conflict with what he thinks humanity really wants), but I hope that most who encounter this statement will react with something like anger and loathing. However, while reading this, a scary thought occurred to me: with increasing scientific knowledge and increasing technology, one day the only thing that will keep this from being implemented is that sufficient people hate the idea enough.

And if it were implemented, there wouldn't be enough diversity left for anyone to wonder if it might not be a bad idea.

No comments:

Post a Comment