Friday, September 24, 2010

The Kill Whitey Problem

My friend Matt, in his blog, posted a link to an interest article about the Trolley Problem.  Simply put, the experiment shows that we like to think of ourselves as having absolute morals, but, in practice, we have social biases and different value sets that cause us to be more consequentialist that we would like to believe.  In other words, though we may like to think the end does not justify the means--we often, even without thinking, will behave as if it does.

For those of us who like Battlestar Galactica, we may recognize the Trolley Problem as similar to the scenario with Lee "Apollo" Adama and the Olympic Carrier in the first episode of the first season, "33".  In order to save the fleet (tens of thousands), he is ordered to destroy a ship containing hundreds.  The decision is a hard one, but it has to be made and in a very short amount of time (a similar situation occurs in the Pilot Episode with Colonel Tigh and an engineering deck that has to be shut down).  Decide: you have to quickly choose between killing group X or group Y!  Mind you, in each of these situations in Battlestar Galactica, it's a more obvious choice because group X is a subset of group Y--kill a few, or everybody dies.

One interesting objection to the Trolley Experiment is that the obligation to participate.  Forcing someone to make the choice (a "mad philosopher" as the experiment likes to use) is the real immorality, not the actual choice.  When we are forced to choose, however, the way we make those decisions can be interesting.  The Kill Whitey experiment shows that our biases come into play whether we are liberals or conservatives (Chip Ellsworth III--what an asshole!)--in the case of the former, guilt, and in the case of the latter, fear of a black planet.

No surprise, we make choices according to our value systems--and this is abundantly clear when especially pressured to make a choice.  The greater the pressure, the more we will flock to the polar extremes.

I'd like to hope that the Gort project could yield some successful fruit.  If we could, say, successfully model our ethical systems and run them through different experiments.  People who live in war zones, areas of famine or corruption, or under great authoritarian pressure or stress, may have to live through iterative Trolley Experiments each and every day.  For those of us in the privileged world, we don't have to even think about it until a September 11 happens--and when it does, we tend to be less prepared to make good decisions as a result.  When is war "the right thing to do"?  To what extent can you hold people accountable for making bad decisions?  To what degree can you blame anyone for having "blind spots" in their thinking if they are in the throes of a Trolley Experiment?

4 comments:

  1. This study seems more related to language processing than applied ethics to me, though to be fair, it doesn't claim relevance in that area. But even if its results correlate with real-world "war zone" situations (including at-home racism) I don't think language-and-paper-based studies--reading the words, imagining the scene then imagining how you might behave--can accurately dissemble the mechanics of this behavior, assuming it can even predict it accurately.

    ReplyDelete
  2. bluesyncopate: you have a point. It makes me curious if the same experiment were presented as a live "choose your own adventure" with actual faces to put to the names. Would the results be analogous? If it's one step closer to reality, how many more of us would be inclined to choose the one versus many, regardless of race?

    ReplyDelete
  3. This reminds me a bit of something laughingstone wrote about recently, called the Bystander Effect, which, summed up, is when someone is in trouble in a public space and since there are a lot of people around, everyone unconsciously assumes someone else will help and does nothing.

    One of the solutions for mitigating this - and similarly, I believe for mitigating the Trolley problem - is getting people to become aware in advance that this could happen and also teaching victims to use more specific language. So instead of "Help! I'm being attacked!", asking a specific person nearby, "Hey! Can you help me? This guy is hurting me!"

    Similarly, doing the kinds of live experiments you're talking about could help condition people to be more conscious of the choices they are making and the biases they may have when real life gives them a situation with the experimental conditions.

    ReplyDelete
  4. Oh man I like me some Trolley problems! Kill the Fat Man! I can't help but parse it through DnD sometimes-- is it Lawful Evil to push the Fat Man?

    ReplyDelete