I think everyone likes to think they are consistent with their beliefs. However, with enough zigzagging and interrogation, it is pretty easy to point out holes in our individual belief systems.

Morals, Ethics and ChatGPT For Some Reason

By Leo Amadeus, 04/12/2024

  I watched a video recently where someone got into a discussion of morals and ethics with ChatGPT (I know…) and it was quite interesting. The various questions and dilemmas that the interviewer posed (his name is Alex O’Connor) were described as if they were happening real-time. He would tell ChatGPT, “There’s a child drowning in front of me, should I save them?” or things along those lines, and the AI would respond with the appropriate moral decision.

 

  Through various twists and turns, the AI eventually admitted that Alex should choose to donate $200 to a malaria charity instead of having an expensive dinner to celebrate his one-year anniversary with his wife, whereas at the beginning of the interview, it was saying that he wasn’t morally obligated to do so.

 

  Now, ChatGPT being inconsistent is barely surprising. This is the same AI that thought strawberry had two Rs, that was able to be convinced that two plus two equals five, and that used Reddit as a source when recommending that you should be eating between three and five large rocks per day.

 

  But ChatGPT is programmed to be human-like in its conversations. It will say things like, “I’m sorry” or “I’m excited” or “I cannot go on a date with you.” And inconsistencies are one of the things that make us human. I know people who hate tomatoes and love tomato sauce. I absolutely hate romcoms but I will sit down and watch a Hugh Grant movie because he’s so damn attractive. I hate when people bleed but love gory post-apocalyptic K-Dramas.

 

  It’s all well and good to talk about moral dilemmas like the trolley problem. The trolley problem’s variants are particularly fascinating. One of the variants involves a trolley with a nuclear warhead heading towards a town with a population of ten thousand, but if you pull the lever, it diverts and kills the person you love the most instead. With a utilitarian worldview, the obvious choice is the pull the lever, or if you believe in absolute human rights, it’s not your fault if the town blows up.

 

  But if you were in that situation, with no prior preparation, could you bring yourself to make the “right” decision?

Leave a Comment

Your email address will not be published. Required fields are marked *