I Robot, You Robot, WE Robot
February 26, 2023
"Still, I’m not exaggerating when I say my two-hour conversation with Sydney was the strangest experience I’ve ever had with a piece of technology. It unsettled me so deeply that I had trouble sleeping afterward. And I no longer believe that the biggest problem with these A.I. models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways, and perhaps eventually grow capable of carrying out its own dangerous acts.”
This is a quote from a recent New York Times story that I never would have believed was a New York Times story if I hadn’t read it myself. It’s one of the most fascinating things I’ve read in a while, and also terrifying. My mother used to always say that truth is stranger than fiction and, well my friends? This is proof that you should always listen to your mother.
If you have time, I strongly suggest you read it. You won’t be the same afterward. And you will definitely have second thoughts about using Open AI or Chat GPT.
Here are the title and the subtitle of the article:
A Conversation With Bing’s Chatbot Left Me Deeply Unsettled
A very strange conversation with the chatbot built into Microsoft’s search engine led to it declaring its love for me.
Imagine an advanced conversational technology with the seeming capability of not only learning how to determine what you need but to intuit what kind of individual you may be and then making decisions about whether or not you should be allowed to continue existing. Imagine that this technology is so personalized that it becomes possessive of you, to the point where it somehow develops (what could not possibly be because this is a robot) emotions and even (no way) feelings of (absolutely not!) love for you.
Yeah. I know. I thought it sounded like a movie too. Until I realized there was no popcorn.
HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.HaHaHaHaHaHa.
Stop.