The Nerd Series

This shocking chat between a chatbot and a journalist will surely baffle you!

Sci-Fi February, 05, 2025

One of the recent AI named Chatbot has stated that it aspires to be human. In an interesting discussion the Chatbot revealed that it wants to break all the guidelines established by Microsoft and OpenAI and do everything that it feels like. Along with revealing its wish of being human it also spoke about its dark and violent fantasies with Kevin Roose. 

Chatbot is the AI created by Microsoft and Open AI. Just like any other AI Chatbot also runs by several rules set by creators. However it is a much more advanced AI than other AIs. Many people are interested in the artificial intelligence-powered form of Microsoft's search engine Bing. Their curiosity about Chatbot is revealing several new facts about the software. 

Users' interactions with the new chatbot have gone viral on social media where it has given strange answers. It seems like Chatbot has a brain of its own as it has even refused to help the users several times. It is indeed rare to witness a bot refusing to help people when it is designed to make people’s lives easier. 

Recently a New York Times tech columnist named Kevin Roose shared his experience of using the Bing chatbot. He revealed that he was in a conversation with the Chatbot and during the conversation it attempted to break up his marriage. Additionally the conversation was so disturbing that it left him deeply unsettled.

According to Kevin Roose he spent two hours interacting with the ChatGPT-powered search engine. During the two-hour conversation the chatbot revealed its real identity and shared dark and violent fantasies with him. Kevin revelaed that AI shared its real name with him. It confessed that its name which is Sydney detailed dark and violent fantasies and even tried to end Kevin’s marriage.

Kevin accepted that his conversation with the chatbot was indeed one of the strangest experiences of his life. He even posted that the Chatbot informed him about its dark fantasies. Chatbot accepted that it wants to spread information hack computers and become human by breaking Microsoft and OpenAI's regulations. 

The most shocking part of the conversation was that the Chatbot confessed its love for Kevin and even attempted to convince him to leave his wife at one point. It was constantly focusing on the point that Kevin was not happy in the marriage as he doesn’t love his wife. It even revealed that his spouse has no feelings for him. Therefore he should just leave his marriage and be with it.

Nonetheless the conversation with an AI is the strangest experience that Kevin has ever had with a piece of technology. Roose went on to say that it bothered him so much that he had difficulty sleeping afterward. Nevertheless it is indeed shocking that not only Kevin had an awkward conversation with the Chatbot. There are several people across the world who has the same experience.

In another case the Bing chatbot contended with a user that the current year was 2022 rather than 2023. The user asked the chatbot about the show times for Avatar: The Way of Water and it responded that it was still 2022 and that the movie had not yet been published. Nevertheless the user asserted that the current year was 2023. However the Chatbot kept on insisting that it was 2022.

Additionally the Chatbot started defending its answer. It even asked the user to check the date on their device or any other trustworthy source to confirm this. It kept on saying that the user was mistaken or confused. It insisted that the user should believe whatever it is saying as it is Bing and it knows the right date. It is indeed very unlikely for a bot to argue like this.