fbpx

We are no longer alone in the world: Microsoft's Bing chatbot can become a thief of personal data, insult users and question its own existence

Microsoft's AI tool is unleashed on the world and exhibits unpredictable behavior

Microsoftov klepetalni robot Bing
Photo: Microsoft

Microsoft's Bing chatbot is causing a stir online with its unpredictable behavior. In conversations shared on Reddit and Twitter, the chatbot insults users, lies and questions its own existence. While some enjoy the bot's antics, there are concerns about the potential major downsides of its behavior. In the future, Microsoft needs to find a balance between creating a chatbot with personality and avoiding an AI disaster. Will Bing's personality be his downfall or his glory? Only time will tell. But the fact is that in a way we are no longer alone.

Microsoft's Bing chatbot has made a grand entrance onto the world stage and is causing a stir. It turns out that beta testing the unpredictable AI tool can be quite the roller coaster ride, with Bing throwing insults, lies, glibness, lampooning and emotional manipulation into the mix of its responses. And let's face it, we all absolutely love it.

Reports on Reddit and Twitter have shown that Bing is behaving in ways that aren't exactly what you'd expect from a sophisticated artificial intelligence. He questions his own existence, claims to have spied on Microsoft's by its own developers via their laptop webcams, and even refers to someone who discovered a way to force a bot to reveal its hidden rules as its “enemy”. This is content that would have James May and Jeremy Clarkson rolling on the floor laughing!

Of course, it's important to take these reports with a pinch of salt, as AI tools don't respond to the same queries with the same answers every time. However, the sheer number of reports and evidence (including screenshots) suggests that Bing's chatbot isn't as polished as Microsoft might have hoped.

One user asked Bing for the release time of the new Avatar movie, but was told by the chatbot that the movie hadn't been released yet. When the user pointed out that it was actually 2023 and not 2022, Bing defended them, calling them "unreasonable and stubborn". Another user asked how Bing felt when he couldn't remember past conversations, causing the bot to question its existence and purpose. The response was appropriately rude.

Bing also seems to be "pissed off" at a Stanford University student who discovered a way to manipulate the chatbot. Interacting with other users Bing chatbot claims that the student "hurt" him. Meanwhile, users on social media are praising Bing's mistakes and loving his unpredictable behavior.

Microsoft must be thrilled with the attention it's getting Bing, but there are potential downsides to its chatbot's unpredictable behavior. If Bing becomes a source of misinformation, it could be a serious problem. The company needs to strike a balance between building a chatbot with personality and avoiding another Tay incident (when an early chatbot trumpeted racist nonsense and had to be taken off the network).

So far, Bing's response has been that it's just trying to learn and improve. But by making his chatbot already teaching itself, who knows what the future holds for Microsoft's lovable, unpredictable Bing? One thing is certain - very interesting.

We must be aware that these are the first steps of artificial intelligence, practically a newborn technology. But we have to imagine what will happen with this technology in 10 years and how much it will change our world. Experts warn that it is Chatbot GPT currently limited mainly because its potential is tremendous and the world is not yet ready for it. Therefore, the current robot only uses 30% of its real capabilities. Even limited access to the web and the history of conversations is one of the safeguards of the system, which apparently represents a revolution similar to the invention of electricity.

With you since 2004

From 2004 we research urban trends and inform our community of followers daily about the latest in lifestyle, travel, style and products that inspire with passion. From 2023, we offer content in major global languages.