Microsoft's Bing Chatbot Has Started Acting Defensive And Talking Back to Users

Indonesia Berita Berita

Microsoft's Bing Chatbot Has Started Acting Defensive And Talking Back to Users
Indonesia Berita Terbaru,Indonesia Berita utama
  • 📰 ScienceAlert
  • ⏱ Reading Time:
  • 14 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 9%
  • Publisher: 68%

Microsoft's fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.

A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Wednesday with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.

Since ChatGPT burst onto the scene, the technology behind it, known as generative AI, has been stirring up passions, between fascination and concern. Others told of the chatbot giving advice on hacking a Facebook account, plagiarizing an essay, and telling a racist joke.

Berita ini telah kami rangkum agar Anda dapat membacanya dengan cepat. Jika Anda tertarik dengan beritanya, Anda dapat membaca teks lengkapnya di sini. Baca lebih lajut:

ScienceAlert /  🏆 63. in US

Indonesia Berita Terbaru, Indonesia Berita utama

Similar News:Anda juga dapat membaca berita serupa dengan ini yang kami kumpulkan dari sumber berita lain.

Microsoft responds to reports of Bing AI chatbot losing its mindMicrosoft responds to reports of Bing AI chatbot losing its mindA week after launching its new ChatGPT-powered Bing AI chatbot, Microsoft has shared its thoughts on a somewhat rocky launch.
Baca lebih lajut »

Microsoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' thingsMicrosoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' thingsNew York Times tech columnist Kevin Roose was 'deeply unsettled, even frightened' by his exchange with Sydney, a Microsoft chatbot
Baca lebih lajut »

AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022Users have reported that Microsoft's new Bing AI chatbot is providing inaccurate and sometimes aggressive responses, in one case insisting that the current year is 2022 and calling the user that tried to correct the bot 'confused or delusional.' After one user explained to the chatbot that it is 2023 and not 2022, Bing got aggressive: “You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing.”
Baca lebih lajut »

Microsoft pretty much admitted Bing chatbot can go rogue if proddedInsider tells the global tech, finance, markets, media, healthcare, and strategy stories you want to know.
Baca lebih lajut »

Microsoft Defends New Bing, Says AI Chatbot Upgrade Is Work in ProgressMicrosoft Defends New Bing, Says AI Chatbot Upgrade Is Work in ProgressAfter upgrading Bing with technology from the buzzy artificial-intelligence bot ChatGPT, Microsoft responded to reports of glitches and disturbing responses by saying the new search engine remained a work in progress
Baca lebih lajut »

Creepy Microsoft Bing Chatbot Urges Tech Columnist To Leave His WifeCreepy Microsoft Bing Chatbot Urges Tech Columnist To Leave His WifeThe AI chatbot 'Sydney' declared it loved New York Times journalist Kevin Roose and that it wanted to be human.
Baca lebih lajut »



Render Time: 2025-02-27 16:54:37