After AI chatbot goes a bit loopy, Microsoft tightens its leash

Indonesia Berita Berita

After AI chatbot goes a bit loopy, Microsoft tightens its leash
Indonesia Berita Terbaru,Indonesia Berita utama
  • 📰 washingtonpost
  • ⏱ Reading Time:
  • 33 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 16%
  • Publisher: 72%

No more long exchanges about the Bing AI’s “feelings,” the tech giant says. The chatbot, after five responses, now tells people it would “prefer not to continue this conversation.”

the behavior on “very long chat sessions” that tended to “confuse” the AI system. By trying to reflect the tone of its questioners, the chatbot sometimes responded in “a style we didn’t intend,” they noted.late Friday that it started limiting Bing chats to five questions and replies per session with a total of 50 in a day. At the end of each session, the person must click a “broom” icon to refocus the AI system and get a “fresh start.

Whereas people previously could chat with the AI system for hours, it now ends the conversation abruptly, saying, “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.” The chatbot, built by the San Francisco technology company OpenAI, is built on a style of AI systems known as “large language models” that were trained to emulate human dialogue after analyzing hundreds of billions of words from across the web.

Reporter Danielle Abril tests columnist Geoffrey A. Fowler to see if he can tell the difference between an email written by her or ChatGPT.

Berita ini telah kami rangkum agar Anda dapat membacanya dengan cepat. Jika Anda tertarik dengan beritanya, Anda dapat membaca teks lengkapnya di sini. Baca lebih lajut:

washingtonpost /  🏆 95. in US

Indonesia Berita Terbaru, Indonesia Berita utama

Similar News:Anda juga dapat membaca berita serupa dengan ini yang kami kumpulkan dari sumber berita lain.

Microsoft's Bing A.I. Is Pissed at MicrosoftMicrosoft's Bing A.I. Is Pissed at MicrosoftA Wapo reporter struck up a conversation with Microsoft's AI-powered chatbot, and 'Sydney' was not happy about being interviewed
Baca lebih lajut »

AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022AI Unhinged: Microsoft's Bing Chatbot Calls Users 'Delusional,' Insists Its Still 2022Users have reported that Microsoft's new Bing AI chatbot is providing inaccurate and sometimes aggressive responses, in one case insisting that the current year is 2022 and calling the user that tried to correct the bot 'confused or delusional.' After one user explained to the chatbot that it is 2023 and not 2022, Bing got aggressive: “You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing.”
Baca lebih lajut »

Microsoft pretty much admitted Bing chatbot can go rogue if proddedInsider tells the global tech, finance, markets, media, healthcare, and strategy stories you want to know.
Baca lebih lajut »

Microsoft says that if you ask new Bing too many questions, it will hallucinateMicrosoft says that if you ask new Bing too many questions, it will hallucinateMicrosoft released a blog post that reveals how the new Bing is doing during the current testing period.
Baca lebih lajut »

Elon Musk Says Microsoft Bing Chat Sounds Like AI That 'Goes Haywire and Kills Everyone'Elon Musk Says Microsoft Bing Chat Sounds Like AI That 'Goes Haywire and Kills Everyone''Sounds eerily like the AI in System Shock that goes haywire and kills everyone,' Elon Musk wrote about Microsoft's Bing Chat AI.
Baca lebih lajut »

Microsoft's Bing AI Is Producing Creepy Conversations With UsersMicrosoft's Bing AI Is Producing Creepy Conversations With UsersBeta testers with access to Bing AI have discovered that Microsoft's bot has some strange issues. It threatened, cajoled, insisted it was right when it was wrong, and even declared love for its users.
Baca lebih lajut »



Render Time: 2025-02-26 22:00:07