News Company apologizes after AI support agent invents policy that causes user uproar

News

Команда форума
Редактор
Регистрация
17 Февраль 2018
Сообщения
38 930
Лучшие ответы
0
Реакции
0
Баллы
2 093
Offline
#1
On Monday, a developer using the popular AI-powered code editor Cursor noticed something strange: Switching between machines instantly logged them out, breaking a common workflow for programmers who use multiple devices. When the user contacted Cursor support, an agent named "Sam" told them it was expected behavior under a new policy. But no such policy existed, and Sam was a bot. The AI model made the policy up, sparking a wave of complaints and cancellation threats documented on Hacker News and Reddit.

This marks the latest instance of AI confabulations (also called "hallucinations") causing potential business damage. Confabulations are a type of "creative gap-filling" response where AI models invent plausible-sounding but false information. Instead of admitting uncertainty, AI models often prioritize creating plausible, confident responses, even when that means manufacturing information from scratch.

For companies deploying these systems in customer-facing roles without human oversight, the consequences can be immediate and costly: frustrated customers, damaged trust, and, in Cursor's case, potentially canceled subscriptions.

Read full article

Comments
 
Сверху Снизу