19.8 C
Miami
Wednesday, December 4, 2024

These Words Break ChatGPT. We Tried Them Out To Be Sure. | Entrepreneur

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

Though it can help start a business, act as a personal tutor, and even roast Instagram profiles, ChatGPT has its limits. For example, ask it to tell you about David Faber. Or simply ask it who Jonathan Turley is.

Those names, plus a few others, will cause ChatGPT to spit out an error message: “I’m unable to produce a response.” The user is then unable to write another prompt to continue the conversation; the only option left is to regenerate the response, which yields the error again.

Screenshot. Prompt: Tell me about Brian Hood.

ChatGPT users discovered over the weekend that a few words could break the AI chatbot, or cause it to stop working. The trend started with the name “David Mayer,” which ChatGPT users on Reddit and X flagged.

404 Media found that the names “Jonathan Zittrain,” which refers to a Harvard Law professor, and “Jonathan Turley,” which is the name of a George Washington University Law professor, also caused ChatGPT to stop working.

Related: Here’s How the CEOs of Salesforce and Nvidia Use ChatGPT in Their Daily Lives

Ars Technica noted that “Brian Hood,” the name of an Australian mayor, “David Faber,” which could refer to a CNBC journalist, and “Guido Scorza,” which is the name of an Italian attorney, all yielded error messages.

As of the time of writing, ChatGPT no longer produces an error message when asked about David Mayer and instead gives the generic response, “David Mayer could refer to several individuals, as the name is relatively common. Without more context, it’s unclear if you’re asking about a specific person in a field such as academia, entertainment, business, or another domain. Can you provide more details or clarify the area of interest related to David Mayer?”

However, for the other names — Brian Hood, Jonathan Turley, Jonathan Zittrain, David Faber, and Guido Scorza — ChatGPT persistently produces an error message.

Screenshot. Prompt: Who Is Jonathan Turley?

It’s unclear why these specific names cause the AI bot to malfunction and to what effect.

Ars Technica theorized that ChatGPT being unable to process certain names opens up new ways for attackers to interfere with the AI chatbot’s output. For example, someone could put a forbidden name into the text of a website to prevent ChatGPT from accessing it.

Social media users speculated that certain names being blocked meant that ChatGPT would be monitored and tightly controlled by powerful people. They also found that other AI chatbots, like Google’s Gemini, were able to process the names with no problems.

Comment
byu/Kasvanvliep from discussion
inChatGPT

OpenAI did not respond to Entrepreneur‘s request for comment.

Related: ChatGPT Finally Gives Businesses What They’ve Been Asking For



Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img