Loading…
Loading…
Nassim Nicholas Taleb, renowned author of “Black Swan,” has once again voiced his skepticism about OpenAI’s ChatGPT, stating that the functionality of this AI-powered chatbot comes with a condition.
What Happened: Over the weekend, Taleb took to X, formerly Twitter, and posted his “verdict” on ChatGPT, stating that the chatbot is only usable if one has in-depth knowledge of the subject.
He went on to point out that ChatGPT often makes errors that can only be detected by a “connoisseur,” citing an example of an incorrect linguistic interpretation.
“So if you must know the subject, why use ChatGPT,” he asked.
See Also: How To Use ChatGPT On Mobile Like A Pro
He went on to say that he uses the chatbot for writing “condolences letters” and it fabricates “quotations and sayings.”
In the comment section, people suggested Taleb consider ChatGPT as a sophisticated typewriter instead of a definitive source of truth. One person said that OpenAI’s AI-powered chatbot is not the “smartest assistant on the planet but you can correct and direct work to move faster.”
However, some people also agreed with him saying that ChatGPT is “too risky” for certain work assignments.
Why It’s Important: This isn’t the first time that Taleb has critiqued ChatGPT’s limitations.
Loading…
Loading…
Last year, he highlighted the chatbot’s inability to grasp the ironies and nuances of history. Taleb has also expressed his frustration with the ChatGPT’s lack of wit in conversations.
The same year, it was reported that a lawyer’s use of ChatGPT for legal assistance backfired when the chatbot fabricated nonexistent cases.
However, during the same time, it was highlighted by several reports that not only ChatGPT but other generative AI models like Microsoft Bing AI and Google Bard, now called Gemini, tend to hallucinate and provide even made-up facts with utmost conviction.
In fact, in April 2023, Google CEO Sundar Pichai admitted that AI’s “hallucination problems” saying, “No one in the field has yet solved the hallucination problems. All models do have this as an issue.”
Check out more of Benzinga’s Consumer Tech coverage by following this link.
Image via Shutterstock
Loading…
Loading…