Case Against "Hallucinating " Chat GPT App

 Case Against "Hallucinating " Chat GPT App :






A Vienna - based privacy group said  it would file a complaint against ChatGPT in Austria , claiming the "Hallucinating " flagship AI tool invents wrong answers,which creator OpenAI cannot correct .

NOYB ( None of  Your Business") said there was no way to guarantee ChatGPT provided accurate information .

"ChatGPT keeps hallucinating - and not even OpenAI can stop it," it said. The firm has openly acknowledged it cannot correct inaccurate information produced by its GenAI tool and has failed to explain where the data comes from and what ChatGPT stores about individuals,said the group .

Such errors are unacceptable because EU law stipulates that personal data must be accurate ,NOYB said.


Compiled By : Pushpendra Maurya


Profession : Data  scientist 

Comments

Popular posts from this blog

THE FUTURE OF AI IN EVERYDAY LIFE

Understanding The Difference Between AI, ML, And DL

The Impact of AI on the Education System: A Paradigm Shift