Technology

Experts warn of a "hallucination" problem with ChatGPT and LaMDA, as these chatbots take what they have learned and reshape it without regard for what is true (Cade Metz/New York Times)


Cade Metz / New York Times:

Experts warn of a “hallucination” problem with ChatGPT and LaMDA, as these chatbots take what they have learned and reshape it without regard for what is true  —  Siri, Google Search, online marketing and your child’s homework will never be the same.  Then there’s the misinformation problem.


File source

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button