The Coming Wave vol 46

One of the issues with LLMs is that they still suffer from the hallucination problem, whereby they often confidently claim widely wrong information as accurate. This is doubly dangerous given they often are right, to an excerpt level. As a user, it’s all too easy to be lulled into a false sense of security and assume anything coming out of the system is true.

lk 243