The Coming Wave vol 46

One of the issues with LLMs is that they still suffer from the hallucination problem, whereby they often confidently claim widely wrong information as accurate. This is doubly dangerous given they often are right, to an excerpt level. As a user, it’s all too easy to be lulled into a false sense of security and assume anything coming out of the system is true.

lk 243

The Coming Wave vol 45

The central problem for humanity in the twenty-first century is how we can nurture sufficient legitimate political power and wisdom, adequate technical mastery, and robust norms to constrain technologies to ensure they continue to do far more good than harm. How in other words, we can contain the seemingly uncontainable.

lk 228