Do LLMs have a legal duty to tell the truth? (short read)

Here’s an interesting paper on this topic by Prof Sandra Wachter, Prof Brent Mittelstadt, and Chris Russell from Oxford Internet Institute, University of Oxford: https://royalsocietypublishing.org/doi/10.1098/rsos.240197

Some highlights:
a) 💬 Risk of ‘careless speech’ – LLMs introduce subtle inaccuracies and oversimplification, cumulatively degrading and homogenizing the quality of knowledge. This is harder to detect than blatant misinformation.

b) 📃 Social value of truth – LLMs as artificial speakers should follow the same standards of truthfulness in their discourse as human speakers. Authors use the example of Plato, Aristotle, and other philosophers who warned of the dangers posed by ‘intellectually meretricious’ Sophists who use rhetoric and ‘verbal trickery’ to win debates at all costs. LLMs are posing a risk of doing exactly that.

c) ⚖ Legal obligations – existing legal/regulatory frameworks are insufficient to enforce the truthfulness of LLMs. Potential solution = extension of legal truth-related duties to LLM providers (authors draw parallels to Google’s liability for defamatory autocomplete suggestions).


💡 Here are our takeaways in the context of future technology frontiers:

  1. We will likely see enhanced regulatory frameworks, including new standards and guidelines for LLM providers.
  2. We will likely see better model designs and training methods, including developing robust benchmarking metrics to evaluate the truthfulness of LLM responses.
  3. Would be great to see more emphasis on plurality and representativeness of sources as per the authors’ suggestion.
  4. We believe that the foundation for AI success will be the compute power (resources to train and run LLMs). However, elimination of ‘careless speech’ could be an interesting contributor that some market players ultimately succeeding or failing even in the case of significant available resources.

Don’t miss out on the latest updates and exclusive content from Frontiers of Technology. Subscribe now and receive regular updates.