Tuesday, 19 August 2025

Amusement only?

An American bloke who was foolish enough to take medical advice from a chatbot ended up in a psychiatric hospital after swapping toxic sodium bromide for the flavouring agent sodium chloride.
    A spokes for the A.I. company warned that chatbots should not be relied on as a source of truth and factual information, or as a substitute for [human] professional advice.
    So what bloody use are they?

No comments:

Post a Comment