Can you imagine buying a piece of furniture where the manufacturer says, "Look, you COULD fall off this thing, so please use it carefully."
Would you buy a gas stove that says, "Well, I COULD leak, so please use me carefully and watch out for gas leaks."
Would a company order simple ball bearings that do not pass quality tests?
No, we cannot imagine commercial release of unsafe products.
Except, AI.
In response to a teen suicide, Open AI decides to take age proof from users.
But, IF a chatbot is encouraging thoughts of self-harm and suicide, does the age of the user matter?Open AI says it will try to inform parents and failing that, will inform law enforcement. Google and Meta already do that in many countries - inform law enforcement when a suicide is imminent. That should have been in the core design!
Why should any company be able to release potentially unsafe products to any user category?
And I am left wondering.. how did an entire industry bypass both quality and safety requirements?
No comments:
Post a Comment
Please share thoughts