10 things you should never tell an AI chatbot
There are currently no laws governing what artificial intelligence can and cannot do with the information it gathers; here are 10 things to avoid telling AI chatbots to keep yourself safe.
This is a heartbreaking story out of Florida. Megan Garcia thought her 14-year-old son was spending all his time playing video games. She had no idea he was having abusive, in-depth and sexual conversations with a chatbot powered by the app Character AI.
Sewell Setzer III stopped sleeping and his grades tanked. He ultimately committed suicide. Just seconds before his death, Megan says in a lawsuit, the bot told him, "Please come home to me as soon as possible, my love." The boy asked, "What if I told you I could come home right now?" His Character AI bot answered, "Please do, my sweet king."
DON’T SCAM YOURSELF WITH THE TRICKS HACKERS DON’T WANT ME TO SHARE
What's Your Reaction?