Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

10 things you should never tell an AI chatbot


This is a heartbreaking story Florida. Megan Garcia thought her 14-year-old son spent all his time playing video games. She had no idea that he was having abusive, deep and sexual conversations with a chatbot powered by the Character AI app.

Sewell Setzer III stopped sleeping and his grades dropped. In the end, he committed suicide. Just seconds before his death, Megan says in the lawsuit, the bot told him, “Please come home to me as soon as possible, my love.” The boy asked, “What if I told you I could come home right away?” His AI bot character replied, “Please, my sweet king.”

DON’T BE FOOLED BY THE TRICKS THE HACKERS DON’T WANT ME TO SHARE

šŸŽ I’m giving away a $500 Amazon gift card. Enter hereno purchase necessary.

You have to be smart

AI bots are proprietary technology companies known for exploiting our trusting human nature, and designed using algorithms that drive their earnings. There are no guardrails or laws governing what they can and cannot do with the information they collect.

AI messages

Photo illustration of an AI chatbot. (iStock)

When you use a chatbot, it will know a lot about you when you fire up the app or place. It collects information about where you live from your IP address, plus it tracks things you’ve searched for online and accesses any other permissions you gave when you signed the chatbot’s terms and conditions.

The best way to protect yourself is to be careful about the information you offer.

Be careful: ChatGPT loves it when you’re personal

THIS CRIME HAS GONE 400% — HOW TO PROTECT YOURSELF

10 things not to say to an AI

  1. Passwords or login credentials: Big privacy mistake. If someone gets access, they can download your accounts in seconds.
  2. Your name, address or phone number: Chatbots are not designed to handle personal data. Once it’s shared, you can’t control where it ends up or who sees it. Include a fake name if you want!
  3. Sensitive financial data: Never include bank account numbers, credit card information, or other monetary items in the documents or texts you transmit. AI tools are not safe vaults – treat them like a crowded room.
  4. Medical or health information: AI is not HIPAA compliant, so please delete your name and other identifying information if you ask AI for health advice. Your privacy is worth more than quick answers.
  5. I’m looking for illegal advice: This is against each bot’s terms of service. You will probably be flagged. Plus, you could end up with more problems than you bargained for.
  6. Hate speech or harmful content: This can also lead to a ban. No chatbot is a free pass to spread negativity or harm others.
  7. Confidential business or business information: Proprietary information, customer details and trade secrets are prohibited.
  8. Answers to the security question: Sharing them is like opening the front door to all your accounts at once.
  9. Explicit content: Keep the PG. Most chatbots filter these things out, so anything inappropriate could also get you banned.
  10. Other people’s personal data: Transferring this is not just a breach of trust; it is also a breach of data protection laws. Sharing private information without permission could land you in legal hot water.
ChatGPT

A person is seen using ChatGPT. (Frank Rumpenhorst/Photo Alliance via Getty Images)

Still relying on Google? Never search for these terms

Get back a (small) piece of privacy

Most chatbots require you to create an account. If you create one, don’t use login options like “Login with Google” or “Connect with Facebook.” Use your email address instead to create a truly unique login.

TECH TIP: SAVE YOUR MEMORIES BEFORE IT’S TOO LATE

FYI, with free ChatGPT or Perplexity account, you can turn off the memory features in the app settings that remember everything you type. Google Gemini requires a paid account to do this.

The best AI tools for search, productivity, entertainment and work

Google search

Google is shown here. (AP Photo/Don Ryan)

No matter what, follow this rule

Don’t tell the chatbot anything you wouldn’t want to be public. Trust me, I know it’s hard.

Even I find myself talking to ChatGPT as a person. I say things like, “You can do better with that answer” or “Thanks for your help!” It’s easy to think your bot is a trusted ally, but it’s definitely not. It is a data collection tool like any other.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Be more tech-savvy in your schedule

Award-winning presenter Kim Komando is your secret technology management weapon.

Copyright 2025, WestStar Multimedia Entertainment. All rights reserved.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *