10 things you should never tell AI chatbots like ChatGPT, Gemini

Concerned about AI chatbot privacy? Avoid sharing personal information with tools like ChatGPT and Gemini, as they pose serious risks

Cover Template - 1

More internet users are turning to Artificial Intelligence for answers due to their ease of use and ability to instantly provide solutions.

Chatbots like ChatGPT and Gemini often answer queries in a human-like, confident way, which may seem comforting. However, this is not proof of their reliability.

Experts warn that you must never share private information with chatbots in order to prevent privacy breaches, identity theft, and misuse of your personal information.

Chatbots often rely on user inputs for future interactions, which means your data is at risk of misuse.

Here are 10 things you should avoid telling AI chatbots:

1. Personal Data

Avoid telling the AI bot your name, your address, phone number, your first cousin's name, etc.

Anything about you can be exploited by who knows who to get into your private accounts, to impersonate you, and run scams.

It's better to avoid asking ChatGPT to write your autobiography or make your resume.

2. Passwords

Sharing your user names and login details, like passwords, with your chatbot could land you in even worse trouble than sharing your name.

You can use a password manager to help you remember your passwords or try memory techniques...

Just don't tell the chatbot your passwords.

3. Financial details

Here is one of the worst things you can share with your ChatGPT: your bank and financial details.

Account numbers, credit card and debit card details, how much your wallet has...just don't.

Save yourself future trouble.

4. Your work and company data

There have been many examples of private company data being leaked after employees used ChatGPT at work. Your company's data is not really yours. Many companies have warned and banned ChatGPT for employees and departments. Sharing such data, having it get leaked, will land you and your job in trouble.

5. ChatGPT and Gemini are not your therapists

Many may feel like talking to a bot is more cost-effective and easier than getting a therapist's appointment or calling up a friend or your mom. However, your personal secrets will not be so secret anymore when they resurface after some time. AI is no more confidential than real people.

6. No, the chatbot is not your doctor either

Information related to your health is highly sensitive. You don't want anyone accessing them without your consent. Giving your diagnosis away, medication you take, and other data to ChatGPT is equal to telling the world: “Yes, I have this disease”

7. Not your Lawyer

AI chatbots cannot replace your lawyer or solve your legal disputes. So, try refraining from asking for help because doing so could harm your legal standing if the information you provided is ever exposed.

8. Creative work and IP

Sharing your original writings with a chatbot is a bad idea because your work might be used to train their large Language models. Several writers have been caught up in legal cases after they realised that chatbots were being trained on their copyrighted material.

9. Sensitive Images and explicit content

Your ID, passports, licenses and private photos all could be misused if you share them with an AI chatbot. Even if you delete it, the traces would remain. Keep all of that documentation in secure files.

This also goes for explicit material and sexual content. While offensive content is flagged and blocked by AI, the traces could remain in the system's logs. Just avoid risking it.

10. Any sensitive data you do not want the world to know

Anything you share with the AI bot is at risk of being exposed, and yes, this includes your regular chats. There have been instances where ChatGPT users reported that they were seeing chats that other users had with the bot.

So, avoid asking or requesting AI Chatbots anything that you don't want the world to know.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp