(WS News) – Artificial intelligence can build an eerily accurate picture of your life and who you are.
Now security experts have warned The Sun readers not too give away too much info when speaking to chatbots – even if it seems “harmless”.
The problem with AI-powered chatbots is that they can seem almost humanlike.
They’re designed that way to be more helpful and approachable – but it can also mean you dropping your guard.
“We’re getting so comfortable talking to AI tools like ChatGPT that we often forget something important,” said cybersecurity pro Akhil Mittal, speaking to The Sun.
“We’re still sharing data, and that data help these systems learn more about us.”
You might think you’re sensible enough to not hand over sensitive financial info.
But the kinds of information that you shouldn’t be sharing with artificial intelligence bots goes far beyond your bank log-in.
“Most people know not to share passwords or credit card numbers,” said Akhil, a senior security consulting manager at Black Duck Software.
“The real concern is the small and everyday details.
“You might mention your upcoming travel plans, a work project, or even your health, thinking it’s harmless.
“But these small bits of information can add up over time, creating a picture of your life that you never intended to share.
“You’re giving away more than you think.”
There are several reasons why you wouldn’t want an AI knowing too much about you.
Firstly, it’s possible that you’re talking to a dodgy chatbot created by cyber-criminals specifically for extracting info from you.
Secondly, even if the chatbot itself isn’t malicious, your account could be hacked – leaking all of your info to whoever breaks in.
Hackers could exploit that info for financial gain.
And thirdly, you have very little control over where your info ends up once it has been “ingested” by the AI machine.
“Think of it like sending a text – just because it feels private doesn’t mean it can’t be shared or even seen by others,” Akhil warned.
“The point is AI systems learn from what you tell them.
“Even if they don’t store everything, they process that information to improve responses.
“So, it’s important to treat your conversations with the same caution as social media.
“Once it’s out there, it’s out of your control.”
It’ll becoming increasingly difficult to avoid chatbots in the future.
So experts say it’s important to be extremely vigilant with what you send to an AI – or it could prove very costly.
If you hand over enough info, hackers might even be able to steal your identity or break into your bank account.
Speaking to The Sun, security expert Chris Hauk said you should try to avoid using a chatbot as a therapist or life guide.
“You should never share financial information, or your deepest thoughts,” said Chris, a consumer privacy advocate at Pixel Privacy.
“Some folks may be inclined to share problems with chatbots, using them as a therapist of sorts.
“This is not a good idea, as doing so is a serious privacy concern, as both types of information would be used by bad actors to cause issues for you down the line.
“Also, never share confidential workplace information, which could result in the unintentional exposure of information about your employer.
“Never provide login information for your accounts in chatbot conversations. Hackers could exploit that info for financial gain.”