Can sensitive personal data be shared with ChatGPT?

Can sensitive personal data be shared with ChatGPT?
Can sensitive personal data be shared with ChatGPT?

With the impressive democratization of artificial intelligence (1 in 2 French people say they have already used ChatGPT), tools like that of OpenAI or Google’s Gemini have become accessible to everyone, facilitating interaction with this technology in an intuitive way. However, while these tools can be useful for many everyday tasks, one should be aware of the risks associated with sharing personal or private data with these systems.

Copying and pasting personal data to an AI: good or bad idea?

We use both search engines and AI for personal subjects, which concern our finances, our health, etc. There are many questions on this subject: for example, can we give our medical analysis results in PDF form to ChatGPT for interpretation? Can we copy/paste personal data (email, telephone number, etc.) to Gemini? What happens to this information once transmitted to your favorite AI? Here are the main reasons why you should shy away from the idea of ​​transmitting private data.

ChatGPT uses your data to feed its artificial intelligence

First thing to know: know that your entries can be used to feed LLMs like that of ChatGPT. Here is what we find in the settings section of ChatGPT:

Data sharing settings for AI training

By clicking on “Improve the model for everyone”, you access the setting allowing you to no longer share your information with ChatGPT:

Model improvement

Improve the model for everyone: Yes / No

« Allow your content to be used to train our models and improve ChatGPT’s performance for you and everyone who uses it. We take steps to protect your privacy.«

Uncheck the box to no longer share your data with ChatGPT

When you read the OpenAI documentation (ChatGPT), the company explains that it uses data from free users to train this model. On the other hand, and this is logical, accounts dedicated to businesses (like ChatGPT Enterprise) are not affected, since these are accounts using sensitive data from client companies.

Our advice: If you have a free ChatGPT account, we advise you to uncheck the box to no longer share your data with the AI.

A lack of transparency and control

The inner workings of AIs often remain opaque. Indeed, it is difficult to determine precisely how your data is used, processed and stored. Or, whether the information transmitted is used to feed the AI’s knowledge base. By sharing your personal information with an AI, you lose a significant part of the control over this data.

A risk of data leaks

No computer system is safe from hacks. Data leaks are a real threat, and your personal information could be compromised if stored by an AI. It is therefore important to limit the sharing of sensitive data to reduce this risk. One tip is to anonymize your data before sharing it with an AI tool.

Malicious use of data

Personal data can be exploited for malicious purposes. This includes identity theft, targeted spam, or the spread of false information. These risks highlight the importance of protecting your personal information.

Bias and discrimination

AI systems are trained on large datasets that may contain biases present in society. These biases can lead to discriminatory outcomes, particularly in sensitive areas such as employment, credit or housing. Caution is therefore advised.

Invasion of privacy

Sharing personal information with an AI may violate your privacy. You give up some of your control over your data and its use. You should know that certain exchanges between the user and the AI ​​can be analyzed by the teams behind the AI, with the aim of analyzing and improving the tool. Maintaining the confidentiality of your personal information is essential to keeping your privacy intact.

Alternatives to conversational AI

To process sensitive data, it is often preferable to use open-source and transparent tools. Solutions are emerging in the field of AI, such as LibreChat. These alternatives allow better control over your data and reduce the risk of leaks and malicious use.

Doesn’t GDPR protect against the exploitation of our data by AI?

Regulations like the General Data Protection Regulation (GDPR) play a role in protecting personal data. They impose strict obligations on companies on how they collect, process and store personal information, providing an extra layer of security for users.

However, this does not mean that AI tools like ChatGPT are GDPR compliant: artificial intelligence is progressing at a rapid pace, and Europe is working to regulate it. This is why it is important to keep in mind that even if you are protected at the European level, it is recommended to take precautions regarding your use of AI, by limiting the sharing of sensitive data and favoring transparent tools to protect your privacy.

[Nouveau] 4 ebooks on digital marketing to download for free

Did you like this article? Receive our next articles by email

Subscribe to our newsletter, and you will receive an email every Thursday with the latest expert articles published.

Other articles on the same theme:

-

-

PREV Denmark to tax farmers’ livestock burps and farts, a first in the world
NEXT an accident created traffic jams south of Toulouse