Here are the requests of the Privacy Guarantor Chatgpt could already return in May,

Here are the requests of the Privacy Guarantor Chatgpt could already return in May,

[ad_1]

Chatgpt could already return to Italians in May. The road is marked, according to what emerges from a note issued today by the Privacy Guarantor: he asks OpenAi to fulfill his requests regarding privacy by 30 April. At that point “failing the reasons for urgency, the Authority will suspend the provision for the temporary limitation of the processing of data of Italian users taken against the US company and ChatGPT will be able to become accessible from Italy again”, reads the note from the Guarantor. Experts see it as a signal that OpenAI, in its discussions with the Guarantor, has actually undertaken to do the things requested. “It is the confirmation that the action of the Guarantor was appropriate and timely, because OpenAI has now affirmed its intention to respect the privacy regulations of the GDPR”, says Franco Pizzetti, former Privacy Guarantor and constitutionalist at the University of Turin, to Sole24Ore.

Here are the requests of the Privacy Guarantor

Here is in particular what was requested today by the Guarantor, with a provision that details what it had already indicated on March 31st (in the famous provision from which the ChatGpt block had come). First point, OpenAI will have to prepare and make available on its website a transparent information, which illustrates the methods and logic underlying the processing of data necessary for the functioning of ChatGPT and the rights attributed to users and non-user interested parties. The disclosure must be easily accessible and placed in a position that allows it to be read before proceeding with any registration for the service. For users who connect from Italy, the information must be presented before completing the registration and, always before completing the registration, they must be asked to declare that they are of age. For already registered users, the information must be presented at the time of the first access following the reactivation of the service and, on the same occasion, they must be asked to pass an age gate which excludes, on the basis of the declared age, minor users.

The question of the legal basis

As for the legal basis of the processing of users’ personal data for algorithm training, the Privacy Guarantor has ordered OpenAI to eliminate any reference to the execution of a contract and to indicate, instead, on the basis of the principle of accountability, the consent or the legitimate interest as a prerequisite for using such data, without prejudice to the exercise of one’s powers of verification and assessment subsequent to this choice. Further provisions concern the provision of useful tools to allow interested parties, including non-users, to request the rectification of personal data concerning them generated inaccurately by the service or the cancellation of the same, in the event the rectification is not technically possible. Furthermore, OpenAI will have to allow interested non-users to exercise, in a simple and accessible way, the right to object to the processing of their personal data used for the exercise of the algorithms and recognize a similar right to users, if they identify the legitimate interest such as legal basis of the treatment.

Verification of the age of minors

As regards the verification of the age of minors, in addition to the immediate implementation of an age request system for the purpose of registering for the service, the Authority has ordered OpenAI to submit an action plan by 31 May which includes , at the latest by 30 September 2023, the implementation of an age verification system, capable of excluding access to users under thirteen and minors for whom parental consent is missing. In agreement with the Guarantor, by 15 May, OpenAI will finally have to promote an information campaign on radio, television, newspapers and the web to inform people about the use of their personal data for the purpose of algorithm training. also confirms that the intention of the Guarantor was not to block the development of AI but to favor users’ rights and a wider diffusion of the culture on AI, says Pizzetti, because OpenAI by implementing what is requested will make all users better understand how they work these chatbots about our personal data”.

Find out more

[ad_2]

Source link