Legal and data privacy professionals are turning their gaze upon OpenAI’s video generator tool, Sora, as the Italian Data Protection Authority, the Garante, conducts an investigation into its operations. The outcome of this investigation could have implications for generative AI companies that rely on legitimate interest exemptions for data processing under the General Data Protection Regulation (GDPR).
Generative AI tools, such as OpenAI’s Sora, operate by creating new, original content based on the data they have been trained on, which often consists of vast swaths of internet text. Whilst these tools can be incredibly useful in creating realistic, high-quality content quickly and efficiently, their use of data could potentially be at odds with GDPR regulations. This hinges on whether or not the data utilized by these systems is considered personal data under the legislation and if the company’s use of this data can be seen as a legitimate interest.
The Garante’s investigation into Sora is therefore being closely watched by the legal fraternity. If they decide in favor of OpenAI and deem that the company’s use of data for training Sora is a legitimate interest, it could set a precedent for other generative AI companies. However, the inverse is also true. If the Garante decides against OpenAI, it could prompt generative AI companies to review their operations to ensure compliance with GDPR regulations.
This is just one instance of the larger trend of legal authorities grappling with the implications of new technology on established laws and regulations. With technology advancing at a rapid pace, the task of aligning it with the current legal framework is a challenging but essential task. Legal professionals will have to continue to adapt and expand their understanding of how technologies like AI fit into current privacy laws and how these laws may need to evolve in the future.
For more details of the ongoing investigation, you can refer to the original article at the FeedBlitz website.