OpenAI, the organization behind ChatGPT, announced on Tuesday the launch of a new tool capable of detecting images created by its  text-to-image generator, DALL·E 3.

The decision to develop and release this software comes amid growing concerns about the role of AI-generated content in this year’s global elections, including the prominent use of such content to spread misinformation.

The tool, backed by Microsoft, has shown in internal testing to correctly identify images created by  DALL·E 3 approximately 98% of the time and is able to withstand common image alterations like compression, cropping, and changes in saturation.

Additionally, OpenAI is implementing tamper-resistant watermarking to help authenticate digital content such as photos and audio, making these watermarks difficult to remove. In an effort to establish more robust standards for media provenance, OpenAI has joined an industry coalition that includes tech giants like Google, Microsoft, and Adobe.

To further address the existing global challenges, OpenAI, in collaboration with Microsoft, is initiating a $2 million “societal resilience” fund aimed at bolstering  AI education and awareness. This initiative reflects a growing recognition of the need to equip societies with the knowledge to navigate the complexities introduced by advanced

OpenAI, the organization behind ChatGPT, announced on Tuesday the launch of a new tool capable of detecting images created by its  text-to-image generator, DALL·E 3.

The decision to develop and release this software comes amid growing concerns about the role of AI-generated content in this year’s global elections, including the prominent use of such content to spread misinformation.

The tool, backed by Microsoft, has shown in internal testing to correctly identify images created by  DALL·E 3 approximately 98% of the time and is able to withstand common image alterations like compression, cropping, and changes in saturation.

Additionally, OpenAI is implementing tamper-resistant watermarking to help authenticate digital content such as photos and audio, making these watermarks difficult to remove. In an effort to establish more robust standards for media provenance, OpenAI has joined an industry coalition that includes tech giants like Google, Microsoft, and Adobe.

To further address the existing global challenges, OpenAI, in collaboration with Microsoft, is initiating a $2 million “societal resilience” fund aimed at bolstering  AI education and awareness. This initiative reflects a growing recognition of the need to equip societies with the knowledge to navigate the complexities introduced by advanced

Leave a Reply

Your email address will not be published. Required fields are marked *