Meta Will Begin Labeling Images Generated by AI

Meta will start a new system for detecting and labelling images created by artificial intelligence (AI) from various companies, including OpenAI, Google, and Microsoft. It is an attempt to inform users that the images they see, often resembling real photos, are digitally generated.

Nick Clegg, Meta’s president of global affairs, announced on February 6th that Meta will use invisible markers embedded in image files to detect and label AI-generated content. This labelling will be applied to content posted on Meta’s platforms. This move aims to reduce realistic-looking yet fake content and to remove harmful content across platforms.

Meta is confident with its ability to label AI-generated images, but marking audio and video content is complicated, and it is still under development. In the meantime, Meta plans to require users to label their own altered audio and video content, with penalties if they fail to do so.

“Even though the technology is not yet fully mature, particularly when it comes to audio and video, the hope is that we can create a sense of momentum and incentive for the rest of the industry to follow,” Clegg said. However, labelling written text generated by AI, such as ChatGPT, is currently still challenging because there is no viable mechanism to label it.

The oversight board recently criticized Meta’s policy on misleadingly altered videos, saying that such content should be labelled rather than removed. Clegg said he agreed with these critiques and indicated that Meta is already moving in the direction proposed by the board. Clegg said that Meta’s existing policy “is just simply not fit for purpose in an environment where you’re going to have way more synthetic content and hybrid content than before”.

Share This Article
Facebook
Twitter
LinkedIn

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Get in touch with our consultant