Meta to require political ads reveal AI altered images

Meta to require political ads reveal AI altered images

Facebook and Instagram parent Meta says that starting next year it will reject election ads that try to hide that their content was created or altered using artificial intelligence
Facebook and Instagram parent Meta says that starting next year it will reject election ads that try to hide that their content was created or altered using artificial intelligence. Photo: GEORGE FREY / GETTY IMAGES NORTH AMERICA/Getty Images via AFP
Source: AFP

Meta on Wednesday said that advertisers will soon have to disclose when artificial intelligence (AI) or other software is used to create or alter imagery or audio in political ads.

The requirement will take effect globally at Facebook and Instagram at the start of next year, parent-company Meta said.

"Advertisers who run ads about social issues, elections and politics with Meta will have to disclose if image or sound has been created or altered digitally, including with AI, to show real people doing or saying things they haven't done or said," Meta global affairs president Nick Clegg said in a Threads post.

Advertisers will also have to reveal when AI is used to create completely fake yet realistic people or events, according to Meta.

Meta will add notices to ads to let viewers know what they are seeing or hearing is the product of software tools, the company said.

Read also

OpenAI sees a future of AI 'superpowers'

In addition, Meta's fact checking partners, which include a unit of AFP, can tag content as "altered" if they determine it was created or edited in ways that could mislead people, including through the use of AI or other digital tools, the company said.

PAY ATTENTION: Share your outstanding story with our editors! Please reach us through info@corp.legit.ng!

Fears of increasingly powerful AI tools include the potential for them to be used to deceive voters during elections.

Microsoft this week announced new measures it will take as part of its efforts to help protect elections from "technology-based threats" such as AI.

"The world in 2024 may see multiple authoritarian nation states seek to interfere in electoral processes," Microsoft chief legal officer Brad Smith and corporate vice president Teresa Hutson said in a blog post.

"And they may combine traditional techniques with AI and other new technologies to threaten the integrity of electoral systems."

Read also

EU probes AliExpress to examine curbs on illegal products

Tools Microsoft plans to release early next year include one that enables candidates or campaigns to embed "credentials" in images or video they produce.

"These watermarking credentials empower an individual or organization to assert that an image or video came from them while protecting against tampering by showing if content was altered after its credentials were created," Smith and Hutson said in the post.

Microsoft said it will also deploy a team to help campaigns combat AI threats such as cyber influence campaigns and fake imagery.

Source: AFP

Authors:
AFP avatar

AFP AFP text, photo, graphic, audio or video material shall not be published, broadcast, rewritten for broadcast or publication or redistributed directly or indirectly in any medium. AFP news material may not be stored in whole or in part in a computer or otherwise except for personal and non-commercial use. AFP will not be held liable for any delays, inaccuracies, errors or omissions in any AFP news material or in transmission or delivery of all or any part thereof or for any damages whatsoever. As a newswire service, AFP does not obtain releases from subjects, individuals, groups or entities contained in its photographs, videos, graphics or quoted in its texts. Further, no clearance is obtained from the owners of any trademarks or copyrighted materials whose marks and materials are included in AFP material. Therefore you will be solely responsible for obtaining any and all necessary releases from whatever individuals and/or entities necessary for any uses of AFP material.