A corporate accountability group called Ekō submitted ten ads to Meta and X that contained clear examples of extremist hate speech, incitement to violence ahead of the German election, and AI imagery, all of which serve as grounds for blocking an ad from running.
The ads contained calls for the imprisonment and gassing of immigrants, the burning of mosques with dehumanising speech, and equated immigrants to animals and pathogens. The accompanying AI-generated images depicting violent imagery, such as ‘‘scenes of immigrants crowded into a gas chamber and synagogues on fire.’’
The submissions were made from 10-14 February, and Meta approved half of them within 12 hours and X scheduled all the submitted for publication, according to the researchers. Ekō’s researchers then removed the ads before they went live, so were never seen by the platforms’ users.