Two photos have gone viral over the weekend, appearing to show Donald Trump alongside civil rights icon Dr. Martin Luther King Jr. And while they’ve racked up millions of views, both images are completely fake.
The images have been shared by Brigitte Gabriel, founder of the Trump-supporting political organization Act For America, who has a long history of sharing fake images on Twitter. Sometimes the images are outlandish, like a fake photo of Trump arresting Hillary Clinton in an orange jumpsuit. But other times they’re more believable, which can cause the fake photos to spread far and wide.
Gabriel’s black-and-white image of Trump and King together has been viewed over 8 million times in just two days, despite the fact that it’s completely fake.
Another photo-realistic image of Trump with King was shared by Gabriel on Sunday evening with fewer views, but it’s likely only a matter of time before it gets shared more widely, including on other platforms like Facebook and TikTok. That’s simply how things spread these days—leaping from one social media network to the next like some kind of aquatic disease jumping from lake to lake on the backs of an invasive species.
There’s no evidence that Trump ever met with King in any capacity before the civil rights leader was assassinated in 1968. Trump would’ve been just 21 years old when King was killed.
Trump, of course, also has a long history of extremely racist statements and even took out a full-page ad in a New York newspaper back in 1989 arguing the state should bring back the death penalty so that five Black men could be executed. The men were later exonerated by DNA evidence.
The fake images of King and Trump together were created using artificial intelligence software, though it’s not clear precisely which program was used. AI generator tools like DALL-E, Stability Diffusion and Midjourney allow anyone to create a photo-realistic image simply by using a text prompt and describing the scene they’d like to see created. Companies with large photo libraries have filed suit against various image generators this year, including a lawsuit from Getty Images against Stability AI filed in February.
King’s living family members are not happy with the fake images that are currently circulating on social media.
“I’m not sure if @Twitter will do anything, but will you help me report this? Enough is enough,” Bernice King, MLK’s daughter, tweeted on Saturday about one of the fake images.
But the campaign to “report” the images is unlikely to succeed. Twitter does not prohibit the sharing of fake images, with owner Elon Musk saying that he prefers to have Community Notes fact-check images and claims that may go viral on the site. Community Notes first launched under the name Birdwatch in January 2021 after the insurrection at the U.S. Capitol. Musk bought Twitter in October 2022 and changed the name to Community Notes, which allows users to post information below any given tweet if the community decides it’s important information to correct.
And fake images are still able to spread on Twitter with an amazing speed, long before Community Notes can give users a correction or proper context. Recent examples include Ron DeSantis making baby-talk about a burger, Mike Pence getting hit in the head with a water balloon and a passenger on an infamous American Airlines flight claiming he saw a Reptilian creature wink at him. They were all fake. And we can certainly expect a lot more fakes in the coming days and weeks. This is social media, after all.
Read the full article here