Person standing alone with blurred flag backdrop and face swap overlay in golden light behind smartphone showing Can you sue?

White House Shares AI-Altered Image of Arrested Civil Rights Lawyer

At a Glance

  • Nekima Levy Armstrong was arrested on Jan. 23 for violating the FACE Act.
  • The White House posted an AI-generated image of her crying, a defamation claim.
  • Legal experts say a lawsuit would face major hurdles, and the remedy may be political rather than judicial.

Why it matters: The incident shows how government officials can use AI to spread false images and the difficulty of seeking redress.

The arrest of civil rights attorney Nekima Levy Armstrong in St. Paul, Minnesota, sparked a controversy that has reached the White House. After participating in a protest at a church where the pastor had reportedly worked with ICE, Armstrong was taken into custody on Jan. 23 for allegedly violating the FACE Act, which protects places of worship from intimidation and interference. A video of the arrest, captured by her husband, shows federal agents recording her but assuring her that the footage would not be posted online.

Arrest and Video Footage

In the seven-minute clip, Armstrong asks the agent, “Why are you recording? I would ask that you not record.” The agent replies, “It’s not going to be on Twitter. It’s not going to be on anything like that.” Despite the assurance, the video was later posted to Twitter, now known as X. The clip shows Armstrong calm and composed, not in distress.

The Fake Image

The White House’s X account shared a different image: a portrait of Armstrong with tears rolling down her face. The photo was later identified as an AI manipulation, designed to make her appear crying. Her lawyer, Jordan Kushner, told the Associated Press that the image was a clear case of defamation. “It is just so outrageous that the White House would make up stories about someone to try and discredit them,” Kushner said. “She was completely calm and composed and rational. There was no one crying. So this is just outrageous defamation.”

Legal Implications

Solemn portrait of Neil Armstrong with tears rolling down her cheeks and calm eyes against a subtle white-gray gradient.

Defamation law requires a false statement of fact that harms a person’s reputation. Courts generally treat photographs as evidence of truth. In this case, experts note that the government could argue the image is a parody or so obviously false that it is not presented as truth. Even if the photo is deemed a false statement, Armstrong would still need to show actual harm to her reputation and that the government acted with actual malice-knowing the image was false and intending to harm her.

The First Amendment adds another layer. For a public figure, the bar for proving defamation is higher, requiring proof of actual malice. Armstrong’s arrest was a matter of public concern, and she may be considered a public figure. Legal scholars say it is unclear whether the government could meet the actual malice standard in this situation.

Expert Opinions

Law professor Eric Goldman of Santa Clara University School of Law emphasized the difficulty of a successful claim. He noted that the government’s use of a false image could be seen as government propaganda. “It’s so shocking to see the government put out a deliberately false image without claiming that they were manipulating the image,” Goldman said. He added that the photo would need to be presented as truthful for defamation to apply.

Goldman also highlighted the hurdles of proving harm and actual malice. “If you fictionalize a photo and present it as true, I think you might have actual malice,” he explained. “However, I’m not sure how that would play out in this circumstance.”

Other legal experts echoed Goldman’s assessment. They argued that a strong case for defamation is unlikely and that the primary remedy for government misinformation is political-voters replacing officials who publish false information.

AI Image Generators Tested

The article’s author tested several AI chatbots to see if they could produce a crying image of Armstrong. Google’s Gemini and OpenAI’s ChatGPT successfully generated such an image. Microsoft Co-Pilot and Anthropic’s Claude refused, citing a policy against editing images to add manipulated emotional expressions to real people. xAI’s Grok was unavailable at the time of testing.

These results illustrate the varying guardrails across AI platforms. Some allow the creation of deepfakes, while others restrict them to prevent misinformation.

Broader Context

The incident is part of a larger pattern of government officials using AI and other technologies to shape public perception. Earlier this week, Secretary Kristi Noem called Alex Pretti, a 37-year-old ICU nurse killed by ICE agents in Minneapolis, a domestic terrorist. Noem’s remarks were widely criticized for mischaracterizing a tragic death.

“I don’t think we’ve had enough discussion about AI deepfakes being weaponized by the government’s propaganda so they can lie against their constituents,” Goldman said. He added that the current legal framework may not be strong enough to punish the government for such abuses.

The case underscores the tension between technological capabilities and democratic accountability. While the legal path for Armstrong may be fraught, the public scrutiny of the White House’s actions could lead to political consequences.

Key Takeaways

  • Nekima Levy Armstrong was arrested on Jan. 23 for violating the FACE Act.
  • The White House posted an AI-generated image of her crying, which her lawyer claims is defamatory.
  • Defamation law requires a false statement, reputational harm, and actual malice-criteria that may be hard to meet in this case.
  • AI tools vary in their ability to create deepfakes; some refuse to manipulate real images.
  • The primary remedy for government misinformation may be political rather than judicial.

Author

  • Aiden V. Crossfield covers urban development, housing, and transportation for News of Austin, reporting on how growth reshapes neighborhoods and who bears the cost. A former urban planning consultant, he’s known for deeply researched, investigative reporting that connects zoning maps, data, and lived community impact.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *