In the hours after a masked federal agent shot and killed Renee Nicole Good, a 37-year-old woman in Minneapolis, social media users have been sharing AI-altered images they falsely claim “unmask” the officer, revealing their real identity. The agent was later identified by Department of Homeland Security spokesperson Tricia McLaughlin as an Immigrations and Customs Enforcement officer.
The shooting occurred on Wednesday morning, and social media footage of the scene shows two masked federal agents approaching an SUV parked in the middle of the road in a suburb south of downtown Minneapolis. One of the officers appears to ask the driver to get out of the vehicle before grabbing the door handle. At this point, the driver appears to reverse, before driving forward and turning. A third masked federal officer, standing near the front of the vehicle, pulls out a gun and fires at the vehicle, killing Good.
The videos of the incident shared on social media in the moments after the shooting did not include any footage of any of the masked ICE agents with their masks off. However, multiple images showing an unmasked agent began circulating on the internet within hours of the shooting.
The images appear to be screenshots taken from the actual video footage, but altered with artificial intelligence tools to create the officer’s face.
WIRED reviewed multiple AI-altered images of the unmasked agent shared on every mainstream social media platform, including X, Facebook, Threads, Instagram, BlueSky, and TikTok. “We need his name,” Claude Taylor, the founder of anti-Trump Mad Dog PAC, wrote in a post on X featuring an AI-altered image of the agent. The post has been viewed over 1.2 million times. Taylor did not respond to a request for comment.
On Threads, an account called “Influencer_Queeen” posted an AI-altered image of the agent and wrote: “Let’s get his address. But only focus on HIM. Not his kids.” The post has been liked almost 3,500 times.
“AI-powered enhancement has a tendency to hallucinate facial details leading to an enhanced image that may be visually clear, but that may also be devoid of reality with respect to biometric identification,” Hany Farid, a UC-Berkeley professor who has in the past studied AI’s ability to enhance facial images, tells WIRED. “In this situation where half of the face is obscured, AI, or any other technique, is not, in my opinion, able to accurately reconstruct the facial identity.”
Some of the people posting the images also claimed, without evidence, to have identified the agent, sharing the names of real people and, in a number of cases, providing links to the social media accounts of these people.
WIRED has confirmed that two of the names circulating don’t appear to be immediately connected to anyone associated with ICE. While many of the posts sharing these AI images have limited engagement, some have gained significant traction.
One of the names shared online without evidence is Steve Grove, the CEO and publisher of the Minnesota Star Tribune, who previously worked in Minnesota Governor Tim Walz’s administration. “We are currently monitoring a coordinated online disinformation campaign incorrectly identifying the ICE agent involved in yesterday’s shooting,” Chris Iles, vice president of communications at the Star Tribune, tells WIRED. “To be clear, the ICE agent has no known affiliation with the Minnesota Star Tribune and is certainly not our publisher and CEO Steve Grove.”
This is not the first time AI has caused issues in the wake of a shooting. A similar situation emerged in September when Charlie Kirk was killed and an AI-altered image of the shooter, based on grainy video footage released by law enforcement, was shared widely online. The AI image looked nothing like the man who was ultimately captured and charged with Kirk’s murder.


