Facebook Parent Meta Removes Deepfake Video Of Ukrainian President Zelenskyy

Facebook parent company meta is meta parent company of facebook facebook parent company facebook parent facebook parentable facebook parent dashboard facebook parenting video facebook parenting meme
Facebook Parent Meta Removes Deepfake Video of Ukrainian President Zelenskyy


Facebook Parent Meta Removes Deepfake Video of Ukrainian President Zelenskyy

Facebook's parent company, Meta, said Wednesday that it removed a deepfake video of Ukrainian President Volodymyr Zelenskyy for violating the social network's rules against manipulated media. 

"It appeared on a reportedly compromised website and then started showing across the internet," said Nathaniel Gleicher, who heads security policy at Meta, in a tweet about the video. "We've quickly reviewed and removed this video for violating our policy against misleading manipulated media, and notified our peers at other platforms."

Videos known as deepfakes use artificial intelligence to create videos of people doing or saying something they didn't. Gleicher said the video of Zelenskyy made it appear the politician uttered a statement he actually didn't. Meta didn't identify the video or say what the statement was. CNET hasn't seen the video.

The removal of the video highlights the ongoing challenges social networks are facing as they try to curb the spread of misinformation after Russia's invasion of Ukraine. Some social media users have been posting old video footage on Twitter and Facebook to make it seem like they were recording events happening in real time. On TikTok, some people used old or out-of-context audio to create fake videos.

Meta doesn't always remove false content, leaning instead on directing users to authoritative sources or labeling misinformation. The social network partners with third-party fact-checkers to flag misinformation on its services, which also include photo-and-video app Instagram. Meta says it'll take down misinformation if there's a risk of physical harm or if the media is highly deceptive. 

"We remove this content because it can go viral quickly and experts advise that false beliefs regarding manipulated media often cannot be corrected through further discourse," Meta's rules against manipulated media says.


Source