Channel 4 has sparked controversy and debate with a fake video of the Queen as an alternative to her traditional festive broadcast that will be broadcast on Christmas.
The broadcaster will show a five-minute video in which a digitally modified version of the Queen shares her thoughts on the year, including the departure of Prince Harry and Meghan Markle as the older royal family and the participation of the Duke of York with disgraced financier Jeffrey Epstein.
The Queen of Deepfake, voiced by actress Debra Stephenson, can also be seen performing a dance routine from the TikTok social networking platform.
Channel 4 said the show was supposed to give a “stern warning” of the threat of fake news in the digital age, and program director Ian Katz described the video as “a powerful reminder that we can no longer trust our own eyes.”
Some experts suggest that broadcasting might lead the public to think that deep-counterfeiting technology is used more often than is the case.
“We have yet to see deep-seated fake files widely used, other than to attack women,” said Sam Gregory, program director of Witness, an organization that uses video and human rights technology. “We should be very careful in persuading people to think they can’t believe what they see.
“If you haven’t seen them before, this could make you believe that deep fakes are a more widespread problem than them,” he said. “It’s okay to expose people to deep mints, but we shouldn’t escalate with rhetoric by claiming we’re surrounded by them.”
Areeq Chowdhury, the technology policy researcher behind deepfakes Jeremy Corbyn and Boris Johnson during the 2019 general election, said he supports the decision to emphasize the impact of deepfakes, but that the technology does not currently pose a broad threat to information sharing.
“The risk is that it becomes easier and easier to use deepfakes, and the obvious challenge is the existence of false information, but also the threat of undermining original videos that could be dismissed as deepfakes,” he said.
“My view is that we should generally be concerned about this technology, but that the main problem with deepfake today is their use in non-confensual deepfake pornography, not in information.
“I don’t think the video is realistic enough to worry in this case, but adding disclaimers before showing a deep fake video or adding a watermark so it can’t be cropped and edited can help deliver them responsibly.”
“As a society, we need to understand what purposes of deep fake files we find acceptable and how we can move through a future in which synthetic media are an increasing part of our lives. Channel 4 should encourage best practice. “