Queen’s Christmas Speech: Channel 4 criticized for deepfake version

  • The British television channel Channel 4 has caused controversy with a deep fake video showing an alternative ceremonial broadcast that will be broadcast on Friday.
  • The video shows the Queen discussing the royal family’s controversial stories, including Prince Andrew’s ties to Jeffrey Epstein and the departure of Prince Harry and Meghan Markle from the family.
  • Channel 4 said it intends to provide a “sharp warning” about deepfake technology and fake news with the video.
  • Critics, however, say that because of the video, deepfakes seem to be more widespread than they actually are.
  • Visit the Business Insider homepage for more news.

The British broadcaster Channel 4 has caused controversy with a deep fake video showing an alternative ceremonial broadcast to be broadcast on Friday.

Queen Elizabeth II announces her annual address to the nation at 3pm on Christmas Day, reflecting the highs and lows of the previous year. The message usually focuses on one topic, and in 2020 it is likely to focus on the coronavirus pandemic and its impact on the UK.

However, the alternative to Channel 4 will be a little different.

The five-minute video shows a digitally modified version of the queen, voiced by actor Debra Stephensen, discussing some of the royal family’s most controversial moments this year, including the departure of Prince Harry and Meghan Markle from royal duties, and the Duke of York’s relationship with disgraced financier and alleged sex offender Jeffrey Epstein, published by The Guardian.

A short clip of the video released by the BBC shows a fake queen joking: “There are a few things that are more harmful than someone telling you they prefer the company of Canadians,” compared to the relocation of Harry and Meghan to Canada.

The video was originally intended to give a “sharp warning” about deepfake technology and fake news.

Ian Katz, director of Channel 4, told the Guardian that it was “a strong reminder that we can no longer believe our own eyes”.

However, the project failed somewhat, and experts noted that the video suggests that deepfake technology is more common than it actually is.

“We have not yet seen widely used fake forgeries other than to attack women,” Sam Gregory, program director of Witness, an organization that uses video and human rights technology, told the Guardian.

“We should be very careful about making people think they can’t believe what they see. If you haven’t seen them before, this could make you believe that deep lies are a more widespread problem than them,” he added.

Deepfake technology is becoming an increasing problem, especially aimed at women with deep fake pornography without consent.

The frightening investigation of the bot service that generates fake acts emphasized that the most urgent danger posed by the Internet “deep fake” is not misinformation – it is revenge.

Sensity in-depth tracking firm, formerly Deeptrace, revealed on Tuesday that it had uncovered a huge operation to spread nude images of women and, in some cases, underage girls generated by artificial intelligence.

The service primarily worked on the Telegram encrypted messaging application using bots on AI technology.

Deepfakes expert Henry Ajder told the Guardian: “I don’t think the video is realistic enough to worry in this case, but adding disclaimers before showing deepfake videos or adding a watermark that can’t be cropped and edited can help make them responsibly deliver.

“As a society, we need to understand what deep-chain purposes we find acceptable and how we can move through a future in which synthetic media are an increasing part of our lives.

“Channel 4 should encourage best practice.”