We all got an update on the quality of deepfake videos last week with the popularity of a set of videos of “DeepTomCruise” on TikTok. I have been keeping track of these videos, created by various computer programs, and last wrote about them for Avast here. It doesn’t take too much imagination to see how this technology can be exploited, but lately there are some positive things to say about deepfake vids. Let’s go to Korean TV, covered by this story in the BBC.
The announcer shown in the screen grab above is supposed to be the anchor Kim Joo-Ha, one of the regulars on the MBN channel. It looks pretty ordinary. But she was replaced by a computer program that generated a digital copy that mimicked her facial expressions, voice and gestures. Now, before you get all in a twist, viewers were told ahead of time that this wasn’t the real Kim and the network was using it as a test. One place that deepfakes could be useful is during real breaking news reports where they have to put someone on air quickly (as opposed to what American cable news calls breaking news).
Deepfake videos are increasingly being used for legitimate purposes, such as Synthesia, a London-based firm that creates corporate training videos. The tech can be useful and cut production costs significantly if you are trying to produce a series in different languages and don’t want to hire native speakers. USC’s Shoah Foundation has produced a series of deepfake video interviews of Holocaust survivors, and the public can ask questions from the survivors and get their answers in real-time — all assembled by computers from hours of videotaped interviews.
The issue is the negative taint that has been part of the deepfakes. In my post for Avast, I mentioned four different categories, including porn, misinformation campaigns, evidence tampering and just plain fraud. Clearly, that is a lot of tempting places for criminals to use them. So we have some work ahead to swing to more legitimate uses.
Also an issue: who owns the rights to the person that is depicted, particularly if the person is no longer alive? This means some truth in labelling, so that viewers — like in the Korean example cited above– know the exact situation.
Latest FBI warning about deepfake exploits https://assets.documentcloud.org/documents/20509703/fbipin-3102021.pdf
Shelly Palmer’s piece on them here:
https://www.shellypalmer.com/2021/03/be-very-worried-about-the-tom-cruise-deepfake/