Recent army The Myanmar coup government has added serious corruption allegations to a series of existing false cases against Myanmar leader Aung San Suu Kyi. These new allegations are based on a statement made by a well-known detained politician, which was first published in a video in March, and many people in Myanmar suspect that the video was deeply faked.
in This video, The voice and face of the political prisoner looked distorted and unnatural because he claimed in detail that he would provide gold and cash to Aung San Suu Kyi. Social media users with reporter Immediately questioned whether the statement was true in Myanmar. This incident illustrates a problem that will only get worse. As real deepfakes get better, People are willing to treat real footage as a deep fake increase. What tools and skills are available to investigate these two types of claims, and who will use them?
In the video, Phyo Min Thein, the former chief minister of Yangon, Myanmar’s largest city, sits in an empty room, apparently reading a statement. His voice sounded strange, unlike a normal voice, his face was still, and in the poor quality version originally circulating, his lips seemed out of sync with his words. It seems that everyone is willing to believe that this is fake.Screenshots from the online Deepfake detector quickly spread, showing a red box around the politician’s face with an assertion More than 90% of people believe that the confession is deeply forged. Myanmar journalists lack the forensic skills to make judgments. The state of the past and the current military operations reinforce the cause of suspicion. A government spokesperson shared a staged picture targeting the Rohingya ethnic group The organizers of the military coup denied that the evidence on social media that they were killed may be true.
But is the prisoner’s “confession” really a further study? I consulted deepfake creators and media forensics experts with deepfake researcher Henry Ajder. Some people pointed out that the video quality is very low, and the mouth failures that people see are likely to be artifacts caused by compression, or evidence of deep forgery. The detection algorithm is not reliable for low-quality compressed video. His unnatural voice may be the result of reading the script under extreme pressure. If it is false, it is very good, because his throat and chest move in sync with the words at the critical moment. Researchers and manufacturers generally suspect this is a deep fraud, although it is not certain. At this point, it is more likely to be familiar to human rights activists like me: Forcing or forcing a confession in front of the cameraIn addition, given the circumstances of the military coup, unless there is a legal judicial process, the essence of the allegations should not be believed.
Why is this important? Whether the video is a forced confession or a deep falsification, the result is likely to be the same: what the coup government forced the prisoners to say digitally or physically.However, although Use deep forgery to create unauthorized sexual images Far more than political examples, deepfakes and synthetic media technologies are rapidly improving, proliferating, and commercializing, expanding the potential for harmful uses. The case of Myanmar shows that there is a growing gap between the ability to produce deep forgeries, claims that real videos are opportunities for deep forgeries, and our ability to challenge this.
It also illustrates the challenge of allowing the public to rely on free online detectors without understanding the advantages and limitations of detection or how to make a second guess about misleading results. Deepfakes detection is still an emerging technology, and detection tools suitable for one method are usually not suitable for another method. We must also be wary of anti-forensics—someone deliberately takes steps to confuse detection methods. And it is not always possible to know which detection tools to trust.
How can we avoid conflicts and crises around the world being blinded by deep fraud and so-called deep fraud?
We should not turn ordinary people into deep fake observers, analyzing pixels to distinguish between true and false.Most people do better by relying on simpler media literacy methods, such as screen Methods that emphasize checking other sources or tracking the original context of the video. In fact, Encouraging people to become amateur forensics experts can get people into the rabbit hole of conspiracy Distrust of images.