Categories: Science & Technology

The deepfake video detector was tricked for the first time: universal way to deceive them is here

California programmers were the first to demonstrate that a deepfake detector can be tricked. This was reported by the press service of Cornell University. The programmers refused to publish their code in the open access, so that it could not be used to distort reality for illegal purposes.

Deepfake technology, or deepfake, allows you to change the face of any subject to someone else’s. At the same time, the new version of the image looks quite believable. With the help of a deepfake, you can create realistic footage of events that did not happen in reality.

Nicolas Cage instead of Amy Adams in Man of Steel / © DeepFake Videos, YouTube

With reference to the work of American specialists in computer engineering, a new neural network is a fairly simple way “cheating” any deepfake detectors. This algorithm was presented at the Winter Conference on the Applications of Computer Vision, which took place in January this year (WACV 2021). The researchers insist that their solution demonstrates the ineffectiveness of even the most advanced video authentication tools available.

The method of deception is to embed special “opposing” – adversarial examples in each frame of the video. The effect is truly discouraging: even a hastily concocted deepfake is detected by the detector as a real recording. Although the unnatural behavior of a person in the video is obvious to the naked eye. Researchers have demonstrated an example of such deception in practice.

©University of California San Diego, SciTechDaily, YouTube

Related Post

Importantly, the “snag” works for video transcoding and on many of the most advanced detectors. In fact, experts conclude that due to the peculiarities of the algorithms that detect forgery, it is really easy to circle them all.

Such developments can become much more perfect, which again calls into question the very concept of trust in the video sequence. More precisely, it raises concerns that manipulation of public opinion may turn out to be much easier for attackers than law enforcement agencies and the people themselves would like. And if at first the wave of deepfakes was perceived only as entertainment, then a little later the danger of technology became obvious.

People are used to trusting what they see with their eyes, and a well-made deepfake can be indistinguishable from a real video. Sometimes it is enough not even to replace a person’s face, but to synchronize lip movements with fake speech. The scope of possible use of such manipulations is truly limitless. And detectors of such fakes are being developed by hundreds of teams around the world.

Advertisement. Scroll to continue reading.
Share

Recent Posts

Scientists are looking for “moon trees” grown from seeds that have been on board the “Apollo 14”

Apollo 14 became the third US spacecraft to land on the moon. On board, in addition…

10 hours ago

NASA: On April 20, a unique M1-class flare occurred on the Sun

According to the data coming from the NASA GOES-16 space satellite of the Geostationary Operational…

20 hours ago

Athonite monk predicted the terrible future of the United States and Russia in the 21st century

Stefan Karulsky, a Serbian schema-archimandrite, prophesied terrible chaos back in the twentieth century. Soon, the flood…

1 day ago

Terramorphing: The Incredible earth changes going unnoticed

On Saturday, the Ha'il region in Saudi Arabia was buried under a layer of ice…

2 days ago

The man from Taured. A time traveler or an alien from a parallel universe?

This mystical story is entangled in many secrets and mysteries. But the most interesting thing about…

2 days ago

People are preparing for a mass exodus from cities. Has the time come?

Several years ago, many journalists with little interest in apocalyptic topics drew attention to strange…

3 days ago