Facebook will Deepfakes via Reverse Engineering erkennen

Share your love

Facebook wants to decipher deepfakes backwards, so to speak, in order to recognize that they are exactly such. Individual images are broken down into their components and information using reverse engineering. The method was developed jointly with Michigan State University.

Instead of wanting to recognize from the image itself whether it is a deepfake, this should now be done via the attributes of the image, such as which generative modeling was used to create the fake. So far, the focus has often been on looking whether images were used in training models. However, according to a Facebook blog post, this is rarely the case. And it is correspondingly of little use in recognition.

The reverse engineering method now derives information about the generative model based on the finished deepfake. Fingerprints should be recognized, this can include cameras and sensors that leave unique patterns. “It is the first time that researchers have managed to identify properties of a model that was used to create a deepfake without first knowing anything about the model.” This should now make it easier to discover deepfakes and show whether counterfeits come from the same company. Above all, Facebook wants to expose coordinated disinformation campaigns in which deepfakes are used.

Videos, photos and audios that were created using deep learning and are intended to manipulate in a targeted manner are already prohibited on the platform. Satire and parodies are excluded. Facebook worked out the criteria for deleting the content with experts. They also apply to contributions that have been brought together from multiple sources. The distinction between permitted parodies and forbidden forgeries is likely to be difficult.

Read Also   High-impedance headphones on the Mac: Apple gives love to the jack socket

Other networks are also trying to ban deepfakes. Google has created an experimental platform to help researchers and journalists spot them. “Assembler” runs under the umbrella of Jigsaw, an Alphabet subsidiary that works to combat extremism and protect freedom of information. There, Google collects various detectors from universities in order to improve the results or increase the hit rate.


Share your love