Reflections of light in the eyes reveal deep fake
Reflections of light in the eyes reveal deep fake

Computer scientists at the University of Buffalo have developed a tool that can automatically identify counterfeits by analyzing the reflections of light in the eye.

The tool showed 94% effectiveness in an experiment described in an article accepted at the IEEE International Conference on Speech, Speech and Signal Processing in Toronto in June using pictorial-like images.

"The cornea is like an ideal spherical shape and has high reflectivity. Everything that penetrates them. The eye is viewed by the light of these sources. Department of Computer Science and Engineering, lead author of the article," said Siwi Liu, professor of innovation at Imperial College, State University of New York.

"The eyes should have very similar thought patterns because they see the same things, and when we look at the face we usually don't notice them," added Liu, an expert in multimedia and digital travel guides.

However, most AI-generated images, including GAN-generated images, cannot do so accurately or consistently.

The new tool takes advantage of this deficiency by detecting small aberrations in the light reflected from the eye of the fake photo.

To conduct the experiments, the research team received real photos of the scene and fake photos of fake faces using fake AI.

All photos are selfies, including real and fake people looking directly at the camera in good lighting, with a resolution of 1024 x 1024 pixels.

The working principle of this tool is to draw each face, then examine the eyes, eyeballs, and the reflected light in each eyeball, and compare possible differences in the shape, intensity of the light, and other characteristics of the reflected light.

While the technology appears promising, it has limitations: it must reflect the light source and can also correct mismatched light reflection on the eye when editing the image.

In addition, this technique only looks at the individual pixels that are reflected in the eye, not the shape of the eye, the shape of the eye from the inside, or the type of object reflected in the eye.

This technique compares the highlights on both eyes. If a person does not have an eye, or if the eyes are not visible, the technique will fail.

Liu, who has been researching machine learning and computer vision projects for more than 20 years, demonstrated that the flicker rate of video objects is often inconsistent or absent in later study videos.

In 2020, Liu helped Facebook meet the global challenge of advanced subject discovery and helped create Deepfake-o-meter, an online resource that enables ordinary people to see if the videos they watch are about progress.

From fake ad campaigns to human pornography, Deepfake-o-meter are being used for many sinister purposes and are becoming increasingly difficult to detect.

Previous Post Next Post