AI tech is a powerful tool. The original photo (left) was cleaned-up with an AI deep learning algorithm (Image source: from Murilo Gustineli) and restoring tremendous clarity.
The AI researchers outline their progress in their white paper Towards Real-World Blind Face Restoration with Generative Facial Prior (https://arxiv.org/pdf/2101.04061) and code is available for others to try on their project webpage: https://xinntao.github.io/projects/gfpgan.
The GFP-GAN system (Generative Facial Prior GFP — Generative Adversarial Network GAN), published by Xintao Wang, Yu Li, and Honglun Zhang and Ying Shan, is able to restore images much better than previous AI systems. The results are nothing short of impressive.
As a privacy professional, when I see these transformational examples, I have grave concerns about undesired monitoring of the population and the ability to clean-up distant or low-quality surveillance images, to identify and track a population.
Digital cameras are widely deployed by businesses and governments. A major limitation is the clarity of images at a distance. It becomes very difficult to positively identify subjects. With AI enhancing image clarity tools, identifying people at great distances or with poor resolution cameras could be automated at scale. That could allow the tracking of people wherever they go, catalog everyone they speak with, and if eventually applied to read-lips, it could eavesdrop on conversations at a distance.
However, you may be shocked to know that I am equally excited as this is also a potentially PRIVACY ENHANCING technology! Because this same type of AI can be used to perturb clear images in ways that undermine facial recognition algorithms.
Imagine this tech embedded in privacy-supporting cameras that modify the pixels in ways, unnoticeable to the human eye, but thwarts AI systems from conducting bulk identification of people from its video feed. Humans still see unblurred images but automated computer processes are thwarted from harvesting identified personal data at scale. Such a usage could find a potentially desirable balance between security and privacy.
It is up to everyone to decide how such tools will be used.