The UK Information Commissioner has said she is “deeply concerned” live facial recognition (LFR) may be used “inappropriately, excessively or even recklessly”.
Elizabeth Denham questioned what would happen if it was combined with social media and other big data.
There is a “high bar” for LFR where “we shop, socialise or gather”, she wrote.
New guidance for companies and public organisations using the technology has also been published.
In a blog post Ms Denham addressed the use of live face recognition, saying that facial recognition technology could be useful, allowing us to unlock our mobile phones, or set up a bank account online.
But when people’s faces were scanned and processed by algorithms in real time and in public places, the risks to privacy increased.
“We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights without having our biometric data collected and analysed with every step we take,” she wrote.
The tech could create instant profiles of people to be used in serving personalised adverts or it could match shoppers’ faces against watch-lists of known shoplifters.
In a separate Commissioner’s Opinion, the ICO revealed it was aware of proposals to use live facial recognition in billboards.
Ads in public spaces might be able tell how engaged a person was, or estimate their age, ethnicity, sex and even clothing styles and brands, in order to serve personalised content.
Billboards might even remember faces allowing companies to track individuals visits across different locations.
Taking steps
Companies also needed to be aware of the dangers of bias in facial recognition systems and the risks of misidentification.
The Commissioner’s Opinion sets standards for the use of the live facial recognition by companies and public bodies; police use was addressed in an earlier document.
The new opinion revealed that out of six ICO investigations into LFR systems, none of the systems that actually went live were fully compliant with data protection law.
All of the organisations chose to stop, or not proceed with the use of the technology.