Facial recognition company Clearview AI is facing a potential fine in the UK and has also been handed a provisional notice to stop further processing of UK citizens’ data and to delete any data it already holds as a result of what the British Information Commissioner’s Office described as “alleged serious breaches” of national data protection law.
The UK’s Information Commissioner’s Office announced the provisional orders following a joint investigation with Australia’s privacy regulator. Recently, the Australian Information Commissioner ordered Clearview AI to delete all images and facial templates belonging to individuals living in Australia.
Violating privacy
Oosto, one of the vendors in the field of Vision AI and face recognition, endorsed the Australian regulator's decision. Following the OAIC privacy commissioner’s statement in November Oosto CEO Avi Golan said, “Oosto endorses the Australian Government's decision. Scraping images of people from the web without their consent is, in our view, a serious violation of the right to privacy.”
Alternative ethical approaches to facial recognition do not require scraping images of people from social media"
“As a vendor of AI-based facial recognition products for private companies and government agencies, it is important for me to emphasise - facial recognition apps should be provided with an empty database.”
Golan added, “It's important to understand that they are alternative ethical approaches to facial recognition that do not require scraping images of people from social media, Google images, LinkedIn, Instagram, among others.”
Cryptography and biometrics
Oosto's perspective is that biometrics should be deployed with empty databases, adequate safeguards for data and privacy need to be built into the technology, and improved operational due diligence needs to be adhered to.
Oosto has employed safeguards including databases created from scratch by the customer organisation to meet their specific security needs, and the use of secure cryptography of any captured biometric data. Oosto points out that it also includes a ‘GDPR-mode’ which blurs the faces of people not appearing on the watchlist.