- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
"A company which enables its clients to search a database of billions of images scraped from the internet for matches to a particular face has won an appeal against the UK’s privacy watchdog.
Last year, Clearview AI was fined more than £7.5m by the Information Commissioner’s Office (ICO) for unlawfully storing facial images.
Privacy International (who helped bring the original case I believe) responded to this on Mastodon:
"The first 33 pages of the judgment explain with great detail and clarity why Clearview falls squarely within the bounds of GDPR. Clearview’s activities are entirely “related to the monitoring of behaviour” of UK data subjects.
In essence, what Clearview does is large-scale processing of a highly intrusive nature. That, the Tribunal agreed.
BUT in the last 2 pages the Tribunal tells us that because Clearview only sells to foreign governments, it doesn’t fall under UK GDPR jurisdiction.
So Clearview would have been subject to GDPR if it sold its services to UK police or government authorities or commercial entities, but because it doesn’t, it can do whatever the hell it wants with UK people’s data - this is at best puzzling, at worst nonsensical."
Because it operates on the data of UK residents.
The internet has made everything really weird in terms of jurisdictions. You can have photos of UK citizens taken in the UK and stored on a UK server, and if a company from somewhere else scrapes the data without permission and moves it out the UK, that doesn’t obviously mean that it’s now fine to use for whatever.
Now of course the law has to have some jurisdictional limits, but it’s not surprising that there has been some disagreement about where they are.