VANCOUVER -- The privacy commissioner of British Columbia has revealed five individual police officers in the province used a controversial facial recognition service that “illegally” harvested billions of social media images.
Michael McEvoy announced the details at a teleconference with other privacy commissioners on Wednesday, the result of months of investigation and review by their offices into Clearview AI, a company notorious for stealing online pictures and selling access to them to hundreds of companies around the world.
“In British Columbia we found evidence of police forces officially taking on this technology,” he told reporters. “There were instances in, I think, five different departments across the provinces where individual officers had taken it upon themselves to do that."
McEvoy said local police forces didn't authorize the use of the technology and were quick to shut it down.
"To the credit of the forces here in B.C. they recognize this is not a proper use of this kind of technology, and they dealt with it," he said.
The Abbotsford and Saanich police departments replied to an email request from CTV News saying none of their members had used Clearview AI. West Vancouver police, Metro Vancouver Transit Police, Victoria police and Delta police did not reply, while Vancouver police confirmed one of their child exploitation investigators had used the software as part of a free 30-day trial the company offered.
“The detective attempted to use the technology in a child pornography investigation and conducted one search, but was unsuccessful,” said Sgt. Steve Addison in an email. “This was the only time the officer used the free account, and the detective later cancelled it. The VPD has not authorized the use of Clearview AI for any of our members and we have no plans to use Clearview AI.”
Not just law enforcement using Clearview
McEvoy said a single employe at a private company was the only example they found using Clearview AI outside the policing sphere, but he declined to name the employee or the company. Commissioners across Canada are in discussions as to whether they will disclose the individuals or private companies that has been using the service.
Clearview AI itself had agreed to stop providing its services in July of 2020 as the federal privacy commissioner and three provincial counterparts began investigating the company.
The U.S.-based technology firm boasts it has scraped billions of images from social media and has touted its identification services to law enforcement agencies but also private companies. Clearview AI had argued that since the images were posted to social media and publicly available, users couldn’t have an expectation of privacy.
But Canada’s top privacy watchdog denounced the technology and its premise as unlawful in this country, describing the company’s defence as “preposterous.”
“What Clearview does is mass surveillance and it is illegal,” said Daniel Therrien, federal privacy commissioner. “The worst part of the argument, as far as I’m concerned is that their corporate interests should prevail over the privacy interests of the people whose images were scraped and then disclosed, so I think that’s preposterous, frankly.”
Police across Canada used controversial software
The RCMP had acknowledged using the facial recognition technology in February of last year, but insisted it was in a limited capacity for only a few months as they investigated child exploitation cases. Other police agencies across the country have admitted to using it, but they all insist it was brief and limited use.
In B.C., officers were quick to end their use of the service when departments were contacted by the privacy commissioner’s staff, so no repercussions are expected. But the province’s privacy watchdog admits his powers don’t have teeth against a company like Clearview.
No teeth for enforcement against foreign companies
“Our tools are far too weak,” said McEvoy “I have the ability to make an order against this company, for example, to stop them from doing what they’re doing here in British Columbia with B.C. residents, but other penalties are needed — monetary penalties are needed to act as a deterrent for companies to prevent them from doing these kinds of things."
McEvoy said those kinds of penalties don't exist in Canada yet.
Evan Solomon, host of CTV’s Power Play, recently asked the federal Public Safety Minister whether he would ban use of the technology in Canada.
“The invasion of Canadian's privacy rights is not insignificant," said Bill Blair. “Going forward, the use of every innovative and emerging technology for public safety purposes has to be informed by the advice and direction we receive from the privacy commissioner."
“We can't stop them from collecting our data, even though we've asked them to – they've basically flipped us the bird on this,” countered Solomon. “Can we stop them?"
Blair appeared flustered by the question, replying: "I think that needs to be determined. I want legal advice."
CTV News reached out to Clearview AI with several questions; so far, the company hasn’t responded.
Legislation not keeping up with technology
While thinking twice about what photos to post on social media and high privacy settings are golden rules of internet usage, in this case the issue is about a company profiting off those images and that’s the crux of the issue.
“There are exemptions for journalists and the (Privacy) Act is very clear about that, but where it’s a commercial enterprise operating for-profit using your image for profit they need to get your permission and that’s what they haven’t done,” McEvoy said. “That’s what the law in British Columbia says and that’s what the law federally is as well and that’s what the company did without anyone’s permission.”
And while regulators and lawmakers are working on modern-day issues like for-profit mass surveillance, it’s a slow process and watchdogs are always playing catchup and waiting for the next privacy issue to present itself. Clearview AI had flown under the radar until an expose by the New York Times last year.
“We wanted to send a clear signal: this kind of action by a company is not acceptable, it’s not compliant with the law and won’t be tolerant in this country," McEvoy said, adding that society needs to consider whether this kind of technology should be used.
"We all, in effect, become suspects, as it were, in a police lineup by having our faces put into this databank for surveillance purposes."