

Toronto police confirmed with CBC News that the victim in that case was Maryna Kudzianiuk. Police ruled her death a homicide after she died in hospital. By mid-December 2019, an internal showcase of Clearview AI was held for roughly 100 investigators from sex crimes, homicide and financial crimes units.įirefighters found Maryna Kudzianiuk with serious injuries while responding to a fire at this highrise building in Toronto in January 2020. By the end of October, investigators from both the child exploitation and intelligence services were using the technology.
#CLEARVIEW AI 30M SERIESHILL NEW TRIAL#
Within days of returning to Toronto, the service obtained a free trial of Clearview AI. Department of Homeland Security showcase of the technology as an investigative tool in identifying exploited children online - and also used Clearview AI in connection with real child exploitation investigations. While there, a detective attended an FBI and U.S. (Thomas Peter/Reuters) Detective introduced to Clearview AI at conferenceĪccording to the report obtained by CBC News, Toronto police were first introduced to Clearview AI at a victim identification conference in the Netherlands in October 2019. Department of Homeland Security, according to the service's internal report. Toronto police attended a conference in October 2019 in the Netherlands, where one detective was introduced to Clearview AI through a showcase put on by the FBI and the U.S. "The Toronto Police Services Board is currently developing a policy for the use of artificial intelligence technology and machine learning following public consultation," Osborne said. In a statement, Toronto police spokesperson Connie Osborne told CBC News that the service has no plans to use Clearview AI again. No plans to use Clearview AI: Toronto police Of the 84 criminal investigations where searches were completed, the report says that 25 were advanced through Clearview AI, with investigators identifying or confirming the whereabouts of four suspects, 12 victims and two witnesses.

That said, there were already two Toronto cases before the courts based at least in part on evidence that officers generated through the use of Clearview AI in March 2020, according to the report. So far, Neubauer and McPhail say they haven't seen a Canadian example of the software's use face legal scrutiny in court. When he was Toronto police chief, Mark Saunders found out his officers were using Clearview AI and ordered them to stop utilizing it on Feb. "We have the right in Canada to be free from unreasonable search and seizure - one could conceivably see an argument being brought in court that this was a fairly profound violation of that right."

"If police violated the law as part of their investigations, this could make those investigations vulnerable to charter challenges," said Eric Neubauer, a Toronto lawyer. Given those findings, the co-chair of the Criminal Lawyers' Association's criminal law and technology committee says the police service's lack of due diligence before using Clearview AI could put cases where it was used at risk. Since then, four Canadian privacy commissioners have determined that Clearview AI conducted mass surveillance and broke Canadian privacy laws by collecting photos of Canadians without their knowledge or consent.
#CLEARVIEW AI 30M SERIESHILL NEW SOFTWARE#
Soon after, then-Toronto police chief Mark Saunders was informed that his officers were using the software and ordered them to stop on Feb. (Submitted by Brenda McPhail)Īccording to the report, detectives using the technology only met with Crown attorneys about Clearview AI after a New York Times investigation in January 2020 revealed details of how the company compiled its database and its use by more than 600 law enforcement agencies in Canada, the United States and elsewhere. Brenda McPhail of the Canadian Civil Liberties Association says the police use of facial recognition technology as an investigative tool is a 'slippery slope' when it comes to privacy.
