In 2020, the Office of the Privacy Commissioner of Canada (OPCC) led a joint federal-provincial investigation into privacy violations stemming from the use of facial recognition technologies. The investigation was prompted specifically by the mobilization of Clearview AI’s facial recognition software in law enforcement, including by regional police services as well as the Royal Canadian Mounted Police. Clearview AI’s technology is based on scraping social media images, which, as the investigation found, constitutes a privacy law violation according to provincial and federal private sector legislation. In response to the investigation, Clearview AI claimed that consent for scraping social media images was not required from users because the information is already public. This common fallacy of social media privacy serves as a pivot point for the integration of digital policy literacy into the OPCC’s digital literacy materials in order to consider the regulatory environment around digital media, alongside their political-economic and infrastructural components. Digital policy literacy is a model that expands what is typically an individual-or organization-level responsibility for privacy protection by considering the wider socio-technical context in which a company like Clearview can emerge.
CITATION STYLE
Shepherd, T. (2024). The Canadian Clearview AI Investigation as a Call for Digital Policy Literacy. Surveillance and Society, 22(2), 179–191. https://doi.org/10.24908/ss.v22i2.16300
Mendeley helps you to discover research relevant for your work.