The application of big data, radiomics, machine learning, and artificial intelligence (AI) algorithms in radiology requires access to large data sets containing personal health information. Because machine learning projects often require collaboration between different sites or data transfer to a third party, precautions are required to safeguard patient privacy. Safety measures are required to prevent inadvertent access to and transfer of identifiable information. The Canadian Association of Radiologists (CAR) is the national voice of radiology committed to promoting the highest standards in patient-centered imaging, lifelong learning, and research. The CAR has created an AI Ethical and Legal standing committee with the mandate to guide the medical imaging community in terms of best practices in data management, access to health care data, de-identification, and accountability practices. Part 1 of this article will inform CAR members on principles of de-identification, pseudonymization, encryption, direct and indirect identifiers, k-anonymization, risks of reidentification, implementations, data set release models, and validation of AI algorithms, with a view to developing appropriate standards to safeguard patient information effectively.
CITATION STYLE
Parker, W., Jaremko, J. L., Cicero, M., Azar, M., El-Emam, K., Gray, B. G., … Bromwich, R. (2021, February 1). Canadian Association of Radiologists White Paper on De-Identification of Medical Imaging: Part 1, General Principles. Canadian Association of Radiologists Journal. SAGE Publications Inc. https://doi.org/10.1177/0846537120967349
Mendeley helps you to discover research relevant for your work.