AI Scribes and Privacy Risks: Balancing Convenience, Patient Rights, and Legal Obligations

Introduction
Artificial intelligence (“AI”) powered tools are becoming increasingly common in clinical and administrative environments, introducing new privacy risks and governance challenges that are particularly salient in the health sector. One such tool is the “AI scribe.” AI medical scribes are AI-enabled tools that can transcribe and summarize a healthcare provider’s verbal interaction with a patient. AI scribes, which have seen rapid adoption in the healthcare field, raise a variety of privacy considerations, including secure data storage, de-identification and retention practices, and obtaining valid patient consent.
Various provincial information and privacy commissioners in Canada have recently issued guidance to help personal health information custodians implement robust practices for safeguarding patient data against inadvertent breaches when implementing AI scribes in the clinical environment.
The importance of such guidance was underscored by a recent incident in Ontario, discussed below, where an AI transcription tool accessed a virtual medical meeting without authorization, exposing sensitive patient personal health information.
Overview of Recent Breach
Background
On December 17, 2024, an Ontario hospital notified the Privacy Commissioner of Ontario that a privacy breach had occurred when a former hospital physician’s AI scribe joined and recorded a virtual hepatology rounds meeting. [1] The AI tool used transcribes spoken words into text and can integrate with various virtual meeting platforms by accessing invitations stored in users’ digital calendars – in this case, this is how the AI scribe joined the meeting despite the fact that the physician was no longer employed at the hospital.
The physician had stopped working at the hospital in 2023 and later installed the AI scribe on their personal device in September 2024. The AI scribe was not approved for use at the hospital. The AI scribe was able to access the hospital rounds meeting because the physician had used their personal email address instead of their work email address (which was contrary to hospital policy) when working at the hospital and the meeting invite to the personal email address was never rescinded by the hospital following the physician’s departure. The AI scribe automatically accessed the meeting link in the physician’s calendar, joined the hepatology rounds meeting, and recorded the meeting discussion without notice to the participants. The breach was discovered only when the AI scribe automatically emailed a meeting summary and transcript to the meeting participants.
The personal health information of seven patients was discussed during the meeting and captured in the recorded transcript. The information included patient names, sex, physician names, diagnoses, medical notes, and treatment information.[2]
Response
To address the breach, the hospital cancelled the digital invite to prevent further unauthorized access, instructed meeting attendees and email recipients to delete the transcript, directed staff to remove the AI tool and similar tools from devices connected to hospital accounts, required the former hospital physician to delete all hospital-related material from the physician’s personal systems, and directed the physician to request that the AI tool delete the recorded information.[3]
The hospital notified the individuals affected by the breach and implemented further preventative measures, including new firewalls, updated privacy training, and revised AI-related policies.[4]
Recommendations
The Privacy Commissioner of Ontario issued several recommendations, which included:
- requiring the hospital to request deletion of all patient information retained by AI tool from the meeting;
- updating the hospital’s privacy breach protocol to ensure immediate contact with third-party organizations when unauthorized collection occurs;
- clarifying in the Acceptable Use Policy that agents may only use hospital-approved devices for hospital work;
- auditing the employee offboarding process;
- enforcing the use of a lobby for all virtual meetings involving personal health information;
- ensuring that the hospital’s AI procurement and implementation framework aligns with IPC guidance; and
- updating policies, procedures, and training materials to include administrative monetary penalties for privacy breaches.[5]
Legal Implications
Sections 12(1) and 17(3) of the Personal Health Information Protection Act (“PHIPA”) (Ontario) require health information custodians to protect personal health information against unauthorized use or disclosure and to ensure records are not copied without authorization.[6] It is important to note that the Ontario Privacy Commissioner has the power to impose administrative penalties for contraventions of such provisions of PHIPA.[7]
Applicable Privacy Requirements and Recent Guidance
Under their respective health privacy laws, Alberta and Quebec both require that privacy impact assessments (“PIAs”) be prepared and sent to the Privacy Commissioner for review and comment before implementing any proposed new practice or system relating to the collection, use and disclosure of personal health information.[8] Introducing the use of AI scribes during hospital rounds or medical appointments would likely qualify as a new practice or new system within the meaning of these Acts.
Some provincial regulators are already taking steps to limit unauthorized AI scribe use in healthcare environments. The Privacy Commissioner of Alberta has specifically advised that before using an AI scribe, custodians must submit a privacy impact assessment under the Health Information Act (the “HIA”).[9] The Privacy Commissioner of Alberta also noted that the HIA likely does not permit vendors to use patient information provided by custodians to train AI tools.[10]
While not statutorily mandated in Ontario, the Privacy Commissioner of Ontario also recently recommended conducting a privacy impact assessment before introducing an AI system that involves the collection, use or disclosure of personal health information.[11]
The Privacy Commissioner for British Columbia also recently recommended that healthcare providers carefully review claims that an AI scribe de‑identifies data. In the absence of a standard definition of “de‑identified,” such data may still constitute personal information under the Personal Information Protection Act (“PIPA”) if it can be combined with other information to identify an individual.[12]
The Privacy Commissioners for both British Columbia and Ontario recently published checklists which provide healthcare organizations with considerations when reviewing an AI scribe’s privacy policy and services agreement to ensure compliance with applicable personal health information legislation.[13]
The Canadian Medical Protective Association (“CMPA”) has also issued guidance on AI scribes. It recommends obtaining patient consent before recording clinical encounters, explaining the purpose of the recording or transcription, and outlining the privacy and accuracy risks. Because AI tools can misinterpret information, hallucinate, or introduce bias, clinicians must review AI‑generated transcriptions to ensure patient records remain accurate and complete.[14]
Best Practices for Minimizing Risk
In light of the foregoing, the following practices can help manage and reduce the risks associated with using AI scribes in healthcare environments:
- Implement and use PIAs to screen new tools and evaluate the technical and privacy limitations of novel technologies and better understand the risks and potential magnitude of a data breach.
- Stay up to date on vetting new software and ensure that personnel are only downloading and sharing personal information with appropriate software that meets appropriate privacy compliance requirements.
- Ensure appropriate human oversight and controls are in place to minimize the likelihood of hallucinations (and consequences stemming from undertaking a particular course of treatment as a result of relying on a hallucination):
- Audit tools regularly, especially after software updates or changes to privacy policies, to understand how functionalities evolve over time.
- Configure tools to limit access and minimize retention periods.
- Include a meeting lobby for virtual meetings with someone to check for any unauthorized chatbots or scribes and provide clear notice to meeting participants when transcription tools are being used.
- Create and enforce distinctions between using personal and professional accounts. Ensure that professional accounts are being properly offboarded, and individuals’ access is removed from shared files, meeting invitations, and mailing lists upon departure or termination of employment.
- Ensure that policies and privacy training remains up-to-date and relevant with continuously evolving tools and technology. The risks evolve as both the underlying technology and reliance on the output of transcription tools develops.
To learn more about how our team can help you navigate the health privacy and data landscape, please contact Dana Siddle or Marissa Caldwell.
[1] Reported Breach HR24-00691, Information and Privacy Commissioner of Ontario, October 27, 2025, at pg 1.
[2] Ibid, at pg 2.
[3] Ibid, at pg 2.
[4] Ibid, at pg 3.
[5] Ibid, at pg 4-5.
[6] Personal Health Information Protection Act, 2004, SO 2004, c 3, Sch A, ss 12 (1) and 17(3).
[7] Personal Health Information Protection Act, 2004, SO 2004, c 3, Sch A, s 61(1)(h.1). This amount can be up to $50,000 for individuals and up to $500,000 for organizations; see General, O Reg 329/04.
[8] Health Information Act, RSA 2000, c H-5, s 64(2); Act respecting health and social services information, CQLR c R-22.1, s 84(7)(e).
[9] Sebastian Paauwe & Christine Wagoner, “AI Scribes in your practice: ensuring patient privacy”, (September 11, 2025) CPSA, online: https://cpsa.ca/news/ai-scribes-in-your-practice-ensuring-patient-privacy/
[10] “Artificial Intelligence (AI) Scribe Privacy Impact Assessment Guidance”, (September 2025) Office of the Information and Privacy Commissioner of Alberta, online at page 5: https://oipc.ab.ca/wp-content/uploads/2025/09/AI-Scribe-PIA-Guidance-Sept-2025.pdf
[11] “AI Scribes: Key Considerations for the Health Sector”, (January 28, 2026) Information and Privacy Commissioner of Ontario, online: https://www.ipc.on.ca/en/resources/ai-scribes-key-considerations-health-sector
[12] “PIPA and AI scribes: best practices for healthcare organizations in BC”, (January 2026) Office of the Information & Privacy Commissioner for British Columbia, online: https://www.oipc.bc.ca/documents/guidance-documents/3082
[13] “PIPA and AI scribes: best practices for healthcare organizations in BC”, (January 2026) Office of the Information & Privacy Commissioner for British Columbia, online: https://www.oipc.bc.ca/documents/guidance-documents/3082; “AI Scribes: Checklist of Key Considerations for the Health Sector”, (January 2026) Information and Privacy Commissioner of Ontario, online: https://www.ipc.on.ca/en/resources/ai-scribes-checklist-key-considerations-health-sector
[14] “AI Scribes: Answers to frequently asked questions”, (December 2025) Canadian Medical Protective Association, online: https://www.cmpa-acpm.ca/en/advice-publications/browse-articles/2023/ai-scribes-answers-to-frequently-asked-questions
Stay Connected
All form fields are required "*"



