- Lawyer Programs
- Learning Centre
- Student Resources
- Sole Practitioner Resources
- New Lawyer Resources
- Key Resources
- Legal Practice
- Continuous Improvement
- Cultural Competence & Equity, Diversity and Inclusion
- Lawyer-Client Relationships
- Practice Management
- Professional Conduct
- Professional Contributions
- Truth and Reconciliation
- Well-Being
- Anti-Money Laundering
- Public Resources
- Request a Presentation or Resource
- Home
- Resource Centre
- Key Resources
- Practice Management
- Managing AI in Meetings
Last updated March 2026
Artificial intelligence is now a common presence in virtual and hybrid meetings. AI meeting assistants, or “bots”, like Otter.ai and Fireflies.ai can join meetings automatically, record discussions, take photos of meeting participants or transcribe discussions, often before anyone even notices.
This use of AI is not inherently negative, and AI tools offer real value to lawyers. They can transcribe conversations, highlight action items and let participants focus on the discussion instead of note-taking. But these benefits come with privacy, security and ethical considerations that need to be managed carefully.
Awareness: How AI Enters the Room and Why it Matters
Awareness of bots is critical.
Many lawyers do not realize that AI meeting assistants can join meetings automatically. Sometimes a bot joins a meeting without a human participant, and even a brief appearance can be enough to capture sensitive information like client names, discussion topics and meeting notes. Bots may then store this data on servers located outside of Canada. Legislation in other countries may give a foreign government the ability to examine this data, giving rise to security and privacy concerns.
Bots raise concerns about unauthorized access, recording and misuse of data. For lawyers, this could mean inadvertent breaches of confidentiality, exposure of sensitive information, and even violations of privacy law and the Code of Conduct.
Communication: Setting Expectations Early
Clear communication is essential for using AI responsibly.
If you don’t want external AI bots in your meeting, tell participants in advance rather than waiting until a bot is trying to enter the meeting. A brief note in the meeting invitation usually works. For example: “External AI assistants and transcription tools are not permitted; internal tools may be used with notice.”
Conversely, if you are planning to record or transcribe a meeting, let participants know beforehand and obtain their consent. Be prepared to explain why a recording is being made and how long the file will be retained.
Controlling Meeting Access
Platform controls can help manage AI in meetings.
You can sometimes identify AI bots from a default username such as “[Attendee’s first name]’s Notetaker (Otter.ai)”. But users can change their bot’s name, making them hard to spot.
Some extra steps you can take to control virtual meeting access include:
- disabling third-party apps and bots in the meeting platform’s admin settings;
- enabling waiting rooms or lobbies to admit only verified participants;
- locking meetings once all attendees have joined;
- restricting who can admit attendees to prevent participants from admitting their own bots; and
- avoiding using the “admit all” function in your platform, which can bypass the steps taken to control access.
If you need to record a meeting, use the tools built into your meeting platform, which are generally more secure and allow you to control the data. You should also avoid storing meeting data on external servers.
Context: Assess AI Use by Risk Level
The risk of using AI depends the situation: it is not an all-or-nothing approach. The question is not whether or not to use AI, but when to use it and how to do so responsibly.
Consider developing firm policies based on the nature of the meeting or the participants involved. For example:
- You may feel more comfortable using AI bots to record internal meetings as opposed to external meetings with unfamiliar participants.
- You may want to use AI in meetings with tech-savvy clients but avoid it with more traditional or cautious clients.
- You may want to record less sensitive meetings like your firm’s social committee, but prohibit it for confidential or strategic discussions on, for example, litigation strategy or expert reports. Your decision to use AI may depend on what is being discussed.
Align your AI policy with your firm’s values and ensure all staff know the rules for meetings.
Host Responsibilities and Meeting Hygiene
Meeting hosts are responsible for enforcing your firm’s policies and technological controls. This includes managing platform settings, verifying participants’ identities and staying alert during the meeting.
For example, before admitting participants, you should check display names against the attendee list and remove any uninvited guests. If an unexpected participant joins your meeting, stop the discussion until you understand the situation and can resolve the issue.
If you have questions, an IT service provider can help you with settings, troubleshooting and recommendations for different types of meetings.
Prioritizing People Over Technology
Thoughtful use of AI in meetings is about the people, not just the technology. With clear communication, well-defined boundaries and transparent practices, lawyers can ensure that convenience does not overshadow consent, productivity does not erode privacy, and innovation does not outpace good ethical judgment.
Microsoft Copilot was used to edit and enhance the content of this document, Feb. 2026.