Transcript
[Transcript] Episode 502: Artificial Intelligence, HIPAA, and Non-Discrimination
Evan Dumas
You’re listening to Group Practice Tech, a podcast by Person Centered Tech, where we help mental health group practice owners ethically and effectively leverage tech to improve their practices. I’m your co-host, Evan Dumas.
Liath Dalton
And I’m Liath Dalton, and we are Person Centered Tech.
Liath Dalton
This episode is brought to you by Therapy Notes. Therapy Notes is a robust online practice management and electronic health record system to support you in growing your thriving practice. Therapy Notes is a complete practice management system with all the functionality you need to manage client records, meet with clients remotely, create rich documentation, schedule appointments and bill insurance all right at your fingertips. To get two free months of Therapy Notes as a new Therapy Notes user go to therapynotes.com and use promo code PCT.
Evan Dumas
Hello and welcome to Episode 502: Artificial Intelligence, HIPAA, and Non- Discrimination.
Liath Dalton
This is going to be a topic that we talk a lot about this this year and probably for many years to come. Artificial Intelligence in particular, but then in the context of what HIPAA has to say in terms of the regulatory framework that informs the appropriate utilization of AI from a perspective of having the necessary safeguards in place that serve client and patient outcomes and best interests, that’s kind of the overarching framework and purpose of HIPAA.
Liath Dalton
And so of course, it has a lot to say about AI. And because AI has emerged so rapidly and proliferated so quickly as well, there are areas where HIPAA is catching up, in terms of putting out really specific uh, standards and and regulations, but we are now starting to see that happening.
Liath Dalton
So one of the first examples of this regulatory clarification and guidance that we have is related to the section 1557 final rule on non-discrimination, and that was something that, the final rule itself took effect July 5. But the rule’s affirmative requirements to make, quote, “Reasonable efforts to identify and mitigate risks of discrimination in use of patient support tools in AI and emerging technologies” takes effect May 1 2025.
Liath Dalton
So we have some time in which to prepare for this going into effect, and that’s what we’re going to talk about today. What is, what is the the rule? What are its implications, and how do we comply with it? So let’s take a little bit of a step back though and and share and appreciate how the Office of Civil Rights and the HHS disseminated this information and made it accessible.
Evan Dumas
Yeah, so back January 10, they sent out this letter. The Department of Health and Human Services, sent out a dear colleagues letter. Now I hadn’t heard of these before, but you can give it a read, and it’s wonderfully easy to understand. It’s written very, very nicely, and it sort of walks you through what things mean, like, why they’re doing that. They’re giving examples of what discrimination would look like, you know, especially with tools of AI use and emergent technologies and how to identify and mitigate those risks.
Evan Dumas
And it’s this wonderfully laid out document about, let’s see scrolling down here about like, six pages long, with a lot of links, etc, signed by Melanie Fontes Rainer, the Director for the Office of Civil Rights. And I just found it like such a pleasing dissemination of information, rather than, you know, the scary articles they send out of people doing wrong, or the more sort of like, here’s the new regulations read them, which, you know, make most people’s eyes glaze over.
Evan Dumas
So this was a wonderful, Dear Colleague letter. We’re going to link to it in the show notes and I recommend just giving it a quick scan and read and see like, oh, they can speak our language. They can help me understand what I need to comply with and how and why.
Liath Dalton
Exactly. And I mean that it does feel like this is kind of an example or illustration of them being successful at what their purpose and mandate is right? Which, HIPAA, as a regulatory framework, is meant to protect clients and patients rights and safeguard their information and support good care outcomes in the process. And it is intended to be a flexible in application framework, because while the standards don’t vary based on entity size, what meeting them in practice looks like is going to vary. And that’s something that is explicitly acknowledged here, but the Dear Colleague letter really is kind of emphasizing the practical application, as well as the formal compliance components.
Liath Dalton
Because there are some specific recommendations about,you know, your your formal compliance processes that are related to compliance with this final rule. And those involve, no surprise here, some risk assessment, risk mitigation planning, and formal written policies and procedures, and training workforce on what they need to know about those policies and procedures to uphold them.
Liath Dalton
So Evan and I both have kind of warm, fuzzy hearts right now, looking at this, because it’s just a good example of how HIPAA can actually be a really supportive tool for managing these really crucial aspects of a practice and of client care and are evolving and adaptive, along with the current practice landscape that we find ourselves in.
Evan Dumas
Yeah.
Liath Dalton
So essentially, the final rule addresses that there, that discrimination is prohibited when, when it is done on the basis of race, gender, sexual orientation, religious identity, etc, and there are ways that discrimination on the basis of those categories very much can show up in AI utilization, right?
Evan Dumas
Mhm.
Liath Dalton
And the the primary reason for that being that AI is what its output is, is going to always be determined by the input, by the data set that it is given, and the algorithms right? And so when data sets have specific information that is kind of demarcated along those lines or categories, the output is going to reflect that, and there are instances in which that may be discriminatory, and therefore not serve client, slash patient best interests.
Liath Dalton
So that needs to be something that there is awareness of the potential for, and that we make sure that we are utilizing AI that is involved in any patient care support tools, that’s the overarching term that the OCR is using, patient care support tools that, that that is being addressed there.
Liath Dalton
So there are some specifics. And of course, this is something that we will be explicitly addressing in the AI utilization policy and procedure template that we are soon to be releasing. There will be a iteration for group practices and then an iteration for solo practices, and that’ll be part of our comprehensive HIPAA compliance manual and material sets. But in the in the interim, these are good things to just be considering and evaluating when looking at AI tools, right, and evaluating their their appropriateness for for practice.
Liath Dalton
So kind of what they’re they’re saying is that anytime a tool that is involved used to inform diagnosis, or assessment, or care provision, or treatment planning, that falls under the purview of this prohibition on non-discrimination. And so they’re not saying that the requirement is that you get the exact data sets from all of your, your vendors. And thankfully, in the the full final rule commentary and preamble, there’s some some great clarification around that of what, what they expect, the actual onus and burden to be on uncovered entities, right?
Liath Dalton
And, of course, because there’s an awareness, you may not know exactly what the whole data set is,
Liath Dalton
for a tool that that you’re using, but these are the sorts of questions we want to ask. So essentially, if a AI tool that you utilize, that has any component of it related to client care or informing decisions around client care, if its output is based on any of these kind of categories and it differs based on the particular input, that means that it’s it’s something that needs to be evaluated as to whether or not it is then kind of creating this prohibited discrimination.
Evan Dumas
Yeah, no.
Liath Dalton
So you know, the examples that are given primarily in the wonderful Dear Colleague letter are more focused on the really clear cut sort of medical use applications and things like the EGFR and pulse ox in context of race, right?
Evan Dumas
Yeah.
Liath Dalton
But if we want to try and extrapolate, make something analogous to a mental health practice context, and also knowing that kind of currently, in terms of clinical use application, one thing that is kind of predominant has been for support around treatment planning. Evaluating whether or not the of course, HIPAA friendly tool that you are using to support treatment planning, if the suggested components of a treatment plan vary based on any of those specified categories.
Evan Dumas
Yeah.
Liath Dalton
Right?
Evan Dumas
Yeah.
Liath Dalton
And then the evaluation is, okay, is that variation discriminatory or not, right?
Evan Dumas
Yeah.
Liath Dalton
So just start, start thinking about those, those pieces of things, identifying, where AI potentially has any role as a patient care support tool within your practice. Like a mini asset inventory, AI asset inventory. And then stay tuned for specific policy support that outlines all the details around how to evaluate tools, how to utilize them appropriately, and how to train your team on that as well.
Liath Dalton
And one thing I’ll leave folks with is something that was highlighted in in the letter and the accompanying AI strategic use plan from HHS, which is: they’re emphasizing that what they govern and regulate are not the tools themselves, it’s the use of the tools.
Evan Dumas
Exactly.
Liath Dalton
And why that makes Evan and myself start nodding and smiling, is because something that we’ve been saying for for so long, is all about, you know, compliance being a process,
Evan Dumas
Yeah.
Liath Dalton
not a product.
Evan Dumas
Yep.
Liath Dalton
And that it is all about the decisions that are made around how something is utilized that make it compliant or or not. And so I just appreciate seeing that very clearly reflected in this, and that’s what our role then becomes, in terms of playing a support to your practices, is helping with that specificity around correct usage, right?
Evan Dumas
Exactly, exactly.
Liath Dalton
And just having those good guard rails in place.
Liath Dalton
So stay tuned, because there are going to be many more developments around AI. I have no doubt that we are going to see increasing specificity from the OCR. In part, because their strategic plan related to cybersecurity and critical infrastructure around their the use of artificial intelligence, really relates to supporting standardization and alignment on best practices, particularly in cybersecurity governance.
Liath Dalton
And so putting those guardrails in place, standardizing data security policies, all of that is a high priority. And so we will start seeing, like this, examples of that, and then be better equipped to to navigate the utilization of AI in appropriate ways.
Liath Dalton
So it’s an exciting development in kind of our view.
Evan Dumas
Yeah.
Liath Dalton
We hope you found this helpful. Thank you for joining us, and we’ll talk to you next time.
Evan Dumas
Yeah, see you next time everybody.
Liath Dalton
This has been Group Practice Tech. You can find us at personcenteredtech.com. For more podcast episodes, you can go to personcenteredtech.com/podcast, or click podcast on the menu bar.


Your Hosts:
PCT’s Director Liath Dalton
Senior Consultant Evan Dumas
Welcome solo and group practice owners! We are Liath Dalton and Evan Dumas, your co-hosts of Group Practice Tech.
In our latest episode, we discuss the implications of the Section 1557 final rule on nondiscrimination for mental health practices.
We cover:
-
The Section 1557 final rule, what it means, and when it takes effect
-
How the Office of Civil Rights (OCR) disseminated this information
-
Ways in which AI can be discriminatory
-
What to consider when utilizing AI with patient care support tools as a mental health provider

Therapy Notes proudly sponsors Group Practice Tech!
TherapyNotes is a behavioral health EMR/EHR that helps you securely manage records, book appointments, write notes, bill, and more. We recommend it for use by mental health professionals. Learn more about TherapyNotes and use code “PCT” to get two months of free software.
*Please note that this offer only applies to brand-new TherapyNotes customers
Resources for Listeners
Resources & further information
Resources:
- The OCR/HHS “Dear Colleague Letter”: Re: Ensuring Nondiscrimination Through the Use of Artificial Intelligence and Other Emerging Technologies
- OCR Notice: Ensuring Nondiscrimination in the Use of AI is Good Medicine
- HHS: HHS Artificial Intelligence Strategic Plan
- HHS: Artificial Intelligence Strategic Plan — Slide Summary
- Federal Register: Section 1557 Final Rule
PCT Resources:
- Relevant on-demand, legal-ethical CE training: The Evolving Legal-Ethical Standard of Care for the Clinical Use of Artificial Intelligence in Mental Health
- Gain insights into the benefits and challenges of incorporating AI technologies into their practice, understand the clinical implications, and learn how to navigate legal and ethical guidelines while maintaining compliance with HIPAA regulations.
- PCT’s Comprehensive HIPAA Security Compliance Program (discounted) bundles:
- For Group Practices
- For Solo Practitioners
- PCT’s HIPAA Risk Analysis & Risk Mitigation Planning service for mental health group practices — care for your practice using our supportive, shame-free risk analysis and mitigation planning service. You’ll have your Risk Analysis done within 2 hours, performed by a PCT consultant, using a tool built specifically for mental health group practice, and a mitigation checklist to help you reduce your risks.
- Group Practice Care Premium
- weekly (live & recorded) direct support & consultation service, Group Practice Office Hours — including monthly session with therapist attorney Eric Ström, JD PhD LMHC
- + assignable staff HIPAA Security Awareness: Bring Your Own Device training + access to Device Security Center with step-by-step device-specific tutorials & registration forms for securing and documenting all personally owned & practice-provided devices (for *all* team members at no per-person cost)
- + assignable staff HIPAA Security Awareness: Remote Workspaces training for all team members + access to Remote Workspace Center with step-by-step tutorials & registration forms for securing and documenting Remote Workspaces (for *all* team members at no per-person cost) + more
Group Practices
Get more information about how PCT can help you reach HIPAA compliance while optimizing and streamlining your practice.
Solo Practitioners
Get more information about how PCT can help you reach HIPAA compliance while optimizing and streamlining your practice.