2 CE Credit Hour Presentation on Using AI in Documentation and Clinical Notes

Using Artificial Intelligence (AI) as a Mental Health Clinician: Managing risk, ethics, and clinical benefits

Join Dr. Maelisa McCaffrey (Hall) for an interactive demonstration of the ways mental health clinicians can ethically use AI for clinical work.

Plus: Dr. McCaffrey (Hall)’s decision tree and AI note handout to help you navigate the AI landscape

 2 legal ethical CE credit hours

On-demand Self Study

CE Credit Hours

You’ve heard about mental health clinicians using artificial intelligence (AI) to simplify their work, but is it ethical?

 

What even is AI and how can you know what is ethical when this industry is so rapidly evolving?

This workshop will answer those questions and more. Join guest speaker Dr. Maelisa McCaffrey for an interactive demonstration of the ways mental health clinicians can ethically use AI for clinical work.

You will learn:

  • How AI is currently being used in various mental health settings, and what ethical considerations are most important for clinicians
  • Exactly which aspects of HIPAA are most relevant to common AI usage
  • How to navigate this ever changing landscape of AI tools for therapists.

Maelisa will demonstrate multiple ways to use AI in clinical practice and review what clients commonly think about AI in mental health. She will show progress notes generated by multiple AI platforms and analyze how well these progress notes meet requirements for ethical documentation (and even for medical necessity). 

Bring your questions and be ready to take notes because this will be an interactive webinar focused on giving you practical ways to use AI, and ways to determine when AI is best left alone (for now). 

Real feedback from the live event:

“This was the best AI training I have attended thus far.”

“Great presentation! AI was a topic I was previously intimidated by how/if to use as a therapist – I’m walking away from this presentation feeling more confident in what ethical options are available.”

“Well organized and professional. New information that I can use right away.”

“The course and the presenter were excellent. I had no exposure at all before this.”

“Thanks for tackling this emerging topic with such clarity and depth!”

“The course was cutting edge. As usual, PCT leads the way”

Who is this event for?

This course is designed for solo practitioners, group practice leaders, and group practice clinical staff members. It is also suitable for practices that consist of 100% in-person, 100% telehealth, or a mixture of in-person and telehealth treatment.

green check mark  In-person Practices

green check mark  Hybrid Practices

green check mark  Teletherapy Only Practices

Plus a Documentation Pack to Improve Your Notes:

  • Decision Tree: Is this AI service ethical and useful?
  • Handout: Examples and analysis of AI generated progress notes

I’ve been watching several of your CE programs and, while I’ve always been impressed with your services, I just have to say, your programs are excellently done with production and content and simultaneously warm and accessible. I really appreciate what you do!

Tara Ingram

Understand the AI experience

Define artificial intelligence (AI) and how it is commonly experienced in everyday life

Ethical Use Cases

Identify three ethical uses of AI in a mental health clinical setting

icon of a home office

Tips for Tricky Situations

Describe what components of HIPAA apply to using AI in clinical practice

Strategies for your practice

Use a decision tree to determine whether or not an AI platform is both ethical and useful for your specific clinical work

Training is Step 2 of the PCT Way.

Learn more about the PCT Way here.

Course Details

2 CE Credit Hour. Self Study

Title: Using Artificial Intelligence (AI) as a Mental Health Clinician: Managing risk, ethics, and clinical benefits

Authors/Presenters: Maelisa McCaffrey, PsyD
CE Length: 2 CE credit hours, legal-ethical
Legal-Ethical CE Hours: 2 legal-ethical CE hour 

Educational Objectives:

  • Define artificial intelligence (AI) and how it is commonly experienced in everyday life
  • Identify three ethical uses of AI in a mental health clinical setting
  • Describe what components of HIPAA apply to using AI in clinical practice
  • Use a decision tree to determine whether or not an AI platform is both ethical and useful for your specific clinical work

Syllabus: 

    • Introductions
    • Audience poll and discussion
    • Artificial intelligence (AI) in everyday life 
    • Definition of AI
    • Common nonclinical uses  of AI 
  • Ethics, HIPAA, and AI
  • How HIPAA applies to common uses of AI 
  • Ethical guidelines for using AI, even if your ethics code has not specifically addressed AI 
  • Cautions and concerns about AI tools for therapists 
  • AI in mental health practice  
    • Common uses of AI in clinical practice
    • Potential benefits of AI in clinical practice
    • Client perceptions of using AI in mental health services
  • Review of various AI tools for clinical documentation 
    • Informed consent and client questions when using AI services 
    • How AI services write progress notes for clinicians  
    • Examples and analysis of AI generated progress notes
  • Closing comments – Mitigating risk and determining the usefulness of AI in your clinical work
  • Final Q&A 
  • Note: Yes! Medical necessity will be addressed in this presentation.

    Meet Our Presenters

    Presented by

    Maelisa McCaffrey (Hall), PsyD

    Rob Reinhardt, LPC-S NCC

    Dr. Maelisa McCaffrey (Hall) is a licensed psychologist, nail design enthusiast, and multi-passionate entrepreneur. With her business QA Prep, she empowers therapists through trainings and consultation on clinical documentation. Maelisa focuses on the “why” behind the usual recommendations and encourages clinicians to think outside the box, while also keeping their ethics intact. A true ENFP, Maelisa aims to make sure all of her endeavors are meeting a need in the community while also allowing for plenty of laughter and fun.

    Resources & Citations

    • AMHCA. (2020). AMHCA code of ethics. https://higherlogicdownload.s3-external-1.amazonaws.com/AMHCA/2 AMHCA Code of Ethics-2020-2.pdf?AWSAccessKeyId=AKIAVRDO7IEREB57R7MT&Expires=1693846114&Signature=bl6rjlWRj1qkEQG8zjkDdXQAlF4= 
    • Abrams, Z. (2023). AI is changing every aspect of psychology. APA Monitor on Psychology, 54(5). https://www.apa.org/monitor/2023/07/psychology-embracing-ai
    • Allen, S. (2023). Improving Psychotherapy with AI: From the Couch to the Keyboard. IEEE Pulse: https://www.embs.org/pulse/articles/improving-psychotherapy-with-ai-from-the-couch-to-the-keyboard/ 
    • American Counseling Association. (2014). ACA code of ethics. https://www.counseling.org/resources/aca-code-of-ethics.pdf 
    • American Psychological Association (2017). Ethical Principles of Psychologists and Code of Conduct. Washington DC: American Psychological Association. 
    • Forester-Miller, H., & Davis, T. E. (2016). Practitioner’s guide to ethical decision making (Rev. ed.). Retrieved from http://www.counseling.org/docs/default-source/ethics/practioner’s-guide-toethical-decision-making.pdf
    • Maheu, M. (2023). Psychotherapy notes in the age of AI: Considering ChatGPT healthcare ethical implications. telehealth.org: https://telehealth.org/chatgpt-healthcare/
    • McAdoo, T. (2023). How to Cite ChatGPT. APA Style: https://apastyle.apa.org/blog/how-to-cite-chatgpt 
    • McCaffrey, M. (2023). Modern progress notes: Considerations for teletherapy, insurance audits, and artificial intelligence (AI). Webinar: https://dev-personcenteredtech.com/modern-progress-notes-teletherapy-insurance-audits-artificial-intelligence-ai/ 
    • Record Keeping Guidelines, American Psychological Association, American Psychologist. 2007. December Vol62 Number 9, Pages 993-1004
    • U.S. Dept. of Health and Human Services. (2006). HIPAA Administrative Simplification. Washington D.C.: Author.
    • U.S. Dept. of Health and Human Services. (2013). HIPAA Omnibus Final Rule. Washington D.C.: Author.
    • Weisman, H. (2023). Making therapy documentation easier: A comparison of AI note automation tools for private practice therapists. LinkedIn: https://www.linkedin.com/pulse/making-therapy-documentation-easier-comparison-ai-hannah-weisman-phd/ 
    • Zuckerman, E.L. and Kolmes, K. (2017) The Paper Office for the Digital Age. New York, NY: The Guilford Press.

    Accuracy, Utility, and Risks Statement: The contents of this program are based primarily on the presenters’ extensive combined experience handling legal, ethical, and usable paperwork needs for mental health organizations. Statements about applicability are according to presenters’ understanding of the state of the art and legal precedents at the time of presentation. This program discusses strategies for complying with applicable ethics codes and laws, for improving clinical documentation, and the informed consent process. It may not include information on all applicable state laws. Misapplication of the materials, or errors in the materials, could result in non-compliance with applicable laws or ethics codes.

    Conflicts of Interest: none

    Commercial Support: none.

    v2.1.12-beta

    Scheduled Maintenance

    We will be temporarily taking the website offline at 10:00 PM Pacific (1:00 AM Eastern) tonight, July 6, in order to make some improvements. We plan to be back online by midnight Pacific (3:00 AM Eastern). We apologize for any inconvenience this may cause. Dismiss