Like a lot of clinicians, I worried about how the rise of AI might impact my field.
I’m not opposed to innovation, but in healthcare, new tools have consequences when they’re adopted too fast or without enough clinical oversight.
As a licensed physical therapist, I was wary of what adopting AI at large could mean for patient care. In healthcare, new tools are being introduced one after the other, often framed as increasing efficiency or saving cost. My concern: Would human decision-making be sidelined because of AI?
Those concerns are especially relevant in musculoskeletal (MSK) care, where outcomes depend on nuance and trust. At the same time, I understand that healthcare is under enormous pressure. Payment is increasingly tied to outcomes and total costs of care. It’s important to get the right care to the right person earlier, and AI has a lot of potential to do that. So the question I’ve been wrestling with isn’t whether AI belongs in healthcare, it’s where AI fits in improving the patient and provider experience.
Why MSK care can’t be automated away
MSK conditions are a major driver of pain, disability, and healthcare spending in the U.S. Nearly 1 in 2 U.S. adults lives with an MSK condition at any given time. I chose this field because I wanted to help people navigate pain and stay active in their lives, not because I wanted to move them through a system faster.
What I’ve seen over time is that patients are often funneled into care after the problem has already escalated, or pushed toward low-value interventions that don’t address the root of the issue. By the time they reach the right kind of support, their quality of life has often been impacted in ways that could have been avoided.
It’s a complex field. Two people can present with the same diagnosis and still need a very different next step. There’s a lot of “it depends.” Readiness, fear, finances, family responsibilities, and past experiences with healthcare are all highly relevant contexts for care. Getting to that point takes rapport, and building rapport takes clinical judgment, experience, and time.
Patients need to feel heard and supported, especially when their pain persists or their path forward isn’t obvious. Technology can help with responsiveness and continuity, but trust is still earned through human connection. For high-stakes or complex moments, tools should support the people doing the listening and adapting, with clear clinical accountability for the decisions that shape a person’s body and life.
Where AI fits
AI is already being used across healthcare, including MSK care. We’re seeing virtual PT platforms using motion tracking, AI scribes helping with documentation, and decision-support tools supporting care teams behind the scenes.
Now I’m routinely using a number of new AI-powered tools that help clinicians like me intervene in the patient’s care in an even more thoughtful way. I see how AI can make a real difference, especially when it works alongside my clinical expertise to provide real-time support during calls and identifying coaching opportunities or patient patterns across conversations.
Support from these tools is important because clinicians are stretched thin. Administrative work and documentation pull attention away from patients. Holding everything in your head while staying fully present with someone in pain is cognitively demanding — and when we can’t operate at the top of our license, the burnout, turnover, and less meaningful patient interaction takes a toll on everyone.
What responsible AI requires
Healthy skepticism doesn’t disappear just because a tool seems promising. If AI is going to be used in MSK care — or any area of healthcare — it needs clear, non-negotiable rules around it.
From my perspective, responsible AI requires human oversight in the initial design and deployment and ongoing to ensure it’s working the way it should. That includes clear escalation paths, regular audits, population-based bias testing, and privacy protections patients can trust.
Looking ahead
There’s potential for AI to legitimately improve what we do. I imagine it providing real-time suggestions to clinical navigators during patient interactions, reminding us of things we already know but may be overlooking, or predicting the most optimal way to communicate with patients considering individual learning styles and readiness.
A lot of this is already underway. Over the next year, in the industry at large, we’ll see more tools move toward keeping patients engaged in care that fits their lives rather than pushing them through rigid pathways of traditional MSK care. The potential inspires me. If AI can help clinicians intervene earlier, communicate better, and make the process easier without affecting trust, we can change what MSK care feels like before pain becomes chronic, expensive, and defining.
I’m on board so long as AI is built and deployed responsibly.
Photo: SDI Productions, Getty Images
Steven Griffin, PT, DPT is the Senior Manager of Clinical Navigation at TailorCare where he leads a team of specialized physical therapists dedicated to guiding patients through complex musculoskeletal care. He focuses on designing the clinical workflows and care models that empower his team to deliver precise, evidence-based recommendations. By bridging the gap between clinical expertise and smart system design, Steven ensures that every patient journey is structured for the best possible outcome. Steven now applies his clinical insight to scaling high-quality navigation services. He holds a Doctorate of Physical Therapy (DPT) from Emory University and a Bachelor’s degree in Kinesiology from the University of Alabama.
This post appears through the MedCity Influencers program. Anyone can publish their perspective on business and innovation in healthcare on MedCity News through MedCity Influencers. Click here to find out how.
