Lisa Mathis FNP-BC
As the season ramps up with flu, strep, and pneumonia, patient visits increase, and advanced care providers’ schedules become more demanding as work-in lists continue to grow. Providers are expected to be more productive, deliver high-quality patient care, and complete documentation before leaving for the day. As a result, we are constantly looking for ways to work faster and smarter—not harder.
This raises an important question: is there a new solution? Is it ChatGPT, artificial intelligence in the office, or other emerging tools designed to assist with charting and diagnosing patients? As a provider, one can enter symptoms and—almost instantly—receive an answer. Technology can even record and document an entire visit. Is this the new wave of the future?
We must be cautious in this thought process.
Bias and Inequity do exist within the AI system: AI systems are trained on existing data, which may underrepresent certain populations. This can lead to misdiagnosis or unequal care for minorities, women, children, or older adults.
Errors and Inaccuracy: AI can produce confident but incorrect recommendations. Clinical decisions based solely on AI outputs may result in patient harm if not verified.
Lack of Transparency (Black Box Problem): Many AI models do not clearly explain how they reach conclusions. This makes it difficult for clinicians to trust, validate, or defend decisions legally and ethically.
Overreliance by Clinicians: Providers may become too dependent on AI tools, potentially weakening clinical judgment and critical thinking. Automation bias can cause clinicians to ignore contradictory clinical signs.
Data Privacy and Security Risks: AI systems require large volumes of patient data. There is increased risk of data breaches, misuse, or unauthorized access, raising HIPAA and confidentiality concerns.
Legal and Liability Issues: Unclear responsibility if AI contributes to an error:
Regulatory frameworks are still evolving.
Ethical Concerns: Issues with informed consent when patients are unaware AI is involved. Risk of reduced human interaction, affecting patient trust and empathy.(1)
AI can enhance efficiency and decision-making in medicine, but it must support—not replace—clinical judgment. Safe use requires transparency, human oversight, strong data governance, and ongoing evaluation.(2)
However, AI is not the only “shortcut” providers are beginning to take. There is an emerging trend toward less detailed documentation, often justified by the belief that “you aren’t getting paid to do that.” This mindset particularly affects physical exam documentation, with greater emphasis placed on the Impression and Plan portion of the visit for billing purposes.
While this approach may be appealing to newer clinicians, it can be difficult for more experienced providers to adopt—and may not represent a best practice. This shift is often driven by coding and billing guidance; however, the provider delivering the care must remember an essential principle: if it is not documented, it is considered not done.
For example, how can a provider demonstrate that a leg wound was examined if there is no description of its appearance? Even a simple statement such as, “The patient was out of bed and seated in a chair at the time of my examination,” can be critically important in a legal setting. Documentation serves not only billing purposes, but also patient safety, continuity of care, and legal protection.
Ultimately, it is the provider—not the coder or the software—who is held accountable for the medical record. This reality must remain at the forefront as documentation practices continue to evolve.
Medical trends come and go. Some become lasting standards of practice, while others prove ineffective and fade away. Regardless of evolving documentation methods, providers must always prioritize what is best for the patient and ensure their work is thoroughly and appropriately documented. In this new era of medicine, strive to be efficient—not lazy.
“Dance like no one is watching,
Chart like it may be read aloud in a deposition!”
1) Toward Safe and Ethical Implementation of Health Care Artificial Intelligence: Insights From an Academic Medical Center https://www.sciencedirect.com/science/article/pii/S2949761224001226?utm_source=chatgpt.com
2) Benefits and Risks of AI in Health Care: Narrative Review https://www.i-jmr.org/2024/1/e53616?utm_source=chatgpt.com