The Legal Risks of AI in Medicine
By: Anna K. Marvin – AttorneyArtificial Intelligence Changing Health Care
Like most other industries, artificial intelligence (“AI”) has made its way into health care. For example, FotoFinder®, a Germany based company, has developed AI software to assist dermatologists in distinguishing between melanocytic lesions (e.g., moles, melanoma) and non-melanocytic lesions (e.g. basal cell carcinoma, seborrheic keratosis) with its Moleanalyzer pro. Moleanalyzer pro, The Intelligent AI Assistant. Other AI models have been developed to detect cancer cells, predict a tumor’s “molecular profile based on cellular features,” and pinpoint features of a tumor’s microenvironment that are “related to a patient’s response to standard treatments.” A New Artificial Intelligence Tool for Cancer, published Sept. 4, 2024.
Additionally, with the help of AI, robots are becoming increasingly autonomous in the performance of surgery, with one system, hierarchical surgical robot transformer (SRT-H), “adapting to individual anatomical features in real-time, making decisions on the fly, and self-correcting when things don’t go as expected.” Robot Performs First Realistic Surgery Without Human Help, published July 9, 2025.
AI medical technologies seem to be multiplying daily, and many are still in their infancy. However, some assistive technologies are already in use in practice today. For example, Franciscan Health locations in some northern Indiana cities are now using the Ceribell System, an EEG device powered by AI, for seizure detection and continuous monitoring purposes. See, e.g., AI-Powered Technology at Local Franciscan Health Hospitals Enhances Neurological Care in Northwest Indiana and Chicago’s South Suburbs, published June 26, 2025.
This expansion raises an interesting question for medical facilities and providers: will there come a time when a hospital or provider commits malpractice by NOT using AI-based medical technologies?
Standard of Care
Indiana’s former standard of care for medical malpractice cases was the modified locality rule, which provided that the standard of care was the
“degree of care, skill, and proficiency which is commonly exercised by ordinarily careful, skillful, and prudent [physicians], at the time of the operation in similar localities.”
Vergara v. Doan, 593 N.E.2d 185, 186 (Ind. 1992) (quoting Burke v. Capello, 520 N.E.2d 439, 441 (Ind. 1988)) (alteration and emphasis in original).
However, as technology, communication systems, medical education, and transportation systems advanced, Indiana abandoned the modified locality rule, adopting a modified standard of care:
“a physician must exercise that degree of care, skill, and proficiency exercised by reasonably careful, skillful, and prudent practitioners in the same class to which he belongs, acting under the same or similar circumstances.”
Vergara, 593 N.E.2d at 187.[1] The Vergara court further provided that “other relevant considerations” when determining whether a doctor acted reasonably would include “advances in the profession.” Id. (emphasis added).
Indiana Legislation Regarding AI Use in Medicine
Unless and until Indiana’s General Assembly enacts legislation governing AI’s use in medicine, our courts will be tasked with addressing how this ever-changing and multiplying technology might (or might not) fit into various existing legal frameworks, including torts such as medical malpractice and products liability. Given AI’s relative newness, it will likely be a while before practicing medicine without adoption of new AI technologies might be deemed malpractice. Additionally, under Indiana’s existing medical malpractice standard, several different factors are analyzed when determining whether a doctor acted reasonably in a specific set of circumstances, and appropriate use of technology is always likely to be only one of those factors. Eventually, however, use of new AI technologies could very well become part of the applicable standard of care.
Liability when Medical AI Goes Wrong
Which leads to the next interesting question: when use of AI technology becomes part of the standard of care for medical professions, who is liable if things go wrong? For example, if a medical provider uses AI technology to assist in patient diagnosis or treatment, and those AI analyses or recommendations cause the provider to misdiagnose the patient, fail to diagnose the patient, or implement the wrong form of treatment, can that provider, and/or its hospital, be liable for malpractice? And to what extent could the developer or distributor of the AI technology be liable?
One thing is clear: just like in seemingly every facet of modern society, medical professionals would be wise to stay abreast of all technological developments in their particular fields, especially in the rapidly changing world of AI.
[1] While the modified locality rule would no longer apply going forward, the court also provided that the “[l]ocality of practice remains a proper subject for evidence and argument because it may be relevant to the circumstances in which the doctor acted.” Vergara, 593 N.E.2d at 188.

Anna K. Marvin – Attorney at Law
Anna Marvin is an associate attorney who focuses her practice in health care law and business litigation.
An Indiana native, Anna grew up in the scenic Lake Shafer area. She completed her undergraduate studies in psychology in three years prior to graduating cum laude from the Indiana University Maurer School of Law.
During her summers in law school, Anna gained valuable experience by serving as a Judicial Intern for Judge Jason Thompson in the White County Circuit Court. She further honed her legal skills working as a law clerk at MacGill PC, where she contributed to complex cases and developed a strong foundation in litigation.
Anna’s professional journey reflects her commitment to delivering impactful legal solutions in her areas of practice.
© Riley Bennett Egloff LLP
Disclaimer: Article is made available for educational purposes only and is not intended as legal advice. If you have questions about any matters in this article, please contact the author directly.
Permissions: You are permitted to reproduce this material in any format, provided that you do not alter the content in any way and do not charge a fee beyond the cost of reproduction. Please include the following statement on any distributed copy: “By Anna K. Marvin© Riley Bennett Egloff LLP — Indianapolis, Indiana. https://rbelaw.com”
Posted: October 21, 2025, by Anna K. Marvin