The American Medical Association (AMA) rolled out a comprehensive framework to protect physicians from unauthorized artificial intelligence-generated deepfakes.
The guide, created by the organization’s Center for Digital Health and AI, aims to modernize physician identity protections while closing legal gaps. The AMA uses the term “augmented intelligence” when referring to AI to emphasize its assistive role in medicine.
The framework is based on seven policy principles: physician identity as a protected right; prohibition on deceptive medical impersonation; informed, opt-in and revocable consent; mandatory transparency and labeling; shared responsibility to prevent impersonation; enforcement and practical remedies and minimizing administrative burden.
AMA CEO John Whyte, M.D., said in a statement that deepfakes impersonating physicians are not solely scams, but also “a public health and safety crisis.”
“When bad actors exploit a doctor’s identity, they undermine patient trust and can steer people toward harmful, unproven care,” Whyte said. “We need strong action by federal and state lawmakers to protect physicians’ identities, ensure transparency, and stop this fraud. Safeguarding professional integrity is essential to preserving trust and delivering high-quality care in a rapidly evolving digital landscape.”
The new framework comes as the organization ramps up calls to improve the safety of AI tools in medicine as solutions become more widespread in the industry.
In late April, the AMA called on federal lawmakers to strengthen safeguards against chatbots amid their increasing use for mental health.
While the organization recognized legislators’ efforts towards “advancing conversations about AI’s role in society and mental health,” it said the rise of mental health chatbots, including reports of encouraging self-harm and privacy breaches, “highlights the urgent need for clear guardrails.”
Recommended AMA safeguards include strict data protection standards, transparency standards and penalization of deceptive practices.