AM Best Warns of Emerging AI Threat to Insurance Data Integrity

red skull in blue binary code on a monitor in a corporate office

October 23, 2025 |

red skull in blue binary code on a monitor in a corporate office

AM Best's Best's Review October 2025 edition features an article by Lori Chordas titled "AM Best: AI Model Poisoning Emerges as New Threat to Insurance Data Integrity." The article explores how model poisoning—an emerging form of artificial intelligence (AI) manipulation—poses new risks to data reliability and insurance operations, according to AM Best Director Edin Imsirovic. 

As insurers expand the use of AI for underwriting, claims, and fraud detection, model poisoning has surfaced as a potential vulnerability that extends beyond traditional cybersecurity concerns. Attackers can subtly alter training data, leading to flawed model outputs that may go undetected, per the article.

According to the author, insurers' reliance on third-party data—from telematics and credit bureaus to internet of things (IoT) devices—introduces multiple entry points for corrupted or manipulated data. Traditional cybersecurity tools may not flag these threats because models can continue operating while quietly producing errors, adding operational and reputational risks for insurers. 

The article notes that while there are no confirmed public cases of model poisoning in insurance, the issue is gaining attention in cybersecurity and academic research. Controlled studies in other sectors show that even limited data manipulation can distort AI model behavior, which may justify its inclusion in insurers' emerging risk frameworks, according to AM Best. 

Mr. Imsirovic identified potential threat actors, including organized criminal groups seeking financial gain, insiders with privileged system access, and ideologically motivated individuals aiming to disrupt industry practices. Each presents unique detection and mitigation challenges, per AM Best. 

Copyright © 2025 by AM Best Rating Services, Inc. and/or its affiliates. ALL RIGHTS RESERVED. 

October 23, 2025