RADV Audit Prep: How to Pass a Risk Adjustment Data Validation Audit
By Daniel Plasencia — Certified Risk Coder (CRC), Certified Professional Coder (CPC)

A Risk Adjustment Data Validation (RADV) audit is the mechanism CMS uses to verify that the diagnoses a Medicare Advantage plan submitted for risk-adjusted payment are actually supported by the underlying medical records. RADV is the single largest source of risk adjustment recoupment exposure in the industry — the 2023 RADV Final Rule, published in the Federal Register, allows CMS to extrapolate audit findings across an entire contract, which means a small sample of unsupported HCCs can translate into a multi-million dollar refund.
This guide walks through what RADV auditors actually look for, the documentation gaps that fail the most charts, and a practical pre-audit checklist your coding team can use whether you are facing a contract-level audit, a targeted CMS audit, or an internal mock RADV.
How RADV Actually Works
CMS contracts with independent medical review contractors to pull a sample of beneficiaries from the plan's risk-adjusted population for a specific payment year. For each sampled beneficiary, the plan must produce one medical record per HCC that was submitted for that member during the audit year -- the HCC categories in scope are pulled directly from the CMS risk-adjustment model software and mapping files. The auditor then re-reviews each chart and decides whether the documentation supports the HCC.
The plan does not get to pick the chart. The plan must submit the "best" record — typically defined as a face-to-face encounter from an acceptable provider type that contains the strongest possible documentation for the diagnosis being validated. If the auditor cannot find documentation that supports the HCC in the submitted chart, the HCC is failed and the corresponding RAF dollars are recouped.
Under the 2023 Final Rule, CMS can extrapolate the audit error rate from the sample to the entire contract starting with payment year 2018 audits. Extrapolation means the dollars at risk are not just the sample's HCCs — they are a percentage of the contract's total risk-adjusted revenue for that year. This aggressive posture tracks the concerns MedPAC raised in its February 2026 CY2027 Advance Notice comment letter about the scale of MA overpayments tied to weakly supported diagnoses.
What Auditors Actually Check
RADV auditors are not looking for clever coding. They are looking for four things:
1. Acceptable provider type. The encounter must be with a physician or qualifying non-physician practitioner. Outpatient hospital and physician office encounters are the cleanest. Inpatient discharge summaries, ED notes, and home health face-to-face notes are also acceptable when properly signed and dated. Lab reports, radiology reports without an interpretation by a treating provider, and pharmacy records by themselves are not acceptable RADV documentation.
2. Face-to-face encounter. The diagnosis must be tied to a face-to-face visit during the data collection period. A diagnosis that appears only in a problem list or a copy-forwarded note from a prior year does not count.
3. Provider signature and credentials. The note must be signed by the rendering provider, with credentials, and dated within the data collection period. Unsigned notes, electronically queued notes, and notes with mismatched signatures are common audit failures. CMS specifically allows authentication via electronic signature, but the signature must be associated with the specific note, not a separate cover sheet.
4. Documentation that supports the HCC. The diagnosis must be documented with enough specificity to map to the submitted ICD-10-CM code -- verified against the official CMS ICD-10-CM code set -- and the provider must show that the condition was actually being managed at that visit. This is where MEAT criteria (Monitor, Evaluate, Assess, Treat) come in -- the documentation must show at least one of those elements for the diagnosis being validated, and AAPC's practical MEAT documentation primer is the industry reference coders typically train against.
The Documentation Gaps That Fail Most Charts
After reviewing thousands of failed RADV charts, four documentation gaps account for the vast majority of failures:
Gap 1 — Status conditions reported without active management. A common failure pattern: the plan submitted HCC 18 (Diabetes with chronic complications) based on a code like E11.22 (Type 2 diabetes mellitus with diabetic chronic kidney disease). The chart shows the diagnosis in the problem list, but the encounter note never addresses diabetes or kidney disease. There is no medication, no labs ordered, no plan, no mention. This fails RADV. The provider has to actually evaluate or manage the condition during the encounter for the diagnosis to count.
Gap 2 — Specificity that the chart does not support. The plan submitted I50.22 (Chronic systolic heart failure), which maps to HCC 226 in V28 with a meaningful weight. The chart says "CHF" — nothing about systolic vs. diastolic, nothing about acute vs. chronic. The auditor will downgrade this to I50.9 (Heart failure, unspecified), which does not map to an HCC in V28. The HCC is failed.
Gap 3 — Cancer history coded as active. The plan submitted C50.911 (Malignant neoplasm of unspecified site of right female breast), which maps to HCC 22 (Breast, prostate, and other cancers and tumors) in V28. The chart shows the patient is on tamoxifen and being followed by oncology, but there is no active treatment, the cancer is in remission, and the documentation describes it as "history of breast cancer." The correct code is Z85.3 (Personal history of malignant neoplasm of breast), which does not map to an HCC. Auditor fails the chart.
Gap 4 — The signature problem. The chart is technically supportive, but the rendering provider's signature is missing, illegible, or applied via a stamp. CMS requires a legible identifier on the signature — handwritten with credentials, electronic with provider name, or a printed name and credential block adjacent to the signature line. Charts that pass clinical review still fail RADV when the signature does not meet CMS standards.
A Practical Pre-Audit Checklist
If you have advance notice that your contract is being audited, or if you want to run an internal mock RADV, work through this checklist for every chart in your sample:
How to Build a Sustainable RADV Defense
The plans that consistently survive RADV audits do three things differently:
They retrieve charts proactively, not reactively. Waiting until CMS announces an audit to start chart retrieval is too late. The best plans pull a 5–10% sample of HCCs every quarter, route them through internal coders, and remediate the failures before they become audit findings. This is sometimes called a continuous mock RADV program.
They train providers on documentation, not just on coding. A coder cannot fix a chart that the provider did not document. The plans with the lowest RADV failure rates invest in provider documentation improvement (CDI) for the diagnoses that drive the most RAF — diabetes with complications, CHF, COPD, CKD, vascular disease, and major depression. These six condition families typically account for more than half of a Medicare Advantage plan's RAF and an even higher share of RADV exposure.
They reconcile encounter data and submitted diagnoses every month. A diagnosis that was submitted to CMS but does not appear in any encounter file is a guaranteed RADV failure. Monthly reconciliation between the plan's encounter data, the chart-review supplemental file, and the RAPS/EDS submission catches these gaps before the audit period closes.
What Happens When You Fail
If a chart fails RADV and the failure stands through the appeals process, the plan refunds the RAF dollars associated with that HCC for the data collection year. Under the 2023 Final Rule, CMS can extrapolate findings across the contract starting with payment year 2018 audits, meaning the dollars at risk are a percentage of the contract's total risk-adjusted revenue, not just the sample's dollars. The OIG work plan on V24 vs. V28 trends indicates that extrapolated RADV recoupments and their cross-model comparisons remain an active federal oversight priority.
Plans have appeal rights at multiple stages: medical record review reconsideration, error rate appeal, and payment error calculation appeal. Each appeal stage requires submitting additional documentation and is decided on the existing record — you cannot create new documentation to support a previously failed chart.
The clearest path to survival is to stop the failures before they happen. Continuous mock audits, provider documentation training, and rigorous month-to-month reconciliation of submitted diagnoses against the medical record give plans the best chance of passing CMS RADV with minimal recoupment.
Try this in HCC Buddy Academy
RADV Audit Preparation Essentials
Part of the RADV Audit Readiness course
Related Tools
Daniel Plasencia
Founder & Developer
Daniel Plasencia — Risk adjustment coding professional and software engineer who built the tool he wished existed, at a price coders can actually afford.
Get HCC Coding Tips in Your Inbox
Join our newsletter for coding tips, guideline updates, and tool announcements.

