Pre-Submission Validation Testing for CMS Network Adequacy: A Practical Guide
Before you submit to HPMS, your adequacy calculations need to survive CMS scrutiny. Here's how experienced network teams validate their adequacy scores before the filing window closes.
Why Pre-Submission Validation Matters
Network adequacy submissions to HPMS are not drafts. Once the filing window closes, what you submitted becomes the basis for CMS's adequacy determination — and any errors discovered after submission require formal deficiency responses, corrective action plans, or amended filings that consume significant team resources and carry compliance risk. The cost of a failed adequacy submission is measured not only in staff time but in potential enrollment limitations that directly affect plan revenue and member access.
High-performing network ops teams treat pre-submission validation as a structured discipline — not a last-minute sanity check. They run multiple validation passes at defined milestones in the submission cycle, assign clear ownership to each pass, and document their validation results as part of the submission file. When CMS asks how you verified your adequacy calculations, you want a paper trail, not a verbal explanation.
This guide walks through the full pre-submission validation workflow that experienced network teams use, from the first internal QA pass 90 days out through the final spot-checks on filing day.
What CMS Checks in HPMS Validation
HPMS (the Health Plan Management System) runs its own automated validation checks when you upload provider data and adequacy calculations. Understanding what those checks catch — and what they miss — is essential for knowing where to focus your internal validation effort.
HPMS automated checks include: NPI format validation (ensuring all submitted NPIs are 10-digit numeric values present in NPPES), taxonomy code validation against CMS's recognized specialty list, county FIPS code verification, service area consistency checks confirming that providers submitted for a county are within your approved service area, and threshold percentage calculations that flag specialty-county combinations falling below the CMS adequacy floor.
What HPMS validation does not catch is equally important. The system does not verify that submitted providers are actually contracted with your plan, actively credentialed, or accepting new patients. It does not cross-check your panel status data against provider attestations. It does not verify that your time-distance calculations used the correct member population centroid for each county. These are all internal validation responsibilities that CMS expects the plan to execute independently before submission.
The practical implication: passing HPMS automated validation is necessary but not sufficient. Plans that treat HPMS acceptance as the finish line are relying on a system that only checks data format and threshold arithmetic — not the substantive accuracy of the underlying network.
Internal QA Testing Steps: The 90-60-30 Framework
The most reliable pre-submission validation cadence is structured around three milestone passes at 90 days, 60 days, and 30 days before the submission window opens. Each pass has a defined scope and escalation protocol.
The 90-day pass focuses on roster completeness. At this stage, the goal is to confirm that your provider roster extract contains all contracted, credentialed providers and that the data fields required for HPMS submission are populated. Pull your full roster from the provider data system, match it against your executed contract records, and flag any providers in the roster who lack an active contract or have not cleared initial credentialing. This pass typically reveals a class of "ghost providers" — providers carried in the system from prior contract terms who are no longer active — that need to be removed before the adequacy calculation runs.
The 60-day pass focuses on adequacy calculation accuracy. Run your adequacy model using the cleaned roster from the 90-day pass and compare the output against CMS's expected thresholds for each county-specialty combination in your service area. Document every county-specialty combination below threshold and classify each gap as: closeable through recruitment still in progress, requiring an exception filing, or requiring escalation because the gap is material and recruitment has stalled. This pass should produce a gap remediation action plan with assigned owners and deadlines.
The 30-day pass focuses on final reconciliation and exception documentation. At this stage, all recruitment activity should be complete or nearly complete. Confirm that every provider added to the roster since the 60-day pass has a fully executed contract and cleared credentialing. Review all exception filings for completeness — every exception must have a rationale narrative and an outreach log. Run the adequacy calculation one final time using the final roster and confirm that all closeable gaps are closed. Document the final adequacy percentages for every county-specialty combination and compare them to the prior benefit year's submission to identify any regression.
Provider Roster Reconciliation Before Submission
Roster reconciliation is the single most error-prone step in the pre-submission process, and it is the step where most adequacy calculation failures originate. The reconciliation process has three components: contract reconciliation, credentialing reconciliation, and NPPES reconciliation.
Contract reconciliation means verifying that every provider in your adequacy model has an executed participating provider agreement that is currently active. Pull your contract management records — whether in a dedicated contracting system or in a spreadsheet — and match them against your roster. Providers whose contracts have expired, been terminated, or are awaiting counter-signature should be flagged and excluded from the adequacy model until their contract status is resolved. Pay particular attention to providers whose contract term end dates fall within the benefit year you are submitting for; a provider contracted through June of the benefit year cannot be counted for the full year.
Credentialing reconciliation means verifying that every provider in your adequacy model has a current, complete credentialing file. Coordinate with your credentialing team to obtain a file-status extract that shows credentialing expiration dates and file-completeness status for every provider in the roster. Exclude any provider whose credentialing is in expired, suspended, or pending status. This is a more dynamic process than contract reconciliation because credentialing files expire on rolling dates, and a provider who was fully credentialed at the 60-day pass may have an expired file by submission day.
NPPES reconciliation means comparing the NPIs in your roster against the National Plan and Provider Enumeration System to verify that each NPI is active, that the specialty taxonomy codes you are using match the provider's NPPES enumeration, and that the practice location addresses you have on file match the NPPES record. Address mismatches are a frequent source of time-distance calculation errors — if your model is calculating drive times from a stale address, the output is wrong even if the arithmetic is correct.
Common Data Errors That Fail Validation
Certain data errors appear repeatedly across network adequacy submissions and are predictable enough that your validation workflow should check for them explicitly.
Stale provider addresses are the most common error. Provider groups relocate, multi-site practices open and close satellite offices, and hospital systems reorganize service delivery locations. If your provider database is not reconciled against NPPES on at least a quarterly basis, you will accumulate address errors that produce incorrect time-distance outputs. A specialist whose practice address is 25 miles from where your model thinks it is can flip a marginal county from passing to failing.
Incorrect specialty taxonomy codes cause providers to count toward the wrong specialty category or not count at all. CMS's HPMS adequacy tool uses specific taxonomy code mappings for each of its 22 evaluated specialty categories. If a psychiatrist is enumerated in NPPES under a taxonomy code that maps to a different category than the one CMS uses for psychiatry in its adequacy model, the provider will be excluded from your behavioral health adequacy count. Review the CMS specialty-to-taxonomy mapping table annually and reconcile it against your provider data.
Duplicate NPI entries — where the same provider appears multiple times in your roster under different practice locations — can inflate your adequacy counts if your model treats each entry as a separate provider. For adequacy calculation purposes, a provider counts once per county in which they have a practice location. If a cardiologist has offices in three counties, they count toward adequacy in each of those counties separately, but should not be counted multiple times within a single county.
Accepting-new-patients status errors cause providers to be counted toward adequacy when they are effectively unavailable. CMS's adequacy standards require that counted providers are available to accept new patients — a provider with a formally closed panel cannot be counted. If your provider attestation data is more than 180 days old, your panel-status information is unreliable. Run a panel-status attestation sweep at least 90 days before submission to update this field.
Spot-Checking Time-Distance Calculations
Time-distance calculations are the mechanical core of adequacy determinations, and they are worth spot-checking manually on a sample basis even if you are running them through a validated calculation tool. The goal of spot-checking is not to recalculate every county-specialty combination — that would defeat the purpose of using a calculation tool — but to verify that the tool's outputs are reasonable for a representative sample of counties.
Select 10 to 15 counties for spot-checking, weighted toward marginal counties where your adequacy percentage is close to the threshold. For each selected county, manually identify the member population centroid, identify the nearest contracted provider in each specialty category, and use a mapping tool to calculate drive-time distance. Compare your manual result against the calculation tool's output. Discrepancies of more than 5 minutes or 5 miles warrant investigation into whether the tool is using the correct provider location, the correct centroid, or the correct road network data.
Pay particular attention to counties that recently reclassified from one RUCA code category to another. When a county moves from rural to suburban — or suburban to urban — the applicable time-distance threshold changes, and if your calculation tool's county-classification table hasn't been updated, it may be applying the wrong standard. CMS updates RUCA classifications periodically based on updated census data, and these updates can change the applicable adequacy standard in ways that flip counties from passing to failing without any change in your actual network.
How to Document Your Validation Process
CMS auditors expect plans to have documented their adequacy validation process as part of their internal controls framework. This documentation serves two purposes: it demonstrates that the plan exercised reasonable diligence in verifying its submission, and it provides an audit trail that can be produced during a data request or on-site review.
At minimum, your validation documentation should include: a description of each validation pass (date, scope, responsible staff, tools used), the roster reconciliation outputs (number of providers reviewed, number flagged, number removed and rationale), the adequacy calculation output at each validation milestone, the gap remediation log showing gaps identified and how each was resolved, and the final pre-submission sign-off with signature from the network operations leader responsible for the filing.
Store this documentation in a location accessible to your audit response team — not in a personal drive or email thread. When CMS sends an audit data request, the clock starts immediately, and the ability to export your validation documentation quickly is a meaningful operational advantage.
When Validation Reveals Late-Breaking Gaps
The hardest situation in pre-submission validation is discovering a material adequacy gap within 30 days of the submission deadline. At this stage, recruitment timelines have typically closed, and the options for closing the gap through contracting are limited. But late-breaking gaps are manageable if you have a defined protocol.
First, classify the gap by severity. A gap that puts a county at 85% adequacy when 90% is required is different from a gap that puts a county at 40% adequacy. The severity determines whether the gap can be closed through exception filing alone or whether it requires escalation to plan leadership and potentially to your CMS regional office.
Second, exhaust your emergency contracting options. Some specialties have providers who can be activated through letter-of-agreement arrangements faster than a full contracting workflow. Hospital-employed specialists sometimes can be added under an existing facility contract. Locum tenens providers can sometimes be engaged quickly enough to clear credentialing before the submission window closes. Document every option pursued and every response received.
Third, prepare a thorough exception filing for any gaps that cannot be closed. The exception narrative must explain specifically why the gap exists, document every outreach attempt made to close it, and describe the plan's access-continuity commitments for members in the affected county. A strong exception filing for a late-breaking gap — one that demonstrates genuine good-faith effort — is materially better positioned than a weak filing for a gap you knew about months earlier.
Finally, notify your compliance officer. Any gap that requires an exception filing in a material specialty category should be disclosed to your compliance function before submission. The compliance officer may need to assess whether the gap creates regulatory exposure beyond the standard exception-filing process and whether any additional stakeholder communications are warranted.
See Blueprint in action
Blueprint automates the network build workflows described in this article — from adequacy modeling to provider outreach tracking. See it with your state and line of business.