Welcome to NAHDO's "data blog" highlighting issues related to health care data collection and use. field.
The journey to improve the quality of hospital, claims, and encounter data sets is one that never quite ends. High-quality data are needed for quality measurement, risk adjustment, and program evaluation / oversight.
While most of us agree data quality is important, what does it mean and what are the unique challenges data agencies face in improving data quality? Data sets, especially data aggregated from different health systems and platforms, are never perfect and the perception of data quality may differ according to the user and uses. A claims transaction will need to conform to national X12N standards in order to qualify for automated acceptance and payment. Data aggregated across dozens (or hundreds) of different data supplier platforms and for the purpose of third-party use (policy, research, market) pose huge challenges for the data steward and data quality is at the heart of comparative provider performance and value-based payment purposes.
State data agencies apply a series of audit tools and system checks to detect anomalies in the data (gender conflicts, age conflicts, code range compliance, etc.) Working closely with data suppliers to correct or align reporting issues has a high payoff to improvement in data utility, but it is time-consuming, requiring ongoing relationship-building and feedback to data suppliers.
All data stewards agree that accuracy, completeness, timeliness translate into usability of large-scale data sets we maintain. These and other issues were discussed at a lively Data Quality Roundtable at NAHDO's 32 Annual Meeting in Washington, DC October 4. Participants called for:
A NAHDO Data Quality Forum to promote best practices in data improvement and minimum edit protocols
A way to alert states about policies likely to affect data quality (NUBC changes, ICD-10 issues).