15 49.0138 8.38624 1 1 4000 1 https://apcdjournal.com 300 true
Author
Software Engineer, Freedman HealthCare

Ms. Sieber is a Software Engineer excited to help make the analytical process more feasible and meaningful by working with various data sources for consistent results.

Inconsistent Data Sets: Normalizing CMS Quality Data Files for Analysis

In the last post, we discussed the five file formats encountered by Freedman Analytical Engine (FAE) as it processes Centers for Medicare and Medicaid Services (CMS) quality data. They require FAE to standardize these files to create one merged CSV. This merged file will most closely resemble a Vertical ID file. This post will focus...Read more

Inconsistent Data Sets: The Five CMS Quality Data File Structures

In the last blog post we discussed augmenting the ETL process for CMS quality data. Freedman Analytical Engine (FAE) starts with this quality data, which does not have a consistent structure, and normalizes the column headers and measure identifiers, merging it into one CSV for analysis. To normalize these varying formats, the different file structures...Read more

Inconsistent Data Sets: Augmentation of the ETL Process for CMS Quality Data

When something occurs automatically, the computer completes a task from start to finish. In an augmented task, the computer helps the person to complete the task, but needs some form of interaction or input from the user. Previously, we explained how Freedman Analytical Engine (FAE) uses metadata dictionaries to convert column headers and measure identifiers...Read more

Inconsistent Data Sets: The Use of Dictionaries for CMS Quality Data ETL

In the previous blog post, I explained the general ETL process for CMS quality data. Analysis of quality data from the Centers for Medicare and Medicaid Services (CMS), just like all payer claims data, population health management data, and other data sources, requires the data to be extracted, transformed, and loaded. The majority of this...Read more