Annual Performance Plan and Report

Fiscal Year 2016
Released February, 2015
 

Data Sources and Validation

Reporting HHS Agencies: ACF | ACL | AHRQ | ASA | ASPR | CDC | CMS | FDA | HRSA | IHS | IOS | NIH | OASH | OMHA | ONC | SAMHSA


Administration for Children and Families (ACF)

Measure ID Data Source Data Validation
1.1LT and 1A; 1.1LT and 1B
(ACF)
State LIHEAP Household Report and Census Bureau’s Annual Social and Economic Supplement (ASEC) to the Current Population Survey ACF obtains weighted national estimated numbers of LIHEAP income eligible (low income) households from the Census Bureau’s Annual Social and Economic Supplement (ASEC) to the Current Population Survey. Specialized tabulations are developed to select those ASEC households which would be eligible under the federal LIHEAP maximum cutoff of the greater of 150 percent of HHS’ Poverty Guidelines or 60 percent of HHS’ state median incomes estimates. [1] The weighted estimates include data on the number of those households having at least one member who is 60 years or older and the number of those households having at least one member who is five years or younger. The estimates are subject to sampling variability. The Census Bureau validates ASEC data.

ACF aggregates data from the states’ annual LIHEAP Household Report to obtain a national count of LIHEAP households that receive heating assistance. The count includes data on the number of households having at least one member who is 60 years or older and the number of those households having at least one member who is five years or younger. The aggregation and editing of state-reported LIHEAP recipiency data for the previous fiscal year are typically completed in November of the following fiscal year. Consequently, the data are not available in time to modify ACF interventions prior to the current fiscal year (i.e. there is at least a one-year data lag). There is some electronic data validation now that ACF is using a web-based system for grantees to submit and validate their data for the LIHEAP Household Report ACF also cross checks the data against LIHEAP benefit data obtained from the states’ submission of the annual LIHEAP Grantee Survey on sources and uses of LIHEAP funds.

 

[1] Congress raised the federal LIHEAP maximum income cutoff to 75 percent of state median income for both FY 2009 and FY 2010. Most states did not elect to use 75 percent of state median income (SMI). With the enactment of P.L. 112-10 on April 15, 2011, grantees could only qualify those LIHEAP applicants that were income eligible up to the 60 percent of SMI threshold. The use of 75 percent of SMI was no longer allowable.
2B
(ACF)
Biennial CCDF Report of State Plans The CCDF State Plan preprint requires states to provide information about their progress in implementing the program components related to quality rating and improvement systems (QRIS). CCDF State Plans are submitted on a biennial basis. In order to collect data on years when CCDF State Plans are not submitted, updates are provided by states and territories using the same questions as included in the CCDF State Plan to ensure data consistency.
3.6LT and 3B
(ACF)
Program Information Report (PIR) Data collection for the PIR is automated to improve efficiency in the collection and analysis of data. Head Start achieves a 100 percent response rate annually from 2,600 respondents. The Office of Head Start also engages in significant monitoring of Head Start grantees through monitoring reviews of Head Start and Early Head Start grantees, which examine and track Head Start Program Performance Standards compliance at least every three years for each program. Teams of ACF Regional Office and Central Office staff, along with trained reviewers, conduct more than 500 on-site reviews each year. The automated data system provides trend data so that the team can examine strengths and weaknesses in all programs.
3A
(ACF)
Classroom Assessment Scoring System (CLASS: Pre-K) CLASS: Pre-K is a valid and reliable tool that uses observations to rate the interactions between adults and children in the classroom. Reviewers, who have achieved the standard of reliability, assess classroom quality by rating multiple dimensions of teacher-child interaction on a seven point scale (with scores of one to two being in the low range; three to five in the mid-range; and six to seven in the high range of quality). ACF will implement ongoing training for CLASS: Pre-K reviewers to ensure their continued reliability. Periodic double-coding of reviewers will also be used, which is a process of using two reviewers during observations to ensure they continue to be reliable in their scoring.
3F
(ACF)
Program Information Report (PIR) The PIR is a survey of all grantees that provides comprehensive data on Head Start, Early Head Start and Migrant Head Start programs nationwide. Data collection for the PIR is automated to improve efficiency in the collection and analysis of data. Head Start achieves a 100 percent response rate annually from 2,600 respondents. The automated data system provides trend data so that the team can examine strengths and weaknesses in all programs.
4.1LT and 4A
(ACF)
The Runaway and Homeless Youth Management Information System (RHYMIS) RHYMIS incorporates numerous business rules and edit checks, provides a hot-line/help desk and undergoes continuous improvement and upgrading. Extensive cleanup and validation of data take place after each semi-annual transfer of data from grantee systems into the national database. Historically, the reporting response rate of grantees has exceeded 97 percent every year.
7D
(ACF)
State Annual Reports States are required to submit an Annual Report addressing each of the CBCAP performance measures outlined in Title II of CAPTA. One section of the report must “provide evaluation data on the outcomes of funded programs and activities.” The 2006 CBCAP Program Instruction adds a requirement that the states must also report on the OMB performance measures reporting requirements and national outcomes for the CBCAP program. States were required to report on this new efficiency measure starting in December 2006. The three percent annual increase represents an ambitious target since this is the first time that the program has required programs to target their funding towards evidence-based and evidence-informed programs, and it will take time for states to adjust their funding priorities to meet these new requirements.
7P1
(ACF)
Adoption and Foster Care Analysis Reporting System (AFCARS) States report child welfare data to ACF through AFCARS. All state semi-annual AFCARS data submissions undergo extensive edit-checks for validity. The results of the AFCARS edit-checks for each of the six-month data submissions are automatically generated and sent back to each state, to help the state to improve data quality. Many states submit revised data to ensure that accurate data are submitted, often for more than one prior submission period. The Children’s Bureau conducts several AFCARS compliance reviews each year, which typically result in a comprehensive AFCARS Improvement Plan (AIP). States’ Statewide Automated Child Welfare Information Systems (SACWIS) are undergoing reviews to determine the status of their operation and the system’s capability of reporting accurate AFCARS data. To speed improvement in these data, the agency funds the National Resource Center for Child Welfare Data and Technology. This Resource Center provides technical assistance to states to improve reporting to AFCARS, improve statewide information systems, and to make better use of their data. All of these activities should continue to generate additional improvements in the data over the next few years.
7P2
(ACF)
Adoption and Foster Care Analysis Reporting System (AFCARS) States report child welfare data to ACF through AFCARS. All state smi-annual AFCARS data submissions undergo extensive edit-checks for validity. The results of the AFCARS edit-checks for each of the six-month data submissions are automatically generated and sent back to each state, to help the state to improve data quality. Many states submit revised data to ensure that accurate data are submitted, often for more than one prior submission period. The Children’s Bureau conducts several AFCARS compliance reviews each year, which typically result in a comprehensive AFCARS Improvement Plan (AIP). States’ Statewide Automated Child Welfare Information Systems (SACWIS) are undergoing reviews to determine the status of their operation and the system’s capability of reporting accurate AFCARS data. To speed improvement in these data, the agency funds the National Resource Center for Child Welfare Data and Technology. This Resource Center provides technical assistance to states to improve reporting to AFCARS, improve statewide information systems, and to make better use of their data. All of these activities should continue to generate additional improvements in the data over the next few years.
7Q
(ACF)
Adoption and Foster Care Analysis Reporting System (AFCARS) States report child welfare data to ACF through AFCARS. All state semi-annual AFCARS data submissions undergo extensive edit-checks for validity. The results of the AFCARS edit-checks for each of the six-month data submissions are automatically generated and sent back to each state, to help the state to improve data quality. Many states submit revised data to ensure that accurate data are submitted, often for more than one prior submission period. The Children’s Bureau conducts several AFCARS compliance reviews each year, which typically result in a comprehensive AFCARS Improvement Plan (AIP). States’ Statewide Automated Child Welfare Information Systems (SACWIS) are undergoing reviews to determine the status of their operation and the system’s capability of reporting accurate AFCARS data. To speed improvement in these data, the agency funds the National Resource Center for Child Welfare Data and Technology. This Resource Center provides technical assistance to states to improve reporting to AFCARS, improve statewide information systems, and to make better use of their data. All of these activities should continue to generate additional improvements in the data over the next few years.
7S
(ACF)
Regulatory Title IV-E Foster Care Eligibility Reviews Data validation occurs on multiple levels. Information collected during the onsite portion of the review is subject to quality assurance procedures to assure the accuracy of the findings of substantial compliance and reports are carefully examined by the Children’s Bureau Central and Regional Office staff for accuracy and completeness before a state report is finalized. Through the error rate contract, data is systematically monitored and extensively checked to make sure the latest available review data on each state is incorporated and updated to reflect rulings by the Departmental Appeals Board and payment adjustments from state quarterly fiscal reports. This ensures the annual program error rate estimates accurately represent each state’s fiscal reporting and performance for specified periods. The Children’s Bureau also has a database (maintained by the contractor) that tracks all key milestones for the state eligibility reviews.
12B
(ACF)
CSBG Information System (CSBG/IS) survey administered by the National Association for State Community Services Programs (NASCSP) The Office of Community Services (OCS) and NASCSP have worked to ensure that the survey captures the required information. The CSBG Block Grant allows states to have different program years; this can create a substantial time lag in preparing annual reports. States and local agencies are working toward improving their data collection and reporting technology. In order to improve the timeliness and accuracy of these reports, NASCSP and OCS are providing states training, and better survey tools and reporting processes.
14D
(ACF)
Family Violence Prevention and Services Program Performance Progress Report Form Submission of this report is a program requirement. The outcome measures and the means of data collection were developed with extensive input from researchers and the domestic violence field. The forms, instructions, and several types of training have been given to states, tribes and domestic violence coalitions.
16.1LT and 16C
(ACF)
Matching Grant Progress Report forms Data are validated with methods similar to those used with Performance Reports. Data are validated by periodic desk and on-site monitoring, in which refugee cases are randomly selected and reviewed. During on-site monitoring, outcomes reported by service providers are verified with both employers and refugees to ensure accurate reporting of job placements, wages, and retentions.
18.1LT and 18A
(ACF)
Performance Report (Form ORR-6) Data are validated by periodic desk and on-site monitoring, in which refugee cases are randomly selected and reviewed. During on-site monitoring, outcomes reported by service providers are verified with both employers and refugees to ensure accurate reporting of job placements, wages, and retentions.
20C
(ACF)
Office of Child Support Enforcement (OCSE) Form 157 States currently maintain information on the necessary data elements for the above performance measures. All states were required to have a comprehensive, statewide, automated Child Support Enforcement system in place by October 1, 1997. Fifty-three states and territories were Family Support Act-certified and Personal Responsibility and Work Opportunity Reconciliation Act-certified (PRWORA) as of July 2007. Certification requires states to meet automation systems provisions of the specific act. Continuing implementation of these systems, in conjunction with cleanup of case data, will improve the accuracy and consistency of reporting. As part of OCSE’s audit of performance data, OCSE Auditors review each state’s and territory’s ability to produce valid data. Data reliability audits are conducted annually. Self-evaluation by states and OCSE audits provide an on-going review of the validity of data and the ability of automated systems to produce accurate data. Each year OCSE Auditors review the data that states report for the previous fiscal year. The OCSE Office of Audit has completed the FY 2012 data reliability audits. Since FY 2001, the reliability standard has been 95 percent.
22.2LT and 22B
(ACF)
National Directory of New Hires (NDNH) Beginning with performance in FY 2001, the above employment measures – job entry, job retention, and earnings gain – are based solely on performance data obtained from the NDNH. Data are updated by states, and data validity is ensured with normal auditing functions for submitted data. Prior to use of the NDNH, states had flexibility in the data source(s) they used to obtain wage information on current and former TANF recipients under high performance bonus (HPB) specifications for performance years FY 1998 through FY 2000. ACF moved to this single source national database (NDNH) to ensure equal access to wage data and uniform application of the performance specifications.

Top of page

Administration for Community Living (ACL)

Measure ID Data Source Data Validation
1.1
(ACL)
State Program Report data is annually submitted by States. The web-based submissions include multiple data checks for consistency. Multi-year comparison reports are reviewed by ACL's Administration on Aging (AoA) and State staff. AoA staff follow-up with States to assure validity and accuracy. After revisions, States certify the accuracy of their data.
2.6
(ACL)
National Survey of Older Americans Act Participants. ACL's Administration on Aging's (AoA) national survey uses a range of quality assurance procedures to validate data on OAA participants and services which covers all the steps in the survey process. The surveys have consistently achieved a cooperation rate of over 80% for the sampled Area Agencies on Aging and over 90% for the sample of clients who are currently participating in OAA programs. These high cooperation rates occur because of several important steps in the quality assurance process, including intensive follow-up to contact and interview as many service participants as possible, and calling back at times that are convenient for respondents. After the surveys are complete, range and consistency checks and edits, in conjunction with the CATI software applications, ensure that only correct responses appear in the data files. The data is weighted during three post-survey steps to ensure accuracy. This includes using the inverse of the probability of selection to weight the sample of agencies and clients, adjusting for any non-response patterns and bias that might otherwise occur, and post-stratification of control totals to ensure consistency with official administrative records.
2.9a
(ACL)
National Survey of Older Americans Act Participants ACL's Administration on Aging's (AoA) national survey uses a range of quality assurance procedures to validate data on OAA participants and services which covers all the steps in the survey process. The surveys have consistently achieved a cooperation rate of over 80% for the sampled Area Agencies on Aging and over 90% for the sample of clients who are currently participating in OAA programs. These high cooperation rates occur because of several important steps in the quality assurance process, including intensive follow-up to contact and interview as many service participants as possible, and calling back at times that are convenient for respondents. After the surveys are complete, range and consistency checks and edits, in conjunction with the CATI software applications, ensure that only correct responses appear in the data files. The data is weighted during three post-survey steps to ensure accuracy. This includes using the inverse of the probability of selection to weight the sample of agencies and clients, adjusting for any non-response patterns and bias that might otherwise occur, and post-stratification of control totals to ensure consistency with official administrative records.
2.9b
(ACL)
National Survey of Older Americans Act Participants ACL's Administration on Aging's (AoA) national survey uses a range of quality assurance procedures to validate data on OAA participants and services which covers all the steps in the survey process. The surveys have consistently achieved a cooperation rate of over 80% for the sampled Area Agencies on Aging and over 90% for the sample of clients who are currently participating in OAA programs. These high cooperation rates occur because of several important steps in the quality assurance process, including intensive follow-up to contact and interview as many service participants as possible, and calling back at times that are convenient for respondents. After the surveys are complete, range and consistency checks and edits, in conjunction with the CATI software applications, ensure that only correct responses appear in the data files. The data is weighted during three post-survey steps to ensure accuracy. This includes using the inverse of the probability of selection to weight the sample of agencies and clients, adjusting for any non-response patterns and bias that might otherwise occur, and post-stratification of control totals to ensure consistency with official administrative records.
2.9c
(ACL)
National Survey of Older Americans Act Participants ACL's Administration on Aging's (AoA) national survey uses a range of quality assurance procedures to validate data on OAA participants and services which covers all the steps in the survey process. The surveys have consistently achieved a cooperation rate of over 80% for the sampled Area Agencies on Aging and over 90% for the sample of clients who are currently participating in OAA programs. These high cooperation rates occur because of several important steps in the quality assurance process, including intensive follow-up to contact and interview as many service participants as possible, and calling back at times that are convenient for respondents. After the surveys are complete, range and consistency checks and edits, in conjunction with the CATI software applications, ensure that only correct responses appear in the data files. The data is weighted during three post-survey steps to ensure accuracy. This includes using the inverse of the probability of selection to weight the sample of agencies and clients, adjusting for any non-response patterns and bias that might otherwise occur, and post-stratification of control totals to ensure consistency with official administrative records.
2.10
(ACL)
State Program Report and National Survey of Older Americans Act Participants. This is a composite measure that utilizes data from multiple sources. One source is the State Program Report. Another source is the National Survey. State Program Report data is annually submitted by States. The web-based submissions include multiple data checks for consistency. Multi-year comparison reports are reviewed by ACL's Administration on Aging ( AoA) and State staff. AoA staff follow-up with States to assure validity and accuracy. After revisions, States certify the accuracy of their data. The National Survey draws a sample of Area Agencies on Aging to obtain a random sample of clients receiving selected Older Americans Act (OAA) services. Trained staff administers telephone surveys. Results are analyzed and compared to client population to assure representative sample.
3.1
(ACL)
State Program Report data is annually submitted by States. The web-based submissions include multiple data checks for consistency. Multi-year comparison reports are reviewed by ACL's Administration on Aging (AoA) and State staff. AoA staff follow-up with States to assure validity and accuracy. After revisions, States certify the accuracy of their data.
3.5
(ACL)
State Program Report data is annually submitted by States. The web-based submissions include multiple data checks for consistency. Multi-year comparison reports are reviewed by ACL's Administration on Aging (AoA) and State staff. AoA staff follow-up with States to assure validity and accuracy. After revisions, States certify the accuracy of their data.

Top of page

Agency for Healthcare Research and Quality (AHRQ)

Measure ID Data Source Data Validation
1.3.16
(AHRQ)
MEPS website - MEPSnet/IC interactive tool. Data published on website

A number of steps are taken from the time of sample selection up to data release to ensure the reliability and accuracy of MEPS data including:
  • Quality control checks are applied to the MEPS sample frame when it is received from NCHS as well as to the subsample selected for MEPS.
  • Following interviewer training, performance is monitored through interview observations and validation interviews.
  • A variety of materials and strategies are employed to stimulate and maintain respondent cooperation.
  • All manual coding and data entry tasks are monitored for quality by verification at 100 percent until an error rate of less than 2 percent is achieved for coding work or less than 1 percent for data entry.
  • All specifications developed to guide the editing, variable construction and file creation are monitored through data runs that are used to verify that processes are conducted correctly and to identify data anomalies.
  • Analytic weights are developed in a manner that reduces nonresponse bias and improves national representativeness of survey estimates.
  • The precision of survey estimates are reviewed to insure they are achieving precision specifications for the survey.
  • Prior to data release, survey estimates on health care utilization, expenditures, insurance coverage, priority conditions and income are compared to previous year MEPS data and other studies. Significant changes in values of constructed variables are investigated to determine whether differences are attributable to data collection or variable construction problems that require correction.
Expenditure data obtained from the MEPS medical provider survey are used to improve the accuracy of household reported data.
1.3.21
(AHRQ)
MEPS website Data published on website: A number of steps are taken from the time of sample selection up to data release to ensure the reliability and accuracy of MEPS data including:
  • Quality control checks are applied to the MEPS sample frame when it is received from NCHS as well as to the subsample selected for MEPS.
  • Following interviewer training, performance is monitored through interview observations and validation interviews.
  • A variety of materials and strategies are employed to stimulate and maintain respondent cooperation.
  • All manual coding and data entry tasks are monitored for quality by verification at 100 percent until an error rate of less than 2 percent is achieved for coding work or less than 1 percent for data entry.
  • All specifications developed to guide the editing, variable construction and file creation are monitored through data runs that are used to verify that processes are conducted correctly and to identify data anomalies.
  • Analytic weights are developed in a manner that reduces nonresponse bias and improves national representativeness of survey estimates.
  • The precision of survey estimates are reviewed to insure they are achieving precision specifications for the survey.
  • Prior to data release, survey estimates on health care utilization, expenditures, insurance coverage, priority conditions and income are compared to previous year MEPS data and other studies. Significant changes in values of constructed variables are investigated to determine whether differences are attributable to data collection or variable construction problems that require correction.
  • Expenditure data obtained from the MEPS medical provider survey are used to improve the accuracy of household reported data.
1.3.23
(AHRQ)
CAHPS database National CAHPS Benchmarking Database Prior to placing survey and related reporting products in the public domain, a rigorous development, testing, and vetting process with stakeholders is followed. Survey results are analyzed to assess internal consistency, construct validity, and power to discriminate among measured providers.
1.3.38
(AHRQ)
Surveys/case studies The Hospital Survey on Patient Safety Culture (HSOPS) is a survey which measures organization patient safety climate. AHRQ staff - from the Patient Safety Portfolio and the Office of Communications and Knowledge Transfer (OCKT) – in collaboration with Westat, the HSOPS support contractor,have developed methods andconducted validation studies of HSOPs using a multi-modal approach. First, we have compared HSOPs to the AHRQ Patient Safety Indicators (PSIs),which are based on individual hospital administrative data and are indicators of harm. Next, AHRQ compared HSOPS to HCAHPS, which characterizes the patient’s experience with care. In addition, we have compared HSOPS to CMS pay-for-performance-measures.
Finally, AHRQ has conducted multiple case studies of the utilization and implementation of HSOPS by individual hospitals and hospital systems.

Top of page

Assistant Secretary for Administration (ASA)

Measure ID Data Source Data Validation
1.1
(ASA)
Manual data calls for telework information through the use of spreadsheets. Office of Human Resources Telework Liaisons
1.2
(ASA)
HHS data for fleet statistics comes from analysis and output via a resource called the Federal Automotive Statistical Tool (FAST). The input for the HHS data comes from an internal HHS data resource called the HHS Motor vehicle Management Information System (MVMIS). Per the intent of the metric, HHS’s measure reflects actual fleet performance values when excluding all fuel products used by HHS law enforcement, protective, emergency response or military tactical vehicles (if any). Both the FAST and MVMIS have internal validation processes
1.3
(ASA)
OCIO HHS Electronic Stewardship data. OpDiv electronic stewardship workgroup
2.1
(ASA)
HHS personnel records  

Top of page

Assistant Secretary for Preparedness and Response (ASPR)

Measure ID Data Source Data Validation
2.4.13
(ASPR)
Program files and contract documents Contracts awarded and draft Request for Proposal for industry comment are negotiated and issued, respectively, in accordance with Federal Acquisition Regulations (FAR) and the HHS Acquisition Regulations (HHSAR). Interagency Agreements are developed with federal laboratories to address specific advanced research questions.

Top of page

Centers for Disease Control and Prevention (CDC)

Measure ID Data Source Data Validation
1.2.1c
(CDC)
Childhood data are collected through the National Immunization Survey (NIS) and reflect calendar years. The NIS uses a nationally representative sample and provides estimates of vaccination coverage rates that are weighted to represent the entire population, nationally, and by region, state, and selected large metropolitan areas. The NIS, a telephone-based survey, is administered by random-digit-dialing to find households with children aged 19 to 35 months. Parents or guardians are asked about the vaccines, with dates, that appear on the child’s "shot card" kept in the home; demographic and socioeconomic information is also collected. At the end of the interview with parents or guardians, survey administrators request permission to contact the child's vaccination providers. Providers are then contacted by mail to provide a record of all immunizations given to the child. Examples of quality control procedures include 100% verification of all entered data with a sub-sample of records independently entered. The biannual data files are reviewed for consistency and completeness by CDC’s National Center for Immunization and Respiratory Diseases, Immunization Services Division - Assessment Branch and CDC’s National Center for Health Statistics’ Office of Research and Methodology. Random monitoring by supervisors of interviewers' questionnaire administration styles and data entry accuracy occurs daily. Annual methodology reports and public use data files are available to the public for review and analysis.
1.3.3a
(CDC)
Behavioral Risk Factor Surveillance System (BRFSS)

Behavioral Risk Factor Surveillance System (BRFSS), interviews conducted September-June for an influenza season (e.g., September 2011-June 2012 for the 2011-12 influenza season) and provided to ISD from OSELS by August (e.g. August 2012 for the 2011-12 influenza season). Final results usually available by September (e.g. September 2012 for the 2011-12 influenza season). BRFSS is an on-going state-based monthly telephone survey which collects information on health conditions and risk behaviors from ~400,000 randomly selected persons ≥18 years among the non-institutionalized, U.S. civilian population. Numerator:
BRFSS respondents were asked if they had received a ‘flu’ vaccine in the past 12 months, and if so, in which month and year. Persons reporting influenza vaccination from August through May (e.g., August 2011-May 2012 for the 2011-12 flu season) were considered vaccinated for the season. Persons reporting influenza vaccination in the past 12 months but with missing month or year of vaccination had month and year imputed from donor pools matched for week of interview, age group, state of residence and race/ethnicity.
The cumulative proportion of persons receiving influenza vaccination coverage during August through May is estimated via Kaplan-Meier analysis in SUDAAN using monthly interview data collected September through June.

Denominator:
Respondents age ≥18 years responding to the BRFSS in the 50 states and the District of Columbia with interviews conducted September-June for an influenza season (e.g., September 2011-June 2012 for the 2011-12 influenza season) and provided to ISD from OSELS by August (e.g. August 2012 for the 2011-12 influenza season). Persons with unknown, refused or missing status for flu vaccination in the past 12 months are excluded.
Data validation methodology: Estimates from BRFSS are subject to the following limitations. First, influenza vaccination status is based on self or parental report, was not validated with medical records, and thus is subject to respondent recall bias. Second, BRFSS is a telephone-based survey and does not include households without telephone service (about 2% of U.S. households) and estimates prior to the 2011-12 influenza season did not include households with cellular telephone service only, which may affect some geographic areas and racial/ethnic groups more than others. Third, the median state CASRO BRFSS response rate was 54.4% in 2010, and nonresponse bias may remain after weighting adjustments. Fourth, the estimated number of persons vaccinated might be overestimated, as previous estimates resulted in higher numbers vaccinated than doses distributed.
2.1.8
(CDC)
National HIV surveillance system CDC evaluates surveillance programs to determine the quality of data. HIV data are reported from all 50 states, DC, and the U.S. dependent areas. The period of time between a diagnosis of HIV and the arrival of a case report at CDC is called the "reporting delay."

Data on diagnoses of HIV from the National HIV Surveillance System will be available in November. Transitioning to updated data processing has caused data delays. Once this transition is complete the data availability will improve
2.2.4
(CDC)
DHAP Legal Assessment Project The Legal Assessment Project (LAP) is a legal research and policy analysis project led by CDC’s Division of HIV/AIDS Prevention, Office of the Director. Using standard legal research methods, the LAP researches state statutes, regulations, and policies that affect states’ ability to conduct effective HIV prevention.
2.8.1
(CDC)
The National TB Surveillance System TB morbidity data and related information submitted via the national TB Surveillance System are entered locally or at the state level into CDC-developed software which contains numerous data validation checks. Data received at CDC are reviewed to confirm their integrity and evaluate completeness. Routine data quality reports are generated to assess data completeness and identify inconsistencies. Problems are resolved by CDC staff working with state and local TB program staff. During regular visits to state, local, and territorial health departments, CDC staff review TB registers and other records and data systems and compare records for verification and accuracy. At the end of each year, data are again reviewed before data and counts are finalized and published.
3.3.2a
(CDC)
Emerging Infections Program / Active Bacterial Core Surveillance/Emerging Infections Program Surveillance for Invasive MRSA Infections Surveillance Site personnel trained in methodology, updates annually; laboratory audits performed by Site staff
3.3.3
(CDC)
National Healthcare Safety Network (NHSN) Extensive cross-field edit checks are used for validation and incomplete records cannot be reported. Detailed instructions for completion of report forms ensure consistency across sites. Process and quality improvements occur through email updates and annual meetings.
3.3.4
(CDC)
National Healthcare Safety Network (NHSN) Extensive cross-field edit checks are used for validation and incomplete records cannot be reported. Detailed instructions for completion of report forms ensure consistency across sites. Process and quality improvements occur through email updates and annual meetings.
4.6.3
(CDC)
National Health Interview Survey, NCHS NCHS validates the data
4.6.5
(CDC)
Youth Risk Behavior Surveillance System (YRBSS), which monitors priority health-risk behaviors and is conducted every other year (odd years). Beginning in FY 2011, the National Youth Tobacco Survey (NYTS) was added as an additional data source but removed in 2014 when the variance of the data reported in the years between YRBSS data reporting became too great as compared with YRBSS. Validity and reliability studies of YRBSSattest to the quality of the data. CDC conducts quality control and logical edit checks on each record
4.11.9
(CDC),/p>
National Health Interview Survey (NHIS), CDC, NCHS Data are reported from a national surveillance system and follow predetermined quality control standards.
8.B.1.3a
(CDC)
Tracking spreadsheets from the ELC-ELR monitoring project Data is validated by the Meaningful Use program by collaborating with the various programs (ELR, ISS, SS) to determine the number of awardees that meet the requirements of the EHR-MU standards.
8.B.2.2
(CDC)
Electronic media reach of CDC Vital Signsismeasured by CDC.gov web traffic and actual followers and subscribers of CDC’s social media, e-mail updates and texting service The data source for this measure is Omniture® web analytics, which is a software product that provides consolidated and accurate statistics about interactions with CDC.gov and social media outlets as individuals seek and access information about CDC Vital Signs. Monthly review of Omniture data by CDC Office of the Associate Director for Communication (OADC) and Vital Signs staff.
8.B.2.5
(CDC)
The data source for this measure is Omniture® web analytics, which is a software product that provides consolidated and accurate statistics about interactions with CDC.gov Ongoing review of Omniture reports by Community Guide staff.
8.B.4.2
(CDC)
Infectious Diseases (EID) Laboratory Fellowships, CDC/Council of State and Territorial Epidemiologists’ (CSTE) Applied Epidemiology Fellowship, Post-EIS Practicum (now known as the Health Systems Integration Program), PHPS Residency, and Applied Public Health Informatics Fellowship were added to the measure in FY 2011. The PHPS Residency pilot program ended in FY 2012. The Informatics Training in Place Program was added in FY 2014. Trainees funded by other federal agencies are excluded. Staff reviews and validates data through the fellowship programs’ personnel systems.
10.A.1.5
(CDC)
Annual Program Results (APRs) Data are validated through routine site monitoring and data quality assurance activities. These activities are performed routinely at sites providing direct service delivery and sites that receive technical assistance for service delivery improvement for this performance measure. Data validation includes routine procedures for assessing and maintaining data completeness and accuracy throughout the data lifecycle as well as systematic procedures for assessing that the reported data are validated. Final aggregated numbers for results and future targets are reviewed by an expert team representing both programmatic technical area experts at headquarters and country technical team members with expert knowledge of country program context, historical performance, and current performance capacity.
10.F.1a
(CDC)
FETP Annual Program Reports Reports from Countries are submitted to CDC annually. These reports are confirmed by program directors in each Country.
10.F.1b
(CDC)
FETP Annual Program Reports Reports from Countries are submitted to CDC annually. These reports are confirmed by program directors in each Country.
13.5.3
(CDC)
Self-reported data from62 PHEP grantees. Quality assurance reviews with follow-up with grantees

Top of page

Centers for Medicare and Medicaid Services (CMS)

Measure ID Data Source Data Validation
ACO1.1
(CMS)
Master Data Management System CMS contractors perform assignment and reconciliation tasks; the data are cleaned and reviewed for quality assurance and validation processes
ACO1.2
(CMS)
Medicare Provider Enrollment, Chain, and Ownership System (PECOS) CMS contractors perform assignment and reconciliation tasks; the data are cleaned and reviewed for quality assurance and validation processes
ACO1.3
(CMS)
Central Repository of Alignment Files (CCRAF);
Central Repository of Alignment Files and Payment (CCRAFP)
CMS contractors perform assignment and reconciliation tasks; the data are cleaned and reviewed for quality assurance and validation processes
CHIP 3.3
(CMS)
Statistical Enrollment Data System Each State must assure that the information is accurate and correct when the information is submitted to SEDS by certifying that the information shown on the CHIP forms is correct and in accordance with the State's child health plan as approved by the Secretary.

CMS staff populates the data into various SEDS reports and verifies each of the enrollment measures. Each form has the following seven measures that are reported by service delivery system: 1: Unduplicated Number Ever Enrolled During the Quarter. 2: Unduplicated Number of New Enrollees in the Quarter. 3: Unduplicated Number of Disenrollees in the Quarter. 4: Number of Member-Months of Enrollment in the Quarter. 5: Average Number of Months of Enrollment (item 4 divided by item 1). 6: Number Enrolled At Quarter’s End (point in time). 7: Unduplicated Number Ever Enrolled in the Year” (4th Quarter Only).

CMS compares these enrollment measures to past quarters and trends over the life of each program to ensure that there are not any anomalies in the data, and if apparent errors are detected, CMS corresponds with the State staff who are responsible for reporting enrollment statistics. If there are major increases or decreases, CMS investigates the causes of the changes in enrollment patterns.
MCD6
(CMS)
Developmental. The core set of measures required under CHIPRA was published in December 2009. CMS will initially use the automated web-based system - CHIP Annual Reporting Template System (CARTS) for the reporting of quality measures developed by the new program. This is the same system that was used for the CHIP Quality GPRA goal that was discontinued after FY 2010 (MCD2). Developmental. CMS will monitor performance measurement data related to the core set of measures through CARTS.
MCD8
(CMS)
Developmental.  For FY 2011 and FY 2012, the data source will be the links to the Federal Register.  The link to the recommended core set is:  http://federalregister.gov/a/2010-32978.  The link to the published core set is http://federalregister.gov/a/2011-33756.  In February, 2013, CMS provided States with technical specifications for reporting information on the adult quality core measures set and technical assistance to increase the feasibility of reporting.  Information voluntarily reported to CMS by early 2014, will serve as the data source for assessing States’ progress in reporting standardized adult quality measurement data to CMS. Developmental.  For FY 2011 and FY 2012, the data validation will be the link to the core set in the Federal Register. The link to the recommended core set is: http://federalregister.gov/a/2010-32978.  The link to the published core set is http://federalregister.gov/a/2011-33756
MCR1.1a
(CMS)
The Medicare Consumer Assessment of Healthcare Providers and Systems (CAHPS) is a set of annual surveys of beneficiaries enrolled in all Medicare Advantage plans and in the original Medicare fee-for-service plan. The Medicare CAHPS are administered according to the standardized protocols as delineated in the CAHPS 4.0 Survey and Reporting Kit developed by the Agency for Healthcare Research and Quality (AHRQ). This protocol includes two mailings of the survey instruments to randomized samples of Medicare beneficiaries in health plans and geographic areas, with telephone follow-up of non-respondents with valid telephone numbers. CAHPS data are carefully edited and cleaned prior to the creation of composite measures using techniques employed comparably in all surveys. Both non-respondent sample weights and managed care-FFS comparability weights are employed to adjust collected data for differential probabilities of sample selection, under-coverage, and item response.
MCR1.1b
(CMS)
The Medicare Consumer Assessment of Healthcare Providers and Systems (CAHPS) is a set of annual surveys of beneficiaries enrolled in the original Medicare fee-for-service plan and in all Medicare Advantage plans. The Medicare CAHPS are administered according to the standardized protocols as delineated in the Medicare Advantage and Prescription Drug Plan CAHPS Survey Quality Assurance Protocols & technical Specifications. This protocol includes two mailings of the survey instruments to randomized samples of Medicare beneficiaries in health plans and geographic areas, with telephone follow-up of non-respondents with valid telephone numbers. CAHPS data are carefully analyzed and cleaned prior to the creation of composite measures using techniques employed comparably in all surveys. Both non-respondent sample weights and managed care-FFS comparability weights are employed to adjust collected data for differential probabilities of sample selection, under-coverage, and item response.
MCR23
(CMS)
The Prescription Drug Event (PDE) data CMS has a rigorous data quality program for ensuring the accuracy and reliability of the PDE data. The first phase in this process is on-line PDE editing. The purpose of on-line editing is to apply format rules, check for legal values, compare data in individual fields to other known information (such as beneficiary, plan, or drug characteristics) and evaluate logical consistency between multiple fields reported on the same PDE. On-line editing also enforces business order logic which ensures only one PDE is active for each prescription drug event. The second phase of our data quality program occurs after PDE data has passed all initial on-line edits and is saved in our data repository. We conduct a variety of routine and ad hoc data analysis of saved PDEs to ensure data quality and payment accuracy.
MCR26
(CMS)
Medicare claims data. The data used to calculate the performance measures are administrative claims data submitted by hospitals. Administrative claims data is a validated data source and is the data source for public reporting of hospital readmission rates on the Hospital Compare website (www.hospitalcompare.hhs.gov).As stated on the Hospital Compare website, research conducted when the measures were being developed demonstrated that the administrative claims-based model performs well in predicting readmissions compared with models based on chart reviews. The claims processing systems have validation methods to accept accurate Medicare claims into the claims database. CMS uses national administrative inpatient hospital claims data to calculate the readmission rate measure. The claims processing systems have validation methods to accept accurate Medicare claims into the claims database. Inpatient hospital claims information is assumed to be accurate and reliable as presented in the database.
MCR28.2
(CMS)
The CDC National Healthcare Safety Network Extensive cross-field edit checks are used for validation and incomplete records cannot be reported. Detailed instructions for completion of report forms ensure consistency across sites. Process and quality improvements occur through email updates and annual meetings.
MIP1
(CMS)
Comprehensive Error Rate Testing (CERT) Program. The CERT program is monitored for compliance by CMS through monthly reports from the contractors. In addition, the OIG periodically conducts reviews of CERT and its contractors.
MIP5
(CMS)
The Part C Error Rate estimate measures errors in clinical diagnostic data submitted to CMS by plans. The diagnostic data is used to determine risk adjusted payments made to plans. Data used to determine the Part C program payment error rate is validated by several contractors.

The Part C program payment error estimate is based on data obtained from a rigorous Risk Adjustment Data Validation process in which medical records are reviewed by independent coding entities in the process of confirming that medical record documentation supports risk adjustment diagnosis data for payment.
MIP6
(CMS),/p>
The components of payment error measurement in the Part D program are:

A rate(s) that measures payment errors related to low income subsidy (LIS) payments for beneficiaries dually-eligible for Medicare and Medicaid and non-duals also eligible for LIS status.

A rate that measures payment errors from errors in Prescription Drug Event (PDE) records. A PDE record represents a prescription filled by a beneficiary that was covered by the plan.

A rate that measures payment errors resulting from incorrect assignment of Medicaid status to beneficiaries who are not dually eligible for Medicare and Medicaid.

A rate that measures payment errors from errors in Direct and Indirect Remuneration (DIR) amounts reported by Part D sponsors to CMS. DIR is defined as price concessions (offered to purchasers by drug manufacturers, pharmacies, or other sources) that serve to decrease the costs incurred by the Part D sponsor for prescription drugs.
For the Part D component payment error rates, the data to validate payments comes from multiple internal and external sources, including CMS’ enrollment and payment files. Data are validated by several contractors.

Data for the LIS payment error measure come from CMS’ internal payment and enrollment files for all Part D plan beneficiaries.

Data for the PDE error measure come from CMS’ PDE Data Validation process, which validates PDE data through contractor review of supporting documentation submitted to CMS by a national sample of Part D plans.

The data element for incorrect Medicaid status is the PERM eligibility error rate, which is validated by the Medicaid program for the entire Medicaid population and is used by the Part D program as a proxy for incorrect Medicaid status. From the population of Part D beneficiaries who are eligible for Medicare and Medicaid, we randomly assign a subset, equal to the PERM rate, to be ineligible for Medicaid, resulting in payment error.

Data for the DIR error measure come from audit findings for a national sample of Part D plans; the audits are conducted by contractors as part of the Financial Audit process conducted by CMS' Office of Financial Management (OFM).
MIP8
(CMS)
Our predictive analytics work, using FPS, will focus on activities in the areas where incidence or opportunity for improper payments and/or fraud are greatest. While this risk-based approach increases contractors’ efficiency, it also reduces the burden on legitimate providers by focusing the majority of fraud detection and prevention resources on those posing higher risk of fraud FPS captures the link between each individual ASR and each subsequent administrative action. The FPS Dashboard and supporting systems will enable a seamless reporting of all data necessary to develop the baseline and to measure performance against any future targets.
MIP9.1
(CMS)
As part of a national contracting strategy, adjudicated claims data and medical policies are gathered from the States for purposes of conducting medical and data processing reviews on a sample of the claims paid in each State. CMS and our contractors are working with the 17States to ensure that the Medicaid and CHIP universe data and sampled claims are complete and accurate and contain the data needed to conduct the reviews. In addition, the OIG conducts annual reviews of the PERM program and its contractors.
MIP9.2
(CMS),>
As part of a national contracting strategy, adjudicated claims data and medical policies are gathered from the States for purposes of conducting medical and data processing reviews on a sample of the claims paid in each State. CMS and our contractors are working with the 17 States to ensure that the Medicaid and CHIP universe data and sampled claims are complete and accurate and contain the data needed to conduct the reviews.
MSC1
(CMS)
CMS reports the prevalence of pressure ulcers in long-stay nursing home residents with quality measures (QMs) derived from the Minimum Data Set (MDS). For this goal, we report the prevalence of pressure ulcers measured in the last three months of the fiscal year. The numerator consists of high-risk residents with a pressure ulcer, stages 2-4, on the most recent assessment. The denominator is all high-risk residents. Beginning with the FY 2012 reporting period, the data source is changing from MDS version 2.0 to MDS version 3.0. The new pressure ulcer measure excludes less serious Stage 1 pressure ulcers. The MDS, version 3.0, is the source of the data used to calculate this measure. The MDS is considered to be part of the medical record. The nursing home must maintain the MDS and submit it electronically to CMS for every resident of the certified part of the nursing home. However, MDS data are self-reported by the nursing home. MDS data quality assurance currently consists of onsite and offsite reviews by surveyors and by CMS contractors to ensure that MDS assessments are reported in a timely and complete manner
MSC5
(CMS)
CMS reports the percentage of long-stay nursing home residents that received an antipsychotic medication with a quality measure (QM) derived from the Minimum Data Set (MDS). The MDS is the source of the data used to calculate this measure. The MDS is considered part of the medical record. The nursing home must maintain the MDS and submit it electronically to CMS for every resident of the certified part of the nursing home.

Top of page

Food and Drug Administration (FDA)

Measure ID Data Source Data Validation
212409
(FDA)
CDC/FoodNet FoodNet Annual Reports are summaries of information collected through active surveillance of nine pathogens. A preliminary version of this report becomes available in the spring of each year and forms the basis of each year's Morbidity and Mortality Weekly Report (MMWR) FoodNet Surveillance. The FoodNet Final Report becomes available later in the year when current census information becomes available. The illness rates calculated for this Priority Goal use the same data and same methodology as the illness rates in the MMWR.. CDC’s FoodNet system reports pathogen-specific illness data based on the calendar year, not the fiscal year. Therefore, achievement of the annual targets reported here is evaluated based on the calendar year data, not fiscal year data.
214305
(FDA)
Field Data Systems These maximum capacities are extrapolated to estimate for times of emergency with the laboratory operating under abnormal conditions that are variable and uncertain. FDA and FERN work to maximize capabilities by continually improving methods and training along with increasing automated functionality and available cache of supplies. Through using these laboratories, with known instrumentation and methods, after examining the sample throughput during emergencies, and after consultation with the laboratories and FDA subject matter experts, the listed sample totals are the estimates reached. The surge capacity estimates provided in the performance measures for these laboratories have been examined under the stress of emergencies and outbreaks such as the melamine contamination, Deepwater Horizon oil spill, and the Japan nuclear event.
214306
(FDA)
BioPlex and ibis Biosensor systems CFSAN scientists have developed the means to evaluate and adapt commercially available instruments to develop and validate more rapid, accurate, and transportable tests to stop the spread of foodborne illness and cases of chemical contamination. Using one such system, known as Bioplex, CFSAN scientists are using the device to rapidly serotype pathogens such Salmonella. The Bioplex system can serotype 48 different samples in 3 to 4 hours, which vastly improves response time in foodborne illness outbreaks. CFSAN scientists also are using the ibis Biosensor system to speed the identification of Salmonella, E. coli, and other pathogens, toxins, and chemical contaminants.
223205; 223215
(FDA)
Review performance monitoring is being done in terms of cohorts, e.g., FY 2009 cohort includes applications received from October 1, 2008, through September 30, 2009. CDER uses the Document Archiving, Reporting, and Regulatory Tracking System (DARRTS). FDA has a quality control process in place to ensure the reliability of the performance data in DARRTS. The Document Archiving, Reporting, and Regulatory Tracking System (DARRTS) is CDER’s enterprise-wide system for supporting premarket and postmarket regulatory activities. DARRTS is the core database upon which most mission-critical applications are dependent. The type of information tracked in DARRTS includes status, type of document, review assignments, status for all assigned reviewers, and other pertinent comments. CDER has in place a quality control process for ensuring the reliability of the performance data in DARRTS. Document room task leaders conduct one hundred percent daily quality control of all incoming data done by their IND and NDA technicians. Senior task leaders then conduct a random quality control check of the entered data in DARRTS. The task leader then validates that all data entered into DARRTS are correct and crosschecks the information with the original document.
234101
(FDA)
CBER’s Office of Vaccines Research and Review; and CBER’s Emerging and Pandemic Threat Preparedness Office The data are validated by the appropriate CBER offices and officials.
243201
(FDA)
Submission Tracking and Reporting System (STARS). STARS tracks submissions, reflects the Center’s target submission processing times and monitors submissions during the developmental or investigational stages and the resulting application for marketing of the product.
262401
(FDA)
NCTR Project Management System; peer-review through FDA/NCTR Science Advisory Board (SAB) and the NTP Scientific Board of Counselors; presentations at national and international scientific meetings; use of the predictive and knowledge-based systems by the FDA reviewers and other government regulators; and manuscripts prepared for publication in peer-reviewed journals. NCTR provides peer-reviewed research that supports FDA’s regulatory function. To accomplish this mission, it is incumbent upon NCTR to solicit feedback from its stakeholders and partners, which include FDA product centers, other government agencies, industry, and academia. The NCTR SAB —composed of non-government scientists from industry, academia, and consumer organizations, and subject matter experts representing all of the FDA product centers—is guided by a charter that requires an intensive review of each of the Center’s scientific programs at least once every five years to ensure high quality programs and overall applicability to FDA’s regulatory needs. Scientific and monetary collaborations include Interagency Agreements with other government agencies, Cooperative Research and Development Agreements that facilitate technology transfer with industry, and informal agreements with academic institutions. NCTR also uses an in-house strategy to ensure the high quality of its research and the accuracy of data collected. Research protocols are often developed collaboratively by principal investigators and scientists at FDA product centers and are developed according to a standardized process outlined in the “NCTR Protocol Handbook.” NCTR’s Project Management System tracks all planned and actual expenditures on each research project. The Quality Assurance Staff monitors experiments that fall within the Good Laboratory Practices (GLP) guidelines. NCTR’s annual report of research accomplishments, goals, and publications is published and available on FDA.gov. Research findings are published in peer-reviewed journals and presented at national and international scientific conferences.
280005
(FDA)
CTP’s Tobacco Inspection Management System (TIMS) is a database that contains the tobacco retail inspection data submitted by state and territorial inspectors commissioned by FDA. CTP/OCE has in place a process for ensuring the quality of the data in TIMS. OCE staff conduct random quality control checks of inspection data submitted for tobacco retail inspections where no violations were found. OCE staff conduct quality control checks for all tobacco retail inspections where potential violations were found.

Top of page

Health Resources and Services Administration (HRSA)

Measure ID Data Source Data Validation
1.I.A.1
(HRSA)
HRSA Bureau of Primary Health Care's Uniform Data System Validated using over 1,000 edit checks, both logical and specific. These include checks for missing data and outliers and checks against history and norm.
1.I.A.3
(HRSA)
HRSA/Bureau of Primary Health Care contractors that perform PCMH surveys. Data validated by Health Center program staff.
1.II.B.1
(HRSA)
Uniform Data System Validated using over 1,000 edit checks, both logical and specific. These include checks for missing data and outliers and checks against history and norm.
4.I.C.2
(HRSA)
HRSA Bureau of Clinician Recruitment Service's Management Information Support System (BMISS),> BMISS is internally managed with support from the NIH which provides: Data Management Services, Data Requests and Dissemination, Analytics, Data Governance and Quality, Project Planning and Requirements Development, Training, and Process Improvement.
6.I.C.2
(HRSA)
Annual grantee data submitted through the Bureau of Health Profession's Performance Management System. Data are entered through a web-based system that incorporates extensive validation checks. Once approved by the project officer (1st level of review), data are cleaned, validated, and analyzed by scientists within BHPr's Office of Performance Measurement (2nd level of review). Inconsistencies in data reported identified throughout the 2nd level of review are flagged and sent to the project officer for follow-up and correction.
10.I.A.1
(HRSA)
Vital statistics compiled by the National Center for Health Statistics, Centers for Disease Control and Prevention (CDC). Data are validated by CDC.
16.E
(HRSA)
ADAP Quarterly Report data provided by State ADAPs. Web-based data checked through a series of internal consistency/validity checks. Also HIV/AIDS program staff review submitted Quarterly reports, and provide technical assistance on data-related issues.
16.I.A.1
(HRSA)
HRSA HIV/AIDS Bureau's Ryan White HIV/AIDS Program Services Report This web-based data collection method communicates errors and warnings in the built in validation process. To ensure data quality the Program conducts data verification for all Ryan White HIV/AIDS Program Services Report (RSR) submissions. Reports detailing items in need of correction and instructions for submitting revised data are sent to grantees.
24.II.A.2
(HRSA)
Data are captured within theNational Marrow Donor Program'scomputerized system, containing information pertaining to registered volunteer adult donorswilling to donate blood stem cellsto patients in need. Monthly reports generated from thecomputerized system to indicate the number of registered donors (broken down by self-reported race and ethinicity). Validated by project officers analyzing comprehensive monthly reports broken down by recruitment organization.To decrease the likelihood of data entry errors, the program contractor utilizes value protected screens and optical scanning forms.
29.IV.A.3
(HRSA)
Reported by grantees through the Program’s Performance Improvement Measurement System Validated by project officers
36.II.B.1
(HRSA)
Family Planning Annual Report (FPAR). The FPAR consists of 14 tables in which grantees report data on user demographic characteristics, user social and economic characteristics, primary contraceptive use, utilization of family planning and related health services, utilization of health personnel, and the composition of project revenues.
For this measure, FPAR Table 11: "Unduplicated number of Users Tested for Chlamydia by Age and Gender," is the data source.
The responsibility for the collection and tabulation of annual service data from Title X grantees rests with the Office of Population Affairs (OPA), which is responsible for the administration of the program. Reports are submitted annually on a calendar year basis (January 1 - December 31) to the regional offices. Grantee reports are tabulated and an annual report is prepared summarizing the regional and national data. The annual report describes the methodology used both in collection and tabulation of grantee reports, as well as the definitions provided by OPA to the grantees for use in completing data requests. Also included in the report are lengthy notes that provide detailed information regarding any discrepancies between the OPA requested data and what individual grantees were able to provide. Data inconsistencies are first identified by the Regional Office and then submitted back to the grantee for correction. Additionally, discrepancies found by the contractor compiling the FPAR data submits these to the Office of Family Planning (OFP) FPAR data coordinator who works with the Regional Office to make corrections. All data inconsistencies and their resolution are noted in an appendix to the report. These are included for two reasons: (1) to explain how adjustments were made to the data, and how discrepancies affect the analysis, and (2) to identify the problems grantees have in collecting and reporting data, with the goal of improving the process.

Top of page

Indian Health Service (IHS)

Measure ID Data Source Data Validation
2
(IHS)
Clinical Reporting System (CRS); yearly Diabetes care and outcome audit Comparison of CRS and audit results; CRS software testing; quality assurance review of site submissions
18
(IHS)
Clinical Reporting System (CRS) CRS software testing; quality assurance review of site submissions
20
(IHS)
IHS operated hospitals and clinics report the accrediting body,the length of accreditation, and other significant information about their accreditation status to the IHS Headquarters, Office of Resource Access and Partnerships, which maintains a List of Federal Facilities - Status of Accreditation. The Joint Commission, AAAHC, CMS, and non-governmental organizations maintain lists of certified and accredited facilities at their public websites.  Visit the Joint Commission website at http://www.qualitycheck.org/CertificationList.aspx exit disclaimer icon.  Visit the Accreditation Association for Ambulatory Health Care at http://www.aaahc.org/eweb/dynamicpa
ge.aspx?site=aaahc_site&webcode=find_orgs
exit disclaimer icon.
24
(IHS)
Clinical Reporting System (CRS) CRS software testing; quality assurance review of site submissions; Immunization program reviews
30
(IHS)
Clinical Reporting System(CRS) CRS software testing; quality assurance review of site submissions
TOHP-SP
(IHS)
Routine IHS Tribal consultation documentation for HHS consultation report and IHS Director's Activities database Routine IHS Tribal consultation documentation for HHS consultation report and IHS Director's Activities database

Top of page

Immediate Office of the Secretary (IOS)

Measure ID Data Source Data Validation
1.1
(IOS)
The data sources are: 1) the number of HHS federal advisory committees (per data released from the HHS Office of the White House Liaison; 2) all challenges and competitions that are made available to member of the public at http://challenge.gov/HHS and the number of HHS API-enabled databases (see http://www.hhs.gov/digitalstrategy/). Collection is based on annual data reported by the HHS Office of the White House Liaison, the HHS Open Innovation Manager, and the Digital Communications Division.
1.2
(IOS)
Quarterly reports on data via Data.Gov submissions and HHS data calls Datasets available at www.healthdata.gov and also tracked by the Director of the Health Data Initiative
1.3
(IOS)
HHS Innovation Council and HHS IDEA Lab administrative records. Many of the collaborations and resulting projects are listed on the HHS IDEA Lab website at http://www.hhs.gov/idealab/
1.4
(IOS)
The Open Innovation Manager's Database The HHS Open Innovation Manager tracks all challenges issued by HHS
1.5
(IOS)
Data are collected and maintainedby HHS IDEA Lab staff Descriptions of innovative solutions are available on the HHS IDEA Lab website at http://www.hhs.gov/idealab/
1.6
(IOS)
NLM PubMed Central Database The validation is based on quarterly database queries. A listing of available articles and participating journals is available at http://www.ncbi.nlm.nih.gov/pmc/

Top of page

National Institutes of Health (NIH)

Measure ID Data Source Data Validation
CBRR-1.1; CBRR-1.2
(NIH)
Doctorate Records File and the NIH IMPAC II database Analyses of career outcomes for predoctoral and postdoctoral NRSA participants, compared to individuals that did not receive NRSA support,” using the Doctorate Records File and the NIH IMPAC II administrative database.

Contact:
Jennifer Sutton
Program Policy and Evaluation Officer
Office of Extramural Programs
(301) 435-2686
CBRR-10
(NIH)
Publications, databases, administrative records and/or public documents. NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities. The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports. Scientific journals use a process of peer review prior to publishing an article. Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.

For further information on Probe Reports from the NIH Molecular Libraries Program: http://www.ncbi.nlm.nih.gov/books/NBK47352/

For further information on the Molecular Libraries Program::
http://commonfund.nih.gov/molecularlibraries/

For further information on the S1P1 compound in clinical development: http://ir.receptos.com/releases.cfm exit disclaimer icon

Macrophage models of Gaucher disease for evaluating disease pathogenesis and candidate drugs. Aflaki E, Stubblefield BK, Maniwang E, Lopez G, Moaven N, Goldin E, Marugan J, Patnaik S, Dutra A, Southall N, Zheng W, Tayebi N, Sidransky E.Sci Transl Med. 2014 Jun 11;6(240):240ra73. doi: 10.1126/scitranslmed.3008659.
http://stm.sciencemag.org/content/6
/240/240ra73.full.html
exit disclaimer icon
 
Cytokine storm plays a direct role in the morbidity and mortality from influenza virus infection and is chemically treatable with a single sphingosine-1-phosphate agonist molecule. Oldstone MB, Rosen H.; Curr Top Microbiol Immunol. 2014;378:129-47. doi: 10.1007/978-3-319-05879-5_6. Review., PMID: 24728596
http://www.ncbi.nlm.nih.gov/pubmed/
24728596
SRO-3.9
(NIH)
Publication, databases, administrative records and/or public documents NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities. The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports. Scientific journals use a process of peer review prior to publishing an article. Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.

For additional information contact: NIAMSSPPB (Reaya Reuss, 301-496-8271, [email protected])

Liu Y, Ramot Y, Torrelo A, Paller AS, Si N, Babay S, Kim PW, Sheikh A, Lee C-CR, Chen Y, Vera A, Zhang X, Goldbach-Mansky R*, Zlotogorski A*. (2012). Mutations in PSMB8 Cause CANDLE Syndrome with Evidence of Genetic and Phenotypic Heterogeneity. Arthritis Rheum. March; 64(3): 895-907. (PMID: 21953331)

Brehm A, Liu Y, Sheikh A, Omoyinmi E, Biancotto A, Zhou Q, Montealegre G, Reinhardt A, de Jesus AA, Pelletier M, Tsai WL, Remmers EF, Kardava L, Hill S, Kim H, Lachmann HL, Megarbane A, Chae JJ, Brady J, Castillo RD, Brown D, Vera Casano A, Ling G, Plass N, Chapelle D, Huang Y, Stone D, Chen Y, Lee C-CR, Kastner DL, Torrelo T, Zlotogorski A, Moir S, Gadina M, McCoy P, Rother, Hildebrand PW, Brogan P, Aksentijevich I,* Krüger E,* Goldbach-Mansky R*. Diverse proteasome subunit mutations link proteasome dysfunction and type interferon induction in PRAAS/CANDLE (Manuscript in preparation).

Montealegre Sanchez G, de Jesus AA, Reinhardt A, Brogan P, Berkun Y, Brown D, Chira P, Gao L, Chapelle D, Plass N, Kim H, Davis M, Liu Y, Huang Y, Lee C-C, Hadigan C, Heller T, Zlotogorski A, O’Shea JJ, Hill S, Rother K, Gadina M, Goldbach-Mansky R. (2013). Chronic Atypical Neutrophilic Dermatosis with Lipodystrophy and Elevated Temperatures (CANDLE): Clinical Characterization and Initial Response to Janus Kinase Inhibition with Baricitinib. (ACR oral presentation) (Abstract exit disclaimer icon)

Kim H, Brooks S, Liu Y, de Jesus AA, Montealegre Sanchez G, Chappelle D, Plass N, Huang Y, Goldbach-Mansky R.

Validation of a Novel IFN-Regulated Gene Score As Biomarker in Chronic Atypical Neutrophilic Dermatosis with Lipodystrophy and Elevated temperature (CANDLE) Patients on Baricitinib, a Janus Kinase 1 /2 Inhibitor, a Proof of Concept (ACR abstract and oral presentation) (Abstract exit disclaimer icon)
SRO-5.2
(NIH)
Publications, databases, administrative records and/or public documents. NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities.  The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports.  Scientific journals use a process of peer review prior to publishing an article.  Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.

Risk loci for chronic obstructive pulmonary disease: a genome-wide association study and meta-analysis. Cho M, et al., The Lancet Respiratory Medicine 2(3); 214-225, March 2014.

NHLBI contact: Dr. Lisa Postow [email protected]
SRO-5.5
(NIH)
Publications, databases, administrative records and/or public documents. NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities.  The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports.  Scientific journals use a process of peer review prior to publishing an article.  Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.

For additional information contact: NIBIB OSPC (Christine Cooper, 301-594-8923, [email protected])
SRO-5.13
(NIH)
Publications, databases, administrative records, and/or public documents.
http://www.epa.gov/ncct/Tox2/
NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities.  The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports.  Scientific journals use a process of peer review prior to publishing an article.  Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.
  1. Attene-Ramos MS, Miller N, Huang R, Michael S, Itkin M, Kavlock RJ, Austin CP, Shinn P, Simeonov A, Tice RR, Xia M. The Tox21 robotic platform for assessment of environmental chemicals - from vision to reality. Drug Discovery Today 2013; 18:716-723.Myers SL, Yang CZ, Bittner GD, Witt KL, Tice RR, Baird DD. Estrogenic and anti-estrogenic activity of off-the-shelf hair and skin care products. Journal of Exposure Science and Environmental Epidemiology 2014; 1–7.
  2. Attene-Ramos MS, Huang R, Michael S, Witt KL, Richard A, Tice RR, Simeonov A, Austin CP, Xia M. Profiling of the Tox21 chemical collection for mitochondrial function to identify compounds that acutely decrease mitochondrial membrane potential. Environ Health Perspect 2014; DOI:10.1289/ehp.1408642.
  3. Huang R, Sakamuru S, Martin MT, Reif DM, Judson RS, Houck KA, Casey W, Hsieh J-H, Shockley K, Ceger P, Fostel J, Witt KL, Tong W, Rotroff DM, Zhao T, Shinn P, Simeonov A, Dix DJ, Austin CP, Kavlock RJ, Tice RR, Xia M. Profiling of the Tox21 10K compound library for agonists and antagonists of the estrogen receptor alpha signaling pathway. Sci. Rep. 2014; 4:5664; DOI:10.1038/srep05664.
SRO-6.4
(NIH
Publications, databases, administrative records and/or public documents NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities.  The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports.  Scientific journals use a process of peer review prior to publishing an article.  Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.
Progress Reports for the following grant numbers: HL109146, HL109152, HL109164, HL109168, HL109172, HL109250, HL109257, and HL109086

Phenotype of asthmatics with increased airway S-nitrosoglutathione reductase activity
Nadzeya V et al., ERJ Express. Published on October 30, 2014 as doi: 10.1183/09031936.00042414

Regional Ventilation Changes in Severe Asthma after Bronchial Thermoplasty with 3He MR Imaging and CT, Thomen R et al., RSNA Radiology, DOI: http://dx.doi.org/10.1148/radiol.14140080 exit disclaimer icon

Improved CT-based estimate of pulmonary gas trapping accounting for scanner and lung-volume variations in a multicenter asthmatic study. Choi S et al., J Appl Physiol 117: 593-603, Aug 2014, doi:10.1152/japplphysiol.00280.2014
SRO-8.2
(NIH)
  NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities.  The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports.  Scientific journals use a process of peer review prior to publishing an article.  Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.

Smith, RJ, Lobo, MK, Spencer, S, Kalivas, PW, Cocaine-induced adaptations in D1 and D2 accumbens projection neurons (a dichotomy not necessarily synonymous with direct and indirect pathways). Curr Opinion Neurobiol, 2013, 10:546-552.

Stamatakis, AM, Jennings, JH, Ung, RL, Glair, GA, Weinberg, RJ, Neve, RL, Boyce, F, Mattis, J, Ramakrishnan, C, Deisseroth, K, Stuber, GD. A unique population of ventral tegmental area neurons inhibits the lateral habenula to promote reward. Neuron, 2013, 80:1039-1053.

Stamatakis, AM, Sparta, DR, Jennings, JH, McElligott, ZA, Decot, H, Stuber, GH. Amygdala and bed nucleus of the stria terminalis circuitry: Implications for addiction-related behaviors. Neuropharmacology 2014, 76 PtB:320-328.

 

Stuber, GD, Britt, JP, Bonci, A. Optogenetic modulation of neural circuits that underlie reward seeking. Biol. Psychiat., 2012, 71:1061-1067.

Britt, JP, Bonci, A. Optogenetic interrogations of the neural circuits underlying addiction. Curr. Opinion Neurobiol, 2013; 23:539-545.

Ferguson, SM,. Phillips, PEM, Roth, BL, Wess, J, Neumaier, JF. Direct-pathway striatal neurons regulate the retention of decision-making strategies. J. Neurosci., 2013, 33:11668-11676

Barrot, M, Sesack, SR, Georges, F, Pistis, M, Hong, S, Jhou, TC. Braking dopamine systems: a new GABA master structure for mesolimbic and nigrostriatal functions. J. Neurosci., 2012, 32:14094-14101.

Jhou, TC, Good, CH, Rowley, CS, Xu, SP, Wang, H, Burnham, NW, Hoffman, AF, Lupica, CR, Ikemoto, S. Cocaine drives aversive conditioning via delayed activation of dopamine-responsive habenular and midbrain pathways. J. Neurosci., 2013, 33:7501-7512.

Jennings, JH, Sparta, DR, Stamatitakis, AM, Ung, RL, Pleil, KE, Kash, TL, Stuber, GD. Distinct extended amygdala circuits for divergent motivational states. Nature, 2013, 496:224-230.
SRO-8.7
(NIH)
Publications, databases, administrative records and/or public documents NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities.  The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports.  Scientific journals use a process of peer review prior to publishing an article.  Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected. Isaac C. Rhew, Eric C. Brown, J. David Hawkins, and John S. Briney. Sustained Effects of the Communities That Care System on Prevention Service System Transformation. American Journal of Public Health: March 2013, Vol. 103, No. 3, pp. 529-535.

Greenberg exit disclaimer icon MT, Feinberg exit disclaimer icon ME, Johnson exit disclaimer icon LE, Perkins exit disclaimer icon DF, Welsh exit disclaimer icon JA, Spoth exit disclaimer icon RL. Factors That Predict Financial Sustainability of Community Coalitions: Five Years of Findings from the PROSPER Partnership Project.
Prevention Science exit disclaimer icon; 2014 Apr 6. [Epub ahead of print]

Mercer, SH.; McIntosh, K; Strickland-Cohen, MK; Horner, RH. Measurement invariance of an instrument assessing sustainability of school-based universal behavior practices. School Psychology Quarterly, Vol 29(2), Jun 2014, 125-137.

Szonja Vamos, MS, Miriam Mumbi, RN, Ryan Cook, BA, Ndashi Chitalu, MD, Stephen Marshall Weiss, PhD, MPH, and Deborah Lynne Jones, PhD. Translation and sustainability of an HIV prevention intervention in Lusaka, Zambia exit disclaimer icon. Transl Behav Med. Jun 2014; 4(2): 141–148. Published online Sep 11, 2013.

Top of page

Office of the Assistant Secretary for Health (OASH)

Measure ID Data Source Data Validation
1.5
(OASH)
Commerce and Census  

Top of page

Office of Medicare Hearings and Appeals (OMHA)

Measure ID Data Source Data Validation
1.1.5
(OMHA)
Appellate Climate Survey The most recent version of thesurvey was administered by a third party contractor using a stratified random sample of appellants whose cases were closed within fiscal year 2014. The survey was designed to collect appellant: demographic information, overall satisfaction, satisfaction with hearing format, satisfaction with other aspects (e.g., scheduling, clarity of case processing documents, interaction with the ALJ team after the scheduling and prior to the hearing, and use of the OMHA website) and possible predictors of satisfaction (e.g., case fully heard and considered, ALJ behavior, etc.).

Top of page

Office of the National Coordinator for Health Information Technology (ONC)

Measure ID Data Source Data Validation
1.A.2; 1.B.4
(ONC)
Centers for Disease Control and Prevention, National Center for Health Statistics, National Ambulatory Medical Care Survey, Electronic Medical Record Supplement The NAMCS is nationally representative of office-based physicians. Historically, the response rate is approximately 68%. Beginning with survey year 2010, the survey allows ONC to evaluate trends in electronic health record adoption by region, provider specialty, and state. Estimates for FYs 2008-11 for this measure derive from the mail supplement to the NAMCS.

Top of page

Substance Abuse and Mental Health Services Administration (SAMHSA)

Measure ID Data Source Data Validation
2.3.21
(SAMHSA)
National Survey on Drug Use and Health State estimates Performance results are based on state-level estimates obtained via the National Survey on Drug Use and Health (NSDUH). State estimates are entered by each SPF SIG grantee into the Prevention Management and Reporting Tool (PMRTS). Validation and verification checks are run on the data as they are being entered. Automated programs identify typical data errors such as missing data and outliers. Additionally, the data management team analyzes data to calculate annual performance results. Data are carefully cleaned using the pre- established Uniform Coding Conventions. Data abnormalities are communicated to the GPOs and grantees via cleaning sheets for explanation or correction. The data management team responsible for this data assures that required fields are complete and that all edits are made.The SPFSIG cross site evaluation team performs analyses and generates reports annually and on an ad hoc basis as needed. Information about methodology for the NSDUH is available at http://www.samhsa.gov/data/Methodological_Reports.aspx The final year of data (2014 data to be reported at the end of 2015) will come from a different data source.
2.3.56
(SAMHSA)
Data are collected through dosage forms and/ participant level instruments using standardized items. These data are collected and provided by grantees online through an electronic system called the Performance Management Reporting and Training system (PMRTS). All grantees receive training in using the system and are provided technical assistance in data to be collected. These data are then cleaned and analyzed by an analytic contract. In FY2012, The Data Collection Analysis and Reporting (DCAR) contract was awarded and serves the data collection, cleaning , analysis, and reporting functions. The data to be reported in August 2015 will come from a different contract. It will come from the Pep C contract. FY2013 data have been carefully collected, cleaned, analyzed, and reported by SAMHSA’s Data Collection Analysis and Reporting contract (DCAR). After data were entered into the PMRTS, the system uses automated programs to do the initial data cleaning (identifies outliers, missing data, etc.) the Data Management Team the reviews the data for completeness and accuracy. Information on any data problems identified is transmitted through the use of "cleaning sheets" to the Government Project Officer (GPO) and the grantee to resolve. The Data Management Team then makes any required edits to the files, following the extensive and detailed Uniform Coding Conventions. The edited files are then sent to SAMHSA staff and the Data Analysis Team for analysis and reporting.
2.3.61
(SAMHSA)
The number of calls answered is reported in the National Suicide Prevention LifeLine Monthly Report Specialists in information technology at the National Suicide Prevention LifeLine evaluation center validate phone records received from Sprint to determine the number of calls received and answered at 1-800-273-TALK.
3.2.02a
(SAMHSA)
TRAC on-line data reporting and collection system. This measure is part fo the phase one transition into the CDP. All TRAC data are automatically checked as they are input into the TRAC system. Validation and verification checks are run as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the TRAC database.
3.2.16
(SAMHSA)
TRAC through 2014 with transition to the Consolidated Data Platform (CDP)in early 2015. All data are automatically checked as they are input to TRAC. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.2.26
(SAMHSA)
TRAC on-line data reporting and collection system. This measure is part of the phase 1 transition into the Consolidated Data Platform (CDP).Phase 1 for the CDPbegins in early 2015. All TRAC data are automatically checked as they are input into TRAC. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.2.30
(SAMHSA)
Data on children's outcomes were reported in the grantees' ED524 Bi-Annual Report submitted to their GPO every six months. The methods for collecting these measures varied by grantee, but were generally student self-report for the violence and substance use measures and school records for attendance and mental health services. Grantees implement various forms of data validation as part of their local evaluations. To establish the accuracy and reliability of data used to measure the outcome performance, local evaluators require double entry of data; range checks coded into the data entry program; or assessing concurrent validity with other measure of the same indicator among other things.
3.2.37
(SAMHSA)
This measure will be in phase one of the CDP transition.  
3.2.50
(SAMHSA)
Until SAMSHA implements the Consolidated Data Platform (CDP) during 2015 and 2016, the data will be collected and tabulated by hand on an ED524 form and then submitted to SAMHSA on an annual basis. Grantees implement various forms of data validation as part of their local evaluations. To establish the accuracy and reliability of data used to measure the outcome performance, local evaluators review the data; provide range checks, and otherwise assess concurrent validity by comparing data and measures.
3.4.02
(SAMHSA)
Data are collected through standard instruments and submitted through the TRAC on-line data reporting and collection system. This measure is included in phase one of SAMHSA's CDP transition. All TRAC data are automatically checked as they are input into TRAC. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.4.15
(SAMHSA)
TRAC on-line data reporting and collection system. In 2015, TRAC is being discontinued, but this measure is not part of phase one in the CDP. This data is from the block grants. It is not included in phase one of the CDP. All TRAC data are automatically checked as they are input into TRAC. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.4.20
(SAMHSA)
Data are submitted annually toSAMHSA by States, which obtain the information from local human service agencies that provide services. SAMHSA's CMHS has developed additional error checks to screen data and contacts States and local providers concerning accuracy when data is reported outside expected ranges. CMHS has also issued guidance to all States and localities on data collection and monitors compliance with data collection through increased site visits to local PATH-funded agencies.
3.4.21
(SAMHSA)
Data are derived from standardized annual Program Performance Reports in which grantees estimate the potential number of individuals impacted through a pre-defined list of 7 possible interventions (e.g., group advocacy non-litigation, facility monitoring services, class litigation). This measureis not part of the CDP phase one transition. The information provided in the annual reports is checked for reliability during on-site PAIMI Program visits, annual reviews, and budget application reviews.
3.4.24; 3.4.25
(SAMHSA)
Services Accountability Improvement System. This data is included in Phase 1 of the Consolidated Data Platform (CDP) and will be stored in this system starting in 2015. All data are automatically checked as they are input to SAIS. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
4.4.10
(SAMHSA)
Data is collected from the contractors who manage the Treatment Locator and SAMHDA websites via standard tracking software measuring unique hits These numbers are provided to the COTRs via email at the end of each month and on January second of the next year. Validation checks are reviewed at that time. Data are maintained by the COTRs for each project.

Top of page


 

Content created by Office of Budget (OB)
Content last reviewed