FY 2017 Annual Performance Plan and Report - Data Sources and Validation

Fiscal Year 2017
Released February, 2016
 

Data Sources and Validation

Reporting HHS Agencies: ACF | ACL | AHRQ | ASA | ASPR | CDC | CMS | FDA | HRSA | IHS | IOS | NIH | OASH | OMHA | ONC | SAMHSA


Administration for Children and Families (ACF)

Measure ID Data Source Data Validation
1.1LT and 1A;

1.1LT and 1B
(ACF)
State LIHEAP Household Report and Census Bureau’s Annual Social and Economic Supplement (ASEC) to the Current Population Survey ACF obtains weighted national estimated numbers of LIHEAP income eligible (low income) households from the Census Bureau’s Annual Social and Economic Supplement (ASEC) to the Current Population Survey.  Specialized tabulations are developed to select those ASEC households which would be eligible under the federal LIHEAP maximum cutoff of the greater of 150 percent of HHS’ Poverty Guidelines or 60 percent of HHS’ state median incomes estimates. [1]  The weighted estimates include data on the number of those households having at least one member who is 60 years or older and the number of those households having at least one member who is five years or younger.  The estimates are subject to sampling variability The Census Bureau validates ASEC data. 
 
ACF aggregates data from the states’ annual LIHEAP Household Report to obtain a national count of LIHEAP households that receive heating assistance.  The count includes data on the number of households having at least one member who is 60 years or older and the number of those households having at least one member who is five years or younger.  The aggregation and editing of state-reported LIHEAP recipiency data for the previous fiscal year are typically completed in November of the following fiscal year.  Consequently, the data are not available in time to modify ACF interventions prior to the current fiscal year (i.e. there is at least a one-year data lag).  There is some electronic data validation now that ACF is using a web-based system for grantees to submit and validate their data for the LIHEAP Household Report.  ACF also cross checks the data against LIHEAP benefit data obtained from the states’ submission of the annual LIHEAP Grantee Survey on sources and uses of LIHEAP funds.

 


[1] Congress raised the federal LIHEAP maximum income cutoff to 75 percent of state median income for both FY 2009 and FY 2010.  Most states did not elect to use 75 percent of state median income (SMI).  With the enactment of P.L. 112-10 on April 15, 2011, grantees could only qualify those LIHEAP applicants that were income eligible up to the 60 percent of SMI threshold.  The use of 75 percent of SMI was no longer allowable.

2B
(ACF)
Biennial CCDF Report of State Plans The CCDF State Plan preprint requires states to provide information about their progress in implementing the program components related to quality rating and improvement systems (QRIS). CCDF State Plans are submitted on a biennial basis. In order to collect data on years when CCDF State Plans are not submitted, updates are provided by states and territories using the same questions as included in the CCDF State Plan to ensure data consistency.  Starting with FY 2016 actual result, the State Plans will be reported on a triennial basis per Child Care Reauthorization.
3.6LT and 3B
(ACF)
Program Information Report (PIR) Data collection for the PIR is automated to improve efficiency in the collection and analysis of data. Head Start achieves a 100 percent response rate annually from 2,600 respondents. The Office of Head Start also engages in significant monitoring of Head Start grantees through monitoring reviews of Head Start and Early Head Start grantees, which examine and track Head Start Program Performance Standards compliance at least every three years for each program. Teams of ACF Regional Office and Central Office staff, along with trained reviewers, conduct more than 500 on-site reviews each year. The automated data system provides trend data so that the team can examine strengths and weaknesses in all programs.
3A
(ACF)
Classroom Assessment Scoring System (CLASS: Pre-K) CLASS: Pre-K is a valid and reliable tool that uses observations to rate the interactions between adults and children in the classroom.  Reviewers, who have achieved the standard of reliability, assess classroom quality by rating multiple dimensions of teacher-child interaction on a seven point scale (with scores of one to two being in the low range; three to five in the mid-range; and six to seven in the high range of quality).  ACF will implement ongoing training for CLASS: Pre-K reviewers to ensure their continued reliability. Periodic double-coding of reviewers will also be used, which is a process of using two reviewers during observations to ensure they continue to be reliable in their scoring.
3D
(ACF)
Program Information Report (PIR) The PIR is a survey of all grantees that provides comprehensive data on Head Start, Early Head Start and Migrant Head Start programs nationwide. Data collection for the PIR is automated to improve efficiency in the collection and analysis of data. Head Start achieves a 100 percent response rate annually from 2,600 respondents. The automated data system provides trend data so that the team can examine strengths and weaknesses in all programs.
3E
(ACF)
Program Information Report (PIR) The PIR is a survey of all grantees that provides comprehensive data on Head Start, Early Head Start and Migrant Head Start programs nationwide.  Data collection for the PIR is automated to improve efficiency in the collection and analysis of data. Head Start achieves a 100 percent response rate annually from 2,600 respondents.  The automated data system provides trend data so that the team can examine strengths and weaknesses in all programs.
4.1LT and 4A
(ACF)
The Runaway and Homeless Youth Management Information System (RHYMIS) RHYMIS incorporates numerous business rules and edit checks, provides a hot-line/help desk and undergoes continuous improvement and upgrading. Extensive cleanup and validation of data take place after each semi-annual transfer of data from grantee systems into the national database. Historically, the reporting response rate of grantees has exceeded 96 percent every year for the past five years.
7D
(ACF)
State Annual Reports States are required to submit an Annual Report addressing each of the CBCAP performance measures outlined in Title II of CAPTA. One section of the report must “provide evaluation data on the outcomes of funded programs and activities.” The 2006 CBCAP Program Instruction adds a requirement that the states must also report on the OMB performance measures reporting requirements and national outcomes for the CBCAP program. States were required to report on this efficiency measure starting in December 2006. The three percent annual increase represents an ambitious target since this is the first time that the program has required programs to target their funding towards evidence-based and evidence-informed programs, and it will take time for states to adjust their funding priorities to meet these requirements.
7P1
(ACF)
Adoption and Foster Care Analysis Reporting System (AFCARS) States report child welfare data to ACF through AFCARS. All state semi-annual AFCARS data submissions undergo extensive edit-checks for validity. The results of the AFCARS edit-checks for each of the six-month data submissions are automatically generated and sent back to each state, to help the state to improve data quality. Many states submit revised data to ensure that accurate data are submitted, often for more than one prior submission period. The Children’s Bureau has conducted AFCARS compliance reviews in all states. All states reviewed were required to undertake a comprehensive AFCARS Improvement Plan (AIP). States’ Statewide Automated Child Welfare Information Systems (SACWIS) are undergoing reviews to determine the status of their operation and the system’s capability of reporting accurate AFCARS data. To speed improvement in these data, the agency provides technical assistance to states to improve reporting to AFCARS, improve statewide information systems, and to make better use of their data. All of these activities should continue to generate additional improvements in the data over the next few years.
7P2
(ACF)
Adoption and Foster Care Analysis Reporting System (AFCARS) States report child welfare data to ACF through AFCARS. All state semi-annual AFCARS data submissions undergo extensive edit-checks for validity. The results of the AFCARS edit-checks for each of the six-month data submissions are automatically generated and sent back to each state, to help the state to improve data quality. Many states submit revised data to ensure that accurate data are submitted, often for more than one prior submission period. The Children’s Bureau has conducted AFCARS compliance reviews in all states. All states reviewed were required to undertake a comprehensive AFCARS Improvement Plan (AIP). States’ Statewide Automated Child Welfare Information Systems (SACWIS) are undergoing reviews to determine the status of their operation and the system’s capability of reporting accurate AFCARS data. To speed improvement in these data, the agency provides technical assistance to states to improve reporting to AFCARS, improve statewide information systems, and to make better use of their data. All of these activities should continue to generate additional improvements in the data over the next few years.
7Q
(ACF)
Adoption and Foster Care Analysis Reporting System (AFCARS) States report child welfare data to ACF through AFCARS. All state semi-annual AFCARS data submissions undergo extensive edit-checks for validity. The results of the AFCARS edit-checks for each of the six-month data submissions are automatically generated and sent back to each state, to help the state to improve data quality. Many states submit revised data to ensure that accurate data are submitted, often for more than one prior submission period. The Children’s Bureau has conducted AFCARS compliance reviews in all states. All states reviewed were required to undertake a comprehensive AFCARS Improvement Plan (AIP). States’ Statewide Automated Child Welfare Information Systems (SACWIS) are undergoing reviews to determine the status of their operation and the system’s capability of reporting accurate AFCARS data. To speed improvement in these data, the agency provides technical assistance to states to improve reporting to AFCARS, improve statewide information systems, and to make better use of their data. All of these activities should continue to generate additional improvements in the data over the next few years.
7S
(ACF)
Regulatory Title IV-E Foster Care Eligibility Reviews Data validation occurs on multiple levels. Information collected during the onsite portion of the review is subject to quality assurance procedures to assure the accuracy of the findings of substantial compliance and reports are carefully examined by the Children’s Bureau Central and Regional Office staff for accuracy and completeness before a state report is finalized. Through the error rate contract, data is systematically monitored and extensively checked to make sure the latest available review data on each state is incorporated and updated to reflect rulings by the Departmental Appeals Board and payment adjustments from state quarterly fiscal reports. This ensures the annual program error rate estimates accurately represent each state’s fiscal reporting and performance for specified periods. The Children’s Bureau also has a database (maintained by the contractor) that tracks all key milestones for the state eligibility reviews.
12B
(ACF)
CSBG Information System (CSBG/IS) survey administered by the National Association for State Community Services Programs (NASCSP) The Office of Community Services (OCS) and the National Association for State Community Service Programs (NASCSP) worked to ensure that the survey captures the required information. The Block Grant allows states to have different program years; this can create a substantial time lag in preparing annual reports. States and local agencies are working toward improving their data collection and reporting technology. In order to improve the timeliness and accuracy of these reports, NASCSP and OCS provide states with training, and better survey tools and reporting processes.
14D
(ACF)
Family Violence Prevention and Services Program Performance Progress Report Form Submission of this report is a program requirement. The outcome measures and the means of data collection were developed with extensive input from researchers and the domestic violence field. The forms, instructions, and several types of training have been given to states, tribes and domestic violence coalitions.
16.1LT and 16C
(ACF)
Matching Grant Progress Report forms Data are validated with methods similar to those used with Performance Reports. Data are validated by periodic desk and on-site monitoring, in which refugee cases are randomly selected and reviewed. During on-site monitoring, outcomes reported by service providers are verified with both employers and refugees to ensure accurate reporting of job placements, wages, and retentions. All of the grantees use database systems (online or manual) for data collection and monitoring of their program service locations.
18.1LT and 18A
(ACF)
Performance Report (Form ORR-6) Data are validated by periodic desk and on-site monitoring, in which refugee cases are randomly selected and reviewed. During on-site monitoring, outcomes reported by service providers are verified with both employers and refugees to ensure accurate reporting of job placements, wages, and retentions.
20C
(ACF)
Office of Child Support Enforcement (OCSE) Form 157 States currently maintain information on the necessary data elements for the above performance measures. All states were required to have a comprehensive, statewide, automated Child Support Enforcement system in place by October 1, 1997. Fifty-three states and territories were Family Support Act-certified and Personal Responsibility and Work Opportunity Reconciliation Act-certified (PRWORA) as of July 2007. The remaining state is in systems development.  Certification requires states to meet automation systems provisions of the specific act. Continuing implementation of these systems, in conjunction with cleanup of case data, will improve the accuracy and consistency of reporting. As part of OCSE’s audit of performance data, OCSE auditors review each state’s and territory’s ability to produce valid data. Data reliability audits are conducted annually. Self-evaluation by states and OCSE audits provide an on-going review of the validity of data and the ability of automated systems to produce accurate data. Each year OCSE auditors review the data that states report for the previous fiscal year. The OCSE Office of Audit has completed the FY 2012 data reliability audits.  Since FY 2001, the data reliability audit standard for reliable data has been 95 percent.  
22B
(ACF)
National Directory of New Hires (NDNH) Beginning with performance in FY 2001, the above employment measures – job entry, job retention, and earnings gain – are based solely on performance data obtained from the NDNH. Data are updated by states, and data validity is ensured with normal auditing functions for submitted data. Prior to use of the NDNH, states had flexibility in the data source(s) they used to obtain wage information on current and former TANF recipients under high performance bonus (HPB) specifications for performance years FY 1998 through FY 2000. ACF moved to this single source national database (NDNH) to ensure equal access to wage data and uniform application of the performance specifications.  

 

Administration for Community Living (ACL)

Measure ID Data Source Data Validation
1.1
(ACL)
State Program Report data is annually submitted by States.
 
The web-based submissions include multiple data checks for consistency. Multi-year comparison reports are reviewed by ACL's Administration on Aging (AoA) and State staff. AoA staff follow-up with States to assure validity and accuracy.  After revisions, States certify the accuracy of their data.
 
2.6
(ACL)
National Survey of Older Americans Act Participants. ACL's Administration on Aging's (AoA) national survey uses a range of quality assurance procedures to validate data on OAA participants and services which covers all the steps in the survey process. The surveys have consistently achieved a cooperation rate of over 80% for the sampled Area Agencies on Aging and over 90% for the sample of clients who are currently participating in OAA programs. These high cooperation rates occur because of several important steps in the quality assurance process, including intensive follow-up to contact and interview as many service participants as possible, and calling back at times that are convenient for respondents. After the surveys are complete, range and consistency checks and edits, in conjunction with the CATI software applications, ensure that only correct responses appear in the data files. The data is weighted during three post-survey steps to ensure accuracy. This includes using the inverse of the probability of selection to weight the sample of agencies and clients, adjusting for any non-response patterns and bias that might otherwise occur, and post-stratification of control totals to ensure consistency with official administrative records.
2.9a
(ACL)
National Survey of Older Americans Act Participants ACL's Administration on Aging's (AoA) national survey uses a range of quality assurance procedures to validate data on OAA participants and services which covers all the steps in the survey process. The surveys have consistently achieved a cooperation rate of over 80% for the sampled Area Agencies on Aging and over 90% for the sample of clients who are currently participating in OAA programs. These high cooperation rates occur because of several important steps in the quality assurance process, including intensive follow-up to contact and interview as many service participants as possible, and calling back at times that are convenient for respondents. After the surveys are complete, range and consistency checks and edits, in conjunction with the CATI software applications, ensure that only correct responses appear in the data files. The data is weighted during three post-survey steps to ensure accuracy. This includes using the inverse of the probability of selection to weight the sample of agencies and clients, adjusting for any non-response patterns and bias that might otherwise occur, and post-stratification of control totals to ensure consistency with official administrative records.
2.9b
(ACL)
National Survey of Older Americans Act Participants ACL's Administration on Aging's (AoA) national survey uses a range of quality assurance procedures to validate data on OAA participants and services which covers all the steps in the survey process. The surveys have consistently achieved a cooperation rate of over 80% for the sampled Area Agencies on Aging and over 90% for the sample of clients who are currently participating in OAA programs. These high cooperation rates occur because of several important steps in the quality assurance process, including intensive follow-up to contact and interview as many service participants as possible, and calling back at times that are convenient for respondents. After the surveys are complete, range and consistency checks and edits, in conjunction with the CATI software applications, ensure that only correct responses appear in the data files. The data is weighted during three post-survey steps to ensure accuracy. This includes using the inverse of the probability of selection to weight the sample of agencies and clients, adjusting for any non-response patterns and bias that might otherwise occur, and post-stratification of control totals to ensure consistency with official administrative records.
2.9c
(ACL)
National Survey of Older Americans Act Participants ACL's Administration on Aging's (AoA) national survey uses a range of quality assurance procedures to validate data on OAA participants and services which covers all the steps in the survey process. The surveys have consistently achieved a cooperation rate of over 80% for the sampled Area Agencies on Aging and over 90% for the sample of clients who are currently participating in OAA programs. These high cooperation rates occur because of several important steps in the quality assurance process, including intensive follow-up to contact and interview as many service participants as possible, and calling back at times that are convenient for respondents. After the surveys are complete, range and consistency checks and edits, in conjunction with the CATI software applications, ensure that only correct responses appear in the data files. The data is weighted during three post-survey steps to ensure accuracy. This includes using the inverse of the probability of selection to weight the sample of agencies and clients, adjusting for any non-response patterns and bias that might otherwise occur, and post-stratification of control totals to ensure consistency with official administrative records.
2.10
(ACL)
State Program Report and National Survey of Older Americans Act Participants. This is a composite measure that utilizes data from multiple sources. One source is the State Program Report. Another source is the National Survey. State Program Report data is annually submitted by States. The web-based submissions include multiple data checks for consistency. Multi-year comparison reports are reviewed by ACL's Administration on Aging ( AoA) and State staff. AoA staff follow-up with States to assure validity and accuracy. After revisions, States certify the accuracy of their data. The National Survey draws a sample of Area Agencies on Aging to obtain a random sample of clients receiving selected Older Americans Act (OAA) services. Trained staff administers telephone surveys. Results are analyzed and compared to client population to assure representative sample.
3.1
(ACL)
State Program Report data is annually submitted by States. The web-based submissions include multiple data checks for consistency. Multi-year comparison reports are reviewed by ACL's Administration on Aging (AoA) and State staff. AoA staff follow-up with States to assure validity and accuracy.  After revisions, States certify the accuracy of their data.
3.5
(ACL)
State Program Report data is annually submitted by States. The web-based submissions include multiple data checks for consistency. Multi-year comparison reports are reviewed by ACL's Administration on Aging (AoA) and State staff. AoA staff follow-up with States to assure validity and accuracy.  After revisions, States certify the accuracy of their data.

 

Agency for Healthcare Research and Quality (AHRQ)

Measure ID Data Source Data Validation
1.3.16
(AHRQ)
MEPS website -  MEPSnet/IC interactive tool. 
http://meps.ahrq.gov/mepsweb/commun
ication/whats_new.jsp
, see Tabular Data.
 
Data published on website
A number of steps are taken from the time of sample selection up to data release to ensure the reliability and accuracy of MEPS data including:

 

  • Quality control checks are applied to the MEPS sample frame when it is received from NCHS as well as to the subsample selected for MEPS.
  • Following interviewer training, performance is monitored through interview observations and validation interviews.
  • A variety of materials and strategies are employed to stimulate and maintain respondent cooperation.
  • All manual coding and data entry tasks are monitored for quality by verification at 100 percent until an error rate of less than 2 percent is achieved for coding work or less than 1 percent for data entry.
  • All specifications developed to guide the editing, variable construction and file creation are monitored through data runs that are used to verify that processes are conducted correctly and to identify data anomalies.
  • Analytic weights are developed in a manner that reduces nonresponse bias and improves national representativeness of survey estimates.
  • The precision of survey estimates are reviewed to insure they are achieving precision specifications for the survey.
  • Prior to data release, survey estimates on health care utilization, expenditures, insurance coverage, priority conditions and income are compared to previous year MEPS data and other studies. Significant changes in values of constructed variables are investigated to determine whether differences are attributable to data collection or variable construction problems that require correction.

Expenditure data obtained from the MEPS medical provider survey are used to improve the accuracy of household reported data.

 
1.3.21
(AHRQ)
MEPS website at What’s News http://meps.ahrq.gov/mepsweb/commun
ication/whats_new.jsp
, see Data.
 
Data published on website

A number of steps are taken from the time of sample selection up to data release to ensure the reliability and accuracy of MEPS data including:

 

  • Quality control checks are applied to the MEPS sample frame when it is received from NCHS as well as to the subsample selected for MEPS.
  • Following interviewer training, performance is monitored through interview observations and validation interviews.
  • A variety of materials and strategies are employed to stimulate and maintain respondent cooperation.
  • All manual coding and data entry tasks are monitored for quality by verification at 100 percent until an error rate of less than 2 percent is achieved for coding work or less than 1 percent for data entry.
  • All specifications developed to guide the editing, variable construction and file creation are monitored through data runs that are used to verify that processes are conducted correctly and to identify data anomalies.
  • Analytic weights are developed in a manner that reduces nonresponse bias and improves national representativeness of survey estimates.
  • The precision of survey estimates are reviewed to insure they are achieving precision specifications for the survey.
  • Prior to data release, survey estimates on health care utilization, expenditures, insurance coverage, priority conditions and income are compared to previous year MEPS data and other studies.  Significant changes in values of constructed variables are investigated to determine whether differences are attributable to data collection or variable construction problems that require correction.

Expenditure data obtained from the MEPS medical provider survey are used to improve the accuracy of household reported data.

 
1.3.23
(AHRQ)
CAHPS database National CAHPS Benchmarking Database Prior to placing survey and related reporting products in the public domain, a rigorous development, testing, and vetting process with stakeholders is followed. Survey results are analyzed to assess internal consistency, construct validity, and power to discriminate among measured providers.  
1.3.38
(AHRQ)
Surveys/case studies The Hospital Survey on Patient Safety Culture (HSOPS) is a survey which measures organization patient safety climate.  AHRQ staff - from the Patient Safety Portfolio and the Office of Communications and Knowledge Transfer (OCKT) – in collaboration with Westat, the HSOPS support contractor, have developed methods and  conducted validation studies of HSOPs using a multi-modal approach. First, AHRQ has compared HSOPs to the AHRQ Patient Safety Indicators (PSIs), which are based on individual hospital administrative data and are indicators of harm.  Next, AHRQ compared HSOPS to HCAHPS, which characterizes the patient’s experience with care.  In addition, AHRQ has compared HSOPS to CMS pay-for-performance-measures. Finally, AHRQ has conducted multiple case studies of the utilization and implementation of HSOPS by individual hospitals and hospital systems.  
1.3.60
(AHRQ)
Ongoing findings from PA-11-99 Information on PHIM projects:

 

https://healthit.ahrq.gov/events/national-web-conference-assessing-patient-health-information-needs-developing-consumer-hit-tools

http://www.ncbi.nlm.nih.gov/pubmed/26147401

 

https://healthit.ahrq.gov/ahrq-funded-projects/infosage-information-sharing-across-generation-and-environments

 

https://healthit.ahrq.gov/ahrq-funded-projects/patient-reminders-and-notifications

 

http://www.ncbi.nlm.nih.gov/pubmed/26173040

 

 

Assistant Secretary for Administration (ASA)

Measure ID Data Source Data Validation
1.1
(ASA)
The Office of Human Resources Telework Liaisons collect telework data from the components using spreadsheets. For all components except NIH and CDC, the Integrated Time and Attendance System is used for AWS data. Data review and verification are completed during the GHG inventory process. 
1.2
(ASA)
HHS data for fleet statistics comes from analysis and output via a resource called the Federal Automotive Statistical Tool (FAST).  The input for the HHS data comes from an internal HHS data resource called the HHS Motor vehicle Management Information System (MVMIS). 

The FY 2014 value is preliminary—data collection for the year is not closed until January 15th and the analysis will not be final until the fleet emission report is released January 30th.

Note: This value excludes all fuel products used by HHS law enforcement, protective, emergency response or military tactical vehicles (if any), as well as any HHS international deployments not already excluded by the previous categories due to constrains of regulating and enforcing US standards abroad and the intent of the metric.
Both the FAST and MVMIS have internal validation processes
1.3
(ASA)
OCIO HHS Electronic Stewardship report.  Third party verification is used (i.e., reviewed by another office with ASA)
2.5
(ASA)
Office of Personnel Management Employee Viewpoint Survey. Question #21, My work unit is able to recruit people with the right skills. This federal survey is a self-administered web survey that is offered to all full-time and part-time employees by the Office of Personnel Management.
2.6
(ASA)
Office of Personnel Management Employee Viewpoint Survey. https://www.unlocktalent.gov/employ
ee-engagement.
Question 69. Considering everything, how satisfied are you with your job?
This federal survey is a self-administered web survey that is offered to all full-time and part-time employees by the Office of Personnel Management. Data collected from the 2014 survey respondents were weighted to produce survey estimates that accurately represent the agency population.
2.7
(ASA)
Office of Personnel Management Employee Viewpoint Survey. Question 45. My supervisor is committed to a workforce representative of all segments of society. This federal survey is a self-administered web survey that is offered to all full-time and part-time employees by the Office of Personnel Management. Data collected from the 2014 survey respondents were weighted to produce survey estimates that accurately represent the agency population.
2.8
(ASA)
The results will be captured from the HHS HREPS tracking system and tracking systems at Indian Health Service (IHS) and the National Institutes of Health (NIH). Data on staff hiring actions is captured within the HREPS and reported on a weekly basis. This weekly data report is sent for review to each of the HHS HR Centers. The report distribution group consists of each of the HR Directors, their Deputies and staff team leads and supervisors.

 

Assistant Secretary for Preparedness and Response (ASPR)

Measure ID Data Source Data Validation
2.4.13
(ASPR)
Program files and contract documents Contracts awarded and draft request for proposal for industry comment are negotiated and issued, respectively, in accordance with Federal Acquisition Regulations (FAR) and the HHS Acquisition Regulations (HHSAR). Interagency Agreements are developed with federal laboratories to address specific advanced research questions.

 

Centers for Disease Control and Prevention (CDC)

Measure ID Data Source Data Validation
1.2.1c
(CDC)
Childhood data are collected through the National Immunization Survey (NIS) and reflect calendar years. The NIS uses a nationally representative sample and provides estimates of vaccination coverage rates that are weighted to represent the entire population, nationally, and by region, state, and selected large metropolitan areas. The NIS, a telephone-based survey, is administered by random-digit-dialing to find households with children aged 19 to 35 months. Parents or guardians are asked about the vaccines, with dates, that appear on the child’s "shot card" kept in the home; demographic and socioeconomic information is also collected. At the end of the interview with parents or guardians, survey administrators request permission to contact the child's vaccination providers. Providers are then contacted by mail to provide a record of all immunizations given to the child. Examples of quality control procedures include 100% verification of all entered data with a sub-sample of records independently entered. The biannual data files are reviewed for consistency and completeness by CDC’s National Center for Immunization and Respiratory Diseases, Immunization Services Division - Assessment Branch and CDC’s National Center for Health Statistics’ Office of Research and Methodology. Random monitoring by supervisors of interviewers' questionnaire administration styles and data entry accuracy occurs daily. Annual methodology reports and public use data files are available to the public for review and analysis.
1.3.3a
(CDC)
Behavioral Risk Factor Surveillance System (BRFSS)

Behavioral Risk Factor Surveillance System (BRFSS), interviews conducted September-June for an influenza season (e.g., September 2011-June 2012 for the 2011-12 influenza season) and provided to ISD from NCCDPHP by August (e.g. August 2012 for the 2011-12 influenza season). Final results usually available by September (e.g. September 2012 for the 2011-12 influenza season). BRFSS is an on-going state-based monthly telephone survey which collects information on health conditions and risk behaviors from ~400,000 randomly selected persons ≥18 years among the non-institutionalized, U.S. civilian population.

Numerator:
BRFSS respondents were asked if they had received a ‘flu’ vaccine in the past 12 months, and if so, in which month and year.  Persons reporting influenza vaccination from August through May (e.g., August 2011-May 2012 for the 2011-12 flu season) were considered vaccinated for the season.  Persons reporting influenza vaccination in the past 12 months but with missing month or year of vaccination had month and year imputed from donor pools matched for week of interview, age group, state of residence and race/ethnicity.
The cumulative proportion of persons receiving influenza vaccination coverage during August through May is estimated via Kaplan-Meier analysis in SUDAAN using monthly interview data collected September through June.
 
Denominator:
Respondents age ≥18 years responding to the BRFSS in the 50 states and the District of Columbia with interviews conducted September-June for an influenza season (e.g., September 2011-June 2012 for the 2011-12 influenza season) and provided to ISD from NCCDPHP by August (e.g. August 2012 for the 2011-12 influenza season).  Persons with unknown, refused or missing status for flu vaccination in the past 12 months are excluded. 
Data validation methodology:  Estimates from BRFSS are subject to the following limitations. First, influenza vaccination status is based on self or parental report, was not validated with medical records, and thus is subject to respondent recall bias. Second, BRFSS is a telephone-based survey and does not include households without telephone service (about 2% of U.S. households) and estimates prior to the 2011-12 influenza season did not include households with cellular telephone service only, which may affect some geographic areas and racial/ethnic groups more than others.  Third, the median state CASRO BRFSS response rate was 54.4% in 2010, and nonresponse bias may remain after weighting adjustments. Fourth, the estimated number of persons vaccinated might be overestimated, as previous estimates resulted in higher numbers vaccinated than doses distributed.
2.1.8
(CDC)
National HIV surveillance system CDC conducts validation and evaluation studies of the data systems which monitor HIV to determine the quality of data generated by them. HIV data for 2010 and after include data from all 50 states. The period of time between a diagnosis of HIV or AIDS and the arrival of a case report at CDC is called the "reporting delay". CDC requires a minimum of 12 months after the end of a calendar year to provide accurate trend data. Data from the National HIV Surveillance System are available and the latest annual surveillance report was published November 24, 2015. Plans are for annual surveillance reports to be published on Dec 1st of upcoming years which would include data collected from the previous year.
2.2.4
(CDC)
DHAP Legal Assessment Project The Legal Assessment Project (LAP) is a legal research and policy analysis project led by CDC’s Division of HIV/AIDS Prevention, Office of the Director.  Using standard legal research methods, the LAP researches state statutes, regulations, and policies that affect states’ ability to conduct effective HIV prevention. 
2.8.1
(CDC)
The National TB Surveillance System TB morbidity data and related information submitted via the national TB Surveillance System are entered locally or at the state level into CDC-developed software which contains numerous data validation checks. Data received at CDC are reviewed to confirm their integrity and evaluate completeness. Routine data quality reports are generated to assess data completeness and identify inconsistencies. Problems are resolved by CDC staff working with state and local TB program staff. During regular visits to state, local, and territorial health departments, CDC staff review TB registers and other records and data systems and compare records for verification and accuracy. At the end of each year, data are again reviewed before data and counts are finalized and published.
3.2.5
(CDC)
National Healthcare Safety Network (NHSN) facility survey Extensive cross-field edit checks are used for validation and incomplete records cannot be reported. Detailed instructions for completion of report forms ensure consistency across sites. Process and quality improvements occur through email updates and annual meetings.
3.3.2b
(CDC)
CDC’s Emerging Infections Program (EIP) Active Bacterial Core Surveillance (ABCs) invasive MRSA Surveillance System, and CDC’s National Healthcare Safety Network (NHSN). NHSN data is validated by the Centers for Medicare & Medicaid Services (CMS) and state/local health departments. EIP data undergoes annual audits to ensure accuracy.
3.3.3
(CDC)
National Healthcare Safety Network (NHSN) Extensive cross-field edit checks are used for validation and incomplete records cannot be reported. Detailed instructions for completion of report forms ensure consistency across sites. Process and quality improvements occur through email updates and annual meetings.
3.3.4
(CDC)
National Healthcare Safety Network (NHSN) Extensive cross-field edit checks are used for validation and incomplete records cannot be reported. Detailed instructions for completion of report forms ensure consistency across sites. Process and quality improvements occur through email updates and annual meetings.
4.6.3
(CDC)
National Health Interview Survey, NCHS  NCHS validates the data 
4.6.5
(CDC)
Youth Risk Behavior Surveillance System (YRBSS), which monitors priority health-risk behaviors and is conducted every other year (odd years). Beginning in FY 2011, the National Youth Tobacco Survey (NYTS) was added as an additional data source but removed in 2014 when the variance of the data reported in the years between YRBSS data reporting became too great as compared with YRBSS. Validity and reliability studies of YRBSS attest to the quality of the data.  CDC conducts quality control and logical edit checks on each record
4.11.9
(CDC)
National Health Interview Survey (NHIS), CDC, NCHS Data are reported from a national surveillance system and follow predetermined quality control standards.
8.B.1.3a
(CDC)
Tracking spreadsheets from the ELC-ELR monitoring project Data is validated by the Meaningful Use program by collaborating with the various programs (ELR, ISS, SS) to determine the number of awardees that meet the requirements of the EHR-MU standards.
8.B.2.2
(CDC)
Electronic media reach of CDC Vital Signs is measured by
CDC.gov web traffic and actual followers and subscribers of CDC’s social media, e-mail updates and texting service The data source for this measure is Omniture® web analytics, which is a software product that provides consolidated and accurate statistics about interactions with CDC.gov and social media outlets as individuals seek and access information about CDC Vital Signs.
Monthly review of Omniture data by CDC Office of the Associate Director for Communication (OADC) and Vital Signs staff.
8.B.2.5
(CDC)
The data source for this measure is Omniture® web analytics, which is a software product that provides consolidated and accurate statistics about interactions with CDC.gov Ongoing review of Omniture reports by Community Guide staff.
8.B.4.2
(CDC)
Infectious Diseases (EID) Laboratory Fellowships, CDC/Council of State and Territorial Epidemiologists’ (CSTE) Applied Epidemiology Fellowship, Post-EIS Practicum (now known as the Health Systems Integration Program), PHPS Residency, and Applied Public Health Informatics Fellowship were added to the measure in FY 2011.  The PHPS Residency pilot program ended in FY 2012. The Informatics Training in Place Program was added in FY 2014. Trainees funded by other federal agencies are excluded.  Staff reviews and validates data through the fellowship programs’ personnel systems.  
10.A.1.5
(CDC)
Annual Program Results (APRs)  Data are validated through routine site monitoring and data quality assurance activities.  These activities are performed routinely at sites providing direct service delivery and sites that receive technical assistance for service delivery improvement for this performance measure.  Data validation includes routine procedures for assessing and maintaining data completeness and accuracy throughout the data lifecycle as well as systematic procedures for assessing that the reported data are validated.  Final aggregated numbers for results and future targets are reviewed by an expert team representing both programmatic technical area experts at headquarters and country technical team members with expert knowledge of country program context, historical performance, and current performance capacity.  
10.F.1a
(CDC)
FETP Annual Program Reports Reports from Countries are submitted to CDC annually.  These reports are confirmed by program directors in each Country.
10.F.1b
(CDC)
FETP Annual Program Reports Reports from Countries are submitted to CDC annually.  These reports are confirmed by program directors in each Country.
13.5.3
(CDC)
Self-reported data from 62 PHEP grantees.  Quality assurance reviews with follow-up with grantees

 

Centers for Medicare & Medicaid Services (CMS)

Measure ID Data Source Data Validation
CHIP 3.3
(CMS)
Statistical Enrollment Data System Each State must assure that the information is accurate and correct when the information is submitted to SEDS by certifying that the information shown on the CHIP forms is correct and in accordance with the State's child health plan as approved by the Secretary.

CMS staff populates the data into various SEDS reports and verifies each of the enrollment measures. Each form has the following seven measures that are reported by service delivery system: 1: Unduplicated Number Ever Enrolled During the Quarter. 2: Unduplicated Number of New Enrollees in the Quarter. 3: Unduplicated Number of Disenrollees in the Quarter. 4: Number of Member-Months of Enrollment in the Quarter. 5: Average Number of Months of Enrollment (item 4 divided by item 1). 6: Number Enrolled At Quarter’s End (point in time). 7: Unduplicated Number Ever Enrolled in the Year” (4th Quarter Only).

CMS compares these enrollment measures to past quarters and trends over the life of each program to ensure that there aren’t any anomalies in the data, and if apparent errors are detected, CMS corresponds with the State staff who are responsible for reporting enrollment statistics. If there are major increases or decreases, CMS investigates the causes of the changes in enrollment patterns.
MCD6
(CMS)
The core set of measures required under CHIPRA was published in December 2009. CMS initially used the automated web-based system, CHIP Annual Reporting Template System (CARTS), for voluntary reporting of quality measures. This is the same system that was used for the CHIP Quality GPRA goal that was discontinued after FY 2010 (MCD2). Starting with FFY 2015 reporting, CMS will use the MACPro system as the data source for this measure. For Child Core Set reporting between FFY 2011 and FFY 2014, CMS monitored performance measurement data related to the Child Core Set through CARTS. Starting with FFY 2015 reporting, CMS will monitor data reported on the Child Core Set through the MACPro system.
MCD8
(CMS)
The initial Adult Core Set was published in January 2012. CMS initially used the automated web-based system, the Medicaid Adult Quality Measures Template in CARTS for voluntary reporting of quality measures. Starting with FFY 2015 reporting, CMS will use the MACPro system as the data source for this measure. For FY 2011 and FY 2012, the data validation was the link to the core set in the Federal Register. The link to the recommended core set is:
http://federalregister.gov/a/2010-3
2978
. The link to the published core set is http://federalregister.gov/a/2011-3
3756.
For Adult Core Set reporting for FFY 2013 and FFY 2014, CMS monitored performance measurement data related to the core set of measures through CARTS. Starting with FFY 2015 reporting, CMS will monitor data reported on the Child Core Set through the MACPro system.
MCR1.1a
(CMS)
The Medicare Consumer Assessment of Healthcare Providers and Systems (CAHPS) is a set of annual surveys of beneficiaries enrolled in all Medicare Advantage plans and in the original Medicare fee-for-service plan. The Medicare CAHPS are administered according to the standardized protocols as delineated in the CAHPS 4.0 Survey and Reporting Kit developed by the Agency for Healthcare Research and Quality (AHRQ). This protocol includes two mailings of the survey instruments to randomized samples of Medicare beneficiaries in health plans and geographic areas, with telephone follow-up of non-respondents with valid telephone numbers. CAHPS data are carefully edited and cleaned prior to the creation of composite measures using techniques employed comparably in all surveys. Both non-respondent sample weights and managed care-FFS comparability weights are employed to adjust collected data for differential probabilities of sample selection, under-coverage, and item response.
MCR1.1b
(CMS)
The Medicare Consumer Assessment of Healthcare Providers and Systems (CAHPS) is a set of annual surveys of beneficiaries enrolled in the original Medicare
fee-for-service plan and in all Medicare Advantage plans.
The Medicare CAHPS are administered according to the standardized protocols as delineated in the Medicare Advantage and Prescription Drug Plan CAHPS Survey Quality Assurance Protocols & Technical Specifications available at www.ma-pdpcahps.org. This protocol includes two mailings of the survey instruments to randomized samples of Medicare beneficiaries in health plans and geographic areas, with telephone follow-up of non-respondents with valid telephone numbers. CAHPS data are carefully analyzed and cleaned prior to the creation of composite measures using techniques employed comparably in all surveys. Both non-respondent sample weights and managed care-FFS comparability weights are employed to adjust collected data for differential probabilities of sample selection, under-coverage, and item response.
MCR23
(CMS)
The Prescription Drug Event (PDE) data CMS has a rigorous data quality program for ensuring the accuracy and reliability of the PDE data.  The first phase in this process is on-line PDE editing.  The purpose of on-line editing is to apply format rules, check for legal values, compare data in individual fields to other known information (such as beneficiary, plan, or drug characteristics) and evaluate logical consistency between multiple fields reported on the same PDE.  On-line editing also enforces business order logic which ensures only one PDE is active for each prescription drug event.  The second phase of our data quality program occurs after PDE data has passed all initial on-line edits and is saved in our data repository.  CMS conducts a variety of routine and ad hoc data analysis of saved PDEs to ensure data quality and payment accuracy.
MCR26
(CMS)
Medicare claims data. The data used to calculate the performance measures are administrative claims data submitted by hospitals and Medicare Advantage plans. Administrative claims data is a validated data source and is the data source for public reporting of hospital readmission rates on the Hospital Compare website (www.hospitalcompare.hhs.gov). 
; As stated on the Hospital Compare website, research conducted when the measures were being developed demonstrated that the administrative claims-based model performs well in predicting readmissions compared with models based on chart reviews.
The claims processing systems have validation methods to accept accurate Medicare claims into the claims database.  CMS uses national administrative inpatient hospital claims data to calculate the readmission rate measure. The claims processing systems have validation methods to accept accurate Medicare claims into the claims database.  Inpatient hospital claims information is assumed to be accurate and reliable as presented in the database.
MCR28.2
(CMS)
The CDC National Healthcare Safety Network Extensive cross-field edit checks are used for validation and incomplete records cannot be reported. Detailed instructions for completion of report forms ensure consistency across sites. Process and quality improvements occur through email updates and annual meetings.
MCR30.1
(CMS)
A combination of data pulls from Integrated Data Repository (IDR), Master Data Management (MDM) warehouse, Chronic Conditions Data Warehouse (CCW), Centers for Medicare & Medicaid Services (CMS) Office of the Actuary (OACT) reports, Center for Medicare and Medicaid Innovation interim payment and reconciliation reports, and so on.

Numerator: The total amount of FFS payments made under alternative payment models and population-based payments (i.e. categories 3 and 4), such as Pioneer Accountable Care Organizations (ACOs), Shared Savings Program (SSP) ACOs, Comprehensive Primary Care Initiative (CPC), Comprehensive End Stage Renal Disease (ESRD), Bundles, and so on) during a given performance year/calendar year.

Denominator: The total amount of FFS payments during a given calendar year as determined by OACT.
CMMI will work closely with OACT, program staff, and contractors to ensure data are accurate and reliable.
MIP1
(CMS)
The Comprehensive Error Rate Testing (CERT) Program selects a random sample of Medicare Fee-for Service (FFS) claims from a population of claims submitted for Medicare FFS payment. Complex medical review is performed on the sample of Medicare FFS claims to determine if the claims were properly paid under Medicare coverage, coding, and billing rules. The CERT program is monitored for compliance by CMS through monthly reports from the contractors. In addition, the HHS Office of the Inspector General (OIG) conducts annual reviews of the CERT program and its contractors.
MIP5
(CMS)
The Part C Error Rate estimate measures errors in clinical diagnostic data submitted to CMS by plans.  The diagnostic data is used to determine risk adjusted payments made to plans. Data used to determine the Part C program payment error rate is validated by several contractors. 
 
The Part C program payment error estimate is based on data obtained from a rigorous Risk Adjustment Data Validation process in which medical records are reviewed by independent coding entities in the process of confirming that medical record documentation supports risk adjustment diagnosis data for payment.
MIP6
(CMS)
The components of payment error measurement in the Part D program are:

A rate(s) that measures payment errors related to low income subsidy (LIS) payments for beneficiaries dually-eligible for Medicare and Medicaid and non-duals also eligible for LIS status.

A rate that measures payment errors from errors in Prescription Drug Event (PDE) records.  A PDE record represents a prescription filled by a beneficiary that was covered by the plan.

A rate that measures payment errors resulting from incorrect assignment of Medicaid status to beneficiaries who are not dually eligible for Medicare and Medicaid.

A rate that measures payment errors from errors in Direct and Indirect Remuneration (DIR) amounts reported by Part D sponsors to CMS.  DIR is defined as price concessions (offered to purchasers by drug manufacturers, pharmacies, or other sources) that serve to decrease the costs incurred by the Part D sponsor for prescription drugs.
For the Part D component payment error rates, the data to validate payments comes from multiple internal and external sources, including CMS’ enrollment and payment files.  Data are validated by several contractors.

Data for the LIS payment error measure come from CMS’ internal payment and enrollment files for all Part D plan beneficiaries.

Data for the PDE error measure come from CMS’ PDE Data Validation process, which validates PDE data through contractor review of supporting documentation submitted to CMS by a national sample of Part D plans.

The data element for incorrect Medicaid status is the PERM eligibility error rate, which is validated by the Medicaid program for the entire Medicaid population and is used by the Part D program as a proxy for incorrect Medicaid status. From the population of Part D beneficiaries who are eligible for Medicare and Medicaid, CMS randomly assigns a subset, equal to the PERM rate, to be ineligible for Medicaid, resulting in payment error.

Data for the DIR error measure come from audit findings for a national sample of Part D plans; the audits are conducted by contractors as part of the Financial Audit process conducted by CMS' Office of Financial Management (OFM).
MIP8
(CMS)
CMS’s predictive analytics work, using the Fraud Prevention System (FPS), will focus on activities in the areas where incidence or opportunity for improper payments and/or fraud are greatest. While this risk-based approach increases contractors’ efficiency, it also reduces the burden on legitimate providers by focusing the majority of fraud detection and prevention resources on those posing higher risk of fraud FPS captures the link between each individual alert summary record (ASR) and each subsequent administrative action.  The FPS Dashboard and supporting systems will enable a seamless reporting of all data necessary to develop the baseline and to measure performance against any future targets.
MIP9.1
(CMS)
As part of a national contracting strategy, adjudicated claims data and medical policies are gathered from the States for purposes of conducting medical and data processing reviews on a sample of the claims paid in each State. CMS and our contractors are working with the 17 States to ensure that the Medicaid and CHIP universe data and sampled claims are complete and accurate and contain the data needed to conduct the reviews. In addition, the OIG conducts annual reviews of the PERM program and its contractors.
MIP9.2
(CMS)
As part of a national contracting strategy, adjudicated claims data and medical policies are gathered from the States for purposes of conducting medical and data processing reviews on a sample of the claims paid in each State. CMS and our contractors are working with the 17 States to ensure that the Medicaid and CHIP universe data and sampled claims are complete and accurate and contain the data needed to conduct the reviews.
MSC1
(CMS)
CMS reports the prevalence of pressure ulcers in long-stay nursing home residents with quality measures (QMs) derived from the Minimum Data Set (MDS). For this goal, CMS reports the prevalence of pressure ulcers measured in the last three months of the fiscal year. The numerator consists of high-risk residents with a pressure ulcer, stages 2-4, on the most recent assessment. The denominator is all high-risk residents. Beginning with the FY 2012 reporting period, the data source is changing from MDS version 2.0 to MDS version 3.0. The new pressure ulcer measure excludes less serious Stage 1 pressure ulcers. The MDS, version 3.0, is the source of the data used to calculate this measure. The MDS is considered to be part of the medical record. The nursing home must maintain the MDS and submit it electronically to CMS for every resident of the certified part of the nursing home. However, MDS data are self-reported by the nursing home. MDS data quality assurance currently consists of onsite and offsite reviews by surveyors and by CMS contractors to ensure that MDS assessments are reported in a timely and complete manner
MSC5
(CMS)
CMS reports the percentage of long-stay nursing home residents that received an antipsychotic medication with a quality measure (QM) derived from the Minimum Data Set (MDS). The MDS is the source of the data used to calculate this measure.  The MDS is considered part of the medical record.  The nursing home must maintain the MDS and submit it electronically to CMS for every resident of the certified part of the nursing home.
PHI5
(CMS)
Enrollment Data Store (EDS) Data will be generated based on effectuated enrollments; CMS will have internal controls and oversight mechanisms to ensure validity.
PHI7
(CMS)
The Centers for Disease Control and Prevention (CDC) National Center for Health Statistics (NCHS) reports data on a quarterly basis from the National Health Interview Survey (NHIS), which provides for prompt monitoring of health insurance coverage as the Affordable Care Act provisions are implemented and go into effect. The numerator for the proposed measure is the estimated number of individuals with no health coverage (individuals who reported IHS or single-service plans only are treated as uninsured). The denominator is the total civilian noninstitutionalized nonelderly population. The National Health Interview Survey (NHIS) is the principal source of information on the health of the civilian noninstitutionalized population of the United States and is one of the major data collection programs of the CDC’s National Center for Health Statistics (NCHS.) Data are collected through a personal household interview conducted by interviewers employed and trained by the U.S. Census Bureau according to procedures specified by the NCHS. http://www.cdc.gov/nchs/nhis/about_
nhis.htm


NHIS estimates will be compared with estimates from the American Community Survey and Current Population Survey Annual Social and Economic Supplement, which are released later than NHIS.

 

Food and Drug Administration (FDA)

Measure ID Data Source Data Validation
212406
(FDA)
FoodNet The proactive use of food safety surveillance information and scientific data and tools to prevent illness and injury from foods is a significant focus of FDA. FDA collects data from the FoodNet Data Base to assess and communicate the specific risks associated with specific food products to American consumers and to industry on a routine basis as well as during foodborne illness outbreaks to reduce the incidence of infection with key foodborne pathogens.
212409
(FDA)
CDC/FoodNet FoodNet Annual Reports are summaries of information collected through active surveillance of nine pathogens.  A preliminary version of this report becomes available in the spring of each year and forms the basis of each year's Morbidity and Mortality Weekly Report (MMWR) FoodNet Surveillance.  The FoodNet Final Report becomes available later in the year when current census information becomes available.  The illness rates calculated for this Priority Goal use the same data and same methodology as the illness rates in the MMWR.  CDC’s FoodNet system reports pathogen-specific illness data based on the calendar year, not the fiscal year.  Therefore, achievement of the annual targets reported here is evaluated based on the calendar year data, not fiscal year data.
214305
(FDA)
Field Data Systems These maximum capacities are extrapolated to estimate for times of emergency with the laboratory operating under abnormal conditions that are variable and uncertain. FDA and FERN work to maximize capabilities by continually improving methods and training along with increasing automated functionality and available cache of supplies. Through using these laboratories, with known instrumentation and methods, after examining the sample throughput during emergencies, and after consultation with the laboratories and FDA subject matter experts, the listed sample totals are the estimates reached. The surge capacity estimates provided in the performance measures for these laboratories have been examined under the stress of emergencies and outbreaks such as the melamine contamination, Deepwater Horizon oil spill, and the Japan nuclear event.
214306
(FDA)
BioPlex and ibis Biosensor systems CFSAN scientists have developed the means to evaluate and adapt commercially available instruments to develop and validate more rapid, accurate, and transportable tests to stop the spread of foodborne illness and cases of chemical contamination. Using one such system, known as BioPlex, CFSAN scientists are using the device to rapidly serotype pathogens such Salmonella. The BioPlex system can serotype 48 different samples in 3 to 4 hours, which vastly improves response time in foodborne illness outbreaks. CFSAN scientists also are using the ibis Biosensor system to speed the identification of Salmonella, E. coli, and other pathogens, toxins, and chemical contaminants.
223215
(FDA)
CDER uses the Document Archiving, Reporting, and Regulatory Tracking System (DARRTS). FDA has a quality control process in place to ensure the reliability of the performance data in DARRTS. The Document Archiving, Reporting, and Regulatory Tracking System (DARRTS) is CDER’s enterprise-wide system for supporting premarket and postmarket regulatory activities. DARRTS is the core database upon which most mission-critical applications are dependent. The type of information tracked in DARRTS includes status, type of document, review assignments, status for all assigned reviewers, and other pertinent comments. CDER has in place a quality control process for ensuring the reliability of the performance data in DARRTS. Document room task leaders conduct one hundred percent daily quality control of all incoming data done by their IND and NDA technicians. Senior task leaders then conduct a random quality control check of the entered data in DARRTS. The task leader then validates that all data entered into DARRTS are correct and crosschecks the information with the original document.
234101
(FDA)
CBER’s Office of Vaccines Research and Review; and CBER’s Emerging and Pandemic Threat Preparedness Office The data are validated by the appropriate CBER offices and officials.
243201
(FDA)
Submission Tracking and Reporting System (STARS). STARS tracks submissions, reflects the Center’s target submission processing times and monitors submissions during the developmental or investigational stages and the resulting application for marketing of the product.
262401
(FDA)
NCTR Project Management System; peer-review through FDA/NCTR Science Advisory Board (SAB) and the NTP Scientific Board of Counselors; presentations at national and international scientific meetings; use of the predictive and knowledge-based systems by the FDA reviewers and other government regulators; and manuscripts prepared for publication in peer-reviewed journals. NCTR provides peer-reviewed research that supports FDA’s regulatory function. To accomplish this mission, it is incumbent upon NCTR to solicit feedback from its stakeholders and partners, which include FDA product centers, other government agencies, industry, and academia. The NCTR SAB —composed of non-government scientists from industry, academia, and consumer organizations, and subject matter experts representing all of the FDA product centers—is guided by a charter that requires an intensive review of each of the Center’s scientific programs at least once every five years to ensure high quality programs and overall applicability to FDA’s regulatory needs. Scientific and monetary collaborations include Interagency Agreements with other government agencies, Cooperative Research and Development Agreements that facilitate technology transfer with industry, and informal agreements with academic institutions. NCTR also uses an in-house strategy to ensure the high quality of its research and the accuracy of data collected. Research protocols are often developed collaboratively by principal investigators and scientists at FDA product centers and are developed according to a standardized process outlined in the “NCTR Protocol Handbook.” NCTR’s Project Management System tracks all planned and actual expenditures on each research project. The Quality Assurance Staff monitors experiments that fall within the Good Laboratory Practices (GLP) guidelines. NCTR’s annual report of research accomplishments, goals, and publications is published and available on FDA.gov. Research findings are published in peer-reviewed journals and presented at national and international scientific conferences.
280005
(FDA)
CTP’s Tobacco Inspection Management System (TIMS) is a database that contains the tobacco retail inspection data submitted by state and territorial inspectors commissioned by FDA. CTP/OCE has in place a process for ensuring the quality of the data in TIMS. OCE staff conduct random quality control checks of inspection data submitted for tobacco retail inspections where no violations were found. OCE staff conduct quality control checks for all tobacco retail inspections where potential violations were found.

 

Health Resources and Services Administration (HRSA)

Measure ID Data Source Data Validation
1.I.A.1
(HRSA)
HRSA Bureau of Primary Health Care's Uniform Data System Validated using over 1,000 edit checks, both logical and specific.  These include checks for missing data and outliers and checks against history and norm.
1.I.A.3
(HRSA)
HRSA/Bureau of Primary Health Care contractors that perform PCMH surveys. Data validated by Health Center program staff.
1.II.B.1
(HRSA)
Uniform Data System Validated using over 1,000 edit checks, both logical and specific. These include checks for missing data and outliers and checks against history and norm.
4.I.C.2
(HRSA)
HRSA Bureau of Clinician Recruitment Service's Management Information Support System (BMISS) BMISS is internally managed with support from the NIH which provides: Data Management Services, Data Requests and Dissemination, Analytics, Data Governance and Quality, Project Planning and Requirements Development, Training, and Process Improvement.
6.I.C.2
(HRSA)
Annual performance reports submitted by BHW grantees through the BHW Performance Management Handbook system. Data are entered through a web-based system that incorporates extensive validation checks.  Once approved by the project officer (1st level of review), data are cleaned, validated, and analyzed by scientists within BHW's National Center for Health Workforce Analysis (2nd level of review). Inconsistencies in data reported identified throughout the 2nd level of review are flagged and sent to the project officer for follow-up and correction.
10.I.A.1
(HRSA)
MCH Block Grant’s Title V Information System (TVIS) that contains data on grantee performance from grantee annual reports Data are validated by project officers and program staff.
16.E
(HRSA)
ADAP Quarterly Report data provided by State ADAPs. Web-based data checked through a series of internal consistency/validity checks. Also HIV/AIDS program staff review submitted Quarterly reports, and provide technical assistance on data-related issues.
16.I.A.1
(HRSA)
HRSA HIV/AIDS Bureau's Ryan White HIV/AIDS Program Services Report This web-based data collection method communicates errors and warnings in the built in validation process. To ensure data quality the Program conducts data verification for all Ryan White HIV/AIDS Program Services Report (RSR) submissions. Reports detailing items in need of correction and instructions for submitting revised data are sent to grantees.
24.II.A.2
(HRSA)
Data are captured within the National Marrow Donor Program's computerized system, containing information pertaining to registered volunteer adult donors willing to donate blood stem cells to patients in need.  Monthly reports generated from the computerized system to indicate the number of registered donors (broken down by self-reported race and ethnicity). Validated by project officers analyzing comprehensive monthly reports broken down by recruitment organization.  To decrease the likelihood of data entry errors, the program contractor utilizes value protected screens and optical scanning forms.
29.IV.A.3
(HRSA)
Reported by grantees through the Program’s Performance Improvement Measurement System Validated by project officers
36.II.B.1
(HRSA)
Family Planning Annual Report (FPAR). The FPAR consists of 14 tables in which grantees report data on user demographic characteristics, user social and economic characteristics, primary contraceptive use, utilization of family planning and related health services, utilization of health personnel, and the composition of project revenues. 
For this measure, FPAR Table 11: "Unduplicated number of Users Tested for Chlamydia by Age and Gender," is the data source. 
The responsibility for the collection and tabulation of annual service data from Title X grantees rests with the Office of Population Affairs (OPA), which is responsible for the administration of the program. Reports are submitted annually on a calendar year basis (January 1 - December 31) to the regional offices. Grantee reports are tabulated and an annual report is prepared summarizing the regional and national data. The annual report describes the methodology used both in collection and tabulation of grantee reports, as well as the definitions provided by OPA to the grantees for use in completing data requests. Also included in the report are lengthy notes that provide detailed information regarding any discrepancies between the OPA requested data and what individual grantees were able to provide. Data inconsistencies are first identified by the Regional Office and then submitted back to the grantee for correction. Additionally, discrepancies found by the contractor compiling the FPAR data submits these to the Office of Family Planning (OFP) FPAR data coordinator who works with the Regional Office to make corrections. All data inconsistencies and their resolution are noted in an appendix to the report. These are included for two reasons: (1) to explain how adjustments were made to the data, and how discrepancies affect the analysis, and (2) to identify the problems grantees have in collecting and reporting data, with the goal of improving the process.

 

Indian Health Service (IHS)

Measure ID Data Source Data Validation
2
(IHS)
Clinical Reporting System (CRS); yearly Diabetes care and outcome audit Comparison of CRS and audit results; CRS software testing; quality assurance review of site submissions
18
(IHS)
Clinical Reporting System (CRS) CRS software testing; quality assurance review of site submissions
20
(IHS)
IHS operated hospitals and clinics report the accrediting body, the length of accreditation, and other significant information about their accreditation status to the IHS Headquarters, Office of Resource Access and Partnerships, which maintains a List of Federal Facilities - Status of Accreditation. The Joint Commission and AAAHC, non-governmental organizations, maintain lists of certified and accredited facilities at their public websites.  Visit the Joint Commission website at http://www.qualitycheck.org/CertificationList.aspx.  Visit the Accreditation Association for Ambulatory Health Care at http://www.aaahc.org/eweb/dynamicpage.aspx?site= aaahc_site&webcode=find_orgs.
24
(IHS)
Clinical Reporting System (CRS) CRS software testing; quality assurance review of site submissions; Immunization program reviews
30
(IHS)
Clinical Reporting System(CRS) CRS software testing; quality assurance review of site submissions
51
(IHS)
Clinical Reporting System (CRS) CRS software testing, quality assurance review of the site submission
TOHP-SP
(IHS)
Routine IHS Tribal consultation documentation for HHS consultation report and IHS Director's Activities database Routine IHS Tribal consultation documentation for HHS consultation report and IHS Director's Activities database

 

Immediate Office of the Secretary (IOS)

Measure ID Data Source Data Validation
1.2
(IOS)
Quarterly reports on data via Data.Gov submissions and HHS data calls Datasets available at www.healthdata.gov and also tracked by the Director of the Health Data Initiative
1.4
(IOS)
The Open Innovation Manager's Database The HHS Open Innovation Manager tracks all challenges issued by HHS
1.5
(IOS)
Data are collected and maintained by HHS IDEA Lab staff Descriptions of innovative solutions are available on the HHS IDEA Lab website at http://www.hhs.gov/idealab/
1.6
(IOS)
NLM PubMed Central Database The validation is based on quarterly database queries.  A listing of available articles and participating journals is available at http://www.ncbi.nlm.nih.gov/pmc/
1.7

 

(IOS)

The HHS IDEA Lab Buyers Club Program Manager’s Database The HHS IDEA Lab Buyer’s Club tracks IT acquisitions that have utilized Buyer’s Club

 

National Institutes of Health (NIH)

Measure ID Data Source Data Validation
CBRR-1.1, CBRR-1.2
(NIH)
Doctorate Records File and the NIH IMPAC II database Analyses of career outcomes for predoctoral and postdoctoral NRSA participants, compared to individuals that did not receive NRSA support,” using the Doctorate Records File and the NIH IMPAC II administrative database.

Contact:
Jennifer Sutton
Program Policy and Evaluation Officer
Office of Extramural Programs
(301) 435-2686
CBRR-10
(NIH)
Publications, databases, administrative records and/or public documents. Information on the Molecular Libraries Program (MLP) can be found here: 
http://commonfund.nih.gov/molecular
libraries/

 
MLP research resources that are available:

 

- NCATS Small Molecule Resource’s public access to compounds of the MLI (http://ncats.nih.gov/preclinical/core/compound)

- NIH Clinical Collection (http://nihsmr.evotec.com/evotec/sets/ncc)

- NCGC Assay Guidance Manual and HTS Guidance Criteria (http://www.ncbi.nlm.nih.gov/books/NBK53196/)

- Probe Reports and Probe Report abstracts in PubChem (http://www.ncbi.nlm.nih.gov/books/NBK47352/; https://pubchem.ncbi.nlm.nih.gov/)

- Probes available from ML Centers and commercial vendors (http://www.ncbi.nlm.nih.gov/books/NBK47352/?term=molecular%5BAll%20Fields%5D%20AND%20(%22libraries%22%5BAll%20Fields%5D%20OR%20%22libraries%22%5BAll%20Fields%5D))

- PubChem (https://pubchem.ncbi.nlm.nih.gov/)

- BioAssay Research Database (BARD) (https://bard.nih.gov/)

SRO-3.9
(NIH)
Publication, databases, administrative records and/or public documents NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities.  The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports.  Scientific journals use a process of peer review prior to publishing an article.  Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected. 

Specific data sources for this measure are provided below.  A contact is listed if administrative records are included as a data source.

For additional information contact: NIAMS SPPB (Reaya Reuss, 301-496-8271, [email protected])

1. Preliminary Response to Janus Kinase Inhibition with Baricitinib in Chronic Atypical Neutrophilic Dermatosis with Lipodystrophy and Elevated Temperatures (CANDLE)

Gina A Montealegre Sanchez, Adam Reinhardt, Paul Brogan, Yackov Berkun, Abraham Zlotogorski, Diane Brown, Peter Chira, Ling Gao, Jason Dare, Susanne Schalm, Rosa Merino, Dawn Chapelle, Hanna Kim, Samantha Judd, Michelle O’Brien, Adriana Almeida de Jesus, Yun J. Kim, Bahar Kost, Yan Huang, Scott Paul, Robert Colbert, Alessandra Brofferio, Chyi-chia Lee, Colleen Hadigan, Theo Heller, Caterina P. Minniti, Kristina I. Rother, Raphaela Goldbach-Mansky.

Abstract and oral presentation by Dr. Montealegre with preliminary data were presented at the ISSAID meeting in Dresden, Germany on October 4th 2015.

2. Additive Loss-of-Function Mutations in Proteasome-Subunit-Genes induce Type-I-interferon production in CANDLE/PRAAS.

Anja Brehm, Yin Liu, Afsal Sheikh, Bernadette Marrero, Ebun Omoyinmi, Qing Zhou, Gina Montealegre Sanchez, Angelique Biancotto, Adam Reinhardt, Adriana de Jesus, Martin Pelletier, Wanja Tsai, Elaine Remmers, Lena Kardava, Suvimol Hill, Hanna Kim, Helen Lachmann, Andres Megarbane, Jae Chae, Jilian Brady, Rena Castillo, Diane Brown, Angel Casano, Lng Gao, Dawn Chapelle, Yan Huang, Deborah Stone, Yongjing Chen, Frauke Sotzny, Chia Lee, Daniel Kastner, Antonio Torrelo, Abraham Zlotogorski, Susanne Moir, Massimo Gadina, Phil McCoy P, Robert Wesley, Kristina Rother, Peter Hildebrand, Paul Brogan, Elke Krüger, Ivona Aksentijevich, Raphaela Goldbach-Mansky R. J Clin Invest. October 2015.
SRO-5.2
(NIH)
Publication, databases, administrative records and/or public documents Cluster analysis in the COPDGene study identifies subtypes of smokers with distinct patterns of airway disease and emphysema. Castaldi PJ et al. Thorax. 2014 May;69(5):415-22. doi: 10.1136/thoraxjnl-2013-203601. Epub 2014 Feb 21. 

For additional information contact: Dr. Lisa Postow, NHLBI, NIH ([email protected]).

SRO-5.5
(NIH)
Publication, databases, administrative records and/or public documents NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities. The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports. Scientific journals use a process of peer review prior to publishing an article. Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected. Specific data sources for this measure are provided below. A contact is listed if administrative records are included as a data source. For additional information, contact: NIBIB OSPC (Christine Cooper, 301-594-8923, [email protected]).

Selected publications:

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4234275/

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4444489/

SRO-5.13
(NIH)
Publication, databases, administrative records and/or public documents Relevant results are reported in peer-reviewed journals:

1. Hsieh et al. (2015). A data analysis pipeline accounting for artifacts in Tox21 quantitative high-throughput screening assays. J Biomol. Screen. 20(7): 887-897.

2. Browne et al. (2015). Screening Chemicals for Estrogen Receptor Bioactivity Using a Computational Model. Environ. Sci. Technol. 49(14): 8804-8814.

3. Judson et al. (2015). Integrated Model of Chemical Perturbations of a Biological Pathway Using 18 In Vitro High-Throughput Screening Assays for the Estrogen Receptor. Toxicol. Sci. 148(1): 137-154.

Information about Tox21 can be found at http://www.epa.gov/ncct/Tox21/.

SRO-6.4
(NIH)
Publications, databases, administrative records and/or public documents NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities. The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports. Scientific journals use a process of peer review prior to publishing an article. Through this rigorous process, other experts in the author's field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.

The Role of Type 2 inflammation in the Pathogenesis of Asthma Exacerbations; Dunican E and Fahy J; Annals ATS, 12 (supp 2): S144-149; November 2015.

Vitamin D Supplementationand the Risk of Colds in Patients with Asthma; Denlinger L et al; AJRCCM; November 2015.

For additional information contact: Dr. Gail Weinmann, NHLBI, NIH at [email protected]
SRO-8.2
(NIH)
Publications, databases, administrative records and/or public documents 1. Papers identified by the August mini-review.

2. a. Jennings, JH, Rizzi, G, Stamatakis, AM, Ung, RL, Stuber, GD, 2013, Science, 341:1517-1518. The inhibitory circuit architecture of the lateral hypothalamus orchestrates feeding.
     b. Nieh, EH, et al, 2015, Cell 160:528-541. Decoding neural circuits that control compulsive sucrose seeking.
     c. Jennings, JH, et al, 2015, Cell 160:516-527. Visualizing hypothalamic network dynamics for appetitive and consummatory behaviors.
     d. Myers, MG, Olson, DP, 2014, Cell Metabolism 19:732-733. SnapShot: Neural pathways that control feeding.

3. Cruz,F.C.,etal.,Using c-fos to study neuronal ensembles in corticostriata lcircuitry of addiction. Brain Research(2014), http://dx.doi.org/10.1016/j.brainres.2014.11.005
SRO-8.7
(NIH)
Publications, databases, administrative records and/or public documents NIH staff review relevant publications, databases, administrative records, and public documents to confirm whether the data sources support the scope of funded research activities. The most common data sources are articles from peer-reviewed journals, as well as presentations and progress reports. Scientific journals use a process of peer review prior to publishing an article. Through this rigorous process, other experts in the author’s field or specialty critically assess a draft of the article, and the paper may be accepted, accepted with revisions, or rejected.

Specific data sources for this measure are provided below. A contact is listed if administrative records are included as a data source.

Cowart-Osborne MJackson MChege EBaker EWhitaker DSelf-Brown S. Technology-Based Innovations in Child Maltreatment Prevention Programs: Examples from SafeCare® Social Sciences (Basel). 2014 Aug 15;3(3):427-440.

Forsberg SFitzpatrick KKDarcy AAspen VAccurso ECBryson SWAgras SArnow KDLe Grange DLock J. Development and evaluation of a treatment fidelity instrument for family-based treatment of adolescent anorexia nervosa. The International journal of eating disorders; 2015 Jan;48(1):91-9.

Brookman-Frazee LStahmer AStadnick NChlebowski CHerschell AGarland AF. Characterizing the Use of Research-Community Partnerships in Studies of Evidence-Based Interventions in Children's Community Services. Adm Policy Ment Health. 2015 Jan 13. [Epub ahead of print]

Isaac C. Rhew, Eric C. Brown, J. David Hawkins, and John S. Briney. Sustained Effects of the Communities That Care System on Prevention Service System Transformation. American Journal of Public Health: March 2013, Vol. 103, No. 3, pp. 529-535.

Greenberg MT, Feinberg ME, Johnson LE, Perkins DF,Welsh JA, Spoth RL. Factors That Predict Financial Sustainability of Community Coalitions: Five Years of Findings from the PROSPER Partnership Project. Prevention Science; 2014 Apr 6. [Epub ahead of print]

Mercer, SH.; McIntosh, K; Strickland-Cohen, MK; Horner, RH. Measurement invariance of an instrument assessing sustainability of school-based universal behavior practices. School Psychology Quarterly, Vol 29(2), Jun 2014, 125-137.

Szonja Vamos, MS, Miriam Mumbi, RN, Ryan Cook, BA, Ndashi Chitalu, MD, Stephen Marshall Weiss, PhD, MPH, and Deborah Lynne Jones, PhD. Translation and sustainability of an HIV prevention intervention in Lusaka, Zambia. Transl Behav Med. Jun 2014; 4(2): 141–148. Published online Sep 11, 2013.
 

 

Office of the Assistant Secretary for Health (OASH)

Measure ID Data Source Data Validation
1.5
(OASH)
Commerce and Census The Commerce data is actual sales of tobacco and the Census data is the early adult population calculation.

 

Office of Medicare Hearings and Appeals (OMHA)

Measure ID Data Source Data Validation
1.1.5
(OMHA)
Appellate Climate Survey The most recent version of the survey was administered by a third party contractor using a stratified random sample of appellants whose cases were closed within fiscal year 2015. The survey was designed to collect appellant: demographic information, overall satisfaction, satisfaction with hearing format, satisfaction with other aspects (e.g., scheduling, clarity of case processing documents, interaction with the ALJ team after the scheduling and prior to the hearing, and use of the OMHA website) and possible predictors of satisfaction (e.g., case fully heard and considered).

 

Office of the National Coordinator for Health Information Technology (ONC)

Measure ID Data Source Data Validation
1.A.2, 1.B.4
(ONC)
Centers for Disease Control and Prevention, National Center for Health Statistics, National Ambulatory Medical Care Survey, Electronic Medical Record Supplement The NAMCS is nationally representative of office-based physicians. Historically, the response rate is approximately 68%. Beginning with survey year 2010, the survey allows ONC to evaluate trends in electronic health record adoption by region, provider specialty, and state. Estimates for FYs 2008-11 for this measure derive from the mail supplement to the NAMCS.
1.E.4
(ONC)
National Electronic Health Records Survey (NEHRS), which is a supplemental mail survey to the National Ambulatory Medical Care Survey (NAMCS). The NERHS was formerly called the NAMCS EMR Supplement. ONC partially funds the supplement through interagency agreements with the CDC National Center for Health Statistics, which fields the broader survey. The NAMCS is nationally representative of office-based physicians. Historically, the response rate is approximately 68%. Beginning with survey year 2010, the survey allows ONC to evaluate trends in electronic health record adoption by region, provider specialty, and state. Estimates for FYs 2008-11 for this measure derive from the mail supplement to the NAMCS.
1.E.7
(ONC)
Hospital measures: American Hospital Association (AHA) Information Technology (IT) Supplement to the AHA Annual Survey, which ONC partially funds through cooperative agreement. Data are from the American Hospital Association (AHA) Information Technology (IT) Supplement to the AHA Annual Survey. Since 2008, ONC has partnered with the AHA to measure the adoption and use of health IT in U.S. hospitals.

The chief executive officer of each U.S. hospital was invited to participate in the survey regardless of AHA membership status. The person most knowledgeable about the hospital's health IT (typically the chief information officer) was requested to provide the information via a mail survey or secure online site. Non-respondents received follow-up mailings and phone calls to encourage response. The FY 2014 estimates are derived from the survey that was fielded from November 2014 to the end of February 2015.

 

Substance Abuse and Mental Health Services Administration (SAMHSA)

Measure ID Data Source Data Validation
2.3.56
(SAMHSA)
Data are collected through dosage forms and/ participant level instruments using standardized items.  These data are collected and provided by grantees online through an electronic system called the Performance Management Reporting and Training system (PMRTS).  All grantees receive training in using the system and are providing technical assistance in data to be collected.  These data are then cleaned and analyzed by an analytic contract.  In FY2012, The Data Collection Analysis and Reporting (DCAR) contract was awarded and  serves the data collection, cleaning , analysis and reporting functions. The data to be reported in August 2015 will come from a different contract.  It will come from the Pep C contract.  This is to be part of CDP Phase 1 data collection and reporting.  FY2013 data have been carefully collected, cleaned, analyzed, and reported by SAMHSA’s Data Collection Analysis and Reporting contract (DCAR).  After data were entered into the PMRTS, the system uses automated programs to do the initial data cleaning (identifies outliers, missing data, etc.) the  Data Management Team the reviews the data for completeness and accuracy.  Information on any data problems identified is transmitted through the use of "cleaning sheets" to the Government Project Officer (GPO) and the grantee to resolve.  The Data Management Team then makes any required edits to the files, following the extensive and detailed Uniform Coding Conventions. The edited files are then sent to SAMHSA staff and the  Data Analysis Team for analysis and reporting.
2.3.61
(SAMHSA)
The number of calls answered is reported in the National Suicide Prevention LifeLine Monthly Report.  Specialists in information technology at the National Suicide Prevention LifeLine evaluation center validate phone records received from Sprint to determine the number of calls received and answered at 1-800-273-TALK.
3.2.02a
(SAMHSA)
TRAC on-line data reporting and collection system.  All TRAC data are automatically checked as they are input into the TRAC system.  Validation and verification checks are run as they are being entered.  The system will not allow any data that are out of range or violate skip patterns to be saved into the TRAC database.  The NCTSI began using a web-based GPRA data collection system called Transformation Accountability (TRAC) System in FY 2008 and only source for its performance monitoring data.
3.2.16
(SAMHSA)
TRAC For the data collected in TRAC, all data are automatically checked as they are input to TRAC. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.2.26
(SAMHSA)
TRAC on-line data reporting and collection system.  For the TRAC data, all TRAC data are automatically checked as they are input into TRAC. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.2.37
(SAMHSA)
TRAC All TRAC data are automatically checked as they are input into the TRAC system. Validation and verification checks are run as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the TRAC database
3.2.50
(SAMHSA)
The data will be collected and tabulated by hand on a ED524 form and then submitted to SAMHSA on an annual basis. 
 
Grantees implement various forms of data validation as part of their local evaluations. To establish the accuracy and reliability of data used to measure the outcome performance, local evaluators review the data; provide range checks, and otherwise assess concurrent validity by comparing data and measures.
3.4.02
(SAMHSA)
Data are collected through standard instruments and submitted through the TRAC on-line data reporting and collection system.  All TRAC data are automatically checked as they are input into TRAC. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.4.15
(SAMHSA)
This measure was never part of phase one in the CDP.  This data is from the mental health block grants. It is hand-collected and reported by CMHS for input into TRAC and then PPTS. All TRAC data are automatically checked as they are input into TRAC. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.4.20
(SAMHSA)
Data are submitted annually to SAMHSA by States, which obtain the information from local human service agencies that provide services. SAMHSA's CMHS has developed additional error checks to screen data and contacts States and local providers concerning accuracy when data is reported outside expected ranges. CMHS has also issued guidance to all States and localities on data collection and monitors compliance with data collection through increased site visits to local PATH-funded agencies.
3.4.21
(SAMHSA)
Data are derived from standardized annual Program Performance Reports in which grantees estimate the potential number of individuals impacted through a pre-defined list of 7 possible interventions (e.g., group advocacy non-litigation, facility monitoring services, class litigation).  This measure is never part of CDP. The information provided in the annual reports is checked for reliability during on-site PAIMI Program visits, annual reviews, and budget application reviews.
3.4.24
(SAMHSA)
Services Accountability Improvement System.  All data are automatically checked as they are input to SAIS. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
3.4.25
(SAMHSA)
Services Accountability Improvement System  All data are automatically checked as they are input to SAIS. Validation and verification checks are run on the data as they are being entered. The system will not allow any data that are out of range or violate skip patterns to be saved into the database.
4.4.10
(SAMHSA)
Data is collected from the contractors who manage the Treatment Locator and SAMHDA websites via standard tracking software measuring unique hits.  These numbers are provided to the COTRs via email at the end of each month and on January second of the next year.  Validation checks are reviewed at that time.  Data are maintained by the COTRs for each project.

 


 

Content created by Office of Budget (OB)
Content last reviewed