HMIS Data Quality Flashcards

1
Q

The overall utility of a dataset(s) as a function of its ability
to be processed easily and analyzed for a database, data
warehouse, or data analytics system.

A

Data Quality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

It involves data rationalization and validation. “Fitness for use”

A

Data Quality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

2 Techniques to check HMIS data Accuracy:

A
  1. Lot Quality Assurance Sampling (LQAS)
  2. Routine Data Quality Assessment (RDQA)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Aspects of Data Quality:

A
  • Accuracy
  • Completeness
  • Update Status
  • Relevance
  • Consistency
  • Reliability
  • Appropriate
  • Presentation
  • Accessibility
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Data collected and reported by HMIS is relevant to the
information needs of the health system for routine
monitoring of program performance.

A

Relevance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Data is collected, transmitted, and processed according
to the prescribed time and available for making timely
decisions

A

Timeliness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Data that is compiled in databases and reporting forms is
accurate and reflect no inconsistency between what is in
the registers and what is in the databases/reporting
forms at facility level. Similarly, in case of data entered in
the computers, there is no inconsistency between the
data in the reporting forms and the computer files

A

Accuracy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

At service delivery point, it refers to all the relevant
element in a patient/client register are filled.
At health administrative unit – data completeness has
two meanings

A

Completeness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Common Sources of Data Errors in HMIS reports: Data items for whole months missing.

A

Missing Data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Common Sources of Data Errors in HMIS reports: Multiple counting of a fully immunized child.

A

Duplicate Data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Common Sources of Data Errors in HMIS reports: When data collection tools are not used routinely, staff just fills in a likely-looking number.

A

Thumb-stuck

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Common Sources of Data Errors in HMIS reports: A man being pregnant; low birth weight babies exceeding number of deliveries.

A

Unlikely values for variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Common Sources of Data Errors in HMIS reports: 100 births in a month when there are only 2,000 women in childbearing age.

A

Contradictions between variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Common Sources of Data Errors in HMIS reports: Mistakes in adding

A

Calculations Errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Common Sources of Data Errors in HMIS reports: Data is wrongly entered into the computer

A

Typing Error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Common Sources of Data Errors in HMIS reports: TB cured in the place of treatment completed.

A

Capture in wrong box

17
Q

it won’t have ay added value in
monitoring the program performance. It only adds
burden on data collectors

A

Data is not relevant

18
Q

– it will not help us to make timely
decisions to fix the problem

A

Data is not timely

19
Q

we will not be able to see the
complete picture of the performance at different levels

A

Data is not complete

20
Q

The decision making based on evidence will be
hampered.

21
Q

Tool that allows the use ofsmall random samples
to distinguish between different groups of data with high and low data quality

A

LOT Quality Assessment

22
Q

Is a technique useful for assessing whether the
desired level of data accuracy has been achieved
by comparing data in relevant record forms (i.e.
registers or tallies) and the HMIS reports.

A

LOT Quality Assessment

23
Q

Simplified version of the Data Quality Audit
(Which allows programs and projects to verify
and assess the quality of their reported data

A

Routine Data Quality Assessment Tool

24
Q

It aims to strengthen their data management and
reporting systems

A

Routine Data Quality Assessment Tool

25
Objectives of the RDQA
1. Verify Rapidly 2. Implement 3. Monitor
26
The analysis of data capture statistics (metadata) that provide insight into the quality of the data and help to identify data quality issues.
Profiling
27
The decomposition of text fields into component parts and the formatting of values into consistent layouts based on industry standards, local standards (for example, postal authority standards for address data), user-defined business rules and knowledge bases of values and patterns.
Parsing and standardization.
28
The modification of data values to meet domain restrictions, integrity constraints, or other business rules that define when the quality of data is sufficient for the organization.
Generalized Cleansing
29
Identifying, linking, or merging related entries within or across sets of data
Matching
30
Deploying controls to ensure that data continues to conform to business rules that define data quality for the organization
Monitoring
31
Enhancing the value of internally held data by appending related attributes from external sources (for example, consumer demographic attributes or geographic descriptors). In addition, these products provide a range of related functional capabilities that are not unique to this market but which are required to execute many of the data quality core functions, or for specific data quality applications.
Enrichment
32
* Problem solving method that identifies the root causes of the problems or events instead of simply addressing the obvious symptoms * Aim is to improve the quality of the products by using systematic ways in order to be effective (Bowen, 2011) * A tool for identifying prevention strategies * Identification and analysis of factors that are contributing to a specific outcome or problem = QUALITY IMPROVEMENT
Root Cause Analysis
33
It aims to find various modes of failure within a system and addresses the following questions for execution.
Failure mode and Effect Analysis
34
The idea that by doing 20 of the work one can generate 80 of the advantage of doing the entire job * finding the changes that will give the biggest benefits * useful where many possible courses of action are competing for attention * lays down the potential causes in a bar graph and tracks the collective percentage in a line graph to the top of the table
Pareto Analysis
35
* used in risk and safety analysis * uses Boolean logic to determine the root cause of an undesirable event * Upside Down Tree o Undesirable result= top of the tree o Potential causes = down tree
Fault Tree analysis
36
* shows the categorized causes and sub causes of a problem * useful in grouping causes (measurements methods, materials, environment, machines) into categories * categories should be the 4 Ms (Manufacturing) the 4 Ss (Service) or the 8 Ps (also service) depending on the industry * 4Ms Method
Fishbone/Ishikawa/Cause and Effect Diagram
37
* breaks a problem down to its root cause by assessing a situation using priorities and orders of concern for specific issues * various decisions are outlined * potential problem analysis is made to ensure that the actions recommended are sustainable
Kepner-Tregoe Technique
38
Diagnose the causes of recurrent problems by three phases: A. DISCOVER – Data gathering and analysis of findings B. INVESTIGATE – Creation of a diagnostic plan and identification of the root cause through careful analysis of the diagnostic data C. FIX – Fixing the problem and monitoring to confirm and validate that the correct root cause was identified
Rapid Problem Resolution (RPR Problem Diagnosis)
39
(organization's values, norms, and practices with regard to the management and use information) affects outcomes of information use (Choo, Bergeron, Detlor and Heaton, 2008) * Determined by mission, history, leadership, employee traits, industry, and national culture * Sets of identified behaviors and values can account for significant proportions of the variance in information use outcomes * Management should continuously work on maintaining and improving the quality of data and information used in daily operations
Information Culture