Chapter 10 Flashcards
What is data governance?
- High-level organizational groups and processes overseeing data stewardship across the organization
What is a data steward?
A person responsible for ensuring that organizational applications properly support the organization’s data quality goals
What are the requirements for data governance to be successful?
- Sponsorship from both senior management and business units
- A data steward manager to support, train, and coordinate data stewards
- Data stewards for different business units, subjects, and/or source systems
- A governance committee to provide data management guidelines and standards
Why is data quality important?
If the data are bad, the business fails. Period.
- GIGO - Garbage in, garbage out
- Sarbanes-Oxley (SOX) compliance by law sets data and metadata quality standards
What is the purpose of data quality?
- Minimize IT project risk
- Make timely business decisions
- Ensure regulatory compliance
- Expand customer base
What are the characteristics of quality data?
- Uniqueness
- Accuracy
- Consistency
- Completeness
- Timeliness
- Currency
- Conformance
- Referential Integrity
What are some causes of poor data quality?
- External data sources (Lack of control over data quality)
- Redundant data storage and inconsistent metadata (Proliferation of databases with uncontrolled redundancy and metadata)
- Data entry (Poor data capture controls)
- Lack of organizational commitment (Not recognizing poor data quality as an organizational issue)
What are some steps that can be taken to improve data quality?
- Get business buy-in
- Perform data quality audit
- Establish data stewardship program
- Improve data capture processes
- Apply modern data management principles and technology
- Apply total quality management (TQM) practices
How can you create business buy-in?
- Executive sponsorship
- Building a business case
- Prove a return on investment (ROI)
- Avoidence of cost
- Avoidance of opportunity loss
What do you do in a data quality audit?
- Statistically profile all data files
- Document the set of values for all fields
- Analyze data patterns (distribution, outliers, frequencies)
- Verify whether controls and business rules are enforced
- Use specialized data profiling tools
What are the roles of a data steward?
- Oversight of data stewardship program
- Manage data subject area
- Oversee data definitions
- Oversee production of data
- Oversee use of data
How can you improve data capture processes?
- Automate data entry as much as possible
- Manual data entry should be selected from preset options
- Use trained operators when possible
- Follow good user interface design principles
- Immediate data validation for entered data
What are some software tools for analyzing and correcting data quality problems?
- Pattern matching
- Fuzzy logic
- Expert systems
Besides software tools, what other modern tools can be applied to data management?
- Sound data modeling and database design
What does TQM stand for?
Total Quality Management
What are the TQM Principles?
- Defect prevention
- Continuous Improvement
- Use of enterprise data standards
What are the components of a balanced focus?
- Customer
- Product/Service
- Strong foundation of measurement
What is master data management (MDM)?
Disciplines, technologies, and methods to ensure the currency, meaning, and quality of reference data within and across various subject areas
What are the three main architectures of MDM?
- Identity registry
- Integration hub
- Persistent
What is Identity registry in MDM?
Master data remains in source systems; registry provides applications with location
What is an integration hub in MDM?
Data changes broadcast through central service to subscribing databases
What is persistent in MDM?
Central “golden record” maintained; all applications have access. Requires applications to push data. Prone to data duplication.
What does data integration do?
Creates a unified view of business data
Other possibilities:
- Application integration
- Business process integration
- User interaction integration
In data integration, what does any approach require?
Change data capture (CDC)
What does changed data capture do?
Indicates which data have changed since previous data integration activity
What are three techniques for data integration?
- Consolidation (ETL)
- Data federation (EII)
- Data propagation (EAI and EDR)
What is consolidation (ETL) in data integration?
- Consolidating all data into a centralized database (like a data warehouse)
What is data federation (EII) in data integration?
- Provides a virtual view of data without actually creating on centralized database
What is data propogation (EAI and EDR) in data integration?
- Duplicate data across databases, with near real-time delay
Comparison of Consolidation, Federation and Propogation forms of data integration
In the reconciled data layer, what is Typical operational data?
- Transient - not historical
- Not normalized (perhaps due to denormalization for performance)
- Restricted in scope-not comprehensive
- Sometimes poor quality - inconsistencies and errors
After ETL what characteristics should data have?
- Detailed - not summarized yet
- Historical - periodic
- Normalized - 3rd normal form or higher
- Comprehensive - enterprise-wide perspective
- Timely - data should be current enough to assist decision-making
- Quality controlled - accurate with full integrity
What does ETL stand for?
Extract, Transform, Load
What is the ETL process?
- Capture/Extract
- Scrub or data cleansing
- Transform
- Load and Index
When is the ETL process done?
- During initial load of Enterprise Data Warehouse (EDW)
- During subsequent periodic updates to EDW
When is mapping and metadata management completed?
It’s a design step prior to performing ETL
What is mapping?
Required data are mapped to data sources
(Graphical or matrix representations)
What information should mapping provide?
- Explanations of reformatting, transofrmations, and cleansing actions to be done
- Process flow involving tasks and jobs
What makes good metadata for mapping?
- Identifies data sources
- Recognizes same data in different systems
- Represents process flow steps
What is static extract?
Capturing a snapshot of the source data at a point in time
What is an incremental extract?
Capturing changes that have occurred since the last static extract
What is capture/extract?
Obtaining a snapshot of a chosen subset of the source data for loading into the data warehouse
Visual of data reconciliation in ETL
What is scrub/cleanse?
Uses pattern recognition and AI techniques to upgrade data quality
What are you looking for when fixing errors?
- Misspellings, erroneous dates, incorrect field usage, mismatched addresses, missing data, duplicate data, inconsistencies
What are some other things to look for in scrub/cleanse?
Decoding, reformatting, time stamping, conversion, key generation, merging, error detection/logging, locating missing data
What does transform mean?
Convert data from format of operational system to format of data warehouse
What is at the record-level?
Selection - data partitioning
Joining - data combining
Aggregation - data summarization
What is at the field level?
Single-field - from one field to one field
multi-field - from many fields to one, or one field to many
What is load/index
Place transformed data into the warehouse and create indexes
What is refresh mode?
Bulk rewriting of target data at periodic intervals
What is update mode?
Only changes in source data are written to data warehouse
What are the four record level transformation functions?
- Selection
- Joining
- Normalization
- Aggregation
What is the process of partitioning data according to predefined criteria?
Selection
What is the process of combining data from various sources into a single table or view?
Joining
What is the process of decomposing relations with anomalies to produce smaller, well-structured relations?
Normalization
What is the process of transforming data from detailed to summary level?
Aggregation
What is basic representation in single field transformation?
Translates data from old form to new form
What is algorithmic transformation in single-field transformation?
Uses a formula or logical expression
What is table lookup in single-field transformation?
Uses a seperate table keyed by source record code
What is multi-field transformation?
Converting many source to one target
or
Converting one source to many targets