Measure Phase Overview Flashcards
Measure Phase
Second phase of DMAIC
- Main activity is to define the baseline
- We get the ‘real story’ behind the current state by gathering data and interpreting what the current process is really capable of
- Team checks how the process is performing against customer expectations and CTQs noticed in the Define phase
- Help understand the extend of the problem with the help of data
Goals of Measure Phase
- Establish baseline performance of the process
- Identification of process performance indicators
- Develop a data collection plan and then collect data
- Validating the measurement system
- Determine the process capability
- Measure phase is approximately 2 to 3 weeks process based on the project inputs.
- All relevant stakeholders involvement is key to getting quality data
Measure Phase Basic Tools
- Process Map
- Value Steam Mapping
- Spaghetti Diagram
- Cause and Effect Matrix
Spaghetti Diagram
- A visual representation using a continuous flow line tracing the path of an item or activity through a process. The continuous flow line enables process teams to identify redundancies in workflow and opportunities to expedite process flow.
Data Collection
- Measure Phase is all about collecting as much data as possible to get the actual picture of the problem. Data must be accurate and precise
Data Types
Data-set of values of qualitative or quantitative variables. It may be numbers, measurements, observations, or even just descriptions of things. Below are Quantitative Data:
Discrete Data: aka attribute data. The data is discrete if the measurements are integers or counts. E.g. number of customer complaints, weekly defect data etc.
Continuous Data: The data is continuous if the measurement takes on any value, usually within some range. E.g. stack height, distance, cycle time, etc.
Discrete Data aka Attribute Data
- Count - Ex. counts of errors
-
Binary data - data that can have only one of two values.
- E.g. On-time delivery (yes/no); Acceptable product (pass/fail)
-
Attribute-Nominal - The “data” are named or labels. There is no intrinsic reason to arrange in any particular order or make a statement amount any quantitative different between them
- E.g. In a company: Dept A, Dept B, Dept C
- E.g. In a shop: Machine 1, Machine 2, Machine 3
- E.g. Types of transportation: boat, train plane
-
Attribute-Ordinal - The names or labels represent some value inherent in the object or item (so there is an obvious order to the labels)
- E.g. On product performance: excellent, very good, good, fair, poor
- E.g. Customer survey: strongly agree, agree, disagree, strongly disagree
- *Though ordinal scales have a defined sequence, they do not imply anything about the degree of difference between the labels (that is, we can’t assume that “excellent” is twice as good as “very good”)
Coding Data
- Sometimes it is more efficient to code data by adding, subtracting, multiplying, or dividing by a factor
Types of Data Coding
Substitution: ex. Replace 1/8ths of an inch with +/ 1 deviations from center in integers
Truncation: ex. data set of 0.554, 0.5542, 0.5547 - you might just remove the 0.554 portions
Data Collection Plan
- Useful tool to focus your data collection efforts on, directed approach helps to avoid locating and measuring data just for the sake of doing so.
- Identify data collection goals
- Develop operational definitions
- Create a sampling plan
- Select and validate data collection methods
Plan for and begin collecting data
Data collection form: a way of recording approach to obtaining the data that is needed to preform the analysis. Data should be recorded by trained operators with a calibrated instrument and also in a standard data collection form
-
Checklist
- perform repetitive activities, to check a list of requirements, or collect data in an orderly and systematic manner, make systematic checks of activities or products, ensuring the operator does not forget anything important
-
Check Sheet
- structured, well-prepared form for collecting and analyzing data consistently of a list of items and some indications of how each item occurs, collecting data in a standard form helps six sigma teams solve problems and make better decisions
Measurement System Analysis (MSA)
- Experimental and mathematical method of determining how much the variation within the measurement process contributes to overall process variability
Some Basic Factors:
- Accuracy
- Precision
- Gage R&R
- Repeatability
- Reproducibility
Six Sigma Statistics
Statistics: science of gathering, classifying, arranging, analyzing, interpreting, and presenting the numerical data, to make inferences about the population from the sample drawn.
-Two basic categories: Analytical (aka inferential statistics) and Descriptive (aka Enumerative statistics)
- Basic Six Sigma Statistics - foundation for six sigma projects. Allows us to numerically describe the data that characterizes the process Xs and Ys
- Inferential statistics - aka analytical statistics is used to determine whether a particular sample or test outcome is representative of the population from the sample it was originally drawn.
-
Descriptive statistics - aka enumerative statistics is basically organizing and summarizing the data using numbers and graphs. Describes the characteristics of the sample or population
- Measure of frequency (count, percentage, frequency)
- Measure of Central Tendency (mean, median, mode)
- Measure of dispersion of variation (Range, variation, standard deviation)
Shape of Data Distribution
- Depicted by its number of peaks and symmetry possession, skewness, or uniformity.
- Skewness - is a measure of the lack of symmetry. In order words, skewness is the measure of how much the probability distribution of a random variable deviates from the Normal Distribution
Data Organization/Data Display/Data Patterns
The graphical analysis creates pictures of the data, which will help understand the patterns are also the correlations between process parameters. Graphical analysis is the starting point for any problem-solving method.
- Control Chart: The control chart is a graphical display of quality characteristics that have been measures or computed from a sample versus the sample number or time
- Frequency Plots: Frequency plots allow you to summarize lots of data in a graphical manner making it easy to see the distribution of that data and process capability, especially when compared to specifications.
- Box Plot: Box plot is a pictorial representation of continuous data. In other words, Box plot shows the Max, Min, Median, Interquartile range Q1, Q3, and outlier.
- Main Effects plot: The main effects plot is the simplest graphical tool to determine the relative impact of a variety of inputs on the output of interest.
- Histogram: Histogram is the graphical representation of a frequency distribution. It is the form of a rectangle with class intervals as bases and the corresponding frequencies as heights
- Scatter plot: A scatter analysis is used when you need to compare two data sets against each other to see if there is a relationship
- Pareto Chart: 80:20 a graphical tool to map and grade business process problems from the most recurrent to the least frequent
Basic Probability
Basic Six Sigma Probability terms like independence, mutually exclusive, compound events, and more are necessary foundations for statistical analysis.
Probability is the ratio of number of favorable outcomes to the total number of possible outcomes. Probabilities are usually shown in fractions or decimals. The probability ALWAYS lies between 0 and 1. An event is one or more outcomes in an experiment. The probability of an event E indicates how likely that event is to occur.
Probability of an event (E) = number of favorable outcomes/number of possible outcomes
- Additive law
- Multiplication law
- Compound Event
- Independent Event