Introduction to Engineering Statistics and Lean Sigma Flashcards

1
Q

What is six sigma?

A

Allen’s text defines it as follows: “Six sigma is an organized and systematic problem-solving method for strategic system improvement and new product and service development that relies on statistical methods and the scientific method to make dramatic reductions in customer defined defect rates and/or improvements in key output variables.”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the two main benefits of six sigma?

A

The main benefits of six sigma are: (1) the method slows people down when they solve problems, preventing them from prematurely jumping to poor recommendations that lose money; and (2) six sigma forces people to evaluate quantitatively and carefully their proposed recommendations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define and distinguish the difference between key input variables and key output variables. What generic statistical terms are these two analgous to?

A

Key output variable (similar to dependent variable): Output variable(s) are of prime interest to you or your team in relation to the effects of input variable changes for the system of interest. Often, this will be the monetary contribution of the system to some entity’s profits. Key input variable (similar to independent variable): directly controllable by team members, and when they are changed, these changes will likely affect at least one key output variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does the acronym DMAIC stand for? When is it applied?

A

Define - Measure - Analyze - Improve - Control. This is the six sigma method for system improvement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does the acronym DMADV stand for? When is it applied?

A

Define - Measure - Analyze - Design - Verify. This is the six sigma method for new system development.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is lean manufacturing?

A

The simple principle of _lean manufacturing_ is to create more value with less work and zero waste. Therefore, ultimate efficiency is the goal or desired end state of the methods and principles associated with the terms lean manufacturing, lean production, or alternatively, _lean_ for short. The methods focus on modeling and optimizing the flows of materials and information through systems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the “Theory of Constraints”? What are the five steps, according to Dr. Eliyahu Goldratt, to using this concept to improve a process?

A

The idea of ToC is that in a complicated system there is generally a single constraint holding up the flow of production or throughput. To increase throughput unavoidably involves elevating or alleviating the constraining subsystem or bottleneck. Dr. Goldratt outlines five steps to do this: Identify the constraints, decide how to exploit those constraints, subordinate all other processes, elevate the constraint, if the constraint has moved within the system go to the new location and start over.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the generic definition of quality?

A

Q = P / E, where P is the performance of the unit and E is the customer expectation of that unit.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can formal planning and “design freezes” reduce cost and improve quality?

A

Design freezes refers to the exploration and large numbers of alternatives to a design early in the manufacturing process. Therefore when the first unit is produced, the company has a lot of confidence in the design and doesn’t need to waste resources on retooling or tweaking designs, which can affect cost and variance (therefore quality).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the benefits of formal Design of Experiments as opposed to trial-and-error (i.e, one factor at a time or OFAT) experimentation?

A

Formal methods (1) spread tests out inside the region of interest where good solutions are expected to be and (2) provide a thorough check of whether changes help. For example, by using interpolation models, e.g., linear regressions or neural nets, one can effectively thoroughly test all the solutions throughout the region spanned by these experimental runs. OFAT procedures have the advantages of being relatively simple and permitting opportunistic decision-making. Yet, for a given number of experimental runs, these procedures effectively test far fewer solutions,

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Name and describe the five steps in learning according to Bloom’s taxonomy.

A

Roughly speaking, general knowledge divides into: (1) knowledge of the needed terminology and the typical applications sequences, (2) comprehension of the relevant plots and tables, (3) experience with application of several central approaches, (4) an ability for analysis of how certain data collection plans are linked to certain model-fitting and decision-making approaches, and (5) the synthesis needed to select an appropriate methodology for a given problem, in that order. Critiquing the knowledge being learned and its usefulness is associated with the steps of analysis and/or synthesis. The central thesis associated with Bloom_s Taxonomy is that teaching should ideally begin with the knowledge and comprehension and build up to applications, ending with synthesis and critique.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Define acceptance sampling

A

Acceptance sampling involves collecting and analyzing a relatively small number of KIV measurements to make “accept or reject” decisions about a relatively large number of units. Statistical evidence is generated about the fraction of the units in the lot that are acceptable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define control planning

A

Control planning is an activity performed by the “owners” of a process to assure that all process KOV variables are being measured in a way that assures a high degree of quality. This effort can involve application of multiple methods.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Define design of experiments (DOE)

A

Design of experiments (DOE) methods are structured approaches for collecting response data from varying multiple KIVs to a system. After the experimental tests yield the response outputs, specific methods for analyzing the data are performed to establish approximate models for predicting outputs as a function of inputs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define failure mode and effects analysis (FMEA)

A

Failure mode and effects analysis (FMEA) is a method for prioritizing response measurements and subsystems addressed with highest priority.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define formal optimization

A

Formal optimization is itself a diverse set of methods for writing technical problems in a precise way and for developing recommended settings to improve a specific system or product, using input-output models as a starting point.

17
Q

Define gauge repeatability and reproducibility (RR)

A

Gauge repeatability and reproducibility (RR) involves collecting repeated measurements on an engineering system and performing complicated calculations to assess the acceptability of a specific measurement system.

18
Q

Define process mapping

A

Process mapping involves creating a diagram of the steps involved with an engineering system. The exercise can be an important part of waste reduction efforts and lean engineering and can aid in identifying key input variables.

19
Q

Define regression

A

Regression is a curve-fitting method for developing approximate predictions of system KOVs (usually averages) as they depend on key input variable settings. It can also be associated with proving statistically that changes in KIVs affect changes in KOVs if used as part of a DOE method.

20
Q

Define statistical process control (SPC)

A

Statistical process control (SPC) charting includes several methods to assess visually and statistically the quality and consistency of process KOVs and to identify unusual occurrences. Therefore, SPC charting is useful for initially establishing the value and accuracy of current settings and confirming whether recommended changes will consistently improve quality.

21
Q

Define quality function deployment (QFD)

A

Quality function deployment (QFD) involves creating several matrices that help decision-makers better understand how their system differs from competitor systems, both in the eyes of their customers and in objective features.

22
Q

Define statistical quality control (SQC)

A

The phrase statistical quality control (SQC) refers to the application of statistical methods to monitor and evaluate systems and to determine whether changing key input variable (KIV) settings is appropriate.