Quality Management Flashcards
Process “Plan Quality Management”
- Goal: identify quality requirements and/or standards for the project and its deliverables, and document how the project will demonstrate compliance.
- Inputs: project charter, project mgmt plan (req mgmt plan, risk mgmt plan, stakeholder engagement plan), project documents (assumpt log, req documentation, RTM, risk reg, stakeholder reg), EEFs, OPAs
- Tools+Techniques: Expert judgment, data gathering (benchmarking, brainstorming, interviews), data analysis (cost-benefit-analysis, COQ), multi criteria decision analysis, data representation (flowcharts, logical data model, matrix diagrams, mind mapping), test and inspection planning, meetings
- Outputs: Quality mgmt plan, quality metrics, PMPlan updates (Risk mgmt plan, scope baseline), Project documents updates (lessons learned reg, RTM, risk reg, stakeholder reg)
Process “Manage Quality”
- Goal: translate the quality mgmt plan into executable quality activities that incorporate the org’s quality policies into the project
- Inputs: Quality mgmt plan, Project documents (lessons learned reg, quality control measurements, quality metrics, risk report), OPAs
- Tools+Techniques: data gathering (checklists), data analysis (alt., document, process and root cause analysis), multicriteria decision analysis, data representation (affinity, cause-and-effect, matrix and scatter diagrams; flowcharts,; histograms), audit, design for X, problem solving, quality improvement methods
- Outputs: quality reports, test and evaluation docs, CRs, PMPlan updates (quality mgmt plan; scope, schedule and cost baseline), project updates (issue log, lessons learned reg, risk reg)
Process “Control Quality”
- Goal: verify that project deliverables and work meet the requirements specified by key stakeholders for final acceptance
- Inputs: Quality mgmt plan, Project documents (lessons learned reg, quality metrics, test and evaluations docs), approved CRs, deliverables, work performance data, EEFs, OPAs
- Tools+Techniques: data gathering (checklists, check sheets, statistical sampling, questionnaires and surveys), data analysis (performance reviews, root cause analysis), inspection, testing/product evaluat., data representation (cause-and-effect diagrams, control charts, histograms, scatter diagrams), meetings
- Outputs: Quality controls measurements, verified deliverables, work performance information, CRs, Updates to Quality mgmt plan, Project documents updates (issue log, lessons learned reg, risk reg, test and evaluat. docs)
Quality vs. grade
- quality: degree to which a set of inherent characteristics fulfill requirements
- grade: a category assigned to deliverables having the same functional use but different technical characteristics
- Low-grade may not be a problem and may be acceptable, while low-quality is always a problem and never acceptable.
Attribute sampling vs. variable sampling
- attribute sampling: the result either conforms or does not conform
- variable sampling: the result is rated on a continuous scale that measure the degree of conformity
Cost of quality (COQ)
- Includes all costs incurred over the life of the product by investment in preventing nonconformance to requirements, appraising the product for conformance to requirements, and failing to meet requirements.
- In short: includes prevention costs (training, processes, time, etc.), appraisal costs (measuring, auditing, testing, etc.) and failure costs (internal - rework, scrap/ external - warranty, lost business, liabilities)
- cost of conformance: cost to avoid failure (prevention + appraisal)
- cost of nonconformance: cost because of failures
SIPOC
- type of flowchart
- summarizes the inputs and outputs of one or more processes in table form and stands for Supplies, Inputs, Process, Outputs and Customers
- is used in the process “Plan Quality Management”
Audit
- structured, independent process used to determine if project activities comply with organizational and project policies, processes and procedures
- objectives: identify good practices being implemented; identify nonconformities, gaps and shortcomings; share good practices from similar projects; offer assistance to improve the implementation of processes; highlight contributions of each audit in the lessons learned repository of the organization
Design for X (DfX)
A set of technical guidelines that may be applied during the design of a product for the optimization of a specific aspect of the design. The X in DfX can be different aspects of product development, such as reliability, deployment, assembly, manufacturing, cost, service, usability, safety, and quality.
-> only used in ‘Manage Quality’
mutual exclusivity
a statistical term describing two or more events that cannot coincide/occur at the same time. E.g. You cannot roll both a five and a three simultaneously on a single die.
Design of Experiments (DOE)
- technique that can be used to analyze alternatives e.g. can be used look for ways to deliver the same level of quality for less cost
- fast and accurate technique that allows you to systematically change the important factors in a process and see which combinations have an optimal impact on the project deliverables
Common vs special causes
- Common causes are normal, expected variances that occur. They are predictable and are not considered unusual. On a control chart, common causes of variance would be indicated by the random points within the control limits. (also known as “random” causes)
- special causes of variance are those causes that are not predictable or inherent in a system. They are usually related to some type of defect. On a control chart, special causes are represented by points beyond the control limits or as non-random points within the control limits. (also known as “assignable” causes)
Accuracy vs. precision
- precision: less scatter, consistent, predictable
- accuracy: closeness to the target
-> high precision and low accuracy means the process should be adjusted