Let's focus on how the Measure Phase of Six Sigma DMAIC gets implemented during eLearning Course Development Process improvement initiative. Measure phase is about validating
measurement system, analyzing data, gathering root causes, measuring high
impact elements and mapping process. After the improvement efforts are scoped
in Define phase, the team starts working on measure phase. Instead of looking
for new measurements, the team needs to take measurements that already exist in
the process and unravel them. The team needs to identify high impact eLearning
course development process elements, develop measureable baseline metrics and
create detailed process mapping. In Measure phase, the team ensures measurement
system’s consistency and verifies if the tool used to collect output variable
is flawed or if all operators interpret the tool reading in the same way. The
team decides on the purpose of data collection, the type of analysis and the
period of data collection. Various tools are used for collecting and
communicating baseline metrics and priorities.
Approach to Measure
Process Mapping
Process mapping helps a
team understand how the activities take place. It gives a high level overview
of a process defining how, when or where a process should be measured. It helps
the team investigate where problems might occur and where alterations be made
that gives optimum outcomes. The process map is a graphical representation of
all process steps (value added and non value added), process inputs (X’s), and
process outputs (Y’s). A process map consists of flowchart having symbols such
as arrow, circle, diamond, oval or rectangle and details the inputs,
activities, decision points, rework loops and outputs of the process.
Data Gathering
Data collection is an
important aspect of Six Sigma projects. Inaccurate data impacts the results of
a study and eventually derails the project. In order to collect data, the team
needs to identify the purpose of data collection, decide the period of data
collection and find out if the necessary data is already on hand. The
numerical data that will be measured during Measure phase falls into two
categories:
-> Continuous
data can take any value within a range. For example time, weight, height etc.
-> Discrete
data can only take certain values. For example: number of defects.
Funneling
Funneling is choosing the
appropriate measures to get precise information of the problem. The Six Sigma
methodology provides two techniques to identify the “critical few” measures.
-> Prioritization
Matrix/ XY Matrix
-> FMEA
Prioritization
matrix - Prioritization Matrix, also known as
criteria matrix identifies the critical few variables that need to be measured
and analyzed. It helps to focus data collection effort, formulate theories
about causes and effects and improve decision making. Prioritization matrix, a
systematic approach is implemented when too many variables have an impact on
the output of the process and collecting data on all possible variables would
cost too much time and money. In case of team members having different theories
about what happens in the process, the prioritization matrix promptly unravels
basic disagreements. It allows the team to narrow down the focus and leads to
implementation success.
To
construct a Prioritization Matrix for input/process variables
-> List
the output variables
-> Rank
order and weight the output variables
-> List
the input variables
-> Evaluate
the strength of the relationship between output and input variables
(correlation factor)
-> Cross-multiply
weight and correlation factor and add
-> Assess
variable with every other variable
-> Highlight
the critical few variables
FMEA
(Failure Mode Effect Analysis) - FMEA
allows to discover potential issues in a process, possible impact of each
issue, and approach to fix each issue. There are two types of FMEA, Design FMEA and
Process FMEA. Design FMEA is used during process or product design and
development. The primary objective is to uncover problems that will result in
potential failures within the new product or process. With the help of FMEA,
the team can collect useful information, capture engineering knowledge,
minimize late changes and associated costs and reduce same kind of failures in
future. Process FMEAs are used to uncover problems related to manpower,
systems, methods, measurements and the environment.
FMEA Advantages
-> Bottom-up
look on each criteria
-> Emphasis
on problem prevention
-> Improved
quality
-> Improved
competitiveness
-> Cut
down cost
-> Avoid
defects, failures, and downtime
-> Reduce
the possibility of same kind of failure in future
-> Increased
customer satisfaction
FMEA identifies, quantifies and
evaluates failure and its goal is to improve quality, competitiveness and
customer satisfaction. FMEA drives
systematic thinking about a product or process and focuses on the three basic
issues:
-> What
might cause system/process stoppage? – Failure
-> How
bad the effect is, when something goes wrong? – Risk
-> What
can be done to avoid things from going wrong? – Corrective Action
FMEA attempts to identify and
prioritize potential process or system failures. The failures are rated on
three components:
-> Severity – Impact of a failure
-> Occurrence – Frequency of causes of failure
-> Detectibility – How easy it is to detect the
failure
Steps
of Conducting an FMEA
1. Define
the scope of FMEA
2. List
process steps of existing process
3. List
possible failure modes
4. List
potential effects of failures
5. Assign
severity of each effect
6. List
potential failure causes
7. Assign
occurrence rating for each cause
8. List
current process controls for detection of failure modes
9. Assign
detection levels to each failure mode
10. Calculate
the risk priority number (RPN) for each cause
11. Rank
or prioritize causes
12. Take
action on high risk failure modes
13. Recalculate
RPN numbers
Risk Priority Number (RPN)
The team develops a ranked list of
potential failure modes by calculating Risk Priority Number (RPN). Then the
failures in an FMEA project are prioritized by Risk Priority Number, or RPN
values. The RPN values are calculated by multiplying together the Severity,
Occurrence, and Detection (SOD) values associated with each cause-and-effect
item identified for each failure mode. The higher the RPN value, the higher the
priority to work on that specific function. Team now takes action on high risk
failure modes. Each cause of high risk needs to be acted upon and closed
properly. The reduction in RPN scores results in dramatic improvement in
process sigma level.
Data
Collection Plan
A data
collection plan is a detailed document. It explains the technique and the
sequence to be executed in gathering the data for the Six Sigma project. This
document is very critical and it makes sure that each team member of the Six
Sigma project is in agreement with the data plan. Below mentioned action items
ensure that the data collection process is stable and the measurement system is
reliable.
1. Identify
data collection plan objectives
2. Decide
the time frame and data elements to be included in the measurement
3. Create
clear, concise and detailed operational definitions and methodology for each
variable. Specific and concrete operational definition reduces ambiguity,
offers the understanding of variable characteristics and explains the method for
measuring the characteristics.
4. Ensure
the reliability of data collection plan and reliability of measurement system
by making sure that the plan is being implemented precisely and consistently
5. Follow
through the data collection process and results to ensure the consistency and
accuracy of execution
Variation
The
difference between two processes/products/services for the same characteristic
is called variation. In Statistical Process Control, the variation in a process
is classified in two ways, Common Cause Variation and Special Cause Variation.
Common Cause Variation – The Common Cause Variation is
created by many factors that are also known as traditional 6Ms (Man power,
Mother Nature, Materials, Method, Measurement and Machine). 6Ms affect any
process and will confirm to a normal distribution. Common Causes are always
inherent and always present to some extent in the process and can only be
reduced by fundamental changes to the system.
Special Cause Variation – The Special Cause Variation arises
from specific factors/causes or non random events that are unusual and not
previously observed. It has a considerable effect on the process. Special Cause
Variations lead to unexpected change in process and makes the process
unpredictable and unstable. Special Causes can be tracked down and removed
using Special Cause Problem Solving methodology.
The
objective of the eLearning Course development team is to minimize variation in
the process and stabilize the process. The team needs to identify and isolate
the causes of variation that are influencing the course quality and find
solutions to them in order to eliminate these causes on a long term basis.
“The objective in driving Six Sigma
performance is to reduce or narrow variation to such a degree that six sigma –
or standard deviations – can be squeezed within the limits defined by the
customer’s specification.” (Pande)
Data
Summarization
Measuring
Central Tendency
Mean, Median and Mode are the three ways to
measure the central tendency of data. These 3M’s are measures to find the
middle point of observations. This middle point of observation helps us find
the representative value of entire distribution and enables us compare data.
Mean represents the average of data. The
concept of mean is simple and most intuitively understood. However, Mean gets
affected by extreme value.
Median represents middle most of central
value in a set of ordered data. Extreme values do not influence the Median
value as strongly as mean.
Mode represents the value that is repeated
most often in a data set. Extreme values do not influence the Mode value as
strongly as mean. However, statistical calculations do not support Mode and
Median calculations as much as the Mean.
Measuring Dispersion
Measures
of dispersions are important for determining the extent of the spread of the data
from the mean value. Various methods are used to measure the dispersion of a
dataset. These methods are
- Range
( R )
- Standard
Deviation ( S )
- Variance
(S2)
- Co-efficient
of Variation (CV)
To
calculate these measures, the team can use one of the statistical software
packages, Minitab.
Histogram
Histogram
graphically represents data frequency distribution in bar form and helps
summarize data from process that has been collected over a period of time.
Box Plot
Box Plot is the great tool for graphical analysis
especially for non normal data. Box plot summarizes the set of data on an
interval scale. The spacing between the different parts of the box indicates
the degree of dispersion (spread) and skewness in the data and identifies
outliers (the most extreme values) in the dataset.
In this section we reviewed the concepts of Process Mapping, Data Gathering, Funneling, Data Collection Plan, Variation and Data Summarization. My next blog will elaborate the concepts of Measurement System Analysis, Pattern Analysis, Pareto Chart, and Process Capability.
Share you views and experiences on how a robust Measurement System leads to a successful Six Sigma initiative.
No comments :
Post a Comment