+300 Lean Six Sigma Terminology

The individual terms that form part of the Lean Six Sigma terminology are not unique but as a collective, they are key to effectively employing the methodologies to achieve process improvement. This post contains over 300 terms and acronyms that are used within Lean Six Sigma.

Lean-Six-Sigma-Terminology

Comprehensive list of Lean Six Sigma Terminology

The Lean Six Sigma terminology is organised in alphabetical order for ease of identification.

3MsMuda, Mura, Muri
3PProduction, Preparation, Process
5 S A methodology for organising, cleaning, developing, and sustaining a productive work environment. Improved safety, ownership of workspace, improved productivity, and improved maintenance is some of the benefits of the 5S program. 
5SSort, Straighten, Shine, Standardise, Sustain
6MMan, Machine, Measure, Mother Nature, Method, Material
8 WastesTransport; Inventory; Motion; Waiting; Overproduction; Overprocessing; Defects; Human Intellect. Additional wastes are Energy; Pollution, Space.
80/20 Rule (Pareto)The rule that suggests that 20% of causes (categories) will account for 80% of the trouble
8D processTeam-oriented problem-solving method for product and process improvement. Structured into 8 steps: Define the problem and prepare for process improvement, establish a team, describe the problem, develop interim containment, define & verify root cause, choose permanent corrective action, implement corrective action, prevent a recurrence, recognise and reward the contributors.
A
Acceptable Quality Level (AQL)The maximum proportion of defective units considered satisfactory is the process average.
Acceptance NumberThe highest number of nonconforming units or defects found in the sample permits the acceptance of the lot.
AccuracyStatement about how close the data are to the target
Affinity DiagramA tool used to organise and summarise large amounts of data (ideas, issues, solutions, problems) into logical categories based on user-perceived relationships. 
AlphaIn hypothesis testing: rejecting the null hypothesis (no difference) erroneously; assuming a relationship where none exists. They are also referred to as Type I Error. E.g. convicting an innocent person.
Analysis of Variance (ANOVA)A calculation procedure to allocate the amount of variation in a process and determine if it is significant or is caused by random noise. 
ANDON is a visual production-control device (usually a lighted overhead display) that continuously shows changing status of the production line and sounds alerts if a problem is imminent.
ANOVAAnalysis of Variance
AOPAnnual Operating Plan
Assignable Causessee Special Causes
Attribute DataNumerical information at the nominal level. Its subdivision is not conceptually meaningful. It represents the frequency of occurrence within some discrete category.
AuditA timely process or system inspection ensures that specifications conform to documented quality standards. An Audit also brings out discrepancies between the documented standards and the standards followed and also might show how well or how badly the documented standards support the processes currently followed.
Average Outgoing QualityThe expected quality of outgoing product following the use of an acceptance sampling plan for a given value of incoming product quality.
B
Bar ChartHorizontal or vertical bars that graphically illustrate the magnitude of multiple situations. 
Bathtub CurveA curve is used to describe the life cycle of a system/device as a function of usage. Initially, when the curve has a downwards slope, the failure rate decreases with use. Then, a constant failure rate is achieved, and finally, the curve slopes upward when the failure rate increases with usage. 
BAU“Business As Usual” The old way of doing business, considering repetitive tasks with no critical sense of improvement.
BBBlack Belt
BenchmarkingThe concept of discovering what the best performance is achieved, whether in your company, by a competitor, or by an entirely different industry.
BetaIn hypothesis testing: the failure to reject a false null hypothesis; assume no relationship exists when it does. E.g. is failing to convict a guilty person.
Binary Attribute DataData that can only have two values indicates the presence or absence of some characters from the process. E.g. pass/fail
Black BeltFull-time team leaders responsible for recommending and implementing process improvement projects (DMAIC or DMADV) within the business — to drive customer satisfaction levels and business productivity up. 
Black NoiseSpecial cause variation in the process
BSLBaseline for measurements
C
C & ECause and Effects matrix
C chartStatistical Process Control Chart used to determine whether the number of defects/unit is constant over time. 
C-PCapacity-Productivity
Capability AnalysisA mathematical calculation is used to compare the process variation to a specification. Examples are Cp, Cpk.  Used to compare processes across businesses.
Cause and Effect DiagramThe tool used for brainstorming and categorising potential causes to a problem. The most commonly used cause branches are Man, Method, Machine and Material. Also known as the Ishikawa and the Fishbone diagram.
CensusCollecting information on an entire population 
Central Limit TheoremThe means of samples from a population will tend to be normally distributed around the population mean.
Central TendencyAn indication of the location or centrality of the data. The most common measures of central tendency are: mean, median and mode. 
ChampionBusiness leaders and senior managers who ensure that resources are available for training and projects and who set and maintain broad goals for improvement projects. A business leader who identifies, selects, and charters continuous improvement projects and supports the delivery.
Chance CausesSee Common Causes
Check SheetsA data collection form consisting of multiple categories. Each category has an operational definition and can be checked off as it occurs. Properly designed, the Check Sheet helps to summarise the data, which is often displayed in a Pareto Chart.
Chi-Square TestA statistical goodness-of-fit-test was used to test the assumption that the distribution of a set of data is similar to the expected distribution.
Chronic ProblemsSee Common Causes
Common CausesSmall, random forces that act continuously on a process (Also called: by Juran – Chronic Problems, by Deming – System Faults, by Shewhart – Chance Causes)
Confidence IntervalA region containing the limits of a parameter, with an associated level of confidence that the bounds are large enough to contain the true parameter value
Continuous VariableA variable whose possible values consist of an entire interval on the number line,i.e. it can take any value.
Contol Specifications Customer requirements for a process under study or in routine operation.
Control ChartA procedure is used for tracking a process through time to distinguish variation that is inherent in the process (common cause) from variation that yields a change to the process (special cause).
Control PlanA  process control document lists all of the elements required to control variations in a process under study or routine operation.
COPQ Cost of Poor Quality
CorrelationIt is denoted by R. Measure of the linear relationship between two variables. It can take on any value between 1 and -1. If the correlation is 1, the two variables have a perfect positive linear relationship. If the correlation is -1, the variables have a perfect negative linear relationship. If |correlation| > .7, the relationship between the variables is considered strong.  
Cost of Poor Quality (COPQ)The cost associated with providing poor quality products or services. The cost of ‘not producing first quality for the customer the first time and every time. Cost of quality issues is often given the broad categories of internal failure costs, external failure costs, appraisal costs, and prevention costs.
CpIndex of process capability – process centred
CpkProcess Capability Index, which takes into account off-centredness. Cpk = min[ (USL-Mean)/(3 x sigma), (Mean-LSL)/(3 x sigma)] (i.e. depending on whether the shift is up or down). A typical objective is to have a Cpk > 1.67.
CpkIndex of process capability – process not centred
CTCCritical to Cost
CTQCritical to Quality
Cumulative  sum (CUSUM) control chartAn alternative technique to Shewhart charting. CUSUM charts can detect small process shifts faster than Shewhart charts. 
Cumulative distribution function (CDF)the calculated integral of the PDF from minus infinity to x. 
CustomerPeople process equipment that receives products or services from you as a supplier. These may be internal or external customers.
D
DataSet of measurements obtained from a sample or census
DefectA defect
Defects Per Million (DPM)A number of defective parts out of one million. DPM = Fraction Nonconforming * 1 M, where fraction non-conforming = 1 – Quality Yield. A quality metric often used in the Lean Six Sigma process. It is calculated by the number of defects observed divided by the number of opportunities for defects compared to 1 million units.
Defects Per Million Opportunities (DPMO)A number of defective parts per million opportunities. It is used when an inspection unit has 1 or more categories of defects. DPMO = Fraction Nonconforming * 1 M, where fraction non-conforming = Total # of defects/Total # of Opportunities (TOP), and TOP = # units * # opportunities per unit. 
Design of Experiments (DOE)A formal, proactive method for experimenting with Xs and Ys. It involves concepts such as factors, levels, centre points, blocking, and replications.
Discrete VariablesVariables whose only possible values are whole units.
DMADVThe methodology used for Design for Six Sigma Projects or new product/service introduction. It stands for Define, Measure, Analyse, Design and Verify. 
DMAICThe methodology used for Six Sigma Problem Solving. Its steps are Define, Measure, Analyse, Improve and Control. It is used only when a product or process exists at your company but is not meeting customer specifications or is not performing adequately.
DMAICDefine, Measure, Analyse, Improve, Control
DOEDesign of Experiments
Dot PlotFor nominal or ordinal data, a dot plot is similar to a bar chart, with the bars replaced by a series of dots. Each dot represents a fixed number of individuals. For continuous data, the dot plot is similar to a histogram, with the rectangles replaced by dots. A dot plot can also help detect any unusual observations (outliers) or gaps in the data set. 
DPUDefects Per Unit
DSODays Sales Outstanding
E
E(x)The expected value of the variable x (or the population mean). E(x) =  (Sx_i)/n, where n is the sample size
ENTEntitlement for a process measurement
EPEXEvery product every interval
ErrorAmbiguities during data analysis are caused by sources as measurement bias, random measurement error, and mistakes. 
ERROR PROOFING is a structured approach to ensure quality and error-free manufacturing environment. Error proofing assures that defects will never be passed to the next operation. 
ExperimentA process is undertaken to determine something that is not already known.
F
FThe function relating process inputs to process outputs
F-TestA statistical test that utilises tabular values from the F-distribution to assess significance.
Factorial ExperimentExperiment strategy that assesses several factors/variables simultaneously in one test. All possible combinations of factors at different levels are examined so that interactions, as well as the main effects, can be estimated. 
Failure Modes and Effects Analysis (FMEA)Analytical approach directed toward problem prevention through which every possible failure mode are identified to determine their effect on the required function of the product or process. 
Fault Tree AnalysisA schematic picture using logic symbols of possible failure modes and associated probabilities.
FIFOFirst in, First out
First Time Yield (FTY)see Quality Yield
Fishbonesee Cause and Effect Diagram
FMEAPotential Failure Modes and Effect Analysis
Force field AnalysisRepresentation of the forces in an organisation that is supporting and driving toward a solution or which are restraining the process. 
FPYFirst Pass Yield
Fractional factorial experimentA designed experiment strategy that assesses several factors/variables simultaneously in one test, where only a partial set of all possible combinations of factor levels are tested to more efficiently identify important factors. This type of test is much more efficient than a traditional one-at-a-time test strategy. 
G
Gauge AccuracyThe average difference observed between a gage under evaluation and a master gage when measuring the same parts over multiple readings.
Gauge R&RGage Repeatability and Reproducibility. See Repeatability and Reproducibility
Gauge RepeatabilityA  measure of the variation is observed when a single operator uses a gauge to measure a group of randomly ordered (but identifiable) parts on a repetitive basis.
Gauge ReproducibilityA measure of average variation is observed between operations when multiple operators use the same gauge to measure a group of randomly ordered (but identifiable) parts on a repetitive basis.
Gantt ChartUsed in project management, it provides a graphical illustration of a schedule and helps plan, coordinate, and track specific tasks in a project.
GatingThe limitation of opportunities for deviation from the proven steps in the manufacturing process. The primary objective is to minimise human error. 
GBGreen Belt
Goodness of FitValue is determined by using many statistical techniques stating probabilistically whether data can be shown to fit a theoretical distribution, such as Normal or Poisson.
Green BeltSix sigma player, responsible for deploying six sigma techniques, managing small projects and implementing improvement.
Green Belt (GB)  Lean Six Sigma role is similar in function to Black Belt but the length of training and project scope are reduced.
H
Heijunka“level production” is a technique of achieving even output flow by coordinated sequencing of very small production batches throughout the manufacturing line in a lean production or just in time (JIT) system.
HeteroscedasticHaving different variance. In a linear regression model, violation of the assumption of constant variance in the outcome variable (homoscedasticity) is called heteroscedasticity
HistogramA bar graph of a frequency distribution in which the widths of the bars are proportional to the classes into which the variable has been divided and the heights of the bars are proportional to the class frequencies
Homogeneous Poisson Process (HPP)A model that considers that failure rate does not change with time. 
HRHuman Resources
Hypergeometric DistributionA distribution that has a similar use to the binomial distribution; however, the sample size is “large” relative to the population size.
Hypothesis TestingConsists of a null hypothesis (H_o) and an alternative hypothesis (H_a), where, for example, H_o indicates equality between two process outputs, and H_a indicates nonequality. Through a hypothesis test, a decision is made to either reject or fail to reject the null hypothesis.
I
In Control An “In-Control” process is one that is free of assignable/special causes of variation. Such a condition is most often evidenced on a control chart which displays an absence of nonrandom variation.
Inner ArrayThe structuring in a Taguchi-style fractional factorial experiment of the factors that can be controlled in a process
Interquartile RangeDifference between the 75th percentile and the 25th percentile of a sample or population 
Inventory Turnover RateThe number of times an inventory cycles or turns over during the year. A frequently used method to compute inventory turnover is to divide the average inventory level into an annual cost of sales. 
ISInformation Systems
Ishikawa, Ichirosee Cause and Effect Diagram
ISOInternational Standards Organization
ISO 9000 Series of StandardsSeries of standards established in the 1980s by countries of Western Europe as a basis for judging the adequacy of the quality control systems of companies
ITInformation Technology
J
Jidoka“Automation with a human touch” being able to stop production lines, either manually by human intervention or mechanically if there is a problem like an equipment malfunction, quality issues, or work that is delayed for whatever reason
JIT ManufacturingA planning system for manufacturing processes that optimises the needed material inventories at the manufacturing site to only what is needed. JIT is a pull system; the product is pulled along to its finish, rather than conventional mass production, which is a push system.
Just In Time (JIT) A philosophy of manufacturing based on planned elimination of all waste and continuous improvement of productivity. It encompasses the successful execution of all manufacturing activities required to produce a final product. 
K
kThousands 
KaizenA Japanese term that means continuous improvement. “Kai” means change and “zen” means good. The Japanese term for improvement; continuing improvement involving everyone – managers and workers. In manufacturing, kaizen relates to finding and eliminating waste in machinery, labour or production methods. 
KanbanJapanese term. It is one of the primary tools of the JIT system. It maintains an orderly and efficient flow of materials throughout the entire manufacturing process. It is usually a printed card that contains specific information such as part name, description, quantity, etc. It is a simple parts-movement system that depends on cards and boxes/containers to take parts from one workstation to another on a production line. The essence of the Kanban concept is that a supplier or the warehouse should only deliver components to the production line as and when they are needed so that there is no storage in the production area. 
KPOVKey Process Output Variable
KurtosisKurtosis is a measure of whether the data are peaked or flat relative to a normal distribution. For unimodal distributions K=3 is a mesokurtic distribution(normal or bell-shaped); K < 3 is a platykurtic distribution (flatter than normal, with shorter tails); and K > 3 is a leptokurtic distribution (more peaked than normal, with longer tails).
L
Layout Design / Cell Designis the organisation of a production or service facility so that items having similar processing requirements are grouped together.
LCLLower Control Limit. Usually represents a downwards 3-sigma deviation from the mean.
Lean ManufacturingPhilosophy developed by Toyota that aims to eliminate waste (non-value-added steps, material, etc.) in the system. 
LEAN METRICS allow companies to measure, evaluate and respond to their performance in a balanced way, without sacrificing the quality to meet quantity objectives or increasing inventory levels to achieve machine efficiencies. The type of the lean metric depends on the organisation and can be of the following categories; Financial performance, behavioural performance and core process performance
Lean Performance Indicators (LPI)is a consistent method to measure lean implementation effectiveness. INDICATORS. Real-Time Performance, Continuous Improvement Implementation, Lean Sustain, Waste Elimination and Profitability. 
Least SquaresThe method used in regression to estimate the equation coefficients and constant so that the sum of squares of the differences between the individual responses and the fitted model is a minimum. 
Linear RegressionAims to find a linear relationship between a response variable and a possible predictor variable using the “least squares” method.
Local Faultssee Special Causes
Loss functionA continuous “Taguchi” function that measures the cost implications of product variability.
LSLLower Specification Limit. The minimum acceptable value of a variable.
LSLLower Specification Limit
LTPDLot Tolerance Percent Defective. The value of incoming quality where it is desirable to reject most lots. 
M
Malcolm Baldrige National Quality AwardThe annual self-evaluation covers the following seven categories of criteria: leadership, strategic planning, customer and market focus, information and analysis, human resource focus, process management, and business results. The National Institute of Standards and Technology (NIST), a federal agency within the Department of Commerce, is responsible for managing the Malcolm Baldrige National Quality Award. The American Society for Quality (ASQ) administers the Malcolm Baldrige National Quality Award under a contract with NIST.
Master Black Belt (MBB)Master Black Belts are Six Sigma or quality experts and are responsible for strategic implementations within the business. The Master Black Belt is qualified to teach other Six Sigma facilitators the methodologies, tools, and applications in all functions and levels of the company, and are a resource for utilising statistical process control within processes. A  person who is an “expert” on Lean Six Sigma techniques and on project implementation. Master Black Belts play a key  role in training and coaching
MBBMaster Black Belt
MeanThe measure of central tendency. It is the average of a set of numbers. See E(x)
Mean SquareSum of squares divided by degrees of freedom. 
Mean Time Between Failure (MTBF)A term that can be used to describe the frequency of failures in a repairable system with a constant failure rate. MTBF is the average time that is expected between failures. MTBF = 1/failure rate. 
Measurement System (MS)This means evaluating a continuous or discrete measurement system with the intention of discovering how “good” or “bad” the measurement systems are.
MedianThe measure of central tendency. It is calculated by ordering the data from smallest to largest. For an odd sample size, the median is the middle observation. For an even sample size, the median is the average of the middle two values of the sample.  
MinitabStatistical software package that operates in a Windows environment and is the main statistical package used by students.
Mistake Proofing (Poka Yoka) is an approach to ‘mistake proofing’ in all aspects of manufacturing, customer service, procurement, etc. It employs visual signals that make mistakes clearly stand out from the rest or devices that stop an assembly line or process if a part or step is missed. Its older name is Poka Yoka (foolproofing).
MMMillions 
ModeThe value or item occurring most frequently in a series of observations or statistical data.
MSAMeasurement System Analysis
MUDAWaste
Multi-Vari Analysis(Inferential Statistics) Passive data collection method used in to display the variation within parts,  machines, or processes between machines or process parts, and over time.
Multi-VariType of multiple variable process study
Multi-vari chartA chart that is constructed to display the variance within units, between units, between samples, and between lots. 
MulticollinearityWhen there exists near linear dependencies between regressors, the problem of multicollinearity is said to exist. It can make the linear regression unstable and/or impossible to accomplish.  
Multimodal distributionA combination of more than one distribution that has more than one distinct peak. 
Multiple Correlation CoefficientThe square of the correlation (R). It measures the % of the variation in the response variable explained by the variation in the predictor variable. 
MURAUnevenness
MURIOverburden
N
Natural TolerancesThree standard deviations on either side of the mean.
Nominal Group Technique (NGT)A voting procedure to expedite team consensus on the relative importance of problems, issues, or solutions. 
Non-value addedAn activity performed in a process that does not add value to the output product or service, which may or may not have a valid business reason for being performed. 
Nonhomogenous Poisson process (NHPP)A mathematical model that can often be used to describe the failure rate of a repairable system that has a decreasing, constant, or increasing rate. 
Nonstationary ProcessA process with a variance that can grow without limit. 
Normal DistributionA continuous, symmetrical density function characterised by a bell-shaped curve. A bell-shaped distribution is often useful to describe various physical, mechanical, electrical, and chemical properties. 
NP chartStatistical Process Control chart where the number of defective items are plotted (like the C chart), but the control limits are calculated using the binomial distribution. This chart can be used if producing defects is not rare. 
Null HypothesisSee Hypothesis testing.
O
O.E.M.a company that uses product components from one or more other companies to build a product that it sells under its own company name and brand
OBCOperator Balance Chart
OEEOverall Equipment Effectiveness
ONE-PIECE FLOW  or continuous flow processing is a concept means that items are processed and moved directly from one processing step to the next, one piece at a time. One-piece flow helps to maximum utilisation of resources, shorten lead times, identify problems and communication between operations. 
One-sided testA statistical consideration where, for example, an alternative hypothesis is that the mean of a population is less than a criterion value.
One-way analysis of variancesee Single Factor Analysis of Variance
Operation Cost Target (OCT)This value represents the maximum expenditure for material, labour, outsourcing, overhead, and all other costs associated with that project. This figure can then be divided between the various operations comprising the manufacturing process, in order to control costs at each step.
Orthogonal The property of a fractional factorial experiment ensures that effects can be determined separately without entanglement. The elements in an orthogonal set are not correlated.
OSHAOccupational Safety and Health Administration
OutlierA data point that does not fit a model because of an erroneous reading or some other abnormal situation. A value greater than 3-sigma from the mean is considered outlier. 
OutputThe result of a process. Sometimes called the response of the process. The products or services that result from a process.
Overall Equipment Effectiveness (OEE) Measures the availability, performance efficiency, and quality rate of equipment – it is especially important to calculate OEE for constrained operations. 
P
P-chartA statistical process control chart is used to determine whether the proportion of nonconformities is constant over time. 
P-ValueThe smallest level of significance at which a null hypothesis would be rejected when a specified test procedure is used on a given data set.
P/TThe ratio of measurement process capability to customer tolerance
Pareto Chart The graphical technique is used to quantify problems so that effort can be expended in fixing the “vital few” causes, as opposed to the “trivial many”. It is a bar chart that displays in descending frequency the number of observed defects in a category.
Pareto Principlesee 80/20 Rule
PDCAPlan-Do-Check-Act
Performance Ratio (PR)Represents the persent of tolerance width used by the variation. PR = 1/Pp
PFEPPlan for every part
Point Estimate An estimate calculated from sample data without a confidence interval.
Poisson DistributionA distribution that is useful, for example, to design reliability tests where the failure rate is considered to be constant as a function of usage. 
Poka YokeJapanese term which means mistake-proofing. A poka-yoke device is one that prevents incorrect parts from being made or assembled or easily identifies a flaw or error. To avoid (yokeru) inadvertent errors (poka)
PopulationThe entire collection of items that is the focus of concern
PpRepresents the capability of the process. It measures the relationship between the tolerance width and the range of variation. Pp = Specification Width / (6-S), where sigma is estimated using the sample standard deviation S. 
PpkRepresents the capability of the process, taking into account any difference between the design target and the actual process mean. Ppk = min(Ppk_upper, Ppk_lower), where Ppk_upper = (USL – X-bar)/3-sigma and Ppk_lower = (X-bar – LSL)/3-sigma
PPMParts Per Million
PPMParts Per Million
PrecisionThe closeness of agreement between randomly selected individual measurements or test results.
Prerequisite TreeIs a logical structure designed to identify all obstacles and the responses needed to overcome them in realising an objective? It identifies minimum necessary conditions without which the objective cannot be met. 
Probability (P)A numerical expression for the likelihood of an occurrence.
Probability Density Function (PDF) – f(x)A mathematical function that can model the probability density reflected in a histogram.
Probablity plot Data are plotted on a selected probability paper coordinate system to determine if a particular distribution is appropriate and to make statements about the percentiles of the population.
ProcessA method to make or do something that involves a number of steps. A  mathematical model such as the HPP (homogeneous Poisson process)
Process Input Variable The vital few input variables called “X’s”
Process ManagementModifying, altering, reshaping, redesigning any business/production process, work method or management style to deliver greater value.
Process MapA visual representation of the workflow either within a process – or an image of the whole operation. A good Process Map should allow people unfamiliar with the process to understand the interaction of causes during the workflow. A good Process Map should contain additional information relating to the Six Sigma project i.e. information per critical step about input and output variables, time, cost, DPU value. 
Process MapMap of the process
Process Output VariableThe output variable, called “Y”. Usually only one Y is studied at a time.
Process Owneris the individual who accompanies the project work and assists with the implementation of the solution. This individual owns the process and ensures improvements implemented are sustained and continuously improved. 
Process Route TableShows what machines and equipment are needed for processing a component or assembly. These tables aid in creating ordinary lines and grouping workpieces into work cells. 
Process Spread The range of values that a given process characteristic displays; this term most often applies to the range but may also include the variance. The spread may be based on a set of data collected at a specific point in time or may reflect the variability across a given period of time.
Q
QAQuality Assurance
QCQuality Control
Qualitative DataData that is non-numerical and allows partitioning of a population. The types of qualitative data are: nominal, ordinal and binary.
Quality Function Deployment (QFD)Quality Function Deployment (QFD) is a systematic process for motivating a business to focus on its customers. It is used by cross-functional teams to identify and resolve issues involved in providing products, processes, services and strategies which will more than satisfy their customers. A prerequisite to QFD is Market Research. This is the process of understanding what the customer wants, how important these benefits are, and how well different providers of products that address these benefits are perceived to perform. This is a prerequisite to QFD because it is impossible to consistently provide products that will attract customers unless you have a very good understanding of what they want.
Quality YieldPercentage of products that were not defective. Quality Yield = (1 – fraction defective) *100%. Also called First Time Yield.
Quantitative DataNumerical data. May allow to uniquely identify each member of a population.
Quick Changeoveris a technique to analyse and reduce resources needed for equipment setup, including the exchange of tools and dies. Single Minute Exchange of Dies (SMED) is an approach to reduce output and quality losses due to changeovers. 
R
Rsee correlation
RRange
R & RGauge  % Repeatability and Reproducibility
R & RRecognition and Reward
R-chartsee X-bar and R chart
R^2see Multiple Correlation Coefficient
RandomHaving no specific pattern.
Random EffectsA factorial experiment where the variance of factors is investigated (as opposed to a fixed-effects model).
Random VariableA random variable is a function that associates a unique numerical value with every outcome of an experiment. The value of the random variable will vary from trial to trial as the experiment is repeated. There are two types of random variables: discrete and continuous.
RangeFor a set of numbers, the absolute difference between the largest and smallest value.
RegressionData collected from an experiment are used to empirically quantify through a mathematical model the relationship that exists between the response variable and influencing factors. 
ReliabilityThe proportion surviving at some point in time during the life of a device. A generic description of tests evaluating failure rates. 
RepeatabilityThe variation in measurements obtained with one measurement instrument when used several times by one appraiser while measuring the identical characteristic on the same part.
ReplicationTest trials that are made under identical conditions.
ReproducibilityThe variation in the average of the measurements made by different appraisers using the same measuring instrument when measuring the identical characteristic on the same part
Residual ErrorExperimental error
ResidualsIn an experiment, the differences between experimental responses and predicted values are determined from a model.
Robust A description of a procedure that is not sensitive to deviations from some of its underlying assumptions.
Rolled Throughput YieldThe probability of being able to pass one unit of product or service through the entire process defect free.
Root CauseA factor that caused a non-conformance and should be permanently eliminated through process improvement
RTYRolled Throughput Yield
Run ChartA time series plot permits the study of observed data for trends or patterns over time, where the x-axis is time and the y axis is the measured variable.
S
sStandard Deviation
SampleA selection of items from a population.
Sampling distributionA distribution derived from a parent distribution by random sampling.
Scatter diagramA plot to assess the relationship between two variables. 
Screening experimentThe first step of a multiple factorial experiment strategy, where the experiment primarily assesses the significance of main effects. Two-factor interactions are normally considered in the experiments that follow a screening experiment. Screening experiments should typically consume only 25% of the monies that are allotted for the total experiment effort to solve a problem.
Shewhart Control ChartDr Shewhart is credited with developing the standard control chart test based on 3-sigma limits to separate the steady component of variation from assignable causes. 
SigmaThe Greek letter is often used to describe the standard deviation of a population.
Sigma level or sigma quality level A quality that is calculated by some to describe the capability of a process to meet specification. A Six Sigma quality level is said to have a 3.4 ppm rate. Pat Spagon from Motorola University prefers to distinguish between sigma as a measure of spread and sigma used in sigma quality level. 
SignificanceA statistical statement indicates that the level of a factor causes a difference in a response with a certain degree of risk of being in error.
Single-factor analysis of varianceOne-way analysis of variance with two levels (or treatments) is to determine if there is a significant difference between level effects.
SIPOC diagramA SIPOC diagram is a tool used by a team to identify all relevant elements of a process improvement project before work begins, in order to map the process. It is typically employed at the Measure phase of the Six Sigma DMAIC methodology. The categories to include in the process are: Supplier, Inputs, Process, Output and Customer.
Six SigmaA term coined by Motorola emphasises the improvement of processes for the purpose of reducing variability and making general improvements. 
SkewSkewness is defined as asymmetry in the distribution of the sample data values. Values on one side of the distribution tend to be further from the ‘middle’ than values on the other side. Distributions of data that are positively skewed have a tail to the right; negatively skewed data have a tail to the left. 
SMEDSingle Minute Exchange of Die
Special CausesA problem that occurs in a process because of an unusual condition. An out-of-control condition in a process control chart. (Also called: by Juran – Sporadic problems, by Deming – Local Faults, by Shewhart – Assignable Causes)
SpecificationA criterion that is to be met by a part or product.
Sporadic problemsee Special Causes
StabilityThe total variation in the measurements obtained with a measurement system on the same master or parts when measuring a single characteristic over an extended time period.
Stable ProcessA process that does not contain any special cause variation, ie. It is in statistical control 
Standard DeviationA mathematical quantity that describes the variability of response. It equals the square root of the variance. 
Standard ErrorThe square root of the variance of the sampling distribution of a statistic.
Standard WorkThe length of time that should be required to set up a given machine or operation and run one part, assembly, batch, or end product through that operation. This time is used in determining machine requirements and labour requirements. 
Stationary processA process with an ultimate constant variance.
Statistical process control (SPC)The application of statistical techniques in the control of processes. SPC is often considered a subset of SQC, where the emphasis in SPC is on the tools associated with the process but not product acceptance techniques. 
Statistical quality control (SQC)The application of statistical techniques in the control of quality. SQC includes the use of regressions analysis, tests of significance, acceptance sampling, control charts, distributions, and so on. 
Stratification AnalysisDecomposing a variable into groups in order to identify or narrow the search for root causes.
Sum of SquaresThe summation of the squared deviations relative to zero, to level means, or the grand mean of an experiment. 
SystemDevices that collectively perform a function. Within this text, systems are considered repairable, where a failure is caused by the failure of a device(s). System failure rates can either be constant or change as a function of usage (time).
System Faultssee Common Causes
T
t-TestA statistical test that utilises tabular values from the t distribution to assess, for example, whether two population means are different. 
Taguchi philosophyThe basic philosophy of reducing product/process variability for the purpose of improving quality and decreasing the loss to society; however the procedures used to achieve this objective often are different. 
Takt TimeThe time required between the completion of successive units of the end product. Tact time is used to pacelines in the production environments. 
Theory of Constraints(TOC)A management philosophy that can be viewed as three separate but interrelated areas – logistics, performance measurement, and logical thinking. TOC focuses the organisation’s scarce resources on improving the performance of the true constraint, and therefore the bottom line of the organisation. 
TOCTheory of Constraints
ToleranceSpecifies an allowable deviation from a target value where a character is still acceptable. It is the difference between the upper specification limit (USL) and the lower specification limit (LSL).
Total Productive Maintenance (TPM)is a maintenance program concept, which brings maintenance into focus in order to minimise downtimes and maximise equipment usage. The goal of TPM is to avoid emergency repairs and keep unscheduled maintenance to a minimum. 
Total Quality ManagementTQM is management and control activities based on the leadership of top management and based on the involvement of all employees and all departments from planning and development to sales and service. These management and control activities focus on quality assurance by which those qualities which satisfy the customer are built into products and services during the above processes and then offered to consumers
Toyota Production System (TPS) is a technology of comprehensive production management. The basic idea of this system is to maintain a continuous flow of products in factories in order to flexibly adapt to demand changes. The realisation of such production flow is called Just-in-time production, which means producing only necessary units in a necessary quantity at a necessary time. As a result, the excess inventories and the excess workforce will be naturally diminished, thereby achieving the purposes of increased productivity and cost reduction. 
TPMTotal Productive Maintenance
TRANSITION TREE  Is a cause and effect logic tree designed to provide step-by-step progress from initiation to completion of a course of action or change? It is an implementation tool. 
Trend chartA chart to view the resultant effect of a known variable on the response of a process. 
Trimmed meanThe measure of central tendency. It is the average of a sample with a % of the smallest and largest observations discarded. For example, a 10% trimmed mean discards the smallest 10% and the largest 10% observations. 
Two-sided TestA statistical consideration where, for example, the mean of a population is to be equal to a criterion, as stated in a null hypothesis. 
Type I Errorsee Alpha
Type II Errorsee Beta
U
U Chart
U ChartA statistical control chart for the average defect rate. 
UCLUpper Control Limit. Usually represents a 3-sigma deviation from the mean value. 
Unbiased StatisticA statistic is an unbiased estimate of a given parameter when the mean of the sampling distribution of that statistic can be shown to be equal to the parameter being estimated.
UnimodalA distribution that only has one peak (ex: Normal distribution).
USLUpper Specification Limit. A value below which performance of a product is acceptable. 
V
Value Add (VA) & Non-Value Add (NVA) Lead time ratio Provides insight on how many value-added activities are performed compared to non-value-added activities, using time as a unit of measure. 
Value Stream Mapping (VSM) is a graphical tool that helps you to see and understand the flow of the material and information as a product makes its way through the value stream. It ties together lean concepts and techniques. 
Value Stream MappingA visual picture of how material and information flows from suppliers, through manufacturing, to the customer. It includes calculations of total cycle time and value-added time. Typically written for the current state of the value chain and the future, to indicate where the business is going.
Value-AddedAny action within a process that adds value to the product from the customer’s point of view. 
VarianceMeasure of dispersion; deviation from the process mean. It is the square of standard deviation. Var(x) = [ S(xi – x-bar)^2 ] / (n –1)
Visual ManagementIs a set of techniques that makes operation standards visible so that workers can follow them more easily. These techniques expose waste so that it can be prevented and eliminated. 
VSMValue Stream Mapping
W
Weibull DistributionWidely used distribution because it has a density function that has many possible shapes. The two-parameter distribution is described by the shape and location parameters. 
White NoiseCommon cause variation in a process
WIPWork In Progress
Workflow Diagram (Spaghetti Diagram) Shows the movement of material, identifying areas of waste. Aids teams to plan future improvements, such as one-piece flow and work cells. 
X
X-BarAlso known as the sample mean
X-Bar Chart and R chartA pair of control charts is used with processes that have subgroups of 2 or more. The standard charts help determine whether a process is stable and predictable. The X-bar chart shows how the average changes over time, and the R chart shows how the range of the subgroup changes over time. 
Y
Yieldsee Quality Yield
Z
Z ScoreA measure of the distance in standard deviations of a sample from the mean, assuming a standard normal process.

We trust you find the Lean Six Sigma Terminology post of value. For more on the graphical statistical tools and techniques used within Lean Six Sigma, visit