The individual terms that form part of the Lean Six Sigma terminology are not unique but as a collective, they are key to effectively employing the methodologies to achieve process improvement. This post contains over 300 terms and acronyms that are used within Lean Six Sigma.
Comprehensive list of Lean Six Sigma Terminology
The Lean Six Sigma terminology is organised in alphabetical order for ease of identification.
|Muda, Mura, Muri
|Production, Preparation, Process
|A methodology for organising, cleaning, developing, and sustaining a productive work environment. Improved safety, ownership of workspace, improved productivity, and improved maintenance is some of the benefits of the 5S program.
|Sort, Straighten, Shine, Standardise, Sustain
|Man, Machine, Measure, Mother Nature, Method, Material
|Transport; Inventory; Motion; Waiting; Overproduction; Overprocessing; Defects; Human Intellect. Additional wastes are Energy; Pollution, Space.
|80/20 Rule (Pareto)
|The rule that suggests that 20% of causes (categories) will account for 80% of the trouble
|Team-oriented problem-solving method for product and process improvement. Structured into 8 steps: Define the problem and prepare for process improvement, establish a team, describe the problem, develop interim containment, define & verify root cause, choose permanent corrective action, implement corrective action, prevent a recurrence, recognise and reward the contributors.
|Acceptable Quality Level (AQL)
|The maximum proportion of defective units considered satisfactory is the process average.
|The highest number of nonconforming units or defects found in the sample permits the acceptance of the lot.
|Statement about how close the data are to the target
|A tool used to organise and summarise large amounts of data (ideas, issues, solutions, problems) into logical categories based on user-perceived relationships.
|In hypothesis testing: rejecting the null hypothesis (no difference) erroneously; assuming a relationship where none exists. They are also referred to as Type I Error. E.g. convicting an innocent person.
|Analysis of Variance (ANOVA)
|A calculation procedure to allocate the amount of variation in a process and determine if it is significant or is caused by random noise.
|is a visual production-control device (usually a lighted overhead display) that continuously shows changing status of the production line and sounds alerts if a problem is imminent.
|Analysis of Variance
|Annual Operating Plan
|see Special Causes
|Numerical information at the nominal level. Its subdivision is not conceptually meaningful. It represents the frequency of occurrence within some discrete category.
|A timely process or system inspection ensures that specifications conform to documented quality standards. An Audit also brings out discrepancies between the documented standards and the standards followed and also might show how well or how badly the documented standards support the processes currently followed.
|Average Outgoing Quality
|The expected quality of outgoing product following the use of an acceptance sampling plan for a given value of incoming product quality.
|Horizontal or vertical bars that graphically illustrate the magnitude of multiple situations.
|A curve is used to describe the life cycle of a system/device as a function of usage. Initially, when the curve has a downwards slope, the failure rate decreases with use. Then, a constant failure rate is achieved, and finally, the curve slopes upward when the failure rate increases with usage.
|“Business As Usual” The old way of doing business, considering repetitive tasks with no critical sense of improvement.
|The concept of discovering what the best performance is achieved, whether in your company, by a competitor, or by an entirely different industry.
|In hypothesis testing: the failure to reject a false null hypothesis; assume no relationship exists when it does. E.g. is failing to convict a guilty person.
|Binary Attribute Data
|Data that can only have two values indicates the presence or absence of some characters from the process. E.g. pass/fail
|Full-time team leaders responsible for recommending and implementing process improvement projects (DMAIC or DMADV) within the business — to drive customer satisfaction levels and business productivity up.
|Special cause variation in the process
|Baseline for measurements
|C & E
|Cause and Effects matrix
|Statistical Process Control Chart used to determine whether the number of defects/unit is constant over time.
|A mathematical calculation is used to compare the process variation to a specification. Examples are Cp, Cpk. Used to compare processes across businesses.
|Cause and Effect Diagram
|The tool used for brainstorming and categorising potential causes to a problem. The most commonly used cause branches are Man, Method, Machine and Material. Also known as the Ishikawa and the Fishbone diagram.
|Collecting information on an entire population
|Central Limit Theorem
|The means of samples from a population will tend to be normally distributed around the population mean.
|An indication of the location or centrality of the data. The most common measures of central tendency are: mean, median and mode.
|Business leaders and senior managers who ensure that resources are available for training and projects and who set and maintain broad goals for improvement projects. A business leader who identifies, selects, and charters continuous improvement projects and supports the delivery.
|See Common Causes
|A data collection form consisting of multiple categories. Each category has an operational definition and can be checked off as it occurs. Properly designed, the Check Sheet helps to summarise the data, which is often displayed in a Pareto Chart.
|A statistical goodness-of-fit-test was used to test the assumption that the distribution of a set of data is similar to the expected distribution.
|See Common Causes
|Small, random forces that act continuously on a process (Also called: by Juran – Chronic Problems, by Deming – System Faults, by Shewhart – Chance Causes)
|A region containing the limits of a parameter, with an associated level of confidence that the bounds are large enough to contain the true parameter value
|A variable whose possible values consist of an entire interval on the number line,i.e. it can take any value.
|Customer requirements for a process under study or in routine operation.
|A procedure is used for tracking a process through time to distinguish variation that is inherent in the process (common cause) from variation that yields a change to the process (special cause).
|A process control document lists all of the elements required to control variations in a process under study or routine operation.
|Cost of Poor Quality
|It is denoted by R. Measure of the linear relationship between two variables. It can take on any value between 1 and -1. If the correlation is 1, the two variables have a perfect positive linear relationship. If the correlation is -1, the variables have a perfect negative linear relationship. If |correlation| > .7, the relationship between the variables is considered strong.
|Cost of Poor Quality (COPQ)
|The cost associated with providing poor quality products or services. The cost of ‘not producing first quality for the customer the first time and every time. Cost of quality issues is often given the broad categories of internal failure costs, external failure costs, appraisal costs, and prevention costs.
|Index of process capability – process centred
|Process Capability Index, which takes into account off-centredness. Cpk = min[ (USL-Mean)/(3 x sigma), (Mean-LSL)/(3 x sigma)] (i.e. depending on whether the shift is up or down). A typical objective is to have a Cpk > 1.67.
|Index of process capability – process not centred
|Critical to Cost
|Critical to Quality
|Cumulative sum (CUSUM) control chart
|An alternative technique to Shewhart charting. CUSUM charts can detect small process shifts faster than Shewhart charts.
|Cumulative distribution function (CDF)
|the calculated integral of the PDF from minus infinity to x.
|People process equipment that receives products or services from you as a supplier. These may be internal or external customers.
|Set of measurements obtained from a sample or census
|Defects Per Million (DPM)
|A number of defective parts out of one million. DPM = Fraction Nonconforming * 1 M, where fraction non-conforming = 1 – Quality Yield. A quality metric often used in the Lean Six Sigma process. It is calculated by the number of defects observed divided by the number of opportunities for defects compared to 1 million units.
|Defects Per Million Opportunities (DPMO)
|A number of defective parts per million opportunities. It is used when an inspection unit has 1 or more categories of defects. DPMO = Fraction Nonconforming * 1 M, where fraction non-conforming = Total # of defects/Total # of Opportunities (TOP), and TOP = # units * # opportunities per unit.
|Design of Experiments (DOE)
|A formal, proactive method for experimenting with Xs and Ys. It involves concepts such as factors, levels, centre points, blocking, and replications.
|Variables whose only possible values are whole units.
|The methodology used for Design for Six Sigma Projects or new product/service introduction. It stands for Define, Measure, Analyse, Design and Verify.
|The methodology used for Six Sigma Problem Solving. Its steps are Define, Measure, Analyse, Improve and Control. It is used only when a product or process exists at your company but is not meeting customer specifications or is not performing adequately.
|Define, Measure, Analyse, Improve, Control
|Design of Experiments
|For nominal or ordinal data, a dot plot is similar to a bar chart, with the bars replaced by a series of dots. Each dot represents a fixed number of individuals. For continuous data, the dot plot is similar to a histogram, with the rectangles replaced by dots. A dot plot can also help detect any unusual observations (outliers) or gaps in the data set.
|Defects Per Unit
|Days Sales Outstanding
|The expected value of the variable x (or the population mean). E(x) = (Sx_i)/n, where n is the sample size
|Entitlement for a process measurement
|Every product every interval
|Ambiguities during data analysis are caused by sources as measurement bias, random measurement error, and mistakes.
|is a structured approach to ensure quality and error-free manufacturing environment. Error proofing assures that defects will never be passed to the next operation.
|A process is undertaken to determine something that is not already known.
|The function relating process inputs to process outputs
|A statistical test that utilises tabular values from the F-distribution to assess significance.
|Experiment strategy that assesses several factors/variables simultaneously in one test. All possible combinations of factors at different levels are examined so that interactions, as well as the main effects, can be estimated.
|Failure Modes and Effects Analysis (FMEA)
|Analytical approach directed toward problem prevention through which every possible failure mode are identified to determine their effect on the required function of the product or process.
|Fault Tree Analysis
|A schematic picture using logic symbols of possible failure modes and associated probabilities.
|First in, First out
|First Time Yield (FTY)
|see Quality Yield
|see Cause and Effect Diagram
|Potential Failure Modes and Effect Analysis
|Force field Analysis
|Representation of the forces in an organisation that is supporting and driving toward a solution or which are restraining the process.
|First Pass Yield
|Fractional factorial experiment
|A designed experiment strategy that assesses several factors/variables simultaneously in one test, where only a partial set of all possible combinations of factor levels are tested to more efficiently identify important factors. This type of test is much more efficient than a traditional one-at-a-time test strategy.
|The average difference observed between a gage under evaluation and a master gage when measuring the same parts over multiple readings.
|Gage Repeatability and Reproducibility. See Repeatability and Reproducibility
|A measure of the variation is observed when a single operator uses a gauge to measure a group of randomly ordered (but identifiable) parts on a repetitive basis.
|A measure of average variation is observed between operations when multiple operators use the same gauge to measure a group of randomly ordered (but identifiable) parts on a repetitive basis.
|Used in project management, it provides a graphical illustration of a schedule and helps plan, coordinate, and track specific tasks in a project.
|The limitation of opportunities for deviation from the proven steps in the manufacturing process. The primary objective is to minimise human error.
|Goodness of Fit
|Value is determined by using many statistical techniques stating probabilistically whether data can be shown to fit a theoretical distribution, such as Normal or Poisson.
|Six sigma player, responsible for deploying six sigma techniques, managing small projects and implementing improvement.
|Green Belt (GB)
|Lean Six Sigma role is similar in function to Black Belt but the length of training and project scope are reduced.
|“level production” is a technique of achieving even output flow by coordinated sequencing of very small production batches throughout the manufacturing line in a lean production or just in time (JIT) system.
|Having different variance. In a linear regression model, violation of the assumption of constant variance in the outcome variable (homoscedasticity) is called heteroscedasticity
|A bar graph of a frequency distribution in which the widths of the bars are proportional to the classes into which the variable has been divided and the heights of the bars are proportional to the class frequencies
|Homogeneous Poisson Process (HPP)
|A model that considers that failure rate does not change with time.
|A distribution that has a similar use to the binomial distribution; however, the sample size is “large” relative to the population size.
|Consists of a null hypothesis (H_o) and an alternative hypothesis (H_a), where, for example, H_o indicates equality between two process outputs, and H_a indicates nonequality. Through a hypothesis test, a decision is made to either reject or fail to reject the null hypothesis.
|An “In-Control” process is one that is free of assignable/special causes of variation. Such a condition is most often evidenced on a control chart which displays an absence of nonrandom variation.
|The structuring in a Taguchi-style fractional factorial experiment of the factors that can be controlled in a process
|Difference between the 75th percentile and the 25th percentile of a sample or population
|Inventory Turnover Rate
|The number of times an inventory cycles or turns over during the year. A frequently used method to compute inventory turnover is to divide the average inventory level into an annual cost of sales.
|see Cause and Effect Diagram
|International Standards Organization
|ISO 9000 Series of Standards
|Series of standards established in the 1980s by countries of Western Europe as a basis for judging the adequacy of the quality control systems of companies
|“Automation with a human touch” being able to stop production lines, either manually by human intervention or mechanically if there is a problem like an equipment malfunction, quality issues, or work that is delayed for whatever reason
|A planning system for manufacturing processes that optimises the needed material inventories at the manufacturing site to only what is needed. JIT is a pull system; the product is pulled along to its finish, rather than conventional mass production, which is a push system.
|Just In Time (JIT)
|A philosophy of manufacturing based on planned elimination of all waste and continuous improvement of productivity. It encompasses the successful execution of all manufacturing activities required to produce a final product.
|A Japanese term that means continuous improvement. “Kai” means change and “zen” means good. The Japanese term for improvement; continuing improvement involving everyone – managers and workers. In manufacturing, kaizen relates to finding and eliminating waste in machinery, labour or production methods.
|Japanese term. It is one of the primary tools of the JIT system. It maintains an orderly and efficient flow of materials throughout the entire manufacturing process. It is usually a printed card that contains specific information such as part name, description, quantity, etc. It is a simple parts-movement system that depends on cards and boxes/containers to take parts from one workstation to another on a production line. The essence of the Kanban concept is that a supplier or the warehouse should only deliver components to the production line as and when they are needed so that there is no storage in the production area.
|Key Process Output Variable
|Kurtosis is a measure of whether the data are peaked or flat relative to a normal distribution. For unimodal distributions K=3 is a mesokurtic distribution(normal or bell-shaped); K < 3 is a platykurtic distribution (flatter than normal, with shorter tails); and K > 3 is a leptokurtic distribution (more peaked than normal, with longer tails).
|Layout Design / Cell Design
|is the organisation of a production or service facility so that items having similar processing requirements are grouped together.
|Lower Control Limit. Usually represents a downwards 3-sigma deviation from the mean.
|Philosophy developed by Toyota that aims to eliminate waste (non-value-added steps, material, etc.) in the system.
|allow companies to measure, evaluate and respond to their performance in a balanced way, without sacrificing the quality to meet quantity objectives or increasing inventory levels to achieve machine efficiencies. The type of the lean metric depends on the organisation and can be of the following categories; Financial performance, behavioural performance and core process performance
|Lean Performance Indicators (LPI)
|is a consistent method to measure lean implementation effectiveness. INDICATORS. Real-Time Performance, Continuous Improvement Implementation, Lean Sustain, Waste Elimination and Profitability.
|The method used in regression to estimate the equation coefficients and constant so that the sum of squares of the differences between the individual responses and the fitted model is a minimum.
|Aims to find a linear relationship between a response variable and a possible predictor variable using the “least squares” method.
|see Special Causes
|A continuous “Taguchi” function that measures the cost implications of product variability.
|Lower Specification Limit. The minimum acceptable value of a variable.
|Lower Specification Limit
|Lot Tolerance Percent Defective. The value of incoming quality where it is desirable to reject most lots.
|Malcolm Baldrige National Quality Award
|The annual self-evaluation covers the following seven categories of criteria: leadership, strategic planning, customer and market focus, information and analysis, human resource focus, process management, and business results. The National Institute of Standards and Technology (NIST), a federal agency within the Department of Commerce, is responsible for managing the Malcolm Baldrige National Quality Award. The American Society for Quality (ASQ) administers the Malcolm Baldrige National Quality Award under a contract with NIST.
|Master Black Belt (MBB)
|Master Black Belts are Six Sigma or quality experts and are responsible for strategic implementations within the business. The Master Black Belt is qualified to teach other Six Sigma facilitators the methodologies, tools, and applications in all functions and levels of the company, and are a resource for utilising statistical process control within processes. A person who is an “expert” on Lean Six Sigma techniques and on project implementation. Master Black Belts play a key role in training and coaching
|Master Black Belt
|The measure of central tendency. It is the average of a set of numbers. See E(x)
|Sum of squares divided by degrees of freedom.
|Mean Time Between Failure (MTBF)
|A term that can be used to describe the frequency of failures in a repairable system with a constant failure rate. MTBF is the average time that is expected between failures. MTBF = 1/failure rate.
|Measurement System (MS)
|This means evaluating a continuous or discrete measurement system with the intention of discovering how “good” or “bad” the measurement systems are.
|The measure of central tendency. It is calculated by ordering the data from smallest to largest. For an odd sample size, the median is the middle observation. For an even sample size, the median is the average of the middle two values of the sample.
|Statistical software package that operates in a Windows environment and is the main statistical package used by students.
|Mistake Proofing (Poka Yoka)
|is an approach to ‘mistake proofing’ in all aspects of manufacturing, customer service, procurement, etc. It employs visual signals that make mistakes clearly stand out from the rest or devices that stop an assembly line or process if a part or step is missed. Its older name is Poka Yoka (foolproofing).
|The value or item occurring most frequently in a series of observations or statistical data.
|Measurement System Analysis
|(Inferential Statistics) Passive data collection method used in to display the variation within parts, machines, or processes between machines or process parts, and over time.
|Type of multiple variable process study
|A chart that is constructed to display the variance within units, between units, between samples, and between lots.
|When there exists near linear dependencies between regressors, the problem of multicollinearity is said to exist. It can make the linear regression unstable and/or impossible to accomplish.
|A combination of more than one distribution that has more than one distinct peak.
|Multiple Correlation Coefficient
|The square of the correlation (R). It measures the % of the variation in the response variable explained by the variation in the predictor variable.
|Three standard deviations on either side of the mean.
|Nominal Group Technique (NGT)
|A voting procedure to expedite team consensus on the relative importance of problems, issues, or solutions.
|An activity performed in a process that does not add value to the output product or service, which may or may not have a valid business reason for being performed.
|Nonhomogenous Poisson process (NHPP)
|A mathematical model that can often be used to describe the failure rate of a repairable system that has a decreasing, constant, or increasing rate.
|A process with a variance that can grow without limit.
|A continuous, symmetrical density function characterised by a bell-shaped curve. A bell-shaped distribution is often useful to describe various physical, mechanical, electrical, and chemical properties.
|Statistical Process Control chart where the number of defective items are plotted (like the C chart), but the control limits are calculated using the binomial distribution. This chart can be used if producing defects is not rare.
|See Hypothesis testing.
|a company that uses product components from one or more other companies to build a product that it sells under its own company name and brand
|Operator Balance Chart
|Overall Equipment Effectiveness
|or continuous flow processing is a concept means that items are processed and moved directly from one processing step to the next, one piece at a time. One-piece flow helps to maximum utilisation of resources, shorten lead times, identify problems and communication between operations.
|A statistical consideration where, for example, an alternative hypothesis is that the mean of a population is less than a criterion value.
|One-way analysis of variance
|see Single Factor Analysis of Variance
|Operation Cost Target (OCT)
|This value represents the maximum expenditure for material, labour, outsourcing, overhead, and all other costs associated with that project. This figure can then be divided between the various operations comprising the manufacturing process, in order to control costs at each step.
|The property of a fractional factorial experiment ensures that effects can be determined separately without entanglement. The elements in an orthogonal set are not correlated.
|Occupational Safety and Health Administration
|A data point that does not fit a model because of an erroneous reading or some other abnormal situation. A value greater than 3-sigma from the mean is considered outlier.
|The result of a process. Sometimes called the response of the process. The products or services that result from a process.
|Overall Equipment Effectiveness (OEE)
|Measures the availability, performance efficiency, and quality rate of equipment – it is especially important to calculate OEE for constrained operations.
|A statistical process control chart is used to determine whether the proportion of nonconformities is constant over time.
|The smallest level of significance at which a null hypothesis would be rejected when a specified test procedure is used on a given data set.
|The ratio of measurement process capability to customer tolerance
|The graphical technique is used to quantify problems so that effort can be expended in fixing the “vital few” causes, as opposed to the “trivial many”. It is a bar chart that displays in descending frequency the number of observed defects in a category.
|see 80/20 Rule
|Performance Ratio (PR)
|Represents the persent of tolerance width used by the variation. PR = 1/Pp
|Plan for every part
|An estimate calculated from sample data without a confidence interval.
|A distribution that is useful, for example, to design reliability tests where the failure rate is considered to be constant as a function of usage.
|Japanese term which means mistake-proofing. A poka-yoke device is one that prevents incorrect parts from being made or assembled or easily identifies a flaw or error. To avoid (yokeru) inadvertent errors (poka)
|The entire collection of items that is the focus of concern
|Represents the capability of the process. It measures the relationship between the tolerance width and the range of variation. Pp = Specification Width / (6-S), where sigma is estimated using the sample standard deviation S.
|Represents the capability of the process, taking into account any difference between the design target and the actual process mean. Ppk = min(Ppk_upper, Ppk_lower), where Ppk_upper = (USL – X-bar)/3-sigma and Ppk_lower = (X-bar – LSL)/3-sigma
|Parts Per Million
|Parts Per Million
|The closeness of agreement between randomly selected individual measurements or test results.
|Is a logical structure designed to identify all obstacles and the responses needed to overcome them in realising an objective? It identifies minimum necessary conditions without which the objective cannot be met.
|A numerical expression for the likelihood of an occurrence.
|Probability Density Function (PDF) – f(x)
|A mathematical function that can model the probability density reflected in a histogram.
|Data are plotted on a selected probability paper coordinate system to determine if a particular distribution is appropriate and to make statements about the percentiles of the population.
|A method to make or do something that involves a number of steps. A mathematical model such as the HPP (homogeneous Poisson process)
|Process Input Variable
|The vital few input variables called “X’s”
|Modifying, altering, reshaping, redesigning any business/production process, work method or management style to deliver greater value.
|A visual representation of the workflow either within a process – or an image of the whole operation. A good Process Map should allow people unfamiliar with the process to understand the interaction of causes during the workflow. A good Process Map should contain additional information relating to the Six Sigma project i.e. information per critical step about input and output variables, time, cost, DPU value.
|Map of the process
|Process Output Variable
|The output variable, called “Y”. Usually only one Y is studied at a time.
|is the individual who accompanies the project work and assists with the implementation of the solution. This individual owns the process and ensures improvements implemented are sustained and continuously improved.
|Process Route Table
|Shows what machines and equipment are needed for processing a component or assembly. These tables aid in creating ordinary lines and grouping workpieces into work cells.
|The range of values that a given process characteristic displays; this term most often applies to the range but may also include the variance. The spread may be based on a set of data collected at a specific point in time or may reflect the variability across a given period of time.
|Data that is non-numerical and allows partitioning of a population. The types of qualitative data are: nominal, ordinal and binary.
|Quality Function Deployment (QFD)
|Quality Function Deployment (QFD) is a systematic process for motivating a business to focus on its customers. It is used by cross-functional teams to identify and resolve issues involved in providing products, processes, services and strategies which will more than satisfy their customers. A prerequisite to QFD is Market Research. This is the process of understanding what the customer wants, how important these benefits are, and how well different providers of products that address these benefits are perceived to perform. This is a prerequisite to QFD because it is impossible to consistently provide products that will attract customers unless you have a very good understanding of what they want.
|Percentage of products that were not defective. Quality Yield = (1 – fraction defective) *100%. Also called First Time Yield.
|Numerical data. May allow to uniquely identify each member of a population.
|is a technique to analyse and reduce resources needed for equipment setup, including the exchange of tools and dies. Single Minute Exchange of Dies (SMED) is an approach to reduce output and quality losses due to changeovers.
|R & R
|Gauge % Repeatability and Reproducibility
|R & R
|Recognition and Reward
|see X-bar and R chart
|see Multiple Correlation Coefficient
|Having no specific pattern.
|A factorial experiment where the variance of factors is investigated (as opposed to a fixed-effects model).
|A random variable is a function that associates a unique numerical value with every outcome of an experiment. The value of the random variable will vary from trial to trial as the experiment is repeated. There are two types of random variables: discrete and continuous.
|For a set of numbers, the absolute difference between the largest and smallest value.
|Data collected from an experiment are used to empirically quantify through a mathematical model the relationship that exists between the response variable and influencing factors.
|The proportion surviving at some point in time during the life of a device. A generic description of tests evaluating failure rates.
|The variation in measurements obtained with one measurement instrument when used several times by one appraiser while measuring the identical characteristic on the same part.
|Test trials that are made under identical conditions.
|The variation in the average of the measurements made by different appraisers using the same measuring instrument when measuring the identical characteristic on the same part
|In an experiment, the differences between experimental responses and predicted values are determined from a model.
|A description of a procedure that is not sensitive to deviations from some of its underlying assumptions.
|Rolled Throughput Yield
|The probability of being able to pass one unit of product or service through the entire process defect free.
|A factor that caused a non-conformance and should be permanently eliminated through process improvement
|Rolled Throughput Yield
|A time series plot permits the study of observed data for trends or patterns over time, where the x-axis is time and the y axis is the measured variable.
|A selection of items from a population.
|A distribution derived from a parent distribution by random sampling.
|A plot to assess the relationship between two variables.
|The first step of a multiple factorial experiment strategy, where the experiment primarily assesses the significance of main effects. Two-factor interactions are normally considered in the experiments that follow a screening experiment. Screening experiments should typically consume only 25% of the monies that are allotted for the total experiment effort to solve a problem.
|Shewhart Control Chart
|Dr Shewhart is credited with developing the standard control chart test based on 3-sigma limits to separate the steady component of variation from assignable causes.
|The Greek letter is often used to describe the standard deviation of a population.
|Sigma level or sigma quality level
|A quality that is calculated by some to describe the capability of a process to meet specification. A Six Sigma quality level is said to have a 3.4 ppm rate. Pat Spagon from Motorola University prefers to distinguish between sigma as a measure of spread and sigma used in sigma quality level.
|A statistical statement indicates that the level of a factor causes a difference in a response with a certain degree of risk of being in error.
|Single-factor analysis of variance
|One-way analysis of variance with two levels (or treatments) is to determine if there is a significant difference between level effects.
|A SIPOC diagram is a tool used by a team to identify all relevant elements of a process improvement project before work begins, in order to map the process. It is typically employed at the Measure phase of the Six Sigma DMAIC methodology. The categories to include in the process are: Supplier, Inputs, Process, Output and Customer.
|A term coined by Motorola emphasises the improvement of processes for the purpose of reducing variability and making general improvements.
|Skewness is defined as asymmetry in the distribution of the sample data values. Values on one side of the distribution tend to be further from the ‘middle’ than values on the other side. Distributions of data that are positively skewed have a tail to the right; negatively skewed data have a tail to the left.
|Single Minute Exchange of Die
|A problem that occurs in a process because of an unusual condition. An out-of-control condition in a process control chart. (Also called: by Juran – Sporadic problems, by Deming – Local Faults, by Shewhart – Assignable Causes)
|A criterion that is to be met by a part or product.
|see Special Causes
|The total variation in the measurements obtained with a measurement system on the same master or parts when measuring a single characteristic over an extended time period.
|A process that does not contain any special cause variation, ie. It is in statistical control
|A mathematical quantity that describes the variability of response. It equals the square root of the variance.
|The square root of the variance of the sampling distribution of a statistic.
|The length of time that should be required to set up a given machine or operation and run one part, assembly, batch, or end product through that operation. This time is used in determining machine requirements and labour requirements.
|A process with an ultimate constant variance.
|Statistical process control (SPC)
|The application of statistical techniques in the control of processes. SPC is often considered a subset of SQC, where the emphasis in SPC is on the tools associated with the process but not product acceptance techniques.
|Statistical quality control (SQC)
|The application of statistical techniques in the control of quality. SQC includes the use of regressions analysis, tests of significance, acceptance sampling, control charts, distributions, and so on.
|Decomposing a variable into groups in order to identify or narrow the search for root causes.
|Sum of Squares
|The summation of the squared deviations relative to zero, to level means, or the grand mean of an experiment.
|Devices that collectively perform a function. Within this text, systems are considered repairable, where a failure is caused by the failure of a device(s). System failure rates can either be constant or change as a function of usage (time).
|see Common Causes
|A statistical test that utilises tabular values from the t distribution to assess, for example, whether two population means are different.
|The basic philosophy of reducing product/process variability for the purpose of improving quality and decreasing the loss to society; however the procedures used to achieve this objective often are different.
|The time required between the completion of successive units of the end product. Tact time is used to pacelines in the production environments.
|Theory of Constraints(TOC)
|A management philosophy that can be viewed as three separate but interrelated areas – logistics, performance measurement, and logical thinking. TOC focuses the organisation’s scarce resources on improving the performance of the true constraint, and therefore the bottom line of the organisation.
|Theory of Constraints
|Specifies an allowable deviation from a target value where a character is still acceptable. It is the difference between the upper specification limit (USL) and the lower specification limit (LSL).
|Total Productive Maintenance (TPM)
|is a maintenance program concept, which brings maintenance into focus in order to minimise downtimes and maximise equipment usage. The goal of TPM is to avoid emergency repairs and keep unscheduled maintenance to a minimum.
|Total Quality Management
|TQM is management and control activities based on the leadership of top management and based on the involvement of all employees and all departments from planning and development to sales and service. These management and control activities focus on quality assurance by which those qualities which satisfy the customer are built into products and services during the above processes and then offered to consumers
|Toyota Production System (TPS)
|is a technology of comprehensive production management. The basic idea of this system is to maintain a continuous flow of products in factories in order to flexibly adapt to demand changes. The realisation of such production flow is called Just-in-time production, which means producing only necessary units in a necessary quantity at a necessary time. As a result, the excess inventories and the excess workforce will be naturally diminished, thereby achieving the purposes of increased productivity and cost reduction.
|Total Productive Maintenance
|Is a cause and effect logic tree designed to provide step-by-step progress from initiation to completion of a course of action or change? It is an implementation tool.
|A chart to view the resultant effect of a known variable on the response of a process.
|The measure of central tendency. It is the average of a sample with a % of the smallest and largest observations discarded. For example, a 10% trimmed mean discards the smallest 10% and the largest 10% observations.
|A statistical consideration where, for example, the mean of a population is to be equal to a criterion, as stated in a null hypothesis.
|Type I Error
|Type II Error
|A statistical control chart for the average defect rate.
|Upper Control Limit. Usually represents a 3-sigma deviation from the mean value.
|A statistic is an unbiased estimate of a given parameter when the mean of the sampling distribution of that statistic can be shown to be equal to the parameter being estimated.
|A distribution that only has one peak (ex: Normal distribution).
|Upper Specification Limit. A value below which performance of a product is acceptable.
|Value Add (VA) & Non-Value Add (NVA)
|Lead time ratio Provides insight on how many value-added activities are performed compared to non-value-added activities, using time as a unit of measure.
|Value Stream Mapping (VSM)
|is a graphical tool that helps you to see and understand the flow of the material and information as a product makes its way through the value stream. It ties together lean concepts and techniques.
|Value Stream Mapping
|A visual picture of how material and information flows from suppliers, through manufacturing, to the customer. It includes calculations of total cycle time and value-added time. Typically written for the current state of the value chain and the future, to indicate where the business is going.
|Any action within a process that adds value to the product from the customer’s point of view.
|Measure of dispersion; deviation from the process mean. It is the square of standard deviation. Var(x) = [ S(xi – x-bar)^2 ] / (n –1)
|Is a set of techniques that makes operation standards visible so that workers can follow them more easily. These techniques expose waste so that it can be prevented and eliminated.
|Value Stream Mapping
|Widely used distribution because it has a density function that has many possible shapes. The two-parameter distribution is described by the shape and location parameters.
|Common cause variation in a process
|Work In Progress
|Workflow Diagram (Spaghetti Diagram)
|Shows the movement of material, identifying areas of waste. Aids teams to plan future improvements, such as one-piece flow and work cells.
|Also known as the sample mean
|X-Bar Chart and R chart
|A pair of control charts is used with processes that have subgroups of 2 or more. The standard charts help determine whether a process is stable and predictable. The X-bar chart shows how the average changes over time, and the R chart shows how the range of the subgroup changes over time.
|see Quality Yield
|A measure of the distance in standard deviations of a sample from the mean, assuming a standard normal process.
We trust you find the Lean Six Sigma Terminology post of value. For more on the graphical statistical tools and techniques used within Lean Six Sigma, visit