You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

Six Sigma tools and statistical techniques are crucial for process improvement. They include basic quality tools like process mapping and cause-effect diagrams, as well as statistical methods such as descriptive statistics and hypothesis testing. These tools help identify issues and analyze data effectively.

Process capability and graphical tools complement Six Sigma methods. Process capability indices assess a process's ability to meet specifications, while graphical tools like histograms and control charts visualize data patterns. Together, these techniques support data-driven decision-making and continuous improvement efforts.

Six Sigma Tools and Statistical Techniques

Basic Six Sigma quality tools

Top images from around the web for Basic Six Sigma quality tools
Top images from around the web for Basic Six Sigma quality tools
  • Process mapping visualizes workflow steps graphically
    • Flowcharts depict sequence of activities with symbols (rectangles, diamonds)
    • Value stream maps identify value-adding and non-value-adding activities (material flow, information flow)
    • SIPOC diagrams outline high-level process view (Suppliers, Inputs, Process, Outputs, Customers)
  • Cause-effect diagrams illustrate potential causes of problems
    • Ishikawa diagrams organize causes into categories (Man, Machine, Method, Material)
    • Fishbone diagrams visually resemble fish skeleton structure
  • 5 Whys analysis uncovers root causes by repeatedly asking "Why?" (equipment failure, human error)
  • Check sheets collect and organize data systematically (defect types, frequency counts)
  • Brainstorming techniques generate ideas collaboratively (round-robin, mind mapping)

Statistical techniques for analysis

  • Descriptive statistics summarize data characteristics
    • Measures of central tendency locate data center
      • calculates arithmetic average
      • Median identifies middle value in ordered dataset
      • Mode determines most frequent value
    • Measures of dispersion quantify data spread
      • Range measures difference between maximum and minimum values
      • Variance calculates average squared deviation from mean
      • square root of variance indicates typical deviation from mean
  • Hypothesis testing evaluates claims about population parameters
    • Null and alternative hypotheses state competing claims
    • p-value probability of obtaining results assuming null hypothesis is true
    • Type I error rejecting true null hypothesis (false positive)
    • Type II error failing to reject false null hypothesis (false negative)
    • t-tests compare means of two groups (independent samples, paired samples)
    • ANOVA compares means of three or more groups (one-way, two-way)
  • Regression analysis models relationships between variables
    • Simple linear regression predicts one dependent variable from one independent variable
    • Multiple regression predicts dependent variable from multiple independent variables
    • R-squared value measures proportion of variance explained by model
    • Residual analysis assesses model fit and assumptions (normality, homoscedasticity)

Process Capability and Graphical Tools

Process capability and indices

  • Process capability assesses process ability to meet specifications
    • Natural process limits represent inherent process variation (±3 sigma)
    • Specification limits define acceptable output range (USL, LSL)
  • Process capability indices quantify process performance
    • CpC_p measures potential capability assuming process is centered
    • CpkC_{pk} accounts for process centering relative to specification limits
    • PpP_p similar to CpC_p but uses overall process standard deviation
    • PpkP_{pk} similar to CpkC_{pk} but uses overall process standard deviation
  • Calculation formulas
    • Cp=USLLSL6σC_p = \frac{USL - LSL}{6\sigma} compares specification width to process spread
    • Cpk=min(USLμ3σ,μLSL3σ)C_{pk} = \min(\frac{USL - \mu}{3\sigma}, \frac{\mu - LSL}{3\sigma}) considers process centering
  • Interpretation of capability indices guides process improvement efforts (>1.33 considered capable)

Graphical tools for data visualization

  • Histograms display frequency distribution of continuous data
    • Frequency distribution shows data groupings (bar heights)
    • Shape of distribution reveals patterns (normal, skewed, bimodal)
    • Skewness indicates asymmetry (right-skewed, left-skewed)
  • Pareto charts prioritize improvement efforts
    • 80-20 rule suggests 80% of effects come from 20% of causes
    • Cumulative percentage line shows running total of categories
  • Control charts monitor process stability over time
    • Common cause variation results from inherent process factors
    • Special cause variation indicates assignable, non-random factors
    • Upper and lower control limits define expected process variation (±3 sigma)
    • X-bar and R charts monitor process mean and range for subgroups
    • Individual and moving range charts used for individual measurements
  • Scatter plots explore relationships between two variables
    • Correlation analysis quantifies strength and direction of relationship
    • Trend identification reveals patterns (linear, curvilinear, clusters)
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary