The Tools

A selection of tools that may be delivered in support of project work.  The tools are organized by Four Platforms.

I. Critical Thinking:  the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, evaluating, and/or predicting information gathered from, or generated by, observation, experience, reflection, reasoning, communication, or sampling as a guide to belief and action.

Thought Map: an ongoing documentation of existing knowledge, the questions asked, the parallel paths of work needed to answer those questions, hypotheses stated, tools applied to answer questions, knowledge gained from work performed, and the direction of future work. The thought map is invaluable in any focused work effort in order to capture the multitude of questions that arise, the many possible paths that need to be considered in obtaining understanding, the work performed, the linkages between situations and appropriate application of the tools and the solutions obtained.

Failure Mode & Effects Analysis (FMEA): a systematic method for identifying, analyzing, documenting and prioritizing potential failure modes, the effects of such failures on system performance and possible causes of failures.

II. Process and Product Description:  Use of graphical techniques to identify X’s, y’s and Y’s.  The intent is to engage the visual sense increasing the likelihood of capturing all possible X’s. The maps provide context for data collection and analysis. 

Process/Product Map: a tool that displays current process knowledge and is a supplement to many of the traditional process investigation tools.  It enhances the usual flowcharts with the type of knowledge captured in Cause and Effect Diagrams. Graphically combining the knowledge typically depicted on a flowchart with that from a Cause and Effect Diagram, the Process Map overcomes the weaknesses of the two tools used independently.  Additionally, the Maps link to sampling plans and are customized for the data acquisition strategy (color-coded for sampling and labeled for DOE).

III. Sampling: a set of methodologies that combine the use of sampling strategies, sampling trees, and statistical process control (SPC).  A proactive methodology, Sampling is used to discover dominant sources and the nature of the variation in products or processes and to thereby provide guidance and direction for work.  Sampling is a discovery tool, whereby SPC has typically been taught as a monitoring tool.:

Components of Variance (COV) Studies: methods to partition the overall process variation into portions assignable to causes at each of several layers (components).  For instance, the total process variation might be attributable to a measurement component, a within piece component, a between piece within lot component, and a between lot component of variation. COV studies are used to evaluate the stability and magnitude of the various components of variation and therefore to provide focus for work to develop process knowledge.

Control Charts: Certain variations in process or product measures belong to the category of chance variations about which little can be done other than change the process or system that produced the data. This category of variation is the sum of the effects of complex interactions between many factors.  It is the variation “built” into the process by current practices and managerial behaviors. Besides these “chance” variations, there are variations produced by “assignable” or special causes.  These sources of variation can be isolated. Control charts are used to differentiate the variation.

Measurement System Evaluation (MSE): methodologies for identifying and quantifying the different sources of variation that affect a measurement system and compares those components to other sources of variation.

Sampling Trees:  A graphical description of the relationship between “layers” of variables in a sampling plan.  They describe the nature of the relationships; nested, systematic or crossed and are required for planning and analysis.

IV. Sampling with Manipulation (Design of Experiment – DOE): efficient procedures for discovering relationships between independent factors (process X’s), such as temperature, pressure, time, speed, etc., and response variables (product Y’s) such as size, variation, ductility, shrink, etc. in the face of Noise.  The independent factors are varied in a specific fashion and the effect of these changes on response variables (Y’s) measured.  The fashion in which the independent variables are manipulated may be thought of as the experimental design.

Factor Relationship Diagrams (FRD): tools that assist engineers in understanding and making decisions regarding potential information content in an experiment and cost of that information relative to experimental run order and experiment size  (experimental cost in terms of production time and possible scrapped units).  It graphically describes the relationship between noise (unit structure) and factors (design structure) and shows any partitioning of the unit structure and degrees of freedom for analysis.

Ross’ Rules of Analysis:  Practical, Graphical and Quantitative, the appropriate order for analysis of any data set.

Regression Analysis: a set of methodologies used to quantify the statistical relationship, if any, between variables in order to utilize the relation for prediction.  Regression, or ordinary least squares (OLS), empirically describes variation in a response variable (Y) based on a set of “independent variables” (x’s or y’s).

Diagnostics: A set of methods to determine the integrity of the data to be analyzed (tests for special cause, outliers, stability, statistical assumptions, etc.)

Additional tools are delivered dependent on the nature of the situation.  These tools include dealing with attribute data, tolerancing, synergy with Lean concepts, data mining (PCA), mixture designs, non-manufacturing environments, etc.

Comments are closed.