Project Abstract

Behavioral analysis of data in process systems is the ultimate application of my accumulated experiences in statistics,
mathematics, and physics. The processes themselves are derived from mathematical and physical models, and are streamlined
using optimization methods that are at the heart of calculus. As the theoretically formulated processes are turned into production,
statistical analysis is used to isolate and perfect different aspects of the procedures to ensure the utmost efficiency of the overall
system. Research in process engineering is essential to improving industrial practices. In the case of plastic injection molding,
inefficiencies in production process result in the disruption in the uniformity of the product, excess wastage of materials, and other
potential disasters related to inconsistent measurements and calibrations of machines. The use of data driven methods to analyze
the entirety of the production process assists with the perpetual task of improving and optimizing the outputs while reducing errors
and eliminating opportunities for failure.

Sunday, April 23, 2017

Background Research: Part 2

As I had discussed earlier, I had the opportunity to access some industry level certification materials, and here is a continuation of my background research.

Continuing from last time, I have delved further into the Master Black Belt materials. This time I focused on the Design of Experiments, as it is the essence of the research I work on at Microtech. In engineering, experiments are used for various functions, including process characterization, optimization, evaluation of material properties, product design, and system tolerance determination, among others. Experiments in process engineering mainly are used to reduce time to develop new processes, improve the performance of existing processes, improve the reliability of products, and to evaluate and explore material design alternatives. In industry, the design of experiments involves a few key components: Randomization, Replication, and Blocking. Randomization is probably the most simple, requiring that all the trials are run in random order, helping to eliminate other potential factors that may not be accounted for. Replication is also fairly straightforward, focusing on improving precision of effect estimation and error by having a reasonable sample size. Blocking is a practice that exists solely to eliminate nuisance factors, conditions over which we have no control but still influence the outputs since it is basically impossible to run a full DOE under homogeneous conditions. Experiments are usually designed using some mathematical structure, and the most common one is the factorial design. The factorial experiment is one in which all possible combinations are tested. The factorial design is noted as 2k, where the k is the number of factors being tested. The outputs are usually recorded in two levels, low and high. This structure is the foundation for many more advanced industrial experiments. Analysis of the factorial design has a few steps. First is estimating the factor effects and formulating a model. When using a replication design, the model should account for all factors, but when using an unreplicating design, normal probability plots can be used to model the effects as well. Once a model is generated, it is put through a statistical testing process called ANOVA (Analysis of Variance). Similar to a t-test, ANOVA allows us to compare different sets of data to determine whether or not there is a statistical difference between any number of data sets, but unlike a t-test it is much stricter, allowing for less type 1 errors (false positive) and therefore more widely accepted in industry. After testing the model, it can be refined and retested as many times as needed in order to reach a statistically favorable model. The final result usually comes in the form of a fitted regression model, a concept that I explored in my last background research post. 

Another interesting part of the six sigma materials I read was about innovation, or as they referred to it, the successful exploitation of ideas. As everyone knows, invention is a unique and novel device or discovery, usually a breakthrough in science and technology. Innovation, on the other hand, involves invention but is often the merging of different ideas and moving them through development, commercialization, production, and practical use. Design of Experiments is considered the foundation for large scale innovation, as interactions between various factors are often the key to innovation. Other than DOE, there are many other methods of innovation, such as data mining, statistics, business analytics, and other mathematical techniques. 

I will continue talking about some other background research in another post soon. 

3 comments:

  1. It was interesting reading about the "Design of Experiments". I'm familiar with the concepts of randomization and replication you mentioned, but I'm not as familiar with blocking. Always cool to learn something new! Definitely sounds like you're learning a lot.

    ReplyDelete
  2. Will you be using the six sigma materials in your SRP?

    ReplyDelete
    Replies
    1. Definitely. Six Sigma training helps in many aspects of my work at Microtech, including in statistics, process management, and ongoing improvement.

      Delete