Project Abstract

Behavioral analysis of data in process systems is the ultimate application of my accumulated experiences in statistics,
mathematics, and physics. The processes themselves are derived from mathematical and physical models, and are streamlined
using optimization methods that are at the heart of calculus. As the theoretically formulated processes are turned into production,
statistical analysis is used to isolate and perfect different aspects of the procedures to ensure the utmost efficiency of the overall
system. Research in process engineering is essential to improving industrial practices. In the case of plastic injection molding,
inefficiencies in production process result in the disruption in the uniformity of the product, excess wastage of materials, and other
potential disasters related to inconsistent measurements and calibrations of machines. The use of data driven methods to analyze
the entirety of the production process assists with the perpetual task of improving and optimizing the outputs while reducing errors
and eliminating opportunities for failure.

Sunday, April 23, 2017

Weeks 4 and 5 at Microtech

As I had described in my last post about my internship, I was able to get right into my research project and apply my knowledge and understanding of statistics to industrial processes. Since then, I have expanded the scope of my work at Microtech even more. As I had described in an earlier post, one of the key technologies that we use is the Vision system, the system that uses cameras and a high speed computer to take measurements of parts passing through on a belt in order to determine if they fit the specifications, and shoots them off the track if they aren't. However, since the computer needs to maintain its speed and efficiency, it doesn't collect any of the data by itself. If desired, its possible to pull some data from the machine using a flash drive, but that isn't useful because its is an extremely inefficient process. If the data form the vision system could be collected, then the job of quality control becomes much easier because they aren't wasting time measuring parts that can be measured by a machine at a much faster rate and with a lower margin or variability. Data collection from the vision system would ultimately allow for a more automated production facility, helping to lower costs and increase both output and quality. There was a solution to this issue. Getting a dedicated server solves the issue of the extreme inefficiency, and we will be able to access all the data at any time. Since I have a pretty strong background in computers, I got into that project at Microtech, and since then have been working closely with many people across management to get this initiative running. First, we needed to purchase a server, so we spent a day shopping for servers with all the given specifications and requirements that the vision company requested. Then we had to set it up and get it online. This process took a few days, and I was surprised by the complexity of coordinating with a client company on the other side of the world. Finally we were able to get the server running the Vision server software, but we are still working to connect the individual vision systems that are on the manufacturing floor to the server network. That is the task I am currently coordinating, and will hopefully be able to move forward with next week.


Background Research: Part 2

As I had discussed earlier, I had the opportunity to access some industry level certification materials, and here is a continuation of my background research.

Continuing from last time, I have delved further into the Master Black Belt materials. This time I focused on the Design of Experiments, as it is the essence of the research I work on at Microtech. In engineering, experiments are used for various functions, including process characterization, optimization, evaluation of material properties, product design, and system tolerance determination, among others. Experiments in process engineering mainly are used to reduce time to develop new processes, improve the performance of existing processes, improve the reliability of products, and to evaluate and explore material design alternatives. In industry, the design of experiments involves a few key components: Randomization, Replication, and Blocking. Randomization is probably the most simple, requiring that all the trials are run in random order, helping to eliminate other potential factors that may not be accounted for. Replication is also fairly straightforward, focusing on improving precision of effect estimation and error by having a reasonable sample size. Blocking is a practice that exists solely to eliminate nuisance factors, conditions over which we have no control but still influence the outputs since it is basically impossible to run a full DOE under homogeneous conditions. Experiments are usually designed using some mathematical structure, and the most common one is the factorial design. The factorial experiment is one in which all possible combinations are tested. The factorial design is noted as 2k, where the k is the number of factors being tested. The outputs are usually recorded in two levels, low and high. This structure is the foundation for many more advanced industrial experiments. Analysis of the factorial design has a few steps. First is estimating the factor effects and formulating a model. When using a replication design, the model should account for all factors, but when using an unreplicating design, normal probability plots can be used to model the effects as well. Once a model is generated, it is put through a statistical testing process called ANOVA (Analysis of Variance). Similar to a t-test, ANOVA allows us to compare different sets of data to determine whether or not there is a statistical difference between any number of data sets, but unlike a t-test it is much stricter, allowing for less type 1 errors (false positive) and therefore more widely accepted in industry. After testing the model, it can be refined and retested as many times as needed in order to reach a statistically favorable model. The final result usually comes in the form of a fitted regression model, a concept that I explored in my last background research post. 

Another interesting part of the six sigma materials I read was about innovation, or as they referred to it, the successful exploitation of ideas. As everyone knows, invention is a unique and novel device or discovery, usually a breakthrough in science and technology. Innovation, on the other hand, involves invention but is often the merging of different ideas and moving them through development, commercialization, production, and practical use. Design of Experiments is considered the foundation for large scale innovation, as interactions between various factors are often the key to innovation. Other than DOE, there are many other methods of innovation, such as data mining, statistics, business analytics, and other mathematical techniques. 

I will continue talking about some other background research in another post soon. 

Weeks 2 and 3 at Microtech

I returned to Microtech a couple weeks ago, and have since worked on many more aspects of the production process.

I started by continuing to familiarize myself with the key functions and methods used at Microtech. In addition to the measuring and auditing parts that I had learned to to before I left, I learned about a few more important functions of the quality lab. First is a process called burst testing. In a battery, the inside is filled with paste, and as the cell uses power the electricity contained in the compound undergoes chemical changes, causing it to release gases. In order to prevent the battery from exploding from use, the battery seals must be breathable enough to allow the produced gases to escape. In order to ensure that the seals are breathable enough, we put them into a machine called a burst tester. The seals are capped using metal parts similar to those you would see on a battery, and then are taken into a sealed chamber in which they are pressurized until the plastic seals burst. The machine records the pressures at which each of the seals bursts, so that we can analyze any trends in the data to identify if there are defective cavities or groups. For each type of battery seal, there are established limits on what is considered an appropriate pressure for the seal to burst at. If the seal bursts too early, then that means that the battery will die out early when begin used because the seal would pop before the all of the battery acid is used. If the seal bursts at a pressure that exceeds the required specifications, then that means that the rest of the battery would break before the seal, allowing batter acid to pour out into the device using it, potentially causing more damage. The next major task that I got into in the quality lab is a process called gage R&R. Gage R&R is a widely used statistical method in industry to ensure two major components of any production process: Repeatability and Reproducibility. In a gage study, multiple operators (people) measure the saem set of parts using some technique (gage) in a random double blind setting. Once measurements of all the parts are taken by all the operators, the values are entered into Minitab statistical software and evaluated. The results show the variability in the system, and minitab isolates the various sources of variability in the system - operator error, instrumental error, methodological error, and production error. Using these results, it becomes possible to determine the viability and efficiency of a certain gage as a measurement method. The repeatability and reproducibility are also analyzed, helping to provide mathematical backing for the variation in measurements by a single operator or instrument and between operators or instruments, respectively. 

Outside the quality lab, I delved right into the core of my research project, which is the statistical analysis of processes using the eDart system to collect data from the injection molding machines.The eDart collects data directly from the mold on various parameters such as Injection Integral, Peak Pressure, Shot temperature, and more. In industrial production, the goal is to optimize the production process so that each and every parameter is fine tuned to the best possible setting for the best production of high quality outputs, and in order to find out exactly what those best fitting settings are, we use a process called DOE ( Design of Experiments). In a DOE, each of the desired parameters are manipulated and isolated in a matrix that can then be compared with the output data in order to determine which changes resulted in positive, neutral, or negative standards of quality. Additionally, the data from the eDart allows us to make a couple additional inferences. First, it allows us to correlate the parameters that we manipulate (mold temperature, barrel temperature, hold pressure, etc.) with other parameters that we do not manipulate but still change, that we call covariates (injection integral, peak shot stroke, injection pressure, etc.). Second, it allows us to see similarities between the covariates and the final quality of the products. By allowing us to further break down the process and to create these inferences, the task of creating the optimal process becomes easier and more concrete.