Fabricante do software líder mundial em análise de risco e de decisão
ENGLISH I ESPAÑOL I PORTUGUÉS I FRANÇAIS I DEUTSCH I 日本語 I 中文 I РУССКИЙ
Modelos Setoriais
@RISK and RISKOptimizer
Six Sigma Example Models

The following examples illustrate how @RISK and RISKOptimizer can be used in a wide variety of Six Sigma applications. Full explanations are provided in each model, and each model is fully explained in the @RISK Six Sigma User’s Guide.

Download all examples
  Download @RISK Six Sigma User's Guide

Tolerance Analysis: Gear Pump Assembly Optimization
This model represents a tolerance analysis study performed to solve a potential assembly issue in a gear pump typically used in powerpacks, forklifts, low noise emission machines. After assembling the parts as shown in Fig. 1, during functioning the pressure inside the pumps tends to press the blocks towards each other; in this situation, a gap between the blocks must not be present. In fact, if there is a gap, a loss of volumetric efficiency will occur, due to the central bypass between inlet and outlet of the pump. The presence of the gap can occur if there is an undesired contact between the blocks and the housing in the points shown in Fig. 2. Hence, it is necessary to design the gear housing and the blocks so that the position of these points can allow the blocks to get in contact in the center, when pushed by the pressure.

The gap depends on several design parameters of the housing and of the blocks. A tolerance analysis has been performed in order to optimize the nominal configuration of certain parameters, taking into consideration some constraints due to functional requirements, machining feasibility and cost. The model allows to simulate the gap distribution and to predict the potential percentage of scraps, depending on both the nominal value and the variation (due to the machining capability) of the parameters. The identification of the statistical distribution that better describes each parameter and the values of the distribution parameters (standard deviation for Normal distribution; shape and scale for Weibull distribution) have been obtained by capability studies.

The assembly is considered as 'scrap' if 'gap' is over zero. This means that the presence of the gap occurs. Therefore, an Upper Specification Limit of zero has been set for the response 'gap'. Some of the involved parameters are shown in Fig. 3.

Looking at the parameter list, it can be noticed that in the model each parameter describing the shape of the block has been considered twice. This choice has been made to simulate the assembly in a realistic way. In fact, in a real assembly, there are two different blocks, randomly selected from the blocks population. Consequently, each parameter will be different for block 1 and block 2 in the same assembly. Regarding the statistical distribution, the same distribution has to be considered, being the population the same. This is the reason why, simulating a big number of assembly combinations, given a block design parameter 'x', the influence of each parameter 'x1' will be the same than the influence of 'x2'. Consequently, sensitivity coefficients will be the same for both 'x1' and 'x2'. Nevertheless, in order to have a correct simulation of the potential scrap percentage, both of them have to be considered in the model.

@RISK has been used to find the optimal combination of nominal design values and process variation to minimize the scrap percentage. The functional constraints have been modeled using a second response called 'beta'. 'beta' is an angular value describing the position on the blocks of the red points shown in Fig. 2. Due to functional requirements, its value must not be lower than a certain amount, while there is not any constraint for its upper limit. Therefore, for 'beta' only a Lower Specification Limit has been defined. A combined optimization of the two responses has been performed and an advanced 3D view of the simulated data has been added.

This model created by Franco Anzani of SixSigmaIn Team www.sixsigmain.it, and Marco Manara of Casappa S.p.A. www.casappa.com. © 2008 SixSigmaIn Team and
Casappa S.p.A.

Download example:
       DFSSGearPumpAssemblyOptimizationModel.zip


 

Design of Experiments: Catapult
The catapult or trebuchet model is a classic example used to teach Design of Experiments. It illustrates Monte Carlo simulation and tolerance analysis.

Suppose you are manufacturing catapults and customers demand the distance the catapult throws a standard ball is 25 meters, plus or minus 1 meter. There are many design specifications involved in producing your catapults, such as:

• Angle of Launch

• Mass of the Ball

• Distance Pulled

• Spring Constant

Each of the design factors contains an @RISK probability distribution to represent different possible values each factor could take. @RISK probability distributions can be entered directly as formulas or by using the Define Distribution icon on the @RISK toolbar. For example, a Uniform distribution represents the possible values for Distance Pulled.

The output is Distance Thrown, and contains a RiskSixSigma property function defining Lower Specification Limit, Upper Specification Limit, and Target for Distance Thrown. Like inputs, an @RISK output can be typed into the formula bar or defined via dialog box using the Add Output button on the @RISK toolbar.

Capability metrics Cpk, Cpk Upper, Cpk Lower, Sigma Level, and DPM are calculated for the catapult, enabling you to determine whether it is ready for production.

The resulting distribution of Distance Thrown shows that about 60% of the time the distance is outside of specification limits.

Sensitivity analysis identifies the most important design factors affecting Distance Thrown as the Distance Pulled, followed by the Mass of the Ball.

This model can help explore the theory of Taguchi or Robust Parameter Design. Taguchi theory states that there are two types of variables which define a system – those whose levels affect the process variation, and those whose levels do not. The idea behind Taguchi Design is to set variables of the first type at a level which minimizes total process variation. Variables which don’t affect process variation are used to control and/or adjust the process.

In the catapult model, you can adjust various design parameters – such as Pull Distance and Mass of Ball – to try to minimize the variation in the output Distance Thrown. Considering that 60% of the time the Distance Thrown is outside the specification limits of 24 to 26 meters, there is room for improvement.

Example model: SixSigmaDOECatapult.xls

 

Design of Experiments: Welding
Suppose you are analyzing a metallic burst cup manufactured by welding a disk onto a ring (see below). The product functions as a seal and a safety device, so it must hold pressure in normal use, and it must separate if the internal pressure exceeds the safety limit.

 

The model relates the weld strength to process and design factors, models the variation for each factor, and forecasts the product performance in relation to the engineering specifications. Modeling a response based on multiple factors can often be accomplished by generating a statistically significant function through experimental design or multiple regression analysis.

In this example, @RISK simulates the variation using Normal distributions for each factor. @RISK distributions support cell referencing so that you can easily set-up a tabular model that can be updated throughout a product and process development lifecycle.

The uncertain factors are:

Design Variables

• Disk thickness

• Horn wall thickness

• Horn length

Process Variables

• Weld pressure

• Weld time

• Trigger point

• Amplitude

• Frequency

Adding a distribution to each factor is as easy as clicking on the Define Distribution icon on the @RISK toolbar. From there you can select a Normal distribution and input its parameters or cell references, as shown below. You could also type the formula directly into Excel’s formula bar for each input. For example, the cell for Well Pressure contains the formula

=RiskNormal(D73,E73)

The output is Weld Strength (N) in the Design & Process Performance section, and contains a RiskSixSigma property function that includes the Lower Specification Limit (LSL), Upper Specification Limit (USL), and Target value specified. As with defining input distributions, you can type the output formula directly in the output cell or use the Add Output dialog. The formula would be:

=RiskOutput("Weld Strength (N)",,,RiskSixSigma(D82,E82,105,0,1))+ [the mathematical calculation]

After you run the simulation, Six Sigma statistics were generated using @RISK Six Sigma functions for Cpk-Upper, Cpk-Lower, Cpk, and PPM Defects (or DPM). Standard @RISK statistics functions (like RiskMean) were also used.

The @RISK output distribution displays the expected performance based on the design and process input variation and shows LSL, USL, and Target value with markers. You can easily access the output statistics using the reporting features or through @RISK functions.

The @RISK Sensitivity Analysis clearly shows that the Weld Time and Amplitude parameters are driving the Weld Strength variation.

The next steps for this problem could include two options: The engineer can attempt to reduce or better control the variation within the Weld Time and Amplitude, or use RISKOptimizer to find the optimal process and design targets to maximize yield or reduce scrap cost.

Example model: SixSigmaDOE.xls

 

Design of Experiments with Optimization
This model demonstrates the use of RISKOptimizer in experimental design. RISKOptimizer combines Monte Carlo simulation with genetic algorithm-based optimization. Using these two techniques, RISKOptimizer is uniquely capable of solving complex optimization problems that involve uncertainty.

With RISKOptimizer, you can choose to maximize, minimize, or approach a target value for any given output in your model. RISKOptimizer tries many different combinations of controllable inputs that you specify in an effort to reach its goal. Each combination is called a “solution,” and the total group of solutions tried is called the “population.” “Mutation” refers to the process of randomly trying new solutions unrelated to previous trials. You can also set constraints that RISKOptimizer must abide by during the optimization.

For uncertain, uncontrollable factors in your model, you define @RISK probability distribution functions. For each trial combination of inputs, RISKOptimizer also runs a Monte Carlo simulation, sampling from those @RISK functions and recording the output for that particular trial. RISKOptimizer can run thousands of trials to get you the best possible answer. By accounting for uncertainty, RISKOptimizer is far more accurate than standard optimization programs.

In this example, as above, the part under investigation is a metallic burst cup manufactured by welding a disk onto a ring. The product functions as a seal and a safety device, so it must hold pressure in normal use, and it must separate if the internal pressure exceeds the safety limit.

The model relates the weld strength to process and design factors, models the variation for each factor, and forecasts the product performance. RISKOptimizer was used to search for the optimal combination of process settings and nominal design values to minimize scrap cost, called Annual Defect Cost in the model. This is the same as maximizing yield.

The process and design variables RISKOptimizer will adjust are:

Design Variables

• Disk thickness

• Horn wall thickness

• Horn length

Process Variables

• Weld pressure

• Weld time

• Trigger point

• Amplitude

• Frequency

All in an effort to minimize the output Annual Defect Cost. Clicking on RISKOptimizer’s Model Definition icon lets you define which cells to adjust, what your output is, and what constraints to use. In addition to the inputs and outputs described above, we will also define a constraint where the Trigger Point must always be less than or equal to Weld Time.

When you click Start Optimization, the RISKOptimizer Progress window appears, showing you a summary status of the analysis.

After simulation and optimization, RISKOptimizer efficiently found a solution that reduced the Annual Defect Cost to under $8,000.

Using RISKOptimizer can save time and resources in a quality improvement and cost reduction effort. The next steps for this problem would be to validate the model and optimized solution through experimentation.

Example model: SixSigmaDOEOpt.xls

 

Seis Sigma: Análise de Circuito Elétrico
Este circuito simples de corrente contínua consiste de duas fontes de voltagem - uma independente e outra dependente e dois resistores. A fonte independente especificada pelo engenheiro de projeto possui uma faixa operacional de energia de 5.550 W +/- 300 W. Se a energia retirada da fonte de voltagem independente estiver fora da especificação, o circuito será defeituoso. Os resultados de performance de projeto indicam calramente qeu o projeto não é capaz de performar com uma percentagem dos circuitos falhando tanto no lado alto quanto no lado baixo dos limites. O valor PNC indica o Percentual de Não Conformidade das unidades esperado nas pontas inferior e superior da especificação.

A lógica básica do modelo é a seguinte:

O modelo calcula o desvio padrão para cada componente baseado nas informações conheceidas e nas seguintes premissas deste modelo: 1) a média do valor dos componentes é centrada dentro dos limites de tolerância. 2) os valores de componentes são normalmente distribuídos. Note que o @RISK pode ser usado para ajustar uma distribuição de probabilidade para um conjunto de dados ou para modelar outros tipos de distribuições de probabilidade, se necessário.

Uma função de propriedade RiskSixSigma na célula de output PowerDep definir Limite Superior, Limite Inferior e Alvo que são usados para cálculos dos resultados Seis Sigma. Função Seis Sigma do @RISK são usadas para calculada a Capacidade Sigma, Cpk Inferior, Cpk Superior, Cpk, Cp, DPM, PNC Superior e PNC Inferior. O gráfico de output para PowerDep inclui marcadores Seis Sigma para USL, LSL eTarget.

A Análise de Sensibilidade do @RISK identifica as variáveis de entrada causando variação no output. A sensibilidade ilustra que as duas fontes de voltagem são as principais contribuidoras para a variação no consumo de energia. Armado com esta finromação, o time de engenharia pode focar seus esforços de melhoria nas fontes de energia e não nos resistores.

O modelo pode ser usado para testar diferentes componenetes e tolerâncias, performances e rendimentos podem ser comparados e a solução ótima pode ser selecionada para maximizar rendimento e reduzir custo.

Baixar do modelo-exemplo: SeisSigmaProjetoEletrico.xls

 

Lean Six Sigma: Analysis of Current State –
Quotation Process

This model represents the process flow of a company's internal sales quotation process. The process was taken from an actual company and had over 36 individual steps involving ten individuals or departments. It took up to four weeks to get a quote through the system, yet for critical issues quotes could be expedited through the system in less than one week. Long quote cycle times prevented the company from bidding in often lucrative emergency orders for their products and services. Management suspected the problems lay with personnel, not the process, but engineers suspected the process and needed a tool to prove it.

First the engineering team asked the question: How long does it take to process a quotation from the receipt of the request from the customer to the release of the quote to the Engineering department? To answer this, the team broke the process down into four steps. First, the data is collected and entered (Step A in the model). Next, it goes into a queue for Customer Service review (Step B). During review (Step C), corrections and additional data are entered onto the form and tracking number assigned. Finally the packet is put into a queue for the Engineering department to perform the quotation activity (Step D).

The team captured the amount of time each quote spent in each step of the process. Data from many quotations appears below, and @RISK's distribution fitting tool was used to create distribution functions that describe the amount of time each Step A-D takes. The output is total time, or the sum of Steps A-D. Built into the output is a RiskSixSigma function defining the USL, LSL, and Target that are used to calculate the Six Sigma statistics Cp, Cpk Lower, Cpk Upper, and Cpk on the output total time after the simulation. In addition, the mean, max, min, and standard deviation of the output total time were calculated with @RISK statistics functions. USL, LSL, and Target are marked on the Total Time output graph.

@RISK simulation showed that the mean time to process a quote is about 1700 minutes, which is over 28 hours, and could take anywhere from 350 minutes (almost 6 hours) to well over 2 calendar days. The team knew the only value-added portion of the process is the Review step (Step C), which took an average of 35 minutes to complete. When management saw that it took over 24 hours to complete 35 minutes of value-added work, they saw the need for process improvement.

This model was created by Ed Biernat of Consulting With Impact, Ltd., www.consultingwithimpact.com

Example model: SixSigmaQuotationProcess.xls

 

DMAIC: Roll Through Yield Analysis
DMAIC - or Define, Measure, Analyze, Improve, and Control - is used to improve existing products or processes. Imagine you are a costume jewelry manufacturer, coating inexpensive silver with thin layers of gold. You import materials and components from China. A small number of components are always defective, but you don't know how many or how much it is costing.

You've gathered data on the number of components that are defective or become defective at various points in the manufacturing process. On the surface, it seems like defective parts are not a major problem. Upwards of 99% of components are acceptable at each stage of the process. However, the combined effect of the defective parts leads to 15-20% waste of final products, which can translate into 200,000 defective units per million produced. If materials are $.50 per unit, that is $100,000 in waste before counting labor, machine time, and other expenses.

You need to reduce the number of defective units produced. However, the process is long and complicated, and you don't know which stage to begin with. Using @RISK, you can simulate many different outcomes and pinpoint the manufacturing stage that is the worst offender. You can also get key process capability metrics for each stage as well as the entire process that will help you improve quality and reduce waste. In this way, @RISK is being used in the Measure and Analyze phases of the DMAIC method. @RISK is used to measure the existing state of the process (with capability metrics) and analyze how it might be improved (with sensitivity analysis).

Using the data gathered from the manufacturing process, @RISK's distribution fitting feature was used to define distribution functions describing the number of defective parts at each stage of the process - Unpackaging/Inspection, Cutting, Cleaning, and Electroplating. These fitted distributions were added directly to the model.

The Defective Parts per Million (DPPM) for each stage, and the process as a whole, were defined as @RISK outputs with Six Sigma specifications for Upper Specification Limit, Lower Specification Limit, and Target values. After the simulation run, a variety of Six Sigma metrics were calculated for each stage and the process as a whole.

Finally, sensitivity analysis and a Tornado graph revealed that the Cutting stage was the most to blame for overall product defects, despite the fact that another stage - Cleaning - had a lower First Time Yield (fewer defects). Even though the FTY of Cutting was higher, the Cutting process itself is less consistent and has more variation than the other processes.

Example model: SixSigmaDMAICRTY.xls

 

DMAIC Failure Rate
This is a failure rate model for use in quality control and planning. You are a manufacturer and need to calculate the likely % of defective products. In the DMAIC method - Define Measure, Analyze, Improve, Control - this is the Measure and Analyze phases, where you wish to measure the current state of quality and analyze the causes of problems or defects.

A product is defective when any one of its components does not meet its required tolerance level. Each component is deemed to be satisfactory if some property of its finished state (e.g. its width) lies within the defined tolerance bands.

This property of each finished component (e.g. its width) is modeled with a Normal distribution in the Sample column. Those cells have also been added as @RISK outputs with RiskSixSigma property functions defining LSL, USL, and Target values for each component. The formula for Component1 appears below:

=RiskOutput(,,,RiskSixSigma(F26,G26,C26,0,0))+RiskNormal (C26, D26)

In this way we'll be able to see graphs of the components’ quality, and calculate Six Sigma statistics on each component.

The component and aggregate Failure Rate is calculated from the RiskMean function, which is an @RISK Statistics function, and therefore applicable only after the simulation has been run. After simulation we can also see component and aggregate Six Sigma statistics Z score and DPM.

Example model: SixSigmaDMAICFailure.xls

 

DMAIC Failure Rate using RiskTheo
This is an extension of the DMAIC Failure model for use in quality control and planning. It includes the use of RiskTheo functions (in this case RiskTheoXtoP) for determining failure rate without actually running a simulation. RiskTheo functions return theoretical statistics on input distributions or formulas rather than returning the statistics on the data from a simulation run.

You are a manufacturer and need to calculate the likely % of defective products. In the DMAIC method - Define Measure, Analyze, Improve, Control - this is the Measure and Analyze phases, where you wish to measure the current state of quality and analyze the causes of problems or defects.

A product is defective when any one of its components does not meet its required tolerance level. Each component is deemed to be satisfactory if some property of its finished state (e.g. its width) lies within the defined tolerance bands.

This property of each finished component (e.g. its width) is modeled with a Normal distribution in the Sample column.

Those cells have also been added as @RISK outputs with RiskSixSigma property functions defining LSL, USL, and Target values for each component. The formula for Component1 appears below:

=RiskOutput(,,,RiskSixSigma(F26,G26,C26,0,0))+RiskNormal (C26, D26)

In this way we'll be able to see graphs of the components’ quality and calculate Six Sigma statistics on each component if we choose to run a simulation.

The component and aggregate Failure Rate is calculated from the RiskTheoXtoP, which draws on the Normal distributions in the Sample column. The Failure Rate from simulation is also calculated using the RiskMean function if you choose to run a simulation. In this way you can compare simulated Failure Rate with RiskTheo Failure Rate.

After simulation we can also see component and aggregate Six Sigma statistics Z score and DPM.

Example model: SixSigmaDMAICFailureRiskTheo.xls




Palisade Brasil
Praia de Botafogo,
nº 501 Sala 101, Botafogo
22250-000 Rio de Janeiro-RJ
+55 (21) 3958 1443
+1 607 277 8000 x318 tel
800 432 7475 x318 E.U.A.
vendas@palisade.com
Palisade
800 432 RISK (US/Can)
+1 607 277 8000
sales@palisade.com
www.palisade.com
Palisade EMEA & India
+44 1895 425050
salesEMEA@palisade.com
www.palisade.com
Palisade Asia-Pacific
+61 2 8249 8239
salesAP@palisade.com
www.palisade.com
Palisade アジア・
パシフィック
東京事務所
+81 3 5456 5287 tel
sales.jp@palisade.com
www.palisade.com/jp/
Palisade Latinoamérica
+1 607 277 8000 x318
800 432 RISK (EEUU/Canadá)
+1 607 277 8001 fax
+54 (11) 5252-8795 Argentina
+56 2581-3492        Chile
+507 836-5675        Panamá
+52 55 5350 2852   México
+51 1 708-6781       Perú
+57 1 508-5187       Colombia
servicioalcliente@palisade.com
ventas@palisade.com
www.palisade-lta.com