Six Sigma and Simulation: Part 2

By Jeff Joines (Associate Professor In Textile Engineering at NCSU)

This is the second of the three part series on Six Sigma, Lean Sigma, and Simulation. The first part explained the Six Sigma methodologies. Recall the goal of the DMAIC continuous improvement methodology is to control/reduce process variability of a current process or product while the Design for Six Sigma process DMADV is used to design a new process or product with minimal variability before creation. Simulation modeling can be employed in almost every phase of either methodology.

Define

Six Sigma practitioners have to estimate the cost savings for each project to be certified or justify the project typically. However, most of these cost forecasts are made on point estimates of key parameters (i.e., raw material cost, customer/product demand, cost of capital, currency rates, etc.). By employing Monte Carlo simulation, variability and/or ranges on these point estimates can be employed to provide a more reliable estimate. Along these lines, several projects have been proposed and simulations can be utilized to help management perform project selection based on resource constraints and objectives.

Analyze and Improve

During the Analysis and Improve phases, Design of Experiments (Full, Fractional, Mixed, etc.) is the most common tool utilized which provides a base line to illustrate improvement when changes are made as well identifying factors of interest to control or change. The normal baseline measure is defined as the process capability (Cpk) which is an indication of the ability of a process to produce consistent results – the ratio between the permissible spread and the actual spread of a process. The Cpk index takes into account off centeredness and defined as the minimum of (USL-Mean)/ 3? or (Mean-LSL)/ 3? where USL and LSL are the upper and lower specification limit. A six sigma process is normally distributed with a Cpk value greater than 1.5.

Using the real system is better in terms of capturing all complexities, interactions, etc. However as simulation practitioners, we recognize when that might be possible or viable. The following lists examples where simulation modeling in terms of Monte Carlo or process simulation can be used.

• If the product or process does not exist as is the case in a Design for Six Sigma, simulation models can be used to ascertain capability of a new process and product before implementation. For example, tolerance stack up of individual parts or processes can be determined. Take parts or processes which are within tolerance individually (e.g., bearing and a shaft) but the assembly process might not be capability owing to the tolerance stack up problem which occurs in manufacturing, service, and transactional processes.
• The cost of performing a DOE with replications is too high (e.g., raw material cost, cost of shutting down current process). We have worked with companies in developing process and Monte Carlo simulation models that could be used to determine their capabilities and ascertain the potential improvement in their changes.
• The time of running the set of experimentation makes it impractical to determine the baseline or ascertain the improvements of a process. While working with a large company and their six sigma process improvement team with a complex global supply chain, one of their projects was to reduce inventories of a series of products with a ten to twelve week lead time. The team had to evaluate six inventory policies, indentify which one of three suppliers was best, etc. The DOE with sufficient replications would have taken years to complete and made the project useless without the simulation model. Also, most of the data driving the model was based on lead-times which are not normally distributed.
• Think of systems where there are multiple processes that feed one another (e.g., departments, plants, etc.) which contain only five or six factors each. Transfer functions can be generated from a traditional DOE on each individual process but not the entire system. A simulation model can be used to combine each individual transfer function into determining the capability of the whole system as well as testing a wider range of values.
• There are several environments, where performing a DOE is impractical or impossible. For example, we have trained dozens of people associated with hospital systems from around the country in Six Sigma. Simulation modeling and analysis allows these practitioners to be able ascertain process capability with a model because the real system cannot be used since patient care is at stake. Other environments where we have used simulation modeling instead of the real system is in processes which are transactional like the banking or insurance industries.

Control

Simulation can also be used as a process control aid as the process is being implemented to determine potential problems.

Hopefully it is apparent that simulation experts already posses the skills that can greatly help Six Sigma projects. These types of projects are not unique but just general simulation models we are know how to build. They only require us to learn the Six Sigma language as well as the need to calculate Cpk statistics. I find it easier to work with Six Sigma people because of their statistical training for understanding input and output analysis even though they typically have only used the Normal distribution. In Six Sigma and Simulation: Part 3, the use of simulation in the Lean Sigma world will be addressed.

Six Sigma and Simulation: Part 1

By Jeff Joines (Associate Professor In Textile Engineering at NCSU)

This is a three part series on Six Sigma, Lean Sigma, and Simulation. The first blog will explain the Six Sigma methodology and the bridge to simulation analysis and modeling while the second and third parts will describe the uses of simulation in each of the Six Sigma phases and Lean Sigma (i.e., Lean Manufacturing) respectively.

“Systems rarely perform exactly as predicted” was the starting line for the blog Predicting Process Variability and is the driving force behind most improvement projects. As stated, variability is inherent in all processes whether these processes are concerned with manufacturing a product within a plant, producing product via an entire supply chain complex or providing a service in the retail, banking, entertainment or hospital environment. If one could predict or eliminate the variability of a process or product, then there would be no waste (or Muda in the Lean World which will discussed in a third part) associated with a process, no overtime to finish an order, no lost sales owing to having the wrong inventory or lengthy lead-times, no deaths owing to errors in health care, shorter lead times, etc. which ultimately leads to reduced costs. For any organization (manufacturing or service), reducing costs, lead-times, etc. is or should be a priority in order to compete in the global world. Reducing, controlling and/or eliminating the variability in a process is key in minimizing costs.

Six Sigma is a business philosophy focusing on continuous improvement to reduce and eliminate variability. In a service or manufacturing environment, a Six Sigma (6?) process would be virtually defect free (i.e., only allowing 3.4 defects out of a million operations of a process). However, most companies operate at four sigma which allows 6,000 defects per million. Six Sigma began in the 1980s when Motorola set out to reduce the number of defects in its own products. Motorola identified ways to cut waste, improve quality, reduce production time and costs, and focus on how the products were designed and made. Six Sigma grew from this proactive initiative of using exact measurements to anticipate problem areas. In 1988, Motorola was selected as the first large manufacturing company to win the Malcolm Baldrige National Quality Award. As a result, Motorola’s methodologies were launched and soon their suppliers were encouraged to adopt the 6? practices. Today, companies who use the Six Sigma methodology achieve significant cost reductions.

Six Sigma evolved from other quality initiatives, such as ISO, Total Quantity Management (TQM) and Baldrige, to become a quality standardization process based on hard data and not hunches or gut feelings, hence the mathematical term, Six Sigma. Six Sigma utilizes a host of traditional statistical tools but encompasses them within a process improvement framework. These tools include affinity diagrams, cause & effects, failure modes and effective analysis (FMEA), Poka Yoke (mistake proofing), survey analysis (voice of customer), design of experiments (DOE), capability analysis, measurement system analysis, statistical process control charts and plans, etc.

There are two basic Six Sigma processes (i.e., DMAIC and DMADV) and they both utilize data intensive solution approaches and eliminate the use of your gut or intuition in making decisions and improvements. The Six Sigma method based on the DMAIC process and is utilized when the product or process already exists but it is not meeting the specifications or performing adequately is described as follows.

Define, identify, prioritize, and select the right projects. Once selected to define the project goals and deliverables.
Measure the key product characteristics and process parameters to create a base line.
Analyze and identify the key process determinants or root causes of the variability.
Improve and optimize performance by eliminating defects.
Control the current gains and future process performances.

If the process or product does not exist and needs to be developed, the Design for Six Sigma (DFSS) process (DMADV) has to be employed. Processes or products designed with the DMADV process typically reach market sooner; have less rework; decreased costs, etc. Even though, the DMADV is similar to DMAIC method and start with the same three steps, they are quite different as defined below.

Define, identify, prioritize, and select the right projects. Once selected to define the project goals and deliverables.
Measure and determine customer needs and specifications through voice of the customer.
Analyze and identify the process options necessary to meet the customer needs.
Design a detailed process or product to meet the customer needs.
Verify the design performance and ability to meet the customer needs where the customer maybe internal or external to the organization.

Both processes use continuous improvement from one stage back to the beginning. For example, if during the analyze phase you determine a key input is not being measured, new metrics have to be defined or new projects can be defined once the control phase is reached.

Now that we have defined six sigma, you may be wondering what is the bridge to computer simulation and modeling. Simulation modeling and analysis is just another tool in the Six Sigma toolbox. Many of the statistical tools (e.g., DOE) try to describe the dependent variables (Y’s) in terms of the independent variables (X’s) in order to improve it. Also, most of the statistical tools are parametric methods (i.e., they rely on the data being normally distributed or utilize our friend the central limit theorem to make the data appear normally distributed). Many of the traditional tools might produce sub-optimal results or cannot be used at all. For example, if one is designing a new process or product, the system does not exist so determining current capability or future performance cannot be done. The complexity and uncertainty of certain processes cannot be determined or analyzed using traditional methods. Simulation modeling and analysis makes none of these assumptions and can yield a more realistic range of results especially where the independent variables (X’s) can be described as a distribution of values. In Six Sigma and Simulation: Part 2, a more detailed look at how simulation is used in the two six sigma processes (DMAIC and DMADV) will be discussed.

Professional Development

The annual Winter Simulation Conference (WSC) starts two weeks from today. Initially as a practitioner and then later as a vendor I have attended over 20 of these conferences in addition to dozens of other similar events. WSC is just one of many events that you could choose to attend. But why should you attend any of them?

All such events are not identical, but here are a few attributes of WSC that are often found in other events as well:

Basic tutorials – If you are new to simulation, this is a good place to learn the basics from experienced people.

Advanced tutorials – If you already have some experience, these sessions can extend your skills into new areas.

Practitioner papers – There is no better way to find out how simulation can be applied to your applications than to explore a case study in your industry and talk to someone who may have already faced the problems you might face.

Research – Catch up on state-of-the-art research through presentations by faculty and graduate students on what they have recently accomplished.

Networking – The chance to meet with your peers and make contacts is invaluable.

Software exhibits and tutorials – If you have not yet selected a product or you want to explore new options, it is extremely convenient to have many major vendors in one place, many of whom also provide scheduled product tutorials.

Supplemental sessions – Some half and full day sessions are offered before and after the conference to enhance your skill set in a particular area.

Proceedings – A quick way to preview a session, or explore a session that you could not attend. This serves as valuable reference material that you may find yourself reaching for throughout the year.

I think every professional involved in simulation should attend WSC or an equivalent conference at least once early in your career, and then periodically every 2-3 years, perhaps rotating between other similar conferences. If you want to be successful you have to keep your skills and knowledge up to date. And in today’s economy, a strong personal network can be valuable when you least expect it.

I hope to see you at WSC in Miami!

Dave Sturrock
VP Products – Simio LLC

Read My Project Report!

I read a lot, both for business and pleasure. But it seems I never have enough time. So when I sit down with a magazine, for example, most articles probably get less than a couple seconds of attention. Unless an article immediately captures my attention, I quickly move on to the next one. I know that I occasionally miss out on good content, but it is a way to cope with the volume of information that I need to process each day. Consider the implications when you are writing a project report for others to read…

We are all busy. When we are presented with information to read or review, we often don’t have time to wade through the details to see if the content merits our time.

Tell me the most important thing first! Give me the summary! How many times have you asked (or wished) for that?

At one point, it was common to give presentations by starting with an introduction, building the content, and ending with the conclusion – “the big finish”. While this is appropriate for some audiences, many people don’t want to take the time to follow such a presentation. Instead, they want to be presented with a quick overview and a concise summary first. They will then decide to read on if the overview has captured their interest and they need more information.

Think about your own experiences. When you have a document to read and you are not sure it is worth your time, what do you do? If you are like most people you will probably consider most, if not all of the following:
• Does the title look interesting?
• Do you know/respect the author?
• Scan the major headings or callouts for content of interest.
• Scan any pictures/diagrams for content of interest.
• Evaluate the summary or abstract.
While the order and details might differ slightly, at each stage of the above process if you are not convinced of the value of continuing, you will put the document aside. Only after the document has passed this gauntlet of tests, will you spend the time to seriously read the content.

What can we learn from this?

Content is not enough. The best content in the world is of little value unless it is read.

Write each report so everyone, including your busy stakeholders, will take the time to read it. Keeping these simple suggestions in mind will help you succeed at getting your message across.

Dave Sturrock
VP Products – Simio LLC

Simulation Applications in Assembly

Assembly processes are a common part of manufacturing and can be found in applications as diverse as apparel, electronics, automotive, aerospace, and even food processing. Assembly operations share many common simulation applications with general manufacturing, but also have many unique characteristics and problems which can often be assisted using simulation.

Material handling and other automated equipment are prevalent in most assembly operations. Simulation can help both in the initial design as well as analyzing to get improved efficiency.

I have found that most people think they can predict process variability fairly well, but when pressed to predict the behavior of even the simplest system, they fail miserably. This is a dangerous combination. Process variability can make the performance of typical systems hard to predict and overconfidence can lead you to incorrect decisions. Fortunately, simulation can provide extensive analysis to project performance, demystify variability, and reduce risk.

Often assemblies are made following a Bill of Material (BOM). Some simulation software has built-in BOM modeling features to make this easy. Whether your supply chain for the assembly involves only other departments in the building or involves off shore companies, simulation can help you assess the supply chain risk and design a system to meet corporate objectives.

For both manual and highly automated systems, line balancing can be a difficult task in assembly. Getting it wrong, even by a small amount, can result in an expensive loss of efficiency. Simulation can help not only tweaking a system for optimal efficiency, but also evaluating major changes in a safe, inexpensive, off-line environment.

Assembly operations can be capital or labor-intensive. Effective allocation of capital and labor is often a need that simulation can fulfill. Simulation can help identify bottlenecks and underutilized resources so that you can gain insight into your operations and get more out of your resources.

Markets change. Technology changes. It sometimes seems like the sole job of Marketing is to make your job miserable by introducing new productivity-damaging products. Simulation can help you respond to change requests with objective data about the cost and other impacts to your system.

It is well known that simulation technology is very effective at creating work schedules while taking into account the complexities of the facility. A few simulation products offer features to enable this application. You can even use the model built for optimizing design as the basis for a plant scheduling model.

In summary, simulation applied to assembly like in other applications, can help streamline designs, reduce risk, improve throughput, and increase your bottom-line profitability.

Dave Sturrock
VP Products – Simio LLC

Predicting Process Variability

Systems rarely perform exactly as predicted. A person doing a task may take six minutes one time and eight minutes the next. Sometimes variability is due to outside forces, like materials that behave differently based on ambient humidity. Some variability is fairly predictable such as tool that cuts slower as it gets dull with use. Others seem much more random, such as a machine that fails every now and then. Collectively we will refer to these as process variability.

How good are you are predicting the impact of process variability? Most people feel that they are fairly good at it.

For example, if someone asked you what is the probability of rolling a three in one role of a common six-sided die, you could probably correctly answer one in six (17%). Likewise, you could probably answer the likelihood of flipping a coin twice and having it come up heads both times, one in four (25%).

But what about even slightly more complex systems? Say you have a single teller at a bank who always serves customers in exactly 55 seconds and customers come in exactly 60 seconds apart. Can you predict the average customer waiting time? I am always surprised at how many professionals get even this simple prediction wrong. (If you want to check your answer, look to the comment attached to this article.)

But let’s say that those times above are variable as they might be in a more typical system. Assume that they are average processing times (using exponential distributions for simplicity). Does that make a difference? Does that change your answer? Do you think the average customer would wait at all? Would he wait less than a minute? Less than 2 minutes? Less than 5 minutes? Less than 10 minutes? I have posed this problem many times to many groups and in an average group of 40 professionals, it is rare for even one person to answer these questions correctly.

This is not a tough problem. In fact this problem is trivial compared to even the smallest, simplest manufacturing system. And yet those same people will look at a work group or line containing five machines and feel confident that they can predict how a random downtime will impact overall system performance. Now extend that out to a typical system with all its variability in processing times, equipment failures, repair times, material arrivals, and all the other common variability. Can anyone predict its performance? Can anyone predict the impact of a change?

With the help of simulation, you can.

This simple problem can be easily solved with either queuing theory or a simple model in your favorite simulation program. More complex problems will require simulation. After using your intuition to guess the answer, I’d suggest that you determine the correct answer for yourself. If you want to check your answer look at the comment attached to this article.

And the next time you or someone you know is tempted to predict system performance, I hope you will remember how well you did at predicting performance of a trivial system. Then use simulation for an accurate answer.

Dave Sturrock
VP Products – Simio LLC

Simulation in Agriculture

Guest article from Sophie Scotts

Over the past several months you have touched on many fields that simulation would benefit such as healthcare and disaster management. I would like now to recall something you said in your “Simulation Expertise through Tours” blog from September, “Don’t limit yourself to just your area of interest/expertise. Often you can learn even more from tours outside your comfort zone.” I think for many professionals in the simulation industry, applying simulation to the field of agriculture might be out of your expertise or comfort zone, but don’t let this stop you.

Since I work for the United States Department of Agriculture (USDA) I see first hand how beneficial simulation could be to our American farmers. Nowadays farmers must be laborers and savvy business men in order to survive in our current economy. It isn’t just milking old Bessie in the barn anymore; they must consider how each area on the farm affects the bottom line, just like any business. Farmers must look at the efficiency of their livestock and harvesting processes and the possibility of diversification in order to stay in business, and simulation could help in each of these areas.

Any farm that has livestock has 3 main questions they must ask themselves; How do I efficiently get livestock onto my farm? How do I efficiently get food to my livestock? And how do I efficiently use (or dispose of) the waste? If they are a dairy they must also consider the most efficient method to milk the cows. For instance, a poultry facility will house several thousands chickens a year for a few months each. During each cycle the chicks are trucked in, food is trucked in (or harvested from the fields), chickens are provided a specified amount of food and space, then they are trucked out (full grown), and wastes are trucked out so the nutrients can be utilized elsewhere. This process could benefit from simulation to create the most efficient scenario.

It is very common now for farmers to turn to non-traditional methods of bringing income onto the farm. One of these methods is to direct market their goods to the public through farmers markets, community supported agriculture (CSA), or opening stores on-property. They must ask themselves; How do I efficiently transport my products to the farmers market? How do I efficiently package and deliver my products to my customers? Or how do I handle parking and lines in my store? Simulation in each of these processes would allow the farmer to make an informed decision on the best management of his business.

So you can see that simulation can have a place in even the most unlikely fields (literally). American farms are a business and thus need to consider the efficiency of processes they undertake in order to meet the bottom line, and simulation can help. So don’t be afraid to think outside of the box and your area of expertise.

Sophie Scotts
United States Department of Agriculture

Help Wanted

Yes, it looks like hard economic times may be coming. But no, this has nothing to do with that.

This blog is a community service. To continue to be effective, we need community participation. That means you.

There are many ways you can participate.

1) Comment – At the end of each article is a link. Click it and add to the discussion. Agree. Disagree. Add new information or a different viewpoint. All civil discussion is welcome.
2) Suggest Topics – Contact me with any ideas you have about future content or ideas for making the blog more useful.
3) Write an Article – It doesn’t have to be rocket science. Nor does it have to be long or formal. Everyone has something to share. The main rule is to keep it unbiased and non-commercial. I am happy to edit it if you like and even publish it under a pen name if you are publicity shy (although I strongly prefer using your real name).
4) Become a Guest Author – I would like nothing better than to “share the limelight” with others. You can write one article or regular articles. Choose your own topics and frequency.

It’s all about sharing to help the simulation community. This is a simple way to give back. Anyone can do it. For any of the above or other ideas, you can contact me using dsturrock at Simio dot biz (name slightly obscured to slow down spammers).

Thanks for your help.

Dave Sturrock
VP Products – Simio LLC

Data Collection Basics Part 2

Last week in Data Collection Basics (Part 1) I discussed data collection, introducing the topics of identifying required data and then locating or creating that data. Once you have some data, you typically need to do some analysis on it before you can effectively use that data.

Select Distribution. Typically input data to a simulation model is specified as a distribution. If you have estimated data you must select the most appropriate distribution (for example a minimum time, typical time, and maximum time may be represented as a Triangular distribution). If you have actual data, then you will need to run a statistical analysis on it. Many software products (some generic and some simulation-specific) are available to help you with selecting (fitting) a distribution and its shape parameters, and even with cleaning the data to eliminate bad observations.

Analyze Sensitivity. Once you have some data you can build it into your model and start making trial runs. Particularly if you have relied on an estimate, you might want to run your model with values above and below the estimated values to determine system sensitivity to that parameter. If you find that the system is sensitive to an estimated value (e.g. the results change significantly with a change to the input parameter), then you can determine if it is worth a greater investment to obtain a more reliable value. This is one potential solution to the problems of bias and inaccuracy discussed in the initial article. But more than that, it is also a good way to iteratively determine how much time to spend on your input data.

Adjust Detail. Sometimes the quality of the available data can help you determine the appropriate level of detail for a model. If the data you intend to use is not very good, then there is little point to building a highly detailed model. This is not to imply that such a model is of no value, after all every model is just a representation or estimate of reality – no model will be perfect. But it is important to represent to your stakeholders the relative accuracy of the model and its underlying data.

This was a quick overview of some steps to data collection. Whole textbook chapters have been written about each of these, so be sure to look for greater detail when you are ready.

Dave Sturrock
VP Products – Simio LLC

Simulation and Disaster Management

While the last couple months have been pretty dry where I live here in the Northeastern part of the U.S., in the Southeastern part several severe hurricanes have already hit and it looks like more are coming. While every severe storm can have serious consequences, often the major difference between a severe storm and an outright disaster is the level of preparation.

Of course weather is just one of many potential causes of disasters. We have all seen floods, fires, earthquakes, and other disasters around the world that have been made much worse through inadequate planning and poor execution. Simulation can play a major role in preparing communities to avoid or at least reduce the impact of such disasters.

More accurate weather prediction is due in part to simulation. Combining advanced detection technology with sophisticated simulations has allowed us to become much better at predicting storm paths and severity. This allows for improved warnings and appropriate responses.

Simulation use in evacuation planning has a very high potential, but is not used as much as it could be. Communities should be able to examine various scenarios and evaluate the best ways to move people to safety, well before a dangerous situation actually occurs.

First-responder rescue efforts can also be pre-planned and evaluated. Where should various types of equipment be stored? How can it be moved? Who will staff it? What procedures should be used for various types of disasters?

As for relief scenarios, they too could be planned ahead of time with the assistance of simulation. What equipment and supplies should be stockpiled and where? How can it be quickly relocated? Who will staff it? The logistics of a large scale disaster-relief effort, including health care provisions, security at all levels, and even communications, (all of which often involve multi-organization coordination) is a great opportunity to showcase the true benefits of using simulation.

Large corporations and other organizations can also do their own simulation-based planning. Contingency plans for various scenarios can minimize the impact of a local or regional event and help ensure that a single event does not cripple the entire organization.

Louisiana State University has a relatively new center for disaster management and has organized a conference November 16-18 dealing with some of these issues.

Be Prepared” is a motto that anyone planning for a disaster should live by; Simulation helps make that a bit easier.

Dave Sturrock
VP Products – Simio LLC