Scheduling in the Industry 4.0

Today started badly.

As soon as I hopped into my car, the GPS system was flashing red to show queues of stationary traffic on my regular route to the office. Thankfully, the alternative offered allowed me to arrive on time and keep my scheduled appointments.

In the same way as a GPS combines live traffic data with an accurate map of the city, Simio Software connects real time data sources with a modeled production situation. Just like a GPS, Simio can also impose rules, make decisions, schedule and reschedule.

The major difference is in the scale.

Simio Simulation and Scheduling Software can model entire factories, holding huge quantities of detailed data about each resource, component and material. It leverages big data analysis to run thousands of permutations of scenarios, finding the optimum outcomes for specific circumstances. Lightning fast, it can detect and respond to changes with suggestions that will keep everything flowing in the best possible way.

Thank goodness for Simio, because Industry 4.0 is here.

Smart Factories employ fully integrated and connected equipment and people, each providing real time feedback about their state. Data is constantly collected on each product component, for process monitoring and control. Every aspect of the entire operation is managed through its associated specifications and status data. This large, constant stream of information coming from a known factory configuration can be received, stored, processed and reported upon by the powerful Simio software.

With Industry 4.0, nothing is left to chance. Everything is monitored and optimized, and performance is predicted, measured, improved and adapted on an ongoing basis. Management of so many interconnected components requires a scheduling system that is specifically designed to operate in this dynamic data environment. Simio Production Scheduling Software can be relied upon to provide the integrated solution for enabling technology in the Smart Factories of the future.

We are already seeing a rise in robotics and the increasing digitalization of the manufacturing industry under the effects of Industry 4.0. Soon all components of the factory model will be interconnected, just like my future driverless car that will communicate directly with my GPS to take the best route using current traffic information.

All I will have to do is sit back and enjoy the ride.

Why Simulation is Important in a Tough Economy

Everyone wants to cut costs. No one wants to spend unnecessarily. When budgets are tight, software and software projects are an easy place to cut. Staff positions like Industrial Engineers are sometimes easier to cut or redeploy than production jobs. I suggest that following this reasoning to eliminate simulation projects is often short-sighted and may end up costing much more than it saves. Here are a few reasons why it may make sense to increase your simulation work now.

1) Minimize your spending. Cash is tight. You cannot afford to waste a single dollar. But how do you really know what is a good investment? Simulate to ensure that you really need what you are purchasing. A frequent result of simulations intended to justify purchases is to find that the purchases are NOT justified and in fact the objectives can be met using existing equipment better. A simulation may save hundreds of times its cost with immediate payback.

2) Optimize use of what you have. Could you use a reduction in cost? Would it be useful to improve customer satisfaction? I assume that your answer would always be yes, but even more so in difficult times. But how can you get better, particularly with minimal investment? Simulation is a proven way to find bottlenecks and identify often low-cost opportunities to improve your operation.

3) Control change. In a down economy you are often using your facilities in new and creative ways – perhaps running lean or producing products in new ways or in new places. But how do you know these new and creative endeavors will actually work? How do you know they will not cost you even more than you save? Simulation helps you discover hidden interactions that can cause big problems. Different is not always better. Simulate first to avoid costly mistakes.

4) Retain/improve your talent pool. Some people who might otherwise be laid off may have the skills to be part of a simulation SWAT team. By letting them participate in simulation projects, they will likely achieve enough cost reduction and productivity improvements that they more than pay for themselves. As an added bonus, the team will learn much about your systems, the people, and communication – knowledge which will improve their value and contributions long after the project is complete.

5) Reduce risk. You are often forced to make changes. How do you know they are the right changes? Will a little more, a little less, or a different approach yield better results? How do you measure? A strength of simulation is its ability to objectively assess various approaches and configurations. Substitute objective criteria for a “best guess”, and, in turn, reduce the risk associated with those changes. In a down economy it is more important than ever that you don’t make mistakes.

In summary, rather than thinking of the cost of simulation, you should think of what the investment in simulation today will save you today, tomorrow and every day following. Simulation is not a cost, it is an investment that may return one of the best ROIs available in a tough economy.

Dave Sturrock
VP Products – Simio LLC

Six Sigma and Simulation: Part 2

By Jeff Joines (Associate Professor In Textile Engineering at NCSU)

This is the second of the three part series on Six Sigma, Lean Sigma, and Simulation. The first part explained the Six Sigma methodologies. Recall the goal of the DMAIC continuous improvement methodology is to control/reduce process variability of a current process or product while the Design for Six Sigma process DMADV is used to design a new process or product with minimal variability before creation. Simulation modeling can be employed in almost every phase of either methodology.


Six Sigma practitioners have to estimate the cost savings for each project to be certified or justify the project typically. However, most of these cost forecasts are made on point estimates of key parameters (i.e., raw material cost, customer/product demand, cost of capital, currency rates, etc.). By employing Monte Carlo simulation, variability and/or ranges on these point estimates can be employed to provide a more reliable estimate. Along these lines, several projects have been proposed and simulations can be utilized to help management perform project selection based on resource constraints and objectives.

Analyze and Improve

During the Analysis and Improve phases, Design of Experiments (Full, Fractional, Mixed, etc.) is the most common tool utilized which provides a base line to illustrate improvement when changes are made as well identifying factors of interest to control or change. The normal baseline measure is defined as the process capability (Cpk) which is an indication of the ability of a process to produce consistent results – the ratio between the permissible spread and the actual spread of a process. The Cpk index takes into account off centeredness and defined as the minimum of (USL-Mean)/ 3? or (Mean-LSL)/ 3? where USL and LSL are the upper and lower specification limit. A six sigma process is normally distributed with a Cpk value greater than 1.5.

Using the real system is better in terms of capturing all complexities, interactions, etc. However as simulation practitioners, we recognize when that might be possible or viable. The following lists examples where simulation modeling in terms of Monte Carlo or process simulation can be used.

  • If the product or process does not exist as is the case in a Design for Six Sigma, simulation models can be used to ascertain capability of a new process and product before implementation. For example, tolerance stack up of individual parts or processes can be determined. Take parts or processes which are within tolerance individually (e.g., bearing and a shaft) but the assembly process might not be capability owing to the tolerance stack up problem which occurs in manufacturing, service, and transactional processes.
  • The cost of performing a DOE with replications is too high (e.g., raw material cost, cost of shutting down current process). We have worked with companies in developing process and Monte Carlo simulation models that could be used to determine their capabilities and ascertain the potential improvement in their changes.
  • The time of running the set of experimentation makes it impractical to determine the baseline or ascertain the improvements of a process. While working with a large company and their six sigma process improvement team with a complex global supply chain, one of their projects was to reduce inventories of a series of products with a ten to twelve week lead time. The team had to evaluate six inventory policies, indentify which one of three suppliers was best, etc. The DOE with sufficient replications would have taken years to complete and made the project useless without the simulation model. Also, most of the data driving the model was based on lead-times which are not normally distributed.
  • Think of systems where there are multiple processes that feed one another (e.g., departments, plants, etc.) which contain only five or six factors each. Transfer functions can be generated from a traditional DOE on each individual process but not the entire system. A simulation model can be used to combine each individual transfer function into determining the capability of the whole system as well as testing a wider range of values.
  • There are several environments, where performing a DOE is impractical or impossible. For example, we have trained dozens of people associated with hospital systems from around the country in Six Sigma. Simulation modeling and analysis allows these practitioners to be able ascertain process capability with a model because the real system cannot be used since patient care is at stake. Other environments where we have used simulation modeling instead of the real system is in processes which are transactional like the banking or insurance industries.


Simulation can also be used as a process control aid as the process is being implemented to determine potential problems.

Hopefully it is apparent that simulation experts already posses the skills that can greatly help Six Sigma projects. These types of projects are not unique but just general simulation models we are know how to build. They only require us to learn the Six Sigma language as well as the need to calculate Cpk statistics. I find it easier to work with Six Sigma people because of their statistical training for understanding input and output analysis even though they typically have only used the Normal distribution. In Six Sigma and Simulation: Part 3, the use of simulation in the Lean Sigma world will be addressed.

Six Sigma and Simulation: Part 1

By Jeff Joines (Associate Professor In Textile Engineering at NCSU)

This is a three part series on Six Sigma, Lean Sigma, and Simulation. The first blog will explain the Six Sigma methodology and the bridge to simulation analysis and modeling while the second and third parts will describe the uses of simulation in each of the Six Sigma phases and Lean Sigma (i.e., Lean Manufacturing) respectively.

“Systems rarely perform exactly as predicted” was the starting line for the blog Predicting Process Variability and is the driving force behind most improvement projects. As stated, variability is inherent in all processes whether these processes are concerned with manufacturing a product within a plant, producing product via an entire supply chain complex or providing a service in the retail, banking, entertainment or hospital environment. If one could predict or eliminate the variability of a process or product, then there would be no waste (or Muda in the Lean World which will discussed in a third part) associated with a process, no overtime to finish an order, no lost sales owing to having the wrong inventory or lengthy lead-times, no deaths owing to errors in health care, shorter lead times, etc. which ultimately leads to reduced costs. For any organization (manufacturing or service), reducing costs, lead-times, etc. is or should be a priority in order to compete in the global world. Reducing, controlling and/or eliminating the variability in a process is key in minimizing costs.

Six Sigma is a business philosophy focusing on continuous improvement to reduce and eliminate variability. In a service or manufacturing environment, a Six Sigma (6?) process would be virtually defect free (i.e., only allowing 3.4 defects out of a million operations of a process). However, most companies operate at four sigma which allows 6,000 defects per million. Six Sigma began in the 1980s when Motorola set out to reduce the number of defects in its own products. Motorola identified ways to cut waste, improve quality, reduce production time and costs, and focus on how the products were designed and made. Six Sigma grew from this proactive initiative of using exact measurements to anticipate problem areas. In 1988, Motorola was selected as the first large manufacturing company to win the Malcolm Baldrige National Quality Award. As a result, Motorola’s methodologies were launched and soon their suppliers were encouraged to adopt the 6? practices. Today, companies who use the Six Sigma methodology achieve significant cost reductions.

Six Sigma evolved from other quality initiatives, such as ISO, Total Quantity Management (TQM) and Baldrige, to become a quality standardization process based on hard data and not hunches or gut feelings, hence the mathematical term, Six Sigma. Six Sigma utilizes a host of traditional statistical tools but encompasses them within a process improvement framework. These tools include affinity diagrams, cause & effects, failure modes and effective analysis (FMEA), Poka Yoke (mistake proofing), survey analysis (voice of customer), design of experiments (DOE), capability analysis, measurement system analysis, statistical process control charts and plans, etc.

There are two basic Six Sigma processes (i.e., DMAIC and DMADV) and they both utilize data intensive solution approaches and eliminate the use of your gut or intuition in making decisions and improvements. The Six Sigma method based on the DMAIC process and is utilized when the product or process already exists but it is not meeting the specifications or performing adequately is described as follows.

    Define, identify, prioritize, and select the right projects. Once selected to define the project goals and deliverables.
    Measure the key product characteristics and process parameters to create a base line.
    Analyze and identify the key process determinants or root causes of the variability.
    Improve and optimize performance by eliminating defects.
    Control the current gains and future process performances.

If the process or product does not exist and needs to be developed, the Design for Six Sigma (DFSS) process (DMADV) has to be employed. Processes or products designed with the DMADV process typically reach market sooner; have less rework; decreased costs, etc. Even though, the DMADV is similar to DMAIC method and start with the same three steps, they are quite different as defined below.

    Define, identify, prioritize, and select the right projects. Once selected to define the project goals and deliverables.
    Measure and determine customer needs and specifications through voice of the customer.
    Analyze and identify the process options necessary to meet the customer needs.
    Design a detailed process or product to meet the customer needs.
    Verify the design performance and ability to meet the customer needs where the customer maybe internal or external to the organization.

Both processes use continuous improvement from one stage back to the beginning. For example, if during the analyze phase you determine a key input is not being measured, new metrics have to be defined or new projects can be defined once the control phase is reached.

Now that we have defined six sigma, you may be wondering what is the bridge to computer simulation and modeling. Simulation modeling and analysis is just another tool in the Six Sigma toolbox. Many of the statistical tools (e.g., DOE) try to describe the dependent variables (Y’s) in terms of the independent variables (X’s) in order to improve it. Also, most of the statistical tools are parametric methods (i.e., they rely on the data being normally distributed or utilize our friend the central limit theorem to make the data appear normally distributed). Many of the traditional tools might produce sub-optimal results or cannot be used at all. For example, if one is designing a new process or product, the system does not exist so determining current capability or future performance cannot be done. The complexity and uncertainty of certain processes cannot be determined or analyzed using traditional methods. Simulation modeling and analysis makes none of these assumptions and can yield a more realistic range of results especially where the independent variables (X’s) can be described as a distribution of values. In Six Sigma and Simulation: Part 2, a more detailed look at how simulation is used in the two six sigma processes (DMAIC and DMADV) will be discussed.