One common validation technique is to start with a model of the existing system (assuming that the real system exists). Compare the results of the "as is" model against the performance of the real system. A stochastic comparison might take a representative period (e.g. 30 days or 30 weeks) and compare the average results over that period. Another approach is to make the model as deterministic as is feasible (e.g. use exact entity arrival times, exact failure data, etc) and compare the results for that shorter period. Each of these approaches is valuable in their own way. In both cases you strive to identify and explain any significant differences.
Another validation technique is to use the experience of your stakeholders. They know the system well and should be able to watch an animation and provide some measure of confidence. You should also give them the opportunity to see the model perform under a wide variety of situations, such as high volume, low volume, or recovering from a failure. Ideally stakeholders should even be able to create such situations themselves e.g. "I want to see Machine A fail …now."
While a single stakeholder can provide valuable insight, a group of stakeholders from different backgrounds can provide even greater value. Perhaps an engineer might say "Yes, you captured the design exactly as I described it" to which an operator might respond "Maybe so, but we would never actually do it that way. Here's how we would run it…". At that point the simulation is already providing significant value as a communication tool. Your role in the remainder of that meeting is to facilitate the discussion and take notes.
During the experimentation phase you will be generating the scenarios identified in the functional specification. Most likely, you will also need a few additional scenarios based on what you have learned as the project progressed. The details of the statistical analysis are beyond the scope of this paper, but proper statistical analysis is critical. See the additional reading section for some thorough treatment of appropriate experimentation and statistical analysis.
As with all the other portions of the project, make sure you provide enough time in the schedule for experimentation and analysis. Many times, if you fall behind on the model building, verification or validation phases of the project, you may find yourself in a time crunch for the analysis. Keep in mind that the reason for doing the simulation project is typically to analyze various scenarios, so make sure to plan accordingly and leave plenty of scheduled time for the final analysis phase.
Your primary goal should be to help your stakeholders make the best decision possible given the time and resources allocated. While you might have other personal goals such as to build credibility or make a profit, it is likely those goals will be met if you concentrate on helping the stakeholders.
Consider the background and particular needs of each stakeholder before creating your report. Although you are probably proud of your model and the detailed way in which you solved complex problems, few stakeholders will share that interest. Most stakeholders are interested in three things. First, what alternatives were considered. Second, what are your conclusions or recommendations. Third, what supporting information can you provide to merit their confidence in your analysis.
Although you need to have data to support your conclusions, do not overwhelm your stakeholders with too many details. Try to provide information in the context needed. For example, instead of simply stating "Average driver utilization was 76%", you might say "Since the average driver utilization is high (76%), there is inadequate slack time to catch up during peak periods without causing line delays."
Don't over-represent the accuracy of the output data. Acknowledge and even emphasize to the stakeholders that the model is an approximation and it will not generate exact answers. Display your data with appropriate precision based on the accuracy of your data and modeling assumptions (e.g. 76.2% not 76.2315738%). And display the accuracy of your numbers when possible. Most stakeholders can relate to a confidence interval like 76.2% ± 1.3%.
In spite of what you might have heard, doing simulation projects well is not easy. There are many ways that even an experienced simulationist can fail. In this paper we have discussed some common traps and ways to avoid them. While following these suggestions will not guarantee a bulls eye, it will certainly improve your chance of hitting the target.
Banks, J., J. S. Carson, B. L. Nelson, and D. M. Nicol. 2010. Discrete-event system simulation. 5th ed. Upper Saddle River, New Jersey: Prentice-Hall, Inc.
Law, A. M. 2007. Simulation modeling & analysis. 4th ed. New York: McGraw-Hill, Inc.
Sadowski, D. A. and M. R. Grabau. 1999. Tips for Successful Practice of Simulation. In Proceedings of the 1999 Winter Simulation Conference, ed. P. A. Farrington, H. B. Nembhard, D. T. Sturrock, and G. W. Evans, 60-66. Piscataway, New Jersey: Institute of Electrical and Electronics Engineers, Inc.
Sturrock, D. T., Success in Simulation, Ongoing blog and discussion at his blog.
DAVID T. STURROCK is Vice President of Operations for Simio LLC. He graduated from the Pennsylvania State University in Industrial Engineering. He has over 25 years experience in the simulation field and has applied simulation techniques in the areas of transportation systems, scheduling, plant layout, call centers, capacity analysis, process design, health care, packaging systems and real-time control. He is co-author of a leading simulation textbook and teaches simulation at the University of Pittsburgh. In his present role for Simio he is responsible for development, support and services for their simulation and scheduling product suite. His email is email@example.com.