Don’t Waste Your Time with a Functional Spec

Recently I was called in as an independent third party in a dispute between a modeler and a stakeholder. The stakeholder said “I have significant experience in both my application and modeling and I know what I want, but I am not getting it.” The modeler said “I have been modeling for 30 years and I know exactly what the stakeholder needs, but he just won’t listen to me!

It was obvious that they weren’t communicating well, but not so obvious why two such highly experienced people were at such odds. So my first question was “What was written in the functional specification?” You might guess the answer … “What functional specification?

My second question was “Well then, what did the contract say?” The answer again was unfortunately along the lines of “I’ll give you $x to model this” (refer to a recent blog of mine on this topic).

So in hindsight it is pretty easy to see where the misunderstanding came from. They had not agreed on model scope, approach, animation, units of measure, or even basic modeling objectives! Of course that leaves lots of room for experienced professionals to interpret the problem in totally different fashions and end up with totally different approaches to the problem.

Many people think that doing a functional specification (FS) is a waste of time. But a FS is rarely extra work. Rather it is work that must be done at some point and if it is done early it can have tremendous positive impact on project success. A FS is almost never a waste of time — even if the project is cancelled as a result of the FS, it is better to have “wasted” a few hours on the FS, rather than to have wasted significantly more time on the project before enough was learned to cancel it.

So who was right? I don’t even need to discuss the technical merits. In my perspective it comes down to two things:

1) A modeler who embarks on a journey with little clue where he is heading (an FS) is setting himself up for failure. An experienced modeler should know better.
2) While it is certainly the responsibility of the modeler to attempt to educate and persuade a client of the best approach, ultimately the customer is the one who decides if the project is successful, not the modeler. So in the end, the customer is always right.

Happy Modeling!

How Much Data Do I Need?

I have discussed data issues in several previous articles. People are often confused about how much data they really need. In particular, I frequently hear the refrain “Simulation requires so much data, but I don’t have enough data to feed it.” So let’s examine a situation where you have, say 40% of the data you would like to have in order to make a sound decision and let’s examine the choices.

1) You can possibly defer the decision. In many cases no decision is a decision in itself because the decision will get made by the situation or by others involved. But if you truly do have the opportunity to wait and collect more data before making the decision, then you must measure the cost of waiting against the potential better decision that you might make with better data. But either way, after waiting you still have all of the following options available.

2) Use “seat of the pants” judgment and just decide based on what you know. This approach compounds the lack of data by also ignoring problem complexity and ignoring any analytic approach. (Ironically enough this approach often ignores the data you do have.) You make a totally subjective call, often heavily biased by politics. There is no doubt that some highly experienced people can make judgment calls that are fairly good. But it is also true that many judgment calls turn out to be poor and could have benefited greatly from a more analytical and objective approach.

3) Use a spreadsheet or other analytical approach that doesn’t require so much data. On the surface this sounds like a good idea and in fact, there is a set of problems for which spreadsheets are certainly the best (or at least an appropriate) choice. But for the modeling problems we typically come across, spreadsheets have two very significant limitations: they cannot deal with system complexity and they cannot adequately deal with system variability. With this approach you are simply “wishing away” the need for the missing data. You are not only making the decision without that data, but you are pretending that the missing data is not important to your decision. An oversimplified model that doesn’t consider variability or system complexity and ignores the missing data … doesn’t sound like the makings of a good decision.

3) Simulate with the data you have. No model is ever perfect. Your intent is generally to build a model to meet your project objectives to the best of your ability given the time, resources, and data available. We can probably all agree that better and more complete data results in a more accurate, complete, and robust model. But model value is not true false (valuable or worthless) but rather it is a graduated scale of increasing value. Referencing back to that variability problem, it is much better to model with estimates of variability than to just use a constant. Likewise a model based on 40% data won’t provide near the results of one with all of the desired data, but it will still outperform the analytical techniques that are not only missing that same data, but are also missing the system complexity and variability.

And unlike the other approaches, simulation does not ignore the missing data, but can also help you identify the impact and prioritize the opportunities to collect more data. For example some products have features that will help you assess the impact of guesses on your key outputs (KPIs). They also have features that can help assess where you should put your data collection efforts to expand sample or small data sets to most improve your model accuracy. And all simulations provide what-if capability you can use to evaluate best and worst case possibilities.

Perfection is the enemy of success. You can’t stop making decisions while you wait for perfect data. But you can use tools that are resilient enough to provide value with limited data. Especially if those same tools will help you better understand the value of both the existing and the missing data.

Happy modeling!

Dave Sturrock
VP Operations – Simio LLC

Simulation Stakeholder Bill of Rights

The people who request, pay for, consume, or are affected by a simulation project and its results are often referred to as its stakeholders. For any simulation project the stakeholders should have reasonable expectations from the people actually doing the work.

Here I propose some basic stakeholder rights that should be assured.

1. Partnership – The modeler will do more than provide information on request. The modeler will assume some ownership of helping stakeholders determine the right problems and identify and evaluate proposed solutions.
2. Functional Specification – A specification will be created at the beginning of the project to help define clear project objectives, deadlines, data, responsibilities, reporting needs, and other project aspects. This specification will be used as a guide throughout the project, especially when tradeoffs must be considered.
3. Prototype – All but the simplest projects will have a prototype to help stakeholders and the modeler communicate and visualize the project scope, approach, and outcomes. The prototype is often done as part of the functional specification.
4. Level of Detail – The model will be created at an appropriate level of detail to address the stated objectives. Too much or too little detail could lead to an incomplete, misunderstood, or even useless model.
5. Phased Approach – The project will be divided into phases and the interim results should be shared with stakeholders. This allows problems in approach, detail, data, timeliness, or other areas to be discovered and addressed early and reduces the chance of an unfortunate surprise at the end of a project.
6. Timeliness – If a decision-making date has been clearly identified, usable results will be provided by that date. If project completion has been delayed, regardless of reason or fault, the model will be re-scoped so that the existing work can provide value and contribute to effective decision-making.
7. Agility – Modeling is a discovery process and often new directions will evolve over the course of the project. While observing the limitations of level of detail, timeliness, and other aspects of the functional specification, a modeler will attempt to adjust project direction appropriately to meet evolving needs.
8. Validated and Verified – The modeler will certify that the model conforms to the design in the functional specification and that the model appropriately represents the actual operation. If there is inadequate time for accuracy, there is inadequate time for the modeling effort.
9. Animation – Every model deserves at least simple animation to aid in verification and communication with stakeholders.
10. Clear Accurate Results – The project results will be summarized and expressed in a form and terminology useful to stakeholders. Since simulation results are an estimate, proper analysis will be done so that the stakeholders are informed of the accuracy of the results.
11. Documentation – The model will be adequately documented both internally and externally to support both immediate objectives and long term model viability.
12. Integrity – The results and recommendations are based only on facts and analysis and are not influenced by politics, effort, or other inappropriate factors.

Note that every set of rights comes with responsibilities. The associated stakeholder responsibilities are discussed as part of the Simulationist Bill of Rights.

Give these expectations careful consideration to improve the effectiveness and success of your next project.

Dave Sturrock
VP Products – Simio LLC

Why Simulation is Important in a Tough Economy

Everyone wants to cut costs. No one wants to spend unnecessarily. When budgets are tight, software and software projects are an easy place to cut. Staff positions like Industrial Engineers are sometimes easier to cut or redeploy than production jobs. I suggest that following this reasoning to eliminate simulation projects is often short-sighted and may end up costing much more than it saves. Here are a few reasons why it may make sense to increase your simulation work now.

1) Minimize your spending. Cash is tight. You cannot afford to waste a single dollar. But how do you really know what is a good investment? Simulate to ensure that you really need what you are purchasing. A frequent result of simulations intended to justify purchases is to find that the purchases are NOT justified and in fact the objectives can be met using existing equipment better. A simulation may save hundreds of times its cost with immediate payback.

2) Optimize use of what you have. Could you use a reduction in cost? Would it be useful to improve customer satisfaction? I assume that your answer would always be yes, but even more so in difficult times. But how can you get better, particularly with minimal investment? Simulation is a proven way to find bottlenecks and identify often low-cost opportunities to improve your operation.

3) Control change. In a down economy you are often using your facilities in new and creative ways – perhaps running lean or producing products in new ways or in new places. But how do you know these new and creative endeavors will actually work? How do you know they will not cost you even more than you save? Simulation helps you discover hidden interactions that can cause big problems. Different is not always better. Simulate first to avoid costly mistakes.

4) Retain/improve your talent pool. Some people who might otherwise be laid off may have the skills to be part of a simulation SWAT team. By letting them participate in simulation projects, they will likely achieve enough cost reduction and productivity improvements that they more than pay for themselves. As an added bonus, the team will learn much about your systems, the people, and communication – knowledge which will improve their value and contributions long after the project is complete.

5) Reduce risk. You are often forced to make changes. How do you know they are the right changes? Will a little more, a little less, or a different approach yield better results? How do you measure? A strength of simulation is its ability to objectively assess various approaches and configurations. Substitute objective criteria for a “best guess”, and, in turn, reduce the risk associated with those changes. In a down economy it is more important than ever that you don’t make mistakes.

In summary, rather than thinking of the cost of simulation, you should think of what the investment in simulation today will save you today, tomorrow and every day following. Simulation is not a cost, it is an investment that may return one of the best ROIs available in a tough economy.

Dave Sturrock
VP Products – Simio LLC

Can Simulations Model Chaos?

Can chaotic systems be predicted? I guess we first need to agree on exactly what a chaotic system is.

BusinessDictionary.com defines it as a
“Complex system that shows sensitivity to initial conditions, such as an economy, a stockmarket, or weather. In such systems any uncertainty (no matter how small) in the beginning will produce rapidly escalating and compounding errors in the prediction of the system’s future behavior.”

It is hard to imagine a complex system that does not show sensitivity to initial conditions. If the follow-on statement is true, then there is little point to ever trying to model or predict the behavior of such a system because it is not predictable. But it is not hard to find counter-examples, even to the examples they provided. Meteorologists do a reasonable job predicting the weather; it depends on your standards of accuracy. Certainly they can predict fairly accurately the likelihood of a 90 degree day in January in Canada or anticipating the path of a tropical storm for the next 12 hours.

A less technical but perhaps more useful definition comes from membrane.com:
“A chaotic system is one in which a tiny change can have a huge effect.”
That leads us toward a more practical definition for our purposes.

For the types of systems we normally model, I would propose yet another definition.
A chaotic system is one in which it is likely that seemingly trivial changes in the initial conditions would cause significant changes in the predicted results, over the time frame being considered.

This definition, while not technically rigorous, acknowledges that most of us rarely have the opportunity or the need to deal in absolutes. We live in a world where the majority of decisions are made subjectively (“Joe has 20 years experience and he says…”) or with gross simplification (“Of course I can model that in a spreadsheet…”). In this world, being able to base a decision on a simulation model with better accuracy and objectivity can help realize tremendous savings, even if it is still only an approximation and only useful within specified parameters.

Can we accurately predict true chaotic systems? By strict definition clearly not. And even by my definition, there will be some systems that are just too chaotic to allow any predictions to be useful.

But can we provide useful predictions of most common systems, even those with some chaotic aspects? Absolutely yes. Every model is an approximation of a real or intended system. Part of our job as modelers is to ensure that the model is close enough to provide useful insight. A touch of chaos just makes that more interesting. 🙂

Dave Sturrock
VP Products – Simio LLC

Predicting Process Variability

Systems rarely perform exactly as predicted. A person doing a task may take six minutes one time and eight minutes the next. Sometimes variability is due to outside forces, like materials that behave differently based on ambient humidity. Some variability is fairly predictable such as tool that cuts slower as it gets dull with use. Others seem much more random, such as a machine that fails every now and then. Collectively we will refer to these as process variability.

How good are you are predicting the impact of process variability? Most people feel that they are fairly good at it.

For example, if someone asked you what is the probability of rolling a three in one role of a common six-sided die, you could probably correctly answer one in six (17%). Likewise, you could probably answer the likelihood of flipping a coin twice and having it come up heads both times, one in four (25%).

But what about even slightly more complex systems? Say you have a single teller at a bank who always serves customers in exactly 55 seconds and customers come in exactly 60 seconds apart. Can you predict the average customer waiting time? I am always surprised at how many professionals get even this simple prediction wrong. (If you want to check your answer, look to the comment attached to this article.)

But let’s say that those times above are variable as they might be in a more typical system. Assume that they are average processing times (using exponential distributions for simplicity). Does that make a difference? Does that change your answer? Do you think the average customer would wait at all? Would he wait less than a minute? Less than 2 minutes? Less than 5 minutes? Less than 10 minutes? I have posed this problem many times to many groups and in an average group of 40 professionals, it is rare for even one person to answer these questions correctly.

This is not a tough problem. In fact this problem is trivial compared to even the smallest, simplest manufacturing system. And yet those same people will look at a work group or line containing five machines and feel confident that they can predict how a random downtime will impact overall system performance. Now extend that out to a typical system with all its variability in processing times, equipment failures, repair times, material arrivals, and all the other common variability. Can anyone predict its performance? Can anyone predict the impact of a change?

With the help of simulation, you can.

This simple problem can be easily solved with either queuing theory or a simple model in your favorite simulation program. More complex problems will require simulation. After using your intuition to guess the answer, I’d suggest that you determine the correct answer for yourself. If you want to check your answer look at the comment attached to this article.

And the next time you or someone you know is tempted to predict system performance, I hope you will remember how well you did at predicting performance of a trivial system. Then use simulation for an accurate answer.

Dave Sturrock
VP Products – Simio LLC