Today, discrete event simulation (DES) software and the benefits it provides are currently being used across a majority of industries to simplify business operations, make predictions, and gain insight into complex processes. But before modern simulation software such as Simio could be used to create shiny models and execute real-time simulations, there were earlier technologies that formed the foundation built upon by modern simulation software. As you can probably tell, there is definitely a story behind the evolution of simulation software and today, that story is being told.
To accurately tell this story, the evolution must be arranged in chronological order. The traditional order currently in use today is the order outlined by R.E. Nance in 1995. This chronological order will be used here but with slight edits to accommodate the earliest memories of simulation software and the current strides being made. This is because the most referenced order outlined in 1995, did not take into account the efforts of Jon von Neumann and Stanislaw Ulam who made use of simulation to analyze the behavior of neutrons in 1946.
RE. Nance’s chronology which was written in 1995 could not and did not account for the recent paradigm shifts in DES software. This understandable omission will also be highlighted and included in this post. Therefore, this paper on discrete event simulation should be seen as an update of the history and evolution of DES software.
The Early Years (1930 – 1950)
Before discrete simulation came to prominence, early mathematicians made use of deterministic statistical sampling to estimate uncertainties and model complex processes. This process was time-consuming and error-prone which led to the early DES techniques known as Monte Carlo simulations. The earliest simulation was the Buffon needle technique which was used by Georges-Louis Leclerc, Compte de Buffon to estimate the value of Pi by dropping needles on a floor made of parallel equidistant strips. Although this method was successful, simulation software as we know it got its origin in 1946.
Sometime in the fall of 46’, two mathematicians were faced with the problem of understanding the behavioral pattern of neutrons. To understand how neutrons behaved, Jon von Neumann and Stanislaw Ulam, developed the Roulette wheel technique to handle discrete event simulations. The light bulb moment came to Ulam while playing a game of Solitaire. Ulam successfully simulated the number of times he could win at Solitaire by studying hundreds of successful plays.
After successfully estimating a few games, he realized it would take years to manually observe and pick successful games for every hand. This realization led to Ulam enlisting Jon von Neumann to build a program to simulate multiple hands of solitaire on the Electronic Numerical Integrator and Computer (ENIAC). And the first simulation software was written.
The Period of Search (1955 – 1960)
The success of both mathematicians in simulating neutron behavioral patterns placed the spot light on simulation and encouraged government agencies to explore its uses in the military. As with all technological processes, the growth of discrete simulation software could only match the computing units available at that time. At that time, analog and barely digital computers were the springing board for development.
Around 1952, John McLeod and a couple of his buddies in the Naval Air Missile Test Center undertook the responsibility of defining simulation concepts and the development of algorithms and routines to facilitate the design of standardized simulation software. In the background, John Backus and his team were also developing a high-level language for computers. The efforts of the multiple teams working independently of one another led to the development of the first simulation language and software that would lead to the evolution of DES software.
It also highlights the general theme of how technological advancements and software evolutions occur which is through advancements in diverse interrelated fields.
The Advent (1960 – 1965)
By 1961, John Backus and his team at IBM had successfully developed FORTRAN, the first high-level programming language for everyday use. The success of FORTRAN led to the creation of a general-purpose simulation language based on FORTRAN. This language was SIMSCRIPT which was successfully implemented in 1962 by Harry Markowitz.
Other general-purpose simulation software and systems also sprang up within this period as competing contractors continued to develop simulation languages and systems. At the tail end of 1965, programs and packages such as ALGOL, General Purpose Simulation System (GPSS), and General Activity Simulation Program (GASP) had been developed. IBM computers and the BUNCH crew consisting of Burroughs, UNIVAC, NCR, Control Data Corporation, and Honeywell were developing more powerful computers to handle complex simulations.
One of the highlights of this period was the successful design of the Gordon Simulator by IBM. The Gordon Simulator was used by the Federal Aviation Administration to distribute weather information to stakeholders in the aviation industry. Thus highlighting the first time simulation was used in the aviation industry.
Here again, the increase in processing speed and the prominent entry of a new term known as computer-aided design was to play a role in advancing the development of simulation software for use. At this stage, early simulation packages and languages were still being used predominantly by the government, as well as, a few corporations. Also, ease of use, intuitive, and responsive packages were slowly being integrated into simulation software such as the GPSS which had become popular in the 60s’.
The Formative Years (1966 – 1970)
The formative years were defined by the development of simulation software for commercial use. By this time, businesses had begun to understand simulation and the role it plays in simplifying business process and solving complex problems. The success of systems such as the Gordon Simulator also got industry actors interested in the diverse ways DES software could be employed.
Recognizing the need to apply simulation in industrial processes, the first organization solely dedicated to simulation was formed in 1967 and the first conference was held in New York at the Hotel Roosevelt. In the second conference, 78 papers on discrete event simulation and developing DES software were submitted. Surprisingly some of the questions asked in the 1968 conference still remain relevant to this day. These questions include:
- The difficulties in convincing top management about simulation software
- How simulation can be applied in manufacturing, transportation, human behavior, urban systems etc.
The Expansion Period (1971 – 1978)
The expansion period was dedicated to the simplification of modeling process when using simulation software and introducing its use in classrooms. At this stage, diverse industries had begun to understand the use and benefits of simulation software to their respective industries. This, in turn, led to discussing the need to prepare students for a world that integrates simulation.
Also, advancement in technology such as the introduction and wide spread use of the personal computer made the case for developing simulation software for dedicated operating systems. This led to the development of the GPSS/H for IBM mainframes and personal computers. The GPSS/H also introduced interactive debugging to the simulation process and made the process approximately 20 times faster than previous simulation packages. In terms of technological evolution, the GASP IV also introduced the use of time events during simulations which highlights the growth in simulation software available to industries at that time.
By the fifth simulation conference tagged the ‘Winter Simulation Conference’ of 1971, diverse tutorials on using simulation packages such as the GASP2 and SIMSCRIPT had become available to the public. The growing popularity of simulation also led to increased commercial opportunities and by 1978, simulation software could be purchased for less than $50,000.
The Consolidation and Regeneration (1979 – 1986)
The consolidation age was defined by the rise of the desktop and personal computer which led to the widespread development of simulation software for the personal computer. Simulation software also witnessed upgrades through the development of simulation language for alternative modeling (SLAM). The SLAM concept made it possible to combine diverse modeling capabilities and obtain multiple modeling perspectives when handling complex processes.
These upgrades or development made simulation for production planning possible and the manufacturing industry began to take a keen interest in simulation software. The increase in computing and storage capacity also led to the creation of factory management systems such as the CAM – I. CAM – I effectively became the first simulation software used solely for closed-loop control of activities and process within shop floors.
By 1983, SLAM II had been developed and this was an industrial-grade simulation package ahead of its time. SLAM II provided three different modelling approaches which could also be combined at need. These approaches included discrete event modeling approach, network modeling, and the ability to integrate discrete modeling and network modeling in a particular simulation model. More importantly, SLAM II cost approximately $900 which was relatively cheap at that time. This can be signified as the moment where discrete event simulation came into its own as commercial software options for discrete event simulation modeling became available to the general public
The Growth and Animation Phase (1987 – 2000)
The 90s’ witnessed a consolidation of the strides made in the earlier years and many interrelated technologies and processes also came off age within this decade. This era focused on simplicity, the development of interactive user-interfaces, and making simulation available for everyone including non-technical individuals.
In the mid-nineties, simulation software was being used to solve even more complex issues such as simulating every event and process in large-scale facilities. The Universal Data System example was a first in those days. Universal Data System was stuck with converting its entire plant to a hybrid flow-shop which enhanced production. To achieve this, the company made use of GPSS and the end result was a successful flow that enhanced daily operations and the entire process was modeled and simulated within 30 days.
In 1998, vendors began to add data collection features to simulation software. These features included the automation of data collection processes, the use of 3D graphics and animation to make the simulation process more user-friendly and non-technical. Needless to say, the technological advancements in animation, modeling, graphics design, and UI building played roles in enhancing simulation software during this period.
The Flexibility and Scalability Phase (2000 – 2019…)
Finally, we come to the last evolutionary phase of the DES software as we know it. Once again, advancement in interrelated technologies have made scaling simulation and speeding up its process possible. The evolution that came with the new millennium saw DES vendors leverage the use of cloud computing, AI, and high-performance computing to take simulation to greater heights.
Other changes that came within these decades was the evolution of production-based scheduling process to a simulation-based scheduling process. This shift allowed for real-time simulation scheduling, processing, and decision-making. This shift also comes with the fourth industrial revolutions were data collection, automation, and interconnectivity rule. Simulation software of this generation has evolved to become tools capable of digitization and the development of digital twins.
Discrete event simulation software such as Simio are examples of the comprehensive simulation technologies that are needed to drive Industry 4.0. This is because new age DES software must be able to collect and store its own data, model accurate 3-D graphics, animation, manage real-time scheduling, and digitization. They must also be equipped with features that market it possible to leverage cloud computing, integrate enterprise resource planning applications, and high-performance computing. These features all work together to ensure the most complex simulations are executed to deliver accurate answers or insights when applied in professional settings.
The future of discrete event simulation is by no means set in stone as the experiences from previous eras have shown. This means with the advancement in interrelated technologies and simulation software, more industrial concepts and business models will be disrupted in the coming decade.