this year’s Simio Sync: Digital Transformation conference (May 4 – 5), Simio is
planning a bigger event covering all things digital by increasing the number of
programs and expert speakers compared to last year’s event. The goal is to
cover the expanding ecosystem around digital transformation while providing a
platform for attendees to experience practical examples of its application
across every human endeavor.
this end, we are enlisting great speakers with years of hands-on experience in
digital transformation to lead expansive sessions on application, strategy, and
charting a course using digital technologies. Today, Simio is excited to
announce two great speakers who will be sharing their experiences with
digitally transforming supply chains and the manufacturing industry.
are happy to announce Martin Barkman, Senior Vice President and
Global Head of Solutions Management for Digital Supply Chain at SAP. Martin leads
the strategy and go-to-market for SAP’s Digital Supply Chain solution
portfolio, which encompasses software for R&D, engineering, supply chain
planning, manufacturing, logistics, and asset management.
He will be speaking on the
role digital transformation plays in enhancing supply chain management and
implementation strategies. His session will also provide practical examples for
enterprises interested in driving their supply chain strategy using technology.
These practical examples will leverage on the 12 years’ experience he gained
providing on-premise and cloud-based solutions for optimizing supply chains at
We’re also pleased to announce
Sircar, CTO Manufacturing Industry at Microsoft. Indranil has
considerable experience with logistics and supply chain management using
disruptive technologies. He has helped big businesses develop intelligent
supply chain strategies using the Internet of Things (IoT), artificial
intelligence (AI), virtual reality, and the digital twin.
At Simio Sync, the veteran
with over 30 years of experience at Hewlett-Packard and Microsoft, will be
speaking about the use of digital transformation to set and drive manufacturing
visions or strategies. He will also share practical examples about using
high-tech and edge solutions to accelerate digital transformation, as well as,
how they bring value to enterprises.
The distinguished speakers
will also be available to take questions from members of the audience which
provides you with the opportunity to present the specific challenges you have
faced with digital transformation.
You do not want to miss this and the other events including the networking dinner with industry stakeholders from Lockheed Martin, Exxon Mobile, BAE Systems etc. and Simio Spouses Agenda that we have lined up for you. Get your early bird tickets today.
Many managers don’t understand what exactly risk analysis is. We put together some of the most common questions with responses for you.
What does the risk percentage mean?
The risk percentage approximates
the on-time probability for an order with appropriate consideration of the
number of replications or “experiments.”
It tells the user how confident they can be in meeting the due date
given how many trials they have conducted.
How does Simio calculate the on-time probability?
Simio adjusts from a base rate of 50% with each risk
replication. If an order is on time in
an individual replication, Simio updates the probability, increasing it closer
to 100%. If the order is late, Simio
decreases the probability closer to 0%.
Each replication is an experiment that provides new information about
the likelihood of success or failure.
More experiments mean more confidence in the answer.
Why is the base rate 50%?
Before any plan is generated or any activity is simulated,
there is no information about the order other than the possible outcomes. Because there are only two outcomes that
matter (on time or not), the base rate is set to 50%.
I have an overdue order in my system. Why is it not always 0%?
Because the calculation is an adjustment of a base rate of
50%, Simio needs a lot of evidence before it will guarantee that an order will
be late (or on time for that matter). If
the user runs 1000 replications, and the result is late in all of them, Simio
will reflect a 0% on time probability.
What formula does Simio use to calculate the probability?
For the statistics experts, Simio uses a binomial proportion
confidence internal formula known as the Wilson Score. We report the midpoint of the confidence
interval as the risk measure.
Why not just report the outcome of the replications as
the probability (e.g., if 9 of 10 are on time, report 90% on time probability)?
This was the original implementation. However, it gives a false sense of confidence
and can be misleading. A single
replication would always yield either 100% on time or 0% on time. We wanted the answer to also give decision
makers a sense of how confident they could be in the answer. Using the Wilson Score, a single replication will
yield a result of 60% at best and 40% at worst (using 95% confidence level). This helps the decision maker identify that
they have a very small sample of data and would encourage them to run
Can you give me an example of how this works?
Risk analysis can be demonstrated using any scheduling
example. It is best viewed in the Entity
Gantt. In the screenshots below, we’ve
included 2 orders from the Candy Manufacturing Scheduling example. One of the orders is overdue (will be late
always), and the other has plenty of time (will be on time always).
The base rate is 50%.
After 1 replication, Simio updates the probabilities. Order 1 now has a 60% on time
probability. Order 2 has a 40% on time
After 2 replications, 67% and 33%:
After 5 replications, 78% and 22%:
After 100 replications, 98% and 2%:
Finally, after 1000 replications, 100% and 0%:
How many replications should I run?
By default, we suggest 10 replications (and 95% confidence
level). With these settings, a risk
measure of 86% is a good sign, while 14% is a bad one. Beyond the default settings, there are
several additional factors which are dependent on the situation and use case. One of these factors is slack time (the time
between estimated completion and due date).
On the Gantt, slack time is the distance between the grey marker and the
green marker. If the slack time is
large, a single replication may suffice.
If the slack time is small, additional replications will help identify
if the order is in trouble or not.
Now that I know my risk, what can I do about it?
Depending on your position in the organization (and
therefore your decision rights), you can change either the design or operation
of the system. Example design changes include things like adding another
assembly line or buying another forklift. These changes are long term and
may require approvals for capital expenditure (which the model facilitates by
quantifying the impact of the expenditure). Example operational changes
include things like adding overtime, expediting a material, or changing order
priorities, quantities, due dates etc. Bridging the gap between design
and operation are the dispatching rules, which relate to overall business
objectives. They are also flexible parameters which control how Simio
chooses the next job from a queue (e.g., earliest due date, least setup,
critical ratio, etc.). All of these parameters influence risk and can be
changed, provided that the user has the authority to change them.
Will Simio choose the best design and operation for me?
Decision rights and business processes have far reaching
consequences. A floor manager can probably authorize overtime if the
schedule looks risky. He probably cannot buy a piece of equipment.
To change a priority or a due date, he probably needs to consult with the
commercial team and/or account managers. To expedite a material, he
probably needs to communicate with the procurement team. To make a
capital expenditure (i.e., change system design), he probably needs
executive/financial approval. Our solution respects those
boundaries. We treat priorities, due dates, etc. as inputs rather than
outputs. Any of these parameters can be changed by the appropriate
decision maker. They should not be changed by the tool without
consent. Simio assists the decision maker (at any level in the
organization) by exposing the true consequences.
With so many choices, how can I quickly explore the
consequences across multiple scenarios?
The experiment runner is used to explore consequences (which
we call Responses) across multiple scenarios where a user can influence the
parameters mentioned above (which we call Controls). If the solution
space is very large (i.e., there are many controls with a wide range of
acceptable values), we recommend using OptQuest to automate the search of the
solution space based on single or multiple objectives (e.g., low cost and high
service level). OptQuest uses a Tabu search which learns how the control
values influence the objectives as it explores the solution space.
How often should I run these type of experiments?
Experiments are most relevant to design choices.
Operational decisions have many hard constraints which cannot be easily
influenced. For example, though Simio will allow you to adjust material
receipt dates of critical materials and show you the impact on the schedule,
many of them are inflexible and out of control of planner or even the
business. If you ask OptQuest how much inventory you would like to have,
it will tell you, but this information adds no value because it is not
actionable in the short term. The planners need to work with what they
have and make the best of it. In practical application, we recommend
running large experiments to explore design decisions on a monthly or quarterly
In today’s world, companies compete not only on price and
quality, but on their ability to reliably deliver product on time. A good production schedule, therefore, influences
a company’s throughput, sales and customer satisfaction. Although companies have invested millions in
information technology for Enterprise Resource Planning (ERP) and Manufacturing
Execution Systems (MES), the investment has fallen short on detailed production
scheduling, causing most companies to fall back on manual methods involving Excel
and planning boards. Meanwhile, industry
trends towards reduced inventory, shorter lead times, increased product
customization, SKU proliferation, and flexible manufacturing are making the
task more complicated. Creating a
feasible plan requires simultaneous consideration of materials, labor,
equipment, and demand. This bar is
simply too high for any manual planning method.
The challenge of creating a reliable plan requires a digital
transformation which can support automated and reliable scheduling.
Central to the idea of effective factory scheduling is the
concept of an actionable schedule.
An actionable schedule is one that fully accounts for the detailed
constraints and operating rules in the system and can therefore be executed in
the factory by the production staff. An
issue with many scheduling solutions is that they ignore one or more detailed
constraints, and therefore cannot be executed as specified on the factory
floor. A non-actionable schedule
requires the operators to step in and override the planned schedule to
accommodate the actual constraints of the system. At this point the schedule is no longer
being followed, and local decisions are being made that impact the system KPIs
in ways that are not visible to the operators.
A second central idea of effective scheduling is properly
accounting for variability and unplanned events in the factory and the corresponding
detrimental impact on throughput and on-time delivery. Most scheduling approaches completely ignore
this critical element of the system, and therefore produce optimistic schedules
that cannot be met in practice. What
starts off looking like a feasible schedule degrades overtime as machines
break, workers call off sick, materials arrive late, rework is required,
etc. The optimistic promises that were
made cannot be kept.
A third consideration is the effect of an infeasible
schedule on the supply chain plan. Factory
scheduling is only the final step in the production planning process, which
begins with supply chain planning based on actual and/or forecast demand. The supply chain planning process generates
production orders and typically establishes material requirements for each planning
period across the entire production network.
The production orders that are generated for each factory in the network
during this process are based on a rough-cut model of the production capacity. The supply chain planning process has very
limited visibility of the true constraints of the factory, and the resulting
production requirements often overestimate the capacity of the factory. Subsequently, the factory schedulers must
develop a detailed plan to meet these production requirements given the actual
constraints of the equipment, workforce, etc.
The factory adjustments to make the plan actionable will not be
transparent to the supply chain planners.
This creates a disconnect in a core business planning function where
enormous spending occurs.
In this paper we will discuss the solution to these
challenges, the Process Digital Twin, and the path to get there. The Simio Digital Twin solution is built on
the patented Simio Risk-based Planning and Scheduling (RPS) software. We
will begin by describing and comparing the three common approaches to factory
scheduling. We will then discuss in
detail the advantages of a process Digital Twin for factory scheduling built on
Factory Scheduling Approaches
Let’s begin by discussion the three most common approaches
to solving the scheduling problem in use today:
1) manual methods using planning boards or spreadsheets, 2) resource
models, and 3) process Digital Twin.
The most common method in use today for factory scheduling
is the manual method, typically augmented with spreadsheets or planning
boards. The use of manual scheduling is
typically not the companies first choice but is the result of failure to
succeed with automated systems.
Manually generating a schedule for a complex factory is a
very challenging task, requiring a detailed understanding of all the equipment,
workforce, and operational constraints. Five
of the most frustrating drawbacks include:
It is difficult for a scheduler to consider all
the critical constraints. While
schedulers can typically focus on primary constraints, they are often unaware –
or must ignore – secondary constraints, and these omissions lead to a
Manual scheduling typically takes hours to
complete, and the moment any change occurs the schedule becomes
The quality of the schedule is entirely
dependent on the knowledge and skill of the scheduler. If the scheduler retires is out for vacation
or illness, the backup scheduler may be less skilled and the KPIs may degrade.
It is virtually impossible for the scheduler to
account for the degrading effect of variation on the schedule and therefore
provide confident completion times for orders.
As critical jobs become late, manual schedulers
resort to bumping other jobs to accommodate these “hot” jobs, disrupting the flow
and creating more “hot” jobs. The system
becomes jerky and the system dissolves into firefighting.
Companies that utilize an automated method for factory
scheduling typically use an approach based on a resource model of the
factory. A resource model is comprised
of a list of critical resources with time slots allocated to tasks that must be
processed by the resource based on estimated task times. The
resource list includes machines, fixtures, workers, etc., that are required for
production. The following is a Gantt
chart depicting simple resource model with four resources (A, B, C, D) and two
jobs (blue, red). The blue job has task
sequence A, D, and B, and the red job has task sequence A and B.
The resources in a resource model are defined by a state
that can be busy, idle, or off-shift.
When a resource is busy with one task or off-shift, other tasks must
wait to be allocated to the resource (e.g. red waits for blue on resource A). The scheduling tools that are based on a
resource model all share this same representation of the factory capacity and
differ only in how tasks are assigned to the resources.
The problem that all these tools share is an overly
simplistic constraint model. Although this
model may work in some simple applications, there are many constraints in
factories that can’t be represented by a simple busy, idle, off-shift state for
a resource. Consider the following
A system has two cranes (A and B) on a runway
that are used to move aircraft components to workstations. Although crane A is currently idle, it is
blocked by crane B and therefore cannot be assigned the task.
A workstation on production line 1 is currently
idle and ready to begin a new task.
However, this workstation has only limited availability when a complex
operation is underway on adjacent line 2.
An assembly operator is required for completing
assembly. There are assembly operators
currently idle, but the same operator that was assigned to the previous task
must also be used on this task, and that operator is currently busy.
A setup operator is required for this task. The operator is idle but is in the adjacent
building and must travel to this location before setup can start.
The tasks involve the flow of fluid through
pipes, valves, and storage/mixing tanks, and the flow is limited by complex
A job requires treatment in an oven, the oven is
idle but not currently at the required temperature.
This is just a few examples of typical constraints for which
a simple busy, idle, off-shift resource model is inadequate. Every factory has its own set of such
constraints that limit the capacity of the facility.
The scheduling tools that utilize a simple resource model
allocate tasks to the resources using one of three basic approaches;
heuristics, optimization, and simulation.
One common heuristic is job-sequencing that begins with the
highest priority job, and assigns all tasks for that job, and repeats this
process for each job until all jobs are scheduled (in the previous example blue
is sequenced, then red). This simple
approach to job sequencing can be done in either a forward direction starting
with the release date, or a backward direction starting with the due date. Note that backward sequencing (while useful
in master planning) is typically problematic in detailed scheduling because the
resulting schedule is fragile and any disruption in the flow of work will
create a tardy job. This simple one-job-at-a-time
sequencing heuristic cannot accommodate complex operating rules such as
minimizing changeovers or running production campaigns based on attributes such
as size or color. However, there have
been many different heuristics developed over time to accommodate special
application requirements. Examples of
scheduling tools that utilize heuristics include Preactor from Siemens and
PP/DS from SAP.
The second approach to assigning tasks to resources in the
resource model is optimization, in which the task assignment problem is
formulated as a set of sequencing constraints that must be satisfied while meeting
an objective such as minimizing tardiness or cost. The mathematical formulation is then “solved”
using a Constraint Programming (CP) solver.
The CP solver uses heuristic rules for searching for possible task
assignments that meet the sequencing constraints and improve the objective. Note that there is no algorithm that can
optimize the mathematical formulation of the task assignment for the resource
model in a reasonable time (this problem is technically classified as NP Hard),
and hence the available CP solvers rely on heuristics to find a “practical” but
not optimal solution. In practice, the
optimization approach has limited application because often long run times (hours)
are required to get to a good solution.
Although PP/DS incorporates the CP solver from ILOG to assign tasks to
resources, most installations of PP/DS rely on the available heuristics for
The third approach to assigning tasks in the simple resource
model is a simulation approach. In this
case we simulate the flow of jobs through the resource model of the factory and
assign tasks to available resources using dispatching rules such as smallest
changeover or earliest completion. This
approach has several advantages over the optimization approach. First, it executes much faster, producing a
schedule in minutes instead of hours. Another
key advantage is that it can support custom decision logic for allocating tasks
to resources. An example of tool that
utilizes this approach is Preactor 400 from Siemens.
Regardless which approach is used to assign tasks to
resources, the resulting schedule assumes away all random events and variation
in the system. Hence the resulting
schedules are optimistic and lead to overpromising of delivery times to
customers. These tools provide no
mechanism for assessing the related risk with the schedule.
The third and latest approach to factory scheduling is a
process Digital Twin of the factory. A
Digital Twin is a digital replica of the processes, equipment, people, and devices
that make up the factory and can be used for both system design and operation. The resources in the system not only have a
busy, idle, and off-shift state, but they are objects that have behaviors and
can move around the system and interact with the other objects in the model to
replicate the behavior and detailed constraints of the real factory. The
Digital Twin brings a new level of fidelity to scheduling that is not available
in the existing resource-based modeling tools.
Simio Digital Twin
The Simio Digital Twin is an object-based, data driven, 3D
animated model of the factory that is connected to real time data from the ERP,
MES, and related data sources. We will
now summarize the key advantages of the Simio Digital Twin as a factory
Dual Use: System Design and Operation
Although the focus here is on enhancing throughput and
on-time delivery by better scheduling using the existing factory design, unlike
traditional scheduling tools, the Simio Digital Twin can also be used to
optimize the factory deign. The same
Simio model that is used for factory scheduling can be used to test our changes
to the facility such as adding new equipment, changing staffing levels,
consolidating production steps, adding buffer inventory, etc.
A basic requirement of any scheduling solution is that it
provide actionable schedules that can implemented in the real factory. If a non-actionable production schedule is
sent to the factory floor, the production staff have no choice to be ignore the
schedule and make their own decisions based on local information.
For a schedule to be actionable, it must capture all the
detailed constraints of the system. Since
the foundation of the Simio Digital Twin is an object-based modeling tool, the
factory model can capture all these constraints in as much detail as necessary. This includes complex constraints such as
material handling devices, complex equipment, workers with different skill sets,
and complex sequencing requirements,
In many systems there are operating rules that have been developed
over time to control the production processes.
These operating rules are just as important to capture as the key system
constraints; any schedule that ignores these operating rules is non-actionable. The Simio modeling framework has flexible rule-based
decision logic for implementing these operating rules. The result is an actionable schedule that respects
both the physical constraints of the system as well as the standard operating
In most organizations, the useful life of a schedule is
short because unplanned events and variation occur that make the current
schedule invalid. When this occurs, a new
schedule must be regenerated and distributed as immediately as possible, to
keep the production running smoothly. A
manual or optimization-based approach to schedule regeneration that takes hours
to complete is not practical; in this case the shop floor operators will take
over and implement their own local scheduling decisions that may not aligned
with the system-wide KPIs. When random
events occur, the Simio Digital Twin can quickly respond and generate and
distribute a new actionable schedule. Schedule
regeneration can either be manually triggered by the scheduler, or
automatically triggered by events in the system.
3D Animated Model and Schedule
In other scheduling systems the only graphical view of the
model and schedule is the resource Gantt chart.
In contrast, the Simio Digital Twin provides a powerful communication
and visualization of both the model structure and resulting schedule. Ideally, anyone in the organization – from
the shop floor to the top floor – should be able to view and understand the model
well enough to validate its structure. A
good solution improves not only the ability to generate an actionable schedule,
but to visualize it and explain it across all levels of the organization.
The Simio Gantt chart has direct link to the 3D animated facility;
right click on a resource along the time scale in the Gantt view and you
instantly jump to an animated view of that portion of facility – showing the machines,
workers, and work in process at that point in time in the schedule. From that point you can simulate forward in
time and watch the schedule unfold as it will in the real the system. The benefits of the Simio Digital Twin begin
with its accurate and fast generation of an actionable schedule. But the benefits culminate in the Digital
Twins ability to communicate its structure, its model logic, and its resulting
schedules to anyone that needs to know.
One of the key shortcomings of scheduling tools is their
inability to deal with unplanned events and variation. In contrast, the Simio Digital Twin can
accurately model these unplanned events and variations to not only provide a
detailed schedule, but also analyze the risk associated with the schedule.
When generating a schedule, the random events/variations are
automatically disabled to generate a deterministic schedule. Like other deterministic schedules it is
optimistic in terms of on time completions.
However, once this schedule is generated, the same model is executed
multiple times with the events/variation enabled, to generate a random sampling
of multiple schedules based on the uncertainty in the system. The set of randomly generated schedules is
then used to derive risk measures – such as the likelihood that each order will
ship on time. These risk measures are
directly displayed on the Gantt Gannt chart and in related reports. This let’s the scheduler know in advance
which orders are risky and take action to make sure important orders have a
high likelihood of shipping on time.
It’s not uncommon that the supply chain planning process which
is based on a rough-cut capacity model of the factory sends more work to a
production facility than can be easily produced given the true capacity and
operational constraints of the facility.
When this occurs, the resulting detailed schedule will have one or more
late jobs and/or jobs with high risk of being late. The question then arises as to what actions
can be taken by the scheduler to ensure that the important jobs all delivered
Although other scheduling approaches generate a schedule,
the Simio Digital Twin goes one step further by also providing a constraint
analysis detailing all the non-value added (NVA) time that is spent by each job
in the system. This includes time
waiting for a machine, an operator, material, a material handling device, or
any other constraint that is impeding the production of the item. Hence if the schedule shows that an item is
going to be late, the constraint analysis shows what actions might be taken to
reduce the NVA time and ship the product on time. For example, if the item spends a significant
time waiting for a setup operation, scheduling overtime for that operator may
Although scheduling within the four walls of a discrete production
facility is an important application area, there are many scheduling
applications beyond discrete manufacturing.
Many manufacturing applications involve fluid flows with storage/mixing
tanks, batch processing, as well as discrete part production. In contrast to other scheduling tools that are
limited in scope to discrete manufacturing, the Simio Digital Twin has been
applied across many different application areas including mixed-mode manufacturing,
and areas outside of manufacturing such as logistics and healthcare. These applications are made possible by the
flexible modeling framework of Simio RPS.
A process Digital Twin is a detailed simulation model that
is directly connected to real time system data. Traditional simulation modeling
tools have limited ability to connect to real time data from ERP, MES, and
other data sources. In contrast, Simio
RPS is designed from the ground up with data integration as a primary
Simio RPS supports a Digital Twin implementation by
providing a flexible relational in-memory data set that can directly map to both
model components and to external data sources.
This approach allows for direct integration with a wide range of data
sources while enabling fast execution of the Simio RPS model.
Data Generated Models
In global applications there are typically multiple
production facilities located around the world that produce the same
products. Although each facility has its
own unique layout there is typically significant overlap in terms of resources
(equipment, workers, etc.) and processes.
In this case Simio RPS provides special features to allow the Digital
Twin for each facility to be automatically generated from data tables that map
to modeling components that describe the resources and processes. This greatly simplifies the development of
multiple Digital Twins across the enterprise and also supports the reconfiguring
of each Digital Twin via data table edits to accommodate ongoing changes in
resources and/or processes.
Simio is a forward scheduling simulation engine. We do not support backwards scheduling. We have found the backwards scheduling
approach fails to represent reality, thus generating an infeasible plan that is
unhelpful to planners. Many of our customers
have learned this lesson the hard way.
The underlying principle of forward scheduling is
feasibility first. A schedule is built
looking forwards considering all the constraints and conditions of the system
(e.g., resource availability, inventory levels, work in progress, etc.). The schedule is optimized in run time while
only considering the set of feasible choices available at that time. Decisions are made according to user
specified dispatching rules (the same as backwards scheduling). The output is a detailed schedule that
reflects what is possible and tells the planner how to achieve it. As in real life, a planner can only choose
when to start an operation. Completion date
is an outcome, not a user specified input.
The most salient technical difference between the two
approaches is material availability (both raw and intermediate manufactured
materials). A forward-looking schedule
makes no assumptions. If materials are
available, a finished good can be produced.
Otherwise, it cannot. If the
materials must be ordered or manufactured, the system will order them or
manufacture them before the finished good can start. A backwards schedule plans the last operation
first, assuming that materials will be available (*we have yet to find an environment
where future inventory can be accurately forecast). If the materials must be produced or
purchased, it will try to schedule or order them prior, hoping that the start
date isn’t yesterday. If the clock is
wound backwards from due date all the way to present time, the resulting
schedule shows the planner what their current stockpile and on-order inventory
would have to be to execute the idealized plan.
It does not tell the planner what they could do with their actual
stockpile and on-order inventory.
Next consider a situation where demand exceeds plant
capacity (this is reality for most of our customers). The plant cannot produce everything that the
planner wants. The planner must choose
amongst the alternatives and face the tradeoffs. Forward scheduling deals with this situation
by continuing to schedule into the future, past the due date, showing the
planner which orders will be late. By
adjusting the dispatching rules, priorities, and the release dates, the planner
can improve the schedule until they reach a satisfactory alternative. Every alternative is a valid choice and feasible
for execution. Backwards scheduling
deals with this situation by continuing to schedule into the past, showing the
planner which orders should have been produced yesterday. The planner must tweak and adjust dispatching
rules and due dates until finding a feasible alternative. In our experience, the planner can make the
best decision by comparing multiple feasible plans, rather than searching for a
Any complete scheduling solution must also be capable of
rescheduling. Rescheduling can be
triggered by any number of random events that occur daily. In rescheduling, the output must respect work
in progress. Forward scheduling loads
WIP first, making the resource unavailable until the WIP is complete. Backwards scheduling loads WIP last, if at
all. Imagine building a weekly schedule
backwards in time, hoping that the “ending” point exactly equals current plant
WIP. The result is often infeasible.
In terms of feasibility, the advantages of forward
scheduling are clear. But we also get
questions about optimization, particularly around JIT delivery. A quick Google search on forward scheduling
reveals literature and blog posts that describe forward scheduling “As early as
possible” (meaning a forward schedule starts an operation as soon as a resource
is available, regardless of when the order is due). This is false. Forward scheduling manages the inventory of
finished goods the same way the plant does.
A planner specifies a release date as a function of due date (or in some
cases specifies individual release dates for each order). In forward scheduling, no order is started
prior to release date. The power of this
approach is experimentation. Changing
lead time is as easy as typing in a different integer and rescheduling. As above, the result is a different feasible
alternative which makes the tradeoff transparent. Shorter lead times minimize inventory of
finished goods but increase late deliveries and vice versa. We have found many customers focus on short
lead times based on financial goals rather than operational goals. Inventory ties up cash. Typically, the decision to focus on cash is
made without quantifying the tradeoff.
We provide decision makers with clear cut differences between
operational strategies so that they can choose based on complete information.
Forward scheduling is reality. It properly represents material flows and
constraints, plant capacity, and work in progress. It manages the plant the same way a planner
does. Accordingly, it generates sets of
feasible alternatives that quantify tradeoffs for planners and executive
decision makers alike. It answers the
question “What should the plant do next?” as opposed to “What should the plant
have done before?” We’ve found the
feasibility first approach is the most helpful to a planner and therefore the
most valuable to a business.
The digital transformation of
traditional business process and the assets that run them have become one of
the raves of the moment. A Forbes-backed research highlights just how
popular the topic of digital transformation and the tools needed to accomplish
it has become. Statistics like the fact that 55% of business intended to adopt
digitization strategies in 2018 which grew to 91% in 2019 highlights just how
popular this transformation has become.
The reason for its increased
adoption rate is the ease it brings to managing business operations,
facilitating growth, and a healthy return on investments made on digital
transformation. The numbers from the 2019 digital business survey prove these benefits
outlined earlier to be true. 35% of organizations have experienced revenue
growth while 67% believe it has helped them deliver better services to
customers. But despite its popularity, the adoption of digital transformation
brings up a multitude of question many enterprises still struggle to answer.
This post will answer some of the more important questions with special
emphasis on facility management and efficiency.
What is Digital Transformation?
Digital transformation refers to
the integration of digital technologies into business operations to change how
an enterprise operates and delivers value to its customers or clients. Digital
technologies generally refer to devices and tools that enable access to the
internet thus its use allows organizations to bring operational processes to
The above definition is a simpler
version of what digital transformation is about but because digital
transformation looks different for every company and industrial niche, other
definitions exist. In terms of enhancing equipment and facility efficiency
levels, the definition by the Agile Elephant better encapsulates its
meaning. Here, digital transformation is defined as digital practices that
‘involve a change in leadership thinking, the encouragement of innovation and
new business models, incorporating digitization of assets, and increased use of
technology to improve an organizations entire operations.’
In facility management, assets
refer to the equipment, tools, and operation stations within the facility while
new business models and innovation refer to the integration of digital
technology concepts. These concepts can be the digital twin, discrete event
simulation or predictive analysis.
What is Overall Equipment and
Productivity within manufacturing
facilities and warehouses are generally measured using the overall equipment
effectiveness (OEE) concept. This concept measures the maximum output machines
can achieve and compares subsequent output to the optimized value. In cases
where the machine or equipment falls short, the OEE falls from 100% and the
production cycle may be termed unproductive.
The OEE is calculated using three
separate components within facilities and these are:
Availability – This
focuses on the percentage of scheduled time an operation is available to
Performance – This
refers to the speed at which work centers compared to the actual speed it was
designed to achieve
Quality – This refers
to the number of goods produced and the quality levels compared to optimal
Although the OEE process is quite
popular and has proved to be efficient, a critical analysis shows that it does
not take into consideration some important metrics. OEE calculations do not
include the state of the shop floor, material handling processes, and
connections to both upstream and downstream performances. This is why its effectiveness as a measuring tool has been lampooned by
a plethora of manufacturers with skin in the game.
Criticism of OEE as a performance
measurement tool include its lack of ability to breakdown or access granular
information in facilities and its lack of multi-dimensionality. The fact that
it struggles with identifying real areas that require improvement within
facilities is also a deterrent to its efficiency in analyzing factory
performances. And this is where digital transformation comes into play.
Digital Transformation and its
Ability to Enhance Facility Efficiency
The ability to digitize assets
within manufacturing shop floors have created an environment where granular
data can be collected from the deepest parts of today’s facilities. With the
data collected due to digital transformation, a clearer picture of how a
facility function can be gotten. But the digitization of traditional
manufacturing processes and operations have also been a source of debate for
diverse professionals due to certain difficulties. These difficulties include
assessing data from legacy or dumb assets, managing communications across
diverse supply chains, and bringing captured data together to make sense of
complex facility operations.
To manage these challenges, diverse
emerging technologies have been built around each of them. In terms of capturing
data from legacy assets, the use of smart edge technologies that can be
attached to assets is currently eliminating this challenge. While standards and
communication protocols such as those from the OPC foundation is solving the
issue of communication across both smart and dumb assets. Finally, to make
sense from the captured data in order to enhance shop floor activities, digital twin technology provides a streamlined
approach to monitoring and managing facilities using captured data.
With these emerging technologies,
detailed insight at the granular level can be assessed about a particular
facility. More importantly, these technologies attached to digital
transformation can be used to enhance operational processes by delivering
real-time scheduling, analyzing complex processes, and simulating applicable
solutions to manufacturing shortcomings.
Discrete Event Simulation and
Enhancing Facility Efficiency
Discrete event simulation (DES)
tools such as Simio are some of the emerging technologies that play important
roles in transforming traditional factory or facility processes. The
introduction of DES can help with mapping out previous event schedules to
create optimized scheduling templates that can speed up production processes.
DES tools or software can analyze
both minor processes that are subsets of a large one, as well as, the entire
complex system to produce schedules that optimize these processes. An example
of this was the integration of Simio by Diamond-Head Associates,
a steel tubing manufacturing company. The challenges the steel tubing
manufacturer faced involved meeting production schedules due to a very complex
production process with hundreds of production variables.
With the aid of Simio simulation
software and the digital transformation it brings, Diamond-Head associates were
able to utilize the large data sets produced by the varying production
processes. With this simulation model, optimized schedules built for its
manufacturing processes were created and this helped with making real-time
businesses decisions. The steel tubing manufacturer successfully reduced the
time it took to make a decision from an hour and a half to approximately 10
This case study highlights how
digital transformation can be used to enhance facility efficiency in diverse
ways. These ways include optimizing scheduling procedures and drastically
reducing the time needed to come up with accurate solutions to complex
manufacturing-related scheduling processes.
Enhancing Facility Productivity with
the Digital Twin
Another aspect of digital
transformation is the use of digital twin technologies to develop digital
representations of physical objects and processes. It is important to note that
the digital twin does more than a 3D scanner which simply recreates physical
objects into digital models. With the digital twin, complex systems can be
represented in digital form including the capture of data produced by assets
within the system.
The digital twin ecosystem can also
be used to conduct simulations that drive machine and facility performance,
real-time scheduling, and predictive analytical processes. Thus highlighting
how digital transformation provides a basis for receiving business insights
that change the leadership of an organization thinks and make decisions.
An example that highlights the
application of digital twin technology to enhance productivity or facility
efficiency is that of CKE Holdings Inc. CKE Holdings is the parent company of
restaurants such as Hardee’s and Carl’s Jr. Earlier this year, the enterprise
was interested in providing efficient shop floors or restaurant spaces for its
employees to increase productivity levels, train new employees, and deliver
better services to its customers. To achieve its aims, the organization turned
to the digital twin and augmented reality to aid its
Once again, it is worth noting that
both the digital twin and virtual reality tools are digital transformation
tools. And with these tools, CKE Holdings Inc. succeeded in developing
optimized restaurants with shop floor plans that played to the strength of its
employees. The digital twin was also used to test and implement new products at
a much faster rate than the traditional processes previously employed by the
The end result was a user-friendly
kitchen layout that delivered innovation in how CE Holdings restaurants
function. The use of augmented reality also added another dimension to the
training of new employees. The use of technology ensured new employees learnt
through live practical involvement without any of the consequences attached to
failure. This also reduced the hours experienced workers spent getting new
employees up to speed within the restaurants. Thus highlighting another aspect
in which digital transformation can be applied to drive facility efficiency levels.
The Benefits of Digital
Transformation to Manufacturing and Production-Based Facilities
The examples outlined already spell
out the benefits of digital transformation and its role in enhancing overall
equipment and facility effectiveness levels. But, it is only right to compare
and highlight what digital transformation brings to the table against the
traditional OEE calculations still used within many shop floors.
A Complete Picture –
Unlike OEE calculations which rely solely on manufacturing data produced from
equipment and tools, digital transformation technologies can capture every
aspect of the production process. This includes capturing data from the diverse
algorithms, scheduling details, assets, sub-systems, and events that occur
within the shop floor. This makes the level of details provided by digital twin
environments superior to analyzing and enhancing facility productivity.
Improved Customer Strategy – Digital transformation enables the capture of data
highlighting customer satisfaction with end products. This information can also
be integrated into the manufacturing circle to ensure customers get nothing but
the best service. This means with digital transformation the feedback of
customers and employees can be used to enhance production facility processes.
Improved Employee Retention Strategy – The manufacturing industry is notorious for its high
employee turnover rate due to diverse factors that make it unattractive to the
new generation of workers. The integration of digital transformation can
enhance workplace layout, as well as, bring a more modern and captivating
process to manufacturing. These enhancements can reduce the turnover rate and
get the younger generation interested in manufacturing.
Enabling Innovation –
The increased adoption rate of industry 4.0 business concepts and models in
manufacturing means businesses must adapt if they intend to retain their
competitive edges. Digital transformation offers a pathway to innovating legacy
business process and increasing an enterprise’s ability to stay competitive in
a changing manufacturing industry.
The Next Steps
The advantages digital
transformation brings to enhancing facility efficiency comes with a butterfly
effect that affects leadership, innovation, and problem-solving activities.
Although the integration process involves technical knowledge of applying
digital twin technologies and simulation software, these skills can be acquired
with a little effort.
Simio Fundamentals Course offer businesses and
other organizations with the opportunity to train staffs about digital
transformation and its specific techniques. You can also choose to register employees to participate in
the upcoming Simio Sync Digital Transformation 2020 Conference to learn more
about digitally transforming your business processes and how to reap the
Digital Twin is reminiscent of the early days of
the personal computer
in many ways. Initially, creating a digital twin required excessive computing
power and multiple engineers working round the clock to develop digital
representations of physical models. And just like the personal computer,
technological advancement led to the creation of cloud-based digital twin
solutions which made it possible for everyone to explore digital transformation
and it is benefits.
the digital twin market is expected to grow exponentially and this growth is
being driven by the approximately 20 billion sensors and endpoints around the world.
Advancements in IoT and IIoT have also played a role in increasing the adoption
rate of digital twin technology which are some of the reasons why digital
representations of almost any entity or process can be created today.
benefits of the digital twin include the ability to make real-time decisions,
receive insight from complex processes or systems, and plan better for the
future. You can explore how digital twin can help your enterprise or individual
pursuits by reading relevant case studies here. Now, to reap these benefits, a digital twin of a
chosen process, object, facility, or system must be created which is what this
post is all about. Thus, if you have ever wondered what it takes to develop a
digital twin then bookmarking this is recommended.
You Should Know About Creating a Digital Twin
task of creating a digital twin may sound daunting but like most activities
diving in headfirst without overthinking simplifies the process. Once you have
the required tools needed to create a digital twin, the supporting technologies
such as Simio provides you with prompts and interactive information needed to
complete the process. To successfully create a digital twin, here is what you
need to know and the resources you need to have:
the System – The
first step to creating a digital twin is defining the system, process or object
to be digitized. To do this, an understanding of the entity is required and
this can only be achieved through data capture. Thus, data defines the system
to be digitized and introduced into the digital space.
The data capture process is
generally fluid in nature and depends on the entity or system being considered.
For manufacturing facilities, the data that defines a system or process can be
gotten from assets within a facility these assets include original equipment,
shop floor layouts, workstations, and IoT devices. Data from these assets are
captured using smart edge devices, RFIDs, human-machine interfaces and other
technologies that drive data collection.
With physical objects such as
vehicles, data capture is done through sensors, actuators, controllers, and
other smart edge devices within the system. 3D scanners can also be used to
extract point clouds when digitizing small to medium-sized objects. The
successful capture of the data a system or object produces defines the system
and is the first step to creating a digital twin.
Identity of Things
– One of the benefits of a digital twin is the ability to automate processes
and develop simulations that analyze how a system will operate under diverse
constraints. This means the system or facility to be digitized must have its
own unique identity which ensures its actions are autonomous when it is
introduced into a system.
To achieve this, many digital
twin platforms make use of decentralized identifiers which verify the digital
identity of a self-sovereign object or facility. For example, when developing a
digital twin of a facility, the entire system will have its own unique identity
and assets within the facility are verified with unique identities to ensure
their actions are autonomous when executing simulations within the digital twin
An Intuitive Digital Twin
Interface – Another
important element or choice to make when creating a digital twin is selecting a
technology or software that can help you achieve your goals. You must be clear
about how the technology can help you achieve your goals of a digital twin.
Some things you need to consider when choosing a digital twin software or
the software handles the flow of data from the IoT devices or facility and
other enterprise systems needed to understand and digitize the chosen process.
need to understand how the software recreates physical objects or assets into
its digital ecosystem. Some technologies support the use of 3D models and
animations when recreating entities while others do not deliver that level of
digitizing complex systems with hundreds of variables that produce large data
sets, the computing resources needed to create and manage a digital twin is
increased. This makes computing power and resources a key consideration when choosing
a digital twin platform or solution. The best options are scalable digital twin
technologies that leverage the cloud to deliver its services.
intuitive digital twin solution also simplifies the process of creating digital
representations of physical assets. The technology should also be able to
understand the data produced across the life-cycle of an asset or at least
integrate the tools that can manage the identity of assets within the digital
key consideration is the functions you expect the digital twin to perform. If
it is to serve as a monitoring tool for facilities or for predictive
maintenance, a limited digital twin software can be used while for simulations
and scheduling a more advanced technology will be required.
Small with Implementation –
When taking on the implementation of digital twin technology, it is recommended
you start small. This means monitoring the performance of simple components or
a single IoT device within a system and expand with time. This hands-on
approach is the best way to understand how the digital twin functions and how
it can be used to manage larger systems according to your requirements.
With this knowledge, you can
then choose to explore the more sophisticated aspects or functions the digital
twin offers such as running complex discrete event simulations and scheduling
tasks. A step by step approach to implementing or creating a digital twin
provides more learning opportunities than initiating a rip and replace approach
when developing one.
the Security Considerations
– According to Gartner, there will be 50 billion
connected devices and 215 trillion stable IoT connections in 2020. As stated
earlier the increased adoption rate of digital twin technology and the
connected systems around the world bring up security challenges. These security
considerations also affect the digital twin due to the constant transfer of
data from the physical asset or process to the digital twin ecosystems.
When creating a digital twin, a plan
must be in place to handle secure communication channels across networks and
other vulnerabilities. To effectively do this, an understanding of the
different communication protocols used within a system is required. This is why
when choosing a digital twin technology, security challenges and how the
platform mitigates risk must also be considered.
Digital Twin with Simio
digital twin technology provides an extensive framework for creating digital
twins of physical processes and facilities. The key considerations such as 3D
representation, animation, scaling up functions, and simulation can be achieved
within Simio’s environment.
properly created, the digital twin can be used to drive data analytics
initiatives, predictive maintenance, design layouts, and simulate diverse
working scenarios. Thus, anyone or an enterprise can explore the benefits of
the digital twin using Simio to create digital representations of complex
systems or simpler ones. You can learn more about using Simio to create digital
twin representations by registering for the Simio Fundamentals Course.
LLC is delighted to announce once again that the opportunity to learn more
about the state of simulation and digital twin technologies is here. And yet
again, this promises to be one of the biggest simulation and digital twin of
the year. Simio Sync Digital Transformation Conference will focus on digital
transformation technologies and how enterprises can tap into Simio to unleash
the digital potential of business processes.
event will be taking place on the 4th and 5th of May 2020
in Pittsburgh with advanced training about using Simio from the 6th
to 8th of May. The first event will introduce you to Simio, the
recent updates in Simio 12, and its application across industries. Keynote
speeches and event programs will consist of Simio use cases and application
across industries of interest. But before delving into the opportunities of the
2020 conference, here is a recap of 2019.
2019 – A Recap
2019 Simio Sync conference was the third annual event on
simulation and digital transformation built around Simio technologies and
solutions. The event brought together an ensemble of experienced speakers to
inspire the crowd on the role of simulation and digital transformation in the
real-world. Speakers included; Chris Tonn from SPIRIT Aero systems, Ian
Shillinger from Mckinsey & Company, Antonio Rodriguez from the National
Institute of Health (NIH), and Dusan Sormaz from the University of OHIO among
speaker presented case studies highlighting the application of Simio within the
aviation industry, manufacturing, healthcare, education, hospitality, and
simulation engineering. These events proved inspiring to participants from
varying industries and opened up new possibilities about applying Simio within
their specific industries.
to Jarred Thome, from USPS, his first Simio Sync event was an eye-opener in
many ways. He said “This was my first year attending the conference and I was
blown away by the extent to which the folks at Simio went to ensure it was a
success. The content, presenters and networking opportunities were all top-notch
and the Simio staff was always accessible and willing to chat. I will
definitely be coming back.”
2019 event was one of a kind and next year’s event is expected to take things
up a notch. The Simio Sync Digital Transformation Conference will consist of
speakers from Fortune 500 companies willing to share their experiences with
digital transformation using Simio with you. The event will also serve as a
networking arena for stakeholders within the simulation and digital
transformation community and participants.
you need to know about the Simio Sync Digital Transformation Conference for
2020 will be highlighted here. In the meantime, you can sign up with Simio
to receive conference updates and to register as a participant today.
fourth annual Simio Sync conference gives you the choice to learn, no matter
the role you play in your company’s digitization efforts. At the end of the
days, you will work away with the knowledge to help you and your company refine
your digital transformation strategy to reap the rewards digitization brings.
networking is your thing, how about coming to listen and catch up with
individuals from fortune 500 companies among others. Through the years, Simio
conferences have been fertile grounds for communicating and building
relationships and this year’s event will be no different.
Sync Digital Transformation provides an excellent opportunity to learn about
simulation and its application in the real-world. Attending the event can help
kick start your company’s digital transformation or refine transformation
strategies to meet your defined goals. Thus, everyone is invited to attend,
network, get inspired, and create fun memories while learning about digital
Sync Digital Transformation conferences are safe spaces created for everyone
interested in the digital twin, simulation, scheduling, and digitization. The
event is open to everyone and the conference areas are safe, inviting, and
representatives are also available in every location to ensure your
participation is a memorable one. If you are in need of answers to
Simio-related questions or event-related questions, you can reach out to a
representative and your questions will be answered.
the increasing number of participants, the golden rule of mutual understanding
also applies. This will help you build better networks, and truly take
advantage of the different sessions, and labs that are parts of the event.
is now open. You can now take advantage of early bird tickets to become one of
the first individuals or organizations to register for the Simio Sync Digital
Transformation Conference for 2020. There are a plethora of excellent hotels
and living areas around the event holding in Pittsburgh and the earlier you
register, the quicker you will be about planning your travel and relaxation
register, visit the Simio Sync event page and go through the registration
process. The process is quite straightforward and intuitive to accomplish. You
will also have the choice of registering for the conference event and adding
advanced training options to your registration form.
Simio Sync session catalog is the ultimate guide you need to navigate through
the conference while bookmarking areas you are particularly interested in. The
session catalog is currently live and you can browse through it while
year, there are approximately eight unique sessions divided across networking
breaks to ensure you take advantage of your participation. The sessions include
diverse keynote speeches from leading digital transformation experts and Simio
engineers. To get a real-world feel of the application of Simio and digital
transformation processes, case study sessions and presentations are also part
of the event catalog.
exciting events of note which you are also welcome to participate in includes
the Simio Pittsburgh exploration event where you and your spouse can line up
with Simio Spouses to explore the ancient city of Pittsburgh. If you are a
running enthusiast, you can also choose to participate with Simio in the
Pittsburgh marathon before the conference begins. These are part of the fun activities
lined up for you!
Training and Hands-on Labs
advanced training event will take place on the 6th to the 8th
of May. This training focuses on the application of Simio in the real-world.
Thus, you will be introduced to the different features of Simio and how they
can be applied to drive digital transformations, simplify discrete event
scheduling, and build digital twins of physical processes.
advanced training program will be a boon for organizations currently using
Simio and others who are interested in using it to drive digital transformation
strategies. Individuals interested in digital transformation are also welcome.
The hands-on lab integrates the use of case studies and the Simio interface to
ensure you understand every aspect of the digital transformation process with
of industry-leading organizations and individuals have already reserved their
spots for the Simio Sync Digital Transformation conference. And come the 4th
of May 2020, you too can pick the brains of your favorite personalities within
the aviation, hospitality, education, healthcare, automotive, automation,
manufacturing, and pharmaceutical industries.
from Lockheed Martin, Air Canada, Boeing, BAE Systems, OHIO University, United
States Postal Services, Exxon Mobil, Roche, FedEx, Honeywell, American Airlines
etc. will be there. The networking dinner and entertainment sessions at 6pm
create an excellent opportunity to build interpersonal relationships for the
get the best out of the Simio Sync Digital Transformation Conference, we
recommend that you participate in at least one of the following programs:
in at least one hands-on training covering the use of Simio.
and participate in a keynote session highlighting the use of Simio for your
industry or related industry
comfortable shoes and cover grounds during the networking dinner and
entertainment opportunities within the conference.
Twins refers to the digital representations of people, processes, and things.
It is used to analyze operations and receive insight into complex processes. As
2019 comes to an end, the need to define digital twin technology still exists
and hopefully by this time next year, its growth and popularity will make this
2018, digital twins were included as a top technology trend by the big names
covering the tech industry. According to Orbis research, the digital twin market is
expected to grow by 35% within a 5-year time frame and 2020 is right in the
middle of this period. But before highlighting the trends to expect in 2020, it
is only right to do a recap of the year so far. This is to note if earlier
predictions have come to pass before mapping the future.
terms of popularity, coverage of the digital twins is definitely on the right
track as continuous studies by Gartner and other publications show. Today, many
professionals across the technical and non-technical divide understand the
digital twin concept and how it can be used to drive business processes and
concepts. This is why many industries are currently integrating digital twins
to bolster business insight and understand data.
biggest adopters of digital twin technology in the geographical sense remains
North America. Enterprises within the US and Canada currently leads the way in
terms of adopting digital twin technology. North America accounts for
approximately 59% of the digital twin market and economy while Europe and the
Asian pacific comes next.
very nature of the digital twin and simulation, as well as, the solutions they
provide makes them attractive business tools for the manufacturing industry and
this fact is backed up by data. The manufacturing industry’s affinity to
digital twins is powered by Industry 4.0 and the varied ongoing processes that
occur within shop floors. The use of smart edge devices, equipment, robots, AI,
and automation also fits nicely into the digital twin concept thus making it
attractive to manufacturers.
2019, manufacturers account for approximately 36% of the digital twin market.
Other industries such as the energy and power industry, Aerospace, Automobile
and Oil & Gas complete the top five industries who make use of the digital
twin to enhance operations. Analyzing this trend highlights the fact that
digital twins are important to simplifying complex processes where hundreds or
thousands of variables and relationships are needed to successfully accomplish
Is the Digital
Twin for Only Production-based Industries?
the Oil & Gas industry, as well as, the energy and power sector are not
tagged as manufacturing industries, a case can be made for it. Therefore, many
may assume or wonder if digital twin technology is only useful within
production-based industries where discrete or process manufacturing takes
place. And the answer is No.
digital twin is also being used in other industry verticals such as the
hospitality industry and in restaurants. One example is the use of Simio by CKE
Holdings Inc. to ease workloads in its Carl’s Jr
and Hardee’s restaurants.
The Digital twin is also being used to support discrete event simulations in
hotels, real-estate, and tolling facilities.
use of interconnected devices and automation within service and hospitality
businesses are the driving forces behind the adoption of digital twin within a
variety of industries. And the coming year is expected to witness continuous
growth as more industries and professionals understand what the digital twin
brings to the table.
Trends for the Digital Twins and Simulation Technology for 2020
Technologies will Boost Adoption Rate
– The growth and maturity of interrelated technologies such as 3D printing,
metal printing, and mapping will play a part in accelerating the adoption rate
of digital twins in 2020. This is because of the need to monitor and
consistently improve these technologies and the systems that drive them.
3D printing as an example, many manufacturing outfits are currently making use
of 3D printing clusters to speed up their production requirements. 3D printing
clusters or farms refers to facilities where hundreds of 3D printers function
simultaneously to manufacture physical items. Although these 3D printing
clusters have dedicated software for managing the printing process, material
delivery, scheduling, and managing the entire supply chain within these facilities
are handled manually.
twin solutions can eliminate the manual management and handling process in 3D
printing farms to great effect. If properly executed, a digital twin of a large
scale 3D printing cluster will provide a data-driven approach to optimize
supply, scheduling and the manufacturing process. This will reduce expenditure
including the energy expended in 3D printing cluster facilities.
Industry 4.0 will
Continue to Drive Adoption
– The growth in Industry 4.0 and the devices, as well as, communication
channels driving the smart factory is expected to increase the adoption of
digital twin solutions. In 2019, Industry 4.0 witnessed the creation of new
standards from the OPC Foundation that supports the collection of data from the
deepest corners of brownfield facilities. These data were collected from dumb
equipment with legacy technologies using smart edge and embedded devices.
success of this approach, means that digital twin technology can now integrate
the data collected from dumb or legacy equipment when developing digital
representations. This increases the accuracy levels of the representations
thereby enhancing simulation results and scheduling plans. Thus, increasingly
accurate digital twin ecosystems and results will create more use cases that
will drive the adoption of digital twins in 2020.
IoT and IIoT to
Drive Digital Twin Adoption Rate
– The move to more interconnected environments across both manufacturing and
service-based industries also have roles to play in 2020. As stated earlier,
Industry 4.0 will enhance the adoption of digital twin technologies and this
also true for the industrial internet of things (IIoT). The widespread adoption
of IoT and IIoT devices or equipment have created a race to develop the best
management solution to monitor interconnected activities.
creates an avenue which digital twin service providers are currently taking
advantage of and will continue to do so in 2020. The ability of the digital
twin to create digital representations of IIoT devices and also integrate the
data they produce creates multiple use cases enterprises will explore in the
coming years. These use cases include running simulations in complex
interconnected facilities to produce accurate results or to access processes
that involve the use of IIoT technologies.
Digital Twin for Cybersecurity
Challenges – With
every passing decade, the cybersecurity challenges enterprises face keeps
changing. The millennium brought Trojan horses and other viruses which were
effectively stopped with anti-virus software apps and by 2010, attackers
pivoted to using phishing attacks and malware. Today, ransomware, spyware,
DDoS, and business email compromise attacks have become the new challenges
enterprises face. Thus highlighting the ever-changing landscape of cyber
cater to these threats and attacks, digital twin solutions will be enlisted by
enterprises in 2020. In this scenario, the digital twin will be used as a penetration
testing tool to simulate the effects of successful data breaches or ransomware
to an organizations business processes. Within the digital twin environment,
attacks to core equipment can be simulated and the result will be a response
pattern that ensures the crippled equipment does not lead to extended downtime.
will also be expected to witness an increase in the cybersecurity threats
facing cloud-based digital twin solutions. Thus, more secure communication
protocols and standards regulating data use will be developed to protect
enterprises making use of digital twin technology. This means developers and
service providers will have an increased role to play in securing digital twin
The drive to deliver real-time scheduling is expected to continue in 2020 as
enterprises seek more accuracy with managing business process. The need for
real-time scheduling is also driven by how enterprises intend to apply
simulation and digital twin tools. An example includes the need to make
business decisions in real-time, handle unforeseen occurrences such as machine
downtime, and reschedule operations.
challenges fall into the category of issues discrete event simulation (DES)
software can handle. Once the required data is accessible, DES and digital twin
applications can conduct simulations in real-time and provide accurate
solutions to dealing with changing scenarios also in real-time. This will
drastically reduce downtime and enhance performance within facilities and
some DES software offers real-time simulation scheduling, many are still
process-based scheduling applications and this is set to change in 2020.
Quantum Computing – If real-time simulation,
scheduling, and process management is to be achieved, then digital twin
solutions must take advantage of the speed, scalability, and high-performance
quantum computing offers. Today, digital twin solutions currently leverage the
cloud to provide stable and scalable services to enterprises and only a few
integrate the use of high-performing computers to enhance or manage really
2020, further strides will be made to speed up simulations within digital twin
environments using high-performing computers. The success of this initiative
will speed up real-time scheduling and complex process management for the
benefits of the digital twin have played an important role in ensuring its
adoption across diverse industries and the expected trends of 2020 will
continue the increased adoption rate that came with 2019. Although digital twin
solutions have become more interactive and intuitive to use, enterprises still
require the assistance of experienced professionals to get the best out of
their digital twin environment and this is where Simio can help.
managers, cybersecurity experts, and project managers can take advantage of the
Simio Fundamentals Course to learn more about simulation
and Digital Twin technology including its application in real-life scenarios.
Today, discrete event simulation
(DES) software and the benefits it provides are currently being used across a
majority of industries to simplify business operations, make predictions, and
gain insight into complex processes. But before modern simulation software such
as Simio could be used to create shiny models and execute real-time
simulations, there were earlier technologies that formed the foundation built
upon by modern simulation software. As you can probably tell, there is
definitely a story behind the evolution of simulation software and today, that
story is being told.
To accurately tell this story, the
evolution must be arranged in chronological order. The traditional order
currently in use today is the order outlined by R.E. Nance in 1995. This chronological
order will be used here but with slight edits to accommodate the earliest
memories of simulation software and the current strides being made. This is
because the most referenced order outlined in 1995, did not take into account
the efforts of Jon von Neumann and Stanislaw Ulam who made use of simulation to
analyze the behavior of neutrons in 1946.
RE. Nance’s chronology which was
written in 1995 could not and did not account for the recent paradigm shifts in
DES software. This understandable omission will also be highlighted and
included in this post. Therefore, this paper on discrete event simulation
should be seen as an update of the history and evolution of DES software.
The Early Years (1930 – 1950)
Before discrete simulation came to prominence, early mathematicians made use of deterministic statistical sampling to estimate uncertainties and model complex processes. This process was time-consuming and error-prone which led to the early DES techniques known as Monte Carlo simulations. The earliest simulation was the Buffon needle technique which was used by Georges-Louis Leclerc, Compte de Buffon to estimate the value of Pi by dropping needles on a floor made of parallel equidistant strips. Although this method was successful, simulation software as we know it got its origin in 1946.
Sometime in the fall of 46’, two
mathematicians were faced with the problem of understanding the behavioral
pattern of neutrons. To understand how neutrons behaved, Jon von Neumann and
Stanislaw Ulam, developed the Roulette wheel technique to handle discrete event
simulations. The light bulb moment came to Ulam while playing a game of
Solitaire. Ulam successfully simulated the number of times he could win at
Solitaire by studying hundreds of successful plays.
After successfully estimating a few
games, he realized it would take years to manually observe and pick successful
games for every hand. This realization led to Ulam enlisting Jon von Neumann to
build a program to simulate multiple hands of solitaire on the Electronic
Numerical Integrator and Computer (ENIAC). And the first simulation software
The Period of Search (1955 – 1960)
The success of both mathematicians
in simulating neutron behavioral patterns placed the spot light on simulation
and encouraged government agencies to explore its uses in the military. As with
all technological processes, the growth of discrete simulation software could
only match the computing units available at that time. At that time, analog and
barely digital computers were the springing board for development.
Around 1952, John McLeod and a
couple of his buddies in the Naval Air Missile Test Center undertook the
responsibility of defining simulation concepts and the development of
algorithms and routines to facilitate the design of standardized simulation
software. In the background, John Backus and his team were also developing a
high-level language for computers. The efforts of the multiple teams working
independently of one another led to the development of the first simulation
language and software that would lead to the evolution of DES software.
It also highlights the general
theme of how technological advancements and software evolutions occur which is
through advancements in diverse interrelated fields.
The Advent (1960 – 1965)
By 1961, John Backus and his team
at IBM had successfully developed FORTRAN, the first high-level programming
language for everyday use. The success of FORTRAN led to the creation of a
general-purpose simulation language based on FORTRAN. This language was SIMSCRIPT
which was successfully implemented in 1962 by Harry Markowitz.
Other general-purpose simulation
software and systems also sprang up within this period as competing contractors
continued to develop simulation languages and systems. At the tail end of 1965,
programs and packages such as ALGOL, General Purpose Simulation System (GPSS),
and General Activity Simulation Program (GASP) had been developed. IBM
computers and the BUNCH crew consisting of Burroughs, UNIVAC, NCR, Control Data
Corporation, and Honeywell were developing more powerful computers to handle
One of the highlights of this
period was the successful design of the Gordon Simulator by IBM. The Gordon
Simulator was used by the Federal Aviation Administration to distribute weather
information to stakeholders in the aviation industry. Thus highlighting the
first time simulation was used in the aviation industry.
Here again, the increase in
processing speed and the prominent entry of a new term known as computer-aided
design was to play a role in advancing the development of simulation software
for use. At this stage, early simulation packages and languages were still
being used predominantly by the government, as well as, a few corporations.
Also, ease of use, intuitive, and responsive packages were slowly being
integrated into simulation software such as the GPSS which had become popular
in the 60s’.
The Formative Years (1966 – 1970)
The formative years were defined by
the development of simulation software for commercial use. By this time,
businesses had begun to understand simulation and the role it plays in
simplifying business process and solving complex problems. The success of
systems such as the Gordon Simulator also got industry actors interested in the
diverse ways DES software could be employed.
Recognizing the need to apply
simulation in industrial processes, the first organization solely dedicated to
simulation was formed in 1967 and the first conference was held in New York at
the Hotel Roosevelt. In the second conference, 78 papers on discrete event
simulation and developing DES software were submitted. Surprisingly some of the
questions asked in the 1968 conference still remain relevant to this day. These
The difficulties in
convincing top management about simulation software
How simulation can be
applied in manufacturing, transportation, human behavior, urban systems etc.
The Expansion Period (1971 – 1978)
The expansion period was dedicated to the
simplification of modeling process when using simulation software and
introducing its use in classrooms. At this stage, diverse industries had begun
to understand the use and benefits of simulation software to their respective
industries. This, in turn, led to discussing the need to prepare students for a
world that integrates simulation.
Also, advancement in technology
such as the introduction and wide spread use of the personal computer made the
case for developing simulation software for dedicated operating systems. This
led to the development of the GPSS/H for IBM mainframes and personal computers.
The GPSS/H also introduced interactive debugging to the simulation process and
made the process approximately 20 times faster than previous simulation
packages. In terms of technological evolution, the GASP IV also introduced the
use of time events during simulations which highlights the growth in simulation
software available to industries at that time.
By the fifth simulation conference
tagged the ‘Winter Simulation Conference’ of 1971, diverse tutorials on using
simulation packages such as the GASP2 and SIMSCRIPT had become available to the
public. The growing popularity of simulation also led to increased commercial
opportunities and by 1978, simulation software could be purchased for less than
The Consolidation and Regeneration
(1979 – 1986)
The consolidation age was defined
by the rise of the desktop and personal computer which led to the widespread
development of simulation software for the personal computer. Simulation
software also witnessed upgrades through the development of simulation language
for alternative modeling (SLAM). The SLAM concept made it possible to combine
diverse modeling capabilities and obtain multiple modeling perspectives when
handling complex processes.
These upgrades or development made
simulation for production planning possible and the manufacturing industry
began to take a keen interest in simulation software. The increase in computing
and storage capacity also led to the creation of factory management systems
such as the CAM – I. CAM – I effectively became the first simulation software
used solely for closed-loop control of activities and process within shop
By 1983, SLAM II had been developed
and this was an industrial-grade simulation package ahead of its time. SLAM II
provided three different modelling approaches which could also be combined at
need. These approaches included discrete event modeling approach, network
modeling, and the ability to integrate discrete modeling and network modeling
in a particular simulation model. More importantly, SLAM II cost approximately
$900 which was relatively cheap at that time. This can be signified as the
moment where discrete event simulation came into its own as commercial software
options for discrete event simulation modeling became available to the general
The Growth and Animation Phase (1987
The 90s’ witnessed a consolidation
of the strides made in the earlier years and many interrelated technologies and
processes also came off age within this decade. This era focused on simplicity,
the development of interactive user-interfaces, and making simulation available
for everyone including non-technical individuals.
In the mid-nineties,
simulation software was being used to solve even more complex issues such as
simulating every event and process in large-scale facilities. The Universal
Data System example was a first in those days. Universal Data System was stuck
with converting its entire plant to a hybrid flow-shop which enhanced
production. To achieve this, the company made use of GPSS and the end result
was a successful flow that enhanced daily operations and the entire process was
modeled and simulated within 30 days.
In 1998, vendors began to add
data collection features to simulation software. These features included the
automation of data collection processes, the use of 3D graphics and animation
to make the simulation process more user-friendly and non-technical. Needless
to say, the technological advancements in animation, modeling, graphics design,
and UI building played roles in enhancing simulation software during this
The Flexibility and Scalability Phase (2000 –
Finally, we come to the last
evolutionary phase of the DES software as we know it. Once again, advancement
in interrelated technologies have made scaling simulation and speeding up its
process possible. The evolution that came with the new millennium saw DES
vendors leverage the use of cloud computing, AI, and high-performance computing
to take simulation to greater heights.
Other changes that came within
these decades was the evolution of production-based scheduling process to a simulation-based
scheduling process. This shift allowed for real-time simulation
scheduling, processing, and decision-making. This shift also comes with the
fourth industrial revolutions were data collection, automation, and
interconnectivity rule. Simulation software of this generation has evolved to
become tools capable of digitization and the development of digital twins.
Discrete event simulation software
such as Simio are examples of the comprehensive simulation technologies that
are needed to drive Industry 4.0. This is because new age DES software must be
able to collect and store its own data, model accurate 3-D graphics, animation,
manage real-time scheduling, and digitization. They must also be equipped with
features that market it possible to leverage cloud computing, integrate
enterprise resource planning applications, and high-performance computing.
These features all work together to ensure the most complex simulations are
executed to deliver accurate answers or insights when applied in professional
The future of
discrete event simulation is by no means set in stone as the experiences from
previous eras have shown. This means with the advancement in interrelated
technologies and simulation software, more industrial concepts and business
models will be disrupted in the coming decade.
INFORMS Annual Meeting for 2019 has come and gone with multiple keynotes
shared, workshop activities, and hundreds of excellent companies sharing their
experiences and solutions from brightly colored booths. Once again, Simio was
at the thick of things evangelizing the benefits of simulation and Digital Twin
technology to the world. As with all annual meetings, the focus was on the
strides been made in operational research and analytics. And the meeting
provided the chance to explore emerging technologies and its applications
across all ‘works of life’.
term ‘all works of life’ wasn’t used lightly as sessions covering social media
analytics and e-learning to applying analytics to human trafficking were
explored. As for Simio, our role was somewhere in the middle and as stated
earlier, our participation was centered around the digital twin. But before
going into details of how the event panned out and Simio’s roles, here is an
outline of what the INFORMS Annual Meeting is about for interested individuals.
which stands for the Institute for Operations Research and the Management
Sciences is an umbrella organization for professionals plying their trade in
operations research and analytics. INFORMS currently boasts of approximately
12,500 members across the globe which highlights its global or international
organization also sets standards and guidelines to ensure research and
analytics within its field are ethically done. To bring its thousands of
members together under one roof, the INFORMS Annual Meeting event was created
and it holds once a year. The event features keynote sessions, workshops,
publication presentations, and an exhibition area for member and enterprises to
showcase their wares.
INFORMS Annual Meeting also coincides with its community service drive to
assist non-profit organizations with meeting their obligations. This is done
through the INFORMS Pro Bono Analytics section of the organization. If you are
wondering why the information about Pro Bono Analytics is included here, then I
ask for patience as it will all make sense in the end.
to the annual meeting of 2019!
2019 INFORMS Annual Meeting was a 3-day event which ran from the 20th
of October to the 23rd. This year’s event was definitely a success
as more than 5,000 people breezed through the different sessions, exhibition
areas, workshops, and lunch areas throughout the 3-day event. The convention
center buzzed with activities through these days and we are proud to say Simio
capitalized on these activities in different ways. Our participation included a
dedicated Simio booth highlighting the use of Simio digital twin technology, a
session handled by Jason, and workshop presentations from Renee.
Sessions and Workshops of Note at the INFORMS Annual Meeting
of session covering operations research and analytics were covered throughout
the event which makes mentioning and discussing every one of them impossible.
So, the focus will be on simulation, digital transformation, cloud computing,
and digital twin sessions.
of the exciting sessions within the above category was the session about the
computational infrastructure for operations research, COIN-OR, initiative. The
IBM project focuses on providing open-source
technologies solely for computational operations research. The end goal is to provide an
open-source library of tools which will ensure researches do not have to start
from scratch when handling complex research. This creates a foundation that
will be built on and maintained by researchers over the years.
session ‘Robust Optimization and Learning Under Uncertainty’ was also
interesting as it discussed the challenges stakeholders face with decision-making
and policy creation. Han Yu, a PhD student at the University of California
spoke about how important data collection, and an understanding of history should
drive real-world decision-making. The session also discussed how modeling and
robust optimization techniques can enhance the decision-making process.
notable sessions highlighted or raised questions about the role digital twin
and simulation could play in enhancing agriculture and the healthcare industry.
According to Greg from Syngenta, AI, computer vision, and bioINFORMSatics
modelling currently assist Syngenta with making data-driven seed selections and
breeding. This raises the question of the role of the digital twin in
agriculture which may be explored in other blog posts.
the ‘Healthcare Modeling and
session, Dr. Zlatana Nenova spoke about the role modeling and data analytics
play in improving healthcare. Her speech also touched on the use of digital technology
to analyze medical care policies for both off and on-site healthcare delivery.
In terms of on-site healthcare, there are certainly diverse ways the healthcare
industry can benefit from digital twin technology. Although this was not
covered in this year’s event, it highlights the possibilities of applying
digital twin to the healthcare industry.
Events at the INFORMS Annual Meeting
to Simio and our role at the INFORMS conference. In last year’s event, Ms.
Renee Thiesing, the VP of Strategic Alliances, spoke excellently on the role Simio plays in driven discrete
and the digital transformation of brownfield systems as the move to Industry
4.0 continues. She also highlighted the importance of real-time event
scheduling and how Simio can help enterprises solve real-time scheduling challenges
this year’s event, Ms. Renee built on her earlier foundation by focusing on the
digital twin capabilities of Simio and its application in diverse industries. Her
session titled ‘New Innovations: Cloud Computing, Real-Time Scheduling, Industry
4.0 and more’ discussed how Simio leverages cloud computing to deliver
high-performing scheduling and simulations.
the session, she discussed how Simio leverages the computing power of Microsoft
Azure to support complex applications. Simio’s compatibility with Schneider
Electric’s Wonderware was also discussed in detail. This includes the
leveraging of Wonderware to achieve detailed production scheduling in
real-time, as well as, manage real-time risk analysis. The new Simio features
such as Simio’s cloud portal and OptQuest were also covered during her
workshop. She highlighted OptQuest abilities to optimize scheduling and
simulation with the aim of delivering optimal solutions to complex business
Ceresoli also spoke on the benefits of using Simio’s 3D modelling capabilities
to solve real—world problems. His presentation covered Simio’s features for
system design and operation. Practical examples of how Simio’s rapid 3D
modeling, planning and scheduling, and optimization capabilities can be used by
enterprises were also discussed by Jason. Finally, his session highlighted the
difference between Simio and other simulation tools with a focus on how
professional researchers and analysts can use these features.
participation at the INFORMS Annual Meeting will not be complete without
recounting our experiences at booth 28 in the exhibition area. The
targeted message used in the Simio booth drew its own audience of
professionals, entrepreneurs and business representatives interested in the
digital twin. This gave us the opportunity to showcase Simio’s features and
real-world applications to interested individuals. We can categorically say our
booth played a role in the sales leads and opportunities we got from the event.
the Pro Bono Analytics Event
remember introducing you to the INFORMS Pro Bono Analytics and now here we are!
This year, Pro Bono Analytics partnered with a Seattle-based non-profit
organization, FareStart, to assist individuals interested in
in food service and culinary arts. At this year’s INFORMS Annual Meeting, Simio
alongside other participants made donations to the FareStart initiative.
event was a success and according to Elise Tinseth, Community Engagement
Manager with FareStart, shared, “The INFORMS Pro Bono micro-volunteer
opportunity of creating hygiene kits is impactful to eliminating barriers our
students who are experience poverty and homelessness have to getting jobs in
the food service industry.’ She also thanked everyone who made out time and
donated resources to help FareStart meet its goals.
Annual Meeting Awards and the Future
lastly, the INFORMS Annual Meeting Awards. Although Simio did not bag any of
the awards, the pomp and pageantry, as well as, the strides made by researchers
are worthy of a mention in this post. Hopefully, the prize for teaching
operations research and management science may be ours. That being said, the
2019 event was a success and Simio will continue to be a part of the INFORMS
Annual meeting for the foreseeable future.