I constructed a simple model with a single server where I used a rate table for arrivals. The server block capacity was fixed at 2.
When I run this model by hand, I see that maximum holding time for the input buffer of the server is just about 24 minutes every time (but the same exact number), just as it would be if the arrival rate was constant throughout a 24-hour period.
However, when I run an experiment where I defined the Server1.InputBuffer.Contents.MaximumTimeWaiting as the response variable, the result is 40 minutes for the scenario where the server capacity of 2 is 40 minutes.(I am using version 5.91)
I am wondering
1) if I am confusing two different quantities (maybe the maximum holding time result is averaged by dividing by capacity,
but that would be a bug, because the capacity should have no effect on the input buffer) or
2) if the randomization seed is the same for every run that I do by hand,
which must be the case, because when I reduce the required replication count to 1 for the experiment, I get the same exact result for the single run that I do by hand. If that is the true reason, then I must ask where to change the randomization seed for every run.