Opened 7 years ago
Last modified 3 years ago
#4656 new defect
Small delays give badly wrong results
Reported by: | Francesco Casella | Owned by: | Willi Braun |
---|---|---|---|
Priority: | critical | Milestone: | |
Component: | Run-time | Version: | |
Keywords: | Cc: | Lennart Ochel, Patrick Täuber |
Description
Detailed transmission lines models require the use of cascaded pure time delays. High-resolution models require a large number of very small cascaded delays. Unfortunately, in this situation the way OMC handles delays is totally unsatisfactory.
The following model demonstrates the problem:
model TestDelay parameter Integer N = 100; parameter Real Tau = 10; parameter Real dt = Tau/N; Real y[N+1](start = zeros(N+1)); Real out = y[N+1]; Real x(start = 0, fixed = true); equation y[1] = if time < 1 then 0 else 1; for j in 2:N+1 loop y[j] = delay(y[j-1], dt); end for; der(x) = 1-x; annotation( experiment(StartTime = 0, StopTime = 20, Tolerance = 1e-06, Interval = 0.02)); end TestDelay;
This model apparently works fine: the variable out
follows y[1]
with a delay of 10 seconds. The output interval in this case is 20 ms, and each individual delay is 100 ms, which is an exact multiple. If you change Interval so that each individual delay is not an exact multiple of the interval itself, e.g. 30 ms, the output is warped. If the Interval is more than the delay, e.g. 120 ms, the result is severely wrong. If noEquidistantTimeGrid is activated, the result is even worse, apparently because of the large size of many steps (i.e, intervals in this case) compared to the individual time delays.
Dymola provides the correct results in all these cases.
In general, the correctness of the final state of a simulation should not depend on the communication interval length, at least if variable step-size solvers are used for the continuous-time part. In particular, it seems weird that to ensure the correctness of the delay() operator output, it is necessary to have a communication interval which is much shorter than the delay itself.
Can you please fix the implementation of the delay operator so as to avoid this syndrome?
Change History (7)
comment:1 by , 7 years ago
comment:2 by , 7 years ago
Maybe one idea is to store the data points in the delay queues at the actual ODE/DAE solver steps, rather than at the communication time points
comment:4 by , 5 years ago
Milestone: | 1.14.0 → 1.16.0 |
---|
Releasing 1.14.0 which is stable and has many improvements w.r.t. 1.13.2. This issue is rescheduled to 1.16.0
comment:6 by , 4 years ago
Milestone: | 1.17.0 → 1.18.0 |
---|
Retargeted to 1.18.0 because of 1.17.0 timed release.
BTW, this may also affect TLM simulations badly