Opened 5 years ago
Last modified 5 years ago
#5559 new defect
Issues with large parameter arrays
Reported by: | Adrian Pop | Owned by: | Lennart Ochel |
---|---|---|---|
Priority: | high | Milestone: | Future |
Component: | Backend | Version: | |
Keywords: | Cc: | Karim Adbdelhak, Andreas Heuermann |
Description
See forum post:
https://www.openmodelica.org/forum/default-topic/2734-ram-problem-with-very-large-arrays-used-as-parameters-in-a-model
I have observed this myself as well for the old front-end.
With -d=newInst is much better for the front-end but the
back-end uses a lot of memory and time to generate the code.
model test public parameter Integer Nmax = 100; protected parameter Real x[Nmax](each fixed=false) "abscissa"; parameter Real y[Nmax](each fixed=false) "y=f(x)"; // any variable for my calculation initial algorithm x := linspace(0.0,14.5,Nmax); // In this example f (x) = -1 and there would be no need for an array, // obviously I fill y with a more complicated algorithm, not a simple formula... y := fill(-1.0,Nmax); equation // any equation for my calculation end test;
Change History (5)
comment:1 by , 5 years ago
comment:2 by , 5 years ago
I'm not 100% sure, but isn't this the same as #1451, or at least related?
comment:3 by , 5 years ago
@casella: not sure if is the same as #1451, but it does seem so.
Running valgrind to profile now with Nmax=2000.
For execstat I get:
Notification: Performance of Backend phase and start with SimCode phase: time 0.01732/91.98, allocations: 53.55 kB / 123.6 MB, free: 2.516 MB / 31.91 MB Notification: Performance of simCode: created initialization part: time 2908/3000, allocations: 17 GB / 17.12 GB, free: 351.9 MB / 1.821 GB
comment:4 by , 5 years ago
While making the incidence matrix we seem to generate a new one with absolute values for no good reason, do the people working on the backend have any idea why?
This is the actual problem for this ticket.
I disabled the generation via the PR: https://github.com/OpenModelica/OpenModelica/pull/295
and about 99 models are failing.
I guess we could just do intAbs on the values in the incidence matrix when is needed (when we read them) and not generate a new one with absolute values all the time.
comment:5 by , 5 years ago
Cc: | added |
---|
For example with Nmax=3000 it takes 74 seconds and 5GB with the master and flags:
-d=newInst,initialization,execstat
.It seems most of the time (70s) is spent into: simCode: created initialization part.
I remember I encountered this before, but with the current front-end and mostly because constant evaluation and simplify took forever and consumed all memory (basically trying to find over and over again if the array is constant).