- Index
- » Programming
- » Modelica Language
- » RAM problem with very large arrays...
RAM problem with very large arrays used as parameters in a model
RAM problem with very large arrays used as parameters in a model
Good morning,
I have to solve a certain problem with Modelica.
I have reduced the code to the minimum, leaving only the part that gives me problems.
------------------------------------------------------------------
model test
public parameter Integer Nmax = 100;
protected parameter Real x[Nmax](each fixed=false) "abscissa";
protected parameter Real y[Nmax](each fixed=false) "y=f(x)";
// any variable for my calculation
initial algorithm
x := linspace(0.0,14.5,Nmax);
// In this example f (x) = -1 and there would be no need for an array,
// obviously I fill y with a more complicated algorithm, not a simple formula...
y := fill(-1.0,Nmax);
equation
// any equation for my calculation
end test;
------------------------------------------------------------------
In a model that I would like to realize, I need to have as a parameter a function given for points x-y. The points are many ... the problem is that up to Nmax = 1000 the program runs, but already with 10000 points OMEdit occupies 22 Gb of RAM and then it stops. Is this normal? Two 10000 arrays or real cannot occupy all this RAM. Is it as if thousands and thousands of instances were created temporary ... Is there no way to allocate them, such as is done for other languages as "static" (common to all instances), or is it a compiler bug? I use OMEdit v1.12.0 (64-bit) on Windows 7 SP1.
If it were normal behavior, is there a way to do this without wasting all this RAM?
Thanks to everyone for the help.
Paolo Ferraresi, Italy.
Re: RAM problem with very large arrays used as parameters in a model
This is most likely a bug. I opened a ticket about it:
https://trac.openmodelica.org/OpenModelica/ticket/5559
- adrpo
- 885 Posts
- Index
- » Programming
- » Modelica Language
- » RAM problem with very large arrays...