The backend segfaults when one feeds an array of constants to it that has
dimension larger than 32.
To get the segfault run model checking from the GUI or by invoking
verifyta -N array_of_constants.xml array_of_constants.q
Created attachment 90 [details]
A tiny model containing an array with dimension 33
I can confirm the segfault on the attached model. However, it does not segfault with an array of dimension
> 32, but with an array bigger than 32x32 (multi-dimensional array) - which is what the attached model
Further clarification: It also crashes with array of non-constants.
A multidimensional array is not necessary to trigger the problem. An array of integers of size 1021 is
enough. It also turns out that an initialiser is not required to cause the crash.
More info: Only on the latest versions on the trunk, does a model with a large array of integers without an
initialiser cause a crash. The problem seems to be a stack overflow of the virtual machine: The initialiser
(either given by the user or auto generated) is evaluated and the result stored on the VM stack. Two things
should be done:
1. The stack should be enlarged to cope with larger arrays (1020 is too small).
2. Detection of stack overflow must be added.
I have added code to estimate the correct stack size. This should solve the problem.