why is there a significant performance dropdown when representing a 16-bit signed integer variable as an intervall (-32768..32767) in comparison to fixed length bit arrays?
Inspecting the pre-processed NuSMV/NuXMV model one can observe that the interval types are converted to enumerations.
The statistics of the BDD however does not show any relevant informaton.