According to the Spark2014 documentation, one is not allowed to handle exceptions in Spark code.
With verification, most run-time errors can be excluded to occur inside a written program, but exceptions like Storage_Error can not be excluded.
Since the Storage_Error could occur on every procedure/function call or when dynamically allocating memory using new (please correct me if I am wrong on that), catching and handling this exception in a code segment with Spark_Mode=off would only be valid at the highest level of a program (the entry point of a program). I really don't like this approach, since you lose nearly all possibilities to react on such an exception.
What I would like to do:
Assume to create an unbounded tree with a procedure Add(). Inside this procedure I would like to check, if there is enough space on the heap, to create a new node inside the tree.
If there is, allocate memory for the node and add it to the tree, otherwise an out parameter could be given, where somekind of flag is set.
I have searched through the Spark UserGuide, but was not able to find a way how this should be handled, only that the programmer has to make sure, that there is enough memory available, but not how one could do that.
How could one handle those kind of exceptions in Spark?
SPARK indeed cannot prove (guarantee) the absence of storage errors as these errors are raised from outside the scope of the program. This is true for both a failing heap allocation as well as when stack space is exhausted.
One may, however, cheat a little bit by refraining the allocation routine from the SPARK analysis as shown in the example below. The allocation subprogram
New_Integerhas postconditions that SPARK can use to analyze the pointer, but the subprogram's body is refrained from analysis. This allows aStorage_Errorto be handled. Of course, care must now be taken that the body does indeed conform to the specification: thePtrfield must not benullwhenValidis true. SPARK now only assumes this is true, but will not verify this.Note: All pointer dereferences and absence of memory leaks can be proved using GNAT CE 2021. However, it would've been nice to actually set the
Validdiscriminator toFalseduringFreeand use a postcondition likePost => P.Valid = False. This, unfortunately, makes SPARK complain about possible discriminant check failures.Update (3-jun-2021)
I updated the example based on the hints of @YannickMoy (see below).
Freenow ensures that the weak pointer'sValiddiscriminator is set toFalseon return.main.adb
test.ads
test.adb
output (gnatprove)
Summary (added by OP)
The provided code helps to prevent
Storage_Errorfrom dynamic allocation using thenewkeyword. As infinite recursions can already be proven by SPARK (as mentioned in the comments. see here), the only open issue that can result in aStorage_Error, would be a program that needs more stack during normal execution than what is available. This however can be monitored and also determined at compile time by tools like GNATstack (also mentioned in the comments. see here).