Absolute worst case stack size based on automatic varaibles

235 Views Asked by At

In a C99 program, under the (theoretical) assumption that I'm not using variable-length arrays, and each of my automatic variables can only exist once at a time in the whole stack (by forbidding circular function calls and explicit recursion), if I sum up all the space they are consuming, could I declare that this is the maximal stack size that can ever happen?

A bit of context here: I told a friend that I wrote a program not using dynamic memory allocation ("malloc") and allocate all memory static (by modeling all my state variables in a struct, which I then declared global). He then told me that if I'm using automatic variables, I still make use of dynamic memory. I argued that my automatic variables are not state variables but control variables, so my program is still to be considered static. We then discussed that there has to be a way to make a statement about the absolute worst-case behaviour about my program, so I came up with the above question.

Bonus question: If the assumptions above hold, I could simply declare all automatic variables static and would end up with a "truly" static program?

3

There are 3 best solutions below

11
On BEST ANSWER

Even if array sizes are constant a C implementation could allocate arrays and even structures dynamically. I'm not aware of any that do (anyone) and it would appear quite unhelpful. But the C Standard doesn't make such guarantees.

There is also (almost certainly) some further overhead in the stack frame (the data added to the stack on call and released on return). You would need to declare all your functions as taking no parameters and returning void to ensure no program variables in the stack. Finally the 'return address' of where execution of a function is to continue after return is pushed onto the stack (at least logically).

So having removed all parameters, automatic variables and return values to you 'state' struct there will still be something going on to the stack - probably.

I say probably because I'm aware of a (non-standard) embedded C compiler that forbids recursion that can determine the maximum size of the stack by examining the call tree of the whole program and identify the call chain that reaches the peek size of the stack.

You could achieve this a monstrous pile of goto statements (some conditional where a functon is logically called from two places or by duplicating code.

It's often important in embedded code on devices with tiny memory to avoid any dynamic memory allocation and know that any 'stack-space' will never overflow.

I'm happy this is a theoretical discussion. What you suggest is a mad way to write code and would throw away most of (ultimately limited) services C provides to infrastructure of procedural coding (pretty much the call stack)

Footnote: See the comment below about the 8-bit PIC architecture.

2
On

Bonus question: If the assumptions above hold, I could simply declare all automatic variables static and would end up with a "truly" static program?

No. This would change the function of the program. static variables are initialized only once. Compare this 2 functions:

int canReturn0Or1(void)
{
  static unsigned a=0;
  a++;
  if(a>1)
  {
    return 1;
  }
  return 0;
}

int willAlwaysReturn0(void)
{
  unsigned a=0;
  a++;
  if(a>1)
  {
    return 1;
  }
  return 0;
}
0
On

In a C99 program, under the (theoretical) assumption that I'm not using variable-length arrays, and each of my automatic variables can only exist once at a time in the whole stack (by forbidding circular function calls and explicit recursion), if I sum up all the space they are consuming, could I declare that this is the maximal stack size that can ever happen?

No, because of function pointers..... Read n1570.

Consider the following code, where rand(3) is some pseudo random number generator (it could also be some input from a sensor) :

typedef int foosig(int);
int foo(int x) {
   foosig* fptr = (x>rand())?&foo:NULL;
   if (fptr) 
     return (*fptr)(x);
   else
     return x+rand();
}

An optimizing compiler (such as some recent GCC suitably invoked with enough optimizations) would make a tail-recursive call for (*fptr)(x). Some other compiler won't.

Depending on how you compile that code, it would use a bounded stack or could produce a stack overflow. With some ABI and calling conventions, both the argument and the result could go thru a processor register and won't consume any stack space.

Experiment with a recent GCC (e.g. on Linux/x86-64, some GCC 10 in 2020) invoked as gcc -O2 -fverbose-asm -S foo.c then look inside foo.s. Change the -O2 to a -O0.

Observe that the naive recursive factorial function could be compiled into some iterative machine code with a good enough C compiler and optimizer. In practice GCC 10 on Linux compiling the below code:

int fact(int n)
{
  if (n<1) return 1;
  else return n*fact(n-1);
}

as gcc -O3 -fverbose-asm tmp/fact.c -S -o tmp/fact.s produces the following assembler code:

    .type   fact, @function
   fact:
   .LFB0:
    .cfi_startproc
    endbr64 
   # tmp/fact.c:3:   if (n<1) return 1;
    movl    $1, %eax    #, <retval>
    testl   %edi, %edi  # n
    jle .L1 #,
    .p2align 4,,10
    .p2align 3
   .L2:
    imull   %edi, %eax  # n, <retval>
    subl    $1, %edi    #, n
    jne .L2 #,
   .L1:
   # tmp/fact.c:5: }
    ret 
    .cfi_endproc
   .LFE0:
    .size   fact, .-fact
    .ident  "GCC: (Ubuntu 10.2.0-5ubuntu1~20.04) 10.2.0"

And you can observe that the call stack is not increasing above.

If you have serious and documented arguments against GCC, please submit a bug report.

BTW, you could write your own GCC plugin which would choose to randomly apply or not such an optimization. I believe it stays conforming to the C standard.

The above optimization is essential for many compilers generating C code, such as Chicken/Scheme or Bigloo.

A related theorem is Rice's theorem. See also this draft report funded by the CHARIOT project.

See also the Compcert project.