Do space leaks consume memory differently than valid lazy algorithms?

109 Views Asked by At

Space leak is usually defined by execution of a program that consumes more space than necessary. Is such definition compatible with algorithms that require lazy evaluation for amortized time efficiency?

I wonder if these tend to exhibit different pattern in thunk structures. For example, a trivial space leak might look like this:

1 + (2 + 3 + ...)

Would it be normal for a lazy algorithm like tree search produce thunk structures of similar sizes?

I have a suspicion that the proper lazy evaluation patterns would tend to look different making actual leaks easier to spot. For example, the structure could look like this:

[...] : a : b : c

Where [...] part is the prefix that has no references and could have been already garbage collected. In such case, lazy algorithm could well run on O(1) space and there can't be a space leak making the distinction very clear.

I wonder if that's common or there is a wider spectrum of trade-offs to make between space and time efficiency in lazy languages.

EDIT: A possible counter example - persistent structures. Are they easy to distinguish from space leaks?

0

There are 0 best solutions below