Memory accumulation on backtracking

I’m searching a large graph in a depth first fashion, and the memory usage accumulates until I’m out of memory. I don’t really understand why. Here is some simple code for a depth first search to a finite depth:

walk(0, [Node|_], Node) :- !.
walk(N0, [Node|T], Final) :-
    N is N0 - 1,
    connect(Node, Next),
    \+ memberchk(Next, T),
    walk(N, [Next,Node|T], Final).

which I invoke like this, with the aim of counting all the final nodes:

aggregate_all(count, Final, walk(64, [Start], Final), Count).

Why does it run out of memory? How can I write it so that it does not?

Would you mind providing a short example how I could achieve a count of the unique final nodes using mode directed tabling?

An alternative is to just not use backtracking by having cuts all over the place and manage the stack myself, but that seems to defy the point of using Prolog…

Ok, just managing the stack myself looks neater to be honest, so I’ll just do that. I also strongly suspect that tabling will explode the memory too.

How would I do that with the walk predicate above to count unique final states? A final state is a function of the path that brought it to the final state. How would I avoid tabling paths in order not to explode the memory?

Marvelous, thank you. I’ll try it.