Performance cost of dictionaries?

The set of required paths is a subset of the power set L of given set of links,
the L is too big for exhaust search. If I remember correct, an exhaustive search for rect(3, 3) did not terminate within one hour.

Plotting memory size for rect(n, n) seems not difficult for me
by using codes library(pac/zdd/vecter-frontier.pl). For now,
as for rect(4,4) the below N-S means that when node N is added, the number of entries of “main vector” i s S.

% ?- time(rect_path_count(rect(4,4), C)).
%@ 25-1
%@ 24-2
%@ 23-5
%@ 22-11
%@ 21-23
%@ 20-47
%@ 19-87
%@ 18-175
%@ 17-291
%@ 16-454
%@ 15-608
%@ 14-586
%@ 13-722
%@ 12-851
%@ 11-984
%@ 10-1044
%@ 9-1006
%@ 8-1348
%@ 7-1270
%@ 6-1522
%@ 5-1246
%@ 4-1271
%@ 3-1123
%@ 2-1132
%@ 1-988
%@ done
%@ % 3,773,368 inferences, 0.888 CPU in 0.960 seconds (93% CPU, 4248135 Lips)
%@ C = 8512.

BTW, this try also shows that frequent manual call of garbage_collect in my codes seems working, though not sure on physical memory freeing.