Prolog texts used in Garbage Collection benchmarking

Now that Ozzy Osbourne has birthday, we can pick
his songs from YT and go down memory lane.

What about Prolog texts used in Garbage Collection
benchmarking? Are some of these Prolog texts here
somewhere available:


https://www.swi-prolog.org/download/publications/lifegc.pdf

I am having a hard time finding Back52, k123 and pgolf.

I do have Back52 lying around. The license is pretty unclear though and the code contains copies of some Quintus libraries, so I’m afraid I can’t share it. I must have had the other two, but I can’t find them on my current machine.

Maybe its the same like this here?

BACK - Berlin Advanced Computational Knowledge Representation System
Version 5.2 - September 1993
https://www.cs.cmu.edu/afs/cs.cmu.edu/project/ai-repository-9/ai/util/areas/kr/systems/kl_one/back/

The link points to a folder with back52.tgz .

It has definitely the same origin. We may have different versions. I used the test suite as one of the tests for a long time. It is a better test for the dynamic database than for GC :slight_smile:

Is there a specific SWI-Prolog recommended paper that discusses the
dynamic database? Maybe also with a practical part, i.e. some

benchmarking? My favorite test case so far:

:- dynamic(foo/2).

test :-
   retractall(foo(_,_)),
   warehouse(1000000, 1, 18884).

warehouse(0, _, _) :- !.
warehouse(N, X, Y) :-
   assertz(foo(Y, bar)),
   (retract(foo(X, bar)) -> true; true),
   zx81(X, Z), zx81(Y, T),
   M is N-1,
   warehouse(M, Z, T).

zx81(X, Y) :- Y is (X*75+74) mod 65537.

SWI-Prolog is quite fast:

/* SWI-Prolog 9.3.14 */
?- time(test).
% 7,001,001 inferences, 1.203 CPU in 1.246 seconds (97% CPU, 5819014 Lips)
true.

Not like Scryer Prolog:

/* Scryer Prolog 0.9.4-210 */
?- time(test).
   % CPU time: 93.184s, 1_047_505_534 inferences
   true.

No.

That is probably a reason: SWI-Prolog compiler is written in C, so we have a lot less inferences :slight_smile: The other aspect is that the VM is designed such that we can use the compiler for dynamic clauses. De-compilation (clause/2, retract/1) is fairly cheap and get the whole indexing machinery for free. The only difference between dynamic and static code is that we skip optimization when compiling as dynamic and we do not look for specific patterns for fast indexing (e.g., two clauses, one having [] and the other [_|_] as first argument is handled special). And, of course, a flag that prevents modifying static code.

edit there are issues though. The dynamic DB is first of all optimized to deal with large clause sets with relatively few modifications. Using it with highly volatile clause sets may lead to poor performance if the asynchronous clause garbage collector cannot keep up with the amount of garbage clauses created. This causes accessing the dynamic predicate to find a lot of garbage clauses that it needs to skip. Using global variables or mutable structures are much faster in these cases.