From dynamic to static predicates and vice versa?

Hello,

The following is a template for the program I am developing.

:- dynamic special_predicate/some_arity.
program:-
 	do_little_work_that_involves_using_assert_retract_over_the_special_predicate,
 	compile_predicates([special_predicate/some_arity]),
 	do_a_lot_of_work_that_does_not_need_the_special_predicate_to_be_dynamic,
 	% make special_predicate/some_arity dynamic again
 	do_little_work_that_involves_using_assert_retract_over_the_special_predicate.

Is there anyway to make a predicate dynamic after it has been compiled? If not, is there a better approach to coding my program to maximize performance by sandwhich-compiling my “special_predicate” instead of interpreting it during the heavy-duty evaluation part?

Thanks,
Amine.

For the last part AFAIK once you compile a predicate you can not make it dynamic.

For the first part I have never taken a dynamic predicate then used compile_predicates/1 but don’t see why that would not work.

You might be able to put the predicates into some sort of container (thread, trie, stack, module) and then have some means to control access to the desired predicates in the container.

See: Database, Multithreaded applications, modules

A similar concept used in Lambda Calculus that might be of value is De Bruijn index. If that term is unfamiliar then don’t follow up on it as it will probably be of more confusion than help, but if you do understand it, it might give you an insight into how to solve your problem.

Why not have two predicates – one dynamic and one compiled?
One clause of the dynamic predicate just calls the compiled predicate (or vice versa).

One way of creating a predicate then compiling it (I haven’t verified this): write the clauses to a temp file, abolish it, then consult the file. Even better, can you split things into two separate programs? (E.g.: program1 generates predicates that program2 uses. This has the advantage that you only run program1 when its inputs change)

In the current implementation of SWI-Prolog there is internally no difference between dynamic and static code. Once upon a time dynamic predicates required locking for access by multiple threads. That has gone. Predicates, both dynamic and static are now executed purely read-only. As a result, compile_predicates/1 is close to a no-op, simply removing the dynamic flag and dynamic/1 does the opposite. Both can be used on an existing predicate, and the only effect is that retract/1 and assert/1 raise an exception on static code.

Well, there is one small exception: the system will try to find dedicated clause indexing schemas for static predicates with a few clauses only for static code. The two most notable examples are predicates with exactly one clause and predicates switching on [] vs [_|_]. More patterns are likely to follow. Dynamic code does do the normal JIT indexing aiming at many clauses as well as simple first argument indexing.

I’m not claiming there will never be differences. I don’t see any reason to re-introduce differences though.

1 Like

Thanks for the help everybody. Your suggestions are priceless!

I conclude that the execution-time of compiled SWI-Prolog code (for a given input vector) is not affected by whether or not predicates are declared as dynamic, and therefore, there is no point in me trying to compile dynamic predicates during execution to enhance performance.

1 Like

Jan W. wrote:

Well, there is one small exception: the system will try to
find dedicated clause indexing schemas for static predicates
with a few clauses only for static code.

I think this is measurable. Like for example naive reverse shows a kind of statistical outlier, its faster than the usual SWI-Prolog clauses. But I am not 100% sure. But if you tell me that a dynamic directive would switch off this optimization,

I could compare the state and dynamic variant to get empirical data whether these optimizations are worth the dime. Somehow the strategy is quite good, nobody cares if compiling statistic predicates takes more time than compiling

dynamic predicates. On the other hand a slower assertz would not be welcome.

Is there a way to figure out whether this special static and small clause set optimizations were applied or not. Some clause property or predicate property? For example this code for ackermann function, is it subject to the special optimization that only static predicates are subject of?

ack(0, N, X) :- !, X is N+1.
ack(M, 0, X) :- !, H is M-1, ack(H, 1, X).
ack(M, N, X) :- H is M-1, J is N-1, ack(M, J, Y), ack(H, Y, X).
?- vm_list(ack/3).

First call it to materialize the supervisor. This code though behaves exactly the same whether static or dynamic. There is no optimization. Only if M \= 0 it won’t try the first clause due to first argument indexing.

Yeah, indexing also helps. But this is the same for static and dynamic, you already wrote, when nothing dedicated is done.

Somehow I have the feeling my own indexing is kind of a multiway trie. Its multiway since different call patterns lead to different indexes. And its a trie, since it uses prefixes. Prefixes are solved in my trie in that a hash table entry, can point to further hash tables. This is what I get JIT-ed for ack/3:

?- ack(2,2,X).
X = 7

?- dump.
-------- ack/3 ---------
length=3
at=0
  key=0, length=3
    at=1
      key=0, length=3
      nonguard, length=2
  nonguard, length=2
    at=1
      key=0, length=2
      nonguard, length=1

So if N\==0 and M\==0 it jumps directly to the third clause. But now I got other new ideas how I could squeeze the lemon one step further. Variable stealing during last call optimization instead of only variable garbage collection.

In the second clause the variable M and H will garbage collected away. And in the third clause its even the variable M, N, H, J and Y, will be are garbage collected during last call optimization. But garbage collection means return the variable to the pool.

We could give the variable space directly to the new clause without detour over the pool. My pool is currently the Java heap, and by returning the variable to the pool, I keep Java GC busy. With variable stealing, maybe something could be gained.

The SWI-Prolog dedicated index, does it also do something with variables?