Is this a good pure way to deal with side-effects?

I was looking for a way to manage side-effects but hopefully without using non-logical predicates like assert and retractall. Basically, I’m trying to mimic this Python code:

x = 3
x = 'something-else'
x = 5

I’ve came up with this code for Prolog:


% get_value(Value, History)
% History tracks values set for variable
% uses an arbitrary term t/1 just to allow adding a regular logical variable to the List
get_value(Val, List) :-
	nonvar(List), % just so that fails when nothing added yet.
	nextto(t(Val), Var, List),
	var(Var),
	!.

available_spot(X, History) :-
	once((member(X, History), var(X))).

add_value(Val, History) :-
  available_spot(X, History),
  t(Val) = X.

Some example outputs:

% add one value for variable and read it back:
?- add_value(abc, History), get_value(Current, History).
History = [t(abc), _5110|_5112],
Current = abc.

% works for multiple "re-assignments" to same variable:
?- add_value(3, History), get_value(FirstWas, History), add_value('my-symbol', History), get_value(SecondWas, History), add_value(2, History), add_value(AVar, History), get_value(LatestIs, History), LatestIs = xyz.
History = [t(3), t('my-symbol'), t(2), t(xyz), _1946|_1948],
FirstWas = 3,
SecondWas = 'my-symbol',
AVar = LatestIs, LatestIs = xyz.

% fails when no value was previously set (as intended)
?- get_value(X, History).
false.

So, it seems to be doing the trick! We can even go back to previous states if needed. But I don’t recall seeing anything like this from experts! So I’d like your advice:

  1. Does this idea make any sense?
  2. How do you manage state if you absolutely want to maintain logical purity?
  3. How would you improve this? (e.g maybe member/2 isn’t the fastest)

I’ll keep this gist updated in the future: Pure way for tracking state of variables in Prolog · GitHub

Perhaps you have something like this in mind:

For compling something like this nowadays, the code would be typically transformed into “static single assignment (SSA) form”, so it’d become internally

x1 = 3
x2 = 'something-else'
x3 = 5

(There’s also a φ operator for joining if-then-else’s and loops)

As a matter of good engineering practice, I’ve found that with imperative languages, keeping as close as possible to SSA form is beneficial (in fact, I recently fixed a subtle bug in some Python code that was due to reuse of a variable).

Without a clear purpose for the code you showed:

it is impossible to give useful advice.

The knee-jerk response to this is, “don’t do it”. But you must have a reason to do it; so what is it? The code you show looks a bit like a stack?

What kind of side-effect do you have in mind? Setting a variable to a value is not strictly a side effect, unless a change in the value triggers a side effect of some kind?

Could it be that you would like to keep some state?

Have you tried using backtrackable assignment using setarg/3 for your purpose?

Thanks Daniel. I’m not familiar with Logtalk but it seems to do the equivalent, right? Is that approach better than using member/2 ?

Thanks, Peter.

I’ve found that with imperative languages, keeping as close as possible to SSA form is beneficial

I only meant that as an example of side-effects from the user’s point of view.

Boris,

Without a clear purpose for the code you showed

Yeah I should have been more specific.
What I had in mind is like implementing a memory simulator where you’d need to change variables values often (or something like implementing a virtual machine, ignoring memory efficiency).
I was hoping to find a pure logical way to handle such cases.

Have you tried using backtrackable assignment using setarg/3

I wasn’t aware of that. It says it’s extra-logical but supports backtracking. I’d like to follow the conventional wisdom of maintaining declarative and pure Prolog code.I will look into that today.

Hi,

I once rewrote the logtalk code to swi prolog – by essentially turning the object into a module – its equivalent.

re: member/2

Not sure how you mean to use member – but, i guess, the variable manages the history in a list, so its likely similar.

Dan

Are those “variable values” locations in (addressable) memory? Or are they registers? Or a stack?

Yeah like registers and memory but please treat my example as a hypothetical.

:slight_smile: the issue with hypothetical questions is that you can’t really get concrete answers. Some people react by writing ad-hoc monographs, but I am not too fast at typing so that is too difficult for me.

I am not completely sure what you mean by “pure logical way”. Some would claim that using a data structure like the AVL-tree implementation in library(assoc) is purely logical. It is a key-value store, and you change a value at a key without too much typing. Take a look at it.

For an “array” which is addressed by an offset (not by an arbitrary term) I would prefer a flat term, and setarg/3 or even nb_setarg/3 or even nb_linkarg/3.

But this is all just guessing.

Or library(rbtrees). Or dicts. nb_setarg/3 is non-logical and is not undone on backtracking; setarg/3 is undone by backtracing. b_setval/2 is a global version.

Rbtrees have worked fine for me (e.g., a symtab that I pass around as a “hidden” argument, using EDCGs) - the performance issues haven’t been in updating or lookup (so far). I’d avoid global variables, as a matter of good engineering practice.

BTW, there’s an implementation of fast fourier transforms on page 172 of these set of benchmarks (fft is by Richard O’Keefe, I think): https://apps.dtic.mil/sti/pdfs/ADA211444.pdf (there might be a better version of the code elsewhere).

1 Like

Yes, there are quite a few different key-value store implementations available in SWI-Prolog.

Dicts are SWI-Prolog specific.

library(rbtrees) and library(nb_rbtrees) that goes with it are great and I use them. They are a bit difficult to discover by browsing the website; and the docs don’t have a polished feel.

Sounds a lot like fluents to me. A fluent is a fact that is true at some time or in some situation and not at others. So if we have an x fluent you could re-write your example as:

holds(x(3), line(1)).
holds(x(something_else), line(2)).
holds(x(5), line(3)).

Of course, you don’t want to be using lines to denote your time/situation and there’s other ways to write them, but I think this makes the idea clear. Usually you’d number discrete situations because dealing with continuous times is more complicated. Now you have 3 options for how you carry fluents around.

  • You can keep asserting them to the knowledge base, in which case you get fast look-ups and the history, but you need to track the current situation and you’ll either be asserting every fluent for every change of state or keeping a history term for look-ups into the past if the fluent isn’t asserted for the situation you want, slowing down the look-ups.

  • You can keep a collection of them in a data-structure, like the trees being discussed here, in which case you get reasonably fast look-ups but you loose the history if you’re not keeping all versions of the trees around. This is a popular choice because you’re mutation free and you can use this in a multi-threaded environment by passing the tree around.

  • Or you can keep the history of the events around that changed the state and query that with regards to what fluents hold (as per Golog in the next paragraph). This gives the slowest look-ups, but you get the history to query (if you need it).

For the logical side, people have been wanting to reason about changes in state and the events that cause these changes in AI for a long time. There are entire calculus’ dedicated to doing this, from Situation Calculus to Event Calculus and Fluent Calculus. None are perfect, each have their strengths. Raymond Reiter published a language called “Golog” which had a simple interpreter in Prolog that included a Situation Calculus reasoner. His book “Knowledge in Action” is invaluable if you want to get into this topic, “Commonsense Reasoning” is the book to get for Event Calculus. I’ve done a Situation Calculus reasoner in Logtalk, so you can use it with SWI (I usually do!). There’s RTEC which is a work-in-progress towards an Event Calculus reasoner in Prolog, and I think there’s a non-realtime Prolog one too that I’ve forgotten the name of.

The key additional benefit of using a calculus like these is that you also define how the state is updated (actions/events) and you get a language for querying the state. This mirrors what’s gone on in the database-world and industry with Event Sourcing and CQRS, but with a logical language rather than a collection of good ideas recognised through lessons learnt.

If you just want to have a set of ‘variables’ with their values changing, you can always use dicts:

State1 = state{ x: 1, y: "string" },
State2 = State1.put(y, "new string")

This lets you keep a state that is easy to update. To pass this state around implicitly you can always use a DCG:

eval(noop) --> [].

eval(Var += 1), [NewState] -->
  [State],
  Value #= State.Var + 1,
  NewState = State.put(Var, Value).

You can also look at the pack EDCG that lets you write DCGs that have more than one threaded value, meaning you could use it to have multiple ‘updatable’ registers. Just be careful about backtracking

Looks like I need to read more about these. There are a lot to digest in this thread. Thanks all for your input.

What do you mean by that?

How does picat treat this …

Is this internally unrolled into two logic variables – a before and after – like DCG …

I assume that the body can be backtracked over …

I see …

I didn’t know that Picat is simply a preprocessor for Prolog.

So, its in the category of languages like Logtalk – language that “compiles” to Prolog – i thought it would have its own optimized VM or compiler.

Thanks.

While I understand why they chose this path, as they develop the language – its a fast path the implementation – including avoiding the “impedance mismatch” as they compile the code to the target.

I see two issues with that:

  1. performance – Picat can at most be as fast as the underlying Prolog – at this stage – so, if one motivation is to have a faster language, then this will come later.

  2. the simplicity of Prolog that works with basically two constructs :- and !, is making for a very simple yet powerful language.

I guess, it will be an interesting research questions whether usability is better served by providing more keywods (such as := above) or to appeal to programming idioms such as context arguments.

Perhaps, for adoption from imperative languages an aligned syntax helps, but, then it may also hinder in understanding what really is going on under the hood.

anyway, just some thoughts …

Dan

True – but, I wonder what the Picat multi-paradigm language stands for – what kind of problems does it aim to solve — is there really a software engineering value in, say, a backtrackable, Python (to pick one or two of the Paradigms from elsewhere) - does it help analyze problems and come up with elegant, understandable, and performant solutions.

Dan

As a Picat user I can perhaps answer some of these questions.

One target group for Picat is people like me, who wants to use a programming language which has both the features of a logic programming language - especially backtracking and reversibility of predicates - as well as imperative features such as for/while loops and re-assignments since some problems are much easier to state with for loops etc (instead of creating predicates to recurse over). Picat is a “natural” development from B-Prolog’s loops etc, but is - IMHO - much better integrated.

An extra plus is that Picat also support functions (though it’s not a full featured functional language) and - especially - constraint modeling (CP, SAT, MIP, and SMT solvers). The constraint model (+ loops) was the very reason I actively tested Picat in 2013 (and onward) and is probably the feature that I use most in Picat, it’s well integrated in the language (though I miss some features that especially MiniZinc supports, such as element/3 written as z = x[y]). The SAT solver is quite fast now: it received some Silver medals in the last MiniZinc Challenge, and it will - hopefully - be faster. The planner module is an extra plus. And, since version 3.0 it’s not very hard to port Prolog programs to Picat since it supports Horn clauses, univ, cut, etc; though such porting can take some time.

That said, Picat lacks certain (and for some important/crucial) aspects of Prolog (especially SWI-Prolog with all its very great features, packages etc), such as no support for op/3, handling assert/clause is strange/hard to use. And one have to be careful when using function symbols since they must be ‘$’ quoted otherwise Picat think it’s an function which should return a value. For example, writing an ILP solver in Picat (or port from Prolog) is a little challenge; but see http://hakank.org/picat/hyper_v3.pi for my port of Bratko’s Hyper ILP solver. And that’s probably one of the easiest ILP solvers. I would definitely not try to port Aleph to Picat (hmm, but that’s a thought :-)).

4 Likes