Is this a good pure way to deal with side-effects?

Thank you for providing further input.

Can you elaborate a bit how the language features in particular helped you solve the kind of problems you are working on.

Also, what aspect of language use made code in particular easy to write, understand and follow (i.e. by yourself later or by others).

thanks,

Dan

[Note: I’m not sure how tangential this discussion is to this SWI-Prolog site. Perhaps I should have messages @grossdan instead.]

Here are some thoughts about your questions, and I hope that it answers why I like Picat.

Many of the features of Picat that I like should be seen compared with how they are done in other languages/systems. As most systems are Turing-compliant it’s mostly a matter of how easy things are done (subjectively for me), not that Picat is the only system that can do certain things.

  • Prototyping: Since Picat supports a high-level constraint language, I use it sometimes to prototype certain problems. And example: the other week I had to implement a certain variant of the minimum_except_0/1 constraint in Google’s OR-tools CP-SAT system, but implemented it first in Picat to experiment with different approaches and different data. Doing these experiment in OR-tools CP-SAT is … well, not that easy.

  • Imperative constructs: As mentioned earlier, when writing programs / algorithms that are better (or easier) done with traditional imperative for/while loops (+ reassignments) is easier in Picat than to translate them to recursive predicates (Prolog standard way). Prolog’s maplist is great sometimes for this but one still have to create predcates to use. I know about SWI-Prolog’s yall library with lambda’s but these approaches are not very natural (to me at least not yet),

    Some Prolog systems have loops, but I think Picat’s approach is the more natural way (for me at least).

    Some other imperative features in Picat that I like and use a lot:

    • (global) maps (dictionaries, hashtables) which I use a lot in other programming languages.
    • list / array comprehensions
    • functions which makes it possible to chain functions together
  • Constraints: Picat’s support for different solvers (CP, SAT, MIP, SMT) makes it quite easy to experiments with different solvers. However, MiniZinc has support for more different solvers so I also play with MiniZinc for this.

  • A feature of the CP solver (not supported by the other Picat solvers and very seldom by other CP systems) is that the solver evaluates the constraints directly, which can be used for debugging, and sometimes to improve the model by rearranging the constraints. Note: SWI-Prolog’s clpfd has also support for this.

  • Table and planning. If a problem fails to solve with logic programming, CP, there might also be a solution using tabling to cache solutions, or one might even rephrase the problem as a planning problem.

  • Experimenting: As also mentioned before, I’m doing quite much constraint modelling in Picat. Compared to MiniZinc (which I really like), in Picat it’s much easier to experiment with different data sets or different sizes of a problem (e.g. via loops).

  • Porting Prolog programs. I am really amazed by what Prolog can do and what has been done since the 70s, so it’s probably no surprise that I have ported quite a few Prolog program to Picat.

    The first 5 och 6 years of Picat (since 2013), one had to translate Prolog programs to Picat using only =>/2 and ?=>/2 and then take care that the head didn’t unified any variables (since that don’t work in the same way in Picat as in Prolog) and that was quite much work.

    But in version 3 much more Prolog constructs was included, especially support for Horn clases using :-/2 with the semantics as in Prolog. So porting is much easier, and in some cases one can use the code straight away.

    Earlier this year I ported many of the programs in Bratko’s “Prolog Programming for Artificial Intelligence” and it was not very hard using the v3 features.

See My Picat page for some of the suff I’ve done in Picat so far.

3 Likes

Thank you for your detailed descriptions.

The language certainty sounds very interesting.

In a way i was advocating for something similar but on the other side of the divide:

For example, to have a the Prolog compiler recognize, for example, the context argument idiom, when a context argument is passed unchanged – mimicking a local variable – across recursions. Its currently happening for optimized calls.

An alternative approach is the Picat one – to elevate local variables into the language itself, and then implement them as logic variables.

In a way, the “imperative” language keywords and usages are then the idiomatic cues – for preprocessing them into standard prolog – essentially, as we mentioned earlier.

If my thinking has merit, then it raises several questions – please see them as potential interesting research questions – not a critique of the approach:

  1. what other logic programming idioms are there (as presented in the O’Keefe book, for example), that are not (yet?) captured in Picat but are essential tools of the trade in logic programming.

Should some of those also be put into the keywords of the language – for e.g. augment context structure to reduce the number of arguments passed (essentially, light weight objects/structs).

  1. is usability really served by not only increasing the number of keywords, but by providing a new (logic based) semantics to them – essentially based on monads (quite like DCG, hiding variables) – if i gather this correctly.

Prolog is high level enough so that programming it efficiently is challenging. With Picat this is taken steps further.

I am not asking here a question about algorithm design, – but a software engineering one – where you are tasked to create systems that are correct and efficient, given human (e.g. cognitive), organizational, budget and machine constraints

In any case, I very much appreciate our discussion – which helps me clarify my own understanding of Prolog and Picat. Picat is surely something i should look into at some point.

Daniel

1 Like

Indeed – and i guess that having cues from the language keywords, makes identifying the cases for optimization more straightforward.

Although, i can’t think right now of a case in Prolog that would be ambiguous or hard to identify, unless cued by the Programmer.

I meant:

foreach end – are keyword cues, to treat variables in the Goal as context arguments.

The assignment operator in Picat := is a cue to treat the LHS as two logic variables.

Yes, this is the idea.

The cues are for the programmer and for the preprocessor.

My apology i wasn’t clear.

I meant the keywords in Picat vs. identifying such idioms straight in Prolog.

I am, not using Picat, but would love the prolog compiler to identify a context variable idiom and optimize on that.

Or Picat, if translated directly into a VM – perhaps in the future – could do this optimization directly, based on the parsed keywords that cue the idiom used.

Let me try to be more precise, and again, my apologies, for not being clear.

Today, if i someone wants to optimize the use of context variables in prolog compiler, that someone needs to parsing prolog clauses to identify this idiomatic use in the code.

If that person parses for Picat, then the parser hands the keywords/cues to the preprocessor – so, it makes life easier for the person who parses code to creating the optimizations.

Yes, i think, i meant something like this – essentially, a Picat compiler to the VM – which, i guess, may or may not be the long term plan.

Perhaps, a translation to standard Prolog is a specification of some kind, whereas optimizations at the VM level is something that could/should work for both Picat and Prolog – as you mentioned, earlier.

Here are some things that are hard/not possible/weird to do in Picat compared to Prolog (and some other systems). Hopefully this is on the idiom level you are after.

  • meta-interpreters etc:
    Normally an meta-interpreter use clause/2 (or a similar construct) to handle head and body of a predicate. Picat don’t support clause/2 directly, but there is a way to connect to the underlying B-Prolog clause/2 (via bp.clause/2 were bp is the B-Prolog package/context).

    However - and this is the hard/weird part - using the bp context/package requires that the predicate is also in the bp context, which means that one either use the built-in from bp.* (which is most of the traditional Prolog builtins) or creates the predicate in the bp context using assert*/1. This also means that one have to use the bp context to retrieve the predicate (via bp.clause/2 or via bp.<predicate_name>).

    This approach works (at least sometimes), see http://www.hakank.org/picat/meta_interpreter_v3.pi for a simple meta-interpreter. Also see http://www.hakank.org/picat/bp_test_v3.pi for some examples of this.

    A more interesting example is Brakto’s ILP program HYPER: http://www.hakank.org /picat/hyper_v3.pi were many predicates use the bp context. However, it don’t use clause/2 but a little another approach (from Bratko). This shows, for example, that findall/3 must be re-written to instead use Picat’s function findall/2.

    Using assert/1 etc is quite slow so it’s not recommended to use this.

    However, when working with simple facts (but not predicates), Picat has some substitutes: one can use cl_facts/1-2 or cl_facts_table/1-2 to load a list of facts to use as they are defined in the program. The /2 version the second parameter is a list of index info for indexing the arguments.

  • Defining operators
    For readability and ease of programming, it’s quite common in Prolog to define new operators via op/3. However, Picat don’t support this at all (using bp.op/3 don’t work) so one have to use “plain” predicates instead with Picat’s standard definition of the operators.

    This mostly works but there are cases porting a Prolog program were this is hard since one cannot redefine precedence or the type of a predicate/symbol.

  • Program transformation (term_expansion/4, goal_expansion/4 etc)
    These “hooks” don’t exist in Picat and what I know there is no substitute for this.

The three limitations above is perhaps quite serious for some Prolog programmers.

Here are some other things.

  • Functional programming
    As mentioned, Picat has support for functions, e.g.

    fact(0) = 1.
    fact(1) = 1.
    fact(N) = N*fact(N-1).
    

    This is quite nice since - among other things - it makes function chaining possible and one can do one-liners nicely, e.g. for solving Project Euler problem #3 (http://hakank.org/picat/euler3.pi ):

    600851475143.factors().max().println()

    Or simulate SNOBOL4’s string pattern chaining style (see http://hakank.org/picat/string_util_test.pi ):

    TT = "unabstractedness",
    % "un" . a" . "b" . "stra" . "c" . "tedness"
    R = TT.breaks("a",T1).span("a",T2).breaks("b",T3).span("b",T4).breaks("c",T5).span("c",T6),
    % ...
    

    However, Picat is not a full fledged functional programming language. One thing I especially miss is that there is no support for lambda expressions, for example in map/2 expressions. Though list/array expressions can be used instead (but this is not as elegant as in a functional language).

  • regular expressions
    I’m originally a Perl guy (working with Perl for more than 20 years before I found Picat) and working with regular expressions was veryt natural for me.
    However, Picat don’t support regular expressions, which is the reason that I - still - use Perl when parsing logfiles etc.
    It would not be very hard for a programmer - with a better C knowledge than me - to implement this as a Picat package. (The implementation I did years a go was not performant at all).

Note:
For simplicity in porting Prolog programs (or writing programs using the “Prolog approach”), I’ve ported most of B-Prolog’s predicates to Picat here: http://www.hakank.org/picat/v3_utils.pi as well as added some important predicates (eg. the maplist family) if B-Prolog don’t support it. In Picat, many of Prolog’s predicates has been replaced with a function, e.g. Prolog’s length/2 is length/1 in Picat, but the idiom generating lists using list/2 is so useful that I’ve ported this (using bp.length/2).

1 Like

@j4n_bur53

Yes, for mathematical/constraint programming, one very important thing is how a system convert the model to a format that the SAT/MIP/SMT/CP solvers can handle. However, it still matter how well a user can build the model, including reading and preprocessing data. Picat is quite good at this, but - say - MiniZinc is quite bad at the preprocessing step.

And - at least from my perspective - Picat can be used for more than just writing constraint models as shown earlier in this thread. Can it compete with Prolog with that? In certain cases, definitely not. In some cases - including writing logic programs - it’s more about preferences how to implement it.

Well, you missed my point by that example. Picat is not just a language for setting up constraint models, but it is a general purpose programming language.

Regarding the motivation to use =>/2 and ?=>/2, it’s probably better to ask Neng-Fa about that.

Thank you for your detailed description, its very interesting.

I would miss term expansion quite a bit – since I am using it to create a syntax for domain specific languages (DSLs) that are aligned with Prolog. For example a syntax like this:

object(my_object). 
   attribute(a1, int).
   attribute(a2, int).
end_object. 

Can be quite easily created with term expansion and a processor for such clauses.

If i understand it correctly, Picat then has two levels – the Picat level and the expanded b-prolog clause level – for power users as it were – probably using a similar term expansion technique i am using.

On another matter:

I am actually wondering why Picat decided to mimic imperative programming with logic variables – real multi-paradigm would have suggested creating (non-backtrackable) real memory based variables and references – so that one can program at this level as well – essentially, like, say, Java or Ruest (i.e. with or without garbage collection).

This is something I would have liked quite a bit in my work that is closer to system programming than application programming.

Dan

ok. So its essentially, syntactic sugar, to make the language more user friendly.

Dan

Yeah – side effects are PITA. Especially for unit testing.

What worked for me:

  • add new rules and facts and to some module, m. Something like:
 m:assert( .... )
  • retract the entirety of the module in a Python context manager’s exit. Here’s what I have for pyswip:
    @staticmethod
    @contextmanager
    def module_context(module_name=None, prolog=None,
            finalizer=abolish_module_contents,
            **kw,
            ) -> Iterator[Prolog]:
        p = ModuleContext(module_name=module_name,
                          prolog=prolog,
                          **kw)
        try:
            yield p
        finally:
            if finalizer is not None:
                if callable(finalizer):
                    finalizer(module_name=p.module_name, prolog=p)
                else:
                    p.run(finalizer)

The yielded object is a Prolog object monkey-patched to wrap queries and asserts as: “{modulename}:({original_query})”.

And where finalizer does:

        prolog.query_all(f"forall(current_predicate({module_name}:P), "
                         f"(P\\={module_name}:pyrun/2, abolish({module_name}:P)))")

You’d use it:

with module_context('m') as prolog:
   prolog.query(s)    # Does m:s

Seemed to work well except that many predicates need to be made aware of module prefixes in their variables. For example, I had to modify a flatmap utility predicate

:- meta_predicate
    flatmap(:,*,*).

flatmap(_, [], []).

% flatmap(P, [[X0, X1, ...]], [Y0, Y1, ...])   where call(P, Xi, Yi)
flatmap(M:P, [H|T], Ls_):-
    call(M:P, H, Hs),
    append(Hs, Ls__, Ls_),
    flatmap(M:P, T, Ls__).

Is worth it to maintain idempotence.

I also used the same pattern to enable multi-threaded queries from Python by using one module name per thread.

Thank you.

I first noticed the loop construct in XSB Prolog – which i used a little, before switching (back) to swi-prolog.

I didn’t now how prevalent the proposal for such constructs are.

Dan

Thank you for clarifying.

If i gather this correctly, the Schrimpf paper is more performant, as it “unfolds” (using O’Kefee’s terminology I saw in his book), the loop into an accumulator idiom

The Schrimpf one is, i guess, is then tail optimized, including (hopefully) no duplicate stack push for the accumulator.

The use of meta predicates is elegant but has the overhead of meta calls.

Also, here i think if the compiler could recognize either expansion / unfolding and optimizing it, that would be great.

Dan