Depth first search not the default?

I assume you could. The system Stanford University developed which I’m trying to implement can represent just about every game or puzzle. There’s a tendency to make things game specific, so we have very good chess etc players. The idea of general game playing is to create a library which can read rules as a program and thereby play any game, which I find very cool, though difficult.

The frustrating part of Stanford’s General Game Playing system is its leader Michael Genesereth created something called kif instead of using plain vanilla Prolog, and the online course on it I did a couple of years ago came with an open source Java application which hadn’t been updated in decades.

A while ago, I managed to translate chess written in kif into a working SWI Prolog program, which involved shuffling the order of clauses a lot to avoid using stuff like freeze (+Var, :Goal).

I imagine there might be times you need to use what other programming languages call promises in Prolog, but in the case of the kif chess program, there seems to be some weird theory in academic logic prgraming that statement order doesn’t matter.

I’ve been thinking about this a couple of times. So far I haven’t seen a meaningful way to gives these a place in Prolog. Any ideas? Then, one could see freeze/2 as a promise, no?

It doesn’t in true declarative languages. Traditional Prolog is a language that has a declarative reading, but one cannot claim it to be declarative. Tabling and constraints do create fully declarative sub languages.

My most recent encounter with promises was redoing a really good MooC given by Gregor Kiczales using Racket (a Lisp developed by MIT as a teaching language).

Even though I only completed the course a few months ago, I’ve already forgotten the details and which example it appeared in, but do recall it was used to resolve a “chicken or egg first” conundrum where two mutually recursive functions need values from each other which the other hasn’t calculated yet.

Googling, I see Racket has a promise library for delayed evaluation, which I don’t recall Kiczales using. I think he simply used local declarations to solve the problem.

In the case of the kif chess problem the delayed evaluation is easily resolved by first getting the value of a variable before calling a predicate which expects to receive a ground term.

I wish I could find the example of Kiczales used to illustrate the need for delayed evaluation, because it’s not a problem I encounter much. (The term promise is used in web applications to mean non blocking as in carry on with other stuff while waiting for a picture or video to download which I think is a bit of misuse of jargon).

1 Like

In my latest version, I’ve decided to avoid global truths (as in asserting the bases of the current state as true(Base) for next(Base) etc to query, and then retracting these before setting the next state), and rather rewrite these as next(State, Next) where both State and Next are lists of bases to iterate through and using memberchk(Base, State) on each. I haven’t tested for speed difference, but it has helped me avoid bugs caused by confusion over what the current state is when.

I think the reason is that, in relation(Parent, Child) the predicate symbol “relation” can be read as the label of an edge connecting the nodes Parent and Child, on a graph. So it can be seen as a more terse notation for edges with labels - or arcs with labels if your graph is an automaton. It’s also a more general notation because not every relation is a label of an arc in an automaton.

Binary relations (i.e. relations with two arguments) are also a natural way to represent fluents in the event calculus and functions with one input and one output variable, so that may be an additional reason why Prolog textbooks prefer the arity-two relation/2 examples rather than arity-3 and above examples like arc/3.

Finally, I think that a lot of early work on AI focused on predicates with one or two arguments. I think this was probably for efficiency reasons. I believe John McCarthy had said something to the fact that you don’t really need anyhing else, if you have binary relations (and arity-one “properties” as I’d call them) but I’ve heard that second-hand and I don’t have a direct source, sorry.

That’s my understanding anyway- sorry if it’s completely irrelevant to your comment, I don’t know which Prolog textbooks you’ve read and I may be misunderstanding your point.

I don’t know which Prolog textbooks you’ve read and I may be misunderstanding your point.

Mainly Ivan Bratko’s Prolog: Programming for Artificial Intelligence (which I rate the best of a bad bunch). In Richard O’Keefe’s The Craft of Prolog there is section 2.13: Returning the path of a solution (p67 in the copy I own), but none of these give triples the primacy I believe they deserve.

Triples are core to the semantic web – eg RDF – which is ultimately also all about automatons. The more I look, the more triples I find.

Regarding using the label as the functor, that would break the search. A key thing is the Parent and Child have the same type, and the label is a different type (input symbols in automaton jargon) and a list of these is the ultimate result of a search that we want.

I don’t understand why you say that “using the label as the functor… would break the search”. It’s quite possible to do a depth-first search (DFS) over arbitrary Prolog terms. In Prolog (with apologies to stating the obvious, but as a reminder) DFS is most naturally implemented in the typical meta-interpreter, which indeed operates on arbitrary Prolog terms. It is also natural to modify that interpreter to add a depth bound (left as an exercise to the reader but I think Markus Triska has examples on his page, The Power of Prolog). The skeleton for a meta-interpreter is easily adapted to any DFS application.

With respect, I wonder if you are backing yourself into a corner with the specific implementation of DFS that you have coded. Is it based on a definition of the algorithm in a particular textbook, that is not a Prolog texbook (i.e. not Bratko)? Would it make things simpler in that case if you re-wrote your search based on a meta-interpreter skeleton?

I also don’t understand why it is a “key thing” that the Parent and Child must have the same type. Can you explain?

Essentially, the search function works something like:

arc(Node0, LabelX, Node1) :-
  ...
  arc(Node1, LabelY, Node2).

So if we made the first functor labelX and the second one labelY, it wouldn’t be easy to write a recursive function. If Node0, Node1, Node2,… aren’t the same type, they wouldn’t chain together into a transitive closure.

I think I understand, but what I mean is that you don’t have to define an arc/3 predicate to walk over the “transitive closure” (as I think I you mean it) of arc/3. You can instead use a typical meta-interpreter skeleton to do the same thing. It works like this:

dfs(leaf).
dfs((Node,Nodes)):-
    dfs(Node)
   ,dfs(Nodes).
dfs(Parent):- 
    arc(Parent,Child)
   ,dfs(Child).

Where Node, Parent and Child are arbitrary Prolog terms whose functors can be interpeted as arc labels and Nodes is a “tree” thereof (tuples with ,/2 as the functor).

In short, instead of writing a special-case recursive predicate for each relation like arc/3 or parent/2, etc, you can use this general-purpose skeleton to walk over the transitive closures of arbitrary relations.

To clarify, I’m not proposing this as an alternative to your code. I assume that if you want to define an arc/3 predicate explicitly, then you have a good reason to do so. I’m offering this as an explanation of the fact that most Prolog textbooks don’t do what you have shown, they don’t explicitly define automata of various kinds. I’m guessing at an answer which has you perplexed in your previous comments.

If I may be so bold as to make a reading suggestion: Prolog textooks like Bratko and O’Keefe are great for understanding Prolog, i.e. their main subject. But they don’t really explain logic programming, and that hinders the understanding of Prolog. For instance, I read Bratko, O’Keefe, Clocksin & Mellish, Sterling & Shapiro and a few others during my degree when I first learned Prolog but things only really started making sense when I read, of all things, Prolog and Natural-Language Analysis by Pereira and Shieber. The book did, for me, an excellent job of not only defining but also explaining Horn clauses and it was only by reading it that I started to understand the first thing about logic programming. I’m pretty sure that the same material is covered in Brakto etc, also, but somehow it didn’t “stick” and I only “got it” when I read Pereira and Shieber.

That’s not a fault of Brakto etc. or a success of Pereira and Shieber, it’s a failing of my studying style; but my experience has convinced me that a solid understanding of logic programming is necessary in order to understand Prolog -and Prolog textbooks. Without that, one is left constantly falling down holes (and I’m again speaking from personal experience- most of my time with Prolog has been like I imagine playing the ET atari game must have been like). Unfortunately, logic programming is a bigger and more complicated subject than Prolog itself and that’s probably why most Prolog courses don’t spend too much time teaching it. But without understanding logic programming one is left holding a map without ever having set foot on the territory! I am convinced that the eternal wailing and gnashing of teeth of students in Prolog courses (I do some GTA work for my university) is exactly the result of not teaching the basics of the paradigm, before jumping head-first into the syntax and practical matters of the language. Prolog makes no sense without knowledge of logic programming and even if one becomes competent in the language, one is left thinking “so what? Why not just use C?”.

The classic source for logic programming is John W. Lloyd, “Foundations of Logic Progarmming”:

I don’t recommend Pereira and Shieber as a starting textbook on logic programming since it’s primarily about, well, what it says on the title, but if you’re curious there is a (free) digital edition of here:

http://www.mtome.com/Publications/PNLA/prolog-digital.pdf

2 Likes

All I can say is that the group of Bart Demoen wanted SWI-Prolog to be able to unify cyclic terms, I asked Bart how that could be done and he shared what is still in the comments in pl-prims.c. If I recall well, his comment was that this trick was readily known among Prolog developers (with some variations) and was not published as far as he was aware of. It is surely incorrect to claim me as inventor and probably Bart wasn’t the inventor either.

One day someone should write a book listing all the undocumented “tricks” in logic programming (and beyond).

Oh, absolutely. I’m just a little frurstrated that it took me so much time to really get started with logic programming, even though I’d been coding in Prolog for several years by that time.

I remember, years ago, I left a comment in the old Swi mailing list and I just couldn’t explain what I wanted to know because I didn’t have the terminology to say it. One of the contributors to Swi was annoyed by the vaguness and fudginess of my comment and they called me out on it. They managed to help me with my specific query but the discussion convinced me that I really needed to understand what I’m talking about, when I’m talking about Prolog and logic programming, else I won’t even be able to ask for help with very practical matters when I need to. I see many of the students in the Prolog courses I help with struggle in the same way. I certainly think this happens more often than in other languages.

It is linear in the number of nodes (atomic and compounds) used to represent the term. This ay be less than the number of nodes of the term given to write/1 if the term is cyclic or subterms appear on more than ones place in the tree (sharing). Of course, this is the worst case if the unification succeeds. If a match fails the unification fails immediately.

Many thanks for the link, it looks very helpful.

1 Like

For those who find Lloyd’s book too hard to digest, I know of two other books:

1 Like