LOL, I had decided against sending that message as I thought it might come off as unnecessarily caustic and then saw your response this morning. Apparently my 3 year old had other ideas.
That’s an interesting point about picat.I didn’t know about them adopting a context sensitive reader. I’ll have to read up. From bad experiences with lisp reader black-magic, I’d tend to agree with you.
Also I think they idea of adding warnings is a good one and relatively easy. And indeed, the Logtalk linter has caught a number of mistakes in my code already.
I wouldn’t necessarily throw the baby out with the bathwater though. For almost all uses of variables in FOL, HOL or elsewhere, what is required is merely lexical scoping. This is true of all the forms discussed here, and existential and universal quantification in logic. This doesn’t mean having a context sensitive reader, just one that can deal with binding forms. I find the argument given by Qi persuasive:
A careful examination of the examples discussed above shows that even though the application domains are quite different, there is a common part to what needs to be treated with regard to binding in both cases. The important aspects of binding can in fact be uniformly captured by using the terms of the λ-calculus as a representational mechanism. For example, the concept of the scope of a binding is explicitly reflected in the structure of a λ-abstraction. Similarly, the recognition of the irrelevance of names for bound variables and the preservation of their scopes during substitution are manifest though the usual λ-conversion rules. Thus, the representation of formal objects relevant to different contexts can be accomplished by using λ-abstractions to capture the underlying binding structures and using constructors like in firstorder abstract syntax representations to encode whatever context specific semantics is relevant to the analysis. – Xiaochu Qi, “An Implementation of the Language Lambda Prolog Organized around Higher-Order Pattern Unification”
Of course this, as you’re well aware, is a can of worms, and you’re undoubtably correct that without a clear demonstration of the lack of negative consequences on existing libraries and performance as well as implementation complexity, it makes sense to stick to warnings.
The first version of Teyjus had an enormously complex extension to the WAM in order to make things work out. However, with Qi’s thesis, they restructured Teyjus and limited to the pattern fragment of higher order unification and this simplified things tremendously. The pattern fragment is deterministic and not too difficult to implement.
It seems now in Teyjus 2, the extensions to the WAM are limited to representation of binding in terms (using de Bruijn indexing to keep comparison/unification fast and structural), and a co-routining approach to deferral of unification when things fall outside of the pattern framgment, and a routine for beta-reduction. Types are no longer relevant at runtime in Teyjus 2 except at constants, which is not important for a normal prolog.
Since many prologs implement some extension to unification (for CLP for instance) and already have co-routining, the differences don’t appear that major anymore. I intend to make a bit of an experiment of it and report back.