Max_arity status?

There is no better way to deep dive in a language than attempt to help someone else with their problems… a simple question on SO brings me on my knees… was suggesting to statically declare such arrays, likely with this code (undebugged, for the reasons to follow)

:- module(arrays, [fill_data/0, sum_a1_a2_a3_to_a4/0]).

:- initialization alloc_arrays.
alloc_arrays :- % cannot use maplist here (must be lambda_free, ok, I understand)

decl_array(Name) :-
    length(Args, 1 000 000),
    Array =.. [Name|Args],
    assertz(Array). % this chokes on max_arity


I understand (I think) the need to keep the arity low.
Seems related to meta_predicate compilation ( call/N ) generality.

But the documentation is lacking about this topic.

?- current_prolog_flag(max_arity,C).
C = unbounded.

?- set_prolog_flag(max_arity,1 000 000).
ERROR: No permission to modify flag `max_arity'

In SWI-Prolog, terms have no arity restriction (except for memory), so you can do functor(Array, a, 1 000 000). Predicates however has a max arity of 1024. This is a compile time (#define) constant. It isn’t very expensive to raise that a little, but making it truly unbounded requires some redesign in the predicate calling mechanism. You can do

?- functor(Array, a, 1 000 000),

I could verify the later constraint, the error happens during assertz/1:

/* SWI-Prolog 8.3.19 */
?- functor(_,f,10000).

?- functor(F,f,10000), assertz(F).
ERROR: Cannot represent due to `max_arity' (limit is 1024, request = 10000)

But I don’t find a corresponding flag:

?- current_prolog_flag(max_arity, X).
X = unbounded.

Bug or feature? Should I look for another Prolog flag?

You’d have to check the ISO standard. AFAIK there is only one max_arity flag and I do not know whether it refers to compounds or predicates or possibly other roles in which a compound may appear? I don’t see much reason to act on this. If anything, it could be getting rid of the limit completely. That is probably mostly just for elegance. I do know at least one user who runs an application that needs a higher arity and changes the default before compiling the system.

I wasnt refering to anything ISO core standard by bug or feature. I was doubting the integrity of the error message. The error message refers to max_arity, but max_arity is unbounded:

?- functor(F,f,10000), assertz(F).
ERROR: Cannot represent due to `max_arity' (limit is 1024, request = 10000)

?- current_prolog_flag(max_arity, X).
X = unbounded.

I suggest introducing a new flag max_arity_compiled or some other name. I am considering to introduce such a flag as well in my system. The new flag is inspired by the YAP error message:

YAP 6.3.3 (i686-mingw32): Sun Jan 20 18:27:56 GMTST 2013
 ?- functor(F,f,10000), assertz(F).
     ERROR at  clause 1 of prolog:'$assert_fact'/4 !!
     INTERNAL COMPILER ERROR- exceed maximum arity of compiled goal

That the ISO core standard flag only covers possibly compound size and not predicate size, I have noted here. Because the ISO core standard says:

Description: The maximum arity allowed for any compound term, or unbounded when the processor has no limit for the number of arguments for a compound term.

So there is a gap in the ISO core standard, predicate size is not covered.

I would be more in favor of max_arity_procedure or max_procedure_arity. Compiled what? Some systems also have a notion of compiled vs interpreted. Seems more systems have different limits for terms and predicates (or should we say procedures?). Testing ECLiPSe we get

out of range in assertz(...)

Otherwise I’m not against such a flag. Might even make it writable :slight_smile: On the other hand, once started it may be better to remove it …

vm_list/0. Or what does impose a limit? Compiling to a VM is also compiling.
Typically such a limit is used to allow shorter instructions inside a VM.

I mean you can compile anything to anything. What is constrained is compiling a term as (to?) a clause. That may be a VM restriction or something completely different. In this case the VM instructions can perfectly represent clauses with unbounded arity (limited by memory only). It is the actual VM implementation that needs to check there is enough space for the arguments before it knows the number of arguments. By limiting each predicate to have max 1024 arguments we just check there is space for 1024 arguments. Now you can also see why raising the limit is cheap. The performance impact is zero, but it causes a local stack overflow to be raised (even more) too early

max_arity_compiled doesn’t express that.

It expresses from SWI-Prolog:

COMMON(int)		compileClause(Clause *cp, Word head, Word body,
				      Procedure proc, Module module,
                                 term_t warnings, int flags ARG_LD);


Or is this dead code? BTW I am now considering max_vm_arity. But I wonder whether my system has also max_jiti_arity, or maybe max_jiti_argument. Also a foreign function or a special builtin might have another maximum predicate size.

The flag would only reflect predicates that are defined by clauses.

Clearly these will be all Prolog flags that are not found in the ISO core standard. But they are there to inspect the Gestalt(*) of the underlying Prolog system. Possibly only useful when used together with a Prolog flag dialect.

For example DES could warn that Prolog JIT indexes are not available etc… etc…

(*) Gestalt (Mac OS)

Gestalt was the name of a system call introduced into the Apple Macintosh operating system System Software 6.0.4 in 1989 to allow applications to dynamically query what capabilities were present in the running system configuration.

In my view it is the system that cannot handle procedures with more than N arguments. Whether that is due to the compiler, VM implementation or the location of Pluto in relation to Venus is in my view irrelevant to the user.

Describing the clause indexing using flags is an interesting idea. There are so many limits, thresholds, etc that I doubt this is meaningfully possible. Surely, I’m not going to try.

I think its relevant. Since this here is a nonsense error message:

?- functor(F,f,10000), assertz(F).
ERROR: Cannot represent due to `max_arity' (limit is 1024, request = 10000)

?- current_prolog_flag(max_arity, X).
X = unbounded.

A normal end-user might think the Prolog system is crazy. Inconsistent statement, `max_arity’ (limit is 1024) and X = unbounded. Clearly a bug and not a feature.

Possibly a legacy error, where max_arity and max_vm_arity where the same. Or before ISO core standard. max_vm_arity is found in the source code of SWI-Prolog.

    Maximum arity of a predicate. May be enarged further, but wastes stack (4 bytes for each argument) on machines that use malloc() for allocating the stack as the local and global stack need to be apart by this amount. Also, an interrupt skips this amount of stack.

But you are right not everything needs to get moved into a Prolog flag. Its already good if there isn’t a silent overflow and some error is thrown, hopefully with a relatable text.

Is the clause body also validated?
From pl-vmi.c, but I dont know what it validates:

/* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Now scan the argument vector of the goal 
and fill the arguments  of  the frame.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */
  if ( arity > 0 )
  { if ( arity > MAXARITY )

Lets try. Nope this works!

?- functor(F,f,10000), assertz(test:-F).
F = f(_15374

Its also not call/n, here I get a crash without error:

?- functor(F,f,10000), call(F,a).
%%% crash
1 Like

See Some safety concerning the maximum allowed predicate arity · Issue #791 · SWI-Prolog/swipl-devel · GitHub with a link to the patch to mitigate this,