Package pac updated to 1.6.5

I have updated package pac to 1.6.5. The log below should be reproduced.

Sample queries of path counting for rectangle grid graphs in library(pac(zdd/‘minato-2r’)) may take long time. For example rect(7, 7) for 8x8 grids takes about 30 seconds. The usual prolog programming with list of visited nodes, I gues, does not terminate for the query in reasonable number of days ! However, the current ZDD of mine is surely still a toy level. It might be time for me to ask true experts on ZDD for help.

% swipl
?- pack_install(pac).
?- use_module(library(pac)).
?- use_module(util('ptq-fragment')).
?- module(ptqfrag).
ptqfrag:  ?- run_samples.

ptq(s,[john,is,a,man],[man(j),find(j,j)]).
Ans = true.
ptq(s,[every,man,is,john],[man(j),find(j,j)]).
Ans = true.
ptq(s,[every,man,is,john],[man(j),man(k)]).
Ans = false.
ptq(s,[every,man,finds,every,man],[man(j),man(k),find(j,j)]).
Ans = false.
ptq(pn,[john],[]).
Ans = [j].
ptq(np,[a,unicorn],[male(j),male(b),female(m),unicorn(u),find(m,u),walk(j),walk(b),walk(m)]).
Ans = [[b,j,m,u],[b,j,u],[b,m,u],[b,u],[j,m,u],[j,u],[m,u],[u]].
ptq(s,[a,unicorn,walks],[male(j),male(b),female(m),unicorn(u),find(m,u),walk(j),walk(b),walk(m)]).
Ans = false.
ptq(vp,[find,a,unicorn],[male(j),male(b),female(m),unicorn(u),find(m,u),walk(j),walk(b),walk(m)]).
Ans = [m].
ptq(s,[john,finds,a,unicorn],[male(j),male(b),female(m),unicorn(u),find(m,u),walk(j),walk(b),walk(m)]).
Ans = false.
ptq(tv,[find],[man(a),find(j,a),walk(j)]).
Ans = [j-a].
ptq(itv,[walk],[man(a),find(j,a),walk(j)]).
Ans = [j].
ptq(vp,[find,a,unicorn],[unicorn(a),find(j,a),walk(j)]).
Ans = [j].
ptq(s,[john,walks],[man(a),find(j,a),walk(j)]).
Ans = true.
ptq(s,[every,man,walks],[man(a),find(j,a),walk(j)]).
Ans = false.
ptq(s,[every,man,walks],[man(j),find(j,a),walk(j)]).
Ans = true.

true.

ptqfrag:  ?- use_module(util('pac-test')).
ptqfrag:  ?- module(pac_test).
pac_test:  ?- test.

	math:nCr(100, 50, A) ==> 100891344545564193334812497256.

	Sum=sum(0), for( 1 - 100, pred(Sum, [I]:- ( arg(1, Sum, S), Si is S+I, setarg(1, Sum, Si)))) ==> sum(5050).

test done.
true.

pac_test:  ?- use_module(zdd('minato-r2')).
pac_test:  ?- module(minato_r2).
minato_r2:  ?- time(test(rect(5,5), C)).
% 16,798,943 inferences, 1.074 CPU in 1.079 seconds (100% CPU, 15638314 Lips)
C = 1262816.

How can rect(5,5) only take 1.079 seconds when rect(4,4) takes
0.960 seconds as you showed recently here:

Maybe minato_r2 is an old version where 5 means 4 ? How fast is
rect(5,5) nowadays with your recent pac version?

Whats a timing plot where on x-axis is N and on y-axis is time spent?

As for pack library(pac/zdd) I have removed all older sources than 1.9.8 because of big dropping shift/reset to use global variables, which means also I can’t reproduce minato-rn.pl series any more. Sorry for this inconvenience. I am not good at managing multiple histories of sources. I have to choose the simpler “frontier-vector.pl”, which is conceptually more simple along the line of Knuth (mate) and Minato (ZDD).

My question again is why my “faithful implementation” in SWI-Prolog of Knuth-Minato approach desparately does not work for rect(n,n) with n>12. I would be glad if serious bugs
are there in my codes rather than SWIPL is seen inappropriate language to be applied to such mate-zdd combination. I would like to keep on going my way of using functor, setarg, term_hash.

How about this, which was the only way for me to install the latest pac.

?- pack_install(pac, [url('http://web.sfc.keio.ac.jp/~mukai/pac-1.9.8.tgz')]).

Yeah, works better its loading now. But you wrote using codes
library(pac/zdd/vecter-frontier.pl). What does this mean? How do
I run rect(5,5) step by step? The newest version, so that the result

is comparable to this rect(4,4) result:

For example this here doesn’t work, from the SWI pac docu page:

?- [library(pac)].
true.
?- module(pac).
true.
pac:  ?- time(rect_path_count(rect(5,5), C)).
ERROR: Unknown procedure: (:)/2 (DWIM could not correct goal)

And what is written in this thread, works neither:

pac:  ?- use_module(zdd('minato-r2')).
ERROR: source_sink `zdd('minato-r2')' does not exist

So how do I run rect(5,5) step by step after loading pac ?

At first, edit the second clause of add_links/4 in library(pac/‘frontier-vector.pl’)
as this.

add_links(EF, [A-Ns|Ls], X, Y):-!,
	b_getval(zdd_vec, #(C, _)),     % <<<< insert this line
	writeln(A-C),					% <<<< insert this line
	bdd_cons(X0, A-A, X),
	add_links_(A, Ns, X0, X1),
	prune_by_frontier(EF, X1, X2, A),
	slim_gc(X2, X3),
	add_links(EF, Ls, X3, Y).

Then, I hope, this log reproducible as your request.

% swipl
Welcome to SWI-Prolog (threaded, 64 bits, version 9.3.11-18-g6b876547a)
SWI-Prolog comes with ABSOLUTELY NO WARRANTY. This is free software.
Please run ?- license. for legal details.

For online help and background, visit https://www.swi-prolog.org
For built-in help, use ?- help(Topic). or ?- apropos(Word).

?- use_module(library(pac)).     % v 1.9.8
true.

?- use_module(zdd(zdd)).
true.

?- zdd.   % initialization for zdd to run.

?- use_module(zdd('frontier-vector')).
true.

?- module(frtvec).
true.

frtvec:  ?-  rect_path_count(rect(2,2), X).
9
8
7
6
5
4
3
2
1
done
X = 12.

What are these interspersed numbers? Doesn’t make any sense.
If I remove the interspersed numbers I get:

% 10,655 inferences, 0.000 CPU in 0.009 seconds (0% CPU, Infinite Lips)
% 97,767 inferences, 0.000 CPU in 0.023 seconds (0% CPU, Infinite Lips)
% 646,833 inferences, 0.016 CPU in 0.060 seconds (26% CPU, 41397312 Lips)
% 3,773,343 inferences, 0.016 CPU in 0.285 seconds (5% CPU, 241493952 Lips)
% 20,547,077 inferences, 1.172 CPU in 1.598 seconds (73% CPU, 17533506 Lips)
% 105,853,352 inferences, 8.625 CPU in 9.167 seconds (94% CPU, 12272852 Lips)
% 531,968,574 inferences, 45.359 CPU in 52.321 seconds (87% CPU, 11727864 Lips)
Etc..

Which gives this “plot” so far:

8b1ba4536c3e262054177fbfe678139b611d661b

Since I used logarithmic y-axis, this algorithm looks exponential.
How does the space(N) plot of your algorithm look like?

I used them as “progressive bar” to know how many nodes remains for adding.

I see. Somehow I felt about 6 times longer time necessary
for each step n to n+1 of the rectangle size n.

I have no idea. However, from Activity Monitor, for rect(n, n), in the middle of long time computation VM size seems to stop increasing (becomes stable), which was just strange for me.

If you look at the page, something odd is going on. Where all packs give a history of versions and their hashes, this one does not. I’ll try to figure out why, but it may take a while …

From a security point of view the website is outdated.
It should migrated to HTTPS:

list:1 Mixed Content: The site at ‘https://www.swi-prolog.org/’ was loaded over a secure connection, but the file at ‘https://web.sfc.keio.ac.jp/~mukai/pac-1.9.8.tgz’ was redirected through an insecure connection. This file should be served over HTTPS. See Chromium Blog: Protecting users from insecure downloads in Google Chrome for more details.

So in chrome you cannot click on the .tar and download it.
It only worked for me with an alternative browser.
Maybe SWI Prolog pack, the HTTP client that is used, has
meanwhile also implemented such security policies?

I think SWI Prolog should move on, and also maybe
warn about unsafe websites. Same with the Prolog FAQ
on comp.lang.prolog, it also has still HTTP links. Also many
servers have missing automatic HTTP to HTTPS promotion.

Which is also missing for the *.jp site, I can try HTTP:

http://web.sfc.keio.ac.jp/~mukai/pac-1.9.8.tgz

And get this in Chrome:
image

Downloading packs over http is not dramatic. The first registration causes the server to download the pack for analysis and it computes a hash. Any subsequent attempt to install the pack will compare the hash and print a warning if the hashes are inconsistent.

I think the best option for distributing packs is to host them as a git repository. That gives users full control and easy inspection of the versions while bulk git hosters do more and more to ban malicious users and code. In addition, they provide stable and probably fairly secure hosting.

One of the problems with github is that release tarballs seem to be created on the fly and (thus) their hash is not consistent :frowning:

SWI-Prolog can install from a github repo in two ways. By default, it uses git. If this is not installed it will download the indicated version as tar image and unpack it using its library(archive).

GIT does also produce a HTTPS warning and not allow HTTP
at all, it works with a HTTP client that has redirect(true):

$ git clone http://www.foo.ch/bar/.git
Cloning into 'dogelog'...
warning: redirecting to https://www.foo.ch/bar/.git/
Fetching objects: 4613, done.

The warning tells me I have a security glitch using initially HTTP,
and I should use a HTTPS link from the start.

Your SWI-Prolog GIT library might suppress this warning, which is not ok
in my opinion. To avoid the warning I directly publish a HTTPS link:

For binary files and source code initially:
git clone https://www.foo.ch/bar/.git

But its up to the server to provide HTTPS at all.

Since spoofing GIT content is so easy and non-sandboxed Prolog code is a rather sensitive thing, I guess this is why bother with HTTPS and a HSTS (HTTP Strict Transport Security) policy could be important. SWI-Prolog packs are non-sandboxed, unlike SWISH notebooks, right?

Here is what ChatGPT says:

An HTTP to HTTPS redirect vulnerability occurs when an insecure HTTP connection is used to redirect users to a secure HTTPS connection, but the initial HTTP request is not adequately protected. Here’s how this vulnerability might be exploited:

  1. Man-in-the-Middle Attack (MitM): Since HTTP is unencrypted, an attacker intercepting the initial HTTP request could manipulate the redirection process before the user reaches the secure HTTPS site. This could involve:
  • Redirecting the user to a malicious site that looks identical to the intended destination.
  • Modifying the content in transit, such as injecting malicious scripts.
  1. Downgrade Attacks: Attackers could attempt to keep users on an HTTP connection instead of redirecting them to HTTPS, leaving communication vulnerable to eavesdropping or tampering.

The severity of an HTTP to HTTPS redirect vulnerability can vary depending on the context, but it is generally considered moderate to high, depending on the following factors:

  • Moderate: For non-sensitive sites where the main risk is traffic manipulation (e.g., content modification or ads injection) without significant consequences.
  • High: For sites handling sensitive user data (e.g., financial services, medical information), especially when users are likely to connect over insecure networks like public Wi-Fi.

Packs are difficult when it comes to security. You first of all have to trust the person(s)/organization that created the pack. This both includes trusting no malicious intends and coding securely, which depends on what the pack does.

The server analyses the pack in a sandboxed environment, giving some idea about its content. The download counts and history is another indication for trust. Notably the analysis could probably do more, i.e., does the pack use any file, network, etc. access predicates? Does it have install scripts?

Anyway, once you trust the pack, you can download and install it. The installer keeps track of installed content hashes and reports if the same version was installed with different hashes. That covers most of the issues with downloading from insecure locations or over insecure connections.

So yes, best is to use https, but in this case the risk is not too bad as we mitigate against modified content using hashes. I’m happy to add an extra warning to installer when download using http.

P.s. installation for pac works again. There was some broken version of pac that caused no version to be announced.

> swipl pack install pac
% Contacting server at https://www.swi-prolog.org/pack/query ... ok
Installation plan:
  Install pac at version 1.9.8 from http://web.sfc.keio.ac.jp/~mukai/pac-1.9.8.tgz (downloaded 241 times)
Download packs? Y/n?

Man-in-the-Middle Attack are typically characterized that
a further entity was involved. Neither you as a client nor the
server that houses the pack can help it. If your connection

security is compromised. You wide open the door to such
vulnerability as soon as you use insecure HTTP, i.e. the
HTTP without the S, where S stands for secure. Even if

its only a small window of a redirect, not to speak of a
download. Any content can be spoofed, hashes can be
recreated, thats all pretty easy. What is stored in

https://www.swi-prolog.org/ can be easily compromised.

Why does it show HTTP http://web.sfc.keio.ac.jp/ here:

Doesn’t make any sense, that is the definition of insecure.
Disclaimer: There are also people that claim HTTPS itself
isn’t secure enough for various reasons.

Edit 20.09.2024:
It has absolutely nothing to do with some malicious intent
of the server that houses the pack. The vulnerability is
cause by some Man-in-the-Middle.

Of course it requires a little imagination and paranoia
to visualize such a scenario. Maybe you were never
sitting in an old style internet cafe 30 years ago,

and the owner had installed a keyboard sniffer and suddently
all your servers had vermin trying to acquire a root
password and phone calls from the ISP?

Now that these old style internet cafes are gone,
the methods differ a little bit, but the malicious goals
are the same. And the intent doesn’t come from the

server with the pack.

For example www.swi-prolog.org doesn’t use DNSSec:

image
https://en.internet.nl/site/www.swi-prolog.org/2964601/

Not sure how important this is, preventing rogue internet adresses.

But don’t worry. I am currently inspired by SWI-Prolog, trying
to bring the final_url/ http result option to my http clients. Pitty
there is no modern PIP for a HTTP client in Prolog.

Working on a realization just now. Maybe thats why I am currently
sensitive to HTTPS issues. Whats also funny, the version http
result option is not available for JavaScript fetch(), right?

You keep forgetting the content hash check. That is harder to bypass as the https problems indicated are lack of HSTS, which is irrelevant as the package manager directly talks to https and accepting phased out cyphers. Phased out cyphers are irrelevant if the client negotiates a good cypher, which should be the case if SWI-Prolog is not compiled against an ancient version of OpenSSL. So, the pack manager has pretty reliable access to the registered content hash for some pack at some version.

Finally, how the packager delivers the the pack is up to them. That should be part of your assessment on whether you trust the package. That said, I’m not against forcing packagers to host their packs on an HTTPS accessible site.

DNSSEC is provided on the backends, but apparently not on the CDN. The backends also have no outdated cyphers. They come to an 94% score, lack of HSTS being probably the major issue.

Thats correct, I get:

?- setting(prolog_pack:server, ServerBase).
ServerBase = 'https://www.swi-prolog.org/pack/'.

The pack server could nevertheless act as a multiplier of
malicious software. For example if we look at supply chain
attacks
, then the weakest link determines the overall security.

How do you initially compute the hash? @kuniaki.mukai
page doesn’t have HTTP to HTTPS promotion, and
here he has published a HTTP url:

Package “pac”
1.9.8 526129e98f3910766eace5d63eaf7097739a7c5b 3 http://web.sfc.keio.ac.jp/~mukai/pac-1.9.8.tgz

https://www.swi-prolog.org/pack/list?p=pac

And the hash is listed side by side with a HTTP URL,
doesn’t make much sense to me, since its not a HTTPS URL.
A hacker can use this as a gateway to distribute a tampered

.tar that automatically has a tampered hash. And its not
a blockchain and/or distributed, you compute the hash from the
downloaded .tar alone at client side, and what is computed at client

side is identical to the server side, so there is no additional
security. Or maybe there is additional security? How is pack
upload realized on the packager side? I don’t know…

I have tested the command line, and which works as posted ! Thanks.

Although I’m too old (though not in heaven yet) to give strict attention for net security
of level which is required nowadays, I could remove my tgz of library(pac) source on the net if security is getting to matter at some level. That said, but I would like to take it that that the command line works means that it is still allowed.

Thanks for valuable comment. So far I didn’t care about difference http and https
in the download option. I use https for the download url option from the next version.