Linear and logistic regression - Reply 3

@friguzzi

Another option is to use Python’s sklearn via Janus which also offers many other common data science algorithms.

Of course it is. I think it’s helpful to have a Prolog implementation though, as one can tinker with it very easily to try things out (in my case, experimenting with combinations of logistic and noisy-or regression).
Using the Prolog matrix library will always restrict scalability, so locally I usually replace the matrix library with a C implementation called via the FLI (that I had set up before there was Janus).
The pure Prolog version runs on SWISH though, which is a big plus.

Would you mind saying a bit about what you use noisy-or for in the context of Prolog?

Sure! It’s not exactly about Prolog though, rather about probabilistic logic programming.
In a probabilistic logic program, every (grounding of a) clause fires with a certain probability. So if you have clauses

0.2 :: alarm :- earthquake.
0.5 :: alarm :- burglary

that express that a burglary is 50% likely to trigger an alarm and an earthquake is 20% likely to trigger it, then in a situation of a burglary during an earthquake the probability of an alarm is 0.6, which is computed by considering both events to be independent. This is the “noisy-or” of 0.2 and 0.5.
Learning and inference for such probabilistic logic programs is implemented in Prolog by Fabrizio Riguzzi’s cplint suite.
Now I am currently working in a hybrid domain of both Boolean and continuous variables for an application, and one way (of several) of adapting this framework is by making the probability annotation of a clause a function of the continuous body atom. The logistic function is an obvious way to do that, so that leads me to fitting such a mixed noisy-or-logistic regression.

1 Like