Building out A Rest Client in Prolog - ChatGPT

So my immediate block right now in building out the rest api is having no clue where to put a json body into post request. Their is zero documentation that I can find anywhere in the standard library, and the source code doesn’t give much clues either.

I mean, prolog can be an awesome, practical scripting language on-par with python if we want it to. Where is that edge? Let’s push ourselves to it!

On one side, I just want to solve my problem. It shouldn’t have taken an entire afternoon without any success. I may not be the most savvy http-slinging protocol wonk, but I’ve made enough req-res calls in other languages to know that it shouldn’t be this opaque. So I open this up as a challenge. Help me help us write-up an awesome (series) of tutorial that shows others how this language we love so much, can be used for practical things, while also being the most declarative kid on the block. That’s the real nub.

Now for the practical side that I’m stuck on:

The simple request call works just fine. I had this going in no time. But it’s not very interesting to look at a list of models. I’m caching them so that we don’t make repeat calls. I wonder what folks think of this technique?

:- dynamic model_data/1.

models(Models) :-
    (call(model_data(Models))  % if model_data is in database
    -> model_data(Models)      % bind to it
    ;                          % otherwise make a request call
     base(Base), headers(Headers), api_version(V),
     URL = '~w/openai/models?~w' $ [Base, V],
     http_open(URL, In, Headers),
     json_read_dict(In, Res),
     Models = Res.data,
     assertz(model_data(Models))    % and store in local db
    ).

But the real issue is sending post request for the JSON object. This is the approach that has gotten me the furthest so far… I’m getting a 411 error code back, “Length Required”.

completion(Prompt, Model,  Response) :-
    base(Base), req_headers(Headers), api_version(V),
    URL = '~w/openai/deployments/~w/completions?~w' $ [Base, Model, V],
    Data = json([prompt=Prompt]),
    atom_json_term(A, Data, []),
    http_open(URL, Out, [method(post)
                         | Headers]),
    json_write(Out, Data),
    close(Out),
    http_read_data(Response,_,[json_object(dict)]).

Oh, by the way, don’t mind the $ above… it’s just a functional composition metapredicate from Mndrix’ awesome little library(func) that I use to do atomic concatenation more succinctly.

Yeah. The organization of the libraries is quite ok, but it is hard to assemble what you need from the docs :frowning: Here is a minimal server and client:

:- use_module(library(http/http_server)).
:- use_module(library(http/http_client)).
:- use_module(library(http/http_json)).

% A simple server the echos the JSON POST request

server :-
    http_server([port(8082)]).

:- http_handler(root(trip), trip, []).

trip(Request) :-
    http_read_data(Request, JSON, []),
    reply_json(JSON).

% The client

client(Data) :-
    http_get('http://localhost:8082/trip',
             Reply,
             [ post(json(Data)),
               json_object(dict),
               value_string_as(atom)
             ]),
    print_term(Reply, []).

You can also use http_open/3 rather than http_get/3 and read the result yourself. Posting works using the same options. The post option has proved to be a bit confusing. The outer functor (json(_)) defines the data type. The argument term must be data in a format acceptable to that format. Multiple formats are supported by means of the hook http:post_data_hook/3. That is why we need library(http/http_json) which extends the hook to support JSON. http_get/3 uses the hook http_client:http_convert_data/4 to convert the wire data based on the HTTP Content-Type to a suitable Prolog term. Again, library(http/http_json) extends this hooks to support JSON. The json_object(dict) and value_string_as(atom) options are passed to the JSON parser to tweak the output format.

http_get/3 is a thin layer on top of http_open/3, so you can use all the (e.g., authentication) options from http_open/3.

An example run:

swipl post.pl
...
?- server.
% Started server at http://localhost:8082/
true.
?- client("test").
test
true.
?- client(#{prompt:"hello world!"}).
_{prompt:'hello world!'}

Note that the dict tag is ignored. If I am not interested in the tag I tend to use #{} these days. I tended to use _{}, but this is a non-ground term while it is ground data.

1 Like

Have you seen https://github.com/RdR1024/prolog2gpt/blob/3a85c004f941bd82372e284fca32195650378c33/src/prolog/prolog2gpt.pl#L291

Thread: Prolog to GPT API

1 Like

Cool, this looks helpful and interesting. I still need something that talks to the Azure’s OpenAI, since I have some credits to that, so will still need to build out my own client a bit more. Thanks for sending!

There’s also a simple server here: GitHub - kamahen/swipl-server-js-client: Sample SWI-Prolog server with JavaScript client … it’s REST-ish, in that the server acts synchronously for each request-reply (it’s multi-threaded, so it can handle multiple requests at a time); the requests and replies use JSON.

I wrote this when I realized that the existing tutorials had become out of date.

completions(Prompt, Model,  Response) :-
    base(Base), req_headers(Headers), api_version(V),
    URL = '~w/openai/deployments/~w/completions?~w' $ [Base, Model, V],
    Data = #{model: Model, prompt: Prompt},
    http_get(URL, Response, [post(json(Data)),
                             json_object(dict),
                             value_string_as(atom)
                            | Headers]),
    print_term(Response, []).

It’s still not quite working.

This is how the current predicate looks. It’s failing, not sure exactly why. How should I be thinking of packaging up JSON in the post? And what does json_object(dict) do? value_string_as(atom)?

Appreciate the feedback :bowing_man:

As in false or an error? If so, which error? You have two options: trace or use

?- debug(http(_)).

which should print a lot of messages about what is being exchanged. Make sure not to use swipl -O as optimized mode removes the debug statements.

Get the response back as a dict rather than the old json([Key=Value, ...]) representation.

Map {"key":"value"} to _{key:value} rather than _{key:"value"}.

1 Like