Prolog to GPT API Reply 1

Nice pack. Does it still work? Testing an example yields:

12 ?- gpt_completions('text-davinci-003','My favourite animal is ',Result,_,[]).
ERROR: url `'https://api.openai.com/v1/chat/completions'' does not exist (status(404,Not Found))
ERROR: In:
ERROR:   [16] throw(error(existence_error(url,'https://api.openai.com/v1/chat/completions'),context(_934,...)))
ERROR:   [14] http_open:try_http_proxy(direct,'<garbage_collected>','<garbage_collected>','<garbage_collected>') at c:/program files/swi-prolog 8.5.8/library/http/http_open.pl:490
ERROR:   [12] http_client:http_get('https://api.openai.com/v1/chat/completions',_1004,[post(...),...|...]) at c:/program files/swi-prolog 8.5.8/library/http/http_client.pl:139
ERROR:   [10] prolog2gpt:gpt_completions('text-davinci-003','My favourite animal is ',_1056,_1058,[]) at c:/users/fernan/desktop/gpt prolog/prolog2gpt.pl:296
ERROR:    [9] toplevel_call('<garbage_collected>') at c:/program files/swi-prolog 8.5.8/boot/toplevel.pl:1162
ERROR:
ERROR: Note: some frames are missing due to last-call optimization.
ERROR: Re-run your program in debug mode (:- debug.) to get more detail.
13 ?-

However, another test does work:

13 ?- gpt_models(Models).
Models = ['dall-e-2', 'text-embedding-ada-002', 'text-embedding-3-large', 'babbage-002', 'o1-mini', 'davinci-002', 'o1-mini-2024-09-12', 'whisper-1', 'dall-e-3'|...].

Tests run in Windows 11 and SWI-Prolog 8.5.8.

Answering myself after trying different goals: It turns out that you must specify the model you have available for your OpenAI key. In my case, gpt-3.5-turbo:

6 ?- gpt_completions('gpt-3.5-turbo','My favourite animal is ',Result,_,[]).

Result = 'a dog. They are loyal, loving, and provide great companionship. They are always happy to see you and show unconditional love. Dogs also have unique personalities and each one is special in their own way. I love spending time with dogs, whether it\'s going for walks, playing fetch, or just cuddling on the couch. They bring so much joy and happiness into my life.' ;
false.

Otherwise, you get that error which seems to point to an unexisting URL.

1 Like

Why only an API, and not a tensor flow DSL.

With these use cases:

  • Run the tensor flow DSL locally in your Prolog system interpreted.
  • Run the tensor flow DSL locally in your Prolog system compiled.
  • Run the tensor flow DSL locally on your TPU.
  • Run the tensor flow DSL remotely on a compute server.
  • What else?

Maybe also support some ONNX file format?