LLM (Large Language Model) such as ChatGPT prompts related to Prolog

These topics are ChatGPT prompts for understanding the strengths and weaknesses of using ChatGPT with programming in SWI-Prolog.

Discourse (the software for this forum) is currently set to limit a post to 32000 characters. The entire set of examples are several times that size. We can increase the limit but there is really no need, this is a really rare case that exceeds the limit. A table of contents is created on the right to make navigation easier.

For those new to Prolog, be aware that many of the replies from ChatGPT may contain flaws, inaccuracies and/or hallucinations . It is advisable to critically evaluate each answer and spend time noting both the strengths and weaknesses. Trust but verify.

From ChatGPT emails, links and other actions

You should always verify any information or claims that ChatGPT makes with other sources, and do not rely on it for any critical or sensitive decisions or actions. ChatGPT is not a substitute for human judgment, expertise, or responsibility.

Understanding ChatGPT

ChatGPT Glossary
  1. Artificial Intelligence - The ability of machines to perform tasks that typically require human intelligence, such as learning, problem-solving, and decision making.

  2. Neural Network - A set of algorithms and statistical models that are used to recognize patterns and relationships in data, based on the structure and function of the human brain.

  3. Artificial Neural Network (ANN) - A machine learning model that is designed to simulate the functioning of the human brain.

  4. Large Language Model (LLM) - An AI model that is trained on a large corpus of text data and is capable of generating human-like text.

  5. Natural Language Processing (NLP) - A subfield of AI that deals with the interaction between computers and human language.

  6. Prompt - A text-based input provided to a natural language processing system such as ChatGPT, which is then used to generate a corresponding response.

  7. Prompt Engineering - The process of designing and creating prompts that are used to train and test AI models.

  8. Completion - The response generated by a natural language processing system such as ChatGPT in response to a prompt.

  9. Transformer Architecture - A type of neural network architecture used in large language models that allows for parallel processing of data.

  10. Pre-training - The process of training an AI model on a large corpus of text data before fine-tuning it on a specific task.

  11. Fine-tuning - The process of adjusting the pre-trained model to perform a specific task, such as language translation or sentiment analysis.

  12. Generative Model - An AI model that is capable of generating new data, such as images or text.

  13. Training Data - The data used to train an AI model, typically consisting of large amounts of text or other input data.

  14. GPT - stands for Generative Pre-trained Transformer, which is the type of neural network architecture. GPT models have been pre-trained on large amounts of text data, which allows them to generate coherent and contextually-appropriate responses to a wide range of prompts.

  15. ChatGPT - A large language model developed by OpenAI that is capable of generating human-like text in response to prompts.

  16. Hallucination - a phenomenon that can occur when using natural language processing, where the model generates text that is not relevant or coherent with the given prompt or the context of the conversation. This can be caused by the model “hallucinating” or generating responses that are not grounded in reality, such as producing fictitious or nonsensical information. It is a common challenge in natural language generation and can be addressed through techniques such as fine-tuning the model or incorporating human-in-the-loop feedback.

  17. Token - A single unit of meaning in text or speech. This could be a word, punctuation mark, or other individual element that can be processed by a machine learning algorithm. Tokens are often used to represent the input data that is used to train language models, and are a key component of natural language processing.

  18. Deterministic - An algorithm or model that will always produce the same output given the same input. This is in contrast to probabilistic models, which may produce different outputs for the same input due to randomness or uncertainty. Deterministic models are often used in situations where consistency and reproducibility are important, such as in scientific simulations or financial modeling.

  19. Edit - The process of modifying or revising an existing text input using a deep learning model known as a transformer. The model takes an existing text input and provides suggestions for how to modify it, such as correcting grammar or rephrasing sentences.

  20. Zero-shot prompt - a prompt that has no examples.

  21. One-shot prompt - a prompt that has one example.

  22. Few-shot prompt - a prompt that has more than one example.

  23. Capability - The ability of an AI system to perform a specific task or set of tasks. This includes the ability to learn from data, recognize patterns, make predictions, and take actions. It is a measure of the system’s capacity to produce outputs that are accurate, reliable and efficient.

  24. Alignment - The process of ensuring that different parts of an AI system are consistent and coherent. This involves training the system to recognize patterns and relationships within data and to adjust its output accordingly. In the context of language models, alignment may involve aligning words and phrases to produce coherent sentences and paragraphs, or aligning text to match the context in which it is being used. Good alignment is essential for producing accurate and effective AI models.

  25. Emergent Ability - A novel behavior or capability that arises from the complex interactions of multiple components in a system, without any explicit programming or design. (ref)


  • The ability of GPT-3 to translate languages it has not been explicitly trained on, due to its ability to generate coherent and contextually appropriate text.
  • The emergence of collective intelligence in swarm robotics, where individual robots coordinate their behavior to perform complex tasks such as collective transport or mapping.

The initial ChatGPT Glossary below was created with a prompt like

Create a glossary for a high school graduate needed to understand AI, Generative Pre-trained Transformer and Large Language Models in a logical order

It was then manually organized and over time more entries were created using a prompt like

For a high school graduate in the context of AI such as Generative Pre-trained Transformer and Large Language Models create a one or two line glossary entry for the term: <Term>.

ChatGPT - technical background
  • “Attention Is All You Need”
    by Ashish Vaswani et al (2017) - (pdf)

  • “Training language models to follow instructions with human feedback”
    by Long Ouyang et al (2022) (pdf)

  • “Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context”
    by Zihang Dai et al (2019) - (pdf)

  • “The Illustrated Transformer”
    by Jay Alammar (2018) - (site)

    by Xavier Amatriain (2023) - (pdf)

  • Let’s build GPT: from scratch, in code, spelled out
    by Andrej Karpathy - (YouTube)

    Asking ChatGPT for research papers will often result in hallucinations in the result. Sometimes an entire entry is an hallucination, sometimes parts of it are a hallucination such as the title, Author(s), links to external sources, DOI, etc.

    The table above was created by hand the old fashion way as that is faster then trying to identify the hallucinations.

Helpful hints about using ChatGPT


ChatGPT Strengths, Weaknesses and Limitations


ChatGPT Prompt Engineering


Prolog FAQ (created with ChatGPT)


ChatGPT prompts - Programming - Non-coding


ChatGPT prompts - Programming - Coding - Source Code


ChatGPT prompts - Programming - Data manipulation tasks


ChatGPT - answers to questions from elsewhere for comparison


ChatGPT prompts - Prolog - Basics


ChatGPT prompts - Prolog - any implementation


ChatGPT prompts - SWI-Prolog

  1. Link
  2. Link

ChatGPT prompts - s(CASP)