Prolog program is much slower than Haskell -- why?

The small logict example claims that

grandparent :: String -> Logic String
grandparent grandchild = do (p, c) <- choose parents
                            (c', g) <- choose parents
                            guard (c == c')
                            guard (g == grandchild)
                            pure p

is equivalent to

grandparent(Person, Grandchild) :- 
    parent(Person, X), 
    parent(X, Grandchild).

but it actually seems to be equivalent to

grandparent(Person, Grandchild) :- 
    parent(Person, X), 
    parent(Y, Grandchild),
    X == Y.

which is almost the same semantics but less efficient. (It’s also far less readable.)

Prolog is more than just backtracking – it also has logical variables. That’s why Haskell has ++ (append) as a built-in: functional languages can’t make “append” into tail-recursive but Prolog can (there’s a cost in Prolog, of course: an extra “ref” in the generated list; but that seems to be acceptable). [And the lack of logical variables is why my translation of the Haskell code to Prolog uses ==/2 instead of =/2.]

Here’s another example of doing something like a backtracking search in Haskell: Escape from Zurg: An Exercise in Logic Programming
I think it’s similar to Graham Hutton’s code in that it appears to build a lazy list of possible solutions and then filters it. (It also doesn’t run with ghc 8.8.4, possibly because the Haskell libraries have changed.)
I’ve read somewhere that lazy foldr (or is it foldl?) can have memory usage issues – I suppose it’s similar to what happens with Prolog doing recursion that creates choicepoints.

I feel like I’m missing understanding something important with understanding Haskell, but I don’t know what it is. (A long time ago (in the days of Miranda, SASL, etc. and before Haskell), I took a graduate course that did lots of pure functional programming – point-free notation, combinators, currying, etc. – but somehow it just feels a lot more difficult than logic programming.)