abathologist 17 hours ago

Any one know how Curry (which has a Haskell-like syntax extended to support prologish features) compares with Mercury (which has a Prolog-like syntax extended to support Haskellish features)?

  • sterlind 16 hours ago

    Mercury feels like if the Ada people wrote Prolog. it's very verbose. you have to declare signatures in separate files and determinism modes. grounding is strictly enforced. it's statically typed. there's no REPL, remarkably.

    in exchange, the compiler catches a lot of bugs and the code is blazing fast.

    Curry is a superset of Haskell. it takes Haskell's pattern matching and makes it extremely general (full unification), extends it to non-determinism with choice points. it does have a REPL, like ghci.

    Like Haskell, Curry is lazy. Mercury (like Prolog) uses mostly eager, depth-first evaluation (SLDNF resolution.) Clause order doesn't matter in Curry, which uses a strategy of "needed narrowing" - variables are narrowed when they need to be.

    Unlike Mercury (and Prolog), and like Haskell and other FP languages, Curry draws a distinction between function inputs and outputs. You can do relational programming via guards and pattern matching, but it doesn't feel as Prolog-y.

    Curry is more niche than Mercury, which is at least being used to build Souffle (a static analysis language built on Datalog), which is actually being used in industry somewhat. But it's a shame because Curry has a lot to offer, especially to Haskellers. They're both worth checking out though.

    • cess11 a few seconds ago

      Last I dabbled in Mercury it generated C and compiled that, which I expect to make a REPL harder to achieve compared with a ghci adjacent environment.

  • hyperbrainer 16 hours ago

    Not a technical difference, but I think Mercury is somewhat more "commerical" in that it's out of development and can be used in real projects, compared to Curry, which is very much in development.

badmonster 18 hours ago

How does Curry manage ambiguity in non-deterministic computations—especially when multiple valid instantiations exist for a free variable?

  • pjmlp 18 hours ago

    Probably like Prolog, we get to generate all possible variations.

taeric 12 hours ago

That example for "some permutation" is not at all easy for me to understand. I'm assuming I'm just not familiar with the general style?

  • idle_zealot 12 hours ago

    I'm unfamiliar as well, but my best guess is that it relies on non-determinism. i.e. both definitions of 'insert' might be valid, and the runtime chooses which to use at random, resulting in either x or y being prepended to the returned list.

    • sterlind 11 hours ago

      it's not random. it tries definitions in declaration order until one succeeds. it's then yielded as an assignment of variables and control returns to the caller. if that assignment gets contradicted it will backtrack and try the second definition, so on and so forth. it's more like coroutining.

      • taeric 11 hours ago

        Does this definition somehow cause all random permutations of a given list? The definition of "Some Permutation" seems to imply it can be used any place you need to try any/all permutations, one at a time? At the least, repeated calls to this would be different permutations?

        • sterlind 8 hours ago

          quick Prolog example because I'm not as familiar with Curry:

          % This generates or recognizes any palindrome: pal --> [_]. pal --> X,pal,X.

          % Here we try it out and press ; to generate more answers. ?- phrase(pal,P). P = [A]; P = [B,A,B]; ...

          % Here we plug in a value and it fails with [A], fails with [B,A,B], etc. until it gets to [D,C,B,A,B,C,D], which can be unified with "racecar." ?- phrase(pal, "racecar") true.

          Another example is just (X=a;X=b),(Y=b;Y=a),X=Y. This has two answers: X=a, Y=a, and X=b,Y=b. What happens is that it first tries X=a, then moves onto the second clause and tries Y=b, then moves onto the third clause and fails, because a≠b! So we backtrack to the last choicepoint, and try Y=a, which succeeds. If we tell Prolog we want more answers (by typing ;) we have exhausted both options of Y, so we'll go back to the first clause and try X=b, then start afresh with Y again (Y=b), and we get the second solution.

          Prolog goes in order, and goes deep. This is notoriously problematic, because it's incomplete. Curry only evaluates choicepoints that a function's output depends on, and only when that output is needed. Curry does have disjunctions (using ? rather than Prolog's ;), unification (by =:= rather than =), and pattern guards rather than clause heads, and the evaluation strategy is different because laziness, but in terms of the fundamentals this is what "non-determinism" means in logic programming. it doesn't mean random, it means decisions are left to the machine to satisfy your constraints.

          • YeGoblynQueenne 2 hours ago

            >> ?- phrase(pal, "racecar") true.

            Off the top of my head but I think that should be backticks, not double quotes? So that `racecar` is read as a list of characters? I might try it later.

            >> Prolog goes in order, and goes deep. This is notoriously problematic, because it's incomplete.

            Yes, because it can get stuck in left-recursive loops. On the upside that makes it fast and light-weight in terms of memory use. Tabled execution with memoization (a.k.a. SLG-Resolution) avoids incompleteness but trades off time for space so you now risk running out of RAM. There's no perfect solution.

            Welcome to classical AI. Note the motto over the threshold: "Soundness, completeness, efficiency: choose two".

            • cess11 3 minutes ago

              To me --> looks like DCG and should work with ", if whatever the flag is called, is set. Scryer has it set by default now, I think. At least it was some time ago I had to look it up and set it myself.

otherayden 15 hours ago

Imagine having your first and last names turn into two separate programming languages lol

pmarreck 18 hours ago

As is usual with any language that is new to me, would love a comparison of this language, in terms of a number of commonly-valued dimensions, with other languages:

speed, compare code samples of small algorithms, any notable dependencies, features (immutable data, static typing etc.), etc.

  • TypingOutBugs 17 hours ago

    Fwiw, Curry is 30 years old! It looks newer than it is fr the site

johnnyjeans 15 hours ago

The comparisons they're making don't make sense to me. I don't think I've ever even seen a logic language without nested expressions. Also VERY weird they give non-determinism as a feature of logic programming. Prolog is the only one off the top of my head that allows for it. Even most Prolog derivatives drop the cut and negation operations. In the broader scope of logic languages, most aren't even turing complete, like Datalog or CLIPS.

I really feel like Prolog and its horn clause syntax are underappreciated. For as much as lispers will rant and rave about macros, how their code is data, it always struck me as naive cope. How can you say that code is data (outside of the obvious von neumann meaning), but still require a special atomic operation to distinguish the two? In Prolog, there is no such thing as a quote. It literally doesn't make sense as a concept. Code is just data. There is no distinguishing between the two, they're fully unified as concepts (pun intended). It's a special facet of Prolog that only makes sense in its exotic execution model that doesn't even have a concept of a "function".

For that reason, I tend to have a pessimistic outlook on things like Curry. Static types are nice, and they don't work well with horn clauses (without abusing atoms/terms as a kind of type-system), but it's really not relevant enough to the paradigm that replacing beautiful horn clauses with IYSWIM/ML syntax makes sense to me. Quite frankly, I have great disdain even for Elixir which trades the beautiful Prolog-derived syntax of Erlang for a psuedo-Ruby.

One thing I really would like to see is further development of the abstract architectures used for logic programming systems. The WAM is cool, but it's absolute ancient and theory has progressed lightyears since it was designed. The interaction calculus, or any graph reduction architecture, promises huge boons for a neo-prolog system. GHC has incidentally paved the way for a brand new generation of logic programming. Sometimes I feel crazy for being the only one who sees it.

  • YeGoblynQueenne 11 hours ago

    Curry is very recognisably the functional programmer's conception of what logic programing is, which is the way it's described in the SICP book. Nothing to do with Resolution, Horn clauses, or even unification, instead it's all about DFS with backtracking. Sometimes dictionaries (!) have something to do with it [1].

    I'm speaking from personal experience here. DFS with backtracking has always featured very prominently in discussions I've had with functional programming folks about logic programming and Prolog and for a while I didn't understand why. Well it's because they have an extremely simplified, reductive model of logic programming in mind. As a consequence there's a certain tendency to dismiss logic programming as overly simplistic. I remember a guy telling me the simplest exercise in some or other of the classic functional programming books is implementing Prolog in (some kind of) Lisp and it's so simple! I told him the simplest exercise in Prolog is implementing Prolog in Prolog but I don't think he got what I meant because what the hell is a Prolog meta-interpreter anyway [2]?

    I've also noticed that functional programmers are scared of unification - weird pattern matching on both sides, why would anyone ever need that? They're also freaked out by the concept of logic varibles and what they call "patterns with holes" like [a,b,C,D,_,E] which are magickal and mysterious, presumably because you have to jump through hoops to do something like that in Lisp. Like you have to jump through hoops to treat your code as data, as you say.

    And of course if you drop Resolution, you drop SLD-Resolution, and if you drop SLD-Resolution you drop the Horn clauses, whose big advantage is that they make SLD-Resolution a piece of cake. Hence the monstrous abomination of "logic programming" languages that look like ... Haskell. Or sometimes like Scheme.

    Beh, rant over. It's late. Go to sleep grandma. yes yes you did it all with Horn clauses in your time yadda yadda...

    ___________

    [1] Like in this MIT lecture by H. Abelson, I believe with G. Sussman looking on:

    https://youtu.be/rCqMiPk1BJE?si=VBOWeS-K62qeWax8

    [2] It's a Prolog interpreter written in Prolog. Like this:

      prove(true):-
        !. %OMG
      prove((Literal,Literals):-
        prove(Literal)
       ,prove(Literals).
      prove(Literal):-
        Literal \= (_,_)
       ,clause(Literal,Body)
       ,prove(Body).
    
    Doubles as a programmatic definition of SLD-Resolution.
    • johnnyjeans 9 hours ago

      a prolog wizard crossing the path is an exceedingly rare and brilliant event, im compelled to make a wish upon this shooting star :3

      > I remember a guy telling me the simplest exercise in some or other of the classic functional programming books is implementing Prolog in (some kind of) Lisp and it's so simple!

      it's really easy to underestimate just how well engineered prolog's grammar is, because it's so deceptively simple. the only way you're getting simpler is like, assembly. and it's a turing equivalent kind of machine, but because if you squint your eyes you can delude yourself into thinking it kind of looks procedural, people can fool themselves into satisfaction that they "get" it, without actually getting it.

      but the moment NAF and resolution as a concept clicks, it's like you brushed up against the third rail of the universe. it's insane to me we let these paradigms rot in the stuffy archives of history. the results this language pulls with natural language processing should raise any sensible person's alarm bells to maximum volume: something is Very Different here. if lisp comes from another planet, prolog came from an alternate dimension. technological zenith will be reached when we push a prolog machine into an open time-like curve and make our first hypercomputation.

      • YeGoblynQueenne 2 hours ago

        >> a prolog wizard crossing the path is an exceedingly rare and brilliant event, im compelled to make a wish upon this shooting star :3

        Well, hello fellow traveler :)

        >> but the moment NAF and resolution as a concept clicks, it's like you brushed up against the third rail of the universe.

        I know, it's mind blowing. Maybe one day there will be a resurgence.

  • spencerflem 6 hours ago

    What's your take on Finite Choice Logic Programming / Dusa btw?

    Been messing with it & Answer Set Programming recently and still trying to work out my own thoughts on it