Propositions as Types (2014) [pdf]

(homepages.inf.ed.ac.uk)

107 points | by nill0 2 days ago ago

75 comments

  • dang an hour ago ago

    Related (I think) - others?

    Propositions as Types: Explained (and Debunked) - https://news.ycombinator.com/item?id=38894706 - Jan 2024 (1 comment)

    Propositions as Types - https://news.ycombinator.com/item?id=17947379 - Sept 2018 (1 comment)

    Propositions as Types (2014) [pdf] - https://news.ycombinator.com/item?id=10553766 - Nov 2015 (6 comments)

    On Propositions as Types [video] - https://news.ycombinator.com/item?id=10154349 - Sept 2015 (3 comments)

  • elikoga 2 days ago ago

    In my opinion it's a tragedy there are so few resources in using "Propositions as Types"/"Curry–Howard correspondence"[0] in didactics in tandem with teaching functional programming to teach structured proof-writing.

    Many students do not feel comfortable with proof-writing and cannot dispatch/discharge implications or quantifications correctly when writing proofs and I believe that a structured approach using the Curry-Howard correspondence could help.

    [0]: https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspon...

    • jerf 2 days ago ago

      While I am first in line to say that programming is math whether you like it or not, it is certainly the case that few programmers are using that sort of math in their heads when they are programming. I suspect that if, oh, about 60-70 years of trying to change that has not had any headway that there is little reason to suspect that's going to change in the next 30.

      But you can sort of backdoor the idea with something like this: https://jerf.org/iri/post/2025/fp_lessons_types_as_assertion... I won't claim it is exactly the same, but it is certainly similar. Tell a programmer, "Hey, you can build a type that carries certain guarantees with it, so that the rest of the program doesn't need to be constantly validating it", a problem any 5+ year dev ought to be very familiar with, and I think you make more progress in the direction of this notion than a paper full of logic diagrams in a notation even most computer scientist undergrads will never encounter.

      (I'm not even certain if I directly encountered that in my Master's degree. It's been a while. I think I did, but I think it was only one class, and not a huge part of it. I'm not sure because I know I did more with it on my own, so I can't remember if I was formally taught it or if I picked it up entirely myself.)

      • skybrian 2 days ago ago

        While I generally agree with defining new types to assert that validation has been done, I think your blog post could have explained more about what kinds of validation are practical to do. For example:

        > Address that represents a “street address” that has been validated by your street address to exist

        What does it even mean to verify that a street address exists? Verifying real-world relationships is complex and error-prone. I’m reminded of the genre of articles starting with “Falsehoods programmers believe about names” [1].

        In practice, the rules that can be enforced are often imposed by the system itself. It simply doesn’t accept data that isn’t in correct format or isn’t consistent with other data it already has. And we need to be cautious about what harm might be caused by these rejections.

        Having the validation logic in one place will certainly help when fixing mistakes, but then what do you do about data that’s already been accepted, but is no longer “valid?” This sort of thing makes long-running systems like databases hard to maintain.

        [1] https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-...

        • jerf 2 days ago ago

          > I think your blog post could have explained more about what kinds of validation are practical to do

          Perhaps following the two links with the word "valid" in them to will answer your concerns: https://jerf.org/iri/post/2023/value_validity/

          Note that article does explicitly have the sentence "Let’s forget the Umptydozen Falsehoods Programmers Believe About Addresses for the sake of argument and stipulate a perfect such function." These are examples. Trying to write "here's how to validate everything that could ever happen and all also here's a breakdown of all the falsehoods and also here's how it interacts with all your other logic" is not exactly a blog post so much as a book series. It turns out that even if you completely ignore the Umptydozen falsehoods of all the various kinds, you still have plenty of validation problems to talk about!

          However, the in-a-nutshell answer to "how do you handle time invalidating things" is that you treat your database as an untrusted store and validate things as needed. I'm actually an 80/20 guy on using databases to maintain integrity for much this reason; I love me some foreign keys and I use them extensively but the truth is that that is only a partial solution to the data validity problem no matter how clever you get, and temporal inconsistency is a big one. Once you have any source of inconsistencies or errors in your DB, a whole whackload of need for validation and care basically comes dropping in all at once, or, to put it another way, if you're not 100.00000% successful at maintaining data integrity, the next practical option is 95%. There is no practical in-between, because even that .001% will end up driving you every bit as crazy as 5% being wrong in most ways.

          But that's also out-of-scope for blog posts targeted at people who are only doing ad-hoc validation whenever they are forced to. Learn how to validate properly at all, then get better when you have a time-based problem.

          • skybrian 2 days ago ago

            Good article. Yeah, I wouldn’t expect a full explanation, just some kind of “here be dragons” caveat. Perhaps a hyperlink alone is a bit too subtle since readers aren’t always going to dereference it. (And there’s some irony there, given the subject of the linked article.)

            The types in Go’s template/html package are a pretty interesting example of using types tactically to indicate validity. The HTML type is used to turn off HTML escaping when it’s already been done. It’s using a type as a loophole. It’s still very useful to have a type like that when reviewing code for security bugs, because you know where to look for problems. Unsafe sections in Rust serve a similar purpose.

            Types are about creating trust, and this trust is often short-lived. When data crosses a security boundary, the validation has to done again.

        • fmbb 2 days ago ago

          Far too many programmers forget that time passes.

          • daxfohl 2 days ago ago

            Yeah, the issue is that proofs are harder than people think, even for trivial things (try a few easy leetcode problems in Lean with a proof of correctness), and less useful than people think, especially when you get past low level algorithms and into domain logic (your point exactly). They also don't serialize well, so a database or API call with a "proof" field would be susceptible to fudging, or the definition could change between deployments. They're also easy to make incompatible: one library requires bounds proofs for signed ints, but your app uses unsigned ints, so you have to either rewrite your app or rewrite the library, or cast, in which your type system has to handle like a "checked exception" and propagate that possibility throughout your whole type domain and app logic.

            I'm pretty convinced there's a good reason that while "propositions as types" is cool in theory, it's unlikely we'll ever see it catch on in practice.

            • mietek 10 hours ago ago

              > …especially when you get past low level algorithms and into domain logic (your point exactly). They also don't serialize well, so a database or API call with a "proof" field would be susceptible to fudging…

              Nonsense.

              Proving things about low-level algorithms that are full of complicated behaviors involving shared mutable state is often more difficult than proving things about high-level domain logic; regardless, people still do it, and if you use any modern CPU, especially from the ARM family, you benefit from their work.

              A proof serializes just as well as any other kind of program — after all, the first step in checking a proof involves deserializing the proof, or in other words, reading and parsing its source code.

              As for “fudging”; proofs are guaranteed to be correct up to the correctness of the proof checker, which can be used to recheck the proof at any time.

              For more information, see Geuvers (2009) “Proof assistants: History, ideas and future”.

              https://sci-hub.st/https://link.springer.com/article/10.1007...

            • Tainnor 2 days ago ago

              > try a few easy leetcode problems in Lean with a proof of correctness

              This might actually be harder than, say, proving some undergraduate math theorems because reasoning about mutable state (especially when you want it to be performant) is tricky. I might guess that it could be easier to use a model checker such as TLA+ for that (although that can only verify algorithm descriptions and not implementations) because they seem to be more built with these things in my mind, but I lack enough experience with it to be certain.

      • immibis 2 days ago ago

        This has nothing to do with C-H though.

        • jerf 2 days ago ago

          It isn't exactly the same. I believe I said that up front.

          But it provides a bridge. Both things have "You can assert Property X about all things of type Y" built into them. Trying to jump people straight to C-H without giving any hint whatsoever of why this may be useful in any real code... well, that's a non-trivial percentage of why this approach has completely failed to capture any significant mindspace, though I leave it as an exercise to the reader to determine the exact percentage in their minds.

          • immibis 2 days ago ago

            C-H has nothing to do with "property X is true about all things of type Y". It says "if you can find an object of type Y, you can prove global property Z."

            For example, the property corresponding to the type "(Either a a) -> a" is "(A or A) implies A" which doesn't tell you any properties of actual objects of the type "(Either a a) -> a"

            The property corresponding to the type "(Either a b) -> a" is "(A or B) implies A" which is not provable, so you can't find any object of this type.

    • mietek 2 days ago ago

      One of the best modern resources is "Programming language foundations in Agda", co-authored by Wen Kokke and the same Philip Wadler, and used for teaching at multiple universities.

      https://plfa.github.io/

    • js8 2 days ago ago

      I would recommend https://lean-lang.org/theorem_proving_in_lean4/, especially the first few chapters, if you're willing to use Lean.

    • ndriscoll 2 days ago ago

      I've been thinking the same thing and mentioned it the other day[0]. When I was learning proofs, I saw people struggle with the idea of needing to choose a "generic" element to prove a forall statement, for example. I suspect it might make more sense if you taught things in terms of needing to write a function x=>P(x). I think in some cases, thinking in terms of programming might also change how we think about structuring things. e.g. define a data structure for a "point with error tolerance" (x, epsilon), then continuity says given a toleranced-point y* at f(x), I can find a toleranced-point x* at x such that f(x*) is "compatible" (equal at the base point and within tolerance) with y*. This factoring lets you avoid shocking new students with quantifier soup. Likewise the chain rule is just straightforward composition when you define a "derivative at a point" data structure Df((x,m))=(f(x), m*f'(x)).

      This is not at all new, but I suppose it's currently a lot to ask students to learn proof mechanics while also unwinding multiple layers of definitions for abstractions. Computers can help do the unwinding (or lifting) automatically to make it easier to make small "quality of life" definitions that otherwise wouldn't be hugely useful when pen-and-paper-proofs would always be unwinding them anyway.

      Basically, math education could look a lot like software engineering education. The concerns and solution patterns are basically the same. e.g. typeclasses are pretty much how mathematics does polymorphism, and are probably usually the right way to do it in programming too.

      [0] https://news.ycombinator.com/item?id=43875101

      • immibis 2 days ago ago

        I don't think the C-H correspondence is necessary for this. It would be a useful way to think even if the C-H correspondence were false.

    • talkingtab 2 days ago ago

      The connection to intuitionism (see https://en.wikipedia.org/wiki/Intuitionism) gives this kind of thinking much broader appeal and application. It seems to me we live in a time so dominated by analytical thinking that we completely ignore other useful and effective modes.

      • johnnyjeans 2 days ago ago

        Intuitionism in this context just means the proofs have to be constructive. (no proof-by-contradiction, or other half-assed academic hacks)

        • AnimalMuppet 2 days ago ago

          Why do you say that proof-by-contradiction is a "half-assed academic hack"?

          In particular, if someone isn't already an intuitionist or constructivist, what reason can you give that would be valid in their frame of reference?

          • zozbot234 2 days ago ago

            As others have explained, there is indeed nothing wrong with "proof by contradiction" of a negative statement. Intuitionistic logic does not view statements phrased in the "positive" and in the "negative" as fully equivalent, as classical logic does. (There are ways of formalizing this point of view quite rigorously, such as in so-called "ecumenical logics" where classical and constructive/intuitionistic reasoning can in fact coexist and interoperate, but statements derived from classical reasoning can only be translated in the negative as seen within constructive reasoning.)

            • AnimalMuppet 2 days ago ago

              So I asked specifically for you to assume that I don't already agree with intuitionist logic. Now, assuming that, what is wrong with proof by contradiction of a positive statement?

              Both you and johnnyjeans gave me answers that already assumed intuitionism. I don't assume that. Can you give me any reasons that start from where I am and show my why I should adopt intuitionism?

              • auggierose 2 days ago ago

                You should not. Platonism is the only defensible mathematical philosophical position.

                • AnimalMuppet 2 days ago ago

                  That doesn't make platonism a "half-assed academic hack", though (which was the original claim I was questioning).

                  • auggierose 2 days ago ago

                    I fully agree. Platonism isn't a hack, it is the only philosophy of math that makes sense to me. Something is either true, or it isn't. There is no third case. This is because mathematical objects are real, not just something our minds make up. To quote Arnold: Mathematics is the part of physics where experiments are cheap.

                    That said, I don't have a problem with intuitionistic logic, but I think about it as a platonist, for example via Kripke models. I also don't have a problem thinking about it in terms of a certain restricted class of proofs, that people call constructive proofs.

          • johnnyjeans 2 days ago ago

            Because they explain nothing and aren't useful for building new insights. It's not a direct verification of the existence or lack-of-existence of a thing with given properties. It relies on rules lawyering in a very specific logic to say something with basically no substance. Allowing it causes more problems for automated proofs than they solve, so long as your domain only deals with finites (which all real-world domains do exclusively.)

            They're also generally much easier to create than constructive proofs, which makes them a convenient go-to for lazy academics.

            • ndriscoll 2 days ago ago

              Proof by contradiction is fine for lack-of-existence (indeed, ¬P is defined in type theory as P -> false). I think also if I got my types right, you can do

                  def eliminateTripleNot(f: ¬¬¬P): ¬P = {p: P => f({g: ¬P => g(p)})}
              
              For a constructive proof of ¬¬¬P => ¬P? So it's really just ¬¬P => P that causes trouble. It's not clear to me though whether LEM is actually okay in the "fast and loose reasoning is morally correct" sense (i.e. if it's okay as long as you don't cheat and do something like use a non-terminating loop or exception to make your ¬¬P) though? Are there cases where you didn't "obviously" cheat where it's "wrong" to use it? In some sense, I see ¬¬P => P as a "cast" from an opaque function type to specifically be a closure {f => f(p)} for some p: P.
              • johnnyjeans a day ago ago

                Sorry, I'm rusty enough on my logic that I'll only embarrass myself if I try to match the depth of your post.

                > Are there cases where you didn't "obviously" cheat where it's "wrong" to use it?

                In particular, it prevents us from adapting the proof to non-binary valued logics full stop.

                > I see ¬¬P => P as a "cast" from an opaque function type to specifically be a closure {f => f(p)} for some p: P.

                Now this is an interesting thought.

              • ndriscoll a day ago ago

                Partially answering my own question: a starting point for synthetic differential geometry is to define a collection of infinitesimals which aren't not zero, but also zero isn't the only infinitesimal. So there are interesting/non-contrived objects that live in the space left by not assuming LEM.

              • cmrx64 2 days ago ago

                that isn’t proof by contradiction, that’s plain old proof of negation. proof of negation is Neg : (P → False) → ¬P, proof by contradiction is PBC : (¬P → False) → P. a subtle yet crucial distinction

                • ndriscoll 2 days ago ago

                  Ah, right. It's been long enough that I've forgotten what the words mean, though I think with my ninja edit, PBC is actually still constructively valid as a method to prove a negation?

            • undefined 2 days ago ago
              [deleted]
            • TimorousBestie 2 days ago ago

              This is needlessly aggressive. There’s nothing wrong with assuming the axiom of choice to prove a result, just as there’s nothing wrong with trying to develop a proof that doesn’t rely on it. Saying a nonconstructive proof “explains nothing” is myopic at best. Insights come from everywhere.

            • Tainnor 2 days ago ago

              > They're also generally much easier to create than constructive proofs

              This is generally seen as a good thing by most "lazy academics". I guess your priorities are just different.

              Constructivism is also not at all opposed to infinite quantities (that would be finitism). "Real-world domains" deal with infinite spaces (e.g. of all functions R->R) quite regularly across all scientific domains.

              • johnnyjeans a day ago ago

                My priorities are indeed different. Apologies for the inflammatory language. My remark WRT constructive proofs is more an observation I've made that most proofs which deal with non-finite values seem to be non-constructive. Not necessarily as a hard and fast rule, the two just don't seem to roll well together. Could be sampling bias, but poking and prodding with mathematician friends more or less confirmed it. Not well read enough to have more interesting things to say on it.

    • boxfire 2 days ago ago

      There's a book that's explicitly about this, "Program = Proof", and though it's not beginner and needs maybe a light version for earlier learners, is an excellent example.

    • ngruhn 2 days ago ago

      100% agree. I did not understand induction until I learned Coq. It really shows how mechanical proving can be.

    • mrkeen 2 days ago ago

      This may not go over as well as you'd think.

      I took Haskell in the same uni course with FSAs, Turing machines, Lambda calculus, natural deduction, Hoare logic and structural induction.

      So I was exposed to the "base case & step case" of structural induction at the same time as learning about it in Haskell. And for whatever reason it didn't leave a good impression. Implementing it formally was harder and more error-prone than shooting from the hip in an IDE. What was the point!?

      Now I smash out Haskell code daily as the quickest way to end up with little binaries that just-work-if-they-compile. It took me a while to realise that the upside of all this formality/mathiness is that other people did the proofs so that you don't need to. I get to take for granted that a boolean is true or false (and not some brilliant third value!) and that (map f . map g) is (map (f . g)).

      • edef 2 days ago ago

        > I get to take for granted that a boolean is true or false (and not some brilliant third value!)

          [True, False, undefined] :: [Bool]
        • immibis 2 days ago ago

          The traditional third value is actually FileNotFound: https://thedailywtf.com/articles/what_is_truth_0x3f_

          but in Haskell, yes, it's undefined. Which isn't a real value! For example, infinite loops are undefined. Theorists like to call it a value of every type, but in practical terms, it's more like a computation that never produces a value. The builtin "undefined" can be written as an infinite loop ("undefined = undefined") and many other pure infinite loops can also act as undefined values. The runtime is able to catch some and crash instead of hanging, but not others.

        • mrkeen 2 days ago ago

          Nah.

            check :: Bool -> a
            check  True = undefined
            check False = undefined
            check     _ = undefined
          
          
            7:1: warning: [GHC-53633] [-Woverlapping-patterns]
                Pattern match is redundant
                In an equation for ‘check’: check _ = ...
              |
            7 | check     _ = undefined
              | ^^^^^^^^^^^^^^^^^^^^^^^
        • ddellacosta 2 days ago ago

          See, you can even do quantum computing very naturally in Haskell.

      • AnimalMuppet 2 days ago ago

        “A man is rich in proportion to the number of things which he can afford to let alone.” ― Henry David Thoreau, Walden or, Life in the Woods

        If I have to do the proof, then as you say, it's probably harder and (in many cases) more error prone than if I didn't use a proof. If the language can do it for me, and I get the lack of errors without having to do the work (and without the chance of me making the mistakes)? Yeah, now we're talking. (Yes, I am aware that I still have to design the types to fit what the program is doing.)

        If a tool introduces complexity, it has to make up for it by eliminating at least that much complexity somewhere else. If it doesn't, the tool isn't worth using.

      • mpweiher 2 days ago ago

        > other people did the proofs so that you don't need to.

        So?

        You don't need to either.

        There simply is no evidence that having such proofs has any significant effect on software quality.

        https://blog.metaobject.com/2014/06/the-safyness-of-static-t...

  • indyjo 2 days ago ago

    As much as the concept blew me away when I first heard of it, I can't shake the feeling that the Curry-Howard correspondence is somehow mis-marketed as something that would immediately cater to programmers. The idea to encode propositions into types sounds enticing for programmers, because there are indeed a lot of cases where something needs to be conveyed that can't be conveyed using type systems (or other features) of common programming languages.

    Examples include the famous "the caller of this function guarantees that the argument is a non-empty list" but also "the caller guarantees that the argument object has been properly locked against concurrent access before using the function".

    However, in my experience, the community is more interested in mathematics than in programming and I don't know if anybody is really working to provide propositions-as-types to mainstream programmers. This is certainly because it is hard enough to prove soundness of a strict functional programming language as Agda or Rocq, much more for anything imperative, stateful, "real-world", non-strict, non-pure, concurrent, ill-defined, you name it.

    So, for me the promise of "showing programmers that what they do is actually mathematics" is not really kept as long as the definition of "programmer" is so narrow that it excludes the vast majority of people who define themselves as programmers.

    What I'm trying to say: I wish there were more efforts to bring more powerful formal methods, especially as powerful as dependent types, to existing mainstream programming languages where they could be useful in an industrial context. Or at least try to come up with new programming languages that are more pragmatic and do not force the programmer into some narrow paradigm.

    Am I wrong? I hope so :)

    • ndriscoll 2 days ago ago

      Programmers regularly do this stuff under various names/phrases. "Make illegal states unrepresentable", "parse, don't validate", "resource acquisition is initialization", etc. It's all about encoding the fact that your structure isn't just data, but also represents some proof of provenance that guarantees some property. Scala is an industrial language that gives more tools than you might usually find for this while also letting you be pragmatic and integrate with Java's massive ecosystem (though you quickly learn you'd rather not because Java code tends to not do a good job of leveraging types in a sane way).

    • AnimalMuppet 2 days ago ago

      No, you're not wrong. I mean, in some circles, it's a battle to get programmers to use types at all. And, while not every proposition can currently be usefully encoded in a type, every type currently encodes a proposition. So if we can get the people who don't use types to start using them, that's probably the lowest-hanging fruit.

      From there, every step to improve the expressiveness of types allows you to encode more within the type system. For example, one could think about encoding that non-empty requirement in C++ or Java collection types. It would be nontrivial - a number of things would need adjusting - but you could think about doing it. (Or rather, it would be nontrivial to do with static types. You could more easily make it throw if the condition was not satisfied.)

      Your "properly locked" example is much harder. It would require telling the type system what the definition of "properly locked" is. Even at that, I can't see any way to do it as a static type. And that's a bummer, because I far prefer static types. I prefer my proofs to fail at compile time.

      But my main point is, "proving" is not binary. Every incremental advance in what types can check proves more things, and therefore leaves the programmers having to prove less in their heads.

      • daxfohl 2 days ago ago

        And it is continually improving. Rust borrow checker is an example of this.

        But as far as jumping into the deep end of dependent types, that's a whole other ball of wax. Like, imagine (or try!) writing leetcode solutions in Lean or Idris, with a type that proves they are correct and (bonus points) run in the specified time complexity. Even defining the hypothesis is non trivial. It's all doable, but takes orders of magnitude longer. But with dependent types you have to do it because the next function you call may require a proof that the leet function returns the thing it's supposed to.

        That's just for algorithmic leet problems. Now imagine having to write proofs in a complex multithreaded system that your code is not accessing out of bounds arrays or leaking memory, and integrating libraries that each have slightly different semantics for doing the same, or they use unsigned types that make all the proofs incompatible, etc. At that point, you have to basically give up on your borrow checker and fall back to using runtime checks anyway. And even if you did get your system into a fully proved state, that only applies to that one binary; it makes no guarantees about distributed interactions, rollouts and rollbacks, or any of the other things that are the more frequent cause of bugs in production systems. In fact, it may encourage more 'distributed spaghetti" just to work around having to prove everything.

        There's an analogy with how checked exceptions work in Java: cool thought, but mostly get in the way of what you're really trying to do after a while, or break things when new types of exceptions get added to the function, so everyone ends up just wrapping them with unchecked exceptions anyway. This is what would end up happening with full dependent types too, except it would pervade the entire type system and every function you write. The eventual outcome would likely be everyone just works around them (every single function, even fairly trivial ones like divide, would return an Option that the caller would have to handle or bubble up), and the actual code would be even less type safe than it would be with a simpler type system.

        So, ultimately the way Rust is going, where some key things like borrow checking, are built into the language, seems to be the better approach.

    • BalinKing 2 days ago ago

      I think Idris (https://www.idris-lang.org/) is primarily focused on using dependent types for programming, and even Lean has been expanding in this direction (cf. the Functional Programming in Lean book, https://lean-lang.org/functional_programming_in_lean/).

    • immibis 2 days ago ago

      C-H isn't useful to programmers at all. The programs that represent proofs end up being useless programs, most of the time, and the proofs represented by real-world programs end up being useless proofs, most of the time.

      Most programs deal with several objects of the same type - your program probably contains more than one integer (shocking, right?). Since C-H maps each unique type to a unique proof term, the same type maps to the same proof term. A function that calculates greatest common factor of two numbers proves that (A∧A)→A, or "(A and A) implies A" which is... uninteresting.

      In the reverse direction, the program generated by (A∧A)→A is either the first or second entry selector from a tuple, and doesn't calculate the greatest common factor of two integers. (A∨A)→A is slightly more interesting, taking Either<int, int> and returning int.

      It's true that C-H of logical disjunction gives you Either types, conjunction gives you tuple types, and implication gives you function types. Kinda fascinating, enough to write a blog post about, but still without any practical purpose.

      Practical types with additional requirements (such as non-empty lists) have nothing to do with C-H and could exist even if C-H were false. Same for dependent types. As I said, C-H is completely useless in actual programming.

      I do wish we had more formal methods though. Formal methods of programming have nothing to do with C-H either. Static typing has nothing to do with C-H. The Rust borrow checker has nothing to do with C-H. Complete formal verification (as in seL4) has nothing to do with C-H.

    • zmgsabst 2 days ago ago

      Programmers use it all the time!

      They turn diagrams representing the logic of the business domain into expressions in their type system of choice.

      I think it’s a failure of pedagogy to focus on the abstract “prove stuff” sense, rather than the applied “domain diagram logic to types” at the core of what SDEs actually do. But turning a business domain diagram into a systems diagram into code is translating a category into another via embedding before translating into a type theory.

      My opinion is that you need to start with that core and explain how the other features make that better, eg, how does polynomial monads address the weirdness between logging and errors (and errors in logging errors, etc)?

  • devlovstad 2 days ago ago

    Slightly OT: I'm a master's student in computer science who focuses mostly on machine learning. Still, the best course I've ever taken was one on semantics and types, presenting many of the ideas in this article. Learning proof-writing using natural deduction with a ruthlessly rigorous teacher has made me much more precise when I write proofs in general, and learning about theory of computation and logic has given me a good mental model of how my programs execute.

    While the course is an elective mostly focused on students interested in programming languages, I think all computer scientists can benefit from taking such a course. In a time where everyone wants to do AI, the course only had around 12 students out of a class of maybe 200 students.

    Even more OT: Phil Wadler gave a talk at the programming language section of my university not too long ago, which I was much excited to see. Sadly, he chose a vague pop-sciency talk on AI which felt quite a bit outside his expertise.

    • somethingsome a day ago ago

      Hey, do you have interesting slides/homeworks to share? I would be interested in taking a look

      • devlovstad a day ago ago

        The course did not use slides but instead wrote everything on a whiteboard. The lecture notes are not public afaik.

        The lecturer did suggest the following supplementary material:

        - Michael Huth and Mark Ryan. Logic in Computer Science: Modelling and Reasoning about systems (2nd ed.). Cambridge University Press 2004. (Mainly chapters 1, 2)

        - Glynn Winskel. The Formal Semantics of Programming Languages: An Introduction, MIT Press 1993. (Mainly chapters 1, 2, 3, 6, 11)

        - Benjamin C. Pierce. Types and Programming Languages. MIT Press 2002. (Mainly chapters 5, 8, 9, 11, 12)

  • marcusb 2 days ago ago

    The author has given a few talks built around the same concept: https://www.youtube.com/watch?v=IOiZatlZtGU

  • js8 2 days ago ago

    I really recommend [56] M. H. Sørensen and P. Urzyczyn. Lectures on the Curry-Howard isomorphism. Elsevier, 2006 on this topic. I found it online.

    It is very self-contained and explains some interesting topics such as relation of intuitionist and classical logic.

  • emorning3 2 days ago ago

    I dont want to prove that my code is correct.

    I would prefer to create specifications in logic and have the code derived for me.

    Other than something like Prolog, are there tools for working this way?

    • Gajurgensen 2 days ago ago

      Program synthesis is of course very difficult in general, especially if you want it to be entirely automated. One option to make it more practical is to have the user drive synthesis from specification to implementation via something which looks like a sequence of tactics.

      (I'll add a plug to some stuff we are working on at Kestrel: https://www.cs.utexas.edu/~moore/acl2/manuals/latest/index.h.... We've used the APT library to do stepwise refinements from specs ACL2 to C code. Each step is something like "make function tail recursive" or "switch to a new, isomorphic shape of the data").

      By the way, Curry-Howard offers a compelling insight here: deriving programs from specifications (i.e. types+propositions) may be the same process as deriving proofs from propositions. So the two processes can in principle work the exact same way.

    • emorning3 2 days ago ago

      PS...

      I've been waiting for such tools for 40 years.

      I just now realized that vibe coding is probably the closest we'll get in my lifetime.

      It's a start I guess.

    • Tainnor 2 days ago ago

      This is impossible to solve in the general case due to Gödel's first incompleteness theorem.

      • YeGoblynQueenne a day ago ago

        A compiler is a program synthesizer that takes in a specification in a high-level language and produces an equivalent program in machine code.

        Given that observation, can you explain what you mean that this is impossible to solve?

        • Gajurgensen a day ago ago

          Think of higher level specifications which do not imply any details of the implementation.

          For instance, consider a sorting function. One could write a bubble sort and consider that a spec, but that is far too much detail, much of which you don't actually care about. A much better specification would be "the function takes a list 'l' and produces a sorted list which is also a permutation of 'l'." This is the sort of specification we want, but we have more work to fill in the implementation details.

          This can get arbitrarily difficult if your specification logic is sufficiently expressive. Imagine the spec is something like "solve this unproven mathematical conjecture."

          • YeGoblynQueenne a day ago ago

            >> Think of higher level specifications which do not imply any details of the implementation.

            In that case you're talking about inductive program synthesis, i.e. program synthesis from incomplete specifications. Program synthesis from complete specification is what we call deductive program synthesis.

            Both are quite possible within the limits of undecidability of sufficiently expressive languages. My question to the OP was why did they think it's not possible at all, but I may have misread their comment.

        • Tainnor a day ago ago

          OP specifically asked for a program that would derive code from logical specifications, not a run-of-the-mill compiler where you have to write the code yourself (albeit on a higher level).

          If your specification logic is strong enough to enclude basic addition and multiplication (and if it's not, it's probably not very useful), it will be affected by incompleteness. So you can write down the Gödel sentence corresponding to your program synthesiser's inference rule and it won't be able to find either a proof (i.e. an implementation) or a disproof (i.e. failing explicitly), so it will have to loop forever.

          Another way to see it is that the set of valid theorems in FOL is uncomputable. To take a concrete example, if you write down the specification for "find the smallest number that is a counterexample to the Goldbach conjecture", finding an implementation for it would be equivalent to answering the conjecture in the negative while getting an error message would mean that the conjecture is true, but obviously nobody has solved it yet.

          • emorning3 6 hours ago ago

            >>Another way to see it is that the set of valid theorems in FOL is uncomputable.<<

            I've never seen it put that way exactly. And I've never learned the gory details of the incompleteness proofs and I've never made the explicit connection between incompleteness and computation before. That blew my tiny pea brain :-).

            Now that I've read the Wikipedia page, I guess I only understood incompleteness as the 2nd theorem.

            https://en.wikipedia.org/wiki/G%C3%B6del%27s_incompleteness_...

            Thanks!

          • YeGoblynQueenne a day ago ago

            I'm sorry but I think that when I replied to your comment it didn't have the "in the general case". I might have missed that, but I thought you meant it's impossible to do it at all.

            Btw:

            >> OP specifically asked for a program that would derive code from logical specifications, not a run-of-the-mill compiler where you have to write the code yourself (albeit on a higher level).

            There's nothing "run-of-the-mill" about compilers and if your high-level language is e.g. FOL, then a compiler that transforms it to low-level machine code is, still, a program synthesizer, and v.v.

            • Tainnor 14 hours ago ago

              I didn't edit my original comment.

              Compilers are totally besides the point. Python, Java and even Prolog aren't FOL, they lack its expressive power.

              • YeGoblynQueenne 2 hours ago ago

                Then I misunderstood your comment, but at the same time your comment is correct but irrelevant to the OP. "In the general case" there are undecidable problems, and there may even be an infinite number of undecidable problems, but there are most likely an infinite number of decidable problems also.

                That is to say, undecidability is not like intractability. One can work around undecidability, and one can always find useful problems that can be solved without falling down undecidable holes. So for example, the Halting Problem may be undecidable, but we have computers we use all the time to do a whole bunch of very useful things. FOL is undecidable, but elementary Boolean logic is decidable and definite logic is semi-decidable. Any formal system powerful enough to represent integer arithmetic is undecidable but we use arithmetic, and higher-level maths based on arithmetic, all the time. And so on, and so forth.

                The OP is asking for a program synthesiser where they can enter a specification in a logic language and get back a program in some other language. That is perfectly doable in practice, there's an entire field of research that produces work along those lines, the field of Program Synthesis. Modern LLMs can to some extent do that, very unreliably, and of course like I say, a compiler is nothing but a (deductive) program synthesiser, your unwillingness to accept that well-understood truism non-withstanding.

                >> Compilers are totally besides the point. Python, Java and even Prolog aren't FOL, they lack its expressive power.

                That's incorrect about Prolog. Prolog is SLD-Resolution, i.e. Linear Resolution with a Selection rule restricted to Definite clauses. Definite logic is a restriction of the first order predicate calculus that nevertheless retains its full expressive power, and is semi-decidable (all true statements in definite logic are provable). Most implementations of Prolog executed by Depth-First Search are incomplete because they get stuck in loops, but Prolog executed by SLG-Resolution (a.k.a. tabling) is refutation-complete.

                Prolog is probably the closest practical programming language that achieves what the OP is looking for (although not strictly in program synthesis terms) but the OP may also want to check out things like Z notation:

                https://en.wikipedia.org/wiki/Z_notation

  • immibis 2 days ago ago

    The Curry-Howard correspondence seems like one of those things that's weird and unexpected but not actually very useful?

    Most of the types that correspond to propositions, and programs that correspond to proofs, aren't of much utility in actual programming. And most of the useful programs and their types don't correspond to interesting proofs and propositions.

    The paper relates it to other connections between fields, such as Cartesian coordinates linking geometry and algebra. This allows you to bring the power of algebra into geometric problems and geometric intuition into algebra problems. But does the Curry-Howard correspondence bring similar powers?

    --

    This kind of rearrangement without really changing anything reminds me of the equivalence between first-order logic terms and sets. Consider [P] to be the set of models where statement P is true - then [A∧B] = [A]∩[B], [A∨B] = [A]∪[B] and so on. But this doesn't lead to any new insights. You basically just wrote the same thing with different symbols.

    In some contexts (proving soundness/completeness) it can allow you to step down one turtle, but you still have aleph-null turtles below you so that doesn't seem that useful either.

    • colbyn 2 days ago ago

      To play devils advocate,

      I’m not a mathematician but I’ve heard that the topics that lead to modern cryptography was once considered absolutely useless. They say for centuries, number theory (especially areas like prime numbers, modular arithmetic, and whatnot) was seen as the peak of “pure” math with no real world utility.

      Personally, im all for static analysis and formal verification in software, particularly the kind where properties can be automatically verified by a computer and to my understanding this field is the on bleeding edge of what’s possible.

      From a big picture perspective, our world is dependent on software, lives can be at risk when software fails, so for this reason I think its worthwhile to explore ideas that may one day lead to inherently more robust software even if it’s comercial utility isn’t clear.

      • immibis a day ago ago

        Perhaps it leads to something down the line, but for now it's more useful to proving (still not very) than to programming.

  • gitroom 2 days ago ago

    I feel like every time I try to do super formal proofs for code, it ends up way more brutal than I expect and slower than just fixing stuff as I go. Still love types catching my mess-ups for me though.

    • cbdumas 2 days ago ago

      This is the eternal tradeoff of testing, brought to its logical (no pun intended) conclusion. Testing (and formal verification) don't get you anything for free: writing detailed specs or tests is at least as difficult as writing a correct implementation. In fact given a sufficiently detailed spec, a program conforming to the spec can be derived automatically.

  • undefined 2 days ago ago
    [deleted]
  • aatd86 2 days ago ago

    Could have written further into set theory and set theoretic interpretation of types.

    A proposition being a set intension.