Ada, Its Design, and the Language That Built the Languages

(iqiipi.com)

133 points | by mpweiher 4 hours ago ago

72 comments

  • YesThatTom2 an hour ago ago

    Ada was also ignored because the typical compiler cost tens of thousands of dollars. No open source or free compiler existed during the decades where popular languages could be had for free.

    I think that is the biggest factor of all.

    • twoodfin an hour ago ago

      Ada’s failure to escape its niche is overdetermined.

      Given the sophistication of the language and the compiler technology of the day, there was no way Ada was going to run well on 1980’s microcomputers. Intel built the i432 “mainframe on a chip” with a bunch of Ada concepts baked into the hardware for performance, and it was still as slow as a dog.

      And as we now know, microcomputers later ate the world, carrying along their C and assembly legacy for the better part of two decades, until they got fast enough and compiler technology got good enough that richer languages were plausible.

      • shrubble 8 minutes ago ago

        The first validated compiler for Ada that ran on the IBM PC was released in 1983.

        The third validated compiler ran on the Western Digital “Pascal MicroEngine” running the UCSD p-system with 64K memory. The MicroEngine executed the byte code from the p-system natively, which was an interesting approach.

        I think more research is warranted by you on this subject.

      • michaelcampbell an hour ago ago

        I used it a bit a Uni and remember enjoying it, but can you say what was slow about it; compilation or runtime or all of it?

        • twoodfin 30 minutes ago ago

          I’ve never directly played with Ada but my understanding is that it was very much both.

          Ada includes a number of critical abstractions that require either dynamic runtime code (slow runtime) or the proverbial sufficiently smart compiler (slow compile-time).

          These were for good reasons, like safety and the need to define concurrent systems within the language. But they were too heavyweight for the commodity hardware of the era.

          Nowadays, languages like Go, C++, Java, Rust, … have no trouble with similar abstractions because optimizers have gotten really good (particularly with inlining) and the hardware has cycles to spare.

          • sidewndr46 12 minutes ago ago

            I had to take some course that was something like "Programming Language Theory". As a result I had to look at the specifications for dozens of different programming languages. I remember looking at the features of some languages and scratching my head trying to figure out how some of this would ever be practically implemented by a compiler. Later on I found out lots of stuff is just implemented by a runtime anyways, which lead to me realize that those fancy language features are often better as a library.

    • acomjean 22 minutes ago ago

      A huge factor. I used ada for years and the fact everyone I worked with did hobby projects in other languages didn’t help it. And most of us liked Ada.

      It had other warts the string handling wasn’t great, which was a huge problem. It was slow too in a time where that mattered more (we had c and ada in our code base.). I remember the concurrency not using the OSs so the one place we used it was a pain. HPUX had an amazing quasi real time extensions, so we just ran a bunch of processes.

    • shrubble 20 minutes ago ago

      The GNU ADA compiler was first released in 1995: https://en.wikipedia.org/wiki/GNAT

    • rhubarbtree 17 minutes ago ago

      Strange comment. GNAT?

    • eager_learner 32 minutes ago ago

      This. Nothing can compete with free.

  • donatj an hour ago ago

    I like the article overall but the continually repeated 'Language X didn't have that until <YEAR>' is very grating after the first ten or so.

    I also wish there were concrete code examples. Show me what you are talking about rather than just telling me how great it is. Put some side by side comparisons!

    • microtherion 21 minutes ago ago

      You could do the same in reverse as well. Many of the features listed in the first paragraph existed before in other languages, though probably not all of them in a single language. In fact, I believe the design process (sensibly) favored best practices of existing languages rather than completely new and unproven mechanisms.

      So there was considerable borrowing from PASCAL, CLU, MODULA(-2), CSP. It's possible that the elaborate system for specifying machine representations of numbers was truly novel, but I'm not sure how much of a success that was.

    • mcdonje 20 minutes ago ago

      I imagine an ada dev would find the pattern grating over the decades, so it reads like an expression of that experience.

  • coldcode 37 minutes ago ago

    The US Air Force intended to use ADA, but had to use JOVIAL instead because ADA took so long to be developed. Most people have never heard of JOVIAL but it still exists in the USAF as a legacy.

    I worked with JOVIAL as part of my first project as a programmer in 1981, even though we didn't even have a full JOVIAL compiler there yet (it existed elsewhere). I remember all the talk about the future being ADA but it was only an incomplete specification at the time.

    • adrian_b 4 minutes ago ago

      JOVIAL had been in use within the US Air Force for more than a decade before the first initiative for designing a unique military programming language, which has resulted in Ada.

      JOVIAL had been derived from IAL (December 1958), the predecessor of ALGOL 60. However JOVIAL was defined before the final version of ALGOL 60 (May 1960), so it did not incorporate a part of the changes that had occurred between IAL and ALGOL 60.

      The timeline of Ada development has been marked by increasingly specific documents elaborated by anonymous employees of the Department of Defense, containing requirements that had to be satisfied by the competing programming language designs:

      1975-04: the STRAWMAN requirements

      1975-08: the WOODENMAN requirements

      1976-01: the TINMAN requirements

      1977-01: the IRONMAN requirements

      1977-07: the IRONMAN requirements (revised)

      1978-06: the STEELMAN requirements

      1979-06: "Preliminary Ada Reference Manual" (after winning the competition)

      Already the STRAWMAN requirements from 1975 contained some features taken from JOVIAL, which the US Air Force used and liked, so they wanted that the replacement language should continue to have them.

      However, starting with the IRONMAN requirements, some features originally taken as such from JOVIAL have been replaced by greatly improved original features, e.g. the function parameters specified as in JOVIAL have been replaced by the requirement to specify the behavior of the parameters regardless of their implementation by the compiler, i.e. the programmer specifies behaviors like "in", "out" and "inout" and the compiler chooses freely how to pass the parameters, e.g. by value or by reference, depending on which method is more efficient.

      This is a huge improvement over how parameters are specified in languages like C or C++ and in all their descendants. The most important defects of C++, which have caused low performance for several decades and which are responsible for much of the current complexity of C++ have as their cause the inability of C++ to distinguish between "out" parameters and "inout" parameters. This misfeature is the reason for the existence of a lot of unnecessary things in C++, like constructors as something different from normal functions, and which cannot signal errors otherwise than by exceptions, of copy constructors different from assignment, of the "move" semantics introduced in C++ 2011 to solve the performance problems that plagued C++ previously, etc.

  • alyls 3 hours ago ago

    The Twitter account is from April 2026:

    https://xcancel.com/Iqiipi_Essays

    There is no named public author. A truly amazing productivity for such a short time period and generously the author does not take any credit.

    • Geezus_42 2 hours ago ago

      No author because its a bot.

      • IAmBroom an hour ago ago

        Yes, that was the point.

  • askUq 3 hours ago ago

    From the main page of this website:

    "These are not positions. They are proposals — structures through which a subject might be examined rather than verdicts about it."

    The entire site is AI written.

    • graemep 2 hours ago ago

      How is that evidence that the site was AI written?

      • twoodfin an hour ago ago

        The evidence is that the article’s writing is terrible. It repeats the same rhetorical devices over and over, dressing up a series of facts in false profundity, because there’s no actual authorial insight here. It’s just “write a well-researched article that demonstrates how ahead of its time the Ada language was” + matmul.

        • boxed an hour ago ago

          Humans are not gods of writing that will please all audiences and make no mistakes.

          • twoodfin 39 minutes ago ago

            Neither of those standards are what I’m talking about.

            Obviously this article was highly pleasing to the hn audience as it’s currently sitting at #1. It’s still garbage, because it doesn’t have any interesting ideas behind it. Certainly not commensurate with its length.

      • zozbot234 20 minutes ago ago

        The combination of emdashes and inane non-sequiturs in "These are not X. They're Y" style is pretty damning.

      • quietbritishjim an hour ago ago

        I think the quoted word salad is plenty of evidence.

  • shminge an hour ago ago

    I really don't want this to be AI writing because I enjoyed it, but as other commenters have pointed out, the rate of publishing (according to the linked Twitter account) is very rapid. I'm worried that I can't tell.

    • aeve890 38 minutes ago ago

      >the rate of publishing (according to the linked Twitter account) is very rapid.

      I've written almost 50 blog posts in the last 3 years. All in draft, never published mostly because a crippling imposter syndrome and fear of criticism. But every now and then I wake up full of confidence and think "this is it. today I'll click publish I don't give a fuck. All in". Never happens. Maybe this author was in the same boat until a month ago. I know there's a high chance that's just a bot but I can understand if it's not and how devastating has to be to overcome the fear of showing your thoughts to the world and being labeled a bot. If it's not already obvious English is not my first language and I've used LLMs to check my grammar and improve the style. Maybe all my posts smell like chatpgt now and this just adds to the fear of being dismissed as slop.

      • twoodfin 21 minutes ago ago

        LLMs do not currently improve the style of typical HN writing. Maybe someday they will; this article is less painfully bad than those of a few months ago.

        The main problem with this article is that it appears to have been basically written out of whole cloth by the LLM, there’s no novel insight here about Ada beyond what you could fit in a short prompt + the Wikipedia article.

  • tomekw 44 minutes ago ago

    Ada is underrated. I am spending lots of my time writing tons of open source software in Ada, mostly for myself, though.

  • tromp 2 hours ago ago

    > Every language that has added sum types in the past twenty years has added, with its own syntax, what Ada's designers put in the original standard.

    While true, that doesn't mean that other language's sum types originated in Ada. As [1] states,

    > NPL and Hope are notable for being the first languages with call-by-pattern evaluation and algebraic data types

    and a modern language like Haskell has origins in Hope (from 1980) through Miranda.

    [1] https://en.wikipedia.org/wiki/Hope_(programming_language)

    • adrian_b an hour ago ago

      The origin of all sum types is in "Definition of new data types in ALGOL x", published by John McCarthy in October 1964, who introduced the keyword UNION for such types (he proposed "union" for sum types, "cartesian" for product types, and also operator overloading for custom types).

      John McCarthy, the creator of LISP, had also many major contributions to ALGOL 60 and to its successors (e.g. he introduced recursive functions in ALGOL 60, which was a major difference between ALGOL 60 and most existing languages at that time, requiring the use of a stack for the local variables, while most previous languages used only statically-allocated variables).

      The "union" of McCarthy and of the languages derived from his proposal is not the "union" of the C language, which has used the McCarthy keyword, but with the behavior of FORTRAN "EQUIVALENCE".

      The concept of "union" as proposed by McCarthy was first implemented in the language ALGOL 68, then, as you mention, some functional languages, like Hope and Miranda, have used it extensively, with different syntactic variations.

      • tialaramex an hour ago ago

        Definitely if you don't have the C "union" user defined type you should use this keyword for your sum types. Many languages don't have this feature - which is an extremely sharp blade intended only for experts - and that's fine. You don't need an Abrams tank to take the kids to school, beginners should not learn to fly in the F-35A and the language for writing your CRUD app does not need C-style unions.

        If Rust didn't have (C-style) unions then its enum should be named union instead. But it does, so they needed a different name. As we work our way through the rough edges of Rust maybe this will stick up more and annoy me, but given Rust 1.95 just finally stabilized core::range::RangeInclusive, the fix for the wonky wheel that is core::ops::RangeInclusive we're not going to get there any time soon.

  • mkovach an hour ago ago

    I've written a few small projects in Ada, and it's a better language than it gets credit for.

    Yes, it's verbose. I like verbosity; it forces clarity. Once you adjust, the code becomes easier to read, not harder. You spend less time guessing intent and more time verifying it. Or verify it, ignore what you verified, then go back and remind yourself you're an idiot when you realize the code your ignored was right. That might just be me.

    In small, purpose-built applications, it's been pleasant to code with. The type system is strict but doesn't yell at you a lot. The language encourages you to be explicit about what the program is actually doing, especially when you're working close to the hardware, which is a nice feature.

    It has quirks, like anything else. But most of them feel like the cost of writing better, safer code.

    Ada doesn't try to be clever. It tries to be clear, even if it is as clear as mud.

  • adrian_b 2 hours ago ago

    Ada is a language that had a lot of useful features much earlier than any of the languages that are popular today, and some of those features are still missing from the languages easily available today.

    In the beginning Ada has been criticized mainly for 2 reasons, it was claimed that it is too complex and it was criticized for being too verbose.

    Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada, in many cases because they have started as simpler languages to which extra features have been added later, and because the need for such features had not been anticipated during the initial language design, adding them later was difficult, increasing the complexity of the updated language.

    The criticism about verbosity is correct, but it could easily be solved by preserving the abstract Ada syntax and just replacing many tokens with less verbose symbols. This can easily be done with a source preprocessor, but this is avoided in most places, because then the source programs have a non-standard appearance.

    It would have been good if the Ada standard had been updated to specify a standardized abbreviated syntax besides the classic syntax. This would not have been unusual, because several old languages have specified abbreviated and non-abbreviated syntactic alternatives, including languages like IBM PL/I or ALGOL 68. Even the language C had a more verbose syntactic alternative (with trigraphs), which has almost never been used, but nonetheless all C compilers had to support both the standard syntax and its trigraph alternative.

    However, the real defect of Ada has been neither complexity nor verbosity, but expensive compilers and software tools, which have ensured its replacement by the free C/C++.

    The so-called complexity of Ada has always been mitigated by the fact that besides its reference specification document, Ada always had a design rationale document accompanying the language specification. The rationale explained the reasons for the choices made when designing the language.

    Such a rationale document would have been extremely useful for many other programming languages, which frequently include some obscure features whose purpose is not obvious, or which look like mistakes, even if sometimes there are good reasons for their existence.

    When Ada was introduced, it was marketed as a language similar to Pascal. The reason is that at that time Pascal had become the language most frequently used for teaching programming in universities.

    Fortunately the resemblances between Ada and Pascal are only superficial. In reality the Ada syntax and semantics are much more similar to earlier languages like ALGOL 68 and Xerox Mesa, which were languages far superior to Pascal.

    The parent article mentions that Ada includes in the language specification the handling of concurrent tasks, instead of delegating such things to a system library (task = term used by IBM since 1964 for what now is normally called "thread", a term first used in 1966 in some Multics documents and popularized much later by the Mach operating system).

    However, I do not believe that this is a valuable feature of Ada. You can indeed build any concurrent applications around the Ada mechanism of task "rendez-vous", but I think that this concept is a little too high-level.

    It incorporates 2 lower level actions, and for the highest efficiency in implementations sometimes it may be necessary to have access to the lowest level actions. This means that sometimes using a system library for implementing the communication between concurrent threads may provide higher performance than the built-in Ada concurrency primitives.

    • init1 an hour ago ago

      Verbosity is a feature not a bug. Programming is a human activity and thus should use human language and avoid encoded forms that require decoding to understand. The use of abbreviations should be avoided as it obsfucates the meaning and purpose of code from a reader.

      • adrian_b an hour ago ago

        The programming community is strongly divided between those who believe that verbosity is a feature and not a bug and those who believe that verbosity is a bug and not a feature.

        A reconciliation between these 2 camps appears impossible. Therefore I think that the ideal programming language should admit 2 equivalent representations, to satisfy both kinds of people.

        The pro-verbose camp argues that they cannot remember many different symbols, so they prefer long texts using keywords resembling a natural language.

        The anti-verbose camp, to which I belong, argues that they can remember mathematical symbols and other such symbols, and that for them it is much more important to see on a screen an amount of program as big as possible, to avoid the need of moving back and forth through the source text.

        Both camps claim that what they support is the way to make the easiest to read source programs, and this must indeed be true for themselves.

        So it seems that it is impossible to choose rules that can ensure the best readability for all program readers or maintainers.

        My opinion is that source programs must not be stored and edited as text, but as abstract syntax trees. The program source editors and viewers should implement multiple kinds of views for the same source program, according to the taste of the user.

        • init1 34 minutes ago ago

          It is not that I cannot remember the symbols - I don't want to; I want the language to plainly explain itself to me. Furthermore every language has it's own set of unique symbols. For new readers to a language you first have to familiarize yourself with the new symbols. I remember my first few times reading rust... It still makes my head spin. I had to keep looking up what everything did. If the plain keyword doesn't directly tell you what it's doing at least it hints at it.

          To be clear Ada specifically talks about all this in the Ada reference manual in the Introduction. It was specifically designed for readers as opposed to writers for very good reasons and it explains why. It's exactly one of the features other languages will eventually learn they need and will independently "discover" some number of years in the future.

          • zozbot234 10 minutes ago ago

            Rust has a complex semantics, not a complicated syntax. The syntax was explicitly chosen to be quite C/C++ like while streamlining some aspects of it (e.g. the terrible type-ascription syntax, replaced with `let name: type`).

            • init1 a minute ago ago

              It has both complicated and terrible syntax that it inherited and extended from c++ and complicated semantics.

      • zozbot234 15 minutes ago ago

        Verbosity is a feature for small self-contained programs, and a bug for everything else. As long as you're using recognizable mnemonics and not just ASCII line noise or weird unreadable runes (as with APL) terseness is no obstacle at all for a good programmer.

    • Raphael_Amiard an hour ago ago

      > Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada

      I don’t think you really understand what you’re saying here. I have worked on an ada compiler for the best part of a decade. It’s one of the most complex languages there is, up there with C++ and C#, and probably rust

      • leoc an hour ago ago

        Mind you, that suggests that the sentence is at least half-true even if "much more complex" is a big overstatement, since Rust, "modern" C++ and the later evolutions of C# are all relatively recent. (What would have compared to Ada in complexity back in the day? Common Lisp, Algol 68?)

        As a matter of general interest, what features or elements of Ada make it particularly hard to compile, or compile well? (And are there parts which look like they might be difficult to manage but aren't?)

      • microtherion 17 minutes ago ago

        I imagine Swift is also a very difficult language to compile.

  • jazzypants an hour ago ago

    > JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers.

    What?

    #1 JavaScript doesn't have formal types. What does it even mean by "representation"?

    #2 You can just define a variable and not export it. You can't import a variable that isn't exported.

    There are several little LLM hallucinations like this throughout the article. It's distracting and annoying.

  • timschmidt 4 hours ago ago

    It'd be a neat trick to have a single unified language which could bridge the gap between software and hardware description languages.

    • adrian_b 43 minutes ago ago

      The hardware description languages, even if they have a single language specification, are divided into 2 distinct subsets, one used for synthesis, i.e. for hardware design, and one used for simulation, i.e. for hardware verification.

      The subset required for hardware synthesis/design, cannot be unified completely with a programming language, because it needs a different semantics, though the syntax can be made somewhat similar, as with VHDL that was derived from Ada, while Verilog was derived from C. However, the subset used for simulation/verification, outside the proper hardware blocks, can be pretty much identical with a programming language.

      So in principle one could have a pair of harmonized languages, one a more or less typical programming language used for verification and a dedicated hardware description language used only for synthesis.

      The current state is not too far from this, because many simulators have interfaces between HDLs and some programming languages, so you can do much verification work in something like C++, instead of SystemVerilog or VHDL. For instance, using C++ for all verification tasks is possible when using Verilator to simulate the hardware blocks.

      I am not aware of any simulator that would allow synthesis in VHDL coupled with writing test benches in Ada, which are a better fit than VHDL with C++, but it could be done.

    • lioeters 2 hours ago ago

      It's an intriguing idea. Having experience with software but almost none (only hobbyist) in hardware, I imagine it'd require a strong type system and mathematical foundation. Perhaps something like Agda, a language that is a proof assistant and theorem prover, with which one can write executable programs. https://en.wikipedia.org/wiki/Agda_(programming_language)

      • timschmidt an hour ago ago

        I wonder if an escape hatch like Rust's unsafe{} would be enough... a hardware{}. The real complexity likely lies in how to integrate the synthesis tools with the compiler and debugger. The timing model. A memory model like Rust's would certainly aid in assuring predictable behavior, but I'm not certain it would be sufficient.

  • ramon156 3 hours ago ago

    off-topic, this article has almost the same theme as dawnfox/dayfox which I love. It fits nicely with my terminal on the left. Cool stuff

  • spinningslate 3 hours ago ago

    Wonderful article and a good fit with HN’s motto of “move slowly and preserve things” as opposed to Silicon Valley’s jingoistic “move fast and break things”.

    It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.

    That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.

    • smitty1e 2 hours ago ago

      > Why do we, as a species, ignore hard-won experience and instead restart?

      Humanity moves from individual to society, not the reverse.

      Some knowledge moves from the plural to the singular, top to bottom, but the regular existential mode is bottom-up, which point The Famous Article (TFA) makes in the context of programming languages.

      Children and ideas grow from babe to adult. They do not spring full grown from the brow of Zeus other than in myth.

      • spinningslate 2 hours ago ago

        Thanks, that’s helpful. My wife is a teacher and talks about knowledge being recreated, not relearned: IOW it’s new to the learner even if known by the teacher. Hadn’t put those things together before.

    • cgadski 2 hours ago ago

      Does anyone understand how/why old HN accounts become mouthpieces for language models?

      • spinningslate 2 hours ago ago

        Erm, well, the comment wasn’t AI generated, it was by me - a warts and all human. The sibling comments say TFA is AI generated and I’ll be the first to admit I didn’t spot that. Still found it interesting though.

      • projektfu 2 hours ago ago

        That seems uncharitable.

  • bananaflag 3 hours ago ago

    I am wondering what the Ada equivalent of affine types is. What is the feature that solves the problem that affine types solve in Rust.

  • mhd 2 hours ago ago

    No mention of Algol? Or Mesa?

  • turtleyacht 4 hours ago ago

    The next language ought to ensure memory-safe conditions across the network.

    • yvdriess 3 hours ago ago

      AmbientTalk did this. I used it for a demo where I dragged a mp3 player's UI button to another machine, where pressing play would play it back on the originator's speakers. Proper actor programming in the veins of E and Erlang.

      https://soft.vub.ac.be/amop/

    • csrse 3 hours ago ago

      Already exists since way back: https://github.com/mozart/mozart2 (for example)

    • gostsamo 3 hours ago ago

      the article states that the language can have extensions for different domains, so it is also an option.

    • derleyici 4 hours ago ago

      And the answer is… Rust.

      • anthk 4 hours ago ago

        Or Algol 68, which is doing a comeback.

        • pjmlp 3 hours ago ago

          Or even ESPOL and its evolution, NEWP, never went away, only available to Unisys customers that care about security as top deployment priority.

          • EvanAnderson 2 hours ago ago

            I wish more people knew about the Burroughs Large Systems[0] machines. I haven't written any code for them, but I got turned-on to them by a financial Customer who ran a ClearPath Series A MCP system (and later one of the NT-based Clearpath machines with the SCAMP processor on a card) back in the late 90s, and later by a fellow contractor who did ALGOL programming for Unisys in the mid-70s and early 80s. It seems like an architecture with an uncompromising attitude toward security, and an utterly parallel universe to what the rest of the industry is (except for, perhaps, the IBM AS/400, at least in the sense of being uncompromising on design ideals).

            [0] https://en.wikipedia.org/wiki/Burroughs_Large_Systems

            • pjmlp 2 hours ago ago

              Yes, IBM i and z/OS, are the other survivors.

  • phplovesong 33 minutes ago ago

    I would never work on projects that ADA is used for.

    1. Would never work on "missile tech" or other "kills people" tech.

    2. Would never work for (civ) aircraft tech, as i would probably burn out for the stress of messing something up and having a airplane crash.

    That said, im sure its also used for stuff that does not kill people, or does not have a high stress level.

  • DeathArrow an hour ago ago

    It looks like OpenClaw started blogging. :D