None of the alternatives have stability. What was exemplary & idiomatic rust pre-pandemic would be churlish & rejected now and the toolchain use would be different anyway.
Carpenters, plumbers, masons & electricians work on houses 3-300 yrs old, navigate the range of legacy styles & tech they encounter, and predictably get good outcomes.
Only C has, yet, given use that level of serviceability. C99, baby, why pay more?
When there’s an alternative that can compete with that sort of real-world durability, C will get displaced.
Having just finished renovating a 140-year-old home with solid brick walls that was slowly collapsing and deteriorating due to the aforementioned professionals’ application of modern and chemically incompatible materials to it… I’m not sure I agree. It’s also why a lot of the UK’s building stock is slowly rotting with black mould right now. Literally none of the professionals I hired, before I trained them, knew how to properly repair a type of home that represents 30% of the UK building stock.
FWIW, the crabi project within rust is trying to improve on some parts of it. But it is still built on (a subset of) the environments c ABI. And it doesn't fix all the problems.
The replacement has already happened. It is HTTP and JSON for 99% of the software developed today. The reason C stayed has multiple reasons but most obvious ones are for me are:
- People just stopped caring about operating systems research and systems programming after ~2005. Actual engineering implementations of the concepts largely stopped after the second half of 90s. Most developers moved on to making websites or applications in higher level programming languages.
- C hit a perfect balance of being a small enough language to grok, being indepedent of the system manufacturers, reflecting the computer architecture of 80s, actually small in syntax and code length and quite easy to implement compilers for. This caused lots of legacy software being built into the infrastructure that gave birth to the current contemporary popular OSes and more importantly the infrastructure of the Internet. Add in .com bubbles and other crises, we basically have/had zero economic incentive to replace those systems.
- Culture changed. We cared more about stability, repairability and reusability. Computers were expensive. So are programmers and software. Now computers are cheap. Our culture is more consumerist than ever. The mentality of "move fast and break things" permeated so well with economic policy and the zeitgeist. With AI it will get worse. So trying to make a real alternative to C (as a generic low level OS protocol) has reduced cultural value / optics. It doesn't fill the CVs as well.
It doesn't mean that people haven't tried or even succeeded. Android was successful in multiple fronts in replacing C. Its "intents" and low level interface description language for hardware interfaces are great replacement for C ABI. Windows' COM is also a good replacement that gets rid of language dependence. There are still newer OSes try like Redox or Fuchsia.
C ABI is more than how ints and array of ints sit next to each other in memory.
It cares about calling conventions and what you can store in registers vs what you cannot. There are multiple possible ways of doing an RPC call and C ABI only provides one way of doing it.
> People just stopped caring about operating systems research and systems programming after ~2005.
and so it was that after that date, all development of
embedded systems
kernel drivers
digital audio workstations
video editors
codecs for audio and video
anything that involved actually controlling non-computer hardware
game engines
came to a grinding halt, and no further work was done.
> It doesn't mean that people haven't tried or even succeeded. Android was successful in multiple fronts in replacing C. Its "intents" and low level interface description language for hardware interfaces are great replacement for C ABI. Windows' COM is also a good replacement that gets rid of language dependence. There are still newer OSes try like Redox or Fuchsia.
I am not sure I buy this from a system perspective, especially when taking this[1] into consideration.
>Culture changed. We cared more about stability, repairability and reusability. Computers were expensive. So are programmers and software. Now computers are cheap. Our culture is more consumerist than ever. The mentality of "move fast and break things" permeated so well with economic policy and the zeitgeist. With AI it will get worse. So trying to make a real alternative to C (as a generic low level OS protocol) has reduced cultural value / optics. It doesn't fill the CVs as well.
IMO I do see this changing in the future as higher power computers become expensive once again, and I'm not just referring to the recent chip shortage.
> Only C has, yet, given use that level of serviceability.
On the contrary, Lisp outshines C to a large degree here. Success has nothing to do with technical merit (if such a thing even exists), it's not a rational game.
Reduce is a Lisp library that's still in active use from 1968, making it older than C itself. We can point to GNU Emacs as an ancient and venerable self-contained Lisp tortoise with more wrinkles than are finitely enumerable, and is in fact a hosted Lisp operating system. Pulling it apart and working with it is admittedly a treat even if I loathe it as a text editor. Mezzano is a modern Lisp OS that you can play with in a VM, and might give you an idea of why Lisp is such a great systems language.
In short: Lisp is semantic and geared towards a living system. The basic REPL is sh + cc + ld + db (and a few others) all in one. It's almost a little mind bending how nice these systems are put together, how cleanly they work. C is like pulling teeth in comparison.
I'm not even a fan of Lisp or sexpr languages. But it's the obvious heavyweight champion of longetivity and ultra-pragmatic service record... Yes, even in the systems domain.
C99, but with a million macros backporting features from newer language versions and compiler extensions. Lovely features you don't get with ordinary c99:
It’s weird how whiny this post is. Like there’s zero intellectual curiosity about why C got this way, and why C gets to be the foundation for how systems software is written.
I could write a whole essay about why, but now isn’t the time. I’m just going to enjoy the fact that TFA and the author don’t get it.
> why C gets to be the foundation for how systems software is written.
Is there an answer here more interesting than "it's what Unix and Windows were written in, so that's how programs talked to the OS, and once you have an interface, it's impossible to change"?
It wasn't a coincidence, or an accident. C was specifically designed to write Unix, by people who had experience with a lot of other computer languages, and had programmed other operating systems including Multics and some earlier versions of Unix. They knew exactly what they were doing, and exactly what they wanted.
They wanted to play and ignored other languages on purpose, that is all.
> Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for. All these languages influenced our work, but it was more fun to do things on our own.
Pity that in regards to secure programing practices in C, community also ignores the decisions of the authors.
> Although the first edition of K&R described most of the rules that brought C's type structure to its present form, many programs written in the older, more relaxed style persisted, and so did compilers that tolerated it. To encourage people to pay more attention to the official language rules, to detect legal but suspicious constructions, and to help find interface mismatches undetectable with simple mechanisms for separate compilation, Steve Johnson adapted his pcc compiler to produce lint [Johnson 79b], which scanned a set of files and remarked on dubious constructions.
Also to be noted that on Plan 9 they attempted to replace C with Alef for userspace, and while the experiment failed, they went with Limbo on Inferno, and also contributed to Go.
And that C compiler on Plan 9 is its own thing,
> The compiler implements ANSI C with some restrictions and extensions [ANSI90]. Most of the restrictions are due to personal preference, while most of the extensions were to help in the implementation of Plan 9. There are other departures from the standard, particularly in the libraries, that are beyond the scope of this paper.
I'm not sure what you mean by "coincidence" or "accident" here.
C is a pretty OK language for writing an OS in the 70s. UNIX got popular for reasons I think largely orthogonal to being written in C. UNIX was one of the first operating systems that was widely licensed to universities. Students were obliged to learn C to work with it.
If the Macintosh OS had come out first and taken over the world, we'd probably all be programming in Object Pascal.
When everyone wanted to program for the web, we all learned JavaScript regardless of its merits or lack thereof.
I don't think there's much very interesting about C beyond the fact that it rode a platform's coattails to popularity. If there is something interesting about it that I'm missing, I'd definitely like to know.
It is often said that C became popular just because Unix was popular, due to being free -- it just "rode its coattails" as you put it.
As if you could separate Unix from C. Without C there wouldn't have been any Unix to become popular, there wouldn't have been any coattails to ride.
C gave Unix some advantages that other operating systems of the 1970s and 80s didn't have:
Unix was ported to many different computers spanning a large range of cost and size, from microcomputers to mainframes.
In Unix both the operating system and the applications were written in the same language.
The original Unix and C developers wrote persuasive books that taught the C language and demonstrated how to do systems programming and application programming in C on Unix.
Unix wasn't the first operating system to be written in a high-level language. The Burroughs OS was written in Algol, Multics was written in PL/I, and much of VMS was written in BLISS. None of those languages became popular.
IN the 1970s and 80s, Unix wasn't universal in universities. Other operating systems were also widely used: Tenex, TOPS-10, and TOPS-20 on DEC-10s and 20s, VMS on VAXes. But their systems languages and programming cultures did not catch on in the same way as C and Unix.
The original Macintosh OS of the 1980s was no competitor to Unix. It was a single user system without integrated network support. Apple replaced the original Macintosh OS with a system based on a Unix.
> Unix wasn't the first operating system to be written in a high-level language. The Burroughs OS was written in Algol, Multics was written in PL/I, and much of VMS was written in BLISS. None of those languages became popular.
Of course, they weren't available as free beer with source tapes.
> Apple replaced the original Macintosh OS with a system based on a Unix.
Only because they decided to buy NeXT instead of Be.
Had they bough Be, that would not been true at all.
First to market is not necessarily the best, case in point: many video sites existed before Youtube, including ones based on Apple Quicktime. But in the end Flash won.
To me it looks like there is a better way to do things and the better one eventually wins.
> Operating systems have to deal with some very unusual objects and events: interrupts; memory maps; apparent locations in memory that really represent devices, hardware traps and faults; and I/O controllers. It is unlikely that even a low-level model can adequately support all of these notions or new ones that come along in the future. So a key idea in C is that the language model be flexible; with escape hatches to allow the programmer to do the right thing, even if the language designer didn't think of it first.
This. This is the difference between C and Pascal. This is why C won and Pascal lost - because Pascal prohibited everything but what Wirth thought should be allowed, and Wirth had far too limited a vision of what people might need to do. Ritchie, in contrast, knew he wasn't smart enough to play that game, so he didn't try. As a result, in practice C was considerably more usable than Pascal. The closer you were to the metal, the greater C's advantage. And in those days, you were often pretty close to the metal...
Later, on page 60:
> Much of the C model relies on the programmer always being right, so the task of the language is to make it easy what is necessary... The converse model, which is the basis of Pascal and Ada, is that the programmer is often wrong, so the language should make it hard to say anything incorrect... Finally, the large amount of freedom provided in the language means that you can make truly spectacular errors, far exceeding the relatively trivial difficulties you encounter misusing, say, BASIC.
Also true. And it is true that the "Pascal model" of the programmer has quite a bit of truth to it. But programmers collectively chose freedom over restrictions, even restrictions that were intended to be for their own good.
The irony is that all wannabe C and C++ replacements are exactly the "Pascal model" brought back into the 21st century, go figure.
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
-- C.A.R Hoare's "The 1980 ACM Turing Award Lecture"
> I'm not sure what you mean by "coincidence" or "accident" here.
I mean Unix had to be written in C, not in, say, Algol or PL/I or BLISS, high-level languages used to write other operating systems.
I also meant that the features of C were not put there by impulse or whim, they were the outcome of considered decisions guided by the specific needs of Unix.
> Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for. All these languages influenced our work, but it was more fun to do things on our own.
Yes and no. Clearly what you said is true, but the more profound reason is that C just minimally reflects how computers work. The rest is just convention.
More concretely, I think the magic lies in these two properties:
1. Conservation of mass: the amount of C code you put in will be pretty close to the amount of machine code you get out. Aside from the preprocessor, which is very obviously expanding macros, there are almost no features of C that will take a small amount of code and expand it to a large amount of output. This makes some things annoyingly verbose to code in C (eg. string manipulation), but that annoyance is reflecting a true fact of machine code, which is that it cannot handle strings very easily.
2. Conservation of energy: the only work that will be performed is the code that you put into your program. There is no "supervisor" performing work on the side (garbage collection, stack checking, context switching), on your behalf. From a practical perspective, this means that the machine code produced by a C compiler is standalone, and can be called from any runtime without needing a special environment to be set up. This is what makes C such a good language for implementing garbage collection, stack checking, context switching, etc.
There are some exceptions to both of these principles. Auto-vectorizing compilers can produce large amounts of output from small amounts of input. Some C compilers do support stack checking (eg. `-fstack-check`). Some implementations of C will perform garbage collection (eg. Boehm, Fil-C). For dynamically linked executables, the PLT stubs will perform hash table lookups the first time you call a function. The point is that C makes it very possible to avoid all of these things, which has made it a great technology for programming close to the machine.
Some languages excel at one but not the other. Byte-code oriented languages generally do well at (1): for example, Java .class files are usually pretty lean, as the byte-code semantics are pretty close to the Java langauge. Go is also pretty good at (1). Languages like C++ or Rust are generally good at (2), but have much larger binaries on average than C thanks to generics, exceptions/panics, and other features. C is one of the few languages I've seen that does both (1) and (2) well.
This is a meme which is repeated often, but not really true. If you disagree, please state specifically what property of PDP-11 you think it different from how modern computers work, and where this affects C but not other languages.
The things complained about in the article are not a minimal reflection of how computers work.
Take the "wobbly types" for example. It would have been more "minimal" to have types tied directly to their sizes instead of having short, int, long, etc.
There isn't any reason that compilers on the same platform have to disagree on the layout of the same basic type, but they do.
The complaints about parsing header files could potentially be solved by an IDL that could compile to c header files and ffi definitions for other languages. It could even be a subset of c that is easier to parse. But nothing like that has ever caught on.
It seems to be a meme on HN that C doesn't reflect hardware, now you're extending that to assembly. It seems silly to me. It was always an approximation of what happens under the hood, but I think the concepts of pointers, variable sizes and memory layout of structs all represent the machine at some level.
For example, C has pointer provenance, so pointers arent just addresses. Thats why type punning is such a mess. If a lang claims to be super close to the hardware this seems like a very weird thing.
> the concepts of pointers, variable sizes and memory layout of structs all represent the machine at some level.
Exactly.
Everything in assembly is still one-to-one in terms of functional/stateful behavior to actual execution. Runtime hardware optimization (pinhole instruction decomposition and reordering, speculative branching, automated caching, etc.) give a performance boost but do not change the model. Doing so would mean it didn't work!
And C is still very close to the assembly, in terms of basic operations. Even if a compiler is able to map the same C operations to different instructions (i.e. regular, SIMD, etc.)
I am not sure what Filip's view on this is. But like to point out the article from Stephen Kell linked below which explains why C is an incredibly useful tool for systems programming and what distinguishes it from all other languages.
The author is upfront about their goals and motivations and explicitly acknowledges that other concerns exist. Calling it whiny is ungracious -- the author is letting some very human frustration peek through in their narrative.
Not everything has to be written with all the warmth and humanity of a UN subcommittee interim report on widget standardisation.
The trouble with C as an API format is that there's no size info. That's asking for buffer overflows.
There's an argument for full type info at an API, but that gets complicated across languages. Things that do that degenerate into CORBA. Size info, though, is meaningful at the machine level, and ought to be there.
Apple originally had Pascal APIs for the Mac, which did carry along size info. But they caved and went with C APIs.
- unspecified default type sizes. Should have had i8, u16, i32, u64, f32, f64 from the beginning.
- aliasing pointers being restricted by default (ie an alias keyword should have been added). Performance matters. All these benchmarks which show something beating C or C++ are mostly due to dealing with aliasing pointers. C++26 still doesnt have standardised restrict keyword.
There are more but I understand the logic/usability/history behind them. The above points should have been addressed in the 80’s.
Not the error "handling"? The array implementation? The weak type system? The barebones-macro-system? The nearly unuseable standard-library? The standard itself, a 750-page-tome you have to memorize, lest C is allowed to erase your hard drive?
I really don't understand why people keep misunderstanding this post so badly. It's not a complaint about C as a programming language. It's a complaint that, due to so much infrastructure being implemented in C, anyone who wants to interact with that infrastructure is forced to deal with some of the constraints of C. C has moved beyond merely being a programming language and become the most common interface for in-process interoperability between languages[1], and that means everyone working at that level needs to care about C even if they have no intention of writing C.
It's understandable how we got here, but it's an entirely legitimate question - could things be better if we had an explicitly designed interoperability interface? Given my experiences with cgo, I'd be pretty solidly on the "Fuck yes" side of things.
(Of course, any such interface would probably end up being designed by committee and end up embodying chunks of ALGOL ABI or something instead, so this may not be the worst possible world but that doesn't mean we have to like it)
[1] I absolutely buy the argument that HTTP probably wins out for out of process
> could things be better if we had an explicitly designed interoperability interface?
Yes, we could define a language-agnostic binary interoperability standard with it's own interface definition language, or IDL. Maybe call it something neutral like the component object model, or just COM[1]. :)
I don't see that as a problem. C has been the bedrock of computing since the 1970s because it is the most minimal way of speaking to the hardware in a mostly portable way. Anything can be done in C, from writing hardware drivers, to GUI applications and scientific computing. In fact I deplore the day people stopped using C for desktop applications and moved to bloated, sluggish Web frameworks to program desktop apps. Today's desktop apps are slower than Windows 95 era GUI programs because of that.
Ok you're still missing the point. This isn't about C being good or bad or suitable or unsuitable. It's about whether it's good that C has, through no deliberate set of choices, ended up embodying the interface that lets us build rust that can be called by go.
Sure history is great and all, but in C it's hard to say reliably define this int is 64-bit wide, because of the wobbly type system. Plus, the whole historical baggage of not having 128-bit wide ints. Or sane strings (not null terminated).
> in C it's hard to say reliably define this int is 64-bit wide
That isn't really a problem any more (since c99). You can define it as uint64_t.
But we have a ton of existing APIs that are defined using the wobbly types, so we're kind of stuck with it. And even new APIs use the wobbly types because the author didn't use that for whatever reason.
But that is far from the only issue.
128 bit ints is definitely a problem though, you don't even get agreement between different compilers on the same os on the same hardware.
I mean… an ABI is more like an agreement. It isn’t a specification. It’d be nice if everything was neatly specified and sound. But as the author notes near the end… there’s a lot of platforms and they all have their own quirks.
There has to be an ABI that has to be agreed upon by everyone. Otherwise there wouldn’t be any interoperability. And if we didn’t have the SystemV ABI — what would we use instead? Prepare for a long debate as every language author, operating system designer, and platform under the sun argues for their respective demands and proposals. And as sure as the sun rises in the East someone, somewhere, would write an article such as this one decrying that blessed ABI.
SystemV shouldn’t be the be all and end all, IMO. But progress should be incremental. Because a lingua franca loses its primary feature and utility when we all return to our own fiefdoms and stop talking to one another in the common tongue.
It’s a pain in the metaphorical butt. But it’s better, IMO, than the alternatives. It’s kind of neat that SystemV works so well let alone at all.
This article isn't about languages. It's about the protocol for two or more languages to talk to each other. There is no specification for this.
The System V ABI is as close as we get to an actual specification but not everyone uses it and in any case it only covers a small part of the protocol.
> Anyone who spends much time trying to parse C(++) headers very quickly says “ah, actually, fuck that” and asks a C(++) compiler to do it.
That's exactly my case. For my programming language I have wrote a tool for C headers conversion using libclang. And even with help of this library it wasn't that easy, I have found a lot of caveats by trying converting headers like <windows.h>.
Do someone pays for anti-C propaganda ?? All that logic breaking accusations...
Eg. here, from memory:
> ...you want to read 32 bits from file but OH NOOES long is 64 bit ! The language ! The imposibility !
But when you read something ot unserialize some format you just need to know based on format schema or domain knowledge. Simple and straightforward like that ! You do not do some "reflections" on what language standard provide and then expect someone send you just that !!
So that anti-C "movement" is mostly based on brainless exampless.
Not saying C is perfect.
But it is very good and I bet IBM and other big corps will keep selling things written and actively developed in C/C++ + adding hefty consulting fees.
In the meantime proles has been adviced to move to cpu-cycle-eating inferior languages and layers over layers of cycle burning infra in cloud-level zero-privacy and guaranteed data leaks.
Oh, btw. that femous Java "bean" is just object with usually language delivered "basic type"... How that poor programmer from article should know what to read from disc when he just have types Java provides ?? How ? Or maybe he should use some domain knowledge or schema for problem he is trying to solve ??
And in "scripting language" with automatic int's - how to even know how many bits runtime/vm actually use ? Maybe some reflection to check type ? But again how that even helps if there is no knowledge in brain how many bits should be read ?? But calling some cycle burning reflection or virtual and as much as posible indirect things is what fat tigers love the moust :)
We didn't do it to annoy you or to foist bad APIs on you. We did it because it was the best language for writing machine code at the time. By miles. Not understanding why this is true will lead you to make all the same mistakes the languages "bested" by C made.
I seem to recall reading this before, I must have not noticed this furry stuff because I would have ignored it.
My take was that this is a rustacean who was having trouble porting something to C and then went in a deep rabbit hole of C traiditional software, and instead of recognizing that perhaps they are in way over their head, they concluded that the issue was in the C, that it's all wrong and therefore their mission of porting stuff to Rust is even more virtuous.
This is just an ad hominem attack. Doesn't seem like the author is "in over their head"; they seem to have a pretty solid grasp of actual identifiable gaps between implementations and various specs, and the article was written with the same kind of "chastising" tone as you would see from any grey-bearded hacker who's unsatisfied with the way things are.
I hate to make judgments without a mountain of evidence, but a cursory glance of their about page honestly made me think this person likely suffers from a multitude of mental health issues.
I think we can be accomodating of a wide array of diagnosable mental conditions in software. I'm thinking of Terry King's TempleOS, Ken Reitz's Requests.
Being upfront about it by authors dispels a lot of the potential tension and substantially changes the way we interact. I understand there may be a conflict and not everyone will want to advertise their diagnosis, but in my experience once it becomes clear that's what's going on, it helps all the parties involved.
I always thought that C was a stepping stone to learn other languages. Like Pascal, it was educational to learn. My Comp Sci courses in 1986-1990 used Turbo Pascal and Turbo C.
I think so to, for most devs C is like Latin, or Roman Law, not something we develop and use, but rather something we learn for context and to understand future developments.
There's some people that still develop on C for sure, but it's limited to FOSS and embedded at this point, Low Level proprietary systems having migrated to C++ or Rust mostly.
I agree with the main thesis that C isn't a language like the others, something that we practice, that it's mostly an ancient but highly influential language, and it's an API/ABI.
What I disagree with is that 'critiquing' C is productive in the same way that critiquing Roman Law or Latin or Plato is productive, the horse is dead, one might think they are being clever or novel for finding flaws in the dead horse, but it's more often a defense mechanism to justify having a hard time learning the decades of backwards compatibility, edge cases and warts that have been developed.
It's easier to think of the previous generation as being dumb and having made mistakes that could have been fixed, and that it all could be simpler, rather than recognize that engineering is super complex and that we might as well dedicate our full life to learning this craft and still not make a dent.
I applaud the new generation for taking on this challenge and giving their best shot at the revolution, but I'm personally thinking of bridging the next-next generation and the previous generation of devs, the historical complexity of the field will increase linearly with time and I think if we pace ourselves we can keep the complexity down, and the more times we hop unto a revolution that disregards the previous generation as dumb, the bigger the complexity is going to be.
There is also still a lot of low-level proprietary code developed in C. I would guess far more than what is developed in Rust.
I fully agree about your last point. The proposed solutions to some of the deficiencies of C are sometimes worse than the disease while its benefits are often exaggerated, at the same time adding unnecessary layers of complexity that will haunt us for decades. In contrast, my hope would be to to carefully revise the things we have, but this takes time and patience.
None of the alternatives have stability. What was exemplary & idiomatic rust pre-pandemic would be churlish & rejected now and the toolchain use would be different anyway.
Carpenters, plumbers, masons & electricians work on houses 3-300 yrs old, navigate the range of legacy styles & tech they encounter, and predictably get good outcomes.
Only C has, yet, given use that level of serviceability. C99, baby, why pay more?
When there’s an alternative that can compete with that sort of real-world durability, C will get displaced.
Having just finished renovating a 140-year-old home with solid brick walls that was slowly collapsing and deteriorating due to the aforementioned professionals’ application of modern and chemically incompatible materials to it… I’m not sure I agree. It’s also why a lot of the UK’s building stock is slowly rotting with black mould right now. Literally none of the professionals I hired, before I trained them, knew how to properly repair a type of home that represents 30% of the UK building stock.
Outside UNIX clones, and embedded space where it is mostly a religious point of view than available compiler toolchains, C has already been displaced.
Even the most relevant C compilers are no longer written in C.
> Even the most relevant C compilers are no longer written in C.
Worth to point out that most of the C compilers are also C++ compilers.
So the point is kind of distorted.
FWIW, the crabi project within rust is trying to improve on some parts of it. But it is still built on (a subset of) the environments c ABI. And it doesn't fix all the problems.
The replacement has already happened. It is HTTP and JSON for 99% of the software developed today. The reason C stayed has multiple reasons but most obvious ones are for me are:
- People just stopped caring about operating systems research and systems programming after ~2005. Actual engineering implementations of the concepts largely stopped after the second half of 90s. Most developers moved on to making websites or applications in higher level programming languages.
- C hit a perfect balance of being a small enough language to grok, being indepedent of the system manufacturers, reflecting the computer architecture of 80s, actually small in syntax and code length and quite easy to implement compilers for. This caused lots of legacy software being built into the infrastructure that gave birth to the current contemporary popular OSes and more importantly the infrastructure of the Internet. Add in .com bubbles and other crises, we basically have/had zero economic incentive to replace those systems.
- Culture changed. We cared more about stability, repairability and reusability. Computers were expensive. So are programmers and software. Now computers are cheap. Our culture is more consumerist than ever. The mentality of "move fast and break things" permeated so well with economic policy and the zeitgeist. With AI it will get worse. So trying to make a real alternative to C (as a generic low level OS protocol) has reduced cultural value / optics. It doesn't fill the CVs as well.
It doesn't mean that people haven't tried or even succeeded. Android was successful in multiple fronts in replacing C. Its "intents" and low level interface description language for hardware interfaces are great replacement for C ABI. Windows' COM is also a good replacement that gets rid of language dependence. There are still newer OSes try like Redox or Fuchsia.
I'd never thought I'd see the day that anyone praises COM.
If you read Don Box’s book on COM he goes through every decision they made and the rationale for it. It all seemed to make sense.
Unfortunately I think Don Box’s was the only person in the world who really understood it all.
As idea it is great, as the tooling available in C++ Builder, Delphi, VB 6, C++/CX (WinRT is basically COM with extras) also great.
Using it from MFC, kind of alright.
Using it from .NET, depends if Framework, .NET Native, or modern, with various levels of usability.
Using it from ATL, WRL, C++/WinRT, is a mess unfortunely.
Name me a stable binary interface that is not ints and arrays of ints
C ABI is more than how ints and array of ints sit next to each other in memory.
It cares about calling conventions and what you can store in registers vs what you cannot. There are multiple possible ways of doing an RPC call and C ABI only provides one way of doing it.
COM, WinRT, XPC, AIDL.
Now you can move the goal posts and assert that any data serialized into a memory buffer is an array of ints.
> People just stopped caring about operating systems research and systems programming after ~2005.
and so it was that after that date, all development of
came to a grinding halt, and no further work was done.On Windows, macOS and Android, most of that development on that list is done in C++, not C.
> It doesn't mean that people haven't tried or even succeeded. Android was successful in multiple fronts in replacing C. Its "intents" and low level interface description language for hardware interfaces are great replacement for C ABI. Windows' COM is also a good replacement that gets rid of language dependence. There are still newer OSes try like Redox or Fuchsia.
I am not sure I buy this from a system perspective, especially when taking this[1] into consideration.
______
1. Alexis King's reply to "Why do common Rust packages depend on C code?". Link: https://langdev.stackexchange.com/a/3237
In pieces of my code, I need to call setuserid() to manage some of the security that I designed in 2010.
There was no Rust at that point, and I used the most basic tool that could do it.
Could I have done this in Java with gymnastics of JNI, linking C into the JRE?
Definite maybe.
Yes, nowadays with Panama, and before Rust was around, JNA was already there so using JNI wasn't strictly necessary.
>Culture changed. We cared more about stability, repairability and reusability. Computers were expensive. So are programmers and software. Now computers are cheap. Our culture is more consumerist than ever. The mentality of "move fast and break things" permeated so well with economic policy and the zeitgeist. With AI it will get worse. So trying to make a real alternative to C (as a generic low level OS protocol) has reduced cultural value / optics. It doesn't fill the CVs as well.
IMO I do see this changing in the future as higher power computers become expensive once again, and I'm not just referring to the recent chip shortage.
> Only C has, yet, given use that level of serviceability.
On the contrary, Lisp outshines C to a large degree here. Success has nothing to do with technical merit (if such a thing even exists), it's not a rational game.
What makes you say that?
Reduce is a Lisp library that's still in active use from 1968, making it older than C itself. We can point to GNU Emacs as an ancient and venerable self-contained Lisp tortoise with more wrinkles than are finitely enumerable, and is in fact a hosted Lisp operating system. Pulling it apart and working with it is admittedly a treat even if I loathe it as a text editor. Mezzano is a modern Lisp OS that you can play with in a VM, and might give you an idea of why Lisp is such a great systems language.
In short: Lisp is semantic and geared towards a living system. The basic REPL is sh + cc + ld + db (and a few others) all in one. It's almost a little mind bending how nice these systems are put together, how cleanly they work. C is like pulling teeth in comparison.
I'm not even a fan of Lisp or sexpr languages. But it's the obvious heavyweight champion of longetivity and ultra-pragmatic service record... Yes, even in the systems domain.
C99, but with a million macros backporting features from newer language versions and compiler extensions. Lovely features you don't get with ordinary c99:
free_sized
#embed
static_assert
Types for enum
Alignof, alignas, aligned_alloc
_Atomic
It’s weird how whiny this post is. Like there’s zero intellectual curiosity about why C got this way, and why C gets to be the foundation for how systems software is written.
I could write a whole essay about why, but now isn’t the time. I’m just going to enjoy the fact that TFA and the author don’t get it.
> why C gets to be the foundation for how systems software is written.
Is there an answer here more interesting than "it's what Unix and Windows were written in, so that's how programs talked to the OS, and once you have an interface, it's impossible to change"?
It wasn't a coincidence, or an accident. C was specifically designed to write Unix, by people who had experience with a lot of other computer languages, and had programmed other operating systems including Multics and some earlier versions of Unix. They knew exactly what they were doing, and exactly what they wanted.
They wanted to play and ignored other languages on purpose, that is all.
> Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for. All these languages influenced our work, but it was more fun to do things on our own.
https://www.nokia.com/bell-labs/about/dennis-m-ritchie/chist...
Pity that in regards to secure programing practices in C, community also ignores the decisions of the authors.
> Although the first edition of K&R described most of the rules that brought C's type structure to its present form, many programs written in the older, more relaxed style persisted, and so did compilers that tolerated it. To encourage people to pay more attention to the official language rules, to detect legal but suspicious constructions, and to help find interface mismatches undetectable with simple mechanisms for separate compilation, Steve Johnson adapted his pcc compiler to produce lint [Johnson 79b], which scanned a set of files and remarked on dubious constructions.
Also to be noted that on Plan 9 they attempted to replace C with Alef for userspace, and while the experiment failed, they went with Limbo on Inferno, and also contributed to Go.
And that C compiler on Plan 9 is its own thing,
> The compiler implements ANSI C with some restrictions and extensions [ANSI90]. Most of the restrictions are due to personal preference, while most of the extensions were to help in the implementation of Plan 9. There are other departures from the standard, particularly in the libraries, that are beyond the scope of this paper.
https://doc.cat-v.org/plan_9/4th_edition/papers/compiler
I'm not sure what you mean by "coincidence" or "accident" here.
C is a pretty OK language for writing an OS in the 70s. UNIX got popular for reasons I think largely orthogonal to being written in C. UNIX was one of the first operating systems that was widely licensed to universities. Students were obliged to learn C to work with it.
If the Macintosh OS had come out first and taken over the world, we'd probably all be programming in Object Pascal.
When everyone wanted to program for the web, we all learned JavaScript regardless of its merits or lack thereof.
I don't think there's much very interesting about C beyond the fact that it rode a platform's coattails to popularity. If there is something interesting about it that I'm missing, I'd definitely like to know.
It is often said that C became popular just because Unix was popular, due to being free -- it just "rode its coattails" as you put it.
As if you could separate Unix from C. Without C there wouldn't have been any Unix to become popular, there wouldn't have been any coattails to ride.
C gave Unix some advantages that other operating systems of the 1970s and 80s didn't have:
Unix was ported to many different computers spanning a large range of cost and size, from microcomputers to mainframes.
In Unix both the operating system and the applications were written in the same language.
The original Unix and C developers wrote persuasive books that taught the C language and demonstrated how to do systems programming and application programming in C on Unix.
Unix wasn't the first operating system to be written in a high-level language. The Burroughs OS was written in Algol, Multics was written in PL/I, and much of VMS was written in BLISS. None of those languages became popular.
IN the 1970s and 80s, Unix wasn't universal in universities. Other operating systems were also widely used: Tenex, TOPS-10, and TOPS-20 on DEC-10s and 20s, VMS on VAXes. But their systems languages and programming cultures did not catch on in the same way as C and Unix.
The original Macintosh OS of the 1980s was no competitor to Unix. It was a single user system without integrated network support. Apple replaced the original Macintosh OS with a system based on a Unix.
> Unix wasn't the first operating system to be written in a high-level language. The Burroughs OS was written in Algol, Multics was written in PL/I, and much of VMS was written in BLISS. None of those languages became popular.
Of course, they weren't available as free beer with source tapes.
> Apple replaced the original Macintosh OS with a system based on a Unix.
Only because they decided to buy NeXT instead of Be.
Had they bough Be, that would not been true at all.
First to market is not necessarily the best, case in point: many video sites existed before Youtube, including ones based on Apple Quicktime. But in the end Flash won.
To me it looks like there is a better way to do things and the better one eventually wins.
Repeating a previous comment of mine (https://news.ycombinator.com/item?id=32784959) about an article in Byte Magazine (August 1983) on the C programming language:
From page 52:
> Operating systems have to deal with some very unusual objects and events: interrupts; memory maps; apparent locations in memory that really represent devices, hardware traps and faults; and I/O controllers. It is unlikely that even a low-level model can adequately support all of these notions or new ones that come along in the future. So a key idea in C is that the language model be flexible; with escape hatches to allow the programmer to do the right thing, even if the language designer didn't think of it first.
This. This is the difference between C and Pascal. This is why C won and Pascal lost - because Pascal prohibited everything but what Wirth thought should be allowed, and Wirth had far too limited a vision of what people might need to do. Ritchie, in contrast, knew he wasn't smart enough to play that game, so he didn't try. As a result, in practice C was considerably more usable than Pascal. The closer you were to the metal, the greater C's advantage. And in those days, you were often pretty close to the metal...
Later, on page 60:
> Much of the C model relies on the programmer always being right, so the task of the language is to make it easy what is necessary... The converse model, which is the basis of Pascal and Ada, is that the programmer is often wrong, so the language should make it hard to say anything incorrect... Finally, the large amount of freedom provided in the language means that you can make truly spectacular errors, far exceeding the relatively trivial difficulties you encounter misusing, say, BASIC.
Also true. And it is true that the "Pascal model" of the programmer has quite a bit of truth to it. But programmers collectively chose freedom over restrictions, even restrictions that were intended to be for their own good.
The irony is that all wannabe C and C++ replacements are exactly the "Pascal model" brought back into the 21st century, go figure.
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
-- C.A.R Hoare's "The 1980 ACM Turing Award Lecture"
> I'm not sure what you mean by "coincidence" or "accident" here.
I mean Unix had to be written in C, not in, say, Algol or PL/I or BLISS, high-level languages used to write other operating systems.
I also meant that the features of C were not put there by impulse or whim, they were the outcome of considered decisions guided by the specific needs of Unix.
No it had not,
> Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for. All these languages influenced our work, but it was more fun to do things on our own.
Yes and no. Clearly what you said is true, but the more profound reason is that C just minimally reflects how computers work. The rest is just convention.
More concretely, I think the magic lies in these two properties:
1. Conservation of mass: the amount of C code you put in will be pretty close to the amount of machine code you get out. Aside from the preprocessor, which is very obviously expanding macros, there are almost no features of C that will take a small amount of code and expand it to a large amount of output. This makes some things annoyingly verbose to code in C (eg. string manipulation), but that annoyance is reflecting a true fact of machine code, which is that it cannot handle strings very easily.
2. Conservation of energy: the only work that will be performed is the code that you put into your program. There is no "supervisor" performing work on the side (garbage collection, stack checking, context switching), on your behalf. From a practical perspective, this means that the machine code produced by a C compiler is standalone, and can be called from any runtime without needing a special environment to be set up. This is what makes C such a good language for implementing garbage collection, stack checking, context switching, etc.
There are some exceptions to both of these principles. Auto-vectorizing compilers can produce large amounts of output from small amounts of input. Some C compilers do support stack checking (eg. `-fstack-check`). Some implementations of C will perform garbage collection (eg. Boehm, Fil-C). For dynamically linked executables, the PLT stubs will perform hash table lookups the first time you call a function. The point is that C makes it very possible to avoid all of these things, which has made it a great technology for programming close to the machine.
Some languages excel at one but not the other. Byte-code oriented languages generally do well at (1): for example, Java .class files are usually pretty lean, as the byte-code semantics are pretty close to the Java langauge. Go is also pretty good at (1). Languages like C++ or Rust are generally good at (2), but have much larger binaries on average than C thanks to generics, exceptions/panics, and other features. C is one of the few languages I've seen that does both (1) and (2) well.
It minimally reflects PDP-11 assembly, which is not how modern computers work.
This is a meme which is repeated often, but not really true. If you disagree, please state specifically what property of PDP-11 you think it different from how modern computers work, and where this affects C but not other languages.
It lacked SIMD instructions.
The things complained about in the article are not a minimal reflection of how computers work.
Take the "wobbly types" for example. It would have been more "minimal" to have types tied directly to their sizes instead of having short, int, long, etc.
There isn't any reason that compilers on the same platform have to disagree on the layout of the same basic type, but they do.
The complaints about parsing header files could potentially be solved by an IDL that could compile to c header files and ffi definitions for other languages. It could even be a subset of c that is easier to parse. But nothing like that has ever caught on.
How does C reflect how AVX works?
> C just minimally reflects how computers work. The rest is just convention.
This hasn't been true for decades. x86 assembly is now itself an abstraction over what the CPU is actually doing.
Microcode, speculative execution, etc.
It seems to be a meme on HN that C doesn't reflect hardware, now you're extending that to assembly. It seems silly to me. It was always an approximation of what happens under the hood, but I think the concepts of pointers, variable sizes and memory layout of structs all represent the machine at some level.
Its not a meme.
For example, C has pointer provenance, so pointers arent just addresses. Thats why type punning is such a mess. If a lang claims to be super close to the hardware this seems like a very weird thing.
> the concepts of pointers, variable sizes and memory layout of structs all represent the machine at some level.
Exactly.
Everything in assembly is still one-to-one in terms of functional/stateful behavior to actual execution. Runtime hardware optimization (pinhole instruction decomposition and reordering, speculative branching, automated caching, etc.) give a performance boost but do not change the model. Doing so would mean it didn't work!
And C is still very close to the assembly, in terms of basic operations. Even if a compiler is able to map the same C operations to different instructions (i.e. regular, SIMD, etc.)
Lets play a game of what ISO C can do, and no other systems programming language has similar feature available?
If language extensions to ISO C are allowed, then same goes for my selection on competing systems languages.
I'm not sure I agree with "impossible to change".
It's 2026, to this date I cannot use standard library/api, to open a file with utf-8 filename without a null terminating string.
When you want to talk to the OS you constantly face the need to had unnecessary overhead (allocation due to string convertion, strlen).
The OS itself does not prevent anything from having those standard "no overhead" API.
However, it's pretty clear that nobody cares to define some new sane interface and nobody care to deprecate old ones.
That would include both API and ABI.
Yes
Care to share the answer with the rest of the class?
I am not sure what Filip's view on this is. But like to point out the article from Stephen Kell linked below which explains why C is an incredibly useful tool for systems programming and what distinguishes it from all other languages.
https://dl.acm.org/doi/abs/10.1145/3133850.3133867
The author is upfront about their goals and motivations and explicitly acknowledges that other concerns exist. Calling it whiny is ungracious -- the author is letting some very human frustration peek through in their narrative.
Not everything has to be written with all the warmth and humanity of a UN subcommittee interim report on widget standardisation.
What is TFA?
The Fine/Fabulous/Fucking Article
Choose your own adjective
To further explain: it comes from 'RTFA' https://en.wiktionary.org/wiki/RTFA which was developed on Slashdot as a variation on 'RTFM'.
Featured
The writing is terrible and full of fluff. Maybe the cat logo should have been a warning.
The trouble with C as an API format is that there's no size info. That's asking for buffer overflows.
There's an argument for full type info at an API, but that gets complicated across languages. Things that do that degenerate into CORBA. Size info, though, is meaningful at the machine level, and ought to be there.
Apple originally had Pascal APIs for the Mac, which did carry along size info. But they caved and went with C APIs.
C’s biggest sins (also inherited by C++):
- unspecified default type sizes. Should have had i8, u16, i32, u64, f32, f64 from the beginning.
- aliasing pointers being restricted by default (ie an alias keyword should have been added). Performance matters. All these benchmarks which show something beating C or C++ are mostly due to dealing with aliasing pointers. C++26 still doesnt have standardised restrict keyword.
There are more but I understand the logic/usability/history behind them. The above points should have been addressed in the 80’s.
Arrays decaying to pointers is probably the biggest non-platform specific design oversight.
As you said, it's easy to see where it came from, but it should've been fixed long ago.
Not the error "handling"? The array implementation? The weak type system? The barebones-macro-system? The nearly unuseable standard-library? The standard itself, a 750-page-tome you have to memorize, lest C is allowed to erase your hard drive?
C is sin incarnated.
I really don't understand why people keep misunderstanding this post so badly. It's not a complaint about C as a programming language. It's a complaint that, due to so much infrastructure being implemented in C, anyone who wants to interact with that infrastructure is forced to deal with some of the constraints of C. C has moved beyond merely being a programming language and become the most common interface for in-process interoperability between languages[1], and that means everyone working at that level needs to care about C even if they have no intention of writing C.
It's understandable how we got here, but it's an entirely legitimate question - could things be better if we had an explicitly designed interoperability interface? Given my experiences with cgo, I'd be pretty solidly on the "Fuck yes" side of things.
(Of course, any such interface would probably end up being designed by committee and end up embodying chunks of ALGOL ABI or something instead, so this may not be the worst possible world but that doesn't mean we have to like it)
[1] I absolutely buy the argument that HTTP probably wins out for out of process
> could things be better if we had an explicitly designed interoperability interface?
Yes, we could define a language-agnostic binary interoperability standard with it's own interface definition language, or IDL. Maybe call it something neutral like the component object model, or just COM[1]. :)
[1] https://en.wikipedia.org/wiki/Component_Object_Model
The general idea is sound. The implementation less so.
I don't see that as a problem. C has been the bedrock of computing since the 1970s because it is the most minimal way of speaking to the hardware in a mostly portable way. Anything can be done in C, from writing hardware drivers, to GUI applications and scientific computing. In fact I deplore the day people stopped using C for desktop applications and moved to bloated, sluggish Web frameworks to program desktop apps. Today's desktop apps are slower than Windows 95 era GUI programs because of that.
Of some computing platforms.
Ok you're still missing the point. This isn't about C being good or bad or suitable or unsuitable. It's about whether it's good that C has, through no deliberate set of choices, ended up embodying the interface that lets us build rust that can be called by go.
Yes, because C is, by virtue of its history and central role in the development of all mainstream operating systems, the lowest common denominator.
Also, if I remember correctly, the first Rust and Go compilers were written in C.
Yes! It's easy to see why we got here, but that doesn't mean it's the optimal outcome!
OCaml was used for rust.
> Yes, because C is, by virtue of its history
Sure history is great and all, but in C it's hard to say reliably define this int is 64-bit wide, because of the wobbly type system. Plus, the whole historical baggage of not having 128-bit wide ints. Or sane strings (not null terminated).
> in C it's hard to say reliably define this int is 64-bit wide
That isn't really a problem any more (since c99). You can define it as uint64_t.
But we have a ton of existing APIs that are defined using the wobbly types, so we're kind of stuck with it. And even new APIs use the wobbly types because the author didn't use that for whatever reason.
But that is far from the only issue.
128 bit ints is definitely a problem though, you don't even get agreement between different compilers on the same os on the same hardware.
> I don't see that as a problem.
It kinda is. Because it was made in the 1970s, and it shows (cough null-terminated strings uncough).
Or you know having a 64-bit wide integer. Reliably.
You did read the article, right?
VHDL vs Verilog is a good parallel from the chip world. VHDL was designed from ground up.
Verilog is loosely based on C. Most designs are done in Verilog.
VHDL tends to reign in European hardware companies.
I mean… an ABI is more like an agreement. It isn’t a specification. It’d be nice if everything was neatly specified and sound. But as the author notes near the end… there’s a lot of platforms and they all have their own quirks.
There has to be an ABI that has to be agreed upon by everyone. Otherwise there wouldn’t be any interoperability. And if we didn’t have the SystemV ABI — what would we use instead? Prepare for a long debate as every language author, operating system designer, and platform under the sun argues for their respective demands and proposals. And as sure as the sun rises in the East someone, somewhere, would write an article such as this one decrying that blessed ABI.
SystemV shouldn’t be the be all and end all, IMO. But progress should be incremental. Because a lingua franca loses its primary feature and utility when we all return to our own fiefdoms and stop talking to one another in the common tongue.
It’s a pain in the metaphorical butt. But it’s better, IMO, than the alternatives. It’s kind of neat that SystemV works so well let alone at all.
What languages does this author like which has a written spec?
This article isn't about languages. It's about the protocol for two or more languages to talk to each other. There is no specification for this.
The System V ABI is as close as we get to an actual specification but not everyone uses it and in any case it only covers a small part of the protocol.
> Anyone who spends much time trying to parse C(++) headers very quickly says “ah, actually, fuck that” and asks a C(++) compiler to do it.
That's exactly my case. For my programming language I have wrote a tool for C headers conversion using libclang. And even with help of this library it wasn't that easy, I have found a lot of caveats by trying converting headers like <windows.h>.
It's not that it was made this way to annoy you, it was all we had.
The whole world shouldn't "need to be fixed" because you won't spend the time to learn something.
Rust doesn't even have a stable Internal ABI that's why you have to re-compile everything all the time.
Do someone pays for anti-C propaganda ?? All that logic breaking accusations...
Eg. here, from memory:
> ...you want to read 32 bits from file but OH NOOES long is 64 bit ! The language ! The imposibility !
But when you read something ot unserialize some format you just need to know based on format schema or domain knowledge. Simple and straightforward like that ! You do not do some "reflections" on what language standard provide and then expect someone send you just that !!
So that anti-C "movement" is mostly based on brainless exampless.
Not saying C is perfect.
But it is very good and I bet IBM and other big corps will keep selling things written and actively developed in C/C++ + adding hefty consulting fees.
In the meantime proles has been adviced to move to cpu-cycle-eating inferior languages and layers over layers of cycle burning infra in cloud-level zero-privacy and guaranteed data leaks.
Oh, btw. that femous Java "bean" is just object with usually language delivered "basic type"... How that poor programmer from article should know what to read from disc when he just have types Java provides ?? How ? Or maybe he should use some domain knowledge or schema for problem he is trying to solve ??
And in "scripting language" with automatic int's - how to even know how many bits runtime/vm actually use ? Maybe some reflection to check type ? But again how that even helps if there is no knowledge in brain how many bits should be read ?? But calling some cycle burning reflection or virtual and as much as posible indirect things is what fat tigers love the moust :)
> C was elevated to a role of prestige and power
We didn't do it to annoy you or to foist bad APIs on you. We did it because it was the best language for writing machine code at the time. By miles. Not understanding why this is true will lead you to make all the same mistakes the languages "bested" by C made.
>Phantomderp and I
>Furry avatar
>"About Me I am Aria Desires, Gankra, and a cat."
I seem to recall reading this before, I must have not noticed this furry stuff because I would have ignored it.
My take was that this is a rustacean who was having trouble porting something to C and then went in a deep rabbit hole of C traiditional software, and instead of recognizing that perhaps they are in way over their head, they concluded that the issue was in the C, that it's all wrong and therefore their mission of porting stuff to Rust is even more virtuous.
This is just an ad hominem attack. Doesn't seem like the author is "in over their head"; they seem to have a pretty solid grasp of actual identifiable gaps between implementations and various specs, and the article was written with the same kind of "chastising" tone as you would see from any grey-bearded hacker who's unsatisfied with the way things are.
I hate to make judgments without a mountain of evidence, but a cursory glance of their about page honestly made me think this person likely suffers from a multitude of mental health issues.
I think we can be accomodating of a wide array of diagnosable mental conditions in software. I'm thinking of Terry King's TempleOS, Ken Reitz's Requests.
Being upfront about it by authors dispels a lot of the potential tension and substantially changes the way we interact. I understand there may be a conflict and not everyone will want to advertise their diagnosis, but in my experience once it becomes clear that's what's going on, it helps all the parties involved.
It is exactly as you describe.
I try not to put much stock in black-and-white opinions because I think the answer is rarely that simple.
I always thought that C was a stepping stone to learn other languages. Like Pascal, it was educational to learn. My Comp Sci courses in 1986-1990 used Turbo Pascal and Turbo C.
C was never a gateway to any flavor of Pascal, a "police state language".
It's not a coincidence that Rust was invented in 1984 by some animals on a farm! Stay in thy lane, programmer!
I think so to, for most devs C is like Latin, or Roman Law, not something we develop and use, but rather something we learn for context and to understand future developments.
There's some people that still develop on C for sure, but it's limited to FOSS and embedded at this point, Low Level proprietary systems having migrated to C++ or Rust mostly.
I agree with the main thesis that C isn't a language like the others, something that we practice, that it's mostly an ancient but highly influential language, and it's an API/ABI.
What I disagree with is that 'critiquing' C is productive in the same way that critiquing Roman Law or Latin or Plato is productive, the horse is dead, one might think they are being clever or novel for finding flaws in the dead horse, but it's more often a defense mechanism to justify having a hard time learning the decades of backwards compatibility, edge cases and warts that have been developed.
It's easier to think of the previous generation as being dumb and having made mistakes that could have been fixed, and that it all could be simpler, rather than recognize that engineering is super complex and that we might as well dedicate our full life to learning this craft and still not make a dent.
I applaud the new generation for taking on this challenge and giving their best shot at the revolution, but I'm personally thinking of bridging the next-next generation and the previous generation of devs, the historical complexity of the field will increase linearly with time and I think if we pace ourselves we can keep the complexity down, and the more times we hop unto a revolution that disregards the previous generation as dumb, the bigger the complexity is going to be.
There is also still a lot of low-level proprietary code developed in C. I would guess far more than what is developed in Rust.
I fully agree about your last point. The proposed solutions to some of the deficiencies of C are sometimes worse than the disease while its benefits are often exaggerated, at the same time adding unnecessary layers of complexity that will haunt us for decades. In contrast, my hope would be to to carefully revise the things we have, but this takes time and patience.
TL;DR author loves rust and writing a subset of a c compiler is hard
Except the author has moved on from Rust, and is still fighting ABI hellscape in different language - Swift.
Which has a standard ABI.