I recently started a pet project using modules in MSVC, the compiler that at present has best support for modules, and ran into a compiler bug where it didn't know how to compile and asked me to "change the code around this line".
So no, modules aren't even here, let alone to stay.
Never mind using modules in an actual project when I could repro a bug so easily. The people preaching modules must not be using them seriously, or otherwise I simply do not understand what weed they are smoking. I would very much appreciate to stand corrected, however.
I still hope that modules become mature and safe for production code. Initially I coded in C/C++ and this header #include/#ifndef approach seemed OK at that time. But after using other programming languages, this approach started to feel too archaic. No sane programming language should require a duplication in order to export something (for example, the full function and its prototype), you should write something once and easily export.
> No sane programming language should require a duplication in order to export something (for example, the full function and its prototype)
You are spoiled by the explosive growth of open source and the ease of accessing source code. Lots of closed source commercial libraries provide some .h files and a .so file. And even when open source, when you install a library from a package from a distribution or just a tarball, it usually installs some .h files and a .so file.
The separation between interface and implementation into separate files was a good idea. The idea seemed to be going out of vogue but it’s still a good idea.
> Lots of closed source commercial libraries provide some .h files and a .so file.
I'm mostly talking about modules for internal implementation, which is likely to be the bulk of the exports. Yes, it's understandable that for dll / so files exporting something for external executables is more complicated also because of ABI compatibility concerns (we use things like extern "C"). So, yes header approach might be justified in this case, but as I stated, such exports are probably a fraction of all exports (if they are needed at all). I'll still prefer modules when it's possible to avoid them.
> The separation between interface and implementation into separate files was a good idea. The idea seemed to be going out of vogue but it’s still a good idea.
However as soon as you do C++ that goes away. With C++ you need implementation of templates available to the consumer (except cases with limited set of types where you can extern them), wmin many cases you get many small functions (basic operator implementations, begin()/end() for iterators in all variations etc.) which benefit from inking, thus need to be in the header.
Oh and did I mention class declarations tonthe the class size ... or more generic and even with plain C: As soon as the client should know about the size of a type (for being able to allocate it, have an array of those etc) you can't provide the size by itself, but you have to provide the full type declaration with all types down the rabbit hole. Till you somewhere introduce a pointer to opaque type indirection.
And then there macros ...
Modules attempt to do that better, by providing just the interface in a file. But hey, C++ standard doesn't "know" about those, so module interface files aren't a portable thing ...
In most situations, auto-generating the equivalent of .h files for a library based on export statements in the source code would be fine and a useful simplification.
I think everyone hopes/hoped for a sane and useful version of modules, one that would provide substantial improvements to compilation speed and make things like packaging libraries and dealing with dependencies a lot more sane.
The version of modules that got standardized is anything but that. It's an incredibly convoluted mess that requires an enormous amount of effort for little benefit.
> It's an incredibly convoluted mess that requires an enormous amount of effort for little benefit.
I'd say C++ as a whole is a complete mess. While it's powerful (including OOP), it's complicated and inconsistent language with a lot of historical baggage (40+ years). That's why people and companies still search for (or even already use) viable replacements for C++, such as Rust, Zig, etc.
Modules are still in the early adoptor phase - despite 3 years. there are unfortunately bugs, and we still need people to write the "best practices for C++ modules" books. Everyone who has use them overall says they are good things and worth learning, but there is a lot about using them well that we haven't figured out.
This is untrue. The MS Office team is using a non-standard MSVC compiler flag that turns standard #include into header units, which treats those header files in a way similar to precompiled header files. This requires no changes to source code, except for some corner cases they mention in that very blog post to work around some compiler quirks.
That is not the same as using modules, which they have not done.
I don't think you're missing something. The standards committee made a bad call with "no submodules", ran into insurmountable problems, and doubled down on the bad call via partitions.
"Just one more level bro, I swear. One more".
I fully expect to sooner or later see a retcon on why really, two is the right number.
Yeah, I'm salty about this. "Submodules encourage dependency messes" is just trying to fix substandard engineering across many teams via enforcement of somewhat arbitrary rules. That has never worked in the history of programming. "The determined Real Programmer can write FORTRAN programs in any language" is still true.
The C++ committee tries to do features with room for future extension. They believe that whatever you want from sub-modules is still possible in the future - but better to have a small (as if modules is small) thing now than try for perfects. We can argue about submodules once we have the easy cases working and hopefully better understand the actual limitations.
Not to put too fine a point on it: The world has 35 years of experience with submodules. It's not rocket science. The committee just did what committees do.
And sure, "future extension" is nice. But not if the future arrives at an absolutely glacial pace and is technically more like the past.
This may be inevitable given the wide spread of the language, but it's also what's dooming the language to be the next COBOL. (On the upside, that means C++ folks can write themselves a yacht in retirement ;)
From the outside looking in, this all feels like too little too late. Big tech has decided on Rust for future infrastructure projects. C++ will get QoL improvements… one day and the committees seem unable to keep everyone happy or disappoint one stake holder. C++ will be around forever, but will it be primarily legacy?
Yes. Unfortunately the committee has completely abandoned safety at this point. Even memory/thread safety profiles have been indefinitely postponed. The latest ghost safety lifetimes thing is completely unimplementable
There literally isn't a plan or direction in place to add any way to compete with Rust in the safety space currently. They've got maybe until c++29 to standardise lifetimes, and then C++ will transition to a legacy language
I think that SFINAE and, to a lesser extent, concepts is fundamentally a bit odd when multiple translation units are involved, but otherwise I don’t see the problem.
It’s regrettable that the question of whether a type meets the requirements to call some overload or to branch in a particular if constexpr expression, etc, can depend on what else is in scope.
It works perfectly when it comes to `import std` and making things a bit easier.
It does not work very well at all if your goal is to port your current large codebase to incrementally use modules to save on compile time and intermediate code size.
import std; is an order of magnitude faster than using the STL individually, if that's evidence enough for you. It's faster than #include <iostream> alone.
Chuanqi says "The data I have obtained from practice ranges from 25% to 45%, excluding the build time of third-party libraries, including the standard library."[1]
Yeah, but now compare this to pre-compiled headers. Maybe we should be happy with getting a standard way to have pre-compiled std headers, but now my build has a "scanning" phase which takes up some time.
Macro hygiene, static initialization ordering, control over symbol export (no more detail namespaces), slightly higher ceiling for compile-time and optimization performance.
If these aren't compelling, there's no real reason.
modules are the future and the rules for are well thought out. Ever compiler has their own version of PCH and they all work different in annoying ways.
If you tools are not updated that isn't the fault of C++. You will feel the same about Rust when forced to used a 15 year old version too (as I write this Rust 1.0 is only 10 years old). Don't whine to me about these problems, whine to your vendors until they give you the new stuff.
> If you tools are not updated that isn't the fault of C++.
It kinda is. The C++ committee has been getting into a bad habit of dumping lots of not-entirely-working features into the standard and ignoring implementer feedback along the way. See https://wg21.link/p3962r0 for the incipient implementer revolt going on.
Even some much simpler things are extremely half baked. For example, here’s one I encountered recently:
alignas(16) char buf[128];
What type is buf? What alignment does that type have? What alignment does buf have? Does the standard even say that alignof(buf) is a valid expression? The answers barely make sense.
Given that this is the recommended replacement for aligned_storage, it’s kind of embarrassing that it works so poorly. My solution is to wrap it in a struct so that at least one aligned type is involved and so that static_assert can query it.
Its happening again with contracts. Implementers are raising implementability objections that are being completely ignored. Senders and receivers are being claimed to work great on a GPU but without significant testing (there's only one super basic cuda implementation), and even a basic examination shows that they won't work well
So many features are starting to land which feel increasingly DoA, we seriously need a language fork
When one of the main arguments people use to stick to C++ is that it "runs everywhere", it actually is. After all, what use is there for a C++ where the vast majority of the library ecosystem only works with the handful of major compilers? If compatibility with a broad legacy ecosystem isn't important, there are far more attractive languages these days!
Just like Python was to blame for the horrible 2-to-3 switch, C++ is to blame for the poor handling of modules. They shouldn't have pushed through a significant backwards-incompatible change if the wide variety of vendor toolchains wasn't willing to adopt it.
My experience with vendor toolchains is that they generally suck anyway.
In a recent bare metal project I chose not to use the vendor's IDE and toolchain (which is just an old version of GCC with some questionable cmake scripts around it) and instead just cross compile with rust manually. And so far its been a really good decision.
If C++ libraries eschew backward compatibility to chase after build time improvements, that’s their design decision. I’ll see an even greater build time improvement than they do (because I won’t be able to build their code at all).
In my opinion this syntax is super good, it allows to have all functions/method names starting at the same level, it’s way easier to read the code that way, huge readability improvement imo. Sadly nobody uses this and you still have the classic way so multiple ways to do the same thing…
I really wish they had used func instead, it would have saved this confusion and allowed for “auto type deduction” to be a smaller more self contained feature
I recently started a pet project using modules in MSVC, the compiler that at present has best support for modules, and ran into a compiler bug where it didn't know how to compile and asked me to "change the code around this line".
So no, modules aren't even here, let alone to stay.
Never mind using modules in an actual project when I could repro a bug so easily. The people preaching modules must not be using them seriously, or otherwise I simply do not understand what weed they are smoking. I would very much appreciate to stand corrected, however.
I still hope that modules become mature and safe for production code. Initially I coded in C/C++ and this header #include/#ifndef approach seemed OK at that time. But after using other programming languages, this approach started to feel too archaic. No sane programming language should require a duplication in order to export something (for example, the full function and its prototype), you should write something once and easily export.
> No sane programming language should require a duplication in order to export something (for example, the full function and its prototype)
You are spoiled by the explosive growth of open source and the ease of accessing source code. Lots of closed source commercial libraries provide some .h files and a .so file. And even when open source, when you install a library from a package from a distribution or just a tarball, it usually installs some .h files and a .so file.
The separation between interface and implementation into separate files was a good idea. The idea seemed to be going out of vogue but it’s still a good idea.
> Lots of closed source commercial libraries provide some .h files and a .so file.
I'm mostly talking about modules for internal implementation, which is likely to be the bulk of the exports. Yes, it's understandable that for dll / so files exporting something for external executables is more complicated also because of ABI compatibility concerns (we use things like extern "C"). So, yes header approach might be justified in this case, but as I stated, such exports are probably a fraction of all exports (if they are needed at all). I'll still prefer modules when it's possible to avoid them.
> The separation between interface and implementation into separate files was a good idea. The idea seemed to be going out of vogue but it’s still a good idea.
However as soon as you do C++ that goes away. With C++ you need implementation of templates available to the consumer (except cases with limited set of types where you can extern them), wmin many cases you get many small functions (basic operator implementations, begin()/end() for iterators in all variations etc.) which benefit from inking, thus need to be in the header.
Oh and did I mention class declarations tonthe the class size ... or more generic and even with plain C: As soon as the client should know about the size of a type (for being able to allocate it, have an array of those etc) you can't provide the size by itself, but you have to provide the full type declaration with all types down the rabbit hole. Till you somewhere introduce a pointer to opaque type indirection.
And then there macros ...
Modules attempt to do that better, by providing just the interface in a file. But hey, C++ standard doesn't "know" about those, so module interface files aren't a portable thing ...
In most situations, auto-generating the equivalent of .h files for a library based on export statements in the source code would be fine and a useful simplification.
I think everyone hopes/hoped for a sane and useful version of modules, one that would provide substantial improvements to compilation speed and make things like packaging libraries and dealing with dependencies a lot more sane.
The version of modules that got standardized is anything but that. It's an incredibly convoluted mess that requires an enormous amount of effort for little benefit.
> It's an incredibly convoluted mess that requires an enormous amount of effort for little benefit.
I'd say C++ as a whole is a complete mess. While it's powerful (including OOP), it's complicated and inconsistent language with a lot of historical baggage (40+ years). That's why people and companies still search for (or even already use) viable replacements for C++, such as Rust, Zig, etc.
Modules are still in the early adoptor phase - despite 3 years. there are unfortunately bugs, and we still need people to write the "best practices for C++ modules" books. Everyone who has use them overall says they are good things and worth learning, but there is a lot about using them well that we haven't figured out.
Best practice for C++ modules: avoid.
(Buy my book)
Modules have been working reasonably well in clang for a while now but MSVC support is indeed buggy.
They are using modules in the MS Office team:
https://devblogs.microsoft.com/cppblog/integrating-c-header-...
This is untrue. The MS Office team is using a non-standard MSVC compiler flag that turns standard #include into header units, which treats those header files in a way similar to precompiled header files. This requires no changes to source code, except for some corner cases they mention in that very blog post to work around some compiler quirks.
That is not the same as using modules, which they have not done.
Here’s the thing I don’t get about module partitions: They only seem to allow one level of encapsulation.
whereas in module systems that support module visibility, like Rust’s, you can decompose your program at multiple abstraction levels: Maybe I am missing something. It seems like you will have to rely on discipline and documentation to enforce clean code layering in C++.Rust's re-exports also allow you to design your public module structure separate from your internal structure.
Like most languages with modules.
Rust, Modula-2 and Ada are probably the only ones with module nesting.
Notably many languages in ML family have first class modules.
Only Standard ML and OCaml, as far as I am aware.
However this is a different kind of modules, with them being present on the type system, and manipulated via functors.
I don't think you're missing something. The standards committee made a bad call with "no submodules", ran into insurmountable problems, and doubled down on the bad call via partitions.
"Just one more level bro, I swear. One more".
I fully expect to sooner or later see a retcon on why really, two is the right number.
Yeah, I'm salty about this. "Submodules encourage dependency messes" is just trying to fix substandard engineering across many teams via enforcement of somewhat arbitrary rules. That has never worked in the history of programming. "The determined Real Programmer can write FORTRAN programs in any language" is still true.
The C++ committee tries to do features with room for future extension. They believe that whatever you want from sub-modules is still possible in the future - but better to have a small (as if modules is small) thing now than try for perfects. We can argue about submodules once we have the easy cases working and hopefully better understand the actual limitations.
Not to put too fine a point on it: The world has 35 years of experience with submodules. It's not rocket science. The committee just did what committees do.
And sure, "future extension" is nice. But not if the future arrives at an absolutely glacial pace and is technically more like the past.
This may be inevitable given the wide spread of the language, but it's also what's dooming the language to be the next COBOL. (On the upside, that means C++ folks can write themselves a yacht in retirement ;)
From the outside looking in, this all feels like too little too late. Big tech has decided on Rust for future infrastructure projects. C++ will get QoL improvements… one day and the committees seem unable to keep everyone happy or disappoint one stake holder. C++ will be around forever, but will it be primarily legacy?
Yes. Unfortunately the committee has completely abandoned safety at this point. Even memory/thread safety profiles have been indefinitely postponed. The latest ghost safety lifetimes thing is completely unimplementable
There literally isn't a plan or direction in place to add any way to compete with Rust in the safety space currently. They've got maybe until c++29 to standardise lifetimes, and then C++ will transition to a legacy language
https://arewemodulesyet.org/ gives you an overview which libraries already provide a module version.
Wow, the way this data is presented is hilarious.
Log scale: Less than 3% done, but it looks like over 50%.
Estimated completion date: 10 March 2195
It would be less funny if they used an exponential model for the completion date to match the log scale.
Yeah, my personal opinion is that modules are dead on arrival, but I won't waste my time arguing with C++ enthusiasts on that.
C++ templates and metaprogramming is fundamentally incompatible with the idea of your code being treated in modules.
The current solution chosen by compilers is to basically have a copy of your code for every dependency that wants to specialize something.
For template heavy code, this is a combinatorial explosion.
D has best-in-class templates and metaprogramming, and modules. It works fine.
I think that SFINAE and, to a lesser extent, concepts is fundamentally a bit odd when multiple translation units are involved, but otherwise I don’t see the problem.
It’s regrettable that the question of whether a type meets the requirements to call some overload or to branch in a particular if constexpr expression, etc, can depend on what else is in scope.
It has worked perfectly fine while using VC++, minus the usual ICE that still come up.
It works perfectly when it comes to `import std` and making things a bit easier.
It does not work very well at all if your goal is to port your current large codebase to incrementally use modules to save on compile time and intermediate code size.
Office has made a couple of talks about their modules migration, which is exactly that use case.
Can someone using modules chime in on whether they’ve seen build times improve?
We did see build time improvements from deploying modules at Meta.
import std; is an order of magnitude faster than using the STL individually, if that's evidence enough for you. It's faster than #include <iostream> alone.
Chuanqi says "The data I have obtained from practice ranges from 25% to 45%, excluding the build time of third-party libraries, including the standard library."[1]
[1]: https://chuanqixu9.github.io/c++/2025/08/14/C++20-Modules.en...
Yeah, but now compare this to pre-compiled headers. Maybe we should be happy with getting a standard way to have pre-compiled std headers, but now my build has a "scanning" phase which takes up some time.
The fact that precompiled headers are nearly as good for a much smaller investment tells you most of what you need to know, imo.
why use modules if PCH on your diagram is not much worse in compile times?
Macro hygiene, static initialization ordering, control over symbol export (no more detail namespaces), slightly higher ceiling for compile-time and optimization performance.
If these aren't compelling, there's no real reason.
Having implemented PCH for C and C++, it is an uuugly hack, which is why D has modules instead.
modules are the future and the rules for are well thought out. Ever compiler has their own version of PCH and they all work different in annoying ways.
Modules are the future... and will always be the future.
I can’t deploy C++ modules to any of the hardware I use in the shop. Probably won’t change in the near-to-mid future.
It seems likely I’ll have to move away from C++, or perhaps more accurately it’s moving away from me.
If you tools are not updated that isn't the fault of C++. You will feel the same about Rust when forced to used a 15 year old version too (as I write this Rust 1.0 is only 10 years old). Don't whine to me about these problems, whine to your vendors until they give you the new stuff.
> If you tools are not updated that isn't the fault of C++.
It kinda is. The C++ committee has been getting into a bad habit of dumping lots of not-entirely-working features into the standard and ignoring implementer feedback along the way. See https://wg21.link/p3962r0 for the incipient implementer revolt going on.
Even some much simpler things are extremely half baked. For example, here’s one I encountered recently:
What type is buf? What alignment does that type have? What alignment does buf have? Does the standard even say that alignof(buf) is a valid expression? The answers barely make sense.Given that this is the recommended replacement for aligned_storage, it’s kind of embarrassing that it works so poorly. My solution is to wrap it in a struct so that at least one aligned type is involved and so that static_assert can query it.
Its happening again with contracts. Implementers are raising implementability objections that are being completely ignored. Senders and receivers are being claimed to work great on a GPU but without significant testing (there's only one super basic cuda implementation), and even a basic examination shows that they won't work well
So many features are starting to land which feel increasingly DoA, we seriously need a language fork
When one of the main arguments people use to stick to C++ is that it "runs everywhere", it actually is. After all, what use is there for a C++ where the vast majority of the library ecosystem only works with the handful of major compilers? If compatibility with a broad legacy ecosystem isn't important, there are far more attractive languages these days!
Just like Python was to blame for the horrible 2-to-3 switch, C++ is to blame for the poor handling of modules. They shouldn't have pushed through a significant backwards-incompatible change if the wide variety of vendor toolchains wasn't willing to adopt it.
Nobody is "whining" to you. Nobody is mentioning rust. Your tone is way too sharp for this discussion.
My experience with vendor toolchains is that they generally suck anyway. In a recent bare metal project I chose not to use the vendor's IDE and toolchain (which is just an old version of GCC with some questionable cmake scripts around it) and instead just cross compile with rust manually. And so far its been a really good decision.
Yep, this aligns with my experience. I’ve yet to take the plunge into cross compiling with rust though, might have to try that.
> whine to your vendors until they give you the new stuff.
How well does this usually work, by the way?
If C++ libraries eschew backward compatibility to chase after build time improvements, that’s their design decision. I’ll see an even greater build time improvement than they do (because I won’t be able to build their code at all).
This is not an argument against modules. This is an argument against allowing areas that don’t upgrade hold modern c++ back.
> auto main() -> int {
Dude…
In my opinion this syntax is super good, it allows to have all functions/method names starting at the same level, it’s way easier to read the code that way, huge readability improvement imo. Sadly nobody uses this and you still have the classic way so multiple ways to do the same thing…
This style is used in {fmt} and is great for documentation, especially on smaller screens: https://fmt.dev/12.0/api/#format_to_n
This has been valid C++ since C++ 11
As someone who quit c++ over 15 years ago it's been comical to watch what this language has become.
i was sincerely hoping i could get
to work, but alas it seems c++ threw pre-ansi argument type declarations out.> c++ threw pre-ansi argument type declarations out
they never were in C++.
It's like calling a Ford Mustang Mach-E the "Model T++."
It's been the go-to syntax for 15 years now
Go-to? I've never seen a project use it, I've only ever seen examples online.
Same here
Now I haven't touched C++ in probably 15 years but the definition of main() looks confused:
> auto main() -> int
Isn't that declaring the return type twice, once as auto and the other as int?
No. The auto there is doing some lifting so that you can declare the type afterwards. The return type is only defined once.
There is, however, a return type auto-deduction in recent standards iirc, which is especially useful for lambdas.
https://en.cppreference.com/w/cpp/language/auto.html
auto f() -> int; // OK: f returns int
auto g() { return 0.0; } // OK since C++14: g returns double
auto h(); // OK since C++14: h’s return type will be deduced when it is defined
What about
auto g() -> auto { return 0.0; }
I really wish they had used func instead, it would have saved this confusion and allowed for “auto type deduction” to be a smaller more self contained feature
the standard c++ committee is extremely resistant to introducing new keywords such as "func", so as not to break reams of existing code.
And their code example doesn't actually return a value!
For main it's explicitly allowed by the standard, and no return is equal to return 0
“C includes show it age.” But C++ is stating not because of there is a “++” there but because of there is a “C”.