C++ Modules Are Here to Stay

Posted by faresahmed 5 days ago

Counter73Comment102OpenOriginal

Comments

Comment by yunnpp 5 hours ago

I recently started a pet project using modules in MSVC, the compiler that at present has best support for modules, and ran into a compiler bug where it didn't know how to compile and asked me to "change the code around this line".

So no, modules aren't even here, let alone to stay.

Never mind using modules in an actual project when I could repro a bug so easily. The people preaching modules must not be using them seriously, or otherwise I simply do not understand what weed they are smoking. I would very much appreciate to stand corrected, however.

Comment by senfiaj 4 hours ago

I still hope that modules become mature and safe for production code. Initially I coded in C/C++ and this header #include/#ifndef approach seemed OK at that time. But after using other programming languages, this approach started to feel too boilerplate and archaic. No sane programming language should require a duplication in order to export something (for example, the full function and its prototype), you should write something once and easily export.

Comment by Maxatar 3 hours ago

I think everyone hopes/hoped for a sane and useful version of modules, one that would provide substantial improvements to compilation speed and make things like packaging libraries and dealing with dependencies a lot more sane.

The version of modules that got standardized is anything but that. It's an incredibly convoluted mess that requires an enormous amount of effort for little benefit.

Comment by senfiaj 2 hours ago

> It's an incredibly convoluted mess that requires an enormous amount of effort for little benefit.

I'd say C++ as a whole is a complete mess. While it's powerful (including OOP), it's complicated and inconsistent language with a lot of historical baggage (40+ years). That's why people and companies still search for (or even already use) viable replacements for C++, such as Rust, Zig, etc.

Comment by kccqzy 3 hours ago

> No sane programming language should require a duplication in order to export something (for example, the full function and its prototype)

You are spoiled by the explosive growth of open source and the ease of accessing source code. Lots of closed source commercial libraries provide some .h files and a .so file. And even when open source, when you install a library from a package from a distribution or just a tarball, it usually installs some .h files and a .so file.

The separation between interface and implementation into separate files was a good idea. The idea seemed to be going out of vogue but it’s still a good idea.

Comment by senfiaj 3 hours ago

> Lots of closed source commercial libraries provide some .h files and a .so file.

I'm mostly talking about modules for internal implementation, which is likely to be the bulk of the exports. Yes, it's understandable that for dll / so files exporting something for external executables is more complicated also because of ABI compatibility concerns (we use things like extern "C"). So, yes header approach might be justified in this case, but as I stated, such exports are probably a fraction of all exports (if they are needed at all). I'll still prefer modules when it's possible to avoid them.

Comment by AgentME 3 hours ago

In most situations, auto-generating the equivalent of .h files for a library based on export statements in the source code would be fine and a useful simplification.

Comment by johannes1234321 2 hours ago

> The separation between interface and implementation into separate files was a good idea. The idea seemed to be going out of vogue but it’s still a good idea.

However as soon as you do C++ that goes away. With C++ you need implementation of templates available to the consumer (except cases with limited set of types where you can extern them), wmin many cases you get many small functions (basic operator implementations, begin()/end() for iterators in all variations etc.) which benefit from inking, thus need to be in the header.

Oh and did I mention class declarations tonthe the class size ... or more generic and even with plain C: As soon as the client should know about the size of a type (for being able to allocate it, have an array of those etc) you can't provide the size by itself, but you have to provide the full type declaration with all types down the rabbit hole. Till you somewhere introduce a pointer to opaque type indirection.

And then there macros ...

Modules attempt to do that better, by providing just the interface in a file. But hey, C++ standard doesn't "know" about those, so module interface files aren't a portable thing ...

Comment by bluGill 3 hours ago

Modules are still in the early adoptor phase - despite 3 years. there are unfortunately bugs, and we still need people to write the "best practices for C++ modules" books. Everyone who has use them overall says they are good things and worth learning, but there is a lot about using them well that we haven't figured out.

Comment by alextingle 3 hours ago

Best practice for C++ modules: avoid.

(Buy my book)

Comment by malfmalf 3 hours ago

They are using modules in the MS Office team:

https://devblogs.microsoft.com/cppblog/integrating-c-header-...

Comment by Maxatar 3 hours ago

This is untrue. The MS Office team is using a non-standard MSVC compiler flag that turns standard #include into header units, which treats those header files in a way similar to precompiled header files. This requires no changes to source code, except for some corner cases they mention in that very blog post to work around some compiler quirks.

That is not the same as using modules, which they have not done.

Comment by starfreakclone 54 minutes ago

There's nothing non-standard happening there. The compiler is allowed to translate #include -> import. Here's the standardese expressing that: https://eel.is/c%2B%2Bdraft/cpp.include#10.

I do agree, it's not _exactly_ the same as using _named modules_, but header units share an almost identical piece of machinery in the compiler as named modules. This makes the (future planned) transition to named modules a lot easier since we know the underlying machinery works.

The actual blocker for named modules is not MSVC, it's other compilers catching up--which clang and gcc are doing quite quickly!

Comment by vitaut 3 hours ago

Modules have been working reasonably well in clang for a while now but MSVC support is indeed buggy.

Comment by throw_sepples 1 hour ago

I'm afraid things will continue very much sucking for a long time and will still be less-than even when they become broadly supported since sepples programmers, being real programmers™, are not entitled to have nice things.

Comment by rienbdj 5 hours ago

From the outside looking in, this all feels like too little too late. Big tech has decided on Rust for future infrastructure projects. C++ will get QoL improvements… one day and the committees seem unable to keep everyone happy or disappoint one stake holder. C++ will be around forever, but will it be primarily legacy?

Comment by 20k 3 hours ago

Yes. Unfortunately the committee has completely abandoned safety at this point. Even memory/thread safety profiles have been indefinitely postponed. The latest ghost safety lifetimes thing is completely unimplementable

There literally isn't a plan or direction in place to add any way to compete with Rust in the safety space currently. They've got maybe until c++29 to standardise lifetimes, and then C++ will transition to a legacy language

Comment by direwolf20 1 hour ago

Using containers and std::string for everything eliminates the majority of safety bugs.

Comment by pornel 35 minutes ago

The safety bar is way way higher.

The C++ WG keeps looking down at C and the old C++ sins, sees their unsafety, and still thinks that's the problem to fix.

Rust looks the same way at modern C++. The std collections and smart pointers already existed before the Rust project has been started. Modern C++ is the safety failure that motivated creation of Rust.

Comment by reactjs_ 6 hours ago

Here’s the thing I don’t get about module partitions: They only seem to allow one level of encapsulation.

    Program
    - Module
      - Module Partition
whereas in module systems that support module visibility, like Rust’s, you can decompose your program at multiple abstraction levels:

    Program
    - Private Module
      - Private Module
        - Private Module
        - Public Module
      - Public Module
Maybe I am missing something. It seems like you will have to rely on discipline and documentation to enforce clean code layering in C++.

Comment by pdpi 5 hours ago

Rust's re-exports also allow you to design your public module structure separate from your internal structure.

Comment by pjmlp 4 hours ago

Like most languages with modules.

Rust, Modula-2 and Ada are probably the only ones with module nesting.

Comment by IsTom 4 hours ago

Notably many languages in ML family have first class modules.

Comment by pjmlp 3 hours ago

Only Standard ML and OCaml, as far as I am aware.

However this is a different kind of modules, with them being present on the type system, and manipulated via functors.

Comment by groby_b 5 hours ago

I don't think you're missing something. The standards committee made a bad call with "no submodules", ran into insurmountable problems, and doubled down on the bad call via partitions.

"Just one more level bro, I swear. One more".

I fully expect to sooner or later see a retcon on why really, two is the right number.

Yeah, I'm salty about this. "Submodules encourage dependency messes" is just trying to fix substandard engineering across many teams via enforcement of somewhat arbitrary rules. That has never worked in the history of programming. "The determined Real Programmer can write FORTRAN programs in any language" is still true.

Comment by bluGill 3 hours ago

The C++ committee tries to do features with room for future extension. They believe that whatever you want from sub-modules is still possible in the future - but better to have a small (as if modules is small) thing now than try for perfects. We can argue about submodules once we have the easy cases working and hopefully better understand the actual limitations.

Comment by pornel 1 hour ago

Just getting to this barely-working state took C++ longer than it took to create all of Rust, including a redesign of Rust's own module system.

Comment by groby_b 2 hours ago

Not to put too fine a point on it: The world has 35 years of experience with submodules. It's not rocket science. The committee just did what committees do.

And sure, "future extension" is nice. But not if the future arrives at an absolutely glacial pace and is technically more like the past.

This may be inevitable given the wide spread of the language, but it's also what's dooming the language to be the next COBOL. (On the upside, that means C++ folks can write themselves a yacht in retirement ;)

Comment by pklausler 2 hours ago

FWIW, Fortran does have submodules.

Comment by groby_b 1 hour ago

I suppose we shall amend to "The determined Real Programmer will fix FORTRAN" ;)

But, for the folks who didn't grow up with the Real Programmer jokes, this is rooted in a context of FORTRAN 77. Which was, uh, not famous for its readability or modularity. (But got stuff done, so there's that)

Comment by pklausler 42 minutes ago

I'm so old, those jokes were about me.

Comment by w4rh4wk5 5 hours ago

https://arewemodulesyet.org/ gives you an overview which libraries already provide a module version.

Comment by srcreigh 5 hours ago

Wow, the way this data is presented is hilarious.

Log scale: Less than 3% done, but it looks like over 50%.

Estimated completion date: 10 March 2195

It would be less funny if they used an exponential model for the completion date to match the log scale.

Comment by w4rh4wk5 3 hours ago

Yeah, my personal opinion is that modules are dead on arrival, but I won't waste my time arguing with C++ enthusiasts on that.

Comment by mcdeltat 2 hours ago

Nah I'm a C++ (ex?) enthusiast and modules are cool but there's only so many decades you can wait for a feature other languages have from day 1, and then another decade for compilers to actually implement it in a usable manner.

Comment by w4rh4wk5 1 hour ago

I am fine with waiting for a feature and using it when it's here. But at this point, I feel like C++ modules are a ton of complexity for users, tools, and compilers to wrangle... for what? Slightly faster compile times than PCH? Less preprocessor code in your C++.. maybe? Doesn't seem worth it to me in comparison.

Comment by ziml77 1 hour ago

I would think they don't want to hear that because of how badly they want modules to happen. Don't kill their hope!

Comment by fasterik 2 hours ago

I get by without modules or header files in my C++ projects by using the following guidelines:

- Single translation unit (main.cpp)

- Include all other cpp files in main

- Include files in dependency order (no forward declarations)

- No circular dependencies between files

- Each file has its own namespace (e.g. namespace draw in draw.cpp)

This works well for small to medium sized projects (on the order of 10k lines). I suspect it will scale to 100k-1M line projects as long as there is minimal use of features that kill compile times (e.g. templates).

Comment by zabzonk 2 hours ago

This might be OK for someone using your files (a bit like a header-only library) but not so great for team development.

Comment by w4rh4wk5 1 hour ago

You still organize the big file into sections to keep things together that are semantically related. For Git it mostly doesn't matter whether it's 100 small files or a single big one.

Comment by indil 2 hours ago

I believe that's called a unity build. Really nice speedup.

Comment by zabzonk 2 hours ago

SQLite calls this an "amalgamation". It is easy and convenient for users (not developers) of SQLite code.

https://sqlite.org/amalgamation.html

Comment by fooker 5 hours ago

C++ templates and metaprogramming is fundamentally incompatible with the idea of your code being treated in modules.

The current solution chosen by compilers is to basically have a copy of your code for every dependency that wants to specialize something.

For template heavy code, this is a combinatorial explosion.

Comment by WalterBright 4 hours ago

D has best-in-class templates and metaprogramming, and modules. It works fine.

Comment by amluto 3 hours ago

I think that SFINAE and, to a lesser extent, concepts is fundamentally a bit odd when multiple translation units are involved, but otherwise I don’t see the problem.

It’s regrettable that the question of whether a type meets the requirements to call some overload or to branch in a particular if constexpr expression, etc, can depend on what else is in scope.

Comment by direwolf20 1 hour ago

This is one of those wicked language design problems that comes up again and again across languages, and they solve it in different ways.

In Haskell, you can't ever check that a type doesn't implement a type class.

In Golang, a type can only implement an interface if the implementation is defined in the same module as the type.

In C++, in typical C++ style, it's the wild west and the compiler doesn't put guard rails on, and does what you would expect it to do if you think about how the compiler works, which probably isn't what you want.

I don't know what Rust does.

Comment by pornel 16 minutes ago

Rust's generics are entirely type-based, not syntax-based. They must declare all the traits (concepts) they need. The type system has restrictions that prevent violating ODR. It's very reliable, but some use-cases that would be basic in C++ (numeric code) can be tedious to define.

Generic code is stored in libraries as MIR, which is half way between AST and LLVM IR. It's still monomorphic and slow to optimize, but at least doesn't pay reparsing cost.

Comment by direwolf20 2 minutes ago

How does it handle an implementation of a trait being in scope in one compilation unit and out of scope in another? That's the wicked problem.

Comment by direwolf20 1 hour ago

The compiler is supposed to put the template IR into the compiled module file, isn't it?

Comment by fooker 1 hour ago

Exactly, that's no better than #including transitive dependencies to compile large translation units.

Comment by pjmlp 4 hours ago

It has worked perfectly fine while using VC++, minus the usual ICE that still come up.

Comment by fooker 4 hours ago

It works perfectly when it comes to `import std` and making things a bit easier.

It does not work very well at all if your goal is to port your current large codebase to incrementally use modules to save on compile time and intermediate code size.

Comment by pjmlp 3 hours ago

Office has made a couple of talks about their modules migration, which is exactly that use case.

Comment by jokoon 2 hours ago

I am curious to know if that 8.6x speedup is consistent.

I don't see many "fair" benchmarks about this, but I guess it is probably difficult to properly benchmarks module compilation as it can depend on cases.

If modules can reach that sort of speedup consistently, it's obviously great news.

Comment by cmovq 6 hours ago

Can someone using modules chime in on whether they’ve seen build times improve?

Comment by nickelpro 5 hours ago

import std; is an order of magnitude faster than using the STL individually, if that's evidence enough for you. It's faster than #include <iostream> alone.

Chuanqi says "The data I have obtained from practice ranges from 25% to 45%, excluding the build time of third-party libraries, including the standard library."[1]

[1]: https://chuanqixu9.github.io/c++/2025/08/14/C++20-Modules.en...

Comment by luke5441 5 hours ago

Yeah, but now compare this to pre-compiled headers. Maybe we should be happy with getting a standard way to have pre-compiled std headers, but now my build has a "scanning" phase which takes up some time.

Comment by direwolf20 1 hour ago

Modules are a lot like precompiled headers, but done properly and not as a hack.

Comment by vitaut 3 hours ago

We did see build time improvements from deploying modules at Meta.

Comment by Night_Thastus 5 hours ago

The fact that precompiled headers are nearly as good for a much smaller investment tells you most of what you need to know, imo.

Comment by feelamee 6 hours ago

why use modules if PCH on your diagram is not much worse in compile times?

Comment by nickelpro 5 hours ago

Macro hygiene, static initialization ordering, control over symbol export (no more detail namespaces), slightly higher ceiling for compile-time and optimization performance.

If these aren't compelling, there's no real reason.

Comment by WalterBright 4 hours ago

Having implemented PCH for C and C++, it is an uuugly hack, which is why D has modules instead.

Comment by bluGill 5 hours ago

modules are the future and the rules for are well thought out. Ever compiler has their own version of PCH and they all work different in annoying ways.

Comment by Maxatar 3 hours ago

Modules are the future... and will always be the future.

Comment by TimorousBestie 5 hours ago

I can’t deploy C++ modules to any of the hardware I use in the shop. Probably won’t change in the near-to-mid future.

It seems likely I’ll have to move away from C++, or perhaps more accurately it’s moving away from me.

Comment by bluGill 5 hours ago

If you tools are not updated that isn't the fault of C++. You will feel the same about Rust when forced to used a 15 year old version too (as I write this Rust 1.0 is only 10 years old). Don't whine to me about these problems, whine to your vendors until they give you the new stuff.

Comment by jcranmer 4 hours ago

> If you tools are not updated that isn't the fault of C++.

It kinda is. The C++ committee has been getting into a bad habit of dumping lots of not-entirely-working features into the standard and ignoring implementer feedback along the way. See https://wg21.link/p3962r0 for the incipient implementer revolt going on.

Comment by 20k 3 hours ago

Its happening again with contracts. Implementers are raising implementability objections that are being completely ignored. Senders and receivers are being claimed to work great on a GPU but without significant testing (there's only one super basic cuda implementation), and even a basic examination shows that they won't work well

So many features are starting to land which feel increasingly DoA, we seriously need a language fork

Comment by direwolf20 1 hour ago

Please make one.

Comment by amluto 3 hours ago

Even some much simpler things are extremely half baked. For example, here’s one I encountered recently:

    alignas(16) char buf[128];
What type is buf? What alignment does that type have? What alignment does buf have? Does the standard even say that alignof(buf) is a valid expression? The answers barely make sense.

Given that this is the recommended replacement for aligned_storage, it’s kind of embarrassing that it works so poorly. My solution is to wrap it in a struct so that at least one aligned type is involved and so that static_assert can query it.

Comment by crote 4 hours ago

When one of the main arguments people use to stick to C++ is that it "runs everywhere", it actually is. After all, what use is there for a C++ where the vast majority of the library ecosystem only works with the handful of major compilers? If compatibility with a broad legacy ecosystem isn't important, there are far more attractive languages these days!

Just like Python was to blame for the horrible 2-to-3 switch, C++ is to blame for the poor handling of modules. They shouldn't have pushed through a significant backwards-incompatible change if the wide variety of vendor toolchains wasn't willing to adopt it.

Comment by krior 5 hours ago

Nobody is "whining" to you. Nobody is mentioning rust. Your tone is way too sharp for this discussion.

Comment by juliangmp 5 hours ago

My experience with vendor toolchains is that they generally suck anyway. In a recent bare metal project I chose not to use the vendor's IDE and toolchain (which is just an old version of GCC with some questionable cmake scripts around it) and instead just cross compile with rust manually. And so far its been a really good decision.

Comment by TimorousBestie 5 hours ago

Yep, this aligns with my experience. I’ve yet to take the plunge into cross compiling with rust though, might have to try that.

Comment by Joker_vD 5 hours ago

> whine to your vendors until they give you the new stuff.

How well does this usually work, by the way?

Comment by TimorousBestie 5 hours ago

If C++ libraries eschew backward compatibility to chase after build time improvements, that’s their design decision. I’ll see an even greater build time improvement than they do (because I won’t be able to build their code at all).

Comment by direwolf20 1 hour ago

Nobody uses all features of C++.

But you might not be able to use libraries that insist upon modules. There won't be many until modules are widespread.

Comment by maccard 4 hours ago

This is not an argument against modules. This is an argument against allowing areas that don’t upgrade hold modern c++ back.

Comment by whobre 6 hours ago

> auto main() -> int {

Dude…

Comment by cocoto 3 hours ago

In my opinion this syntax is super good, it allows to have all functions/method names starting at the same level, it’s way easier to read the code that way, huge readability improvement imo. Sadly nobody uses this and you still have the classic way so multiple ways to do the same thing…

Comment by vitaut 3 hours ago

This style is used in {fmt} and is great for documentation, especially on smaller screens: https://fmt.dev/12.0/api/#format_to_n

Comment by rovingeye 3 hours ago

This has been valid C++ since C++ 11

Comment by direwolf20 1 hour ago

It's unusual. Some, unusual, style guides require it. It's useful in some cases, even necessary in some which is why it was introduced, but not for simple "int"

Comment by sethops1 3 hours ago

As someone who quit c++ over 15 years ago it's been comical to watch what this language has become.

Comment by webdevver 4 hours ago

i was sincerely hoping i could get

    auto main(argc, argv) -> int
         int argc;
         char **argv;
to work, but alas it seems c++ threw pre-ansi argument type declarations out.

Comment by zabzonk 3 hours ago

> c++ threw pre-ansi argument type declarations out

they never were in C++.

Comment by CamperBob2 6 hours ago

It's like calling a Ford Mustang Mach-E the "Model T++."

Comment by on_the_train 6 hours ago

It's been the go-to syntax for 15 years now

Comment by Night_Thastus 5 hours ago

Go-to? I've never seen a project use it, I've only ever seen examples online.

Comment by whobre 5 hours ago

Same here

Comment by cpburns2009 5 hours ago

Now I haven't touched C++ in probably 15 years but the definition of main() looks confused:

> auto main() -> int

Isn't that declaring the return type twice, once as auto and the other as int?

Comment by yunnpp 5 hours ago

No. The auto there is doing some lifting so that you can declare the type afterwards. The return type is only defined once.

There is, however, a return type auto-deduction in recent standards iirc, which is especially useful for lambdas.

https://en.cppreference.com/w/cpp/language/auto.html

auto f() -> int; // OK: f returns int

auto g() { return 0.0; } // OK since C++14: g returns double

auto h(); // OK since C++14: h’s return type will be deduced when it is defined

Comment by maccard 4 hours ago

What about

auto g() -> auto { return 0.0; }

Comment by yunnpp 2 hours ago

0.0 is a double, so I would assume the return type of g is deduced to be double, if that is what you're asking.

Comment by maccard 4 hours ago

I really wish they had used func instead, it would have saved this confusion and allowed for “auto type deduction” to be a smaller more self contained feature

Comment by zabzonk 3 hours ago

the standard c++ committee is extremely resistant to introducing new keywords such as "func", so as not to break reams of existing code.

Comment by few 5 hours ago

And their code example doesn't actually return a value!

Comment by Davidbrcz 5 hours ago

For main it's explicitly allowed by the standard, and no return is equal to return 0

Comment by direwolf20 1 hour ago

which is super weird. If they can tell the compiler to allow no return, only for main, they can also tell it to pretend void return is int return of 0, only for main.

Comment by GrowingSideways 5 hours ago

[dead]

Comment by up2isomorphism 4 hours ago

“C includes show it age.” But C++ is stating not because of there is a “++” there but because of there is a “C”.

Comment by direwolf20 1 hour ago

Decades–old parts of the ++ also contribute.

Comment by GrowingSideways 5 hours ago

[dead]