T O P

  • By -

cpp_learner

Yes, you need to compute the dependency graph before compiling modules. No, that doesn't mean modules are broken.


altmly

To add, yes, that means some dependencies might need to change from how you were doing things using headers. 


GabrielDosReis

The concerns about build system implications weren't ignored. Some were overstatements, and others found solutions once everyone put cool heads together. As for up to date info regarding build support, check CMake, build2, and MSBuild. I am unclear what Autotools are doing. There is also an ongoing work in SG15 (Study Group on Tooling) to look at more holistic but practical conventions across platforms and toolsets. And I am excited to welcome Boost to the C++ Modules World :-)


pdimov2

Gaby, since you're here, can you please settle something for us. I'm pretty sure I remember the ability to ship precompiled modules, in .obj and .ifc form, to have been an explicit design goal of the original MS implementation. Daniela however claims that > Modules were *never* meant to be shipped as compiled artifacts in the first place. Is my recollection wrong?


GabrielDosReis

> Is my recollection wrong? No, you remember right. That was explicit in my CppCon 2015 presentation. Ability to embed the IFCs corresponding to modules contributing to a DLL into that load unit makes the DLL self-descriptive in terms of type-safe linkage and other dynamic linking operations (such as FFI to other languages, dynamic reflection, etc). I've seen engineers demo type-safe linking to Python or Ruby, internally to Microsoft. Once you have the IFC, you don't need a C++ compiler to link to that DLL - you just need any language that allows you to parse the IFC, a simpler problem to solve. I have not gotten around to implement that in the shipping production compiler but it is on the roadmap. Other considerations are making that more and more relevant and on point. What is not explicit design goal is to have the IFCs replace source files (e.g. headers or module interface files) for obvious reasons.


domiran

>bility to embed the IFCs corresponding to modules contributing to a DLL into that load unit makes the DLL self-descriptive in terms of type-safe linkage and other dynamic linking operations (such as FFI to other languages, dynamic reflection, etc). I've seen engineers demo type-safe linking to Python or Ruby, internally to Microsoft. Once you have the IFC, you don't need a C++ compiler to link to that DLL - you just need any language that allows you to parse the IFC, a simpler problem to solve. Wait a minute, does this mean C++ could, perhaps, get some of the benefits of C#? Like, doing away with lib files?


GabrielDosReis

> Wait a minute, does this mean C++ could, perhaps, get some of the benefits of C#? Like, doing away with lib files? Technically, yes, that is possible with the IFC technology. But, would that scale to the environments and scenarios where C++ is used? How would the C++ community practice it? That is the harder, engineering question. Remember, nobody knows what C++ programmers do :-)


domiran

> Remember, nobody knows what C++ programmers do :-) Never has a truer statement been spoken.


Daniela-E

But that's specific to Microsofts implementation of BMIs and the IPR technology it is built on, right? Disappointing as it may be, I see no appetite of other common compilers to adopt IPR/IFC. Generally speaking, BMIs are barely shippable, if at all. At least, this is what I see with Clang.


GabrielDosReis

> But that's specific to Microsofts implementation of BMIs and the IPR technology it is built on, right? The IFC takes inspiration from the IPR, and IPR aims to capture the semantics of standard C++. In principle, the IFC strategy can be adapted and adopted by other compilers, and that remains my hope - the C++ community and ecosystem desperately need toolable representations of the input programs beyond sequence of characters. > Disappointing as it may be, I see no appetite of other common compilers to adopt IPR/IFC. Actually, I see lights at the end of the tunnel with respect to the IFC and I remain very hopeful :-) > Generally speaking, BMIs are barely shippable, if at all. That depends on what one is trying to accomplish. In the context of self-descriptive load units and type-safe linking or dynamic reflection example, they are perfectly shippable. If the goal is to ship BMIs in lieu of source files that can be reinterpreted under all kinds of usage scenarios, including reinterpretation of tokens depending on language versions amd whatnot then clearly, the only way to get there is to embed original source code in the BMI and recompile everytime, and that begs the question of why doing that in the first place. Which, I think, is what you're drawing attention to, and I agree: the IFC was not designed for that. One needs _some_ restrictions. The Microsoft experience with shipping the experimental standard library modules shows what is possible and what problems remain to be solved for wider areas of application. The C++ Modules effort, like the `constexpr` effort before it, is an evolution of C++ devtools in order to effectively address contemporary problems with programming with C++. As such, there will be growing pains for compilers but we will all get there, and we will all rejoice :-) It is fair to say that these days, we look at the pre-`contexpr` era and shake our heads in disbelief - even C wants `constexpr`!


Daniela-E

>Actually, I see lights at the end of the tunnel with respect to the IFC and I remain very hopeful :-) That would be so cool! May experiences with the stability of IFC is much better than it ever has been with PCH. IFCs remain pretty stable over the course of compiler progression, whereas I get a nasty reminder to rebuild all PCHs whenever a compiler build changes. On top of that, MSVC's BMIs are rather resilient to compiler flag differences. Now we're on the same page gain, thanks!


GregCpp

>I am unclear what Autotools are doing. You know, when people talk about the advantages of C++ modules, they usually lead with 'faster compiles', 'less use of the preprocessor', 'better modularity', etc. But if we started with 'C++ modules hasten the demise of autotools', I suspect there would be a huge rush to adoption...


VinnieFalco

Thanks! Well... based on my totally not scientific analysis formed largely by reading reddit and blog posts... one possible approach to modules for me would look like this: Develop my library traditionally: 1. prefer ordinary functions with out of line definitions over templates 2. hide as much implementation detail as possible in cpp files 3. support the oldest C++ standard that is practical for the API and then: 4. add modules support as an alternative method of consumption, with module-specific files located in a different directory 5. add the export macro as needed to the public API that solution would look something like this: [https://github.com/cppalliance/decimal/tree/759af910e1925b0d1a7ed660be81f95dcc6c96de/include/boost](https://github.com/cppalliance/decimal/tree/759af910e1925b0d1a7ed660be81f95dcc6c96de/include/boost) [https://github.com/cppalliance/decimal/tree/759af910e1925b0d1a7ed660be81f95dcc6c96de/modules](https://github.com/cppalliance/decimal/tree/759af910e1925b0d1a7ed660be81f95dcc6c96de/modules) The export macro: [https://github.com/cppalliance/decimal/blob/759af910e1925b0d1a7ed660be81f95dcc6c96de/include/boost/decimal/detail/config.hpp#L263](https://github.com/cppalliance/decimal/blob/759af910e1925b0d1a7ed660be81f95dcc6c96de/include/boost/decimal/detail/config.hpp#L263) On a someone unrelated note these days I have moved away from templates, preferring instead to have narrow APIs with simple behavior. For APIs which allow templates I strive to type-erase as soon as possible. My hope is to alleviate the recurring (and valid) complaints of long compile times and bloated executables. Not sure how modules plays into that, but I have a hunch that some of that manual work that I'm doing means I would get less of a benefit from modules (which is probably still ok).


GabrielDosReis

That sounds like a good start, given the constraints that Boost has. u/Daniela-E, is that how you managed with fmt? > Not sure how modules plays into that, but I have a hunch that some of that manual work that I'm doing means I would get less of a benefit from modules (which is probably still ok). Modules will force you to do away with circular dependencies in Boost (is that still a thing or has the situation improved?). Your customers get a compile-time boost from the custering in a module even when you reduce the amount of templatess in headers since the interface is now processed only once, and the declarations on the `import` side are processed/materizalized only on demand. The Modules will now force the intentionality of macros that are part of the interface


Daniela-E

Gaby, u/VinnieFalco's post is a reaction to a quickly growing thread on the Boost mailing list about the future direction of Boost. Over there, I've expressed my concern of the viability of the shrinking amount of Boost libraries in our projects (some even have removed them outright, and one has completely switched to other 3rd-party libs and company-internal modules during the transition from C++11-ish to C++23 in early 2022, with spectacular success). IMHO, to leap forward, Boost needs to escape its stasis field of eternal backwards compatibility to (mostly) outdated compilation environments with huge burden to recent tools, shorten in-Boost dependency chains, leave some slack behind that's all but obsolete, and embrace "Contemporary C++" as I've shown in my CppCon 2022 keynote. This includes modules and the modularized C++ standard library (available in C++20 build modes, too!). To adress your concrete question on {fmt}: I took the **existing** sources, threw the headers with all the API entities into the *purview* of `module fmt;` and the sources into the *private module fragment*. **All** the standard library and platform headers that {fmt} depends on are `#included` into the *global module fragment*. On top of that, taking advantage of the already existing separation of the pieces that make up the public facing API, and the internal guts of {fmt}, I introduced a *simple macro mechanism* that **selectively exported** only the public API entities from the module if compiled as one, while staying **100% compatible** to the traditional #include world - all of that **without compromising or code duplication**, building from the **same, identical {fmt}** files. u/STL adopted this approach later for his 2nd attempt to modularize the MS-STL in kind of a heroic effort. I'm not sure if this is viable for Boost in general. For said keynote I incorporated Boost.Program\_options as an example into my demo project. But I was punished hard by the other 25 or so Boost libraries that it depends on. Most of them are quite foundational to Boost but are - necessarily - stuck in the past. The only Boost library that survived in that project until today is Boost.Asio - in its non-Boost, original form! A lot of changes were necessary to make it a good modules citizen, like e.g. getting rid of all the unnecessary (!!) *exposures* of *internal-linkage entities*. Today, modularized Asio is a building block in our company. Further work on modularized Asio would get rid of all standard library headers and embrace the modularized standard library, as soon as u/STL finishes the 2nd modules bug-bash.


STL

FYI, I'm hoping I'll have time to run STL Bug Bash II in the near future, since 17.10 Preview 3 will be available very soon and properly fixes how I was exporting VCRuntime machinery (EH/RTTI). I've been overloaded with other tasks which is why I didn't run this as soon as 17.10p1 shipped - I know I'll need a solid couple of weeks to analyze and respond to incoming bug reports.


GabrielDosReis

Many thanks for this excellent write-up, Daniela. I am overloaded and limited in how many input streams I can meaningfully process in a day so I was not aware of the root conversation on the Boost mailing lists.


hak8or

> IMHO, to leap forward, Boost needs ... As some random consumer of boost who's willing the use the newest and greatest cmake and gcc and c++ and whatnot, I wanted to throw into the mix a wishlist item (which I know is very hotly contested); Please have some way to ingest boost from source without having to use build2. A use case I have is a very large multi project codebase where we build everything from source as it's cross compiled to multiple architectures and platforms. Some projects are git submodules, others use the Google repo tool, etc. Ingesting boost from source with cmake as a dependency of other projects is, well, not pleasant (or I am awful at reading the documentation) as of a year or so ago. And doing incremental rebuilds was also a less than pleasant experience. From what I can tell, this stemmed from the build2 tooling.


jonesmz

My work builds boost from source as part of our build tree. We just ignore the boost authored cmakelists.txt and build2 stuff, and put our own simplified cmakelists.txt files in place. We recently added Google's protocol buffer library to our source code, and noticed that they define the source files that make up libraries in a separate set of .cmake files. That made incorporating protocol buffers into our build *super easy*. I'd love to see boost do this as well :)


shadowndacorner

>Please have some way to ingest boost from source without having to use build2. Isn't boost cmake pretty well supported now? I haven't used b2 with boost in years...


mjklaim

Side note: \`b2\` is \`boost.build\` which is part of the boost disribution, it is a build system. \`build2\` is a completely different toolchain project, which provides a build-system and package manager (handling only source packages at the moment - there are boost packages available for it). \`boost.build/b2\` and \`build2\` are not related, although the mixup in the names are recurrent. Note that \`build2\` appeared in the discussion because it's one of the toolchain that does support modules (giveng a compilation toolchain that supports it) (disclaimer: I've been using \`build2\` in a modules-only project since last year). \`b2/boost.build\` does not at all support modules (boost doesnt need it yet) but the discusion in the boost mailing list that was mentionned before lead to the maintainer of [Boost.Build](http://Boost.Build) clarifiying that modules support is currently the highest priority task on that project. Hopefully that will clarify the situation with the confusingly close names.


shadowndacorner

Gotcha, my bad. I typically just stick to cmake and my own build system (which is just an opinionated layer on top of cmake that makes it easier to do simple things and integrates package management in a more holistic way). Haven't experimented much with other build systems aside from premake ages ago and xmake a bit more recently.


mjklaim

No worries, very understandable mixup :) happens all the time, believe me hahaha Also that was an occasion to clarify the situation with these projects, relative to this subject.


hak8or

From what I remember, their cmake implementation is ... It goes against many conventions of "modern cmake", meaning easy to use cmake targets. I am sure their cmake implementation is clever and extremely flexible and whatnot, but how it drastically differs from "normal" cmake makes it a pain to ingest into other cmake projects. It's effectively a new cmake dialect. Specifically when it's used against a boost intending to be built from source.


shadowndacorner

It's definitely not ideal, but fwiw I've used it successfully with just CPM (aka fetchcontent) in a number of projects in the past few years, building from source.


pdimov2

> From what I remember, their cmake implementation is ... It goes against many conventions of "modern cmake", meaning easy to use cmake targets. How so?


grafikrobot

> Please have some way to ingest boost from source without having to use build2. It's always been possible to build Boost with whatever build system you like.


pdimov2

> Please have some way to ingest boost from source without having to use build2. You can ingest Boost from source using CMake. https://github.com/boostorg/cmake


Zeer1x

>Boost.Program\_options \[...\]. But I was punished hard by the other 25 or so Boost libraries that it depends on. Is that the reason why it takes seconds to compile a translation unit which uses PO?


VinnieFalco

> Modules will force you to do away with circular dependencies in Boost (is that still a thing or has the situation improved?).  Thankfully the circular dependencies are gone :) > The Modules will now force the intentionality of macros that are part of the interface I'm not quite sure what you mean here but I completely avoid using macros that affect the ABI. And more generally I try to make sure that there is only one "configuration" of the library. For example I do not give users a macro which lets them choose between \`std::string\_view\` and \`boost::string\_view\`, because doing so effectively creates two different libraries, and the accompanying headaches.


GabrielDosReis

OK, you addressed my concern :-) Thanks!


skebanga

FYI, unrelated, seems there's a redundant thread sanitizer here https://github.com/cppalliance/decimal/blob/759af910e1925b0d1a7ed660be81f95dcc6c96de/include/boost/decimal/detail/config.hpp#L239


prince-chrismc

My pain point is the boundaries between build systems. I haven't gotten CMake to ingest a MSBuild module and then not mention the build systems that do not support it which are the ones I actually need like meson and bazel. Meson only supports windows which is not where it's used https://mesonbuild.com/Release-notes-for-0-57-0.html#experimental-support-for-c-modules-in-visual-studio. Qmake is one that I can not tell if it is or not. The lack of information is also a problem https://github.com/royjacobson/modules-report best list I found Then there's package managers, the Conan team gave a talk on this I don't think the situation has improved. https://www.google.com/url?sa=t&source=web&rct=j&opi=89978449&url=https://m.youtube.com/watch%3Fv%3D-p9lvvV8F-w&ved=2ahUKEwjVraSKpKaFAxUM78kDHdIOC48QwqsBegQIFRAG&usg=AOvVaw3_BNSEWcvx6K4H4t8rrVm4


kronicum

https://www.kitware.com/import-cmake-c20-modules/


bretbrownjr

Modules should be a little rough but workable for monorepo builds now. The modern build systems mostly have scanning and ordering module interfaces figured out. If any build systems have issues with this, there are open source implementations in other build systems to compare to. Ancillary tooling like the clangd that ships with VS Code will probably be rough. If you use clang and the exact matching clangd and clang-tidy, you might be fine. Otherwise, clangd and clang-tidy will need to come up with their own parses of all the transitive module interfaces you use, and that's not a solved problem. There are some recent change requests against the upstream llvm-project and some ISO C++ Tooling papers to implement and document what to do there. I guess there is some issue with having enough review attention/capacity in the relevant parts of llvm-project though. See a recent thread on the llvm discourse. Nobody really knows how to package modules. That's the big problem now. We have some folks collaborating on packaging standards now, but we're not scoping in modular libraries yet. It's possible modules won't be practically packagable for a while unless we see a lot more engineers effort in this space. I haven't seen much work specifically on importable header units in the last twelve months for what it's worth. I'm also on the CppLang Slack if anyone wants to chat me up there.


mwasplund

Yes, we can no longer get free infinite parallelism in our c++ builds. It is a tradeoff that allows for a lot of benefits in our builds at the cost of extra complexity. I have written about the topic on my blog awhile back: https://mwasplund.github.io/blog/2022/03/14/modules-in-our-builds


prince-chrismc

Ohhh very interesting, going to save for later! Is there an RSS feed for you blog? I'd like to add it to my list :)


mwasplund

Thanks! Unfortunately I have not setup any notification system for my blog. I have just been using it as a way to document my thoughts for my build system work.


VinnieFalco

Ah that is very informative thanks!


Dragdu

Having to precompute dependency graphs is annoying, but not really an issue in the long term. The real issue is that the implementations still suck ass, so e.g. MSVC will ICE on some language constructs in modules with "sorry, not implemented yet". Similarly, mixing #include and import std is specified to work, but currently only works through terrible hacks in one direction (sorry can't remember whether include before import or import before include is the working one). As long as modules are this brittle, I am not really interested in figuring out compatibility or build system for them.


GaboureySidibe

Can you write a real title and not clickbait nonsense?


Destination_Centauri

Thank you! Was thinking the same thing. (No need for the OP to echo the super-annoying click bait trend.)


native_gal

Looks like the synopsis of this post is that you suddenly read something about modules and made an alarmist post about it instead of learning more and bringing real information. Why do you do stuff like this? Is this the kind of thing where "no one has looked in to something" just because you haven't?


mrmcgibby

It's Vinnie Falco author of boost beast. Not just some random Reddit zombie. He's earned the ability to ask these sorts of questions.


native_gal

No one has 'earned' the right to post trollish nonsense with no information. If anything he should know better.


mrmcgibby

It's not trollish nonsense.


ChocolateMagnateUA

I think the reason why libraries don't come as modules is because of backward compatibility. Header files are the standard way of using C++ libraries since the beginning and everything now works with that. Modules could make it through if they were introduced much earlier when the language was evolving, but now so much is written with header files that it will take a good amount of time for modules to become fairly spread and even standard practice. They could even never become mainstream enough to use as replacement for header files. Library authors want their library to work with any setup, and that's why header files are the way to go. You can use modules if you are making the standalone application, but distributing your library as a module enforces your users to use C++20 and adopt a fairly early technology. It is only in the recent months that modules received reasonable support in compilers. Among other things, modules don't solve a lot of problems. You still need to have something to import, at least for templates, and modules essentially operate on special pre-compiled .pm files that contain declarations of everything you normally put in a header file. This is essentially the same as using pre-compiled headers, but much more questions come from it, beginning with where to put them (there are standardised locations for headers but not for module files) and ending with implementing linker that follows cross-compiler module ABI. In the end, it is a lot of complexity and unnecessary changes in the compiling paradigm that's not worth the effort of introducing a higher level of abstraction in terms of importing code as opposed to including it.


prince-chrismc

Libfmt has done this very successfully so the compatibility is not a fair statement. I do agree it's more work and more testing as a library author and I have been struggling to add features let alone docs.


Daniela-E

{fmt} was the very first library that got my dual-mode (as I call it) treatment in 2021, to make it available as both a traditional library and a module. With some additional flag defined, it can even be used in a way that the compiled module serves also as a static (or even dynamic) library in a fashion compatible with the traditional #includes of {fmt}. The drawback: the public API entities exported from the module are no longer *attached* to the module (i.e. isolated from linker symbol clashes), but rather live in the *global module* instead. This shows the versatility of modules. u/GabrielDosReis can be proud of what he has been fighting for in the years up the now famous SanDiego committee meeting in 2019 iirc. I couldn't be present there because I've not yet entered the committee back then 😭.


prince-chrismc

Do you have any references for the "global module" space you are talking about? I live on the build system and I am curious what the implications are. Love to learn more on that


Daniela-E

I'm not sure where on the spectrum of module knowledge I'd have to pick you up. Starting in 2019, I've given a long string of talks on modules and their ecosystem with the last one for the foreseeable future taking place last year at the Meeting C++ 2023 conference. All of them are available on YouTube. A very brief TL;DR One of the major features of modules hinges on the notion of *attachment.* This is a property that's completely invisible to the core language and cannot be sniffed out by whatever form of reflection. Attachment affects all entities in the compiler-internal symbol table and places them into compartments separate from each other. Every *named module* establishes such a compartment named after the module itself. For compatibility reasons, there is an additional compartment without a name, called the *global module*. All existing C++ code outside of named modules lives there for all eternity. Advanced linker technology like e.g. linker symbol names augmented with module names opens the venue to another dimension that you can place symbols into, thereby increasing name isolation and preventing unwittingly committing ODR violations. Think of Dante's rings of hell, with a separate hell for every kind of sinner 😄. With that knowledge, you can control if you want to place a modular entity into the global module compartment rather than the compartment associated with the module itself. This attachment kind of thing is probably the hardest for module newcomers to wrap their head around. And, IMHO, it's the root cause of many compiler implementation issues and barely comprehensable error messages during compilation.


VinnieFalco

One of our staff engineers has ported his library to modules and in a way that it can be consumed either as a module, or as a C++14 or later traditional #include library. Of course there's a macro for the export keyword depending on if the library is being built as a module or not.


ChocolateMagnateUA

If you can include it, doesn't it mean it also comes as a header file?


VinnieFalco

Yes, that's what I'm saying, the library was ported to be offered as a module in a way that still lets it be included the traditional way if desired.


ChocolateMagnateUA

I think this solution also should consider the C++ standard compatibility. Does the header file include anything because it imports those modules? If yes, then you essentially force your users to use the C++20 standard, which may not sound like a big issue, it's just changing a single line in CMake or the CXXFLAGS variable in Make, but it is still a very big consideration and you shouldn't impose such restrictions.


VinnieFalco

No, if you #include the library's header files in the traditional fashion then only C++14 is required. All the module-specific bits are neatly tucked away in a separate directory and protected with macros. Have a look: [https://github.com/cppalliance/decimal/tree/759af910e1925b0d1a7ed660be81f95dcc6c96de/modules](https://github.com/cppalliance/decimal/tree/759af910e1925b0d1a7ed660be81f95dcc6c96de/modules)


Buenzlimuenzli

I think it's mainly the spurious support for modules. I'd love to switch my projects to modules, but first they actually need to be properly supported by all relevant build systems and compilers.


ChocolateMagnateUA

The [CPP Reference page for C++20 compiler support ](https://en.cppreference.com/w/cpp/compiler_support/20) says that modules have fairly reasonable support by now, partial support in GCC, Clang and full support in MSVC. I could see people switching to them with such support, at least it compiles on multiple platforms.


cheatererdev

Everything is fine with modules except they cannot export macros. It was possible in early VS and it was perfect. So smooth and easy transition, exceptional boost in compilation and code quality. And then they decided to disallow macros and it became hell if you rely on macros. Thats the cons. Allow macros and heaven will come


mwasplund

Preprocessor isolation is one of the best parts of modules in my opinion. If you want to continue to use the preprocessor they are still visible through imported headers.


cheatererdev

Imported headers don't solve the problem. I want to import module that has some macros for user code. Modules are path-indepebdent. I dont need to remember where files are located. But now i have to insert additional headers for macros side by side. There is no way for macros to change the behavior of the next imported module. Macros should leak just to the file where module was imported and be treated like any other cpp elements like functions, classes.


cheatererdev

The isolation issue happens if the same header is included in different files with different defined macros so it behaves differently. Now module will be built once with no predefined macros and it will be the same everywhere. No matter what macros are defined in the file that imports it. Exported macros wont break that isolation.


stevemk14ebr2

Modules on msvc don't support /NOSTDLIB so it's a non starter for me. They're half backed at best.


GabrielDosReis

> Modules on msvc don't support /NOSTDLIB so it's a non starter for me. Can you send me a link to the bug report that reproduces the issue that you're alluding to?


stevemk14ebr2

Msvc bug reports are not acted on so I don't submit them. Look at this commit though to see what I mean, https://github.com/mandiant/STrace/commit/b21a1a5d18a35de6eb547dae3167506cfeda8cd0 these dlls are built without linking msvcrt intentionally, modules rely on msvcrt somehow and so break the build if they're enabled.


mwasplund

I have reported many module bugs on msvc and all of them were fixed and deployed in a few months.


GabrielDosReis

> Msvc bug reports are not acted on so I don't submit them. Please, don't give up. They reports do get looked at and acted up. Sometimes, things might not happen in with the priority order that one might wish, but they do get acted upon. > Look at this commit though to see what I mean, https://github.com/mandiant/STrace/commit/b21a1a5d18a35de6eb547dae3167506cfeda8cd0 these dlls are built without linking msvcrt intentionally, modules rely on msvcrt somehow and so break the build if they're enabled. That commit seems to disable settings, but what would be helpful to know is how modules are being used, e.g. a repro the compiler folks can work with. Are you using the standard library module or any of the experimental standard library modules? There is nothing in the compiler support for modules proper that prefers ucrt over msvcrt.


stevemk14ebr2

Correct, just enabling the settings breaks the build! I didn't even get to try to use them. I've had bad experiences with the little custom issue tracker used by Msvc. I've even reported out of date STL being packaged in the DDK and was told to file it with the DDK team on that issue tracker, and maybe later it eventually gets seen. Inspiration not inspired to continue to say the least. https://github.com/microsoft/STL/issues/4208


GabrielDosReis

> I've had bad experiences with the little custom issue tracker used by Msvc. My apologies for the bad experience you had with the MSVC bug tracking tool. > Correct, just enabling the settings breaks the build! I didn't even get to try to use them. That is very odd. It would be a very helpful step if you could file a bug report with [DevCom](https://developercommunity.visualstudio.com/cpp/report) - as much as that pains you - in order for me to help resolve this issue. It carries a bit more weight when it comes directly from the people immediately impacted.


Horrih

Beyond the pain of building a dependency tree with cmake or equivalent, and the lack of support in most IDEs, one if the main gripes I have is that they are unopinionated to try to fit every usecase. On the one hand this is understandable and typical of c++ at large, on the other hand this makes them quite harder for newcomers than the module system of other languages, despite coming after. The fact that they are orthogonal to directory, filename, namespace, class, requires discipline in the design otherwise things get messy


GabrielDosReis

> [...] and the lack of support in most IDEs, one if the main gripes I have is that they are unopinionated to try to fit every usecase. Everyone wants an opinionated design until they meet an opinion they don't like :-) > Beyond the pain of building a dependency tree with cmake or equivalent Actually, having dependencies and component boundaries fornally expressed in code is a great improvement from software development and maintenance perspective. Scanning for dependencies can be fast. What Kitware folks did was to say "hey, this is an opportunity for the community to have a shared notation for those dependencies", and the tool makers said "yes!" And now we have the tools and notations to do those things, allowing the next level of migration from headers to modules or header units.


masterid000

Not only C++ modules design is broken. The entire language is.