__builtin_constant_p(vec) is not inquiring if the contents of vec is constant. The compilers are not being fickle. The statement is not performing the question that the developer intended.
When it becomes beneficial to game a system (fame, power, money), people will learn to game the system. "Weird Nerds" weren't drawn to their interests with the intent to game the system. Sometimes "weird nerds" can achieve more success if they find a political animal who recognizes the benefit of teaming with them.
The article implies that he was drunk, but every eyewitness refers to sleeping pills. It seems that the alcohol claim is solely from his demeaner in the infamous outtakes without knowing the cause of the slurred speech. People assume that he was drunk, so the few articles written about this event duly report that he was drunk.
It is obvious he partied all night after a late shoot. "I took a sleeping pill and it didn't kick in until now" is just a polite way of saying partying was more important to me than this commercial. The man was a known alcoholic.
I once drank a 2-liter bottle of Mountain Dew, and then fell asleep about 2-3 hours later.
A few hours later, (about 6 after drinking all of the Mountain Dew,) I pooped. As everything moved in my insides, the caffeine finally hit me all at once.
So it's completely possible for a sleeping pill to sit in Well's stomach for a few hours before he finally absorbed it. I wouldn't attribute any malintent on his part here.
It's not about purity, it's about options. The ClangBuiltLinux community advocated that Linux should not be dependent upon a single compiler. But when Rust came along, suddenly many of the same people suddently decided that a single compiler was okay.
The Rust community is not perfect. Neither is the LLVM community nor the GCC community, not any Open Source community. Consider the recent drama / growing pains that has occurred within the Rust community. Everyone has biases and conflicts of interest. Anyone who doesn't recognize the benefit of alternatives will learn the hard way.
The Rust community rationalization that they don't need / want alternatives for "reason" is self-serving and all about control. I don't care if someone is the BDFL, they aren't right 100% of the time and not always doing things for altruistic reasons. The Rust community has imbued the leadership with some godlike omniscience and altruism because it makes them feel good, not because it's sound policy.
There's a long distance between "perfect" and "conflicts of interest", the latter bears more resemblance to corruption than imperfection.
Of course rust leadership is going to make and has made mistakes, I'm confident every BDFL ever has also made and will continue to make mistakes, the same goes for the 'herd' of clang/gcc/msvc/... and the herd of browser makers. The target is not perfection, but merely being better than the alternative.
I think that, in the absence of conflicts of interests, single source of truth models (e.g. what rust rust/python/java/... does) are likely to do better than 'herd' of competing implementation models (C/C++/javascript) at making a good language. The latter probably does better at working despite conflicts of interest, but that's not a problem with most programming languages where there is relatively (compared to the browser ecosystem) little opportunity for a powerful corporation to push their interest to the detriment of others.
I think the rust community is quite clear that the rust leadership is flawed, but that's not very interesting without a way to make leadership better. If you can convince people you have that way - you'll get a lot of interest.
Based on that logic, why did the LLVM community develop Clang, Clang++, libc++, etc. instead of continuing with DragonEgg? There already were GCC, G++, libstdc++ , as well as EDG C++ front-end.
GCC, Clang, MSVC, and other compilers complement each other, serve different purposes, and serve different markets. They also ensure that the language is robust and conforms to a specification, not whatever quirks a single implementation happens to provide. And multiple implementations avoids the dangers of relying on a single implementation, which could have future problems with security, governance, etc.
The GNU Toolchain Project, the LLVM Project, the Rust project all have experienced issues and it's good to not rely on a single point of failure. Redundancy and anti-fragility is your friend.
LLVM saw growth for a number of reasons, but nothing to do because it was actually beneficial for the C++ ecosystem:
* A C++ codebase. At the time GCC was written in C which slowed development (it’s now a C++ codebase adopting the lessons LLVM provided)
* It had a friendlier license than GCC which switched to GPLv3 and thus Google & Apple moved their compiler teams to work on LLVM over time.
* Libc++ is a combination of friendlier license + avoiding the hot garbage that was (maybe still is?) libstdc++ (e.g. there were incompatible design decisions in libstdc++ that inhibited implementing the C++ spec like SSO). There were also build time improvements if I recall correctly.
* LLVM provided a fresher architecture which made it more convenient as a research platform (indeed most compiler academics target LLVM rather than GCC for new research ideas).
Basically, the reason LLVM was invested in instead of DragonEgg was a mixture of license & the GCC community being quite difficult to work with causing huge amounts of investments by industry and academia into LLVM. Once those projects took off, even after GCC fixed their community issues they still had the license problem and the LLVM community was strongly independent.
Compilers don’t typically generate security issues so that’s not a problem. There are questions of governance but due to the permissive license that Rust uses governance problems can be rectified by forking without building a new compiler from the ground up (e.g. what happened with NodeJS until the governance issues were resolved and the fork reabsorbed).
It’s funny you mention the different C++ compilers consider that Clang is well on its way of becoming the dominant compiler. It’s already targeting to be a full drop-in replacement for MSVC and it’s fully capable of replacing GCC on Linux unless you’re on a rarer platform where GCC has a bit richer history in the embedded space). I think over the long term GCC is likely to die and it’s entirely possible that MSVC will abandon their in-house and instead use clang (same as they did abandoning IE & adopting Blink). It will be interesting to see if ICC starts porting their optimizations to LLVM and making them freely available - I can’t imagine ICC licenses really bring in enough money to justify things.
I think I read somewhere that recent ICC is LLVM based, just not freely available because of course the LLVM licence doesn’t require that. Can’t remember the source though, so take it with a pinch of salt.
> It’s funny you mention the different C++ compilers consider that Clang is well on its way of becoming the dominant compiler.
I'd be interested to know what metric you're basing this on.
Outside the Apple ecosystem, LLVM seems to have made very little inroads that I can see. No major Linux distribution that I'm aware of uses LLVM, and Windows is still dominated by MSVC.
> GCC has a bit richer history in the embedded space
Yeah as someone who works in the embedded space, I do wish that LLVM had more microcontroller support. Maybe one day! It was weirdly hard to get up and running with the STM32L4 and LLVM, whereas GCC was easy as, just for a recent example. Apparently I can pay ARM for a specific LLVM implementation?
I doubt very much that using C slowed development. To my recollection it was more that for years the FSF pushed for a GCC architecture that resisted plugins to avoid proprietary extensions. LLVM showed the advantages of a more extensible archite and they followed suit to remain competitive.