Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sometimes it feels like I am living in an alternate reality. After I started using Arch Linux as my main OS I was fully prepared for it to break every few months. But it's been more then 2 years (maybe even 3 or 4 ?) and not one breakage. Also keep in mind that I update it every single day. From time to time I check the Arch Linux news site to see if anything needs manual intervention. So far I haven't needed to do anything.


> Also keep in mind that I update it every single day.

I can't remember what I was looking at, but within the past week I ran across a comment that said Arch Linux only supports constant upgrades such as that, and any delays that result in skipping a version are what risk causing breakages. The commenter was very surprised that they were even thinking about supporting a version jump on whatever the thread was about (pretty sure it was somewhere here on HN).


I also have another VM that I use from time to time. I think my longest update was about after a year. I don't remember having any problems like that.


can confirm, went to arch linux due to constant breakage under Debian & Ubuntu, it's be so much more pleasant since then. No more everything exploding left and right as soon as I need a newer GCC or libav version.


Changing the system default GCC version, like it seems you did, is not good practice in environments with conservative package policies.

There's also no need to do that, since nobody prevents users from installing newer versions alongside old ones, and invoking them directly.

Even not considering the fact that different GCC versions can coexist, complaining about this breakage in absolute terms doesn't make any sense. Releases changing default compiler versions inevitably cause more breakages with packages who haven't been updated to be compatible with newer GCC versions. So ultimately it's a matter of choosing the distributions with the appropriate release model, not a matter of Ubuntu/Debian breaking stuff.

Libav has also been deprecated in Ubuntu long ago, so it's not clear what you refer to. If you happen to refer to the transition from and back to ffmpeg, that's very old history.


> Libav has also been deprecated in Ubuntu long ago, so it's not clear what you refer to

I'm referring to the libav* libraries which are part of the ffmpeg project (not the horribly-named libav fork) - and external debian repos providing updated version of those (due to better codec support in media players, etc) such as debian-multimedia


That's funny because I've been running Debian Sid for 15 years as my main OS doing weekly updates and the two single cases of breakage I've seen were glibc6 transition (which was announced and expected) and proprietary video card drivers. You must be thinking of Ubuntu specifically.


To keep the anecdotes going, I've been using a mix of Debian and Ubuntu, both stable/LTS and testing/biannual, for pretty much exactly the same 15 years as you've been running Sid and I've never had any breakage that wasn't caused by me fucking around with binary drivers.

ndiswrapper was the main cause back in the day, shockingly giving Windows drivers access to the Linux kernel can cause problems. The most recent time was when VDPAU was new and I was trying to get HD video playback working on a mini-PC with an nVidia Ion GPU by running a version of the nVidia driver much newer than Ubuntu packaged. Now that I think about it that must have been around a full decade ago.


I'm not old enough to be able to run a Linux distribution for 15 years. And I also never run Debian on any of my personal computers. Thus, my anecdotes are definitely way less convincing. In other words, you can stop reading here.

The only two major distributions that I used for a sufficient amount of time are Ubuntu and Arch. Arch "unstableness" is exactly what I want most of the time on my personal computer as it's my to-go Petri dish. "Stability" would mean that it's harder for me to break it apart, and make a Frankenstein out of it. That's exactly what I have been experiencing with Ubuntu LTS releases --- stability.

Most of the time, I want to have all the available LLVM versions alongside with all the GCC versions, with all the available binutils (Qemu, Docker, Oracle VBox, etc.) versions on the latest kernel full of my monkey patched printk's. When I finally get to break its back I dive the Wiki for few hours to restore it.

I can imagine a non-office, hacking desktop OS that follows the Arch packaging strategy being highly successful.

I also maintain a few compute servers for 10-20 people. They are on Ubuntu LTS. The packages that I need there are always the ones that just work and don't let anyone do anything "cutting edge".


This is the proper response, imo.

Arch is rolling release and you take the good with the bad. The ones who try to defend arch as some paragon of stability miss the point that Arch's model is inherently unstable, but it comes with other benefits.

I'm the one who kicked off this entire conversation pointing out that arch is unstable, and it cracks me up watching silly people scramble to try and defend Arch as being some paragon of stability.

No, it's not. That's baked into its identity.


Aside from patching the kernel I have done everything GP said. Just because the Arch wiki says it is unstable doesn't mean it always is. It just means Arch Linux can do breaking changes (systemd) without worrying about backwards compatibility. And FYI I wasn't defending Arch Linux. It just seems strange to me that everyone is having instability problems and I can't even reproduce it.

I also agree that you shouldn't run your production database on Arch Linux. It isn't made for workloads like that. But personally I find maintaining Arch Linux+"custom packages"(with AUR) easier then Debian+"latest packages"+"custom packages".


That is only if you use the debian-specific definition of "stable" which is "does not change". The rest of the world thinks of "less bugs" when they think of stable software.


yes, because randomly declaring the other person as using a different definition somehow adds to the conversation and changes their point.

Back here in reality, rolling release is less stable because more bugs in the software get through. And this is a reasonable expectation and not some magical fairyland where bugs never get written so being right up against the dev branch is as stable as being on the stable branch.


> Back here in reality, rolling release is less stable because more bugs in the software get through.

We really live in two different software worlds. Every software I'm using has its number of bugs a purely decreasing function of time, especially in the "main" paths and use cases.


Your statement is a logical paradox.

If it were true, it means there wouldn't be bugs in the first place because they wouldn't have gotten written. The very fact that the bugs got written implies new bugs can, and will, be introduced.


I'm running Debian testing since 4.0 beta and never had an un-bootable system or something got broken due to normal updates. I've just reinstalled the system once to migrate it to 64 bit, since there was no proven procedure to migrate a running 32 bit system to 64 bit.

There were rough patches along the way but, all distros were having the same problem in someway or another (vdpau, fglrx, multi-gpu support, ndiswrapper & wireless stuff, etc.) but it's a set it and forget it affair for a very very long time.

...and obligatory xkcd: https://xkcd.com/963/


No, I only ran Ubuntu a few times. My worst breakages were on debian (generally testing). I remember the mysql packages completely killing my apt as well as grub updates wreaking havoc as two individual examples.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: