I put my boomers on Fedora with GNOME a couple years ago and there hasn’t been any issues with that. Especially now that a lot of stuff that used to be desktop apps has moved to the browser, it’s more viable than ever.
This, and other lies I tell myself…
I love Linux, don’t get me wrong, and I have used desktop versions of Mint and Arch and Ubuntu and Zorin and Pop! and OpenSuSE and Parrot and Kali and a bunch of other small ones way back in the day (currently I have Mint and Zorin on a side laptop), and both rackmount servers in my bathroom are running Ubuntu…
But I still use a Windows desktop for gaming and a MacOS laptop for dev work as my main drivers.
I actually think that if OEMs would put it on hardware they sold at BestBuy it’d sell like hotcakes, but that level of support is never going to happen, and without the ability for them to bring it to GeekSquad and have them either fix it or RMA it, Linux is never going to be “right” for the average person.
I have set up a Linux computer for my dad just this week. As the others are saying the biggest hurdle is support. Mostly the part where reliable automated update mechanisms don’t exist. The system I created has a WM & a panel and it’s even simpler than chrome OS, but here is no way my dad could deal with a message, where apt would ask him, if he wanted to update to the maintainer’s grub version. Making an update script is very easy, but whether you use Debian stable, Ubuntu or Arch eventually some kind of intervention is going to be required beyond just typing sudo passwords…
Though truth to be told Windows has a lot of garbage problems (perhaps even more than L) as well, it’s just people have gotten used to them after 20 years of NT on desktop. On Linux at least most common problems can be solved without reinstalling the OS, on Windows the rot feels like a built in planned obsolescence sometimes, because there are many that just update their PCs instead of ever reinstalling Windows.
When a Linux desktop environment breaks, it breaks hard. I’ve lost whole days of work debugging stupid nonsense like where I couldn’t get past the login screen without switching from GDM to LightDM, or not being able to open settings in Gnome until I realized that it was a proprietary display driver issue, or had a previously working secondary display just switch to rendering a distorted image. And these are things that would happen after installing routine updates that the OS prompted! The investigations and fixes were just filled with deep dives into configuration files and all sorts of CLI shenanigans. Searching for solutions brought up inapplicable suggestions from 6 distro versions ago.
Windows and MacOS certainly have their issues but they’ve never broken like that for me. I still use Linux on my work machine but anecdotally speaking I don’t think it’ll ever be daily-driver ready for “most people.”
The Distros mentioned in the article are meant to be used without changing anything else…
If you meant Arch, I agree with your concern. But Arch isn’t designed for beginners in the first place. It was designed to be built.
Any operating system would break if you tinking too much about how it was built. In Windows, if you mess with regedit too much, it’ll start to misbehave or worse, Blue Screen…