

Would be neat if it wasn’t necessary to do that though, of course. 👍 I didn’t need to do that with i3 for some reason, e.g. Feels inconsistent.


Would be neat if it wasn’t necessary to do that though, of course. 👍 I didn’t need to do that with i3 for some reason, e.g. Feels inconsistent.


Whatever it does, it’s something that must be added manually in your compositor’s startup commands, or in your shell’s init file.


It’s very annoying also that you need some weird systemd command incantations to make it detectable by environment variables when using Wayland compositors or X11 window managers that aren’t full desktop environments, in combination with XDG portals.
I struggled so much getting dark mode running well with darkman. And screen sharing with Niri. Hhhrg.
Also any year that someone switches their main desktop to linux is the year of the linux desktop for them personally.
Agreed. I’ve made the same argument myself, fully on board. I had my year of the Linux Desktop two decades ago. 😄
I just… As long as we don’t actually take it seriously that “this might be the year of the Linux Desktop” and get disappointed that it isn’t — by some unknown metric (seriously, what’s even the definition of the “year of the Linux Desktop”?) — then it’s fine. 😊👍
I love the enthusiasm but I think we should honestly stop chasing “The Year Of The Linux Desktop™️”. I think it only serves as a depressant.
Let’s just keep focusing on making Linux great for experienced and new users alike. 👀


Image of “Entering Anaconda” sign taken here:
https://maps.app.goo.gl/9zSYnPMwwLsZE7i97?g_st=ac
Felt like geo finding for a second. 😅
Cons
- None
I like how they spelled “Reduced accuracy”.
Who could say. Not I. 😁
I don’t mind development being a little slower if it means the software is more stable and performant.
Now, that said, I can’t really speak to the fact that Rust is more performant or stable than some other language X, as I don’t know enough to make such statements. 😅
I’m just saying.
I like the way you phrase things there, pal. 👌


Is it the goal of Linux to be usable by the average person? Just asking.
I consider myself an average person. I’m a completely self-taught Linux user, until I learned a bunch more at uni, but that was a small fraction of what I know now and after I started using Linux.
I just followed the installation guide and searched the internet when I couldn’t figure something out myself, just like I expect from the average schmuck. Especially a gamer schmuck who might know a thing or two more than average average schmucks who barely use computers at all.
You know what I mean? Like are we expecting Linux to do Windows levels of handholding?
I know a lot of gamers who will happily drop into the firmware of their motherboard and tweak the timings of their RAM, but they can’t expect to learn some command line commands? Read some documentation?


Just as with anything you do in life, take action with a healthy side of precaution.
This is a life lesson. I’ve learned to be careful around the oven. I’ve also learned to be careful running volatile commands. 😅


lol yeah, now that you mentioned it, the order seems to be random or some totally unrelated metric to the charts. 🥴


TIL Arch is a footgun. 🤡 cope. 😉
But yeah, I agree, if package maintainers were astute there, a warning would’ve probably been good somehow. Not sure pacman supports pre-install warnings. Maybe? It does support warning about installing a renamed/moved package. But the naming would’ve had to be really weird for everyone involved if the warning would be clear in that case.


It would be a very out-of-scope feature for a Linux package manager to do a GPU hardware check and kernel module use check to compare whether you’re using the installed driver, and then somehow detect in the downloaded, about-to-be-installed binary that this will indeed remove support for your hardware.
It just seems very difficult to begin with, but especially not the responsibility of a general package manager as found on Linux.
On Windows, surely the Nvidia software should perform this detection and prevent the upgrade. That would be its responsibility. But it’s just not how it is done on Linux.
It’s not the package itself that “auto updates”. The package manager just updates all the packages that have updates available, that’s it.
But still, the system doesn’t really “break”, all you have to do is downgrade the package, then add a rule preventing it from being updated until Nvidia/Arch package maintainers add a new package that has only that legacy driver’# latest version, which won’t be upgraded again.


Windows doesnt drop to CLI and break if the graphics driver is missing.
Okay. Kind of a matter of definition of “breaking” but sure.
But also GPU driver updates are not forced on you just by updating the system.
Right. But on Linux they happen automatically when upgrading the rest of your system, is what I was saying.


You don’t have to updare your drivers though.
Not sure if you’re on Windows or Linux but, on Linux, we have to actively take explicit actions not to upgrade something when we are upgrading the rest of our system. It takes more or less significant effort to prevent upgrading a specific package, especially when it comes in a sneaky way like this that is hard to judge by the version number alone.
On Windows you’d be in a situation like “oh, I forgot to update the drivers for three years, well that was lucky.”


Not really a problem of Arch, but of the driver release model, then, IMO. You’d have this issue on Windows too if you just upgraded blindly, right? It’s Nvidia’s fault for not naming their drivers, or versioning/naming them in a way that indicates support for a set of architectures. Not just an incrementing number willy nilly.
I mean, I agree completely. I chose i3 back in the day to get tiling windows, but also specifically to get my system as lean as possible to only run the things I needed. So I definitely made a conscious choice. As I did with Niri recently.
I just feel like it should be easier. Like you start the dbus daemon or whatever, and then that’s it. That should be it. There shouldn’t be anything left to do after that IMO. 😅
But I mean, I know how to search shit on the Internet, and read docs. I made it work. Others might struggle though, coming from Windows e.g. like I did decades ago. I have a big headstart.