All my experiences with amd gpus have been fantastic, their drivers work beautifully and when they have even a slight issue it's been problems with protocols adoption and whatnot, never the driver.
It's such a contrast to the dogshit experience I've had with all the nvidia gpus I use, I really can't think of a reason except cuda that a Linux user should get a nvidia device.
I hope when the people adopting Linux now start building their next pc demand for nvidia noticeably shrinks. Or maybe they'd be bankrupt by then because of the ai bubble crash
yeah i mean ofc if you also put everyone in the world that that datacentre is serving in a human datacentre, I'm sure it'd also consume tons of power (in food)
also that's definitely not going to have adequate performance, you'd need something like looking glass and that requires a spare gpu or sriov/gvt-g. it's probably easier to set up with a standalone vm
security you don't understand is security you don't have. windows' exploit mitigations don't work because the average user doesn't understand them and can easily be guided into disabling them.
the weakest attack surface is the stupidity of the user and that's not gonna change however much you try to make your os secure
I don't really agree with the firmware baked in/firmware loaded at runtime distinction the fsf makes, so I don't really see the point of not using proprietary firmware wifi cards (like the Intel series), as awesome as ath9k can be
depending on how you manage non-free js these distros are feasible for daily driving though
Regarding vibe coding, Torvalds described himself as "fairly positive" – but not for kernel development. Computers have become more complicated than when he learned to code and was "typing in programs from computer magazines." Vibe coding, he said, is a great way for people to "get computers to do something that maybe they couldn't do otherwise."
This is despite the fact that vibe coding "may be a horrible, horrible idea from a maintenance standpoint."
if you have it always plugged in, the battery doesn't get any wear. I think it's a common misconception that it'd be "charged" and "discharged" at the same time, but that's just not how batteries work.
It gets charged to the 100% mark once, and topped up to stay at 100% depending on the natural discharge of the battery.
All my experiences with amd gpus have been fantastic, their drivers work beautifully and when they have even a slight issue it's been problems with protocols adoption and whatnot, never the driver.
It's such a contrast to the dogshit experience I've had with all the nvidia gpus I use, I really can't think of a reason except cuda that a Linux user should get a nvidia device.
I hope when the people adopting Linux now start building their next pc demand for nvidia noticeably shrinks. Or maybe they'd be bankrupt by then because of the ai bubble crash