This...actually seems like a good use of AI? I generally think AI is being shoehorned into a lot of use cases where it doesn't belong but this seems like a proper place to use it. It's serving a specific and defined purpose rather than trying to handle unfiltered customer input or do overly generic tasks,
I don't really see why they would hire him to achieve this goal. He had already quit as maintainer. He was out of the picture unless he resigned specifically due to accepting an offer from NVIDIA, but if that was the case and they wanted Nouveau stopped then why is he now contributing a huge patchset? If they hired him and he quit nouveau they could've had him work on the proprietary driver or their own open out of tree kernel driver, but they specifically had him (or at least allowed him) to keep working on nouveau.
Also, if they really wanted to EEE nouveau into oblivion, they would need to get every single prominent nouveau, nova, and NVK developer on payroll simultaneously before they silence them all because once one gets silenced why would any of the others even consider an NVIDIA offer? Especially those already employed at Red Hat? It doesn't really make sense to me as an EEE tactic.
What has been apparent over the past few years is that NVIDIA seems to be relaxing their iron grip on their hardware. They were the only ones who could enable reclocking in such a way that it would be available to a theoretical open source driver and they did exactly that. They moved the functionality they wanted to keep hidden into firmware. They had to have known that doing this would enable nouveau to use it too.
Also, they're hopping on this bandwagon now that NVK is showing promise of being a truly viable gaming and general purpose use driver. Looking at the AMD side of things, they did the same thing back when they first started supporting Mesa directly. They released some documentation, let the community get a minimally viable driver working, and then poured official resources into making it better. I believe the same situation happened with the Freedreno driver, with Qualcomm eventually contributing patches officially. ARM also announced their support of the Panfrost driver for non-Android Linux use cases only after it had been functionally viable for some time. Maybe it's a case of "if you can't beat them, join them" but we've seen companies eventually start helping out on open drivers only after dragging their feet for years several times before.
I'm cautiously optimistic. While I could see NVIDIA hiring him to stifle nouveau development, it doesn't really seem worth it when he already quit as maintainer and Red Hat is already working on nova, a replacement for nouveau. I got into Linux with Ubuntu 6.06 and remember the situation then. NVIDIA and ATI both had proprietary drivers and little open source support, at least for their most recent chipsets of the time. I was planning on building a new PC and going with an NVIDIA card because ATI's drivers were the hottest of garbage and I had a dreadful experience going from a GeForce 4 MX420 to a Radeon X1600Pro. However, when AMD acquired ATI they released a bunch of documentation. They didn't immediately start paying people to write FOSS Radeon drivers, but the community (including third party commercial contributors) started writing drivers from these documents. Radeon support quickly got way better. Only after there was a good foundation in place do I remember seeing news about official AMD funded contributors to the Mesa drivers. I hope that's what we're now seeing with NVIDIA. They released "documentation" in the form of their open kernel modules for their proprietary userspace as well as reworking features into GSP to make them easier to access, and now that the community supported driver is maturing the see it viable enough to directly contribute to.
I think the same may have happened with Freedreno and Panfrost projects too.
This is my cautious optimism here. I hope they follow this path like the others and not use this to stifle the nouveau project. Besides, stifling one nouveau dev would mean no other nouveau/nova/mesa devs would accept future offers from them. They can't shut down the open driver at this point, and the GSP changes seem like they purposely enabled this work to begin with. They could've just kept the firmware locked down and nouveau would've stayed essentially dead indefinitely.
In college I was on the robotics team. We used several different controllers to drive various robots. I made a little tank steering robot that was remote controlled from a PC with an Xbox 360 controller. I later rebuilt it to use a Raspberry Pi and added a pan/tilt mount for the camera controlled from the controller's D-pad. We also used a Wiimote to control our competition robot, using the accelerometer for steering which was pretty cool. This was in like 2010 when motion controls were still a relatively new and cool thing.
I prefer the USB port to be on the bottom, but very few phones (at least in the smartphone era) even tried to move the USB port. Headphone jacks were frequently on top. I like the USB port on the bottom in the center so it can sit on a stand with a cutout in the center (which are pretty common).
Most gaming laptops these days don't support true GPU switching as it requires a hardware mux to switch the display between the GPUs. Every gaming laptop I've used from the past decade has been muxless and only used render offloading.
I think it's the other way around. NVIDIA's marketing name for render offloading (muxless) GPU laptops is NVIDIA Optimus so when the Mesa people were creating the open source version they called it PRIME.
Most gaming laptops these days don't do GPU switching anyways. They do render offloading, where the laptop display is permanently connected to the integrated GPU only. When you want to use the discrete GPU to play a game, it renders the game frames into a framebuffer on the discrete GPU and then copies the completed frame over PCIe into a framebuffer on the iGPU to then output it to the display. On Linux (Mesa), this feature is known as PRIME. If you have two GPUs and you do DRI_PRIME=1
<command>
, it will run the command on the second GPU, at least for OpenGL applications. Vulkan seems to default to the discrete GPU no matter what. My laptop has an AMD iGPU and an NVIDIA dGPU and I've been testing the new NVK Mesa driver. Render offloading seems to work as expected. I would assume the AMD Mesa driver would work just as well for render offloading in a dual AMD situation.
Hopefully they can find a new home. I am ashamed of GitLab. I used to love it but they get worse and worse by the day. Maybe Codeberg would be a better home. Nintendo can't kill this, there will always be new places to host software and it's open source.
It's absolutely ridiculous they took it down even though Nintendo didn't DMCA the Suyu project directly. Shitty corporate cover-our-ass behavior at its finest.
I'm just using a Dell PC monitor (21" 1080p) from like 2010. It supports HDMI but I don't know about CEC. Either way it could just put the monitor to sleep and that would be fine, doesn't require CEC. I just am not sure of a way to trigger this manually when I'm done using it.
The AMD radv driver is best for gaming at the moment IMO. If you're stuck with NVIDIA hardware then yes, the proprietary driver is the best for gaming as the open source driver is quite slow, but the good news is that this is rapidly changing after being stagnant for 5+ years. NVK is the new open source NVIDIA Vulkan driver in Mesa and it just recently left experimental to be included officially in the next Mesa release. Also, NVIDIA's GSP firmware changes mean that the open source nouveau kernel driver can finally reclock NVIDIA GPUs to high performance clocks/power states thus it could achieve performance parity with the proprietary driver with enough optimization. On my RTX 3070 laptop it is still significantly slower and some games don't work yet, but there is no flickering or tearing that I experience with the proprietary driver. Unfortunately for GTX 10 series users, these cards do not use GSP firmware and have no means of reclocking still so they will be stuck using only proprietary drivers for the forseeable future.
I'm not sure. I don't know how or when DSC gets used. My new monitor is a 4K 144Hz display connected over DisplayPort and my GPU is a Radeon RX 7800XT. I don't think DSC is being used in this setup but I don't know for sure. I also used this display with an Arc A770 and GNOME VRR worked just fine there too, though I had to comment out a line in a udev rule that excluded VRR support on Intel GPUs for some reason.
I just set up a bedroom "TV" which is just an old monitor and Raspberry Pi. I installed Kodi and some addons for TV sources. Works OK, just wish there was an easy way to turn the monitor off from the Pi on command so I don't have to walk over to it and shut it off manually.
Can't wait to try out the official version of GNOME VRR after using the patched mutter-vrr for several years now. It's a very solid VRR implementation and I feel it's better than KDE's. It's about time it made it into an actual GNOME release. Just wish they would've fully committed and added the VRR toggle in settings rather than hide it behind an experimental flag. Hopefully GNOME 47 moves it out of experimental.
Both sides suck here but I have to side with Reddit over patent trolls. Nokia, what a disgrace you are these days if you have to resort to patent trolling. You used to be cool. That said, if this hurts Reddit's IPO then I'll be happy anyways.
I mostly use Linux but have a Mac Mini as a TV PC. I use the same browser everywhere - LibreWolf. It's Firefox but with Mozilla's bullshit adware/sponsored garbage removed and some extra privacy-focused features/default settings. Firefox has become adware itself, with its home page having sponsored garbage and suggested stories from partners. I generally love what Mozilla is doing and we need competition in the browser space, but I don't want Mozilla spamming up my homepage with their "suggestions".
Not on my 1080Ti. I have serious flickering on certain apps when using the latest NVIDIA proprietary drivers on Arch Linux with GNOME Wayland. Steam flickers and sometimes seems to fail to redraw properly. Had some issues on Discord as well.
This...actually seems like a good use of AI? I generally think AI is being shoehorned into a lot of use cases where it doesn't belong but this seems like a proper place to use it. It's serving a specific and defined purpose rather than trying to handle unfiltered customer input or do overly generic tasks,