All of my apps are in docker containers set to restart unless stopped by me.
Then I run a cron job that is scheduled at like 3 or 4am that runs docker pull on all containers and restarts them. Then it runs all system updates and restarts the server.
Every week or so I just spot check to make sure it is still working. This has been my process for like 6 months without issue.
The ability to read, and maybe watch a video. And then persistence for some of the trial and error you will run in to. All skills you need can be picked up with the above.
I turn mine off to save power when I'm not actively using it. I have a small 65 watt server that stays on all the time. Currently it has been up for 3 months or so.
I started on gnome. Used gnome for most of my linux life. However, after some memory and performance issues, I decided to try KDE. That was about 3 years ago and everything that handles it well and I use a GUI with has been moved to KDE.
I am unsure if the specs bear this out, but my personal experience has been that RDP's compression and encoding leads to much smoother interactions with the remote machine, especially when there are a lot of windows or visuals on screen. My bandwidth utilization has been lower on VNC.
Using RDP I also meet CMMC guidelines, which is probably doable with VNC, but not as easily or without some additional work on my end to prove compliance. It's also easier to convince my clients to allow me to work off-site using RDP as a trusted secure protocol. Less headache.
I have some RHEL machines at work. They are used as VM hosts for windows VMs (CAD software). I set them up, but I also have a huge list of other apps and servers that I manage,develop and support, and so the person that wanted these mahines wanted professional services as an option if I am out or busy with other projects. Plus it allows us to offload liability for security if need be, whereas when I do it, there is anyone else to blame, legally speaking. ( Although so far we have not had a breach on my watch knocks on wood )
I just use fedora at home, I find the they are about the same and I personally wouldn't pay for the additional services. The package manager is different, but that's about it.
I have a folder for my projects on root and within those projects I have my GitHub repos all contained within their own directory named the same as the project.
If I am learning something, I have a folder for the topic I am learning, and a logseq file with all of my notes. Then I have folders for my book references, one for video or audio references, and then a folder for my practice projects.
I manage the few linux servers at my company. I use a windows laptop to ssh to my servers. Windows for me is fine, but I do very little on it outside of ssh or emails. However, I would never use windows outside of this.
Fair point. Hadn't followed recently, but that suggestion makes sense. I would personally buy used, but I totally understand others not wanting to and buying the newer chips would make the most sense there.
You could go either way. But with the shit going on with the 13th and 14th gen Intel chips, I personally would rather go the AMD route. I would actually probably go with 5000 series chips with ddr4 ram for the savings. It would probably still be a huge upgrade for me, and it would be overall a much cheaper upgrade. If you are gaming primarily, the 5800x3d is still an amazing chip for gaming when it comes price to performance.
I run a Fedora server.
All of my apps are in docker containers set to restart unless stopped by me.
Then I run a cron job that is scheduled at like 3 or 4am that runs docker pull on all containers and restarts them. Then it runs all system updates and restarts the server.
Every week or so I just spot check to make sure it is still working. This has been my process for like 6 months without issue.