In an enterprise imaged Windows laptop they and you probably wouldn't have superuser privileges in order to keep yourselves from doing stuff like deleting core Windows dependencies. Maybe they give you full administrative access at your company but if you deleted the Program Files folder to save time you'd be blamed by pretty much everyone.
You guys obviously have root privileges or else you wouldn't have been able to delete the system's core Python2 installation. And frankly you must have literally manually deleted it because the package manager would have told you what havoc you were about to enact and made you tell it to do it anyway.
But what's even weird to me is that most python devs I know, including myself use python virtual environments (venv) to use different versions and package bloat control from something like pip but keep it all nice and neat.
If you wanted python3 to be the default you have to change the PATH in Windows or if you don't know what you are doing I guess reinstall whichever python with a .MSI an hope it does it for you.
Meanwhile, in Linux you can just use the alternatives utility to literally pick your preferred versions and it takes care of the paths for you.
And with the HDMI issue? You must not be using the same graphics drivers and someone is using proprietary graphics drivers (won't have the issues you've described) and the other is using open source versions (you'll have the issues you've described) because companies are shitty about their proprietary closed standards.
Which brings up another point. You say you all use the same laptop model and OS but you don't all use the same drivers? There's no baseline? There's no control?
This sounds like a Hell of your own making. This is why users in general should never have full administrative privileges and they should be tailored down to just what you need. Epecially if they haven't yet learned the basics of the OS they are using because they are at best a danger to themselves and at worst a vulnerable laptop inside the network.
People forget XP was pretty bad at first just like Windows 98 and like Windows 98 people became less critical after a bunch of major fixes. For Windows 98 this became Windows 98SE and for XP this became XP SP2 (and eventually 3).
Both Vista and 7 had problems before they were fixed after awhile. The most common issue I can remember was UAC and everyone just told you to turn it off to install and use their software and games. There were also a bunch of breaking Win API stuff and a lot of software made for XP just didn't work anymore in Vista+.
People mainly just remember them after they were fixed, except for Vista because 7 came out fairly quickly (just 2 years later). Microsoft does not have a good track record for initial Windows releases but eventually everyone forgets and even some of the bad ones are remembered as the good ones.