Windows 10 end of life could prompt torrent of e-waste as 240 million devices set for scrapheap::As Windows 10 end of life approaches, analysts are concerned that millions of devices will be scrapped due to incompatibility
of course no mention of upcycling these with linux and getting them into needy hands. with alll the solid state hardware now many of these machines are perfectly functional, and will be for some time. its the batteries that likely need a looking at
No, personal computers can only ever work with Windows. I just love that the common thinking process just accepted that problems, especially IT problems, can only ever be solved by 5 gigacorps.
BTW a lot of these will not even be laptops, I imagine they won’t even need much. If Windows was a proper system by the way, they could be still supplied with security updates by third parties.
Also, I’ve seen Rufus claiming to be able to remove the TPM requirement from the installer. I didn’t test it though.
Let’s go back to 1995, you’re a corporate IT manager or C-class executive , responsible for deploying desktops, laptops, to 10,000+ employees (I worked for or with several companies like this at the time).
You need directory services, email, app deployment. You also need common office apps, like word processing, spreadsheets, etc.
Your end users are finance folks, regulatory compliance teams (i.e. legal), marketing, etc, who’ve been working with systems that are purpose-built for their roles (mainframe/IBM As series for finance, print layout systems for marketing, etc), with not everyone really using email much.
Suddenly you have an opportunity to migrate everyone to a general purpose system that’s pretty easy to understand, and many people already have some familiarity with. You can eliminate sending handbooks to everyone by building your own intranet which people can access with IE. Your HR systems (which are still on mainframes/AS-400) can now be accessed by IE from anywhere in the company, so time entry, vacation, benefits changes, etc, reduce time and paper consumption dramatically.
Theres a million reasons why companies embraced Windows back then. Standardized UI for everything massively improved support capability. Being able to take output from legacy systems and present it better either via IE or custom-built apps made for significant training reduction, and could even reduce password management difficulties, and increase password compliance/security for the legacy systems (I saw one custom app in 1996 that presaged SSO by managing logins to 11 backend systems).
There was nothing in the *Nix world at the time that could compete with the whole package that Windows/Exchange offered, for the user management and end-user ease-of-use. Especially since you could retain your legacy systems and use Windows as both Windows and as a terminal if needed, and provide app flexibility for end users.
Then there’s the productivity side, there were already tons of Windows apps available, with many more on the way. And people were familiar with how to use them, because of a standard interface. Also, many people were using Windows at home or school, so we’re familiar with it.
Just compare Word to Wordperfect at the time. I’m not sure WP was even a GUI yet (I forget when they added it). So legal folks were fast as hell with WP, but your average user wasn’t, and it had a bit of a learning curve. Compare that to the menu-driven, WYSIWYG Word Perfect.
Now look at the SMB space, where money is even tighter. It’s much easier to deploy and manage an exchange/windows setup for 50 users than what, setup a Unix system? I could teach someone to do day-to-day Exchange admin stuff in a few hours, because GUI is way easier than command line for people who are new/inexperienced, because it reveals the concepts/paradigms. And Exchange ran on fairly generic hardware. Again, they didn’t have to buy something like an AS400.
Unix folks just didn’t see what was coming for some reason. I remember Unix admins disparaging Windows as a “toy” in the early/mid 90’s. Even today I couldn’t imagine selling a Linux setup to most companies, as mature is it’s gotten.
High-end corporate laptops from 5-10 years ago make excellent cheap and powerful Linux machines today (given a reconditioned battery, assuming you want to run them without mains, and a new SSD several times larger than the hard drive they came with). See all the sticker-festooned Thinkpads you see at conferences that spent the first few years of their lives handling executive email and PowerPoint presentations, now living their best lives.
I’ve always wanted to do this.
What’s a good source to buy them?
What models do you recommend?
I’ve always got them from eBay.
The T and X series are the high-end ones. Between those it mostly depends on what size of laptop you’re looking for. Its worth checking a guide for how you replace the SSD/RAM/battery - some of the newer ones have these soldered in place, which means you’re stuck with whatever it originally came with.
Personally, I think the sweet spot is around 4 years old. By that point they’re pretty cheap (maybe 10% of the original RRP), and going for older ones doesn’t save you much more money. I recently got an X390 and it’s doing everything I need from a laptop
So what I’m hearing is, free Linux servers?
Where does the assumption that owners of these devices care about updates comes from? I regularly see people still using Windows 7, willing to use sketchy workarounds to continue using it. We all wish that this would mean The Year Of The Linux Desktop in 2025, but that would mean users would have to suddenly start caring about their OS.
All these machines will continue to run, so if they’re not going to upgrade to win 11 and buy a new machine then what does it matter. They’ll just use a win 10 machine with no updates forever. Security concerns aside obviously.
Positive take: Lots of great Linux laptops on their way to eBay.
Im still on a t430 🤷♀️
2013 Dell XPS i7-4470 up in this jawn.
Toshiba L745 i5 2430M here, still roaring
I’ve kept a Windows 10 install on a separate SSD for the programs that stubbornly refuse to run on Linux (games, in my case). However, I won’t be upgrading that to Windows 11. I’ll just reclaim that SSD for other purposes and use Linux exclusively.
I’m one of those maniacs who went to the trouble of setting up a GPU passthrough VM instead of dual booting, and I have no intention of switching it from Win10 to Win11. If it gets infected, it can’t do jack or shit to the important parts of my system, and I can either roll back to a snapshot or nuke it.
I swear, I can read the first part of your first sentence just fine, but I don’t understand what it means, lol!
I tried to look it up, and as far as I understood it, it’s a technique that allows a virtual machine to access a physical GPU directly. I guess that means that even if your VM is elsewhere (a server or wherever) it can still use the GPU you have. But the more relevant part is that since your Win10 install is on a VM, it can’t do shit on the rest of your system, and the GPU access is just there so that it won’t run as slow as shit when gaming, right?
But the more relevant part is that since your Win10 install is on a VM, it can’t do shit on the rest of your system, and the GPU access is just there so that it won’t run as slow as shit when gaming, right?
Pretty much
I tried to look it up, and as far as I understood it, it’s a technique that allows a virtual machine to access a physical GPU directly. I guess that means that even if your VM is elsewhere (a server or wherever) it can still use the GPU you have.
So, to get more technical, there’s a motherboard technology called IOMMU, which was developed for containing malware that has infected device firmware. What Linux has is a kernel module that allows an IOMMU group to be isolated from the host operating system, and connected up to a virtual machine as if it were real hardware. On an expensive motherboard, you get a different IOMMU group for each PCIe lane, each M.2 socket, each cluster of USB ports, etc. On a cheap one, you get one that for each type of device, maybe the PCIe lanes are divided into two groups.
So the fun part, and why we do this, is that when you have two GPUs, in different IOMMU groups, one can remain on host and allow graphics drivers, desktop environment, etc. to remain loaded, while the other can be connected to the VM and used entirely for gaming (theoretically, if you wanted to you could game on both systems at once). Thankfully, cheap, shit secondary GPUs aren’t expensive (was once on a 710, ditched that and its many driver issues for a 1050, and my main remains a 980ti), but setting up the main GPU to switch between proper drivers and “vfio-pci”, the drivers that have to be loaded before the passthrough can occur, can be a pain.
Thanks for the explanation. Prior to our exchange, I didn’t even know such a thing is possible. It’s wonderful, though to be honest, being as technologically klutzy as I am, I might find it easier to just buy a different set of hardware for my win10 to use, if ever, and disable any networking capabilities (because if it’s no longer supported, it needs to be taken offline).
Again, thanks!
Where can i get this waste .my linux pengiun will love it🤩.but it saddens me that people relay on windows so much.
This is corporate talk, no ones work station is going to be running Linux anytime soon.
Oh yeah everyone, tell me where you work with Linux?
Nobody’s Steam Deck is gonna run Windows any time soon.
I am fully on Linux - daily usage, gaming and working.
Large ISP, in the global operations computing department. I am an exception to the rule though. I mostly touch network gear and *NIX servers so I’m not limited to Linux but I will say most of our *NIX stuff is RHEL now and doesn’t even boot past run level three so it’s all CLI.
Cool, I need some cheap Linux servers to build out my home lab
deleted by creator