Well, I guess we're only going to invent the microchip once.
In any case now that I'm... you know, not a kid and can look at 90s tech with some historical data, it wasn't quite as straightforward as we remember. But then it also didn't have memory prices spiking by 300% overnight because everybody knows a fake AI bubble is going to pop so they want to sell them a bunch of crap before all the money evaporates but not ramp up production because that money is clearly going to evaporate.
That's 21st century through and through. 20th century tech would just keep slapping the same Z80 and Motorola 68000 chips on things for a decade. Z80 too slow as your CPU? Maybe it can be your sound chip now. Listen, we have warehouses of these, we'll keep selling you new ones well into the 2000s, it's barely a clump of wires held together with chewing gum.
Did you know the 68000 was 10 years old by the time it made it to the Genesis/Megadrive as a CPU? That'd be like the PS5 launching with a single core 2GHz Athlon from 2005. I didn't know that as a kid.
I guess there are two types of approach to this season, the "yay, nothing is getting done in the next three weeks" and the "oh, crap, nothing is getting done in the next three weeks".
It takes a prodigious amount of entitlement to look at things that way. The leap of logic from a tangible action to... some other thing that happened requires keeping the loosest possible tally and looking at international politics strictly from the lens of how it affects your worldview, rather than the actual impact on the ground.
No, my dear online performative leftist, the US deciding to reverse their policy and cut tens of billions of international aid is not "the same" as whatever war, political stance or act of interference you vaguely remember being mad about a decade ago. They can both be bad without both being the same.
I mean, never mind that the Iraq and Afghanistan wars were started by Bush, in turn the proto-Trump that opened the door for the fascist base to encroach on the US right, the fact that those things happened doesn't mean that the new, different thing Trump did that none of his predecessors did isn't worse than what their predecessors were doing. The people that relied on US aid relied on US aid, independently of whatever US tanks were doing thousands of kilometers away.
Trump also reversed policy regarding Israel, incidentally, with the recognition of Jerusalem and moving the embassy. That's the type of false equivalence that led to him being in power in the first place. Because man, I was not on board with Biden's stance on Palestine, and I am sure Harris would have been way too lenient with Israel for my tastes, but if you think that's the same as openly suggesting mass displacement for the sake of turning Gaza into a tourist resort and that it made no diplomatic difference in how fast and what type of ceasefire could have been attained you're out of your mind.
And this is the last I say about it. I have zero patience for this type of willful ignorance in general, but I also have no energy to be angry. Thanks for the reminder that leaving even a tiny crack for US politics, even if it's coming from the left, is way too much. The entire thing is toxic. Malign actor indeed.
It takes deliberate ignorance to reach that conclusion. Forget the cuts in aid, which an article two posts up the chain here was directly linking to a worsening of cholera in South Sudan, the notion that anybody in Brazil or Mexico is going "I can't tell the difference" is ludicrous. Your mileage may vary on whether Trump invading Venezuela is a good or a bad thing, but I'm pretty sure the regime there isn't going "same thing, really".
I guess the Argentitian government would say things are better now, considering they just got bailed out in what amounts to buying a midterm election. In that case I'd wager it's the opposition who doesn't find things were just as bad a couple years ago.
What the hell do you have to be on to think only Europe has noticed open fascists being in charge in the US. This is why I've been taking a break from this place, holy crap.
The problem is the way he got banned also blocks him from his shared auth, which in turn blocks him from purchases and device functionality:
The Damage: I effectively have over $30,000 worth of previously-active “bricked" hardware. My iPhone, iPad, Watch, and Macs cannot sync, update, or function properly. I have lost access to thousands of dollars in purchased software and media. Apple representatives claim that only the “Media and Services” side of my account is blocked, but now my devices have signed me out of iMessage (and I can’t sign back in), and I can’t even sign out of the blocked iCloud account because… it’s barred from the sign-out API, as far as I can tell.
Seriously, it's like a one page blog. You could have read it in the time it took you to make me read it for you.
Agreed 100%. I think it's understandable to feel schadenfreude on someone this deeply embedded being bit by the arbitrary business practices of big corpo in a worst case scenario type of situation.
But the problem is the business practices, not the person being affected. The guy's job feeding Apples gargantuan content engine doesn't make this alright.
Please do not use recipes from AI summaries without clicking through to the correct amounts.
I mean, hell, I won't use a recipe I find on a recipe site without cross-referencing at least a couple other recipes from other sites, because some of those are janky as hell. But the best-guess of a summarization LLM on what all those numbers and ratios are meant to be? Yeah, no. Not by itself at all, unless you're just trying to jog your memory on something you already know and can recognize the correct values if you see them.
Man, that's a big lateral move. Honestly, my experience is spotty as well (using mostly Bazzite), and I'm sticking to dual booting for the foreseeable. I definitely would not change hardware for this reason. Especially because, having some installs on AMD-based hardware, my expectation that the issues would drop to zero is... very low.
If you can't try before you buy I'd only swap to get a meaningful upgrade. Of course when a "meaningful upgrade" will be affordable or accessible in the future is anybody's guess. Because isn't tech in the 21st century fun?
I mean, "now" is doing a lot of load-bearing work in that sentence, but it's also hard to argue it isn't markedly worse than it used to be.
I guess unless you're Russia and their circle. I'm starting to wonder about China, too, considering the geopolitical wedgie they're giving the US right now.
Because it was a 500 dollar transaction and the card they purchased was an apple-branded product in a major retailer.
It was a 500 dollar transaction because this guy is a pro developer in Apple's ecosystem and apparently uses a 6TB plan for both personal and professional storage.
The Trigger: The only recent activity on my account was a recent attempt to redeem a $500 Apple Gift Card to pay for my 6TB iCloud+ storage plan. The code failed. The vendor suggested that the card number was likely compromised and agreed to reissue it. Shortly after, my account was locked.
An Apple Support representative suggested that this was the cause of the issue: indicating that something was likely untoward about this card.
The card was purchased from a major brick-and-mortar retailer (Australians, think Woolworths scale; Americans, think Walmart scale), so if I cannot rely on the provenance of that, and have no recourse, what am I meant to do? We have even sent the receipt, indicating the card’s serial number and purchase location to Apple.
Much as I do think mixing pro and personal accounts is a mistake, as a person who has to pay several major corpos for subscription plans for professional software that include cloud storage, I admit I get it. Receiving spam about how full your free personal Google Drive is kinda sucks extra if you are already paying a bunch for an enterprise account with a bunch of storage on the side.
Not an "apple fan", an apple-focuse software dev deeply embedded in their dev community.
Which I suppose goes a long way to explain them being multiple terabytes in the hole inside Apple's ecosystem, and also why even having a separate backup would definitely not fix their problem in the first place.
That's not true at all. Synology will sell you 24 bay rack mounted devices and 12 bay towers, as well as expansion modules for both with more bays you can daisy chain to them.
Granted, I believe those are technically marketed as enterprise solutions, but you can buy a 12 bay unit off of Amazon for like two grand diskless, so... I mean, it's a thing.
Not saying you should, and it's definitely less cost effective (and less powerful, depending on what you have laying around) than reusing old hardware, but it does exist.
I'm currently running some stuff out of an old laptop which I also have tucked away somewhere and just... remote desktop in for most of the same functionality. And even if you can't be bothered to flip it open in the rare occassion you can't get to the points where the OS will let you remote in, there are workarounds for that these days. And of course the solution to the "can't hook it up to a keyboard and mouse" in that case is the thing comes with both (and its own built-in UPS) out of the box.
Nobody is saying that server grade solutions aren't functional or convenient. They exist for a reason. The argument is that a home/family server you don't need to use at scale can run perfectly fine without them only losing minor quality of life features and is a perfectly valid solution to upcycle old or discarded consumer hardware.
I think the self-hosting community needs to be more honest with itself about separating self hosting from building server hardware at home as separate hobbies.
You absolutely don't need sever-grade hardware for a home/family server, but I do see building a proper server as a separate activity, kinda like building a ship in a bottle.
That calculation changes a bit if you're trying to host some publicly available service at home, but even that is a bit of a separate thing unless you're running a hosting business, at which point it's not a really a home server anyways, even if it happens to sit inside your house.
I mean... my old PC burns through 50-100W, even at idle and even without a bunch of spinning hard drives. My actual NAS barely breaks that under load with all bays full.
I could scrounge up enough SATA inputs on it to make for a decent NAS if I didn't care about that, and I could still run a few other services with the spare cycles, but... maybe not the best use of power.
I am genuinely considering turning it into a backup box I turn on under automation to run a backup and then turn off after completion. That's feasible and would do quite well, as opposed to paying for a dedicated backup unit.
I had that laptop before I tried to move it to Linux and I'm not buying a new one. It does work under Windows.
This is not my laptop not supporting Linux, this is Linux not supporting my laptop. Because I already own the laptop. If people weren't trying to cheerlead for their preferred OS for other reasons than... you know, whether it's good or not, this wouldn't even be a discussion. In fact, half the "Windows sucks" angles these days are down to "Windows 11 doesn't support specific pieces of pre-existing hardware". Which, you know, is the exact problem I'm having here.
Now, would ASUS finally paying attention to the ecosystem make it easier for a whole bunch of people to move over? Sure. Of course. But that doesn't contradict my previous statements.
I have an ASUS laptop that maps its multiple speakers incorrectly under Linux, it's been killing me for months and I'm now considering it. I was not prepared for the realization that the Linux path forward would be to just pay by the bug fix.
Well, I guess we're only going to invent the microchip once.
In any case now that I'm... you know, not a kid and can look at 90s tech with some historical data, it wasn't quite as straightforward as we remember. But then it also didn't have memory prices spiking by 300% overnight because everybody knows a fake AI bubble is going to pop so they want to sell them a bunch of crap before all the money evaporates but not ramp up production because that money is clearly going to evaporate.
That's 21st century through and through. 20th century tech would just keep slapping the same Z80 and Motorola 68000 chips on things for a decade. Z80 too slow as your CPU? Maybe it can be your sound chip now. Listen, we have warehouses of these, we'll keep selling you new ones well into the 2000s, it's barely a clump of wires held together with chewing gum.
Did you know the 68000 was 10 years old by the time it made it to the Genesis/Megadrive as a CPU? That'd be like the PS5 launching with a single core 2GHz Athlon from 2005. I didn't know that as a kid.