Abstraction is not very compatible with concurrency, so as well as your your beautiful abstract API, you also need some 'cut through the layers' functions to return the underlying classes you need to synchronise on. Now you have a right mess that's incredibly hard to understand, infuriating to debug, and impossible to refactor. Best you can do is put another layer of abstraction on top. Repeat every six months.
5G is for spreading the woke gay mind virus. Collecting all of your personal information is the Jewish space lasers. Fortunately, tinfoil hat stops both.
Google Stadia wasn't exactly a responding success...
From a previous job in hydraulics, the computational fluid dynamics / finite element analysis that we used to do would eat all your compute resource and ask for more. Split your design into tiny cubes, simulate all the flow / mass balance / temperature exchange / material stress calculations for each one, gain an understanding of how the part would perform in the real world. Very easily parallelizable, a great fit for GPU calculation. However, it's a 'hundreds of millions of dollars' industry, and the AI bubble is currently 'tens of trillions' deep.
Yes, they can be used for other tasks. But we've just no use for the amount that's been purchased - there's tens of thousands of times as much as makes any sense.
We've had multiple instances of AI slop being automatically released to production without any human review, and some of our customers are very angry about broken workflows and downtime, and the execs are still all-in on it. Maybe the tune is changing to, "well, maybe we should have some guardrails", but very slowly.
Unfortunately, server RAM and GPUs aren't compatible with desktops. Also, NVidia have committed to releasing a new GPU every year, making the existing ones worth much less. So unless you're planning to build your own data centre with slightly out-of-date gear - which would be folly, the existing ones will be desperate to recoup any investment and selling cheap - then it's all just destined to become a mountain of e-waste.
100% of supercomputers, 80% of mobile devices (as Android), 4 or 5% of desktops depending on whether you count ChromeOS. Desktop share is a few percent higher if you just count gaming PCs, eg. the Steam survey, since it's more widely used at home than on business machines.
The rate of adoption is accelerating, too - slowly but steadily.
Oof. The second-hand market is full of stuff that businesses are throwing out since they won't run Win11, but which run Linux perfectly well. I've just recently replaced my NAS / home server with a £20 core i5 mini-PC that if anything is a bit overpowered for the job. Runs Mint desktop very nicely.
I'd imagine that if you're spending a hundred times as much, then you don't just have "web and office" in mind, though...
Given that all lottery draws are equally likely, you'd want your own to be numbers that no-one else has chosen, so that if you win you wont have to share it so many ways. Don't choose numbers that are birthdays, for instance. But "generate six numbers completely at random" is such a bad ask for an LLM that I can't even - it's likely to pick ones it's seen before in its training data, which is the worst possible selection.
As a "caps lock is another control" enjoyer, I know that pain. Don't need to take your fingers off the home keys to type ^[ , whereas the proper escape key is a bit of a stretch.
Well, having not played the Xbox version... ;-) Once you've got it running, it remains one of the finest games of all time.
Getting it running is the real sands of time, tho. It has a particular hatred of multi-core CPUs, requires a graphics card that supports both hardware transform & lighting but also truly ancient versions of DirectX, and is obstinately not-widescreen. You'll be wanting a fan patch; last time I tried one, it was a bit of a crash-fest (it wasn't, back in the day) and some of the SFX looked plain wrong.
Graphics still held up perfectly - the art style is very strong - and the story remains charming. All I wanted from a remake was the damned thing to start up in a modern screen resolution, and it seems they've managed to spend years on it without even managing that.
"If you make noise in real life, then the alien will hear you in game.". As if A:I needed to be any more terrifying than it is.
Still - it's a very expensive bit of hardware to implement the microphone feature that eg. the Famicom had, and the 'tracking' functionality only benefits a couple of games. Bizarre decision to make it mandatory as part of the console.
AmigaOS is still available and able to run all your Linux favourite applications as well as 'classic Amiga software', except of course it requires you to be running a PPC processor. Plus it costs money. So you'd have to invest £lots in 'most of a new PC' to see whether it even works for you.
Now, if we could open-source it and get it running on x64, I'd love to be running workbench again. It was ahead of its time.
Saw them at a festival a couple of years back. They know they're a bit cheesy and play into it, but they're a tight band and can still smash out all their hits.
Now, the fact that the festival could hardly afford anyone else because Filth were headlining, that was a problem, but it did mean that a few lesser-known bands got to play a decent set, so it's all good I suppose.
What is my cat doing on your mat, when she has a perfectly good beanbag to stretch out on? At least I know what she's up to when she sneaks outside, now...
They've a lot of canals, the ladders are custom, they'll need to be coated to stop them from corroding, and that'll be the installed price, so that's a small team driving round, barriering off bits of the canal while the work is done
If anything, seems cheap for a council job. My town would probably spend ten times that on the desk study to decide where they'll go and to get the paperwork together.
Dark Souls 3 is a great game to play at SL1. You've got quite a selection of weapons and armour that you can equip, plus one spell, so it's a bit of a puzzler to find optimum combinations of stuff to beat all the bosses.
Dark Souls 1 is okay to play at SL1. You're limited to being a pyromancer and have a good selection of flame spells that you can cast, but you're limited to weapons with fairly boring movesets, and you'll be doing a lot of running back to Blightown to get pyromancies and level up your flame.
Dark Souls 2 is goddamned brutal to play at SL1. Your dodging is tied to your agility, which means you're a sitting duck until you get some stat boosting gear. Start the game by murdering Cale for his hat of +3 dexterity, grab the work hook and the ladle to swap out in your off-hand for their small stat boosts, and get yourself to Tseldora to grind the peasant set for its small adaptability bonus. I hope you're good at beating end-game bosses with a rapier, no shield, and bad rolls - maximum four in a row due to your low stamina, which makes throne watcher / defender hellish.
Scholar obviously has all of the pain of 2, plus you can't rush into the DLC areas for their high-powered rings. By the time you get the ring of the embedded for its massive SL1 stat boost, you'll have most certainly earned it.
Yes, I did play through all four at SL1 in preparation for the release of Elden Ring. DS3 is fun at SL1, but I also do not recommend the others to anyone. Elden Ring is quite good at RL1 - it still allows some quite varied builds, and it forces you to learn the bosses rather than just "DPS race" them like you do normally.
Abstraction is not very compatible with concurrency, so as well as your your beautiful abstract API, you also need some 'cut through the layers' functions to return the underlying classes you need to synchronise on. Now you have a right mess that's incredibly hard to understand, infuriating to debug, and impossible to refactor. Best you can do is put another layer of abstraction on top. Repeat every six months.