For perspective, this is 4% of the revenue they pretend they'll make. The trillion-dollar fantasy is obviously not happening - but if even an embarrassing sliver of that happens for real, this was just investment.
The haters are so mad you got the robot to do some of the things it's for. Anything beyond 'I click the button and it draws a pretty lady! I'm a artist!' really fucks with their absolutism. That's a threat to ingroup solidarity for the ones who've made opposition part of their identity.
Text generation is the least you can do. You can still fire up Photoshop and feed in a half-finished image. Diffusion turns whatever you have into whatever you describe. If it does decent scratches on metal, but won't put them exactly where you want, then select them and move them, and the robot will smooth it over.
The very first article I read about Stable Diffusion, three years ago, had the author doodling mountains and flipping a spaceship. All the image-to-video stuff demands you provide art, as an input. Prompts alone are just a tech demo gone feral.
Frustrating part A is that we have a universal binary format... and it's HTML5. Frustrating part B is that nobody with a purchasing department wants to admit it. Slack ships with its own browser like you don't have one. Modern web games can run on a sufficiently fancy Amiga, yet there have been Electron apps without a Linux version. That Amiga's gonna suffer overhead and unstable performance, but I mean, so do native Unreal 5 games.
The good ending from here would be a period of buck-wild development. RISC-V, MIPS, finally doing that guy's Mill CPU. I was gonna say that neural networks might finally get high parallelism taken seriously, but no, optimized matrix algebra will stay relegated to specialty hardware. Somewhere between a GPU and an FPU. There's server chips with a hundred cores and it still hasn't revived Tilera. They're just running more stuff, at normal speed.
The few things that need to happen quickly instead of a lot will probably push FPGAs toward the mainstream. The finance-bro firehose of money barely splashed it, when high-frequency trading was the hot new thing. Oh yeah: I guess some exchanges put in several entire seconds of fiber optics, to keep the market comprehensible. Anyway, big FPGAs at sane prices would be great for experimentation, as the hardware market splinters into anything with an LLVM back-end. Also nice for anything you need to happen a zillion times a second on one AA battery, but neural networks will probably cover that as well, anywhere accuracy is negotiable.
Sheer quantity of memory will be a deciding factor for a while. Phones and laptops put us in a weird place where 6 GB was considered plenty, for over a decade. DRAM sucks battery and SRAM is priced like it's hand-etched by artisanal craftsmen. Now this AI summer has produced guides like 'If you only have 96 GB of VRAM, set it to FP8. Peasant.' Then again - with SSDs, maybe anything that's not state is just cache. Occasionally your program hitches for an entire millisecond. Even a spinning disk makes a terabyte of swap dirrrt cheap. That and patience will run any damn thing.
This was inevitable once ports looked the same and ran the same. Doubling your customer base, without developing the whole game twice? Obvious choice for any third party. First-party developers have taken longer, because their parent companies primarily own them to promote a hardware business. Microsoft's hardware business has become vestigial. It always was, to some extent; the Xbox project was a 1990s scheme to PC-ify the console market. It worked.
Consoles don't exist anymore. Do you want the green AMD laptop, or the blue AMD laptop? Even Nintendo rebadged an Android tablet. You can release some crazy new hardware unlike anything else, but the only third-party games will be multiplatform hits that run like garbage. Like on early PS3. The Helldivers 2 PSN fiasco sure looks like Sony found out how profitable they'd be as just another publisher and the answer scared the shit out of them. Without that service, they don't have a platform, anymore. They sell a popular model of an IBM compatible. Asterisk on the compatible.
Nintendo can get away with that shit forever, because they own Pokemon. I don't know how much longer you can cosplay that sort of first-party importance, on the strength of Horizon and... Death Stranding.
Nobody can "get" exclusives. They don't exist anymore. There's first-party games - there's games the first party funded into existence - and everything else runs on whatever customers have.
Developers want to sell games... to people. Hardware is an obstacle. They don't want to care which color of deliberately incompatible generic computer you bought. Shipping five near-identical versions of the same damn game is a button they push in Unreal 5, and that's why they put up with Unreal 5.
Fortunately a lot of early Windows shit runs in Wine, since the most stable Linux API is Win32. Anything older than that either works in 86box or was broken to begin with. Okay, that's not fair - WineVDM is necessary to bridge the gap for the dozen Windows 3.1 programs that matter. I am never allowed to write those off when one of them is Castle Of The Winds.
What Intel learned with Itanium is that compatibility is god. They thought their big thing was good chip design and modern foundries. They were stupid. AMD understood that what kept Intel relevant was last year's software running better this year. This was evident back in the 486 days, when AMD was kicking their ass in terms of cycles per operation, and it caused division-by-zero errors with network benchmarks taking less than one millisecond.
But software has won.
The open architecture of RISC-V is feasible mostly because architecture doesn't fucking matter. People are running Steam on their goddamn phones. It's not because ARM is amazing; it's because machine code is irrelevant. Intermediate formats can be forced upon even proprietary native programs. Macs get one last gasp of custom bullshit, with Metal just barely predating Vulkan, and if they try anything unique after that then it's a deliberate waste of everyone's time. We are entering an era where all software for major platforms should Just Work.
I mean... I've had a lot of genuine interactions with real people which I would not rate highly.
A sort of reddit holodeck would be obviously desirable, sometimes, but any concerns about solipsism or narcissism come far behind the expectation you'd want to run that shit locally. What is the value of disappearing into your own little world if you don't even control it?
Ehhh. The pattern of 'put me in charge because powerless weaklings will ruin civilization if we don't kill them first' is at least as old as Rome's "social war" c.90 BC. Italian states outside Rome demanded equal citizenship and the Empire violently put down that effort.
"Just" read documentation, says someone assuming past documentation is accurate, comprehensible, and relevant.
I taught myself QBASIC from the help files. I still found Open Watcom's documentation frankly terrible, bordering useless. There's comments in the original Doom source code lamenting how shite the dead-tree books were.
Octopussen.