Software path-tracing has been on my bucket list, mostly to test a concept: physically based instant radiosity. If an eye-ray goes camera, A, B, C, then the light C->B forms an anisotropic point source. The material at B scatters light from C directly into onscreen geometry. This allows cheating akin to photon mapping, where you assume nearby pixels are also visible to B. Low-frequency lighting should look decent at much less than one sample per pixel.
Sony pulls this shit with every new Playstation, through the ingenious and difficult process of not making enough. "PS3 sold out at launch! New shipment sold out again! And again!" Meanwhile they'd moved fewer total units than the 360 in the same timeframe, but Microsoft made one big shipment instead of three small ones.
Copyright made sense when it was a decade or two. Industrial patents seem basically functional. Trademark's mostly truth-in-advertising for consumer choice.
But software patents aren't about how you do something - they're claiming the entire concept, in the broadest possible terms, and killing it. Straight-up murdering that potential. It is denied the necessary iterative competition that turns dogshit first implementations into must-have features. Nobody's gonna care in twenty years.
Entire hardware form-factors have come and gone in a single decade. Can you imagine if swipe keyboards were still single-vendor, and still worked like in 2009? Or maybe Apple bought them, and endlessly bragged about how Android can't do [blank], because fifty thousand dollars changed hands in the 3G era.
How many games would not exist, if Nintendo had decided they own sidescrollers? A whole genre, wiped out, because a piece of paper says those mechanics are theft.
Yeesh. Dunno if that's the right move, versus grabbing the cash for what is unmistakably a cash grab.
Like hey, congrats on clearing the rights and all, and it looks like you've somehow coerced the original engine into something like modernity, but how in god's name did you make the ugliest Unreal 1 game feel uglier?
D'aww, that's downright charming. Like when viruses would explode across whole networks just to randomly toggle capslock.
You couldn't make the movie Hackers today. Not because it's a time capsule of commercialized cyberpunk cheese, from an era where neither audiences nor screenwriters actually understood computers... but because when the protagonist coldly brags 'I'm a cybercriminal,' grandma in the front row would yell "Hang him by the nuts!"
AI is only useful right now as a stress test as it reveals how hollow adolescent work has become. If it pushes schools toward offering work with relevance, impact, and agency and away from hopeless busywork (“When will I ever use this?”), that is a win.
Vile misunderstanding of why we have kids practice what they're learning. Math is trivial to do with a calculator, but learning how to use, y'know, numbers, involves a lot of repetitious demonstration. You will obviously need skills to read and process boring minutia or infuriating nonsense. The fact we've calculator-ized reading a page of text is not some damning critique of essay questions for reading comprehension. No more than 'a train leaving Chicago at 8:30--' can be answered by looking up at the timetable. We need proof that people know what concepts mean.
A person who only learned math with a calculator can only do math with a calculator. What do you think is the real impact on a person who only learned to think with a chatbot?
If AI doesn't make you more productive you're using it wrong, end of story.
It's refreshing to see people acknowledge, "holy shit, the robot can do what?" when Lemmy feels dogmatically opposed to demonstrable results. (For diffusion tools, detractors are reduced to "well that doesn't count.") But the robot's still dumb as hell. A guy claiming to run Claude in a while-yes loop to create a new programming language named his script after Ralph Wiggum. And he still had to git-revert every so often.
If you ask it to do something, the machine will bumble onward, but you can't trust it with any damn thing.
Turning non-programmers into programmers is my life’s work. It is the thing of which I am most proud as a college professor.
Getting non-programmers to program, without becoming programmers, is an important branch of computer science which this early paragraph obscures. Using a computer should not take a college education. That includes making your own programs, for whatever you want the computer to do.
This is the stated purpose of several important programming languages. It is the de facto purpose of several more. BASIC is rightly mocked, but any child who reads '10 print "butts"; 20 goto 10' instantly understands control flow. Python's visible indentation is its structure... with an invisible invitation to the tabs / spaces holy war. (It's what tabs are for, god dammit!) The languages mathematicians use are so cursed with their brain-patterns that some of them begin arrays on 1.
Anyway.
Someone once realized you can open executables in Notepad, and tried saving 'show a blue circle bouncing on a black screen' as myprogram.exe. Vibe coding damn near makes that work. It's not especially good at programming... but it's better than a novice. And it gives novices results they can tweak and learn from, unlike 'code-free' approaches.
If you want to know what a compiler is doing, write an NES game, because cc65 is blazing fast and dumb as hell. A glorified assembler macro. If you want to know what an assembler is doing, fire up an Apple I emulator and code something in Wozmon's bare 6502 bytecode. I've done both things - and I'd never say either is necessary to start people writing programs. We shouldn't lament people glancing at generated code and saying 'that looks about right,' when we don't do even that, for generated bytecode.
The author does full-throatedly endorse the tech for the thing it does... eventually. But I'm tired of the million ways people say 'oh it's so terrible... anyway here's what I use it for.' They're seeing the jack-of-all-trades robot as a lazy novice within their field of expertise, and then letting it challenge them in all the fields where they are themselves a lazy novice, and they refuse to square these into a consistent judgement of its utility.
Why not have it just write C? Or hell, why not x86 assembly?
Human readability is the only way to check when the LLM is wrong, versus when your instructions are wrong. Even cc65 has a few bizarre edge cases, and that's a script collection written by experts for exactly one architecture. An LLM as a compiler has some small chance of switching from x86-64 to knock-knock jokes. The closer the output stays to human-readable, the more chance that a human will actually read it. And once you're in that habit you might as well fix little things yourself. Or at least write better punchlines.
Software path-tracing has been on my bucket list, mostly to test a concept: physically based instant radiosity. If an eye-ray goes camera, A, B, C, then the light C->B forms an anisotropic point source. The material at B scatters light from C directly into onscreen geometry. This allows cheating akin to photon mapping, where you assume nearby pixels are also visible to B. Low-frequency lighting should look decent at much less than one sample per pixel.