

assuming you recently took a warm shower, maybe letting cold water run a little bit might help? cold water might cause the plug to shrink a bit and be easier to release
assuming you recently took a warm shower, maybe letting cold water run a little bit might help? cold water might cause the plug to shrink a bit and be easier to release
ah yes, voting, famously known for being the only way humans have ever changed their material circumstances.
now that voting has failed us, the only thing left to do is nothing!
“America in 2016 was perfect until orange man ruined it, we simply need to return to 2016 and everything will be ok”
a surprisingly disappointing article from ars, i expect better from them.
the author appears to be confusing “relay attacks” with “cloning” and doesn’t really explain the flow of the attach that well.
really this just sounds like a complicated MitM attack, using the victim’s phone as the “middle” component between the victim’s physical card and the attacker’s rooted phone.
the whole “cloning the UID attack” at the end of the article is irrelevant, NFC payment cards don’t work like that.
20 years? more like 5
if you’re talking about that recent pic of him floating around with a chain and a bread, that was an AI doctored photo
you’re so close, just why exactly do you think people are using it for these things it’s not meant for?
because every company, every CEO, every VP, is pushing every sector of their companies to adopt AI no matter what.
most actual people understand the limitations you list, but it’s the capitalists at the table that are making AI show up where it’s not wanted
TLS doesn’t encrypt the host name of the urls you are visiting and DNS traffic is insanely easy to sniff even if you aren’t using your ISPs service.
babe wake up, a new bone-apple-tea just dropped
just a guess, but in order for an LLM to generate or draw anything it needs source material in the form of training data. For copyrighted characters this would mean OpenAI would be willingly feeding their LLM copyrighted images which would likely open them up to legal action.
open source software getting backdoored by nefarious committers is not an indictment on closed source software in any way. this was discovered by a microsoft employee due to its effect on cpu usage and its introduction of faults in valgrind, neither of which required the source to discover.
the only thing this proves is that you should never fully trust any external dependencies.
yeah silly me for supporting artists with my money but also downloading drm-free copies of things so I can actually exercise a semblance of ownership. but sure, keelhaul me so you can keep your sense of smug superiority.
AI is a tool that is fundamentally based on the concept of theft and plagiarism. The LLM training data comes from artists and creators that did not consent to their work being plagiarized by a hallucinating machine.
a decentralized community that correctly prioritizes security would absolutely be using signed commits and other web-of-trust security practices to prevent this sort of problem
you understand there’s more than one way to have an economy right? that there’s more than one way for labor to be rewarded for its output?
saying “our economic system needs to end” has nothing to do with what you wrote
they wrote two documents, held two meetings, and moved a task to the “started” column?
we’re cooked bro