It really comes down to what kind of speed you want. You can run some LLMs on older hardware “just fine” and many models without a dedicated GPU. The problem is that the time taken to generate responses gets to be crazy.
I ran DeepSeek on an old R410 for shits and giggles a while back, and it worked. It just took multiple minutes to actually give me a complete response.
Y’know, I read that headline as a Canadian and the first thing that came to mind was why your primary language would affect the levels of cadmium in your food.
I… I don’t think it’s children creating deepfake nudes of people and posting them to onlyfans.
I do get your point that most of the time people aren’t training a model on a plethora of images, but it isn’t that difficult a thing to do. It’s more complicated than asking Grok to “take this single shot and make them naked” but it isn’t something you can’t do in under a day of research if your mind is set to it.
I’m not sure I get what your comment is referencing. If you feel enough data into an “ai” meant to generate lifelike images, you’re going to get a relatively close approximation of the person. Will it be exact? Absolutely not. But again, it will be enough to put your job in danger in many situations.
I feel the need to point out that enough shots from enough angles in anything other than multiple layers or sweats, is going to essentially result in an “xray” effect. Yeah, it won’t know the exact hue of your nipples, or the precise single of your dangle, but it’s going to be close enough to be considered enough to end a career.
You realize the microplastics are still released from a capped landfill, right?…
I’m not necessarily saying this is a better option, but you’re talking like a capped landfill is the best solution out there, and hand waving any argument against it.
There’s also the problem of landfills taking decades to fill, with UV breaking down the topmost layer, heat breaking down internal layers, and water flowing through (and needing to be treated in the best case scenario).
Whelp, skipping this one now. Thank you for bringing that up.