Oh ok, because in the country + area that YOU live in there's a wealth of printers you can apparently find in dumpsters, and because YOU have both the physical ability and time avaliable to go diving in dumpsters, THE ENTIRE WORLD does not need a new printer??
I don't really think that the world is free from ever needing a new printer, what kind of take is that? This project promises 3d buildable components that run on a pi? Exactly how does that contribute to global e-waste if that's your point? PLA plastic biodegrades, pis can be re-used for other stuff, and inkjet cartridges can be refilled. What possible issue could you find with a project that's open to the public and you can construct yourself??
I was curious, so I looked up the public records from the criminal court of connecticut, looks like next court date is in march, not much has happened besides the crazy stepmother continually requesting that she be informed of the location and new alias of the person she (allegedly) abused
A trans person] joined the Discord server and made a big deal out of their pronouns ..] because they put their pronouns in their nickname and made a big deal out of them because people were referring to them as “he” misgendering them], which, on the Internet, let’s be real, is the default. And so, one of the moderators changed the pronouns in their nickname to “who/cares”. …] Let’s be real, this isn’t like, calling someone the N-word or something.
I hate jeeps as much as the next guy but glass transmission subarus being at the top of the list for reliability, as well as the fact that the rating is based off of "user predicted reliability" tells me everything I need to know about this list
I've never personally had issues with 8x7b refusing requests, but I guess I haven't really plumbed the depths on what it might agree or disagree to, I have run it through the ordinary gambit (pretend to be public figure x, make dangerous item y, say untrue thing about company z) and it hasn't given me any problems but sure whatever works for you
Actually not 100% true, you can offload a portion of the model into ram to save VRAM to save money on a crazy gpu and still run a decent model, it just takes a bit longer. I personally can wait a minute for a detailed answer instead of needing it in 5 seconds but of course YMMV
You have to know about open source AI models, then you have to know what fine tuning is, then you have to know where to go to get software that runs the models, and then finally you need to know what models are compatible with both AMD and Nvidia graphics cards.
There's plenty of open source models that don't really have any restrictions, you just have to host them yourself (which you can do on your own computer if you have a decent gpu)
Oh ok, because in the country + area that YOU live in there's a wealth of printers you can apparently find in dumpsters, and because YOU have both the physical ability and time avaliable to go diving in dumpsters, THE ENTIRE WORLD does not need a new printer??
I need some of whatever you are smoking