

Thanks, looking forward to it
Thanks, looking forward to it
Thanks. I don’t see the content of the blogs in the feed, just the title - but maybe that’s a problem with my reader (I use Capy on Android). I’ll try a couple of other readers to see if it works
You got an RSS feed for me?
Your blog is awesome. I have always wanted someone to break down RF homelabbing for me and I think as your blog progresses I will find such content.
I’m also looking for blogs/material on OS hardening (Linux/*nix), do you plan to write on that (and any recommendations)?
Coming back to this thread, I do think some of your comments were inflammatory. If you were to receive a ban, it should have been for trying to bring fights in the comments (but even that is ambiguous at best). I agree that the ban for a comment was too much. An admin shouldn’t be conflating one such action with overall behaviour. As for “repeated bad-faith behaviour”, it is not so far out to ban you I think. People should be responsible for their own actions.
I went through the list. Google and FairPhone should definitely be moved to “Safe for now” whilst OnePlus should be moved to “Requires an online account/sacrifice” as they limited their unbrick utility which means no more custom ROMs for new OnePlus phones.
I honestly don’t understand why Chinese companies do this. They would fare much better against their American counterparts (including Samsung) if they allowed for more open hardware. Goes to show that MBAs at the top of these companies have utter dung between their ears
Mod is biased. There’s no 2 ways around it
Ban was unjustified. db0 needs to at least point to someone accountable, seeing that he is still the benevolent dictator
That means it’s likely a problem with DNS.
Send the link to the discussion and the screenshot of your comment
Yeah I’m not going to run them on CPUs, that’s not going to be very good. I’ll buy the GPUs when I can.
Yes, just thought if you could check that the correct ports are opened. I.e. is port 443 open for NGINX on Unraid? Is NGINX forwarding traffic to the correct port to your backend? Is the backend configured to allow traffic on a certain domain/all domains if it is handling HTTPS?
Has anybody here actually worked for a company that uses Qubes OS as their corporate OS? I think Qubes is awesome and with some work can definitely be used for corporate work but most people don’t know, don’t care and are scared of security
Thank you, that makes sense. Yes, I will look to create templates using AI that I like. Thanks again for the help
Thanks for the edit. You have a very intriguing idea; a second LLM in the background with a summary of the conversation + static context might make performance a lot better. I don’t know if anyone has implemented it/knows how one can DIY it with Kobold/Ollama. I think it is an amazing idea for code assistants too if you’re doing a long coding session.
Better be AGPL or she’s never getting cloned on my PC, that’s for sure!
I see. Thanks for the note. I think beyond 48GB of VRAM diminishing returns set in very quickly so I’ll likely stick to that limit. I wouldn’t want to use models hosted in the cloud so that’s out of the question.
Absolutely. TheBloke’s fine-tuned models with their guardrails removed are the only conversational models I will run. I get enraged looking at AI telling me to curb my speech.
I do use Python but I haven’t touched AI yet so it’s going to be a learning-curve if I go down that route. I am hoping to get finetuned models OOTB for this kind of stuff but I know it’s a hard ask.
I was going to buy 2-3 used GPUs/new budget GPUs like the B580 but with the tariffs the prices of these are INFLATED beyond what I can afford to pay for them. Once something changes (financially speaking) I’ll probably throw enough VRAM at it to at least get the 8B models (probably not FP16 but maybe quantised to 4K/8K) running smoothly.
Thanks for the reminder. I have wanted to use character AI for so long but couldn’t bear to give away my thought patterns to them (look at my hypocrisy: I’m giving it all away anyway when everyone is free to scrape Lemmy). I guess I’m an idiot.
I was going to buy the ARC B580s when they come back down in price, but with the tariffs I don’t think I’ll ever see them at MSRP. Even the used market is very expensive. I’ll probably hold off on buying GPUs for a few more months till I can afford the higher prices/something changes. Thanks for the Lexi V2 suggestion
Thanks man, that would be much appreciated