Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible
They probably also do some OCR on that and then let something other run over that to see if the text makes sense (basically letting another AI grade the output, commonly done to judge what's a good dataset and what isn't) and then just feed the ai again. Today you have a shortage of data since the internet is too small (yes I know it sounds crazy) so I wouldn't wonder if they actually tried to use pictures and ocr to gather a bit more usable data
That is true, I mean I mostly only use my homelab except some game servers that I am running. And you are totally right. Only reason why I want to run proxmox or in general why I have a homelab is to learn more about servers and self hosting. I am currently in the first year of my apprenticeship and I have learned so much since I got my server up and running 😄 and I think I can learn a lot more when I am using proxmox
I am in the same boat currently and thinking about how I can migrate my stuff over without having a 1 month downtime
EDIT: after reading all the comments I'm still not sure if I should do it or like I said even how. I love my unraid it fits me well however I think I also have fallen in love with proxmox
Meh that sucks i even have a perfectly working ddns, I mean I know I don't get something like a PTR record but i wish that mail hosters would allow for more self hosting options
Oh yeah I heard about this and saw that mutahar (some ordinary gamers) was doing it once on windows with a 4090.
I would love to do that on my GPU and then split it between my host and my VM
That was my idea too and it looks like I imagined it to 😂