I got an old machine off eBay(see pic) I only run models that are 8b parameters or less.
I got Ubuntu server on it. Then docker running in that. In docker I have olama, open web UI, jellyfin and a game sever. No issues running any of that.
Edit: if you want something that can run better LLMs I recommend more RAM and a better GPU
Yakub is that you???