hot-takes
Distributed Ollama LLM Infrastructure
Achievement unlocked: Running some grunt LLM work on 5 local ollama servers each tied to an NVIDIA GPU running in docker containers with an nginx server in front. Rails and RubyLLM handling the job processing and api calls like a dream.