nibbles

Aider and Local LLMs

I got Aider (AI coding friend), LM Studio (local llm manager/interface), and devstral small (a small, self-hostable coding aware LLM) running on my dual RTX 3090 (honkin GPU) machine. The results of running a local model and coding assistant are really interesting.

I’m bullish on local llm coding tools becoming competent.

Original LinkedIn post