Local models work best when you meet them halfway ...
A local LLM makes better sense for serious work ...
This desktop app for hosting and running LLMs locally is rough in a few spots, but still useful right out of the box.