Point LinqPad AI helper to Ollama
Thanks for the AI helper! I'd like to see that helper
able to point to a model hosted by a local instance
of Ollama. Ollama can serve on localhost:11434, and
can accept/return JSON documents. Ollama would
take some additional work for a developer to set up,
but they'd have free and private queries.
-
Daniel Wagner commented
You can configure Ollama in LINQPad using the OpenAI or Anthropic compatibility endpoints, see
https://docs.ollama.com/api/openai-compatibility
https://docs.ollama.com/api/anthropic-compatibilityIn default Ollama setup no authentication parameters are needed, but you've to configure all your pulled models by hand.
As the API provides a models endpoint, it would be a nice feature to assist configuration by checking this endpoint which would allow for prepopulation of the model configuration.