Class: LLM::Ollama
- Includes:
- Format
- Defined in:
- lib/llm/providers/ollama.rb,
lib/llm/providers/ollama/format.rb,
lib/llm/providers/ollama/error_handler.rb,
lib/llm/providers/ollama/response_parser.rb
Overview
The Ollama class implements a provider for [Ollama](ollama.ai/)
Defined Under Namespace
Modules: Format, ResponseParser Classes: ErrorHandler
Constant Summary collapse
- HOST =
"localhost"- DEFAULT_PARAMS =
{model: "llama3.2", stream: false}.freeze
Instance Method Summary collapse
- #complete(prompt, role = :user, **params) ⇒ LLM::Response::Completion
-
#initialize(secret) ⇒ Ollama
constructor
A new instance of Ollama.
Methods included from Format
Methods inherited from Provider
#chat, #chat!, #embed, #inspect
Methods included from HTTPClient
Constructor Details
Instance Method Details
#complete(prompt, role = :user, **params) ⇒ LLM::Response::Completion
27 28 29 30 31 32 33 34 35 |
# File 'lib/llm/providers/ollama.rb', line 27 def complete(prompt, role = :user, **params) req = Net::HTTP::Post.new ["/api", "chat"].join("/") = [*(params.delete(:messages) || []), LLM::Message.new(role, prompt)] params = DEFAULT_PARAMS.merge(params) body = {messages: .map(&:to_h)}.merge!(params) req = preflight(req, body) res = request(@http, req) Response::Completion.new(res).extend(response_parser) end |