Class: LLM::Ollama

Inherits:
Provider show all
Includes:
Format
Defined in:
lib/llm/providers/ollama.rb,
lib/llm/providers/ollama/format.rb,
lib/llm/providers/ollama/error_handler.rb,
lib/llm/providers/ollama/response_parser.rb

Overview

The Ollama class implements a provider for [Ollama](ollama.ai/)

Defined Under Namespace

Modules: Format, ResponseParser Classes: ErrorHandler

Constant Summary collapse

HOST =
"localhost"
DEFAULT_PARAMS =
{model: "llama3.2", stream: false}.freeze

Instance Method Summary collapse

Methods included from Format

#format

Methods inherited from Provider

#chat, #chat!, #embed, #inspect

Methods included from HTTPClient

#request

Constructor Details

#initialize(secret) ⇒ Ollama

Returns a new instance of Ollama.

Parameters:

  • secret (String)

    The secret key for authentication



18
19
20
# File 'lib/llm/providers/ollama.rb', line 18

def initialize(secret, **)
  super(secret, host: HOST, port: 11434, ssl: false, **)
end

Instance Method Details

#complete(prompt, role = :user, **params) ⇒ LLM::Response::Completion

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

Returns:

See Also:



27
28
29
30
31
32
33
34
35
# File 'lib/llm/providers/ollama.rb', line 27

def complete(prompt, role = :user, **params)
  req = Net::HTTP::Post.new ["/api", "chat"].join("/")
  messages = [*(params.delete(:messages) || []), LLM::Message.new(role, prompt)]
  params = DEFAULT_PARAMS.merge(params)
  body = {messages: messages.map(&:to_h)}.merge!(params)
  req = preflight(req, body)
  res = request(@http, req)
  Response::Completion.new(res).extend(response_parser)
end