Class: LLM::Provider

Inherits:
Object
  • Object
show all
Includes:
HTTPClient
Defined in:
lib/llm/provider.rb

Overview

The Provider class represents an abstract class for LLM (Language Model) providers

Direct Known Subclasses

Anthropic, Gemini, Ollama, OpenAI, VoyageAI

Instance Method Summary collapse

Methods included from HTTPClient

#request

Constructor Details

#initialize(secret, host:, port: 443, timeout: 60, ssl: true) ⇒ Provider

Returns a new instance of Provider.

Parameters:

  • secret (String)

    The secret key for authentication

  • host (String)

    The host address of the LLM provider

  • port (Integer) (defaults to: 443)

    The port number

  • timeout (Integer) (defaults to: 60)

    The number of seconds to wait for a response



19
20
21
22
23
24
25
# File 'lib/llm/provider.rb', line 19

def initialize(secret, host:, port: 443, timeout: 60, ssl: true)
  @secret = secret
  @http = Net::HTTP.new(host, port).tap do |http|
    http.use_ssl = ssl
    http.read_timeout = timeout
  end
end

Instance Method Details

#assistant_roleString

Returns the role of the assistant in the conversation. Usually “assistant” or “model”

Returns:

  • (String)

    Returns the role of the assistant in the conversation. Usually “assistant” or “model”

Raises:

  • (NotImplementedError)


82
83
84
# File 'lib/llm/provider.rb', line 82

def assistant_role
  raise NotImplementedError
end

#chat(prompt, role = :user, **params) ⇒ LLM::LazyConversation

Starts a new lazy conversation

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



64
65
66
# File 'lib/llm/provider.rb', line 64

def chat(prompt, role = :user, **params)
  LLM::LazyConversation.new(self, params).chat(prompt, role)
end

#chat!(prompt, role = :user, **params) ⇒ LLM::Conversation

Starts a new conversation

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



74
75
76
# File 'lib/llm/provider.rb', line 74

def chat!(prompt, role = :user, **params)
  LLM::Conversation.new(self, params).chat(prompt, role)
end

#complete(prompt, role = :user, **params) ⇒ LLM::Response::Completion

Completes a given prompt using the LLM

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



54
55
56
# File 'lib/llm/provider.rb', line 54

def complete(prompt, role = :user, **params)
  raise NotImplementedError
end

#embed(input, **params) ⇒ LLM::Response::Embedding

Parameters:

  • input (String, Array<String>)

    The input to embed

Returns:

Raises:

  • (NotImplementedError)

    When the method is not implemented by a subclass



41
42
43
# File 'lib/llm/provider.rb', line 41

def embed(input, **params)
  raise NotImplementedError
end

#inspectString

Note:

The secret key is redacted in inspect for security reasons

Returns an inspection of the provider object

Returns:

  • (String)


31
32
33
# File 'lib/llm/provider.rb', line 31

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} @secret=[REDACTED] @http=#{@http.inspect}>"
end

#modelsHash<String, LLM::Model>

Returns a hash of available models

Returns:

  • (Hash<String, LLM::Model>)

    Returns a hash of available models

Raises:

  • (NotImplementedError)


89
90
91
# File 'lib/llm/provider.rb', line 89

def models
  raise NotImplementedError
end