Class: LLM::Anthropic

Inherits:
Provider show all
Includes:
Format
Defined in:
lib/llm/providers/anthropic.rb,
lib/llm/providers/anthropic/format.rb,
lib/llm/providers/anthropic/error_handler.rb,
lib/llm/providers/anthropic/response_parser.rb

Overview

The Anthropic class implements a provider for [Anthropic](www.anthropic.com)

Defined Under Namespace

Modules: Format, ResponseParser Classes: ErrorHandler

Constant Summary collapse

HOST =
"api.anthropic.com"

Instance Method Summary collapse

Methods included from Format

#format

Methods inherited from Provider

#chat, #chat!, #inspect

Methods included from HTTPClient

#request

Constructor Details

#initialize(secret) ⇒ Anthropic

Returns a new instance of Anthropic.

Parameters:

  • secret (String)

    The secret key for authentication



17
18
19
# File 'lib/llm/providers/anthropic.rb', line 17

def initialize(secret, **)
  super(secret, host: HOST, **)
end

Instance Method Details

#assistant_roleString

Returns the role of the assistant in the conversation. Usually “assistant” or “model”

Returns:

  • (String)

    Returns the role of the assistant in the conversation. Usually “assistant” or “model”



51
52
53
# File 'lib/llm/providers/anthropic.rb', line 51

def assistant_role
  "assistant"
end

#complete(prompt, role = :user, **params) ⇒ LLM::Response::Completion

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

Returns:

See Also:



40
41
42
43
44
45
46
47
# File 'lib/llm/providers/anthropic.rb', line 40

def complete(prompt, role = :user, **params)
  params   = {max_tokens: 1024, model: "claude-3-5-sonnet-20240620"}.merge!(params)
  req      = Net::HTTP::Post.new("/v1/messages", headers)
  messages = [*(params.delete(:messages) || []), Message.new(role, prompt)]
  req.body = JSON.dump({messages: format(messages)}.merge!(params))
  res      = request(@http, req)
  Response::Completion.new(res).extend(response_parser)
end

#embed(input, token:, **params) ⇒ LLM::Response::Embedding

Provides an embedding via VoyageAI per [Anthropic’s recommendation](docs.anthropic.com/en/docs/build-with-claude/embeddings)

Parameters:

  • token (String)

    Valid token for the VoyageAI API

  • params (Hash)

    Additional parameters to pass to the API

  • input (String, Array<String>)

    The input to embed

Returns:



30
31
32
33
# File 'lib/llm/providers/anthropic.rb', line 30

def embed(input, token:, **params)
  llm = LLM.voyageai(token)
  llm.embed(input, **params)
end

#modelsHash<String, LLM::Model>

Returns a hash of available models

Returns:

  • (Hash<String, LLM::Model>)

    Returns a hash of available models



57
58
59
# File 'lib/llm/providers/anthropic.rb', line 57

def models
  @models ||= load_models!("anthropic")
end