Class: LLM::Anthropic
- Includes:
- Format
- Defined in:
- lib/llm/providers/anthropic.rb,
lib/llm/providers/anthropic/format.rb,
lib/llm/providers/anthropic/models.rb,
lib/llm/providers/anthropic/error_handler.rb,
lib/llm/providers/anthropic/response_parser.rb
Overview
The Anthropic class implements a provider for [Anthropic](www.anthropic.com)
Defined Under Namespace
Modules: Format, ResponseParser Classes: ErrorHandler, Models
Constant Summary collapse
- HOST =
"api.anthropic.com"
Instance Method Summary collapse
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#complete(prompt, role = :user, model: default_model, max_tokens: 1024, **params) ⇒ LLM::Response::Completion
Provides an interface to the chat completions API.
-
#default_model ⇒ String
Returns the default model for chat completions.
-
#embed(input, token:, model: "voyage-2", **params) ⇒ LLM::Response::Embedding
Provides an embedding via VoyageAI per [Anthropic’s recommendation](docs.anthropic.com/en/docs/build-with-claude/embeddings).
-
#initialize(secret) ⇒ Anthropic
constructor
A new instance of Anthropic.
-
#models ⇒ LLM::Anthropic::Models
Provides an interface to Anthropic’s models API.
Methods included from Format
Methods inherited from Provider
#audio, #chat, #chat!, #files, #images, #inspect, #respond, #respond!, #responses, #schema
Constructor Details
Instance Method Details
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually “assistant” or “model”
72 73 74 |
# File 'lib/llm/providers/anthropic.rb', line 72 def assistant_role "assistant" end |
#complete(prompt, role = :user, model: default_model, max_tokens: 1024, **params) ⇒ LLM::Response::Completion
Provides an interface to the chat completions API
52 53 54 55 56 57 58 59 60 |
# File 'lib/llm/providers/anthropic.rb', line 52 def complete(prompt, role = :user, model: default_model, max_tokens: 1024, **params) params = {max_tokens:, model:}.merge!(params) req = Net::HTTP::Post.new("/v1/messages", headers) = [*(params.delete(:messages) || []), Message.new(role, prompt)] body = JSON.dump({messages: format()}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = request(@http, req) Response::Completion.new(res).extend(response_parser) end |
#default_model ⇒ String
Returns the default model for chat completions
80 81 82 |
# File 'lib/llm/providers/anthropic.rb', line 80 def default_model "claude-3-5-sonnet-20240620" end |
#embed(input, token:, model: "voyage-2", **params) ⇒ LLM::Response::Embedding
Provides an embedding via VoyageAI per [Anthropic’s recommendation](docs.anthropic.com/en/docs/build-with-claude/embeddings)
34 35 36 37 |
# File 'lib/llm/providers/anthropic.rb', line 34 def (input, token:, model: "voyage-2", **params) llm = LLM.voyageai(token) llm.(input, **params.merge(model:)) end |
#models ⇒ LLM::Anthropic::Models
Provides an interface to Anthropic’s models API
66 67 68 |
# File 'lib/llm/providers/anthropic.rb', line 66 def models LLM::Anthropic::Models.new(self) end |