Class: LLM::Anthropic
- Includes:
- Format
- Defined in:
- lib/llm/providers/anthropic.rb,
lib/llm/providers/anthropic/format.rb,
lib/llm/providers/anthropic/models.rb,
lib/llm/providers/anthropic/error_handler.rb,
lib/llm/providers/anthropic/stream_parser.rb
Overview
The Anthropic class implements a provider for [Anthropic](www.anthropic.com).
Defined Under Namespace
Modules: Format, Response Classes: ErrorHandler, Models, StreamParser
Constant Summary collapse
- HOST =
"api.anthropic.com"
Instance Method Summary collapse
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API.
-
#default_model ⇒ String
Returns the default model for chat completions.
-
#initialize ⇒ Anthropic
constructor
A new instance of Anthropic.
-
#models ⇒ LLM::Anthropic::Models
Provides an interface to Anthropic’s models API.
Methods included from Format
Methods inherited from Provider
#audio, #chat, #chat!, #embed, #files, #images, #inspect, #moderations, #respond, #respond!, #responses, #schema, #vector_stores, #with
Constructor Details
Instance Method Details
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually “assistant” or “model”
65 66 67 |
# File 'lib/llm/providers/anthropic.rb', line 65 def assistant_role "assistant" end |
#complete(prompt, params = {}) ⇒ LLM::Response
Provides an interface to the chat completions API
42 43 44 45 46 47 48 49 50 51 52 53 |
# File 'lib/llm/providers/anthropic.rb', line 42 def complete(prompt, params = {}) params = {role: :user, model: default_model, max_tokens: 1024}.merge!(params) params = [params, format_tools(params)].inject({}, &:merge!).compact role, stream = params.delete(:role), params.delete(:stream) params[:stream] = true if stream.respond_to?(:<<) || stream == true req = Net::HTTP::Post.new("/v1/messages", headers) = [*(params.delete(:messages) || []), Message.new(role, prompt)] body = JSON.dump({messages: [format()].flatten}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = execute(request: req, stream:) LLM::Response.new(res).extend(LLM::Anthropic::Response::Completion) end |
#default_model ⇒ String
Returns the default model for chat completions
73 74 75 |
# File 'lib/llm/providers/anthropic.rb', line 73 def default_model "claude-sonnet-4-20250514" end |
#models ⇒ LLM::Anthropic::Models
Provides an interface to Anthropic’s models API
59 60 61 |
# File 'lib/llm/providers/anthropic.rb', line 59 def models LLM::Anthropic::Models.new(self) end |