Class: LLM::Chat
- Inherits:
-
Object
- Object
- LLM::Chat
- Defined in:
- lib/llm/chat.rb
Overview
LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.
Instance Attribute Summary collapse
- #messages ⇒ Array<LLM::Message> readonly
Instance Method Summary collapse
-
#chat(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the chat completions API.
-
#initialize(provider, model: provider.default_model, **params) ⇒ Chat
constructor
A new instance of Chat.
- #inspect ⇒ Object
-
#last_message(role: @provider.assistant_role) ⇒ LLM::Message
(also: #recent_message, #read_response)
The last message in the conversation.
-
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
-
#lazy? ⇒ Boolean
Returns true if the conversation is lazy.
-
#respond(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the responses API.
Constructor Details
#initialize(provider, model: provider.default_model, **params) ⇒ Chat
Returns a new instance of Chat.
34 35 36 37 38 39 |
# File 'lib/llm/chat.rb', line 34 def initialize(provider, model: provider.default_model, **params) @provider = provider @params = params.merge!(model:) @lazy = false @messages = [] end |
Instance Attribute Details
#messages ⇒ Array<LLM::Message> (readonly)
25 26 27 |
# File 'lib/llm/chat.rb', line 25 def @messages end |
Instance Method Details
#chat(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the chat completions API
47 48 49 50 51 52 53 54 55 56 |
# File 'lib/llm/chat.rb', line 47 def chat(prompt, role = :user, **params) if lazy? @messages << [LLM::Message.new(role, prompt), @params.merge(params), :complete] self else completion = complete!(prompt, role, params) @messages.concat [Message.new(role, prompt), completion.choices[0]] self end end |
#inspect ⇒ Object
109 110 111 112 113 |
# File 'lib/llm/chat.rb', line 109 def inspect "#<#{self.class.name}:0x#{object_id.to_s(16)} " \ "@provider=#{@provider.class}, @params=#{@params.inspect}, " \ "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>" end |
#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message, read_response
The ‘read_response` and `recent_message` methods are aliases of the `last_message` method, and you can choose the name that best fits your context or code style.
The last message in the conversation.
85 86 87 |
# File 'lib/llm/chat.rb', line 85 def (role: @provider.assistant_role) .reverse_each.find { _1.role == role.to_s } end |
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
94 95 96 97 98 99 100 |
# File 'lib/llm/chat.rb', line 94 def lazy tap do next if lazy? @lazy = true @messages = LLM::Buffer.new(@provider) end end |
#lazy? ⇒ Boolean
Returns true if the conversation is lazy
105 106 107 |
# File 'lib/llm/chat.rb', line 105 def lazy? @lazy end |
#respond(prompt, role = :user, **params) ⇒ LLM::Chat
Not all LLM providers support this API
Maintain a conversation via the responses API
65 66 67 68 69 70 71 72 73 74 |
# File 'lib/llm/chat.rb', line 65 def respond(prompt, role = :user, **params) if lazy? @messages << [LLM::Message.new(role, prompt), @params.merge(params), :respond] self else @response = respond!(prompt, role, params) @messages.concat [Message.new(role, prompt), @response.outputs[0]] self end end |