Class: LLM::Chat
- Inherits:
-
Object
- Object
- LLM::Chat
- Defined in:
- lib/llm/chat.rb
Overview
LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.
Instance Attribute Summary collapse
- #messages ⇒ Array<LLM::Message> readonly
Instance Method Summary collapse
-
#chat(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the chat completions API.
-
#initialize(provider, model: provider.default_model, schema: nil, **params) ⇒ Chat
constructor
A new instance of Chat.
- #inspect ⇒ Object
-
#last_message(role: @provider.assistant_role) ⇒ LLM::Message
(also: #recent_message, #read_response)
The last message in the conversation.
-
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
-
#lazy? ⇒ Boolean
Returns true if the conversation is lazy.
-
#respond(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the responses API.
Constructor Details
#initialize(provider, model: provider.default_model, schema: nil, **params) ⇒ Chat
Returns a new instance of Chat.
36 37 38 39 40 41 |
# File 'lib/llm/chat.rb', line 36 def initialize(provider, model: provider.default_model, schema: nil, **params) @provider = provider @params = params.merge!(model:, schema:) @lazy = false @messages = [] end |
Instance Attribute Details
#messages ⇒ Array<LLM::Message> (readonly)
25 26 27 |
# File 'lib/llm/chat.rb', line 25 def @messages end |
Instance Method Details
#chat(prompt, role = :user, **params) ⇒ LLM::Chat
Maintain a conversation via the chat completions API
49 50 51 52 53 54 55 56 57 58 |
# File 'lib/llm/chat.rb', line 49 def chat(prompt, role = :user, **params) if lazy? @messages << [LLM::Message.new(role, prompt), @params.merge(params), :complete] self else completion = complete!(prompt, role, params) @messages.concat [Message.new(role, prompt), completion.choices[0]] self end end |
#inspect ⇒ Object
111 112 113 114 115 |
# File 'lib/llm/chat.rb', line 111 def inspect "#<#{self.class.name}:0x#{object_id.to_s(16)} " \ "@provider=#{@provider.class}, @params=#{@params.inspect}, " \ "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>" end |
#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message, read_response
The ‘read_response` and `recent_message` methods are aliases of the `last_message` method, and you can choose the name that best fits your context or code style.
The last message in the conversation.
87 88 89 |
# File 'lib/llm/chat.rb', line 87 def (role: @provider.assistant_role) .reverse_each.find { _1.role == role.to_s } end |
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
96 97 98 99 100 101 102 |
# File 'lib/llm/chat.rb', line 96 def lazy tap do next if lazy? @lazy = true @messages = LLM::Buffer.new(@provider) end end |
#lazy? ⇒ Boolean
Returns true if the conversation is lazy
107 108 109 |
# File 'lib/llm/chat.rb', line 107 def lazy? @lazy end |
#respond(prompt, role = :user, **params) ⇒ LLM::Chat
Not all LLM providers support this API
Maintain a conversation via the responses API
67 68 69 70 71 72 73 74 75 76 |
# File 'lib/llm/chat.rb', line 67 def respond(prompt, role = :user, **params) if lazy? @messages << [LLM::Message.new(role, prompt), @params.merge(params), :respond] self else @response = respond!(prompt, role, params) @messages.concat [Message.new(role, prompt), @response.outputs[0]] self end end |