Class: LLM::Chat
- Inherits:
-
Object
- Object
- LLM::Chat
- Defined in:
- lib/llm/chat.rb
Overview
LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.
Instance Attribute Summary collapse
- #messages ⇒ Array<LLM::Message> readonly
Instance Method Summary collapse
-
#chat(prompt, params = {}) ⇒ LLM::Chat
Maintain a conversation via the chat completions API.
-
#functions ⇒ Array<LLM::Function>
Returns an array of functions that have yet to be called.
-
#initialize(provider, params = {}) ⇒ Chat
constructor
A new instance of Chat.
- #inspect ⇒ String
-
#last_message(role: @provider.assistant_role) ⇒ LLM::Message
(also: #recent_message, #read_response)
The last message in the conversation.
-
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
-
#lazy? ⇒ Boolean
Returns true if the conversation is lazy.
-
#respond(prompt, params = {}) ⇒ LLM::Chat
Maintain a conversation via the responses API.
Constructor Details
#initialize(provider, params = {}) ⇒ Chat
Returns a new instance of Chat.
36 37 38 39 40 41 |
# File 'lib/llm/chat.rb', line 36 def initialize(provider, params = {}) @provider = provider @params = {model: provider.default_model, schema: nil}.compact.merge!(params) @lazy = false @messages = [].extend(Array) end |
Instance Attribute Details
#messages ⇒ Array<LLM::Message> (readonly)
24 25 26 |
# File 'lib/llm/chat.rb', line 24 def @messages end |
Instance Method Details
#chat(prompt, params = {}) ⇒ LLM::Chat
Maintain a conversation via the chat completions API
48 49 50 51 52 53 54 55 56 57 58 59 60 |
# File 'lib/llm/chat.rb', line 48 def chat(prompt, params = {}) params = {role: :user}.merge!(params) if lazy? role = params.delete(:role) @messages << [LLM::Message.new(role, prompt), @params.merge(params), :complete] self else role = params[:role] completion = complete!(prompt, params) @messages.concat [Message.new(role, prompt), completion.choices[0]] self end end |
#functions ⇒ Array<LLM::Function>
Returns an array of functions that have yet to be called
126 127 128 129 130 131 |
# File 'lib/llm/chat.rb', line 126 def functions .select(&:assistant?) .flat_map(&:functions) .reject(&:called?) end |
#inspect ⇒ String
117 118 119 120 121 |
# File 'lib/llm/chat.rb', line 117 def inspect "#<#{self.class.name}:0x#{object_id.to_s(16)} " \ "@provider=#{@provider.class}, @params=#{@params.inspect}, " \ "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>" end |
#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message, read_response
The ‘read_response` and `recent_message` methods are aliases of the `last_message` method, and you can choose the name that best fits your context or code style.
The last message in the conversation.
91 92 93 |
# File 'lib/llm/chat.rb', line 91 def (role: @provider.assistant_role) .reverse_each.find { _1.role == role.to_s } end |
#lazy ⇒ LLM::Chat
Enables lazy mode for the conversation.
100 101 102 103 104 105 106 |
# File 'lib/llm/chat.rb', line 100 def lazy tap do next if lazy? @lazy = true @messages = LLM::Buffer.new(@provider) end end |
#lazy? ⇒ Boolean
Returns true if the conversation is lazy
111 112 113 |
# File 'lib/llm/chat.rb', line 111 def lazy? @lazy end |
#respond(prompt, params = {}) ⇒ LLM::Chat
Not all LLM providers support this API
Maintain a conversation via the responses API
68 69 70 71 72 73 74 75 76 77 78 79 80 |
# File 'lib/llm/chat.rb', line 68 def respond(prompt, params = {}) params = {role: :user}.merge!(params) if lazy? role = params.delete(:role) @messages << [LLM::Message.new(role, prompt), @params.merge(params), :respond] self else role = params[:role] @response = respond!(prompt, params) @messages.concat [Message.new(role, prompt), @response.outputs[0]] self end end |