Class: LLM::Conversation

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/conversation.rb

Overview

LLM::Conversation provides a conversation object that maintains a thread of messages that act as the context of the conversation.

Examples:

llm = LLM.openai(key)
bot = llm.chat("What is the capital of France?")
bot.chat("What should we eat in Paris?")
bot.chat("What is the weather like in Paris?")
p bot.messages.map { [_1.role, _1.content] }

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Conversation

Returns a new instance of Conversation.

Parameters:



23
24
25
26
27
# File 'lib/llm/conversation.rb', line 23

def initialize(provider, params = {})
  @provider = provider
  @params = params
  @messages = []
end

Instance Attribute Details

#messagesArray<LLM::Message> (readonly)

Returns:



18
19
20
# File 'lib/llm/conversation.rb', line 18

def messages
  @messages
end

Instance Method Details

#chat(prompt, role = :user, **params) ⇒ LLM::Conversation

Returns:



32
33
34
35
36
37
# File 'lib/llm/conversation.rb', line 32

def chat(prompt, role = :user, **params)
  tap do
    completion = @provider.complete(prompt, role, **@params.merge(params.merge(messages:)))
    @messages.concat [Message.new(role, prompt), completion.choices[0]]
  end
end

#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message

Returns The last message for the given role.

Parameters:

  • role (#to_s) (defaults to: @provider.assistant_role)

    The role of the last message. Defaults to the LLM’s assistant role (eg “assistant” or “model”)

Returns:



45
46
47
# File 'lib/llm/conversation.rb', line 45

def last_message(role: @provider.assistant_role)
  messages.reverse_each.find { _1.role == role.to_s }
end