Class: LLM::LazyConversation

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/lazy_conversation.rb

Overview

LLM::LazyConversation provides a conversation object that allows input prompts to be queued and only sent to the LLM when a response is needed.

Examples:

llm = LLM.openai(key)
bot = llm.chat("Be a helpful weather assistant", :system)
bot.chat("What's the weather like in Rio?")
bot.chat("What's the weather like in Algiers?")
bot.messages.each do |message|
  # A single request is made at this point
end

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ LazyConversation

Returns a new instance of LazyConversation.

Parameters:



27
28
29
30
31
# File 'lib/llm/lazy_conversation.rb', line 27

def initialize(provider, params = {})
  @provider = provider
  @params = params
  @messages = LLM::MessageQueue.new(provider)
end

Instance Attribute Details

#messagesLLM::MessageQueue (readonly)

Returns:



22
23
24
# File 'lib/llm/lazy_conversation.rb', line 22

def messages
  @messages
end

Instance Method Details

#chat(prompt, role = :user, **params) ⇒ LLM::Conversation

Returns:



36
37
38
# File 'lib/llm/lazy_conversation.rb', line 36

def chat(prompt, role = :user, **params)
  tap { @messages << [prompt, role, @params.merge(params)] }
end

#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message

Returns The last message for the given role.

Parameters:

  • role (#to_s) (defaults to: @provider.assistant_role)

    The role of the last message. Defaults to the LLM’s assistant role (eg “assistant” or “model”)

Returns:



46
47
48
# File 'lib/llm/lazy_conversation.rb', line 46

def last_message(role: @provider.assistant_role)
  messages.reverse_each.find { _1.role == role.to_s }
end