Class: LLM::LazyConversation

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/lazy_conversation.rb

Overview

LLM::LazyConversation provides a conversation object that allows input prompts to be queued and only sent to the LLM when a response is needed.

Examples:

llm = LLM.openai(key)
bot = llm.chat("Be a helpful weather assistant", :system)
bot.chat("What's the weather like in Rio?")
bot.chat("What's the weather like in Algiers?")
bot.messages.each do |message|
  # A single request is made at this point
end

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider) ⇒ LazyConversation

Returns a new instance of LazyConversation.

Parameters:



27
28
29
30
# File 'lib/llm/lazy_conversation.rb', line 27

def initialize(provider)
  @provider = provider
  @messages = LLM::MessageQueue.new(provider)
end

Instance Attribute Details

#messagesLLM::MessageQueue (readonly)

Returns:



22
23
24
# File 'lib/llm/lazy_conversation.rb', line 22

def messages
  @messages
end

Instance Method Details

#chat(prompt, role = :user, **params) ⇒ LLM::Conversation

Returns:



35
36
37
# File 'lib/llm/lazy_conversation.rb', line 35

def chat(prompt, role = :user, **params)
  tap { @messages << [prompt, role, params] }
end