Class: LLM::Chat

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/chat.rb

Overview

LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.

Examples:

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(ENV["KEY"])
bot = LLM::Chat.new(llm).lazy
bot.chat("Your task is to answer all of my questions", :system)
bot.chat("Your answers should be short and concise", :system)
bot.chat("What is 5 + 7 ?", :user)
bot.chat("Why is the sky blue ?", :user)
bot.chat("Why did the chicken cross the road ?", :user)
bot.messages.map { print "[#{_1.role}]", _1.content, "\n" }

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, model: provider.default_model, schema: nil, **params) ⇒ Chat

Returns a new instance of Chat.

Parameters:

  • provider (LLM::Provider)

    A provider

  • schema (to_json) (defaults to: nil)

    The JSON schema to maintain throughout the conversation

  • model (String) (defaults to: provider.default_model)

    The model to maintain throughout the conversation

  • params (Hash)

    Other parameters to maintain throughout the conversation



36
37
38
39
40
41
# File 'lib/llm/chat.rb', line 36

def initialize(provider, model: provider.default_model, schema: nil, **params)
  @provider = provider
  @params = params.merge!(model:, schema:)
  @lazy = false
  @messages = [].extend(Array)
end

Instance Attribute Details

#messagesArray<LLM::Message> (readonly)

Returns:



25
26
27
# File 'lib/llm/chat.rb', line 25

def messages
  @messages
end

Instance Method Details

#chat(prompt, role = :user, **params) ⇒ LLM::Chat

Maintain a conversation via the chat completions API

Returns:



49
50
51
52
53
54
55
56
57
58
# File 'lib/llm/chat.rb', line 49

def chat(prompt, role = :user, **params)
  if lazy?
    @messages << [LLM::Message.new(role, prompt), @params.merge(params), :complete]
    self
  else
    completion = complete!(prompt, role, params)
    @messages.concat [Message.new(role, prompt), completion.choices[0]]
    self
  end
end

#functionsArray<LLM::Function>

Returns an array of functions that have yet to be called

Returns:



122
123
124
125
126
127
# File 'lib/llm/chat.rb', line 122

def functions
  messages
    .select(&:assistant?)
    .flat_map(&:functions)
    .reject(&:called?)
end

#inspectString

Returns:

  • (String)


113
114
115
116
117
# File 'lib/llm/chat.rb', line 113

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>"
end

#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message, read_response

Note:

The ‘read_response` and `recent_message` methods are aliases of the `last_message` method, and you can choose the name that best fits your context or code style.

The last message in the conversation.

Parameters:

  • role (#to_s) (defaults to: @provider.assistant_role)

    The role of the last message.

Returns:



87
88
89
# File 'lib/llm/chat.rb', line 87

def last_message(role: @provider.assistant_role)
  messages.reverse_each.find { _1.role == role.to_s }
end

#lazyLLM::Chat

Enables lazy mode for the conversation.

Returns:



96
97
98
99
100
101
102
# File 'lib/llm/chat.rb', line 96

def lazy
  tap do
    next if lazy?
    @lazy = true
    @messages = LLM::Buffer.new(@provider)
  end
end

#lazy?Boolean

Returns true if the conversation is lazy

Returns:

  • (Boolean)

    Returns true if the conversation is lazy



107
108
109
# File 'lib/llm/chat.rb', line 107

def lazy?
  @lazy
end

#respond(prompt, role = :user, **params) ⇒ LLM::Chat

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API

Returns:



67
68
69
70
71
72
73
74
75
76
# File 'lib/llm/chat.rb', line 67

def respond(prompt, role = :user, **params)
  if lazy?
    @messages << [LLM::Message.new(role, prompt), @params.merge(params), :respond]
    self
  else
    @response = respond!(prompt, role, params)
    @messages.concat [Message.new(role, prompt), @response.outputs[0]]
    self
  end
end