Class: LLM::Chat

Inherits:
Object
  • Object
show all
Defined in:
lib/llm/chat.rb

Overview

LLM::Chat provides a chat object that maintains a thread of messages that acts as context throughout a conversation. A conversation can use the chat completions API that most LLM providers support or the responses API that a select few LLM providers support.

Examples:

#!/usr/bin/env ruby
require "llm"

llm = LLM.openai(ENV["KEY"])
bot = LLM::Chat.new(llm).lazy
bot.chat("Provide short and concise answers", role: :system)
bot.chat("What is 5 + 7 ?", role: :user)
bot.chat("Why is the sky blue ?", role: :user)
bot.chat("Why did the chicken cross the road ?", role: :user)
bot.messages.map { print "[#{_1.role}]", _1.content, "\n" }

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(provider, params = {}) ⇒ Chat

Returns a new instance of Chat.

Parameters:

  • provider (LLM::Provider)

    A provider

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Options Hash (params):

  • :model (String)

    Defaults to the provider’s default model

  • :schema (#to_json, nil)

    Defaults to nil

  • :tools (Array<LLM::Function>, nil)

    Defaults to nil



36
37
38
39
40
41
# File 'lib/llm/chat.rb', line 36

def initialize(provider, params = {})
  @provider = provider
  @params = {model: provider.default_model, schema: nil}.compact.merge!(params)
  @lazy = false
  @messages = [].extend(Array)
end

Instance Attribute Details

#messagesArray<LLM::Message> (readonly)

Returns:



24
25
26
# File 'lib/llm/chat.rb', line 24

def messages
  @messages
end

Instance Method Details

#chat(prompt, params = {}) ⇒ LLM::Chat

Maintain a conversation via the chat completions API

Parameters:

  • prompt (String)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Returns:



48
49
50
51
52
53
54
55
56
57
58
59
60
# File 'lib/llm/chat.rb', line 48

def chat(prompt, params = {})
  params = {role: :user}.merge!(params)
  if lazy?
    role = params.delete(:role)
    @messages << [LLM::Message.new(role, prompt), @params.merge(params), :complete]
    self
  else
    role = params[:role]
    completion = complete!(prompt, params)
    @messages.concat [Message.new(role, prompt), completion.choices[0]]
    self
  end
end

#functionsArray<LLM::Function>

Returns an array of functions that have yet to be called

Returns:



126
127
128
129
130
131
# File 'lib/llm/chat.rb', line 126

def functions
  messages
    .select(&:assistant?)
    .flat_map(&:functions)
    .reject(&:called?)
end

#inspectString

Returns:

  • (String)


117
118
119
120
121
# File 'lib/llm/chat.rb', line 117

def inspect
  "#<#{self.class.name}:0x#{object_id.to_s(16)} " \
  "@provider=#{@provider.class}, @params=#{@params.inspect}, " \
  "@messages=#{@messages.inspect}, @lazy=#{@lazy.inspect}>"
end

#last_message(role: @provider.assistant_role) ⇒ LLM::Message Also known as: recent_message, read_response

Note:

The ‘read_response` and `recent_message` methods are aliases of the `last_message` method, and you can choose the name that best fits your context or code style.

The last message in the conversation.

Parameters:

  • role (#to_s) (defaults to: @provider.assistant_role)

    The role of the last message.

Returns:



91
92
93
# File 'lib/llm/chat.rb', line 91

def last_message(role: @provider.assistant_role)
  messages.reverse_each.find { _1.role == role.to_s }
end

#lazyLLM::Chat

Enables lazy mode for the conversation.

Returns:



100
101
102
103
104
105
106
# File 'lib/llm/chat.rb', line 100

def lazy
  tap do
    next if lazy?
    @lazy = true
    @messages = LLM::Buffer.new(@provider)
  end
end

#lazy?Boolean

Returns true if the conversation is lazy

Returns:

  • (Boolean)

    Returns true if the conversation is lazy



111
112
113
# File 'lib/llm/chat.rb', line 111

def lazy?
  @lazy
end

#respond(prompt, params = {}) ⇒ LLM::Chat

Note:

Not all LLM providers support this API

Maintain a conversation via the responses API

Parameters:

  • prompt (String)

    The input prompt to be completed

  • params (Hash) (defaults to: {})

    The parameters to maintain throughout the conversation. Any parameter the provider supports can be included and not only those listed here.

Returns:



68
69
70
71
72
73
74
75
76
77
78
79
80
# File 'lib/llm/chat.rb', line 68

def respond(prompt, params = {})
  params = {role: :user}.merge!(params)
  if lazy?
    role = params.delete(:role)
    @messages << [LLM::Message.new(role, prompt), @params.merge(params), :respond]
    self
  else
    role = params[:role]
    @response = respond!(prompt, params)
    @messages.concat [Message.new(role, prompt), @response.outputs[0]]
    self
  end
end