Class: LLM::OpenAI

Inherits:
Provider show all
Includes:
Format
Defined in:
lib/llm/providers/openai.rb,
lib/llm/providers/openai/audio.rb,
lib/llm/providers/openai/files.rb,
lib/llm/providers/openai/format.rb,
lib/llm/providers/openai/images.rb,
lib/llm/providers/openai/models.rb,
lib/llm/providers/openai/responses.rb,
lib/llm/providers/openai/error_handler.rb,
lib/llm/providers/openai/response_parser.rb

Overview

The OpenAI class implements a provider for [OpenAI](platform.openai.com/)

Defined Under Namespace

Modules: Format, ResponseParser Classes: Audio, ErrorHandler, Files, Images, Models, Responses

Constant Summary collapse

HOST =
"api.openai.com"

Instance Method Summary collapse

Methods included from Format

#format

Methods inherited from Provider

#chat, #chat!, #inspect, #respond, #respond!, #schema

Constructor Details

#initialize(secret) ⇒ OpenAI

Returns a new instance of OpenAI.

Parameters:

  • secret (String)

    The secret key for authentication



22
23
24
# File 'lib/llm/providers/openai.rb', line 22

def initialize(secret, **)
  super(secret, host: HOST, **)
end

Instance Method Details

#assistant_roleString

Returns the role of the assistant in the conversation. Usually “assistant” or “model”

Returns:

  • (String)

    Returns the role of the assistant in the conversation. Usually “assistant” or “model”



109
110
111
# File 'lib/llm/providers/openai.rb', line 109

def assistant_role
  "assistant"
end

#audioLLM::OpenAI::Audio

Provides an interface to OpenAI’s audio generation API

Returns:

See Also:



87
88
89
# File 'lib/llm/providers/openai.rb', line 87

def audio
  LLM::OpenAI::Audio.new(self)
end

#complete(prompt, role = :user, model: default_model, schema: nil, **params) ⇒ LLM::Response::Completion

Provides an interface to the chat completions API

Examples:

llm = LLM.openai(ENV["KEY"])
messages = [
  {role: "system", content: "Your task is to answer all of my questions"},
  {role: "system", content: "Your answers should be short and concise"},
]
res = llm.complete("Hello. What is the answer to 5 + 2 ?", :user, messages:)
print "[#{res.choices[0].role}]", res.choices[0].content, "\n"

Parameters:

  • prompt (String)

    The input prompt to be completed

  • role (Symbol) (defaults to: :user)

    The role of the prompt (e.g. :user, :system)

  • model (String) (defaults to: default_model)

    The model to use for the completion

  • schema (#to_json, nil) (defaults to: nil)

    The schema that describes the expected response format

  • params (Hash)

    Other completion parameters

Returns:

Raises:

See Also:



54
55
56
57
58
59
60
61
62
63
64
65
# File 'lib/llm/providers/openai.rb', line 54

def complete(prompt, role = :user, model: default_model, schema: nil, **params)
  params = {model:}
             .merge!(expand_schema(schema))
             .merge!(params)
             .compact
  req = Net::HTTP::Post.new("/v1/chat/completions", headers)
  messages = [*(params.delete(:messages) || []), Message.new(role, prompt)]
  body = JSON.dump({messages: format(messages, :complete)}.merge!(params))
  set_body_stream(req, StringIO.new(body))
  res = request(@http, req)
  Response::Completion.new(res).extend(response_parser)
end

#default_modelString

Returns the default model for chat completions

Returns:

  • (String)

See Also:



117
118
119
# File 'lib/llm/providers/openai.rb', line 117

def default_model
  "gpt-4o-mini"
end

#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response::Embedding

Provides an embedding

Parameters:

  • input (String, Array<String>)

    The input to embed

  • model (String) (defaults to: "text-embedding-3-small")

    The embedding model to use

  • params (Hash)

    Other embedding parameters

Returns:

Raises:

See Also:



34
35
36
37
38
39
# File 'lib/llm/providers/openai.rb', line 34

def embed(input, model: "text-embedding-3-small", **params)
  req = Net::HTTP::Post.new("/v1/embeddings", headers)
  req.body = JSON.dump({input:, model:}.merge!(params))
  res = request(@http, req)
  Response::Embedding.new(res).extend(response_parser)
end

#filesLLM::OpenAI::Files

Provides an interface to OpenAI’s files API

Returns:

See Also:



95
96
97
# File 'lib/llm/providers/openai.rb', line 95

def files
  LLM::OpenAI::Files.new(self)
end

#imagesLLM::OpenAI::Images

Provides an interface to OpenAI’s image generation API

Returns:

See Also:



79
80
81
# File 'lib/llm/providers/openai.rb', line 79

def images
  LLM::OpenAI::Images.new(self)
end

#modelsLLM::OpenAI::Models

Provides an interface to OpenAI’s models API

Returns:

See Also:



103
104
105
# File 'lib/llm/providers/openai.rb', line 103

def models
  LLM::OpenAI::Models.new(self)
end

#responsesLLM::OpenAI::Responses

Provides an interface to OpenAI’s response API

Returns:

See Also:



71
72
73
# File 'lib/llm/providers/openai.rb', line 71

def responses
  LLM::OpenAI::Responses.new(self)
end