Class: LLM::OpenAI
- Includes:
- Format
- Defined in:
- lib/llm/providers/openai.rb,
lib/llm/providers/openai/audio.rb,
lib/llm/providers/openai/files.rb,
lib/llm/providers/openai/format.rb,
lib/llm/providers/openai/images.rb,
lib/llm/providers/openai/models.rb,
lib/llm/providers/openai/responses.rb,
lib/llm/providers/openai/error_handler.rb,
lib/llm/providers/openai/response_parser.rb
Overview
The OpenAI class implements a provider for [OpenAI](platform.openai.com/)
Defined Under Namespace
Modules: Format, ResponseParser Classes: Audio, ErrorHandler, Files, Images, Models, Responses
Constant Summary collapse
- HOST =
"api.openai.com"
Instance Method Summary collapse
-
#assistant_role ⇒ String
Returns the role of the assistant in the conversation.
-
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI’s audio generation API.
-
#complete(prompt, role = :user, model: default_model, schema: nil, **params) ⇒ LLM::Response::Completion
Provides an interface to the chat completions API.
-
#default_model ⇒ String
Returns the default model for chat completions.
-
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response::Embedding
Provides an embedding.
-
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI’s files API.
-
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI’s image generation API.
-
#initialize(secret) ⇒ OpenAI
constructor
A new instance of OpenAI.
-
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI’s models API.
-
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI’s response API.
Methods included from Format
Methods inherited from Provider
#chat, #chat!, #inspect, #respond, #respond!, #schema
Constructor Details
Instance Method Details
#assistant_role ⇒ String
Returns the role of the assistant in the conversation. Usually “assistant” or “model”
109 110 111 |
# File 'lib/llm/providers/openai.rb', line 109 def assistant_role "assistant" end |
#audio ⇒ LLM::OpenAI::Audio
Provides an interface to OpenAI’s audio generation API
87 88 89 |
# File 'lib/llm/providers/openai.rb', line 87 def audio LLM::OpenAI::Audio.new(self) end |
#complete(prompt, role = :user, model: default_model, schema: nil, **params) ⇒ LLM::Response::Completion
Provides an interface to the chat completions API
54 55 56 57 58 59 60 61 62 63 64 65 |
# File 'lib/llm/providers/openai.rb', line 54 def complete(prompt, role = :user, model: default_model, schema: nil, **params) params = {model:} .merge!((schema)) .merge!(params) .compact req = Net::HTTP::Post.new("/v1/chat/completions", headers) = [*(params.delete(:messages) || []), Message.new(role, prompt)] body = JSON.dump({messages: format(, :complete)}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = request(@http, req) Response::Completion.new(res).extend(response_parser) end |
#default_model ⇒ String
Returns the default model for chat completions
117 118 119 |
# File 'lib/llm/providers/openai.rb', line 117 def default_model "gpt-4o-mini" end |
#embed(input, model: "text-embedding-3-small", **params) ⇒ LLM::Response::Embedding
Provides an embedding
34 35 36 37 38 39 |
# File 'lib/llm/providers/openai.rb', line 34 def (input, model: "text-embedding-3-small", **params) req = Net::HTTP::Post.new("/v1/embeddings", headers) req.body = JSON.dump({input:, model:}.merge!(params)) res = request(@http, req) Response::Embedding.new(res).extend(response_parser) end |
#files ⇒ LLM::OpenAI::Files
Provides an interface to OpenAI’s files API
95 96 97 |
# File 'lib/llm/providers/openai.rb', line 95 def files LLM::OpenAI::Files.new(self) end |
#images ⇒ LLM::OpenAI::Images
Provides an interface to OpenAI’s image generation API
79 80 81 |
# File 'lib/llm/providers/openai.rb', line 79 def images LLM::OpenAI::Images.new(self) end |
#models ⇒ LLM::OpenAI::Models
Provides an interface to OpenAI’s models API
103 104 105 |
# File 'lib/llm/providers/openai.rb', line 103 def models LLM::OpenAI::Models.new(self) end |
#responses ⇒ LLM::OpenAI::Responses
Provides an interface to OpenAI’s response API
71 72 73 |
# File 'lib/llm/providers/openai.rb', line 71 def responses LLM::OpenAI::Responses.new(self) end |