Class: LLM::OpenAI::Responses
- Inherits:
-
Object
- Object
- LLM::OpenAI::Responses
- Includes:
- Format
- Defined in:
- lib/llm/providers/openai/responses.rb
Overview
The LLM::OpenAI::Responses class provides a responses object for interacting with [OpenAI’s response API](platform.openai.com/docs/guides/conversation-state?api-mode=responses). The responses API is similar to the chat completions API but it can maintain conversation state across multiple requests. This is useful when you want to save bandwidth and/or not maintain the message thread by yourself.
Instance Method Summary collapse
-
#create(prompt, role = :user, model: "gpt-4o-mini", **params) ⇒ LLM::Response::Output
Create a response.
-
#delete(response) ⇒ OpenStruct
Deletes a response.
-
#get(response, **params) ⇒ LLM::Response::Output
Get a response.
-
#initialize(provider) ⇒ LLM::OpenAI::Responses
constructor
Returns a new Responses object.
Methods included from Format
Constructor Details
#initialize(provider) ⇒ LLM::OpenAI::Responses
Returns a new Responses object
26 27 28 |
# File 'lib/llm/providers/openai/responses.rb', line 26 def initialize(provider) @provider = provider end |
Instance Method Details
#create(prompt, role = :user, model: "gpt-4o-mini", **params) ⇒ LLM::Response::Output
Create a response
39 40 41 42 43 44 45 46 |
# File 'lib/llm/providers/openai/responses.rb', line 39 def create(prompt, role = :user, model: "gpt-4o-mini", **params) params = {model:}.merge!(params) req = Net::HTTP::Post.new("/v1/responses", headers) = [*(params.delete(:input) || []), LLM::Message.new(role, prompt)] req.body = JSON.dump({input: format(, :response)}.merge!(params)) res = request(http, req) LLM::Response::Output.new(res).extend(response_parser) end |
#delete(response) ⇒ OpenStruct
Deletes a response
68 69 70 71 72 73 |
# File 'lib/llm/providers/openai/responses.rb', line 68 def delete(response) response_id = response.respond_to?(:id) ? response.id : response req = Net::HTTP::Delete.new("/v1/responses/#{response_id}", headers) res = request(http, req) OpenStruct.from_hash JSON.parse(res.body) end |
#get(response, **params) ⇒ LLM::Response::Output
Get a response
54 55 56 57 58 59 60 |
# File 'lib/llm/providers/openai/responses.rb', line 54 def get(response, **params) response_id = response.respond_to?(:id) ? response.id : response query = URI.encode_www_form(params) req = Net::HTTP::Get.new("/v1/responses/#{response_id}?#{query}", headers) res = request(http, req) LLM::Response::Output.new(res).extend(response_parser) end |