Class: LLM::OpenAI::Responses
- Inherits:
-
Object
- Object
- LLM::OpenAI::Responses
- Includes:
- Format
- Defined in:
- lib/llm/providers/openai/responses.rb
Overview
The LLM::OpenAI::Responses class provides a responses object for interacting with [OpenAI’s response API](platform.openai.com/docs/guides/conversation-state?api-mode=responses). The responses API is similar to the chat completions API but it can maintain conversation state across multiple requests. This is useful when you want to save bandwidth and/or not maintain the message thread by yourself.
Instance Method Summary collapse
-
#create(prompt, params = {}) ⇒ LLM::Response::Output
Create a response.
-
#delete(response) ⇒ OpenStruct
Deletes a response.
-
#get(response, **params) ⇒ LLM::Response::Output
Get a response.
-
#initialize(provider) ⇒ LLM::OpenAI::Responses
constructor
Returns a new Responses object.
Methods included from Format
Constructor Details
#initialize(provider) ⇒ LLM::OpenAI::Responses
Returns a new Responses object
40 41 42 |
# File 'lib/llm/providers/openai/responses.rb', line 40 def initialize(provider) @provider = provider end |
Instance Method Details
#create(prompt, params = {}) ⇒ LLM::Response::Output
Create a response
53 54 55 56 57 58 59 60 61 62 63 |
# File 'lib/llm/providers/openai/responses.rb', line 53 def create(prompt, params = {}) params = {role: :user, model: @provider.default_model}.merge!(params) params = [params, format_schema(params), format_tools(params)].inject({}, &:merge!).compact role = params.delete(:role) req = Net::HTTP::Post.new("/v1/responses", headers) = [*(params.delete(:input) || []), LLM::Message.new(role, prompt)] body = JSON.dump({input: [format(, :response)].flatten}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = request(http, req) LLM::Response::Respond.new(res).extend(response_parser) end |
#delete(response) ⇒ OpenStruct
Deletes a response
85 86 87 88 89 90 |
# File 'lib/llm/providers/openai/responses.rb', line 85 def delete(response) response_id = response.respond_to?(:id) ? response.id : response req = Net::HTTP::Delete.new("/v1/responses/#{response_id}", headers) res = request(http, req) OpenStruct.from_hash JSON.parse(res.body) end |
#get(response, **params) ⇒ LLM::Response::Output
Get a response
71 72 73 74 75 76 77 |
# File 'lib/llm/providers/openai/responses.rb', line 71 def get(response, **params) response_id = response.respond_to?(:id) ? response.id : response query = URI.encode_www_form(params) req = Net::HTTP::Get.new("/v1/responses/#{response_id}?#{query}", headers) res = request(http, req) LLM::Response::Respond.new(res).extend(response_parser) end |