Class: LLM::OpenAI::Responses
- Inherits:
-
Object
- Object
- LLM::OpenAI::Responses
- Includes:
- Format
- Defined in:
- lib/llm/providers/openai/responses.rb
Overview
The LLM::OpenAI::Responses class provides an interface for [OpenAI’s response API](platform.openai.com/docs/guides/conversation-state?api-mode=responses).
Instance Method Summary collapse
-
#create(prompt, params = {}) ⇒ LLM::Response
Create a response.
-
#delete(response) ⇒ LLM::Object
Deletes a response.
-
#get(response, **params) ⇒ LLM::Response
Get a response.
-
#initialize(provider) ⇒ LLM::OpenAI::Responses
constructor
Returns a new Responses object.
Methods included from Format
Constructor Details
#initialize(provider) ⇒ LLM::OpenAI::Responses
Returns a new Responses object
24 25 26 |
# File 'lib/llm/providers/openai/responses.rb', line 24 def initialize(provider) @provider = provider end |
Instance Method Details
#create(prompt, params = {}) ⇒ LLM::Response
Create a response
37 38 39 40 41 42 43 44 45 46 47 |
# File 'lib/llm/providers/openai/responses.rb', line 37 def create(prompt, params = {}) params = {role: :user, model: @provider.default_model}.merge!(params) params = [params, format_schema(params), format_tools(params)].inject({}, &:merge!).compact role = params.delete(:role) req = Net::HTTP::Post.new("/v1/responses", headers) = [*(params.delete(:input) || []), LLM::Message.new(role, prompt)] body = JSON.dump({input: [format(, :response)].flatten}.merge!(params)) set_body_stream(req, StringIO.new(body)) res = execute(request: req) LLM::Response.new(res).extend(LLM::OpenAI::Response::Responds) end |
#delete(response) ⇒ LLM::Object
Deletes a response
69 70 71 72 73 74 |
# File 'lib/llm/providers/openai/responses.rb', line 69 def delete(response) response_id = response.respond_to?(:id) ? response.id : response req = Net::HTTP::Delete.new("/v1/responses/#{response_id}", headers) res = execute(request: req) LLM::Response.new(res) end |
#get(response, **params) ⇒ LLM::Response
Get a response
55 56 57 58 59 60 61 |
# File 'lib/llm/providers/openai/responses.rb', line 55 def get(response, **params) response_id = response.respond_to?(:id) ? response.id : response query = URI.encode_www_form(params) req = Net::HTTP::Get.new("/v1/responses/#{response_id}?#{query}", headers) res = execute(request: req) LLM::Response.new(res).extend(LLM::OpenAI::Response::Responds) end |