Module: LLM
- Defined in:
- lib/llm.rb,
lib/llm/bot.rb,
lib/llm/error.rb,
lib/llm/buffer.rb,
lib/llm/client.rb,
lib/llm/message.rb,
lib/llm/version.rb,
lib/llm/response.rb,
lib/llm/eventhandler.rb,
lib/llm/providers/xai.rb,
lib/llm/providers/zai.rb,
lib/llm/providers/gemini.rb,
lib/llm/providers/ollama.rb,
lib/llm/providers/openai.rb,
lib/llm/providers/deepseek.rb,
lib/llm/providers/llamacpp.rb,
lib/llm/providers/anthropic.rb
Defined Under Namespace
Modules: Client, EventStream, Utils Classes: Anthropic, Bot, Buffer, Builder, DeepSeek, Error, EventHandler, File, Function, Gemini, LlamaCpp, Message, Mime, Multipart, Object, Ollama, OpenAI, Provider, Response, ResponseError, Schema, ServerTool, Tool, XAI, ZAI
Constant Summary collapse
Class.new(ResponseError)
- RateLimitError =
HTTPTooManyRequests
Class.new(ResponseError)
- ServerError =
HTTPServerError
Class.new(ResponseError)
- NoImageError =
When no images are found in a response
Class.new(ResponseError)
- FormatError =
When an given an input object that is not understood
Class.new(Error)
- PromptError =
When given a prompt object that is not understood
Class.new(FormatError)
- VERSION =
"2.0.1"
Class Method Summary collapse
-
.anthropic ⇒ Anthropic
A new instance of Anthropic.
- .deepseek ⇒ LLM::DeepSeek
- .File(obj) ⇒ LLM::File
-
.function(key, &b) ⇒ LLM::Function
Define a function.
-
.gemini ⇒ Gemini
A new instance of Gemini.
- .llamacpp(key: nil) ⇒ LLM::LlamaCpp
-
.lock(name) ⇒ void
Provides a thread-safe lock.
-
.ollama(key: nil) ⇒ Ollama
A new instance of Ollama.
-
.openai ⇒ OpenAI
A new instance of OpenAI.
-
.xai ⇒ XAI
A new instance of XAI.
-
.zai ⇒ ZAI
A new instance of ZAI.
Class Method Details
.anthropic ⇒ Anthropic
Returns a new instance of Anthropic.
34 35 36 37 |
# File 'lib/llm.rb', line 34 def anthropic(**) lock(:require) { require_relative "llm/providers/anthropic" unless defined?(LLM::Anthropic) } LLM::Anthropic.new(**) end |
.deepseek ⇒ LLM::DeepSeek
66 67 68 69 |
# File 'lib/llm.rb', line 66 def deepseek(**) lock(:require) { require_relative "llm/providers/deepseek" unless defined?(LLM::DeepSeek) } LLM::DeepSeek.new(**) end |
.File(obj) ⇒ LLM::File
82 83 84 85 86 87 88 89 90 91 |
# File 'lib/llm/file.rb', line 82 def LLM.File(obj) case obj when File obj.close unless obj.closed? LLM.File(obj.path) when LLM::File, LLM::Response then obj when String then LLM::File.new(obj) else raise TypeError, "don't know how to handle #{obj.class} objects" end end |
.function(key, &b) ⇒ LLM::Function
Define a function
112 113 114 |
# File 'lib/llm.rb', line 112 def function(key, &b) LLM::Function.new(key, &b) end |
.gemini ⇒ Gemini
Returns a new instance of Gemini.
42 43 44 45 |
# File 'lib/llm.rb', line 42 def gemini(**) lock(:require) { require_relative "llm/providers/gemini" unless defined?(LLM::Gemini) } LLM::Gemini.new(**) end |
.llamacpp(key: nil) ⇒ LLM::LlamaCpp
58 59 60 61 |
# File 'lib/llm.rb', line 58 def llamacpp(key: nil, **) lock(:require) { require_relative "llm/providers/llamacpp" unless defined?(LLM::LlamaCpp) } LLM::LlamaCpp.new(key:, **) end |
.lock(name) ⇒ void
This method returns an undefined value.
Provides a thread-safe lock
121 |
# File 'lib/llm.rb', line 121 def lock(name, &) = @monitors[name].synchronize(&) |
.ollama(key: nil) ⇒ Ollama
Returns a new instance of Ollama.
50 51 52 53 |
# File 'lib/llm.rb', line 50 def ollama(key: nil, **) lock(:require) { require_relative "llm/providers/ollama" unless defined?(LLM::Ollama) } LLM::Ollama.new(key:, **) end |
.openai ⇒ OpenAI
Returns a new instance of OpenAI.
74 75 76 77 |
# File 'lib/llm.rb', line 74 def openai(**) lock(:require) { require_relative "llm/providers/openai" unless defined?(LLM::OpenAI) } LLM::OpenAI.new(**) end |