Module: LLM
- Defined in:
- lib/llm.rb,
lib/llm/bot.rb,
lib/llm/error.rb,
lib/llm/buffer.rb,
lib/llm/client.rb,
lib/llm/message.rb,
lib/llm/version.rb,
lib/llm/response.rb,
lib/llm/eventhandler.rb,
lib/llm/providers/xai.rb,
lib/llm/providers/gemini.rb,
lib/llm/providers/ollama.rb,
lib/llm/providers/openai.rb,
lib/llm/providers/deepseek.rb,
lib/llm/providers/llamacpp.rb,
lib/llm/providers/anthropic.rb
Defined Under Namespace
Modules: Client, EventStream, Utils Classes: Anthropic, Bot, Buffer, DeepSeek, Error, EventHandler, File, Function, Gemini, LlamaCpp, Message, Mime, Multipart, Object, Ollama, OpenAI, Provider, Response, ResponseError, Schema, Tool, XAI
Constant Summary collapse
Class.new(ResponseError)
- RateLimitError =
HTTPTooManyRequests
Class.new(ResponseError)
- ServerError =
HTTPServerError
Class.new(ResponseError)
- NoImageError =
When no images are found in a response
Class.new(ResponseError)
- FormatError =
When an given an input object that is not understood
Class.new(Error)
- PromptError =
When given a prompt object that is not understood
Class.new(FormatError)
- VERSION =
"0.16.3"
Class Method Summary collapse
-
.anthropic ⇒ Anthropic
A new instance of Anthropic.
- .deepseek ⇒ LLM::DeepSeek
- .File(obj) ⇒ LLM::File
-
.function(name, &b) ⇒ LLM::Function
Define or get a function.
-
.functions ⇒ Hash<String,LLM::Function>
Returns all known functions.
-
.gemini ⇒ Gemini
A new instance of Gemini.
- .llamacpp(key: nil) ⇒ LLM::LlamaCpp
-
.ollama(key: nil) ⇒ Ollama
A new instance of Ollama.
-
.openai ⇒ OpenAI
A new instance of OpenAI.
-
.xai ⇒ XAI
A new instance of XAI.
Class Method Details
.anthropic ⇒ Anthropic
Returns a new instance of Anthropic.
30 31 32 33 |
# File 'lib/llm.rb', line 30 def anthropic(**) @mutex.synchronize { require_relative "llm/providers/anthropic" unless defined?(LLM::Anthropic) } LLM::Anthropic.new(**) end |
.deepseek ⇒ LLM::DeepSeek
62 63 64 65 |
# File 'lib/llm.rb', line 62 def deepseek(**) @mutex.synchronize { require_relative "llm/providers/deepseek" unless defined?(LLM::DeepSeek) } LLM::DeepSeek.new(**) end |
.File(obj) ⇒ LLM::File
82 83 84 85 86 87 88 89 90 91 |
# File 'lib/llm/file.rb', line 82 def LLM.File(obj) case obj when File obj.close unless obj.closed? LLM.File(obj.path) when LLM::File, LLM::Response then obj when String then LLM::File.new(obj) else raise TypeError, "don't know how to handle #{obj.class} objects" end end |
.function(name, &b) ⇒ LLM::Function
Define or get a function
99 100 101 102 103 104 105 |
# File 'lib/llm.rb', line 99 def function(name, &b) if block_given? functions[name.to_s] = LLM::Function.new(name, &b) else functions[name.to_s] end end |
.functions ⇒ Hash<String,LLM::Function>
Returns all known functions
110 111 112 |
# File 'lib/llm.rb', line 110 def functions @functions ||= {} end |
.gemini ⇒ Gemini
Returns a new instance of Gemini.
38 39 40 41 |
# File 'lib/llm.rb', line 38 def gemini(**) @mutex.synchronize { require_relative "llm/providers/gemini" unless defined?(LLM::Gemini) } LLM::Gemini.new(**) end |
.llamacpp(key: nil) ⇒ LLM::LlamaCpp
54 55 56 57 |
# File 'lib/llm.rb', line 54 def llamacpp(key: nil, **) @mutex.synchronize { require_relative "llm/providers/llamacpp" unless defined?(LLM::LlamaCpp) } LLM::LlamaCpp.new(key:, **) end |
.ollama(key: nil) ⇒ Ollama
Returns a new instance of Ollama.
46 47 48 49 |
# File 'lib/llm.rb', line 46 def ollama(key: nil, **) @mutex.synchronize { require_relative "llm/providers/ollama" unless defined?(LLM::Ollama) } LLM::Ollama.new(key:, **) end |