class
LLM::GeneralAdapter
- LLM::GeneralAdapter
- Reference
- Object
Overview
Adapter for OpenAI-compatible chat APIs (LLM::General).
Included Modules
Defined in:
llm/adapter.crConstructors
Instance Method Summary
- #client : LLM::General
-
#request(prompt : String, format : String = "json") : String
Send a single prompt and get a response as a String.
-
#request_messages(messages : Messages, format : String = "json") : String
Send chat-style messages (system/user) and get a response as a String.
-
#request_messages_with_tools(messages : Messages, tools : String) : String
Request next step using provider-native tool definitions.
-
#supports_native_tool_calling? : Bool
Whether this adapter can leverage provider-native tool-calling.
Instance methods inherited from module LLM::Adapter
close : Nil
close,
request(prompt : String, format : String = "json") : String
request,
request_messages(messages : Messages, format : String = "json") : String
request_messages,
request_messages_with_tools(messages : Messages, _tools : String) : String
request_messages_with_tools,
request_with_context(system : String | Nil, user : String, format : String = "json", cache_key : String | Nil = nil) : String
request_with_context,
supports_context? : Bool
supports_context?,
supports_native_tool_calling? : Bool
supports_native_tool_calling?
Constructor Detail
Instance Method Detail
Description copied from module LLM::Adapter
Send a single prompt and get a response as a String.
Description copied from module LLM::Adapter
Send chat-style messages (system/user) and get a response as a String.
Description copied from module LLM::Adapter
Request next step using provider-native tool definitions. Implementations that do not support this can fallback to regular JSON-mode requests.
def supports_native_tool_calling? : Bool
#
Description copied from module LLM::Adapter
Whether this adapter can leverage provider-native tool-calling.