class LLM::GeneralAdapter

Overview

Adapter for OpenAI-compatible chat APIs (LLM::General).

Included Modules

Defined in:

llm/adapter.cr

Constructors

Instance Method Summary

Instance methods inherited from module LLM::Adapter

close : Nil close, request(prompt : String, format : String = "json") : String request, request_messages(messages : Messages, format : String = "json") : String request_messages, request_messages_with_tools(messages : Messages, _tools : String) : String request_messages_with_tools, request_with_context(system : String | Nil, user : String, format : String = "json", cache_key : String | Nil = nil) : String request_with_context, supports_context? : Bool supports_context?, supports_native_tool_calling? : Bool supports_native_tool_calling?

Constructor Detail

def self.new(client : LLM::General, native_tool_calling_enabled : Bool = true) #

[View source]

Instance Method Detail

def client : LLM::General #

[View source]
def request(prompt : String, format : String = "json") : String #
Description copied from module LLM::Adapter

Send a single prompt and get a response as a String.


[View source]
def request_messages(messages : Messages, format : String = "json") : String #
Description copied from module LLM::Adapter

Send chat-style messages (system/user) and get a response as a String.


[View source]
def request_messages_with_tools(messages : Messages, tools : String) : String #
Description copied from module LLM::Adapter

Request next step using provider-native tool definitions. Implementations that do not support this can fallback to regular JSON-mode requests.


[View source]
def supports_native_tool_calling? : Bool #
Description copied from module LLM::Adapter

Whether this adapter can leverage provider-native tool-calling.


[View source]