class
Llama::Error
- Llama::Error
- Exception
- Reference
- Object
Direct Known Subclasses
- Llama::Batch::Error
- Llama::Context::Error
- Llama::Context::TokenizationError
- Llama::KvCache::Error
- Llama::Model::Error
- Llama::Sampler::Error
- Llama::State::Error
Defined in:
llama/error.crConstant Summary
-
ERROR_MESSAGES =
{-1 => "General error", -2 => "Memory allocation error", -3 => "Batch processing error", -4 => "Context creation error", -5 => "Model loading error", -6 => "Tokenization error", -7 => "KV cache error", -8 => "State management error", -9 => "Sampling error", -10 => "Invalid parameter error", -11 => "File I/O error", -12 => "Network error", -13 => "GPU error", -14 => "Timeout error", -15 => "Unsupported operation error"}
Class Method Summary
- .error_message(code : Int32) : String
- .format_error(message : String, code : Int32 | Nil = nil, context : String | Nil = nil) : String
Class Method Detail
def self.format_error(message : String, code : Int32 | Nil = nil, context : String | Nil = nil) : String
#