class
MCProtocol::CreateMessageRequestParams
- MCProtocol::CreateMessageRequestParams
- Reference
- Object
Included Modules
- JSON::Serializable
Defined in:
mcprotocol/create_message_request.crConstructors
- .new(maxTokens : Int64, messages : Array(SamplingMessage), includeContext : CreateMessageRequestParamsIncludeContext | Nil = Nil, metadata : JSON::Any | Nil = Nil, modelPreferences : ModelPreferences | Nil = Nil, stopSequences : Array(String) | Nil = Nil, systemPrompt : String | Nil = Nil, temperature : Float64 | Nil = Nil)
- .new(pull : JSON::PullParser)
Instance Method Summary
-
#includeContext : CreateMessageRequestParamsIncludeContext | Nil
A request to include context from one or more MCP servers (including the caller), to be attached to the prompt.
-
#maxTokens : Int64
The maximum number of tokens to sample, as requested by the server.
- #messages : Array(SamplingMessage)
-
#metadata : JSON::Any | Nil
Optional metadata to pass through to the LLM provider.
-
#modelPreferences : ModelPreferences | Nil
The server's preferences for which model to select.
- #stopSequences : Array(String) | Nil
-
#systemPrompt : String | Nil
An optional system prompt the server wants to use for sampling.
- #temperature : Float64 | Nil
Constructor Detail
Instance Method Detail
A request to include context from one or more MCP servers (including the caller), to be attached to the prompt. The client MAY ignore this request.
The maximum number of tokens to sample, as requested by the server. The client MAY choose to sample fewer tokens than requested.
Optional metadata to pass through to the LLM provider. The format of this metadata is provider-specific.
The server's preferences for which model to select. The client MAY ignore these preferences.
An optional system prompt the server wants to use for sampling. The client MAY modify or omit this prompt.