struct Anthropic::GeneratedMessage
- Anthropic::GeneratedMessage
- Struct
- Value
- Object
Included Modules
- Anthropic::Resource
- JSON::Serializable
Extended Modules
- JSON::Schema
Defined in:
generated_message.crConstructors
Instance Method Summary
-
#content : Array(MessageContent)
This is an array of content blocks, each of which has a
#type
that determines its shape. -
#id : String
Unique object identifier.
-
#message_thread : Array(Message)
Messages that were passed to the
-
#model : String
The model that handled the request.
-
#role : Message::Role
Conversational role of the generated message.
-
#stop_reason : StopReason | Nil
The reason that we stopped.
-
#stop_sequence : String | Nil
Which custom stop sequence was generated, if any.
- #to_message
- #to_s(io) : Nil
-
#type : String
Object type — for
Message
s, this is always"message"
. -
#usage : Usage
Billing and rate-limit usage.
Constructor Detail
Instance Method Detail
This is an array of content blocks, each of which has a #type
that determines
its shape. Currently, the only #type
in responses is "text"
.
Example:
[{ "type": "text", "text": "Hi, I'm Claude." }]
If the request input messages
ended with an assistant
turn, then the
response #content
will continue directly from that last turn. You can use this
to constrain the model's output.
For example, if the input messages
were:
[
{
"role": "user",
"content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun"
},
{ "role": "assistant", "content": "The best answer is (" }
]
Then the response #content
might be:
[{ "type": "text", "text": "B)" }]
Conversational role of the generated message. This will always be :assistant
.
The reason that we stopped. This may be one the following values:
:end_turn
: the model reached a natural stopping point:max_tokens
: we exceeded the requestedmax_tokens
or the model's maximum:stop_sequence
: one of your provided customstop_sequences
was generated
In non-streaming mode this value is always non-null. In streaming mode, it is
nil
in the MessageStart
event and non-nil
otherwise.
Which custom stop sequence was generated, if any. This value will be a non-
nil
string if one of your custom stop sequences was generated.
Billing and rate-limit usage.
Anthropic's API bills and rate-limits by token counts, as tokens represent the underlying cost to our systems.
Under the hood, the API transforms requests into a format suitable for the
model. The model's output then goes through a parsing stage before becoming an
API response. As a result, the token counts in #usage
will not match one-to-one
with the exact visible content of an API request or response.
For example, output_tokens
will be non-zero, even for an empty string response
from Claude.