CreateMessageRequestParams

data class CreateMessageRequestParams(val maxTokens: Int, val messages: List<SamplingMessage>, val modelPreferences: ModelPreferences? = null, val systemPrompt: String? = null, val includeContext: IncludeContext? = null, val temperature: Double? = null, val stopSequences: List<String>? = null, val metadata: JsonObject? = null, val meta: RequestMeta? = null) : RequestParams(source)

Parameters for a sampling/createMessage request.

Constructors

Link copied to clipboard
constructor(maxTokens: Int, messages: List<SamplingMessage>, modelPreferences: ModelPreferences? = null, systemPrompt: String? = null, includeContext: IncludeContext? = null, temperature: Double? = null, stopSequences: List<String>? = null, metadata: JsonObject? = null, meta: RequestMeta? = null)

Properties

Link copied to clipboard

A request to include context from one or more MCP servers (including the caller), to be attached to the prompt. The client MAY ignore this request.

Link copied to clipboard

The requested maximum number of tokens to sample (to prevent runaway completions). The client MAY choose to sample fewer tokens than the requested maximum.

Link copied to clipboard

The messages to use as context for sampling. Typically includes conversation history and the current user message.

Link copied to clipboard
@SerialName(value = "_meta")
open override val meta: RequestMeta?

Optional metadata for this request. May include a progressToken for out-of-band progress notifications.

Link copied to clipboard

Optional metadata to pass through to the LLM provider. The format of this metadata is provider-specific.

Link copied to clipboard

The server's preferences for which model to select. The client MAY ignore these preferences and choose any model.

Link copied to clipboard

Optional list of sequences that will stop generation if encountered.

Link copied to clipboard

An optional system prompt the server wants to use for sampling. The client MAY modify or omit this prompt.

Link copied to clipboard

Optional temperature parameter for sampling (typically 0.0-2.0). Higher values make output more random, lower values make it more deterministic.