Create Prompt
bedrockagent_create_prompt | R Documentation |
Creates a prompt in your prompt library that you can add to a flow¶
Description¶
Creates a prompt in your prompt library that you can add to a flow. For more information, see Prompt management in Amazon Bedrock, Create a prompt using Prompt management and Prompt flows in Amazon Bedrock in the Amazon Bedrock User Guide.
Usage¶
bedrockagent_create_prompt(clientToken, customerEncryptionKeyArn,
defaultVariant, description, name, tags, variants)
Arguments¶
clientToken
A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.
customerEncryptionKeyArn
The Amazon Resource Name (ARN) of the KMS key to encrypt the prompt.
defaultVariant
The name of the default variant for the prompt. This value must match the
name
field in the relevant PromptVariant object.description
A description for the prompt.
name
[required] A name for the prompt.
tags
Any tags that you want to attach to the prompt. For more information, see Tagging resources in Amazon Bedrock.
variants
A list of objects, each containing details about a variant of the prompt.
Value¶
A list with the following syntax:
list(
arn = "string",
createdAt = as.POSIXct(
"2015-01-01"
),
customerEncryptionKeyArn = "string",
defaultVariant = "string",
description = "string",
id = "string",
name = "string",
updatedAt = as.POSIXct(
"2015-01-01"
),
variants = list(
list(
additionalModelRequestFields = list(),
genAiResource = list(
agent = list(
agentIdentifier = "string"
)
),
inferenceConfiguration = list(
text = list(
maxTokens = 123,
stopSequences = list(
"string"
),
temperature = 123.0,
topP = 123.0
)
),
metadata = list(
list(
key = "string",
value = "string"
)
),
modelId = "string",
name = "string",
templateConfiguration = list(
chat = list(
inputVariables = list(
list(
name = "string"
)
),
messages = list(
list(
content = list(
list(
text = "string"
)
),
role = "user"|"assistant"
)
),
system = list(
list(
text = "string"
)
),
toolConfiguration = list(
toolChoice = list(
any = list(),
auto = list(),
tool = list(
name = "string"
)
),
tools = list(
list(
toolSpec = list(
description = "string",
inputSchema = list(
json = list()
),
name = "string"
)
)
)
)
),
text = list(
inputVariables = list(
list(
name = "string"
)
),
text = "string"
)
),
templateType = "TEXT"|"CHAT"
)
),
version = "string"
)
Request syntax¶
svc$create_prompt(
clientToken = "string",
customerEncryptionKeyArn = "string",
defaultVariant = "string",
description = "string",
name = "string",
tags = list(
"string"
),
variants = list(
list(
additionalModelRequestFields = list(),
genAiResource = list(
agent = list(
agentIdentifier = "string"
)
),
inferenceConfiguration = list(
text = list(
maxTokens = 123,
stopSequences = list(
"string"
),
temperature = 123.0,
topP = 123.0
)
),
metadata = list(
list(
key = "string",
value = "string"
)
),
modelId = "string",
name = "string",
templateConfiguration = list(
chat = list(
inputVariables = list(
list(
name = "string"
)
),
messages = list(
list(
content = list(
list(
text = "string"
)
),
role = "user"|"assistant"
)
),
system = list(
list(
text = "string"
)
),
toolConfiguration = list(
toolChoice = list(
any = list(),
auto = list(),
tool = list(
name = "string"
)
),
tools = list(
list(
toolSpec = list(
description = "string",
inputSchema = list(
json = list()
),
name = "string"
)
)
)
)
),
text = list(
inputVariables = list(
list(
name = "string"
)
),
text = "string"
)
),
templateType = "TEXT"|"CHAT"
)
)
)