Anthropic: Chat AI action

Anthropic Chat AI action connects automations to Amazon Bedrock Anthropic chat AI functionality. This action enables automations to engage in natural, informative, and context-aware conversations with users, providing a more personalized and engaging automation experience.

Prerequisites

  • You must have the Bot creator role to use the Anthropic Chat AI action in a bot.
  • Ensure that you have the necessary credentials to send a request. For more information on acquiring the credentials, see Amazon Bedrock: Authenticate action.

This example shows how to send a natural language message using the Anthropic Chat AI action and get an appropriate response.

Procedure

  1. In the Control Room, navigate to the Actions pane, select Generative AI > Amazon Bedrock, drag Anthropic: Chat AI and place it in the canvas.
  2. Enter or select the following fields:

    Anthropic Chat AI

    1. Enter the Region. For information on Region, see Amazon Bedrock GA regions.
    2. Select a large language model (LLM) to use for your prompt from the Model dropdown. You can select the following models:
      • Claude Instant v1.2
      • Claude v1.3
      • Claude v2
      • Claude v2.1
      • Claude 3 Sonnet v1
      • Claude 3 Haiku v1
      • Other supported version to input other supported models.
      Note: When you select Claude 3 Sonnet v1 or Claude 3 Haiku v1, a text box appears for entering a System Prompt (optional). A system prompt in Claude 3 is a way to provide context, instructions, and guidelines to the large language model before it interacts with you. It acts like a set-up for the conversation, letting Claude 3 know what you expect from it. For more information on system prompt, see

      Use system prompts and Anthropic Claude Messages API.

    3. Enter a chat Message to use by the model to generate a response.
      Note: Chat actions retain the result of the previous chat action within the same session. If you call chat actions consecutively, the model can understand subsequent messages and relate them to the previous message. However, all chat history is deleted after the session ends.
    4. Enter the Maximum length.
      By default, if you do not enter a value, then the maximum length is automatically set to keep it within the maximum context length of the selected model by considering the length of the generated response.
    5. Enter a Temperature. This value refers to the randomness of the response. As the temperature approaches zero, the response becomes specific. The higher the value, the more random is the response.
    6. Enter Default as the session name to limit the session to the current session.
    7. To manage the optional parameters, click Show more options and select Yes. If you select Yes, you can add other parameters such as: Top P, Top K, Add instructions, Stop sequences, or enter an Anthropic version. For information about these optional parameters, see Learn Models.
      Note: Claude 3 models accepts System Prompts, not Add instructions. Unlike traditional instructions, system prompts provide a structured way to guide Claude 3. This is because Claude 3 is trained to understand the intent behind your prompt and generate responses that fulfill that goal, rather than simply following a set of commands.
    8. Save the response to a variable.
      In this example, the response is saved to str_Anthropic_chatResponse
  3. Click Run to start the bot.
    You can read the value of the field by printing the response in a Message box action. In this example, str_Anthropic_chatResponse prints the response. You can add additional chat requests to get additional responses.
    Tip: To maintain multiple chats in the same bot, you will need to create multiple sessions with different names or variables.