Anthropic: Prompt AI action

Anthropic Prompt AI action connects automations to Amazon Bedrock Anthropic prompt AI functionality. This action enables automations to generate human-quality text, translate languages, write different kinds of creative content, and answer questions in an informative way, all based on user-defined prompts.

Prerequisites

  • You must have the Bot creator role to use the Anthropic Prompt AI action in a bot.
  • Ensure that you have the necessary credentials to send a request. For more information on acquiring the credentials, see Amazon Bedrock: 인증 action.

This example shows how to send a natural language message using the Anthropic Prompt AI action and get an appropriate response.

Procedure

  1. In the Automation Anywhere Control Room, navigate to the Actions pane, select Generative AI > Amazon Bedrock, drag Anthropic: Prompt AI and place it in the canvas.
  2. Enter or select the following fields:

    Anthropic Prompt AI

    1. Enter the Region. For information on Region, see Amazon Bedrock GA regions.
    2. Select a large language model (LLM) to use for your prompt from the Model dropdown. You can select the following models:
      • Claude Instant v1.2
      • Claude v1.3
      • Claude v2
      • Claude v2.1
      • Claude 3 Sonnet v1
      • Claude 3 Haiku v1
      • Other supported version to input other supported models.
      Note: Claude 3 Sonnet v1 또는 Claude 3 Haiku v1을 선택하면 시스템 프롬프트(선택 사항)를 입력할 수 있는 텍스트 상자가 나타납니다. Claude 3의 시스템 프롬프트는 대규모 언어 모델이 사용자와 상호 작용하기 전에 컨텍스트, 지침 및 가이드라인을 제공하는 방법입니다. 이것은 대화를 위한 설정처럼 작동하여 Claude 3에게 사용자가 기대하는 바를 알려줍니다. 시스템 프롬프트에 대한 자세한 내용은

      시스템 프롬프트Anthropic Claude Messages API 항목을 참조하십시오.

    3. Enter a prompt Message to use by the model to generate a response.
    4. Enter the Maximum length.
      By default, if you do not enter a value, then the maximum length is automatically set to keep it within the maximum context length of the selected model by considering the length of the generated response.
    5. Enter a Temperature. This value refers to the randomness of the response. As the temperature approaches zero, the response becomes more focused and deterministic. The higher the value, the more random is the response.
    6. Enter Default as the session name to limit the session to the current session.
    7. To manage the optional parameters, click Show more options and select Yes. If you select Yes, you can add other parameters such as: Top P, Top K, Stop sequences, or enter an Anthropic version. For information about these optional parameters, see Learn Models.
    8. Save the response to a variable.
      In this example, the response is saved to str_Anthropic_promptResponse.
  3. Click Run to start the bot.
    You can read the value of the field by printing the response in a Message box action. In this example, str_Anthropic_promptResponse prints the response.