Azure OpenAI: Chat AI action

The Azure OpenAI: Chat AI action uses Microsoft's ChatGPT and GPT-4 models to generate text in a chat-like format.

Prerequisites

  • You must have the Bot creator role to use the Azure OpenAI: Chat AI action in a bot.
  • Ensure that you have the necessary credentials to send a request and have include Azure OpenAI: Authenticate action before calling any Microsoft Azure OpenAI actions.

This example shows how to send a natural language message using the Azure OpenAI: Chat AI action and get an appropriate response.

Procedure

  1. In the Automation Anywhere Control Room, navigate to the Actions pane, select Generative AI > Microsoft Azure OpenAI, drag Azure OpenAI: Chat AI, and place it in the canvas.
  2. Enter or select the following fields:

    Azure Chat AI

    1. For Authenticate, select Latest version to use the Azure OpenAI: Authenticate action with the API Key.
      If you select To be deprecated, you can authenticate using the API Key without calling the Authenticate action.
      Note: The To be deprecated option will be deprecated in the upcoming release.
    2. Enter Default as the session name to confine to the current session.
    3. Enter the Deployment ID.The Deployment ID is linked to the large language model (LLM) you want to use for your prompt.
    4. Enter a chat Message to use by the model to generate a response.
      Note: Chat actions retain the result of the previous chat action within the same session. If you call chat actions consecutively, the model can understand subsequent messages and relate them to the previous message. However, all chat history is deleted after the session ends.
    5. Enter the maximum number of tokens (Max tokens) to generate. By default, if you do not enter a value, then the maximum number of tokens generated is automatically set to keep it within the maximum context length of the selected model by considering the length of the generated response.
    6. Enter a Temperature. This value refers to the randomness of the response. As the temperature approaches zero, the response becomes more focused and deterministic. The higher the value, the more random is the response.
    7. To manage the optional parameters, click the Show more options and select Yes. If you select Yes, you can add other parameters such as: Stop, Presence Penalty, Frequency Penalty, Logit bias, and User. For information about these optional parameters, see Azure Open AI chat completions.
    8. Save the response to a variable. In this example, in the response is saved to str_chatai-response.
  3. Click Run to start the bot. You can read the value of the field by printing the response in a Message box action. In this example, str_chatai-response prints the response.
    Tip: To maintain multiple chats in the same bot, you will need to create multiple sessions with different names or variables.