Azure OpenAI: Prompt AI action

The Azure OpenAI: Prompt AI action uses the Microsoft Azure AI's completions endpoint to generate text. This endpoint gives you access to the model's text input and output interface. You can give the model a text prompt in natural language, and it will generate a completion.

Prerequisites

  • You must have the Bot creator role to use the Azure OpenAI: Prompt AI action in a bot.
  • Ensure that you have the necessary credentials to send a request and have include Azure OpenAI: Authenticate action before calling any Microsoft Azure OpenAI actions.

This example shows how to send a natural language prompt using the Azure OpenAI: Prompt AI action and get an appropriate response.

Procedure

  1. In the Automation Anywhere Control Room, navigate to the Actions pane, select Generative AI > Microsoft, drag Azure OpenAI: Prompt AI, and place it in the canvas.
  2. Enter or select the following fields:

    Azure Prompt AI

    1. For Authenticate, select Latest version to use the Azure OpenAI: Authenticate action with the API Key.
      If you select To be deprecated, you can authenticate using the API Key without calling the Authenticate action.
      Note: The To be deprecated option will be deprecated in the up coming release.
    2. Enter Default as the session name to limit the session to the current session.
    3. Enter the Deployment ID.The Deployment ID is linked to the large language model (LLM) you want to use for your prompt.
    4. Enter a Prompt to use by the model to generate a response.
    5. Enter the maximum number of tokens to generate. By default, if you do not enter a value, then the maximum number of tokens generated is automatically set to keep it within the maximum context length of the selected model by considering the length of the generated response.
    6. Enter a Temperature. This value refers to the randomness of the response. As the temperature approaches zero, the response becomes more focused and deterministic. The higher the value, the more random is the response.
    7. To manage the optional parameters, click the Show more options and select Yes, if you want to manage the optional parameters. If you select Yes, you can add other parameters such as: API version, Suffix, Top P, N, Logprobs, Echo, Stop, Presence Penalty, Frequency Penalty, Best of, Logit bias, and User. For more information about these optional parameters, see Azure Open AI completions.
    8. Save the response to a variable. In this example, the response is saved to str_promptai_response.
  3. Click Run to start the bot. The output of a prompt is a list where you can set N as the optional parameter, generate multiple prompts for each input. To read the response of the prompt, you can print the response in a Message box action. In this example, the first element of the list str_promptai_response[0] prints the response because N is set to 1.