Connect your own generative AI services

If your organization’s security protocols or data privacy policies require processing documents within your own infrastructure and prohibit sharing data or documents with external vendors (except for providers such as Amazon, Google, or Microsoft), you will need to connect your own generative AI service account to Document Automation.

Prerequisites

Note: For any queries or issues with your third-party provider license, contact your third-party provider directly.
Ensure you are logged in to the Control Room as the administrator, or a user with either the AAE_Locker Admin role, or a user-created role with the Manage my credentials and lockers permission to configure lockers, add credentials, and grants access to other users.
Anthropic
Anthropic models are supported when provided through Amazon Bedrock and GCP Vertex AI. Native Anthropic services are not supported.
Azure OpenAI

When creating a learning instance, you can select a generative AI service, such as Azure OpenAI or Anthropic. By default, Document Automation uses the Automation Anywhere service account, but if you need greater control over data flow, you can connect your own credentials within the extraction bot of the learning instance. While the same credentials can be used for multiple learning instances, each instance must be explicitly configured. The exact procedure for this is detailed below.

Note: As an administrator, you can enable or disable generative AI services and select the region used for these services in Administration > Settings > Document Automation Settings. However, these settings only apply when using the default Automation Anywhere service account and are not applicable when you configure your own credentials.

To securely store and share credentials with other users or Bot Runners, you must create and save your credentials, store your credentials in a locker, and select the users who can use the credentials.

Procedure

  1. Log in to your Control Room.
  2. Navigate to Manage > Credentials.
  3. On the Credentials page, click the plus (+) icon to create a new credential.
  4. In the Credential name field, enter a name for the credential.
  5. In the Attribute name field, enter a name for the attribute.
  6. Select the Standard input option.
  7. Depending on the generative AI service you are using, store the keys to access your account:
    • Anthropic Amazon Bedrock: Copy your access key and paste it in the Value field. Repeat steps 5 through 6 to copy and paste your secret access key.
    • Anthropic GCP Vertex AI: Copy your access key and paste it in the Value field.
    • Azure OpenAI: Copy your API key for GPT and paste it into the Value field.
    Note: For Azure OpenAI, you can use GPT LLM and ADA embedding model. If you are using the same account for these models, you can use the account API key to use these models but the URLs to access the models will be different. Otherwise, repeat steps 5 through 7 to create and store the API key for a different account.
  8. Click Create credential.
    Now, create a locker to store the created credentials and provide access to users.
  9. Navigate to the Lockers tab and click the plus (+) icon to create a new locker.
  10. In the Locker name field, enter a name for the locker.
  11. Select the credential to be stored in this locker and click the right arrow to move the credential to the Selected column.
  12. In the Consumers tab, select the roles that will consume this locker and credential and click the right arrow to move the roles to the Selected column.
  13. Click Create locker.
  14. Configure Additional settings for the generative AI service that you want to use in the Extract data action of the learning instance. See Extract data action.
  15. Click Save.
After your credentials are successfully configured and connected, the learning instance will begin utilizing your selected generative AI service account for document processing.