Create Fine tuned Model connections

Create Fine tuned Model connections by customizing the supported foundational models via testing and fine-tuning to make them available to Pro Developers for connecting to the AI Skills they create.

Connect to your instances of the foundational models from hyperscaler vendors such as Amazon Bedrock, Google Vertex AI, Azure OpenAI, or OpenAI and customize their models by fine-tuning them and saving them with a specific name.

The Automation Admin creates and tests the Fine tuned Model connections and makes these available to the Pro Developers, who can connect to them when creating AI Skills. The Model connections are used in AI Skills to send prompts and receive responses from the models.

The Automation Admin creates custom roles and assigns these Fine tuned Model connections to the custom roles that are assigned to users to enable their access to the Model connections.

This feature gives you the option to create your own custom models based on your specific use case. As these Fine tuned Model connections are customized and trained using your data, within your organization's environment, you can govern and monitor their use as per your compliance and governance policies.

Additionally, you can use the Fine tuned models created with Amazon Bedrock in AWS Services, and Google Vertex AI in Google Data Store when creating Fine tuned Model connections in Automation Anywhere.

Create Fine tuned Model connections

A few things to consider

To use the Fine tuned Model connections you would first create them by via tooling provided by the hyperscaler models. These Fine tuned models are supported out-of-the-box by AI Agent Studio in Automation Anywhere.

Refer to the following information to create your custom, Fine tuned models for each foundational model:

Amazon Bedrock:
  1. Create your Fine tuned model in Amazon Bedrock Services. See Fine-tuned models for Amazon Bedrock.
  2. Next, click Custom Models.
  3. Find your Fine tuned model and select it.
  4. Make a note of the Region where the model is deployed.
  5. Use the Model ARN and Region values procured from Amazon Bedrock to connect to your Fine tuned model when creating a Model connection in Automation Anywhere.
Google Vertex AI:
  1. Create a custom model in Google Vertex AI. See How to create Fine-tuned models for Google Vertex AI.
  2. Next, navigate to Vertex AI > Model Garden.
  3. Click View My Endpoints & Models.
  4. You should see the ModelID, and the relevant Region values for your Fine tuned model.
  5. Procure the ProjectId for this model from the first page after you log into Google Vertex AI. You can also get it from the top left corner after you log in, where it is additionally displayed.
  6. Use the procured ModelID and Region values to connect to your Fine tuned model when creating a Model connection in Automation Anywhere.
Azure OpenAI:
  1. To create a custom model in Azure OpenAI, see Customize a model with fine-tuning.
  2. Then create a deployment using this custom model. Make a note of the Model name and Deployment name.
  3. Use the Model name and Deployment name values to connect to your Fine tuned model when creating a Model connection in Automation Anywhere.
OpenAI:
  1. Create your Fine tuned model in OpenAI. See Create a Fine-tuned model with OpenAI.
  2. Procure the Model ID that was generated for the Fine tuned model.
  3. Use this Model ID value when creating a Model connection in Automation Anywhere.

Prerequisites

The Automation Admin creates the Model connections and requires these roles and permissions to manage Model connections for their business organization.
  • Role: AAE_Basic, Automation Admin custom role
  • Permission: Attended Bot Runner
  • Settings: AI Data Management must be enabled by the Automation Admin and the check box selected for Allow users to disable logs on AI Skills. Allow users with the Bot Creator license to disable data logging when using AI Skills.

See Roles and permissions for AI Tools for the Automation Admin custom role permissions.

Other requirements:
  • You would first create a Fine tuned Model connection and save it with a specific name so that you can use it when creating a Model connection. Refer to the section above to see how to create Fine tuned Model connections for each foundational model vendor.
  • If you want to store authentication details in a credential vault, have that information handy. See Secure credential store through Credential Vault.
  • To test a Model connection, you must be connected to a Bot Agent 22.60.10 and later. As part of the test, you would have to run the bot on your desktop. Hence ensure the Bot Agent is configured to your user. For this task, if you have to switch connection to a different Control Room, see: Switch device registration between Control Room instances.
  • You need access to the Recorder package and the AI Skills package to test the connection successfully. A test Prompt would be executed to test the Model connection.

Follow these steps to create a Fine tuned Model connection.

Procedure

  1. In your Control Room environment, navigate to AI > Model connections > Create model connection.
  2. In the Create model connection screen you would configure these Connection settings:
    1. Model connection name: Provide a name for easy identification of the Model connection.
    2. Description (optional): Add a meaningful short description defining the connection.
    3. Choose a vendor: Choose a foundational model vendor from the supported list of vendors such as Amazon Bedrock, Google Vertex AI, Azure OpenAI, or OpenAI.
    4. Choose a type: Choose Fine tuned from the drop-down list.
    5. Choose a model: Choose a model from the drop-down list. The list displays relevant models as per your vendor selection.
      We recommend not selecting any of the Amazon Anthropic Claude models from the drop-down list. Amazon does not allow fine-tuning any of the Anthropic Claude models, hence this model option should not display in the drop-down list. We will remove the option in our future release.
    6. Fine-tuned model name: Enter the specific name of the Fine tuned custom model you created earlier.
    7. Click Next to proceed to the Authentication details section.
  3. In the Authentication details section, configure the settings as per values procured from the foundational model services for Amazon Bedrock, Google Vertex AI, Azure OpenAI, or OpenAI.
    Note: For details on setting up the Authentication details for each model vendor, see Authenticate Model connections.
  4. Click Test connection to make sure all connection details have been defined correctly and check if the connection is working.
    This is a desktop operation using a Bot Agent. Use Bot Agent 22.60.10 and later for successful testing.
    • If the connection works as expected, the system will process the request and you will get a system generated success message.
    • If the connection does not work as expected, you will get a system-generated message stating the reason for the connection failure. For example, if you have not downloaded the supported foundational model package to your workspace, you would get an error message. You would have to download the package and then retest the Model connection.
    • If the testing a Model connection is unsuccessful or if you leave the task incomplete, the Model connection will not get saved and you will have to restart the process of creating the Model connection.
  5. Click Next to proceed to the Invite roles section to begin assigning custom roles to users.
    The Automation Admin would create custom roles and assign the Model connections to the role, which can then be assigned to users. Only users assigned to this custom role can use this Model connection.
  6. Assign access to the Pro Developer via custom role (using RBAC), for using this Model connection to create an AI Skill.
  7. Click Create model connection to complete creating the Model connection.
    After successfully creating the Model connection, the Pro Developer would use it to create an AI Skill

    See: Create AI Skills.

Next steps

After creating and testing the Model connection, you would assign it to the Pro Developers, who would use this connection to create AI Skills.

See Create AI Skills.

Note: When you create or test an AI Skill in the AI Skill editor, the success or failure details along with the model responses can be viewed in these navigation screens:
  • Administration > AI governance > AI Prompt log
  • Administration > AI governance > Event log
  • Administration > Audit log

See AI governance.

As the next step in your sequence of tasks, go to Create AI Skills and create an AI Skill and connect to a Fine tuned Model connection to eventually use it in an automation.