Create Fine tuned Model connections
- Updated: 2024/10/10
Create Fine tuned Model connections
Create Fine tuned Model connections by customizing the supported foundational models via testing and fine-tuning to make them available to Pro Developers for connecting to the AI Skills they create.
Connect to your instances of the foundational models from hyperscaler vendors such as Amazon Bedrock, Google Vertex AI, Azure OpenAI, or OpenAI and customize their models by fine-tuning them and saving them with a specific name.
The Automation Admin creates and tests the Fine tuned Model connections and makes these available to the Pro Developers, who can connect to them when creating AI Skills. The Model connections are used in AI Skills to send prompts and receive responses from the models.
The Automation Admin creates custom roles and assigns these Fine tuned Model connections to the custom roles that are assigned to users to enable their access to the Model connections.
This feature gives you the option to create your own custom models based on your specific use case. As these Fine tuned Model connections are customized and trained using your data, within your organization's environment, you can govern and monitor their use as per your compliance and governance policies.
Additionally, you can use the Fine tuned models created with Amazon Bedrock in AWS Services, and Google Vertex AI in Google Data Store when creating Fine tuned Model connections in Automation Anywhere.
A few things to consider
To use the Fine tuned Model connections you would first create them by via tooling provided by the hyperscaler models. These Fine tuned models are supported out-of-the-box by AI Agent Studio in Automation Anywhere.
Refer to the following information to create your custom, Fine tuned models for each foundational model:
- Create your Fine tuned model in Amazon Bedrock Services. See Fine-tuned models for Amazon Bedrock.
- Next, click Custom Models.
- Find your Fine tuned model and select it.
- Make a note of the Region where the model is deployed.
- Use the Model ARN and Region values procured from Amazon Bedrock to connect to your Fine tuned model when creating a Model connection in Automation Anywhere.
- Create a custom model in Google Vertex AI. See How to create Fine-tuned models for Google Vertex AI.
- Next, navigate to .
- Click View My Endpoints & Models.
- You should see the ModelID, and the relevant Region values for your Fine tuned model.
- Procure the ProjectId for this model from the first page after you log into Google Vertex AI. You can also get it from the top left corner after you log in, where it is additionally displayed.
- Use the procured ModelID and Region values to connect to your Fine tuned model when creating a Model connection in Automation Anywhere.
- To create a custom model in Azure OpenAI, see Customize a model with fine-tuning.
- Then create a deployment using this custom model. Make a note of the Model name and Deployment name.
- Use the Model name and Deployment name values to connect to your Fine tuned model when creating a Model connection in Automation Anywhere.
- Create your Fine tuned model in OpenAI. See Create a Fine-tuned model with OpenAI.
- Procure the Model ID that was generated for the Fine tuned model.
- Use this Model ID value when creating a Model connection in Automation Anywhere.
Prerequisites
- Role: AAE_Basic, Automation Admin custom role
- Permission: Attended Bot Runner
- Settings: AI Data Management must be enabled by the Automation Admin and the check box selected for Allow users to disable logs on AI Skills. Allow users with the Bot Creator license to disable data logging when using AI Skills.
See Roles and permissions for AI Tools for the Automation Admin custom role permissions.
- You would first create a Fine tuned Model connection and save it with a specific name so that you can use it when creating a Model connection. Refer to the section above to see how to create Fine tuned Model connections for each foundational model vendor.
- If you want to store authentication details in a credential vault, have that information handy. See Secure credential store through Credential Vault.
- To test a Model connection, you must be connected to a Bot Agent 22.60.10 and later. As part of the test, you would have to run the bot on your desktop. Hence ensure the Bot Agent is configured to your user. For this task, if you have to switch connection to a different Control Room, see: Switch device registration between Control Room instances.
- You need access to the Recorder package and the AI Skills package to test the connection successfully. A test Prompt would be executed to test the Model connection.
Follow these steps to create a Fine tuned Model connection.
Procedure
Next steps
See Create AI Skills.
See AI governance.
As the next step in your sequence of tasks, go to Create AI Skills and create an AI Skill and connect to a Fine tuned Model connection to eventually use it in an automation.