Adding Models
This section explains the steps to add AWS Bedrock models and configure the required access controls.1
Navigate to AWS Bedrock Models in AI Gateway
From the TrueFoundry dashboard, navigate to 
AI Gateway > Models and select AWS Bedrock.
Navigate to AWS Bedrock Models
2
Add AWS Bedrock Account Name and Collaborators
Give a unique name for the bedrock account which will be used to refer later in the models. The models in the account will be referred to as 
@providername/@modelname. Add collaborators to your account. You can decide which users/teams have access to the models in the account (User Role) and who can add/edit/remove models in this account (Manager Role). You can read more about access control here.
AWS Bedrock Model Account Form
3
Add Region and Authentication
Select the default AWS region for the models in this account. The account-level region serves as the default for all models unless explicitly overridden at the model level. Provide the authentication details on how the gateway can access the Bedrock models. Truefoundry supports both AWS Access Key/Secret Key and Assume Role based authentication. You can read below on how to generate the access/secret keys or roles.
Get AWS Authentication Details
Get AWS Authentication Details
Required IAM PolicyFirst, create the IAM policy that grants permission to invoke Bedrock models. This policy can be attached to either an IAM user (for access key authentication) or an IAM role (for assumed role authentication).The following policy grants permission to invoke all models in your available regions (To check the list of available regions for different models, refer to AWS Bedrock):Using AWS Access Key and Secret
Replace
- Create an IAM user (or choose an existing IAM user) following these steps.
- Attach the IAM policy created above to this user.
- Create an access key for this user as per this doc.
- Use this access key and secret while adding the provider account to authenticate requests to the Bedrock model.
- You can also directly specify a role that can be assumed by the service account attached to the pods running AI Gateway.
- Create an IAM role and attach the IAM policy created above to this role.
- Configure the trust policy for the role as shown in the image below:

Trust Policy
<CONTROL_PLANE_IAM_ROLE_ARN> with the ARN of the Control Plane IAM Role. You can find the Control Plane IAM Role from AWS IAM console. It is usually of the format <unique-name>-truefoundry-deps.- Read more about how assumed roles work here.
4
Add Models
Select the models from the list that you want to add. You can use
Select All to select all the models.If the model you are looking for is not present in the options, you can add it using
+ Add Model at the end of list.TrueFoundry AI Gateway supports all text and image models in Bedrock.The complete list of models supported by Bedrock can be found here.
Inference
After adding the models, you can perform inference using an OpenAI-compatible API via the Playground or integrate with your own application.
Infer Model in Playground or Get Code Snippet to integrate in your application
FAQ:
How to override the default cost of models?
How to override the default cost of models?
In case you have custom pricing for your models, you can override the default cost by clicking on Edit Model button and then choosing the 

Private Cost Metric option.
Edit Model

Set custom cost metric
Can I add models from different regions in a single bedrock integration?
Can I add models from different regions in a single bedrock integration?
Yes, you can add models from different regions. You can provide a top level default region for the account and also override it at the model level.

How to integrate Bedrock cross-region inference model?
How to integrate Bedrock cross-region inference model?
AWS Bedrock Cross-Region Inference is a feature that allows you to use models from different regions and can automatically route the request to the best region based on factors like Current load and capacity in each region, Network latency and performance, Regional availability and health. You can read more about it here. You can also read more on which models support cross-region inference here.
To use cross region inference models in Truefoundry, you can use the

To use cross region inference models in Truefoundry, you can use the Inference Profile ID instead of the model ID.
When using system-defined inference profiles for cross-Region routing, please ensure you grant model access permission in all destination regions to the role or access/secret key provided in Truefoundry while doing the integration. Failure to do so will result in failed inference requests if Bedrock decides to route the request to a different region (apart from the source region) because of lack of access in the other region.