Adding Models
This section explains the steps to add AWS Bedrock models and configure the required access controls.Navigate to AWS Bedrock Models in AI Gateway
AI Gateway > Models and select AWS Bedrock.
Navigate to AWS Bedrock Models
Add AWS Bedrock Account Name and Collaborators
@providername/@modelname. Add collaborators to your account. You can decide which users/teams have access to the models in the account (User Role) and who can add/edit/remove models in this account (Manager Role). You can read more about access control here.
AWS Bedrock Model Account Form
Add Region and Authentication
Get AWS Authentication Details
Get AWS Authentication Details
- Create an IAM user (or choose an existing IAM user) following these steps.
- Attach the IAM policy created above to this user.
- Create an access key for this user as per this doc.
- Use this access key and secret while adding the provider account to authenticate requests to the Bedrock model.
- Create an IAM role in your AWS account that has access to Bedrock. Attach the IAM policy with Bedrock permissions (shown above) to this role.
- Configure the trust policy for this role to allow the gateway role to assume it. Use the appropriate role ARN based on your deployment:
- Gateway role ARN:
arn:aws:iam::416964291864:role/tfy-ctl-production-ai-gateway-deps
- Your gateway role ARN will look like:
arn:aws:iam::<your-aws-account-id>:role/<account-prefix>-truefoundry-deps

Trust Policy
- Read more about how assumed roles work here.
Add Models
Select All to select all the models.+ Add Model at the end of list.Inference
After adding the models, you can perform inference using an OpenAI-compatible API via the Playground or integrate with your own application.
Infer Model in Playground or Get Code Snippet to integrate in your application
FAQ:
How to override the default cost of models?
How to override the default cost of models?
Private Cost Metric option.
Edit Model

Set custom cost metric
Can I add models from different regions in a single bedrock integration?
Can I add models from different regions in a single bedrock integration?

How to integrate Bedrock cross-region inference model?
How to integrate Bedrock cross-region inference model?
- Regular Model ID:
anthropic.claude-3-5-sonnet-20240620-v1:0(single region) - Inference Profile ID:
us.anthropic.claude-3-5-sonnet-20240620-v1:0(cross-region routing)
- System-defined geographic profiles: Use geographic prefixes (
us.,eu.,apac.) followed by the model ID (e.g.,us.anthropic.claude-3-5-sonnet-20240620-v1:0). The prefix indicates routing within that geography. - Custom inference profiles: Use full ARN format (e.g.,
arn:aws:bedrock:us-east-1:123456789012:inference-profile/my-profile)
us.anthropic.claude-3-5-sonnet-20240620-v1:0) instead of the regular model ID. If it’s not in the dropdown, use + Add Model and enter it manually.
AWS Bedrock cross-region inference configuration interface
* for the region to allow access across all regions:YOUR-AWS-ACCOUNT-ID in the policy above with your actual AWS account ID. The * in the region position allows access across all regions.Request fails with 'Access Denied' error
Request fails with 'Access Denied' error
- Ensure your IAM policy grants Bedrock permissions across all regions (use
*in the region part of the ARN) - For geographic profiles, grant permissions in both source and destination regions
- Check if Service Control Policies (SCPs) are blocking access to certain regions
Requests always go to the same region
Requests always go to the same region
us.anthropic.claude-3-5-sonnet-20240620-v1:0) instead of the regular model ID.- Cross-Region inference overview - How cross-region inference works
- Geographic cross-Region inference - Geographic boundary routing and IAM policy requirements
- Using inference profiles - How to use inference profiles in API calls
- Supported models and regions - List of models that support cross-region inference