TrueFoundry offers a secure and efficient gateway to seamlessly integrate various Large Language Models (LLMs) into your applications, including models hosted on AWS Bedrock.

Adding Models

This section explains the steps to add AWS Bedrock models and configure the required access controls.

  • From the TrueFoundry dashboard, navigate to AI Gateway > Models and select AWS Bedrock.
  • Click Add AWS Bedrock Account

Navigate to AWS Bedrock Models

  • Give a unique name to your AWS Bedrock account and complete the form with your AWS authentication details (Access Key + Secret or Assume Role ID) and select the default AWS region. Learn more about authentication options here.
  • Add collaborators to your account, this will give access to the account to other users/teams. Learn more about access control here.
  • Select the model from the list. If you see the model you want to add in the list of checkboxes, we support public model cost for these models.

(Optional) If the model you are looking for is not present in the options, you can add it using + Add Model at the end of list (scroll down to see the option) by filling the form.

You can specify a different region for the model in this form to override the default region specified at the account level. The account-level region serves as the default for all models unless explicitly overridden at the model level.

AWS Bedrock Model Account Form

Supported Models

A list of models supported by AWS Bedrock, along with their corresponding model IDs, can be found here: View Full List

The TrueFoundry AI Gateway supports all text and image models in Bedrock. We are also working on adding support for additional modalities, including speech, in the near future.

Inference

After adding the models, you can perform inference using an OpenAI-compatible API via the TrueFoundry AI Gateway. For instance, you can directly utilize the OpenAI library as well.

If you want to test out the added models on our playground, you can do so by clicking on the Try in Playground button next to the model you want to test.

Test in Playground

from openai import OpenAI

client = OpenAI(api_key="Enter your API Key here", base_url="https://{controlPlaneUrl}/api/inference/openai")
stream = client.chat.completions.create(
    messages = [
            {"role": "system", "content": "You are an AI bot."},
            {"role": "user", "content": "Enter your prompt here"},
    ],
    model= "bedrock-provider/llama-70b",
)

TrueFoundry offers a secure and efficient gateway to seamlessly integrate various Large Language Models (LLMs) into your applications, including models hosted on AWS Bedrock.

Adding Models

This section explains the steps to add AWS Bedrock models and configure the required access controls.

  • From the TrueFoundry dashboard, navigate to AI Gateway > Models and select AWS Bedrock.
  • Click Add AWS Bedrock Account

Navigate to AWS Bedrock Models

  • Give a unique name to your AWS Bedrock account and complete the form with your AWS authentication details (Access Key + Secret or Assume Role ID) and select the default AWS region. Learn more about authentication options here.
  • Add collaborators to your account, this will give access to the account to other users/teams. Learn more about access control here.
  • Select the model from the list. If you see the model you want to add in the list of checkboxes, we support public model cost for these models.

(Optional) If the model you are looking for is not present in the options, you can add it using + Add Model at the end of list (scroll down to see the option) by filling the form.

You can specify a different region for the model in this form to override the default region specified at the account level. The account-level region serves as the default for all models unless explicitly overridden at the model level.

AWS Bedrock Model Account Form

Supported Models

A list of models supported by AWS Bedrock, along with their corresponding model IDs, can be found here: View Full List

The TrueFoundry AI Gateway supports all text and image models in Bedrock. We are also working on adding support for additional modalities, including speech, in the near future.

Inference

After adding the models, you can perform inference using an OpenAI-compatible API via the TrueFoundry AI Gateway. For instance, you can directly utilize the OpenAI library as well.

If you want to test out the added models on our playground, you can do so by clicking on the Try in Playground button next to the model you want to test.

Test in Playground

from openai import OpenAI

client = OpenAI(api_key="Enter your API Key here", base_url="https://{controlPlaneUrl}/api/inference/openai")
stream = client.chat.completions.create(
    messages = [
            {"role": "system", "content": "You are an AI bot."},
            {"role": "user", "content": "Enter your prompt here"},
    ],
    model= "bedrock-provider/llama-70b",
)