Skip to main content
When you create an agent, the choice of LLM determines how effectively the agent can complete your workflow. The SaaS and Developer Edition variants of the watsonx Orchestrate platform include all LLM chat models available within watsonx.ai and Groq. These models are provided as part of your entitlement [1].

Available Models

ModelDescription
is a Groq model optimized for high-speed inference and tool calling, designed for multilingual and system-prompt flexibility.
watsonx/meta-llama/llama-3-2-90b-vision-instructLlama-3-2-90b-vision-instruct is an auto-regressive language model that uses an optimized transformer architecture.
watsonx/meta-llama/llama-3-405b-instructLlama-3-405b-instruct is Meta’s largest open-sourced foundation model to date, with 405 billion parameters, optimized for dialogue use cases.
  • GPT-OSS models require special considerations. For more information, see Special considerations.
  • GPT-OSS-120b is a non-IBM product governed by a third-party license that may impose use restrictions and other obligations. By using this model you agree to the terms. Read the terms.

Preferred Models

While all models are available, only a subset is marked as preferred. Preferred models appear with a ★ next to their name when you run the orchestrate models list command. In the UI, only preferred models are shown as supported on the Manage Agents page. Preferred models have undergone evaluation and are optimized for use with the watsonx Orchestrate platform. On-premises models hosted through watsonx.ai Inference Frameworks Manager (IFM) within watsonx Orchestrate also appear in this list. Groq models, such as groq/gpt-oss-120b can be enabled locally by setting the GROQ_API_KEY in the .env file.

watsonx.ai Models

The SaaS and Developer Edition variants of watsonx Orchestrate platform comes out of the box with all LLM chat models available within watsonx.ai. These models come with your entitlement [1]. While all models are available only a small number of models are marked as preferred. These appear with a ★ next to their name when you run the orchestrate models list command. Only preferred models appear as supported in the Manage Agents page in the UI. Preferred models are models which have undergone evaluation and have been determined to work well with the watsonx Orchestrate platform. On-premises models hosted via watsonx.ai Inference Frameworks Manager (IFM) within watsonx Orchestrate will also show up in this list.

AI Gateway

virtual-models

The watsonx Orchestrate platform includes the AI Gateway, which lets you expose models from your preferred provider to watsonx Orchestrate. These models can be added to watsonx Orchestrate via the ADK and will be automatically made available to all users of the watsonx Orchestrate platform. [2] While it may be possible to add any model in this way, only models which support tool calling will work with watsonx Orchestrate. Models added via the AI gateway are not validated for compatability with watsonx Orchestrate, unlike the preferred models from watsonx.ai. *Using a virtual model may occur additional cost from the upstream provider.

virtual-policies

In addition to adding models, it is possible to configure complex routing rules with you LLMs. Use virtual-policies to establish pseudo-LLM names which can load balance traffic between models or establish fallback policies for when a provider is experiences an outage.
[1] In the Developer Edition of watsonx Orchestrate, when a WO_INSTANCE and WO_API_KEY is provided in the users .env file, if that instance is a SaaS instance, all LLM requests will be proxied through watsonx Orchestrate without the user needing an additional watsonx.ai entitlement.If the instance is CPD, and a WO_INSTANCE AND WO_API_KEY is provided, only models deployed via IFM will be available via the watsonx/ prefix.If neither of the above apply, either provide a WATSONX_SPACE_ID and WATSONX_APIKEY to a watsonx.ai account hosted in us-south, or add a model to the AI gateway running within your local Developer Edition server.[2] virtual-models and virtual-policies added to a SaaS or CPD instance of watsonx Orchestrate will not automatically be available within the Developer Edition. They will need to be manually added by the user.

Next steps