Large Language Models (LLMs)
Monitoring your LLMs with Langfuse
Observability refers to the ability to monitor and understand the internal state of an LLM application, including its inputs and outputs.
watsonx Orchestrate provides observability through a native integration with an open-source project known as Langfuse.
Enabling Langfuse on SaaS
You can configure your own hosted Langfuse instance for observability.
You can configure it directly in the CLI:
Flags:
Argument | Description |
---|---|
--url / -u | The URL of the Langfuse instance (required if not specified in --config-file ). |
--api-key | The Langfuse API key (required if not specified in --config-file ). |
--health-uri | The Health URI of the Langfuse instance (required if not specified in --config-file ). |
--config-file | A config file for the Langfuse integration (can be fetched using orchestrate settings). |
--config-json | A config JSON object for the Langfuse integration, this object should contain your Langfuse public_key . |
[BASH]
Or using an external file in YAML format:
[YAML]
[BASH]
Enabling Langfuse on watsonx Orchestrate Developer Edition
To enable Langfuse in the watsonx Orchestrate Developer Edition, you must add the --with-langfuse
or -l
option to the server start
command: