mirror of
https://github.com/trycua/computer.git
synced 2026-01-06 05:20:02 -06:00
Merge pull request #404 from lmnr-ai/docs-observability
add docs on observability with Laminar
This commit is contained in:
@@ -1,4 +1,7 @@
|
||||
{
|
||||
"title": "Integrations",
|
||||
"pages": ["hud"]
|
||||
}
|
||||
"pages": [
|
||||
"hud",
|
||||
"observability"
|
||||
]
|
||||
}
|
||||
62
docs/content/docs/agent-sdk/integrations/observability.mdx
Normal file
62
docs/content/docs/agent-sdk/integrations/observability.mdx
Normal file
@@ -0,0 +1,62 @@
|
||||
---
|
||||
title: Observability
|
||||
description: Trace CUA execution steps and sessions
|
||||
---
|
||||
|
||||
## Observability
|
||||
|
||||
CUA has a native integration with [Laminar](https://laminar.sh/) – open-source platform for tracing, evals, and labeling of autonomous AI agents. Read more about Laminar in the [Laminar docs](https://docs.lmnr.ai/).
|
||||
|
||||
## Setup
|
||||
|
||||
Register on [Laminar Cloud](https://laminar.sh/) or spin up a [local instance](https://github.com/lmnr-ai/lmnr) and get the key from your project settings. Set the `LMNR_PROJECT_API_KEY` environment variable to your key.
|
||||
|
||||
```bash
|
||||
pip install lmnr[all]
|
||||
export LMNR_PROJECT_API_KEY=your-key
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
Then, initialize Laminar at the entry point of your application, register Laminar LiteLLM callback, and all steps of CUA will be automatically traced.
|
||||
|
||||
```python
|
||||
import os
|
||||
|
||||
import litellm
|
||||
|
||||
from agent import ComputerAgent
|
||||
from computer import Computer
|
||||
from lmnr import Laminar, LaminarLiteLLMCallback # [!code highlight]
|
||||
|
||||
Laminar.initialize() # [!code highlight]
|
||||
litellm.callbacks.append(LaminarLiteLLMCallback()) # [!code highlight]
|
||||
|
||||
computer = Computer(
|
||||
os_type="linux",
|
||||
provider_type="cloud",
|
||||
name=os.getenv("CUA_CONTAINER_NAME"),
|
||||
api_key=os.getenv("CUA_API_KEY"),
|
||||
)
|
||||
|
||||
agent = ComputerAgent(
|
||||
model="openai/computer-use-preview",
|
||||
tools=[computer],
|
||||
)
|
||||
|
||||
async def main():
|
||||
async for step in agent.run("Create a new file called 'test.txt' in the current directory"):
|
||||
print(step["output"])
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
## Viewing traces
|
||||
|
||||
You can view traces in the Laminar UI by going to the traces tab in your project. When you select a trace,
|
||||
you will see all the agent execution steps, including computer actions, LLM calls, and screenshots.
|
||||
|
||||
For each step, you will see the LLM call, the computer action. The computer actions are highlighted in the timeline in yellow.
|
||||
|
||||
<img src="/docs/img/laminar_trace_example.png" alt="Example trace in Laminar showing the litellm.response span and its output." width="800px" />
|
||||
@@ -17,4 +17,4 @@
|
||||
"---[CodeXml]API Reference---",
|
||||
"...libraries"
|
||||
]
|
||||
}
|
||||
}
|
||||
BIN
docs/public/img/laminar_trace_example.png
Normal file
BIN
docs/public/img/laminar_trace_example.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 628 KiB |
Reference in New Issue
Block a user