Update docs to use os.environ instead of export in Python examples

Replace shell export statements with Python os.environ in all Python code blocks.
This provides a more Pythonic way to set environment variables.

Changes:
- docs/content/docs/get-started/quickstart.mdx: Update Cloud Sandbox, CUA VLM Router, and BYOK examples
- docs/content/docs/agent-sdk/supported-model-providers/cua-vlm-router.mdx: Update migration examples

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
f-trycua
2025-11-19 20:13:03 +01:00
parent da63030505
commit d2b24d3359
2 changed files with 17 additions and 22 deletions

View File

@@ -344,8 +344,9 @@ Switching from direct provider access (BYOK) to CUA VLM Router is simple:
**Before (Direct Provider Access with BYOK):**
```python
import os
# Required: Provider-specific API key
export ANTHROPIC_API_KEY="sk-ant-..."
os.environ["ANTHROPIC_API_KEY"] = "sk-ant-..."
agent = ComputerAgent(
model="anthropic/claude-sonnet-4-5-20250929",
@@ -355,8 +356,9 @@ agent = ComputerAgent(
**After (CUA VLM Router - Cloud Service):**
```python
import os
# Required: CUA API key only (no provider keys needed)
export CUA_API_KEY="sk_cua-api01_..."
os.environ["CUA_API_KEY"] = "sk_cua-api01_..."
agent = ComputerAgent(
model="cua/anthropic/claude-sonnet-4.5", # Add "cua/" prefix

View File

@@ -141,15 +141,13 @@ Connect to your Cua computer and perform basic interactions, such as taking scre
<Tabs items={['Cloud Sandbox', 'Linux on Docker', 'macOS Sandbox', 'Windows Sandbox', 'Your host desktop']}>
<Tab value="Cloud Sandbox">
Set your CUA API key (same key used for model inference):
```bash
export CUA_API_KEY="sk_cua-api01_..."
```
Then connect to your sandbox:
Set your CUA API key (same key used for model inference) and connect to your sandbox:
```python
import os
from computer import Computer
os.environ["CUA_API_KEY"] = "sk_cua-api01_..."
computer = Computer(
os_type="linux", # or "windows" or "macos"
provider_type="cloud",
@@ -346,15 +344,13 @@ Choose how you want to access vision-language models for your agent:
Use CUA's inference API to access multiple model providers with a single API key (same key used for sandbox access). CUA VLM Router provides intelligent routing and cost optimization.
**Set your CUA API key:**
```bash
export CUA_API_KEY="sk_cua-api01_..."
```
**Use the agent with CUA models:**
```python
import os
from agent import ComputerAgent
os.environ["CUA_API_KEY"] = "sk_cua-api01_..."
agent = ComputerAgent(
model="cua/anthropic/claude-sonnet-4.5", # CUA-routed model
tools=[computer],
@@ -383,19 +379,16 @@ Choose how you want to access vision-language models for your agent:
Use your own API keys from model providers like Anthropic, OpenAI, or others.
**Set your provider API key:**
```bash
# For Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
# For OpenAI
export OPENAI_API_KEY="sk-..."
```
**Use the agent with your provider:**
```python
import os
from agent import ComputerAgent
# Set your provider API key
os.environ["ANTHROPIC_API_KEY"] = "sk-ant-..." # For Anthropic
# OR
os.environ["OPENAI_API_KEY"] = "sk-..." # For OpenAI
agent = ComputerAgent(
model="anthropic/claude-sonnet-4-5-20250929", # Direct provider model
tools=[computer],