Merge pull request #497 from AdityaBavadekar/fix/ci-cd-prettier

Fix: Prettier conflicts between pnpm and pre-commit
This commit is contained in:
James Murdza
2025-10-22 14:38:34 -07:00
committed by GitHub
6 changed files with 4 additions and 10 deletions

View File

@@ -30,7 +30,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: 3.11
python-version: 3.12
- name: Install Python dependencies
run: |

View File

@@ -6,6 +6,7 @@ repos:
name: Prettier (TS/JS/JSON/Markdown/YAML)
entry: prettier --write
language: node
additional_dependencies: ["prettier@3.6.2"]
files: \.(ts|tsx|js|jsx|json|md|yaml|yml)$
- repo: local

View File

@@ -216,14 +216,12 @@ Each response contains:
Let's break down the main components of our system and how they work together:
1. **The Virtual Machine (VM)**
- Think of this as a safe playground for our AI
- It's a complete macOS system running inside your computer
- Anything the AI does stays inside this VM, keeping your main system safe
- We use `lume` to create and manage this VM
2. **The Computer Interface (CUI)**
- This is how we control the VM
- It can move the mouse, type text, and take screenshots
- Works like a remote control for the VM
@@ -303,7 +301,6 @@ This design keeps everything organized and safe. The AI can only interact with t
```
**Important Storage Notes:**
- Initial download requires 80GB of free space
- After first run, space usage reduces to ~30GB due to macOS's sparse file system
- VMs are stored in `~/.lume`

View File

@@ -77,7 +77,6 @@ Before running any code examples, let's set up a proper environment:
```
**Option B: Using Anaconda Navigator UI**
- Open Anaconda Navigator
- Click on "Environments" in the left sidebar
- Click the "Create" button at the bottom
@@ -119,7 +118,6 @@ Before running any code examples, let's set up a proper environment:
```
**Option B: Use VS Code notebooks**
- Open VS Code
- Install the Python extension if you haven't already
- Create a new file with a `.ipynb` extension (e.g., `cua_agent_tutorial.ipynb`)
@@ -345,7 +343,6 @@ One of the most powerful features of the framework is the ability to use local m
**How to run this example:**
1. First, you'll need to install Ollama for running local models:
- Visit [ollama.com](https://ollama.com) and download the installer for your OS
- Follow the installation instructions
- Pull the Gemma 3 model:

View File

@@ -154,8 +154,8 @@ export default async function Page(props: { params: Promise<{ slug?: string[] }>
{link.includes('python')
? 'Python'
: link.includes('typescript')
? 'TypeScript'
: `Source ${index + 1}`}
? 'TypeScript'
: `Source ${index + 1}`}
<ExternalLink className="w-4 h-4 ml-auto" />
</a>
))}

View File

@@ -19,7 +19,6 @@ This example demonstrates how to control a Cua Cloud Sandbox using the OpenAI `c
2. **Set up environment variables:**
Create a `.env` file with the following variables:
- `OPENAI_API_KEY` — your OpenAI API key
- `CUA_API_KEY` — your Cua Cloud API key
- `CUA_CONTAINER_NAME` — the name of your provisioned sandbox