mirror of
https://github.com/apidoorman/doorman.git
synced 2026-02-08 18:18:46 -06:00
Minor updates
This commit is contained in:
191
README.md
191
README.md
@@ -9,7 +9,7 @@
|
||||
|
||||
# Doorman API Gateway
|
||||
|
||||
A lightweight, Python-based API gateway for managing REST, SOAP, GraphQL, gRPC, and AI APIs. No low-level language expertise required.
|
||||
Lightweight Python API gateway for REST, SOAP, GraphQL, gRPC, and AI APIs.
|
||||
|
||||

|
||||
|
||||
@@ -21,171 +21,88 @@ A lightweight, Python-based API gateway for managing REST, SOAP, GraphQL, gRPC,
|
||||
- **Caching & Storage**: Redis caching, MongoDB integration, or in memory
|
||||
- **Validation**: Request payload validation and logging
|
||||
|
||||
## One‑Command Demo
|
||||
## Quick Demo
|
||||
|
||||
### Prerequisites
|
||||
- Docker installed
|
||||
|
||||
### Run with Docker Compose
|
||||
Run a local demo instance in seconds.
|
||||
|
||||
```bash
|
||||
# First time (build the demo image to include frontend proxy config)
|
||||
# Clone and launch instantly
|
||||
docker compose -f docker-compose.yml -f docker-compose.demo.yml up --build
|
||||
|
||||
# Next runs (no rebuild needed)
|
||||
docker compose -f docker-compose.yml -f docker-compose.demo.yml up
|
||||
```
|
||||
|
||||
Defaults (demo‑only):
|
||||
- Admin: `demo@doorman.dev` / `DemoPassword123!`
|
||||
- Web UI: `http://localhost:3000`
|
||||
- API: `http://localhost:3001`
|
||||
- Mode: in‑memory (no Redis/Mongo); no seed data created
|
||||
- **Web UI**: [http://localhost:3000](http://localhost:3000)
|
||||
- **Admin**: `demo@doorman.dev` / `DemoPassword123!`
|
||||
- **Mode**: Memory mode (no external DB)
|
||||
|
||||
## Quick Start
|
||||
---
|
||||
|
||||
### Prerequisites
|
||||
- Docker installed
|
||||
- Environment file (`.env`) at repo root (start from `./.env.example`)
|
||||
## Self-Hosting
|
||||
|
||||
### Run with Docker Compose
|
||||
Deploy with Docker. Production mode requires Redis and MongoDB.
|
||||
|
||||
### 1. Environment Configuration
|
||||
Copy the template and set your secrets.
|
||||
```bash
|
||||
# 1) Prepare env (first time)
|
||||
cp .env.example .env
|
||||
# Edit .env and set: DOORMAN_ADMIN_EMAIL, DOORMAN_ADMIN_PASSWORD, JWT_SECRET_KEY
|
||||
|
||||
# 2) Start (builds automatically)
|
||||
docker compose up
|
||||
# Set: DOORMAN_ADMIN_EMAIL, DOORMAN_ADMIN_PASSWORD, JWT_SECRET_KEY
|
||||
```
|
||||
|
||||
When ready:
|
||||
- Web UI: `http://localhost:3000`
|
||||
- Gateway API: `http://localhost:3001`
|
||||
- Data & logs persist in Docker volumes (`doorman-generated`, `doorman-logs`).
|
||||
|
||||
## Frontend Gateway Configuration
|
||||
|
||||
The web client needs to know the backend gateway URL. Set `NEXT_PUBLIC_GATEWAY_URL` in the root `.env` file:
|
||||
### 2. Choose Storage
|
||||
- Memory (default): development and tests.
|
||||
- Redis + MongoDB: production. Note: SQLite is not supported.
|
||||
|
||||
### 3. Launch
|
||||
```bash
|
||||
# For Docker Compose (default - both services in same container)
|
||||
NEXT_PUBLIC_GATEWAY_URL=http://localhost:3001
|
||||
|
||||
# For production reverse proxy (frontend and API on same domain)
|
||||
# Leave unset - frontend will use same origin
|
||||
```
|
||||
|
||||
**Behavior:**
|
||||
- If `NEXT_PUBLIC_GATEWAY_URL` is set → uses that URL for API calls
|
||||
- If not set → uses same origin (for reverse proxy deployments where frontend and API share the same domain)
|
||||
|
||||
### Run in Background
|
||||
|
||||
```bash
|
||||
# Start detached
|
||||
# Standard launch
|
||||
docker compose up -d
|
||||
|
||||
# View logs
|
||||
docker compose logs -f
|
||||
|
||||
# Stop services
|
||||
docker compose down
|
||||
```
|
||||
|
||||
### Data & Logs
|
||||
|
||||
- By default, Compose stores generated data and logs in Docker volumes, not in the repo folders:
|
||||
- Volume `doorman-generated` → `/app/backend-services/generated`
|
||||
- Volume `doorman-logs` → `/app/backend-services/logs`
|
||||
- To inspect inside the container: `docker compose exec doorman sh`
|
||||
- To reset data: `docker compose down -v` (removes volumes)
|
||||
|
||||
## Configuration
|
||||
|
||||
### Required Environment Variables
|
||||
- `DOORMAN_ADMIN_EMAIL` — initial admin user email
|
||||
- `DOORMAN_ADMIN_PASSWORD` — initial admin password (12+ characters required)
|
||||
- `JWT_SECRET_KEY` — secret key for JWT tokens (32+ chars)
|
||||
|
||||
Optional (recommended in some setups):
|
||||
- `NEXT_PUBLIC_GATEWAY_URL` — frontend → gateway base URL (see “Frontend Gateway Configuration”)
|
||||
|
||||
### High Availability Setup
|
||||
|
||||
For production/HA with Redis and MongoDB via Docker Compose:
|
||||
|
||||
```bash
|
||||
# In .env (compose service names inside the network)
|
||||
MEM_OR_EXTERNAL=REDIS
|
||||
MONGO_DB_HOSTS=mongo:27017
|
||||
MONGO_DB_USER=doorman_admin
|
||||
MONGO_DB_PASSWORD=changeme # set a stronger password in real deployments
|
||||
REDIS_HOST=redis
|
||||
|
||||
# Start with production profile (brings up Redis + MongoDB)
|
||||
# Production launch (Redis + MongoDB)
|
||||
docker compose --profile production up -d
|
||||
```
|
||||
|
||||
Notes:
|
||||
- Ensure `MONGO_DB_USER`/`MONGO_DB_PASSWORD` match the values in `docker-compose.yml` (defaults are provided for convenience; change in production).
|
||||
- When running under Compose, use `mongo` and `redis` service names (not `localhost`).
|
||||
---
|
||||
|
||||
### Alternative: Manual Docker Commands
|
||||
## Configuration
|
||||
|
||||
If you prefer not to use Docker Compose:
|
||||
### Core Environment Variables
|
||||
| Variable | Required | Description |
|
||||
| :--- | :--- | :--- |
|
||||
| `DOORMAN_ADMIN_EMAIL` | Yes | Initial administrator email |
|
||||
| `DOORMAN_ADMIN_PASSWORD` | Yes | Admin password (min 12 chars) |
|
||||
| `JWT_SECRET_KEY` | Yes | Secret for signing access tokens |
|
||||
| `NEXT_PUBLIC_GATEWAY_URL` | No | Frontend API target (Defaults to same origin) |
|
||||
|
||||
```bash
|
||||
# Build the image
|
||||
docker build -t doorman:latest .
|
||||
### Persistence & Performance
|
||||
- Redis: set `MEM_OR_EXTERNAL=REDIS` to enable caching/rate limiting.
|
||||
- MongoDB: set `MONGO_DB_HOSTS=mongo:27017` (and credentials) to persist configurations and users.
|
||||
- Volumes: Docker-managed volumes (`doorman-generated`, `doorman-logs`). Use `docker compose down -v` to reset.
|
||||
|
||||
# Run the container
|
||||
docker run --rm --name doorman \
|
||||
-p 3001:3001 -p 3000:3000 \
|
||||
--env-file .env \
|
||||
doorman:latest
|
||||
---
|
||||
|
||||
## Repository Structure
|
||||
|
||||
```text
|
||||
doorman/
|
||||
├── backend-services/ # Python Gateway Engine (FastAPI)
|
||||
├── web-client/ # Next.js Dashboard
|
||||
├── user-docs/ # Technical Guides & Runbooks
|
||||
├── scripts/ # Build & Maintenance tools
|
||||
└── ops/ # Infrastructure & Docker config
|
||||
```
|
||||
|
||||
## Documentation
|
||||
|
||||
- User docs live in `user-docs/` with:
|
||||
- `01-getting-started.md` for setup and first API
|
||||
- `02-configuration.md` for environment variables
|
||||
- `03-security.md` for hardening
|
||||
- `04-api-workflows.md` for end-to-end examples
|
||||
- `05-operations.md` for production ops and runbooks
|
||||
- `06-tools.md` for diagnostics and the CORS checker
|
||||
|
||||
|
||||
## Repository Structure
|
||||
|
||||
```
|
||||
doorman/
|
||||
├── backend-services/ # Python gateway core, routes, services, tests
|
||||
├── web-client/ # Next.js frontend
|
||||
├── docker/ # Container entrypoint and scripts
|
||||
├── user-docs/ # Documentation and guides
|
||||
├── scripts/ # Helper scripts (preflight, coverage, maintenance)
|
||||
└── generated/ # Local development artifacts
|
||||
```
|
||||
|
||||
## Security Notes
|
||||
|
||||
- Frontend only exposes `NEXT_PUBLIC_*` variables to the browser
|
||||
- Never pass secrets to frontend build args
|
||||
- Backend loads environment at runtime from `--env-file` or `/env/*.env`
|
||||
- Platform/injected env variables take precedence over repo files
|
||||
|
||||
## License
|
||||
|
||||
Copyright Doorman Dev, LLC
|
||||
|
||||
Licensed under the Apache License 2.0 - see [LICENSE](https://www.apache.org/licenses/LICENSE-2.0)
|
||||
|
||||
## Disclaimer
|
||||
|
||||
Use at your own risk. By using this software, you agree to the [Apache 2.0 License](https://www.apache.org/licenses/LICENSE-2.0) and any annotations in the source code.
|
||||
Deep-dive into our guides for advanced setups:
|
||||
- [Getting Started Guide](user-docs/01-getting-started.md)
|
||||
- [Security & Hardening](user-docs/03-security.md)
|
||||
- [API Workflows (gRPC/SOAP)](user-docs/04-api-workflows.md)
|
||||
- [Production Operations](user-docs/05-operations.md)
|
||||
|
||||
---
|
||||
|
||||
**We welcome contributors and testers!**
|
||||
## License
|
||||
|
||||
**Copyright © Doorman Dev, LLC**
|
||||
Licensed under the **Apache License 2.0**.
|
||||
|
||||
Review the [Security Hardening Guide](user-docs/03-security.md) before production deployment.
|
||||
|
||||
@@ -1,45 +1,89 @@
|
||||
# Doorman Backend Services
|
||||
# Doorman Gateway Engine
|
||||
|
||||
The core gateway engine for Doorman. Handles protocol translation, authentication, rate limiting, and observability for REST, GraphQL, gRPC, and SOAP.
|
||||
The core high-performance gateway engine for Doorman. Handles protocol translation, security enforcement, rate limiting, and observability for REST, GraphQL, gRPC, and SOAP APIs.
|
||||
|
||||
## Features
|
||||
- **Multiprotocol**: First-class support for REST, GraphQL, gRPC, and SOAP.
|
||||
- **Auth Engine**: Built-in JWT management, RBAC, and User/Group/Role isolation.
|
||||
- **Zero-Dependency Mode**: Run entirely in-memory for local dev.
|
||||
- **Production Mode**: Connect to Redis (caching) and MongoDB (persistence) for scale.
|
||||
- **Security First**: Integrated XXE protection (defusedxml), path traversal guards, and secure gRPC generation.
|
||||
## 🚀 Key Features
|
||||
|
||||
## Quick Start (Instant Dev Mode)
|
||||
- **Multi-Protocol Gateway**: First-class support for REST, SOAP 1.1/1.2, GraphQL, and gRPC (with auto-generation).
|
||||
- **Security & RBAC**: Integrated JWT management, Role-Based Access Control, and User/Group isolation.
|
||||
- **Traffic Control**: Granular rate limiting (fixed window), throttling, and credit-based quotas.
|
||||
- **Storage Flexibility**:
|
||||
- **Memory Mode**: Zero-dependency mode for local development and CI/CD.
|
||||
- **Production Mode**: Scalable architecture using Redis (caching/rate-limits) and MongoDB (persistence).
|
||||
- **Robustness**: Built-in XXE protection, path traversal guards, and secure gRPC compilation.
|
||||
- **Observability**: Structured logging, request tracing, and aggregated metrics.
|
||||
|
||||
1. **Install Dependencies**
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
---
|
||||
|
||||
2. **Run with Memory Storage**
|
||||
```bash
|
||||
# No Redis or MongoDB required
|
||||
export DOORMAN_MEMORY_MODE=true
|
||||
python doorman.py
|
||||
```
|
||||
## 🛠 Setup & Development
|
||||
|
||||
3. **Check Health**
|
||||
```bash
|
||||
curl http://localhost:8000/health
|
||||
```
|
||||
|
||||
## Configuration
|
||||
Configure via environment variables or a `config.yaml` file. Key variables:
|
||||
- `DOORMAN_REDIS_URL`: Connection string for Redis (default: localhost:6379)
|
||||
- `DOORMAN_MONGO_URL`: Connection string for MongoDB (default: localhost:27017)
|
||||
- `JWT_SECRET`: Secret key for signing tokens.
|
||||
|
||||
## Testing
|
||||
We use `pytest` for comprehensive integration and unit testing.
|
||||
### 1. Instant Memory Mode (No Database)
|
||||
Perfect for testing or local development.
|
||||
```bash
|
||||
# Run all tests (requires venv)
|
||||
./venv/bin/pytest -q
|
||||
# Set up environment
|
||||
python -m venv venv
|
||||
source venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Run instantly (uses in-memory storage)
|
||||
export DOORMAN_MEMORY_MODE=true
|
||||
python doorman.py
|
||||
```
|
||||
|
||||
### 2. Production/HA Mode
|
||||
Requires Redis and MongoDB.
|
||||
```bash
|
||||
# Configure persistence
|
||||
export MEM_OR_EXTERNAL=REDIS
|
||||
export REDIS_HOST=localhost
|
||||
export MONGO_DB_HOSTS=localhost:27017
|
||||
|
||||
# Run server
|
||||
python doorman.py
|
||||
```
|
||||
|
||||
### 3. One-Command Docker Demo
|
||||
Run the full gateway + dashboard stack:
|
||||
```bash
|
||||
docker compose -f docker-compose.yml -f docker-compose.demo.yml up --build
|
||||
```
|
||||
|
||||
---
|
||||
Built by Doorman Dev, LLC. Licensed under Apache 2.0.
|
||||
|
||||
## ⚙️ Configuration
|
||||
|
||||
| Variable | Default | Description |
|
||||
| :--- | :--- | :--- |
|
||||
| `MEM_OR_EXTERNAL` | `MEM` | `MEM` for in-memory, `REDIS` or `EXTERNAL` for production. |
|
||||
| `REDIS_HOST` | `localhost` | Redis server hostname. |
|
||||
| `MONGO_DB_HOSTS` | `localhost:27017` | MongoDB connection string. |
|
||||
| `JWT_SECRET_KEY` | - | **Required**. Secret for signing tokens. |
|
||||
| `DOORMAN_ADMIN_PASSWORD` | - | **Required**. Admin password (min 12 chars). |
|
||||
| `LOGS_DIR` | `./logs` | Directory for structured logs. |
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Testing
|
||||
|
||||
We maintain high stability with over 480 integration tests.
|
||||
```bash
|
||||
# Run all tests
|
||||
./venv/bin/pytest -q tests/
|
||||
|
||||
# Run specific suite
|
||||
./venv/bin/pytest tests/test_gateway_soap.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📂 Repository Structure
|
||||
|
||||
- `routes/`: API endpoint definitions (RESTful).
|
||||
- `services/`: Core logic for protocol handling (REST, SOAP, GraphQL, gRPC).
|
||||
- `middleware/`: Security, analytics, and body size limiters.
|
||||
- `models/`: Pydantic models for request/response validation.
|
||||
- `utils/`: shared utilities (auth, database, metrics, encryption).
|
||||
|
||||
---
|
||||
|
||||
Built by **Doorman Dev, LLC**. Licensed under **Apache License 2.0**.
|
||||
|
||||
@@ -2221,20 +2221,6 @@ doorman.include_router(grpc_router, tags=['gRPC Discovery'])
|
||||
doorman.include_router(mfa_router, tags=['MFA'])
|
||||
|
||||
|
||||
|
||||
@doorman.on_event('startup')
|
||||
async def startup_event():
|
||||
"""Run startup checks"""
|
||||
try:
|
||||
from utils.redis_client import get_redis_client
|
||||
redis = get_redis_client()
|
||||
redis.ping()
|
||||
gateway_logger.info('Startup check: Redis connection successful')
|
||||
except Exception as e:
|
||||
gateway_logger.error(f'Startup check failed: Redis unavailable - {e}')
|
||||
# In strict mode we might exit, but for resilience we log error
|
||||
# sys.exit(1)
|
||||
|
||||
def start() -> None:
|
||||
if os.path.exists(PID_FILE):
|
||||
gateway_logger.info('doorman is already running!')
|
||||
|
||||
@@ -86,7 +86,16 @@ async def _restore_snapshot(snapshot_id: str = None):
|
||||
else:
|
||||
# Get latest
|
||||
cursor = coll.find().sort('timestamp', -1).limit(1)
|
||||
snapshot = await cursor.to_list(length=1)
|
||||
# Handle both async (Motor) and sync (InMemory/PyMongo) cursors gracefully
|
||||
if hasattr(cursor, 'to_list'):
|
||||
# Check if it's an awaitable (Motor)
|
||||
import inspect
|
||||
if inspect.iscoroutinefunction(cursor.to_list) or inspect.isawaitable(cursor.to_list(length=1)):
|
||||
snapshot = await cursor.to_list(length=1)
|
||||
else:
|
||||
snapshot = cursor.to_list(length=1)
|
||||
else:
|
||||
snapshot = list(cursor)[:1]
|
||||
snapshot = snapshot[0] if snapshot else None
|
||||
|
||||
if not snapshot:
|
||||
|
||||
@@ -266,7 +266,7 @@ async def generate_report(request: Request, start: str, end: str):
|
||||
import datetime as _dt
|
||||
|
||||
def _to_date_time(ts: int):
|
||||
dt = _dt.datetime.utcfromtimestamp(ts)
|
||||
dt = _dt.datetime.fromtimestamp(ts, _dt.timezone.utc)
|
||||
return dt.strftime('%Y-%m-%d'), dt.strftime('%H:%M')
|
||||
|
||||
start_date, start_time_str = _to_date_time(start_ts)
|
||||
@@ -423,7 +423,7 @@ async def generate_report(request: Request, start: str, end: str):
|
||||
w.writerow(['Bandwidth (per day, UTC)'])
|
||||
w.writerow(['date', 'bytes_in', 'bytes_out', 'total'])
|
||||
for day_ts in sorted(daily_bw.keys()):
|
||||
date_str = _dt.datetime.utcfromtimestamp(day_ts).strftime('%Y-%m-%d')
|
||||
date_str = _dt.datetime.fromtimestamp(day_ts, _dt.timezone.utc).strftime('%Y-%m-%d')
|
||||
bi = int(daily_bw[day_ts]['in'])
|
||||
bo = int(daily_bw[day_ts]['out'])
|
||||
w.writerow([date_str, bi, bo, bi + bo])
|
||||
|
||||
@@ -369,7 +369,7 @@ Response:
|
||||
"""
|
||||
|
||||
|
||||
@user_router.get('/me', description='Get user by username', response_model=UserModelResponse)
|
||||
@user_router.get('/me', description='Get user by username', response_model=ResponseModel)
|
||||
async def get_user_by_username(request: Request):
|
||||
request_id = str(uuid.uuid4())
|
||||
start_time = time.time() * 1000
|
||||
@@ -415,7 +415,7 @@ Response:
|
||||
"""
|
||||
|
||||
|
||||
@user_router.get('/all', description='Get all users', response_model=list[UserModelResponse])
|
||||
@user_router.get('/all', description='Get all users', response_model=ResponseModel)
|
||||
async def get_all_users(
|
||||
request: Request, page: int = Defaults.PAGE, page_size: int = Defaults.PAGE_SIZE
|
||||
):
|
||||
@@ -482,7 +482,7 @@ Response:
|
||||
|
||||
|
||||
@user_router.get(
|
||||
'/{username}', description='Get user by username', response_model=UserModelResponse
|
||||
'/{username}', description='Get user by username', response_model=ResponseModel
|
||||
)
|
||||
async def get_user_by_username(username: str, request: Request):
|
||||
request_id = str(uuid.uuid4())
|
||||
@@ -612,7 +612,7 @@ async def get_user_by_email(email: str, request: Request):
|
||||
|
||||
|
||||
@user_router.get(
|
||||
'', description='Get all users (base path)', response_model=list[UserModelResponse]
|
||||
'', description='Get all users (base path)', response_model=ResponseModel
|
||||
)
|
||||
async def get_all_users_base(
|
||||
request: Request, page: int = Defaults.PAGE, page_size: int = Defaults.PAGE_SIZE
|
||||
|
||||
@@ -9,7 +9,7 @@ import json
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
from datetime import datetime
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
@@ -317,9 +317,9 @@ class LoggingService:
|
||||
rec = json.loads(s)
|
||||
timestamp_str = rec.get('time') or rec.get('timestamp')
|
||||
try:
|
||||
timestamp = timestamp_str or datetime.utcnow().isoformat()
|
||||
timestamp = timestamp_str or datetime.now(timezone.utc).isoformat()
|
||||
except Exception:
|
||||
timestamp = datetime.utcnow().isoformat()
|
||||
timestamp = datetime.now(timezone.utc).isoformat()
|
||||
message = rec.get('message', '')
|
||||
name = rec.get('name', '')
|
||||
level = rec.get('level', '')
|
||||
@@ -346,7 +346,7 @@ class LoggingService:
|
||||
try:
|
||||
timestamp = datetime.strptime(timestamp_str, '%Y-%m-%d %H:%M:%S')
|
||||
except ValueError:
|
||||
timestamp = datetime.utcnow()
|
||||
timestamp = datetime.now(timezone.utc)
|
||||
message_parts = full_message.split(' | ', 1)
|
||||
request_id = message_parts[0] if len(message_parts) > 1 else None
|
||||
message = message_parts[1] if len(message_parts) > 1 else full_message
|
||||
|
||||
@@ -33,7 +33,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
|
||||
response = client.post(
|
||||
'/platform/authorization',
|
||||
data=small_payload,
|
||||
content=small_payload,
|
||||
headers={'Transfer-Encoding': 'chunked', 'Content-Type': 'application/json'},
|
||||
)
|
||||
|
||||
@@ -67,7 +67,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
|
||||
response = client.post(
|
||||
'/api/rest/test/v1/endpoint',
|
||||
data=large_payload,
|
||||
content=large_payload,
|
||||
headers={'Transfer-Encoding': 'chunked', 'Content-Type': 'application/json'},
|
||||
)
|
||||
|
||||
@@ -86,7 +86,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
|
||||
response = client.post(
|
||||
'/api/soap/test/v1/service',
|
||||
data=medium_payload,
|
||||
content=medium_payload,
|
||||
headers={'Transfer-Encoding': 'chunked', 'Content-Type': 'text/xml'},
|
||||
)
|
||||
|
||||
@@ -105,7 +105,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
|
||||
response = client.post(
|
||||
'/platform/authorization',
|
||||
data=large_payload,
|
||||
content=large_payload,
|
||||
headers={'Content-Type': 'application/json'},
|
||||
)
|
||||
|
||||
@@ -124,7 +124,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
|
||||
response = client.post(
|
||||
'/platform/authorization',
|
||||
data=large_payload,
|
||||
content=large_payload,
|
||||
headers={
|
||||
'Transfer-Encoding': 'chunked',
|
||||
'Content-Length': '100',
|
||||
@@ -154,7 +154,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
|
||||
response = client.put(
|
||||
'/platform/user/testuser',
|
||||
data=large_payload,
|
||||
content=large_payload,
|
||||
headers={'Transfer-Encoding': 'chunked', 'Content-Type': 'application/json'},
|
||||
)
|
||||
|
||||
@@ -172,7 +172,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
|
||||
response = client.patch(
|
||||
'/platform/user/testuser',
|
||||
data=large_payload,
|
||||
content=large_payload,
|
||||
headers={'Transfer-Encoding': 'chunked', 'Content-Type': 'application/json'},
|
||||
)
|
||||
|
||||
@@ -190,7 +190,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
|
||||
response = client.post(
|
||||
'/api/graphql/test',
|
||||
data=large_query.encode(),
|
||||
content=large_query.encode(),
|
||||
headers={'Transfer-Encoding': 'chunked', 'Content-Type': 'application/json'},
|
||||
)
|
||||
|
||||
@@ -217,7 +217,7 @@ class TestChunkedEncodingBodyLimit:
|
||||
for route in routes:
|
||||
response = client.post(
|
||||
route,
|
||||
data=large_payload,
|
||||
content=large_payload,
|
||||
headers={'Transfer-Encoding': 'chunked', 'Content-Type': 'application/json'},
|
||||
)
|
||||
|
||||
|
||||
@@ -237,9 +237,9 @@ async def test_monitor_report_csv(monkeypatch, authed_client):
|
||||
monkeypatch.setattr(gs.httpx, 'AsyncClient', _FakeAsyncClient)
|
||||
await authed_client.get(f'/api/rest/{name}/{ver}/r')
|
||||
|
||||
from datetime import datetime
|
||||
from datetime import datetime, timezone
|
||||
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
start = now.strftime('%Y-%m-%dT%H:%M')
|
||||
end = start
|
||||
csvr = await authed_client.get(f'/platform/monitor/report?start={start}&end={end}')
|
||||
|
||||
43
backend-services/tests/test_path_validation_deep.py
Normal file
43
backend-services/tests/test_path_validation_deep.py
Normal file
@@ -0,0 +1,43 @@
|
||||
import os
|
||||
import pytest
|
||||
from pathlib import Path
|
||||
from routes.proto_routes import validate_path, PROJECT_ROOT
|
||||
|
||||
def test_validate_path_success():
|
||||
# Valid path within project root
|
||||
target = PROJECT_ROOT / "test_file.proto"
|
||||
assert validate_path(PROJECT_ROOT, target) is True
|
||||
|
||||
def test_validate_path_traversal():
|
||||
# Dangerous path attempting to go up
|
||||
target = PROJECT_ROOT / "../../../etc/passwd"
|
||||
assert validate_path(PROJECT_ROOT, target) is False
|
||||
|
||||
def test_validate_path_outside_allowed():
|
||||
# Path outside project and temp
|
||||
import tempfile
|
||||
outside = Path("/usr/bin/local")
|
||||
assert validate_path(PROJECT_ROOT, outside) is False
|
||||
|
||||
def test_validate_path_temp_dir():
|
||||
import tempfile
|
||||
temp_dir = Path(tempfile.gettempdir())
|
||||
target = temp_dir / "safe_temp.proto"
|
||||
assert validate_path(temp_dir, target) is True
|
||||
|
||||
def test_validate_path_complex_traversal():
|
||||
# Attempt to use symlink-like trickery or redundant separators
|
||||
target = PROJECT_ROOT / "subdir" / ".." / ".." / "etc" / "passwd"
|
||||
assert validate_path(PROJECT_ROOT, target) is False
|
||||
|
||||
def test_validate_path_same_dir():
|
||||
assert validate_path(PROJECT_ROOT, PROJECT_ROOT) is True
|
||||
|
||||
def test_validate_path_prefix_attack():
|
||||
# PROJECT_ROOT = /foo/bar
|
||||
# target = /foo/bar_extra/secret.txt
|
||||
# Simple startswith would fail here, but commonpath should handle it.
|
||||
parent = PROJECT_ROOT.parent
|
||||
sibling = parent / (PROJECT_ROOT.name + "_extra")
|
||||
target = sibling / "secret.txt"
|
||||
assert validate_path(PROJECT_ROOT, target) is False
|
||||
@@ -4,7 +4,7 @@ import os
|
||||
import random
|
||||
import string
|
||||
import uuid
|
||||
from datetime import datetime, timedelta
|
||||
from datetime import datetime, timedelta, timezone
|
||||
|
||||
from utils import password_util
|
||||
from utils.database import (
|
||||
@@ -327,7 +327,7 @@ def seed_user_credits(usernames: list[str], credit_groups: list[str]) -> None:
|
||||
users_credits[g] = {
|
||||
'tier_name': _rand_choice(['basic', 'pro', 'enterprise']),
|
||||
'available_credits': random.randint(10, 10000),
|
||||
'reset_date': (datetime.utcnow() + timedelta(days=random.randint(1, 30))).strftime(
|
||||
'reset_date': (datetime.now(timezone.utc) + timedelta(days=random.randint(1, 30))).strftime(
|
||||
'%Y-%m-%d'
|
||||
),
|
||||
'user_api_key': encrypt_value(uuid.uuid4().hex),
|
||||
@@ -364,7 +364,7 @@ def seed_logs(n: int, usernames: list[str], apis: list[tuple[str, str]]) -> None
|
||||
log_path = os.path.join(logs_dir, 'doorman.log')
|
||||
methods = ['GET', 'POST', 'PUT', 'DELETE', 'PATCH']
|
||||
uris = ['/status', '/list', '/items', '/items/123', '/search?q=test', '/export', '/metrics']
|
||||
now = datetime.now()
|
||||
now = datetime.now(timezone.utc)
|
||||
with open(log_path, 'a', encoding='utf-8') as lf:
|
||||
for _ in range(n):
|
||||
api = _rand_choice(apis) if apis else ('demo', 'v1')
|
||||
@@ -414,7 +414,7 @@ message StatusReply {{
|
||||
|
||||
|
||||
def seed_metrics(usernames: list[str], apis: list[tuple[str, str]], minutes: int = 400) -> None:
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
for i in range(minutes, 0, -1):
|
||||
minute_start = int(((now - timedelta(minutes=i)).timestamp()) // 60) * 60
|
||||
b = MinuteBucket(start_ts=minute_start)
|
||||
|
||||
@@ -13,7 +13,7 @@ import json
|
||||
import logging
|
||||
import threading
|
||||
from collections import deque
|
||||
from datetime import datetime
|
||||
from datetime import datetime, timezone
|
||||
from typing import Deque, List, Optional
|
||||
|
||||
|
||||
@@ -53,7 +53,7 @@ class MemoryLogHandler(logging.Handler):
|
||||
try:
|
||||
payload = {
|
||||
# Align keys with JSONFormatter in doorman.py
|
||||
"time": datetime.utcfromtimestamp(record.created).strftime("%Y-%m-%dT%H:%M:%S"),
|
||||
"time": datetime.fromtimestamp(record.created, timezone.utc).strftime("%Y-%m-%dT%H:%M:%S"),
|
||||
"name": record.name,
|
||||
"level": record.levelname,
|
||||
"message": self.format(record) if self.formatter else record.getMessage(),
|
||||
|
||||
@@ -25,7 +25,11 @@ async def subscription_required(request: Request):
|
||||
username = payload.get('sub')
|
||||
if not username:
|
||||
raise HTTPException(status_code=401, detail='Invalid token')
|
||||
# All users (including admins) must have a subscription unless the API is public
|
||||
# Admin bypass: users with admin role skip subscription checks
|
||||
if await is_admin_user(username):
|
||||
return payload
|
||||
|
||||
# All users (non-admins) must have a subscription unless the API is public
|
||||
full_path = request.url.path
|
||||
if full_path.startswith('/api/rest/'):
|
||||
prefix = '/api/rest/'
|
||||
|
||||
Reference in New Issue
Block a user