Compare commits

..

26 Commits

Author SHA1 Message Date
Eli Bosley
71a5214a03 Merge 2cb6eaeb12 into 4e945f5f56 2025-09-02 12:50:33 -04:00
Eli Bosley
2cb6eaeb12 refactor: enhance clipboard functionality and UI responsiveness in HeaderOsVersion component
- Integrated clipboard support check to ensure OS and API version copying only occurs when supported.
- Updated button states to reflect clipboard capability, improving user experience.
- Adjusted icon styles for better visual consistency and responsiveness in the component layout.
2025-09-02 12:50:19 -04:00
Eli Bosley
b1590ee609 refactor: improve button click handling and class management in components
- Updated button click handlers in Notifications and RClone components to pass parameters directly, enhancing clarity and functionality.
- Introduced a computed property for button classes in UserProfile component, streamlining class management based on item properties.
- Refactored navigation logic in UpdateOs component for better readability and maintainability.
2025-09-02 10:24:33 -04:00
Eli Bosley
51ebe77d09 refactor: update component structure and improve tooltip content
- Removed unnecessary margin from the ConnectSettings component for cleaner layout.
- Enhanced the Sidebar component's tooltip to include a more descriptive label for editing notification settings, improving user clarity.
2025-09-01 20:10:48 -04:00
Eli Bosley
e719780ee8 refactor: enhance component styles and introduce responsive modal
- Updated CSS variables and utility classes for improved theme integration and style consistency across components.
- Introduced a new responsive modal component to enhance user experience on various screen sizes.
- Refined button and badge styles to ensure better visual hierarchy and interaction feedback.
- Adjusted component imports and structure for better modularity and maintainability.
- Removed deprecated styles and streamlined CSS for improved performance and clarity.
2025-09-01 20:06:48 -04:00
Eli Bosley
67a6a2e7c8 refactor: enhance CSS variable management and layer structure for improved theme integration
- Introduced overrides for Tailwind v4 global styles to utilize webgui variables, ensuring better compatibility and theming.
- Scoped border colors and other styles to specific components, preventing unintended style leakage.
- Updated layer definitions in main.css to prioritize webgui styles effectively, enhancing overall style management.
- Added new Tailwind v4 color variables for utility classes in the theme store, improving customization options.
2025-08-31 15:23:17 -04:00
Eli Bosley
3faa637d97 refactor: streamline CSS patching and layer management for improved style isolation
- Simplified the CSS patching function to wrap styles in a single `@layer`, enhancing control over cascade order and ensuring Tailwind styles can override as needed.
- Removed the previous exclusion selectors logic, focusing on a more efficient layer-based approach to prevent style conflicts with webgui elements.
- Updated the Nuxt configuration to eliminate the postcssPrefixPlugin, reflecting the shift towards layer management for CSS class handling.
- Enhanced the main.css file to define layer order explicitly, ensuring that webgui styles are overridden by Tailwind utilities effectively.
2025-08-31 13:06:00 -04:00
Eli Bosley
83107a743c feat: implement CSS class prefixing and exclusion logic for improved style management
- Introduced a new Vite plugin to prefix CSS classes, ensuring that styles from the webgui do not interfere with our components.
- Enhanced the CSS patching script to apply exclusion selectors, preventing style leakage from `.unapi` containers.
- Updated the Nuxt configuration to integrate the new postcssPrefixPlugin, allowing for better control over CSS class names.
- Modified the Vue app mounting logic to add the `.unapi` class for improved style isolation and backward compatibility with `.unraid-reset`.
2025-08-31 12:37:53 -04:00
Eli Bosley
aa9648105f refactor: update CSS patching for improved compatibility and specificity
- Modified the CSS patching script to enhance compatibility by updating echo statements and adjusting the patching logic.
- Removed unnecessary layer wrapping in the CSS content, simplifying the structure while maintaining style specificity.
- Updated comments for clarity on the purpose of the compatibility patch and its impact on CSS management.
2025-08-31 10:00:07 -04:00
Eli Bosley
fdacc21d0e refactor: improve CSS layer management for better style precedence
- Updated the CSS layer definitions in both the plugin and main.css to ensure a clear hierarchy, preventing style conflicts and enhancing specificity.
- Revised comments to clarify the intended layer order and its impact on style application, ensuring better maintainability and understanding of the CSS structure.
2025-08-31 09:43:54 -04:00
Eli Bosley
7ec4874680 refactor: enhance CSS backup and restoration logic in deployment script
- Added functionality to restore existing CSS backups before creating new ones, ensuring a clean state during deployment.
- Updated comments for clarity on the backup process and the creation of the backup directory.
- Improved the handling of CSS imports in `main.css` to prevent global resets and enhance style specificity.
2025-08-31 09:29:27 -04:00
Eli Bosley
771dcef4f7 fix: update PHP path in plugin and enhance deployment script for web components
- Changed the PHP executable path in the plugin from `/bin/php` to `/usr/bin/php` for better compatibility.
- Improved the `deploy-dev.sh` script by ensuring proper quoting in the rsync command and adding a check to create the remote directory for standalone apps, enhancing deployment reliability.
2025-08-31 09:10:41 -04:00
Eli Bosley
2594df7e9c refactor: enhance CSS layering and specificity for improved style management
- Updated the CSS patching script to wrap styles in a new `@layer` structure, ensuring better isolation and priority management for webgui and unraid-api styles.
- Refined the `.unraid-reset` class in `main.css` to utilize CSS layers, enhancing specificity and preventing style conflicts with webgui elements.
- Improved the handling of CSS content during patching to ensure all styles are correctly wrapped and prioritized.
2025-08-31 09:03:40 -04:00
Eli Bosley
9dcd05748e refactor: enhance CSS patching and restoration logic in deployment script
- Added a new installation script to patch webgui CSS files, ensuring that styles for specific elements are wrapped with `:not(.unraid-reset)` to prevent style leakage.
- Implemented a backup and restoration mechanism for original CSS files, allowing for easy recovery after patching.
- Improved the handling of CSS directories and added warnings for missing directories to enhance robustness during deployment.
2025-08-31 09:00:40 -04:00
Eli Bosley
3cd5c0e8fd refactor: improve CSS reset strategy and deployment script logic
- Refined the `.unraid-reset` class in `main.css` to create a CSS layer for resets, enhancing style isolation and preventing leakage from webgui styles.
- Updated the deployment script `deploy-dev.sh` to improve checks for the existence of web components and standalone apps, ensuring accurate deployment and error handling.
2025-08-30 22:40:30 -04:00
Eli Bosley
c60f7b7204 refactor: enhance CSS isolation and z-index management for modals
- Updated the `.unraid-reset` class to apply isolation to non-modal components, preventing style leakage.
- Added z-index rules to ensure modals and their backdrops appear above all other content, improving UI layering.
- Refined button styles within the `.unraid-reset` class to reset inherited properties for better consistency.
2025-08-30 22:28:35 -04:00
Eli Bosley
ce67257526 refactor: enhance CSS reset and improve Vue app mounting logic
- Updated the `.unraid-reset` class in `main.css` to include additional properties for better styling consistency across Unraid components.
- Refined the `autoMountComponent` function in `vue-mount-app.ts` to check for element existence before mounting, improving robustness and preventing errors during the mounting process.
2025-08-30 22:19:17 -04:00
Eli Bosley
4a39cd9862 refactor: enhance manifest validation and Vue app mounting logic
- Improved validation in `WebComponentsExtractor` to log errors for missing standalone apps entries and file keys, ensuring better error handling during manifest processing.
- Updated CSS content retrieval in `vite.standalone.config.ts` to include a fallback mechanism for missing Nuxt CSS files, enhancing robustness.
- Simplified modal component mounting in `standalone-mount.ts` by utilizing a dedicated function for better readability and maintainability.
- Refined `mountVueApp` logic in `vue-mount-app.ts` to differentiate between the main app and clones, optimizing the mounting process for multiple targets.
2025-08-30 22:06:25 -04:00
Eli Bosley
f7ad582436 refactor: streamline standalone app deployment and manifest generation
- Removed redundant modal div from `test-standalone.html`, simplifying the structure for Vue component mounting.
- Added a check in `add-timestamp-standalone-manifest.js` to ensure the existence of the standalone apps directory before manifest generation, improving error handling.
- Updated `deploy-dev.sh` to enhance the rsync command for standalone apps, ensuring proper synchronization and cleanup of old files during deployment.
2025-08-30 22:02:02 -04:00
Eli Bosley
7c59c03786 refactor: improve standalone app manifest handling and Vue app mounting
- Updated `WebComponentsExtractor` to iterate over all manifest files, ensuring valid standalone apps entries are processed and preventing duplicate script loading.
- Enhanced `mountVueApp` to manage multiple clones and their respective shadow-root containers, improving cleanup and organization of mounted Vue apps.
- Modified deployment script to capture exit codes from standalone app synchronization, ensuring accurate error reporting during deployment.
2025-08-30 21:58:24 -04:00
Eli Bosley
eeed20215f feat: add prop parsing from HTML attributes in Vue app mounting
- Introduced a helper function `parsePropsFromElement` to extract props from HTML attributes, enhancing the flexibility of prop handling.
- Updated `mountVueApp` to utilize parsed props for both the main app and additional targets, allowing for dynamic prop assignment based on the HTML structure.
- Improved overall integration of props with Vue components, ensuring a more seamless mounting process.
2025-08-30 21:52:43 -04:00
Eli Bosley
41f11b0f8d refactor: enhance CSS content retrieval in Vite config
- Updated the CSS content retrieval logic in `vite.standalone.config.ts` to dynamically find and read entry CSS files from specified directories, improving flexibility and maintainability.
- Removed hardcoded CSS file paths in favor of a directory-based approach, allowing for easier updates and better organization of CSS assets.
2025-08-30 21:28:09 -04:00
Eli Bosley
32bc79b93d fix: update CSS validation patterns for Tailwind classes
- Enhanced regex patterns in `validate-custom-elements-css.js` to accommodate minified CSS formats, ensuring accurate detection of Tailwind utility classes and other CSS properties.
- Adjusted patterns for flex, margin, padding, color, background utilities, CSS custom properties, and responsive breakpoints to support both spaced and non-spaced formats.
2025-08-30 21:26:49 -04:00
Eli Bosley
b9632b9774 style: add unraid-reset class and CSS rules for component styling
- Introduced a new CSS class `.unraid-reset` to reset inherited styles for Unraid components, ensuring consistent styling across the application.
- Updated `vue-mount-app.ts` to apply the `.unraid-reset` class to mount targets, preventing webgui styles from leaking into Unraid components.
2025-08-30 21:18:49 -04:00
Eli Bosley
a1d91a0b4d fix: update artifact path and manifest validation logic
- Changed the artifact path in the GitHub Actions workflow to point to the new standalone apps directory.
- Enhanced the manifest file validation to include support for standalone.manifest.json, allowing for more flexible manifest file requirements.
2025-08-30 21:03:15 -04:00
Eli Bosley
85b250eb80 feat: mount vue apps, not web components 2025-08-30 20:49:31 -04:00
701 changed files with 17807 additions and 53829 deletions

View File

@@ -1,3 +1,123 @@
{
"permissions": {}
"permissions": {
"allow": [
"# Development Commands",
"Bash(pnpm install)",
"Bash(pnpm dev)",
"Bash(pnpm build)",
"Bash(pnpm test)",
"Bash(pnpm test:*)",
"Bash(pnpm lint)",
"Bash(pnpm lint:fix)",
"Bash(pnpm type-check)",
"Bash(pnpm codegen)",
"Bash(pnpm storybook)",
"Bash(pnpm --filter * dev)",
"Bash(pnpm --filter * build)",
"Bash(pnpm --filter * test)",
"Bash(pnpm --filter * lint)",
"Bash(pnpm --filter * codegen)",
"# Git Commands (read-only)",
"Bash(git status)",
"Bash(git diff)",
"Bash(git log)",
"Bash(git branch)",
"Bash(git remote -v)",
"# Search Commands",
"Bash(rg *)",
"# File System (read-only)",
"Bash(ls)",
"Bash(ls -la)",
"Bash(pwd)",
"Bash(find . -name)",
"Bash(find . -type)",
"# Node/NPM Commands",
"Bash(node --version)",
"Bash(pnpm --version)",
"Bash(npx --version)",
"# Environment Commands",
"Bash(echo $*)",
"Bash(which *)",
"# Process Commands",
"Bash(ps aux | grep)",
"Bash(lsof -i)",
"# Documentation Domains",
"WebFetch(domain:tailwindcss.com)",
"WebFetch(domain:github.com)",
"WebFetch(domain:reka-ui.com)",
"WebFetch(domain:nodejs.org)",
"WebFetch(domain:pnpm.io)",
"WebFetch(domain:vitejs.dev)",
"WebFetch(domain:nuxt.com)",
"WebFetch(domain:nestjs.com)",
"# IDE Integration",
"mcp__ide__getDiagnostics",
"# Browser MCP (for testing)",
"mcp__browsermcp__browser_navigate",
"mcp__browsermcp__browser_click",
"mcp__browsermcp__browser_screenshot"
],
"deny": [
"# Dangerous Commands",
"Bash(rm -rf)",
"Bash(chmod 777)",
"Bash(curl)",
"Bash(wget)",
"Bash(ssh)",
"Bash(scp)",
"Bash(sudo)",
"Bash(su)",
"Bash(pkill)",
"Bash(kill)",
"Bash(killall)",
"Bash(python)",
"Bash(python3)",
"Bash(pip)",
"Bash(npm)",
"Bash(yarn)",
"Bash(apt)",
"Bash(brew)",
"Bash(systemctl)",
"Bash(service)",
"Bash(docker)",
"Bash(docker-compose)",
"# File Modification (use Edit/Write tools instead)",
"Bash(sed)",
"Bash(awk)",
"Bash(perl)",
"Bash(echo > *)",
"Bash(echo >> *)",
"Bash(cat > *)",
"Bash(cat >> *)",
"Bash(tee)",
"# Git Write Commands (require explicit user action)",
"Bash(git add)",
"Bash(git commit)",
"Bash(git push)",
"Bash(git pull)",
"Bash(git merge)",
"Bash(git rebase)",
"Bash(git checkout)",
"Bash(git reset)",
"Bash(git clean)",
"# Package Management Write Commands",
"Bash(pnpm add)",
"Bash(pnpm remove)",
"Bash(pnpm update)",
"Bash(pnpm upgrade)"
]
},
"enableAllProjectMcpServers": false
}

View File

@@ -241,3 +241,4 @@ const pinia = createTestingPinia({
- Set initial state for focused testing
- Test computed properties by accessing them directly
- Verify state changes by updating the store

View File

@@ -1,201 +0,0 @@
name: Build Artifacts
on:
workflow_call:
inputs:
ref:
type: string
required: false
description: "Git ref to checkout (commit SHA, branch, or tag)"
version_override:
type: string
required: false
description: "Override version (for manual releases)"
outputs:
build_number:
description: "Build number for the artifacts"
value: ${{ jobs.build-api.outputs.build_number }}
secrets:
VITE_ACCOUNT:
required: true
VITE_CONNECT:
required: true
VITE_UNRAID_NET:
required: true
VITE_CALLBACK_KEY:
required: true
UNRAID_BOT_GITHUB_ADMIN_TOKEN:
required: false
jobs:
build-api:
name: Build API
runs-on: ubuntu-latest
outputs:
build_number: ${{ steps.buildnumber.outputs.build_number }}
defaults:
run:
working-directory: api
steps:
- name: Checkout repo
uses: actions/checkout@v5
with:
ref: ${{ inputs.ref || github.ref }}
fetch-depth: 0
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential
version: 1.0
- name: PNPM Install
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile
- name: Get Git Short Sha and API version
id: vars
run: |
GIT_SHA=$(git rev-parse --short HEAD)
IS_TAGGED=$(git describe --tags --abbrev=0 --exact-match || echo '')
PACKAGE_LOCK_VERSION=$(jq -r '.version' package.json)
API_VERSION=${{ inputs.version_override && format('"{0}"', inputs.version_override) || '${PACKAGE_LOCK_VERSION}' }}
if [ -z "${{ inputs.version_override }}" ] && [ -z "$IS_TAGGED" ]; then
API_VERSION="${PACKAGE_LOCK_VERSION}+${GIT_SHA}"
fi
export API_VERSION
echo "API_VERSION=${API_VERSION}" >> $GITHUB_ENV
echo "PACKAGE_LOCK_VERSION=${PACKAGE_LOCK_VERSION}" >> $GITHUB_OUTPUT
- name: Generate build number
id: buildnumber
uses: onyxmueller/build-tag-number@v1
with:
token: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN || github.token }}
prefix: ${{ inputs.version_override || steps.vars.outputs.PACKAGE_LOCK_VERSION }}
- name: Build
run: |
pnpm run build:release
tar -czf deploy/unraid-api.tgz -C deploy/pack/ .
- name: Upload tgz to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-api
path: ${{ github.workspace }}/api/deploy/unraid-api.tgz
build-unraid-ui-webcomponents:
name: Build Unraid UI Library (Webcomponent Version)
defaults:
run:
working-directory: unraid-ui
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
with:
ref: ${{ inputs.ref || github.ref }}
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential
version: 1.0
- name: Install dependencies
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile --filter @unraid/ui
- name: Lint
run: pnpm run lint
- name: Build
run: pnpm run build:wc
- name: Upload Artifact to Github
uses: actions/upload-artifact@v4
with:
name: unraid-wc-ui
path: unraid-ui/dist-wc/
build-web:
name: Build Web App
defaults:
run:
working-directory: web
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
with:
ref: ${{ inputs.ref || github.ref }}
- name: Create env file
run: |
touch .env
echo VITE_ACCOUNT=${{ secrets.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ secrets.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ secrets.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ secrets.VITE_CALLBACK_KEY }} >> .env
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: PNPM Install
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile --filter @unraid/web --filter @unraid/ui
- name: Build Unraid UI
run: |
cd ${{ github.workspace }}/unraid-ui
pnpm run build
- name: Lint files
run: pnpm run lint
- name: Type Check
run: pnpm run type-check
- name: Build
run: pnpm run build
- name: Upload build to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-wc-rich
path: web/dist

View File

@@ -27,15 +27,6 @@ on:
type: string
required: true
description: "Build number for the plugin builds"
ref:
type: string
required: false
description: "Git ref (commit SHA, branch, or tag) to checkout"
TRIGGER_PRODUCTION_RELEASE:
type: boolean
required: false
default: false
description: "Whether to automatically trigger the release-production workflow (default: false)"
secrets:
CF_ACCESS_KEY_ID:
required: true
@@ -45,8 +36,6 @@ on:
required: true
CF_ENDPOINT:
required: true
UNRAID_BOT_GITHUB_ADMIN_TOKEN:
required: false
jobs:
build-plugin:
name: Build and Deploy Plugin
@@ -58,19 +47,23 @@ jobs:
- name: Checkout repo
uses: actions/checkout@v5
with:
ref: ${{ inputs.ref }}
fetch-depth: 0
- name: Install Node
uses: actions/setup-node@v4
with:
node-version-file: ".nvmrc"
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path)" >> $GITHUB_OUTPUT
- name: Get API Version
id: vars
@@ -78,23 +71,17 @@ jobs:
GIT_SHA=$(git rev-parse --short HEAD)
IS_TAGGED=$(git describe --tags --abbrev=0 --exact-match || echo '')
PACKAGE_LOCK_VERSION=$(jq -r '.version' package.json)
# For release builds, trust the release tag version to avoid stale checkouts
if [ "${{ inputs.RELEASE_CREATED }}" = "true" ] && [ -n "${{ inputs.RELEASE_TAG }}" ]; then
TAG_VERSION="${{ inputs.RELEASE_TAG }}"
TAG_VERSION="${TAG_VERSION#v}" # trim leading v if present
if [ "$TAG_VERSION" != "$PACKAGE_LOCK_VERSION" ]; then
echo "::warning::Release tag version ($TAG_VERSION) does not match package.json version ($PACKAGE_LOCK_VERSION). Using tag version for TXZ naming."
fi
API_VERSION="$TAG_VERSION"
else
API_VERSION=$([[ -n "$IS_TAGGED" ]] && echo "$PACKAGE_LOCK_VERSION" || echo "${PACKAGE_LOCK_VERSION}+${GIT_SHA}")
fi
API_VERSION=$([[ -n "$IS_TAGGED" ]] && echo "$PACKAGE_LOCK_VERSION" || echo "${PACKAGE_LOCK_VERSION}+${GIT_SHA}")
echo "API_VERSION=${API_VERSION}" >> $GITHUB_OUTPUT
- uses: actions/cache@v4
name: Setup pnpm cache
with:
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Install dependencies
run: |
cd ${{ github.workspace }}
@@ -110,7 +97,7 @@ jobs:
uses: actions/download-artifact@v5
with:
pattern: unraid-wc-rich
path: ${{ github.workspace }}/plugin/source/dynamix.unraid.net/usr/local/emhttp/plugins/dynamix.my.servers/unraid-components/standalone
path: ${{ github.workspace }}/plugin/source/dynamix.unraid.net/usr/local/emhttp/plugins/dynamix.my.servers/unraid-components/nuxt
merge-multiple: true
- name: Download Unraid API
uses: actions/download-artifact@v5
@@ -160,12 +147,12 @@ jobs:
done
- name: Workflow Dispatch and wait
if: inputs.RELEASE_CREATED == 'true' && inputs.TRIGGER_PRODUCTION_RELEASE == true
if: inputs.RELEASE_CREATED == 'true'
uses: the-actions-org/workflow-dispatch@v4.0.0
with:
workflow: release-production.yml
inputs: '{ "version": "v${{ steps.vars.outputs.API_VERSION }}" }'
token: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
inputs: '{ "version": "${{ steps.vars.outputs.API_VERSION }}" }'
token: ${{ secrets.WORKFLOW_TRIGGER_PAT }}
- name: Upload to Cloudflare
if: inputs.RELEASE_CREATED == 'false'
@@ -194,40 +181,3 @@ jobs:
```
${{ inputs.BASE_URL }}/tag/${{ inputs.TAG }}/dynamix.unraid.net.plg
```
- name: Clean up old preview builds
if: inputs.RELEASE_CREATED == 'false' && github.event_name == 'push'
continue-on-error: true
env:
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: auto
run: |
echo "🧹 Cleaning up old preview builds (keeping last 7 days)..."
# Calculate cutoff date (7 days ago)
CUTOFF_DATE=$(date -d "7 days ago" +"%Y.%m.%d")
echo "Deleting builds older than: ${CUTOFF_DATE}"
# List and delete old timestamped .txz files
OLD_FILES=$(aws s3 ls "s3://${{ secrets.CF_BUCKET_PREVIEW }}/unraid-api/" \
--endpoint-url ${{ secrets.CF_ENDPOINT }} --recursive | \
grep -E "dynamix\.unraid\.net-[0-9]{4}\.[0-9]{2}\.[0-9]{2}\.[0-9]{4}\.txz" | \
awk '{print $4}' || true)
DELETED_COUNT=0
if [ -n "$OLD_FILES" ]; then
while IFS= read -r file; do
if [[ $file =~ ([0-9]{4}\.[0-9]{2}\.[0-9]{2})\.[0-9]{4}\.txz ]]; then
FILE_DATE="${BASH_REMATCH[1]}"
if [[ "$FILE_DATE" < "$CUTOFF_DATE" ]]; then
echo "Deleting old build: $(basename "$file")"
aws s3 rm "s3://${{ secrets.CF_BUCKET_PREVIEW }}/${file}" \
--endpoint-url ${{ secrets.CF_ENDPOINT }} || true
((DELETED_COUNT++))
fi
fi
done <<< "$OLD_FILES"
fi
echo "✅ Deleted ${DELETED_COUNT} old builds"

103
.github/workflows/claude-code-review.yml vendored Normal file
View File

@@ -0,0 +1,103 @@
name: Claude Code Review
on:
pull_request:
types: [opened, synchronize]
# Skip reviews for non-code changes
paths-ignore:
- "**/*.md"
- "**/package-lock.json"
- "**/pnpm-lock.yaml"
- "**/.gitignore"
- "**/LICENSE"
- "**/*.config.js"
- "**/*.config.ts"
- "**/tsconfig.json"
- "**/.github/workflows/*.yml"
- "**/docs/**"
jobs:
claude-review:
# Skip review for bot PRs and WIP/skip-review PRs
# Only run if changes are significant (>10 lines)
if: |
(github.event.pull_request.additions > 10 || github.event.pull_request.deletions > 10) &&
!contains(github.event.pull_request.title, '[skip-review]') &&
!contains(github.event.pull_request.title, '[WIP]') &&
!endsWith(github.event.pull_request.user.login, '[bot]') &&
github.event.pull_request.user.login != 'dependabot' &&
github.event.pull_request.user.login != 'renovate'
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
fetch-depth: 1
- name: Run Claude Code Review
id: claude-review
uses: anthropics/claude-code-action@beta
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# Optional: Specify model (defaults to Claude Sonnet 4, uncomment for Claude Opus 4)
# model: "claude-opus-4-20250514"
# Direct prompt for automated review (no @claude mention needed)
direct_prompt: |
IMPORTANT: Review ONLY the DIFF/CHANGESET - the actual lines that were added or modified in this PR.
DO NOT review the entire file context, only analyze the specific changes being made.
Look for HIGH-PRIORITY issues in the CHANGED LINES ONLY:
1. CRITICAL BUGS: Logic errors, null pointer issues, infinite loops, race conditions
2. SECURITY: SQL injection, XSS, authentication bypass, exposed secrets, unsafe operations
3. BREAKING CHANGES: API contract violations, removed exports, changed function signatures
4. DATA LOSS RISKS: Destructive operations without safeguards, missing data validation
DO NOT comment on:
- Code that wasn't changed in this PR
- Style, formatting, or documentation
- Test coverage (unless tests are broken by the changes)
- Minor optimizations or best practices
- Existing code issues that weren't introduced by this PR
If you find no critical issues in the DIFF, respond with: "✅ No critical issues found in changes"
Keep response under 10 lines. Reference specific line numbers from the diff when reporting issues.
# Optional: Use sticky comments to make Claude reuse the same comment on subsequent pushes to the same PR
use_sticky_comment: true
# Context-aware review based on PR characteristics
# Uncomment to enable different review strategies based on context
# direct_prompt: |
# ${{
# (github.event.pull_request.additions > 500) &&
# 'Large PR detected. Focus only on architectural issues and breaking changes. Skip minor issues.' ||
# contains(github.event.pull_request.title, 'fix') &&
# 'Bug fix PR: Verify the fix addresses the root cause and check for regression risks.' ||
# contains(github.event.pull_request.title, 'deps') &&
# 'Dependency update: Check for breaking changes and security advisories only.' ||
# contains(github.event.pull_request.title, 'refactor') &&
# 'Refactor PR: Verify no behavior changes and check for performance regressions.' ||
# contains(github.event.pull_request.title, 'feat') &&
# 'New feature: Check for security issues, edge cases, and integration problems only.' ||
# 'Standard review: Check for critical bugs, security issues, and breaking changes only.'
# }}
# Optional: Add specific tools for running tests or linting
# allowed_tools: "Bash(npm run test),Bash(npm run lint),Bash(npm run typecheck)"
# Optional: Skip review for certain conditions
# if: |
# !contains(github.event.pull_request.title, '[skip-review]') &&
# !contains(github.event.pull_request.title, '[WIP]')

64
.github/workflows/claude.yml vendored Normal file
View File

@@ -0,0 +1,64 @@
name: Claude Code
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
actions: read # Required for Claude to read CI results on PRs
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
fetch-depth: 1
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@beta
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# This is an optional setting that allows Claude to read CI results on PRs
additional_permissions: |
actions: read
# Optional: Specify model (defaults to Claude Sonnet 4, uncomment for Claude Opus 4)
# model: "claude-opus-4-20250514"
# Optional: Customize the trigger phrase (default: @claude)
# trigger_phrase: "/claude"
# Optional: Trigger when specific user is assigned to an issue
# assignee_trigger: "claude-bot"
# Optional: Allow Claude to run specific commands
# allowed_tools: "Bash(npm install),Bash(npm run build),Bash(npm run test:*),Bash(npm run lint:*)"
# Optional: Add custom instructions for Claude to customize its behavior for your project
# custom_instructions: |
# Follow our coding standards
# Ensure all new code has tests
# Use TypeScript for new files
# Optional: Custom environment variables for Claude
# claude_env: |
# NODE_ENV: test

View File

@@ -0,0 +1,82 @@
name: Update API Documentation
on:
push:
branches:
- main
paths:
- 'api/docs/**'
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
# Add permissions for GITHUB_TOKEN
permissions:
contents: write
pull-requests: write
jobs:
create-docs-pr:
runs-on: ubuntu-latest
steps:
- name: Checkout source repository
uses: actions/checkout@v5
with:
path: source-repo
- name: Checkout docs repository
uses: actions/checkout@v5
with:
repository: unraid/docs
path: docs-repo
token: ${{ secrets.DOCS_PAT_UNRAID_BOT }}
- name: Copy and process docs
run: |
if [ ! -d "source-repo/api/docs" ]; then
echo "Source directory does not exist!"
exit 1
fi
# Remove old API docs but preserve other folders
rm -rf docs-repo/docs/API/
mkdir -p docs-repo/docs/API
# Copy all markdown files and maintain directory structure
cp -r source-repo/api/docs/public/. docs-repo/docs/API/
# Copy images to Docusaurus static directory
mkdir -p docs-repo/static/img/api
# Copy images from public/images if they exist
if [ -d "source-repo/api/docs/public/images" ]; then
cp -r source-repo/api/docs/public/images/. docs-repo/static/img/api/
fi
# Also copy any images from the parent docs/images directory
if [ -d "source-repo/api/docs/images" ]; then
cp -r source-repo/api/docs/images/. docs-repo/static/img/api/
fi
# Update image paths in markdown files
# Replace relative image paths with absolute paths pointing to /img/api/
find docs-repo/docs/API -name "*.md" -type f -exec sed -i 's|!\[\([^]]*\)\](\./images/\([^)]*\))|![\1](/img/api/\2)|g' {} \;
find docs-repo/docs/API -name "*.md" -type f -exec sed -i 's|!\[\([^]]*\)\](images/\([^)]*\))|![\1](/img/api/\2)|g' {} \;
find docs-repo/docs/API -name "*.md" -type f -exec sed -i 's|!\[\([^]]*\)\](../images/\([^)]*\))|![\1](/img/api/\2)|g' {} \;
- name: Create Pull Request
uses: peter-evans/create-pull-request@v7
with:
token: ${{ secrets.DOCS_PAT_UNRAID_BOT }}
path: docs-repo
commit-message: 'docs: update API documentation'
title: 'Update API Documentation'
body: |
This PR updates the API documentation based on changes from the main repository.
Changes were automatically generated from api/docs/* directory.
@coderabbitai ignore
reviewers: ljm42, elibosley
branch: update-api-docs
base: main
delete-branch: true

View File

@@ -22,17 +22,16 @@ jobs:
- name: Checkout
uses: actions/checkout@v5
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '22.18.0'
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Setup Node.js
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
@@ -66,7 +65,7 @@ jobs:
- name: Comment PR with deployment URL
if: github.event_name == 'pull_request'
uses: actions/github-script@v8
uses: actions/github-script@v7
with:
script: |
github.rest.issues.createComment({

View File

@@ -1,210 +0,0 @@
name: Generate Release Notes
on:
workflow_call:
inputs:
version:
description: 'Version number (e.g., 4.25.3)'
required: true
type: string
target_commitish:
description: 'Commit SHA or branch (leave empty for current HEAD)'
required: false
type: string
release_notes:
description: 'Custom release notes (leave empty to auto-generate)'
required: false
type: string
outputs:
release_notes:
description: 'Generated or provided release notes'
value: ${{ jobs.generate.outputs.release_notes }}
secrets:
UNRAID_BOT_GITHUB_ADMIN_TOKEN:
required: true
jobs:
generate:
name: Generate Release Notes
runs-on: ubuntu-latest
outputs:
release_notes: ${{ steps.generate_notes.outputs.release_notes }}
steps:
- name: Checkout repo
uses: actions/checkout@v5
with:
ref: ${{ inputs.target_commitish || github.ref }}
fetch-depth: 0
token: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Generate Release Notes
id: generate_notes
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
TAG_NAME="v${{ inputs.version }}"
VERSION="${{ inputs.version }}"
if [ -n "${{ inputs.release_notes }}" ]; then
NOTES="${{ inputs.release_notes }}"
else
CHANGELOG_PATH="api/CHANGELOG.md"
if [ -f "$CHANGELOG_PATH" ]; then
echo "Extracting release notes from CHANGELOG.md for version ${VERSION}"
NOTES=$(awk -v ver="$VERSION" '
BEGIN {
found=0; capture=0; output="";
gsub(/\./, "\\.", ver);
}
/^## \[/ {
if (capture) exit;
if ($0 ~ "\\[" ver "\\]") {
found=1;
capture=1;
}
}
capture {
if (output != "") output = output "\n";
output = output $0;
}
END {
if (found) print output;
else exit 1;
}
' "$CHANGELOG_PATH") || EXTRACTION_STATUS=$?
if [ ${EXTRACTION_STATUS:-0} -eq 0 ] && [ -n "$NOTES" ]; then
echo "✓ Found release notes in CHANGELOG.md"
else
echo "⚠ Version ${VERSION} not found in CHANGELOG.md, generating with conventional-changelog"
PREV_TAG=$(git describe --tags --abbrev=0 HEAD^ 2>/dev/null || echo "")
CHANGELOG_GENERATED=false
if [ -n "$PREV_TAG" ]; then
echo "Generating changelog from ${PREV_TAG}..HEAD using conventional-changelog"
npm install -g conventional-changelog-cli
TEMP_NOTES=$(mktemp)
conventional-changelog -p conventionalcommits \
--release-count 1 \
--output-unreleased \
> "$TEMP_NOTES" 2>/dev/null || true
if [ -s "$TEMP_NOTES" ]; then
NOTES=$(cat "$TEMP_NOTES")
if [ -n "$NOTES" ]; then
echo "✓ Generated changelog with conventional-changelog"
CHANGELOG_GENERATED=true
TEMP_CHANGELOG=$(mktemp)
{
if [ -f "$CHANGELOG_PATH" ]; then
head -n 1 "$CHANGELOG_PATH"
echo ""
echo "$NOTES"
echo ""
tail -n +2 "$CHANGELOG_PATH"
else
echo "# Changelog"
echo ""
echo "$NOTES"
fi
} > "$TEMP_CHANGELOG"
mv "$TEMP_CHANGELOG" "$CHANGELOG_PATH"
echo "✓ Updated CHANGELOG.md with generated notes"
else
echo "⚠ conventional-changelog produced empty output, using GitHub auto-generation"
NOTES=$(gh api repos/${{ github.repository }}/releases/generate-notes \
-f tag_name="${TAG_NAME}" \
-f target_commitish="${{ inputs.target_commitish || github.sha }}" \
-f previous_tag_name="${PREV_TAG}" \
--jq '.body')
fi
else
echo "⚠ conventional-changelog failed, using GitHub auto-generation"
NOTES=$(gh api repos/${{ github.repository }}/releases/generate-notes \
-f tag_name="${TAG_NAME}" \
-f target_commitish="${{ inputs.target_commitish || github.sha }}" \
-f previous_tag_name="${PREV_TAG}" \
--jq '.body')
fi
rm -f "$TEMP_NOTES"
else
echo "⚠ No previous tag found, using GitHub auto-generation"
NOTES=$(gh api repos/${{ github.repository }}/releases/generate-notes \
-f tag_name="${TAG_NAME}" \
-f target_commitish="${{ inputs.target_commitish || github.sha }}" \
--jq '.body' || echo "Release ${VERSION}")
fi
if [ "$CHANGELOG_GENERATED" = true ]; then
BRANCH_OR_SHA="${{ inputs.target_commitish || github.ref }}"
if git show-ref --verify --quiet "refs/heads/${BRANCH_OR_SHA}"; then
echo ""
echo "=========================================="
echo "CHANGELOG GENERATED AND COMMITTED"
echo "=========================================="
echo ""
git config user.name "github-actions[bot]"
git config user.email "github-actions[bot]@users.noreply.github.com"
BEFORE_SHA=$(git rev-parse HEAD)
git add "$CHANGELOG_PATH"
git commit -m "chore: add changelog for version ${VERSION}"
git push origin "HEAD:${BRANCH_OR_SHA}"
AFTER_SHA=$(git rev-parse HEAD)
echo "✓ Changelog committed and pushed successfully"
echo ""
echo "Previous SHA: ${BEFORE_SHA}"
echo "New SHA: ${AFTER_SHA}"
echo ""
echo "⚠️ CRITICAL: A new commit was created, but github.sha is immutable."
echo "⚠️ github.sha = ${BEFORE_SHA} (original workflow trigger)"
echo "⚠️ The release tag must point to ${AFTER_SHA} (with changelog)"
echo ""
echo "Re-run this workflow to create the release with the correct commit."
echo ""
exit 1
else
echo "⚠ Target is a commit SHA, not a branch. Cannot push changelog updates."
echo "Changelog was generated but not committed."
fi
fi
fi
else
echo "⚠ CHANGELOG.md not found, using GitHub auto-generation"
PREV_TAG=$(git describe --tags --abbrev=0 HEAD^ 2>/dev/null || echo "")
if [ -n "$PREV_TAG" ]; then
NOTES=$(gh api repos/${{ github.repository }}/releases/generate-notes \
-f tag_name="${TAG_NAME}" \
-f target_commitish="${{ inputs.target_commitish || github.sha }}" \
-f previous_tag_name="${PREV_TAG}" \
--jq '.body')
else
NOTES="Release ${VERSION}"
fi
fi
fi
echo "release_notes<<EOF" >> $GITHUB_OUTPUT
echo "$NOTES" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT

View File

@@ -6,15 +6,29 @@ on:
branches:
- main
permissions:
contents: write
pull-requests: write
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
cancel-in-progress: true
jobs:
release-please:
name: Release Please
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
steps:
- name: Checkout
uses: actions/checkout@v5
# Only run release-please on pushes to main
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
- id: release
uses: googleapis/release-please-action@v4
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
outputs:
releases_created: ${{ steps.release.outputs.releases_created || 'false' }}
tag_name: ${{ steps.release.outputs.tag_name || '' }}
test-api:
name: Test API
defaults:
@@ -24,25 +38,36 @@ jobs:
steps:
- name: Checkout repo
uses: actions/checkout@v5
- name: Install Node
uses: actions/setup-node@v4
with:
fetch-depth: 0
node-version-file: ".nvmrc"
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential libvirt-daemon-system
version: 1.0
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
run_install: false
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path)" >> $GITHUB_OUTPUT
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
- uses: actions/cache@v4
name: Setup pnpm cache
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential libvirt-daemon-system php-cli
version: 1.0
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: PNPM Install
run: pnpm install --frozen-lockfile
@@ -92,113 +117,265 @@ jobs:
# Verify libvirt is running using sudo to bypass group membership delays
sudo virsh list --all || true
- name: Build UI Package First
run: |
echo "🔧 Building UI package for web tests dependency..."
cd ../unraid-ui && pnpm run build
- uses: oven-sh/setup-bun@v2
with:
bun-version: latest
- name: Run Tests Concurrently
run: |
set -e
# Run all tests in parallel with labeled output and coverage generation
# Run all tests in parallel with labeled output
echo "🚀 Starting API coverage tests..."
pnpm run coverage > api-test.log 2>&1 &
API_PID=$!
echo "🚀 Starting Connect plugin tests..."
(cd ../packages/unraid-api-plugin-connect && pnpm test --coverage 2>/dev/null || pnpm test) > connect-test.log 2>&1 &
(cd ../packages/unraid-api-plugin-connect && pnpm test) > connect-test.log 2>&1 &
CONNECT_PID=$!
echo "🚀 Starting Shared package tests..."
(cd ../packages/unraid-shared && pnpm test --coverage 2>/dev/null || pnpm test) > shared-test.log 2>&1 &
(cd ../packages/unraid-shared && pnpm test) > shared-test.log 2>&1 &
SHARED_PID=$!
echo "🚀 Starting Web package coverage tests..."
(cd ../web && (pnpm test --coverage || pnpm test)) > web-test.log 2>&1 &
WEB_PID=$!
echo "🚀 Starting UI package coverage tests..."
(cd ../unraid-ui && pnpm test --coverage 2>/dev/null || pnpm test) > ui-test.log 2>&1 &
UI_PID=$!
echo "🚀 Starting Plugin tests..."
(cd ../plugin && pnpm test) > plugin-test.log 2>&1 &
PLUGIN_PID=$!
# Wait for all processes and capture exit codes
wait $API_PID && echo "✅ API tests completed" || { echo "❌ API tests failed"; API_EXIT=1; }
wait $CONNECT_PID && echo "✅ Connect tests completed" || { echo "❌ Connect tests failed"; CONNECT_EXIT=1; }
wait $SHARED_PID && echo "✅ Shared tests completed" || { echo "❌ Shared tests failed"; SHARED_EXIT=1; }
wait $WEB_PID && echo "✅ Web tests completed" || { echo "❌ Web tests failed"; WEB_EXIT=1; }
wait $UI_PID && echo "✅ UI tests completed" || { echo "❌ UI tests failed"; UI_EXIT=1; }
wait $PLUGIN_PID && echo "✅ Plugin tests completed" || { echo "❌ Plugin tests failed"; PLUGIN_EXIT=1; }
# Display all outputs
echo "📋 API Test Results:" && cat api-test.log
echo "📋 Connect Plugin Test Results:" && cat connect-test.log
echo "📋 Shared Package Test Results:" && cat shared-test.log
echo "📋 Web Package Test Results:" && cat web-test.log
echo "📋 UI Package Test Results:" && cat ui-test.log
echo "📋 Plugin Test Results:" && cat plugin-test.log
# Exit with error if any test failed
if [[ ${API_EXIT:-0} -eq 1 || ${CONNECT_EXIT:-0} -eq 1 || ${SHARED_EXIT:-0} -eq 1 || ${WEB_EXIT:-0} -eq 1 || ${UI_EXIT:-0} -eq 1 || ${PLUGIN_EXIT:-0} -eq 1 ]]; then
if [[ ${API_EXIT:-0} -eq 1 || ${CONNECT_EXIT:-0} -eq 1 || ${SHARED_EXIT:-0} -eq 1 ]]; then
exit 1
fi
- name: Upload all coverage reports to Codecov
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: ./coverage/coverage-final.json,../web/coverage/coverage-final.json,../unraid-ui/coverage/coverage-final.json,../packages/unraid-api-plugin-connect/coverage/coverage-final.json,../packages/unraid-shared/coverage/coverage-final.json
fail_ci_if_error: false
build-artifacts:
name: Build All Artifacts
uses: ./.github/workflows/build-artifacts.yml
secrets:
VITE_ACCOUNT: ${{ secrets.VITE_ACCOUNT }}
VITE_CONNECT: ${{ secrets.VITE_CONNECT }}
VITE_UNRAID_NET: ${{ secrets.VITE_UNRAID_NET }}
VITE_CALLBACK_KEY: ${{ secrets.VITE_CALLBACK_KEY }}
UNRAID_BOT_GITHUB_ADMIN_TOKEN: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
release-please:
name: Release Please
build-api:
name: Build API
runs-on: ubuntu-latest
# Only run on pushes to main AND after tests pass
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
needs:
- test-api
- build-artifacts
permissions:
contents: write
pull-requests: write
steps:
- name: Checkout
uses: actions/checkout@v5
with:
fetch-depth: 0
- id: release
uses: googleapis/release-please-action@v4
outputs:
releases_created: ${{ steps.release.outputs.releases_created || 'false' }}
tag_name: ${{ steps.release.outputs.tag_name || '' }}
build_number: ${{ steps.buildnumber.outputs.build_number }}
defaults:
run:
working-directory: api
steps:
- name: Checkout repo
uses: actions/checkout@v5
- name: Install Node
uses: actions/setup-node@v4
with:
node-version-file: ".nvmrc"
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path)" >> $GITHUB_OUTPUT
- uses: actions/cache@v4
name: Setup pnpm cache
with:
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential
version: 1.0
- name: PNPM Install
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile
- name: Build
run: pnpm run build
- name: Get Git Short Sha and API version
id: vars
run: |
GIT_SHA=$(git rev-parse --short HEAD)
IS_TAGGED=$(git describe --tags --abbrev=0 --exact-match || echo '')
PACKAGE_LOCK_VERSION=$(jq -r '.version' package.json)
API_VERSION=$([[ -n "$IS_TAGGED" ]] && echo "$PACKAGE_LOCK_VERSION" || echo "${PACKAGE_LOCK_VERSION}+${GIT_SHA}")
export API_VERSION
echo "API_VERSION=${API_VERSION}" >> $GITHUB_ENV
echo "PACKAGE_LOCK_VERSION=${PACKAGE_LOCK_VERSION}" >> $GITHUB_OUTPUT
- name: Generate build number
id: buildnumber
uses: onyxmueller/build-tag-number@v1
with:
token: ${{secrets.github_token}}
prefix: ${{steps.vars.outputs.PACKAGE_LOCK_VERSION}}
- name: Build
run: |
pnpm run build:release
tar -czf deploy/unraid-api.tgz -C deploy/pack/ .
- name: Upload tgz to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-api
path: ${{ github.workspace }}/api/deploy/unraid-api.tgz
build-unraid-ui-webcomponents:
name: Build Unraid UI Library (Webcomponent Version)
defaults:
run:
working-directory: unraid-ui
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
- name: Install Node
uses: actions/setup-node@v4
with:
node-version-file: ".nvmrc"
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path)" >> $GITHUB_OUTPUT
- uses: actions/cache@v4
name: Setup pnpm cache
with:
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential
version: 1.0
- name: Install dependencies
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile --filter @unraid/ui
- name: Lint
run: pnpm run lint
- name: Build
run: pnpm run build:wc
- name: Upload Artifact to Github
uses: actions/upload-artifact@v4
with:
name: unraid-wc-ui
path: unraid-ui/dist-wc/
build-web:
# needs: [build-unraid-ui]
name: Build Web App
defaults:
run:
working-directory: web
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
- name: Create env file
run: |
touch .env
echo VITE_ACCOUNT=${{ secrets.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ secrets.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ secrets.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ secrets.VITE_CALLBACK_KEY }} >> .env
cat .env
- name: Install Node
uses: actions/setup-node@v4
with:
node-version-file: ".nvmrc"
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path)" >> $GITHUB_OUTPUT
- uses: actions/cache@v4
name: Setup pnpm cache
with:
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: PNPM Install
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile --filter @unraid/web --filter @unraid/ui
- name: Build Unraid UI
run: |
cd ${{ github.workspace }}/unraid-ui
pnpm run build
- name: Lint files
run: pnpm run lint
- name: Type Check
run: pnpm run type-check
- name: Test
run: pnpm run test:ci
- name: Build
run: pnpm run build
- name: Upload build to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-wc-rich
path: web/.nuxt/standalone-apps
build-plugin-staging-pr:
name: Build and Deploy Plugin
needs:
- build-artifacts
- release-please
- build-api
- build-web
- build-unraid-ui-webcomponents
- test-api
uses: ./.github/workflows/build-plugin.yml
with:
RELEASE_CREATED: 'false'
RELEASE_CREATED: false
TAG: ${{ github.event.pull_request.number && format('PR{0}', github.event.pull_request.number) || '' }}
BUCKET_PATH: ${{ github.event.pull_request.number && format('unraid-api/tag/PR{0}', github.event.pull_request.number) || 'unraid-api' }}
BASE_URL: "https://preview.dl.unraid.net/unraid-api"
BUILD_NUMBER: ${{ needs.build-artifacts.outputs.build_number }}
BUILD_NUMBER: ${{ needs.build-api.outputs.build_number }}
secrets:
CF_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
CF_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
@@ -210,19 +387,20 @@ jobs:
name: Build and Deploy Production Plugin
needs:
- release-please
- build-artifacts
- build-api
- build-web
- build-unraid-ui-webcomponents
- test-api
uses: ./.github/workflows/build-plugin.yml
with:
RELEASE_CREATED: 'true'
RELEASE_CREATED: true
RELEASE_TAG: ${{ needs.release-please.outputs.tag_name }}
TAG: ""
BUCKET_PATH: unraid-api
BASE_URL: "https://stable.dl.unraid.net/unraid-api"
BUILD_NUMBER: ${{ needs.build-artifacts.outputs.build_number }}
TRIGGER_PRODUCTION_RELEASE: true
BUILD_NUMBER: ${{ needs.build-api.outputs.build_number }}
secrets:
CF_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
CF_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
CF_BUCKET_PREVIEW: ${{ secrets.CF_BUCKET_PREVIEW }}
CF_ENDPOINT: ${{ secrets.CF_ENDPOINT }}
UNRAID_BOT_GITHUB_ADMIN_TOKEN: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}

View File

@@ -1,239 +0,0 @@
name: Manual Release
on:
workflow_dispatch:
inputs:
version:
description: 'Version to release (e.g., 4.25.3)'
required: true
type: string
target_commitish:
description: 'Commit SHA or branch (leave empty for current HEAD)'
required: false
type: string
release_notes:
description: 'Release notes/changelog (leave empty to auto-generate from commits)'
required: false
type: string
prerelease:
description: 'Mark as prerelease'
required: false
type: boolean
default: false
permissions:
contents: write
pull-requests: write
jobs:
validate-version:
name: Validate and Update Package Versions
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
with:
ref: ${{ inputs.target_commitish || github.ref }}
fetch-depth: 0
token: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Check and Update Package Versions
run: |
EXPECTED_VERSION="${{ inputs.version }}"
MISMATCHES_FOUND=false
PACKAGE_JSONS=(
"package.json"
"api/package.json"
"web/package.json"
"unraid-ui/package.json"
"plugin/package.json"
"packages/unraid-shared/package.json"
"packages/unraid-api-plugin-health/package.json"
"packages/unraid-api-plugin-generator/package.json"
"packages/unraid-api-plugin-connect/package.json"
)
echo "Checking package.json versions against expected version: ${EXPECTED_VERSION}"
for pkg in "${PACKAGE_JSONS[@]}"; do
if [ -f "$pkg" ]; then
CURRENT_VERSION=$(node -p "require('./$pkg').version")
if [ "$CURRENT_VERSION" != "$EXPECTED_VERSION" ]; then
echo "❌ Version mismatch in $pkg: $CURRENT_VERSION != $EXPECTED_VERSION"
MISMATCHES_FOUND=true
# Detect indentation by checking the first property line
INDENT_SPACES=$(head -10 "$pkg" | grep '^ *"' | head -1 | sed 's/".*//g' | wc -c)
INDENT_SPACES=$((INDENT_SPACES - 1))
jq --indent "$INDENT_SPACES" --arg version "$EXPECTED_VERSION" '.version = $version' "$pkg" > "$pkg.tmp" && mv "$pkg.tmp" "$pkg"
echo "✓ Updated $pkg to version $EXPECTED_VERSION"
else
echo "✓ $pkg version matches: $CURRENT_VERSION"
fi
fi
done
if [ "$MISMATCHES_FOUND" = true ]; then
echo ""
echo "=========================================="
echo "Version mismatches found!"
echo "=========================================="
echo ""
BRANCH_OR_SHA="${{ inputs.target_commitish || github.ref }}"
if git show-ref --verify --quiet "refs/heads/${BRANCH_OR_SHA}"; then
echo "Creating commit with version updates and pushing to branch: ${BRANCH_OR_SHA}"
git config user.name "github-actions[bot]"
git config user.email "github-actions[bot]@users.noreply.github.com"
BEFORE_SHA=$(git rev-parse HEAD)
git add ${PACKAGE_JSONS[@]}
git commit -m "chore: update package versions to ${{ inputs.version }}"
git push origin "HEAD:${BRANCH_OR_SHA}"
AFTER_SHA=$(git rev-parse HEAD)
echo ""
echo "=========================================="
echo "WORKFLOW MUST BE RE-RUN"
echo "=========================================="
echo ""
echo "✓ Version updates committed and pushed successfully"
echo ""
echo "Previous SHA: ${BEFORE_SHA}"
echo "New SHA: ${AFTER_SHA}"
echo ""
echo "⚠️ CRITICAL: A new commit was created, but github.sha is immutable."
echo "⚠️ github.sha = ${BEFORE_SHA} (original workflow trigger)"
echo "⚠️ The release tag must point to ${AFTER_SHA} (with version updates)"
echo ""
echo "Re-run this workflow to create the release with the correct commit."
echo ""
exit 1
else
echo "Target is a commit SHA, not a branch. Cannot push version updates."
echo "Please update the package.json versions manually and re-run the workflow."
exit 1
fi
fi
echo ""
echo "✓ All package.json versions match the expected version: ${EXPECTED_VERSION}"
build-artifacts:
name: Build All Artifacts
needs:
- validate-version
uses: ./.github/workflows/build-artifacts.yml
with:
ref: ${{ inputs.target_commitish || github.ref }}
version_override: ${{ inputs.version }}
secrets:
VITE_ACCOUNT: ${{ secrets.VITE_ACCOUNT }}
VITE_CONNECT: ${{ secrets.VITE_CONNECT }}
VITE_UNRAID_NET: ${{ secrets.VITE_UNRAID_NET }}
VITE_CALLBACK_KEY: ${{ secrets.VITE_CALLBACK_KEY }}
UNRAID_BOT_GITHUB_ADMIN_TOKEN: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
generate-release-notes:
name: Generate Release Notes
needs:
- build-artifacts
uses: ./.github/workflows/generate-release-notes.yml
with:
version: ${{ inputs.version }}
target_commitish: ${{ inputs.target_commitish || github.ref }}
release_notes: ${{ inputs.release_notes }}
secrets:
UNRAID_BOT_GITHUB_ADMIN_TOKEN: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
create-release:
name: Create GitHub Release (Draft)
runs-on: ubuntu-latest
needs:
- generate-release-notes
outputs:
tag_name: ${{ steps.create_release.outputs.tag_name }}
release_notes: ${{ needs.generate-release-notes.outputs.release_notes }}
steps:
- name: Checkout repo
uses: actions/checkout@v5
with:
ref: ${{ inputs.target_commitish || github.ref }}
fetch-depth: 0
- name: Create or Update Release as Draft
id: create_release
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
TAG_NAME="v${{ inputs.version }}"
TARGET="${{ inputs.target_commitish || github.sha }}"
echo "tag_name=${TAG_NAME}" >> $GITHUB_OUTPUT
if gh release view "${TAG_NAME}" > /dev/null 2>&1; then
echo "Release ${TAG_NAME} already exists, updating as draft..."
gh release edit "${TAG_NAME}" \
--draft \
--notes "${{ needs.generate-release-notes.outputs.release_notes }}" \
${{ inputs.prerelease && '--prerelease' || '' }}
else
echo "Creating new draft release ${TAG_NAME}..."
git tag "${TAG_NAME}" "${TARGET}" || true
git push origin "${TAG_NAME}" || true
gh release create "${TAG_NAME}" \
--draft \
--title "${{ inputs.version }}" \
--notes "${{ needs.generate-release-notes.outputs.release_notes }}" \
--target "${TARGET}" \
${{ inputs.prerelease && '--prerelease' || '' }}
fi
build-plugin-production:
name: Build and Deploy Production Plugin
needs:
- create-release
- build-artifacts
uses: ./.github/workflows/build-plugin.yml
with:
RELEASE_CREATED: 'true'
RELEASE_TAG: ${{ needs.create-release.outputs.tag_name }}
TAG: ""
BUCKET_PATH: unraid-api
BASE_URL: "https://stable.dl.unraid.net/unraid-api"
BUILD_NUMBER: ${{ needs.build-artifacts.outputs.build_number }}
ref: ${{ inputs.target_commitish || github.ref }}
secrets:
CF_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
CF_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
CF_BUCKET_PREVIEW: ${{ secrets.CF_BUCKET_PREVIEW }}
CF_ENDPOINT: ${{ secrets.CF_ENDPOINT }}
UNRAID_BOT_GITHUB_ADMIN_TOKEN: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
publish-release:
name: Publish Release
runs-on: ubuntu-latest
needs:
- create-release
- build-plugin-production
steps:
- name: Publish Release
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
TAG_NAME="${{ needs.create-release.outputs.tag_name }}"
echo "Publishing release ${TAG_NAME}..."
gh release edit "${TAG_NAME}" --draft=false --repo ${{ github.repository }}

View File

@@ -1,30 +0,0 @@
name: Publish GraphQL Schema
on:
push:
branches:
- main
paths:
- 'api/generated-schema.graphql'
workflow_dispatch:
jobs:
publish-schema:
name: Publish Schema to Apollo Studio
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
- name: Install Apollo Rover CLI
run: |
curl -sSL https://rover.apollo.dev/nix/latest | sh
echo "$HOME/.rover/bin" >> $GITHUB_PATH
- name: Publish schema to Apollo Studio
env:
APOLLO_KEY: ${{ secrets.APOLLO_KEY }}
run: |
rover graph publish Unraid-API@current \
--schema api/generated-schema.graphql

View File

@@ -1,9 +1,4 @@
name: Replace PR Plugin with Staging Redirect on Merge
# This workflow runs when a PR is merged and replaces the PR-specific plugin
# with a redirect version that points to the main staging URL.
# This ensures users who installed the PR version will automatically
# update to the staging version on their next update check.
name: Push Staging Plugin on PR Close
on:
pull_request:
@@ -22,13 +17,18 @@ on:
default: true
jobs:
push-staging-redirect:
push-staging:
if: (github.event_name == 'pull_request' && github.event.pull_request.merged == true) || (github.event_name == 'workflow_dispatch' && inputs.pr_merged == true)
runs-on: ubuntu-latest
permissions:
contents: read
actions: read
steps:
- name: Set Timezone
uses: szenius/set-timezone@v2.0
with:
timezoneLinux: "America/Los_Angeles"
- name: Set PR number
id: pr_number
run: |
@@ -45,12 +45,11 @@ jobs:
name: unraid-plugin-.*
path: connect-files
pr: ${{ steps.pr_number.outputs.pr_number }}
workflow: main.yml
workflow_conclusion: success
workflow_search: true
search_artifacts: true
if_no_artifact_found: fail
- name: Update Downloaded Plugin to Redirect to Staging
- name: Update Downloaded Staging Plugin to New Date
run: |
# Find the .plg file in the downloaded artifact
plgfile=$(find connect-files -name "*.plg" -type f | head -1)
@@ -61,82 +60,23 @@ jobs:
fi
echo "Found plugin file: $plgfile"
# Get current version and bump it with current timestamp
current_version=$(grep '<!ENTITY version' "${plgfile}" | sed -E 's/.*"(.*)".*/\1/')
echo "Current version: ${current_version}"
# Create new version with current timestamp (ensures it's newer)
new_version=$(date +"%Y.%m.%d.%H%M")
echo "New redirect version: ${new_version}"
# Update version to trigger update
sed -i -E "s#(<!ENTITY version \").*(\">)#\1${new_version}\2#g" "${plgfile}" || exit 1
version=$(date +"%Y.%m.%d.%H%M")
sed -i -E "s#(<!ENTITY version \").*(\">)#\1${version}\2#g" "${plgfile}" || exit 1
# Change the plugin url to point to staging - users will switch to staging on next update
# Change the plugin url to point to staging
url="https://preview.dl.unraid.net/unraid-api/dynamix.unraid.net.plg"
sed -i -E "s#(<!ENTITY plugin_url \").*?(\">)#\1${url}\2#g" "${plgfile}" || exit 1
echo "Modified plugin to redirect to: ${url}"
echo "Version bumped from ${current_version} to ${new_version}"
cat "${plgfile}"
mkdir -p pr-release
mv "${plgfile}" pr-release/dynamix.unraid.net.plg
- name: Clean up old PR artifacts from Cloudflare
- name: Upload to Cloudflare
uses: jakejarvis/s3-sync-action@v0.5.1
env:
AWS_S3_ENDPOINT: ${{ secrets.CF_ENDPOINT }}
AWS_S3_BUCKET: ${{ secrets.CF_BUCKET_PREVIEW }}
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: auto
run: |
# Delete all existing files in the PR directory first (txz, plg, etc.)
aws s3 rm s3://${{ secrets.CF_BUCKET_PREVIEW }}/unraid-api/tag/PR${{ steps.pr_number.outputs.pr_number }}/ \
--recursive \
--endpoint-url ${{ secrets.CF_ENDPOINT }}
echo "✅ Cleaned up old PR artifacts"
- name: Upload PR Redirect Plugin to Cloudflare
env:
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: auto
run: |
# Upload only the redirect plugin file
aws s3 cp pr-release/dynamix.unraid.net.plg \
s3://${{ secrets.CF_BUCKET_PREVIEW }}/unraid-api/tag/PR${{ steps.pr_number.outputs.pr_number }}/dynamix.unraid.net.plg \
--endpoint-url ${{ secrets.CF_ENDPOINT }} \
--content-encoding none \
--acl public-read
echo "✅ Uploaded redirect plugin"
- name: Output redirect information
run: |
echo "✅ PR plugin replaced with staging redirect version"
echo "PR URL remains: https://preview.dl.unraid.net/unraid-api/tag/PR${{ steps.pr_number.outputs.pr_number }}/dynamix.unraid.net.plg"
echo "Redirects users to staging: https://preview.dl.unraid.net/unraid-api/dynamix.unraid.net.plg"
echo "Users updating from this PR version will automatically switch to staging"
- name: Comment on PR about staging redirect
if: github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v3
with:
comment-tag: pr-closed-staging
mode: recreate
message: |
## 🔄 PR Merged - Plugin Redirected to Staging
This PR has been merged and the preview plugin has been updated to redirect to the staging version.
**For users testing this PR:**
- Your plugin will automatically update to the staging version on the next update check
- The staging version includes all merged changes from this PR
- No manual intervention required
**Staging URL:**
```
https://preview.dl.unraid.net/unraid-api/dynamix.unraid.net.plg
```
Thank you for testing! 🚀
AWS_REGION: "auto"
SOURCE_DIR: pr-release
DEST_DIR: unraid-api/tag/PR${{ steps.pr_number.outputs.pr_number }}

View File

@@ -28,16 +28,16 @@ jobs:
with:
latest: true
prerelease: false
- uses: actions/setup-node@v5
- uses: actions/setup-node@v4
with:
node-version: 22.19.0
node-version: '22.18.0'
- run: |
cat << 'EOF' > release-notes.txt
${{ steps.release-info.outputs.body }}
EOF
- run: npm install html-escaper@2 xml2js
- name: Update Plugin Changelog
uses: actions/github-script@v8
uses: actions/github-script@v7
with:
script: |
const fs = require('fs');
@@ -124,22 +124,3 @@ jobs:
--no-guess-mime-type \
--content-encoding none \
--acl public-read
- name: Discord Webhook Notification
uses: tsickert/discord-webhook@v7.0.0
with:
webhook-url: ${{ secrets.PUBLIC_DISCORD_RELEASE_ENDPOINT }}
username: "Unraid API Bot"
avatar-url: "https://craftassets.unraid.net/uploads/logos/un-mark-gradient.png"
embed-title: "🚀 Unraid API ${{ inputs.version }} Released!"
embed-url: "https://github.com/${{ github.repository }}/releases/tag/${{ inputs.version }}"
embed-description: |
A new version of Unraid API has been released!
**Version:** `${{ inputs.version }}`
**Release Page:** [View on GitHub](https://github.com/${{ github.repository }}/releases/tag/${{ inputs.version }})
**📋 Changelog:**
${{ steps.release-info.outputs.body }}
embed-color: 16734296
embed-footer-text: "Unraid API • Automated Release"

71
.github/workflows/test-libvirt.yml vendored Normal file
View File

@@ -0,0 +1,71 @@
name: Test Libvirt
on:
push:
branches:
- main
paths:
- "libvirt/**"
pull_request:
paths:
- "libvirt/**"
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
build:
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./libvirt
steps:
- uses: actions/checkout@v5
with:
submodules: recursive
- uses: actions/setup-python@v5
with:
python-version: "3.13.7"
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: libvirt-dev
version: 1.0
- name: Set Node.js
uses: actions/setup-node@v4
with:
node-version-file: ".nvmrc"
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
version: 10.15.0
run_install: false
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path)" >> $GITHUB_OUTPUT
- uses: actions/cache@v4
name: Setup pnpm cache
with:
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('libvirt/package.json') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: pnpm install
run: pnpm install --frozen-lockfile
- name: Build
run: pnpm run build
- name: test
run: pnpm run test

8
.gitignore vendored
View File

@@ -29,10 +29,6 @@ unraid-ui/node_modules/
# TypeScript v1 declaration files
typings/
# Auto-generated type declarations for Nuxt UI
auto-imports.d.ts
components.d.ts
# Optional npm cache directory
.npm
@@ -122,7 +118,3 @@ api/dev/Unraid.net/myservers.cfg
# local Mise settings
.mise.toml
# Compiled test pages (generated from Nunjucks templates)
web/public/test-pages/*.html

View File

@@ -1 +1 @@
{".":"4.27.2"}
{".":"4.17.0"}

View File

@@ -1,8 +1,7 @@
@custom-variant dark (&:where(.dark, .dark *));
/* Utility defaults for web components (when we were using shadow DOM) */
:host,
.unapi {
:host {
--tw-divide-y-reverse: 0;
--tw-border-style: solid;
--tw-font-weight: initial;
@@ -62,7 +61,7 @@
}
*/
.unapi {
body {
--color-alpha: #1c1b1b;
--color-beta: #f2f2f2;
--color-gamma: #999999;
@@ -74,25 +73,7 @@
--ring-shadow: 0 0 var(--color-beta);
}
.unapi button:not(:disabled),
.unapi [role='button']:not(:disabled) {
button:not(:disabled),
[role='button']:not(:disabled) {
cursor: pointer;
}
/* Font size overrides for SSO button component */
.unapi unraid-sso-button,
unraid-sso-button.unapi {
--text-xs: 0.75rem;
--text-sm: 0.875rem;
--text-base: 1rem;
--text-lg: 1.125rem;
--text-xl: 1.25rem;
--text-2xl: 1.5rem;
--text-3xl: 1.875rem;
--text-4xl: 2.25rem;
--text-5xl: 3rem;
--text-6xl: 3.75rem;
--text-7xl: 4.5rem;
--text-8xl: 6rem;
--text-9xl: 8rem;
}
}

View File

@@ -2,59 +2,9 @@
/* Light mode defaults */
:root {
/* Nuxt UI Color System - Primary (Orange for Unraid) */
--ui-color-primary-50: #fff7ed;
--ui-color-primary-100: #ffedd5;
--ui-color-primary-200: #fed7aa;
--ui-color-primary-300: #fdba74;
--ui-color-primary-400: #fb923c;
--ui-color-primary-500: #ff8c2f;
--ui-color-primary-600: #ea580c;
--ui-color-primary-700: #c2410c;
--ui-color-primary-800: #9a3412;
--ui-color-primary-900: #7c2d12;
--ui-color-primary-950: #431407;
/* Nuxt UI Color System - Neutral (True Gray) */
--ui-color-neutral-50: #fafafa;
--ui-color-neutral-100: #f5f5f5;
--ui-color-neutral-200: #e5e5e5;
--ui-color-neutral-300: #d4d4d4;
--ui-color-neutral-400: #a3a3a3;
--ui-color-neutral-500: #737373;
--ui-color-neutral-600: #525252;
--ui-color-neutral-700: #404040;
--ui-color-neutral-800: #262626;
--ui-color-neutral-900: #171717;
--ui-color-neutral-950: #0a0a0a;
/* Nuxt UI Default color shades */
--ui-primary: var(--ui-color-primary-500);
--ui-secondary: var(--ui-color-neutral-500);
/* Nuxt UI Design Tokens - Text */
--ui-text-dimmed: var(--ui-color-neutral-400);
--ui-text-muted: var(--ui-color-neutral-500);
--ui-text-toned: var(--ui-color-neutral-600);
--ui-text: var(--ui-color-neutral-700);
--ui-text-highlighted: var(--ui-color-neutral-900);
--ui-text-inverted: white;
/* Nuxt UI Design Tokens - Background */
--ui-bg: white;
--ui-bg-muted: var(--ui-color-neutral-50);
--ui-bg-elevated: var(--ui-color-neutral-100);
--ui-bg-accented: var(--ui-color-neutral-200);
--ui-bg-inverted: var(--ui-color-neutral-900);
/* Nuxt UI Design Tokens - Border */
--ui-border: var(--ui-color-neutral-200);
--ui-border-muted: var(--ui-color-neutral-200);
--ui-border-accented: var(--ui-color-neutral-300);
--ui-border-inverted: var(--ui-color-neutral-900);
/* Nuxt UI Radius */
--ui-radius: 0.5rem;
/* Override Tailwind v4 global styles to use webgui variables */
--ui-bg: var(--background-color) !important;
--ui-text: var(--text-color) !important;
--background: 0 0% 100%;
--foreground: 0 0% 3.9%;
@@ -66,7 +16,7 @@
--card-foreground: 0 0% 3.9%;
--border: 0 0% 89.8%;
--input: 0 0% 89.8%;
--primary: 24 100% 50%; /* Orange #ff8c2f in HSL */
--primary: 0 0% 9%;
--primary-foreground: 0 0% 98%;
--secondary: 0 0% 96.1%;
--secondary-foreground: 0 0% 9%;
@@ -74,7 +24,7 @@
--accent-foreground: 0 0% 9%;
--destructive: 0 84.2% 60.2%;
--destructive-foreground: 0 0% 98%;
--ring: 24 100% 50%; /* Orange ring to match primary */
--ring: 0 0% 3.9%;
--chart-1: 12 76% 61%;
--chart-2: 173 58% 39%;
--chart-3: 197 37% 24%;
@@ -84,30 +34,9 @@
/* Dark mode */
.dark {
/* Nuxt UI Default color shades - Dark mode */
--ui-primary: var(--ui-color-primary-400);
--ui-secondary: var(--ui-color-neutral-400);
/* Nuxt UI Design Tokens - Text (Dark) */
--ui-text-dimmed: var(--ui-color-neutral-500);
--ui-text-muted: var(--ui-color-neutral-400);
--ui-text-toned: var(--ui-color-neutral-300);
--ui-text: var(--ui-color-neutral-200);
--ui-text-highlighted: white;
--ui-text-inverted: var(--ui-color-neutral-900);
/* Nuxt UI Design Tokens - Background (Dark) */
--ui-bg: var(--ui-color-neutral-900);
--ui-bg-muted: var(--ui-color-neutral-800);
--ui-bg-elevated: var(--ui-color-neutral-800);
--ui-bg-accented: var(--ui-color-neutral-700);
--ui-bg-inverted: white;
/* Nuxt UI Design Tokens - Border (Dark) */
--ui-border: var(--ui-color-neutral-800);
--ui-border-muted: var(--ui-color-neutral-700);
--ui-border-accented: var(--ui-color-neutral-700);
--ui-border-inverted: white;
/* Override Tailwind v4 global styles to use webgui variables */
--ui-bg: var(--background-color) !important;
--ui-text: var(--text-color) !important;
--background: 0 0% 3.9%;
--foreground: 0 0% 98%;
@@ -119,15 +48,15 @@
--card-foreground: 0 0% 98%;
--border: 0 0% 14.9%;
--input: 0 0% 14.9%;
--primary: 24 100% 50%; /* Orange #ff8c2f in HSL */
--primary-foreground: 0 0% 98%;
--primary: 0 0% 98%;
--primary-foreground: 0 0% 9%;
--secondary: 0 0% 14.9%;
--secondary-foreground: 0 0% 98%;
--accent: 0 0% 14.9%;
--accent-foreground: 0 0% 98%;
--destructive: 0 62.8% 30.6%;
--destructive-foreground: 0 0% 98%;
--ring: 24 100% 50%; /* Orange ring to match primary */
--ring: 0 0% 83.1%;
--chart-1: 220 70% 50%;
--chart-2: 160 60% 45%;
--chart-3: 30 80% 55%;

View File

@@ -1,5 +1,6 @@
/* Tailwind Shared Styles - Single entry point for all shared CSS */
@import './css-variables.css';
@import './unraid-theme.css';
@import './theme-variants.css';
@import './base-utilities.css';
@import './sonner.css';
@import './reka-resets.css';

View File

@@ -0,0 +1,21 @@
/*
* Minimal resets for reka-ui components
* Only override the problematic webgui button styles
*/
/* Target all reka-ui buttons by their common attributes */
button[id^="reka-accordion-trigger"],
button[role="combobox"],
button[aria-haspopup="menu"],
[role="dialog"] button[type="button"] {
/* Only override the truly problematic styles */
font-family: inherit !important; /* Don't force clear-sans */
font-size: inherit !important; /* Don't force 1.1rem */
font-weight: normal !important; /* Don't force bold */
letter-spacing: normal !important; /* Don't force 1.8px spacing */
text-transform: none !important; /* Don't force uppercase */
min-width: auto !important; /* Don't force 86px minimum */
margin: 0 !important; /* Don't add 10px margins */
border: none !important; /* Remove forced border */
/* Let components handle their own padding through Tailwind classes */
}

665
@tailwind-shared/sonner.css Normal file
View File

@@ -0,0 +1,665 @@
/**------------------------------------------------------------------------------------------------
* SONNER.CSS
* This is a copy of Sonner's `style.css` as of commit a5b77c2df08d5c05aa923170176168102855533d
*
* This was necessary because I couldn't find a simple way to include Sonner's styles in vite's
* css build output. They wouldn't show up even though the toaster was included, and vue-sonner
* currently doesn't export its stylesheet (it appears to be inlined, but styles weren't applied
* to the unraid-toaster component for some reason).
*------------------------------------------------------------------------------------------------**/
:where(html[dir='ltr']),
:where([data-sonner-toaster][dir='ltr']) {
--toast-icon-margin-start: -3px;
--toast-icon-margin-end: 4px;
--toast-svg-margin-start: -1px;
--toast-svg-margin-end: 0px;
--toast-button-margin-start: auto;
--toast-button-margin-end: 0;
--toast-close-button-start: 0;
--toast-close-button-end: unset;
--toast-close-button-transform: translate(-35%, -35%);
}
:where(html[dir='rtl']),
:where([data-sonner-toaster][dir='rtl']) {
--toast-icon-margin-start: 4px;
--toast-icon-margin-end: -3px;
--toast-svg-margin-start: 0px;
--toast-svg-margin-end: -1px;
--toast-button-margin-start: 0;
--toast-button-margin-end: auto;
--toast-close-button-start: unset;
--toast-close-button-end: 0;
--toast-close-button-transform: translate(35%, -35%);
}
:where([data-sonner-toaster]) {
position: fixed;
width: var(--width);
font-family: ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, Segoe UI, Roboto, Helvetica Neue, Arial,
Noto Sans, sans-serif, Apple Color Emoji, Segoe UI Emoji, Segoe UI Symbol, Noto Color Emoji;
--gray1: hsl(0, 0%, 99%);
--gray2: hsl(0, 0%, 97.3%);
--gray3: hsl(0, 0%, 95.1%);
--gray4: hsl(0, 0%, 93%);
--gray5: hsl(0, 0%, 90.9%);
--gray6: hsl(0, 0%, 88.7%);
--gray7: hsl(0, 0%, 85.8%);
--gray8: hsl(0, 0%, 78%);
--gray9: hsl(0, 0%, 56.1%);
--gray10: hsl(0, 0%, 52.3%);
--gray11: hsl(0, 0%, 43.5%);
--gray12: hsl(0, 0%, 9%);
--border-radius: 8px;
box-sizing: border-box;
padding: 0;
margin: 0;
list-style: none;
outline: none;
z-index: 999999999;
transition: transform 400ms ease;
}
:where([data-sonner-toaster][data-lifted='true']) {
transform: translateY(-10px);
}
@media (hover: none) and (pointer: coarse) {
:where([data-sonner-toaster][data-lifted='true']) {
transform: none;
}
}
:where([data-sonner-toaster][data-x-position='right']) {
right: max(var(--offset), env(safe-area-inset-right));
}
:where([data-sonner-toaster][data-x-position='left']) {
left: max(var(--offset), env(safe-area-inset-left));
}
:where([data-sonner-toaster][data-x-position='center']) {
left: 50%;
transform: translateX(-50%);
}
:where([data-sonner-toaster][data-y-position='top']) {
top: max(var(--offset), env(safe-area-inset-top));
}
:where([data-sonner-toaster][data-y-position='bottom']) {
bottom: max(var(--offset), env(safe-area-inset-bottom));
}
:where([data-sonner-toast]) {
--y: translateY(100%);
--lift-amount: calc(var(--lift) * var(--gap));
z-index: var(--z-index);
position: absolute;
opacity: 0;
transform: var(--y);
filter: blur(0);
/* https://stackoverflow.com/questions/48124372/pointermove-event-not-working-with-touch-why-not */
touch-action: none;
transition: transform 400ms, opacity 400ms, height 400ms, box-shadow 200ms;
box-sizing: border-box;
outline: none;
overflow-wrap: anywhere;
}
:where([data-sonner-toast][data-styled='true']) {
padding: 16px;
background: var(--normal-bg);
border: 1px solid var(--normal-border);
color: var(--normal-text);
border-radius: var(--border-radius);
box-shadow: 0px 4px 12px rgba(0, 0, 0, 0.1);
width: var(--width);
font-size: 13px;
display: flex;
align-items: center;
gap: 6px;
}
:where([data-sonner-toast]:focus-visible) {
box-shadow: 0px 4px 12px rgba(0, 0, 0, 0.1), 0 0 0 2px rgba(0, 0, 0, 0.2);
}
:where([data-sonner-toast][data-y-position='top']) {
top: 0;
--y: translateY(-100%);
--lift: 1;
--lift-amount: calc(1 * var(--gap));
}
:where([data-sonner-toast][data-y-position='bottom']) {
bottom: 0;
--y: translateY(100%);
--lift: -1;
--lift-amount: calc(var(--lift) * var(--gap));
}
:where([data-sonner-toast]) :where([data-description]) {
font-weight: 400;
line-height: 1.4;
color: inherit;
}
:where([data-sonner-toast]) :where([data-title]) {
font-weight: 500;
line-height: 1.5;
color: inherit;
}
:where([data-sonner-toast]) :where([data-icon]) {
display: flex;
height: 16px;
width: 16px;
position: relative;
justify-content: flex-start;
align-items: center;
flex-shrink: 0;
margin-left: var(--toast-icon-margin-start);
margin-right: var(--toast-icon-margin-end);
}
:where([data-sonner-toast][data-promise='true']) :where([data-icon]) > svg {
opacity: 0;
transform: scale(0.8);
transform-origin: center;
animation: sonner-fade-in 300ms ease forwards;
}
:where([data-sonner-toast]) :where([data-icon]) > * {
flex-shrink: 0;
}
:where([data-sonner-toast]) :where([data-icon]) svg {
margin-left: var(--toast-svg-margin-start);
margin-right: var(--toast-svg-margin-end);
}
:where([data-sonner-toast]) :where([data-content]) {
display: flex;
flex-direction: column;
gap: 2px;
}
[data-sonner-toast][data-styled='true'] [data-button] {
border-radius: 4px;
padding-left: 8px;
padding-right: 8px;
height: 24px;
font-size: 12px;
color: var(--normal-bg);
background: var(--normal-text);
margin-left: var(--toast-button-margin-start);
margin-right: var(--toast-button-margin-end);
border: none;
cursor: pointer;
outline: none;
display: flex;
align-items: center;
flex-shrink: 0;
transition: opacity 400ms, box-shadow 200ms;
}
:where([data-sonner-toast]) :where([data-button]):focus-visible {
box-shadow: 0 0 0 2px rgba(0, 0, 0, 0.4);
}
:where([data-sonner-toast]) :where([data-button]):first-of-type {
margin-left: var(--toast-button-margin-start);
margin-right: var(--toast-button-margin-end);
}
:where([data-sonner-toast]) :where([data-cancel]) {
color: var(--normal-text);
background: rgba(0, 0, 0, 0.08);
}
:where([data-sonner-toast][data-theme='dark']) :where([data-cancel]) {
background: rgba(255, 255, 255, 0.3);
}
[data-sonner-toast] [data-close-button] {
position: absolute;
left: var(--toast-close-button-start);
right: var(--toast-close-button-end);
top: 0;
height: 20px;
width: 20px;
display: flex;
justify-content: center;
align-items: center;
padding: 0;
color: hsl(var(--foreground));
border: 1px solid hsl(var(--border));
transform: var(--toast-close-button-transform);
border-radius: 50%;
cursor: pointer;
z-index: 1;
transition: opacity 100ms, background 200ms, border-color 200ms;
}
[data-sonner-toast] [data-close-button] {
background: hsl(var(--background));
}
:where([data-sonner-toast]) :where([data-close-button]):focus-visible {
box-shadow: 0px 4px 12px rgba(0, 0, 0, 0.1), 0 0 0 2px rgba(0, 0, 0, 0.2);
}
:where([data-sonner-toast]) :where([data-disabled='true']) {
cursor: not-allowed;
}
[data-sonner-toast]:hover [data-close-button]:hover {
background: hsl(var(--muted));
border-color: hsl(var(--border));
}
/* Leave a ghost div to avoid setting hover to false when swiping out */
:where([data-sonner-toast][data-swiping='true'])::before {
content: '';
position: absolute;
left: 0;
right: 0;
height: 100%;
z-index: -1;
}
:where([data-sonner-toast][data-y-position='top'][data-swiping='true'])::before {
/* y 50% needed to distribute height additional height evenly */
bottom: 50%;
transform: scaleY(3) translateY(50%);
}
:where([data-sonner-toast][data-y-position='bottom'][data-swiping='true'])::before {
/* y -50% needed to distribute height additional height evenly */
top: 50%;
transform: scaleY(3) translateY(-50%);
}
/* Leave a ghost div to avoid setting hover to false when transitioning out */
:where([data-sonner-toast][data-swiping='false'][data-removed='true'])::before {
content: '';
position: absolute;
inset: 0;
transform: scaleY(2);
}
/* Needed to avoid setting hover to false when inbetween toasts */
:where([data-sonner-toast])::after {
content: '';
position: absolute;
left: 0;
height: calc(var(--gap) + 1px);
bottom: 100%;
width: 100%;
}
:where([data-sonner-toast][data-mounted='true']) {
--y: translateY(0);
opacity: 1;
}
:where([data-sonner-toast][data-expanded='false'][data-front='false']) {
--scale: var(--toasts-before) * 0.05 + 1;
--y: translateY(calc(var(--lift-amount) * var(--toasts-before))) scale(calc(-1 * var(--scale)));
height: var(--front-toast-height);
}
:where([data-sonner-toast]) > * {
transition: opacity 400ms;
}
:where([data-sonner-toast][data-expanded='false'][data-front='false'][data-styled='true']) > * {
opacity: 0;
}
:where([data-sonner-toast][data-visible='false']) {
opacity: 0;
pointer-events: none;
}
:where([data-sonner-toast][data-mounted='true'][data-expanded='true']) {
--y: translateY(calc(var(--lift) * var(--offset)));
height: var(--initial-height);
}
:where([data-sonner-toast][data-removed='true'][data-front='true'][data-swipe-out='false']) {
--y: translateY(calc(var(--lift) * -100%));
opacity: 0;
}
:where([data-sonner-toast][data-removed='true'][data-front='false'][data-swipe-out='false'][data-expanded='true']) {
--y: translateY(calc(var(--lift) * var(--offset) + var(--lift) * -100%));
opacity: 0;
}
:where([data-sonner-toast][data-removed='true'][data-front='false'][data-swipe-out='false'][data-expanded='false']) {
--y: translateY(40%);
opacity: 0;
transition: transform 500ms, opacity 200ms;
}
/* Bump up the height to make sure hover state doesn't get set to false */
:where([data-sonner-toast][data-removed='true'][data-front='false'])::before {
height: calc(var(--initial-height) + 20%);
}
[data-sonner-toast][data-swiping='true'] {
transform: var(--y) translateY(var(--swipe-amount, 0px));
transition: none;
}
[data-sonner-toast][data-swiped='true'] {
user-select: none;
}
[data-sonner-toast][data-swipe-out='true'][data-y-position='bottom'],
[data-sonner-toast][data-swipe-out='true'][data-y-position='top'] {
animation: swipe-out 200ms ease-out forwards;
}
@keyframes swipe-out {
from {
transform: translateY(calc(var(--lift) * var(--offset) + var(--swipe-amount)));
opacity: 1;
}
to {
transform: translateY(calc(var(--lift) * var(--offset) + var(--swipe-amount) + var(--lift) * -100%));
opacity: 0;
}
}
@media (max-width: 600px) {
[data-sonner-toaster] {
position: fixed;
--mobile-offset: 16px;
right: var(--mobile-offset);
left: var(--mobile-offset);
width: 100%;
}
[data-sonner-toaster][dir='rtl'] {
left: calc(var(--mobile-offset) * -1);
}
[data-sonner-toaster] [data-sonner-toast] {
left: 0;
right: 0;
width: calc(100% - var(--mobile-offset) * 2);
}
[data-sonner-toaster][data-x-position='left'] {
left: var(--mobile-offset);
}
[data-sonner-toaster][data-y-position='bottom'] {
bottom: 20px;
}
[data-sonner-toaster][data-y-position='top'] {
top: 20px;
}
[data-sonner-toaster][data-x-position='center'] {
left: var(--mobile-offset);
right: var(--mobile-offset);
transform: none;
}
}
[data-sonner-toaster][data-theme='light'] {
--normal-bg: hsl(var(--background));
--normal-border: hsl(var(--border));
--normal-text: hsl(var(--foreground));
--success-bg: hsl(143, 85%, 96%);
--success-border: hsl(145, 92%, 91%);
--success-text: hsl(140, 100%, 27%);
--info-bg: hsl(208, 100%, 97%);
--info-border: hsl(221, 91%, 91%);
--info-text: hsl(210, 92%, 45%);
--warning-bg: hsl(49, 100%, 97%);
--warning-border: hsl(49, 91%, 91%);
--warning-text: hsl(31, 92%, 45%);
--error-bg: hsl(359, 100%, 97%);
--error-border: hsl(359, 100%, 94%);
--error-text: hsl(360, 100%, 45%);
}
[data-sonner-toaster][data-theme='light'] [data-sonner-toast][data-invert='true'] {
--normal-bg: hsl(0 0% 3.9%);
--normal-border: hsl(0 0% 14.9%);
--normal-text: hsl(0 0% 98%);
}
[data-sonner-toaster][data-theme='dark'] [data-sonner-toast][data-invert='true'] {
--normal-bg: hsl(0 0% 100%);
--normal-border: hsl(0 0% 89.8%);
--normal-text: hsl(0 0% 3.9%);
}
[data-sonner-toaster][data-theme='dark'] {
--normal-bg: hsl(var(--background));
--normal-border: hsl(var(--border));
--normal-text: hsl(var(--foreground));
--success-bg: hsl(150, 100%, 6%);
--success-border: hsl(147, 100%, 12%);
--success-text: hsl(150, 86%, 65%);
--info-bg: hsl(215, 100%, 6%);
--info-border: hsl(223, 100%, 12%);
--info-text: hsl(216, 87%, 65%);
--warning-bg: hsl(64, 100%, 6%);
--warning-border: hsl(60, 100%, 12%);
--warning-text: hsl(46, 87%, 65%);
--error-bg: hsl(358, 76%, 10%);
--error-border: hsl(357, 89%, 16%);
--error-text: hsl(358, 100%, 81%);
}
[data-rich-colors='true'][data-sonner-toast][data-type='success'] {
background: var(--success-bg);
border-color: var(--success-border);
color: var(--success-text);
}
[data-rich-colors='true'][data-sonner-toast][data-type='success'] [data-close-button] {
background: var(--success-bg);
border-color: var(--success-border);
color: var(--success-text);
}
[data-rich-colors='true'][data-sonner-toast][data-type='info'] {
background: var(--info-bg);
border-color: var(--info-border);
color: var(--info-text);
}
[data-rich-colors='true'][data-sonner-toast][data-type='info'] [data-close-button] {
background: var(--info-bg);
border-color: var(--info-border);
color: var(--info-text);
}
[data-rich-colors='true'][data-sonner-toast][data-type='warning'] {
background: var(--warning-bg);
border-color: var(--warning-border);
color: var(--warning-text);
}
[data-rich-colors='true'][data-sonner-toast][data-type='warning'] [data-close-button] {
background: var(--warning-bg);
border-color: var(--warning-border);
color: var(--warning-text);
}
[data-rich-colors='true'][data-sonner-toast][data-type='error'] {
background: var(--error-bg);
border-color: var(--error-border);
color: var(--error-text);
}
[data-rich-colors='true'][data-sonner-toast][data-type='error'] [data-close-button] {
background: var(--error-bg);
border-color: var(--error-border);
color: var(--error-text);
}
.sonner-loading-wrapper {
--size: 16px;
height: var(--size);
width: var(--size);
position: absolute;
inset: 0;
z-index: 10;
}
.sonner-loading-wrapper[data-visible='false'] {
transform-origin: center;
animation: sonner-fade-out 0.2s ease forwards;
}
.sonner-spinner {
position: relative;
top: 50%;
left: 50%;
height: var(--size);
width: var(--size);
}
.sonner-loading-bar {
animation: sonner-spin 1.2s linear infinite;
background: hsl(var(--muted-foreground));
border-radius: 6px;
height: 8%;
left: -10%;
position: absolute;
top: -3.9%;
width: 24%;
}
.sonner-loading-bar:nth-child(1) {
animation-delay: -1.2s;
transform: rotate(0.0001deg) translate(146%);
}
.sonner-loading-bar:nth-child(2) {
animation-delay: -1.1s;
transform: rotate(30deg) translate(146%);
}
.sonner-loading-bar:nth-child(3) {
animation-delay: -1s;
transform: rotate(60deg) translate(146%);
}
.sonner-loading-bar:nth-child(4) {
animation-delay: -0.9s;
transform: rotate(90deg) translate(146%);
}
.sonner-loading-bar:nth-child(5) {
animation-delay: -0.8s;
transform: rotate(120deg) translate(146%);
}
.sonner-loading-bar:nth-child(6) {
animation-delay: -0.7s;
transform: rotate(150deg) translate(146%);
}
.sonner-loading-bar:nth-child(7) {
animation-delay: -0.6s;
transform: rotate(180deg) translate(146%);
}
.sonner-loading-bar:nth-child(8) {
animation-delay: -0.5s;
transform: rotate(210deg) translate(146%);
}
.sonner-loading-bar:nth-child(9) {
animation-delay: -0.4s;
transform: rotate(240deg) translate(146%);
}
.sonner-loading-bar:nth-child(10) {
animation-delay: -0.3s;
transform: rotate(270deg) translate(146%);
}
.sonner-loading-bar:nth-child(11) {
animation-delay: -0.2s;
transform: rotate(300deg) translate(146%);
}
.sonner-loading-bar:nth-child(12) {
animation-delay: -0.1s;
transform: rotate(330deg) translate(146%);
}
@keyframes sonner-fade-in {
0% {
opacity: 0;
transform: scale(0.8);
}
100% {
opacity: 1;
transform: scale(1);
}
}
@keyframes sonner-fade-out {
0% {
opacity: 1;
transform: scale(1);
}
100% {
opacity: 0;
transform: scale(0.8);
}
}
@keyframes sonner-spin {
0% {
opacity: 1;
}
100% {
opacity: 0.15;
}
}
@media (prefers-reduced-motion) {
[data-sonner-toast],
[data-sonner-toast] > *,
.sonner-loading-bar {
transition: none !important;
animation: none !important;
}
}
.sonner-loader {
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
transform-origin: center;
transition: opacity 200ms, transform 200ms;
}
.sonner-loader[data-visible='false'] {
opacity: 0;
transform: scale(0.8) translate(-50%, -50%);
}

View File

@@ -1,47 +0,0 @@
/**
* Tailwind v4 Theme Variants
* Defines theme-specific CSS variables that can be switched via classes
* These are applied dynamically based on the theme selected in GraphQL
*/
/* Default/White Theme */
.Theme--white {
--color-border: #383735;
--color-alpha: #ff8c2f;
--color-beta: #1c1b1b;
--color-gamma: #ffffff;
--color-gamma-opaque: rgba(255, 255, 255, 0.3);
}
/* Black Theme */
.Theme--black,
.Theme--black.dark {
--color-border: #e0e0e0;
--color-alpha: #ff8c2f;
--color-beta: #f2f2f2;
--color-gamma: #1c1b1b;
--color-gamma-opaque: rgba(28, 27, 27, 0.3);
}
/* Gray Theme */
.Theme--gray {
--color-border: #383735;
--color-alpha: #ff8c2f;
--color-beta: #383735;
--color-gamma: #ffffff;
--color-gamma-opaque: rgba(255, 255, 255, 0.3);
}
/* Azure Theme */
.Theme--azure {
--color-border: #5a8bb8;
--color-alpha: #ff8c2f;
--color-beta: #e7f2f8;
--color-gamma: #336699;
--color-gamma-opaque: rgba(51, 102, 153, 0.3);
}
/* Dark Mode Overrides */
.dark {
--color-border: #383735;
}

View File

@@ -84,23 +84,23 @@
--color-primary-900: #7c2d12;
--color-primary-950: #431407;
/* Header colors - defaults will be overridden by theme */
--color-header-text-primary: var(--header-text-primary, #1c1c1c);
--color-header-text-secondary: var(--header-text-secondary, #999999);
--color-header-background: var(--header-background-color, #f2f2f2);
/* Header colors */
--color-header-text-primary: var(--header-text-primary);
--color-header-text-secondary: var(--header-text-secondary);
--color-header-background-color: var(--header-background-color);
/* Legacy colors - defaults (overridden by theme-variants.css) */
--color-alpha: #ff8c2f;
--color-beta: #f2f2f2;
--color-gamma: #999999;
--color-gamma-opaque: rgba(153, 153, 153, 0.5);
--color-customgradient-start: rgba(242, 242, 242, 0);
--color-customgradient-end: rgba(242, 242, 242, 0.85);
/* Legacy colors */
--color-alpha: var(--color-alpha);
--color-beta: var(--color-beta);
--color-gamma: var(--color-gamma);
--color-gamma-opaque: var(--color-gamma-opaque);
--color-customgradient-start: var(--color-customgradient-start);
--color-customgradient-end: var(--color-customgradient-end);
/* Gradients - defaults (overridden by theme-variants.css) */
--color-header-gradient-start: rgba(242, 242, 242, 0);
--color-header-gradient-end: rgba(242, 242, 242, 0.85);
--color-banner-gradient: none;
/* Gradients */
--color-header-gradient-start: var(--header-gradient-start);
--color-header-gradient-end: var(--header-gradient-end);
--color-banner-gradient: var(--banner-gradient);
/* Font sizes */
--font-10px: 10px;
@@ -167,27 +167,6 @@
--max-width-800px: 800px;
--max-width-1024px: 1024px;
/* Container sizes adjusted for 10px base font size (1.6x scale) */
--container-xs: 32rem;
--container-sm: 38.4rem;
--container-md: 44.8rem;
--container-lg: 51.2rem;
--container-xl: 57.6rem;
--container-2xl: 67.2rem;
--container-3xl: 76.8rem;
--container-4xl: 89.6rem;
--container-5xl: 102.4rem;
--container-6xl: 115.2rem;
--container-7xl: 128rem;
/* Extended width scale for max-w-* utilities */
--width-5xl: 102.4rem;
--width-6xl: 115.2rem;
--width-7xl: 128rem;
--width-8xl: 140.8rem;
--width-9xl: 153.6rem;
--width-10xl: 166.4rem;
/* Animations */
--animate-mark-2: mark-2 1.5s ease infinite;
--animate-mark-3: mark-3 1.5s ease infinite;

View File

@@ -7,16 +7,13 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
This is the Unraid API monorepo containing multiple packages that provide API functionality for Unraid servers. It uses pnpm workspaces with the following structure:
- `/api` - Core NestJS API server with GraphQL
- `/web` - Vue 3 frontend application
- `/web` - Nuxt.js frontend application
- `/unraid-ui` - Vue 3 component library
- `/plugin` - Unraid plugin package (.plg)
- `/packages` - Shared packages and API plugins
## Essential Commands
pnpm does not use `--` to pass additional arguments.
For example, to target a specific test file, `pnpm test <file>` is sufficient.
### Development
```bash
@@ -131,6 +128,9 @@ Enables GraphQL playground at `http://tower.local/graphql`
- **Use Mocks Correctly**: Mocks should be used as nouns, not verbs.
#### Vue Component Testing
- This is a Nuxt.js app but we are testing with vitest outside of the Nuxt environment
- Nuxt is currently set to auto import so some vue files may need compute or ref imported
- Use pnpm when running terminal commands and stay within the web directory
- Tests are located under `web/__test__`, run with `pnpm test`
- Use `mount` from Vue Test Utils for component testing

View File

@@ -31,5 +31,3 @@ BYPASS_CORS_CHECKS=true
CHOKIDAR_USEPOLLING=true
LOG_TRANSPORT=console
LOG_LEVEL=trace
ENABLE_NEXT_DOCKER_RELEASE=true
SKIP_CONNECT_PLUGIN_CHECK=true

View File

@@ -42,10 +42,7 @@ export default tseslint.config(
'ignorePackages',
{
js: 'always',
mjs: 'always',
cjs: 'always',
ts: 'never',
tsx: 'never',
ts: 'always',
},
],
'no-restricted-globals': [

3
api/.gitignore vendored
View File

@@ -93,6 +93,3 @@ dev/local-session
# local OIDC config for testing - contains secrets
dev/configs/oidc.local.json
# local api keys
dev/keys/*

View File

@@ -1,294 +1,5 @@
# Changelog
## [4.27.2](https://github.com/unraid/api/compare/v4.27.1...v4.27.2) (2025-11-21)
### Bug Fixes
* issue with header flashing + issue with trial date ([64875ed](https://github.com/unraid/api/commit/64875edbba786a0d1ba0113c9e9a3d38594eafcc))
## [4.27.1](https://github.com/unraid/api/compare/v4.27.0...v4.27.1) (2025-11-21)
### Bug Fixes
* missing translations for expiring trials ([#1800](https://github.com/unraid/api/issues/1800)) ([36c1049](https://github.com/unraid/api/commit/36c104915ece203a3cac9e1a13e0c325e536a839))
* resolve header flash when background color is set ([#1796](https://github.com/unraid/api/issues/1796)) ([dc9a036](https://github.com/unraid/api/commit/dc9a036c73d8ba110029364e0d044dc24c7d0dfa))
## [4.27.0](https://github.com/unraid/api/compare/v4.26.2...v4.27.0) (2025-11-19)
### Features
* remove Unraid API log download functionality ([#1793](https://github.com/unraid/api/issues/1793)) ([e4a9b82](https://github.com/unraid/api/commit/e4a9b8291b049752a9ff59b17ff50cf464fe0535))
### Bug Fixes
* auto-uninstallation of connect api plugin ([#1791](https://github.com/unraid/api/issues/1791)) ([e734043](https://github.com/unraid/api/commit/e7340431a58821ec1b4f5d1b452fba6613b01fa5))
## [4.26.2](https://github.com/unraid/api/compare/v4.26.1...v4.26.2) (2025-11-19)
### Bug Fixes
* **theme:** Missing header background color ([e2fdf6c](https://github.com/unraid/api/commit/e2fdf6cadbd816559b8c82546c2bc771a81ffa9e))
## [4.26.1](https://github.com/unraid/api/compare/v4.26.0...v4.26.1) (2025-11-18)
### Bug Fixes
* **theme:** update theme class naming and scoping logic ([b28ef1e](https://github.com/unraid/api/commit/b28ef1ea334cb4842f01fa992effa7024185c6c9))
## [4.26.0](https://github.com/unraid/api/compare/v4.25.3...v4.26.0) (2025-11-17)
### Features
* add cpu power query & subscription ([#1745](https://github.com/unraid/api/issues/1745)) ([d7aca81](https://github.com/unraid/api/commit/d7aca81c60281bfa47fb9113929c1ead6ed3361b))
* add schema publishing to apollo studio ([#1772](https://github.com/unraid/api/issues/1772)) ([7e13202](https://github.com/unraid/api/commit/7e13202aa1c02803095bb72bb1bcb2472716f53a))
* add workflow_dispatch trigger to schema publishing workflow ([818e7ce](https://github.com/unraid/api/commit/818e7ce997059663e07efcf1dab706bf0d7fc9da))
* apollo studio readme link ([c4cd0c6](https://github.com/unraid/api/commit/c4cd0c63520deec15d735255f38811f0360fe3a1))
* **cli:** make `unraid-api plugins remove` scriptable ([#1774](https://github.com/unraid/api/issues/1774)) ([64eb9ce](https://github.com/unraid/api/commit/64eb9ce9b5d1ff4fb1f08d9963522c5d32221ba7))
* use persisted theme css to fix flashes on header ([#1784](https://github.com/unraid/api/issues/1784)) ([854b403](https://github.com/unraid/api/commit/854b403fbd85220a3012af58ce033cf0b8418516))
### Bug Fixes
* **api:** decode html entities before parsing notifications ([#1768](https://github.com/unraid/api/issues/1768)) ([42406e7](https://github.com/unraid/api/commit/42406e795da1e5b95622951a467722dde72d51a8))
* **connect:** disable api plugin if unraid plugin is absent ([#1773](https://github.com/unraid/api/issues/1773)) ([c264a18](https://github.com/unraid/api/commit/c264a1843cf115e8cc1add1ab4f12fdcc932405a))
* detection of flash backup activation state ([#1769](https://github.com/unraid/api/issues/1769)) ([d18eaf2](https://github.com/unraid/api/commit/d18eaf2364e0c04992c52af38679ff0a0c570440))
* re-add missing header gradient styles ([#1787](https://github.com/unraid/api/issues/1787)) ([f8a6785](https://github.com/unraid/api/commit/f8a6785e9c92f81acaef76ac5eb78a4a769e69da))
* respect OS safe mode in plugin loader ([#1775](https://github.com/unraid/api/issues/1775)) ([92af3b6](https://github.com/unraid/api/commit/92af3b61156cabae70368cf5222a2f7ac5b4d083))
## [4.25.3](https://github.com/unraid/unraid-api/compare/v4.25.2...v4.25.3) (2025-10-22)
### Bug Fixes
* flaky watch on boot drive's dynamix config ([ec7aa06](https://github.com/unraid/unraid-api/commit/ec7aa06d4a5fb1f0e84420266b0b0d7ee09a3663))
## [4.25.2](https://github.com/unraid/api/compare/v4.25.1...v4.25.2) (2025-09-30)
### Bug Fixes
* enhance activation code modal visibility logic ([#1733](https://github.com/unraid/api/issues/1733)) ([e57ec00](https://github.com/unraid/api/commit/e57ec00627e54ce76d903fd0fa8686ad02b393f3))
## [4.25.1](https://github.com/unraid/api/compare/v4.25.0...v4.25.1) (2025-09-30)
### Bug Fixes
* add cache busting to web component extractor ([#1731](https://github.com/unraid/api/issues/1731)) ([0d165a6](https://github.com/unraid/api/commit/0d165a608740505bdc505dcf69fb615225969741))
* Connect won't appear within Apps - Previous Apps ([#1727](https://github.com/unraid/api/issues/1727)) ([d73953f](https://github.com/unraid/api/commit/d73953f8ff3d7425c0aed32d16236ededfd948e1))
## [4.25.0](https://github.com/unraid/api/compare/v4.24.1...v4.25.0) (2025-09-26)
### Features
* add Tailwind scoping plugin and integrate into Vite config ([#1722](https://github.com/unraid/api/issues/1722)) ([b7afaf4](https://github.com/unraid/api/commit/b7afaf463243b073e1ab1083961a16a12ac6c4a3))
* notification filter controls pill buttons ([#1718](https://github.com/unraid/api/issues/1718)) ([661865f](https://github.com/unraid/api/commit/661865f97611cf802f239fde8232f3109281dde6))
### Bug Fixes
* enable auth guard for nested fields - thanks [@ingel81](https://github.com/ingel81) ([7bdeca8](https://github.com/unraid/api/commit/7bdeca8338a3901f15fde06fd7aede3b0c16e087))
* enhance user context validation in auth module ([#1726](https://github.com/unraid/api/issues/1726)) ([cd5eff1](https://github.com/unraid/api/commit/cd5eff11bcb4398581472966cb7ec124eac7ad0a))
## [4.24.1](https://github.com/unraid/api/compare/v4.24.0...v4.24.1) (2025-09-23)
### Bug Fixes
* cleanup leftover removed packages on upgrade ([#1719](https://github.com/unraid/api/issues/1719)) ([9972a5f](https://github.com/unraid/api/commit/9972a5f178f9a251e6c129d85c5f11cfd25e6281))
* enhance version comparison logic in installation script ([d9c561b](https://github.com/unraid/api/commit/d9c561bfebed0c553fe4bfa26b088ae71ca59755))
* issue with incorrect permissions on viewer / other roles ([378cdb7](https://github.com/unraid/api/commit/378cdb7f102f63128dd236c13f1a3745902d5a2c))
## [4.24.0](https://github.com/unraid/api/compare/v4.23.1...v4.24.0) (2025-09-18)
### Features
* improve dom content loading by being more efficient about component mounting ([#1716](https://github.com/unraid/api/issues/1716)) ([d8b166e](https://github.com/unraid/api/commit/d8b166e4b6a718e07783d9c8ac8393b50ec89ae3))
## [4.23.1](https://github.com/unraid/api/compare/v4.23.0...v4.23.1) (2025-09-17)
### Bug Fixes
* cleanup ini parser logic with better fallbacks ([#1713](https://github.com/unraid/api/issues/1713)) ([1691362](https://github.com/unraid/api/commit/16913627de9497a5d2f71edb710cec6e2eb9f890))
## [4.23.0](https://github.com/unraid/api/compare/v4.22.2...v4.23.0) (2025-09-16)
### Features
* add unraid api status manager ([#1708](https://github.com/unraid/api/issues/1708)) ([1d9ce0a](https://github.com/unraid/api/commit/1d9ce0aa3d067726c2c880929408c68f53e13e0d))
### Bug Fixes
* **logging:** remove colorized logs ([#1705](https://github.com/unraid/api/issues/1705)) ([1d2c670](https://github.com/unraid/api/commit/1d2c6701ce56b1d40afdb776065295e9273d08e9))
* no sizeRootFs unless queried ([#1710](https://github.com/unraid/api/issues/1710)) ([9714b21](https://github.com/unraid/api/commit/9714b21c5c07160b92a11512e8b703908adb0620))
* use virtual-modal-container ([#1709](https://github.com/unraid/api/issues/1709)) ([44b4d77](https://github.com/unraid/api/commit/44b4d77d803aa724968307cfa463f7c440791a10))
## [4.22.2](https://github.com/unraid/api/compare/v4.22.1...v4.22.2) (2025-09-15)
### Bug Fixes
* **deps:** pin dependency conventional-changelog-conventionalcommits to 9.1.0 ([#1697](https://github.com/unraid/api/issues/1697)) ([9a86c61](https://github.com/unraid/api/commit/9a86c615da2e975f568922fa012cc29b3f9cde0e))
* **deps:** update dependency filenamify to v7 ([#1703](https://github.com/unraid/api/issues/1703)) ([b80988a](https://github.com/unraid/api/commit/b80988aaabebc4b8dbf2bf31f0764bf2f28e1575))
* **deps:** update graphqlcodegenerator monorepo (major) ([#1689](https://github.com/unraid/api/issues/1689)) ([ba4a43a](https://github.com/unraid/api/commit/ba4a43aec863fc30c47dd17370d74daed7f84703))
* false positive on verify_install script being external shell ([#1704](https://github.com/unraid/api/issues/1704)) ([31a255c](https://github.com/unraid/api/commit/31a255c9281b29df983d0f5d0475cd5a69790a48))
* improve vue mount speed by 10x ([c855caa](https://github.com/unraid/api/commit/c855caa9b2d4d63bead1a992f5c583e00b9ba843))
## [4.22.1](https://github.com/unraid/api/compare/v4.22.0...v4.22.1) (2025-09-12)
### Bug Fixes
* set input color in SSO field rather than inside of the main.css ([01d353f](https://github.com/unraid/api/commit/01d353fa08a3df688b37a495a204605138f7f71d))
## [4.22.0](https://github.com/unraid/api/compare/v4.21.0...v4.22.0) (2025-09-12)
### Features
* improved update ui ([#1691](https://github.com/unraid/api/issues/1691)) ([a59b363](https://github.com/unraid/api/commit/a59b363ebc1e660f854c55d50fc02c823c2fd0cc))
### Bug Fixes
* **deps:** update dependency camelcase-keys to v10 ([#1687](https://github.com/unraid/api/issues/1687)) ([95faeaa](https://github.com/unraid/api/commit/95faeaa2f39bf7bd16502698d7530aaa590b286d))
* **deps:** update dependency p-retry to v7 ([#1608](https://github.com/unraid/api/issues/1608)) ([c782cf0](https://github.com/unraid/api/commit/c782cf0e8710c6690050376feefda3edb30dd549))
* **deps:** update dependency uuid to v13 ([#1688](https://github.com/unraid/api/issues/1688)) ([2fef10c](https://github.com/unraid/api/commit/2fef10c94aae910e95d9f5bcacf7289e2cca6ed9))
* **deps:** update dependency vue-sonner to v2 ([#1475](https://github.com/unraid/api/issues/1475)) ([f95ca9c](https://github.com/unraid/api/commit/f95ca9c9cb69725dcf3bb4bcbd0b558a2074e311))
* display settings fix for languages on less than 7.2-beta.2.3 ([#1696](https://github.com/unraid/api/issues/1696)) ([03dae7c](https://github.com/unraid/api/commit/03dae7ce66b3409593eeee90cd5b56e2a920ca44))
* hide reset help option when sso is being checked ([#1695](https://github.com/unraid/api/issues/1695)) ([222ced7](https://github.com/unraid/api/commit/222ced7518d40c207198a3b8548f0e024bc865b0))
* progressFrame white on black ([0990b89](https://github.com/unraid/api/commit/0990b898bd02c231153157c20d5142e5fd4513cd))
## [4.21.0](https://github.com/unraid/api/compare/v4.20.4...v4.21.0) (2025-09-10)
### Features
* add zsh shell detection to install script ([#1539](https://github.com/unraid/api/issues/1539)) ([50ea2a3](https://github.com/unraid/api/commit/50ea2a3ffb82b30152fb85e0fb9b0d178d596efe))
* **api:** determine if docker container has update ([#1582](https://github.com/unraid/api/issues/1582)) ([e57d81e](https://github.com/unraid/api/commit/e57d81e0735772758bb85e0b3c89dce15c56635e))
### Bug Fixes
* white on white login text ([ae4d3ec](https://github.com/unraid/api/commit/ae4d3ecbc417454ae3c6e02018f8e4c49bbfc902))
## [4.20.4](https://github.com/unraid/api/compare/v4.20.3...v4.20.4) (2025-09-09)
### Bug Fixes
* staging PR plugin fixes + UI issues on 7.2 beta ([b79b44e](https://github.com/unraid/api/commit/b79b44e95c65a124313814ab55b0d0a745a799c7))
## [4.20.3](https://github.com/unraid/api/compare/v4.20.2...v4.20.3) (2025-09-09)
### Bug Fixes
* header background color issues fixed on 7.2 - thanks Nick! ([73c1100](https://github.com/unraid/api/commit/73c1100d0ba396fe4342f8ce7561017ab821e68b))
## [4.20.2](https://github.com/unraid/api/compare/v4.20.1...v4.20.2) (2025-09-09)
### Bug Fixes
* trigger deployment ([a27453f](https://github.com/unraid/api/commit/a27453fda81e4eeb07f257e60516bebbbc27cf7a))
## [4.20.1](https://github.com/unraid/api/compare/v4.20.0...v4.20.1) (2025-09-09)
### Bug Fixes
* adjust header styles to fix flashing and width issues - thanks ZarZ ([4759b3d](https://github.com/unraid/api/commit/4759b3d0b3fb6bc71636f75f807cd6f4f62305d1))
## [4.20.0](https://github.com/unraid/api/compare/v4.19.1...v4.20.0) (2025-09-08)
### Features
* **disks:** add isSpinning field to Disk type ([#1527](https://github.com/unraid/api/issues/1527)) ([193be3d](https://github.com/unraid/api/commit/193be3df3672514be9904e3d4fbdff776470afc0))
### Bug Fixes
* better component loading to prevent per-page strange behavior ([095c222](https://github.com/unraid/api/commit/095c2221c94f144f8ad410a69362b15803765531))
* **deps:** pin dependencies ([#1669](https://github.com/unraid/api/issues/1669)) ([413db4b](https://github.com/unraid/api/commit/413db4bd30a06aa69d3ca86e793782854f822589))
* **plugin:** add fallback for unraid-api stop in deprecation cleanup ([#1668](https://github.com/unraid/api/issues/1668)) ([797bf50](https://github.com/unraid/api/commit/797bf50ec702ebc8244ff71a8ef1a80ea5cd2169))
* prepend 'v' to API version in workflow dispatch inputs ([f0cffbd](https://github.com/unraid/api/commit/f0cffbdc7ac36e7037ab60fe9dddbb2cab4a5e10))
* progress frame background color fix ([#1672](https://github.com/unraid/api/issues/1672)) ([785f1f5](https://github.com/unraid/api/commit/785f1f5eb1a1cc8b41f6eb502e4092d149cfbd80))
* properly override header values ([#1673](https://github.com/unraid/api/issues/1673)) ([aecf70f](https://github.com/unraid/api/commit/aecf70ffad60c83074347d3d6ec23f73acbd1aee))
## [4.19.1](https://github.com/unraid/api/compare/v4.19.0...v4.19.1) (2025-09-05)
### Bug Fixes
* custom path detection to fix setup issues ([#1664](https://github.com/unraid/api/issues/1664)) ([2ecdb99](https://github.com/unraid/api/commit/2ecdb99052f39d89af21bbe7ad3f80b83bb1eaa9))
## [4.19.0](https://github.com/unraid/api/compare/v4.18.2...v4.19.0) (2025-09-04)
### Features
* mount vue apps, not web components ([#1639](https://github.com/unraid/api/issues/1639)) ([88087d5](https://github.com/unraid/api/commit/88087d5201992298cdafa791d5d1b5bb23dcd72b))
### Bug Fixes
* api version json response ([#1653](https://github.com/unraid/api/issues/1653)) ([292bc0f](https://github.com/unraid/api/commit/292bc0fc810a0d0f0cce6813b0631ff25099cc05))
* enhance DOM validation and cleanup in vue-mount-app ([6cf7c88](https://github.com/unraid/api/commit/6cf7c88242f2f4fe9f83871560039767b5b90273))
* enhance getKeyFile function to handle missing key file gracefully ([#1659](https://github.com/unraid/api/issues/1659)) ([728b38a](https://github.com/unraid/api/commit/728b38ac11faeacd39ce9d0157024ad140e29b36))
* info alert docker icon ([#1661](https://github.com/unraid/api/issues/1661)) ([239cdd6](https://github.com/unraid/api/commit/239cdd6133690699348e61f68e485d2b54fdcbdb))
* oidc cache busting issues fixed ([#1656](https://github.com/unraid/api/issues/1656)) ([e204eb8](https://github.com/unraid/api/commit/e204eb80a00ab9242e3dca4ccfc3e1b55a7694b7))
* **plugin:** restore cleanup behavior for unsupported unraid versions ([#1658](https://github.com/unraid/api/issues/1658)) ([534a077](https://github.com/unraid/api/commit/534a07788b76de49e9ba14059a9aed0bf16e02ca))
* UnraidToaster component and update dialog close button ([#1657](https://github.com/unraid/api/issues/1657)) ([44774d0](https://github.com/unraid/api/commit/44774d0acdd25aa33cb60a5d0b4f80777f4068e5))
* vue mounting logic with tests ([#1651](https://github.com/unraid/api/issues/1651)) ([33774aa](https://github.com/unraid/api/commit/33774aa596124a031a7452b62ca4c43743a09951))
## [4.18.2](https://github.com/unraid/api/compare/v4.18.1...v4.18.2) (2025-09-03)
### Bug Fixes
* add missing CPU guest metrics to CPU responses ([#1644](https://github.com/unraid/api/issues/1644)) ([99dbad5](https://github.com/unraid/api/commit/99dbad57d55a256f5f3f850f9a47a6eaa6348065))
* **plugin:** raise minimum unraid os version to 6.12.15 ([#1649](https://github.com/unraid/api/issues/1649)) ([bc15bd3](https://github.com/unraid/api/commit/bc15bd3d7008acb416ac3c6fb1f4724c685ec7e7))
* update GitHub Actions token for workflow trigger ([4d8588b](https://github.com/unraid/api/commit/4d8588b17331afa45ba8caf84fcec8c0ea03591f))
* update OIDC URL validation and add tests ([#1646](https://github.com/unraid/api/issues/1646)) ([c7c3bb5](https://github.com/unraid/api/commit/c7c3bb57ea482633a7acff064b39fbc8d4e07213))
* use shared bg & border color for styled toasts ([#1647](https://github.com/unraid/api/issues/1647)) ([7c3aee8](https://github.com/unraid/api/commit/7c3aee8f3f9ba82ae8c8ed3840c20ab47f3cb00f))
## [4.18.1](https://github.com/unraid/api/compare/v4.18.0...v4.18.1) (2025-09-03)
### Bug Fixes
* OIDC and API Key management issues ([#1642](https://github.com/unraid/api/issues/1642)) ([0fe2c2c](https://github.com/unraid/api/commit/0fe2c2c1c85dcc547e4b1217a3b5636d7dd6d4b4))
* rm redundant emission to `$HOME/.pm2/logs` ([#1640](https://github.com/unraid/api/issues/1640)) ([a8e4119](https://github.com/unraid/api/commit/a8e4119270868a1dabccd405853a7340f8dcd8a5))
## [4.18.0](https://github.com/unraid/api/compare/v4.17.0...v4.18.0) (2025-09-02)
### Features
* **api:** enhance OIDC redirect URI handling in service and tests ([#1618](https://github.com/unraid/api/issues/1618)) ([4e945f5](https://github.com/unraid/api/commit/4e945f5f56ce059eb275a9576caf3194a5df8a90))
### Bug Fixes
* api key creation cli ([#1637](https://github.com/unraid/api/issues/1637)) ([c147a6b](https://github.com/unraid/api/commit/c147a6b5075969e77798210c4a5cfd1fa5b96ae3))
* **cli:** support `--log-level` for `start` and `restart` cmds ([#1623](https://github.com/unraid/api/issues/1623)) ([a1ee915](https://github.com/unraid/api/commit/a1ee915ca52e5a063eccf8facbada911a63f37f6))
* confusing server -&gt; status query ([#1635](https://github.com/unraid/api/issues/1635)) ([9d42b36](https://github.com/unraid/api/commit/9d42b36f74274cad72490da5152fdb98fdc5b89b))
* use unraid css variables in sonner ([#1634](https://github.com/unraid/api/issues/1634)) ([26a95af](https://github.com/unraid/api/commit/26a95af9539d05a837112d62dc6b7dd46761c83f))
## [4.17.0](https://github.com/unraid/api/compare/v4.16.0...v4.17.0) (2025-08-27)

View File

@@ -71,10 +71,6 @@ unraid-api report -vv
If you found this file you're likely a developer. If you'd like to know more about the API and when it's available please join [our discord](https://discord.unraid.net/).
## Internationalization
- Run `pnpm --filter @unraid/api i18n:extract` to scan the Nest.js source for translation helper usages and update `src/i18n/en.json` with any new keys. The extractor keeps existing translations intact and appends new keys with their English source text.
## License
Copyright Lime Technology Inc. All rights reserved.

View File

@@ -17,7 +17,6 @@ const config: CodegenConfig = {
URL: 'URL',
Port: 'number',
UUID: 'string',
BigInt: 'number',
},
scalarSchemas: {
URL: 'z.instanceof(URL)',
@@ -25,7 +24,6 @@ const config: CodegenConfig = {
JSON: 'z.record(z.string(), z.any())',
Port: 'z.number()',
UUID: 'z.string()',
BigInt: 'z.number()',
},
},
generates: {

View File

@@ -1,5 +1,5 @@
{
"version": "4.27.2",
"version": "4.17.0",
"extraOrigins": [],
"sandbox": true,
"ssoSubIds": [],

View File

@@ -1,6 +0,0 @@
timestamp=1730937600
event=Hashtag Test
subject=Warning [UNRAID] - #1 OS is cooking
description=Disk 1 temperature has reached #epic # levels of proportion
importance=warning

View File

@@ -1,6 +0,0 @@
timestamp=1730937600
event=Temperature Test
subject=Warning [UNRAID] - High disk temperature detected: 45&#8201;&#176;C
description=Disk 1 temperature has reached 45&#8201;&#176;C (threshold: 40&#8201;&#176;C)<br><br>Current temperatures:<br>Parity - 32&#8201;&#176;C [OK]<br>Disk 1 - 45&#8201;&#176;C [WARNING]<br>Disk 2 - 38&#8201;&#176;C [OK]<br>Cache - 28&#8201;&#176;C [OK]<br><br>Please check cooling system.
importance=warning

View File

@@ -1,247 +0,0 @@
# Feature Flags
Feature flags allow you to conditionally enable or disable functionality in the Unraid API. This is useful for gradually rolling out new features, A/B testing, or keeping experimental code behind flags during development.
## Setting Up Feature Flags
### 1. Define the Feature Flag
Feature flags are defined as environment variables and collected in `src/consts.ts`:
```typescript
// src/environment.ts
export const ENABLE_MY_NEW_FEATURE = process.env.ENABLE_MY_NEW_FEATURE === 'true';
// src/consts.ts
export const FeatureFlags = Object.freeze({
ENABLE_NEXT_DOCKER_RELEASE,
ENABLE_MY_NEW_FEATURE, // Add your new flag here
});
```
### 2. Set the Environment Variable
Set the environment variable when running the API:
```bash
ENABLE_MY_NEW_FEATURE=true unraid-api start
```
Or add it to your `.env` file:
```env
ENABLE_MY_NEW_FEATURE=true
```
## Using Feature Flags in GraphQL
### Method 1: @UseFeatureFlag Decorator (Schema-Level)
The `@UseFeatureFlag` decorator conditionally includes or excludes GraphQL fields, queries, and mutations from the schema based on feature flags. When a feature flag is disabled, the field won't appear in the GraphQL schema at all.
```typescript
import { UseFeatureFlag } from '@app/unraid-api/decorators/use-feature-flag.decorator.js';
import { Query, Mutation, ResolveField } from '@nestjs/graphql';
@Resolver()
export class MyResolver {
// Conditionally include a query
@UseFeatureFlag('ENABLE_MY_NEW_FEATURE')
@Query(() => String)
async experimentalQuery() {
return 'This query only exists when ENABLE_MY_NEW_FEATURE is true';
}
// Conditionally include a mutation
@UseFeatureFlag('ENABLE_MY_NEW_FEATURE')
@Mutation(() => Boolean)
async experimentalMutation() {
return true;
}
// Conditionally include a field resolver
@UseFeatureFlag('ENABLE_MY_NEW_FEATURE')
@ResolveField(() => String)
async experimentalField() {
return 'This field only exists when the flag is enabled';
}
}
```
**Benefits:**
- Clean schema - disabled features don't appear in GraphQL introspection
- No runtime overhead for disabled features
- Clear feature boundaries
**Use when:**
- You want to completely hide features from the GraphQL schema
- The feature is experimental or in beta
- You're doing a gradual rollout
### Method 2: checkFeatureFlag Function (Runtime)
The `checkFeatureFlag` function provides runtime feature flag checking within resolver methods. It throws a `ForbiddenException` if the feature is disabled.
```typescript
import { checkFeatureFlag } from '@app/unraid-api/utils/feature-flag.helper.js';
import { FeatureFlags } from '@app/consts.js';
import { Query, ResolveField } from '@nestjs/graphql';
@Resolver()
export class MyResolver {
@Query(() => String)
async myQuery(
@Args('useNewAlgorithm', { nullable: true }) useNewAlgorithm?: boolean
) {
// Conditionally use new logic based on feature flag
if (useNewAlgorithm) {
checkFeatureFlag(FeatureFlags, 'ENABLE_MY_NEW_FEATURE');
return this.newAlgorithm();
}
return this.oldAlgorithm();
}
@ResolveField(() => String)
async dataField() {
// Check flag at the start of the method
checkFeatureFlag(FeatureFlags, 'ENABLE_MY_NEW_FEATURE');
// Feature-specific logic here
return this.computeExperimentalData();
}
}
```
**Benefits:**
- More granular control within methods
- Can conditionally execute parts of a method
- Useful for A/B testing scenarios
- Good for gradual migration strategies
**Use when:**
- You need conditional logic within a method
- The field should exist but behavior changes based on the flag
- You're migrating from old to new implementation gradually
## Feature Flag Patterns
### Pattern 1: Complete Feature Toggle
Hide an entire feature behind a flag:
```typescript
@UseFeatureFlag('ENABLE_DOCKER_TEMPLATES')
@Resolver(() => DockerTemplate)
export class DockerTemplateResolver {
// All resolvers in this class are toggled by the flag
}
```
### Pattern 2: Gradual Migration
Migrate from old to new implementation:
```typescript
@Query(() => [Container])
async getContainers(@Args('version') version?: string) {
if (version === 'v2') {
checkFeatureFlag(FeatureFlags, 'ENABLE_CONTAINERS_V2');
return this.getContainersV2();
}
return this.getContainersV1();
}
```
### Pattern 3: Beta Features
Mark features as beta:
```typescript
@UseFeatureFlag('ENABLE_BETA_FEATURES')
@ResolveField(() => BetaMetrics, {
description: 'BETA: Advanced metrics (requires ENABLE_BETA_FEATURES flag)'
})
async betaMetrics() {
return this.computeBetaMetrics();
}
```
### Pattern 4: Performance Optimizations
Toggle expensive operations:
```typescript
@ResolveField(() => Statistics)
async statistics() {
const basicStats = await this.getBasicStats();
try {
checkFeatureFlag(FeatureFlags, 'ENABLE_ADVANCED_ANALYTICS');
const advancedStats = await this.getAdvancedStats();
return { ...basicStats, ...advancedStats };
} catch {
// Feature disabled, return only basic stats
return basicStats;
}
}
```
## Testing with Feature Flags
When writing tests for feature-flagged code, create a mock to control feature flag values:
```typescript
import { vi } from 'vitest';
// Mock the entire consts module
vi.mock('@app/consts.js', async () => {
const actual = await vi.importActual('@app/consts.js');
return {
...actual,
FeatureFlags: {
ENABLE_MY_NEW_FEATURE: true, // Set your test value
ENABLE_NEXT_DOCKER_RELEASE: false,
}
};
});
describe('MyResolver', () => {
it('should execute new logic when feature is enabled', async () => {
// Test new behavior with mocked flag
});
});
```
## Best Practices
1. **Naming Convention**: Use `ENABLE_` prefix for boolean feature flags
2. **Environment Variables**: Always use uppercase with underscores
3. **Documentation**: Document what each feature flag controls
4. **Cleanup**: Remove feature flags once features are stable and fully rolled out
5. **Default State**: New features should default to `false` (disabled)
6. **Granularity**: Keep feature flags focused on a single feature or capability
7. **Testing**: Always test both enabled and disabled states
## Common Use Cases
- **Experimental Features**: Hide unstable features in production
- **Gradual Rollouts**: Enable features for specific environments first
- **A/B Testing**: Toggle between different implementations
- **Performance**: Disable expensive operations when not needed
- **Breaking Changes**: Provide migration path with both old and new behavior
- **Debug Features**: Enable additional logging or debugging tools
## Checking Active Feature Flags
To see which feature flags are currently active:
```typescript
// Log all feature flags on startup
console.log('Active Feature Flags:', FeatureFlags);
```
Or check via GraphQL introspection to see which fields are available based on current flags.

View File

@@ -0,0 +1,4 @@
{
"label": "Unraid API",
"position": 4
}

View File

@@ -0,0 +1,100 @@
# API Key Authorization Flow
This document describes the self-service API key creation flow for third-party applications.
## Overview
Applications can request API access to an Unraid server by redirecting users to a special authorization page where users can review requested permissions and create an API key with one click.
## Flow
1. **Application initiates request**: The app redirects the user to:
```
https://[unraid-server]/ApiKeyAuthorize?name=MyApp&scopes=docker:read,vm:*&redirect_uri=https://myapp.com/callback&state=abc123
```
2. **User authentication**: If not already logged in, the user is redirected to login first (standard Unraid auth)
3. **Consent screen**: User sees:
- Application name and description
- Requested permissions (with checkboxes to approve/deny specific scopes)
- API key name field (pre-filled)
- Authorize & Cancel buttons
4. **API key creation**: Upon authorization:
- API key is created with approved scopes
- Key is displayed to the user
- If `redirect_uri` is provided, user is redirected back with the key
5. **Callback**: App receives the API key:
```
https://myapp.com/callback?api_key=xxx&state=abc123
```
## Query Parameters
- `name` (required): Name of the requesting application
- `description` (optional): Description of the application
- `scopes` (required): Comma-separated list of requested scopes
- `redirect_uri` (optional): URL to redirect after authorization
- `state` (optional): Opaque value for maintaining state
## Scope Format
Scopes follow the pattern: `resource:action`
### Examples:
- `docker:read` - Read access to Docker
- `vm:*` - Full access to VMs
- `system:update` - Update access to system
- `role:viewer` - Viewer role access
- `role:admin` - Admin role access
### Available Resources:
- `docker`, `vm`, `system`, `share`, `user`, `network`, `disk`, etc.
### Available Actions:
- `create`, `read`, `update`, `delete` or `*` for all
## Security Considerations
1. **HTTPS required**: Redirect URIs must use HTTPS (except localhost for development)
2. **User consent**: Users explicitly approve each permission
3. **Session-based**: Uses existing Unraid authentication session
4. **One-time display**: API keys are shown once and must be saved securely
## Example Integration
```javascript
// JavaScript example
const unraidServer = 'tower.local';
const appName = 'My Docker Manager';
const scopes = 'docker:*,system:read';
const redirectUri = 'https://myapp.com/unraid/callback';
const state = generateRandomState();
// Store state for verification
sessionStorage.setItem('oauth_state', state);
// Redirect user to authorization page
window.location.href =
`https://${unraidServer}/ApiKeyAuthorize?` +
`name=${encodeURIComponent(appName)}&` +
`scopes=${encodeURIComponent(scopes)}&` +
`redirect_uri=${encodeURIComponent(redirectUri)}&` +
`state=${encodeURIComponent(state)}`;
// Handle callback
const urlParams = new URLSearchParams(window.location.search);
const apiKey = urlParams.get('api_key');
const returnedState = urlParams.get('state');
if (returnedState === sessionStorage.getItem('oauth_state')) {
// Save API key securely
saveApiKey(apiKey);
}
```

210
api/docs/public/cli.md Normal file
View File

@@ -0,0 +1,210 @@
---
title: CLI Reference
description: Complete reference for all Unraid API CLI commands
sidebar_position: 4
---
# CLI Commands
:::info[Command Structure]
All commands follow the pattern: `unraid-api <command> [options]`
:::
## 🚀 Service Management
### Start
```bash
unraid-api start [--log-level <level>]
```
Starts the Unraid API service.
Options:
- `--log-level`: Set logging level (trace|debug|info|warn|error|fatal)
Alternative: You can also set the log level using the `LOG_LEVEL` environment variable:
```bash
LOG_LEVEL=trace unraid-api start
```
### Stop
```bash
unraid-api stop [--delete]
```
Stops the Unraid API service.
- `--delete`: Optional. Delete the PM2 home directory
### Restart
```bash
unraid-api restart [--log-level <level>]
```
Restarts the Unraid API service.
Options:
- `--log-level`: Set logging level (trace|debug|info|warn|error|fatal)
Alternative: You can also set the log level using the `LOG_LEVEL` environment variable:
```bash
LOG_LEVEL=trace unraid-api restart
```
### Logs
```bash
unraid-api logs [-l <lines>]
```
View the API logs.
- `-l, --lines`: Optional. Number of lines to tail (default: 100)
## ⚙️ Configuration Commands
### Config
```bash
unraid-api config
```
Displays current configuration values.
### Switch Environment
```bash
unraid-api switch-env [-e <environment>]
```
Switch between production and staging environments.
- `-e, --environment`: Optional. Target environment (production|staging)
### Developer Mode
:::tip Web GUI Management
You can also manage developer options through the web interface at **Settings****Management Access****Developer Options**
:::
```bash
unraid-api developer # Interactive prompt for tools
unraid-api developer --sandbox true # Enable GraphQL sandbox
unraid-api developer --sandbox false # Disable GraphQL sandbox
unraid-api developer --enable-modal # Enable modal testing tool
unraid-api developer --disable-modal # Disable modal testing tool
```
Configure developer features for the API:
- **GraphQL Sandbox**: Enable/disable Apollo GraphQL sandbox at `/graphql`
- **Modal Testing Tool**: Enable/disable UI modal testing in the Unraid menu
## API Key Management
:::tip Web GUI Management
You can also manage API keys through the web interface at **Settings****Management Access****API Keys**
:::
### API Key Commands
```bash
unraid-api apikey [options]
```
Create and manage API keys via CLI.
Options:
- `--name <name>`: Name of the key
- `--create`: Create a new key
- `-r, --roles <roles>`: Comma-separated list of roles
- `-p, --permissions <permissions>`: Comma-separated list of permissions
- `-d, --description <description>`: Description for the key
## SSO (Single Sign-On) Management
:::info OIDC Configuration
For OIDC/SSO provider configuration, see the web interface at **Settings****Management Access****API****OIDC** or refer to the [OIDC Provider Setup](./oidc-provider-setup.md) guide.
:::
### SSO Base Command
```bash
unraid-api sso
```
#### Add SSO User
```bash
unraid-api sso add-user
# or
unraid-api sso add
# or
unraid-api sso a
```
Add a new user for SSO authentication.
#### Remove SSO User
```bash
unraid-api sso remove-user
# or
unraid-api sso remove
# or
unraid-api sso r
```
Remove a user (or all users) from SSO.
#### List SSO Users
```bash
unraid-api sso list-users
# or
unraid-api sso list
# or
unraid-api sso l
```
List all configured SSO users.
#### Validate SSO Token
```bash
unraid-api sso validate-token <token>
# or
unraid-api sso validate
# or
unraid-api sso v
```
Validates an SSO token and returns its status.
## Report Generation
### Generate Report
```bash
unraid-api report [-r] [-j]
```
Generate a system report.
- `-r, --raw`: Display raw command output
- `-j, --json`: Display output in JSON format
## Notes
1. Most commands require appropriate permissions to modify system state
2. Some commands require the API to be running or stopped
3. Store API keys securely as they provide system access
4. SSO configuration changes may require a service restart

View File

@@ -0,0 +1,255 @@
---
title: Using the Unraid API
description: Learn how to interact with your Unraid server through the GraphQL API
sidebar_position: 2
---
# Using the Unraid API
:::tip[Quick Start]
The Unraid API provides a powerful GraphQL interface for managing your server. This guide covers authentication, common queries, and best practices.
:::
The Unraid API provides a GraphQL interface that allows you to interact with your Unraid server. This guide will help you get started with exploring and using the API.
## 🎮 Enabling the GraphQL Sandbox
### Web GUI Method (Recommended)
:::info[Preferred Method]
Using the Web GUI is the easiest way to enable the GraphQL sandbox.
:::
1. Navigate to **Settings****Management Access****Developer Options**
2. Enable the **GraphQL Sandbox** toggle
3. Access the GraphQL playground by navigating to:
```txt
http://YOUR_SERVER_IP/graphql
```
### CLI Method
Alternatively, you can enable developer mode using the CLI:
```bash
unraid-api developer --sandbox true
```
Or use the interactive mode:
```bash
unraid-api developer
```
## 🔑 Authentication
:::warning[Required for Most Operations]
Most queries and mutations require authentication. Always include appropriate credentials in your requests.
:::
You can authenticate using:
1. **API Keys** - For programmatic access
2. **Cookies** - Automatic when signed into the WebGUI
3. **SSO/OIDC** - When configured with external providers
### Managing API Keys
<tabs>
<tabItem value="gui" label="Web GUI (Recommended)" default>
Navigate to **Settings** → **Management Access** → **API Keys** in your Unraid web interface to:
- View existing API keys
- Create new API keys
- Manage permissions and roles
- Revoke or regenerate keys
</tabItem>
<tabItem value="cli" label="CLI Method">
You can also use the CLI to create an API key:
```bash
unraid-api apikey --create
```
Follow the prompts to set:
- Name
- Description
- Roles
- Permissions
</tabItem>
</tabs>
### Using API Keys
The generated API key should be included in your GraphQL requests as a header:
```json
{
"x-api-key": "YOUR_API_KEY"
}
```
## 📊 Available Schemas
The API provides access to various aspects of your Unraid server:
### System Information
- Query system details including CPU, memory, and OS information
- Monitor system status and health
- Access baseboard and hardware information
### Array Management
- Query array status and configuration
- Manage array operations (start/stop)
- Monitor disk status and health
- Perform parity checks
### Docker Management
- List and manage Docker containers
- Monitor container status
- Manage Docker networks
### Remote Access
- Configure and manage remote access settings
- Handle SSO configuration
- Manage allowed origins
### 💻 Example Queries
#### Check System Status
```graphql
query {
info {
os {
platform
distro
release
uptime
}
cpu {
manufacturer
brand
cores
threads
}
}
}
```
#### Monitor Array Status
```graphql
query {
array {
state
capacity {
disks {
free
used
total
}
}
disks {
name
size
status
temp
}
}
}
```
#### List Docker Containers
```graphql
query {
dockerContainers {
id
names
state
status
autoStart
}
}
```
## 🏗️ Schema Types
The API includes several core types:
### Base Types
- `Node`: Interface for objects with unique IDs - please see [Object Identification](https://graphql.org/learn/global-object-identification/)
- `JSON`: For complex JSON data
- `DateTime`: For timestamp values
- `Long`: For 64-bit integers
### Resource Types
- `Array`: Array and disk management
- `Docker`: Container and network management
- `Info`: System information
- `Config`: Server configuration
- `Connect`: Remote access settings
### Role-Based Access
Available roles:
- `admin`: Full access
- `connect`: Remote access features
- `guest`: Limited read access
## ✨ Best Practices
:::tip[Pro Tips]
1. Use the Apollo Sandbox to explore the schema and test queries
2. Start with small queries and gradually add fields as needed
3. Monitor your query complexity to maintain performance
4. Use appropriate roles and permissions for your API keys
5. Keep your API keys secure and rotate them periodically
:::
## ⏱️ Rate Limiting
:::caution[Rate Limits]
The API implements rate limiting to prevent abuse. Ensure your applications handle rate limit responses appropriately.
:::
## 🚨 Error Handling
The API returns standard GraphQL errors in the following format:
```json
{
"errors": [
{
"message": "Error description",
"locations": [...],
"path": [...]
}
]
}
```
## 📚 Additional Resources
:::info[Learn More]
- Use the Apollo Sandbox's schema explorer to browse all available types and fields
- Check the documentation tab in Apollo Sandbox for detailed field descriptions
- Monitor the API's health using `unraid-api status`
- Generate reports using `unraid-api report` for troubleshooting
For more information about specific commands and configuration options, refer to the [CLI documentation](/cli) or run `unraid-api --help`.
:::

Binary file not shown.

After

Width:  |  Height:  |  Size: 101 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 85 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 128 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 75 KiB

94
api/docs/public/index.md Normal file
View File

@@ -0,0 +1,94 @@
---
title: Welcome to Unraid API
description: The official GraphQL API for Unraid Server management and automation
sidebar_position: 1
---
# Welcome to Unraid API
:::tip[What's New]
Starting with Unraid OS v7.2, the API comes built into the operating system - no plugin installation required!
:::
The Unraid API provides a GraphQL interface for programmatic interaction with your Unraid server. It enables automation, monitoring, and integration capabilities.
## 📦 Availability
### ✨ Native Integration (Unraid OS v7.2+)
Starting with Unraid OS v7.2, the API is integrated directly into the operating system:
- No plugin installation required
- Automatically available on system startup
- Deep system integration
- Access through **Settings****Management Access****API**
### 🔌 Plugin Installation (Pre-7.2 and Advanced Users)
For Unraid versions prior to v7.2 or to access newer API features:
1. Install the Unraid Connect Plugin from Community Apps
2. [Configure the plugin](./how-to-use-the-api.md#enabling-the-graphql-sandbox)
3. Access API functionality through the [GraphQL Sandbox](./how-to-use-the-api.md)
:::info Important Notes
- The Unraid Connect plugin provides the API for pre-7.2 versions
- You do NOT need to sign in to Unraid Connect to use the API locally
- Installing the plugin on 7.2+ gives you access to newer API features before they're included in OS releases
:::
## 📚 Documentation Sections
<cards>
<card title="CLI Commands" icon="terminal" href="./cli">
Complete reference for all CLI commands
</card>
<card title="Using the API" icon="code" href="./how-to-use-the-api">
Learn how to interact with the GraphQL API
</card>
<card title="OIDC Setup" icon="shield" href="./oidc-provider-setup">
Configure SSO authentication providers
</card>
<card title="Upcoming Features" icon="rocket" href="./upcoming-features">
See what's coming next
</card>
</cards>
## 🌟 Key Features
:::info[Core Capabilities]
The API provides:
- **GraphQL Interface**: Modern, flexible API with strong typing
- **Authentication**: Multiple methods including API keys, session cookies, and SSO/OIDC
- **Comprehensive Coverage**: Access to system information, array management, and Docker operations
- **Developer Tools**: Built-in GraphQL sandbox configurable via web interface or CLI
- **Role-Based Access**: Granular permission control
- **Web Management**: Manage API keys and settings through the web interface
:::
## 🚀 Get Started
<tabs>
<tabItem value="v72" label="Unraid OS v7.2+" default>
1. The API is already installed and running
2. Access settings at **Settings****Management Access****API**
3. Enable the GraphQL Sandbox for development
4. Create your first API key
5. Start making GraphQL queries!
</tabItem>
<tabItem value="older" label="Pre-7.2 Versions">
1. Install the Unraid Connect plugin from Community Apps
2. No Unraid Connect login required for local API access
3. Configure the plugin settings
4. Enable the GraphQL Sandbox
5. Start exploring the API!
</tabItem>
</tabs>
For detailed usage instructions, see the [CLI Commands](./cli) reference.

View File

@@ -1 +0,0 @@
# All Content Here has been permanently moved to [Unraid Docs](https://github.com/unraid/docs)

View File

@@ -0,0 +1,420 @@
---
title: OIDC Provider Setup
description: Configure OIDC (OpenID Connect) providers for SSO authentication in Unraid API
sidebar_position: 3
---
# OIDC Provider Setup
:::info[What is OIDC?]
OpenID Connect (OIDC) is an authentication protocol that allows users to sign in using their existing accounts from providers like Google, Microsoft, or your corporate identity provider. It enables Single Sign-On (SSO) for seamless and secure authentication.
:::
This guide walks you through configuring OIDC (OpenID Connect) providers for SSO authentication in the Unraid API using the web interface.
## 🚀 Quick Start
<details open>
<summary><strong>Getting to OIDC Settings</strong></summary>
1. Navigate to your Unraid server's web interface
2. Go to **Settings****Management Access****API****OIDC**
3. You'll see tabs for different providers - click the **+** button to add a new provider
</details>
### OIDC Providers Interface Overview
![Login Page with SSO Options](./images/sso-with-options.png)
*Login page showing traditional login form with SSO options - "Login With Unraid.net" and "Sign in with Google" buttons*
The interface includes:
- **Provider tabs**: Each configured provider (Unraid.net, Google, etc.) appears as a tab
- **Add Provider button**: Click the **+** button to add new providers
- **Authorization Mode dropdown**: Toggle between "simple" and "advanced" modes
- **Simple Authorization section**: Configure allowed email domains and specific addresses
- **Add Item buttons**: Click to add multiple authorization rules
## Understanding Authorization Modes
The interface provides two authorization modes:
### Simple Mode (Recommended)
Simple mode is the easiest way to configure authorization. You can:
- Allow specific email domains (e.g., @company.com)
- Allow specific email addresses
- Configure who can access your Unraid server with minimal setup
**When to use Simple Mode:**
- You want to allow all users from your company domain
- You have a small list of specific users
- You're new to OIDC configuration
<details>
<summary><strong>Advanced Mode</strong></summary>
Advanced mode provides granular control using claim-based rules. You can:
- Create complex authorization rules based on JWT claims
- Use operators like equals, contains, endsWith, startsWith
- Combine multiple conditions with OR/AND logic
- Choose whether ANY rule must pass (OR mode) or ALL rules must pass (AND mode)
**When to use Advanced Mode:**
- You need to check group memberships
- You want to verify multiple claims (e.g., email domain AND verified status)
- You have complex authorization requirements
- You need fine-grained control over how rules are evaluated
</details>
## Authorization Rules
![Authorization Rules Configuration](./images/advanced-rules.png)
*Advanced authorization rules showing JWT claim configuration with email endsWith operator for domain-based access control*
### Simple Mode Examples
#### Allow Company Domain
In Simple Authorization:
- **Allowed Email Domains**: Enter `company.com`
- This allows anyone with @company.com email
#### Allow Specific Users
- **Specific Email Addresses**: Add individual emails
- Click **Add Item** to add multiple addresses
<details>
<summary><strong>Advanced Mode Examples</strong></summary>
#### Authorization Rule Mode
When using multiple rules, you can choose how they're evaluated:
- **OR Mode** (default): User is authorized if ANY rule passes
- **AND Mode**: User is authorized only if ALL rules pass
#### Email Domain with Verification (AND Mode)
To require both email domain AND verification:
1. Set **Authorization Rule Mode** to `AND`
2. Add two rules:
- Rule 1:
- **Claim**: `email`
- **Operator**: `endsWith`
- **Value**: `@company.com`
- Rule 2:
- **Claim**: `email_verified`
- **Operator**: `equals`
- **Value**: `true`
This ensures users must have both a company email AND a verified email address.
#### Group-Based Access (OR Mode)
To allow access to multiple groups:
1. Set **Authorization Rule Mode** to `OR` (default)
2. Add rules for each group:
- **Claim**: `groups`
- **Operator**: `contains`
- **Value**: `admins`
Or add another rule:
- **Claim**: `groups`
- **Operator**: `contains`
- **Value**: `developers`
Users in either `admins` OR `developers` group will be authorized.
#### Multiple Domains
- **Claim**: `email`
- **Operator**: `endsWith`
- **Values**: Add multiple domains (e.g., `company.com`, `subsidiary.com`)
#### Complex Authorization (AND Mode)
For strict security requiring multiple conditions:
1. Set **Authorization Rule Mode** to `AND`
2. Add multiple rules that ALL must pass:
- Email must be from company domain
- Email must be verified
- User must be in specific group
- Account must have 2FA enabled (if claim available)
</details>
<details>
<summary><strong>Configuration Interface Details</strong></summary>
### Provider Tabs
- Each configured provider appears as a tab at the top
- Click a tab to switch between provider configurations
- The **+** button on the right adds a new provider
### Authorization Mode Dropdown
- **simple**: Best for email-based authorization (recommended for most users)
- **advanced**: For complex claim-based rules using JWT claims
### Simple Authorization Fields
When "simple" mode is selected, you'll see:
- **Allowed Email Domains**: Enter domains without @ (e.g., `company.com`)
- Helper text: "Users with emails ending in these domains can login"
- **Specific Email Addresses**: Add individual email addresses
- Helper text: "Only these exact email addresses can login"
- **Add Item** buttons to add multiple entries
### Advanced Authorization Fields
When "advanced" mode is selected, you'll see:
- **Authorization Rule Mode**: Choose `OR` (any rule passes) or `AND` (all rules must pass)
- **Authorization Rules**: Add multiple claim-based rules
- **For each rule**:
- **Claim**: The JWT claim to check
- **Operator**: How to compare (equals, contains, endsWith, startsWith)
- **Value**: What to match against
### Additional Interface Elements
- **Enable Developer Sandbox**: Toggle to enable GraphQL sandbox at `/graphql`
- The interface uses a dark theme for better visibility
- Field validation indicators help ensure correct configuration
</details>
### Required Redirect URI
:::caution[Important Configuration]
All providers must be configured with this exact redirect URI format:
:::
```bash
http://YOUR_UNRAID_IP/graphql/api/auth/oidc/callback
```
:::tip
Replace `YOUR_UNRAID_IP` with your actual server IP address (e.g., `192.168.1.100` or `tower.local`).
:::
### Issuer URL Format
The **Issuer URL** field accepts both formats, but **base URL is strongly recommended** for security:
- **Base URL** (recommended): `https://accounts.google.com`
- **Full discovery URL**: `https://accounts.google.com/.well-known/openid-configuration`
**⚠️ Security Note**: Always use the base URL format when possible. The system automatically appends `/.well-known/openid-configuration` for OIDC discovery. Using the full discovery URL directly disables important issuer validation checks and is not recommended by the OpenID Connect specification.
**Examples of correct base URLs:**
- Google: `https://accounts.google.com`
- Microsoft/Azure: `https://login.microsoftonline.com/YOUR_TENANT_ID/v2.0`
- Keycloak: `https://keycloak.example.com/realms/YOUR_REALM`
- Authelia: `https://auth.yourdomain.com`
## ✅ Testing Your Configuration
![Login Page with SSO Buttons](./images/sso-with-options.png)
*Unraid login page displaying both traditional username/password authentication and SSO options with customized provider buttons*
1. Save your provider configuration
2. Log out (if logged in)
3. Navigate to the login page
4. Your configured provider button should appear
5. Click to test the login flow
## 🔧 Troubleshooting
### Common Issues
#### "Provider not found" error
- Ensure the Issuer URL is correct
- Check that the provider supports OIDC discovery (/.well-known/openid-configuration)
#### "Authorization failed"
- In Simple Mode: Check email domains are entered correctly (without @)
- In Advanced Mode:
- Verify claim names match exactly what your provider sends
- Check if Authorization Rule Mode is set correctly (OR vs AND)
- Ensure all required claims are present in the token
- Enable debug logging to see actual claims and rule evaluation
#### "Invalid redirect URI"
- Ensure the redirect URI in your provider matches exactly
- Include the correct port if using a non-standard configuration
- Verify the redirect URI protocol matches your server's configuration (HTTP or HTTPS)
#### Cannot see login button
- Check that at least one authorization rule is configured
- Verify the provider is enabled/saved
### Debug Mode
To troubleshoot issues:
1. Enable debug logging:
```bash
LOG_LEVEL=debug unraid-api start --debug
```
2. Check logs for:
- Received claims from provider
- Authorization rule evaluation
- Token validation errors
## 🔐 Security Best Practices
1. **Use Simple Mode for authorization** - Prevents overly accepting configurations and reduces misconfiguration risks
2. **Be specific with authorization** - Don't use overly broad rules
3. **Rotate secrets regularly** - Update client secrets periodically
4. **Test thoroughly** - Verify only intended users can access
## 💡 Need Help?
- Check provider's OIDC documentation
- Review Unraid API logs for detailed error messages
- Ensure your provider supports standard OIDC discovery
- Verify network connectivity between Unraid and provider
## 🏢 Provider-Specific Setup
### Unraid.net Provider
The Unraid.net provider is built-in and pre-configured. You only need to configure authorization rules in the interface.
**Configuration:**
- **Issuer URL**: Pre-configured (built-in provider)
- **Client ID/Secret**: Pre-configured (built-in provider)
- **Redirect URI**: `http://YOUR_UNRAID_IP/graphql/api/auth/oidc/callback`
:::tip[Redirect URI Protocol]
**Match the protocol to your server setup:** Use `http://` if accessing your Unraid server without SSL/TLS (typical for local network access). Use `https://` if you've configured SSL/TLS on your server. Some OIDC providers (like Google) require HTTPS and won't accept HTTP redirect URIs.
:::
Configure authorization rules using Simple Mode (allowed email domains/addresses) or Advanced Mode for complex requirements.
### Google
<details>
<summary><strong>📋 Setup Steps</strong></summary>
Set up OAuth 2.0 credentials in [Google Cloud Console](https://console.cloud.google.com/):
1. Go to **APIs & Services****Credentials**
2. Click **Create Credentials****OAuth client ID**
3. Choose **Web application** as the application type
4. Add your redirect URI to **Authorized redirect URIs**
5. Configure the OAuth consent screen if prompted
</details>
**Configuration:**
- **Issuer URL**: `https://accounts.google.com`
- **Client ID/Secret**: From your OAuth 2.0 client credentials
- **Required Scopes**: `openid`, `profile`, `email`
- **Redirect URI**: `http://YOUR_UNRAID_IP/graphql/api/auth/oidc/callback`
:::warning[Google Domain Requirements]
**Google requires valid domain names for OAuth redirect URIs.** Local IP addresses and `.local` domains are not accepted. To use Google OAuth with your Unraid server, you'll need:
- **Option 1: Reverse Proxy** - Set up a reverse proxy (like NGINX Proxy Manager or Traefik) with a valid domain name pointing to your Unraid API
- **Option 2: Tailscale** - Use Tailscale to get a valid `*.ts.net` domain that Google will accept
- **Option 3: Dynamic DNS** - Use a DDNS service to get a public domain name for your server
Remember to update your redirect URI in both Google Cloud Console and your Unraid OIDC configuration to use the valid domain.
:::
For Google Workspace domains, use Advanced Mode with the `hd` claim to restrict access to your organization's domain.
### Authelia
Configure OIDC client in your Authelia `configuration.yml` with client ID `unraid-api` and generate a hashed secret using the Authelia hash-password command.
**Configuration:**
- **Issuer URL**: `https://auth.yourdomain.com`
- **Client ID**: `unraid-api` (or as configured in Authelia)
- **Client Secret**: Your unhashed secret
- **Required Scopes**: `openid`, `profile`, `email`, `groups`
- **Redirect URI**: `http://YOUR_UNRAID_IP/graphql/api/auth/oidc/callback`
Use Advanced Mode with `groups` claim for group-based authorization.
### Microsoft/Azure AD
Register a new app in [Azure Portal](https://portal.azure.com/) under Azure Active Directory → App registrations. Note the Application ID, create a client secret, and note your tenant ID.
**Configuration:**
- **Issuer URL**: `https://login.microsoftonline.com/YOUR_TENANT_ID/v2.0`
- **Client ID**: Your Application (client) ID
- **Client Secret**: Generated client secret
- **Required Scopes**: `openid`, `profile`, `email`
- **Redirect URI**: `http://YOUR_UNRAID_IP/graphql/api/auth/oidc/callback`
Authorization rules can be configured in the interface using email domains or advanced claims.
### Keycloak
Create a new confidential client in Keycloak Admin Console with `openid-connect` protocol and copy the client secret from the Credentials tab.
**Configuration:**
- **Issuer URL**: `https://keycloak.example.com/realms/YOUR_REALM`
- **Client ID**: `unraid-api` (or as configured in Keycloak)
- **Client Secret**: From Keycloak Credentials tab
- **Required Scopes**: `openid`, `profile`, `email`
- **Redirect URI**: `http://YOUR_UNRAID_IP/graphql/api/auth/oidc/callback`
For role-based authorization, use Advanced Mode with `realm_access.roles` or `resource_access` claims.
### Authentik
Create a new OAuth2/OpenID Provider in Authentik, then create an Application and link it to the provider.
**Configuration:**
- **Issuer URL**: `https://authentik.example.com/application/o/<application_slug>/`
- **Client ID**: From Authentik provider configuration
- **Client Secret**: From Authentik provider configuration
- **Required Scopes**: `openid`, `profile`, `email`
- **Redirect URI**: `http://YOUR_UNRAID_IP/graphql/api/auth/oidc/callback`
Authorization rules can be configured in the interface.
### Okta
Create a new OIDC Web Application in Okta Admin Console and assign appropriate users or groups.
**Configuration:**
- **Issuer URL**: `https://YOUR_DOMAIN.okta.com`
- **Client ID**: From Okta application configuration
- **Client Secret**: From Okta application configuration
- **Required Scopes**: `openid`, `profile`, `email`
- **Redirect URI**: `http://YOUR_UNRAID_IP/graphql/api/auth/oidc/callback`
Authorization rules can be configured in the interface using email domains or advanced claims.

View File

@@ -0,0 +1,172 @@
---
title: Roadmap & Features
description: Current status and upcoming features for the Unraid API
sidebar_position: 10
---
# Roadmap & Features
:::info Development Status
This roadmap outlines completed and planned features for the Unraid API. Features and timelines may change based on development priorities and community feedback.
:::
## Feature Status Legend
| Status | Description |
|--------|-------------|
| ✅ **Done** | Feature is complete and available |
| 🚧 **In Progress** | Currently under active development |
| 📅 **Planned** | Scheduled for future development |
| 💡 **Under Consideration** | Being evaluated for future inclusion |
## Core Infrastructure
### Completed Features ✅
| Feature | Available Since |
|---------|-----------------|
| **API Development Environment Improvements** | v4.0.0 |
| **Include API in Unraid OS** | Unraid v7.2-beta.1 |
| **Separate API from Connect Plugin** | Unraid v7.2-beta.1 |
### Upcoming Features 📅
| Feature | Target Timeline |
|---------|-----------------|
| **Make API Open Source** | Q1 2025 |
| **Developer Tools for Plugins** | Q2 2025 |
## Security & Authentication
### Completed Features ✅
| Feature | Available Since |
|---------|-----------------|
| **Permissions System Rewrite** | v4.0.0 |
| **OIDC/SSO Support** | Unraid v7.2-beta.1 |
### In Development 🚧
- **User Interface Component Library** - Enhanced security components for the UI
## User Interface Improvements
### Planned Features 📅
| Feature | Target Timeline | Description |
|---------|-----------------|-------------|
| **New Settings Pages** | Q2 2025 | Modernized settings interface with improved UX |
| **Custom Theme Creator** | Q2-Q3 2025 | Allow users to create and share custom themes |
| **New Connect Settings Interface** | Q1 2025 | Redesigned Unraid Connect configuration |
## Array Management
### Completed Features ✅
| Feature | Available Since |
|---------|-----------------|
| **Array Status Monitoring** | v4.0.0 |
### Planned Features 📅
| Feature | Target Timeline | Description |
|---------|-----------------|-------------|
| **Storage Pool Creation Interface** | Q2 2025 | Simplified pool creation workflow |
| **Storage Pool Status Interface** | Q2 2025 | Real-time pool health monitoring |
## Docker Integration
### Completed Features ✅
| Feature | Available Since |
|---------|-----------------|
| **Docker Container Status Monitoring** | v4.0.0 |
### Planned Features 📅
| Feature | Target Timeline | Description |
|---------|-----------------|-------------|
| **New Docker Status Interface Design** | Q3 2025 | Modern container management UI |
| **New Docker Status Interface** | Q3 2025 | Implementation of new design |
| **Docker Container Setup Interface** | Q3 2025 | Streamlined container deployment |
| **Docker Compose Support** | TBD | Native docker-compose.yml support |
## Share Management
### Completed Features ✅
| Feature | Available Since |
|---------|-----------------|
| **Array/Cache Share Status Monitoring** | v4.0.0 |
### Under Consideration 💡
- **Storage Share Creation & Settings** - Enhanced share configuration options
- **Storage Share Management Interface** - Unified share management dashboard
## Plugin System
### Planned Features 📅
| Feature | Target Timeline | Description |
|---------|-----------------|-------------|
| **New Plugins Interface** | Q3 2025 | Redesigned plugin management UI |
| **Plugin Management Interface** | TBD | Advanced plugin configuration |
| **Plugin Development Tools** | TBD | SDK and tooling for developers |
## Notifications
### Completed Features ✅
| Feature | Available Since |
|---------|-----------------|
| **Notifications System** | v4.0.0 |
| **Notifications Interface** | v4.0.0 |
---
## Recent Releases
:::info Full Release History
For a complete list of all releases, changelogs, and download links, visit the [Unraid API GitHub Releases](https://github.com/unraid/api/releases) page.
:::
### Unraid v7.2-beta.1 Highlights
- 🎉 **API included in Unraid OS** - Native integration
- 🔐 **OIDC/SSO Support** - Enterprise authentication
- 📦 **Standalone API** - Separated from Connect plugin
### v4.0.0 Highlights
- 🛡️ **Permissions System Rewrite** - Enhanced security
- 📊 **Comprehensive Monitoring** - Array, Docker, and Share status
- 🔔 **Notifications System** - Real-time alerts and notifications
- 🛠️ **Developer Environment** - Improved development tools
## Community Feedback
:::tip Have a Feature Request?
We value community input! Please submit feature requests and feedback through:
- [Unraid Forums](https://forums.unraid.net)
- [GitHub Issues](https://github.com/unraid/api/issues) - API is open source!
:::
## Version Support
| Unraid Version | API Version | Support Status |
|----------------|-------------|----------------|
| Unraid v7.2-beta.1+ | Latest | ✅ Active |
| 7.0 - 7.1.x | v4.x via Plugin | ⚠️ Limited |
| 6.12.x | v4.x via Plugin | ⚠️ Limited |
| < 6.12 | Not Supported | ❌ EOL |
:::warning Legacy Support
Versions prior to Unraid 7.2 require the API to be installed through the Unraid Connect plugin. Some features may not be available on older versions.
:::
:::tip Pre-release Versions
You can always install the Unraid Connect plugin to access pre-release versions of the API and get early access to new features before they're included in Unraid OS releases.
:::

20
api/ecosystem.config.json Normal file
View File

@@ -0,0 +1,20 @@
{
"$schema": "https://json.schemastore.org/pm2-ecosystem",
"apps": [
{
"name": "unraid-api",
"script": "./dist/main.js",
"cwd": "/usr/local/unraid-api",
"exec_mode": "fork",
"wait_ready": true,
"listen_timeout": 15000,
"max_restarts": 10,
"min_uptime": 10000,
"watch": false,
"interpreter": "/usr/local/bin/node",
"ignore_watch": ["node_modules", "src", ".env.*", "myservers.cfg"],
"log_file": "/var/log/graphql-api.log",
"kill_timeout": 10000
}
]
}

View File

@@ -139,9 +139,6 @@ type ArrayDisk implements Node {
"""ata | nvme | usb | (others)"""
transport: String
color: ArrayDiskFsColor
"""Whether the disk is currently spinning"""
isSpinning: Boolean
}
interface Node {
@@ -349,9 +346,6 @@ type Disk implements Node {
"""The partitions on the disk"""
partitions: [DiskPartition!]!
"""Whether the disk is spinning or not"""
isSpinning: Boolean!
}
"""The type of interface the disk uses to connect to the system"""
@@ -1050,19 +1044,6 @@ enum ThemeName {
white
}
type ExplicitStatusItem {
name: String!
updateStatus: UpdateStatus!
}
"""Update status of a container."""
enum UpdateStatus {
UP_TO_DATE
UPDATE_AVAILABLE
REBUILD_READY
UNKNOWN
}
type ContainerPort {
ip: String
privatePort: Port
@@ -1093,8 +1074,8 @@ type DockerContainer implements Node {
created: Int!
ports: [ContainerPort!]!
"""Total size of all files in the container (in bytes)"""
sizeRootFs: BigInt
"""Total size of all the files in the container"""
sizeRootFs: Int
labels: JSON
state: ContainerState!
status: String!
@@ -1102,8 +1083,6 @@ type DockerContainer implements Node {
networkSettings: JSON
mounts: [JSON!]
autoStart: Boolean!
isUpdateAvailable: Boolean
isRebuildReady: Boolean
}
enum ContainerState {
@@ -1134,7 +1113,6 @@ type Docker implements Node {
containers(skipCache: Boolean! = false): [DockerContainer!]!
networks(skipCache: Boolean! = false): [DockerNetwork!]!
organizer: ResolvedOrganizerV1!
containerUpdateStatuses: [ExplicitStatusItem!]!
}
type ResolvedOrganizerView {
@@ -1383,25 +1361,6 @@ type CpuLoad {
"""The percentage of time the CPU spent servicing hardware interrupts."""
percentIrq: Float!
"""The percentage of time the CPU spent running virtual machines (guest)."""
percentGuest: Float!
"""The percentage of CPU time stolen by the hypervisor."""
percentSteal: Float!
}
type CpuPackages implements Node {
id: PrefixedID!
"""Total CPU package power draw (W)"""
totalPower: Float!
"""Power draw per package (W)"""
power: [Float!]!
"""Temperature per package (°C)"""
temp: [Float!]!
}
type CpuUtilization implements Node {
@@ -1467,12 +1426,6 @@ type InfoCpu implements Node {
"""CPU feature flags"""
flags: [String!]
"""
Per-package array of core/thread pairs, e.g. [[[0,1],[2,3]], [[4,5],[6,7]]]
"""
topology: [[[Int!]!]!]!
packages: CpuPackages!
}
type MemoryLayout implements Node {
@@ -1673,8 +1626,8 @@ type PackageVersions {
"""npm version"""
npm: String
"""nodemon version"""
nodemon: String
"""pm2 version"""
pm2: String
"""Git version"""
git: String
@@ -2454,7 +2407,6 @@ type Mutation {
setDockerFolderChildren(folderId: String, childrenIds: [String!]!): ResolvedOrganizerV1!
deleteDockerEntries(entryIds: [String!]!): ResolvedOrganizerV1!
moveDockerEntriesToFolder(sourceEntryIds: [String!]!, destinationFolderId: String!): ResolvedOrganizerV1!
refreshDockerDigests: Boolean!
"""Initiates a flash drive backup using a configured remote."""
initiateFlashBackup(input: InitiateFlashBackupInput!): FlashBackupStatus!
@@ -2661,7 +2613,6 @@ type Subscription {
arraySubscription: UnraidArray!
logFile(path: String!): LogFileContent!
systemMetricsCpu: CpuUtilization!
systemMetricsCpuTelemetry: CpuPackages!
systemMetricsMemory: MemoryUtilization!
upsUpdates: UPSDevice!
}

View File

@@ -1257,7 +1257,7 @@ type Versions {
openssl: String
perl: String
php: String
nodemon: String
pm2: String
postfix: String
postgresql: String
python: String

View File

@@ -1,17 +0,0 @@
{
"watch": [
"dist/main.js"
],
"ignore": [
"node_modules",
"src",
".env.*"
],
"exec": "node $UNRAID_API_SERVER_ENTRYPOINT",
"signal": "SIGTERM",
"ext": "js,json",
"restartable": "rs",
"env": {
"NODE_ENV": "production"
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@unraid/api",
"version": "4.27.2",
"version": "4.17.0",
"main": "src/cli/index.ts",
"type": "module",
"corepack": {
@@ -30,8 +30,6 @@
"// GraphQL Codegen": "",
"codegen": "graphql-codegen --config codegen.ts",
"codegen:watch": "graphql-codegen --config codegen.ts --watch",
"// Internationalization": "",
"i18n:extract": "node ./scripts/extract-translations.mjs",
"// Code Quality": "",
"lint": "eslint --config .eslintrc.ts src/",
"lint:fix": "eslint --fix --config .eslintrc.ts src/",
@@ -58,7 +56,7 @@
"@as-integrations/fastify": "2.1.1",
"@fastify/cookie": "11.0.2",
"@fastify/helmet": "13.0.1",
"@graphql-codegen/client-preset": "5.0.0",
"@graphql-codegen/client-preset": "4.8.3",
"@graphql-tools/load-files": "7.0.1",
"@graphql-tools/merge": "9.1.1",
"@graphql-tools/schema": "10.0.25",
@@ -86,7 +84,7 @@
"bytes": "3.1.2",
"cache-manager": "7.2.0",
"cacheable-lookup": "7.0.0",
"camelcase-keys": "10.0.0",
"camelcase-keys": "9.1.3",
"casbin": "5.38.0",
"change-case": "5.4.4",
"chokidar": "4.0.3",
@@ -96,7 +94,7 @@
"command-exists": "1.2.9",
"convert": "5.12.0",
"cookie": "1.0.2",
"cron": "4.3.0",
"cron": "4.3.3",
"cross-fetch": "4.1.0",
"diff": "8.0.2",
"dockerode": "4.0.7",
@@ -105,7 +103,7 @@
"execa": "9.6.0",
"exit-hook": "4.0.0",
"fastify": "5.5.0",
"filenamify": "7.0.0",
"filenamify": "6.0.0",
"fs-extra": "11.3.1",
"glob": "11.0.3",
"global-agent": "3.0.0",
@@ -116,7 +114,6 @@
"graphql-subscriptions": "3.0.0",
"graphql-tag": "2.12.6",
"graphql-ws": "6.0.6",
"html-entities": "^2.6.0",
"ini": "5.0.0",
"ip": "2.0.1",
"jose": "6.0.13",
@@ -129,23 +126,22 @@
"nestjs-pino": "4.4.0",
"node-cache": "5.1.2",
"node-window-polyfill": "1.0.4",
"nodemon": "3.1.10",
"openid-client": "6.6.4",
"p-retry": "7.0.0",
"p-retry": "6.2.1",
"passport-custom": "1.1.1",
"passport-http-header-strategy": "1.1.0",
"path-type": "6.0.0",
"pino": "9.9.0",
"pino-http": "10.5.0",
"pino-pretty": "13.1.1",
"proper-lockfile": "^4.1.2",
"pm2": "6.0.8",
"reflect-metadata": "^0.1.14",
"rxjs": "7.8.2",
"semver": "7.7.2",
"strftime": "0.10.3",
"systeminformation": "5.27.8",
"undici": "7.15.0",
"uuid": "13.0.0",
"uuid": "11.1.0",
"ws": "8.18.3",
"zen-observable-ts": "1.1.0",
"zod": "3.25.76"
@@ -160,14 +156,14 @@
},
"devDependencies": {
"@eslint/js": "9.34.0",
"@graphql-codegen/add": "6.0.0",
"@graphql-codegen/cli": "6.0.0",
"@graphql-codegen/fragment-matcher": "6.0.0",
"@graphql-codegen/add": "5.0.3",
"@graphql-codegen/cli": "5.0.7",
"@graphql-codegen/fragment-matcher": "5.1.0",
"@graphql-codegen/import-types-preset": "3.0.1",
"@graphql-codegen/typed-document-node": "6.0.0",
"@graphql-codegen/typescript": "5.0.0",
"@graphql-codegen/typescript-operations": "5.0.0",
"@graphql-codegen/typescript-resolvers": "5.0.0",
"@graphql-codegen/typed-document-node": "5.1.2",
"@graphql-codegen/typescript": "4.1.6",
"@graphql-codegen/typescript-operations": "4.6.1",
"@graphql-codegen/typescript-resolvers": "4.5.1",
"@graphql-typed-document-node/core": "3.2.0",
"@ianvs/prettier-plugin-sort-imports": "4.6.3",
"@nestjs/testing": "11.1.6",
@@ -189,13 +185,12 @@
"@types/mustache": "4.2.6",
"@types/node": "22.18.0",
"@types/pify": "6.1.0",
"@types/proper-lockfile": "^4.1.4",
"@types/semver": "7.7.0",
"@types/sendmail": "1.4.7",
"@types/stoppable": "1.1.3",
"@types/strftime": "0.9.8",
"@types/supertest": "6.0.3",
"@types/uuid": "11.0.0",
"@types/uuid": "10.0.0",
"@types/ws": "8.18.1",
"@types/wtfnode": "0.10.0",
"@vitest/coverage-v8": "3.2.4",
@@ -205,11 +200,12 @@
"eslint-plugin-no-relative-import-paths": "1.6.1",
"eslint-plugin-prettier": "5.5.4",
"jiti": "2.5.1",
"nodemon": "3.1.10",
"prettier": "3.6.2",
"rollup-plugin-node-externals": "8.1.0",
"supertest": "7.1.4",
"tsx": "4.20.5",
"type-fest": "5.0.0",
"type-fest": "4.41.0",
"typescript": "5.9.2",
"typescript-eslint": "8.41.0",
"unplugin-swc": "1.5.7",

View File

@@ -7,7 +7,7 @@ import { exit } from 'process';
import type { PackageJson } from 'type-fest';
import { $, cd } from 'zx';
import { getDeploymentVersion } from '@app/../scripts/get-deployment-version.js';
import { getDeploymentVersion } from './get-deployment-version.js';
type ApiPackageJson = PackageJson & {
version: string;
@@ -94,7 +94,7 @@ try {
await writeFile('./deploy/pack/package.json', JSON.stringify(parsedPackageJson, null, 4));
// Copy necessary files to the pack directory
await $`cp -r dist README.md .env.* nodemon.json ./deploy/pack/`;
await $`cp -r dist README.md .env.* ecosystem.config.json ./deploy/pack/`;
// Change to the pack directory and install dependencies
cd('./deploy/pack');

View File

@@ -1,162 +0,0 @@
#!/usr/bin/env node
import { readFile, writeFile } from 'node:fs/promises';
import path from 'node:path';
import { glob } from 'glob';
import ts from 'typescript';
const projectRoot = process.cwd();
const sourcePatterns = 'src/**/*.{ts,js}';
const ignorePatterns = [
'**/__tests__/**',
'**/__test__/**',
'**/*.spec.ts',
'**/*.spec.js',
'**/*.test.ts',
'**/*.test.js',
];
const englishLocaleFile = path.resolve(projectRoot, 'src/i18n/en.json');
const identifierTargets = new Set(['t', 'translate']);
const propertyTargets = new Set([
'i18n.t',
'i18n.translate',
'ctx.t',
'this.translate',
'this.i18n.translate',
'this.i18n.t',
]);
function getPropertyChain(node) {
if (ts.isIdentifier(node)) {
return node.text;
}
if (ts.isPropertyAccessExpression(node)) {
const left = getPropertyChain(node.expression);
if (!left) return undefined;
return `${left}.${node.name.text}`;
}
return undefined;
}
function extractLiteral(node) {
if (ts.isStringLiteralLike(node)) {
return node.text;
}
if (ts.isNoSubstitutionTemplateLiteral(node)) {
return node.text;
}
return undefined;
}
function collectKeysFromSource(sourceFile) {
const keys = new Set();
function visit(node) {
if (ts.isCallExpression(node)) {
const expr = node.expression;
let matches = false;
if (ts.isIdentifier(expr) && identifierTargets.has(expr.text)) {
matches = true;
} else if (ts.isPropertyAccessExpression(expr)) {
const chain = getPropertyChain(expr);
if (chain && propertyTargets.has(chain)) {
matches = true;
}
}
if (matches) {
const [firstArg] = node.arguments;
if (firstArg) {
const literal = extractLiteral(firstArg);
if (literal) {
keys.add(literal);
}
}
}
}
ts.forEachChild(node, visit);
}
visit(sourceFile);
return keys;
}
async function loadEnglishCatalog() {
try {
const raw = await readFile(englishLocaleFile, 'utf8');
const parsed = raw.trim() ? JSON.parse(raw) : {};
if (typeof parsed !== 'object' || Array.isArray(parsed)) {
throw new Error('English locale file must contain a JSON object.');
}
return parsed;
} catch (error) {
if (error && error.code === 'ENOENT') {
return {};
}
throw error;
}
}
async function ensureEnglishCatalog(keys) {
const existingCatalog = await loadEnglishCatalog();
const existingKeys = new Set(Object.keys(existingCatalog));
let added = 0;
const combinedKeys = new Set([...existingKeys, ...keys]);
const sortedKeys = Array.from(combinedKeys).sort((a, b) => a.localeCompare(b));
const nextCatalog = {};
for (const key of sortedKeys) {
if (Object.prototype.hasOwnProperty.call(existingCatalog, key)) {
nextCatalog[key] = existingCatalog[key];
} else {
nextCatalog[key] = key;
added += 1;
}
}
const nextJson = `${JSON.stringify(nextCatalog, null, 2)}\n`;
const existingJson = JSON.stringify(existingCatalog, null, 2) + '\n';
if (nextJson !== existingJson) {
await writeFile(englishLocaleFile, nextJson, 'utf8');
}
return added;
}
async function main() {
const files = await glob(sourcePatterns, {
cwd: projectRoot,
ignore: ignorePatterns,
absolute: true,
});
const collectedKeys = new Set();
await Promise.all(
files.map(async (file) => {
const content = await readFile(file, 'utf8');
const sourceFile = ts.createSourceFile(file, content, ts.ScriptTarget.Latest, true);
const keys = collectKeysFromSource(sourceFile);
keys.forEach((key) => collectedKeys.add(key));
}),
);
const added = await ensureEnglishCatalog(collectedKeys);
if (added === 0) {
console.log('[i18n] No new backend translation keys detected.');
} else {
console.log(`[i18n] Added ${added} key(s) to src/i18n/en.json.`);
}
}
main().catch((error) => {
console.error('[i18n] Failed to extract backend translations.', error);
process.exitCode = 1;
});

View File

@@ -4,18 +4,23 @@ import {
getBannerPathIfPresent,
getCasePathIfPresent,
} from '@app/core/utils/images/image-file-helpers.js';
import { loadDynamixConfig } from '@app/store/index.js';
import { loadDynamixConfigFile } from '@app/store/actions/load-dynamix-config-file.js';
import { store } from '@app/store/index.js';
test('get case path returns expected result', async () => {
await expect(getCasePathIfPresent()).resolves.toContain('/dev/dynamix/case-model.png');
});
test('get banner path returns null (state unloaded)', async () => {
await expect(getBannerPathIfPresent()).resolves.toMatchInlineSnapshot('null');
});
test('get banner path returns the banner (state loaded)', async () => {
loadDynamixConfig();
await store.dispatch(loadDynamixConfigFile()).unwrap();
await expect(getBannerPathIfPresent()).resolves.toContain('/dev/dynamix/banner.png');
});
test('get banner path returns null when no banner (state loaded)', async () => {
loadDynamixConfig();
await store.dispatch(loadDynamixConfigFile()).unwrap();
await expect(getBannerPathIfPresent('notabanner.png')).resolves.toMatchInlineSnapshot('null');
});

View File

@@ -1,12 +1,11 @@
import { expect, test, vi } from 'vitest';
import { expect, test } from 'vitest';
import { store } from '@app/store/index.js';
import { FileLoadStatus, StateFileKey } from '@app/store/types.js';
import '@app/core/utils/misc/get-key-file.js';
import '@app/store/modules/emhttp.js';
vi.mock('fs/promises');
test('Before loading key returns null', async () => {
const { getKeyFile } = await import('@app/core/utils/misc/get-key-file.js');
const { status } = store.getState().registration;
@@ -49,70 +48,21 @@ test('Returns empty key if key location is empty', async () => {
await expect(getKeyFile()).resolves.toBe('');
});
test('Returns empty string when key file does not exist (ENOENT)', async () => {
const { readFile } = await import('fs/promises');
// Mock readFile to throw ENOENT error
const readFileMock = vi.mocked(readFile);
readFileMock.mockRejectedValueOnce(
Object.assign(new Error('ENOENT: no such file or directory'), { code: 'ENOENT' })
);
// Clear the module cache and re-import to get fresh module with mock
vi.resetModules();
const { getKeyFile } = await import('@app/core/utils/misc/get-key-file.js');
const { updateEmhttpState } = await import('@app/store/modules/emhttp.js');
const { store: freshStore } = await import('@app/store/index.js');
// Set key file location to a non-existent file
freshStore.dispatch(
updateEmhttpState({
field: StateFileKey.var,
state: {
regFile: '/boot/config/Pro.key',
},
})
);
// Should return empty string when file doesn't exist
await expect(getKeyFile()).resolves.toBe('');
// Clear mock
readFileMock.mockReset();
vi.resetModules();
});
test('Returns decoded key file if key location exists', async () => {
const { readFile } = await import('fs/promises');
// Mock a valid key file content
const mockKeyContent =
'hVs1tLjvC9FiiQsIwIQ7G1KszAcexf0IneThhnmf22SB0dGs5WzRkqMiSMmt2DtR5HOXFUD32YyxuzGeUXmky3zKpSu6xhZNKVg5atGM1OfvkzHBMldI3SeBLuUFSgejLbpNUMdTrbk64JJdbzle4O8wiQgkIpAMIGxeYLwLBD4zHBcfyzq40QnxG--HcX6j25eE0xqa2zWj-j0b0rCAXahJV2a3ySCbPzr1MvfPRTVb0rr7KJ-25R592hYrz4H7Sc1B3p0lr6QUxHE6o7bcYrWKDRtIVoZ8SMPpd1_0gzYIcl5GsDFzFumTXUh8NEnl0Q8hwW1YE-tRc6Y_rrvd7w==';
const binaryContent = Buffer.from(mockKeyContent, 'base64').toString('binary');
const readFileMock = vi.mocked(readFile);
readFileMock.mockResolvedValue(binaryContent);
// Clear the module cache and re-import to get fresh module with mock
vi.resetModules();
const { getKeyFile } = await import('@app/core/utils/misc/get-key-file.js');
const { loadStateFiles } = await import('@app/store/modules/emhttp.js');
const { loadRegistrationKey } = await import('@app/store/modules/registration.js');
const { store: freshStore } = await import('@app/store/index.js');
// Load state files into store
await freshStore.dispatch(loadStateFiles());
await freshStore.dispatch(loadRegistrationKey());
// Check if store has state files loaded
const { status } = freshStore.getState().registration;
expect(status).toBe(FileLoadStatus.LOADED);
const result = await getKeyFile();
expect(result).toBe(
'hVs1tLjvC9FiiQsIwIQ7G1KszAcexf0IneThhnmf22SB0dGs5WzRkqMiSMmt2DtR5HOXFUD32YyxuzGeUXmky3zKpSu6xhZNKVg5atGM1OfvkzHBMldI3SeBLuUFSgejLbpNUMdTrbk64JJdbzle4O8wiQgkIpAMIGxeYLwLBD4zHBcfyzq40QnxG--HcX6j25eE0xqa2zWj-j0b0rCAXahJV2a3ySCbPzr1MvfPRTVb0rr7KJ-25R592hYrz4H7Sc1B3p0lr6QUxHE6o7bcYrWKDRtIVoZ8SMPpd1_0gzYIcl5GsDFzFumTXUh8NEnl0Q8hwW1YE-tRc6Y_rrvd7w'
);
// Clear mock
readFileMock.mockReset();
vi.resetModules();
}, 10000);
test(
'Returns decoded key file if key location exists',
async () => {
const { getKeyFile } = await import('@app/core/utils/misc/get-key-file.js');
const { loadStateFiles } = await import('@app/store/modules/emhttp.js');
const { loadRegistrationKey } = await import('@app/store/modules/registration.js');
// Load state files into store
await store.dispatch(loadStateFiles());
await store.dispatch(loadRegistrationKey());
// Check if store has state files loaded
const { status } = store.getState().registration;
expect(status).toBe(FileLoadStatus.LOADED);
await expect(getKeyFile()).resolves.toMatchInlineSnapshot(
'"hVs1tLjvC9FiiQsIwIQ7G1KszAcexf0IneThhnmf22SB0dGs5WzRkqMiSMmt2DtR5HOXFUD32YyxuzGeUXmky3zKpSu6xhZNKVg5atGM1OfvkzHBMldI3SeBLuUFSgejLbpNUMdTrbk64JJdbzle4O8wiQgkIpAMIGxeYLwLBD4zHBcfyzq40QnxG--HcX6j25eE0xqa2zWj-j0b0rCAXahJV2a3ySCbPzr1MvfPRTVb0rr7KJ-25R592hYrz4H7Sc1B3p0lr6QUxHE6o7bcYrWKDRtIVoZ8SMPpd1_0gzYIcl5GsDFzFumTXUh8NEnl0Q8hwW1YE-tRc6Y_rrvd7w"'
);
},
{ timeout: 10000 }
);

View File

@@ -1,178 +0,0 @@
import { describe, expect, test } from 'vitest';
import {
iniBooleanOrAutoToJsBoolean,
iniBooleanToJsBoolean,
} from '@app/core/utils/parsers/ini-boolean-parser.js';
describe('iniBooleanToJsBoolean', () => {
describe('valid boolean values', () => {
test('returns false for "no"', () => {
expect(iniBooleanToJsBoolean('no')).toBe(false);
});
test('returns false for "false"', () => {
expect(iniBooleanToJsBoolean('false')).toBe(false);
});
test('returns true for "yes"', () => {
expect(iniBooleanToJsBoolean('yes')).toBe(true);
});
test('returns true for "true"', () => {
expect(iniBooleanToJsBoolean('true')).toBe(true);
});
});
describe('malformed values', () => {
test('handles "no*" as false', () => {
expect(iniBooleanToJsBoolean('no*')).toBe(false);
});
test('handles "yes*" as true', () => {
expect(iniBooleanToJsBoolean('yes*')).toBe(true);
});
test('handles "true*" as true', () => {
expect(iniBooleanToJsBoolean('true*')).toBe(true);
});
test('handles "false*" as false', () => {
expect(iniBooleanToJsBoolean('false*')).toBe(false);
});
test('returns undefined for "n0!" (cleans to "n" which is invalid)', () => {
expect(iniBooleanToJsBoolean('n0!')).toBe(undefined);
});
test('returns undefined for "y3s!" (cleans to "ys" which is invalid)', () => {
expect(iniBooleanToJsBoolean('y3s!')).toBe(undefined);
});
test('handles mixed case with extra chars "YES*" as true', () => {
expect(iniBooleanToJsBoolean('YES*')).toBe(true);
});
test('handles mixed case with extra chars "NO*" as false', () => {
expect(iniBooleanToJsBoolean('NO*')).toBe(false);
});
});
describe('default values', () => {
test('returns default value for invalid input when provided', () => {
expect(iniBooleanToJsBoolean('invalid', true)).toBe(true);
expect(iniBooleanToJsBoolean('invalid', false)).toBe(false);
});
test('returns default value for empty string when provided', () => {
expect(iniBooleanToJsBoolean('', true)).toBe(true);
expect(iniBooleanToJsBoolean('', false)).toBe(false);
});
});
describe('undefined fallback cases', () => {
test('returns undefined for invalid input without default', () => {
expect(iniBooleanToJsBoolean('invalid')).toBe(undefined);
});
test('returns undefined for empty string without default', () => {
expect(iniBooleanToJsBoolean('')).toBe(undefined);
});
test('returns undefined for numeric string without default', () => {
expect(iniBooleanToJsBoolean('123')).toBe(undefined);
});
});
});
describe('iniBooleanOrAutoToJsBoolean', () => {
describe('valid boolean values', () => {
test('returns false for "no"', () => {
expect(iniBooleanOrAutoToJsBoolean('no')).toBe(false);
});
test('returns false for "false"', () => {
expect(iniBooleanOrAutoToJsBoolean('false')).toBe(false);
});
test('returns true for "yes"', () => {
expect(iniBooleanOrAutoToJsBoolean('yes')).toBe(true);
});
test('returns true for "true"', () => {
expect(iniBooleanOrAutoToJsBoolean('true')).toBe(true);
});
});
describe('auto value', () => {
test('returns null for "auto"', () => {
expect(iniBooleanOrAutoToJsBoolean('auto')).toBe(null);
});
});
describe('malformed values', () => {
test('handles "no*" as false', () => {
expect(iniBooleanOrAutoToJsBoolean('no*')).toBe(false);
});
test('handles "yes*" as true', () => {
expect(iniBooleanOrAutoToJsBoolean('yes*')).toBe(true);
});
test('handles "auto*" as null', () => {
expect(iniBooleanOrAutoToJsBoolean('auto*')).toBe(null);
});
test('handles "true*" as true', () => {
expect(iniBooleanOrAutoToJsBoolean('true*')).toBe(true);
});
test('handles "false*" as false', () => {
expect(iniBooleanOrAutoToJsBoolean('false*')).toBe(false);
});
test('handles "n0!" as undefined fallback (cleans to "n" which is invalid)', () => {
expect(iniBooleanOrAutoToJsBoolean('n0!')).toBe(undefined);
});
test('handles "a1ut2o!" as null (removes non-alphabetic chars)', () => {
expect(iniBooleanOrAutoToJsBoolean('a1ut2o!')).toBe(null);
});
test('handles mixed case "AUTO*" as null', () => {
expect(iniBooleanOrAutoToJsBoolean('AUTO*')).toBe(null);
});
});
describe('fallback behavior', () => {
test('returns undefined for completely invalid input', () => {
expect(iniBooleanOrAutoToJsBoolean('invalid123')).toBe(undefined);
});
test('returns undefined for empty string', () => {
expect(iniBooleanOrAutoToJsBoolean('')).toBe(undefined);
});
test('returns undefined for numeric string', () => {
expect(iniBooleanOrAutoToJsBoolean('123')).toBe(undefined);
});
test('returns undefined for special characters only', () => {
expect(iniBooleanOrAutoToJsBoolean('!@#$')).toBe(undefined);
});
});
describe('edge cases', () => {
test('handles undefined gracefully', () => {
expect(iniBooleanOrAutoToJsBoolean(undefined as any)).toBe(undefined);
});
test('handles null gracefully', () => {
expect(iniBooleanOrAutoToJsBoolean(null as any)).toBe(undefined);
});
test('handles non-string input gracefully', () => {
expect(iniBooleanOrAutoToJsBoolean(123 as any)).toBe(undefined);
});
});
});

View File

@@ -0,0 +1,5 @@
/* eslint-disable no-undef */
// Dummy process for PM2 testing
setInterval(() => {
// Keep process alive
}, 1000);

View File

@@ -0,0 +1,216 @@
import { existsSync } from 'node:fs';
import { join } from 'node:path';
import { fileURLToPath } from 'node:url';
import { execa } from 'execa';
import pm2 from 'pm2';
import { afterAll, afterEach, beforeAll, beforeEach, describe, expect, it, vi } from 'vitest';
import { isUnraidApiRunning } from '@app/core/utils/pm2/unraid-api-running.js';
const __dirname = fileURLToPath(new URL('.', import.meta.url));
const PROJECT_ROOT = join(__dirname, '../../../../..');
const DUMMY_PROCESS_PATH = join(__dirname, 'dummy-process.js');
const CLI_PATH = join(PROJECT_ROOT, 'dist/cli.js');
const TEST_PROCESS_NAME = 'test-unraid-api';
// Shared PM2 connection state
let pm2Connected = false;
// Helper function to run CLI command (assumes CLI is built)
async function runCliCommand(command: string, options: any = {}) {
return await execa('node', [CLI_PATH, command], options);
}
// Helper to ensure PM2 connection is established
async function ensurePM2Connection() {
if (pm2Connected) return;
return new Promise<void>((resolve, reject) => {
pm2.connect((err) => {
if (err) {
reject(err);
return;
}
pm2Connected = true;
resolve();
});
});
}
// Helper to delete specific test processes (lightweight, reuses connection)
async function deleteTestProcesses() {
if (!pm2Connected) {
// No connection, nothing to clean up
return;
}
const deletePromise = new Promise<void>((resolve) => {
// Delete specific processes we might have created
const processNames = ['unraid-api', TEST_PROCESS_NAME];
let deletedCount = 0;
const deleteNext = () => {
if (deletedCount >= processNames.length) {
resolve();
return;
}
const processName = processNames[deletedCount];
pm2.delete(processName, (deleteErr) => {
// Ignore errors, process might not exist
deletedCount++;
deleteNext();
});
};
deleteNext();
});
const timeoutPromise = new Promise<void>((resolve) => {
setTimeout(() => resolve(), 3000); // 3 second timeout
});
return Promise.race([deletePromise, timeoutPromise]);
}
// Helper to ensure PM2 is completely clean (heavy cleanup with daemon kill)
async function cleanupAllPM2Processes() {
// First delete test processes if we have a connection
if (pm2Connected) {
await deleteTestProcesses();
}
return new Promise<void>((resolve) => {
// Always connect fresh for daemon kill (in case we weren't connected)
pm2.connect((err) => {
if (err) {
// If we can't connect, assume PM2 is not running
pm2Connected = false;
resolve();
return;
}
// Kill the daemon to ensure fresh state
pm2.killDaemon((killErr) => {
pm2.disconnect();
pm2Connected = false;
// Small delay to let PM2 fully shutdown
setTimeout(resolve, 500);
});
});
});
}
describe.skipIf(!!process.env.CI)('PM2 integration tests', () => {
beforeAll(async () => {
// Build the CLI if it doesn't exist (only for CLI tests)
if (!existsSync(CLI_PATH)) {
console.log('Building CLI for integration tests...');
try {
await execa('pnpm', ['build'], {
cwd: PROJECT_ROOT,
stdio: 'inherit',
timeout: 120000, // 2 minute timeout for build
});
} catch (error) {
console.error('Failed to build CLI:', error);
throw new Error(
'Cannot run CLI integration tests without built CLI. Run `pnpm build` first.'
);
}
}
// Only do a full cleanup once at the beginning
await cleanupAllPM2Processes();
}, 150000); // 2.5 minute timeout for setup
afterAll(async () => {
// Only do a full cleanup once at the end
await cleanupAllPM2Processes();
});
afterEach(async () => {
// Lightweight cleanup after each test - just delete our test processes
await deleteTestProcesses();
}, 5000); // 5 second timeout for cleanup
describe('isUnraidApiRunning function', () => {
it('should return false when PM2 is not running the unraid-api process', async () => {
const result = await isUnraidApiRunning();
expect(result).toBe(false);
});
it('should return true when PM2 has unraid-api process running', async () => {
// Ensure PM2 connection
await ensurePM2Connection();
// Start a dummy process with the name 'unraid-api'
await new Promise<void>((resolve, reject) => {
pm2.start(
{
script: DUMMY_PROCESS_PATH,
name: 'unraid-api',
},
(startErr) => {
if (startErr) return reject(startErr);
resolve();
}
);
});
// Give PM2 time to start the process
await new Promise((resolve) => setTimeout(resolve, 2000));
const result = await isUnraidApiRunning();
expect(result).toBe(true);
}, 30000);
it('should return false when unraid-api process is stopped', async () => {
// Ensure PM2 connection
await ensurePM2Connection();
// Start and then stop the process
await new Promise<void>((resolve, reject) => {
pm2.start(
{
script: DUMMY_PROCESS_PATH,
name: 'unraid-api',
},
(startErr) => {
if (startErr) return reject(startErr);
// Stop the process after starting
setTimeout(() => {
pm2.stop('unraid-api', (stopErr) => {
if (stopErr) return reject(stopErr);
resolve();
});
}, 1000);
}
);
});
await new Promise((resolve) => setTimeout(resolve, 1000));
const result = await isUnraidApiRunning();
expect(result).toBe(false);
}, 30000);
it('should handle PM2 connection errors gracefully', async () => {
// Set an invalid PM2_HOME to force connection failure
const originalPM2Home = process.env.PM2_HOME;
process.env.PM2_HOME = '/invalid/path/that/does/not/exist';
const result = await isUnraidApiRunning();
expect(result).toBe(false);
// Restore original PM2_HOME
if (originalPM2Home) {
process.env.PM2_HOME = originalPM2Home;
} else {
delete process.env.PM2_HOME;
}
}, 15000); // 15 second timeout to allow for the Promise.race timeout
});
});

View File

@@ -1,54 +0,0 @@
import { mkdtempSync, rmSync, writeFileSync } from 'node:fs';
import { tmpdir } from 'node:os';
import { join } from 'node:path';
import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from 'vitest';
describe('isUnraidApiRunning (nodemon pid detection)', () => {
let tempDir: string;
let pidPath: string;
beforeAll(() => {
tempDir = mkdtempSync(join(tmpdir(), 'unraid-api-'));
pidPath = join(tempDir, 'nodemon.pid');
});
afterAll(() => {
rmSync(tempDir, { recursive: true, force: true });
});
afterEach(() => {
vi.resetModules();
});
async function loadIsRunning() {
vi.doMock('@app/environment.js', async () => {
const actual =
await vi.importActual<typeof import('@app/environment.js')>('@app/environment.js');
return { ...actual, NODEMON_PID_PATH: pidPath };
});
const module = await import('@app/core/utils/process/unraid-api-running.js');
return module.isUnraidApiRunning;
}
it('returns false when pid file is missing', async () => {
const isUnraidApiRunning = await loadIsRunning();
expect(await isUnraidApiRunning()).toBe(false);
});
it('returns true when a live pid is recorded', async () => {
writeFileSync(pidPath, `${process.pid}`);
const isUnraidApiRunning = await loadIsRunning();
expect(await isUnraidApiRunning()).toBe(true);
});
it('returns false when pid file is invalid', async () => {
writeFileSync(pidPath, 'not-a-number');
const isUnraidApiRunning = await loadIsRunning();
expect(await isUnraidApiRunning()).toBe(false);
});
});

View File

@@ -1,29 +0,0 @@
import { join } from 'node:path';
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
describe('nodemon path configuration', () => {
const originalUnraidApiCwd = process.env.UNRAID_API_CWD;
beforeEach(() => {
vi.resetModules();
delete process.env.UNRAID_API_CWD;
});
afterEach(() => {
if (originalUnraidApiCwd === undefined) {
delete process.env.UNRAID_API_CWD;
} else {
process.env.UNRAID_API_CWD = originalUnraidApiCwd;
}
});
it('anchors nodemon paths to the package root by default', async () => {
const environment = await import('@app/environment.js');
const { UNRAID_API_ROOT, NODEMON_CONFIG_PATH, NODEMON_PATH, UNRAID_API_CWD } = environment;
expect(UNRAID_API_CWD).toBe(UNRAID_API_ROOT);
expect(NODEMON_CONFIG_PATH).toBe(join(UNRAID_API_ROOT, 'nodemon.json'));
expect(NODEMON_PATH).toBe(join(UNRAID_API_ROOT, 'node_modules', 'nodemon', 'bin', 'nodemon.js'));
});
});

View File

@@ -12,22 +12,7 @@ import {
UpdateRCloneRemoteDto,
} from '@app/unraid-api/graph/resolvers/rclone/rclone.model.js';
vi.mock('got', () => {
const mockPost = vi.fn();
const gotMock = {
post: mockPost,
};
return {
default: gotMock,
HTTPError: class HTTPError extends Error {
response?: any;
constructor(response?: any) {
super('HTTP Error');
this.response = response;
}
},
};
});
vi.mock('got');
vi.mock('execa');
vi.mock('p-retry');
vi.mock('node:fs', () => ({
@@ -51,8 +36,6 @@ vi.mock('@app/store/index.js', () => ({
}));
vi.mock('@app/environment.js', () => ({
ENVIRONMENT: 'development',
SUPPRESS_LOGS: false,
LOG_LEVEL: 'INFO',
environment: {
IS_MAIN_PROCESS: true,
},
@@ -77,7 +60,7 @@ vi.mock('@nestjs/common', async (importOriginal) => {
describe('RCloneApiService', () => {
let service: RCloneApiService;
let mockGotPost: any;
let mockGot: any;
let mockExeca: any;
let mockPRetry: any;
let mockExistsSync: any;
@@ -85,19 +68,19 @@ describe('RCloneApiService', () => {
beforeEach(async () => {
vi.clearAllMocks();
const got = await import('got');
const { default: got } = await import('got');
const { execa } = await import('execa');
const pRetry = await import('p-retry');
const { existsSync } = await import('node:fs');
const { fileExists } = await import('@app/core/utils/files/file-exists.js');
mockGotPost = vi.mocked(got.default.post);
mockGot = vi.mocked(got);
mockExeca = vi.mocked(execa);
mockPRetry = vi.mocked(pRetry.default);
mockExistsSync = vi.mocked(existsSync);
// Mock successful RClone API response for socket check
mockGotPost.mockResolvedValue({ body: { pid: 12345 } });
mockGot.post = vi.fn().mockResolvedValue({ body: { pid: 12345 } });
// Mock RClone binary exists check
vi.mocked(fileExists).mockResolvedValue(true);
@@ -114,10 +97,10 @@ describe('RCloneApiService', () => {
mockPRetry.mockResolvedValue(undefined);
service = new RCloneApiService();
await service.onApplicationBootstrap();
await service.onModuleInit();
// Reset the mock after initialization to prepare for test-specific responses
mockGotPost.mockClear();
mockGot.post.mockClear();
});
describe('getProviders', () => {
@@ -126,15 +109,15 @@ describe('RCloneApiService', () => {
{ name: 'aws', prefix: 's3', description: 'Amazon S3' },
{ name: 'google', prefix: 'drive', description: 'Google Drive' },
];
mockGotPost.mockResolvedValue({
mockGot.post.mockResolvedValue({
body: { providers: mockProviders },
});
const result = await service.getProviders();
expect(result).toEqual(mockProviders);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/config\/providers$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/config/providers',
expect.objectContaining({
json: {},
responseType: 'json',
@@ -147,7 +130,7 @@ describe('RCloneApiService', () => {
});
it('should return empty array when no providers', async () => {
mockGotPost.mockResolvedValue({ body: {} });
mockGot.post.mockResolvedValue({ body: {} });
const result = await service.getProviders();
@@ -158,15 +141,15 @@ describe('RCloneApiService', () => {
describe('listRemotes', () => {
it('should return list of remotes', async () => {
const mockRemotes = ['backup-s3', 'drive-storage'];
mockGotPost.mockResolvedValue({
mockGot.post.mockResolvedValue({
body: { remotes: mockRemotes },
});
const result = await service.listRemotes();
expect(result).toEqual(mockRemotes);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/config\/listremotes$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/config/listremotes',
expect.objectContaining({
json: {},
responseType: 'json',
@@ -179,7 +162,7 @@ describe('RCloneApiService', () => {
});
it('should return empty array when no remotes', async () => {
mockGotPost.mockResolvedValue({ body: {} });
mockGot.post.mockResolvedValue({ body: {} });
const result = await service.listRemotes();
@@ -191,13 +174,13 @@ describe('RCloneApiService', () => {
it('should return remote details', async () => {
const input: GetRCloneRemoteDetailsDto = { name: 'test-remote' };
const mockConfig = { type: 's3', provider: 'AWS' };
mockGotPost.mockResolvedValue({ body: mockConfig });
mockGot.post.mockResolvedValue({ body: mockConfig });
const result = await service.getRemoteDetails(input);
expect(result).toEqual(mockConfig);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/config\/get$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/config/get',
expect.objectContaining({
json: { name: 'test-remote' },
responseType: 'json',
@@ -214,7 +197,7 @@ describe('RCloneApiService', () => {
it('should return remote configuration', async () => {
const input: GetRCloneRemoteConfigDto = { name: 'test-remote' };
const mockConfig = { type: 's3', access_key_id: 'AKIA...' };
mockGotPost.mockResolvedValue({ body: mockConfig });
mockGot.post.mockResolvedValue({ body: mockConfig });
const result = await service.getRemoteConfig(input);
@@ -230,13 +213,13 @@ describe('RCloneApiService', () => {
parameters: { access_key_id: 'AKIA...', secret_access_key: 'secret' },
};
const mockResponse = { success: true };
mockGotPost.mockResolvedValue({ body: mockResponse });
mockGot.post.mockResolvedValue({ body: mockResponse });
const result = await service.createRemote(input);
expect(result).toEqual(mockResponse);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/config\/create$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/config/create',
expect.objectContaining({
json: {
name: 'new-remote',
@@ -260,13 +243,13 @@ describe('RCloneApiService', () => {
parameters: { access_key_id: 'NEW_AKIA...' },
};
const mockResponse = { success: true };
mockGotPost.mockResolvedValue({ body: mockResponse });
mockGot.post.mockResolvedValue({ body: mockResponse });
const result = await service.updateRemote(input);
expect(result).toEqual(mockResponse);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/config\/update$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/config/update',
expect.objectContaining({
json: {
name: 'existing-remote',
@@ -286,13 +269,13 @@ describe('RCloneApiService', () => {
it('should delete a remote', async () => {
const input: DeleteRCloneRemoteDto = { name: 'remote-to-delete' };
const mockResponse = { success: true };
mockGotPost.mockResolvedValue({ body: mockResponse });
mockGot.post.mockResolvedValue({ body: mockResponse });
const result = await service.deleteRemote(input);
expect(result).toEqual(mockResponse);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/config\/delete$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/config/delete',
expect.objectContaining({
json: { name: 'remote-to-delete' },
responseType: 'json',
@@ -313,13 +296,13 @@ describe('RCloneApiService', () => {
options: { delete_on: 'dst' },
};
const mockResponse = { jobid: 'job-123' };
mockGotPost.mockResolvedValue({ body: mockResponse });
mockGot.post.mockResolvedValue({ body: mockResponse });
const result = await service.startBackup(input);
expect(result).toEqual(mockResponse);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/sync\/copy$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/sync/copy',
expect.objectContaining({
json: {
srcFs: '/source/path',
@@ -340,13 +323,13 @@ describe('RCloneApiService', () => {
it('should return job status', async () => {
const input: GetRCloneJobStatusDto = { jobId: 'job-123' };
const mockStatus = { status: 'running', progress: 0.5 };
mockGotPost.mockResolvedValue({ body: mockStatus });
mockGot.post.mockResolvedValue({ body: mockStatus });
const result = await service.getJobStatus(input);
expect(result).toEqual(mockStatus);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/job\/status$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/job/status',
expect.objectContaining({
json: { jobid: 'job-123' },
responseType: 'json',
@@ -365,13 +348,13 @@ describe('RCloneApiService', () => {
{ id: 'job-1', status: 'running' },
{ id: 'job-2', status: 'finished' },
];
mockGotPost.mockResolvedValue({ body: mockJobs });
mockGot.post.mockResolvedValue({ body: mockJobs });
const result = await service.listRunningJobs();
expect(result).toEqual(mockJobs);
expect(mockGotPost).toHaveBeenCalledWith(
expect.stringMatching(/\/job\/list$/),
expect(mockGot.post).toHaveBeenCalledWith(
'http://unix:/tmp/rclone.sock:/job/list',
expect.objectContaining({
json: {},
responseType: 'json',
@@ -395,7 +378,7 @@ describe('RCloneApiService', () => {
},
};
Object.setPrototypeOf(httpError, HTTPError.prototype);
mockGotPost.mockRejectedValue(httpError);
mockGot.post.mockRejectedValue(httpError);
await expect(service.getProviders()).rejects.toThrow(
'Rclone API Error (config/providers, HTTP 500): Rclone Error: Internal server error'
@@ -412,7 +395,7 @@ describe('RCloneApiService', () => {
},
};
Object.setPrototypeOf(httpError, HTTPError.prototype);
mockGotPost.mockRejectedValue(httpError);
mockGot.post.mockRejectedValue(httpError);
await expect(service.getProviders()).rejects.toThrow(
'Rclone API Error (config/providers, HTTP 404): Failed to process error response body. Raw body:'
@@ -429,7 +412,7 @@ describe('RCloneApiService', () => {
},
};
Object.setPrototypeOf(httpError, HTTPError.prototype);
mockGotPost.mockRejectedValue(httpError);
mockGot.post.mockRejectedValue(httpError);
await expect(service.getProviders()).rejects.toThrow(
'Rclone API Error (config/providers, HTTP 400): Failed to process error response body. Raw body: invalid json'
@@ -438,108 +421,17 @@ describe('RCloneApiService', () => {
it('should handle non-HTTP errors', async () => {
const networkError = new Error('Network connection failed');
mockGotPost.mockRejectedValue(networkError);
mockGot.post.mockRejectedValue(networkError);
await expect(service.getProviders()).rejects.toThrow('Network connection failed');
});
it('should handle unknown errors', async () => {
mockGotPost.mockRejectedValue('unknown error');
mockGot.post.mockRejectedValue('unknown error');
await expect(service.getProviders()).rejects.toThrow(
'Unknown error calling RClone API (config/providers) with params {}: unknown error'
);
});
});
describe('checkRcloneBinaryExists', () => {
beforeEach(() => {
// Create a new service instance without initializing for these tests
service = new RCloneApiService();
});
it('should return true when rclone version is 1.70.0', async () => {
mockExeca.mockResolvedValueOnce({
stdout: 'rclone v1.70.0\n- os/version: darwin 14.0 (64 bit)\n- os/kernel: 23.0.0 (arm64)',
stderr: '',
} as any);
const result = await (service as any).checkRcloneBinaryExists();
expect(result).toBe(true);
});
it('should return true when rclone version is newer than 1.70.0', async () => {
mockExeca.mockResolvedValueOnce({
stdout: 'rclone v1.75.2\n- os/version: darwin 14.0 (64 bit)\n- os/kernel: 23.0.0 (arm64)',
stderr: '',
} as any);
const result = await (service as any).checkRcloneBinaryExists();
expect(result).toBe(true);
});
it('should return false when rclone version is older than 1.70.0', async () => {
mockExeca.mockResolvedValueOnce({
stdout: 'rclone v1.69.0\n- os/version: darwin 14.0 (64 bit)\n- os/kernel: 23.0.0 (arm64)',
stderr: '',
} as any);
const result = await (service as any).checkRcloneBinaryExists();
expect(result).toBe(false);
});
it('should return false when rclone version is much older', async () => {
mockExeca.mockResolvedValueOnce({
stdout: 'rclone v1.50.0\n- os/version: darwin 14.0 (64 bit)\n- os/kernel: 23.0.0 (arm64)',
stderr: '',
} as any);
const result = await (service as any).checkRcloneBinaryExists();
expect(result).toBe(false);
});
it('should return false when version cannot be parsed', async () => {
mockExeca.mockResolvedValueOnce({
stdout: 'rclone unknown version format',
stderr: '',
} as any);
const result = await (service as any).checkRcloneBinaryExists();
expect(result).toBe(false);
});
it('should return false when rclone binary is not found', async () => {
const error = new Error('Command not found') as any;
error.code = 'ENOENT';
mockExeca.mockRejectedValueOnce(error);
const result = await (service as any).checkRcloneBinaryExists();
expect(result).toBe(false);
});
it('should return false and log error for other exceptions', async () => {
mockExeca.mockRejectedValueOnce(new Error('Some other error'));
const result = await (service as any).checkRcloneBinaryExists();
expect(result).toBe(false);
});
it('should handle beta/rc versions correctly', async () => {
mockExeca.mockResolvedValueOnce({
stdout: 'rclone v1.70.0-beta.1\n- os/version: darwin 14.0 (64 bit)\n- os/kernel: 23.0.0 (arm64)',
stderr: '',
} as any);
const result = await (service as any).checkRcloneBinaryExists();
expect(result).toBe(true);
});
});
});

View File

@@ -211,7 +211,6 @@ test('After init returns values from cfg file for all fields', { timeout: 30000
"fsUsed": null,
"id": "ST18000NM000J-2TV103_ZR585CPY",
"idx": 0,
"isSpinning": true,
"name": "parity",
"numErrors": 0,
"numReads": 0,
@@ -236,7 +235,6 @@ test('After init returns values from cfg file for all fields', { timeout: 30000
"fsUsed": 4116003021,
"id": "ST18000NM000J-2TV103_ZR5B1W9X",
"idx": 1,
"isSpinning": true,
"name": "disk1",
"numErrors": 0,
"numReads": 0,
@@ -261,7 +259,6 @@ test('After init returns values from cfg file for all fields', { timeout: 30000
"fsUsed": 11904860828,
"id": "WDC_WD120EDAZ-11F3RA0_5PJRD45C",
"idx": 2,
"isSpinning": true,
"name": "disk2",
"numErrors": 0,
"numReads": 0,
@@ -286,7 +283,6 @@ test('After init returns values from cfg file for all fields', { timeout: 30000
"fsUsed": 6478056481,
"id": "WDC_WD120EMAZ-11BLFA0_5PH8BTYD",
"idx": 3,
"isSpinning": true,
"name": "disk3",
"numErrors": 0,
"numReads": 0,
@@ -311,7 +307,6 @@ test('After init returns values from cfg file for all fields', { timeout: 30000
"fsUsed": 137273827,
"id": "Samsung_SSD_850_EVO_250GB_S2R5NX0H643734Z",
"idx": 30,
"isSpinning": true,
"name": "cache",
"numErrors": 0,
"numReads": 0,
@@ -336,7 +331,6 @@ test('After init returns values from cfg file for all fields', { timeout: 30000
"fsUsed": null,
"id": "KINGSTON_SA2000M8250G_50026B7282669D9E",
"idx": 31,
"isSpinning": true,
"name": "cache2",
"numErrors": 0,
"numReads": 0,
@@ -361,7 +355,6 @@ test('After init returns values from cfg file for all fields', { timeout: 30000
"fsUsed": 851325,
"id": "Cruzer",
"idx": 32,
"isSpinning": true,
"name": "flash",
"numErrors": 0,
"numReads": 0,

View File

@@ -28,7 +28,6 @@ test('Returns parsed state file', async () => {
"fsUsed": null,
"id": "ST18000NM000J-2TV103_ZR585CPY",
"idx": 0,
"isSpinning": true,
"name": "parity",
"numErrors": 0,
"numReads": 0,
@@ -53,7 +52,6 @@ test('Returns parsed state file', async () => {
"fsUsed": 4116003021,
"id": "ST18000NM000J-2TV103_ZR5B1W9X",
"idx": 1,
"isSpinning": true,
"name": "disk1",
"numErrors": 0,
"numReads": 0,
@@ -78,7 +76,6 @@ test('Returns parsed state file', async () => {
"fsUsed": 11904860828,
"id": "WDC_WD120EDAZ-11F3RA0_5PJRD45C",
"idx": 2,
"isSpinning": true,
"name": "disk2",
"numErrors": 0,
"numReads": 0,
@@ -103,7 +100,6 @@ test('Returns parsed state file', async () => {
"fsUsed": 6478056481,
"id": "WDC_WD120EMAZ-11BLFA0_5PH8BTYD",
"idx": 3,
"isSpinning": true,
"name": "disk3",
"numErrors": 0,
"numReads": 0,
@@ -128,7 +124,6 @@ test('Returns parsed state file', async () => {
"fsUsed": 137273827,
"id": "Samsung_SSD_850_EVO_250GB_S2R5NX0H643734Z",
"idx": 30,
"isSpinning": true,
"name": "cache",
"numErrors": 0,
"numReads": 0,
@@ -153,7 +148,6 @@ test('Returns parsed state file', async () => {
"fsUsed": null,
"id": "KINGSTON_SA2000M8250G_50026B7282669D9E",
"idx": 31,
"isSpinning": true,
"name": "cache2",
"numErrors": 0,
"numReads": 0,
@@ -178,7 +172,6 @@ test('Returns parsed state file', async () => {
"fsUsed": 851325,
"id": "Cruzer",
"idx": 32,
"isSpinning": true,
"name": "flash",
"numErrors": 0,
"numReads": 0,

View File

@@ -1,25 +1,12 @@
import '@app/dotenv.js';
import { Logger } from '@nestjs/common';
import { appendFileSync } from 'node:fs';
import { CommandFactory } from 'nest-commander';
import { LOG_LEVEL, SUPPRESS_LOGS } from '@app/environment.js';
import { LogService } from '@app/unraid-api/cli/log.service.js';
const BOOT_LOG_PATH = '/var/log/unraid-api/boot.log';
const logToBootFile = (message: string): void => {
const timestamp = new Date().toISOString();
const line = `[${timestamp}] [cli] ${message}\n`;
try {
appendFileSync(BOOT_LOG_PATH, line);
} catch {
// Silently fail if we can't write to boot log
}
};
const getUnraidApiLocation = async () => {
const { execa } = await import('execa');
try {
@@ -39,8 +26,6 @@ const getLogger = () => {
const logger = getLogger();
try {
logToBootFile(`CLI started with args: ${process.argv.slice(2).join(' ')}`);
await import('json-bigint-patch');
const { CliModule } = await import('@app/unraid-api/cli/cli.module.js');
@@ -53,17 +38,10 @@ try {
nativeShell: { executablePath: await getUnraidApiLocation() },
},
});
logToBootFile('CLI completed successfully');
process.exit(0);
} catch (error) {
// Always log errors to boot file for boot-time debugging
const errorMessage = error instanceof Error ? error.stack || error.message : String(error);
logToBootFile(`CLI ERROR: ${errorMessage}`);
if (logger) {
logger.error('ERROR:', error);
} else {
console.error('ERROR:', error);
}
process.exit(1);
}

View File

@@ -1,12 +0,0 @@
import { existsSync } from 'node:fs';
/**
* Local filesystem and env checks stay synchronous so we can branch at module load.
* @returns True if the Connect Unraid plugin is installed, false otherwise.
*/
export const isConnectPluginInstalled = () => {
if (process.env.SKIP_CONNECT_PLUGIN_CHECK === 'true') {
return true;
}
return existsSync('/boot/config/plugins/dynamix.unraid.net.plg');
};

View File

@@ -2,7 +2,7 @@ import { join } from 'path';
import type { JSONWebKeySet } from 'jose';
import { ENABLE_NEXT_DOCKER_RELEASE, PORT } from '@app/environment.js';
import { PORT } from '@app/environment.js';
export const getInternalApiAddress = (isHttp = true, nginxPort = 80) => {
const envPort = PORT;
@@ -79,14 +79,3 @@ export const KEYSERVER_VALIDATION_ENDPOINT = 'https://keys.lime-technology.com/v
/** Set the max retries for the GraphQL Client */
export const MAX_RETRIES_FOR_LINEAR_BACKOFF = 100;
/**
* Feature flags are used to conditionally enable or disable functionality in the Unraid API.
*
* Keys are human readable feature flag names -- will be used to construct error messages.
*
* Values are boolean/truthy values.
*/
export const FeatureFlags = Object.freeze({
ENABLE_NEXT_DOCKER_RELEASE,
});

View File

@@ -16,30 +16,31 @@ const nullDestination = pino.destination({
});
export const logDestination =
process.env.SUPPRESS_LOGS === 'true'
? nullDestination
: pino.destination({ dest: PATHS_LOGS_FILE, mkdir: true });
// Since process output is piped directly to the log file, we should not colorize stdout
// to avoid ANSI escape codes in the log file
process.env.SUPPRESS_LOGS === 'true' ? nullDestination : pino.destination();
const localFileDestination = pino.destination({
dest: PATHS_LOGS_FILE,
sync: true,
});
const stream = SUPPRESS_LOGS
? nullDestination
: LOG_TYPE === 'pretty'
? pretty({
singleLine: true,
hideObject: false,
colorize: false, // No colors since logs are written directly to file
colorizeObjects: false,
colorize: true,
colorizeObjects: true,
levelFirst: false,
ignore: 'hostname,pid',
destination: logDestination,
translateTime: 'HH:mm:ss',
customPrettifiers: {
time: (timestamp: string | object) => `[${timestamp}`,
level: (_logLevel: string | object, _key: string, log: any, extras: any) => {
// Use label instead of labelColorized for non-colored output
const { label } = extras;
level: (logLevel: string | object, key: string, log: any, extras: any) => {
// Use labelColorized which preserves the colors
const { labelColorized } = extras;
const context = log.context || log.logger || 'app';
return `${label} ${context}]`;
return `${labelColorized} ${context}]`;
},
},
messageFormat: (log: any, messageKey: string) => {
@@ -97,7 +98,7 @@ export const keyServerLogger = logger.child({ logger: 'key-server' });
export const remoteAccessLogger = logger.child({ logger: 'remote-access' });
export const remoteQueryLogger = logger.child({ logger: 'remote-query' });
export const apiLogger = logger.child({ logger: 'api' });
export const pluginLogger = logger.child({ logger: 'plugin' });
export const pluginLogger = logger.child({ logger: 'plugin', stream: localFileDestination });
export const loggers = [
internalLogger,

View File

@@ -7,6 +7,8 @@ import { PubSub } from 'graphql-subscriptions';
const eventEmitter = new EventEmitter();
eventEmitter.setMaxListeners(30);
export { GRAPHQL_PUBSUB_CHANNEL as PUBSUB_CHANNEL };
export const pubsub = new PubSub({ eventEmitter });
/**

View File

@@ -1,66 +0,0 @@
import { afterEach, describe, expect, it, vi } from 'vitest';
import { isSafeModeEnabled } from '@app/core/utils/safe-mode.js';
import { store } from '@app/store/index.js';
import * as stateFileLoader from '@app/store/services/state-file-loader.js';
describe('isSafeModeEnabled', () => {
afterEach(() => {
vi.restoreAllMocks();
});
it('returns the safe mode flag already present in the store', () => {
const baseState = store.getState();
vi.spyOn(store, 'getState').mockReturnValue({
...baseState,
emhttp: {
...baseState.emhttp,
var: {
...(baseState.emhttp?.var ?? {}),
safeMode: true,
},
},
});
const loaderSpy = vi.spyOn(stateFileLoader, 'loadStateFileSync');
expect(isSafeModeEnabled()).toBe(true);
expect(loaderSpy).not.toHaveBeenCalled();
});
it('falls back to the synchronous loader when store state is missing', () => {
const baseState = store.getState();
vi.spyOn(store, 'getState').mockReturnValue({
...baseState,
emhttp: {
...baseState.emhttp,
var: {
...(baseState.emhttp?.var ?? {}),
safeMode: undefined as unknown as boolean,
} as typeof baseState.emhttp.var,
} as typeof baseState.emhttp,
} as typeof baseState);
vi.spyOn(stateFileLoader, 'loadStateFileSync').mockReturnValue({
...(baseState.emhttp?.var ?? {}),
safeMode: true,
} as any);
expect(isSafeModeEnabled()).toBe(true);
});
it('defaults to false when loader cannot provide state', () => {
const baseState = store.getState();
vi.spyOn(store, 'getState').mockReturnValue({
...baseState,
emhttp: {
...baseState.emhttp,
var: {
...(baseState.emhttp?.var ?? {}),
safeMode: undefined as unknown as boolean,
} as typeof baseState.emhttp.var,
} as typeof baseState.emhttp,
} as typeof baseState);
vi.spyOn(stateFileLoader, 'loadStateFileSync').mockReturnValue(null);
expect(isSafeModeEnabled()).toBe(false);
});
});

View File

@@ -16,22 +16,11 @@ export const getKeyFile = async function (appStore: RootState = store.getState()
const keyFileName = basename(emhttp.var?.regFile);
const registrationKeyFilePath = join(paths['keyfile-base'], keyFileName);
try {
const keyFile = await readFile(registrationKeyFilePath, 'binary');
return Buffer.from(keyFile, 'binary')
.toString('base64')
.trim()
.replace(/\+/g, '-')
.replace(/\//g, '_')
.replace(/=/g, '');
} catch (error) {
// Handle ENOENT error when Pro.key file doesn't exist
if (error instanceof Error && 'code' in error && error.code === 'ENOENT') {
// Return empty string when key file is missing (ENOKEYFILE state)
return '';
}
// Re-throw other errors
throw error;
}
const keyFile = await readFile(registrationKeyFilePath, 'binary');
return Buffer.from(keyFile, 'binary')
.toString('base64')
.trim()
.replace(/\+/g, '-')
.replace(/\//g, '_')
.replace(/=/g, '');
};

View File

@@ -1,86 +0,0 @@
import { type IniStringBoolean, type IniStringBooleanOrAuto } from '@app/core/types/ini.js';
/**
* Converts INI boolean string values to JavaScript boolean values.
* Handles malformed values by cleaning them of non-alphabetic characters.
*
* @param value - The string value to parse ("yes", "no", "true", "false", etc.)
* @returns boolean value or undefined if parsing fails
*/
export function iniBooleanToJsBoolean(value: string): boolean | undefined;
/**
* Converts INI boolean string values to JavaScript boolean values.
* Handles malformed values by cleaning them of non-alphabetic characters.
*
* @param value - The string value to parse ("yes", "no", "true", "false", etc.)
* @param defaultValue - Default value to return if parsing fails
* @returns boolean value or defaultValue if parsing fails (never undefined when defaultValue is provided)
*/
export function iniBooleanToJsBoolean(value: string, defaultValue: boolean): boolean;
export function iniBooleanToJsBoolean(value: string, defaultValue?: boolean): boolean | undefined {
if (value === 'no' || value === 'false') {
return false;
}
if (value === 'yes' || value === 'true') {
return true;
}
// Handle malformed values by cleaning them first
if (typeof value === 'string') {
const cleanValue = value.replace(/[^a-zA-Z]/g, '').toLowerCase();
if (cleanValue === 'no' || cleanValue === 'false') {
return false;
}
if (cleanValue === 'yes' || cleanValue === 'true') {
return true;
}
}
// Always return defaultValue when provided (even if undefined)
if (arguments.length >= 2) {
return defaultValue;
}
// Return undefined only when no default was provided
return undefined;
}
/**
* Converts INI boolean or auto string values to JavaScript boolean or null values.
* Handles malformed values by cleaning them of non-alphabetic characters.
*
* @param value - The string value to parse ("yes", "no", "auto", "true", "false", etc.)
* @returns boolean value for yes/no/true/false, null for auto, or undefined as fallback
*/
export const iniBooleanOrAutoToJsBoolean = (
value: IniStringBooleanOrAuto | string
): boolean | null | undefined => {
// Handle auto first
if (value === 'auto') {
return null;
}
// Try to parse as boolean
const boolResult = iniBooleanToJsBoolean(value as IniStringBoolean);
if (boolResult !== undefined) {
return boolResult;
}
// Handle malformed values like "auto*" by extracting the base value
if (typeof value === 'string') {
const cleanValue = value.replace(/[^a-zA-Z]/g, '').toLowerCase();
if (cleanValue === 'auto') {
return null;
}
if (cleanValue === 'no' || cleanValue === 'false') {
return false;
}
if (cleanValue === 'yes' || cleanValue === 'true') {
return true;
}
}
// Return undefined as fallback instead of throwing to prevent API crash
return undefined;
};

View File

@@ -0,0 +1,40 @@
export const isUnraidApiRunning = async (): Promise<boolean | undefined> => {
const { PM2_HOME } = await import('@app/environment.js');
// Set PM2_HOME if not already set
if (!process.env.PM2_HOME) {
process.env.PM2_HOME = PM2_HOME;
}
const pm2Module = await import('pm2');
const pm2 = pm2Module.default || pm2Module;
const pm2Promise = new Promise<boolean>((resolve) => {
pm2.connect(function (err) {
if (err) {
// Don't reject here, resolve with false since we can't connect to PM2
resolve(false);
return;
}
// Now try to describe unraid-api specifically
pm2.describe('unraid-api', function (err, processDescription) {
if (err || processDescription.length === 0) {
// Service not found or error occurred
resolve(false);
} else {
const isOnline = processDescription?.[0]?.pm2_env?.status === 'online';
resolve(isOnline);
}
pm2.disconnect();
});
});
});
const timeoutPromise = new Promise<boolean>((resolve) => {
setTimeout(() => resolve(false), 10000); // 10 second timeout
});
return Promise.race([pm2Promise, timeoutPromise]);
};

View File

@@ -1,23 +0,0 @@
import { readFile } from 'node:fs/promises';
import { fileExists } from '@app/core/utils/files/file-exists.js';
import { NODEMON_PID_PATH } from '@app/environment.js';
export const isUnraidApiRunning = async (): Promise<boolean> => {
try {
if (!(await fileExists(NODEMON_PID_PATH))) {
return false;
}
const pidText = (await readFile(NODEMON_PID_PATH, 'utf-8')).trim();
const pid = Number.parseInt(pidText, 10);
if (Number.isNaN(pid)) {
return false;
}
process.kill(pid, 0);
return true;
} catch {
return false;
}
};

View File

@@ -1,17 +0,0 @@
import { store } from '@app/store/index.js';
import { loadStateFileSync } from '@app/store/services/state-file-loader.js';
import { StateFileKey } from '@app/store/types.js';
export const isSafeModeEnabled = (): boolean => {
const safeModeFromStore = store.getState().emhttp?.var?.safeMode;
if (typeof safeModeFromStore === 'boolean') {
return safeModeFromStore;
}
const varState = loadStateFileSync(StateFileKey.var);
if (varState) {
return Boolean(varState.safeMode);
}
return false;
};

View File

@@ -2,7 +2,8 @@
// Non-function exports from this module are loaded into the NestJS Config at runtime.
import { readFileSync } from 'node:fs';
import { dirname, join } from 'node:path';
import { homedir } from 'node:os';
import { join } from 'node:path';
import { fileURLToPath } from 'node:url';
import type { PackageJson, SetRequired } from 'type-fest';
@@ -65,7 +66,6 @@ export const getPackageJsonDependencies = (): string[] | undefined => {
};
export const API_VERSION = process.env.npm_package_version ?? getPackageJson().version;
export const UNRAID_API_ROOT = dirname(getPackageJsonPath());
/** Controls how the app is built/run (i.e. in terms of optimization) */
export const NODE_ENV =
@@ -92,7 +92,6 @@ export const LOG_LEVEL = process.env.LOG_LEVEL
: process.env.ENVIRONMENT === 'production'
? 'INFO'
: 'DEBUG';
export const LOG_CASBIN = process.env.LOG_CASBIN === 'true';
export const SUPPRESS_LOGS = process.env.SUPPRESS_LOGS === 'true';
export const MOTHERSHIP_GRAPHQL_LINK = process.env.MOTHERSHIP_GRAPHQL_LINK
? process.env.MOTHERSHIP_GRAPHQL_LINK
@@ -100,24 +99,15 @@ export const MOTHERSHIP_GRAPHQL_LINK = process.env.MOTHERSHIP_GRAPHQL_LINK
? 'https://staging.mothership.unraid.net/ws'
: 'https://mothership.unraid.net/ws';
export const PM2_HOME = process.env.PM2_HOME ?? join(homedir(), '.pm2');
export const PM2_PATH = join(import.meta.dirname, '../../', 'node_modules', 'pm2', 'bin', 'pm2');
export const ECOSYSTEM_PATH = join(import.meta.dirname, '../../', 'ecosystem.config.json');
export const PATHS_LOGS_DIR =
process.env.PATHS_LOGS_DIR ?? process.env.LOGS_DIR ?? '/var/log/unraid-api';
export const PATHS_LOGS_FILE = process.env.PATHS_LOGS_FILE ?? '/var/log/graphql-api.log';
export const PATHS_NODEMON_LOG_FILE =
process.env.PATHS_NODEMON_LOG_FILE ?? join(PATHS_LOGS_DIR, 'nodemon.log');
export const NODEMON_PATH = join(UNRAID_API_ROOT, 'node_modules', 'nodemon', 'bin', 'nodemon.js');
export const NODEMON_CONFIG_PATH = join(UNRAID_API_ROOT, 'nodemon.json');
export const NODEMON_PID_PATH = process.env.NODEMON_PID_PATH ?? '/var/run/unraid-api/nodemon.pid';
export const NODEMON_LOCK_PATH = process.env.NODEMON_LOCK_PATH ?? '/var/run/unraid-api/nodemon.lock';
export const UNRAID_API_CWD = process.env.UNRAID_API_CWD ?? UNRAID_API_ROOT;
export const UNRAID_API_SERVER_ENTRYPOINT = join(UNRAID_API_CWD, 'dist', 'main.js');
export const PATHS_CONFIG_MODULES =
process.env.PATHS_CONFIG_MODULES ?? '/boot/config/plugins/dynamix.my.servers/configs';
export const PATHS_LOCAL_SESSION_FILE =
process.env.PATHS_LOCAL_SESSION_FILE ?? '/var/run/unraid-api/local-session';
/** feature flag for the upcoming docker release */
export const ENABLE_NEXT_DOCKER_RELEASE = process.env.ENABLE_NEXT_DOCKER_RELEASE === 'true';

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

View File

@@ -1 +0,0 @@
{}

Some files were not shown because too many files have changed in this diff Show More