Compare commits

..

3 Commits

Author SHA1 Message Date
Zack Spear
fbbda5926b refactor: update my servers include file to handle CSS values
This commit updates the `myservers1.php` file in the `dynamix.my.servers` plugin. It adds functionality to handle CSS values in addition to the existing file handling logic. The code now checks for CSS values in the local manifest and includes them in the rendered HTML if found.
2024-09-30 11:56:37 -07:00
mdatelle
8a0a053a80 test: add notification component directory 2024-09-27 15:39:23 -04:00
mdatelle
026d643b49 test: add poc drawer component 2024-09-27 15:35:54 -04:00
1680 changed files with 81412 additions and 186940 deletions

View File

@@ -1,3 +0,0 @@
{
"permissions": {}
}

View File

@@ -1,14 +0,0 @@
---
description:
globs: api/**/*,api/*
alwaysApply: false
---
* pnpm ONLY
* always run scripts from api/package.json unless requested
* prefer adding new files to the nest repo located at api/src/unraid-api/ instead of the legacy code
* Test suite is VITEST, do not use jest
pnpm --filter ./api test
* Prefer to not mock simple dependencies
* For error testing, use `.rejects.toThrow()` without arguments - don't test exact error message strings unless the message format is specifically what you're testing

View File

@@ -1,8 +0,0 @@
---
description:
globs:
alwaysApply: true
---
Never add comments unless they are needed for clarity of function
Be CONCISE, keep replies shorter than a paragraph if at all passible.

View File

@@ -1,6 +0,0 @@
---
description:
globs:
alwaysApply: true
---
Never add comments for obvious things, and avoid commenting when starting and ending code blocks

View File

@@ -1,9 +0,0 @@
---
description:
globs: web/**/*
alwaysApply: false
---
* Always run `pnpm codegen` for GraphQL code generation in the web directory
* GraphQL queries must be placed in `.query.ts` files
* GraphQL mutations must be placed in `.mutation.ts` files
* All GraphQL under `web/` and follow this naming convention

View File

@@ -1,244 +0,0 @@
---
description:
globs: **/*.test.ts,**/__test__/components/**/*.ts,**/__test__/store/**/*.ts,**/__test__/mocks/**/*.ts
alwaysApply: false
---
## General Testing Best Practices
- **Error Testing:** Use `.rejects.toThrow()` without arguments to test that functions throw errors. Don't test exact error message strings unless the message format is specifically what you're testing
- **Focus on Behavior:** Test what the code does, not implementation details like exact error message wording
## Vue Component Testing Best Practices
- This is a Nuxt.js app but we are testing with vitest outside of the Nuxt environment
- Nuxt is currently set to auto import so some vue files may need compute or ref imported
- Use pnpm when running termical commands and stay within the web directory.
- The directory for tests is located under `web/__test__` when running test just run `pnpm test`
### Setup
- Use `mount` from Vue Test Utils for component testing
- Stub complex child components that aren't the focus of the test
- Mock external dependencies and services
```typescript
import { mount } from '@vue/test-utils';
import { beforeEach, describe, expect, it, vi } from 'vitest';
import { createTestingPinia } from '@pinia/testing'
import { useSomeStore } from '@/stores/myStore'
import YourComponent from '~/components/YourComponent.vue';
// Mock dependencies
vi.mock('~/helpers/someHelper', () => ({
SOME_CONSTANT: 'mocked-value',
}));
describe('YourComponent', () => {
beforeEach(() => {
vi.clearAllMocks();
});
it('renders correctly', () => {
const wrapper = mount(YourComponent, {
global: {
plugins: [createTestingPinia()],
stubs: {
// Stub child components when needed
ChildComponent: true,
},
},
});
const store = useSomeStore() // uses the testing pinia!
// state can be directly manipulated
store.name = 'my new name'
// actions are stubbed by default, meaning they don't execute their code by default.
// See below to customize this behavior.
store.someAction()
expect(store.someAction).toHaveBeenCalledTimes(1)
// Assertions on components
expect(wrapper.text()).toContain('Expected content');
});
});
```
### Testing Patterns
- Test component behavior and output, not implementation details
- Verify that the expected elements are rendered
- Test component interactions (clicks, inputs, etc.)
- Check for expected prop handling and event emissions
- Use `createTestingPinia()` for mocking stores in components
### Finding Elements
- Use semantic queries like `find('button')` or `find('[data-test="id"]')` but prefer not to use data test ID's
- Find components with `findComponent(ComponentName)`
- Use `findAll` to check for multiple elements
### Assertions
- Assert on rendered text content with `wrapper.text()`
- Assert on element attributes with `element.attributes()`
- Verify element existence with `expect(element.exists()).toBe(true)`
- Check component state through rendered output
### Component Interaction
- Trigger events with `await element.trigger('click')`
- Set input values with `await input.setValue('value')`
- Test emitted events with `wrapper.emitted()`
### Mocking
- Mock external services and API calls
- Prefer not using mocks whenever possible
- Use `vi.mock()` for module-level mocks
- Specify return values for component methods with `vi.spyOn()`
- Reset mocks between tests with `vi.clearAllMocks()`
- Frequently used mocks are stored under `web/test/mocks`
### Async Testing
- Use `await nextTick()` for DOM updates
- Use `flushPromises()` for more complex promise chains
- Always await async operations before making assertions
## Store Testing with Pinia
### Basic Setup
- When testing Store files use `createPinia` and `setActivePinia`
```typescript
import { createPinia, setActivePinia } from 'pinia';
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
import { useYourStore } from '~/store/your-store';
// Mock declarations must be at top level due to hoisting
const mockDependencyFn = vi.fn();
// Module mocks must use factory functions
vi.mock('~/store/dependency', () => ({
useDependencyStore: () => ({
someMethod: mockDependencyFn,
someProperty: 'mockValue'
})
}));
describe('Your Store', () => {
let store: ReturnType<typeof useYourStore>;
beforeEach(() => {
setActivePinia(createPinia());
store = useYourStore();
vi.clearAllMocks();
});
afterEach(() => {
vi.resetAllMocks();
});
it('tests some action', () => {
store.someAction();
expect(mockDependencyFn).toHaveBeenCalled();
});
});
```
### Important Guidelines
1. **Store Initialization**
- Use `createPinia()` instead of `createTestingPinia()` for most cases
- Only use `createTestingPinia` if you specifically need its testing features
- Let stores initialize with their natural default state instead of forcing initial state
- Do not mock the store we're actually testing in the test file. That's why we're using `createPinia()`
2. **Vue Reactivity**
- Ensure Vue reactivity imports are added to original store files as they may be missing because Nuxt auto import was turned on
- Don't rely on Nuxt auto-imports in tests
```typescript
// Required in store files, even with Nuxt auto-imports
import { computed, ref, watchEffect } from 'vue';
```
3. **Mocking Best Practices**
- Place all mock declarations at the top level
- Use factory functions for module mocks to avoid hoisting issues
```typescript
// ❌ Wrong - will cause hoisting issues
const mockFn = vi.fn();
vi.mock('module', () => ({ method: mockFn }));
// ✅ Correct - using factory function
vi.mock('module', () => {
const mockFn = vi.fn();
return { method: mockFn };
});
```
4. **Testing Actions**
- Test action side effects and state changes
- Verify actions are called with correct parameters
- Mock external dependencies appropriately
```typescript
it('should handle action correctly', () => {
store.yourAction();
expect(mockDependencyFn).toHaveBeenCalledWith(
expectedArg1,
expectedArg2
);
expect(store.someState).toBe(expectedValue);
});
```
5. **Common Pitfalls**
- Don't mix mock declarations and module mocks incorrectly
- Avoid relying on Nuxt's auto-imports in test environment
- Clear mocks between tests to ensure isolation
- Remember that `vi.mock()` calls are hoisted
### Testing State & Getters
- Test computed properties by accessing them directly
- Verify state changes after actions
- Test getter dependencies are properly mocked
```typescript
it('computes derived state correctly', () => {
store.setState('new value');
expect(store.computedValue).toBe('expected result');
});
```
### Testing Complex Interactions
- Test store interactions with other stores
- Verify proper error handling
- Test async operations completely
```typescript
it('handles async operations', async () => {
const promise = store.asyncAction();
expect(store.status).toBe('loading');
await promise;
expect(store.status).toBe('success');
});
```
### Testing Actions
- Verify actions are called with the right parameters
- Test action side effects if not stubbed
- Override specific action implementations when needed
```typescript
// Test action calls
store.yourAction(params);
expect(store.yourAction).toHaveBeenCalledWith(params);
// Test with real implementation
const pinia = createTestingPinia({
createSpy: vi.fn,
stubActions: false,
});
```
### Testing State & Getters
- Set initial state for focused testing
- Test computed properties by accessing them directly
- Verify state changes by updating the store

View File

@@ -1,45 +0,0 @@
---
name: Bug Report
about: Create a report to help us improve
title: ''
labels: bug
assignees: ''
---
<!--
IMPORTANT: If your issue is related to Unraid Connect features (Flash Backup, connect.myunraid.net, mothership errors with connectivity, etc.) please submit a ticket here: [LINK TO FRESHDESK FORM FOR CONNECT] and choose Unraid Connect in the dropdown.
-->
## Environment
**Unraid OS Version:**
<!-- Please specify your Unraid version (e.g. 7.0.0) -->
**Are you using a reverse proxy?**
<!-- Please answer Yes/No. If yes, have you tested the issue by accessing your server directly? -->
<!-- Note: Reverse proxies are not officially supported by Unraid and can cause issues with various components of Unraid OS -->
## Pre-submission Checklist
<!-- Please check all that apply by replacing [ ] with [x] -->
- [ ] I have verified that my Unraid OS is up to date
- [ ] I have tested this issue by accessing my server directly (not through a reverse proxy)
- [ ] This is not an Unraid Connect related issue (if it is, please submit via the support form instead)
## Issue Description
<!-- Please provide a clear and concise description of the issue -->
## Steps to Reproduce
1.
2.
3.
## Expected Behavior
<!-- What did you expect to happen? -->
## Actual Behavior
<!-- What actually happened? -->
## Additional Context
<!-- Add any other context, screenshots, or error messages about the problem here -->

View File

@@ -1,34 +0,0 @@
---
name: Feature Request
about: Suggest an idea for this project
title: ''
labels: enhancement
assignees: ''
---
<!--
IMPORTANT: If your feature request is related to Unraid Connect features (Flash Backup, connect.myunraid.net, etc.) please submit it here: [LINK TO FRESHDESK FORM FOR CONNECT] and choose Unraid Connect in the dropdown.
-->
## Is your feature request related to a problem?
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
## Describe the solution you'd like
<!-- A clear and concise description of what you want to happen -->
## Describe alternatives you've considered
<!-- A clear and concise description of any alternative solutions or features you've considered -->
## Additional context
<!-- Add any other context, mockups, or screenshots about the feature request here -->
## Environment (if relevant)
**Unraid OS Version:**
<!-- Please specify your Unraid version (e.g. 7.0.0) if the feature request is version-specific -->
## Pre-submission Checklist
<!-- Please check all that apply by replacing [ ] with [x] -->
- [ ] I have searched existing issues to ensure this feature hasn't already been requested
- [ ] This is not an Unraid Connect related feature (if it is, please submit via the support form instead)
- [ ] I have provided clear examples or use cases for the feature

View File

@@ -1,41 +0,0 @@
---
name: Work Intent
about: Request approval for planned development work (must be approved before starting)
title: 'Work Intent: '
labels: work-intent, unapproved
assignees: 'elibosley'
---
<!--
IMPORTANT: This work intent must be approved by a core developer before beginning any development work.
The 'unapproved' label will be removed once approved.
-->
## Overview
<!-- Provide a high-level description of what you want to work on and why -->
## Technical Approach
<!-- Brief description of how you plan to implement this -->
## Scope
<!-- Check components that will be modified -->
- [ ] API
- [ ] Plugin
- [ ] Web UI
- [ ] Build/Deploy Process
- [ ] Documentation
## Timeline & Impact
<!-- Quick details about timing and effects -->
- Estimated time needed:
- Potential impacts:
## Pre-submission Checklist
<!-- Please check all that apply -->
- [ ] I have searched for similar work/issues
- [ ] I understand this needs approval before starting
- [ ] I am willing to make adjustments based on feedback
<!--
For Reviewers: Remove 'unapproved' label and add 'approved' label if accepted
-->

View File

@@ -1,49 +0,0 @@
# CodeQL Security Analysis for Unraid API
This directory contains custom CodeQL queries and configurations for security analysis of the Unraid API codebase.
## Overview
The analysis is configured to run:
- On all pushes to the main branch
- On all pull requests
- Weekly via scheduled runs
## Custom Queries
The following custom queries are implemented:
1. **API Authorization Bypass Detection**
Identifies API handlers that may not properly check authorization before performing operations.
2. **GraphQL Injection Detection**
Detects potential injection vulnerabilities in GraphQL queries and operations.
3. **Hardcoded Secrets Detection**
Finds potential hardcoded secrets or credentials in the codebase.
4. **Insecure Cryptographic Implementations**
Identifies usage of weak cryptographic algorithms or insecure random number generation.
5. **Path Traversal Vulnerability Detection**
Detects potential path traversal vulnerabilities in file system operations.
## Configuration
The CodeQL analysis is configured in:
- `.github/workflows/codeql-analysis.yml` - Workflow configuration
- `.github/codeql/codeql-config.yml` - CodeQL engine configuration
## Running Locally
To run these queries locally:
1. Install the CodeQL CLI: https://github.com/github/codeql-cli-binaries/releases
2. Create a CodeQL database:
```
codeql database create <db-name> --language=javascript --source-root=.
```
3. Run a query:
```
codeql query run .github/codeql/custom-queries/javascript/api-auth-bypass.ql --database=<db-name>
```

View File

@@ -1,16 +0,0 @@
name: "Unraid API CodeQL Configuration"
disable-default-queries: false
queries:
- name: Extended Security Queries
uses: security-extended
- name: Custom Unraid API Queries
uses: ./.github/codeql/custom-queries
query-filters:
- exclude:
problem.severity:
- warning
- recommendation
tags contain: security

View File

@@ -1,45 +0,0 @@
/**
* @name Potential API Authorization Bypass
* @description Functions that process API requests without verifying authorization may lead to security vulnerabilities.
* @kind problem
* @problem.severity error
* @precision medium
* @id js/api-auth-bypass
* @tags security
* external/cwe/cwe-285
*/
import javascript
/**
* Identifies functions that appear to handle API requests
*/
predicate isApiHandler(Function f) {
exists(f.getAParameter()) and
(
f.getName().regexpMatch("(?i).*(api|handler|controller|resolver|endpoint).*") or
exists(CallExpr call |
call.getCalleeName().regexpMatch("(?i).*(get|post|put|delete|patch).*") and
call.getArgument(1) = f
)
)
}
/**
* Identifies expressions that appear to perform authorization checks
*/
predicate isAuthCheck(DataFlow::Node node) {
exists(CallExpr call |
call.getCalleeName().regexpMatch("(?i).*(authorize|authenticate|isAuth|checkAuth|verifyAuth|hasPermission|isAdmin|canAccess).*") and
call.flow().getASuccessor*() = node
)
}
from Function apiHandler
where
isApiHandler(apiHandler) and
not exists(DataFlow::Node authCheck |
isAuthCheck(authCheck) and
authCheck.getEnclosingExpr().getEnclosingFunction() = apiHandler
)
select apiHandler, "API handler function may not perform proper authorization checks."

View File

@@ -1,77 +0,0 @@
/**
* @name Potential GraphQL Injection
* @description User-controlled input used directly in GraphQL queries may lead to injection vulnerabilities.
* @kind path-problem
* @problem.severity error
* @precision high
* @id js/graphql-injection
* @tags security
* external/cwe/cwe-943
*/
import javascript
import DataFlow::PathGraph
class GraphQLQueryExecution extends DataFlow::CallNode {
GraphQLQueryExecution() {
exists(string name |
name = this.getCalleeName() and
(
name = "execute" or
name = "executeQuery" or
name = "query" or
name.regexpMatch("(?i).*graphql.*query.*")
)
)
}
DataFlow::Node getQuery() {
result = this.getArgument(0)
}
}
class UserControlledInput extends DataFlow::Node {
UserControlledInput() {
exists(DataFlow::ParameterNode param |
param.getName().regexpMatch("(?i).*(query|request|input|args|variables|params).*") and
this = param
)
or
exists(DataFlow::PropRead prop |
prop.getPropertyName().regexpMatch("(?i).*(query|request|input|args|variables|params).*") and
this = prop
)
}
}
/**
* Holds if `node` is a string concatenation.
*/
predicate isStringConcatenation(DataFlow::Node node) {
exists(BinaryExpr concat |
concat.getOperator() = "+" and
concat.flow() = node
)
}
class GraphQLInjectionConfig extends TaintTracking::Configuration {
GraphQLInjectionConfig() { this = "GraphQLInjectionConfig" }
override predicate isSource(DataFlow::Node source) {
source instanceof UserControlledInput
}
override predicate isSink(DataFlow::Node sink) {
exists(GraphQLQueryExecution exec | sink = exec.getQuery())
}
override predicate isAdditionalTaintStep(DataFlow::Node pred, DataFlow::Node succ) {
// Add any GraphQL-specific taint steps if needed
isStringConcatenation(succ) and
succ.(DataFlow::BinaryExprNode).getAnOperand() = pred
}
}
from GraphQLInjectionConfig config, DataFlow::PathNode source, DataFlow::PathNode sink
where config.hasFlowPath(source, sink)
select sink.getNode(), source, sink, "GraphQL query may contain user-controlled input from $@.", source.getNode(), "user input"

View File

@@ -1,53 +0,0 @@
/**
* @name Hardcoded Secrets
* @description Hardcoded secrets or credentials in source code can lead to security vulnerabilities.
* @kind problem
* @problem.severity error
* @precision medium
* @id js/hardcoded-secrets
* @tags security
* external/cwe/cwe-798
*/
import javascript
/**
* Identifies variable declarations or assignments that may contain secrets
*/
predicate isSensitiveAssignment(DataFlow::Node node) {
exists(DataFlow::PropWrite propWrite |
propWrite.getPropertyName().regexpMatch("(?i).*(secret|key|password|token|credential|auth).*") and
propWrite.getRhs() = node
)
or
exists(VariableDeclarator decl |
decl.getName().regexpMatch("(?i).*(secret|key|password|token|credential|auth).*") and
decl.getInit().flow() = node
)
}
/**
* Identifies literals that look like secrets
*/
predicate isSecretLiteral(StringLiteral literal) {
// Match alphanumeric strings of moderate length that may be secrets
literal.getValue().regexpMatch("[A-Za-z0-9_\\-]{8,}") and
not (
// Skip likely non-sensitive literals
literal.getValue().regexpMatch("(?i)^(true|false|null|undefined|localhost|development|production|staging)$") or
// Skip URLs without credentials
literal.getValue().regexpMatch("^https?://[^:@/]+")
)
}
from DataFlow::Node source
where
isSensitiveAssignment(source) and
(
exists(StringLiteral literal |
literal.flow() = source and
isSecretLiteral(literal)
)
)
select source, "This assignment may contain a hardcoded secret or credential."

View File

@@ -1,90 +0,0 @@
/**
* @name Insecure Cryptographic Implementation
* @description Usage of weak cryptographic algorithms or improper implementations can lead to security vulnerabilities.
* @kind problem
* @problem.severity error
* @precision high
* @id js/insecure-crypto
* @tags security
* external/cwe/cwe-327
*/
import javascript
/**
* Identifies calls to crypto functions with insecure algorithms
*/
predicate isInsecureCryptoCall(CallExpr call) {
// Node.js crypto module uses
exists(string methodName |
methodName = call.getCalleeName() and
(
// Detect MD5 usage
methodName.regexpMatch("(?i).*md5.*") or
methodName.regexpMatch("(?i).*sha1.*") or
// Insecure crypto constructors
(
methodName = "createHash" or
methodName = "createCipheriv" or
methodName = "createDecipher"
) and
(
exists(StringLiteral algo |
algo = call.getArgument(0) and
(
algo.getValue().regexpMatch("(?i).*(md5|md4|md2|sha1|des|rc4|blowfish).*") or
algo.getValue().regexpMatch("(?i).*(ecb).*") // ECB mode
)
)
)
)
)
or
// Browser crypto API uses
exists(MethodCallExpr mce, string propertyName |
propertyName = mce.getMethodName() and
(
propertyName = "subtle" and
exists(MethodCallExpr subtleCall |
subtleCall.getReceiver() = mce and
subtleCall.getMethodName() = "encrypt" and
exists(ObjectExpr obj |
obj = subtleCall.getArgument(0) and
exists(Property p |
p = obj.getAProperty() and
p.getName() = "name" and
exists(StringLiteral algo |
algo = p.getInit() and
algo.getValue().regexpMatch("(?i).*(rc4|des|aes-cbc).*")
)
)
)
)
)
)
}
/**
* Identifies usage of Math.random() for security-sensitive operations
*/
predicate isInsecureRandomCall(CallExpr call) {
exists(PropertyAccess prop |
prop.getPropertyName() = "random" and
prop.getBase().toString() = "Math" and
call.getCallee() = prop
)
}
from Expr insecureExpr, string message
where
(
insecureExpr instanceof CallExpr and
isInsecureCryptoCall(insecureExpr) and
message = "Using potentially insecure cryptographic algorithm or mode."
) or (
insecureExpr instanceof CallExpr and
isInsecureRandomCall(insecureExpr) and
message = "Using Math.random() for security-sensitive operation. Consider using crypto.getRandomValues() instead."
)
select insecureExpr, message

View File

@@ -1,130 +0,0 @@
/**
* @name Path Traversal Vulnerability
* @description User-controlled inputs used in file operations may allow for path traversal attacks.
* @kind path-problem
* @problem.severity error
* @precision high
* @id js/path-traversal
* @tags security
* external/cwe/cwe-22
*/
import javascript
import DataFlow::PathGraph
/**
* Identifies sources of user-controlled input
*/
class UserInput extends DataFlow::Node {
UserInput() {
// HTTP request parameters
exists(DataFlow::ParameterNode param |
param.getName().regexpMatch("(?i).*(req|request|param|query|body|user|input).*") and
this = param
)
or
// Access to common request properties
exists(DataFlow::PropRead prop |
(
prop.getPropertyName() = "query" or
prop.getPropertyName() = "body" or
prop.getPropertyName() = "params" or
prop.getPropertyName() = "files"
) and
this = prop
)
}
}
/**
* Identifies fs module imports
*/
class FileSystemAccess extends DataFlow::CallNode {
FileSystemAccess() {
// Node.js fs module functions
exists(string name |
name = this.getCalleeName() and
(
name = "readFile" or
name = "readFileSync" or
name = "writeFile" or
name = "writeFileSync" or
name = "appendFile" or
name = "appendFileSync" or
name = "createReadStream" or
name = "createWriteStream" or
name = "openSync" or
name = "open"
)
)
or
// File system operations via require('fs')
exists(DataFlow::SourceNode fsModule, string methodName |
(fsModule.getAPropertyRead("promises") or fsModule).flowsTo(this.getReceiver()) and
methodName = this.getMethodName() and
(
methodName = "readFile" or
methodName = "writeFile" or
methodName = "appendFile" or
methodName = "readdir" or
methodName = "stat"
)
)
}
DataFlow::Node getPathArgument() {
result = this.getArgument(0)
}
}
/**
* Identifies sanitization of file paths
*/
predicate isPathSanitized(DataFlow::Node node) {
// Check for path normalization or validation
exists(DataFlow::CallNode call |
(
call.getCalleeName() = "resolve" or
call.getCalleeName() = "normalize" or
call.getCalleeName() = "isAbsolute" or
call.getCalleeName() = "relative" or
call.getCalleeName().regexpMatch("(?i).*(sanitize|validate|check).*path.*")
) and
call.flowsTo(node)
)
or
// Check for path traversal mitigation patterns
exists(DataFlow::CallNode call |
call.getCalleeName() = "replace" and
exists(StringLiteral regex |
regex = call.getArgument(0).(DataFlow::RegExpCreationNode).getSource().getAChildExpr() and
regex.getValue().regexpMatch("(\\.\\./|\\.\\.\\\\)")
) and
call.flowsTo(node)
)
}
/**
* Configuration for tracking flow from user input to file system operations
*/
class PathTraversalConfig extends TaintTracking::Configuration {
PathTraversalConfig() { this = "PathTraversalConfig" }
override predicate isSource(DataFlow::Node source) {
source instanceof UserInput
}
override predicate isSink(DataFlow::Node sink) {
exists(FileSystemAccess fileAccess |
sink = fileAccess.getPathArgument()
)
}
override predicate isSanitizer(DataFlow::Node node) {
isPathSanitized(node)
}
}
from PathTraversalConfig config, DataFlow::PathNode source, DataFlow::PathNode sink
where config.hasFlowPath(source, sink)
select sink.getNode(), source, sink, "File system operation depends on a user-controlled value $@.", source.getNode(), "user input"

1
.github/unraid.svg vendored
View File

@@ -1 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" version="1.2" viewBox="0 0 1000 1000"><defs><linearGradient id="a" x1="-900" x2="-100" y1="-100" y2="-900" gradientUnits="userSpaceOnUse"><stop offset="0" stop-color="#e32929"/><stop offset="1" stop-color="#ff8d30"/></linearGradient></defs><path fill="url(#a)" d="M1000 500.1v376.4c0 57.5-43.4 110.1-100 120.9-8.4 1.6-17.1 2.5-25.6 2.5-250.1.1-500.2.1-750.3.1-61.3 0-114.8-47-122.8-108q-.3-2.1-.6-4.1-.2-2-.3-4.1-.2-2-.3-4v-4.1C0 624.9 0 374.2 0 123.5 0 66 43.4 13.3 100 2.6 108.4 1 117.1.1 125.6.1 375.9 0 626.2 0 876.5 0 934 0 986.7 43.4 997.4 100c1.5 8.4 2.5 17.1 2.5 25.6.1 124.8.1 249.7.1 374.5z"/><path fill="#fff" d="M481.6 392.1h36.5v216.2h-36.5zm-356 0h36.5v216.2h-36.5zm178 242h36.5v82.5h-36.5zm-89.3-92.7h36.5v133.7h-36.5zm178 0h36.5V675h-36.5zm445.8-149.3h36.5v216.1h-36.5zm-178-107.8h36.5v82.6h-36.5zm89.3 41.5h36.5v133.1h-36.5zm-178.6 0h36.5v133h-36.5z"/></svg>

Before

Width:  |  Height:  |  Size: 915 B

View File

@@ -1,209 +0,0 @@
name: Build Plugin Component
on:
workflow_call:
inputs:
RELEASE_CREATED:
type: string
required: true
description: "Whether a release was created"
RELEASE_TAG:
type: string
required: false
description: "Name of the tag when a release is created"
TAG:
type: string
required: false
description: "Tag for the build (e.g. PR number or version)"
BUCKET_PATH:
type: string
required: true
description: "Path in the bucket where artifacts should be stored"
BASE_URL:
type: string
required: true
description: "Base URL for the plugin builds"
BUILD_NUMBER:
type: string
required: true
description: "Build number for the plugin builds"
secrets:
CF_ACCESS_KEY_ID:
required: true
CF_SECRET_ACCESS_KEY:
required: true
CF_BUCKET_PREVIEW:
required: true
CF_ENDPOINT:
required: true
UNRAID_BOT_GITHUB_ADMIN_TOKEN:
required: false
jobs:
build-plugin:
name: Build and Deploy Plugin
defaults:
run:
working-directory: plugin
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
with:
fetch-depth: 0
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Get API Version
id: vars
run: |
GIT_SHA=$(git rev-parse --short HEAD)
IS_TAGGED=$(git describe --tags --abbrev=0 --exact-match || echo '')
PACKAGE_LOCK_VERSION=$(jq -r '.version' package.json)
API_VERSION=$([[ -n "$IS_TAGGED" ]] && echo "$PACKAGE_LOCK_VERSION" || echo "${PACKAGE_LOCK_VERSION}+${GIT_SHA}")
echo "API_VERSION=${API_VERSION}" >> $GITHUB_OUTPUT
- name: Install dependencies
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile --filter @unraid/connect-plugin
- name: Download Unraid UI Components
uses: actions/download-artifact@v5
with:
name: unraid-wc-ui
path: ${{ github.workspace }}/plugin/source/dynamix.unraid.net/usr/local/emhttp/plugins/dynamix.my.servers/unraid-components/uui
merge-multiple: true
- name: Download Unraid Web Components
uses: actions/download-artifact@v5
with:
pattern: unraid-wc-rich
path: ${{ github.workspace }}/plugin/source/dynamix.unraid.net/usr/local/emhttp/plugins/dynamix.my.servers/unraid-components/standalone
merge-multiple: true
- name: Download Unraid API
uses: actions/download-artifact@v5
with:
name: unraid-api
path: ${{ github.workspace }}/plugin/api/
- name: Extract Unraid API
run: |
mkdir -p ${{ github.workspace }}/plugin/source/dynamix.unraid.net/usr/local/unraid-api
tar -xzf ${{ github.workspace }}/plugin/api/unraid-api.tgz -C ${{ github.workspace }}/plugin/source/dynamix.unraid.net/usr/local/unraid-api
- name: Build Plugin and TXZ Based on Event and Tag
id: build-plugin
run: |
cd ${{ github.workspace }}/plugin
pnpm run build:txz --tag="${{ inputs.TAG }}" --base-url="${{ inputs.BASE_URL }}" --api-version="${{ steps.vars.outputs.API_VERSION }}" --build-number="${{ inputs.BUILD_NUMBER }}"
pnpm run build:plugin --tag="${{ inputs.TAG }}" --base-url="${{ inputs.BASE_URL }}" --api-version="${{ steps.vars.outputs.API_VERSION }}" --build-number="${{ inputs.BUILD_NUMBER }}"
- name: Ensure Plugin Files Exist
run: |
ls -al ./deploy
if [ ! -f ./deploy/*.plg ]; then
echo "Error: .plg file not found in plugin/deploy/"
exit 1
fi
if [ ! -f ./deploy/*.txz ]; then
echo "Error: .txz file not found in plugin/deploy/"
exit 1
fi
- name: Upload to GHA
uses: actions/upload-artifact@v4
with:
name: unraid-plugin-${{ github.run_id }}-${{ inputs.RELEASE_TAG }}
path: plugin/deploy/
- name: Upload Release Assets
if: inputs.RELEASE_CREATED == 'true'
env:
GITHUB_TOKEN: ${{ github.token }}
RELEASE_TAG: ${{ inputs.RELEASE_TAG }}
run: |
# For each file in release directory
for file in deploy/*; do
echo "Uploading $file to release..."
gh release upload "${RELEASE_TAG}" "$file" --clobber
done
- name: Workflow Dispatch and wait
if: inputs.RELEASE_CREATED == 'true'
uses: the-actions-org/workflow-dispatch@v4.0.0
with:
workflow: release-production.yml
inputs: '{ "version": "v${{ steps.vars.outputs.API_VERSION }}" }'
token: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
- name: Upload to Cloudflare
if: inputs.RELEASE_CREATED == 'false'
env:
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: auto
run: |
# Sync the deploy directory to the Cloudflare bucket with explicit content encoding and public-read ACL
aws s3 sync deploy/ s3://${{ secrets.CF_BUCKET_PREVIEW }}/${{ inputs.BUCKET_PATH }} \
--endpoint-url ${{ secrets.CF_ENDPOINT }} \
--checksum-algorithm CRC32 \
--no-guess-mime-type \
--content-encoding none \
--acl public-read
- name: Comment URL
if: github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v3
with:
comment-tag: prlink
mode: recreate
message: |
This plugin has been deployed to Cloudflare R2 and is available for testing.
Download it at this URL:
```
${{ inputs.BASE_URL }}/tag/${{ inputs.TAG }}/dynamix.unraid.net.plg
```
- name: Clean up old preview builds
if: inputs.RELEASE_CREATED == 'false' && github.event_name == 'push'
continue-on-error: true
env:
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: auto
run: |
echo "🧹 Cleaning up old preview builds (keeping last 7 days)..."
# Calculate cutoff date (7 days ago)
CUTOFF_DATE=$(date -d "7 days ago" +"%Y.%m.%d")
echo "Deleting builds older than: ${CUTOFF_DATE}"
# List and delete old timestamped .txz files
OLD_FILES=$(aws s3 ls "s3://${{ secrets.CF_BUCKET_PREVIEW }}/unraid-api/" \
--endpoint-url ${{ secrets.CF_ENDPOINT }} --recursive | \
grep -E "dynamix\.unraid\.net-[0-9]{4}\.[0-9]{2}\.[0-9]{2}\.[0-9]{4}\.txz" | \
awk '{print $4}' || true)
DELETED_COUNT=0
if [ -n "$OLD_FILES" ]; then
while IFS= read -r file; do
if [[ $file =~ ([0-9]{4}\.[0-9]{2}\.[0-9]{2})\.[0-9]{4}\.txz ]]; then
FILE_DATE="${BASH_REMATCH[1]}"
if [[ "$FILE_DATE" < "$CUTOFF_DATE" ]]; then
echo "Deleting old build: $(basename "$file")"
aws s3 rm "s3://${{ secrets.CF_BUCKET_PREVIEW }}/${file}" \
--endpoint-url ${{ secrets.CF_ENDPOINT }} || true
((DELETED_COUNT++))
fi
fi
done <<< "$OLD_FILES"
fi
echo "✅ Deleted ${DELETED_COUNT} old builds"

View File

@@ -1,103 +0,0 @@
name: Claude Code Review
on:
pull_request:
types: [opened, synchronize]
# Skip reviews for non-code changes
paths-ignore:
- "**/*.md"
- "**/package-lock.json"
- "**/pnpm-lock.yaml"
- "**/.gitignore"
- "**/LICENSE"
- "**/*.config.js"
- "**/*.config.ts"
- "**/tsconfig.json"
- "**/.github/workflows/*.yml"
- "**/docs/**"
jobs:
claude-review:
# Skip review for bot PRs and WIP/skip-review PRs
# Only run if changes are significant (>10 lines)
if: |
(github.event.pull_request.additions > 10 || github.event.pull_request.deletions > 10) &&
!contains(github.event.pull_request.title, '[skip-review]') &&
!contains(github.event.pull_request.title, '[WIP]') &&
!endsWith(github.event.pull_request.user.login, '[bot]') &&
github.event.pull_request.user.login != 'dependabot' &&
github.event.pull_request.user.login != 'renovate'
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
fetch-depth: 1
- name: Run Claude Code Review
id: claude-review
uses: anthropics/claude-code-action@beta
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# Optional: Specify model (defaults to Claude Sonnet 4, uncomment for Claude Opus 4)
# model: "claude-opus-4-20250514"
# Direct prompt for automated review (no @claude mention needed)
direct_prompt: |
IMPORTANT: Review ONLY the DIFF/CHANGESET - the actual lines that were added or modified in this PR.
DO NOT review the entire file context, only analyze the specific changes being made.
Look for HIGH-PRIORITY issues in the CHANGED LINES ONLY:
1. CRITICAL BUGS: Logic errors, null pointer issues, infinite loops, race conditions
2. SECURITY: SQL injection, XSS, authentication bypass, exposed secrets, unsafe operations
3. BREAKING CHANGES: API contract violations, removed exports, changed function signatures
4. DATA LOSS RISKS: Destructive operations without safeguards, missing data validation
DO NOT comment on:
- Code that wasn't changed in this PR
- Style, formatting, or documentation
- Test coverage (unless tests are broken by the changes)
- Minor optimizations or best practices
- Existing code issues that weren't introduced by this PR
If you find no critical issues in the DIFF, respond with: "✅ No critical issues found in changes"
Keep response under 10 lines. Reference specific line numbers from the diff when reporting issues.
# Optional: Use sticky comments to make Claude reuse the same comment on subsequent pushes to the same PR
use_sticky_comment: true
# Context-aware review based on PR characteristics
# Uncomment to enable different review strategies based on context
# direct_prompt: |
# ${{
# (github.event.pull_request.additions > 500) &&
# 'Large PR detected. Focus only on architectural issues and breaking changes. Skip minor issues.' ||
# contains(github.event.pull_request.title, 'fix') &&
# 'Bug fix PR: Verify the fix addresses the root cause and check for regression risks.' ||
# contains(github.event.pull_request.title, 'deps') &&
# 'Dependency update: Check for breaking changes and security advisories only.' ||
# contains(github.event.pull_request.title, 'refactor') &&
# 'Refactor PR: Verify no behavior changes and check for performance regressions.' ||
# contains(github.event.pull_request.title, 'feat') &&
# 'New feature: Check for security issues, edge cases, and integration problems only.' ||
# 'Standard review: Check for critical bugs, security issues, and breaking changes only.'
# }}
# Optional: Add specific tools for running tests or linting
# allowed_tools: "Bash(npm run test),Bash(npm run lint),Bash(npm run typecheck)"
# Optional: Skip review for certain conditions
# if: |
# !contains(github.event.pull_request.title, '[skip-review]') &&
# !contains(github.event.pull_request.title, '[WIP]')

View File

@@ -1,64 +0,0 @@
name: Claude Code
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
actions: read # Required for Claude to read CI results on PRs
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
fetch-depth: 1
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@beta
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# This is an optional setting that allows Claude to read CI results on PRs
additional_permissions: |
actions: read
# Optional: Specify model (defaults to Claude Sonnet 4, uncomment for Claude Opus 4)
# model: "claude-opus-4-20250514"
# Optional: Customize the trigger phrase (default: @claude)
# trigger_phrase: "/claude"
# Optional: Trigger when specific user is assigned to an issue
# assignee_trigger: "claude-bot"
# Optional: Allow Claude to run specific commands
# allowed_tools: "Bash(npm install),Bash(npm run build),Bash(npm run test:*),Bash(npm run lint:*)"
# Optional: Add custom instructions for Claude to customize its behavior for your project
# custom_instructions: |
# Follow our coding standards
# Ensure all new code has tests
# Use TypeScript for new files
# Optional: Custom environment variables for Claude
# claude_env: |
# NODE_ENV: test

View File

@@ -1,40 +0,0 @@
name: "CodeQL Security Analysis"
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
schedule:
- cron: '0 0 * * 0' # Run weekly on Sundays
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'javascript', 'typescript' ]
steps:
- name: Checkout repository
uses: actions/checkout@v5
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/codeql-config.yml
queries: +security-and-quality
- name: Autobuild
uses: github/codeql-action/autobuild@v3
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3

View File

@@ -1,77 +0,0 @@
name: Deploy Storybook to Cloudflare Workers
permissions:
contents: read
pull-requests: write
issues: write
on:
push:
branches:
- main
paths:
- 'unraid-ui/**'
pull_request:
paths:
- 'unraid-ui/**'
workflow_dispatch:
jobs:
deploy:
runs-on: ubuntu-latest
name: Deploy Storybook
steps:
- name: Checkout
uses: actions/checkout@v5
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Setup Node.js
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential libvirt-daemon-system
version: 1.0
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Build Storybook
run: |
cd unraid-ui
pnpm build-storybook
- name: Deploy to Cloudflare Workers (Staging)
id: deploy_staging
if: github.event_name == 'pull_request'
uses: cloudflare/wrangler-action@v3
with:
apiToken: ${{ secrets.CLOUDFLARE_DEPLOY_TOKEN }}
command: deploy --env staging
workingDirectory: unraid-ui
- name: Deploy to Cloudflare Workers (Production)
if: github.ref == 'refs/heads/main' && github.event_name == 'push'
uses: cloudflare/wrangler-action@v3
with:
apiToken: ${{ secrets.CLOUDFLARE_DEPLOY_TOKEN }}
command: deploy
workingDirectory: unraid-ui
- name: Comment PR with deployment URL
if: github.event_name == 'pull_request'
uses: actions/github-script@v8
with:
script: |
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: `🚀 Storybook has been deployed to staging: ${{ steps.deploy_staging.outputs['deployment-url'] }}`
})

View File

@@ -0,0 +1,74 @@
name: Lint, Test, and Build Web Components
on:
workflow_dispatch:
jobs:
lint-web:
defaults:
run:
working-directory: web
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Create env file
run: |
touch .env
echo VITE_ACCOUNT=${{ vars.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ vars.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ vars.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ vars.VITE_CALLBACK_KEY }} >> .env
cat .env
- name: Install node
uses: actions/setup-node@v4
with:
cache: "npm"
cache-dependency-path: "web/package-lock.json"
node-version-file: "web/.nvmrc"
- name: Installing node deps
run: npm install
- name: Lint files
run: npm run lint
build-web:
defaults:
run:
working-directory: web
runs-on: ubuntu-latest
needs: [lint-web]
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Create env file
run: |
touch .env
echo VITE_ACCOUNT=${{ vars.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ vars.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ vars.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ vars.VITE_CALLBACK_KEY }} >> .env
cat .env
- name: Install node
uses: actions/setup-node@v4
with:
cache: "npm"
cache-dependency-path: "web/package-lock.json"
node-version-file: "web/.nvmrc"
- name: Installing node deps
run: npm install
- name: Build
run: npm run build
- name: Upload build to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-web
path: web/.nuxt/nuxt-custom-elements/dist/unraid-components

View File

@@ -1,387 +1,355 @@
name: CI - Main (API)
on:
pull_request:
push:
branches:
- main
permissions:
contents: write
pull-requests: write
tags:
- "v*"
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
cancel-in-progress: true
jobs:
test-api:
name: Test API
defaults:
run:
working-directory: api
release-please:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
steps:
- name: Checkout repo
uses: actions/checkout@v5
with:
fetch-depth: 0
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
run_install: false
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential libvirt-daemon-system php-cli
version: 1.0
- name: PNPM Install
run: pnpm install --frozen-lockfile
- name: Lint
run: pnpm run lint
- name: Type Check
run: pnpm run type-check
- name: Setup libvirt
run: |
# Create required groups (if they don't already exist)
sudo groupadd -f libvirt
sudo groupadd -f kvm
# Create libvirt user if not present, and add it to the kvm group
sudo useradd -m -s /bin/bash -g libvirt libvirt || true
sudo usermod -aG kvm libvirt || true
# Set up libvirt directories and permissions
sudo mkdir -p /var/run/libvirt /var/log/libvirt /etc/libvirt
sudo chown root:libvirt /var/run/libvirt /var/log/libvirt
sudo chmod g+w /var/run/libvirt /var/log/libvirt
# Configure libvirt by appending required settings
sudo tee -a /etc/libvirt/libvirtd.conf > /dev/null <<EOF
unix_sock_group = "libvirt"
unix_sock_rw_perms = "0770"
auth_unix_rw = "none"
EOF
# Add the current user to libvirt and kvm groups (note: this change won't apply to the current session)
sudo usermod -aG libvirt,kvm $USER
sudo mkdir -p /var/run/libvirt
sudo chown root:libvirt /var/run/libvirt
sudo chmod 775 /var/run/libvirt
# Start libvirtd in the background
sudo /usr/sbin/libvirtd --daemon
# Wait a bit longer for libvirtd to start
sleep 5
# Verify libvirt is running using sudo to bypass group membership delays
sudo virsh list --all || true
- name: Build UI Package First
run: |
echo "🔧 Building UI package for web tests dependency..."
cd ../unraid-ui && pnpm run build
- name: Run Tests Concurrently
run: |
set -e
# Run all tests in parallel with labeled output and coverage generation
echo "🚀 Starting API coverage tests..."
pnpm run coverage > api-test.log 2>&1 &
API_PID=$!
echo "🚀 Starting Connect plugin tests..."
(cd ../packages/unraid-api-plugin-connect && pnpm test --coverage 2>/dev/null || pnpm test) > connect-test.log 2>&1 &
CONNECT_PID=$!
echo "🚀 Starting Shared package tests..."
(cd ../packages/unraid-shared && pnpm test --coverage 2>/dev/null || pnpm test) > shared-test.log 2>&1 &
SHARED_PID=$!
echo "🚀 Starting Web package coverage tests..."
(cd ../web && (pnpm test --coverage || pnpm test)) > web-test.log 2>&1 &
WEB_PID=$!
echo "🚀 Starting UI package coverage tests..."
(cd ../unraid-ui && pnpm test --coverage 2>/dev/null || pnpm test) > ui-test.log 2>&1 &
UI_PID=$!
echo "🚀 Starting Plugin tests..."
(cd ../plugin && pnpm test) > plugin-test.log 2>&1 &
PLUGIN_PID=$!
# Wait for all processes and capture exit codes
wait $API_PID && echo "✅ API tests completed" || { echo "❌ API tests failed"; API_EXIT=1; }
wait $CONNECT_PID && echo "✅ Connect tests completed" || { echo "❌ Connect tests failed"; CONNECT_EXIT=1; }
wait $SHARED_PID && echo "✅ Shared tests completed" || { echo "❌ Shared tests failed"; SHARED_EXIT=1; }
wait $WEB_PID && echo "✅ Web tests completed" || { echo "❌ Web tests failed"; WEB_EXIT=1; }
wait $UI_PID && echo "✅ UI tests completed" || { echo "❌ UI tests failed"; UI_EXIT=1; }
wait $PLUGIN_PID && echo "✅ Plugin tests completed" || { echo "❌ Plugin tests failed"; PLUGIN_EXIT=1; }
# Display all outputs
echo "📋 API Test Results:" && cat api-test.log
echo "📋 Connect Plugin Test Results:" && cat connect-test.log
echo "📋 Shared Package Test Results:" && cat shared-test.log
echo "📋 Web Package Test Results:" && cat web-test.log
echo "📋 UI Package Test Results:" && cat ui-test.log
echo "📋 Plugin Test Results:" && cat plugin-test.log
# Exit with error if any test failed
if [[ ${API_EXIT:-0} -eq 1 || ${CONNECT_EXIT:-0} -eq 1 || ${SHARED_EXIT:-0} -eq 1 || ${WEB_EXIT:-0} -eq 1 || ${UI_EXIT:-0} -eq 1 || ${PLUGIN_EXIT:-0} -eq 1 ]]; then
exit 1
fi
- name: Upload all coverage reports to Codecov
uses: codecov/codecov-action@v5
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: ./coverage/coverage-final.json,../web/coverage/coverage-final.json,../unraid-ui/coverage/coverage-final.json,../packages/unraid-api-plugin-connect/coverage/coverage-final.json,../packages/unraid-shared/coverage/coverage-final.json
fail_ci_if_error: false
build-api:
name: Build API
runs-on: ubuntu-latest
- id: release
uses: googleapis/release-please-action@v4
outputs:
build_number: ${{ steps.buildnumber.outputs.build_number }}
releases_created: ${{ steps.release.outputs.releases_created }}
tag_name: ${{ steps.release.outputs.tag_name }}
start:
# This prevents a tag running twice as it'll have a "tag" and a "commit" event
# We only want the tag to run the action as it'll be able to create the release notes
if: (startsWith(github.event.ref, 'refs/heads/') && !startsWith(github.event.head_commit.message, 'chore(release)')) || (startsWith(github.event.ref, 'refs/tags/') && startsWith(github.event.head_commit.message, 'chore(release)'))
runs-on: ubuntu-latest
steps:
- name: Validate branch and tag
run: exit 0
lint-api:
continue-on-error: true
defaults:
run:
working-directory: api
steps:
- name: Checkout repo
uses: actions/checkout@v5
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential
version: 1.0
- name: PNPM Install
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile
- name: Build
run: pnpm run build
- name: Get Git Short Sha and API version
id: vars
run: |
GIT_SHA=$(git rev-parse --short HEAD)
IS_TAGGED=$(git describe --tags --abbrev=0 --exact-match || echo '')
PACKAGE_LOCK_VERSION=$(jq -r '.version' package.json)
API_VERSION=$([[ -n "$IS_TAGGED" ]] && echo "$PACKAGE_LOCK_VERSION" || echo "${PACKAGE_LOCK_VERSION}+${GIT_SHA}")
export API_VERSION
echo "API_VERSION=${API_VERSION}" >> $GITHUB_ENV
echo "PACKAGE_LOCK_VERSION=${PACKAGE_LOCK_VERSION}" >> $GITHUB_OUTPUT
- name: Generate build number
id: buildnumber
uses: onyxmueller/build-tag-number@v1
with:
token: ${{secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN}}
prefix: ${{steps.vars.outputs.PACKAGE_LOCK_VERSION}}
- name: Build
run: |
pnpm run build:release
tar -czf deploy/unraid-api.tgz -C deploy/pack/ .
- name: Upload tgz to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-api
path: ${{ github.workspace }}/api/deploy/unraid-api.tgz
build-unraid-ui-webcomponents:
name: Build Unraid UI Library (Webcomponent Version)
defaults:
run:
working-directory: unraid-ui
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
- uses: pnpm/action-setup@v4
name: Install pnpm
uses: actions/checkout@v4
with:
run_install: false
persist-credentials: false
- name: Reconfigure git to use HTTP authenti:cation
run: >
git config --global url."https://github.com/".insteadOf
ssh://git@github.com/
- name: Install Node
uses: actions/setup-node@v5
- name: Install node
uses: actions/setup-node@v4
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
node-version-file: "api/.nvmrc"
- name: Cache APT Packages
uses: awalsh128/cache-apt-pkgs-action@v1.5.3
# - name: Get npm cache directory
# id: npm-cache
# run: echo "::set-output name=dir::$(npm config get cache)"
# - name: Load npm cache
# uses: actions/cache@v3
# with:
# path: ${{ steps.npm-cache.outputs.dir }}
# key: ${{ runner.os }}-npm-cache-${{ hashFiles('**/package-lock.json') }}
- name: Install libvirt-dev
run: sudo apt-get update && sudo apt-get install libvirt-dev
- name: Installing node deps
run: npm install
- name: Lint files
run: npm run lint
test-api:
defaults:
run:
working-directory: api
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v4
with:
packages: bash procps python3 libvirt-dev jq zstd git build-essential
version: 1.0
persist-credentials: false
- name: Install dependencies
- name: Reconfigure git to use HTTP authentication
run: >
git config --global url."https://github.com/".insteadOf
ssh://git@github.com/
- name: Build Docker Compose
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile --filter @unraid/ui
docker network create mothership_default
GIT_SHA=$(git rev-parse --short HEAD) IS_TAGGED=$(git describe --tags --abbrev=0 --exact-match || echo '') docker compose build builder
- name: Lint
run: pnpm run lint
- name: Run Docker Compose
run: GIT_SHA=$(git rev-parse --short HEAD) IS_TAGGED=$(git describe --tags --abbrev=0 --exact-match || echo '') docker compose run builder npm run coverage
- name: Build
run: pnpm run build:wc
- name: Upload Artifact to Github
uses: actions/upload-artifact@v4
with:
name: unraid-wc-ui
path: unraid-ui/dist-wc/
build-web:
# needs: [build-unraid-ui]
name: Build Web App
lint-web:
defaults:
run:
working-directory: web
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v5
uses: actions/checkout@v4
- name: Create env file
run: |
touch .env
echo VITE_ACCOUNT=${{ secrets.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ secrets.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ secrets.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ secrets.VITE_CALLBACK_KEY }} >> .env
echo VITE_ACCOUNT=${{ vars.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ vars.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ vars.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ vars.VITE_CALLBACK_KEY }} >> .env
cat .env
- uses: pnpm/action-setup@v4
name: Install pnpm
- name: Install node
uses: actions/setup-node@v4
with:
run_install: false
cache: "npm"
cache-dependency-path: "web/package-lock.json"
node-version-file: "web/.nvmrc"
- name: Install Node
uses: actions/setup-node@v5
with:
node-version-file: ".nvmrc"
cache: 'pnpm'
- name: PNPM Install
run: |
cd ${{ github.workspace }}
pnpm install --frozen-lockfile --filter @unraid/web --filter @unraid/ui
- name: Build Unraid UI
run: |
cd ${{ github.workspace }}/unraid-ui
pnpm run build
- name: Installing node deps
run: npm install
- name: Lint files
run: pnpm run lint
run: npm run lint
- name: Type Check
run: pnpm run type-check
build-api:
defaults:
run:
working-directory: api
runs-on: ubuntu-latest
- name: Test
run: pnpm run test:ci
outputs:
API_VERSION: ${{ steps.build-pack-binary.outputs.API_VERSION }}
API_MD5: ${{ steps.set-hashes.outputs.API_MD5 }}
API_SHA256: ${{ steps.set-hashes.outputs.API_SHA256 }}
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Add SSH deploy key
uses: shimataro/ssh-key-action@v2
with:
key: ${{ secrets.UNRAID_BOT_SSH_KEY }}
known_hosts: ${{ secrets.KNOWN_HOSTS }}
- name: Install node
uses: actions/setup-node@v4
with:
node-version-file: "api/.nvmrc"
- name: Install libvirt-dev
run: sudo apt-get update && sudo apt-get install libvirt-dev
- name: Installing node deps
run: npm install
- name: Install pkg and node-prune
run: npm i -g pkg && curl -sf https://gobinaries.com/tj/node-prune | sh
# See https://github.com/apollographql/subscriptions-transport-ws/issues/433
- name: Patch subscriptions-transport-ws
run: npm run patch:subscriptions-transport-ws
- name: Build and Pack
id: build-pack-binary
run: WORKDIR=${{ github.workspace }} && npm run build-pkg
- name: Set Hashes
id: set-hashes
run: |
API_MD5=$(md5sum ${{ github.workspace }}/api/deploy/release/*.tgz | awk '{ print $1 }')
API_SHA256=$(sha256sum ${{ github.workspace }}/api/deploy/release/*.tgz | awk '{ print $1 }')
echo "::set-output name=API_MD5::${API_MD5}"
echo "::set-output name=API_SHA256::${API_SHA256}"
- name: Upload tgz to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-api
path: ${{ github.workspace }}/api/deploy/release/*.tgz
build-web:
defaults:
run:
working-directory: web
runs-on: ubuntu-latest
environment:
name: production
needs: [lint-web]
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Create env file
run: |
touch .env
echo VITE_ACCOUNT=${{ vars.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ vars.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ vars.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ vars.VITE_CALLBACK_KEY }} >> .env
cat .env
- name: Install node
uses: actions/setup-node@v4
with:
cache: "npm"
cache-dependency-path: "web/package-lock.json"
node-version-file: "web/.nvmrc"
- name: Installing node deps
run: npm install
- name: Build
run: pnpm run build
run: npm run build
- name: Upload build to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-wc-rich
path: web/dist
name: unraid-web
path: web/.nuxt/nuxt-custom-elements/dist/unraid-components
release-please:
name: Release Please
build-plugin:
needs: [lint-api, lint-web, test-api, build-api, build-web]
defaults:
run:
working-directory: plugin
runs-on: ubuntu-latest
# Only run on pushes to main AND after tests pass
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
needs:
- test-api
- build-api
- build-web
- build-unraid-ui-webcomponents
permissions:
contents: write
pull-requests: write
steps:
- name: Checkout
uses: actions/checkout@v5
- name: Set Timezone
uses: szenius/set-timezone@v1.2
with:
timezoneLinux: "America/Los_Angeles"
- name: Checkout repo
uses: actions/checkout@v4
- name: Download unraid web components
uses: actions/download-artifact@v4
with:
name: unraid-web
path: ./plugin/source/dynamix.unraid.net/usr/local/emhttp/plugins/dynamix.my.servers/unraid-components
- name: Build Plugin
run: |
cd source/dynamix.unraid.net
export API_VERSION=${{needs.build-api.outputs.API_VERSION}}
export API_MD5=${{needs.build-api.outputs.API_MD5}}
export API_SHA256=${{needs.build-api.outputs.API_SHA256}}
bash ./pkg_build.sh s
bash ./pkg_build.sh p
- name: Upload binary txz and plg to Github artifacts
uses: actions/upload-artifact@v4
with:
name: connect-files
path: |
${{ github.workspace }}/plugin/archive/*.txz
${{ github.workspace }}/plugin/plugins/*.plg
retention-days: 5
if-no-files-found: error
- id: release
uses: googleapis/release-please-action@v4
outputs:
releases_created: ${{ steps.release.outputs.releases_created || 'false' }}
tag_name: ${{ steps.release.outputs.tag_name || '' }}
release-staging:
# Only release if this is a push to the main branch
if: startsWith(github.ref, 'refs/heads/main')
runs-on: ubuntu-latest
needs: [build-plugin]
build-plugin-staging-pr:
name: Build and Deploy Plugin
needs:
- build-api
- build-web
- build-unraid-ui-webcomponents
- test-api
uses: ./.github/workflows/build-plugin.yml
with:
RELEASE_CREATED: false
TAG: ${{ github.event.pull_request.number && format('PR{0}', github.event.pull_request.number) || '' }}
BUCKET_PATH: ${{ github.event.pull_request.number && format('unraid-api/tag/PR{0}', github.event.pull_request.number) || 'unraid-api' }}
BASE_URL: "https://preview.dl.unraid.net/unraid-api"
BUILD_NUMBER: ${{ needs.build-api.outputs.build_number }}
secrets:
CF_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
CF_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
CF_BUCKET_PREVIEW: ${{ secrets.CF_BUCKET_PREVIEW }}
CF_ENDPOINT: ${{ secrets.CF_ENDPOINT }}
steps:
- name: Checkout repo
uses: actions/checkout@v4
build-plugin-production:
if: ${{ needs.release-please.outputs.releases_created == 'true' }}
name: Build and Deploy Production Plugin
needs:
- release-please
- build-api
uses: ./.github/workflows/build-plugin.yml
with:
RELEASE_CREATED: true
RELEASE_TAG: ${{ needs.release-please.outputs.tag_name }}
TAG: ""
BUCKET_PATH: unraid-api
BASE_URL: "https://stable.dl.unraid.net/unraid-api"
BUILD_NUMBER: ${{ needs.build-api.outputs.build_number }}
secrets:
CF_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
CF_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
CF_BUCKET_PREVIEW: ${{ secrets.CF_BUCKET_PREVIEW }}
CF_ENDPOINT: ${{ secrets.CF_ENDPOINT }}
UNRAID_BOT_GITHUB_ADMIN_TOKEN: ${{ secrets.UNRAID_BOT_GITHUB_ADMIN_TOKEN }}
- name: Make Staging Release Folder
run: mkdir staging-release/
- name: Download unraid-api binary tgz
uses: actions/download-artifact@v4
with:
name: unraid-api
path: staging-release
- name: Download plugin binary tgz
uses: actions/download-artifact@v4
with:
name: connect-files
- name: Parse Changelog
id: changelog
uses: ocavue/changelog-parser-action@v1
with:
removeMarkdown: false
filePath: "./api/CHANGELOG.md"
- name: Run LS in unraid-api folder
run: |
cp archive/dynamix.unraid.net.staging-*.txz staging-release/
cp plugins/dynamix.unraid.net.staging.plg staging-release/
ls -al staging-release
- name: Upload Staging Plugin to DO Spaces
uses: BetaHuhn/do-spaces-action@v2
with:
access_key: ${{ secrets.DO_ACCESS_KEY }}
secret_key: ${{ secrets.DO_SECRET_KEY }}
space_name: ${{ secrets.DO_SPACE_NAME }}
space_region: ${{ secrets.DO_SPACE_REGION }}
source: staging-release
out_dir: unraid-api
- name: Upload Staging Plugin to Cloudflare Bucket
uses: jakejarvis/s3-sync-action@v0.5.1
env:
AWS_S3_ENDPOINT: ${{ secrets.CF_ENDPOINT }}
AWS_S3_BUCKET: ${{ secrets.CF_BUCKET_PREVIEW }}
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_REGION: 'auto'
SOURCE_DIR: staging-release
DEST_DIR: unraid-api
create-draft-release:
# Only create new draft if this is a version tag
if: |
startsWith(github.ref, 'refs/tags/v')
runs-on: ubuntu-latest
needs: [build-plugin]
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Download unraid-api binary tgz
uses: actions/download-artifact@v4
with:
name: unraid-api
- name: Download plugin binary tgz
uses: actions/download-artifact@v4
with:
name: connect-files
- name: Create Github release
uses: softprops/action-gh-release@v1
with:
draft: true
prerelease: false
files: |
unraid-api-*.tgz
plugins/dynamix.unraid.net*
archive/dynamix.unraid.net*
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

82
.github/workflows/pull-request-web.yml vendored Normal file
View File

@@ -0,0 +1,82 @@
name: Pull Request Web
on:
pull_request:
paths:
- 'web/**'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}-web
cancel-in-progress: true
jobs:
lint-web:
defaults:
run:
working-directory: web
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Create env file
run: |
touch .env
echo VITE_ACCOUNT=${{ vars.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ vars.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ vars.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ vars.VITE_CALLBACK_KEY }} >> .env
cat .env
- name: Install node
uses: actions/setup-node@v4
with:
cache: "npm"
cache-dependency-path: "web/package-lock.json"
node-version-file: "web/.nvmrc"
- name: Installing node deps
run: npm install
- name: Lint files
run: npm run lint
build-web:
defaults:
run:
working-directory: web
runs-on: ubuntu-latest
environment:
name: production
needs: [lint-web]
steps:
- name: Checkout repo
uses: actions/checkout@v4
- name: Create env file
run: |
touch .env
echo VITE_ACCOUNT=${{ vars.VITE_ACCOUNT }} >> .env
echo VITE_CONNECT=${{ vars.VITE_CONNECT }} >> .env
echo VITE_UNRAID_NET=${{ vars.VITE_UNRAID_NET }} >> .env
echo VITE_CALLBACK_KEY=${{ vars.VITE_CALLBACK_KEY }} >> .env
cat .env
- name: Install node
uses: actions/setup-node@v4
with:
cache: "npm"
cache-dependency-path: "web/package-lock.json"
node-version-file: "web/.nvmrc"
- name: Installing node deps
run: npm install
- name: Build
run: npm run build
- name: Upload build to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-web
path: web/.nuxt/nuxt-custom-elements/dist/unraid-components

183
.github/workflows/pull-request.yml vendored Normal file
View File

@@ -0,0 +1,183 @@
name: Pull Request
on:
pull_request:
paths:
- api/**
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
lint-api:
services:
registry: # Using a local registry is ~3x faster than exporting the image to docker agent
image: registry:2
ports:
- 5000:5000
continue-on-error: true
defaults:
run:
working-directory: api
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v4
with:
persist-credentials: true
- uses: docker/setup-buildx-action@v3
with:
# network=host driver-opt needed to push to local registry
driver-opts: network=host
- name: Build and push
uses: docker/build-push-action@v5
with:
context: api
target: builder
push: true
tags: localhost:5000/unraid-api:builder
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Lint
run: |
docker run localhost:5000/unraid-api:builder npm run lint
test-api:
services:
registry: # Using a local registry is ~3x faster than exporting the image to docker agent
image: registry:2
ports:
- 5000:5000
defaults:
run:
working-directory: api
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v4
with:
persist-credentials: true
- uses: docker/setup-buildx-action@v3
with:
# network=host driver-opt needed to push to local registry
driver-opts: network=host
- name: Build and push
uses: docker/build-push-action@v5
with:
context: api
target: builder
push: true
tags: localhost:5000/unraid-api:builder
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Test
run: |
docker run localhost:5000/unraid-api:builder npm run coverage
build-api:
services:
registry: # Using a local registry is ~3x faster than exporting the image to docker agent
image: registry:2
ports:
- 5000:5000
defaults:
run:
working-directory: api
runs-on: ubuntu-latest
outputs:
API_VERSION: ${{ steps.build-pack-binary.outputs.API_VERSION }}
API_MD5: ${{ steps.set-hashes.outputs.API_MD5 }}
API_SHA256: ${{ steps.set-hashes.outputs.API_SHA256 }}
steps:
- name: Checkout repo
uses: actions/checkout@v4
with:
persist-credentials: true
- uses: docker/setup-buildx-action@v3
with:
# network=host driver-opt needed to push to local registry
driver-opts: network=host
- name: Build and push
uses: docker/build-push-action@v5
with:
context: api
target: builder
push: true
tags: localhost:5000/unraid-api:builder
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Run Build
run: docker run -e GIT_SHA=$(git rev-parse --short HEAD) -e IS_TAGGED=$(git describe --tags --abbrev=0 --exact-match) -v $(pwd)/deploy:/app/deploy/ localhost:5000/unraid-api:builder npm run build-pkg
- name: Set Hashes
id: set-hashes
run: |
API_MD5=$(md5sum ${{ github.workspace }}/api/deploy/release/*.tgz | awk '{ print $1 }')
API_SHA256=$(sha256sum ${{ github.workspace }}/api/deploy/release/*.tgz | awk '{ print $1 }')
echo "::set-output name=API_MD5::${API_MD5}"
echo "::set-output name=API_SHA256::${API_SHA256}"
- name: Upload tgz to Github artifacts
uses: actions/upload-artifact@v4
with:
name: unraid-api
path: ${{ github.workspace }}/api/deploy/release/*.tgz
- name: Parse Changelog
id: changelog
uses: ocavue/changelog-parser-action@v1
with:
removeMarkdown: false
filePath: "./api/CHANGELOG.md"
- name: View release notes
run: |
escapedNotes=$(sed -e 's/[&\\/]/\\&/g; s/$/\\/' -e '$s/\\$//' <<<"${{steps.changelog.outputs.latestBody}}")
echo "${escapedNotes}"
build-plugin:
defaults:
run:
working-directory: plugin
runs-on: ubuntu-latest
needs: [lint-api, test-api, build-api]
steps:
- name: Set Timezone
uses: szenius/set-timezone@v1.2
with:
timezoneLinux: "America/Los_Angeles"
- name: Checkout repo
uses: actions/checkout@v4
- name: Build Plugin
run: |
cd source/dynamix.unraid.net
export API_VERSION=${{needs.build-api.outputs.API_VERSION}}
export API_MD5=${{needs.build-api.outputs.API_MD5}}
export API_SHA256=${{needs.build-api.outputs.API_SHA256}}
bash ./pkg_build.sh s
bash ./pkg_build.sh p
- name: Create release notes
run: |
LAST_RELEASE=$(git tag --list --sort=v:refname | tail -1)
echo ${LAST_RELEASE}
RELEASE_NOTES=$(git log "$LAST_RELEASE...HEAD" --pretty=format:"- %s [\`%h\`](http://github.com/$GITHUB_REPOSITORY/commit/%H)" --reverse)
echo "${RELEASE_NOTES}"
# escapedNotes=$(sed -e 's/[&\\/]/\\&/g; s/$/\\/' -e '$s/\\$//' <<<"${RELEASE_NOTES}")
# sed -i -z -E "s/<CHANGES>(.*)<\/CHANGES>/<CHANGES>\n${escapedNotes}\n<\/CHANGES>/g" "plugins/dynamix.unraid.net.staging.plg"
- name: Upload binary txz and plg to Github artifacts
uses: actions/upload-artifact@v4
with:
name: connect-files
path: |
${{ github.workspace }}/plugin/archive/*.txz
${{ github.workspace }}/plugin/plugins/*.plg
retention-days: 5
if-no-files-found: error

View File

@@ -1,142 +0,0 @@
name: Replace PR Plugin with Staging Redirect on Merge
# This workflow runs when a PR is merged and replaces the PR-specific plugin
# with a redirect version that points to the main staging URL.
# This ensures users who installed the PR version will automatically
# update to the staging version on their next update check.
on:
pull_request:
types:
- closed
workflow_dispatch:
inputs:
pr_number:
description: "PR number to test with"
required: true
type: string
pr_merged:
description: "Simulate merged PR"
required: true
type: boolean
default: true
jobs:
push-staging-redirect:
if: (github.event_name == 'pull_request' && github.event.pull_request.merged == true) || (github.event_name == 'workflow_dispatch' && inputs.pr_merged == true)
runs-on: ubuntu-latest
permissions:
contents: read
actions: read
steps:
- name: Set PR number
id: pr_number
run: |
if [ "${{ github.event_name }}" == "pull_request" ]; then
echo "pr_number=${{ github.event.pull_request.number }}" >> $GITHUB_OUTPUT
else
echo "pr_number=${{ inputs.pr_number }}" >> $GITHUB_OUTPUT
fi
- name: Download artifact
uses: dawidd6/action-download-artifact@v11
with:
name_is_regexp: true
name: unraid-plugin-.*
path: connect-files
pr: ${{ steps.pr_number.outputs.pr_number }}
workflow: main.yml
workflow_conclusion: success
search_artifacts: true
if_no_artifact_found: fail
- name: Update Downloaded Plugin to Redirect to Staging
run: |
# Find the .plg file in the downloaded artifact
plgfile=$(find connect-files -name "*.plg" -type f | head -1)
if [ ! -f "$plgfile" ]; then
echo "ERROR: .plg file not found in connect-files/"
ls -la connect-files/
exit 1
fi
echo "Found plugin file: $plgfile"
# Get current version and bump it with current timestamp
current_version=$(grep '<!ENTITY version' "${plgfile}" | sed -E 's/.*"(.*)".*/\1/')
echo "Current version: ${current_version}"
# Create new version with current timestamp (ensures it's newer)
new_version=$(date +"%Y.%m.%d.%H%M")
echo "New redirect version: ${new_version}"
# Update version to trigger update
sed -i -E "s#(<!ENTITY version \").*(\">)#\1${new_version}\2#g" "${plgfile}" || exit 1
# Change the plugin url to point to staging - users will switch to staging on next update
url="https://preview.dl.unraid.net/unraid-api/dynamix.unraid.net.plg"
sed -i -E "s#(<!ENTITY plugin_url \").*?(\">)#\1${url}\2#g" "${plgfile}" || exit 1
echo "Modified plugin to redirect to: ${url}"
echo "Version bumped from ${current_version} to ${new_version}"
mkdir -p pr-release
mv "${plgfile}" pr-release/dynamix.unraid.net.plg
- name: Clean up old PR artifacts from Cloudflare
env:
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: auto
run: |
# Delete all existing files in the PR directory first (txz, plg, etc.)
aws s3 rm s3://${{ secrets.CF_BUCKET_PREVIEW }}/unraid-api/tag/PR${{ steps.pr_number.outputs.pr_number }}/ \
--recursive \
--endpoint-url ${{ secrets.CF_ENDPOINT }}
echo "✅ Cleaned up old PR artifacts"
- name: Upload PR Redirect Plugin to Cloudflare
env:
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: auto
run: |
# Upload only the redirect plugin file
aws s3 cp pr-release/dynamix.unraid.net.plg \
s3://${{ secrets.CF_BUCKET_PREVIEW }}/unraid-api/tag/PR${{ steps.pr_number.outputs.pr_number }}/dynamix.unraid.net.plg \
--endpoint-url ${{ secrets.CF_ENDPOINT }} \
--content-encoding none \
--acl public-read
echo "✅ Uploaded redirect plugin"
- name: Output redirect information
run: |
echo "✅ PR plugin replaced with staging redirect version"
echo "PR URL remains: https://preview.dl.unraid.net/unraid-api/tag/PR${{ steps.pr_number.outputs.pr_number }}/dynamix.unraid.net.plg"
echo "Redirects users to staging: https://preview.dl.unraid.net/unraid-api/dynamix.unraid.net.plg"
echo "Users updating from this PR version will automatically switch to staging"
- name: Comment on PR about staging redirect
if: github.event_name == 'pull_request'
uses: thollander/actions-comment-pull-request@v3
with:
comment-tag: pr-closed-staging
mode: recreate
message: |
## 🔄 PR Merged - Plugin Redirected to Staging
This PR has been merged and the preview plugin has been updated to redirect to the staging version.
**For users testing this PR:**
- Your plugin will automatically update to the staging version on the next update check
- The staging version includes all merged changes from this PR
- No manual intervention required
**Staging URL:**
```
https://preview.dl.unraid.net/unraid-api/dynamix.unraid.net.plg
```
Thank you for testing! 🚀

View File

@@ -1,14 +1,11 @@
name: Publish Release
name: Publish Release to Digital Ocean
on:
workflow_dispatch:
inputs:
version:
description: 'Tag to release - will replace active release'
required: true
release:
types: [published]
jobs:
publish:
publish-to-digital-ocean:
runs-on: ubuntu-latest
steps:
@@ -19,7 +16,7 @@ jobs:
regex: true
token: ${{ secrets.GITHUB_TOKEN }}
target: "./"
version: ${{ inputs.version && format('tags/{0}', inputs.version) || 'latest' }}
version: "latest"
- uses: cardinalby/git-get-release-action@v1
id: release-info
@@ -28,118 +25,33 @@ jobs:
with:
latest: true
prerelease: false
- uses: actions/setup-node@v5
with:
node-version: 22.19.0
- run: |
cat << 'EOF' > release-notes.txt
- name: Get Release Changelog
run: |
notes=$(cat << EOF
${{ steps.release-info.outputs.body }}
EOF
- run: npm install html-escaper@2 xml2js
- name: Update Plugin Changelog
uses: actions/github-script@v8
)
escapedNotes=$(sed -e 's/[&\\/]/\\&/g; s/$/\\/' -e '$s/\\$//' <<<"$notes")
sed -i -z -E "s/<CHANGES>(.*)<\/CHANGES>/<CHANGES>\n${escapedNotes}\n<\/CHANGES>/g" "dynamix.unraid.net.plg"
sed -i -z -E "s/<CHANGES>(.*)<\/CHANGES>/<CHANGES>\n${escapedNotes}\n<\/CHANGES>/g" "dynamix.unraid.net.staging.plg"
- name: Upload All Release Files to DO Spaces
uses: BetaHuhn/do-spaces-action@v2
with:
script: |
const fs = require('fs');
const { escape } = require('html-escaper');
access_key: ${{ secrets.DO_ACCESS_KEY }}
secret_key: ${{ secrets.DO_SECRET_KEY }}
space_name: ${{ secrets.DO_SPACE_NAME }}
space_region: ${{ secrets.DO_SPACE_REGION }}
source: "."
out_dir: unraid-api
const releaseNotes = escape(fs.readFileSync('release-notes.txt', 'utf8'));
if (!releaseNotes) {
console.error('No release notes found');
process.exit(1);
}
// Read the plugin file
const pluginPath = 'dynamix.unraid.net.plg';
if (!fs.existsSync(pluginPath)) {
console.error('Plugin file not found:', pluginPath);
process.exit(1);
}
let pluginContent = fs.readFileSync(pluginPath, 'utf8');
// Replace the changelog section using CDATA
pluginContent = pluginContent.replace(
/<CHANGES>[\s\S]*?<\/CHANGES>/,
`<CHANGES>\n${releaseNotes}\n</CHANGES>`
);
// Validate the plugin file is valid XML
const xml2js = require('xml2js');
const parser = new xml2js.Parser({
explicitCharkey: true,
trim: true,
explicitRoot: true,
explicitArray: false,
attrkey: 'ATTR',
charkey: 'TEXT',
xmlnskey: 'XMLNS',
normalizeTags: false,
normalize: false,
strict: false // Try with less strict parsing
});
parser.parseStringPromise(pluginContent).then((result) => {
if (!result) {
console.error('Plugin file is not valid XML');
process.exit(1);
}
console.log('Plugin file is valid XML');
// Write back to file
fs.writeFileSync(pluginPath, pluginContent);
}).catch((err) => {
console.error('Plugin file is not valid XML', err);
process.exit(1);
});
- name: Cleanup Inline Scripts
run: |
rm -rf node_modules/
- name: Upload Release Files to DO Spaces
env:
AWS_ACCESS_KEY_ID: ${{ secrets.DO_ACCESS_KEY }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.DO_SECRET_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.DO_SPACE_REGION }}
AWS_ENDPOINT_URL: https://${{ secrets.DO_SPACE_REGION }}.digitaloceanspaces.com
run: |
# Upload files with explicit content encoding and public-read ACL
aws s3 sync . s3://${{ secrets.DO_SPACE_NAME }}/unraid-api \
--checksum-algorithm CRC32 \
--no-guess-mime-type \
--content-encoding none \
--acl public-read
- name: Upload Release Files to Cloudflare Bucket
- name: Upload Staging Plugin to Cloudflare Bucket
uses: jakejarvis/s3-sync-action@v0.5.1
env:
AWS_S3_ENDPOINT: ${{ secrets.CF_ENDPOINT }}
AWS_S3_BUCKET: ${{ secrets.CF_BUCKET }}
AWS_ACCESS_KEY_ID: ${{ secrets.CF_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.CF_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: auto
AWS_ENDPOINT_URL: ${{ secrets.CF_ENDPOINT }}
run: |
# Upload files with explicit content encoding and public-read ACL
aws s3 sync . s3://${{ secrets.CF_BUCKET }}/unraid-api \
--checksum-algorithm CRC32 \
--no-guess-mime-type \
--content-encoding none \
--acl public-read
- name: Discord Webhook Notification
uses: tsickert/discord-webhook@v7.0.0
with:
webhook-url: ${{ secrets.PUBLIC_DISCORD_RELEASE_ENDPOINT }}
username: "Unraid API Bot"
avatar-url: "https://craftassets.unraid.net/uploads/logos/un-mark-gradient.png"
embed-title: "🚀 Unraid API ${{ inputs.version }} Released!"
embed-url: "https://github.com/${{ github.repository }}/releases/tag/${{ inputs.version }}"
embed-description: |
A new version of Unraid API has been released!
**Version:** `${{ inputs.version }}`
**Release Page:** [View on GitHub](https://github.com/${{ github.repository }}/releases/tag/${{ inputs.version }})
**📋 Changelog:**
${{ steps.release-info.outputs.body }}
embed-color: 16734296
embed-footer-text: "Unraid API • Automated Release"
AWS_REGION: 'auto'
SOURCE_DIR: "."
DEST_DIR: unraid-api

46
.gitignore vendored
View File

@@ -24,15 +24,10 @@ build/Release
# Dependency directories
node_modules/
jspm_packages/
unraid-ui/node_modules/
# TypeScript v1 declaration files
typings/
# Auto-generated type declarations for Nuxt UI
auto-imports.d.ts
components.d.ts
# Optional npm cache directory
.npm
@@ -56,20 +51,18 @@ components.d.ts
# Visual Studio Code workspace
.vscode/sftp.json
.history/
# Jetbrains
.idea
# OSX
.DS_Store
# Jetbrains Settings Files
.idea
# Temp dir for tests
test/__temp__/*
# Built files
dist
unraid-ui/storybook-static
# Typescript
typescript
@@ -80,10 +73,7 @@ typescript
# Github actions
RELEASE_NOTES.md
# Test backups
api/dev/configs/api.json.backup
# Docker Deploy Folder
# Docker Deploy Folder
deploy/*
!deploy/.gitkeep
@@ -95,31 +85,7 @@ deploy/*
.nitro
.cache
.output
.env*
!.env.example
fb_keepalive
# pnpm store
.pnpm-store
# Nix
result
result-*
.direnv/
.envrc
# Webgui sync script helpers
web/scripts/.sync-webgui-repo-*
# Activation code data
plugin/source/dynamix.unraid.net/usr/local/emhttp/plugins/dynamix.my.servers/data/activation-data.php
# Config file that changes between versions
api/dev/Unraid.net/myservers.cfg
# Claude local settings
.claude/settings.local.json
# local Mise settings
.mise.toml
fb_keepalive

View File

@@ -1,12 +0,0 @@
#!/bin/sh
if [ "$SKIP_SIMPLE_GIT_HOOKS" = "1" ]; then
echo "[INFO] SKIP_SIMPLE_GIT_HOOKS is set to 1, skipping hook."
exit 0
fi
if [ -f "$SIMPLE_GIT_HOOKS_RC" ]; then
. "$SIMPLE_GIT_HOOKS_RC"
fi
pnpm lint-staged

1
.npmrc
View File

@@ -1 +0,0 @@

1
.nvmrc
View File

@@ -1 +0,0 @@
22.18.0

View File

@@ -1 +0,0 @@
1.69.1

View File

@@ -1 +1 @@
{".":"4.23.1"}
{"api":"3.10.0","web":"3.10.0"}

View File

@@ -1,6 +1,8 @@
{
"recommendations": [
"natizyskunk.sftp",
"davidanson.vscode-markdownlint",
"bmewburn.vscode-intelephense-client",
"foxundermoon.shell-format",
"timonwong.shellcheck",
"esbenp.prettier-vscode"

34
.vscode/settings.json vendored Normal file
View File

@@ -0,0 +1,34 @@
{
"files.associations": {
"*.page": "php"
},
"editor.codeActionsOnSave": {
"source.fixAll": "never",
"source.fixAll.eslint": "explicit"
},
"workbench.colorCustomizations": {
"activityBar.activeBackground": "#78797d",
"activityBar.background": "#78797d",
"activityBar.foreground": "#e7e7e7",
"activityBar.inactiveForeground": "#e7e7e799",
"activityBarBadge.background": "#df9fac",
"activityBarBadge.foreground": "#15202b",
"commandCenter.border": "#e7e7e799",
"sash.hoverBorder": "#78797d",
"statusBar.background": "#5f6063",
"statusBar.foreground": "#e7e7e7",
"statusBarItem.hoverBackground": "#78797d",
"statusBarItem.remoteBackground": "#5f6063",
"statusBarItem.remoteForeground": "#e7e7e7",
"titleBar.activeBackground": "#5f6063",
"titleBar.activeForeground": "#e7e7e7",
"titleBar.inactiveBackground": "#5f606399",
"titleBar.inactiveForeground": "#e7e7e799"
},
"peacock.color": "#5f6063",
"i18n-ally.localesPaths": [
"locales"
],
"i18n-ally.keystyle": "flat",
"eslint.experimental.useFlatConfig": true,
}

21
.vscode/sftp-template.json vendored Normal file
View File

@@ -0,0 +1,21 @@
{
"_comment": "rename this file to .vscode/sftp.json and replace name/host/privateKeyPath for your system",
"name": "Tower",
"host": "Tower.local",
"protocol": "sftp",
"port": 22,
"username": "root",
"privateKeyPath": "C:/Users/username/.ssh/tower",
"remotePath": "/",
"context": "plugin/source/dynamix.unraid.net/",
"uploadOnSave": true,
"useTempFile": false,
"openSsh": false,
"ignore": [
"// comment: ignore dot files/dirs in root of repo",
".github",
".vscode",
".git",
".DS_Store"
]
}

View File

@@ -1,96 +0,0 @@
@custom-variant dark (&:where(.dark, .dark *));
/* Utility defaults for web components (when we were using shadow DOM) */
:host {
--tw-divide-y-reverse: 0;
--tw-border-style: solid;
--tw-font-weight: initial;
--tw-tracking: initial;
--tw-translate-x: 0;
--tw-translate-y: 0;
--tw-translate-z: 0;
--tw-rotate-x: rotateX(0);
--tw-rotate-y: rotateY(0);
--tw-rotate-z: rotateZ(0);
--tw-skew-x: skewX(0);
--tw-skew-y: skewY(0);
--tw-space-x-reverse: 0;
--tw-gradient-position: initial;
--tw-gradient-from: #0000;
--tw-gradient-via: #0000;
--tw-gradient-to: #0000;
--tw-gradient-stops: initial;
--tw-gradient-via-stops: initial;
--tw-gradient-from-position: 0%;
--tw-gradient-via-position: 50%;
--tw-gradient-to-position: 100%;
--tw-shadow: 0 0 #0000;
--tw-shadow-color: initial;
--tw-inset-shadow: 0 0 #0000;
--tw-inset-shadow-color: initial;
--tw-ring-color: initial;
--tw-ring-shadow: 0 0 #0000;
--tw-inset-ring-color: initial;
--tw-inset-ring-shadow: 0 0 #0000;
--tw-ring-inset: initial;
--tw-ring-offset-width: 0px;
--tw-ring-offset-color: #fff;
--tw-ring-offset-shadow: 0 0 #0000;
--tw-blur: initial;
--tw-brightness: initial;
--tw-contrast: initial;
--tw-grayscale: initial;
--tw-hue-rotate: initial;
--tw-invert: initial;
--tw-opacity: initial;
--tw-saturate: initial;
--tw-sepia: initial;
--tw-drop-shadow: initial;
--tw-duration: initial;
--tw-ease: initial;
}
/* Global border color - this is what's causing the issue! */
/* Commenting out since it affects all elements globally
*,
::after,
::before,
::backdrop,
::file-selector-button {
border-color: hsl(var(--border));
}
*/
body {
--color-alpha: #1c1b1b;
--color-beta: #f2f2f2;
--color-gamma: #999999;
--color-gamma-opaque: rgba(153, 153, 153, 0.5);
--color-customgradient-start: rgba(242, 242, 242, 0);
--color-customgradient-end: rgba(242, 242, 242, 0.85);
--shadow-beta: 0 25px 50px -12px rgba(242, 242, 242, 0.15);
--ring-offset-shadow: 0 0 var(--color-beta);
--ring-shadow: 0 0 var(--color-beta);
}
button:not(:disabled),
[role='button']:not(:disabled) {
cursor: pointer;
}
/* Font size overrides for SSO button component */
unraid-sso-button {
--text-xs: 0.75rem;
--text-sm: 0.875rem;
--text-base: 1rem;
--text-lg: 1.125rem;
--text-xl: 1.25rem;
--text-2xl: 1.5rem;
--text-3xl: 1.875rem;
--text-4xl: 2.25rem;
--text-5xl: 3rem;
--text-6xl: 3.75rem;
--text-7xl: 4.5rem;
--text-8xl: 6rem;
--text-9xl: 8rem;
}

View File

@@ -1,144 +0,0 @@
/* Hybrid theme system: Native CSS + Theme Store fallback */
/* Light mode defaults */
:root {
/* Nuxt UI Color System - Primary (Orange for Unraid) */
--ui-color-primary-50: #fff7ed;
--ui-color-primary-100: #ffedd5;
--ui-color-primary-200: #fed7aa;
--ui-color-primary-300: #fdba74;
--ui-color-primary-400: #fb923c;
--ui-color-primary-500: #ff8c2f;
--ui-color-primary-600: #ea580c;
--ui-color-primary-700: #c2410c;
--ui-color-primary-800: #9a3412;
--ui-color-primary-900: #7c2d12;
--ui-color-primary-950: #431407;
/* Nuxt UI Color System - Neutral (True Gray) */
--ui-color-neutral-50: #fafafa;
--ui-color-neutral-100: #f5f5f5;
--ui-color-neutral-200: #e5e5e5;
--ui-color-neutral-300: #d4d4d4;
--ui-color-neutral-400: #a3a3a3;
--ui-color-neutral-500: #737373;
--ui-color-neutral-600: #525252;
--ui-color-neutral-700: #404040;
--ui-color-neutral-800: #262626;
--ui-color-neutral-900: #171717;
--ui-color-neutral-950: #0a0a0a;
/* Nuxt UI Default color shades */
--ui-primary: var(--ui-color-primary-500);
--ui-secondary: var(--ui-color-neutral-500);
/* Nuxt UI Design Tokens - Text */
--ui-text-dimmed: var(--ui-color-neutral-400);
--ui-text-muted: var(--ui-color-neutral-500);
--ui-text-toned: var(--ui-color-neutral-600);
--ui-text: var(--ui-color-neutral-700);
--ui-text-highlighted: var(--ui-color-neutral-900);
--ui-text-inverted: white;
/* Nuxt UI Design Tokens - Background */
--ui-bg: white;
--ui-bg-muted: var(--ui-color-neutral-50);
--ui-bg-elevated: var(--ui-color-neutral-100);
--ui-bg-accented: var(--ui-color-neutral-200);
--ui-bg-inverted: var(--ui-color-neutral-900);
/* Nuxt UI Design Tokens - Border */
--ui-border: var(--ui-color-neutral-200);
--ui-border-muted: var(--ui-color-neutral-200);
--ui-border-accented: var(--ui-color-neutral-300);
--ui-border-inverted: var(--ui-color-neutral-900);
/* Nuxt UI Radius */
--ui-radius: 0.5rem;
--background: 0 0% 100%;
--foreground: 0 0% 3.9%;
--muted: 0 0% 96.1%;
--muted-foreground: 0 0% 45.1%;
--popover: 0 0% 100%;
--popover-foreground: 0 0% 3.9%;
--card: 0 0% 100%;
--card-foreground: 0 0% 3.9%;
--border: 0 0% 89.8%;
--input: 0 0% 89.8%;
--primary: 24 100% 50%; /* Orange #ff8c2f in HSL */
--primary-foreground: 0 0% 98%;
--secondary: 0 0% 96.1%;
--secondary-foreground: 0 0% 9%;
--accent: 0 0% 96.1%;
--accent-foreground: 0 0% 9%;
--destructive: 0 84.2% 60.2%;
--destructive-foreground: 0 0% 98%;
--ring: 24 100% 50%; /* Orange ring to match primary */
--chart-1: 12 76% 61%;
--chart-2: 173 58% 39%;
--chart-3: 197 37% 24%;
--chart-4: 43 74% 66%;
--chart-5: 27 87% 67%;
}
/* Dark mode */
.dark {
/* Nuxt UI Default color shades - Dark mode */
--ui-primary: var(--ui-color-primary-400);
--ui-secondary: var(--ui-color-neutral-400);
/* Nuxt UI Design Tokens - Text (Dark) */
--ui-text-dimmed: var(--ui-color-neutral-500);
--ui-text-muted: var(--ui-color-neutral-400);
--ui-text-toned: var(--ui-color-neutral-300);
--ui-text: var(--ui-color-neutral-200);
--ui-text-highlighted: white;
--ui-text-inverted: var(--ui-color-neutral-900);
/* Nuxt UI Design Tokens - Background (Dark) */
--ui-bg: var(--ui-color-neutral-900);
--ui-bg-muted: var(--ui-color-neutral-800);
--ui-bg-elevated: var(--ui-color-neutral-800);
--ui-bg-accented: var(--ui-color-neutral-700);
--ui-bg-inverted: white;
/* Nuxt UI Design Tokens - Border (Dark) */
--ui-border: var(--ui-color-neutral-800);
--ui-border-muted: var(--ui-color-neutral-700);
--ui-border-accented: var(--ui-color-neutral-700);
--ui-border-inverted: white;
--background: 0 0% 3.9%;
--foreground: 0 0% 98%;
--muted: 0 0% 14.9%;
--muted-foreground: 0 0% 63.9%;
--popover: 0 0% 3.9%;
--popover-foreground: 0 0% 98%;
--card: 0 0% 3.9%;
--card-foreground: 0 0% 98%;
--border: 0 0% 14.9%;
--input: 0 0% 14.9%;
--primary: 24 100% 50%; /* Orange #ff8c2f in HSL */
--primary-foreground: 0 0% 98%;
--secondary: 0 0% 14.9%;
--secondary-foreground: 0 0% 98%;
--accent: 0 0% 14.9%;
--accent-foreground: 0 0% 98%;
--destructive: 0 62.8% 30.6%;
--destructive-foreground: 0 0% 98%;
--ring: 24 100% 50%; /* Orange ring to match primary */
--chart-1: 220 70% 50%;
--chart-2: 160 60% 45%;
--chart-3: 30 80% 55%;
--chart-4: 280 65% 60%;
--chart-5: 340 75% 55%;
}
/* Alternative class-based dark mode support for specific Unraid themes */
.dark[data-theme='black'],
.dark[data-theme='gray'] {
--background: 0 0% 3.9%;
--foreground: 0 0% 98%;
--border: 0 0% 14.9%;
}

View File

@@ -1,5 +0,0 @@
/* Tailwind Shared Styles - Single entry point for all shared CSS */
@import './css-variables.css';
@import './unraid-theme.css';
@import './theme-variants.css';
@import './base-utilities.css';

View File

@@ -1,92 +0,0 @@
/**
* Tailwind v4 Theme Variants
* Defines theme-specific CSS variables that can be switched via classes
* These are applied dynamically based on the theme selected in GraphQL
*/
/* Default/White Theme */
:root,
.theme-white {
--header-text-primary: #ffffff;
--header-text-secondary: #999999;
--header-background-color: #1c1b1b;
--header-gradient-start: rgba(28, 27, 27, 0);
--header-gradient-end: rgba(28, 27, 27, 0.7);
--color-border: #383735;
--color-alpha: #ff8c2f;
--color-beta: #1c1b1b;
--color-gamma: #ffffff;
--color-gamma-opaque: rgba(255, 255, 255, 0.3);
}
/* Black Theme */
.theme-black,
.theme-black.dark {
--header-text-primary: #1c1b1b;
--header-text-secondary: #999999;
--header-background-color: #f2f2f2;
--header-gradient-start: rgba(242, 242, 242, 0);
--header-gradient-end: rgba(242, 242, 242, 0.7);
--color-border: #e0e0e0;
--color-alpha: #ff8c2f;
--color-beta: #f2f2f2;
--color-gamma: #1c1b1b;
--color-gamma-opaque: rgba(28, 27, 27, 0.3);
}
/* Gray Theme */
.theme-gray {
--header-text-primary: #ffffff;
--header-text-secondary: #999999;
--header-background-color: #1c1b1b;
--header-gradient-start: rgba(28, 27, 27, 0);
--header-gradient-end: rgba(28, 27, 27, 0.7);
--color-border: #383735;
--color-alpha: #ff8c2f;
--color-beta: #383735;
--color-gamma: #ffffff;
--color-gamma-opaque: rgba(255, 255, 255, 0.3);
}
/* Azure Theme */
.theme-azure {
--header-text-primary: #1c1b1b;
--header-text-secondary: #999999;
--header-background-color: #f2f2f2;
--header-gradient-start: rgba(242, 242, 242, 0);
--header-gradient-end: rgba(242, 242, 242, 0.7);
--color-border: #5a8bb8;
--color-alpha: #ff8c2f;
--color-beta: #e7f2f8;
--color-gamma: #336699;
--color-gamma-opaque: rgba(51, 102, 153, 0.3);
}
/* Dark Mode Overrides */
.dark {
--color-border: #383735;
}
/*
* Dynamic color variables for user overrides from GraphQL
* These are set via JavaScript and override the theme defaults
* Using :root with class for higher specificity to override theme classes
*/
:root.has-custom-header-text {
--header-text-primary: var(--custom-header-text-primary);
--color-header-text-primary: var(--custom-header-text-primary);
}
:root.has-custom-header-meta {
--header-text-secondary: var(--custom-header-text-secondary);
--color-header-text-secondary: var(--custom-header-text-secondary);
}
:root.has-custom-header-bg {
--header-background-color: var(--custom-header-background-color);
--color-header-background: var(--custom-header-background-color);
--header-gradient-start: var(--custom-header-gradient-start);
--header-gradient-end: var(--custom-header-gradient-end);
--color-header-gradient-start: var(--custom-header-gradient-start);
--color-header-gradient-end: var(--custom-header-gradient-end);
}

View File

@@ -1,280 +0,0 @@
@theme static {
/* Breakpoints */
--breakpoint-xs: 30rem;
--breakpoint-2xl: 100rem;
--breakpoint-3xl: 120rem;
/* Container settings */
--container-center: true;
--container-padding: 2rem;
--container-screen-2xl: 1400px;
/* Font families */
--font-sans:
clear-sans, ui-sans-serif, system-ui, sans-serif, 'Apple Color Emoji', 'Segoe UI Emoji',
'Segoe UI Symbol', 'Noto Color Emoji';
/* Grid template columns */
--grid-template-columns-settings: 35% 1fr;
/* Border color default */
--default-border-color: var(--color-border);
--ui-border-muted: hsl(var(--border));
--ui-radius: 0.5rem;
--ui-primary: var(--color-primary-500);
--ui-primary-hover: var(--color-primary-600);
--ui-primary-active: var(--color-primary-700);
/* Color palette */
--color-inherit: inherit;
--color-transparent: transparent;
--color-black: #1c1b1b;
--color-grey-darkest: #222;
--color-grey-darker: #606f7b;
--color-grey-dark: #383735;
--color-grey-mid: #999999;
--color-grey: #e0e0e0;
--color-grey-light: #dae1e7;
--color-grey-lighter: #f1f5f8;
--color-grey-lightest: #f2f2f2;
--color-white: #ffffff;
/* Unraid colors */
--color-yellow-accent: #e9bf41;
--color-orange-dark: #f15a2c;
--color-orange: #ff8c2f;
/* Unraid red palette */
--color-unraid-red: #e22828;
--color-unraid-red-50: #fef2f2;
--color-unraid-red-100: #ffe1e1;
--color-unraid-red-200: #ffc9c9;
--color-unraid-red-300: #fea3a3;
--color-unraid-red-400: #fc6d6d;
--color-unraid-red-500: #f43f3f;
--color-unraid-red-600: #e22828;
--color-unraid-red-700: #bd1818;
--color-unraid-red-800: #9c1818;
--color-unraid-red-900: #821a1a;
--color-unraid-red-950: #470808;
/* Unraid green palette */
--color-unraid-green: #63a659;
--color-unraid-green-50: #f5f9f4;
--color-unraid-green-100: #e7f3e5;
--color-unraid-green-200: #d0e6cc;
--color-unraid-green-300: #aad1a4;
--color-unraid-green-400: #7db474;
--color-unraid-green-500: #63a659;
--color-unraid-green-600: #457b3e;
--color-unraid-green-700: #396134;
--color-unraid-green-800: #314e2d;
--color-unraid-green-900: #284126;
--color-unraid-green-950: #122211;
/* Primary colors (orange) */
--color-primary-50: #fff7ed;
--color-primary-100: #ffedd5;
--color-primary-200: #fed7aa;
--color-primary-300: #fdba74;
--color-primary-400: #fb923c;
--color-primary-500: #ff6600;
--color-primary-600: #ea580c;
--color-primary-700: #c2410c;
--color-primary-800: #9a3412;
--color-primary-900: #7c2d12;
--color-primary-950: #431407;
/* Header colors - defaults will be overridden by theme */
--color-header-text-primary: var(--header-text-primary, #1c1c1c);
--color-header-text-secondary: var(--header-text-secondary, #999999);
--color-header-background: var(--header-background-color, #f2f2f2);
/* Legacy colors - defaults (overridden by theme-variants.css) */
--color-alpha: #ff8c2f;
--color-beta: #f2f2f2;
--color-gamma: #999999;
--color-gamma-opaque: rgba(153, 153, 153, 0.5);
--color-customgradient-start: rgba(242, 242, 242, 0);
--color-customgradient-end: rgba(242, 242, 242, 0.85);
/* Gradients - defaults (overridden by theme-variants.css) */
--color-header-gradient-start: rgba(242, 242, 242, 0);
--color-header-gradient-end: rgba(242, 242, 242, 0.85);
--color-banner-gradient: none;
/* Font sizes */
--font-10px: 10px;
--font-12px: 12px;
--font-14px: 14px;
--font-16px: 16px;
--font-18px: 18px;
--font-20px: 20px;
--font-24px: 24px;
--font-30px: 30px;
/* Spacing */
--spacing-4_5: 1.125rem;
--spacing--8px: -8px;
--spacing-2px: 2px;
--spacing-4px: 4px;
--spacing-6px: 6px;
--spacing-8px: 8px;
--spacing-10px: 10px;
--spacing-12px: 12px;
--spacing-14px: 14px;
--spacing-16px: 16px;
--spacing-20px: 20px;
--spacing-24px: 24px;
--spacing-28px: 28px;
--spacing-32px: 32px;
--spacing-36px: 36px;
--spacing-40px: 40px;
--spacing-64px: 64px;
--spacing-80px: 80px;
--spacing-90px: 90px;
--spacing-150px: 150px;
--spacing-160px: 160px;
--spacing-200px: 200px;
--spacing-260px: 260px;
--spacing-300px: 300px;
--spacing-310px: 310px;
--spacing-350px: 350px;
--spacing-448px: 448px;
--spacing-512px: 512px;
--spacing-640px: 640px;
--spacing-800px: 800px;
/* Width and Height values */
--width-36px: 36px;
--height-36px: 36px;
/* Min/Max widths */
--min-width-86px: 86px;
--min-width-160px: 160px;
--min-width-260px: 260px;
--min-width-300px: 300px;
--min-width-310px: 310px;
--min-width-350px: 350px;
--min-width-800px: 800px;
--max-width-86px: 86px;
--max-width-160px: 160px;
--max-width-260px: 260px;
--max-width-300px: 300px;
--max-width-310px: 310px;
--max-width-350px: 350px;
--max-width-640px: 640px;
--max-width-800px: 800px;
--max-width-1024px: 1024px;
/* Container sizes adjusted for 10px base font size (1.6x scale) */
--container-xs: 32rem;
--container-sm: 38.4rem;
--container-md: 44.8rem;
--container-lg: 51.2rem;
--container-xl: 57.6rem;
--container-2xl: 67.2rem;
--container-3xl: 76.8rem;
--container-4xl: 89.6rem;
--container-5xl: 102.4rem;
--container-6xl: 115.2rem;
--container-7xl: 128rem;
/* Extended width scale for max-w-* utilities */
--width-5xl: 102.4rem;
--width-6xl: 115.2rem;
--width-7xl: 128rem;
--width-8xl: 140.8rem;
--width-9xl: 153.6rem;
--width-10xl: 166.4rem;
/* Animations */
--animate-mark-2: mark-2 1.5s ease infinite;
--animate-mark-3: mark-3 1.5s ease infinite;
--animate-mark-6: mark-6 1.5s ease infinite;
--animate-mark-7: mark-7 1.5s ease infinite;
/* Radius */
--radius: 0.5rem;
/* Text Resizing */
--text-xs: 1.2rem; /* 12px at 10px base */
--text-sm: 1.4rem; /* 14px at 10px base */
--text-base: 1.6rem; /* 16px at 10px base */
--text-lg: 1.8rem; /* 18px at 10px base */
--text-xl: 2rem; /* 20px at 10px base */
--text-2xl: 2.4rem; /* 24px at 10px base */
--text-3xl: 3rem; /* 30px at 10px base */
--text-4xl: 3.6rem; /* 36px at 10px base */
--text-5xl: 4.8rem; /* 48px at 10px base */
--text-6xl: 6rem; /* 60px at 10px base */
--text-7xl: 7.2rem; /* 72px at 10px base */
--text-8xl: 9.6rem; /* 96px at 10px base */
--text-9xl: 12.8rem; /* 128px at 10px base */
--spacing: 0.4rem; /* 4px at 10px base */
}
/* Keyframes */
@keyframes mark-2 {
50% {
transform: translateY(-40px);
}
to {
transform: translateY(0);
}
}
@keyframes mark-3 {
50% {
transform: translateY(-62px);
}
to {
transform: translateY(0);
}
}
@keyframes mark-6 {
50% {
transform: translateY(40px);
}
to {
transform: translateY(0);
}
}
@keyframes mark-7 {
50% {
transform: translateY(62px);
}
to {
transform: translateY(0);
}
}
/* Theme colors that reference CSS variables */
@theme inline {
--color-background: hsl(var(--background));
--color-foreground: hsl(var(--foreground));
--color-muted: hsl(var(--muted));
--color-muted-foreground: hsl(var(--muted-foreground));
--color-popover: hsl(var(--popover));
--color-popover-foreground: hsl(var(--popover-foreground));
--color-card: hsl(var(--card));
--color-card-foreground: hsl(var(--card-foreground));
--color-border: hsl(var(--border));
--color-input: hsl(var(--input));
--color-primary: hsl(var(--primary));
--color-primary-foreground: hsl(var(--primary-foreground));
--color-secondary: hsl(var(--secondary));
--color-secondary-foreground: hsl(var(--secondary-foreground));
--color-accent: hsl(var(--accent));
--color-accent-foreground: hsl(var(--accent-foreground));
--color-destructive: hsl(var(--destructive));
--color-destructive-foreground: hsl(var(--destructive-foreground));
--color-ring: hsl(var(--ring));
--color-chart-1: hsl(var(--chart-1, 12 76% 61%));
--color-chart-2: hsl(var(--chart-2, 173 58% 39%));
--color-chart-3: hsl(var(--chart-3, 197 37% 24%));
--color-chart-4: hsl(var(--chart-4, 43 74% 66%));
--color-chart-5: hsl(var(--chart-5, 27 87% 67%));
}

160
CLAUDE.md
View File

@@ -1,160 +0,0 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
This is the Unraid API monorepo containing multiple packages that provide API functionality for Unraid servers. It uses pnpm workspaces with the following structure:
- `/api` - Core NestJS API server with GraphQL
- `/web` - Vue 3 frontend application
- `/unraid-ui` - Vue 3 component library
- `/plugin` - Unraid plugin package (.plg)
- `/packages` - Shared packages and API plugins
## Essential Commands
### Development
```bash
pnpm install # Install all dependencies
pnpm dev # Run all dev servers concurrently
pnpm build # Build all packages
pnpm build:watch # Watch mode with local plugin build
```
### Testing & Code Quality
```bash
pnpm test # Run all tests
pnpm lint # Run linting
pnpm lint:fix # Fix linting issues
pnpm type-check # TypeScript type checking
```
### API Development
```bash
cd api && pnpm dev # Run API server (http://localhost:3001)
cd api && pnpm test:watch # Run tests in watch mode
cd api && pnpm codegen # Generate GraphQL types
```
### Deployment
```bash
pnpm unraid:deploy <SERVER_IP> # Deploy all to Unraid server
```
### Developer Tools
```bash
unraid-api developer # Interactive prompt for tools
unraid-api developer --sandbox true # Enable GraphQL sandbox
unraid-api developer --sandbox false # Disable GraphQL sandbox
unraid-api developer --enable-modal # Enable modal testing tool
unraid-api developer --disable-modal # Disable modal testing tool
```
## Architecture Notes
### API Structure (NestJS)
- Modules: `auth`, `config`, `plugins`, `emhttp`, `monitoring`
- GraphQL API with Apollo Server at `/graphql`
- Redux store for state management in `src/store/`
- Plugin system for extending functionality
- Entry points: `src/index.ts` (server), `src/cli.ts` (CLI)
### Key Patterns
- TypeScript imports use `.js` extensions (ESM compatibility)
- NestJS dependency injection with decorators
- GraphQL schema-first approach with code generation
- API plugins follow specific structure (see `api/docs/developer/api-plugins.md`)
### Authentication
- API key authentication via headers
- Cookie-based session management
- Keys stored in `/boot/config/plugins/unraid-api/`
### Development Workflow
1. Work Intent required before starting development
2. Fork from `main` branch
3. Reference Work Intent in PR
4. No direct pushes to main
### Debug Mode
```bash
LOG_LEVEL=debug unraid-api start --debug
```
Enables GraphQL playground at `http://tower.local/graphql`
## Coding Guidelines
### General Rules
- Never add comments unless they are needed for clarity of function
- Never add comments for obvious things, and avoid commenting when starting and ending code blocks
- Be CONCISE, keep replies shorter than a paragraph if at all possible
### API Development Rules (`api/**/*`)
- Use pnpm ONLY for package management
- Always run scripts from api/package.json unless requested
- Prefer adding new files to the NestJS repo located at `api/src/unraid-api/` instead of the legacy code
- Test suite is VITEST, do not use jest
- Run tests with: `pnpm --filter ./api test`
- Prefer to not mock simple dependencies
### Web Development Rules (`web/**/*`)
- Always run `pnpm codegen` for GraphQL code generation in the web directory
- GraphQL queries must be placed in `.query.ts` files
- GraphQL mutations must be placed in `.mutation.ts` files
- All GraphQL under `web/` must follow this naming convention
### Testing Guidelines
#### General Testing Best Practices
- **Error Testing:** Use `.rejects.toThrow()` without arguments to test that functions throw errors. Don't test exact error message strings unless the message format is specifically what you're testing
- **Focus on Behavior:** Test what the code does, not implementation details like exact error message wording
- **Avoid Brittleness:** Don't write tests that break when minor changes are made to error messages, log formats, or other non-essential details
- **Use Mocks Correctly**: Mocks should be used as nouns, not verbs.
#### Vue Component Testing
- Use pnpm when running terminal commands and stay within the web directory
- Tests are located under `web/__test__`, run with `pnpm test`
- Use `mount` from Vue Test Utils for component testing
- Stub complex child components that aren't the focus of the test
- Mock external dependencies and services
- Test component behavior and output, not implementation details
- Use `createTestingPinia()` for mocking stores in components
- Find elements with semantic queries like `find('button')` rather than data-test IDs
- Use `await nextTick()` for DOM updates
- Always await async operations before making assertions
#### Store Testing with Pinia
- Use `createPinia()` and `setActivePinia` when testing Store files
- Only use `createTestingPinia` if you specifically need its testing features
- Let stores initialize with their natural default state
- Don't mock the store being tested
- Ensure Vue reactivity imports are added to store files (computed, ref, watchEffect)
- Place all mock declarations at the top level
- Use factory functions for module mocks to avoid hoisting issues
- Clear mocks between tests to ensure isolation
## Development Memories
- We are using tailwind v4 we do not need a tailwind config anymore
- always search the internet for tailwind v4 documentation when making tailwind related style changes
- never run or restart the API server or web server. I will handle the lifecycle, simply wait and ask me to do this for you
- Never use the `any` type. Always prefer proper typing
- Avoid using casting whenever possible, prefer proper typing from the start
- **IMPORTANT:** cache-manager v7 expects TTL values in **milliseconds**, not seconds. Always use milliseconds when setting cache TTL (e.g., 600000 for 10 minutes, not 600)

View File

@@ -1,80 +0,0 @@
# Contributing to Unraid Connect
Thank you for your interest in contributing to Unraid Connect! We want to make contributing to this project as easy and transparent as possible, whether it's:
- Reporting a bug
- Discussing the current state of the code
- Submitting a fix
- Proposing new features
## TypeScript Import Extensions in the API Directory
When working with the API directory, you'll notice that TypeScript files are imported with `.js` extensions (e.g., `import { something } from './file.js'`) even though the actual files have `.ts` extensions. This is because:
1. We use ECMAScript modules (ESM) in our TypeScript configuration
2. When TypeScript compiles `.ts` files to `.js`, the import paths in the compiled code need to reference `.js` files
3. TypeScript doesn't automatically change the extensions in import statements during compilation
4. Using `.js` extensions in imports ensures that both TypeScript during development and Node.js in production can resolve the modules correctly
This approach follows the [official TypeScript ESM recommendation](https://www.typescriptlang.org/docs/handbook/esm-node.html) and ensures compatibility across development and production environments.
## Development Process
We use GitHub to host code, to track issues and feature requests, as well as accept pull requests.
### 1. Work Intent Process
**Before starting any development work**, you must submit a Work Intent and have it approved:
1. **Create a Work Intent**
- Go to [Issues → New Issue → Work Intent](https://github.com/unraid/api/issues/new?template=work_intent.md)
- Fill out the brief template describing what you want to work on
- The issue will be automatically labeled as `work-intent` and `unapproved`
2. **Wait for Approval**
- A core developer will review your Work Intent
- They may ask questions or suggest changes
- Once approved, the `unapproved` label will be removed
3. **Begin Development**
- Only start coding after your Work Intent is approved
- Follow the approach outlined in your approved Work Intent
- Reference the Work Intent in your future PR
### 2. Making Changes
1. Fork the repo and create your branch from `main`
2. Make your changes
3. Ensure your commits are clear and descriptive
4. Keep your changes focused - solve one thing at a time
### 3. Pull Request Process
1. Create a pull request from your fork to our `main` branch
2. Reference the approved Work Intent in your PR description
3. Ensure the PR description clearly describes the problem and solution
4. Include screenshots or examples if applicable
5. Wait for review from the core team
**Note:** Direct pushes to the main branch are not allowed. All changes must go through the PR process.
## Developer Documentation
For detailed information about development workflows, repository organization, and other technical details, please refer to our developer documentation:
- [Development Guide](api/docs/developer/development.md) - Setup, building, and debugging instructions
- [Development Workflows](api/docs/developer/workflows.md) - Detailed workflows for local development, building, and deployment
- [Repository Organization](api/docs/developer/repo-organization.md) - High-level architecture and project structure
## Bug Reports and Feature Requests
We use GitHub issues to track bugs and feature requests:
- **Bug Report**: Use the [Bug Report Template](https://github.com/unraid/api/issues/new?template=bug_report.md)
- **Feature Request**: Use the [Feature Request Template](https://github.com/unraid/api/issues/new?template=feature_request.md)
For Unraid Connect specific issues (Flash Backup, connect.myunraid.net, mothership connectivity), please submit through our support portal instead.
## License
By contributing, you agree that your contributions will be licensed under the same terms as the main project.

View File

@@ -1,352 +0,0 @@
Project License Notice
----------------------
This project is licensed under the terms of the GNU General Public License version 2,
**or (at your option) any later version** published by the Free Software Foundation.
The full text of the GNU GPL v2.0 is provided below for reference.
----------------------
GNU GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
<https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users. This
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to
using it. (Some other Free Software Foundation software is covered by
the GNU Lesser General Public License instead.) You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have. You must make sure that they, too, receive or can get the
source code. And you must show them these terms so they know their
rights.
We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary. To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.
The precise terms and conditions for copying, distribution and
modification follow.
GNU GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License. The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language. (Hereinafter, translation is included without limitation in
the term "modification".) Each licensee is addressed as "you".
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.
You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) You must cause the modified files to carry prominent notices
stating that you changed the files and the date of any change.
b) You must cause any work that you distribute or publish, that in
whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License.
c) If the modified program normally reads commands interactively
when run, you must cause it, when started running for such
interactive use in the most ordinary way, to print or display an
announcement including an appropriate copyright notice and a
notice that there is no warranty (or else, saying that you provide
a warranty) and that users may redistribute the program under
these conditions, and telling the user how to view a copy of this
License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on
the Program is not required to print an announcement.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.
In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange; or,
c) Accompany it with the information you received as to the offer
to distribute corresponding source code. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form with such
an offer, in accord with Subsection b above.)
The source code for a work means the preferred form of the work for
making modifications to it. For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable. However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.
If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.
4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License. Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.
5. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all. For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded. In such case, this License incorporates
the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation. If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.
10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission. For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this. Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
Unraid API - Core API functionality for Unraid systems
Copyright (C) 2024 Lime Technology, Inc.
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License along
with this program; if not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
For questions about licensing or to report issues:
- Website: https://unraid.net
- Email: support@unraid.net
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) year name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
`Gnomovision' (which makes passes at compilers) written by James Hacker.
<signature of Moe Ghoul>, 1 April 1989
Moe Ghoul, President of Vice
This General Public License does not permit incorporating your program into
proprietary programs. If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License.

View File

@@ -1,10 +0,0 @@
{
"parsers": {
"**/*.ts": [
"@depcheck/parser-typescript",
{
"project": "tsconfig.json"
}
]
}
}

View File

@@ -1,24 +1,11 @@
PATHS_UNRAID_DATA=./dev/data # Where we store plugin data (e.g. permissions.json)
PATHS_STATES=./dev/states # Where .ini files live (e.g. vars.ini)
PATHS_AUTH_SESSIONS=./dev/sessions # Where user sessions live
PATHS_AUTH_KEY=./dev/keys # Auth key directory
PATHS_DYNAMIX_BASE=./dev/dynamix # Dynamix's data directory
PATHS_DYNAMIX_CONFIG_DEFAULT=./dev/dynamix/default.cfg # Dynamix's default config file, which ships with unraid
PATHS_DYNAMIX_CONFIG=./dev/dynamix/dynamix.cfg # Dynamix's config file
PATHS_MY_SERVERS_CONFIG=./dev/Unraid.net/myservers.cfg # My servers config file
PATHS_MY_SERVERS_FB=./dev/Unraid.net/fb_keepalive # My servers flashbackup timekeeper file
PATHS_KEYFILE_BASE=./dev/Unraid.net # Keyfile location
PATHS_MACHINE_ID=./dev/data/machine-id
PATHS_PARITY_CHECKS=./dev/states/parity-checks.log
PATHS_CONFIG_MODULES=./dev/configs
PATHS_ACTIVATION_BASE=./dev/activation
PATHS_PASSWD=./dev/passwd
PATHS_RCLONE_SOCKET=./dev/rclone-socket
PATHS_LOG_BASE=./dev/log # Where we store logs
PATHS_LOGS_FILE=./dev/log/graphql-api.log
PATHS_CONNECT_STATUS_FILE_PATH=./dev/connectStatus.json # Connect plugin status file
PATHS_OIDC_JSON=./dev/configs/oidc.local.json
PATHS_LOCAL_SESSION_FILE=./dev/local-session
ENVIRONMENT="development"
NODE_ENV="development"
PORT="3001"
@@ -27,8 +14,5 @@ INTROSPECTION=true
MOTHERSHIP_GRAPHQL_LINK="http://authenticator:3000/graphql"
NODE_TLS_REJECT_UNAUTHORIZED=0
BYPASS_PERMISSION_CHECKS=false
BYPASS_CORS_CHECKS=true
BYPASS_CORS_CHECKS=false
CHOKIDAR_USEPOLLING=true
LOG_TRANSPORT=console
LOG_LEVEL=trace
ENABLE_NEXT_DOCKER_RELEASE=true

View File

@@ -1,5 +0,0 @@
ENVIRONMENT="production"
NODE_ENV="production"
PORT="/var/run/unraid-api.sock"
MOTHERSHIP_GRAPHQL_LINK="https://mothership.unraid.net/ws"
PATHS_CONFIG_MODULES="/boot/config/plugins/dynamix.my.servers/configs"

View File

@@ -1,5 +0,0 @@
ENVIRONMENT="staging"
NODE_ENV="production"
PORT="/var/run/unraid-api.sock"
MOTHERSHIP_GRAPHQL_LINK="https://staging.mothership.unraid.net/ws"
PATHS_CONFIG_MODULES="/boot/config/plugins/dynamix.my.servers/configs"

View File

@@ -1,19 +1,11 @@
VERSION="THIS_WILL_BE_REPLACED_WHEN_BUILT"
PATHS_UNRAID_DATA=./dev/data # Where we store plugin data (e.g. permissions.json)
PATHS_STATES=./dev/states # Where .ini files live (e.g. vars.ini)
PATHS_AUTH_SESSIONS=./dev/sessions # Where user sessions live
PATHS_AUTH_KEY=./dev/keys # Auth key directory
PATHS_DYNAMIX_BASE=./dev/dynamix # Dynamix's data directory
PATHS_DYNAMIX_CONFIG_DEFAULT=./dev/dynamix/default.cfg # Dynamix's default config file, which ships with unraid
PATHS_DYNAMIX_CONFIG=./dev/dynamix/dynamix.cfg # Dynamix's config file
PATHS_MY_SERVERS_CONFIG=./dev/Unraid.net/myservers.cfg # My servers config file
PATHS_MY_SERVERS_FB=./dev/Unraid.net/fb_keepalive # My servers flashbackup timekeeper file
PATHS_KEYFILE_BASE=./dev/Unraid.net # Keyfile location
PATHS_MACHINE_ID=./dev/data/machine-id
PATHS_PARITY_CHECKS=./dev/states/parity-checks.log
PATHS_CONFIG_MODULES=./dev/configs
PATHS_ACTIVATION_BASE=./dev/activation
PATHS_PASSWD=./dev/passwd
PATHS_LOGS_FILE=./dev/log/graphql-api.log
PATHS_LOCAL_SESSION_FILE=./dev/local-session
PORT=5000
NODE_ENV="test"
NODE_ENV=test

47
api/.eslintrc.cjs Normal file
View File

@@ -0,0 +1,47 @@
/** @type {import('eslint').Linter.Config} */
module.exports = {
root: true,
plugins: [
'@typescript-eslint/eslint-plugin',
'unused-imports',
'eslint-plugin-unicorn',
],
ignorePatterns: ['src/graphql/generated/**/*.ts', '*.test.ts', 'tsup.config.ts', 'vite.config.ts'],
parser: '@typescript-eslint/parser',
rules: {
'@typescript-eslint/no-redundant-type-constituents': 'off',
'@typescript-eslint/no-unsafe-call': 'off',
'@typescript-eslint/naming-convention': 'off',
'@typescript-eslint/no-unsafe-assignment': 'off',
'@typescript-eslint/no-unsafe-return': 'off',
'@typescript-eslint/ban-types': 'off',
'@typescript-eslint/no-explicit-any': 'off',
'@typescript-eslint/consistent-type-imports': [
'warn',
{ fixStyle: 'inline-type-imports' },
],
'unicorn/numeric-separators-style': [
'error',
{ number: { minimumDigits: 0, groupLength: 3 } },
],
'import/no-cycle': 'off', // Change this to "error" to find circular imports
'@typescript-eslint/no-use-before-define': ['error'],
'no-multiple-empty-lines': ['error', { max: 1, maxBOF: 0, maxEOF: 1 }],
},
overrides: [
{
files: ['*.ts'],
extends: [
'eslint:recommended',
'plugin:@typescript-eslint/recommended',
],
parserOptions: {
project: true,
tsconfigRootDir: __dirname,
},
rules: {
'@typescript-eslint/no-explicit-any': 'off',
},
},
],
};

View File

@@ -1,62 +0,0 @@
import eslint from '@eslint/js';
import importPlugin from 'eslint-plugin-import';
import noRelativeImportPaths from 'eslint-plugin-no-relative-import-paths';
import prettier from 'eslint-plugin-prettier';
import tseslint from 'typescript-eslint';
export default tseslint.config(
eslint.configs.recommended,
...tseslint.configs.recommended,
{
ignores: ['src/graphql/generated/client/**/*', 'src/**/**/dummy-process.js'],
},
{
plugins: {
'no-relative-import-paths': noRelativeImportPaths,
prettier: prettier,
import: importPlugin,
},
rules: {
'@typescript-eslint/no-redundant-type-constituents': 'off',
'@typescript-eslint/no-unsafe-call': 'off',
'@typescript-eslint/naming-convention': 'off',
'@typescript-eslint/no-unsafe-assignment': 'off',
'@typescript-eslint/no-unsafe-return': 'off',
'@typescript-eslint/ban-types': 'off',
'@typescript-eslint/no-explicit-any': 'off',
'@typescript-eslint/no-empty-object-type': 'off',
'no-use-before-define': ['off'],
'no-multiple-empty-lines': ['error', { max: 1, maxBOF: 0, maxEOF: 1 }],
'@typescript-eslint/no-unused-vars': 'off',
'@typescript-eslint/no-unused-expressions': 'off',
'import/no-unresolved': 'off',
'import/no-absolute-path': 'off',
'import/prefer-default-export': 'off',
'no-relative-import-paths/no-relative-import-paths': [
'error',
{ allowSameFolder: false, rootDir: 'src', prefix: '@app' },
],
'prettier/prettier': 'error',
'import/extensions': [
'error',
'ignorePackages',
{
js: 'always',
ts: 'always',
},
],
'no-restricted-globals': [
'error',
{
name: '__dirname',
message: 'Use import.meta.url instead of __dirname in ESM',
},
{
name: '__filename',
message: 'Use import.meta.url instead of __filename in ESM',
},
],
'eol-last': ['error', 'always'],
},
}
);

98
api/.gitignore vendored
View File

@@ -1,98 +0,0 @@
# Logs
./logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
coverage-ts
# nyc test coverage
.nyc_output
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# TypeScript v1 declaration files
typings/
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
# next.js build output
.next
# Visual Studio Code workspace
.vscode/*
!.vscode/extensions.json
# OSX
.DS_Store
# Temp dir for tests
test/__temp__/*
# Built files
dist
# Typescript
typescript
# Ultra runner
.ultra.cache.json
# Github actions
RELEASE_NOTES.md
# Docker Deploy Folder
deploy/*
!deploy/.gitkeep
# pkg cache
.pkg-cache
# IDE Settings Files
.idea
!**/*.login.*
# local api configs - don't need project-wide tracking
dev/connectStatus.json
dev/configs/*
# local status - doesn't need to be tracked
dev/connectStatus.json
# mock local session file
dev/local-session
# local OIDC config for testing - contains secrets
dev/configs/oidc.local.json
# local api keys
dev/keys/*

View File

@@ -1,5 +0,0 @@
{
schema: {
files: 'src/graphql/schema/types/**/*.graphql'
}
}

1
api/.nvmrc Normal file
View File

@@ -0,0 +1 @@
18.19.1

View File

@@ -1,7 +0,0 @@
!src/*
# Downloaded Fixtures (For File Modifications)
src/unraid-api/unraid-file-modifier/modifications/__fixtures__/downloaded/*
# Generated Types
src/graphql/generated/client/*.ts

View File

@@ -1,38 +0,0 @@
/**
* @see https://prettier.io/docs/en/configuration.html
* @type {import("prettier").Config}
*/
module.exports = {
trailingComma: 'es5',
tabWidth: 4,
semi: true,
singleQuote: true,
printWidth: 105,
plugins: ['@ianvs/prettier-plugin-sort-imports'],
// decorators-legacy lets the import sorter transform files with decorators
importOrderParserPlugins: ['typescript', 'decorators-legacy'],
importOrder: [
/**----------------------
* Nest.js & node.js imports
*------------------------**/
'<TYPES>^@nestjs(/.*)?$',
'^@nestjs(/.*)?$', // matches imports starting with @nestjs
'<TYPES>^(node:)',
'<BUILTIN_MODULES>', // Node.js built-in modules
'',
/**----------------------
* Third party packages
*------------------------**/
'<TYPES>',
'<THIRD_PARTY_MODULES>', // Imports not matched by other special words or groups.
'',
/**----------------------
* Application Code
*------------------------**/
'<TYPES>^@app(/.*)?$', // matches type imports starting with @app
'^@app(/.*)?$',
'',
'<TYPES>^[.]',
'^[.]', // relative imports
],
};

View File

@@ -1,11 +0,0 @@
{
"recommendations": [
"mikestead.dotenv",
"eamodio.gitlens",
"dbaeumer.vscode-eslint",
"antfu.goto-alias",
"bierner.markdown-mermaid",
"github.vscode-pull-request-github",
"bierner.markdown-preview-github-styles"
]
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,7 @@
###########################################################
# Development/Build Image
###########################################################
FROM node:22.18.0-bookworm-slim AS development
FROM node:18.19.1-bookworm-slim As development
# Install build tools and dependencies
RUN apt-get update -y && apt-get install -y \
@@ -19,18 +19,22 @@ WORKDIR /app
# Set app env
ENV NODE_ENV=development
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"
# Install pnpm
RUN corepack enable && corepack prepare pnpm@8.15.4 --activate && npm i -g npm@latest
# Setup cache for pkg
ENV PKG_CACHE_PATH /app/.pkg-cache
RUN mkdir -p ${PKG_CACHE_PATH}
COPY tsconfig.json .eslintrc.ts .prettierrc.cjs .npmrc .env.production .env.staging package.json pnpm-lock.yaml .npmrc ./
COPY tsconfig.json tsup.config.ts .eslintrc.cjs .npmrc .env.production .env.staging ./
COPY package.json package-lock.json ./
# Install pkg
RUN npm i -g pkg zx
# Install deps
RUN --mount=type=cache,id=pnpm,target=/pnpm/store pnpm install --frozen-lockfile
RUN npm i
EXPOSE 3001
EXPOSE 4000
###########################################################
# Builder Image
@@ -38,8 +42,6 @@ EXPOSE 3001
FROM development AS builder
ENV NODE_ENV=production
COPY . .
CMD ["pnpm", "run", "build:release"]
CMD ["npm", "run", "build-pkg"]

View File

@@ -2,11 +2,7 @@
## Installation
Install the production plugin via the apps tab (search for "Unraid Connect")
Manual install can be done with the following routes:
[production](https://stable.dl.unraid.net/unraid-api/dynamix.unraid.net.plg)
[staging](https://preview.dl.unraid.net/unraid-api/dynamix.unraid.net.staging.plg)
Install the production plugin via the apps tab (search for "my servers") on Unraid 6.9.2 or later.
## CLI
@@ -17,11 +13,11 @@ root@Devon:~# unraid-api
Unraid API
Thanks for using the official Unraid API
Thanks for using the official Unraid API
Usage:
$ unraid-api command <options>
$ unraid-api command <options>
Commands:
@@ -29,48 +25,34 @@ Commands:
Options:
-h, --help Prints this usage guide.
-d, --debug Enabled debug mode.
-p, --port string Set the graphql port.
--environment production/staging/development Set the working environment.
--log-level ALL/TRACE/DEBUG/INFO/WARN/ERROR/FATAL/MARK/OFF Set the log level.
-h, --help Prints this usage guide.
-d, --debug Enabled debug mode.
-p, --port string Set the graphql port.
--environment production/staging/development Set the working environment.
--log-level ALL/TRACE/DEBUG/INFO/WARN/ERROR/FATAL/MARK/OFF Set the log level.
Copyright © 2024 Lime Technology, Inc.
Copyright © 2022 Lime Technology, Inc.
```
## Key
To create and work with Unraid API keys, used for the local API, run the following command to view all available options. These options may change over time.
```sh
unraid-api key --help
```
## Report
To view the current status of the unraid-api and its connection to mothership, run:
```sh
```
unraid-api report
```
To view verbose data (anonymized), run:
```sh
```
unraid-api report -v
```
To view non-anonymized verbose data, run:
```sh
```
unraid-api report -vv
```
## Secrets
If you found this file you're likely a developer. If you'd like to know more about the API and when it's available please join [our discord](https://discord.unraid.net/).
## License
Copyright Lime Technology Inc. All rights reserved.
Copyright 2019-2022 Lime Technology Inc. All rights reserved.

View File

@@ -1,49 +0,0 @@
import type { CodegenConfig } from '@graphql-codegen/cli';
const config: CodegenConfig = {
overwrite: true,
emitLegacyCommonJSImports: false,
verbose: true,
config: {
namingConvention: {
enumValues: 'change-case-all#upperCase',
transformUnderscore: true,
useTypeImports: true,
},
scalars: {
DateTime: 'string',
Long: 'number',
JSON: 'Record<string, any>',
URL: 'URL',
Port: 'number',
UUID: 'string',
BigInt: 'number',
},
scalarSchemas: {
URL: 'z.instanceof(URL)',
Long: 'z.number()',
JSON: 'z.record(z.string(), z.any())',
Port: 'z.number()',
UUID: 'z.string()',
BigInt: 'z.number()',
},
},
generates: {
// Generate Types for CLI Internal GraphQL Queries
'src/unraid-api/cli/generated/': {
documents: ['src/unraid-api/cli/queries/**/*.ts', 'src/unraid-api/cli/mutations/**/*.ts'],
schema: './generated-schema.graphql',
preset: 'client',
presetConfig: {
gqlTagName: 'gql',
},
config: {
useTypeImports: true,
withObjectType: true,
},
plugins: [{ add: { content: '/* eslint-disable */' } }],
},
},
};
export default config;

77
api/codegen.yml Normal file
View File

@@ -0,0 +1,77 @@
overwrite: true
emitLegacyCommonJSImports: false
verbose: true
require:
- ts-node/register
config:
namingConvention:
typeNames: './fix-array-type.cjs'
enumValues: 'change-case#upperCase'
useTypeImports: true
scalars:
DateTime: string
Long: number
JSON: "{ [key: string]: any }"
URL: URL
Port: number
UUID: string
generates:
src/graphql/generated/client/:
documents: './src/graphql/mothership/*.ts'
schema:
'${MOTHERSHIP_GRAPHQL_LINK}':
headers:
origin: 'https://forums.unraid.net'
preset: client
presetConfig:
gqlTagName: graphql
config:
useTypeImports: true
withObjectType: true
plugins:
- add: { content: '/* eslint-disable */' }
# Generate Types for the API Server
src/graphql/generated/api/types.ts:
schema:
- './src/graphql/types.ts'
- './src/graphql/schema/types/**/*.graphql'
plugins:
- typescript
- typescript-resolvers
- add: { content: '/* eslint-disable */' }
config:
contextType: '@app/graphql/schema/utils#Context'
useIndexSignature: true
# Generate Operations for any built in API Server Operations (ie report.ts)
src/graphql/generated/api/operations.ts:
documents: './src/graphql/client/api/*.ts'
schema:
- './src/graphql/types.ts'
- './src/graphql/schema/types/**/*.graphql'
preset: import-types
presetConfig:
typesPath: '@app/graphql/generated/api/types'
plugins:
- typescript-validation-schema
- typescript-operations
- typed-document-node
- add: { content: '/* eslint-disable */' }
config:
importFrom: '@app/graphql/generated/api/types'
strictScalars: false
schema: 'zod'
withObjectType: true
src/graphql/generated/client/validators.ts:
schema:
'${MOTHERSHIP_GRAPHQL_LINK}':
headers:
origin: 'https://forums.unraid.net'
plugins:
- typescript-validation-schema
- add: { content: '/* eslint-disable */'}
config:
importFrom: '@app/graphql/generated/client/graphql'
strictScalars: false
schema: 'zod'

View File

@@ -1 +0,0 @@
module.exports = { extends: ['@commitlint/config-conventional'] };

View File

@@ -1 +0,0 @@
┘[5╢╦О яb┴ ю└;R╛леЩ²ДА├y÷шd│яя╛Еlя▓ё"Hи╜ь;QДs≈@Вы▄╠╩1·Qy╓к|й╔+╨фM)X9jя▄тГО⌠1а2WHщ'│.ЕJё-╨MPгS╜╧:Ю▓]o9^ЮО0┴$"░ l^`╪ >3к:╦я ЯО┤q~ёш≈└с ш5ёЗ=р╟─]╗IWf╥и ⌡?:У2ВоE5[р╨Ш(÷╤Е+о│ШIмAч²%╞╓дq:ё╤эb╣┼

View File

@@ -1,20 +1,21 @@
[api]
version="4.4.1"
version="3.8.1+d06e215a"
extraOrigins="https://google.com,https://test.com"
[local]
sandbox="yes"
[notifier]
apikey="unnotify_30994bfaccf839c65bae75f7fa12dd5ee16e69389f754c3b98ed7d5"
[remote]
wanaccess="yes"
wanport="8443"
upnpEnabled="no"
apikey="_______________________BIG_API_KEY_HERE_________________________"
localApiKey="_______________________LOCAL_API_KEY_HERE_________________________"
email="test@example.com"
username="zspearmint"
avatar="https://via.placeholder.com/200"
regWizTime="1611175408732_0951-1653-3509-FBA155FA23C0"
accesstoken=""
idtoken=""
accesstoken=""
refreshtoken=""
dynamicRemoteAccessType="DISABLED"
ssoSubIds=""
[upc]
apikey="unupc_fab6ff6ffe51040595c6d9ffb63a353ba16cc2ad7d93f813a2e80a5810"

View File

@@ -1,20 +0,0 @@
[api]
version="4.4.1"
extraOrigins="https://google.com,https://test.com"
[local]
sandbox="yes"
[remote]
wanaccess="yes"
wanport="8443"
upnpEnabled="no"
apikey="_______________________BIG_API_KEY_HERE_________________________"
localApiKey="_______________________LOCAL_API_KEY_HERE_________________________"
email="test@example.com"
username="zspearmint"
avatar="https://via.placeholder.com/200"
regWizTime="1611175408732_0951-1653-3509-FBA155FA23C0"
accesstoken=""
idtoken=""
refreshtoken=""
dynamicRemoteAccessType="DISABLED"
ssoSubIds=""

View File

@@ -1,13 +0,0 @@
{
"code": "EXAMPLE_CODE_123",
"partnerName": "MyPartner Inc.",
"partnerUrl": "https://partner.example.com",
"serverName": "MyAwesomeServer",
"sysModel": "CustomBuild v1.0",
"comment": "This is a test activation code for development.",
"header": "#336699",
"headermetacolor": "#FFFFFF",
"background": "#F0F0F0",
"showBannerGradient": "yes",
"theme": "black"
}

View File

@@ -1 +0,0 @@
true

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.2 KiB

View File

@@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8" ?>
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="442" height="221">
<defs>
<linearGradient id="gradient_0" gradientUnits="userSpaceOnUse" x1="608.84924" y1="48.058002" x2="447.47684" y2="388.15295">
<stop offset="0" stop-color="#ECC02F"/>
<stop offset="1" stop-color="#B8436B"/>
</linearGradient>
</defs>
<path fill="url(#gradient_0)" transform="scale(0.431641 0.431641)" d="M126.543 236.139C141.269 184.983 170.747 148.08 228.938 144.823C240.378 144.182 259.66 144.749 271.333 145.215C299.585 144.391 350.558 142.667 377.842 145.685C414.099 149.696 443.185 175.429 472.192 195.251L586.561 274.337C636.114 308.874 627.234 309.151 685.21 309.042L778.304 309.082C799.091 309.099 813.482 308.867 828.82 292.529C857.893 261.561 843.003 209.317 800.506 200.17C790.505 198.018 779.334 199.535 769.11 199.523L702.658 199.488C690.005 186.062 675.199 151.817 658.182 145.215L739.199 145.198C765.636 145.196 796.164 142.886 821.565 150.344C923.889 180.389 922.324 331.136 816.611 357.807C802.524 361.361 788.425 361.034 774.035 361.031L663.497 361.009C623.773 360.859 603.599 349.313 572.35 327.596L430.421 229.848C415.731 219.804 401.419 209.118 386.451 199.488C377.579 199.501 368.42 200.01 359.582 199.488L272.561 199.497C258.582 199.485 235.352 198.06 222.607 200.981C192.741 207.825 177.956 234.361 180.015 263.294C177.545 260.392 178.63 254.678 178.838 251.164C179.877 233.569 187.409 224.968 197.345 212.22C184.786 202.853 156.933 193.749 149.447 186.645C143.454 196.583 136.881 205.628 132.955 216.732C130.766 222.921 130.678 230.967 127.506 236.625L126.543 236.139Z"/>
<path fill="#308DAF" transform="scale(0.431641 0.431641)" d="M149.447 186.645C156.933 193.749 184.786 202.853 197.345 212.22C187.409 224.968 179.877 233.569 178.838 251.164C178.63 254.678 177.545 260.392 180.015 263.294C192.489 309.751 221.563 309.078 263.512 309.07L322.096 309.048C333.708 325.984 348.958 344.904 361.795 361.006L232.654 361.03C176.801 360.579 130.605 315.939 126.498 260.613C125.893 252.473 126.453 244.293 126.543 236.139L127.506 236.625C130.678 230.967 130.766 222.921 132.955 216.732C136.881 205.628 143.454 196.583 149.447 186.645Z"/>
<defs>
<linearGradient id="gradient_1" gradientUnits="userSpaceOnUse" x1="620.42566" y1="140.57172" x2="611.08759" y2="282.2207">
<stop offset="0" stop-color="#F5A22C"/>
<stop offset="1" stop-color="#E17543"/>
</linearGradient>
</defs>
<path fill="url(#gradient_1)" transform="scale(0.431641 0.431641)" d="M570.215 137.504C646.214 133.055 670.623 188.789 707.064 241.977L726.71 270.658C729.065 274.1 737.591 284.13 737.576 287.916L674.645 287.916C674.5 287.132 659.251 264.134 658.182 263.294C658.133 262.92 623.915 212.832 620.593 208.697C602.652 186.369 565.856 181.796 545.393 203.424C542.002 207.007 539.705 211.779 535.713 214.764C534.409 212.586 496.093 187.105 490.641 183.32C508.306 154.99 539.004 142.872 570.215 137.504Z"/>
<path fill="#308DAF" transform="scale(0.431641 0.431641)" d="M286.656 221.485L350.512 221.485C354.248 227.374 358.556 232.986 362.565 238.698L379.9 263.82C397.44 289.065 410.994 321.185 447.698 317.317C464.599 315.536 476.472 305.449 486.751 292.741C494.293 298.818 530.089 320.341 533.124 324.28C532.441 328.231 526.229 334.319 522.861 336.255C521.587 339.958 509.164 348.519 505.635 350.88C463.781 378.879 411.472 377.537 373.808 343.464C365.331 335.795 359.734 326.969 353.351 317.641L336.798 293.614C320.035 269.591 302.915 245.863 286.656 221.485Z"/>
</svg>

Before

Width:  |  Height:  |  Size: 3.4 KiB

View File

@@ -1,34 +0,0 @@
# Development Configuration Files
This directory contains configuration files for local development.
## OIDC Configuration
### oidc.json
The default OIDC configuration file. This file is committed to git and should only contain non-sensitive test configurations.
### Using a Local Configuration (gitignored)
For local testing with real OAuth providers:
1. Create an `oidc.local.json` file based on `oidc.json`
2. Set the environment variable: `PATHS_OIDC_JSON=./dev/configs/oidc.local.json`
3. The API will load your local configuration instead of the default
Example:
```bash
PATHS_OIDC_JSON=./dev/configs/oidc.local.json pnpm dev
```
### Setting up OAuth Apps
#### Google
1. Go to [Google Cloud Console](https://console.cloud.google.com/)
2. Create a new project or select existing
3. Enable Google+ API
4. Create OAuth 2.0 credentials
5. Add authorized redirect URI: `http://localhost:3000/graphql/api/auth/oidc/callback`
#### GitHub
1. Go to GitHub Settings > Developer settings > OAuth Apps
2. Create a new OAuth App
3. Set Authorization callback URL: `http://localhost:3000/graphql/api/auth/oidc/callback`

View File

@@ -1,9 +0,0 @@
{
"version": "4.22.2",
"extraOrigins": [],
"sandbox": true,
"ssoSubIds": [],
"plugins": [
"unraid-api-plugin-connect"
]
}

View File

@@ -1,12 +0,0 @@
{
"wanaccess": true,
"wanport": 8443,
"upnpEnabled": false,
"apikey": "",
"localApiKey": "_______________________LOCAL_API_KEY_HERE_________________________",
"email": "test@example.com",
"username": "zspearmint",
"avatar": "https://via.placeholder.com/200",
"regWizTime": "1611175408732_0951-1653-3509-FBA155FA23C0",
"dynamicRemoteAccessType": "STATIC"
}

View File

@@ -1,22 +0,0 @@
{
"providers": [
{
"id": "unraid.net",
"name": "Unraid.net",
"clientId": "CONNECT_SERVER_SSO",
"issuer": "https://account.unraid.net",
"authorizationEndpoint": "https://account.unraid.net/sso/",
"tokenEndpoint": "https://account.unraid.net/api/oauth2/token",
"scopes": [
"openid",
"profile",
"email"
],
"authorizedSubIds": [
"297294e2-b31c-4bcc-a441-88aee0ad609f"
],
"buttonText": "Login With Unraid.net"
}
],
"defaultAllowedOrigins": []
}

View File

@@ -1 +0,0 @@
d0b5433294c110f1eed72bdb63910a9a

View File

@@ -0,0 +1,191 @@
{
"admin": {
"extends": "user",
"permissions": [
{
"resource": "apikey",
"action": "read:any",
"attributes": "*"
},
{
"resource": "array",
"action": "read:any",
"attributes": "*"
},
{
"resource": "cpu",
"action": "read:any",
"attributes": "*"
},
{
"resource": "device",
"action": "read:any",
"attributes": "*"
},
{
"resource": "device/unassigned",
"action": "read:any",
"attributes": "*"
},
{
"resource": "disk",
"action": "read:any",
"attributes": "*"
},
{
"resource": "disk/settings",
"action": "read:any",
"attributes": "*"
},
{
"resource": "display",
"action": "read:any",
"attributes": "*"
},
{
"resource": "docker/container",
"action": "read:any",
"attributes": "*"
},
{
"resource": "docker/network",
"action": "read:any",
"attributes": "*"
},
{
"resource": "info",
"action": "read:any",
"attributes": "*"
},
{
"resource": "license-key",
"action": "read:any",
"attributes": "*"
},
{
"resource": "machine-id",
"action": "read:any",
"attributes": "*"
},
{
"resource": "memory",
"action": "read:any",
"attributes": "*"
},
{
"resource": "notifications",
"action": "read:any",
"attributes": "*"
},
{
"resource": "online",
"action": "read:any",
"attributes": "*"
},
{
"resource": "os",
"action": "read:any",
"attributes": "*"
},
{
"resource": "parity-history",
"action": "read:any",
"attributes": "*"
},
{
"resource": "permission",
"action": "read:any",
"attributes": "*"
},
{
"resource": "servers",
"action": "read:any",
"attributes": "*"
},
{
"resource": "service",
"action": "read:any",
"attributes": "*"
},
{
"resource": "service/emhttpd",
"action": "read:any",
"attributes": "*"
},
{
"resource": "service/unraid-api",
"action": "read:any",
"attributes": "*"
},
{
"resource": "services",
"action": "read:any",
"attributes": "*"
},
{
"resource": "share",
"action": "read:any",
"attributes": "*"
},
{
"resource": "software-versions",
"action": "read:any",
"attributes": "*"
},
{
"resource": "unraid-version",
"action": "read:any",
"attributes": "*"
},
{
"resource": "user",
"action": "read:any",
"attributes": "*"
},
{
"resource": "var",
"action": "read:any",
"attributes": "*"
},
{
"resource": "vars",
"action": "read:any",
"attributes": "*"
},
{
"resource": "vm/domain",
"action": "read:any",
"attributes": "*"
},
{
"resource": "vm/network",
"action": "read:any",
"attributes": "*"
}
]
},
"user": {
"extends": "guest",
"permissions": [
{
"resource": "apikey",
"action": "read:own",
"attributes": "*"
},
{
"resource": "permission",
"action": "read:any",
"attributes": "*"
}
]
},
"guest": {
"permissions": [
{
"resource": "welcome",
"action": "read:any",
"attributes": "*"
}
]
}
}

View File

@@ -1 +0,0 @@
version="6.12.0-beta5"

Binary file not shown.

Before

Width:  |  Height:  |  Size: 61 KiB

View File

@@ -1 +0,0 @@
case-model.png

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.5 KiB

View File

@@ -1,80 +0,0 @@
[confirm]
down="1"
stop="1"
[display]
width=""
font=""
tty="15"
date="%c"
time="%R"
number=".,"
unit="C"
scale="-1"
resize="0"
wwn="0"
total="1"
banner=""
header=""
background=""
tabs="1"
users="Tasks:3"
usage="0"
text="1"
warning="70"
critical="90"
hot="45"
max="55"
hotssd="60"
maxssd="70"
power=""
theme="white"
locale=""
raw=""
rtl=""
headermetacolor=""
headerdescription="yes"
showBannerGradient="yes"
[parity]
mode="0"
hour="0 0"
dotm="1"
month="1"
day="0"
cron=""
write="NOCORRECT"
[notify]
display="0"
life="5"
date="d-m-Y"
time="H:i"
position="top-right"
path="/tmp/notifications"
system="*/1 * * * *"
entity="1"
normal="1"
warning="1"
alert="1"
unraid="1"
plugin="1"
docker_notify="1"
language_notify="1"
report="1"
unraidos=""
version=""
docker_update=""
language_update=""
status=""
[ssmtp]
root=""
RcptTo=""
SetEmailPriority="True"
Subject="Unraid Status: "
server="smtp.gmail.com"
port="465"
UseTLS="YES"
UseSTARTTLS="NO"
UseTLSCert="NO"
TLSCert=""
AuthMethod="login"
AuthUser=""
AuthPass=""

View File

@@ -1,42 +1,35 @@
[display]
date=%c
time=%I:%M %p
number=.,
scale=-1
tabs=1
users=Tasks:3
resize=0
wwn=0
total=1
usage=0
banner=image
dashapps=icons
theme=black
text=1
unit=C
warning=70
critical=90
hot=45
max=55
sysinfo=/Tools/SystemProfiler
header=336699
headermetacolor=FFFFFF
background=F0F0F0
showBannerGradient=yes
date="%c"
number=".,"
scale="-1"
tabs="1"
users="Tasks:3"
resize="0"
wwn="0"
total="1"
usage="0"
banner="image"
dashapps="icons"
theme="white"
text="1"
unit="C"
warning="70"
critical="90"
hot="45"
max="55"
sysinfo="/Tools/SystemProfiler"
[notify]
entity=1
normal=1
warning=1
alert=1
unraid=1
plugin=1
docker_notify=1
report=1
display=0
date=d-m-Y
time=H:i
position=top-right
path=./dev/notifications
system=*/1 * * * *
entity="1"
normal="1"
warning="1"
alert="1"
unraid="1"
plugin="1"
docker_notify="1"
report="1"
display="0"
date="d-m-Y"
time="H:i"
position="top-right"
path="/app/dev/notifications"
system="*/1 * * * *"

View File

@@ -1,36 +0,0 @@
# Generated settings:
NAME="Unraid"
timeZone="America/New_York"
COMMENT="Media server"
SECURITY="user"
WORKGROUP="WORKGROUP"
DOMAIN=""
DOMAIN_SHORT=""
hideDotFiles="no"
enableFruit="yes"
USE_NETBIOS="no"
localMaster="yes"
serverMultiChannel="no"
USE_WSD="yes"
WSD_OPT=""
WSD2_OPT=""
USE_NTP="yes"
NTP_SERVER1="time1.google.com"
NTP_SERVER2="time2.google.com"
NTP_SERVER3="time3.google.com"
NTP_SERVER4="time4.google.com"
DOMAIN_LOGIN="Administrator"
DOMAIN_PASSWD=""
SYS_MODEL="Custom"
SYS_ARRAY_SLOTS="24"
USE_SSL="yes"
PORT="80"
PORTSSL="8443"
LOCAL_TLD="local"
BIND_MGT="no"
USE_TELNET="no"
PORTTELNET="23"
USE_SSH="yes"
PORTSSH="22"
USE_UPNP="yes"
START_PAGE="Main"

View File

@@ -1 +0,0 @@
# custom log directory for tests & development

View File

@@ -0,0 +1,5 @@
timestamp=1683971161
event=Unraid Parity check
subject=Notice [UNRAID] - Parity check finished (0 errors)
description=Canceled
importance=warning

View File

@@ -3,4 +3,3 @@ event=Unraid Parity check
subject=Notice [UNRAID] - Parity check finished (0 errors)
description=Canceled
importance=warning
link=/

View File

@@ -1 +0,0 @@
unraid_login|i:1736523078;unraid_user|s:4:"root";locale|s:0:"";buildDate|s:8:"20241202";

View File

@@ -1,24 +1,24 @@
[api]
version="4.4.1"
version="3.8.1+d06e215a"
extraOrigins="https://google.com,https://test.com"
[local]
sandbox="yes"
[notifier]
apikey="unnotify_30994bfaccf839c65bae75f7fa12dd5ee16e69389f754c3b98ed7d5"
[remote]
wanaccess="yes"
wanaccess="no"
wanport="8443"
upnpEnabled="no"
apikey="_______________________BIG_API_KEY_HERE_________________________"
localApiKey="_______________________LOCAL_API_KEY_HERE_________________________"
email="test@example.com"
username="zspearmint"
avatar="https://via.placeholder.com/200"
regWizTime="1611175408732_0951-1653-3509-FBA155FA23C0"
accesstoken=""
idtoken=""
accesstoken=""
refreshtoken=""
dynamicRemoteAccessType="DISABLED"
ssoSubIds=""
allowedOrigins="/var/run/unraid-notifications.sock, /var/run/unraid-php.sock, /var/run/unraid-cli.sock, http://localhost:8080, https://localhost:4443, https://tower.local:4443, https://192.168.1.150:4443, https://tower:4443, https://192-168-1-150.thisisfourtyrandomcharacters012345678900.myunraid.net:4443, https://85-121-123-122.thisisfourtyrandomcharacters012345678900.myunraid.net:8443, https://10-252-0-1.hash.myunraid.net:4443, https://10-252-1-1.hash.myunraid.net:4443, https://10-253-3-1.hash.myunraid.net:4443, https://10-253-4-1.hash.myunraid.net:4443, https://10-253-5-1.hash.myunraid.net:4443, https://10-100-0-1.hash.myunraid.net:4443, https://10-100-0-2.hash.myunraid.net:4443, https://10-123-1-2.hash.myunraid.net:4443, https://221-123-121-112.hash.myunraid.net:4443, https://google.com, https://test.com, https://connect.myunraid.net, https://connect-staging.myunraid.net, https://dev-my.myunraid.net:4000, https://studio.apollographql.com"
dynamicRemoteAccessType="STATIC"
[upc]
apikey="unupc_fab6ff6ffe51040595c6d9ffb63a353ba16cc2ad7d93f813a2e80a5810"
[connectionStatus]
minigraph="PRE_INIT"
upnpStatus=""
minigraph="CONNECTED"

View File

@@ -1,30 +0,0 @@
[eth0]
DHCP_KEEPRESOLV="no"
DNS_SERVER1="1.1.1.1"
DNS_SERVER2="8.8.8.8"
DHCP6_KEEPRESOLV="no"
BONDING="yes"
BONDNAME=""
BONDNICS="eth0,eth1,eth2,eth3"
BONDING_MODE="1"
BONDING_MIIMON="100"
BRIDGING="yes"
BRNAME=""
BRNICS="bond0"
BRSTP="0"
BRFD="0"
DESCRIPTION:0=""
PROTOCOL:0=""
USE_DHCP:0="yes"
IPADDR:0="192.168.1.150"
NETMASK:0="255.255.255.0"
GATEWAY:0="192.168.1.1"
METRIC:0=""
USE_DHCP6:0=""
IPADDR6:0=""
NETMASK6:0=""
GATEWAY6:0=""
METRIC6:0=""
PRIVACY6:0=""
MTU=""
TYPE="access"

View File

@@ -1,190 +0,0 @@
["disk1"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk2"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk3"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk4"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk5"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk6"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk7"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk8"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk9"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk10"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk11"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk12"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk13"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk14"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk15"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk16"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk17"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk18"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk19"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk20"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk21"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["disk22"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["abc"]
export="e"
fruit="no"
caseSensitive="auto"
security="public"
readList=""
writeList=""
volsizelimit=""
["flash"]
export="e"
fruit="no"
security="public"
readList=""
writeList=""

View File

@@ -1,92 +0,0 @@
["disk1"]
export="-"
security="public"
hostList=""
["disk2"]
export="-"
security="public"
hostList=""
["disk3"]
export="-"
security="public"
hostList=""
["disk4"]
export="-"
security="public"
hostList=""
["disk5"]
export="-"
security="public"
hostList=""
["disk6"]
export="-"
security="public"
hostList=""
["disk7"]
export="-"
security="public"
hostList=""
["disk8"]
export="-"
security="public"
hostList=""
["disk9"]
export="-"
security="public"
hostList=""
["disk10"]
export="-"
security="public"
hostList=""
["disk11"]
export="-"
security="public"
hostList=""
["disk12"]
export="-"
security="public"
hostList=""
["disk13"]
export="-"
security="public"
hostList=""
["disk14"]
export="-"
security="public"
hostList=""
["disk15"]
export="-"
security="public"
hostList=""
["disk16"]
export="-"
security="public"
hostList=""
["disk17"]
export="-"
security="public"
hostList=""
["disk18"]
export="-"
security="public"
hostList=""
["disk19"]
export="-"
security="public"
hostList=""
["disk20"]
export="-"
security="public"
hostList=""
["disk21"]
export="-"
security="public"
hostList=""
["disk22"]
export="-"
security="public"
hostList=""
["abc"]
export="-"
security="public"
hostList=""

View File

@@ -1,102 +0,0 @@
["appdata"]
name="appdata"
nameOrig="appdata"
comment=""
allocator="highwater"
splitLevel=""
floor="0"
include=""
exclude=""
useCache="no"
cachePool="cache"
cow="auto"
color="yellow-on"
size="0"
free="9091184"
used="32831348"
luksStatus="0"
["domains"]
name="domains"
nameOrig="domains"
comment="saved VM instances"
allocator="highwater"
splitLevel="1"
floor="0"
include=""
exclude=""
useCache="prefer"
cachePool="cache"
cow="auto"
color="yellow-on"
size="0"
free="9091184"
used="32831348"
luksStatus="0"
["isos"]
name="isos"
nameOrig="isos"
comment="ISO images"
allocator="highwater"
splitLevel=""
floor="0"
include=""
exclude=""
useCache="yes"
cachePool="cache"
cow="auto"
color="yellow-on"
size="0"
free="9091184"
used="32831348"
luksStatus="0"
["system"]
name="system"
nameOrig="system"
comment="system data"
allocator="highwater"
splitLevel="1"
floor="0"
include=""
exclude=""
useCache="prefer"
cachePool="cache"
cow="auto"
color="yellow-on"
size="0"
free="9091184"
used="32831348"
luksStatus="0"
["system.with.periods"]
name="system.with.periods"
nameOrig="system.with.periods"
comment="system data with periods"
allocator="highwater"
splitLevel="1"
floor="0"
include=""
exclude=""
useCache="prefer"
cachePool="cache"
cow="auto"
color="yellow-on"
size="0"
free="9091184"
used="32831348"
luksStatus="0"
["system.with.🚀"]
name="system.with.🚀"
nameOrig="system.with.🚀"
comment="system data with 🚀"
allocator="highwater"
splitLevel="1"
floor="0"
include=""
exclude=""
useCache="prefer"
cachePool="cache"
cow="auto"
color="yellow-on"
size="0"
free="9091184"
used="32831348"
luksStatus="0"

View File

@@ -1,15 +0,0 @@
["root"]
idx="0"
name="root"
desc="Console and webGui login account"
passwd="yes"
["xo"]
idx="1"
name="xo"
desc=""
passwd="yes"
["test_user"]
idx="2"
name="test_user"
desc=""
passwd="no"

View File

@@ -87,7 +87,7 @@ shareAvahiSMBModel="Xserve"
shfs_logging="1"
safeMode="no"
startMode="Normal"
configValid="ineligible"
configValid="yes"
joinStatus="Not joined"
deviceCount="4"
flashGUID="0000-0000-0000-000000000000"
@@ -102,7 +102,6 @@ regTm="1833409182"
regTm2="0"
regExp=""
regGen="0"
regState="ENOKEYFILE"
sbName="/boot/config/super.dat"
sbVersion="2.9.13"
sbUpdated="1596079143"

View File

@@ -1 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 0 222.36 39.04"><defs><linearGradient id="header-logo" x1="47.53" y1="79.1" x2="170.71" y2="-44.08" gradientUnits="userSpaceOnUse"><stop offset="0" stop-color="#e32929"/><stop offset="1" stop-color="#ff8d30"/></linearGradient></defs><title>unraid.net</title><path d="M146.7,29.47H135l-3,9h-6.49L138.93,0h8l13.41,38.49h-7.09L142.62,6.93l-5.83,16.88h8ZM29.69,0V25.4c0,8.91-5.77,13.64-14.9,13.64S0,34.31,0,25.4V0H6.54V25.4c0,5.17,3.19,7.92,8.25,7.92s8.36-2.75,8.36-7.92V0ZM50.86,12v26.5H44.31V0h6.11l17,26.5V0H74V38.49H67.9ZM171.29,0h6.54V38.49h-6.54Zm51.07,24.69c0,9-5.88,13.8-15.17,13.8H192.67V0H207.3c9.18,0,15.06,4.78,15.06,13.8ZM215.82,13.8c0-5.28-3.3-8.14-8.52-8.14h-8.08V32.77h8c5.33,0,8.63-2.8,8.63-8.08ZM108.31,23.92c4.34-1.6,6.93-5.28,6.93-11.55C115.24,3.68,110.18,0,102.48,0H88.84V38.49h6.55V5.66h6.87c3.8,0,6.21,1.82,6.21,6.71s-2.41,6.76-6.21,6.76H98.88l9.21,19.36h7.53Z" fill="url(#header-logo)"/></svg>

Before

Width:  |  Height:  |  Size: 1008 B

View File

@@ -1,12 +1,28 @@
x-common: &common
version: '3.8'
x-volumes: &volumes
volumes:
- ./:/app
- pnpm-store:/pnpm/store
- ../libvirt:/libvirt
environment:
- IS_DOCKER=true
- GIT_SHA=${GIT_SHA:-unknown}
- IS_TAGGED=${IS_TAGGED:-false}
- ./dev:/app/dev
- ./src:/app/src
- ./patches:/app/patches
- ./package.json:/app/package.json
- ./package-lock.json:/app/package-lock.json
- ./tsconfig.json:/app/tsconfig.json
- ./tsup.config.ts:/app/tsup.config.ts
- ./vite.config.ts:/app/vite.config.ts
- ./dist/:/app/dist/
- ./deploy/:/app/deploy/
- ./README.md:/app/README.md
- ./scripts/:/app/scripts/
- ../.git/:/app/.git/
- ./.env.production:/app/.env.production
- ./.env.staging:/app/.env.staging
- ./.env.test:/app/.env.test
- ./.env.development:/app/.env.development
- ./.pkg-cache:/app/.pkg-cache
- ./codegen.yml:/app/codegen.yml
- ./fix-array-type.cjs:/app/fix-array-type.cjs
- /var/run/docker.sock:/var/run/docker.sock
services:
@@ -18,10 +34,14 @@ services:
context: .
target: development
dockerfile: Dockerfile
<<: *common
<<: *volumes
stdin_open: true
tty: true
entrypoint: /bin/bash
environment:
- IS_DOCKER=true
- GIT_SHA=${GIT_SHA:?err}
- IS_TAGGED=${IS_TAGGED}
profiles:
- builder
@@ -33,23 +53,24 @@ services:
context: .
target: development
dockerfile: Dockerfile
<<: *common
<<: *volumes
command: npm run start:dev
environment:
- IS_DOCKER=true
- GIT_SHA=${GIT_SHA:?err}
- IS_TAGGED=${IS_TAGGED}
profiles:
- builder
builder:
image: unraid-api:builder
environment:
- GIT_SHA=${GIT_SHA:?err}
- IS_TAGGED=${IS_TAGGED}
build:
context: .
target: builder
dockerfile: Dockerfile
<<: *common
<<: *volumes
profiles:
- builder
volumes:
pnpm-store:
name: "pnpm-store"
pnpm-cache:
name: "pnpm-cache"
- builder

View File

@@ -1,124 +0,0 @@
# Working with API plugins
Under the hood, API plugins (i.e. plugins to the `@unraid/api` project) are represented
as npm `peerDependencies`. This is npm's intended package plugin mechanism, and given that
peer dependencies are installed by default as of npm v7, it supports bi-directional plugin functionality,
where the API provides dependencies for the plugin while the plugin provides functionality to the API.
## Private Workspace plugins
### Adding a local workspace package as an API plugin
The challenge with local workspace plugins is that they aren't available via npm during production.
To solve this, we vendor them during the build process. Here's the complete process:
#### 1. Configure the build system
Add your workspace package to the vendoring configuration in `api/scripts/build.ts`:
```typescript
const WORKSPACE_PACKAGES_TO_VENDOR = {
'@unraid/shared': 'packages/unraid-shared',
'unraid-api-plugin-connect': 'packages/unraid-api-plugin-connect',
'your-plugin-name': 'packages/your-plugin-path', // Add your plugin here
} as const;
```
#### 2. Configure Vite
Add your workspace package to the Vite configuration in `api/vite.config.ts`:
```typescript
const workspaceDependencies = {
'@unraid/shared': 'packages/unraid-shared',
'unraid-api-plugin-connect': 'packages/unraid-api-plugin-connect',
'your-plugin-name': 'packages/your-plugin-path', // Add your plugin here
};
```
This ensures the package is:
- Excluded from Vite's optimization during development
- Marked as external during the build process
- Properly handled in SSR mode
#### 3. Configure the API package.json
Add your workspace package as a peer dependency in `api/package.json`:
```json
{
"peerDependencies": {
"unraid-api-plugin-connect": "workspace:*",
"your-plugin-name": "workspace:*"
},
"peerDependenciesMeta": {
"unraid-api-plugin-connect": {
"optional": true
},
"your-plugin-name": {
"optional": true
}
}
}
```
By marking the workspace dependency "optional", npm will not attempt to install it during development.
The "workspace:*" identifier will be invalid during build-time and run-time, but won't cause problems
because the package gets vendored instead.
#### 4. Plugin package setup
Your workspace plugin package should:
1. **Export types and main entry**: Set up proper `main`, `types`, and `exports` fields:
```json
{
"name": "your-plugin-name",
"main": "dist/index.js",
"types": "dist/index.d.ts",
"type": "module",
"exports": {
".": {
"types": "./dist/index.d.ts",
"import": "./dist/index.js"
}
},
"files": ["dist"]
}
```
2. **Use peer dependencies**: Declare shared dependencies as peer dependencies to avoid duplication:
```json
{
"peerDependencies": {
"@nestjs/common": "^11.0.11",
"@nestjs/core": "^11.0.11",
"graphql": "^16.9.0"
}
}
```
3. **Include build script**: Add a build script that compiles TypeScript:
```json
{
"scripts": {
"build": "tsc",
"prepare": "npm run build"
}
}
```
#### 5. Build process
During production builds:
1. The build script (`api/scripts/build.ts`) will automatically pack and install your workspace package as a tarball
2. This happens after `npm install --omit=dev` in the pack directory
3. The vendored package becomes a regular node_modules dependency in the final build
#### 6. Development vs Production
- **Development**: Vite resolves workspace packages directly from their source
- **Production**: Packages are vendored as tarballs in `node_modules`
This approach ensures that workspace plugins work seamlessly in both development and production environments.

View File

@@ -1,116 +0,0 @@
# Development
## Installation
Manual install of the staging and production plugins can be done with the following routes:
[production](https://stable.dl.unraid.net/unraid-api/dynamix.unraid.net.plg)
[staging](https://preview.dl.unraid.net/unraid-api/dynamix.unraid.net.staging.plg)
## Connecting to the API
### HTTP
This can be accessed by default via `http://tower.local/graphql`.
See <https://graphql.org/learn/serving-over-http/#http-methods-headers-and-body>
## Building in Docker
To get a development environment for testing start by running this docker command:
`npm run build:docker`
`npm run start:ddev`
which will give you an interactive shell inside of the newly build linux container.
To automatically build the plugin run the command below:
`npm run build:docker`
The builder command will build the plugin into deploy/release, and the interactive plugin lets you build the plugin or install node modules how you like.
## Logs
Logging can be configured via environment variables.
Log levels can be set when the api starts via `LOG_LEVEL=all/trace/debug/info/warn/error/fatal/mark/off`.
Additional detail for the log entry can be added with `LOG_CONTEXT=true` (warning, generates a lot of data).
By default, logs will be sent to syslog. Or you can set `LOG_TRANSPORT=file` to have logs saved in `/var/log/unraid-api/stdout.log`. Or enable debug mode to view logs inline.
Examples:
- `unraid-api start`
- `LOG_LEVEL=debug unraid-api start --debug`
- `LOG_LEVEL=trace LOG_CONTEXT=true LOG_TRANSPORT=file unraid-api start`
## Viewing data sent to mothership
If the environment variable `LOG_MOTHERSHIP_MESSAGES=true` exists, any data the unraid-api sends to mothership will be saved in clear text here: `/var/log/unraid-api/relay-messages.log`
Examples:
- `LOG_MOTHERSHIP_MESSAGES=true unraid-api start`
- `LOG_MOTHERSHIP_MESSAGES=true LOG_LEVEL=debug unraid-api start --debug`
## Debug Logging
To view debug logs, change the log level when starting the API. Then type unraid-api logs to trail the logs.
Examples:
- `LOG_LEVEL=debug unraid-api start`
- `unraid-api logs`
## Switching between staging and production environments
1. Stop the api: `unraid-api stop`
2. Switch environments: `unraid-api switch-env`
3. Start the api: `unraid-api start`
4. Confirm the environment: `unraid-api report`
## Playground
The playground can be access via `http://tower.local/graphql` while in debug mode.
To get your API key open a terminal on your server and run `cat /boot/config/plugins/dynamix.my.servers/myservers.cfg | grep apikey=\"unraid | cut -d '"' -f2`. Add that API key in the "HTTP headers" panel of the playground.
```json
{
"x-api-key": "__REPLACE_ME_WITH_API_KEY__"
}
```
Next add the query you want to run and hit the play icon.
```gql
query welcome {
welcome {
message
}
}
```
You should get something like this back.
```json
{
"data": {
"welcome": {
"message": "Welcome root to this Unraid 6.10.0 server"
}
}
}
```
Click the "Schema" and "Docs" button on the right side of the playground to learn more.
## Create a new release
To create a new version run `npm run release` and then run **ONLY** the `git push` section of the commands it returns.
To create a new prerelease run `npm run release -- --prerelease alpha`.
Pushing to this repo will cause an automatic "rolling" release to be built which can be accessed via the page for the associated Github action run.
## Using a custom version (e.g. testing a new release)
Find the Pull Request you'd like to install, and a link will be present as a comment to install a PR-specific version.

View File

@@ -1,247 +0,0 @@
# Feature Flags
Feature flags allow you to conditionally enable or disable functionality in the Unraid API. This is useful for gradually rolling out new features, A/B testing, or keeping experimental code behind flags during development.
## Setting Up Feature Flags
### 1. Define the Feature Flag
Feature flags are defined as environment variables and collected in `src/consts.ts`:
```typescript
// src/environment.ts
export const ENABLE_MY_NEW_FEATURE = process.env.ENABLE_MY_NEW_FEATURE === 'true';
// src/consts.ts
export const FeatureFlags = Object.freeze({
ENABLE_NEXT_DOCKER_RELEASE,
ENABLE_MY_NEW_FEATURE, // Add your new flag here
});
```
### 2. Set the Environment Variable
Set the environment variable when running the API:
```bash
ENABLE_MY_NEW_FEATURE=true unraid-api start
```
Or add it to your `.env` file:
```env
ENABLE_MY_NEW_FEATURE=true
```
## Using Feature Flags in GraphQL
### Method 1: @UseFeatureFlag Decorator (Schema-Level)
The `@UseFeatureFlag` decorator conditionally includes or excludes GraphQL fields, queries, and mutations from the schema based on feature flags. When a feature flag is disabled, the field won't appear in the GraphQL schema at all.
```typescript
import { UseFeatureFlag } from '@app/unraid-api/decorators/use-feature-flag.decorator.js';
import { Query, Mutation, ResolveField } from '@nestjs/graphql';
@Resolver()
export class MyResolver {
// Conditionally include a query
@UseFeatureFlag('ENABLE_MY_NEW_FEATURE')
@Query(() => String)
async experimentalQuery() {
return 'This query only exists when ENABLE_MY_NEW_FEATURE is true';
}
// Conditionally include a mutation
@UseFeatureFlag('ENABLE_MY_NEW_FEATURE')
@Mutation(() => Boolean)
async experimentalMutation() {
return true;
}
// Conditionally include a field resolver
@UseFeatureFlag('ENABLE_MY_NEW_FEATURE')
@ResolveField(() => String)
async experimentalField() {
return 'This field only exists when the flag is enabled';
}
}
```
**Benefits:**
- Clean schema - disabled features don't appear in GraphQL introspection
- No runtime overhead for disabled features
- Clear feature boundaries
**Use when:**
- You want to completely hide features from the GraphQL schema
- The feature is experimental or in beta
- You're doing a gradual rollout
### Method 2: checkFeatureFlag Function (Runtime)
The `checkFeatureFlag` function provides runtime feature flag checking within resolver methods. It throws a `ForbiddenException` if the feature is disabled.
```typescript
import { checkFeatureFlag } from '@app/unraid-api/utils/feature-flag.helper.js';
import { FeatureFlags } from '@app/consts.js';
import { Query, ResolveField } from '@nestjs/graphql';
@Resolver()
export class MyResolver {
@Query(() => String)
async myQuery(
@Args('useNewAlgorithm', { nullable: true }) useNewAlgorithm?: boolean
) {
// Conditionally use new logic based on feature flag
if (useNewAlgorithm) {
checkFeatureFlag(FeatureFlags, 'ENABLE_MY_NEW_FEATURE');
return this.newAlgorithm();
}
return this.oldAlgorithm();
}
@ResolveField(() => String)
async dataField() {
// Check flag at the start of the method
checkFeatureFlag(FeatureFlags, 'ENABLE_MY_NEW_FEATURE');
// Feature-specific logic here
return this.computeExperimentalData();
}
}
```
**Benefits:**
- More granular control within methods
- Can conditionally execute parts of a method
- Useful for A/B testing scenarios
- Good for gradual migration strategies
**Use when:**
- You need conditional logic within a method
- The field should exist but behavior changes based on the flag
- You're migrating from old to new implementation gradually
## Feature Flag Patterns
### Pattern 1: Complete Feature Toggle
Hide an entire feature behind a flag:
```typescript
@UseFeatureFlag('ENABLE_DOCKER_TEMPLATES')
@Resolver(() => DockerTemplate)
export class DockerTemplateResolver {
// All resolvers in this class are toggled by the flag
}
```
### Pattern 2: Gradual Migration
Migrate from old to new implementation:
```typescript
@Query(() => [Container])
async getContainers(@Args('version') version?: string) {
if (version === 'v2') {
checkFeatureFlag(FeatureFlags, 'ENABLE_CONTAINERS_V2');
return this.getContainersV2();
}
return this.getContainersV1();
}
```
### Pattern 3: Beta Features
Mark features as beta:
```typescript
@UseFeatureFlag('ENABLE_BETA_FEATURES')
@ResolveField(() => BetaMetrics, {
description: 'BETA: Advanced metrics (requires ENABLE_BETA_FEATURES flag)'
})
async betaMetrics() {
return this.computeBetaMetrics();
}
```
### Pattern 4: Performance Optimizations
Toggle expensive operations:
```typescript
@ResolveField(() => Statistics)
async statistics() {
const basicStats = await this.getBasicStats();
try {
checkFeatureFlag(FeatureFlags, 'ENABLE_ADVANCED_ANALYTICS');
const advancedStats = await this.getAdvancedStats();
return { ...basicStats, ...advancedStats };
} catch {
// Feature disabled, return only basic stats
return basicStats;
}
}
```
## Testing with Feature Flags
When writing tests for feature-flagged code, create a mock to control feature flag values:
```typescript
import { vi } from 'vitest';
// Mock the entire consts module
vi.mock('@app/consts.js', async () => {
const actual = await vi.importActual('@app/consts.js');
return {
...actual,
FeatureFlags: {
ENABLE_MY_NEW_FEATURE: true, // Set your test value
ENABLE_NEXT_DOCKER_RELEASE: false,
}
};
});
describe('MyResolver', () => {
it('should execute new logic when feature is enabled', async () => {
// Test new behavior with mocked flag
});
});
```
## Best Practices
1. **Naming Convention**: Use `ENABLE_` prefix for boolean feature flags
2. **Environment Variables**: Always use uppercase with underscores
3. **Documentation**: Document what each feature flag controls
4. **Cleanup**: Remove feature flags once features are stable and fully rolled out
5. **Default State**: New features should default to `false` (disabled)
6. **Granularity**: Keep feature flags focused on a single feature or capability
7. **Testing**: Always test both enabled and disabled states
## Common Use Cases
- **Experimental Features**: Hide unstable features in production
- **Gradual Rollouts**: Enable features for specific environments first
- **A/B Testing**: Toggle between different implementations
- **Performance**: Disable expensive operations when not needed
- **Breaking Changes**: Provide migration path with both old and new behavior
- **Debug Features**: Enable additional logging or debugging tools
## Checking Active Feature Flags
To see which feature flags are currently active:
```typescript
// Log all feature flags on startup
console.log('Active Feature Flags:', FeatureFlags);
```
Or check via GraphQL introspection to see which fields are available based on current flags.

Some files were not shown because too many files have changed in this diff Show More