mirror of
https://github.com/papra-hq/papra.git
synced 2025-12-19 12:19:37 -06:00
Compare commits
35 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5140a64c40 | ||
|
|
9ddb7d545d | ||
|
|
2a73551ca4 | ||
|
|
7be56455b0 | ||
|
|
1085bf079c | ||
|
|
b13986e1e3 | ||
|
|
d4462f942b | ||
|
|
2f2ad90fd3 | ||
|
|
2bbb68aa17 | ||
|
|
2b2827cdb3 | ||
|
|
4b4621e4d0 | ||
|
|
fd0f79feb0 | ||
|
|
b9c2448805 | ||
|
|
542225fabc | ||
|
|
e4af2653ea | ||
|
|
4dd15527c0 | ||
|
|
ae0f69043d | ||
|
|
79eafdb3ee | ||
|
|
979df5dad8 | ||
|
|
76c50dce6c | ||
|
|
8acd7de79e | ||
|
|
a3bd2024c6 | ||
|
|
25c26e8dc0 | ||
|
|
07563dce5d | ||
|
|
a0797beb14 | ||
|
|
0701a84973 | ||
|
|
0789ad3e9a | ||
|
|
cb3c9c3194 | ||
|
|
f9b02c4439 | ||
|
|
9afca3fd84 | ||
|
|
faca409604 | ||
|
|
fc973d20fe | ||
|
|
400541f0ce | ||
|
|
f98b810bd4 | ||
|
|
091bd26fbc |
5
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
5
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
blank_issues_enabled: true
|
||||
contact_links:
|
||||
- name: 💬 Discord Community
|
||||
url: https://papra.app/discord
|
||||
about: Join the Papra Discord community to get help, share your feedback, and stay updated on the project.
|
||||
BIN
.github/papra-screenshot.png
vendored
BIN
.github/papra-screenshot.png
vendored
Binary file not shown.
|
Before Width: | Height: | Size: 107 KiB After Width: | Height: | Size: 151 KiB |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -37,5 +37,6 @@ cache
|
||||
*.sqlite
|
||||
|
||||
local-documents
|
||||
ingestion
|
||||
.cursorrules
|
||||
*.traineddata
|
||||
33
README.md
33
README.md
@@ -22,17 +22,14 @@
|
||||
<span> • </span>
|
||||
<a href="https://github.com/orgs/papra-hq/projects/2">Roadmap</a>
|
||||
<span> • </span>
|
||||
<a href="https://discord.gg/8UPjzsrBNF">Discord</a>
|
||||
<a href="https://papra.app/discord">Discord</a>
|
||||
<!-- <span> • </span>
|
||||
<a href="https://dashboard.papra.app">Managed instance</a> -->
|
||||
</p>
|
||||
|
||||
## Introduction
|
||||
|
||||
> [!IMPORTANT]
|
||||
> **Papra** is currently in active development and is not yet ready for production use or self-hosting.
|
||||
|
||||
**Papra** is a minimalistic document management and archiving platform. It is designed to be simple to use and accessible to everyone. Papra is a plateform for long-term document storage and management, like a digital archive for your documents.
|
||||
**Papra** is a minimalistic document management and archiving platform. It is designed to be simple to use and accessible to everyone. Papra is a platform for long-term document storage and management, like a digital archive for your documents.
|
||||
|
||||
Forget about that receipt of that gift you bought for your friend last year, or that warranty for your new phone. With Papra, you can easily store, forget, and retrieve your documents whenever you need them.
|
||||
|
||||
@@ -40,6 +37,15 @@ A live demo of the platform is available at [demo.papra.app](https://demo.papra.
|
||||
|
||||
[](https://demo.papra.app)
|
||||
|
||||
## Project Status
|
||||
|
||||
Papra is currently in **beta**. The core functionality is stable and usable, but you may encounter occasional bugs or limitations. The project is actively developed, with new features being added regularly.
|
||||
|
||||
- ✅ Core document management features are stable
|
||||
- ✅ Self-hosting is fully supported
|
||||
- 🚧 Some advanced features are still in development
|
||||
- 📝 Feedback and bug reports are highly appreciated
|
||||
|
||||
## Features
|
||||
|
||||
- **Document management**: Upload, store, and manage your documents in one place.
|
||||
@@ -53,15 +59,28 @@ A live demo of the platform is available at [demo.papra.app](https://demo.papra.
|
||||
- **Tags**: Organize your documents with tags.
|
||||
- **Email ingestion**: Send/forward emails to a generated address to automatically import documents.
|
||||
- **Content extraction**: Automatically extract text from images or scanned documents for search.
|
||||
- **i18n**: Support for multiple languages.
|
||||
- *In progress:* **i18n**: Support for multiple languages.
|
||||
- *Coming soon:* **Tagging Rules**: Automatically tag documents based on custom rules.
|
||||
- *Coming soon:* **Folder ingestion**: Automatically import documents from a folder.
|
||||
- *Coming soon:* **SDK and API**: Build your own applications on top of Papra.
|
||||
- *Coming soon:* **CLI**: Manage your documents from the command line.
|
||||
- *Coming soon:* **Document sharing**: Share documents with others.
|
||||
- *Coming soon:* **Document requests**: Generate upload links for people to add documents.
|
||||
- *Coming maybe one day:* **Mobile app**: Access and upload documents on the go.
|
||||
- *Coming maybe one day:* **Desktop app**: Access and upload documents from your computer.
|
||||
|
||||
## Self-hosting
|
||||
|
||||
Papra is dedicated to providing a simple yet highly configurable self-hosting experience. Our lightweight Docker image (<200MB) is compatible with multiple architectures including x86, ARM64, and ARMv7.
|
||||
|
||||
For a quick start, simply run the following command:
|
||||
|
||||
```bash
|
||||
docker run -d --name papra -p 1221:1221 ghcr.io/papra-hq/papra:latest
|
||||
```
|
||||
|
||||
Please refer to the [self-hosting documentation](https://docs.papra.app/self-hosting/using-docker) for more information and configuration options.
|
||||
|
||||
## Contributing
|
||||
|
||||
Contributions are welcome! Please refer to the [`CONTRIBUTING.md`](./CONTRIBUTING.md) file for guidelines on how to get started, report issues, and submit pull requests.
|
||||
@@ -73,7 +92,7 @@ This project is licensed under the AGPL-3.0 License - see the [LICENSE](./LICENS
|
||||
|
||||
## Community
|
||||
|
||||
Join the community on [Papra's Discord server](https://discord.gg/8UPjzsrBNF) to discuss the project, ask questions, or get help.
|
||||
Join the community on [Papra's Discord server](https://papra.app/discord) to discuss the project, ask questions, or get help.
|
||||
|
||||
## Credits
|
||||
|
||||
|
||||
@@ -26,7 +26,10 @@ export default defineConfig({
|
||||
social: {
|
||||
github: 'https://github.com/papra-hq/papra',
|
||||
blueSky: 'https://bsky.app/profile/papra.app',
|
||||
discord: 'https://discord.gg/8UPjzsrBNF',
|
||||
discord: 'https://papra.app/discord',
|
||||
},
|
||||
expressiveCode: {
|
||||
themes: ['vitesse-black', 'vitesse-light'],
|
||||
},
|
||||
editLink: {
|
||||
baseUrl: 'https://github.com/papra-hq/papra/edit/main/apps/docs/',
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "@papra/docs",
|
||||
"type": "module",
|
||||
"version": "0.1.2",
|
||||
"version": "0.3.0",
|
||||
"packageManager": "pnpm@9.15.4",
|
||||
"description": "Papra documentation website",
|
||||
"author": "Corentin Thomasset <corentinth@proton.me> (https://corentin.tech)",
|
||||
@@ -20,7 +20,8 @@
|
||||
"@astrojs/starlight": "^0.31.0",
|
||||
"astro": "^5.1.5",
|
||||
"sharp": "^0.32.5",
|
||||
"starlight-theme-rapide": "^0.3.0"
|
||||
"starlight-theme-rapide": "^0.3.0",
|
||||
"zod-to-json-schema": "^3.24.5"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@antfu/eslint-config": "^3.13.0",
|
||||
@@ -29,7 +30,7 @@
|
||||
"@unocss/reset": "^0.64.0",
|
||||
"eslint": "^9.17.0",
|
||||
"eslint-plugin-astro": "^1.3.1",
|
||||
"figue": "^2.2.0",
|
||||
"figue": "^2.2.2",
|
||||
"lodash-es": "^4.17.21",
|
||||
"marked": "^15.0.6",
|
||||
"typescript": "^5.7.3"
|
||||
|
||||
@@ -1,7 +1,10 @@
|
||||
:root[data-theme='dark'] {
|
||||
--background-color: #050505!important;
|
||||
--background-color: #0c0d0f!important;
|
||||
--accent-color: #fff!important;
|
||||
--foreground-color: #f6f6f6!important;
|
||||
--foreground-color: #9ea3a2!important;
|
||||
--sl-color-white: #f3f7f6!important;
|
||||
|
||||
--surface: #0a0b0d!important;
|
||||
|
||||
--sl-color-text: var(--foreground-color)!important;
|
||||
|
||||
@@ -13,6 +16,33 @@
|
||||
--sl-color-bg-sidebar: var(--background-color)!important;
|
||||
}
|
||||
|
||||
.sl-link-card {
|
||||
background-color: var(--surface)!important;
|
||||
}
|
||||
|
||||
.hero .sl-link-button {
|
||||
background-color: var(--sl-color-text)!important;
|
||||
border-color: transparent!important;
|
||||
color: var(--sl-color-bg)!important;
|
||||
border-radius: 0.8rem!important;
|
||||
transition: opacity 0.2s ease-in-out!important;
|
||||
font-weight: 500!important;
|
||||
}
|
||||
|
||||
:root[data-theme='dark'] .hero .sl-link-button {
|
||||
background-color: var(--accent-color)!important;
|
||||
color: var(--background-color)!important;
|
||||
}
|
||||
|
||||
.hero .sl-link-button:hover {
|
||||
opacity: 0.8!important;
|
||||
}
|
||||
|
||||
#_top {
|
||||
padding-top: 10px!important;
|
||||
padding-bottom: 30px!important;
|
||||
}
|
||||
|
||||
.site-title {
|
||||
color:inherit !important;
|
||||
gap: 0.5rem !important;
|
||||
|
||||
@@ -52,4 +52,34 @@ ${documentation}
|
||||
|
||||
`.trim()).join('\n\n---\n\n');
|
||||
|
||||
export { mdSections };
|
||||
function wrapText(text: string, maxLength = 75) {
|
||||
const words = text.split(' ');
|
||||
const lines: string[] = [];
|
||||
let currentLine = '';
|
||||
|
||||
words.forEach((word) => {
|
||||
if ((currentLine + word).length + 1 <= maxLength) {
|
||||
currentLine += (currentLine ? ' ' : '') + word;
|
||||
} else {
|
||||
lines.push(currentLine);
|
||||
currentLine = word;
|
||||
}
|
||||
});
|
||||
|
||||
if (currentLine) {
|
||||
lines.push(currentLine);
|
||||
}
|
||||
|
||||
return lines.map(line => `# ${line}`);
|
||||
}
|
||||
|
||||
const fullDotEnv = rows.map(({ env, defaultValue, documentation }) => {
|
||||
const isEmptyDefaultValue = isNil(defaultValue) || (isArray(defaultValue) && isEmpty(defaultValue)) || defaultValue === '';
|
||||
|
||||
return [
|
||||
...wrapText(documentation),
|
||||
`# ${env}=${isEmptyDefaultValue ? '' : defaultValue}`,
|
||||
].join('\n');
|
||||
}).join('\n\n');
|
||||
|
||||
export { fullDotEnv, mdSections };
|
||||
|
||||
@@ -1,61 +0,0 @@
|
||||
---
|
||||
title: Installing Papra using Docker
|
||||
description: Self-host Papra using Docker.
|
||||
slug: self-hosting/using-docker
|
||||
---
|
||||
|
||||
Papra can be easily installed and run using Docker. This method is recommended for users who want a quick and straightforward way to deploy their own instance of Papra with minimal setup.
|
||||
|
||||
- Single lightweight image
|
||||
- Only one container to manage
|
||||
- Available for all platforms (arm64, arm/v7, x86_64)
|
||||
- Root and Rootless image variants
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Before you begin, ensure that you have Docker installed on your system. You can download and install Docker from the official [Docker website](https://www.docker.com/get-started).
|
||||
|
||||
Verify your installation:
|
||||
|
||||
```bash frame="none"
|
||||
docker --version
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
You can run Papra using the following command:
|
||||
|
||||
```bash frame="none"
|
||||
docker run -d --name papra --restart unless-stopped -p 1221:1221 ghcr.io/papra-hq/papra
|
||||
```
|
||||
|
||||
It will automatically download the latest image and start the container. The application will be available at [http://localhost:1221](http://localhost:1221).
|
||||
|
||||
## Root and Rootless installation
|
||||
|
||||
Papra can be installed in two different ways:
|
||||
|
||||
- **Rootless** (recommended): This method does not require root privileges to run. The images are suffixed with `-rootless` like `corentinth/papra:latest-rootless` or `corentinth/papra:1.0.0-rootless` and the default `:latest` tag points to the latest rootless image.
|
||||
- **Root**: This is the default installation method. It requires root privileges to run. The images are suffixed with `-root` like `corentinth/papra:latest-root` or `corentinth/papra:1.0.0-root`.
|
||||
|
||||
## Image Sources
|
||||
|
||||
Papra Docker images are available on both **Docker Hub** and **GitHub Container Registry** (GHCR). You can choose the source that best suits your needs.
|
||||
|
||||
```bash frame="none"
|
||||
# Using Docker Hub
|
||||
docker pull corentinth/papra:latest
|
||||
docker pull corentinth/papra:latest-rootless
|
||||
docker pull corentinth/papra:latest-root
|
||||
|
||||
# Using GitHub Container Registry
|
||||
docker pull ghcr.io/papra-hq/papra:latest
|
||||
docker pull ghcr.io/papra-hq/papra:latest-rootless
|
||||
docker pull ghcr.io/papra-hq/papra:latest-root
|
||||
```
|
||||
|
||||
## Basic Usage
|
||||
|
||||
```bash frame="none"
|
||||
docker run -d --name papra --restart unless-stopped -p 1221:1221 ghcr.io/papra-hq/papra:latest-root
|
||||
```
|
||||
111
apps/docs/src/content/docs/02-self-hosting/01-using-docker.mdx
Normal file
111
apps/docs/src/content/docs/02-self-hosting/01-using-docker.mdx
Normal file
@@ -0,0 +1,111 @@
|
||||
---
|
||||
title: Installing Papra using Docker
|
||||
description: Self-host Papra using Docker.
|
||||
slug: self-hosting/using-docker
|
||||
---
|
||||
import { Steps } from '@astrojs/starlight/components';
|
||||
|
||||
Papra provides optimized Docker images for streamlined deployment. This method is recommended for users seeking a production-ready setup with minimal maintenance overhead.
|
||||
|
||||
- **Simplified management**: Single container handles all components
|
||||
- **Lightweight**: Optimized image sizes across architectures
|
||||
- **Cross-platform support**: Compatible with `arm64`, `arm/v7`, and `x86_64` systems
|
||||
- **Security options**: Supports both rootless (recommended) and rootful configurations
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Ensure Docker is installed on your host system. Official installation guides are available at:
|
||||
[docker.com/get-started](https://www.docker.com/get-started)
|
||||
|
||||
Verify Docker installation with:
|
||||
|
||||
```bash
|
||||
docker --version
|
||||
```
|
||||
|
||||
## Quick Deployment
|
||||
|
||||
Launch Papra with default configuration using:
|
||||
|
||||
```bash
|
||||
docker run -d \
|
||||
--name papra \
|
||||
--restart unless-stopped \
|
||||
-p 1221:1221 \
|
||||
ghcr.io/papra-hq/papra:latest
|
||||
```
|
||||
|
||||
This command will:
|
||||
1. Pull the latest rootless image from GitHub Container Registry
|
||||
2. Expose the web interface on [http://localhost:1221](http://localhost:1221)
|
||||
3. Configure automatic restarts for service continuity
|
||||
|
||||
## Image Variants
|
||||
|
||||
Choose between two security models based on your requirements:
|
||||
|
||||
- **Rootless**: Tagged as `latest`, `latest-rootless` or `<version>-rootless` (like `0.2.1-rootless`). Recommended for most users.
|
||||
- **Root**: Tagged as `latest-root` or `<version>-root` (like `0.2.1-root`). Only use if you need to run Papra as the root user.
|
||||
|
||||
The `:latest` tag always references the latest rootless build.
|
||||
|
||||
## Persistent Data Configuration
|
||||
|
||||
For production deployments, mount host directories to preserve application data between container updates.
|
||||
|
||||
<Steps>
|
||||
|
||||
1. Create Storage Directories
|
||||
|
||||
Create a directory for Papra data `./papra-data`, with `./papra-data/db` and `./papra-data/documents` subdirectories:
|
||||
|
||||
```bash
|
||||
mkdir -p ./papra-data/{db,documents}
|
||||
```
|
||||
|
||||
2. Launch Container with Volume Binding
|
||||
|
||||
```bash
|
||||
docker run -d \
|
||||
--name papra \
|
||||
--restart unless-stopped \
|
||||
-p 1221:1221 \
|
||||
-v $(pwd)/papra-data:/app/app-data \
|
||||
--user $(id -u):$(id -g) \
|
||||
ghcr.io/papra-hq/papra:latest
|
||||
```
|
||||
|
||||
This configuration:
|
||||
- Maintains data integrity across container lifecycle events
|
||||
- Enforces proper file ownership without manual permission adjustments
|
||||
- Stores both database files and document assets persistently
|
||||
|
||||
</Steps>
|
||||
|
||||
## Image Registries
|
||||
|
||||
Papra images are distributed through multiple channels:
|
||||
|
||||
**Primary Source (GHCR):**
|
||||
```bash
|
||||
docker pull ghcr.io/papra-hq/papra:latest
|
||||
docker pull ghcr.io/papra-hq/papra:latest-rootless
|
||||
docker pull ghcr.io/papra-hq/papra:latest-root
|
||||
```
|
||||
|
||||
**Community Mirror (Docker Hub):**
|
||||
```bash
|
||||
docker pull corentinth/papra:latest
|
||||
docker pull corentinth/papra:latest-rootless
|
||||
docker pull corentinth/papra:latest-root
|
||||
```
|
||||
|
||||
## Updating Papra
|
||||
|
||||
Regularly pull updated images and recreate containers to receive security patches and feature updates.
|
||||
|
||||
```bash
|
||||
docker pull ghcr.io/papra-hq/papra:latest
|
||||
# Or
|
||||
docker pull corentinth/papra:latest
|
||||
```
|
||||
@@ -5,18 +5,35 @@ slug: self-hosting/using-docker-compose
|
||||
|
||||
import { Steps } from '@astrojs/starlight/components';
|
||||
|
||||
Docker Compose makes it easy to deploy Papra on your server or local machine. Follow these simple steps to get your instance up and running quickly.
|
||||
This guide covers how to deploy Papra using Docker Compose, ideal for users who prefer declarative configurations or plan to integrate Papra into a broader service stack.
|
||||
|
||||
Using Docker Compose provides:
|
||||
- A single, versioned configuration file
|
||||
- Easy integration with volumes, networks, and service dependencies
|
||||
- Simplified updates and re-deployments
|
||||
|
||||
This method supports both `rootless` and `rootful` Papra images, please refer to the [Docker](/self-hosting/using-docker) guide for more information about the difference between the two. The following example uses the recommended `rootless` setup.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Ensure Docker and Docker Compose are installed on your host system. Official installation guides are available at: [docker.com/get-started](https://www.docker.com/get-started)
|
||||
|
||||
Verify Docker installation with:
|
||||
|
||||
```bash
|
||||
docker --version
|
||||
docker compose version
|
||||
```
|
||||
|
||||
|
||||
<Steps>
|
||||
|
||||
1. Prepare your environment
|
||||
1. Initialize Project Structure
|
||||
|
||||
Make sure you have [Docker](https://www.docker.com/get-started) and [Docker Compose](https://docs.docker.com/compose/install/) installed on your system.
|
||||
Create working directory and persistent storage subdirectories:
|
||||
|
||||
Verify your installation:
|
||||
|
||||
```bash frame="none"
|
||||
docker --version
|
||||
```bash
|
||||
mkdir -p papra/app-data/{db,documents} && cd papra
|
||||
```
|
||||
|
||||
2. Create Docker Compose file
|
||||
@@ -26,16 +43,14 @@ Docker Compose makes it easy to deploy Papra on your server or local machine. Fo
|
||||
```yaml
|
||||
services:
|
||||
papra:
|
||||
image: corentinth/papra:latest-rootless
|
||||
ports:
|
||||
- '1221:1221'
|
||||
volumes:
|
||||
- papra-data:/app/app-data
|
||||
container_name: papra
|
||||
image: ghcr.io/papra-hq/papra:latest
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
papra-data:
|
||||
driver: local
|
||||
ports:
|
||||
- "1221:1221"
|
||||
volumes:
|
||||
- ./app-data:/app/app-data
|
||||
user: "${UID}:${GID}"
|
||||
```
|
||||
|
||||
3. Start Papra
|
||||
@@ -43,10 +58,10 @@ Docker Compose makes it easy to deploy Papra on your server or local machine. Fo
|
||||
From the directory containing your `docker-compose.yml` file, run:
|
||||
|
||||
```bash
|
||||
docker compose up -d
|
||||
UID=$(id -u) GID=$(id -g) docker compose up -d
|
||||
```
|
||||
|
||||
This command downloads the latest Papra image, sets up the container, and starts the Papra service.
|
||||
This command downloads the latest Papra image, sets up the container, and starts the Papra service. The `UID` and `GID` variables are used to set the user and group for the container, ensuring proper file ownership. If you don't want to use the `UID` and `GID` variables, you can replace the image with the rootful variant.
|
||||
|
||||
4. Access Papra
|
||||
|
||||
|
||||
@@ -4,15 +4,85 @@ slug: self-hosting/configuration
|
||||
|
||||
---
|
||||
|
||||
import { mdSections } from '../../../config.data.ts';
|
||||
import { mdSections, fullDotEnv } from '../../../config.data.ts';
|
||||
import { marked } from 'marked';
|
||||
import { Tabs, TabItem } from '@astrojs/starlight/components';
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
import { Code } from '@astrojs/starlight/components';
|
||||
|
||||
Configuring your self hosted Papra allows you to customize the application to better suit your environment and requirements. This guide covers the key environment variables you can set to control various aspects of the application, including port settings, security options, and storage configurations.
|
||||
|
||||
## Configuration files
|
||||
|
||||
You can configure Papra using standard environment variables or use some configuration files.
|
||||
Papra uses [c12](https://github.com/unjs/c12) to load configuration files and [figue](https://github.com/CorentinTh/figue) to validate and merge environment variables and configuration files.
|
||||
|
||||
The [c12](https://github.com/unjs/c12) allows you to use the file format you want. The configuration file should be named `papra.config.[ext]` and should be located in the root of the project or in `/app/app-data/` directory in docker container (it can be changed using `PAPRA_CONFIG_DIR` environment variable).
|
||||
|
||||
The supported formats are: `json`, `jsonc`, `json5`, `yaml`, `yml`, `toml`, `js`, `ts`, `cjs`, `mjs`.
|
||||
|
||||
Example of configuration files:
|
||||
<Tabs>
|
||||
<TabItem label="papra.config.yaml">
|
||||
```yaml
|
||||
server:
|
||||
baseUrl: https://papra.example.com
|
||||
corsOrigins: *
|
||||
|
||||
client:
|
||||
baseUrl: https://papra.example.com
|
||||
|
||||
auth:
|
||||
secret: your-secret-key
|
||||
isRegistrationEnabled: true
|
||||
# ...
|
||||
```
|
||||
</TabItem>
|
||||
|
||||
<TabItem label="papra.config.json">
|
||||
```json
|
||||
{
|
||||
"$schema": "https://docs.papra.com/papra-config-schema.json",
|
||||
"server": {
|
||||
"baseUrl": "https://papra.example.com"
|
||||
},
|
||||
"client": {
|
||||
"baseUrl": "https://papra.example.com"
|
||||
},
|
||||
"auth": {
|
||||
"secret": "your-secret-key",
|
||||
"isRegistrationEnabled": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
<Aside type="tip">
|
||||
When using an IDE, you can use the [papra-config-schema.json](/papra-config-schema.json) file to get autocompletion for the configuration file. Just add a `$schema` property to your configuration file and point it to the schema file.
|
||||
|
||||
```json
|
||||
{
|
||||
"$schema": "https://docs.papra.com/papra-config-schema.json",
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
</Aside>
|
||||
|
||||
</TabItem>
|
||||
</Tabs>
|
||||
|
||||
|
||||
You'll find the complete list of configuration variables with their environment variables equivalents and path for files in the next section.
|
||||
|
||||
## Complete .env
|
||||
|
||||
Here is the full configuration file that you can use to configure Papra. The variables values are the default values.
|
||||
|
||||
<Code code={fullDotEnv} language="env" title=".env" />
|
||||
|
||||
## Configuration variables
|
||||
|
||||
Here is the complete list of configuration variables that you can use to configure Papra. You can set these variables in the `.env` file or pass them as environment variables when running the Docker container.
|
||||
|
||||
<Fragment set:html={marked.parse(mdSections)} />
|
||||
|
||||
Coming soon.
|
||||
|
||||
@@ -55,11 +55,12 @@ By integrating Papra with OwlRelay, your instance will generate email addresses
|
||||
# this is to authenticate that the emails are coming from OwlRelay
|
||||
INTAKE_EMAILS_WEBHOOK_SECRET=a-random-key
|
||||
|
||||
# Optional: This is the URL that OwlRelay will send the emails to,
|
||||
# [Optional]
|
||||
# This is the URL that OwlRelay will send the emails to,
|
||||
# if not provided, the webhook will be inferred from the server URL.
|
||||
# Can be relevant if you have multiple urls pointing to your Papra instance
|
||||
# or when using tunnel services
|
||||
OWLRELAY_WEBHOOK_URL=https://your-instance.com/api/intake-emails/owlrelay
|
||||
OWLRELAY_WEBHOOK_URL=https://your-instance.com/api/intake-emails/ingest
|
||||
```
|
||||
|
||||
4. **That's it!**
|
||||
|
||||
@@ -0,0 +1,105 @@
|
||||
---
|
||||
title: Setup Ingestion Folder
|
||||
description: Step-by-step guide to setup an ingestion folder to automatically ingest documents into your Papra instance.
|
||||
slug: guides/setup-ingestion-folder
|
||||
---
|
||||
import { Steps } from '@astrojs/starlight/components';
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
import { FileTree } from '@astrojs/starlight/components';
|
||||
|
||||
The ingestion folder is a special folder that is watched by Papra for new files. When a new file is added to the ingestion folder, Papra will automatically import it.
|
||||
|
||||
## Multi-Organization Structure
|
||||
|
||||
Papra supports multiple organizations within a single instance, each requiring a dedicated ingestion folder. The ingestion system uses a hierarchical structure where:
|
||||
|
||||
<FileTree>
|
||||
- ingestion-folder
|
||||
- org_abc123
|
||||
- document.pdf
|
||||
- report.docx
|
||||
- org_def456
|
||||
- file.txt
|
||||
- foo.txt # Ignored as it's not in an organization
|
||||
</FileTree>
|
||||
|
||||
|
||||
This allows you to have a single instance of Papra watching multiple organizations' ingestion folders.
|
||||
|
||||
<Aside>
|
||||
Files and folders that are within the `ingestion-root-folder` but not within an organization folder are ignored.
|
||||
</Aside>
|
||||
|
||||
## Setup
|
||||
|
||||
Add the following to your `docker-compose.yml` file:
|
||||
|
||||
```yaml title="docker-compose.yml" ins={9,12}
|
||||
services:
|
||||
papra:
|
||||
container_name: papra
|
||||
image: ghcr.io/papra-hq/papra:latest
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "1221:1221"
|
||||
environment:
|
||||
- INGESTION_FOLDER_IS_ENABLED=true
|
||||
volumes:
|
||||
- ./app-data:/app/app-data
|
||||
- <your-ingestion-folder>:/app/ingestion
|
||||
user: "${UID}:${GID}"
|
||||
```
|
||||
|
||||
Then add files to a folder named with the organization id (available in Papra URL, e.g. `https://papra.example.com/organizations/<organization-id>`, the format is `org_<random>`).
|
||||
|
||||
```bash
|
||||
mkdir -p <your-ingestion-folder>/<org_id>
|
||||
touch <your-ingestion-folder>/<org_id>/hello.txt
|
||||
```
|
||||
|
||||
## Post-processing
|
||||
|
||||
Once a file has been ingested in your Papra organization, you can configure what happens to it by setting the `INGESTION_FOLDER_POST_PROCESSING_STRATEGY` environment variable. There are two strategies:
|
||||
|
||||
- `delete`: The file is deleted from the ingestion folder (default strategy)
|
||||
- `move`: The file is moved to the `INGESTION_FOLDER_POST_PROCESSING_MOVE_FOLDER_PATH` folder (default: `./ingestion-done`)
|
||||
|
||||
Note that the `INGESTION_FOLDER_POST_PROCESSING_MOVE_FOLDER_PATH` path is relative to the organization ingestion folder.
|
||||
|
||||
So with `INGESTION_FOLDER_POST_PROCESSING_MOVE_FOLDER_PATH=ingestion-done`, the file `<ingestion-folder>/<org_id>/file.pdf` will be moved to `<ingestion-folder>/<org_id>/ingestion-done/file.pdf` once ingested.
|
||||
|
||||
## Safeguards
|
||||
|
||||
To avoid accidental data loss, if for some reason the ingestion fails, the file is moved to the `INGESTION_FOLDER_ERROR_FOLDER_PATH` folder (default: `./ingestion-error`).
|
||||
|
||||
<Aside>
|
||||
As for the `INGESTION_FOLDER_POST_PROCESSING_MOVE_FOLDER_PATH`, the `INGESTION_FOLDER_ERROR_FOLDER_PATH` path is relative to the organization ingestion folder.
|
||||
</Aside>
|
||||
|
||||
## Polling
|
||||
|
||||
By default, Papra uses native file watchers to detect changes in the ingestion folder. On some OS (like Windows), this can be flaky with Docker. To avoid this issue, you can enable polling by setting the `INGESTION_FOLDER_WATCHER_USE_POLLING` environment variable to `true`.
|
||||
|
||||
The default polling interval is 2 seconds, you can change it by setting the `INGESTION_FOLDER_WATCHER_POLLING_INTERVAL_MS` environment variable.
|
||||
|
||||
|
||||
```yaml title="docker-compose.yml" ins={2-3}
|
||||
environment:
|
||||
- INGESTION_FOLDER_WATCHER_USE_POLLING=true
|
||||
- INGESTION_FOLDER_WATCHER_POLLING_INTERVAL_MS=2000
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
You can find the list of all configuration options in the [configuration reference](/docs/configuration-reference), the related variables are prefixed with `INGESTION_FOLDER_`.
|
||||
|
||||
|
||||
## Edge cases and behaviors
|
||||
|
||||
- The ingestion folder is watched recursively.
|
||||
- Files in the ingestion folder `done` and `error` folders are ignored.
|
||||
- When a file from the ingestion folder is already present (and not in the trash) in the organization, no ingestion is done, but the file is post-processed (deleted or moved) as successfully ingested.
|
||||
- When a file is moved to "done" or "error" folder
|
||||
- If a file with the same name and same content is present in the destination folder, the original file is deleted
|
||||
- If a file with the same name but different content is present in the destination folder, the original file is moved and a timestamp is added to the filename
|
||||
- Some files are ignored by default (`.DS_Store`, `Thumbs.db`, `desktop.ini`, etc.) see [ingestion-folders.constants.ts](https://github.com/papra-hq/papra/blob/main/apps/papra-server/src/modules/ingestion-folders/ingestion-folders.constants.ts) for the list of ignored files and patterns. You can change this by setting the `INGESTION_FOLDER_IGNORED_PATTERNS` environment variable.
|
||||
@@ -1,6 +1,19 @@
|
||||
---
|
||||
title: Papra documentation
|
||||
description: Papra documentation.
|
||||
hero:
|
||||
title: Papra Docs
|
||||
tagline: Documentation for Papra, the minimalistic document archiving platform.
|
||||
image:
|
||||
alt: A glittering, brightly colored logo
|
||||
dark: ../../assets/logo-dark.svg
|
||||
light: ../../assets/logo-light.svg
|
||||
actions:
|
||||
- text: Self-hosting guide
|
||||
link: /self-hosting/using-docker
|
||||
icon: right-arrow
|
||||
variant: primary
|
||||
|
||||
---
|
||||
|
||||
import { LinkCard } from '@astrojs/starlight/components';
|
||||
@@ -38,12 +51,13 @@ In today's digital world, managing countless important documents efficiently and
|
||||
- **Tags**: Organize your documents with tags.
|
||||
- **Email ingestion**: Send/forward emails to a generated address to automatically import documents.
|
||||
- **Content extraction**: Automatically extract text from images or scanned documents for search.
|
||||
- **i18n**: Support for multiple languages.
|
||||
- *In progress:* **i18n**: Support for multiple languages.
|
||||
- *Coming soon:* **Tagging Rules**: Automatically tag documents based on custom rules.
|
||||
- *Coming soon:* **Folder ingestion**: Automatically import documents from a folder.
|
||||
- *Coming soon:* **SDK and API**: Build your own applications on top of Papra.
|
||||
- *Coming soon:* **CLI**: Manage your documents from the command line.
|
||||
- *Coming soon:* **Document sharing**: Share documents with others.
|
||||
- *Coming soon:* **Document requests**: Generate upload links for people to add documents.
|
||||
|
||||
## Community & Open Source
|
||||
|
||||
|
||||
@@ -26,6 +26,10 @@ export const sidebar: StarlightUserConfig['sidebar'] = [
|
||||
label: 'Setup intake emails with CF Email Workers',
|
||||
slug: 'guides/intake-emails-with-cloudflare-email-workers',
|
||||
},
|
||||
{
|
||||
label: 'Setup Ingestion Folder',
|
||||
slug: 'guides/setup-ingestion-folder',
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
|
||||
49
apps/docs/src/pages/papra-config-schema.json.ts
Normal file
49
apps/docs/src/pages/papra-config-schema.json.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import type { APIRoute } from 'astro';
|
||||
import type { ConfigDefinition } from 'figue';
|
||||
import { z } from 'astro:content';
|
||||
import { mapValues } from 'lodash-es';
|
||||
import { zodToJsonSchema } from 'zod-to-json-schema';
|
||||
import { configDefinition } from '../../../papra-server/src/modules/config/config';
|
||||
|
||||
function buildConfigSchema({ configDefinition }: { configDefinition: ConfigDefinition }) {
|
||||
const schema: any = mapValues(configDefinition, (config) => {
|
||||
if (typeof config === 'object' && config !== null && 'schema' in config && 'doc' in config) {
|
||||
return config.schema;
|
||||
} else {
|
||||
return buildConfigSchema({
|
||||
configDefinition: config as ConfigDefinition,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return z.object(schema);
|
||||
}
|
||||
|
||||
function stripRequired(schema: any) {
|
||||
if (schema.type === 'object') {
|
||||
schema.required = [];
|
||||
for (const key in schema.properties) {
|
||||
stripRequired(schema.properties[key]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function addSchema(schema: any) {
|
||||
schema.properties.$schema = {
|
||||
type: 'string',
|
||||
description: 'The schema of the configuration file, to be used by IDEs to provide autocompletion and validation',
|
||||
};
|
||||
}
|
||||
|
||||
function getConfigSchema() {
|
||||
const schema = buildConfigSchema({ configDefinition });
|
||||
const jsonSchema = zodToJsonSchema(schema, { pipeStrategy: 'output' });
|
||||
|
||||
stripRequired(jsonSchema);
|
||||
addSchema(jsonSchema);
|
||||
return jsonSchema;
|
||||
}
|
||||
|
||||
export const GET: APIRoute = () => {
|
||||
return new Response(JSON.stringify(getConfigSchema()));
|
||||
};
|
||||
15
apps/docs/src/pages/robots.txt.ts
Normal file
15
apps/docs/src/pages/robots.txt.ts
Normal file
@@ -0,0 +1,15 @@
|
||||
import type { APIRoute } from 'astro';
|
||||
|
||||
function getRobotsTxt(sitemapURL: URL) {
|
||||
return `
|
||||
User-agent: *
|
||||
Allow: /
|
||||
|
||||
Sitemap: ${sitemapURL.href}
|
||||
`.trim();
|
||||
}
|
||||
|
||||
export const GET: APIRoute = ({ site }) => {
|
||||
const sitemapURL = new URL('sitemap-index.xml', site);
|
||||
return new Response(getRobotsTxt(sitemapURL));
|
||||
};
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "@papra/papra-app-client",
|
||||
"type": "module",
|
||||
"version": "0.1.2",
|
||||
"version": "0.3.0",
|
||||
"packageManager": "pnpm@9.15.4",
|
||||
"description": "Papra frontend client",
|
||||
"author": "Corentin Thomasset <corentinth@proton.me> (https://corentin.tech)",
|
||||
|
||||
@@ -1,10 +1,87 @@
|
||||
auth:
|
||||
request-password-reset:
|
||||
title: Reset your password
|
||||
description: Enter your email to reset your password.
|
||||
requested: If an account exists for this email, we've sent you an email to reset your password.
|
||||
back-to-login: Back to login
|
||||
form:
|
||||
email:
|
||||
label: Email
|
||||
placeholder: 'Example: ada@papra.app'
|
||||
required: Please enter your email address
|
||||
invalid: This email address is invalid
|
||||
submit: Request password reset
|
||||
|
||||
reset-password:
|
||||
title: Reset your password
|
||||
description: Enter your new password to reset your password.
|
||||
reset: Your password has been reset.
|
||||
back-to-login: Back to login
|
||||
form:
|
||||
new-password:
|
||||
label: New password
|
||||
placeholder: 'Example: **********'
|
||||
required: Please enter your new password
|
||||
min-length: Password must be at least {{ minLength }} characters
|
||||
max-length: Password must be less than {{ maxLength }} characters
|
||||
submit: Reset password
|
||||
|
||||
email-provider:
|
||||
open: Open {{ provider }}
|
||||
|
||||
login:
|
||||
title: Login to Papra
|
||||
description: Enter your email or use social login to access your Papra account.
|
||||
login-with-provider: Login with {{ provider }}
|
||||
no-account: Don't have an account?
|
||||
register: Register
|
||||
form:
|
||||
email:
|
||||
label: Email
|
||||
placeholder: 'Example: ada@papra.app'
|
||||
required: Please enter your email address
|
||||
invalid: This email address is invalid
|
||||
password:
|
||||
label: Password
|
||||
placeholder: Set a password
|
||||
required: Please enter your password
|
||||
remember-me:
|
||||
label: Remember me
|
||||
forgot-password:
|
||||
label: Forgot password?
|
||||
submit: Login
|
||||
|
||||
register:
|
||||
title: Register to Papra
|
||||
description: Enter your email or use social login to access your Papra account.
|
||||
register-with-email: Register with email
|
||||
register-with-provider: Register with {{ provider }}
|
||||
providers:
|
||||
google: Google
|
||||
github: GitHub
|
||||
have-account: Already have an account?
|
||||
login: Login
|
||||
registration-disabled:
|
||||
title: Registration is disabled
|
||||
description: The creation of new accounts is currently disabled on this instance of Papra. Only users with existing accounts can log in. If you think this is a mistake, please contact the administrator of this instance.
|
||||
form:
|
||||
email:
|
||||
label: Email
|
||||
placeholder: 'Example: ada@papra.app'
|
||||
required: Please enter your email address
|
||||
invalid: This email address is invalid
|
||||
password:
|
||||
label: Password
|
||||
placeholder: Set a password
|
||||
required: Please enter your password
|
||||
min-length: Password must be at least {{ minLength }} characters
|
||||
max-length: Password must be less than {{ maxLength }} characters
|
||||
name:
|
||||
label: Name
|
||||
placeholder: 'Example: Ada Lovelace'
|
||||
required: Please enter your name
|
||||
max-length: Name must be less than {{ maxLength }} characters
|
||||
submit: Register
|
||||
email-validation-required:
|
||||
title: Verify your email
|
||||
description: A verification email has been sent to your email address. Please verify your email address by clicking the link in the email.
|
||||
@@ -24,12 +101,119 @@ layout:
|
||||
home: Home
|
||||
documents: Documents
|
||||
tags: Tags
|
||||
tagging-rules: Tagging rules
|
||||
integrations: Integrations
|
||||
deleted-documents: Deleted documents
|
||||
organization-settings: Organization settings
|
||||
|
||||
tagging-rules:
|
||||
field:
|
||||
name: document name
|
||||
content: document content
|
||||
operator:
|
||||
equals: equals
|
||||
not-equals: not equals
|
||||
contains: contains
|
||||
not-contains: not contains
|
||||
starts-with: starts with
|
||||
ends-with: ends with
|
||||
list:
|
||||
title: Tagging rules
|
||||
description: Manage your organization's tagging rules, to automatically tag documents based on conditions you define.
|
||||
demo-warning: 'Note: As this is a demo environment (with no server), tagging rules will not be applied to newly added documents.'
|
||||
no-tagging-rules:
|
||||
title: No tagging rules
|
||||
description: Create a tagging rule to automatically tag your added documents based on conditions you define.
|
||||
create-tagging-rule: Create tagging rule
|
||||
card:
|
||||
no-conditions: No conditions
|
||||
one-condition: 1 condition
|
||||
conditions: '{{ count }} conditions'
|
||||
delete: Delete rule
|
||||
edit: Edit rule
|
||||
create:
|
||||
title: Create tagging rule
|
||||
success: Tagging rule created successfully
|
||||
error: Failed to create tagging rule
|
||||
submit: Create rule
|
||||
form:
|
||||
name:
|
||||
label: Name
|
||||
placeholder: 'Example: Tag invoices'
|
||||
min-length: Please enter a name for the rule
|
||||
max-length: The name must be less than 64 characters
|
||||
description:
|
||||
label: Description
|
||||
placeholder: 'Example: Tag documents with "invoice" in the name'
|
||||
max-length: The description must be less than 256 characters
|
||||
conditions:
|
||||
label: Conditions
|
||||
description: Define the conditions that must be met for the rule to apply. All conditions must be met for the rule to apply.
|
||||
add-condition: Add condition
|
||||
no-conditions:
|
||||
title: No conditions
|
||||
description: You didn't add any conditions to this rule. This rule will apply its tags to all documents.
|
||||
confirm: Apply rule without conditions
|
||||
cancel: Cancel
|
||||
field:
|
||||
label: Field
|
||||
operator:
|
||||
label: Operator
|
||||
value:
|
||||
label: Value
|
||||
placeholder: 'Example: invoice'
|
||||
min-length: Please enter a value for the condition
|
||||
tags:
|
||||
label: Tags
|
||||
description: Select the tags to apply to the added documents that match the conditions
|
||||
min-length: At least one tag to apply is required
|
||||
add-tag: Create tag
|
||||
submit: Create rule
|
||||
update:
|
||||
title: Update tagging rule
|
||||
success: Tagging rule updated successfully
|
||||
error: Failed to update tagging rule
|
||||
submit: Update rule
|
||||
cancel: Cancel
|
||||
demo:
|
||||
popup:
|
||||
description: This is a demo environment, all data is save to your browser local storage.
|
||||
discord: Join the {{ discordLink }} to get support, propose features or just chat.
|
||||
discord-link-label: Discord server
|
||||
reset: Reset demo data
|
||||
hide: Hide
|
||||
|
||||
trash:
|
||||
delete-all:
|
||||
button: Delete all
|
||||
confirm:
|
||||
title: Permanently delete all documents?
|
||||
description: Are you sure you want to permanently delete all documents from the trash? This action cannot be undone.
|
||||
label: Delete
|
||||
cancel: Cancel
|
||||
delete:
|
||||
button: Delete
|
||||
confirm:
|
||||
title: Permanently delete document?
|
||||
description: Are you sure you want to permanently delete this document from the trash? This action cannot be undone.
|
||||
label: Delete
|
||||
cancel: Cancel
|
||||
deleted:
|
||||
success:
|
||||
title: Document deleted
|
||||
description: The document has been permanently deleted.
|
||||
|
||||
import-documents:
|
||||
title:
|
||||
error: '{{ count }} documents failed'
|
||||
success: '{{ count }} documents imported'
|
||||
pending: '{{ count }} / {{ total }} documents imported'
|
||||
none: Import documents
|
||||
no-import-in-progress: No document import in progress
|
||||
|
||||
api-errors:
|
||||
document.already_exists: The document already exists
|
||||
document.file_too_big: The document file is too big
|
||||
intake_email.limit_reached: The maximum number of intake emails for this organization has been reached. Please upgrade your plan to create more intake emails.
|
||||
user.max_organization_count_reached: You have reached the maximum number of organizations you can create, if you need to create more, please contact support.
|
||||
default: An error occurred while processing your request.
|
||||
|
||||
@@ -31,5 +31,7 @@ layout:
|
||||
demo:
|
||||
popup:
|
||||
description: Ceci est un environnement de démo, toutes les données sont enregistrées dans le local storage de votre navigateur.
|
||||
discord: Rejoignez le {{ discordLink }} pour obtenir de l'aide, proposer des fonctionnalités ou discuter avec l'équipe.
|
||||
discord-link-label: Serveur Discord
|
||||
reset: Réinitialiser la démo
|
||||
hide: Masquer
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { cn } from '@/modules/shared/style/cn';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
import { type Component, type ComponentProps, splitProps } from 'solid-js';
|
||||
@@ -246,6 +247,7 @@ export function getEmailProvider({ email }: { email?: string }) {
|
||||
|
||||
export const OpenEmailProvider: Component<{ email?: string } & ComponentProps<typeof Button>> = (props) => {
|
||||
const [local, rest] = splitProps(props, ['email', 'class']);
|
||||
const { t } = useI18n();
|
||||
|
||||
const { provider } = getEmailProvider({ email: local.email });
|
||||
|
||||
@@ -256,9 +258,7 @@ export const OpenEmailProvider: Component<{ email?: string } & ComponentProps<ty
|
||||
return (
|
||||
<Button as="a" href={provider.url} target="_blank" rel="noopener noreferrer" class={cn('w-full', local.class)} {...rest}>
|
||||
<div class="i-tabler-external-link mr-2 size-4" />
|
||||
Open
|
||||
{' '}
|
||||
{provider.name}
|
||||
{t('auth.email-provider.open', { provider: provider.name })}
|
||||
</Button>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -18,6 +18,7 @@ import { SsoProviderButton } from '../components/sso-provider-button.component';
|
||||
export const EmailLoginForm: Component = () => {
|
||||
const navigate = useNavigate();
|
||||
const { config } = useConfig();
|
||||
const { t } = useI18n();
|
||||
|
||||
const { form, Form, Field } = createForm({
|
||||
onSubmit: async ({ email, password, rememberMe }) => {
|
||||
@@ -35,12 +36,12 @@ export const EmailLoginForm: Component = () => {
|
||||
email: v.pipe(
|
||||
v.string(),
|
||||
v.trim(),
|
||||
v.nonEmpty('Please enter your email address'),
|
||||
v.email('This is not a valid email address'),
|
||||
v.nonEmpty(t('auth.login.form.email.required')),
|
||||
v.email(t('auth.login.form.email.invalid')),
|
||||
),
|
||||
password: v.pipe(
|
||||
v.string('Password is required'),
|
||||
v.nonEmpty('Please enter your password'),
|
||||
v.string(t('auth.login.form.password.required')),
|
||||
v.nonEmpty(t('auth.login.form.password.required')),
|
||||
),
|
||||
rememberMe: v.boolean(),
|
||||
}),
|
||||
@@ -54,8 +55,8 @@ export const EmailLoginForm: Component = () => {
|
||||
<Field name="email">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 mb-4">
|
||||
<TextFieldLabel for="email">Email</TextFieldLabel>
|
||||
<TextField type="email" id="email" placeholder="Eg. ada@papra.app" {...inputProps} autoFocus value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
<TextFieldLabel for="email">{t('auth.login.form.email.label')}</TextFieldLabel>
|
||||
<TextField type="email" id="email" placeholder={t('auth.login.form.email.placeholder')} {...inputProps} autoFocus value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
@@ -64,9 +65,9 @@ export const EmailLoginForm: Component = () => {
|
||||
<Field name="password">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 mb-4">
|
||||
<TextFieldLabel for="password">Password</TextFieldLabel>
|
||||
<TextFieldLabel for="password">{t('auth.login.form.password.label')}</TextFieldLabel>
|
||||
|
||||
<TextField type="password" id="password" placeholder="Your password" {...inputProps} value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
<TextField type="password" id="password" placeholder={t('auth.login.form.password.placeholder')} {...inputProps} value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
@@ -78,18 +79,18 @@ export const EmailLoginForm: Component = () => {
|
||||
<Checkbox class="flex items-center gap-2" defaultChecked={field.value}>
|
||||
<CheckboxControl inputProps={inputProps} />
|
||||
<CheckboxLabel class="text-sm font-medium leading-none peer-disabled:cursor-not-allowed peer-disabled:opacity-70">
|
||||
Remember me
|
||||
{t('auth.login.form.remember-me.label')}
|
||||
</CheckboxLabel>
|
||||
</Checkbox>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Button variant="link" as={A} class="inline p-0! h-auto" href="/request-password-reset">
|
||||
Forgot password?
|
||||
{t('auth.login.form.forgot-password.label')}
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<Button type="submit" class="w-full">Login</Button>
|
||||
<Button type="submit" class="w-full">{t('auth.login.form.submit')}</Button>
|
||||
|
||||
<div class="text-red-500 text-sm mt-4">{form.response.message}</div>
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import type { ssoProviders } from '../auth.constants';
|
||||
import { useConfig } from '@/modules/config/config.provider';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { createForm } from '@/modules/shared/form/form';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
import { Separator } from '@/modules/ui/components/separator';
|
||||
@@ -16,7 +17,7 @@ import { SsoProviderButton } from '../components/sso-provider-button.component';
|
||||
export const EmailRegisterForm: Component = () => {
|
||||
const { config } = useConfig();
|
||||
const navigate = useNavigate();
|
||||
|
||||
const { t } = useI18n();
|
||||
const { form, Form, Field } = createForm({
|
||||
onSubmit: async ({ email, password, name }) => {
|
||||
const { error } = await signUp.email({
|
||||
@@ -41,19 +42,19 @@ export const EmailRegisterForm: Component = () => {
|
||||
email: v.pipe(
|
||||
v.string(),
|
||||
v.trim(),
|
||||
v.nonEmpty('Please enter an email address'),
|
||||
v.email('This is not a valid email address'),
|
||||
v.nonEmpty(t('auth.register.form.email.required')),
|
||||
v.email(t('auth.register.form.email.invalid')),
|
||||
),
|
||||
password: v.pipe(
|
||||
v.string('Password is required'),
|
||||
v.nonEmpty('Please enter a password'),
|
||||
v.minLength(8, 'Password must be at least 8 characters'),
|
||||
v.maxLength(128, 'Password must be at most 128 characters'),
|
||||
v.string(),
|
||||
v.nonEmpty(t('auth.register.form.password.required')),
|
||||
v.minLength(8, t('auth.register.form.password.min-length', { minLength: 8 })),
|
||||
v.maxLength(128, t('auth.register.form.password.max-length', { maxLength: 128 })),
|
||||
),
|
||||
name: v.pipe(
|
||||
v.string('Name is required'),
|
||||
v.nonEmpty('Please enter a name'),
|
||||
v.maxLength(64, 'Name must be at most 64 characters'),
|
||||
v.string(t('auth.register.form.name.label')),
|
||||
v.nonEmpty(t('auth.register.form.name.required')),
|
||||
v.maxLength(64, t('auth.register.form.name.max-length', { maxLength: 64 })),
|
||||
),
|
||||
}),
|
||||
});
|
||||
@@ -63,8 +64,8 @@ export const EmailRegisterForm: Component = () => {
|
||||
<Field name="email">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 mb-4">
|
||||
<TextFieldLabel for="email">Email</TextFieldLabel>
|
||||
<TextField type="email" id="email" placeholder="Eg. ada@papra.app" {...inputProps} autoFocus value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
<TextFieldLabel for="email">{t('auth.register.form.email.label')}</TextFieldLabel>
|
||||
<TextField type="email" id="email" placeholder={t('auth.register.form.email.placeholder')} {...inputProps} autoFocus value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
@@ -73,8 +74,8 @@ export const EmailRegisterForm: Component = () => {
|
||||
<Field name="name">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 mb-4">
|
||||
<TextFieldLabel for="name">Your full name</TextFieldLabel>
|
||||
<TextField type="text" id="name" placeholder="Eg. Ada Lovelace" {...inputProps} value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
<TextFieldLabel for="name">{t('auth.register.form.name.label')}</TextFieldLabel>
|
||||
<TextField type="text" id="name" placeholder={t('auth.register.form.name.placeholder')} {...inputProps} value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
@@ -83,15 +84,15 @@ export const EmailRegisterForm: Component = () => {
|
||||
<Field name="password">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 mb-4">
|
||||
<TextFieldLabel for="password">Password</TextFieldLabel>
|
||||
<TextFieldLabel for="password">{t('auth.register.form.password.label')}</TextFieldLabel>
|
||||
|
||||
<TextField type="password" id="password" placeholder="Your password" {...inputProps} value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
<TextField type="password" id="password" placeholder={t('auth.register.form.password.placeholder')} {...inputProps} value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Button type="submit" class="w-full">Register</Button>
|
||||
<Button type="submit" class="w-full">{t('auth.register.form.submit')}</Button>
|
||||
|
||||
<div class="text-red-500 text-sm mt-4">{form.response.message}</div>
|
||||
|
||||
@@ -101,6 +102,7 @@ export const EmailRegisterForm: Component = () => {
|
||||
|
||||
export const RegisterPage: Component = () => {
|
||||
const { config } = useConfig();
|
||||
const { t } = useI18n();
|
||||
|
||||
if (!config.auth.isRegistrationEnabled) {
|
||||
return (
|
||||
@@ -108,17 +110,17 @@ export const RegisterPage: Component = () => {
|
||||
<div class="flex items-center justify-center h-full p-6 sm:pb-32">
|
||||
<div class="max-w-sm w-full">
|
||||
<h1 class="text-xl font-bold">
|
||||
Registration is disabled
|
||||
{t('auth.register.registration-disabled.title')}
|
||||
</h1>
|
||||
<p class="text-muted-foreground mt-1 mb-4">
|
||||
The creation of new accounts is currently disabled on this instance of Papra. Only users with existing accounts can log in. If you think this is a mistake, please contact the administrator of this instance.
|
||||
{t('auth.register.registration-disabled.description')}
|
||||
</p>
|
||||
|
||||
<p class="text-muted-foreground mt-4">
|
||||
Already have an account?
|
||||
{t('auth.register.have-account')}
|
||||
{' '}
|
||||
<Button variant="link" as={A} class="inline px-0" href="/login">
|
||||
Login
|
||||
{t('auth.register.login')}
|
||||
</Button>
|
||||
</p>
|
||||
|
||||
@@ -141,10 +143,10 @@ export const RegisterPage: Component = () => {
|
||||
<div class="flex items-center justify-center h-full p-6 sm:pb-32">
|
||||
<div class="max-w-sm w-full">
|
||||
<h1 class="text-xl font-bold">
|
||||
Register to Papra
|
||||
{t('auth.register.title')}
|
||||
</h1>
|
||||
<p class="text-muted-foreground mt-1 mb-4">
|
||||
Enter your email or use social login to create your Papra account.
|
||||
{t('auth.register.description')}
|
||||
</p>
|
||||
|
||||
{getShowEmailRegister() || !getHasSsoProviders()
|
||||
@@ -152,7 +154,7 @@ export const RegisterPage: Component = () => {
|
||||
: (
|
||||
<Button onClick={() => setShowEmailRegister(true)} class="w-full">
|
||||
<div class="i-tabler-mail mr-2 size-4.5" />
|
||||
Register with email
|
||||
{t('auth.register.register-with-email')}
|
||||
</Button>
|
||||
)}
|
||||
|
||||
@@ -162,17 +164,22 @@ export const RegisterPage: Component = () => {
|
||||
<div class="flex flex-col gap-2">
|
||||
<For each={getEnabledSsoProviderConfigs({ config })}>
|
||||
{provider => (
|
||||
<SsoProviderButton name={provider.name} icon={provider.icon} onClick={() => registerWithProvider(provider)} label={`Register with ${provider.name}`} />
|
||||
<SsoProviderButton
|
||||
name={provider.name}
|
||||
icon={provider.icon}
|
||||
onClick={() => registerWithProvider(provider)}
|
||||
label={t('auth.register.register-with-provider', { provider: t(`auth.register.providers.${provider.key}`) })}
|
||||
/>
|
||||
)}
|
||||
</For>
|
||||
</div>
|
||||
</Show>
|
||||
|
||||
<p class="text-muted-foreground mt-4">
|
||||
Already have an account?
|
||||
{t('auth.register.have-account')}
|
||||
{' '}
|
||||
<Button variant="link" as={A} class="inline px-0" href="/login">
|
||||
Login
|
||||
{t('auth.register.login')}
|
||||
</Button>
|
||||
</p>
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { useConfig } from '@/modules/config/config.provider';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { createForm } from '@/modules/shared/form/form';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
import { TextField, TextFieldLabel, TextFieldRoot } from '@/modules/ui/components/textfield';
|
||||
@@ -11,14 +12,16 @@ import { forgetPassword } from '../auth.services';
|
||||
import { OpenEmailProvider } from '../components/open-email-provider.component';
|
||||
|
||||
export const ResetPasswordForm: Component<{ onSubmit: (args: { email: string }) => Promise<void> }> = (props) => {
|
||||
const { t } = useI18n();
|
||||
|
||||
const { form, Form, Field } = createForm({
|
||||
onSubmit: props.onSubmit,
|
||||
schema: v.object({
|
||||
email: v.pipe(
|
||||
v.string(),
|
||||
v.trim(),
|
||||
v.nonEmpty('Please enter your email address'),
|
||||
v.email('This is not a valid email address'),
|
||||
v.nonEmpty(t('auth.request-password-reset.form.email.required')),
|
||||
v.email(t('auth.request-password-reset.form.email.invalid')),
|
||||
),
|
||||
}),
|
||||
});
|
||||
@@ -28,15 +31,15 @@ export const ResetPasswordForm: Component<{ onSubmit: (args: { email: string })
|
||||
<Field name="email">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 mb-4">
|
||||
<TextFieldLabel for="email">Email</TextFieldLabel>
|
||||
<TextField type="email" id="email" placeholder="Eg. ada@papra.app" {...inputProps} autoFocus value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
<TextFieldLabel for="email">{t('auth.request-password-reset.form.email.label')}</TextFieldLabel>
|
||||
<TextField type="email" id="email" placeholder={t('auth.request-password-reset.form.email.placeholder')} {...inputProps} autoFocus value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Button type="submit" class="w-full">
|
||||
Request password reset
|
||||
{t('auth.request-password-reset.form.submit')}
|
||||
</Button>
|
||||
|
||||
<div class="text-red-500 text-sm mt-2">{form.response.message}</div>
|
||||
@@ -49,6 +52,7 @@ export const RequestPasswordResetPage: Component = () => {
|
||||
const [getHasPasswordResetBeenRequested, setHasPasswordResetBeenRequested] = createSignal(false);
|
||||
const [getEmail, setEmail] = createSignal<string | undefined>(undefined);
|
||||
|
||||
const { t } = useI18n();
|
||||
const { config } = useConfig();
|
||||
const navigate = useNavigate();
|
||||
|
||||
@@ -80,14 +84,14 @@ export const RequestPasswordResetPage: Component = () => {
|
||||
<div class="flex items-center justify-center p-6 sm:pb-32">
|
||||
<div class="max-w-sm w-full">
|
||||
<h1 class="text-xl font-bold">
|
||||
Reset your password
|
||||
{t('auth.request-password-reset.title')}
|
||||
</h1>
|
||||
|
||||
{getHasPasswordResetBeenRequested()
|
||||
? (
|
||||
<>
|
||||
<div class="text-muted-foreground mt-1 mb-4">
|
||||
If an account exists for this email, we've sent you an email to reset your password.
|
||||
{t('auth.request-password-reset.requested')}
|
||||
</div>
|
||||
|
||||
<OpenEmailProvider email={getEmail()} variant="secondary" class="w-full mb-4" />
|
||||
@@ -96,7 +100,7 @@ export const RequestPasswordResetPage: Component = () => {
|
||||
: (
|
||||
<>
|
||||
<p class="text-muted-foreground mt-1 mb-4">
|
||||
Enter your email to reset your password.
|
||||
{t('auth.request-password-reset.description')}
|
||||
</p>
|
||||
|
||||
<ResetPasswordForm onSubmit={onPasswordResetRequested} />
|
||||
@@ -105,7 +109,7 @@ export const RequestPasswordResetPage: Component = () => {
|
||||
|
||||
<Button as={A} href="/login" class="w-full" variant={getHasPasswordResetBeenRequested() ? 'default' : 'ghost'}>
|
||||
<div class="i-tabler-arrow-left mr-2 size-4" />
|
||||
Back to login
|
||||
{t('auth.request-password-reset.back-to-login')}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { useConfig } from '@/modules/config/config.provider';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { createForm } from '@/modules/shared/form/form';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
import { TextField, TextFieldLabel, TextFieldRoot } from '@/modules/ui/components/textfield';
|
||||
@@ -10,14 +11,16 @@ import { AuthLayout } from '../../ui/layouts/auth-layout.component';
|
||||
import { resetPassword } from '../auth.services';
|
||||
|
||||
export const ResetPasswordForm: Component<{ onSubmit: (args: { newPassword: string }) => Promise<void> }> = (props) => {
|
||||
const { t } = useI18n();
|
||||
|
||||
const { form, Form, Field } = createForm({
|
||||
onSubmit: props.onSubmit,
|
||||
schema: v.object({
|
||||
newPassword: v.pipe(
|
||||
v.string(),
|
||||
v.nonEmpty('Please enter your new password'),
|
||||
v.minLength(8, 'Password must be at least 8 characters long'),
|
||||
v.maxLength(128, 'Password must be at most 128 characters long'),
|
||||
v.nonEmpty(t('auth.reset-password.form.new-password.required')),
|
||||
v.minLength(8, t('auth.reset-password.form.new-password.min-length', { minLength: 8 })),
|
||||
v.maxLength(128, t('auth.reset-password.form.new-password.max-length', { maxLength: 128 })),
|
||||
),
|
||||
}),
|
||||
});
|
||||
@@ -27,15 +30,15 @@ export const ResetPasswordForm: Component<{ onSubmit: (args: { newPassword: stri
|
||||
<Field name="newPassword">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 mb-4">
|
||||
<TextFieldLabel for="newPassword">New password</TextFieldLabel>
|
||||
<TextField type="password" id="newPassword" placeholder="Your new password" {...inputProps} autoFocus value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
<TextFieldLabel for="newPassword">{t('auth.reset-password.form.new-password.label')}</TextFieldLabel>
|
||||
<TextField type="password" id="newPassword" placeholder={t('auth.reset-password.form.new-password.placeholder')} {...inputProps} autoFocus value={field.value} aria-invalid={Boolean(field.error)} />
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Button type="submit" class="w-full">
|
||||
Reset password
|
||||
{t('auth.reset-password.form.submit')}
|
||||
</Button>
|
||||
|
||||
<div class="text-red-500 text-sm mt-2">{form.response.message}</div>
|
||||
@@ -49,6 +52,8 @@ export const ResetPasswordPage: Component = () => {
|
||||
const [searchParams] = useSearchParams();
|
||||
const token = searchParams.token;
|
||||
|
||||
const { t } = useI18n();
|
||||
|
||||
if (!token || typeof token !== 'string') {
|
||||
return <Navigate href="/login" />;
|
||||
}
|
||||
@@ -80,18 +85,18 @@ export const ResetPasswordPage: Component = () => {
|
||||
<div class="flex items-center justify-center p-6 sm:pb-32">
|
||||
<div class="max-w-sm w-full">
|
||||
<h1 class="text-xl font-bold">
|
||||
Reset your password
|
||||
{t('auth.reset-password.title')}
|
||||
</h1>
|
||||
|
||||
{getHasPasswordBeenReset()
|
||||
? (
|
||||
<>
|
||||
<div class="text-muted-foreground mt-1 mb-4">
|
||||
Your password has been reset.
|
||||
{t('auth.reset-password.reset')}
|
||||
</div>
|
||||
|
||||
<Button as={A} href="/login" class="w-full">
|
||||
Go to login
|
||||
{t('auth.reset-password.back-to-login')}
|
||||
<div class="i-tabler-login-2 ml-2 size-4" />
|
||||
</Button>
|
||||
</>
|
||||
@@ -99,7 +104,7 @@ export const ResetPasswordPage: Component = () => {
|
||||
: (
|
||||
<>
|
||||
<p class="text-muted-foreground mt-1 mb-4">
|
||||
Enter your new password.
|
||||
{t('auth.reset-password.description')}
|
||||
</p>
|
||||
|
||||
<ResetPasswordForm onSubmit={onPasswordResetRequested} />
|
||||
|
||||
@@ -2,7 +2,7 @@ import { get } from 'lodash-es';
|
||||
import { FetchError } from 'ofetch';
|
||||
import { createRouter } from 'radix3';
|
||||
import { defineHandler } from './demo-api-mock.models';
|
||||
import { documentFileStorage, documentStorage, organizationStorage, tagDocumentStorage, tagStorage } from './demo.storage';
|
||||
import { documentFileStorage, documentStorage, organizationStorage, tagDocumentStorage, taggingRuleStorage, tagStorage } from './demo.storage';
|
||||
import { findMany, getValues } from './demo.storage.models';
|
||||
|
||||
function assert(condition: unknown, { message = 'Error', status }: { message?: string; status?: number } = {}): asserts condition {
|
||||
@@ -130,10 +130,21 @@ const inMemoryApiMock: Record<string, { handler: any }> = {
|
||||
await documentFileStorage.setItem(key, await serializeFile(file));
|
||||
await documentStorage.setItem(key, document);
|
||||
|
||||
// Simulate a slow response
|
||||
await new Promise(resolve => setTimeout(resolve, 500));
|
||||
|
||||
return { document };
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/customer-portal',
|
||||
method: 'GET',
|
||||
handler: async () => {
|
||||
throw Object.assign(new FetchError('Not available in demo'), { status: 501 });
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/documents/statistics',
|
||||
method: 'GET',
|
||||
@@ -449,6 +460,90 @@ const inMemoryApiMock: Record<string, { handler: any }> = {
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/tagging-rules',
|
||||
method: 'GET',
|
||||
handler: async ({ params: { organizationId } }) => {
|
||||
const taggingRules = await findMany(taggingRuleStorage, taggingRule => taggingRule.organizationId === organizationId);
|
||||
|
||||
return { taggingRules };
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/tagging-rules',
|
||||
method: 'POST',
|
||||
handler: async ({ params: { organizationId }, body }) => {
|
||||
const taggingRule = {
|
||||
id: `tr_${Math.random().toString(36).slice(2)}`,
|
||||
organizationId,
|
||||
name: get(body, 'name'),
|
||||
description: get(body, 'description'),
|
||||
conditions: get(body, 'conditions'),
|
||||
actions: get(body, 'tagIds').map((tagId: string) => ({ tagId })),
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
};
|
||||
|
||||
await taggingRuleStorage.setItem(taggingRule.id, taggingRule);
|
||||
|
||||
return { taggingRule };
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/tagging-rules/:taggingRuleId',
|
||||
method: 'GET',
|
||||
handler: async ({ params: { taggingRuleId } }) => {
|
||||
const taggingRule = await taggingRuleStorage.getItem(taggingRuleId);
|
||||
|
||||
assert(taggingRule, { status: 404 });
|
||||
|
||||
return { taggingRule };
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/tagging-rules/:taggingRuleId',
|
||||
method: 'DELETE',
|
||||
handler: async ({ params: { taggingRuleId } }) => {
|
||||
await taggingRuleStorage.removeItem(taggingRuleId);
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/tagging-rules/:taggingRuleId',
|
||||
method: 'PUT',
|
||||
handler: async ({ params: { taggingRuleId }, body }) => {
|
||||
const taggingRule = await taggingRuleStorage.getItem(taggingRuleId);
|
||||
|
||||
assert(taggingRule, { status: 404 });
|
||||
|
||||
await taggingRuleStorage.setItem(taggingRuleId, Object.assign(taggingRule, body, { updatedAt: new Date() }));
|
||||
|
||||
return { taggingRule };
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/documents/trash',
|
||||
method: 'DELETE',
|
||||
handler: async ({ params: { organizationId } }) => {
|
||||
const documents = await findMany(documentStorage, document => document.organizationId === organizationId && Boolean(document.deletedAt));
|
||||
|
||||
await Promise.all(documents.map(document => documentStorage.removeItem(`${organizationId}:${document.id}`)));
|
||||
},
|
||||
}),
|
||||
|
||||
...defineHandler({
|
||||
path: '/api/organizations/:organizationId/documents/trash/:documentId',
|
||||
method: 'DELETE',
|
||||
handler: async ({ params: { organizationId, documentId } }) => {
|
||||
const key = `${organizationId}:${documentId}`;
|
||||
|
||||
await documentStorage.removeItem(key);
|
||||
},
|
||||
}),
|
||||
};
|
||||
|
||||
export const router = createRouter({ routes: inMemoryApiMock, strictTrailingSlash: false });
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { useNavigate } from '@solidjs/router';
|
||||
import { A, useNavigate } from '@solidjs/router';
|
||||
import { type Component, createSignal } from 'solid-js';
|
||||
import { Portal } from 'solid-js/web';
|
||||
import { buildTimeConfig } from '../config/config';
|
||||
@@ -9,7 +9,7 @@ import { clearDemoStorage } from './demo.storage';
|
||||
export const DemoIndicator: Component = () => {
|
||||
const [getIsMinified, setIsMinified] = createSignal(false);
|
||||
const navigate = useNavigate();
|
||||
const { t } = useI18n();
|
||||
const { t, te } = useI18n();
|
||||
|
||||
const clearDemo = async () => {
|
||||
await clearDemoStorage();
|
||||
@@ -33,6 +33,9 @@ export const DemoIndicator: Component = () => {
|
||||
<p class="text-sm">
|
||||
{t('demo.popup.description')}
|
||||
</p>
|
||||
<p class="text-sm mt-2">
|
||||
{te('demo.popup.discord', { discordLink: <A href="https://papra.app/discord" target="_blank" rel="noopener noreferrer" class="underline font-bold">{t('demo.popup.discord-link-label')}</A> })}
|
||||
</p>
|
||||
<div class="flex justify-end mt-4 gap-2">
|
||||
<Button variant="secondary" onClick={clearDemo} size="sm" class="text-primary shadow-none">
|
||||
{t('demo.popup.reset')}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import type { Document } from '../documents/documents.types';
|
||||
import type { Organization } from '../organizations/organizations.types';
|
||||
import type { TaggingRule } from '../tagging-rules/tagging-rules.types';
|
||||
import type { Tag } from '../tags/tags.types';
|
||||
import { createStorage, prefixStorage } from 'unstorage';
|
||||
import localStorageDriver from 'unstorage/drivers/localstorage';
|
||||
@@ -14,6 +15,7 @@ export const documentStorage = prefixStorage<Document>(storage, 'documents');
|
||||
export const documentFileStorage = prefixStorage(storage, 'documentFiles');
|
||||
export const tagStorage = prefixStorage<Omit<Tag, 'documentsCount'>>(storage, 'tags');
|
||||
export const tagDocumentStorage = prefixStorage<{ documentId: string; tagId: string; id: string }>(storage, 'tagDocuments');
|
||||
export const taggingRuleStorage = prefixStorage<TaggingRule>(storage, 'taggingRules');
|
||||
|
||||
export async function clearDemoStorage() {
|
||||
await storage.clear();
|
||||
|
||||
@@ -0,0 +1,208 @@
|
||||
import type { ParentComponent } from 'solid-js';
|
||||
import type { Document } from '../documents.types';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { promptUploadFiles } from '@/modules/shared/files/upload';
|
||||
import { useI18nApiErrors } from '@/modules/shared/http/composables/i18n-api-errors';
|
||||
import { cn } from '@/modules/shared/style/cn';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
import { safely } from '@corentinth/chisels';
|
||||
import { A } from '@solidjs/router';
|
||||
import { throttle } from 'lodash-es';
|
||||
import { createContext, createSignal, For, Match, Show, Switch, useContext } from 'solid-js';
|
||||
import { Portal } from 'solid-js/web';
|
||||
import { invalidateOrganizationDocumentsQuery } from '../documents.composables';
|
||||
import { uploadDocument } from '../documents.services';
|
||||
|
||||
const DocumentUploadContext = createContext<{
|
||||
uploadDocuments: (args: { files: File[]; organizationId: string }) => Promise<void>;
|
||||
}>();
|
||||
|
||||
export function useDocumentUpload({ organizationId }: { organizationId: string }) {
|
||||
const context = useContext(DocumentUploadContext);
|
||||
|
||||
if (!context) {
|
||||
throw new Error('DocumentUploadContext not found');
|
||||
}
|
||||
|
||||
const { uploadDocuments } = context;
|
||||
|
||||
return {
|
||||
uploadDocuments: async ({ files }: { files: File[] }) => uploadDocuments({ files, organizationId }),
|
||||
promptImport: async () => {
|
||||
const { files } = await promptUploadFiles();
|
||||
|
||||
await uploadDocuments({ files, organizationId });
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
type TaskSuccess = {
|
||||
file: File;
|
||||
status: 'success';
|
||||
document: Document;
|
||||
};
|
||||
|
||||
type TaskError = {
|
||||
file: File;
|
||||
status: 'error';
|
||||
error: Error;
|
||||
};
|
||||
|
||||
type Task = TaskSuccess | TaskError | {
|
||||
file: File;
|
||||
status: 'pending' | 'uploading' ;
|
||||
};
|
||||
|
||||
export const DocumentUploadProvider: ParentComponent = (props) => {
|
||||
const throttledInvalidateOrganizationDocumentsQuery = throttle(invalidateOrganizationDocumentsQuery, 500);
|
||||
const { getErrorMessage } = useI18nApiErrors();
|
||||
const { t } = useI18n();
|
||||
|
||||
const [getState, setState] = createSignal<'open' | 'closed' | 'collapsed'>('closed');
|
||||
const [getTasks, setTasks] = createSignal<Task[]>([]);
|
||||
|
||||
const updateTaskStatus = (args: { file: File; status: 'success'; document: Document } | { file: File; status: 'error'; error: Error } | { file: File; status: 'pending' | 'uploading' }) => {
|
||||
setTasks(tasks => tasks.map(task => task.file === args.file ? { ...task, ...args } : task));
|
||||
};
|
||||
|
||||
const uploadDocuments = async ({ files, organizationId }: { files: File[]; organizationId: string }) => {
|
||||
setTasks(tasks => [...tasks, ...files.map(file => ({ file, status: 'pending' } as const))]);
|
||||
setState('open');
|
||||
|
||||
await Promise.all(files.map(async (file) => {
|
||||
updateTaskStatus({ file, status: 'uploading' });
|
||||
|
||||
const [result, error] = await safely(uploadDocument({ file, organizationId }));
|
||||
|
||||
if (error) {
|
||||
updateTaskStatus({ file, status: 'error', error });
|
||||
} else {
|
||||
const { document } = result;
|
||||
|
||||
updateTaskStatus({ file, status: 'success', document });
|
||||
}
|
||||
|
||||
await throttledInvalidateOrganizationDocumentsQuery({ organizationId });
|
||||
}));
|
||||
};
|
||||
|
||||
const getTitle = () => {
|
||||
if (getTasks().length === 0) {
|
||||
return t('import-documents.title.none');
|
||||
}
|
||||
|
||||
const successCount = getTasks().filter(task => task.status === 'success').length;
|
||||
const errorCount = getTasks().filter(task => task.status === 'error').length;
|
||||
const totalCount = getTasks().length;
|
||||
|
||||
if (errorCount > 0) {
|
||||
return t('import-documents.title.error', { count: errorCount });
|
||||
}
|
||||
|
||||
if (successCount === totalCount) {
|
||||
return t('import-documents.title.success', { count: successCount });
|
||||
}
|
||||
|
||||
return t('import-documents.title.pending', { count: successCount, total: totalCount });
|
||||
};
|
||||
|
||||
const close = () => {
|
||||
setState('closed');
|
||||
setTasks([]);
|
||||
};
|
||||
|
||||
return (
|
||||
<DocumentUploadContext.Provider value={{ uploadDocuments }}>
|
||||
{props.children}
|
||||
|
||||
<Portal>
|
||||
<Show when={getState() !== 'closed'}>
|
||||
<div class="fixed bottom-0 right-0 sm:right-20px w-full sm:w-400px bg-card border-l border-t border-r sm:rounded-t-xl shadow-lg">
|
||||
<div class="flex items-center gap-1 pl-6 pr-4 py-3 border-b">
|
||||
<h2 class="text-base font-bold flex-1">{getTitle()}</h2>
|
||||
|
||||
<Button variant="ghost" size="icon" onClick={() => setState(state => state === 'open' ? 'collapsed' : 'open')}>
|
||||
<div class={cn('i-tabler-chevron-down size-5 transition-transform', getState() === 'collapsed' && 'rotate-180')} />
|
||||
</Button>
|
||||
|
||||
<Button variant="ghost" size="icon" onClick={close}>
|
||||
<div class="i-tabler-x size-5"></div>
|
||||
</Button>
|
||||
|
||||
</div>
|
||||
|
||||
<Show when={getState() === 'open'}>
|
||||
<div class="flex flex-col overflow-y-auto h-[450px] pb-4">
|
||||
<For each={getTasks()}>
|
||||
{task => (
|
||||
|
||||
<Switch>
|
||||
<Match when={task.status === 'success'}>
|
||||
<A
|
||||
href={`/organizations/${(task as TaskSuccess).document.organizationId}/documents/${(task as TaskSuccess).document.id}`}
|
||||
class="text-sm truncate min-w-0 flex items-center gap-4 min-h-48px group hover:bg-muted/50 transition-colors px-6 border-b border-border/80"
|
||||
>
|
||||
<div class="flex-1 truncate">
|
||||
{task.file.name}
|
||||
</div>
|
||||
|
||||
<div class="flex-none">
|
||||
<div class="i-tabler-circle-check text-primary size-5.5 group-hover:hidden"></div>
|
||||
<div class="i-tabler-arrow-right text-muted-foreground size-5.5 hidden group-hover:block"></div>
|
||||
</div>
|
||||
</A>
|
||||
</Match>
|
||||
|
||||
<Match when={task.status === 'error'}>
|
||||
<div class="text-sm truncate min-w-0 flex items-center gap-4 min-h-48px px-6 border-b border-border/80">
|
||||
<div class="flex-1 truncate">
|
||||
<div class="flex-1 truncate">{task.file.name}</div>
|
||||
|
||||
<div class="text-xs text-muted-foreground truncate text-red-500">
|
||||
{getErrorMessage({ error: (task as TaskError).error })}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="flex-none">
|
||||
<div class="i-tabler-circle-x text-red-500 size-5.5"></div>
|
||||
</div>
|
||||
</div>
|
||||
</Match>
|
||||
|
||||
<Match when={['pending', 'uploading'].includes(task.status)}>
|
||||
<div class="text-sm truncate min-w-0 flex items-center gap-4 min-h-48px px-6 border-b border-border/80">
|
||||
<div class="flex-1 truncate">
|
||||
{task.file.name}
|
||||
</div>
|
||||
|
||||
<div class="flex-none">
|
||||
<div class="i-tabler-loader-2 animate-spin text-muted-foreground size-5.5"></div>
|
||||
</div>
|
||||
</div>
|
||||
</Match>
|
||||
</Switch>
|
||||
|
||||
)}
|
||||
</For>
|
||||
|
||||
<Show when={getTasks().length === 0}>
|
||||
<div class="flex flex-col items-center justify-center gap-2 h-full mb-10">
|
||||
<div class="flex flex-col items-center justify-center gap-2 ">
|
||||
<div class="i-tabler-file-import size-10 text-muted-foreground"></div>
|
||||
</div>
|
||||
|
||||
<div class="text-sm text-muted-foreground text-center mt-2">
|
||||
{t('import-documents.no-import-in-progress')}
|
||||
</div>
|
||||
</div>
|
||||
</Show>
|
||||
</div>
|
||||
|
||||
</Show>
|
||||
|
||||
</div>
|
||||
</Show>
|
||||
</Portal>
|
||||
</DocumentUploadContext.Provider>
|
||||
);
|
||||
};
|
||||
@@ -9,7 +9,7 @@ import { queryClient } from '../shared/query/query-client';
|
||||
import { createToast } from '../ui/components/sonner';
|
||||
import { deleteDocument, restoreDocument, uploadDocument } from './documents.services';
|
||||
|
||||
function invalidateOrganizationDocumentsQuery({ organizationId }: { organizationId: string }) {
|
||||
export function invalidateOrganizationDocumentsQuery({ organizationId }: { organizationId: string }) {
|
||||
return queryClient.invalidateQueries({
|
||||
queryKey: ['organizations', organizationId],
|
||||
});
|
||||
@@ -77,6 +77,34 @@ export function useRestoreDocument() {
|
||||
};
|
||||
}
|
||||
|
||||
function toastUploadError({ error, file }: { error: Error; file: File }) {
|
||||
if (isHttpErrorWithCode({ error, code: 'document.already_exists' })) {
|
||||
createToast({
|
||||
type: 'error',
|
||||
message: 'Document already exists',
|
||||
description: `The document ${file.name} already exists, it has not been uploaded.`,
|
||||
});
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
if (isHttpErrorWithCode({ error, code: 'document.file_too_big' })) {
|
||||
createToast({
|
||||
type: 'error',
|
||||
message: 'Document too big',
|
||||
description: `The document ${file.name} is too big, it has not been uploaded.`,
|
||||
});
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
createToast({
|
||||
type: 'error',
|
||||
message: 'Failed to upload document',
|
||||
description: error.message,
|
||||
});
|
||||
}
|
||||
|
||||
export function useUploadDocuments({ organizationId }: { organizationId: string }) {
|
||||
const uploadDocuments = async ({ files }: { files: File[] }) => {
|
||||
const throttledInvalidateOrganizationDocumentsQuery = throttle(invalidateOrganizationDocumentsQuery, 500);
|
||||
@@ -84,12 +112,8 @@ export function useUploadDocuments({ organizationId }: { organizationId: string
|
||||
await Promise.all(files.map(async (file) => {
|
||||
const [, error] = await safely(uploadDocument({ file, organizationId }));
|
||||
|
||||
if (isHttpErrorWithCode({ error, code: 'document.already_exists' })) {
|
||||
createToast({
|
||||
type: 'error',
|
||||
message: 'Document already exists',
|
||||
description: `The document ${file.name} already exists, it has not been uploaded.`,
|
||||
});
|
||||
if (error) {
|
||||
toastUploadError({ error, file });
|
||||
}
|
||||
|
||||
await throttledInvalidateOrganizationDocumentsQuery({ organizationId });
|
||||
|
||||
@@ -194,3 +194,17 @@ export async function getOrganizationDocumentsStats({ organizationId }: { organi
|
||||
|
||||
return { organizationStats };
|
||||
}
|
||||
|
||||
export async function deleteAllTrashDocuments({ organizationId }: { organizationId: string }) {
|
||||
await apiClient({
|
||||
method: 'DELETE',
|
||||
path: `/api/organizations/${organizationId}/documents/trash`,
|
||||
});
|
||||
}
|
||||
|
||||
export async function deleteTrashDocument({ documentId, organizationId }: { documentId: string; organizationId: string }) {
|
||||
await apiClient({
|
||||
method: 'DELETE',
|
||||
path: `/api/organizations/${organizationId}/documents/trash/${documentId}`,
|
||||
});
|
||||
}
|
||||
|
||||
@@ -1,14 +1,18 @@
|
||||
import type { Document } from '../documents.types';
|
||||
import { useConfig } from '@/modules/config/config.provider';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { useConfirmModal } from '@/modules/shared/confirm';
|
||||
import { timeAgo } from '@/modules/shared/date/time-ago';
|
||||
import { queryClient } from '@/modules/shared/query/query-client';
|
||||
import { Alert, AlertDescription } from '@/modules/ui/components/alert';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
import { createToast } from '@/modules/ui/components/sonner';
|
||||
import { useParams } from '@solidjs/router';
|
||||
import { createQuery, keepPreviousData } from '@tanstack/solid-query';
|
||||
import { createMutation, createQuery, keepPreviousData } from '@tanstack/solid-query';
|
||||
import { type Component, createSignal, Show, Suspense } from 'solid-js';
|
||||
import { DocumentsPaginatedList } from '../components/documents-list.component';
|
||||
import { useRestoreDocument } from '../documents.composables';
|
||||
import { fetchOrganizationDeletedDocuments } from '../documents.services';
|
||||
import { deleteAllTrashDocuments, deleteTrashDocument, fetchOrganizationDeletedDocuments } from '../documents.services';
|
||||
|
||||
const RestoreDocumentButton: Component<{ document: Document }> = (props) => {
|
||||
const { getIsRestoring, restore } = useRestoreDocument();
|
||||
@@ -32,6 +36,113 @@ const RestoreDocumentButton: Component<{ document: Document }> = (props) => {
|
||||
);
|
||||
};
|
||||
|
||||
const PermanentlyDeleteTrashDocumentButton: Component<{ document: Document; organizationId: string }> = (props) => {
|
||||
const { confirm } = useConfirmModal();
|
||||
const { t } = useI18n();
|
||||
|
||||
const deleteMutation = createMutation(() => ({
|
||||
mutationFn: async () => {
|
||||
await deleteTrashDocument({ documentId: props.document.id, organizationId: props.organizationId });
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['organizations', props.organizationId, 'documents', 'deleted'] });
|
||||
|
||||
createToast({
|
||||
message: t('trash.deleted.success.title'),
|
||||
description: t('trash.deleted.success.description'),
|
||||
});
|
||||
},
|
||||
}));
|
||||
|
||||
const handleClick = async () => {
|
||||
if (!await confirm({
|
||||
title: t('trash.delete.confirm.title'),
|
||||
message: t('trash.delete.confirm.description'),
|
||||
confirmButton: {
|
||||
text: t('trash.delete.confirm.label'),
|
||||
variant: 'destructive',
|
||||
},
|
||||
cancelButton: {
|
||||
text: t('trash.delete.confirm.cancel'),
|
||||
},
|
||||
})) {
|
||||
return;
|
||||
}
|
||||
|
||||
deleteMutation.mutate();
|
||||
};
|
||||
|
||||
return (
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={handleClick}
|
||||
isLoading={deleteMutation.isPending}
|
||||
class="text-red-500 hover:text-red-600"
|
||||
>
|
||||
{deleteMutation.isPending
|
||||
? (<>Deleting...</>)
|
||||
: (
|
||||
<>
|
||||
<div class="i-tabler-trash size-4 mr-2" />
|
||||
{t('trash.delete.button')}
|
||||
</>
|
||||
)}
|
||||
</Button>
|
||||
);
|
||||
};
|
||||
|
||||
const DeleteAllTrashDocumentsButton: Component<{ organizationId: string }> = (props) => {
|
||||
const { confirm } = useConfirmModal();
|
||||
const { t } = useI18n();
|
||||
|
||||
const deleteAllMutation = createMutation(() => ({
|
||||
mutationFn: async () => {
|
||||
await deleteAllTrashDocuments({ organizationId: props.organizationId });
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['organizations', props.organizationId, 'documents', 'deleted'] });
|
||||
},
|
||||
}));
|
||||
|
||||
const handleClick = async () => {
|
||||
if (!await confirm({
|
||||
title: t('trash.delete-all.confirm.title'),
|
||||
message: t('trash.delete-all.confirm.description'),
|
||||
confirmButton: {
|
||||
text: t('trash.delete-all.confirm.label'),
|
||||
variant: 'destructive',
|
||||
},
|
||||
cancelButton: {
|
||||
text: t('trash.delete-all.confirm.cancel'),
|
||||
},
|
||||
})) {
|
||||
return;
|
||||
}
|
||||
|
||||
deleteAllMutation.mutate();
|
||||
};
|
||||
|
||||
return (
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={handleClick}
|
||||
isLoading={deleteAllMutation.isPending}
|
||||
class="text-red-500 hover:text-red-600"
|
||||
>
|
||||
{deleteAllMutation.isPending
|
||||
? (<>Deleting...</>)
|
||||
: (
|
||||
<>
|
||||
<div class="i-tabler-trash size-4 mr-2" />
|
||||
{t('trash.delete-all.button')}
|
||||
</>
|
||||
)}
|
||||
</Button>
|
||||
);
|
||||
};
|
||||
|
||||
export const DeletedDocumentsPage: Component = () => {
|
||||
const [getPagination, setPagination] = createSignal({ pageIndex: 0, pageSize: 100 });
|
||||
const params = useParams();
|
||||
@@ -77,6 +188,10 @@ export const DeletedDocumentsPage: Component = () => {
|
||||
</Show>
|
||||
|
||||
<Show when={query.data && query.data?.documents.length > 0}>
|
||||
<div class="flex items-center justify-end gap-2">
|
||||
<DeleteAllTrashDocumentsButton organizationId={params.organizationId} />
|
||||
</div>
|
||||
|
||||
<DocumentsPaginatedList
|
||||
documents={query.data?.documents ?? []}
|
||||
documentsCount={query.data?.documentsCount ?? 0}
|
||||
@@ -96,8 +211,9 @@ export const DeletedDocumentsPage: Component = () => {
|
||||
{
|
||||
id: 'actions',
|
||||
cell: data => (
|
||||
<div class="flex items-center justify-end">
|
||||
<div class="flex items-center justify-end gap-2">
|
||||
<RestoreDocumentButton document={data.row.original} />
|
||||
<PermanentlyDeleteTrashDocumentButton document={data.row.original} organizationId={params.organizationId} />
|
||||
</div>
|
||||
),
|
||||
},
|
||||
|
||||
@@ -138,8 +138,7 @@ export const DocumentPage: Component = () => {
|
||||
|
||||
<DocumentTagPicker
|
||||
organizationId={params.organizationId}
|
||||
documentId={params.documentId}
|
||||
tags={getDocument().tags}
|
||||
tagIds={getDocument().tags.map(tag => tag.id)}
|
||||
onTagAdded={async ({ tag }) => {
|
||||
await addTagToDocument({
|
||||
documentId: params.documentId,
|
||||
|
||||
@@ -30,7 +30,13 @@ export function findMatchingLocale({
|
||||
|
||||
export function createTranslator<Dict extends Record<string, string>>({ getDictionary }: { getDictionary: () => Dict }) {
|
||||
return (key: keyof Dict, args?: Record<string, string | number>) => {
|
||||
let translation: string = getDictionary()[key] ?? key;
|
||||
const translationFromDictionary = getDictionary()[key];
|
||||
|
||||
if (!translationFromDictionary && import.meta.env.DEV) {
|
||||
console.warn(`Translation not found for key: ${String(key)}`);
|
||||
}
|
||||
|
||||
let translation: string = translationFromDictionary ?? key;
|
||||
|
||||
if (args) {
|
||||
for (const [key, value] of Object.entries(args)) {
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -1,4 +1,5 @@
|
||||
import type { Organization } from '../organizations.types';
|
||||
import { buildTimeConfig } from '@/modules/config/config';
|
||||
import { useConfirmModal } from '@/modules/shared/confirm';
|
||||
import { createForm } from '@/modules/shared/form/form';
|
||||
import { getCustomerPortalUrl } from '@/modules/subscriptions/subscriptions.services';
|
||||
@@ -89,7 +90,7 @@ export const SubscriptionCard: Component<{ organization: Organization }> = (prop
|
||||
Manage your billing, invoices and payment methods.
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={goToCustomerPortal} isLoading={getIsLoading()} class="flex-shrink-0">
|
||||
<Button onClick={goToCustomerPortal} isLoading={getIsLoading()} class="flex-shrink-0" disabled={buildTimeConfig.isDemoMode}>
|
||||
Manage subscription
|
||||
</Button>
|
||||
</Card>
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { FormProps, PartialValues } from '@modular-forms/solid';
|
||||
import type { FormErrors, FormProps, PartialValues } from '@modular-forms/solid';
|
||||
import type * as v from 'valibot';
|
||||
import { createForm as createModularForm, valiForm } from '@modular-forms/solid';
|
||||
import { createForm as createModularForm, FormError, valiForm } from '@modular-forms/solid';
|
||||
import { createHook } from '../hooks/hooks';
|
||||
|
||||
export function createForm<Schema extends v.ObjectSchema<any, any>>({
|
||||
@@ -18,7 +18,7 @@ export function createForm<Schema extends v.ObjectSchema<any, any>>({
|
||||
submitHook.on(onSubmit);
|
||||
}
|
||||
|
||||
const [form, { Form, Field }] = createModularForm<v.InferInput<Schema>>({
|
||||
const [form, { Form, Field, FieldArray }] = createModularForm<v.InferInput<Schema>>({
|
||||
validate: valiForm(schema),
|
||||
initialValues,
|
||||
});
|
||||
@@ -27,7 +27,9 @@ export function createForm<Schema extends v.ObjectSchema<any, any>>({
|
||||
form,
|
||||
Form: (props: Omit<FormProps<v.InferInput<Schema>, undefined>, 'of'>) => Form({ ...props, onSubmit: submitHook.trigger }),
|
||||
Field,
|
||||
FieldArray,
|
||||
onSubmit: submitHook.on,
|
||||
submit: submitHook.trigger,
|
||||
createFormError: ({ message, fields }: { message: string; fields?: FormErrors<v.InferInput<Schema>> }) => new FormError<v.InferInput<Schema>>(message, fields),
|
||||
};
|
||||
}
|
||||
|
||||
@@ -0,0 +1,29 @@
|
||||
import type { LocaleKeys } from '@/modules/i18n/locales.types';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { get } from 'lodash-es';
|
||||
|
||||
export function useI18nApiErrors({ t = useI18n().t }: { t?: ReturnType<typeof useI18n>['t'] } = {}) {
|
||||
const getTranslationFromApiErrorCode = ({ code }: { code: string }) => {
|
||||
return t(`api-errors.${code}` as LocaleKeys);
|
||||
};
|
||||
|
||||
const getTranslationFromApiError = ({ error }: { error: unknown }) => {
|
||||
const code = get(error, 'data.error.code') ?? get(error, 'code');
|
||||
|
||||
if (!code) {
|
||||
return t('api-errors.default');
|
||||
}
|
||||
|
||||
return getTranslationFromApiErrorCode({ code });
|
||||
};
|
||||
|
||||
return {
|
||||
getErrorMessage: (args: { error: unknown } | { code: string }) => {
|
||||
if ('error' in args) {
|
||||
return getTranslationFromApiError({ error: args.error });
|
||||
}
|
||||
|
||||
return getTranslationFromApiErrorCode({ code: args.code });
|
||||
},
|
||||
};
|
||||
}
|
||||
@@ -0,0 +1,253 @@
|
||||
import type { TaggingRule, TaggingRuleForCreation } from '../tagging-rules.types';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { useConfirmModal } from '@/modules/shared/confirm';
|
||||
import { createForm } from '@/modules/shared/form/form';
|
||||
import { DocumentTagPicker } from '@/modules/tags/components/tag-picker.component';
|
||||
import { CreateTagModal } from '@/modules/tags/pages/tags.page';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from '@/modules/ui/components/select';
|
||||
import { Separator } from '@/modules/ui/components/separator';
|
||||
import { TextArea } from '@/modules/ui/components/textarea';
|
||||
import { TextField, TextFieldLabel, TextFieldRoot } from '@/modules/ui/components/textfield';
|
||||
import { insert, remove, setValue } from '@modular-forms/solid';
|
||||
import { A } from '@solidjs/router';
|
||||
import { type Component, For, Show } from 'solid-js';
|
||||
import * as v from 'valibot';
|
||||
import { TAGGING_RULE_FIELDS, TAGGING_RULE_FIELDS_LOCALIZATION_KEYS, TAGGING_RULE_OPERATORS, TAGGING_RULE_OPERATORS_LOCALIZATION_KEYS } from '../tagging-rules.constants';
|
||||
|
||||
export const TaggingRuleForm: Component<{
|
||||
onSubmit: (args: { taggingRule: TaggingRuleForCreation }) => Promise<void> | void;
|
||||
organizationId: string;
|
||||
taggingRule?: TaggingRule;
|
||||
submitButtonText?: string;
|
||||
}> = (props) => {
|
||||
const { t } = useI18n();
|
||||
const { confirm } = useConfirmModal();
|
||||
|
||||
const { form, Form, Field, FieldArray } = createForm({
|
||||
onSubmit: async ({ name, conditions = [], tagIds, description }) => {
|
||||
if (conditions.length === 0) {
|
||||
const confirmed = await confirm({
|
||||
title: t('tagging-rules.form.conditions.no-conditions.title'),
|
||||
message: t('tagging-rules.form.conditions.no-conditions.description'),
|
||||
confirmButton: {
|
||||
variant: 'default',
|
||||
text: t('tagging-rules.form.conditions.no-conditions.confirm'),
|
||||
},
|
||||
cancelButton: {
|
||||
text: t('tagging-rules.form.conditions.no-conditions.cancel'),
|
||||
},
|
||||
});
|
||||
|
||||
if (!confirmed) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
props.onSubmit({ taggingRule: { name, conditions, tagIds, description } });
|
||||
},
|
||||
schema: v.object({
|
||||
name: v.pipe(
|
||||
v.string(),
|
||||
v.minLength(1, t('tagging-rules.form.name.min-length')),
|
||||
v.maxLength(64, t('tagging-rules.form.name.max-length')),
|
||||
),
|
||||
description: v.pipe(
|
||||
v.string(),
|
||||
v.maxLength(256, t('tagging-rules.form.description.max-length')),
|
||||
),
|
||||
conditions: v.optional(
|
||||
v.array(v.object({
|
||||
field: v.picklist(Object.values(TAGGING_RULE_FIELDS)),
|
||||
operator: v.picklist(Object.values(TAGGING_RULE_OPERATORS)),
|
||||
value: v.pipe(
|
||||
v.string(),
|
||||
v.minLength(1, t('tagging-rules.form.conditions.value.min-length')),
|
||||
),
|
||||
})),
|
||||
),
|
||||
tagIds: v.pipe(
|
||||
v.array(v.string()),
|
||||
v.minLength(1, t('tagging-rules.form.tags.min-length')),
|
||||
),
|
||||
}),
|
||||
initialValues: {
|
||||
conditions: props.taggingRule?.conditions ?? [],
|
||||
tagIds: props.taggingRule?.actions.map(action => action.tagId) ?? [],
|
||||
name: props.taggingRule?.name,
|
||||
description: props.taggingRule?.description,
|
||||
},
|
||||
});
|
||||
|
||||
const getOperatorLabel = (operator: string) => {
|
||||
return t(TAGGING_RULE_OPERATORS_LOCALIZATION_KEYS[operator as keyof typeof TAGGING_RULE_OPERATORS_LOCALIZATION_KEYS]);
|
||||
};
|
||||
|
||||
const getFieldLabel = (field: string) => {
|
||||
return t(TAGGING_RULE_FIELDS_LOCALIZATION_KEYS[field as keyof typeof TAGGING_RULE_FIELDS_LOCALIZATION_KEYS]);
|
||||
};
|
||||
|
||||
return (
|
||||
<Form>
|
||||
<Field name="name">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1">
|
||||
<TextFieldLabel for="name">{t('tagging-rules.form.name.label')}</TextFieldLabel>
|
||||
<TextField
|
||||
type="text"
|
||||
id="name"
|
||||
placeholder={t('tagging-rules.form.name.placeholder')}
|
||||
{...inputProps}
|
||||
value={field.value}
|
||||
aria-invalid={Boolean(field.error)}
|
||||
/>
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
</Field>
|
||||
<Field name="description">
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 mt-6">
|
||||
<TextFieldLabel for="description">{t('tagging-rules.form.description.label')}</TextFieldLabel>
|
||||
<TextArea
|
||||
id="description"
|
||||
placeholder={t('tagging-rules.form.description.placeholder')}
|
||||
{...inputProps}
|
||||
value={field.value}
|
||||
/>
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Separator class="my-6" />
|
||||
|
||||
<p class="mb-1 font-medium">{t('tagging-rules.form.conditions.label')}</p>
|
||||
<p class="mb-2 text-sm text-muted-foreground">{t('tagging-rules.form.conditions.description')}</p>
|
||||
|
||||
<FieldArray name="conditions">
|
||||
{fieldArray => (
|
||||
<div>
|
||||
<For each={fieldArray.items}>
|
||||
{(_, index) => (
|
||||
<div class="px-4 py-4 mb-1 flex gap-2 items-center bg-card border rounded-md">
|
||||
<div>When</div>
|
||||
|
||||
<Field name={`conditions.${index()}.field`}>
|
||||
{field => (
|
||||
<Select
|
||||
id="field"
|
||||
defaultValue={field.value}
|
||||
onChange={value => value && setValue(form, `conditions.${index()}.field`, value)}
|
||||
options={Object.values(TAGGING_RULE_FIELDS)}
|
||||
itemComponent={props => (
|
||||
<SelectItem item={props.item}>{getFieldLabel(props.item.rawValue)}</SelectItem>
|
||||
)}
|
||||
>
|
||||
<SelectTrigger class="w-[180px]">
|
||||
<SelectValue<string>>{state => getFieldLabel(state.selectedOption())}</SelectValue>
|
||||
</SelectTrigger>
|
||||
<SelectContent />
|
||||
</Select>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Field name={`conditions.${index()}.operator`}>
|
||||
{field => (
|
||||
<Select
|
||||
id="operator"
|
||||
defaultValue={field.value}
|
||||
onChange={value => value && setValue(form, `conditions.${index()}.operator`, value)}
|
||||
options={Object.values(TAGGING_RULE_OPERATORS)}
|
||||
itemComponent={props => (
|
||||
<SelectItem item={props.item}>{getOperatorLabel(props.item.rawValue)}</SelectItem>
|
||||
)}
|
||||
>
|
||||
<SelectTrigger class="w-[140px]">
|
||||
<SelectValue<string>>{state => getOperatorLabel(state.selectedOption())}</SelectValue>
|
||||
</SelectTrigger>
|
||||
<SelectContent />
|
||||
</Select>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Field name={`conditions.${index()}.value`}>
|
||||
{(field, inputProps) => (
|
||||
<TextFieldRoot class="flex flex-col gap-1 flex-1">
|
||||
<TextField
|
||||
id="value"
|
||||
{...inputProps}
|
||||
value={field.value}
|
||||
placeholder={t('tagging-rules.form.conditions.value.placeholder')}
|
||||
|
||||
/>
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
|
||||
</TextFieldRoot>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Button variant="outline" size="icon" onClick={() => remove(form, 'conditions', { at: index() })}>
|
||||
<div class="i-tabler-x size-4"></div>
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</For>
|
||||
{fieldArray.error && <div class="text-red-500 text-sm">{fieldArray.error}</div>}
|
||||
</div>
|
||||
)}
|
||||
</FieldArray>
|
||||
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={() => insert(form, 'conditions', { value: { field: 'name', operator: 'contains', value: '' } })}
|
||||
class="gap-2 mt-2"
|
||||
>
|
||||
<div class="i-tabler-plus size-4"></div>
|
||||
{t('tagging-rules.form.conditions.add-condition')}
|
||||
</Button>
|
||||
|
||||
<Separator class="my-6" />
|
||||
|
||||
<p class="mb-1 font-medium">{t('tagging-rules.form.tags.label')}</p>
|
||||
<p class="mb-2 text-sm text-muted-foreground">{t('tagging-rules.form.tags.description')}</p>
|
||||
|
||||
<Field name="tagIds" type="string[]">
|
||||
{field => (
|
||||
<>
|
||||
<div class="flex gap-2 sm:items-center sm:flex-row flex-col">
|
||||
<div class="flex-1">
|
||||
|
||||
<DocumentTagPicker
|
||||
organizationId={props.organizationId}
|
||||
tagIds={field.value ?? []}
|
||||
onTagsChange={({ tags }) => setValue(form, 'tagIds', tags.map(tag => tag.id))}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<CreateTagModal organizationId={props.organizationId}>
|
||||
{props => (
|
||||
<Button variant="outline" {...props}>
|
||||
<div class="i-tabler-plus size-4 mr-2"></div>
|
||||
{t('tagging-rules.form.tags.add-tag')}
|
||||
</Button>
|
||||
)}
|
||||
</CreateTagModal>
|
||||
</div>
|
||||
{field.error && <div class="text-red-500 text-sm">{field.error}</div>}
|
||||
</>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<div class="flex justify-end mt-6 gap-2">
|
||||
<Show when={props.taggingRule}>
|
||||
<Button variant="outline" as={A} href={`/organizations/${props.organizationId}/tagging-rules`}>
|
||||
{t('tagging-rules.update.cancel')}
|
||||
</Button>
|
||||
</Show>
|
||||
|
||||
<Button type="submit">{props.submitButtonText ?? t('tagging-rules.form.submit')}</Button>
|
||||
</div>
|
||||
</Form>
|
||||
);
|
||||
};
|
||||
@@ -0,0 +1,49 @@
|
||||
import type { Component } from 'solid-js';
|
||||
import type { TaggingRuleForCreation } from '../tagging-rules.types';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { createToast } from '@/modules/ui/components/sonner';
|
||||
import { useNavigate, useParams } from '@solidjs/router';
|
||||
import { createMutation } from '@tanstack/solid-query';
|
||||
import { TaggingRuleForm } from '../components/tagging-rule-form.component';
|
||||
import { createTaggingRule } from '../tagging-rules.services';
|
||||
|
||||
export const CreateTaggingRulePage: Component = () => {
|
||||
const { t } = useI18n();
|
||||
const params = useParams();
|
||||
const navigate = useNavigate();
|
||||
|
||||
const createTaggingRuleMutation = createMutation(() => ({
|
||||
mutationFn: async ({ taggingRule }: { taggingRule: TaggingRuleForCreation }) => {
|
||||
await createTaggingRule({ taggingRule, organizationId: params.organizationId });
|
||||
},
|
||||
onSuccess: () => {
|
||||
createToast({
|
||||
message: t('tagging-rules.create.success'),
|
||||
type: 'success',
|
||||
});
|
||||
navigate(`/organizations/${params.organizationId}/tagging-rules`);
|
||||
},
|
||||
onError: () => {
|
||||
createToast({
|
||||
message: t('tagging-rules.create.error'),
|
||||
type: 'error',
|
||||
});
|
||||
},
|
||||
}));
|
||||
|
||||
return (
|
||||
<div class="p-6 max-w-screen-md mx-auto mt-4">
|
||||
<div class="border-b mb-6 pb-4">
|
||||
<h1 class="text-xl font-bold">
|
||||
{t('tagging-rules.create.title')}
|
||||
</h1>
|
||||
</div>
|
||||
|
||||
<TaggingRuleForm
|
||||
onSubmit={({ taggingRule }) => createTaggingRuleMutation.mutate({ taggingRule })}
|
||||
organizationId={params.organizationId}
|
||||
submitButtonText={t('tagging-rules.create.submit')}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -0,0 +1,146 @@
|
||||
import type { TaggingRule } from '../tagging-rules.types';
|
||||
import { useConfig } from '@/modules/config/config.provider';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { queryClient } from '@/modules/shared/query/query-client';
|
||||
import { Alert } from '@/modules/ui/components/alert';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
import { EmptyState } from '@/modules/ui/components/empty';
|
||||
import { A, useParams } from '@solidjs/router';
|
||||
import { createMutation, createQuery } from '@tanstack/solid-query';
|
||||
import { type Component, For, Match, Show, Switch } from 'solid-js';
|
||||
import { deleteTaggingRule, fetchTaggingRules } from '../tagging-rules.services';
|
||||
|
||||
const TaggingRuleCard: Component<{ taggingRule: TaggingRule }> = (props) => {
|
||||
const { t } = useI18n();
|
||||
|
||||
const getConditionsLabel = () => {
|
||||
const count = props.taggingRule.conditions.length;
|
||||
|
||||
if (count === 0) {
|
||||
return t('tagging-rules.list.card.no-conditions');
|
||||
}
|
||||
|
||||
if (count === 1) {
|
||||
return t('tagging-rules.list.card.one-condition');
|
||||
}
|
||||
|
||||
return t('tagging-rules.list.card.conditions', { count });
|
||||
};
|
||||
|
||||
const deleteTaggingRuleMutation = createMutation(() => ({
|
||||
mutationFn: async () => {
|
||||
await deleteTaggingRule({ organizationId: props.taggingRule.organizationId, taggingRuleId: props.taggingRule.id });
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ['organizations', props.taggingRule.organizationId, 'tagging-rules'] });
|
||||
},
|
||||
}));
|
||||
|
||||
return (
|
||||
<div class="flex items-center gap-2 bg-card py-4 px-6 rounded-md border">
|
||||
<A href={`/organizations/${props.taggingRule.organizationId}/tagging-rules/${props.taggingRule.id}`}>
|
||||
<div class="i-tabler-list-check size-8 opacity-30 mr-2" />
|
||||
</A>
|
||||
|
||||
<div class="flex-1">
|
||||
<A href={`/organizations/${props.taggingRule.organizationId}/tagging-rules/${props.taggingRule.id}`} class="text-base font-bold">{props.taggingRule.name}</A>
|
||||
|
||||
<p class="text-xs text-muted-foreground">
|
||||
{[getConditionsLabel(), props.taggingRule.description].filter(Boolean).join(' - ')}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div class="flex items-center gap-2">
|
||||
<Button
|
||||
as={A}
|
||||
href={`/organizations/${props.taggingRule.organizationId}/tagging-rules/${props.taggingRule.id}`}
|
||||
variant="outline"
|
||||
size="icon"
|
||||
aria-label={t('tagging-rules.list.card.edit')}
|
||||
>
|
||||
<div class="i-tabler-edit size-4" />
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
variant="outline"
|
||||
size="icon"
|
||||
onClick={() => deleteTaggingRuleMutation.mutate()}
|
||||
disabled={deleteTaggingRuleMutation.isPending}
|
||||
aria-label={t('tagging-rules.list.card.delete')}
|
||||
>
|
||||
<div class="i-tabler-trash size-4" />
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export const TaggingRulesPage: Component = () => {
|
||||
const { t } = useI18n();
|
||||
const { config } = useConfig();
|
||||
const params = useParams();
|
||||
|
||||
const query = createQuery(() => ({
|
||||
queryKey: ['organizations', params.organizationId, 'tagging-rules'],
|
||||
queryFn: () => fetchTaggingRules({ organizationId: params.organizationId }),
|
||||
}));
|
||||
|
||||
return (
|
||||
<div class="p-6 max-w-screen-lg mx-auto mt-4">
|
||||
<div class="border-b mb-6 pb-4 flex items-center justify-between gap-4 sm:flex-row flex-col">
|
||||
<div>
|
||||
<h1 class="text-xl font-bold">
|
||||
{t('tagging-rules.list.title')}
|
||||
</h1>
|
||||
|
||||
<p class="text-muted-foreground mt-1">
|
||||
{t('tagging-rules.list.description')}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<Show when={query.data?.taggingRules.length}>
|
||||
<Button as={A} href={`/organizations/${params.organizationId}/tagging-rules/create`} class="flex items-center gap-2 flex-shrink-0 sm:w-auto w-full">
|
||||
<div class="i-tabler-plus size-4" />
|
||||
{t('tagging-rules.list.no-tagging-rules.create-tagging-rule')}
|
||||
</Button>
|
||||
</Show>
|
||||
</div>
|
||||
|
||||
<Show when={config.isDemoMode}>
|
||||
<Alert class="bg-primary text-primary-foreground mb-4">
|
||||
{t('tagging-rules.list.demo-warning')}
|
||||
</Alert>
|
||||
</Show>
|
||||
|
||||
<Switch>
|
||||
<Match when={query.data?.taggingRules.length === 0}>
|
||||
<div class="mt-16">
|
||||
<EmptyState
|
||||
title={t('tagging-rules.list.no-tagging-rules.title')}
|
||||
description={t('tagging-rules.list.no-tagging-rules.description')}
|
||||
class="pt-0"
|
||||
icon="i-tabler-list-check"
|
||||
cta={(
|
||||
<Button as={A} href={`/organizations/${params.organizationId}/tagging-rules/create`}>
|
||||
<div class="i-tabler-plus size-4 mr-2" />
|
||||
{t('tagging-rules.list.no-tagging-rules.create-tagging-rule')}
|
||||
</Button>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
</Match>
|
||||
|
||||
<Match when={query.data?.taggingRules.length}>
|
||||
<div class="flex flex-col gap-2">
|
||||
<For each={query.data?.taggingRules}>
|
||||
{taggingRule => <TaggingRuleCard taggingRule={taggingRule} />}
|
||||
</For>
|
||||
</div>
|
||||
</Match>
|
||||
|
||||
</Switch>
|
||||
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -0,0 +1,63 @@
|
||||
import type { TaggingRuleForCreation } from '../tagging-rules.types';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { queryClient } from '@/modules/shared/query/query-client';
|
||||
import { createToast } from '@/modules/ui/components/sonner';
|
||||
import { useNavigate, useParams } from '@solidjs/router';
|
||||
import { createMutation, createQuery } from '@tanstack/solid-query';
|
||||
import { type Component, Show } from 'solid-js';
|
||||
import { TaggingRuleForm } from '../components/tagging-rule-form.component';
|
||||
import { getTaggingRule, updateTaggingRule } from '../tagging-rules.services';
|
||||
|
||||
export const UpdateTaggingRulePage: Component = () => {
|
||||
const { t } = useI18n();
|
||||
const params = useParams();
|
||||
const navigate = useNavigate();
|
||||
|
||||
const query = createQuery(() => ({
|
||||
queryKey: ['organizations', params.organizationId, 'tagging-rules', params.taggingRuleId],
|
||||
queryFn: () => getTaggingRule({ organizationId: params.organizationId, taggingRuleId: params.taggingRuleId }),
|
||||
}));
|
||||
|
||||
const updateTaggingRuleMutation = createMutation(() => ({
|
||||
mutationFn: async ({ taggingRule }: { taggingRule: TaggingRuleForCreation }) => {
|
||||
await updateTaggingRule({ organizationId: params.organizationId, taggingRuleId: params.taggingRuleId, taggingRule });
|
||||
},
|
||||
onSuccess: async () => {
|
||||
await queryClient.invalidateQueries({ queryKey: ['organizations', params.organizationId, 'tagging-rules'] });
|
||||
|
||||
createToast({
|
||||
message: t('tagging-rules.create.success'),
|
||||
type: 'success',
|
||||
});
|
||||
navigate(`/organizations/${params.organizationId}/tagging-rules`);
|
||||
},
|
||||
onError: () => {
|
||||
createToast({
|
||||
message: t('tagging-rules.update.error'),
|
||||
type: 'error',
|
||||
});
|
||||
},
|
||||
}));
|
||||
|
||||
return (
|
||||
<div class="p-6 max-w-screen-md mx-auto mt-4">
|
||||
<div class="border-b mb-6 pb-4">
|
||||
<h1 class="text-xl font-bold">
|
||||
{t('tagging-rules.update.title')}
|
||||
</h1>
|
||||
</div>
|
||||
|
||||
<Show when={query.data?.taggingRule}>
|
||||
{getTaggingRule => (
|
||||
<TaggingRuleForm
|
||||
onSubmit={({ taggingRule }) => updateTaggingRuleMutation.mutate({ taggingRule })}
|
||||
organizationId={params.organizationId}
|
||||
taggingRule={getTaggingRule()}
|
||||
submitButtonText={t('tagging-rules.update.submit')}
|
||||
/>
|
||||
)}
|
||||
</Show>
|
||||
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -0,0 +1,29 @@
|
||||
import type { LocaleKeys } from '../i18n/locales.types';
|
||||
|
||||
export const TAGGING_RULE_OPERATORS = {
|
||||
EQUAL: 'equal',
|
||||
NOT_EQUAL: 'not_equal',
|
||||
CONTAINS: 'contains',
|
||||
NOT_CONTAINS: 'not_contains',
|
||||
STARTS_WITH: 'starts_with',
|
||||
ENDS_WITH: 'ends_with',
|
||||
} as const;
|
||||
|
||||
export const TAGGING_RULE_FIELDS = {
|
||||
DOCUMENT_NAME: 'name',
|
||||
DOCUMENT_CONTENT: 'content',
|
||||
} as const;
|
||||
|
||||
export const TAGGING_RULE_OPERATORS_LOCALIZATION_KEYS: Record<(typeof TAGGING_RULE_OPERATORS)[keyof typeof TAGGING_RULE_OPERATORS], LocaleKeys> = {
|
||||
[TAGGING_RULE_OPERATORS.EQUAL]: 'tagging-rules.operator.equals',
|
||||
[TAGGING_RULE_OPERATORS.NOT_EQUAL]: 'tagging-rules.operator.not-equals',
|
||||
[TAGGING_RULE_OPERATORS.CONTAINS]: 'tagging-rules.operator.contains',
|
||||
[TAGGING_RULE_OPERATORS.NOT_CONTAINS]: 'tagging-rules.operator.not-contains',
|
||||
[TAGGING_RULE_OPERATORS.STARTS_WITH]: 'tagging-rules.operator.starts-with',
|
||||
[TAGGING_RULE_OPERATORS.ENDS_WITH]: 'tagging-rules.operator.ends-with',
|
||||
} as const;
|
||||
|
||||
export const TAGGING_RULE_FIELDS_LOCALIZATION_KEYS: Record<(typeof TAGGING_RULE_FIELDS)[keyof typeof TAGGING_RULE_FIELDS], LocaleKeys> = {
|
||||
[TAGGING_RULE_FIELDS.DOCUMENT_NAME]: 'tagging-rules.field.name',
|
||||
[TAGGING_RULE_FIELDS.DOCUMENT_CONTENT]: 'tagging-rules.field.content',
|
||||
} as const;
|
||||
@@ -0,0 +1,43 @@
|
||||
import type { TaggingRule, TaggingRuleForCreation } from './tagging-rules.types';
|
||||
import { apiClient } from '../shared/http/api-client';
|
||||
|
||||
export async function fetchTaggingRules({ organizationId }: { organizationId: string }) {
|
||||
const { taggingRules } = await apiClient<{ taggingRules: TaggingRule[] }>({
|
||||
path: `/api/organizations/${organizationId}/tagging-rules`,
|
||||
method: 'GET',
|
||||
});
|
||||
|
||||
return { taggingRules };
|
||||
}
|
||||
|
||||
export async function createTaggingRule({ taggingRule, organizationId }: { taggingRule: TaggingRuleForCreation; organizationId: string }) {
|
||||
await apiClient({
|
||||
path: `/api/organizations/${organizationId}/tagging-rules`,
|
||||
method: 'POST',
|
||||
body: taggingRule,
|
||||
});
|
||||
}
|
||||
|
||||
export async function deleteTaggingRule({ organizationId, taggingRuleId }: { organizationId: string; taggingRuleId: string }) {
|
||||
await apiClient({
|
||||
path: `/api/organizations/${organizationId}/tagging-rules/${taggingRuleId}`,
|
||||
method: 'DELETE',
|
||||
});
|
||||
}
|
||||
|
||||
export async function getTaggingRule({ organizationId, taggingRuleId }: { organizationId: string; taggingRuleId: string }) {
|
||||
const { taggingRule } = await apiClient<{ taggingRule: TaggingRule }>({
|
||||
path: `/api/organizations/${organizationId}/tagging-rules/${taggingRuleId}`,
|
||||
method: 'GET',
|
||||
});
|
||||
|
||||
return { taggingRule };
|
||||
}
|
||||
|
||||
export async function updateTaggingRule({ organizationId, taggingRuleId, taggingRule }: { organizationId: string; taggingRuleId: string; taggingRule: TaggingRuleForCreation }) {
|
||||
await apiClient({
|
||||
path: `/api/organizations/${organizationId}/tagging-rules/${taggingRuleId}`,
|
||||
method: 'PUT',
|
||||
body: taggingRule,
|
||||
});
|
||||
}
|
||||
@@ -0,0 +1,25 @@
|
||||
import type { TAGGING_RULE_FIELDS, TAGGING_RULE_OPERATORS } from './tagging-rules.constants';
|
||||
|
||||
export type TaggingRuleForCreation = {
|
||||
name: string;
|
||||
description: string;
|
||||
conditions: TaggingRuleCondition[];
|
||||
tagIds: string[];
|
||||
};
|
||||
|
||||
export type TaggingRuleCondition = {
|
||||
field: (typeof TAGGING_RULE_FIELDS)[keyof typeof TAGGING_RULE_FIELDS];
|
||||
operator: (typeof TAGGING_RULE_OPERATORS)[keyof typeof TAGGING_RULE_OPERATORS];
|
||||
value: string;
|
||||
};
|
||||
|
||||
export type TaggingRule = {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
conditions: TaggingRuleCondition[];
|
||||
actions: { tagId: string }[];
|
||||
organizationId: string;
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
};
|
||||
@@ -7,13 +7,12 @@ import { Tag as TagComponent } from './tag.component';
|
||||
|
||||
export const DocumentTagPicker: Component<{
|
||||
organizationId: string;
|
||||
tags: Tag[];
|
||||
documentId: string;
|
||||
tagIds: string[];
|
||||
onTagsChange?: (args: { tags: Tag[] }) => void;
|
||||
onTagAdded?: (args: { tag: Tag }) => void;
|
||||
onTagRemoved?: (args: { tag: Tag }) => void;
|
||||
}> = (props) => {
|
||||
const [getSelectedTags, setSelectedTags] = createSignal<Tag[]>(props.tags);
|
||||
const [getSelectedTagIds, setSelectedTagIds] = createSignal<string[]>(props.tagIds);
|
||||
|
||||
const query = createQuery(() => ({
|
||||
queryKey: ['organizations', props.organizationId, 'tags'],
|
||||
@@ -22,6 +21,9 @@ export const DocumentTagPicker: Component<{
|
||||
|
||||
const options = () => query.data?.tags || [];
|
||||
|
||||
const getSelectedTags = () => query.data?.tags.filter(tag => getSelectedTagIds().includes(tag.id)) ?? [];
|
||||
const setSelectedTags = (tags: Tag[]) => setSelectedTagIds(tags.map(tag => tag.id));
|
||||
|
||||
return (
|
||||
<Combobox<Tag>
|
||||
options={options()}
|
||||
|
||||
@@ -104,7 +104,7 @@ const TagForm: Component<{
|
||||
);
|
||||
};
|
||||
|
||||
const CreateTagModal: Component<{
|
||||
export const CreateTagModal: Component<{
|
||||
children: (props: DialogTriggerProps) => JSX.Element;
|
||||
organizationId: string;
|
||||
}> = (props) => {
|
||||
|
||||
@@ -8,7 +8,7 @@ export const IntegrationsLayout: ParentComponent = (props) => {
|
||||
return (
|
||||
<div class="p-6 mt-4 pb-32 mx-auto max-w-5xl">
|
||||
<div class="border-b mb-6">
|
||||
<h1 class="text-xl font-bold ">
|
||||
<h1 class="text-xl font-bold">
|
||||
Integrations
|
||||
</h1>
|
||||
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import type { Organization } from '@/modules/organizations/organizations.types';
|
||||
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { DocumentUploadProvider } from '@/modules/documents/components/document-import-status.component';
|
||||
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { fetchOrganization, fetchOrganizations } from '@/modules/organizations/organizations.services';
|
||||
import { useNavigate, useParams } from '@solidjs/router';
|
||||
import { createQueries, createQuery } from '@tanstack/solid-query';
|
||||
@@ -38,6 +39,11 @@ const OrganizationLayoutSideNav: Component = () => {
|
||||
icon: 'i-tabler-tag',
|
||||
href: `/organizations/${params.organizationId}/tags`,
|
||||
},
|
||||
{
|
||||
label: t('layout.menu.tagging-rules'),
|
||||
icon: 'i-tabler-list-check',
|
||||
href: `/organizations/${params.organizationId}/tagging-rules`,
|
||||
},
|
||||
{
|
||||
label: t('layout.menu.integrations'),
|
||||
icon: 'i-tabler-link',
|
||||
@@ -161,9 +167,11 @@ export const OrganizationLayout: ParentComponent = (props) => {
|
||||
));
|
||||
|
||||
return (
|
||||
<SidenavLayout
|
||||
children={props.children}
|
||||
sideNav={OrganizationLayoutSideNav}
|
||||
/>
|
||||
<DocumentUploadProvider>
|
||||
<SidenavLayout
|
||||
children={props.children}
|
||||
sideNav={OrganizationLayoutSideNav}
|
||||
/>
|
||||
</DocumentUploadProvider>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -3,10 +3,10 @@ import { signOut } from '@/modules/auth/auth.services';
|
||||
import { useCommandPalette } from '@/modules/command-palette/command-palette.provider';
|
||||
import { useConfig } from '@/modules/config/config.provider';
|
||||
|
||||
import { useDocumentUpload } from '@/modules/documents/components/document-import-status.component';
|
||||
import { GlobalDropArea } from '@/modules/documents/components/global-drop-area.component';
|
||||
import { useUploadDocuments } from '@/modules/documents/documents.composables';
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
|
||||
import { useI18n } from '@/modules/i18n/i18n.provider';
|
||||
import { cn } from '@/modules/shared/style/cn';
|
||||
import { useThemeStore } from '@/modules/theme/theme.store';
|
||||
import { Button } from '@/modules/ui/components/button';
|
||||
@@ -182,7 +182,7 @@ export const SidenavLayout: ParentComponent<{
|
||||
const { openCommandPalette } = useCommandPalette();
|
||||
const navigate = useNavigate();
|
||||
|
||||
const { promptImport, uploadDocuments } = useUploadDocuments({ organizationId: params.organizationId });
|
||||
const { promptImport, uploadDocuments } = useDocumentUpload({ organizationId: params.organizationId });
|
||||
|
||||
return (
|
||||
<div class="flex flex-row h-screen min-h-0">
|
||||
|
||||
@@ -19,6 +19,9 @@ import { OrganizationsSettingsPage } from './modules/organizations/pages/organiz
|
||||
import { OrganizationsPage } from './modules/organizations/pages/organizations.page';
|
||||
import { ComingSoonPage } from './modules/shared/pages/coming-soon.page';
|
||||
import { NotFoundPage } from './modules/shared/pages/not-found.page';
|
||||
import { CreateTaggingRulePage } from './modules/tagging-rules/pages/create-tagging-rule.page';
|
||||
import { TaggingRulesPage } from './modules/tagging-rules/pages/tagging-rules.page';
|
||||
import { UpdateTaggingRulePage } from './modules/tagging-rules/pages/update-tagging-rule.page';
|
||||
import { TagsPage } from './modules/tags/pages/tags.page';
|
||||
import { IntegrationsLayout } from './modules/ui/layouts/integrations.layout';
|
||||
import { OrganizationLayout } from './modules/ui/layouts/organization.layout';
|
||||
@@ -111,6 +114,18 @@ export const routes: RouteDefinition[] = [
|
||||
path: '/tags',
|
||||
component: TagsPage,
|
||||
},
|
||||
{
|
||||
path: '/tagging-rules',
|
||||
component: TaggingRulesPage,
|
||||
},
|
||||
{
|
||||
path: '/tagging-rules/create',
|
||||
component: CreateTaggingRulePage,
|
||||
},
|
||||
{
|
||||
path: '/tagging-rules/:taggingRuleId',
|
||||
component: UpdateTaggingRulePage,
|
||||
},
|
||||
{
|
||||
path: '/',
|
||||
component: IntegrationsLayout,
|
||||
|
||||
32
apps/papra-server/migrations/0002_tagging_rules.sql
Normal file
32
apps/papra-server/migrations/0002_tagging_rules.sql
Normal file
@@ -0,0 +1,32 @@
|
||||
CREATE TABLE `tagging_rule_actions` (
|
||||
`id` text PRIMARY KEY NOT NULL,
|
||||
`created_at` integer NOT NULL,
|
||||
`updated_at` integer NOT NULL,
|
||||
`tagging_rule_id` text NOT NULL,
|
||||
`tag_id` text NOT NULL,
|
||||
FOREIGN KEY (`tagging_rule_id`) REFERENCES `tagging_rules`(`id`) ON UPDATE cascade ON DELETE cascade,
|
||||
FOREIGN KEY (`tag_id`) REFERENCES `tags`(`id`) ON UPDATE cascade ON DELETE cascade
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `tagging_rule_conditions` (
|
||||
`id` text PRIMARY KEY NOT NULL,
|
||||
`created_at` integer NOT NULL,
|
||||
`updated_at` integer NOT NULL,
|
||||
`tagging_rule_id` text NOT NULL,
|
||||
`field` text NOT NULL,
|
||||
`operator` text NOT NULL,
|
||||
`value` text NOT NULL,
|
||||
`is_case_sensitive` integer DEFAULT false NOT NULL,
|
||||
FOREIGN KEY (`tagging_rule_id`) REFERENCES `tagging_rules`(`id`) ON UPDATE cascade ON DELETE cascade
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `tagging_rules` (
|
||||
`id` text PRIMARY KEY NOT NULL,
|
||||
`created_at` integer NOT NULL,
|
||||
`updated_at` integer NOT NULL,
|
||||
`organization_id` text NOT NULL,
|
||||
`name` text NOT NULL,
|
||||
`description` text,
|
||||
`enabled` integer DEFAULT true NOT NULL,
|
||||
FOREIGN KEY (`organization_id`) REFERENCES `organizations`(`id`) ON UPDATE cascade ON DELETE cascade
|
||||
);
|
||||
1443
apps/papra-server/migrations/meta/0002_snapshot.json
Normal file
1443
apps/papra-server/migrations/meta/0002_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -15,6 +15,13 @@
|
||||
"when": 1743508401881,
|
||||
"tag": "0001_documents_fts",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 2,
|
||||
"version": "6",
|
||||
"when": 1743938048080,
|
||||
"tag": "0002_tagging_rules",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "@papra/papra-app-server",
|
||||
"type": "module",
|
||||
"version": "0.1.2",
|
||||
"version": "0.3.0",
|
||||
"packageManager": "pnpm@9.15.4",
|
||||
"description": "Papra app server",
|
||||
"author": "Corentin Thomasset <corentinth@proton.me> (https://corentin.tech)",
|
||||
@@ -37,18 +37,25 @@
|
||||
"@crowlog/logger": "^1.1.0",
|
||||
"@hono/node-server": "^1.13.7",
|
||||
"@libsql/client": "^0.14.0",
|
||||
"@owlrelay/api-sdk": "^0.0.1",
|
||||
"@owlrelay/webhook": "^0.0.2",
|
||||
"@owlrelay/api-sdk": "^0.0.2",
|
||||
"@owlrelay/webhook": "^0.0.3",
|
||||
"@papra/lecture": "^0.0.4",
|
||||
"@paralleldrive/cuid2": "^2.2.2",
|
||||
"better-auth": "catalog:",
|
||||
"c12": "^3.0.2",
|
||||
"chokidar": "^4.0.3",
|
||||
"date-fns": "^4.1.0",
|
||||
"drizzle-kit": "^0.30.1",
|
||||
"drizzle-orm": "^0.38.3",
|
||||
"figue": "^2.2.0",
|
||||
"figue": "^2.2.3",
|
||||
"hono": "^4.6.15",
|
||||
"lodash-es": "^4.17.21",
|
||||
"mime-types": "^3.0.1",
|
||||
"node-cron": "^3.0.3",
|
||||
"p-limit": "^6.2.0",
|
||||
"p-queue": "^8.1.0",
|
||||
"picomatch": "^4.0.2",
|
||||
"posthog-node": "^4.11.1",
|
||||
"resend": "^4.1.2",
|
||||
"stripe": "^17.7.0",
|
||||
"tsx": "^4.19.2",
|
||||
@@ -59,11 +66,14 @@
|
||||
"@crowlog/pretty": "^1.1.1",
|
||||
"@total-typescript/ts-reset": "^0.6.1",
|
||||
"@types/lodash-es": "^4.17.12",
|
||||
"@types/mime-types": "^2.1.4",
|
||||
"@types/node": "^22.10.2",
|
||||
"@types/node-cron": "^3.0.11",
|
||||
"@types/picomatch": "^4.0.0",
|
||||
"@vitest/coverage-v8": "catalog:",
|
||||
"esbuild": "^0.24.2",
|
||||
"eslint": "catalog:",
|
||||
"memfs": "^4.17.0",
|
||||
"typescript": "catalog:",
|
||||
"vitest": "catalog:"
|
||||
}
|
||||
|
||||
@@ -1,18 +1,20 @@
|
||||
/* eslint-disable antfu/no-top-level-await */
|
||||
import process, { env } from 'node:process';
|
||||
import { serve } from '@hono/node-server';
|
||||
import { setupDatabase } from './modules/app/database/database';
|
||||
import { createServer } from './modules/app/server';
|
||||
import { parseConfig } from './modules/config/config';
|
||||
import { createIngestionFolderWatcher } from './modules/ingestion-folders/ingestion-folders.usecases';
|
||||
import { createLogger } from './modules/shared/logger/logger';
|
||||
import { createTaskScheduler } from './modules/tasks/task-scheduler';
|
||||
import { taskDefinitions } from './modules/tasks/tasks.defiitions';
|
||||
|
||||
const logger = createLogger({ namespace: 'app-server' });
|
||||
|
||||
const { config } = parseConfig({ env });
|
||||
const { config } = await parseConfig({ env });
|
||||
const { db, client } = setupDatabase(config.database);
|
||||
|
||||
const { app } = createServer({ config, db });
|
||||
const { app } = await createServer({ config, db });
|
||||
const { taskScheduler } = createTaskScheduler({ config, taskDefinitions, tasksArgs: { db } });
|
||||
|
||||
const server = serve(
|
||||
@@ -23,6 +25,15 @@ const server = serve(
|
||||
({ port }) => logger.info({ port }, 'Server started'),
|
||||
);
|
||||
|
||||
if (config.ingestionFolder.isEnabled) {
|
||||
const { startWatchingIngestionFolders } = createIngestionFolderWatcher({
|
||||
config,
|
||||
db,
|
||||
});
|
||||
|
||||
await startWatchingIngestionFolders();
|
||||
}
|
||||
|
||||
taskScheduler.start();
|
||||
|
||||
process.on('SIGINT', async () => {
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import type { ConfigDefinition } from 'figue';
|
||||
import { z } from 'zod';
|
||||
import { booleanishSchema } from '../../config/config.schemas';
|
||||
|
||||
export const authConfig = {
|
||||
secret: {
|
||||
@@ -10,59 +11,34 @@ export const authConfig = {
|
||||
},
|
||||
isRegistrationEnabled: {
|
||||
doc: 'Whether registration is enabled',
|
||||
schema: z
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => x === 'true')
|
||||
.pipe(z.boolean()),
|
||||
default: 'true',
|
||||
schema: booleanishSchema,
|
||||
default: true,
|
||||
env: 'AUTH_IS_REGISTRATION_ENABLED',
|
||||
},
|
||||
isPasswordResetEnabled: {
|
||||
doc: 'Whether password reset is enabled',
|
||||
schema: z
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => x === 'true')
|
||||
.pipe(z.boolean()),
|
||||
default: 'true',
|
||||
schema: booleanishSchema,
|
||||
default: true,
|
||||
env: 'AUTH_IS_PASSWORD_RESET_ENABLED',
|
||||
},
|
||||
isEmailVerificationRequired: {
|
||||
doc: 'Whether email verification is required',
|
||||
schema: z
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => x === 'true')
|
||||
.pipe(z.boolean()),
|
||||
default: 'false',
|
||||
schema: booleanishSchema,
|
||||
default: false,
|
||||
env: 'AUTH_IS_EMAIL_VERIFICATION_REQUIRED',
|
||||
},
|
||||
showLegalLinksOnAuthPage: {
|
||||
doc: 'Whether to show Papra legal links on the auth pages (terms of service, privacy policy), useless for self-hosted instances',
|
||||
schema: z
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => x === 'true')
|
||||
.pipe(z.boolean()),
|
||||
default: 'false',
|
||||
schema: booleanishSchema,
|
||||
default: false,
|
||||
env: 'AUTH_SHOW_LEGAL_LINKS',
|
||||
},
|
||||
providers: {
|
||||
github: {
|
||||
isEnabled: {
|
||||
doc: 'Whether Github OAuth is enabled',
|
||||
schema: z
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => x === 'true')
|
||||
.pipe(z.boolean()),
|
||||
default: 'false',
|
||||
schema: booleanishSchema,
|
||||
default: false,
|
||||
env: 'AUTH_PROVIDERS_GITHUB_IS_ENABLED',
|
||||
},
|
||||
clientId: {
|
||||
@@ -81,13 +57,8 @@ export const authConfig = {
|
||||
google: {
|
||||
isEnabled: {
|
||||
doc: 'Whether Google OAuth is enabled',
|
||||
schema: z
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => x === 'true')
|
||||
.pipe(z.boolean()),
|
||||
default: 'false',
|
||||
schema: booleanishSchema,
|
||||
default: false,
|
||||
env: 'AUTH_PROVIDERS_GOOGLE_IS_ENABLED',
|
||||
},
|
||||
clientId: {
|
||||
|
||||
48
apps/papra-server/src/modules/app/auth/auth.models.test.ts
Normal file
48
apps/papra-server/src/modules/app/auth/auth.models.test.ts
Normal file
@@ -0,0 +1,48 @@
|
||||
import type { Config } from '../../config/config.types';
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { getTrustedOrigins } from './auth.models';
|
||||
|
||||
describe('auth models', () => {
|
||||
describe('getTrustedOrigins', () => {
|
||||
test('by default the trusted origins are only the baseUrl', () => {
|
||||
const config = {
|
||||
client: {
|
||||
baseUrl: 'http://localhost:3000',
|
||||
},
|
||||
server: {
|
||||
trustedOrigins: [] as string[],
|
||||
},
|
||||
} as Config;
|
||||
|
||||
const { trustedOrigins } = getTrustedOrigins({ config });
|
||||
|
||||
expect(trustedOrigins).to.deep.equal(['http://localhost:3000']);
|
||||
});
|
||||
|
||||
test('if the user defined a list of trusted origins, it returns the client baseUrl and the trustedOrigins deduplicated', () => {
|
||||
const config = {
|
||||
client: {
|
||||
baseUrl: 'http://localhost:3000',
|
||||
},
|
||||
server: {
|
||||
trustedOrigins: [
|
||||
'http://localhost:3000',
|
||||
'http://localhost:3001',
|
||||
'http://localhost:3001',
|
||||
'http://localhost:3002',
|
||||
],
|
||||
},
|
||||
} as Config;
|
||||
|
||||
const { trustedOrigins } = getTrustedOrigins({ config });
|
||||
|
||||
expect(
|
||||
trustedOrigins,
|
||||
).to.deep.equal([
|
||||
'http://localhost:3000',
|
||||
'http://localhost:3001',
|
||||
'http://localhost:3002',
|
||||
]);
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,4 +1,6 @@
|
||||
import type { Config } from '../../config/config.types';
|
||||
import type { Context } from '../server.types';
|
||||
import { uniq } from 'lodash-es';
|
||||
import { createError } from '../../shared/errors/errors';
|
||||
|
||||
export function getUser({ context }: { context: Context }) {
|
||||
@@ -24,3 +26,12 @@ export function getSession({ context }: { context: Context }) {
|
||||
|
||||
return { session };
|
||||
}
|
||||
|
||||
export function getTrustedOrigins({ config }: { config: Config }) {
|
||||
const { baseUrl } = config.client;
|
||||
const { trustedOrigins } = config.server;
|
||||
|
||||
return {
|
||||
trustedOrigins: uniq([baseUrl, ...trustedOrigins]),
|
||||
};
|
||||
}
|
||||
|
||||
@@ -1,23 +1,37 @@
|
||||
import type { Config } from '../../config/config.types';
|
||||
import type { TrackingServices } from '../../tracking/tracking.services';
|
||||
import type { Database } from '../database/database.types';
|
||||
import type { AuthEmailsServices } from './auth.emails.services';
|
||||
import { betterAuth } from 'better-auth';
|
||||
import { drizzleAdapter } from 'better-auth/adapters/drizzle';
|
||||
import { createLogger } from '../../shared/logger/logger';
|
||||
import { usersTable } from '../../users/users.table';
|
||||
import { getTrustedOrigins } from './auth.models';
|
||||
import { accountsTable, sessionsTable, verificationsTable } from './auth.tables';
|
||||
|
||||
export type Auth = ReturnType<typeof getAuth>['auth'];
|
||||
|
||||
const logger = createLogger({ namespace: 'auth' });
|
||||
|
||||
export function getAuth({ db, config, authEmailsServices }: { db: Database; config: Config; authEmailsServices: AuthEmailsServices }) {
|
||||
export function getAuth({
|
||||
db,
|
||||
config,
|
||||
authEmailsServices,
|
||||
trackingServices,
|
||||
}: {
|
||||
db: Database;
|
||||
config: Config;
|
||||
authEmailsServices: AuthEmailsServices;
|
||||
trackingServices: TrackingServices;
|
||||
}) {
|
||||
const { secret } = config.auth;
|
||||
|
||||
const { trustedOrigins } = getTrustedOrigins({ config });
|
||||
|
||||
const auth = betterAuth({
|
||||
secret,
|
||||
baseURL: config.server.baseUrl,
|
||||
trustedOrigins: [config.client.baseUrl],
|
||||
trustedOrigins,
|
||||
logger: {
|
||||
disabled: false,
|
||||
log: (baseLevel, message) => {
|
||||
@@ -56,6 +70,17 @@ export function getAuth({ db, config, authEmailsServices }: { db: Database; conf
|
||||
},
|
||||
),
|
||||
|
||||
databaseHooks: {
|
||||
user: {
|
||||
create: {
|
||||
after: async (user) => {
|
||||
logger.info({ userId: user.id }, 'User signed up');
|
||||
trackingServices.captureUserEvent({ userId: user.id, event: 'User signed up' });
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
|
||||
advanced: {
|
||||
// Drizzle tables handle the id generation
|
||||
generateId: false,
|
||||
|
||||
@@ -3,7 +3,7 @@ import { z } from 'zod';
|
||||
|
||||
export const databaseConfig = {
|
||||
url: {
|
||||
doc: 'The URL of the database',
|
||||
doc: 'The URL of the database (default as "file:./app-data/db/db.sqlite" when using docker)',
|
||||
schema: z.string().url(),
|
||||
default: 'file:./db.sqlite',
|
||||
env: 'DATABASE_URL',
|
||||
|
||||
@@ -3,6 +3,7 @@ import { documentsTable } from '../../documents/documents.table';
|
||||
import { intakeEmailsTable } from '../../intake-emails/intake-emails.tables';
|
||||
import { organizationMembersTable, organizationsTable } from '../../organizations/organizations.table';
|
||||
import { organizationSubscriptionsTable } from '../../subscriptions/subscriptions.tables';
|
||||
import { taggingRuleActionsTable, taggingRuleConditionsTable, taggingRulesTable } from '../../tagging-rules/tagging-rules.tables';
|
||||
import { documentsTagsTable, tagsTable } from '../../tags/tags.table';
|
||||
import { usersTable } from '../../users/users.table';
|
||||
import { setupDatabase } from './database';
|
||||
@@ -31,6 +32,9 @@ const seedTables = {
|
||||
documentsTags: documentsTagsTable,
|
||||
intakeEmails: intakeEmailsTable,
|
||||
organizationSubscriptions: organizationSubscriptionsTable,
|
||||
taggingRules: taggingRulesTable,
|
||||
taggingRuleConditions: taggingRuleConditionsTable,
|
||||
taggingRuleActions: taggingRuleActionsTable,
|
||||
} as const;
|
||||
|
||||
type SeedTablesRows = {
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import type { Database } from '../../database/database.types';
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { overrideConfig } from '../../../config/config.test-utils';
|
||||
import { createInMemoryDatabase } from '../../database/database.test-utils';
|
||||
import { createServer } from '../../server';
|
||||
|
||||
@@ -8,7 +9,7 @@ describe('health check routes e2e', () => {
|
||||
describe('the /api/health is a publicly accessible route that provides health information about the server', () => {
|
||||
test('when the database is healthy, the /api/health returns 200', async () => {
|
||||
const { db } = await createInMemoryDatabase();
|
||||
const { app } = createServer({ db });
|
||||
const { app } = await createServer({ db, config: overrideConfig() });
|
||||
|
||||
const response = await app.request('/api/health');
|
||||
|
||||
@@ -27,7 +28,7 @@ describe('health check routes e2e', () => {
|
||||
},
|
||||
} as unknown as Database;
|
||||
|
||||
const { app } = createServer({ db });
|
||||
const { app } = await createServer({ db, config: overrideConfig() });
|
||||
|
||||
const response = await app.request('/api/health');
|
||||
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { overrideConfig } from '../../../config/config.test-utils';
|
||||
import { createInMemoryDatabase } from '../../database/database.test-utils';
|
||||
import { createServer } from '../../server';
|
||||
|
||||
describe('ping routes e2e', () => {
|
||||
test('the /api/ping is a publicly accessible route that always returns a 200 with a status ok', async () => {
|
||||
const { db } = await createInMemoryDatabase();
|
||||
const { app } = createServer({ db });
|
||||
const { app } = await createServer({ db, config: overrideConfig() });
|
||||
|
||||
const response = await app.request('/api/ping');
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ import { registerDocumentsPrivateRoutes } from '../documents/documents.routes';
|
||||
import { registerIntakeEmailsPrivateRoutes, registerIntakeEmailsPublicRoutes } from '../intake-emails/intake-emails.routes';
|
||||
import { registerOrganizationsPrivateRoutes } from '../organizations/organizations.routes';
|
||||
import { registerSubscriptionsPrivateRoutes, registerSubscriptionsPublicRoutes } from '../subscriptions/subscriptions.routes';
|
||||
import { registerTaggingRulesRoutes } from '../tagging-rules/tagging-rules.routes';
|
||||
import { registerTagsRoutes } from '../tags/tags.routes';
|
||||
import { registerUsersPrivateRoutes } from '../users/users.routes';
|
||||
import { createUnauthorizedError } from './auth/auth.errors';
|
||||
@@ -42,4 +43,5 @@ function registerPrivateRoutes(context: RouteDefinitionContext) {
|
||||
registerTagsRoutes(context);
|
||||
registerIntakeEmailsPrivateRoutes(context);
|
||||
registerSubscriptionsPrivateRoutes(context);
|
||||
registerTaggingRulesRoutes(context);
|
||||
}
|
||||
|
||||
@@ -5,6 +5,7 @@ import { parseConfig } from '../config/config';
|
||||
import { createEmailsServices } from '../emails/emails.services';
|
||||
import { createLoggerMiddleware } from '../shared/logger/logger.middleware';
|
||||
import { createSubscriptionsServices } from '../subscriptions/subscriptions.services';
|
||||
import { createTrackingServices } from '../tracking/tracking.services';
|
||||
import { createAuthEmailsServices } from './auth/auth.emails.services';
|
||||
import { getAuth } from './auth/auth.services';
|
||||
import { setupDatabase } from './database/database';
|
||||
@@ -14,11 +15,12 @@ import { createTimeoutMiddleware } from './middlewares/timeout.middleware';
|
||||
import { registerRoutes } from './server.routes';
|
||||
import { registerStaticAssetsRoutes } from './static-assets/static-assets.routes';
|
||||
|
||||
function createGlobalDependencies(partialDeps: Partial<GlobalDependencies>): GlobalDependencies {
|
||||
const config = partialDeps.config ?? parseConfig().config;
|
||||
async function createGlobalDependencies(partialDeps: Partial<GlobalDependencies>): Promise<GlobalDependencies> {
|
||||
const config = partialDeps.config ?? (await parseConfig()).config;
|
||||
const db = partialDeps.db ?? setupDatabase(config.database).db;
|
||||
const emailsServices = createEmailsServices({ config });
|
||||
const auth = partialDeps.auth ?? getAuth({ db, config, authEmailsServices: createAuthEmailsServices({ emailsServices }) }).auth;
|
||||
const trackingServices = createTrackingServices({ config });
|
||||
const auth = partialDeps.auth ?? getAuth({ db, config, authEmailsServices: createAuthEmailsServices({ emailsServices }), trackingServices }).auth;
|
||||
const subscriptionsServices = createSubscriptionsServices({ config });
|
||||
|
||||
return {
|
||||
@@ -27,12 +29,13 @@ function createGlobalDependencies(partialDeps: Partial<GlobalDependencies>): Glo
|
||||
auth,
|
||||
emailsServices,
|
||||
subscriptionsServices,
|
||||
trackingServices,
|
||||
};
|
||||
}
|
||||
|
||||
export function createServer(initialDeps: Partial<GlobalDependencies>) {
|
||||
const dependencies = createGlobalDependencies(initialDeps);
|
||||
const { config } = dependencies;
|
||||
export async function createServer(initialDeps: Partial<GlobalDependencies>) {
|
||||
const dependencies = await createGlobalDependencies(initialDeps);
|
||||
const { config, trackingServices } = dependencies;
|
||||
|
||||
const app = new Hono<ServerInstanceGenerics>({ strict: true });
|
||||
|
||||
@@ -48,5 +51,8 @@ export function createServer(initialDeps: Partial<GlobalDependencies>) {
|
||||
|
||||
return {
|
||||
app,
|
||||
shutdown: async () => {
|
||||
await trackingServices.shutdown();
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ import type { Context as BaseContext, Hono } from 'hono';
|
||||
import type { Config } from '../config/config.types';
|
||||
import type { EmailsServices } from '../emails/emails.services';
|
||||
import type { SubscriptionsServices } from '../subscriptions/subscriptions.services';
|
||||
import type { TrackingServices } from '../tracking/tracking.services';
|
||||
import type { Auth } from './auth/auth.services';
|
||||
import type { Database } from './database/database.types';
|
||||
|
||||
@@ -22,6 +23,7 @@ export type GlobalDependencies = {
|
||||
auth: Auth;
|
||||
emailsServices: EmailsServices;
|
||||
subscriptionsServices: SubscriptionsServices;
|
||||
trackingServices: TrackingServices;
|
||||
};
|
||||
|
||||
export type RouteDefinitionContext = { app: ServerInstance } & GlobalDependencies;
|
||||
|
||||
48
apps/papra-server/src/modules/config/config.schemas.test.ts
Normal file
48
apps/papra-server/src/modules/config/config.schemas.test.ts
Normal file
@@ -0,0 +1,48 @@
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { booleanishSchema, trustedOriginsSchema } from './config.schemas';
|
||||
|
||||
describe('config schemas', () => {
|
||||
describe('booleanishSchema', () => {
|
||||
test('a zod schema that validates and coerces a string to a boolean, used in the config where we accept env variables and pojo values', () => {
|
||||
expect(booleanishSchema.parse(true)).toBe(true);
|
||||
expect(booleanishSchema.parse('true')).toBe(true);
|
||||
expect(booleanishSchema.parse('TRUE')).toBe(true);
|
||||
expect(booleanishSchema.parse('True')).toBe(true);
|
||||
expect(booleanishSchema.parse(' True ')).toBe(true);
|
||||
expect(booleanishSchema.parse('1')).toBe(true);
|
||||
expect(booleanishSchema.parse(1)).toBe(true);
|
||||
|
||||
expect(booleanishSchema.parse('false')).toBe(false);
|
||||
expect(booleanishSchema.parse('FALSE')).toBe(false);
|
||||
expect(booleanishSchema.parse('False')).toBe(false);
|
||||
expect(booleanishSchema.parse(' False ')).toBe(false);
|
||||
expect(booleanishSchema.parse(false)).toBe(false);
|
||||
expect(booleanishSchema.parse('foo')).toBe(false);
|
||||
expect(booleanishSchema.parse('0')).toBe(false);
|
||||
expect(booleanishSchema.parse(-1)).toBe(false);
|
||||
expect(booleanishSchema.parse(0)).toBe(false);
|
||||
expect(booleanishSchema.parse(2)).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('trustedOriginsSchema', () => {
|
||||
test('this schema validates and coerces a comma separated string to an array of urls', () => {
|
||||
expect(trustedOriginsSchema.parse('http://localhost:3000')).toEqual(['http://localhost:3000']);
|
||||
expect(trustedOriginsSchema.parse('http://localhost:3000,http://localhost:3001')).toEqual([
|
||||
'http://localhost:3000',
|
||||
'http://localhost:3001',
|
||||
]);
|
||||
expect(trustedOriginsSchema.parse([
|
||||
'http://localhost:3000',
|
||||
'http://localhost:3001',
|
||||
])).toEqual([
|
||||
'http://localhost:3000',
|
||||
'http://localhost:3001',
|
||||
]);
|
||||
});
|
||||
|
||||
test('otherwise it throws an error', () => {
|
||||
expect(() => trustedOriginsSchema.parse('non-url')).toThrow();
|
||||
});
|
||||
});
|
||||
});
|
||||
14
apps/papra-server/src/modules/config/config.schemas.ts
Normal file
14
apps/papra-server/src/modules/config/config.schemas.ts
Normal file
@@ -0,0 +1,14 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
export const booleanishSchema = z
|
||||
.coerce
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => ['true', '1'].includes(x))
|
||||
.pipe(z.boolean());
|
||||
|
||||
export const trustedOriginsSchema = z.union([
|
||||
z.array(z.string().url()),
|
||||
z.string().transform(value => value.split(',')).pipe(z.array(z.string().url())),
|
||||
]);
|
||||
@@ -1,12 +1,12 @@
|
||||
import type { DeepPartial } from '@corentinth/chisels';
|
||||
import type { Config } from './config.types';
|
||||
import { merge } from 'lodash-es';
|
||||
import { parseConfig } from './config';
|
||||
import { loadDryConfig } from './config';
|
||||
|
||||
export { overrideConfig };
|
||||
|
||||
function overrideConfig(config: DeepPartial<Config>) {
|
||||
const { config: defaultConfig } = parseConfig();
|
||||
function overrideConfig(config: DeepPartial<Config> | undefined = {}) {
|
||||
const { config: defaultConfig } = loadDryConfig();
|
||||
|
||||
return merge({}, defaultConfig, config);
|
||||
}
|
||||
|
||||
@@ -1,19 +1,24 @@
|
||||
import type { ConfigDefinition } from 'figue';
|
||||
import process from 'node:process';
|
||||
import { safelySync } from '@corentinth/chisels';
|
||||
import { loadConfig } from 'c12';
|
||||
import { defineConfig } from 'figue';
|
||||
import { memoize } from 'lodash-es';
|
||||
import { z } from 'zod';
|
||||
import { authConfig } from '../app/auth/auth.config';
|
||||
import { databaseConfig } from '../app/database/database.config';
|
||||
import { documentsConfig } from '../documents/documents.config';
|
||||
import { documentStorageConfig } from '../documents/storage/document-storage.config';
|
||||
import { emailsConfig } from '../emails/emails.config';
|
||||
import { ingestionFolderConfig } from '../ingestion-folders/ingestion-folders.config';
|
||||
import { intakeEmailsConfig } from '../intake-emails/intake-emails.config';
|
||||
import { organizationsConfig } from '../organizations/organizations.config';
|
||||
import { organizationPlansConfig } from '../plans/plans.config';
|
||||
import { createLogger } from '../shared/logger/logger';
|
||||
import { subscriptionsConfig } from '../subscriptions/subscriptions.config';
|
||||
import { tasksConfig } from '../tasks/tasks.config';
|
||||
import { trackingConfig } from '../tracking/tracking.config';
|
||||
import { booleanishSchema, trustedOriginsSchema } from './config.schemas';
|
||||
|
||||
export const configDefinition = {
|
||||
env: {
|
||||
@@ -22,6 +27,20 @@ export const configDefinition = {
|
||||
default: 'development',
|
||||
env: 'NODE_ENV',
|
||||
},
|
||||
client: {
|
||||
baseUrl: {
|
||||
doc: 'The URL of the client',
|
||||
schema: z.string().url(),
|
||||
default: 'http://localhost:3000',
|
||||
env: 'CLIENT_BASE_URL',
|
||||
},
|
||||
oauthRedirectUrl: {
|
||||
doc: 'The URL to redirect to after OAuth',
|
||||
schema: z.string().url(),
|
||||
default: 'http://localhost:3000/confirm',
|
||||
env: 'CLIENT_OAUTH_REDIRECT_URL',
|
||||
},
|
||||
},
|
||||
server: {
|
||||
baseUrl: {
|
||||
doc: 'The base URL of the server',
|
||||
@@ -29,6 +48,12 @@ export const configDefinition = {
|
||||
default: 'http://localhost:1221',
|
||||
env: 'SERVER_BASE_URL',
|
||||
},
|
||||
trustedOrigins: {
|
||||
doc: 'A comma separated list of origins that are trusted to make requests to the server. The client baseUrl (CLIENT_BASE_URL) is automatically added by default, no need to add it to the list.',
|
||||
schema: trustedOriginsSchema,
|
||||
default: [],
|
||||
env: 'TRUSTED_ORIGINS',
|
||||
},
|
||||
port: {
|
||||
doc: 'The port to listen on when using node server',
|
||||
schema: z.coerce.number().min(1024).max(65535),
|
||||
@@ -51,52 +76,41 @@ export const configDefinition = {
|
||||
env: 'SERVER_CORS_ORIGINS',
|
||||
},
|
||||
servePublicDir: {
|
||||
doc: 'Whether to serve the public directory',
|
||||
schema: z
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => x === 'true')
|
||||
.pipe(z.boolean()),
|
||||
default: 'false',
|
||||
doc: 'Whether to serve the public directory (default as true when using docker)',
|
||||
schema: booleanishSchema,
|
||||
default: false,
|
||||
env: 'SERVER_SERVE_PUBLIC_DIR',
|
||||
},
|
||||
},
|
||||
client: {
|
||||
baseUrl: {
|
||||
doc: 'The URL of the client',
|
||||
schema: z.string().url(),
|
||||
default: 'http://localhost:3000',
|
||||
env: 'CLIENT_BASE_URL',
|
||||
},
|
||||
oauthRedirectUrl: {
|
||||
doc: 'The URL to redirect to after OAuth',
|
||||
schema: z.string().url(),
|
||||
default: 'http://localhost:3000/confirm',
|
||||
env: 'CLIENT_OAUTH_REDIRECT_URL',
|
||||
},
|
||||
},
|
||||
|
||||
database: databaseConfig,
|
||||
documents: documentsConfig,
|
||||
documentsStorage: documentStorageConfig,
|
||||
auth: authConfig,
|
||||
ingestionFolder: ingestionFolderConfig,
|
||||
tasks: tasksConfig,
|
||||
intakeEmails: intakeEmailsConfig,
|
||||
emails: emailsConfig,
|
||||
organizations: organizationsConfig,
|
||||
organizationPlans: organizationPlansConfig,
|
||||
subscriptions: subscriptionsConfig,
|
||||
tracking: trackingConfig,
|
||||
} as const satisfies ConfigDefinition;
|
||||
|
||||
const logger = createLogger({ namespace: 'config' });
|
||||
|
||||
export function parseConfig({ env = process.env }: { env?: Record<string, string | undefined> } = {}) {
|
||||
const [configResult, configError] = safelySync(() => defineConfig(
|
||||
configDefinition,
|
||||
{
|
||||
envSource: env,
|
||||
},
|
||||
));
|
||||
export async function parseConfig({ env = process.env }: { env?: Record<string, string | undefined> } = {}) {
|
||||
const { config: configFromFile } = await loadConfig({
|
||||
name: 'papra',
|
||||
rcFile: false,
|
||||
globalRc: false,
|
||||
dotenv: false,
|
||||
packageJson: false,
|
||||
envName: false,
|
||||
cwd: env.PAPRA_CONFIG_DIR ?? process.cwd(),
|
||||
});
|
||||
|
||||
const [configResult, configError] = safelySync(() => defineConfig(configDefinition, { envSource: env, defaults: configFromFile }));
|
||||
|
||||
if (configError) {
|
||||
logger.error({ error: configError }, `Invalid config: ${configError.message}`);
|
||||
@@ -107,3 +121,11 @@ export function parseConfig({ env = process.env }: { env?: Record<string, string
|
||||
|
||||
return { config };
|
||||
}
|
||||
|
||||
// Permit to load the default config, regardless of environment variables, and config files
|
||||
// memoized to avoid re-parsing the config definition
|
||||
export const loadDryConfig = memoize(() => {
|
||||
const { config } = defineConfig(configDefinition);
|
||||
|
||||
return { config };
|
||||
});
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
import type { parseConfig } from './config';
|
||||
|
||||
export type Config = ReturnType<typeof parseConfig>['config'];
|
||||
export type Config = Awaited<ReturnType<typeof parseConfig>>['config'];
|
||||
|
||||
@@ -12,8 +12,15 @@ export const createDocumentIsNotDeletedError = createErrorFactory({
|
||||
statusCode: 400,
|
||||
});
|
||||
|
||||
export const DOCUMENT_ALREADY_EXISTS_ERROR_CODE = 'document.already_exists' as const;
|
||||
export const createDocumentAlreadyExistsError = createErrorFactory({
|
||||
message: 'Document already exists.',
|
||||
code: 'document.already_exists',
|
||||
code: DOCUMENT_ALREADY_EXISTS_ERROR_CODE,
|
||||
statusCode: 409,
|
||||
});
|
||||
|
||||
export const createDocumentNotDeletedError = createErrorFactory({
|
||||
message: 'Document is not deleted, cannot delete.',
|
||||
code: 'document.not_deleted',
|
||||
statusCode: 400,
|
||||
});
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { buildOriginalDocumentKey, joinStorageKeyParts } from './documents.models';
|
||||
import { buildOriginalDocumentKey, isDocumentSizeLimitEnabled, joinStorageKeyParts } from './documents.models';
|
||||
|
||||
describe('documents models', () => {
|
||||
describe('joinStorageKeyParts', () => {
|
||||
@@ -46,4 +46,13 @@ describe('documents models', () => {
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('isDocumentSizeLimitEnabled', () => {
|
||||
test('the user can disable the document size limit by setting the maxUploadSize to 0', () => {
|
||||
expect(isDocumentSizeLimitEnabled({ maxUploadSize: 0 })).to.eql(false);
|
||||
|
||||
expect(isDocumentSizeLimitEnabled({ maxUploadSize: 100 })).to.eql(true);
|
||||
expect(isDocumentSizeLimitEnabled({ maxUploadSize: 42 })).to.eql(true);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -19,3 +19,7 @@ export function buildOriginalDocumentKey({ documentId, organizationId, fileName
|
||||
export function generateDocumentId() {
|
||||
return generateId({ prefix: 'doc' });
|
||||
}
|
||||
|
||||
export function isDocumentSizeLimitEnabled({ maxUploadSize }: { maxUploadSize: number }) {
|
||||
return maxUploadSize !== 0;
|
||||
}
|
||||
|
||||
@@ -28,6 +28,7 @@ export function createDocumentsRepository({ db }: { db: Database }) {
|
||||
getExpiredDeletedDocuments,
|
||||
getOrganizationStats,
|
||||
getOrganizationDocumentBySha256Hash,
|
||||
getAllOrganizationTrashDocumentIds,
|
||||
},
|
||||
{ db },
|
||||
);
|
||||
@@ -225,20 +226,25 @@ async function softDeleteDocument({ documentId, organizationId, userId, db, now
|
||||
);
|
||||
}
|
||||
|
||||
async function restoreDocument({ documentId, organizationId, db }: { documentId: string;organizationId: string; db: Database }) {
|
||||
await db
|
||||
async function restoreDocument({ documentId, organizationId, name, userId, db }: { documentId: string;organizationId: string; name?: string; userId?: string; db: Database }) {
|
||||
const [document] = await db
|
||||
.update(documentsTable)
|
||||
.set({
|
||||
isDeleted: false,
|
||||
deletedBy: null,
|
||||
deletedAt: null,
|
||||
...(name ? { name, originalName: name } : {}),
|
||||
...(userId ? { createdBy: userId } : {}),
|
||||
})
|
||||
.where(
|
||||
and(
|
||||
eq(documentsTable.id, documentId),
|
||||
eq(documentsTable.organizationId, organizationId),
|
||||
),
|
||||
);
|
||||
)
|
||||
.returning();
|
||||
|
||||
return { document };
|
||||
}
|
||||
|
||||
async function hardDeleteDocument({ documentId, db }: { documentId: string; db: Database }) {
|
||||
@@ -303,3 +309,18 @@ async function getOrganizationStats({ organizationId, db }: { organizationId: st
|
||||
documentsSize: Number(documentsSize ?? 0),
|
||||
};
|
||||
}
|
||||
|
||||
async function getAllOrganizationTrashDocumentIds({ organizationId, db }: { organizationId: string; db: Database }) {
|
||||
const documents = await db.select({
|
||||
id: documentsTable.id,
|
||||
}).from(documentsTable).where(
|
||||
and(
|
||||
eq(documentsTable.organizationId, organizationId),
|
||||
eq(documentsTable.isDeleted, true),
|
||||
),
|
||||
);
|
||||
|
||||
return {
|
||||
documentIds: documents.map(document => document.id),
|
||||
};
|
||||
}
|
||||
|
||||
@@ -9,9 +9,12 @@ import { createPlansRepository } from '../plans/plans.repository';
|
||||
import { createError } from '../shared/errors/errors';
|
||||
import { validateFormData, validateParams, validateQuery } from '../shared/validation/validation';
|
||||
import { createSubscriptionsRepository } from '../subscriptions/subscriptions.repository';
|
||||
import { createTaggingRulesRepository } from '../tagging-rules/tagging-rules.repository';
|
||||
import { createTagsRepository } from '../tags/tags.repository';
|
||||
import { createDocumentIsNotDeletedError } from './documents.errors';
|
||||
import { isDocumentSizeLimitEnabled } from './documents.models';
|
||||
import { createDocumentsRepository } from './documents.repository';
|
||||
import { createDocument, ensureDocumentExists, getDocumentOrThrow } from './documents.usecases';
|
||||
import { createDocument, deleteAllTrashDocuments, deleteTrashDocument, ensureDocumentExists, getDocumentOrThrow } from './documents.usecases';
|
||||
import { createDocumentStorageService } from './storage/documents.storage.services';
|
||||
|
||||
export function registerDocumentsPrivateRoutes(context: RouteDefinitionContext) {
|
||||
@@ -22,16 +25,22 @@ export function registerDocumentsPrivateRoutes(context: RouteDefinitionContext)
|
||||
setupGetDeletedDocumentsRoute(context);
|
||||
setupGetOrganizationDocumentsStatsRoute(context);
|
||||
setupGetDocumentRoute(context);
|
||||
setupDeleteTrashDocumentRoute(context);
|
||||
setupDeleteAllTrashDocumentsRoute(context);
|
||||
setupDeleteDocumentRoute(context);
|
||||
setupGetDocumentFileRoute(context);
|
||||
}
|
||||
|
||||
function setupCreateDocumentRoute({ app, config, db }: RouteDefinitionContext) {
|
||||
function setupCreateDocumentRoute({ app, config, db, trackingServices }: RouteDefinitionContext) {
|
||||
app.post(
|
||||
'/api/organizations/:organizationId/documents',
|
||||
(context, next) => {
|
||||
const { maxUploadSize } = config.documentsStorage;
|
||||
|
||||
if (!isDocumentSizeLimitEnabled({ maxUploadSize })) {
|
||||
return next();
|
||||
}
|
||||
|
||||
const middleware = bodyLimit({
|
||||
maxSize: maxUploadSize,
|
||||
onError: () => {
|
||||
@@ -78,6 +87,8 @@ function setupCreateDocumentRoute({ app, config, db }: RouteDefinitionContext) {
|
||||
const documentsStorageService = await createDocumentStorageService({ config });
|
||||
const plansRepository = createPlansRepository({ config });
|
||||
const subscriptionsRepository = createSubscriptionsRepository({ db });
|
||||
const taggingRulesRepository = createTaggingRulesRepository({ db });
|
||||
const tagsRepository = createTagsRepository({ db });
|
||||
|
||||
const { document } = await createDocument({
|
||||
file,
|
||||
@@ -87,6 +98,9 @@ function setupCreateDocumentRoute({ app, config, db }: RouteDefinitionContext) {
|
||||
documentsStorageService,
|
||||
plansRepository,
|
||||
subscriptionsRepository,
|
||||
trackingServices,
|
||||
taggingRulesRepository,
|
||||
tagsRepository,
|
||||
});
|
||||
|
||||
return context.json({
|
||||
@@ -357,3 +371,54 @@ function setupGetOrganizationDocumentsStatsRoute({ app, db }: RouteDefinitionCon
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
function setupDeleteTrashDocumentRoute({ app, config, db }: RouteDefinitionContext) {
|
||||
app.delete(
|
||||
'/api/organizations/:organizationId/documents/trash/:documentId',
|
||||
validateParams(z.object({
|
||||
organizationId: z.string().regex(organizationIdRegex),
|
||||
documentId: z.string(),
|
||||
})),
|
||||
async (context) => {
|
||||
const { userId } = getUser({ context });
|
||||
|
||||
const { organizationId, documentId } = context.req.valid('param');
|
||||
|
||||
const documentsRepository = createDocumentsRepository({ db });
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
const documentsStorageService = await createDocumentStorageService({ config });
|
||||
|
||||
await ensureUserIsInOrganization({ userId, organizationId, organizationsRepository });
|
||||
|
||||
await deleteTrashDocument({ documentId, organizationId, documentsRepository, documentsStorageService });
|
||||
|
||||
return context.json({
|
||||
success: true,
|
||||
});
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
function setupDeleteAllTrashDocumentsRoute({ app, config, db }: RouteDefinitionContext) {
|
||||
app.delete(
|
||||
'/api/organizations/:organizationId/documents/trash',
|
||||
validateParams(z.object({
|
||||
organizationId: z.string().regex(organizationIdRegex),
|
||||
})),
|
||||
async (context) => {
|
||||
const { userId } = getUser({ context });
|
||||
|
||||
const { organizationId } = context.req.valid('param');
|
||||
|
||||
const documentsRepository = createDocumentsRepository({ db });
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
const documentsStorageService = await createDocumentStorageService({ config });
|
||||
|
||||
await ensureUserIsInOrganization({ userId, organizationId, organizationsRepository });
|
||||
|
||||
await deleteAllTrashDocuments({ organizationId, documentsRepository, documentsStorageService });
|
||||
|
||||
return context.body(null, 204);
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
@@ -2,3 +2,6 @@ import type { Expand } from '@corentinth/chisels';
|
||||
import type { documentsTable } from './documents.table';
|
||||
|
||||
export type DbInsertableDocument = Expand<typeof documentsTable.$inferInsert>;
|
||||
export type DbSelectableDocument = Expand<typeof documentsTable.$inferSelect>;
|
||||
|
||||
export type Document = DbSelectableDocument;
|
||||
|
||||
@@ -5,6 +5,10 @@ import { ORGANIZATION_ROLES } from '../organizations/organizations.constants';
|
||||
import { createPlansRepository } from '../plans/plans.repository';
|
||||
import { collectReadableStreamToString } from '../shared/streams/readable-stream';
|
||||
import { createSubscriptionsRepository } from '../subscriptions/subscriptions.repository';
|
||||
import { createTaggingRulesRepository } from '../tagging-rules/tagging-rules.repository';
|
||||
import { createTagsRepository } from '../tags/tags.repository';
|
||||
import { documentsTagsTable } from '../tags/tags.table';
|
||||
import { createDummyTrackingServices } from '../tracking/tracking.services';
|
||||
import { createDocumentAlreadyExistsError } from './documents.errors';
|
||||
import { createDocumentsRepository } from './documents.repository';
|
||||
import { documentsTable } from './documents.table';
|
||||
@@ -24,6 +28,9 @@ describe('documents usecases', () => {
|
||||
const documentsStorageService = await createDocumentStorageService({ config: { documentsStorage: { driver: 'in-memory' } } as Config });
|
||||
const plansRepository = createPlansRepository({ config: { organizationPlans: { isFreePlanUnlimited: true } } as Config });
|
||||
const subscriptionsRepository = createSubscriptionsRepository({ db });
|
||||
const trackingServices = createDummyTrackingServices();
|
||||
const taggingRulesRepository = createTaggingRulesRepository({ db });
|
||||
const tagsRepository = createTagsRepository({ db });
|
||||
const generateDocumentId = () => 'doc_1';
|
||||
|
||||
const file = new File(['content'], 'file.txt', { type: 'text/plain' });
|
||||
@@ -39,6 +46,9 @@ describe('documents usecases', () => {
|
||||
generateDocumentId,
|
||||
plansRepository,
|
||||
subscriptionsRepository,
|
||||
trackingServices,
|
||||
taggingRulesRepository,
|
||||
tagsRepository,
|
||||
});
|
||||
|
||||
expect(document).to.include({
|
||||
@@ -66,7 +76,7 @@ describe('documents usecases', () => {
|
||||
expect(documentRecords).to.eql([document]);
|
||||
});
|
||||
|
||||
test('in the same organization, we should be able to have two documents with the same content, an error is raised if the document already exists', async () => {
|
||||
test('in the same organization, we should not be able to have two documents with the same content, an error is raised if the document already exists', async () => {
|
||||
const { db } = await createInMemoryDatabase({
|
||||
users: [{ id: 'user-1', email: 'user-1@example.com' }],
|
||||
organizations: [{ id: 'organization-1', name: 'Organization 1' }],
|
||||
@@ -77,6 +87,10 @@ describe('documents usecases', () => {
|
||||
const documentsStorageService = await createDocumentStorageService({ config: { documentsStorage: { driver: 'in-memory' } } as Config });
|
||||
const plansRepository = createPlansRepository({ config: { organizationPlans: { isFreePlanUnlimited: true } } as Config });
|
||||
const subscriptionsRepository = createSubscriptionsRepository({ db });
|
||||
const trackingServices = createDummyTrackingServices();
|
||||
const taggingRulesRepository = createTaggingRulesRepository({ db });
|
||||
const tagsRepository = createTagsRepository({ db });
|
||||
|
||||
let documentIdIndex = 1;
|
||||
const generateDocumentId = () => `doc_${documentIdIndex++}`;
|
||||
|
||||
@@ -93,6 +107,9 @@ describe('documents usecases', () => {
|
||||
plansRepository,
|
||||
subscriptionsRepository,
|
||||
generateDocumentId,
|
||||
trackingServices,
|
||||
taggingRulesRepository,
|
||||
tagsRepository,
|
||||
});
|
||||
|
||||
expect(document1).to.include({
|
||||
@@ -118,6 +135,9 @@ describe('documents usecases', () => {
|
||||
plansRepository,
|
||||
subscriptionsRepository,
|
||||
generateDocumentId,
|
||||
trackingServices,
|
||||
taggingRulesRepository,
|
||||
tagsRepository,
|
||||
}),
|
||||
).rejects.toThrow(
|
||||
createDocumentAlreadyExistsError(),
|
||||
@@ -132,6 +152,96 @@ describe('documents usecases', () => {
|
||||
).rejects.toThrow('File not found');
|
||||
});
|
||||
|
||||
test(`if the document already exists but is in the trash
|
||||
- we restore the document
|
||||
- update the existing record, setting the name to the new file name (same content does not mean same name)
|
||||
- pre-existing tags are removed
|
||||
- the tagging rules are applied to the restored document`, async () => {
|
||||
// this is the sha256 hash of 'hello world'
|
||||
const hash = 'b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9';
|
||||
|
||||
// The document is deleted and has the tag-1
|
||||
// A tagging rule that apply tag-2 if the content contains 'hello'
|
||||
// When restoring the document, the tagging rule should apply tag-2
|
||||
const { db } = await createInMemoryDatabase({
|
||||
users: [{ id: 'user-1', email: 'user-1@example.com' }],
|
||||
organizations: [{ id: 'organization-1', name: 'Organization 1' }],
|
||||
organizationMembers: [{ organizationId: 'organization-1', userId: 'user-1', role: ORGANIZATION_ROLES.OWNER }],
|
||||
tags: [
|
||||
{ id: 'tag-1', name: 'Tag 1', color: '#000000', organizationId: 'organization-1' },
|
||||
{ id: 'tag-2', name: 'Tag 2', color: '#000000', organizationId: 'organization-1' },
|
||||
],
|
||||
documents: [{
|
||||
id: 'document-1',
|
||||
organizationId: 'organization-1',
|
||||
originalSha256Hash: hash,
|
||||
isDeleted: true,
|
||||
mimeType: 'text/plain',
|
||||
originalStorageKey: 'organization-1/originals/document-1.txt',
|
||||
name: 'file-1.txt',
|
||||
originalName: 'file-1.txt',
|
||||
content: 'hello world',
|
||||
}],
|
||||
documentsTags: [{
|
||||
documentId: 'document-1',
|
||||
tagId: 'tag-1',
|
||||
}],
|
||||
taggingRules: [
|
||||
{ id: 'tagging-rule-1', organizationId: 'organization-1', name: 'Tagging Rule 1', enabled: true },
|
||||
],
|
||||
taggingRuleConditions: [
|
||||
{ id: 'tagging-rule-condition-1', taggingRuleId: 'tagging-rule-1', field: 'content', operator: 'contains', value: 'hello' },
|
||||
],
|
||||
taggingRuleActions: [
|
||||
{ id: 'tagging-rule-action-1', taggingRuleId: 'tagging-rule-1', tagId: 'tag-2' },
|
||||
],
|
||||
});
|
||||
|
||||
const documentsRepository = createDocumentsRepository({ db });
|
||||
const documentsStorageService = await createDocumentStorageService({ config: { documentsStorage: { driver: 'in-memory' } } as Config });
|
||||
const plansRepository = createPlansRepository({ config: { organizationPlans: { isFreePlanUnlimited: true } } as Config });
|
||||
const subscriptionsRepository = createSubscriptionsRepository({ db });
|
||||
const trackingServices = createDummyTrackingServices();
|
||||
const taggingRulesRepository = createTaggingRulesRepository({ db });
|
||||
const tagsRepository = createTagsRepository({ db });
|
||||
|
||||
// 3. Re-create the document
|
||||
const { document: documentRestored } = await createDocument({
|
||||
file: new File(['hello world'], 'file-2.txt', { type: 'text/plain' }),
|
||||
organizationId: 'organization-1',
|
||||
documentsRepository,
|
||||
documentsStorageService,
|
||||
plansRepository,
|
||||
subscriptionsRepository,
|
||||
trackingServices,
|
||||
taggingRulesRepository,
|
||||
tagsRepository,
|
||||
});
|
||||
|
||||
expect(documentRestored).to.deep.include({
|
||||
id: 'document-1',
|
||||
organizationId: 'organization-1',
|
||||
name: 'file-2.txt',
|
||||
originalName: 'file-2.txt',
|
||||
isDeleted: false,
|
||||
deletedBy: null,
|
||||
deletedAt: null,
|
||||
});
|
||||
|
||||
const documentsRecordsAfterRestoration = await db.select().from(documentsTable);
|
||||
|
||||
expect(documentsRecordsAfterRestoration.length).to.eql(1);
|
||||
|
||||
expect(documentsRecordsAfterRestoration[0]).to.eql(documentRestored);
|
||||
|
||||
const documentsTagsRecordsAfterRestoration = await db.select().from(documentsTagsTable);
|
||||
|
||||
expect(documentsTagsRecordsAfterRestoration).to.eql([{
|
||||
documentId: 'document-1',
|
||||
tagId: 'tag-2',
|
||||
}]);
|
||||
});
|
||||
|
||||
test('when there is an issue when inserting the document in the db, the file should not be saved in the storage', async () => {
|
||||
const { db } = await createInMemoryDatabase({
|
||||
users: [{ id: 'user-1', email: 'user-1@example.com' }],
|
||||
@@ -143,6 +253,9 @@ describe('documents usecases', () => {
|
||||
const documentsStorageService = await createDocumentStorageService({ config: { documentsStorage: { driver: 'in-memory' } } as Config });
|
||||
const plansRepository = createPlansRepository({ config: { organizationPlans: { isFreePlanUnlimited: true } } as Config });
|
||||
const subscriptionsRepository = createSubscriptionsRepository({ db });
|
||||
const trackingServices = createDummyTrackingServices();
|
||||
const taggingRulesRepository = createTaggingRulesRepository({ db });
|
||||
const tagsRepository = createTagsRepository({ db });
|
||||
const generateDocumentId = () => 'doc_1';
|
||||
|
||||
const file = new File(['content'], 'file.txt', { type: 'text/plain' });
|
||||
@@ -164,6 +277,9 @@ describe('documents usecases', () => {
|
||||
plansRepository,
|
||||
subscriptionsRepository,
|
||||
generateDocumentId,
|
||||
trackingServices,
|
||||
taggingRulesRepository,
|
||||
tagsRepository,
|
||||
}),
|
||||
).rejects.toThrow(new Error('Macron, explosion!'));
|
||||
|
||||
|
||||
@@ -1,16 +1,23 @@
|
||||
import type { Database } from '../app/database/database.types';
|
||||
import type { Config } from '../config/config.types';
|
||||
import type { PlansRepository } from '../plans/plans.repository';
|
||||
import type { Logger } from '../shared/logger/logger';
|
||||
import type { SubscriptionsRepository } from '../subscriptions/subscriptions.repository';
|
||||
import type { DocumentsRepository } from './documents.repository';
|
||||
import type { DocumentStorageService } from './storage/documents.storage.services';
|
||||
import type { Document } from './documents.types';
|
||||
import { safely } from '@corentinth/chisels';
|
||||
import { extractTextFromFile } from '@papra/lecture';
|
||||
import pLimit from 'p-limit';
|
||||
import { checkIfOrganizationCanCreateNewDocument } from '../organizations/organizations.usecases';
|
||||
import { createPlansRepository, type PlansRepository } from '../plans/plans.repository';
|
||||
import { createLogger } from '../shared/logger/logger';
|
||||
import { createDocumentAlreadyExistsError, createDocumentNotFoundError } from './documents.errors';
|
||||
import { createSubscriptionsRepository, type SubscriptionsRepository } from '../subscriptions/subscriptions.repository';
|
||||
import { createTaggingRulesRepository, type TaggingRulesRepository } from '../tagging-rules/tagging-rules.repository';
|
||||
import { applyTaggingRules } from '../tagging-rules/tagging-rules.usecases';
|
||||
import { createTagsRepository, type TagsRepository } from '../tags/tags.repository';
|
||||
import { createTrackingServices, type TrackingServices } from '../tracking/tracking.services';
|
||||
import { createDocumentAlreadyExistsError, createDocumentNotDeletedError, createDocumentNotFoundError } from './documents.errors';
|
||||
import { buildOriginalDocumentKey, generateDocumentId as generateDocumentIdImpl } from './documents.models';
|
||||
import { createDocumentsRepository, type DocumentsRepository } from './documents.repository';
|
||||
import { getFileSha256Hash } from './documents.services';
|
||||
import { createDocumentStorageService, type DocumentStorageService } from './storage/documents.storage.services';
|
||||
|
||||
const logger = createLogger({ namespace: 'documents:usecases' });
|
||||
|
||||
@@ -35,6 +42,10 @@ export async function createDocument({
|
||||
generateDocumentId = generateDocumentIdImpl,
|
||||
plansRepository,
|
||||
subscriptionsRepository,
|
||||
trackingServices,
|
||||
taggingRulesRepository,
|
||||
tagsRepository,
|
||||
logger = createLogger({ namespace: 'documents:usecases' }),
|
||||
}: {
|
||||
file: File;
|
||||
userId?: string;
|
||||
@@ -44,6 +55,10 @@ export async function createDocument({
|
||||
generateDocumentId?: () => string;
|
||||
plansRepository: PlansRepository;
|
||||
subscriptionsRepository: SubscriptionsRepository;
|
||||
trackingServices: TrackingServices;
|
||||
taggingRulesRepository: TaggingRulesRepository;
|
||||
tagsRepository: TagsRepository;
|
||||
logger?: Logger;
|
||||
}) {
|
||||
const {
|
||||
name: fileName,
|
||||
@@ -64,10 +79,127 @@ export async function createDocument({
|
||||
// Early check to avoid saving the file and then realizing it already exists with the db constraint
|
||||
const { document: existingDocument } = await documentsRepository.getOrganizationDocumentBySha256Hash({ sha256Hash: hash, organizationId });
|
||||
|
||||
if (existingDocument) {
|
||||
const { document } = existingDocument
|
||||
? await handleExistingDocument({
|
||||
existingDocument,
|
||||
fileName,
|
||||
organizationId,
|
||||
documentsRepository,
|
||||
tagsRepository,
|
||||
logger,
|
||||
})
|
||||
: await createNewDocument({
|
||||
file,
|
||||
fileName,
|
||||
size,
|
||||
mimeType,
|
||||
hash,
|
||||
userId,
|
||||
organizationId,
|
||||
documentsRepository,
|
||||
documentsStorageService,
|
||||
generateDocumentId,
|
||||
trackingServices,
|
||||
logger,
|
||||
});
|
||||
|
||||
await applyTaggingRules({ document, taggingRulesRepository, tagsRepository });
|
||||
|
||||
return { document };
|
||||
}
|
||||
|
||||
export type CreateDocumentUsecase = Awaited<ReturnType<typeof createDocumentCreationUsecase>>;
|
||||
export async function createDocumentCreationUsecase({
|
||||
db,
|
||||
config,
|
||||
logger = createLogger({ namespace: 'documents:usecases' }),
|
||||
generateDocumentId = generateDocumentIdImpl,
|
||||
documentsStorageService,
|
||||
}: {
|
||||
db: Database;
|
||||
config: Config;
|
||||
logger?: Logger;
|
||||
documentsStorageService?: DocumentStorageService;
|
||||
generateDocumentId?: () => string;
|
||||
}) {
|
||||
const deps = {
|
||||
documentsRepository: createDocumentsRepository({ db }),
|
||||
documentsStorageService: documentsStorageService ?? await createDocumentStorageService({ config }),
|
||||
plansRepository: createPlansRepository({ config }),
|
||||
subscriptionsRepository: createSubscriptionsRepository({ db }),
|
||||
trackingServices: createTrackingServices({ config }),
|
||||
taggingRulesRepository: createTaggingRulesRepository({ db }),
|
||||
tagsRepository: createTagsRepository({ db }),
|
||||
generateDocumentId,
|
||||
logger,
|
||||
};
|
||||
|
||||
return async ({ file, userId, organizationId }: { file: File; userId?: string; organizationId: string }) => createDocument({
|
||||
file,
|
||||
userId,
|
||||
organizationId,
|
||||
...deps,
|
||||
});
|
||||
}
|
||||
|
||||
async function handleExistingDocument({
|
||||
existingDocument,
|
||||
fileName,
|
||||
userId,
|
||||
organizationId,
|
||||
documentsRepository,
|
||||
tagsRepository,
|
||||
logger,
|
||||
}: {
|
||||
existingDocument: Document;
|
||||
fileName: string;
|
||||
userId?: string;
|
||||
organizationId: string;
|
||||
documentsRepository: DocumentsRepository;
|
||||
tagsRepository: TagsRepository;
|
||||
logger: Logger;
|
||||
}) {
|
||||
if (existingDocument && !existingDocument.isDeleted) {
|
||||
throw createDocumentAlreadyExistsError();
|
||||
}
|
||||
|
||||
logger.info({ documentId: existingDocument.id }, 'Document already exists, restoring for deduplication');
|
||||
|
||||
const [, { document: restoredDocument }] = await Promise.all([
|
||||
tagsRepository.removeAllTagsFromDocument({ documentId: existingDocument.id }),
|
||||
documentsRepository.restoreDocument({ documentId: existingDocument.id, organizationId, name: fileName, userId }),
|
||||
]);
|
||||
|
||||
return { document: restoredDocument };
|
||||
}
|
||||
|
||||
async function createNewDocument({
|
||||
file,
|
||||
fileName,
|
||||
size,
|
||||
mimeType,
|
||||
hash,
|
||||
userId,
|
||||
organizationId,
|
||||
documentsRepository,
|
||||
documentsStorageService,
|
||||
generateDocumentId,
|
||||
trackingServices,
|
||||
logger,
|
||||
}: {
|
||||
file: File;
|
||||
fileName: string;
|
||||
size: number;
|
||||
mimeType: string;
|
||||
hash: string;
|
||||
userId?: string;
|
||||
organizationId: string;
|
||||
documentsRepository: DocumentsRepository;
|
||||
documentsStorageService: DocumentStorageService;
|
||||
generateDocumentId: () => string;
|
||||
trackingServices: TrackingServices;
|
||||
logger: Logger;
|
||||
}) {
|
||||
const documentId = generateDocumentId();
|
||||
|
||||
const { originalDocumentStorageKey } = buildOriginalDocumentKey({
|
||||
@@ -94,19 +226,26 @@ export async function createDocument({
|
||||
mimeType,
|
||||
content: text,
|
||||
originalSha256Hash: hash,
|
||||
}),
|
||||
);
|
||||
}));
|
||||
|
||||
if (error) {
|
||||
logger.error({ error }, 'Error while creating document');
|
||||
|
||||
// If the document is not saved, delete the file from the storage
|
||||
await documentsStorageService.deleteFile({ storageKey: originalDocumentStorageKey });
|
||||
|
||||
logger.error({ error }, 'Stored document file deleted because of error');
|
||||
|
||||
throw error;
|
||||
}
|
||||
|
||||
const { document } = result;
|
||||
if (userId) {
|
||||
trackingServices.captureUserEvent({ userId, event: 'Document created' });
|
||||
}
|
||||
|
||||
return { document };
|
||||
logger.info({ documentId, userId, organizationId }, 'Document created');
|
||||
|
||||
return { document: result.document };
|
||||
}
|
||||
|
||||
export async function getDocumentOrThrow({
|
||||
@@ -148,7 +287,7 @@ export async function hardDeleteDocument({
|
||||
documentsRepository: DocumentsRepository;
|
||||
documentsStorageService: DocumentStorageService;
|
||||
}) {
|
||||
await Promise.all([
|
||||
await Promise.allSettled([
|
||||
documentsRepository.hardDeleteDocument({ documentId }),
|
||||
documentsStorageService.deleteFile({ storageKey: documentId }),
|
||||
]);
|
||||
@@ -186,3 +325,47 @@ export async function deleteExpiredDocuments({
|
||||
deletedDocumentsCount: documentIds.length,
|
||||
};
|
||||
}
|
||||
|
||||
export async function deleteTrashDocument({
|
||||
documentId,
|
||||
organizationId,
|
||||
documentsRepository,
|
||||
documentsStorageService,
|
||||
}: {
|
||||
documentId: string;
|
||||
organizationId: string;
|
||||
documentsRepository: DocumentsRepository;
|
||||
documentsStorageService: DocumentStorageService;
|
||||
}) {
|
||||
const { document } = await documentsRepository.getDocumentById({ documentId, organizationId });
|
||||
|
||||
if (!document) {
|
||||
throw createDocumentNotFoundError();
|
||||
}
|
||||
|
||||
if (!document.isDeleted) {
|
||||
throw createDocumentNotDeletedError();
|
||||
}
|
||||
|
||||
await hardDeleteDocument({ documentId, documentsRepository, documentsStorageService });
|
||||
}
|
||||
|
||||
export async function deleteAllTrashDocuments({
|
||||
organizationId,
|
||||
documentsRepository,
|
||||
documentsStorageService,
|
||||
}: {
|
||||
organizationId: string;
|
||||
documentsRepository: DocumentsRepository;
|
||||
documentsStorageService: DocumentStorageService;
|
||||
}) {
|
||||
const { documentIds } = await documentsRepository.getAllOrganizationTrashDocumentIds({ organizationId });
|
||||
|
||||
// TODO: refactor to use batching and transaction
|
||||
|
||||
const limit = pLimit(10);
|
||||
|
||||
await Promise.all(
|
||||
documentIds.map(documentId => limit(() => hardDeleteDocument({ documentId, documentsRepository, documentsStorageService }))),
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,18 +1,18 @@
|
||||
import type { ConfigDefinition } from 'figue';
|
||||
import { z } from 'zod';
|
||||
import { FS_STORAGE_DRIVER_NAME } from '../../documents/storage/drivers/fs/fs.storage-driver';
|
||||
import { IN_MEMORY_STORAGE_DRIVER_NAME } from '../../documents/storage/drivers/memory/memory.storage-driver';
|
||||
import { S3_STORAGE_DRIVER_NAME } from '../../documents/storage/drivers/s3/s3.storage-driver';
|
||||
import { FS_STORAGE_DRIVER_NAME } from './drivers/fs/fs.storage-driver';
|
||||
import { IN_MEMORY_STORAGE_DRIVER_NAME } from './drivers/memory/memory.storage-driver';
|
||||
import { S3_STORAGE_DRIVER_NAME } from './drivers/s3/s3.storage-driver';
|
||||
|
||||
export const documentStorageConfig = {
|
||||
maxUploadSize: {
|
||||
doc: 'The maximum size in bytes for an uploaded file',
|
||||
schema: z.coerce.number().int().positive(),
|
||||
doc: 'The maximum size in bytes for an uploaded file. Set to 0 to disable the limit and allow uploading documents of any size.',
|
||||
schema: z.coerce.number().int().nonnegative(),
|
||||
default: 10 * 1024 * 1024, // 10MB
|
||||
env: 'DOCUMENT_STORAGE_MAX_UPLOAD_SIZE',
|
||||
},
|
||||
driver: {
|
||||
doc: 'The driver to use for document storage',
|
||||
doc: `The driver to use for document storage, values can be one of: ${[FS_STORAGE_DRIVER_NAME, S3_STORAGE_DRIVER_NAME, IN_MEMORY_STORAGE_DRIVER_NAME].map(x => `\`${x}\``).join(', ')}`,
|
||||
schema: z.enum([FS_STORAGE_DRIVER_NAME, S3_STORAGE_DRIVER_NAME, IN_MEMORY_STORAGE_DRIVER_NAME]),
|
||||
default: FS_STORAGE_DRIVER_NAME,
|
||||
env: 'DOCUMENT_STORAGE_DRIVER',
|
||||
@@ -20,7 +20,7 @@ export const documentStorageConfig = {
|
||||
drivers: {
|
||||
filesystem: {
|
||||
root: {
|
||||
doc: 'The root directory to store documents in',
|
||||
doc: 'The root directory to store documents in (default as "./app-data/documents" when using docker)',
|
||||
schema: z.string(),
|
||||
default: './local-documents',
|
||||
env: 'DOCUMENT_STORAGE_FILESYSTEM_ROOT',
|
||||
|
||||
@@ -17,6 +17,6 @@ export type StorageDriver = {
|
||||
|
||||
export type StorageDriverFactory = (args: { config: Config }) => Promise<StorageDriver>;
|
||||
|
||||
export function defineStorageDriver(factory: StorageDriverFactory) {
|
||||
export function defineStorageDriver<T extends StorageDriverFactory>(factory: T) {
|
||||
return factory;
|
||||
}
|
||||
|
||||
@@ -2,20 +2,10 @@ import fs from 'node:fs';
|
||||
import { dirname, join } from 'node:path';
|
||||
import stream from 'node:stream';
|
||||
import { get } from 'lodash-es';
|
||||
import { checkFileExists, deleteFile, ensureDirectoryExists } from '../../../../shared/fs/fs.services';
|
||||
import { defineStorageDriver } from '../drivers.models';
|
||||
import { createFileAlreadyExistsError } from './fs.storage-driver.errors';
|
||||
|
||||
function ensureDirectoryExists({ path }: { path: string }) {
|
||||
return fs.promises.mkdir(
|
||||
dirname(path),
|
||||
{ recursive: true },
|
||||
);
|
||||
}
|
||||
|
||||
function checkFileExists({ path }: { path: string }) {
|
||||
return fs.promises.access(path, fs.constants.F_OK).then(() => true).catch(() => false);
|
||||
}
|
||||
|
||||
export const FS_STORAGE_DRIVER_NAME = 'filesystem' as const;
|
||||
|
||||
export const fsStorageDriverFactory = defineStorageDriver(async ({ config }) => {
|
||||
@@ -36,7 +26,7 @@ export const fsStorageDriverFactory = defineStorageDriver(async ({ config }) =>
|
||||
throw createFileAlreadyExistsError();
|
||||
}
|
||||
|
||||
await ensureDirectoryExists({ path: storagePath });
|
||||
await ensureDirectoryExists({ path: dirname(storagePath) });
|
||||
|
||||
const writeStream = fs.createWriteStream(storagePath);
|
||||
stream.Readable.fromWeb(file.stream()).pipe(writeStream);
|
||||
@@ -69,7 +59,7 @@ export const fsStorageDriverFactory = defineStorageDriver(async ({ config }) =>
|
||||
const { storagePath } = getStoragePath({ storageKey });
|
||||
|
||||
try {
|
||||
await fs.promises.unlink(storagePath);
|
||||
await deleteFile({ filePath: storagePath });
|
||||
} catch (error) {
|
||||
if (get(error, 'code') === 'ENOENT') {
|
||||
throw new Error('File not found');
|
||||
|
||||
@@ -1,11 +1,10 @@
|
||||
import type { Config } from '../../../../config/config.types';
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { inMemoryStorageDriverFactory } from './memory.storage-driver';
|
||||
|
||||
describe('memory storage-driver', () => {
|
||||
describe('inMemoryStorageDriver', () => {
|
||||
test('saves, retrieves and delete a file', async () => {
|
||||
const inMemoryStorageDriver = await inMemoryStorageDriverFactory({ config: {} as Config });
|
||||
const inMemoryStorageDriver = await inMemoryStorageDriverFactory();
|
||||
|
||||
const file = new File(['lorem ipsum'], 'text-file.txt', { type: 'text/plain' });
|
||||
|
||||
@@ -28,5 +27,26 @@ describe('memory storage-driver', () => {
|
||||
|
||||
await expect(inMemoryStorageDriver.getFileStream({ storageKey: 'org_1/text-file.txt' })).rejects.toThrow('File not found');
|
||||
});
|
||||
|
||||
test('mainly for testing purposes, a _getStorage() method is available to access the internal storage map', async () => {
|
||||
const inMemoryStorageDriver = await inMemoryStorageDriverFactory();
|
||||
|
||||
await inMemoryStorageDriver.saveFile({
|
||||
file: new File(['lorem ipsum'], 'text-file.txt', { type: 'text/plain' }),
|
||||
storageKey: 'org_1/text-file.txt',
|
||||
});
|
||||
|
||||
const storage = inMemoryStorageDriver._getStorage();
|
||||
|
||||
expect(storage).to.be.a('Map');
|
||||
const entries = Array.from(storage.entries());
|
||||
|
||||
expect(entries).to.have.length(1);
|
||||
const [key, file] = entries[0];
|
||||
|
||||
expect(key).to.eql('org_1/text-file.txt');
|
||||
expect(file).to.be.a('File');
|
||||
expect(await file.text()).to.eql('lorem ipsum');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -29,5 +29,7 @@ export const inMemoryStorageDriverFactory = defineStorageDriver(async () => {
|
||||
deleteFile: async ({ storageKey }) => {
|
||||
storage.delete(storageKey);
|
||||
},
|
||||
|
||||
_getStorage: () => storage,
|
||||
};
|
||||
});
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
import type { ConfigDefinition } from 'figue';
|
||||
import { z } from 'zod';
|
||||
import { booleanishSchema } from '../config/config.schemas';
|
||||
|
||||
export const emailsConfig = {
|
||||
resendApiKey: {
|
||||
doc: 'The API key for Resend',
|
||||
doc: 'The API key for Resend (use to send emails)',
|
||||
schema: z.string(),
|
||||
default: 'set-me',
|
||||
env: 'RESEND_API_KEY',
|
||||
@@ -16,13 +17,8 @@ export const emailsConfig = {
|
||||
},
|
||||
dryRun: {
|
||||
doc: 'Whether to run the email service in dry run mode',
|
||||
schema: z
|
||||
.string()
|
||||
.trim()
|
||||
.toLowerCase()
|
||||
.transform(x => x === 'true')
|
||||
.pipe(z.boolean()),
|
||||
default: 'false',
|
||||
schema: booleanishSchema,
|
||||
default: false,
|
||||
env: 'EMAILS_DRY_RUN',
|
||||
},
|
||||
} as const satisfies ConfigDefinition;
|
||||
|
||||
@@ -0,0 +1,109 @@
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { addTimestampToFilename, isFileInDoneFolder, isFileInErrorFolder, normalizeFilePathToIngestionFolder } from './ingestion-folder.models';
|
||||
|
||||
describe('ingestion-folder models', () => {
|
||||
describe('normalizeFilePathToIngestionFolder', () => {
|
||||
test('get the path of a file relative to the ingestion folder', () => {
|
||||
expect(
|
||||
normalizeFilePathToIngestionFolder({
|
||||
filePath: '/home/foo/projects/papra/apps/papra-server/ingestion/org_1/yo.md',
|
||||
ingestionFolderPath: '/home/foo/projects/papra/apps/papra-server/ingestion',
|
||||
}),
|
||||
).to.eql({ relativeFilePath: 'org_1/yo.md' });
|
||||
});
|
||||
});
|
||||
|
||||
describe('isFileInErrorFolder', () => {
|
||||
test('the user can configure the error folder as a relative path, it is relative to the file organization ingestion folder', () => {
|
||||
// Here the error folder is /foo/bar/ingestion/org_1/error
|
||||
expect(
|
||||
isFileInErrorFolder({
|
||||
filePath: '/foo/bar/ingestion/org_1/yo.md',
|
||||
errorFolder: 'error',
|
||||
organizationIngestionFolderPath: '/foo/bar/ingestion/org_1',
|
||||
}),
|
||||
).to.equal(false);
|
||||
|
||||
expect(
|
||||
isFileInErrorFolder({
|
||||
filePath: '/foo/bar/ingestion/org_1/error/yo.md',
|
||||
errorFolder: 'error',
|
||||
organizationIngestionFolderPath: '/foo/bar/ingestion/org_1',
|
||||
}),
|
||||
).to.equal(true);
|
||||
});
|
||||
|
||||
test('the user can configure the error folder as an absolute path that can overlap with the organization ingestion folder', () => {
|
||||
expect(
|
||||
isFileInErrorFolder({
|
||||
filePath: '/foo/bar/ingestion/org_1/error/yo.md',
|
||||
errorFolder: '/foo/bar/ingestion/org_1/error',
|
||||
organizationIngestionFolderPath: '/foo/bar/ingestion/org_1',
|
||||
}),
|
||||
).to.equal(true);
|
||||
|
||||
expect(
|
||||
isFileInErrorFolder({
|
||||
filePath: '/foo/bar/ingestion/org_1/error/yo.md',
|
||||
errorFolder: '/error',
|
||||
organizationIngestionFolderPath: '/foo/bar/ingestion/org_1',
|
||||
}),
|
||||
).to.equal(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('isFileInDoneFolder', () => {
|
||||
test('the user can configure the done folder as a relative path, it is relative to the file organization ingestion folder', () => {
|
||||
// Here the done folder is /foo/bar/ingestion/org_1/done
|
||||
expect(
|
||||
isFileInDoneFolder({
|
||||
filePath: '/foo/bar/ingestion/org_1/yo.md',
|
||||
doneFolder: 'done',
|
||||
organizationIngestionFolderPath: '/foo/bar/ingestion/org_1',
|
||||
}),
|
||||
).to.equal(false);
|
||||
|
||||
expect(
|
||||
isFileInDoneFolder({
|
||||
filePath: '/foo/bar/ingestion/org_1/done/yo.md',
|
||||
doneFolder: 'done',
|
||||
organizationIngestionFolderPath: '/foo/bar/ingestion/org_1',
|
||||
}),
|
||||
).to.equal(true);
|
||||
});
|
||||
|
||||
test('the user can configure the done folder as an absolute path that can overlap with the organization ingestion folder', () => {
|
||||
expect(
|
||||
isFileInDoneFolder({
|
||||
filePath: '/foo/bar/ingestion/org_1/done/yo.md',
|
||||
doneFolder: '/foo/bar/ingestion/org_1/done',
|
||||
organizationIngestionFolderPath: '/foo/bar/ingestion/org_1',
|
||||
}),
|
||||
).to.equal(true);
|
||||
|
||||
expect(
|
||||
isFileInDoneFolder({
|
||||
filePath: '/foo/bar/ingestion/org_1/done/yo.md',
|
||||
doneFolder: '/done',
|
||||
organizationIngestionFolderPath: '/foo/bar/ingestion/org_1',
|
||||
}),
|
||||
).to.equal(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('addTimestampToFilename', () => {
|
||||
test('given a filename, it adds a timestamp to the name, before the extension', () => {
|
||||
expect(
|
||||
addTimestampToFilename({ fileName: 'yo.md', now: new Date('2024-04-14T12:00:00.000Z') }),
|
||||
).to.equal('yo_1713096000000.md');
|
||||
|
||||
expect(
|
||||
addTimestampToFilename({ fileName: '.config', now: new Date('2024-04-14T12:00:00.000Z') }),
|
||||
).to.equal('.config_1713096000000');
|
||||
|
||||
expect(
|
||||
addTimestampToFilename({ fileName: 'documents.models.tests.ts', now: new Date('2024-04-14T12:00:00.000Z') }),
|
||||
).to.equal('documents.models.tests_1713096000000.ts');
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,47 @@
|
||||
import { isAbsolute, join, parse, sep as pathSeparator, relative } from 'node:path';
|
||||
import { organizationIdRegex } from '../organizations/organizations.constants';
|
||||
|
||||
export function normalizeFilePathToIngestionFolder({
|
||||
filePath,
|
||||
ingestionFolderPath,
|
||||
}: {
|
||||
filePath: string;
|
||||
ingestionFolderPath: string;
|
||||
}) {
|
||||
const relativeFilePath = relative(ingestionFolderPath, filePath);
|
||||
|
||||
return { relativeFilePath };
|
||||
}
|
||||
|
||||
export function getOrganizationIdFromFilePath({ relativeFilePath }: { relativeFilePath: string }) {
|
||||
const [maybeOrganizationId] = relativeFilePath.split(pathSeparator);
|
||||
|
||||
if (!maybeOrganizationId || !organizationIdRegex.test(maybeOrganizationId)) {
|
||||
return { organizationId: undefined };
|
||||
}
|
||||
|
||||
return { organizationId: maybeOrganizationId };
|
||||
}
|
||||
|
||||
export function addTimestampToFilename({ fileName, now = new Date() }: { fileName: string; now?: Date }): string {
|
||||
const { name, ext } = parse(fileName);
|
||||
const timestamp = now.getTime();
|
||||
|
||||
return `${name}_${timestamp}${ext}`;
|
||||
}
|
||||
|
||||
export function getAbsolutePathFromFolderRelativeToOrganizationIngestionFolder({ path, organizationIngestionFolderPath }: { path: string; organizationIngestionFolderPath: string }) {
|
||||
return isAbsolute(path) ? path : join(organizationIngestionFolderPath, path);
|
||||
}
|
||||
|
||||
export function isFileInErrorFolder({ filePath, errorFolder, organizationIngestionFolderPath }: { filePath: string; errorFolder: string; organizationIngestionFolderPath: string }) {
|
||||
const errorFolderPath = getAbsolutePathFromFolderRelativeToOrganizationIngestionFolder({ path: errorFolder, organizationIngestionFolderPath });
|
||||
|
||||
return filePath.startsWith(errorFolderPath);
|
||||
}
|
||||
|
||||
export function isFileInDoneFolder({ filePath, doneFolder, organizationIngestionFolderPath }: { filePath: string; doneFolder: string; organizationIngestionFolderPath: string }) {
|
||||
const doneFolderPath = getAbsolutePathFromFolderRelativeToOrganizationIngestionFolder({ path: doneFolder, organizationIngestionFolderPath });
|
||||
|
||||
return filePath.startsWith(doneFolderPath);
|
||||
}
|
||||
@@ -0,0 +1,68 @@
|
||||
import type { ConfigDefinition } from 'figue';
|
||||
import { z } from 'zod';
|
||||
import { booleanishSchema } from '../config/config.schemas';
|
||||
import { defaultIgnoredPatterns } from './ingestion-folders.constants';
|
||||
|
||||
export const ingestionFolderConfig = {
|
||||
isEnabled: {
|
||||
doc: 'Whether ingestion folders are enabled',
|
||||
schema: booleanishSchema,
|
||||
default: false,
|
||||
env: 'INGESTION_FOLDER_IS_ENABLED',
|
||||
},
|
||||
folderRootPath: {
|
||||
doc: 'The root directory in which ingestion folders for each organization are stored',
|
||||
schema: z.string(),
|
||||
default: './ingestion',
|
||||
env: 'INGESTION_FOLDER_ROOT_PATH',
|
||||
},
|
||||
watcher: {
|
||||
usePolling: {
|
||||
doc: 'Whether to use polling for the ingestion folder watcher',
|
||||
schema: booleanishSchema,
|
||||
default: false,
|
||||
env: 'INGESTION_FOLDER_WATCHER_USE_POLLING',
|
||||
},
|
||||
pollingInterval: {
|
||||
doc: 'When polling is used, this is the interval at which the watcher checks for changes in the ingestion folder (in milliseconds)',
|
||||
schema: z.number(),
|
||||
default: 2_000,
|
||||
env: 'INGESTION_FOLDER_WATCHER_POLLING_INTERVAL_MS',
|
||||
},
|
||||
},
|
||||
processingConcurrency: {
|
||||
doc: 'The number of files that can be processed concurrently by the server. Increasing this can improve processing speed, but it will also increase CPU and memory usage.',
|
||||
schema: z.number().int().positive().min(1),
|
||||
default: 1,
|
||||
env: 'INGESTION_FOLDER_PROCESSING_CONCURRENCY',
|
||||
},
|
||||
errorFolder: {
|
||||
doc: 'The folder to move the file when the ingestion fails, the path is relative to the organization ingestion folder (<ingestion root>/<organization id>)',
|
||||
schema: z.string(),
|
||||
default: './ingestion-error',
|
||||
env: 'INGESTION_FOLDER_ERROR_FOLDER_PATH',
|
||||
},
|
||||
postProcessing: {
|
||||
strategy: {
|
||||
doc: 'The action done on the file after it has been ingested',
|
||||
schema: z.enum(['delete', 'move']),
|
||||
default: 'delete',
|
||||
env: 'INGESTION_FOLDER_POST_PROCESSING_STRATEGY',
|
||||
},
|
||||
moveToFolderPath: {
|
||||
doc: 'The folder to move the file when the post-processing strategy is "move", the path is relative to the organization ingestion folder (<ingestion root>/<organization id>)',
|
||||
schema: z.string(),
|
||||
default: './ingestion-done',
|
||||
env: 'INGESTION_FOLDER_POST_PROCESSING_MOVE_FOLDER_PATH',
|
||||
},
|
||||
},
|
||||
ignoredPatterns: {
|
||||
doc: `Comma separated list of patterns to ignore when watching the ingestion folder. Note that if you update this variable, it'll override the default patterns, not merge them. Regarding the format and syntax, please refer to the [picomatch documentation](https://github.com/micromatch/picomatch/blob/bf6a33bd3db990edfbfd20b3b160eed926cd07dd/README.md#globbing-features)`,
|
||||
schema: z.union([
|
||||
z.string(),
|
||||
z.array(z.string()),
|
||||
]).transform(value => (typeof value === 'string' ? value.split(',') : value)),
|
||||
default: defaultIgnoredPatterns,
|
||||
env: 'INGESTION_FOLDER_IGNORED_PATTERNS',
|
||||
},
|
||||
} as const satisfies ConfigDefinition;
|
||||
@@ -0,0 +1,13 @@
|
||||
export const defaultIgnoredPatterns = [
|
||||
// Files
|
||||
'**/.DS_Store',
|
||||
'**/.env',
|
||||
'**/desktop.ini',
|
||||
'**/Thumbs.db',
|
||||
|
||||
// Directories
|
||||
'**/.git/**',
|
||||
'**/.idea/**',
|
||||
'**/.vscode/**',
|
||||
'**/node_modules/**',
|
||||
];
|
||||
@@ -0,0 +1,10 @@
|
||||
import { createError } from '../shared/errors/errors';
|
||||
|
||||
export function createInvalidPostProcessingStrategyError({ strategy }: { strategy: string }) {
|
||||
return createError({
|
||||
code: 'ingestion-folder.invalid-post-processing-strategy',
|
||||
message: `The post-processing strategy "${strategy}" is invalid`,
|
||||
statusCode: 501,
|
||||
isInternal: true,
|
||||
});
|
||||
}
|
||||
@@ -0,0 +1,35 @@
|
||||
import { Buffer } from 'node:buffer';
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { getFile } from './ingestion-folders.services';
|
||||
|
||||
describe('ingestion-folders services', () => {
|
||||
describe('getFile', () => {
|
||||
const readFile = async () => Buffer.from('test');
|
||||
|
||||
test('reads a file from the fs and returns a File instance', async () => {
|
||||
const { file } = await getFile({
|
||||
filePath: 'test.txt',
|
||||
fs: { readFile },
|
||||
});
|
||||
|
||||
expect(file).instanceOf(File);
|
||||
expect(file.type).to.equal('text/plain');
|
||||
expect(file.name).to.equal('test.txt');
|
||||
expect(file.size).to.equal(4);
|
||||
expect(await file.text()).to.equal('test');
|
||||
});
|
||||
|
||||
test('a file with a weird extension is considered a octet-stream', async () => {
|
||||
const { file } = await getFile({
|
||||
filePath: 'test.weird',
|
||||
fs: { readFile },
|
||||
});
|
||||
|
||||
expect(file).instanceOf(File);
|
||||
expect(file.type).to.equal('application/octet-stream');
|
||||
expect(file.name).to.equal('test.weird');
|
||||
expect(file.size).to.equal(4);
|
||||
expect(await file.text()).to.equal('test');
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,24 @@
|
||||
import type {
|
||||
FsServices,
|
||||
} from '../shared/fs/fs.services';
|
||||
import { parse } from 'node:path';
|
||||
import mime from 'mime-types';
|
||||
import {
|
||||
createFsServices,
|
||||
} from '../shared/fs/fs.services';
|
||||
|
||||
export async function getFile({
|
||||
filePath,
|
||||
fs = createFsServices(),
|
||||
}: {
|
||||
filePath: string;
|
||||
fs?: Pick<FsServices, 'readFile'>;
|
||||
}) {
|
||||
const buffer = await fs.readFile({ filePath });
|
||||
// OR pipes since lookup returns false if the mime type is not found
|
||||
const mimeType = mime.lookup(filePath) || 'application/octet-stream';
|
||||
const { base: fileName } = parse(filePath);
|
||||
|
||||
const file = new File([buffer], fileName, { type: mimeType });
|
||||
return { file };
|
||||
}
|
||||
@@ -0,0 +1,642 @@
|
||||
import type { FsNative } from '../shared/fs/fs.services';
|
||||
import { safely } from '@corentinth/chisels';
|
||||
import { memfs } from 'memfs';
|
||||
import { describe, expect, test } from 'vitest';
|
||||
import { createInMemoryDatabase } from '../app/database/database.test-utils';
|
||||
import { overrideConfig } from '../config/config.test-utils';
|
||||
import { documentsTable } from '../documents/documents.table';
|
||||
import { createDocumentCreationUsecase } from '../documents/documents.usecases';
|
||||
import { inMemoryStorageDriverFactory } from '../documents/storage/drivers/memory/memory.storage-driver';
|
||||
import { createOrganizationsRepository } from '../organizations/organizations.repository';
|
||||
import { createInMemoryFsServices } from '../shared/fs/fs.in-memory';
|
||||
import { createFsServices } from '../shared/fs/fs.services';
|
||||
import { createTestLogger } from '../shared/logger/logger.test-utils';
|
||||
import { createInvalidPostProcessingStrategyError } from './ingestion-folders.errors';
|
||||
import { moveIngestionFile, processFile } from './ingestion-folders.usecases';
|
||||
|
||||
describe('ingestion-folders usecases', () => {
|
||||
describe('processFile', () => {
|
||||
describe('when a file is added to an organization ingestion folder', () => {
|
||||
test('if the post processing strategy is set to "move", the file is ingested and moved to the done folder', async () => {
|
||||
const { logger, getLogs } = createTestLogger();
|
||||
|
||||
const { db } = await createInMemoryDatabase({
|
||||
organizations: [{ id: 'org_111111111111111111111111', name: 'Org 1' }],
|
||||
});
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
|
||||
const config = overrideConfig({
|
||||
ingestionFolder: {
|
||||
folderRootPath: '/apps/papra/ingestion',
|
||||
postProcessing: {
|
||||
strategy: 'move',
|
||||
moveToFolderPath: 'done',
|
||||
},
|
||||
},
|
||||
documentsStorage: {
|
||||
driver: 'in-memory',
|
||||
},
|
||||
});
|
||||
|
||||
const documentsStorageService = await inMemoryStorageDriverFactory();
|
||||
let documentIdIndex = 1;
|
||||
const generateDocumentId = () => `doc_${documentIdIndex++}`;
|
||||
|
||||
const { vol } = memfs({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
await processFile({
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
ingestionFolderPath: '/apps/papra/ingestion',
|
||||
config,
|
||||
organizationsRepository,
|
||||
logger,
|
||||
fs: createFsServices({ fs: vol.promises as unknown as FsNative }),
|
||||
createDocument: await createDocumentCreationUsecase({ db, config, logger, documentsStorageService, generateDocumentId }),
|
||||
});
|
||||
|
||||
// Check database
|
||||
const documents = await db.select().from(documentsTable);
|
||||
|
||||
expect(documents).to.have.length(1);
|
||||
expect(documents[0]).to.deep.include({
|
||||
id: 'doc_1',
|
||||
organizationId: 'org_111111111111111111111111',
|
||||
createdBy: null,
|
||||
name: 'hello.md',
|
||||
originalName: 'hello.md',
|
||||
originalSize: 11,
|
||||
});
|
||||
|
||||
// Check file storage
|
||||
const files = Array.from(documentsStorageService._getStorage().values());
|
||||
|
||||
expect(files).to.have.length(1);
|
||||
|
||||
const [file] = files;
|
||||
|
||||
expect(file.name).to.equal('hello.md');
|
||||
expect(file.size).to.equal(11);
|
||||
expect(file.type).to.equal('text/markdown');
|
||||
expect(await file.text()).to.equal('lorem ipsum');
|
||||
|
||||
// Check FS, ensure the file has been moved to the done folder
|
||||
expect(vol.toJSON()).to.deep.equal({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/done/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
// Check logs
|
||||
expect(getLogs({ excludeTimestampMs: true })).to.eql(
|
||||
[
|
||||
{
|
||||
data: {
|
||||
documentId: 'doc_1',
|
||||
organizationId: 'org_111111111111111111111111',
|
||||
userId: undefined,
|
||||
},
|
||||
level: 'info',
|
||||
message: 'Document created',
|
||||
namespace: 'test',
|
||||
},
|
||||
{
|
||||
data: {
|
||||
documentId: 'doc_1',
|
||||
},
|
||||
level: 'info',
|
||||
message: 'Document imported from ingestion folder',
|
||||
namespace: 'test',
|
||||
},
|
||||
{
|
||||
data: {
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
},
|
||||
level: 'info',
|
||||
message: 'File moved after ingestion',
|
||||
namespace: 'test',
|
||||
},
|
||||
],
|
||||
);
|
||||
});
|
||||
|
||||
test('if the post processing strategy is set to "delete", the file is ingested and deleted', async () => {
|
||||
const { logger, getLogs } = createTestLogger();
|
||||
|
||||
const { db } = await createInMemoryDatabase({
|
||||
organizations: [{ id: 'org_111111111111111111111111', name: 'Org 1' }],
|
||||
});
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
|
||||
const config = overrideConfig({
|
||||
ingestionFolder: {
|
||||
folderRootPath: '/apps/papra/ingestion',
|
||||
postProcessing: {
|
||||
strategy: 'delete',
|
||||
},
|
||||
},
|
||||
documentsStorage: {
|
||||
driver: 'in-memory',
|
||||
},
|
||||
});
|
||||
|
||||
const documentsStorageService = await inMemoryStorageDriverFactory();
|
||||
let documentIdIndex = 1;
|
||||
const generateDocumentId = () => `doc_${documentIdIndex++}`;
|
||||
|
||||
const { vol } = memfs({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
await processFile({
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
ingestionFolderPath: '/apps/papra/ingestion',
|
||||
config,
|
||||
organizationsRepository,
|
||||
logger,
|
||||
fs: createFsServices({ fs: vol.promises as unknown as FsNative }),
|
||||
createDocument: await createDocumentCreationUsecase({ db, config, logger, documentsStorageService, generateDocumentId }),
|
||||
});
|
||||
|
||||
// Check database
|
||||
const documents = await db.select().from(documentsTable);
|
||||
|
||||
expect(documents).to.have.length(1);
|
||||
expect(documents[0]).to.deep.include({
|
||||
id: 'doc_1',
|
||||
organizationId: 'org_111111111111111111111111',
|
||||
createdBy: null,
|
||||
name: 'hello.md',
|
||||
originalName: 'hello.md',
|
||||
originalSize: 11,
|
||||
});
|
||||
|
||||
// Check file storage
|
||||
const files = Array.from(documentsStorageService._getStorage().values());
|
||||
|
||||
expect(files).to.have.length(1);
|
||||
|
||||
const [file] = files;
|
||||
|
||||
expect(file.name).to.equal('hello.md');
|
||||
expect(file.size).to.equal(11);
|
||||
expect(file.type).to.equal('text/markdown');
|
||||
expect(await file.text()).to.equal('lorem ipsum');
|
||||
|
||||
// Check FS, ensure the file has been moved to the done folder
|
||||
expect(vol.toJSON()).to.deep.equal({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111': null,
|
||||
});
|
||||
|
||||
// Check logs
|
||||
expect(getLogs({ excludeTimestampMs: true })).to.eql(
|
||||
[
|
||||
{
|
||||
data: {
|
||||
documentId: 'doc_1',
|
||||
organizationId: 'org_111111111111111111111111',
|
||||
userId: undefined,
|
||||
},
|
||||
level: 'info',
|
||||
message: 'Document created',
|
||||
namespace: 'test',
|
||||
},
|
||||
{
|
||||
data: {
|
||||
documentId: 'doc_1',
|
||||
},
|
||||
level: 'info',
|
||||
message: 'Document imported from ingestion folder',
|
||||
namespace: 'test',
|
||||
},
|
||||
{
|
||||
data: {
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
},
|
||||
level: 'info',
|
||||
message: 'File deleted after ingestion',
|
||||
namespace: 'test',
|
||||
},
|
||||
],
|
||||
);
|
||||
});
|
||||
|
||||
test('if the post processing strategy is not implemented, an error is thrown after the file has been ingested', async () => {
|
||||
const { logger } = createTestLogger();
|
||||
|
||||
const { db } = await createInMemoryDatabase({
|
||||
organizations: [{ id: 'org_111111111111111111111111', name: 'Org 1' }],
|
||||
});
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
|
||||
const config = overrideConfig({
|
||||
ingestionFolder: {
|
||||
folderRootPath: '/apps/papra/ingestion',
|
||||
postProcessing: {
|
||||
strategy: 'unknown' as any,
|
||||
},
|
||||
},
|
||||
documentsStorage: {
|
||||
driver: 'in-memory',
|
||||
},
|
||||
});
|
||||
|
||||
const documentsStorageService = await inMemoryStorageDriverFactory();
|
||||
let documentIdIndex = 1;
|
||||
const generateDocumentId = () => `doc_${documentIdIndex++}`;
|
||||
|
||||
const { vol } = memfs({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
const [, error] = await safely(processFile({
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
ingestionFolderPath: '/apps/papra/ingestion',
|
||||
config,
|
||||
organizationsRepository,
|
||||
logger,
|
||||
fs: createFsServices({ fs: vol.promises as unknown as FsNative }),
|
||||
createDocument: await createDocumentCreationUsecase({ db, config, logger, documentsStorageService, generateDocumentId }),
|
||||
}));
|
||||
|
||||
expect(error).to.deep.equal(createInvalidPostProcessingStrategyError({ strategy: 'unknown' }));
|
||||
|
||||
// Check database
|
||||
const documents = await db.select().from(documentsTable);
|
||||
|
||||
expect(documents).to.have.length(1);
|
||||
expect(documents[0]).to.deep.include({
|
||||
id: 'doc_1',
|
||||
organizationId: 'org_111111111111111111111111',
|
||||
createdBy: null,
|
||||
name: 'hello.md',
|
||||
originalName: 'hello.md',
|
||||
originalSize: 11,
|
||||
});
|
||||
|
||||
// Check file storage
|
||||
const files = Array.from(documentsStorageService._getStorage().values());
|
||||
|
||||
expect(files).to.have.length(1);
|
||||
|
||||
const [file] = files;
|
||||
|
||||
expect(file.name).to.equal('hello.md');
|
||||
expect(file.size).to.equal(11);
|
||||
expect(file.type).to.equal('text/markdown');
|
||||
expect(await file.text()).to.equal('lorem ipsum');
|
||||
|
||||
// Check FS, ensure the file is still in the ingestion folder
|
||||
expect(vol.toJSON()).to.deep.equal({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/hello.md': 'lorem ipsum',
|
||||
});
|
||||
});
|
||||
|
||||
test('if for some reason the file cannot be read, a log is emitted and the processing stops', async () => {
|
||||
const { logger, getLogs } = createTestLogger();
|
||||
|
||||
const { db } = await createInMemoryDatabase({
|
||||
organizations: [{ id: 'org_111111111111111111111111', name: 'Org 1' }],
|
||||
});
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
|
||||
const config = overrideConfig({
|
||||
ingestionFolder: {
|
||||
folderRootPath: '/apps/papra/ingestion',
|
||||
postProcessing: {
|
||||
strategy: 'unknown' as any,
|
||||
},
|
||||
},
|
||||
documentsStorage: {
|
||||
driver: 'in-memory',
|
||||
},
|
||||
});
|
||||
|
||||
const documentsStorageService = await inMemoryStorageDriverFactory();
|
||||
let documentIdIndex = 1;
|
||||
const generateDocumentId = () => `doc_${documentIdIndex++}`;
|
||||
|
||||
const { vol } = memfs({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
await processFile({
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
ingestionFolderPath: '/apps/papra/ingestion',
|
||||
config,
|
||||
organizationsRepository,
|
||||
logger,
|
||||
fs: {
|
||||
...createFsServices({ fs: vol.promises as unknown as FsNative }),
|
||||
readFile: async () => {
|
||||
throw new Error('File not found');
|
||||
},
|
||||
},
|
||||
createDocument: await createDocumentCreationUsecase({ db, config, logger, documentsStorageService, generateDocumentId }),
|
||||
});
|
||||
|
||||
// Check logs
|
||||
expect(getLogs({ excludeTimestampMs: true })).to.eql(
|
||||
[
|
||||
{
|
||||
data: {
|
||||
error: new Error('File not found'),
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
},
|
||||
level: 'error',
|
||||
message: 'Error reading file',
|
||||
namespace: 'test',
|
||||
},
|
||||
],
|
||||
);
|
||||
|
||||
// Check database
|
||||
const documents = await db.select().from(documentsTable);
|
||||
|
||||
expect(documents).to.have.length(0);
|
||||
});
|
||||
|
||||
test('if a file is located in the post-processed "done" folder or "error" folder, it is not processed nor ingested', async () => {
|
||||
const { logger, getLogs } = createTestLogger();
|
||||
|
||||
const { db } = await createInMemoryDatabase({
|
||||
organizations: [{ id: 'org_111111111111111111111111', name: 'Org 1' }],
|
||||
});
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
|
||||
const config = overrideConfig({
|
||||
ingestionFolder: {
|
||||
folderRootPath: '/apps/papra/ingestion',
|
||||
postProcessing: {
|
||||
strategy: 'move',
|
||||
moveToFolderPath: 'done',
|
||||
},
|
||||
errorFolder: 'error',
|
||||
},
|
||||
documentsStorage: {
|
||||
driver: 'in-memory',
|
||||
},
|
||||
});
|
||||
|
||||
const { vol } = memfs({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/done/hello.md': 'lorem ipsum',
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/error/world.md': 'dolor sit amet',
|
||||
});
|
||||
|
||||
await processFile({
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/done/hello.md',
|
||||
ingestionFolderPath: '/apps/papra/ingestion',
|
||||
config,
|
||||
organizationsRepository,
|
||||
logger,
|
||||
fs: createFsServices({ fs: vol.promises as unknown as FsNative }),
|
||||
createDocument: async () => expect.fail('Document should not be created'),
|
||||
});
|
||||
|
||||
await processFile({
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/error/world.md',
|
||||
ingestionFolderPath: '/apps/papra/ingestion',
|
||||
config,
|
||||
organizationsRepository,
|
||||
logger,
|
||||
fs: createFsServices({ fs: vol.promises as unknown as FsNative }),
|
||||
createDocument: async () => expect.fail('Document should not be created'),
|
||||
});
|
||||
|
||||
// Check database
|
||||
const documents = await db.select().from(documentsTable);
|
||||
|
||||
expect(documents).to.have.length(0);
|
||||
|
||||
// Check logs
|
||||
expect(getLogs({ excludeTimestampMs: true })).to.eql(
|
||||
[
|
||||
{
|
||||
data: {
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/done/hello.md',
|
||||
},
|
||||
level: 'debug',
|
||||
message: 'File from post-processing folder, skipping',
|
||||
namespace: 'test',
|
||||
},
|
||||
{
|
||||
data: {
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/error/world.md',
|
||||
},
|
||||
level: 'debug',
|
||||
message: 'File from error folder, skipping',
|
||||
namespace: 'test',
|
||||
},
|
||||
],
|
||||
);
|
||||
});
|
||||
|
||||
test('when the document already exists in the database, it is not ingested, but the post-processing is still executed', async () => {
|
||||
const { logger, getLogs } = createTestLogger();
|
||||
|
||||
// This is the sha256 hash of the "lorem ipsum" text
|
||||
const loremIpsumSha256Hash = '5e2bf57d3f40c4b6df69daf1936cb766f832374b4fc0259a7cbff06e2f70f269';
|
||||
|
||||
const { db } = await createInMemoryDatabase({
|
||||
organizations: [{ id: 'org_111111111111111111111111', name: 'Org 1' }],
|
||||
documents: [{ id: 'doc_1', organizationId: 'org_111111111111111111111111', name: 'hello.md', originalName: 'hello.md', originalStorageKey: 'hello.md', originalSha256Hash: loremIpsumSha256Hash, mimeType: 'text/markdown' }],
|
||||
});
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
|
||||
const documentsStorageService = await inMemoryStorageDriverFactory();
|
||||
let documentIdIndex = 1;
|
||||
const generateDocumentId = () => `doc_${documentIdIndex++}`;
|
||||
|
||||
const config = overrideConfig({
|
||||
ingestionFolder: {
|
||||
folderRootPath: '/apps/papra/ingestion',
|
||||
postProcessing: {
|
||||
strategy: 'move',
|
||||
moveToFolderPath: 'done',
|
||||
},
|
||||
},
|
||||
documentsStorage: {
|
||||
driver: 'in-memory',
|
||||
},
|
||||
});
|
||||
|
||||
const { vol } = memfs({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
await processFile({
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
ingestionFolderPath: '/apps/papra/ingestion',
|
||||
config,
|
||||
organizationsRepository,
|
||||
logger,
|
||||
fs: createFsServices({ fs: vol.promises as unknown as FsNative }),
|
||||
createDocument: await createDocumentCreationUsecase({ db, config, logger, documentsStorageService, generateDocumentId }),
|
||||
});
|
||||
|
||||
// Check database
|
||||
const documents = await db.select().from(documentsTable);
|
||||
|
||||
expect(documents).to.have.length(1);
|
||||
expect(documents[0].id).to.equal('doc_1');
|
||||
|
||||
// Check fs
|
||||
expect(vol.toJSON()).to.deep.equal({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/done/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
// Check logs
|
||||
expect(getLogs({ excludeTimestampMs: true })).to.eql(
|
||||
[
|
||||
{
|
||||
data: {
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
},
|
||||
level: 'info',
|
||||
message: 'Document not inserted because it already exists',
|
||||
namespace: 'test',
|
||||
},
|
||||
{
|
||||
data: {
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
},
|
||||
level: 'info',
|
||||
message: 'File moved after ingestion',
|
||||
namespace: 'test',
|
||||
},
|
||||
],
|
||||
);
|
||||
});
|
||||
|
||||
test('when their is an issue with the document creation, the file is moved to the error folder', async () => {
|
||||
const { logger, getLogs } = createTestLogger();
|
||||
|
||||
// This is the sha256 hash of the "lorem ipsum" text
|
||||
const loremIpsumSha256Hash = '5e2bf57d3f40c4b6df69daf1936cb766f832374b4fc0259a7cbff06e2f70f269';
|
||||
|
||||
const { db } = await createInMemoryDatabase({
|
||||
organizations: [{ id: 'org_111111111111111111111111', name: 'Org 1' }],
|
||||
documents: [{ id: 'doc_1', organizationId: 'org_111111111111111111111111', name: 'hello.md', originalName: 'hello.md', originalStorageKey: 'hello.md', originalSha256Hash: loremIpsumSha256Hash, mimeType: 'text/markdown' }],
|
||||
});
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
|
||||
const config = overrideConfig({
|
||||
ingestionFolder: {
|
||||
folderRootPath: '/apps/papra/ingestion',
|
||||
postProcessing: {
|
||||
strategy: 'move',
|
||||
moveToFolderPath: 'done',
|
||||
},
|
||||
errorFolder: 'error',
|
||||
},
|
||||
documentsStorage: {
|
||||
driver: 'in-memory',
|
||||
},
|
||||
});
|
||||
|
||||
const { vol } = memfs({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
await processFile({
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
ingestionFolderPath: '/apps/papra/ingestion',
|
||||
config,
|
||||
organizationsRepository,
|
||||
logger,
|
||||
fs: createFsServices({ fs: vol.promises as unknown as FsNative }),
|
||||
createDocument: async () => {
|
||||
throw new Error('Document creation failed');
|
||||
},
|
||||
});
|
||||
|
||||
// Check database
|
||||
const documents = await db.select().from(documentsTable);
|
||||
|
||||
expect(documents).to.have.length(1);
|
||||
expect(documents[0].id).to.equal('doc_1');
|
||||
|
||||
// Check fs
|
||||
expect(vol.toJSON()).to.deep.equal({
|
||||
'/apps/papra/ingestion/org_111111111111111111111111/error/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
// Check logs
|
||||
expect(getLogs({ excludeTimestampMs: true })).to.eql(
|
||||
[
|
||||
{
|
||||
data: {
|
||||
error: new Error('Document creation failed'),
|
||||
filePath: '/apps/papra/ingestion/org_111111111111111111111111/hello.md',
|
||||
},
|
||||
level: 'error',
|
||||
message: 'Error creating document',
|
||||
namespace: 'test',
|
||||
},
|
||||
],
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('moveIngestionFile', () => {
|
||||
describe(`a file from the ingestion folder can be moved
|
||||
- either to the done folder, when the post-processing strategy is set to "move"
|
||||
- or to the error folder, when an error occurs during the ingestion
|
||||
so this can lead to data loss if not handled properly`, () => {
|
||||
test('in best case, the file is moved to the destination folder', async () => {
|
||||
const { fs, getFsState } = createInMemoryFsServices({
|
||||
'/apps/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
await moveIngestionFile({
|
||||
filePath: '/apps/hello.md',
|
||||
moveToFolder: '/foo/destination',
|
||||
fs,
|
||||
});
|
||||
|
||||
expect(getFsState()).to.deep.equal({
|
||||
'/apps': null,
|
||||
'/foo/destination/hello.md': 'lorem ipsum',
|
||||
});
|
||||
});
|
||||
|
||||
test('if the destination file already exists, but has the same name and same content, the original file is deleted', async () => {
|
||||
const { fs, getFsState } = createInMemoryFsServices({
|
||||
'/apps/hello.md': 'lorem ipsum',
|
||||
'/foo/destination/hello.md': 'lorem ipsum',
|
||||
});
|
||||
|
||||
await moveIngestionFile({
|
||||
filePath: '/apps/hello.md',
|
||||
moveToFolder: '/foo/destination',
|
||||
fs,
|
||||
});
|
||||
|
||||
expect(getFsState()).to.deep.equal({
|
||||
'/apps': null,
|
||||
'/foo/destination/hello.md': 'lorem ipsum',
|
||||
});
|
||||
});
|
||||
|
||||
test('if the destination file already exists, but has the same name and different content, the original file is renamed with a timestamp', async () => {
|
||||
const { fs, getFsState } = createInMemoryFsServices({
|
||||
'/apps/hello.md': 'lorem ipsum',
|
||||
'/foo/destination/hello.md': 'dolor sit amet',
|
||||
});
|
||||
|
||||
await moveIngestionFile({
|
||||
filePath: '/apps/hello.md',
|
||||
moveToFolder: '/foo/destination',
|
||||
fs,
|
||||
now: new Date('2021-01-01'),
|
||||
});
|
||||
|
||||
expect(getFsState()).to.deep.equal({
|
||||
'/apps': null,
|
||||
'/foo/destination/hello.md': 'dolor sit amet',
|
||||
'/foo/destination/hello_1609459200000.md': 'lorem ipsum',
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,277 @@
|
||||
import type { Stats } from 'node:fs';
|
||||
import type { Database } from '../app/database/database.types';
|
||||
import type { Config } from '../config/config.types';
|
||||
import type { CreateDocumentUsecase } from '../documents/documents.usecases';
|
||||
import type { FsServices } from '../shared/fs/fs.services';
|
||||
import type { Logger } from '../shared/logger/logger';
|
||||
import { isAbsolute, join, parse } from 'node:path';
|
||||
import { safely } from '@corentinth/chisels';
|
||||
import chokidar from 'chokidar';
|
||||
import { uniq } from 'lodash-es';
|
||||
import PQueue from 'p-queue';
|
||||
import picomatch from 'picomatch';
|
||||
import { DOCUMENT_ALREADY_EXISTS_ERROR_CODE } from '../documents/documents.errors';
|
||||
import { createDocumentCreationUsecase } from '../documents/documents.usecases';
|
||||
import { createOrganizationsRepository, type OrganizationsRepository } from '../organizations/organizations.repository';
|
||||
import { isErrorWithCode } from '../shared/errors/errors';
|
||||
import { createFsServices } from '../shared/fs/fs.services';
|
||||
import { createLogger } from '../shared/logger/logger';
|
||||
import { getRootDirPath } from '../shared/path';
|
||||
import { addTimestampToFilename, getAbsolutePathFromFolderRelativeToOrganizationIngestionFolder, getOrganizationIdFromFilePath, isFileInDoneFolder, isFileInErrorFolder, normalizeFilePathToIngestionFolder } from './ingestion-folder.models';
|
||||
import { createInvalidPostProcessingStrategyError } from './ingestion-folders.errors';
|
||||
import { getFile } from './ingestion-folders.services';
|
||||
|
||||
export function createIngestionFolderWatcher({
|
||||
config,
|
||||
logger = createLogger({ namespace: 'ingestion-folder-watcher' }),
|
||||
db,
|
||||
}: {
|
||||
config: Config;
|
||||
logger?: Logger;
|
||||
db: Database;
|
||||
}) {
|
||||
const { folderRootPath, watcher: { usePolling, pollingInterval }, processingConcurrency } = config.ingestionFolder;
|
||||
|
||||
const processingQueue = new PQueue({ concurrency: processingConcurrency });
|
||||
const cwd = getRootDirPath();
|
||||
const ingestionFolderPath = isAbsolute(folderRootPath) ? folderRootPath : join(cwd, folderRootPath);
|
||||
|
||||
return {
|
||||
startWatchingIngestionFolders: async () => {
|
||||
const organizationsRepository = createOrganizationsRepository({ db });
|
||||
const createDocument = await createDocumentCreationUsecase({ db, config, logger });
|
||||
|
||||
const ignored = await buildPathIgnoreFunction({ config, cwd, organizationsRepository });
|
||||
|
||||
chokidar
|
||||
.watch(
|
||||
folderRootPath,
|
||||
{
|
||||
persistent: true,
|
||||
followSymlinks: true,
|
||||
awaitWriteFinish: true,
|
||||
atomic: true,
|
||||
cwd,
|
||||
usePolling,
|
||||
interval: pollingInterval,
|
||||
ignored,
|
||||
},
|
||||
)
|
||||
.on('add', (fileMaybeCwdRelativePath) => {
|
||||
processingQueue.add(async () => {
|
||||
const filePath = isAbsolute(fileMaybeCwdRelativePath) ? fileMaybeCwdRelativePath : join(cwd, fileMaybeCwdRelativePath);
|
||||
|
||||
logger.info({ filePath }, 'Processing file');
|
||||
|
||||
const [, error] = await safely(processFile({ filePath, ingestionFolderPath, createDocument, logger, config, organizationsRepository }));
|
||||
|
||||
if (error) {
|
||||
logger.error({ filePath, error }, 'Error processing file');
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
logger.info(
|
||||
{
|
||||
folderRootPath,
|
||||
usePolling,
|
||||
pollingInterval,
|
||||
processingConcurrency,
|
||||
},
|
||||
'Ingestion folder watcher started',
|
||||
);
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
export async function processFile({
|
||||
filePath,
|
||||
ingestionFolderPath,
|
||||
logger,
|
||||
organizationsRepository,
|
||||
config,
|
||||
createDocument,
|
||||
fs = createFsServices(),
|
||||
}: {
|
||||
filePath: string;
|
||||
ingestionFolderPath: string;
|
||||
logger: Logger;
|
||||
config: Config;
|
||||
createDocument: CreateDocumentUsecase;
|
||||
organizationsRepository: OrganizationsRepository;
|
||||
fs?: FsServices;
|
||||
}) {
|
||||
const { postProcessing: { moveToFolderPath: doneFolder }, errorFolder } = config.ingestionFolder;
|
||||
|
||||
// Get the file from the ingestion folder as a File Instance
|
||||
const [getFileResult, getFileError] = await safely(getFile({ filePath, fs }));
|
||||
|
||||
if (getFileError) {
|
||||
logger.error({ filePath, error: getFileError }, 'Error reading file');
|
||||
return;
|
||||
}
|
||||
|
||||
const { file } = getFileResult;
|
||||
|
||||
const { organizationId } = await getFileOrganizationId({ filePath, ingestionFolderPath, organizationsRepository });
|
||||
|
||||
if (!organizationId) {
|
||||
logger.warn({ filePath }, 'A file in the ingestion folder is not located in an organization ingestion folder, skipping');
|
||||
return;
|
||||
}
|
||||
|
||||
const organizationIngestionFolderPath = join(ingestionFolderPath, organizationId);
|
||||
|
||||
if (isFileInDoneFolder({ filePath, doneFolder, organizationIngestionFolderPath })) {
|
||||
logger.debug({ filePath }, 'File from post-processing folder, skipping');
|
||||
return;
|
||||
}
|
||||
|
||||
if (isFileInErrorFolder({ filePath, errorFolder, organizationIngestionFolderPath })) {
|
||||
logger.debug({ filePath }, 'File from error folder, skipping');
|
||||
return;
|
||||
}
|
||||
|
||||
const [result, error] = await safely(createDocument({ file, organizationId }));
|
||||
|
||||
const isNotInsertedBecauseAlreadyExists = isErrorWithCode({ error, code: DOCUMENT_ALREADY_EXISTS_ERROR_CODE });
|
||||
|
||||
if (error && !isNotInsertedBecauseAlreadyExists) {
|
||||
logger.error({ filePath, error }, 'Error creating document');
|
||||
const errorFolderPath = getAbsolutePathFromFolderRelativeToOrganizationIngestionFolder({ path: errorFolder, organizationIngestionFolderPath });
|
||||
|
||||
await moveIngestionFile({ filePath, moveToFolder: errorFolderPath, fs });
|
||||
return;
|
||||
}
|
||||
|
||||
if (isNotInsertedBecauseAlreadyExists) {
|
||||
logger.info({ filePath }, 'Document not inserted because it already exists');
|
||||
}
|
||||
|
||||
if (result) {
|
||||
const { document } = result;
|
||||
|
||||
logger.info({ documentId: document.id }, 'Document imported from ingestion folder');
|
||||
}
|
||||
|
||||
await postProcessFile({ filePath, organizationIngestionFolderPath, logger, config, fs });
|
||||
}
|
||||
|
||||
async function postProcessFile({
|
||||
filePath,
|
||||
organizationIngestionFolderPath,
|
||||
logger,
|
||||
config,
|
||||
fs = createFsServices(),
|
||||
}: {
|
||||
filePath: string;
|
||||
organizationIngestionFolderPath: string;
|
||||
logger: Logger;
|
||||
config: Config;
|
||||
fs?: FsServices;
|
||||
}) {
|
||||
const { postProcessing: { strategy, moveToFolderPath } } = config.ingestionFolder;
|
||||
|
||||
if (strategy === 'delete') {
|
||||
await fs.deleteFile({ filePath });
|
||||
logger.info({ filePath }, 'File deleted after ingestion');
|
||||
return;
|
||||
}
|
||||
|
||||
if (strategy === 'move') {
|
||||
const path = getAbsolutePathFromFolderRelativeToOrganizationIngestionFolder({ path: moveToFolderPath, organizationIngestionFolderPath });
|
||||
|
||||
await moveIngestionFile({ filePath, moveToFolder: path, fs });
|
||||
logger.info({ filePath }, 'File moved after ingestion');
|
||||
return;
|
||||
}
|
||||
|
||||
throw createInvalidPostProcessingStrategyError({ strategy });
|
||||
}
|
||||
|
||||
async function getFileOrganizationId({ filePath, ingestionFolderPath, organizationsRepository }: { filePath: string; ingestionFolderPath: string; organizationsRepository: OrganizationsRepository }) {
|
||||
const { relativeFilePath } = normalizeFilePathToIngestionFolder({ filePath, ingestionFolderPath });
|
||||
|
||||
const { organizationId } = getOrganizationIdFromFilePath({ relativeFilePath });
|
||||
|
||||
if (!organizationId) {
|
||||
return { organizationId: undefined };
|
||||
}
|
||||
|
||||
const { organization } = await organizationsRepository.getOrganizationById({ organizationId });
|
||||
|
||||
return { organizationId: organization.id };
|
||||
}
|
||||
|
||||
async function buildPathIgnoreFunction({
|
||||
config,
|
||||
cwd = getRootDirPath(),
|
||||
organizationsRepository,
|
||||
}: {
|
||||
config: Config;
|
||||
cwd?: string;
|
||||
organizationsRepository: OrganizationsRepository;
|
||||
}) {
|
||||
const { ingestionFolder: { postProcessing: { strategy, moveToFolderPath }, errorFolder, ignoredPatterns, folderRootPath } } = config;
|
||||
|
||||
const { organizationIds } = await organizationsRepository.getAllOrganizationIds();
|
||||
|
||||
const doneFolders = strategy === 'move'
|
||||
? isAbsolute(moveToFolderPath) ? moveToFolderPath : uniq(organizationIds.map(id => join(cwd, folderRootPath, id, moveToFolderPath)))
|
||||
: [];
|
||||
|
||||
const errorFolders = isAbsolute(errorFolder) ? errorFolder : uniq(organizationIds.map(id => join(cwd, folderRootPath, id, errorFolder)));
|
||||
|
||||
const ignoredFolders = [...doneFolders, ...errorFolders];
|
||||
const matchExcludedPatterns = picomatch(ignoredPatterns);
|
||||
|
||||
return (path: string, stats?: Stats) => {
|
||||
const normalizedPath = isAbsolute(path) ? path : join(cwd, path);
|
||||
|
||||
return Boolean(stats?.isFile()) && (ignoredFolders.some(folder => normalizedPath.startsWith(folder)) || matchExcludedPatterns(normalizedPath));
|
||||
};
|
||||
}
|
||||
|
||||
export async function moveIngestionFile({
|
||||
filePath,
|
||||
moveToFolder,
|
||||
fs = createFsServices(),
|
||||
now = new Date(),
|
||||
}: {
|
||||
filePath: string;
|
||||
moveToFolder: string;
|
||||
fs?: FsServices;
|
||||
now?: Date;
|
||||
}) {
|
||||
const { base } = parse(filePath);
|
||||
const newFilePath = join(moveToFolder, base);
|
||||
|
||||
await fs.ensureDirectoryExists({ path: moveToFolder });
|
||||
|
||||
// Check if the destination file already exists
|
||||
const destinationFileExists = await fs.checkFileExists({ path: newFilePath });
|
||||
|
||||
if (destinationFileExists) {
|
||||
// Check if the files have the same content
|
||||
const sameContent = await fs.areFilesContentIdentical({
|
||||
file1: filePath,
|
||||
file2: newFilePath,
|
||||
});
|
||||
|
||||
if (sameContent) {
|
||||
// If same content, no need to move - just delete the source file
|
||||
await fs.deleteFile({ filePath });
|
||||
return;
|
||||
} else {
|
||||
// If different content, generate a new filename with timestamp
|
||||
const newFileName = addTimestampToFilename({ fileName: base, now });
|
||||
const timestampedFilePath = join(moveToFolder, newFileName);
|
||||
|
||||
await fs.moveFile({ sourceFilePath: filePath, destinationFilePath: timestampedFilePath });
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Default case: no conflict, simple move
|
||||
await fs.moveFile({ sourceFilePath: filePath, destinationFilePath: newFilePath });
|
||||
}
|
||||
@@ -3,6 +3,7 @@ import type { Config } from '../../config/config.types';
|
||||
export type IntakeEmailsServices = {
|
||||
name: string;
|
||||
generateEmailAddress: () => Promise<{ emailAddress: string }>;
|
||||
deleteEmailAddress: ({ emailAddress }: { emailAddress: string }) => Promise<void>;
|
||||
};
|
||||
|
||||
export type IntakeEmailDriverFactory = (args: { config: Config }) => IntakeEmailsServices;
|
||||
|
||||
@@ -1,11 +1,15 @@
|
||||
import { buildUrl } from '@corentinth/chisels';
|
||||
import { buildUrl, safely } from '@corentinth/chisels';
|
||||
import { generateId as generateHumanReadableId } from '@corentinth/friendly-ids';
|
||||
import { createClient } from '@owlrelay/api-sdk';
|
||||
import { createLogger } from '../../../shared/logger/logger';
|
||||
import { INTAKE_EMAILS_INGEST_ROUTE } from '../../intake-emails.constants';
|
||||
import { buildEmailAddress } from '../../intake-emails.models';
|
||||
import { defineIntakeEmailDriver } from '../intake-emails.drivers.models';
|
||||
|
||||
export const OWLRELAY_INTAKE_EMAIL_DRIVER_NAME = 'owlrelay';
|
||||
|
||||
const logger = createLogger({ namespace: 'intake-emails.drivers.owlrelay' });
|
||||
|
||||
export const owlrelayIntakeEmailDriverFactory = defineIntakeEmailDriver(({ config }) => {
|
||||
const { baseUrl } = config.server;
|
||||
const { webhookSecret } = config.intakeEmails;
|
||||
@@ -15,12 +19,12 @@ export const owlrelayIntakeEmailDriverFactory = defineIntakeEmailDriver(({ confi
|
||||
apiKey: owlrelayApiKey,
|
||||
});
|
||||
|
||||
const webhookUrl = configuredWebhookUrl ?? buildUrl({ baseUrl, path: '/api/intake-emails/owlrelay' });
|
||||
const webhookUrl = configuredWebhookUrl ?? buildUrl({ baseUrl, path: INTAKE_EMAILS_INGEST_ROUTE });
|
||||
|
||||
return {
|
||||
name: OWLRELAY_INTAKE_EMAIL_DRIVER_NAME,
|
||||
generateEmailAddress: async () => {
|
||||
const { domain, username } = await client.createEmail({
|
||||
const { domain, username, id: owlrelayEmailId } = await client.createEmail({
|
||||
username: generateHumanReadableId(),
|
||||
webhookUrl,
|
||||
webhookSecret,
|
||||
@@ -28,9 +32,21 @@ export const owlrelayIntakeEmailDriverFactory = defineIntakeEmailDriver(({ confi
|
||||
|
||||
const emailAddress = buildEmailAddress({ username, domain });
|
||||
|
||||
logger.info({ emailAddress, owlrelayEmailId }, 'Created email address in OwlRelay');
|
||||
|
||||
return {
|
||||
emailAddress,
|
||||
};
|
||||
},
|
||||
deleteEmailAddress: async ({ emailAddress }) => {
|
||||
const [, error] = await safely(client.deleteEmail({ emailAddress }));
|
||||
|
||||
if (error) {
|
||||
logger.error({ error }, 'Failed to delete email address in OwlRelay');
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info({ emailAddress }, 'Deleted email address in OwlRelay');
|
||||
},
|
||||
};
|
||||
});
|
||||
|
||||
@@ -15,5 +15,7 @@ export const randomUsernameIntakeEmailDriverFactory = defineIntakeEmailDriver(({
|
||||
emailAddress: `${randomUsername}@${domain}`,
|
||||
};
|
||||
},
|
||||
// Deletion functionality is not required for this driver
|
||||
deleteEmailAddress: async () => {},
|
||||
};
|
||||
});
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user