Compare commits

..

59 Commits

Author SHA1 Message Date
Trenton H
38df71b71a Missing ijson 2026-01-29 10:39:11 -08:00
Trenton H
7bae6b7f6d Fixes the merge conflicts I missed 2026-01-29 10:35:10 -08:00
Trenton H
1c99e55069 Initial version hacked up by Opus 2026-01-29 10:06:02 -08:00
Trenton H
b44eea6508 The websocket included script 2026-01-29 09:39:52 -08:00
Trenton H
b8af971652 Merge remote-tracking branch 'origin/dev' into feature-migrator-application 2026-01-29 09:28:24 -08:00
Trenton H
66593ec660 Chore: Bulk backend updates (#11543) 2026-01-28 13:30:12 -08:00
GitHub Actions
5af0d1da26 Auto translate strings 2026-01-28 16:27:11 +00:00
shamoon
3281ec2401 Documentation: update duplicates note 2026-01-28 08:25:16 -08:00
shamoon
dc9061eb97 Chore: refactor zoom and editor mode to use enums 2026-01-28 08:25:16 -08:00
Trenton H
6859e7e3c2 Chore: Resolve more flaky tests (#11920) 2026-01-28 16:13:27 +00:00
Jan Kleine
3e645bd9e2 Tweak: increase minimum screen width before inserting padding (#11926) 2026-01-28 15:57:47 +00:00
GitHub Actions
09d39de200 Auto translate strings 2026-01-28 15:55:01 +00:00
Jan Kleine
94231dbb0f Enhancement: Add setting for default PDF Editor mode (#11927)
---------

Co-authored-by: shamoon <4887959+shamoon@users.noreply.github.com>
2026-01-28 15:53:14 +00:00
Trenton H
2f76350023 Chore: Push manually dispatched images to the registry (#11925) 2026-01-28 15:47:32 +00:00
Pierre Nédélec
4cbe56e3af Chore: Http interceptors refactor (#11923)
---------

Co-authored-by: shamoon <4887959+shamoon@users.noreply.github.com>
2026-01-28 07:18:48 -08:00
Trenton H
01b21377af Chore: Use a local http server instead of external to reduce flakiness (#11916) 2026-01-28 03:57:12 +00:00
Pierre Nédélec
56b5d838d7 Chore: remove deprecated Angular method (#11919)
---------

Co-authored-by: shamoon <4887959+shamoon@users.noreply.github.com>
2026-01-27 16:58:38 -08:00
shamoon
d294508982 Fixhancement: auto-queue llm index if needed (#11891) 2026-01-27 21:48:17 +00:00
Philipp Defner
02002620d2 Development: update devcontainer setup, add documentation for pre-commit, set uv cache dir (#11882) 2026-01-27 20:45:56 +00:00
shamoon
6d93ae93b4 Chore: fix session token strategy import deprecation (#11914) 2026-01-27 19:38:33 +00:00
Trenton H
c84f2f04b3 Chore: Switch to a local IMAP server instead of a real email service (#11913) 2026-01-27 11:35:12 -08:00
GitHub Actions
d9d83e3045 Auto translate strings 2026-01-27 18:57:11 +00:00
shamoon
1f074390e4 Feature: sharelink bundles (#11682) 2026-01-27 18:54:51 +00:00
Trenton H
50d676c592 Chore: Upgrade to Pytest 9 (#11898) 2026-01-27 17:01:13 +00:00
GitHub Actions
94b0f4e114 Auto translate strings 2026-01-27 07:25:45 +00:00
shamoon
045994042b Enhancement: user control of doc details fields (#11906) 2026-01-26 23:23:53 -08:00
shamoon
e1655045ca Ready
[ci skip]
2026-01-23 22:04:58 -08:00
shamoon
1a638d8cc0 drop, migrate, then import 2026-01-23 22:04:44 -08:00
shamoon
b21ff75a30 Run importer 2026-01-23 21:50:01 -08:00
shamoon
58f1a186d4 2.20.6 2026-01-23 21:38:02 -08:00
shamoon
2a1c06c047 Merge branch 'feature/migrator' of https://github.com/paperless-ngx/paperless-ngx into feature/migrator 2026-01-23 21:33:27 -08:00
shamoon
770dc02833 Add root URL redirect to migration home 2026-01-23 21:31:51 -08:00
shamoon
af9d75dfcf Fix static files again 2026-01-23 21:31:34 -08:00
shamoon
7b23cdc0c1 Opacify complete steps 2026-01-23 15:44:07 -08:00
shamoon
09892809f9 Tweak instructions 2026-01-23 15:37:53 -08:00
shamoon
94c6108006 Nice, upload button 2026-01-23 15:29:19 -08:00
shamoon
33c5d5bab0 Update migration_home.html
[ci skip]
2026-01-23 08:54:00 -08:00
shamoon
9beb508f1d Auto-step after transform
[ci skip]
2026-01-23 08:40:37 -08:00
shamoon
a290fcfe6f Sick, run transform as subprocess 2026-01-23 08:39:57 -08:00
shamoon
0846fe9845 Script deps 2026-01-23 08:33:12 -08:00
shamoon
910d16374b Stumpylog's current version of the transform script
[ci skip]

Co-Authored-By: Trenton H <797416+stumpylog@users.noreply.github.com>
2026-01-23 08:22:34 -08:00
shamoon
35d77b144d Small startup detection thing 2026-01-23 08:11:34 -08:00
shamoon
5987e35101 Dummy console thing
[ci skip]
2026-01-22 23:28:26 -08:00
shamoon
96259ce441 Export instructions 2026-01-22 23:27:30 -08:00
shamoon
283afb265d Update settings.py
[ci skip]
2026-01-22 23:12:47 -08:00
shamoon
67564dd573 more light mode shit 2026-01-22 23:12:47 -08:00
shamoon
046d65c2ba Just light mode
[ci skip]
2026-01-22 22:59:27 -08:00
shamoon
8761816635 Update urls.py 2026-01-22 22:55:30 -08:00
shamoon
a1cdc45f1a one-time code 2026-01-22 22:39:11 -08:00
shamoon
190e42e722 Oh nice, reuse existing 2026-01-22 22:14:09 -08:00
shamoon
75c6ffe01f fix export dir
[ci skip]
2026-01-22 22:09:59 -08:00
shamoon
2964b4b256 Update migration_home.html
[ci skip]
2026-01-22 22:07:12 -08:00
shamoon
f52f9dd325 Basic login styling 2026-01-22 21:59:13 -08:00
shamoon
5827a0ec25 Disable unusable buttons 2026-01-22 21:59:12 -08:00
shamoon
990ef05d99 Some prettiness 2026-01-22 21:59:12 -08:00
shamoon
9f48b8e6e1 Some styling 2026-01-22 21:40:08 -08:00
shamoon
42689070b3 Still support conf 2026-01-22 21:40:08 -08:00
shamoon
09f3cfdb93 Start in migrator 2026-01-22 21:40:07 -08:00
shamoon
84f408fa43 save this, it does work 2026-01-22 21:40:07 -08:00
100 changed files with 8972 additions and 3571 deletions

View File

@@ -89,6 +89,18 @@ Additional tasks are available for common maintenance operations:
- **Migrate Database**: To apply database migrations.
- **Create Superuser**: To create an admin user for the application.
## Committing from the Host Machine
The DevContainer automatically installs pre-commit hooks during setup. However, these hooks are configured for use inside the container.
If you want to commit changes from your host machine (outside the DevContainer), you need to set up pre-commit on your host. This installs it as a standalone tool.
```bash
uv tool install pre-commit && pre-commit install
```
After this, you can commit either from inside the DevContainer or from your host machine.
## Let's Get Started!
Follow the steps above to get your development environment up and running. Happy coding!

View File

@@ -3,26 +3,30 @@
"dockerComposeFile": "docker-compose.devcontainer.sqlite-tika.yml",
"service": "paperless-development",
"workspaceFolder": "/usr/src/paperless/paperless-ngx",
"postCreateCommand": "/bin/bash -c 'rm -rf .venv/.* && uv sync --group dev && uv run pre-commit install'",
"containerEnv": {
"UV_CACHE_DIR": "/usr/src/paperless/paperless-ngx/.uv-cache"
},
"postCreateCommand": "/bin/bash -c 'rm -rf .venv/.* && uv sync --group dev && uv run pre-commit install'",
"customizations": {
"vscode": {
"extensions": [
"mhutchie.git-graph",
"ms-python.python",
"ms-vscode.js-debug-nightly",
"eamodio.gitlens",
"yzhang.markdown-all-in-one"
],
"settings": {
"python.defaultInterpreterPath": "/usr/src/paperless/paperless-ngx/.venv/bin/python",
"python.pythonPath": "/usr/src/paperless/paperless-ngx/.venv/bin/python",
"python.terminal.activateEnvInCurrentTerminal": true,
"editor.formatOnPaste": false,
"editor.formatOnSave": true,
"editor.formatOnType": true,
"files.trimTrailingWhitespace": true
}
"extensions": [
"mhutchie.git-graph",
"ms-python.python",
"ms-vscode.js-debug-nightly",
"eamodio.gitlens",
"yzhang.markdown-all-in-one",
"pnpm.pnpm"
],
"settings": {
"python.defaultInterpreterPath": "/usr/src/paperless/paperless-ngx/.venv/bin/python",
"python.pythonPath": "/usr/src/paperless/paperless-ngx/.venv/bin/python",
"python.terminal.activateEnvInCurrentTerminal": true,
"editor.formatOnPaste": false,
"editor.formatOnSave": true,
"editor.formatOnType": true,
"files.trimTrailingWhitespace": true
}
}
},
"remoteUser": "paperless"
}
},
"remoteUser": "paperless"
}

View File

@@ -174,12 +174,22 @@
{
"label": "Maintenance: Install Frontend Dependencies",
"description": "Install frontend (pnpm) dependencies",
"type": "pnpm",
"script": "install",
"path": "src-ui",
"type": "shell",
"command": "pnpm install",
"group": "clean",
"problemMatcher": [],
"detail": "install dependencies from package"
"options": {
"cwd": "${workspaceFolder}/src-ui"
},
"presentation": {
"echo": true,
"reveal": "always",
"focus": true,
"panel": "shared",
"showReuseMessage": false,
"clear": true,
"revealProblems": "onProblem"
}
},
{
"description": "Clean install frontend dependencies and build the frontend for production",

View File

@@ -75,9 +75,6 @@ jobs:
env:
NLTK_DATA: ${{ env.NLTK_DATA }}
PAPERLESS_CI_TEST: 1
PAPERLESS_MAIL_TEST_HOST: ${{ secrets.TEST_MAIL_HOST }}
PAPERLESS_MAIL_TEST_USER: ${{ secrets.TEST_MAIL_USER }}
PAPERLESS_MAIL_TEST_PASSWD: ${{ secrets.TEST_MAIL_PASSWD }}
run: |
uv run \
--python ${{ steps.setup-python.outputs.python-version }} \

View File

@@ -46,14 +46,13 @@ jobs:
id: ref
run: |
ref_name="${GITHUB_HEAD_REF:-$GITHUB_REF_NAME}"
# Sanitize by replacing / with - for cache keys
cache_ref="${ref_name//\//-}"
# Sanitize by replacing / with - for use in tags and cache keys
sanitized_ref="${ref_name//\//-}"
echo "ref_name=${ref_name}"
echo "cache_ref=${cache_ref}"
echo "sanitized_ref=${sanitized_ref}"
echo "name=${ref_name}" >> $GITHUB_OUTPUT
echo "cache-ref=${cache_ref}" >> $GITHUB_OUTPUT
echo "name=${sanitized_ref}" >> $GITHUB_OUTPUT
- name: Check push permissions
id: check-push
env:
@@ -62,12 +61,14 @@ jobs:
# should-push: Should we push to GHCR?
# True for:
# 1. Pushes (tags/dev/beta) - filtered via the workflow triggers
# 2. Internal PRs where the branch name starts with 'feature-' - filtered here when a PR is synced
# 2. Manual dispatch - always push to GHCR
# 3. Internal PRs where the branch name starts with 'feature-' or 'fix-'
should_push="false"
if [[ "${{ github.event_name }}" == "push" ]]; then
should_push="true"
elif [[ "${{ github.event_name }}" == "workflow_dispatch" ]]; then
should_push="true"
elif [[ "${{ github.event_name }}" == "pull_request" && "${{ github.event.pull_request.head.repo.full_name }}" == "${{ github.repository }}" ]]; then
if [[ "${REF_NAME}" == feature-* || "${REF_NAME}" == fix-* ]]; then
should_push="true"
@@ -139,9 +140,9 @@ jobs:
PNGX_TAG_VERSION=${{ steps.docker-meta.outputs.version }}
outputs: type=image,name=${{ env.REGISTRY }}/${{ steps.repo.outputs.name }},push-by-digest=true,name-canonical=true,push=${{ steps.check-push.outputs.should-push }}
cache-from: |
type=registry,ref=${{ env.REGISTRY }}/${{ steps.repo.outputs.name }}/cache/app:${{ steps.ref.outputs.cache-ref }}-${{ matrix.arch }}
type=registry,ref=${{ env.REGISTRY }}/${{ steps.repo.outputs.name }}/cache/app:${{ steps.ref.outputs.name }}-${{ matrix.arch }}
type=registry,ref=${{ env.REGISTRY }}/${{ steps.repo.outputs.name }}/cache/app:dev-${{ matrix.arch }}
cache-to: ${{ steps.check-push.outputs.should-push == 'true' && format('type=registry,mode=max,ref={0}/{1}/cache/app:{2}-{3}', env.REGISTRY, steps.repo.outputs.name, steps.ref.outputs.cache-ref, matrix.arch) || '' }}
cache-to: ${{ steps.check-push.outputs.should-push == 'true' && format('type=registry,mode=max,ref={0}/{1}/cache/app:{2}-{3}', env.REGISTRY, steps.repo.outputs.name, steps.ref.outputs.name, matrix.arch) || '' }}
- name: Export digest
if: steps.check-push.outputs.should-push == 'true'
run: |

1
.gitignore vendored
View File

@@ -40,6 +40,7 @@ htmlcov/
.coverage
.coverage.*
.cache
.uv-cache
nosetests.xml
coverage.xml
*,cover

View File

@@ -37,7 +37,7 @@ repos:
- json
# See https://github.com/prettier/prettier/issues/15742 for the fork reason
- repo: https://github.com/rbubley/mirrors-prettier
rev: 'v3.6.2'
rev: 'v3.8.1'
hooks:
- id: prettier
types_or:
@@ -49,7 +49,7 @@ repos:
- 'prettier-plugin-organize-imports@4.1.0'
# Python hooks
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.14.5
rev: v0.14.14
hooks:
- id: ruff-check
- id: ruff-format
@@ -76,7 +76,7 @@ repos:
hooks:
- id: shellcheck
- repo: https://github.com/google/yamlfmt
rev: v0.20.0
rev: v0.21.0
hooks:
- id: yamlfmt
exclude: "^src-ui/pnpm-lock.yaml"

View File

@@ -23,3 +23,24 @@ services:
container_name: tika
network_mode: host
restart: unless-stopped
greenmail:
image: greenmail/standalone:2.1.8
hostname: greenmail
container_name: greenmail
environment:
# Enable only IMAP for now (SMTP available via 3025 if needed later)
GREENMAIL_OPTS: >-
-Dgreenmail.setup.test.imap -Dgreenmail.users=test@localhost:test -Dgreenmail.users.login=test@localhost -Dgreenmail.verbose
ports:
- "3143:3143" # IMAP
restart: unless-stopped
nginx:
image: docker.io/nginx:1.29-alpine
hostname: nginx
container_name: nginx
ports:
- "8080:8080"
restart: unless-stopped
volumes:
- ../../docs/assets:/usr/share/nginx/html/assets:ro
- ./test-nginx.conf:/etc/nginx/conf.d/default.conf:ro

View File

@@ -0,0 +1,14 @@
server {
listen 8080;
server_name localhost;
root /usr/share/nginx/html;
# Enable CORS for test requests
add_header 'Access-Control-Allow-Origin' '*' always;
add_header 'Access-Control-Allow-Methods' 'GET, HEAD, OPTIONS' always;
location / {
try_files $uri $uri/ =404;
}
}

View File

@@ -8,6 +8,11 @@ echo "${log_prefix} Apply database migrations..."
cd "${PAPERLESS_SRC_DIR}"
if [[ "${PAPERLESS_MIGRATION_MODE:-0}" == "1" ]]; then
echo "${log_prefix} Migration mode enabled, skipping migrations."
exit 0
fi
# The whole migrate, with flock, needs to run as the right user
if [[ -n "${USER_IS_NON_ROOT}" ]]; then
exec s6-setlock -n "${data_dir}/migration_lock" python3 manage.py migrate --skip-checks --no-input

View File

@@ -9,7 +9,15 @@ echo "${log_prefix} Running Django checks"
cd "${PAPERLESS_SRC_DIR}"
if [[ -n "${USER_IS_NON_ROOT}" ]]; then
python3 manage.py check
if [[ "${PAPERLESS_MIGRATION_MODE:-0}" == "1" ]]; then
python3 manage_migration.py check
else
python3 manage.py check
fi
else
s6-setuidgid paperless python3 manage.py check
if [[ "${PAPERLESS_MIGRATION_MODE:-0}" == "1" ]]; then
s6-setuidgid paperless python3 manage_migration.py check
else
s6-setuidgid paperless python3 manage.py check
fi
fi

View File

@@ -13,8 +13,14 @@ if [[ -n "${PAPERLESS_FORCE_SCRIPT_NAME}" ]]; then
export GRANIAN_URL_PATH_PREFIX=${PAPERLESS_FORCE_SCRIPT_NAME}
fi
if [[ -n "${USER_IS_NON_ROOT}" ]]; then
exec granian --interface asginl --ws --loop uvloop "paperless.asgi:application"
if [[ "${PAPERLESS_MIGRATION_MODE:-0}" == "1" ]]; then
app_module="paperless.migration_asgi:application"
else
exec s6-setuidgid paperless granian --interface asginl --ws --loop uvloop "paperless.asgi:application"
app_module="paperless.asgi:application"
fi
if [[ -n "${USER_IS_NON_ROOT}" ]]; then
exec granian --interface asginl --ws --loop uvloop "${app_module}"
else
exec s6-setuidgid paperless granian --interface asginl --ws --loop uvloop "${app_module}"
fi

View File

@@ -582,7 +582,7 @@ document.
### Detecting duplicates {#fuzzy_duplicate}
Paperless already catches and prevents upload of exactly matching documents,
Paperless-ngx already catches and warns of exactly matching documents,
however a new scan of an existing document may not produce an exact bit for bit
duplicate. But the content should be exact or close, allowing detection.

View File

@@ -60,20 +60,6 @@ The REST api provides five different forms of authentication.
[here](advanced_usage.md#openid-connect-and-social-authentication) for more
information on social accounts.
## Model Context Protocol (MCP)
Paperless-ngx exposes an MCP endpoint powered by `django-mcp-server` so MCP
clients can query data collections, run full-text document search, and invoke
DRF-backed CRUD tools.
- Endpoint: `/mcp/`
- Authentication: identical to the REST API (Basic, Session, Token, or Remote
User depending on your configuration).
The MCP server uses existing DRF viewsets and permissions. It also exposes a
`query_data_collections` tool for structured querying across published models
and a `search_documents` tool for full-text search.
## Searching for documents
Full text searching is available on the `/api/documents/` endpoint. Two

View File

@@ -1617,6 +1617,16 @@ processing. This only has an effect if
Defaults to `0 1 * * *`, once per day.
## Share links
#### [`PAPERLESS_SHARE_LINK_BUNDLE_CLEANUP_CRON=<cron expression>`](#PAPERLESS_SHARE_LINK_BUNDLE_CLEANUP_CRON) {#PAPERLESS_SHARE_LINK_BUNDLE_CLEANUP_CRON}
: Controls how often Paperless-ngx removes expired share link bundles (and their generated ZIP archives).
: If set to the string "disable", expired bundles are not cleaned up automatically.
Defaults to `0 2 * * *`, once per day at 02:00.
## Binaries
There are a few external software packages that Paperless expects to

View File

@@ -308,12 +308,14 @@ or using [email](#workflow-action-email) or [webhook](#workflow-action-webhook)
### Share Links
"Share links" are shareable public links to files and can be created and managed under the 'Send' button on the document detail screen.
"Share links" are public links to files (or an archive of files) and can be created and managed under the 'Send' button on the document detail screen or from the bulk editor.
- Share links do not require a user to login and thus link directly to a file.
- Share links do not require a user to login and thus link directly to a file or bundled download.
- Links are unique and are of the form `{paperless-url}/share/{randomly-generated-slug}`.
- Links can optionally have an expiration time set.
- After a link expires or is deleted users will be redirected to the regular paperless-ngx login.
- From the document detail screen you can create a share link for that single document.
- From the bulk editor you can create a **share link bundle** for any selection. Paperless-ngx prepares a ZIP archive in the background and exposes a single share link. You can revisit the "Manage share link bundles" dialog to monitor progress, retry failed bundles, or delete links.
!!! tip

View File

@@ -19,14 +19,14 @@ dependencies = [
"azure-ai-documentintelligence>=1.0.2",
"babel>=2.17",
"bleach~=6.3.0",
"celery[redis]~=5.5.1",
"celery[redis]~=5.6.2",
"channels~=4.2",
"channels-redis~=4.2",
"concurrent-log-handler~=0.9.25",
"dateparser~=1.2",
# WARNING: django does not use semver.
# Only patch versions are guaranteed to not introduce breaking changes.
"django~=5.2.5",
"django~=5.2.10",
"django-allauth[mfa,socialaccount]~=65.13.1",
"django-auditlog~=3.4.1",
"django-cachalot~=2.8.0",
@@ -36,7 +36,6 @@ dependencies = [
"django-extensions~=4.1",
"django-filter~=25.1",
"django-guardian~=3.2.0",
"django-mcp-server~=0.5.7",
"django-multiselectfield~=1.0.1",
"django-soft-delete~=1.0.18",
"django-treenode>=0.23.2",
@@ -50,6 +49,8 @@ dependencies = [
"flower~=2.0.1",
"gotenberg-client~=0.13.1",
"httpx-oauth~=0.16",
"ijson",
"ijson~=3.3",
"imap-tools~=1.11.0",
"jinja2~=3.1.5",
"langdetect~=1.0.9",
@@ -73,6 +74,7 @@ dependencies = [
"rapidfuzz~=3.14.0",
"redis[hiredis]~=5.2.1",
"regex>=2025.9.18",
"rich~=14.1.0",
"scikit-learn~=1.7.0",
"sentence-transformers>=4.1",
"setproctitle~=1.3.4",
@@ -80,7 +82,7 @@ dependencies = [
"torch~=2.9.1",
"tqdm~=4.67.1",
"watchfiles>=1.1.1",
"whitenoise~=6.9",
"whitenoise~=6.11",
"whoosh-reloaded>=2.7.5",
"zxing-cpp~=2.3.0",
]
@@ -89,13 +91,13 @@ optional-dependencies.mariadb = [
"mysqlclient~=2.2.7",
]
optional-dependencies.postgres = [
"psycopg[c,pool]==3.2.12",
"psycopg[c,pool]==3.3",
# Direct dependency for proper resolution of the pre-built wheels
"psycopg-c==3.2.12",
"psycopg-c==3.3",
"psycopg-pool==3.3",
]
optional-dependencies.webserver = [
"granian[uvloop]~=2.5.1",
"granian[uvloop]~=2.6.0",
]
[dependency-groups]
@@ -115,15 +117,16 @@ testing = [
"daphne",
"factory-boy~=3.3.1",
"imagehash",
"pytest~=8.4.1",
"pytest~=9.0.0",
"pytest-cov~=7.0.0",
"pytest-django~=4.11.1",
"pytest-env",
"pytest-env~=1.2.0",
"pytest-httpx",
"pytest-mock",
"pytest-rerunfailures",
"pytest-mock~=3.15.1",
#"pytest-randomly~=4.0.1",
"pytest-rerunfailures~=16.1",
"pytest-sugar",
"pytest-xdist",
"pytest-xdist~=3.8.0",
]
lint = [
@@ -152,7 +155,7 @@ typing = [
]
[tool.uv]
required-version = ">=0.5.14"
required-version = ">=0.9.0"
package = false
environments = [
"sys_platform == 'darwin'",
@@ -162,8 +165,8 @@ environments = [
[tool.uv.sources]
# Markers are chosen to select these almost exclusively when building the Docker image
psycopg-c = [
{ url = "https://github.com/paperless-ngx/builder/releases/download/psycopg-bookworm-3.2.12/psycopg_c-3.2.12-cp312-cp312-linux_x86_64.whl", marker = "sys_platform == 'linux' and platform_machine == 'x86_64' and python_version == '3.12'" },
{ url = "https://github.com/paperless-ngx/builder/releases/download/psycopg-bookworm-3.2.12/psycopg_c-3.2.12-cp312-cp312-linux_aarch64.whl", marker = "sys_platform == 'linux' and platform_machine == 'aarch64' and python_version == '3.12'" },
{ url = "https://github.com/paperless-ngx/builder/releases/download/psycopg-trixie-3.3.0/psycopg_c-3.3.0-cp312-cp312-linux_x86_64.whl", marker = "sys_platform == 'linux' and platform_machine == 'x86_64' and python_version == '3.12'" },
{ url = "https://github.com/paperless-ngx/builder/releases/download/psycopg-trixie-3.3.0/psycopg_c-3.3.0-cp312-cp312-linux_aarch64.whl", marker = "sys_platform == 'linux' and platform_machine == 'aarch64' and python_version == '3.12'" },
]
zxing-cpp = [
{ url = "https://github.com/paperless-ngx/builder/releases/download/zxing-2.3.0/zxing_cpp-2.3.0-cp312-cp312-linux_x86_64.whl", marker = "sys_platform == 'linux' and platform_machine == 'x86_64' and python_version == '3.12'" },
@@ -261,11 +264,15 @@ write-changes = true
ignore-words-list = "criterias,afterall,valeu,ureue,equest,ure,assertIn,Oktober,commitish"
skip = "src-ui/src/locale/*,src-ui/pnpm-lock.yaml,src-ui/e2e/*,src/paperless_mail/tests/samples/*,src/documents/tests/samples/*,*.po,*.json"
[tool.pytest.ini_options]
minversion = "8.0"
pythonpath = [
"src",
]
[tool.pytest]
minversion = "9.0"
pythonpath = [ "src" ]
strict_config = true
strict_markers = true
strict_parametrization_ids = true
strict_xfail = true
testpaths = [
"src/documents/tests/",
"src/paperless/tests/",
@@ -276,6 +283,7 @@ testpaths = [
"src/paperless_remote/tests/",
"src/paperless_ai/tests",
]
addopts = [
"--pythonwarnings=all",
"--cov",
@@ -283,15 +291,26 @@ addopts = [
"--cov-report=xml",
"--numprocesses=auto",
"--maxprocesses=16",
"--quiet",
"--dist=loadscope",
"--durations=50",
"--durations-min=0.5",
"--junitxml=junit.xml",
"-o junit_family=legacy",
"-o",
"junit_family=legacy",
]
norecursedirs = [ "src/locale/", ".venv/", "src-ui/" ]
DJANGO_SETTINGS_MODULE = "paperless.settings"
markers = [
"live: Integration tests requiring external services (Gotenberg, Tika, nginx, etc)",
"nginx: Tests that make HTTP requests to the local nginx service",
"gotenberg: Tests requiring Gotenberg service",
"tika: Tests requiring Tika service",
"greenmail: Tests requiring Greenmail service",
]
[tool.pytest_env]
PAPERLESS_DISABLE_DBHANDLER = "true"
PAPERLESS_CACHE_BACKEND = "django.core.cache.backends.locmem.LocMemCache"

File diff suppressed because it is too large Load Diff

View File

@@ -103,22 +103,6 @@
</div>
<div class="row mb-3">
<div class="col-md-3 col-form-label pt-0">
<span i18n>Items per page</span>
</div>
<div class="col">
<select class="form-select" formControlName="documentListItemPerPage">
<option [ngValue]="10">10</option>
<option [ngValue]="25">25</option>
<option [ngValue]="50">50</option>
<option [ngValue]="100">100</option>
</select>
</div>
</div>
<div class="row">
<div class="col-md-3 col-form-label pt-0">
<span i18n>Sidebar</span>
</div>
@@ -153,8 +137,28 @@
</button>
</div>
</div>
</div>
<div class="col-xl-6 ps-xl-5">
<h5 class="mt-3 mt-md-0" i18n>Global search</h5>
<div class="row">
<div class="col">
<pngx-input-check i18n-title title="Do not include advanced search results" formControlName="searchDbOnly"></pngx-input-check>
</div>
</div>
<h5 class="mt-3" id="update-checking" i18n>Update checking</h5>
<div class="row mb-3">
<div class="col-md-3 col-form-label pt-0">
<span i18n>Full search links to</span>
</div>
<div class="col mb-3">
<select class="form-select" formControlName="searchLink">
<option [ngValue]="GlobalSearchType.TITLE_CONTENT" i18n>Title and content search</option>
<option [ngValue]="GlobalSearchType.ADVANCED" i18n>Advanced search</option>
</select>
</div>
</div>
<h5 class="mt-3 mt-md-0" id="update-checking" i18n>Update checking</h5>
<div class="row mb-3">
<div class="col d-flex flex-row align-items-start">
<pngx-input-check i18n-title title="Enable update checking" formControlName="updateCheckingEnabled"></pngx-input-check>
@@ -179,11 +183,33 @@
<pngx-input-check i18n-title title="Show document counts in sidebar saved views" formControlName="sidebarViewsShowCount"></pngx-input-check>
</div>
</div>
</div>
<div class="col-xl-6 ps-xl-5">
<h5 class="mt-3 mt-md-0" i18n>Document editing</h5>
</div>
</ng-template>
</li>
<li [ngbNavItem]="SettingsNavIDs.Documents">
<a ngbNavLink i18n>Documents</a>
<ng-template ngbNavContent>
<div class="row">
<div class="col-xl-6 pe-xl-5">
<h5 i18n>Documents</h5>
<div class="row mb-3">
<div class="col-md-3 col-form-label pt-0">
<span i18n>Items per page</span>
</div>
<div class="col">
<select class="form-select" formControlName="documentListItemPerPage">
<option [ngValue]="10">10</option>
<option [ngValue]="25">25</option>
<option [ngValue]="50">50</option>
<option [ngValue]="100">100</option>
</select>
</div>
</div>
<h5 class="mt-3" i18n>Document editing</h5>
<div class="row">
<div class="col">
<pngx-input-check i18n-title title="Use PDF viewer provided by the browser" i18n-hint hint="This is usually faster for displaying large PDF documents, but it might not work on some browsers." formControlName="useNativePdfViewer"></pngx-input-check>
@@ -209,31 +235,32 @@
</div>
</div>
<div class="row mb-3">
<div class="row">
<div class="col">
<pngx-input-check i18n-title title="Show document thumbnail during loading" formControlName="documentEditingOverlayThumbnail"></pngx-input-check>
</div>
</div>
<h5 class="mt-3" i18n>Global search</h5>
<div class="row">
<div class="col">
<pngx-input-check i18n-title title="Do not include advanced search results" formControlName="searchDbOnly"></pngx-input-check>
</div>
</div>
<div class="row mb-3">
<div class="col-md-3 col-form-label pt-0">
<span i18n>Full search links to</span>
</div>
<div class="col mb-3">
<select class="form-select" formControlName="searchLink">
<option [ngValue]="GlobalSearchType.TITLE_CONTENT" i18n>Title and content search</option>
<option [ngValue]="GlobalSearchType.ADVANCED" i18n>Advanced search</option>
</select>
<div class="col">
<p class="mb-2" i18n>Built-in fields to show:</p>
@for (option of documentDetailFieldOptions; track option.id) {
<div class="form-check ms-3">
<input class="form-check-input" type="checkbox"
[id]="'documentDetailField-' + option.id"
[checked]="isDocumentDetailFieldShown(option.id)"
(change)="toggleDocumentDetailField(option.id, $event.target.checked)" />
<label class="form-check-label" [for]="'documentDetailField-' + option.id">
{{ option.label }}
</label>
</div>
}
<p class="small text-muted mt-1" i18n>Uncheck fields to hide them on the document details page.</p>
</div>
</div>
</div>
<div class="col-xl-6 ps-xl-5">
<h5 class="mt-3" i18n>Bulk editing</h5>
<div class="row mb-3">
<div class="col">
@@ -242,16 +269,27 @@
</div>
</div>
<h5 class="mt-3" i18n>PDF Editor</h5>
<div class="row">
<div class="col-md-3 col-form-label pt-0">
<span i18n>Default editing mode</span>
</div>
<div class="col">
<select class="form-select" formControlName="pdfEditorDefaultEditMode">
<option [ngValue]="PdfEditorEditMode.Create" i18n>Create new document(s)</option>
<option [ngValue]="PdfEditorEditMode.Update" i18n>Update existing document</option>
</select>
</div>
</div>
<h5 class="mt-3" i18n>Notes</h5>
<div class="row mb-3">
<div class="col">
<pngx-input-check i18n-title title="Enable notes" formControlName="notesEnabled"></pngx-input-check>
</div>
</div>
</div>
</div>
</ng-template>
</li>

View File

@@ -201,9 +201,9 @@ describe('SettingsComponent', () => {
const navigateSpy = jest.spyOn(router, 'navigate')
const tabButtons = fixture.debugElement.queryAll(By.directive(NgbNavLink))
tabButtons[1].nativeElement.dispatchEvent(new MouseEvent('click'))
expect(navigateSpy).toHaveBeenCalledWith(['settings', 'permissions'])
expect(navigateSpy).toHaveBeenCalledWith(['settings', 'documents'])
tabButtons[2].nativeElement.dispatchEvent(new MouseEvent('click'))
expect(navigateSpy).toHaveBeenCalledWith(['settings', 'notifications'])
expect(navigateSpy).toHaveBeenCalledWith(['settings', 'permissions'])
const initSpy = jest.spyOn(component, 'initialize')
component.isDirty = true // mock dirty
@@ -213,8 +213,8 @@ describe('SettingsComponent', () => {
expect(initSpy).not.toHaveBeenCalled()
navigateSpy.mockResolvedValueOnce(true) // nav accepted even though dirty
tabButtons[1].nativeElement.dispatchEvent(new MouseEvent('click'))
expect(navigateSpy).toHaveBeenCalledWith(['settings', 'notifications'])
tabButtons[2].nativeElement.dispatchEvent(new MouseEvent('click'))
expect(navigateSpy).toHaveBeenCalledWith(['settings', 'permissions'])
expect(initSpy).toHaveBeenCalled()
})
@@ -226,7 +226,7 @@ describe('SettingsComponent', () => {
activatedRoute.snapshot.fragment = '#notifications'
const scrollSpy = jest.spyOn(viewportScroller, 'scrollToAnchor')
component.ngOnInit()
expect(component.activeNavID).toEqual(3) // Notifications
expect(component.activeNavID).toEqual(4) // Notifications
component.ngAfterViewInit()
expect(scrollSpy).toHaveBeenCalledWith('#notifications')
})
@@ -251,7 +251,7 @@ describe('SettingsComponent', () => {
expect(toastErrorSpy).toHaveBeenCalled()
expect(storeSpy).toHaveBeenCalled()
expect(appearanceSettingsSpy).not.toHaveBeenCalled()
expect(setSpy).toHaveBeenCalledTimes(30)
expect(setSpy).toHaveBeenCalledTimes(32)
// succeed
storeSpy.mockReturnValueOnce(of(true))
@@ -366,4 +366,22 @@ describe('SettingsComponent', () => {
settingsService.settingsSaved.emit(true)
expect(maybeRefreshSpy).toHaveBeenCalled()
})
it('should support toggling document detail fields', () => {
completeSetup()
const field = 'storage_path'
expect(
component.settingsForm.get('documentDetailsHiddenFields').value.length
).toEqual(0)
component.toggleDocumentDetailField(field, false)
expect(
component.settingsForm.get('documentDetailsHiddenFields').value.length
).toEqual(1)
expect(component.isDocumentDetailFieldShown(field)).toBeFalsy()
component.toggleDocumentDetailField(field, true)
expect(
component.settingsForm.get('documentDetailsHiddenFields').value.length
).toEqual(0)
expect(component.isDocumentDetailFieldShown(field)).toBeTruthy()
})
})

View File

@@ -64,15 +64,16 @@ import { PermissionsGroupComponent } from '../../common/input/permissions/permis
import { PermissionsUserComponent } from '../../common/input/permissions/permissions-user/permissions-user.component'
import { SelectComponent } from '../../common/input/select/select.component'
import { PageHeaderComponent } from '../../common/page-header/page-header.component'
import { PdfEditorEditMode } from '../../common/pdf-editor/pdf-editor-edit-mode'
import { SystemStatusDialogComponent } from '../../common/system-status-dialog/system-status-dialog.component'
import { ZoomSetting } from '../../document-detail/document-detail.component'
import { ZoomSetting } from '../../document-detail/zoom-setting'
import { ComponentWithPermissions } from '../../with-permissions/with-permissions.component'
enum SettingsNavIDs {
General = 1,
Permissions = 2,
Notifications = 3,
SavedViews = 4,
Documents = 2,
Permissions = 3,
Notifications = 4,
}
const systemLanguage = { code: '', name: $localize`Use system language` }
@@ -81,6 +82,25 @@ const systemDateFormat = {
name: $localize`Use date format of display language`,
}
export enum DocumentDetailFieldID {
ArchiveSerialNumber = 'archive_serial_number',
Correspondent = 'correspondent',
DocumentType = 'document_type',
StoragePath = 'storage_path',
Tags = 'tags',
}
const documentDetailFieldOptions = [
{
id: DocumentDetailFieldID.ArchiveSerialNumber,
label: $localize`Archive serial number`,
},
{ id: DocumentDetailFieldID.Correspondent, label: $localize`Correspondent` },
{ id: DocumentDetailFieldID.DocumentType, label: $localize`Document type` },
{ id: DocumentDetailFieldID.StoragePath, label: $localize`Storage path` },
{ id: DocumentDetailFieldID.Tags, label: $localize`Tags` },
]
@Component({
selector: 'pngx-settings',
templateUrl: './settings.component.html',
@@ -144,8 +164,10 @@ export class SettingsComponent
defaultPermsEditGroups: new FormControl(null),
useNativePdfViewer: new FormControl(null),
pdfViewerDefaultZoom: new FormControl(null),
pdfEditorDefaultEditMode: new FormControl(null),
documentEditingRemoveInboxTags: new FormControl(null),
documentEditingOverlayThumbnail: new FormControl(null),
documentDetailsHiddenFields: new FormControl([]),
searchDbOnly: new FormControl(null),
searchLink: new FormControl(null),
@@ -176,6 +198,10 @@ export class SettingsComponent
public readonly ZoomSetting = ZoomSetting
public readonly PdfEditorEditMode = PdfEditorEditMode
public readonly documentDetailFieldOptions = documentDetailFieldOptions
get systemStatusHasErrors(): boolean {
return (
this.systemStatus.database.status === SystemStatusItemStatus.ERROR ||
@@ -292,6 +318,9 @@ export class SettingsComponent
pdfViewerDefaultZoom: this.settings.get(
SETTINGS_KEYS.PDF_VIEWER_ZOOM_SETTING
),
pdfEditorDefaultEditMode: this.settings.get(
SETTINGS_KEYS.PDF_EDITOR_DEFAULT_EDIT_MODE
),
displayLanguage: this.settings.getLanguage(),
dateLocale: this.settings.get(SETTINGS_KEYS.DATE_LOCALE),
dateFormat: this.settings.get(SETTINGS_KEYS.DATE_FORMAT),
@@ -336,6 +365,9 @@ export class SettingsComponent
documentEditingOverlayThumbnail: this.settings.get(
SETTINGS_KEYS.DOCUMENT_EDITING_OVERLAY_THUMBNAIL
),
documentDetailsHiddenFields: this.settings.get(
SETTINGS_KEYS.DOCUMENT_DETAILS_HIDDEN_FIELDS
),
searchDbOnly: this.settings.get(SETTINGS_KEYS.SEARCH_DB_ONLY),
searchLink: this.settings.get(SETTINGS_KEYS.SEARCH_FULL_TYPE),
}
@@ -458,6 +490,10 @@ export class SettingsComponent
SETTINGS_KEYS.PDF_VIEWER_ZOOM_SETTING,
this.settingsForm.value.pdfViewerDefaultZoom
)
this.settings.set(
SETTINGS_KEYS.PDF_EDITOR_DEFAULT_EDIT_MODE,
this.settingsForm.value.pdfEditorDefaultEditMode
)
this.settings.set(
SETTINGS_KEYS.DATE_LOCALE,
this.settingsForm.value.dateLocale
@@ -526,6 +562,10 @@ export class SettingsComponent
SETTINGS_KEYS.DOCUMENT_EDITING_OVERLAY_THUMBNAIL,
this.settingsForm.value.documentEditingOverlayThumbnail
)
this.settings.set(
SETTINGS_KEYS.DOCUMENT_DETAILS_HIDDEN_FIELDS,
this.settingsForm.value.documentDetailsHiddenFields
)
this.settings.set(
SETTINGS_KEYS.SEARCH_DB_ONLY,
this.settingsForm.value.searchDbOnly
@@ -587,6 +627,26 @@ export class SettingsComponent
this.settingsForm.get('themeColor').patchValue('')
}
isDocumentDetailFieldShown(fieldId: string): boolean {
const hiddenFields =
this.settingsForm.value.documentDetailsHiddenFields || []
return !hiddenFields.includes(fieldId)
}
toggleDocumentDetailField(fieldId: string, checked: boolean) {
const hiddenFields = new Set(
this.settingsForm.value.documentDetailsHiddenFields || []
)
if (checked) {
hiddenFields.delete(fieldId)
} else {
hiddenFields.add(fieldId)
}
this.settingsForm
.get('documentDetailsHiddenFields')
.setValue(Array.from(hiddenFields))
}
showSystemStatus() {
const modal: NgbModalRef = this.modalService.open(
SystemStatusDialogComponent,

View File

@@ -248,7 +248,7 @@ main {
}
}
@media screen and (min-width: 366px) and (max-width: 768px) {
@media screen and (min-width: 376px) and (max-width: 768px) {
.navbar-toggler {
// compensate for 2 buttons on the right
margin-right: 45px;

View File

@@ -0,0 +1,4 @@
export enum PdfEditorEditMode {
Update = 'update',
Create = 'create',
}

View File

@@ -8,8 +8,11 @@ import { FormsModule } from '@angular/forms'
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap'
import { PDFDocumentProxy, PdfViewerModule } from 'ng2-pdf-viewer'
import { NgxBootstrapIconsModule } from 'ngx-bootstrap-icons'
import { SETTINGS_KEYS } from 'src/app/data/ui-settings'
import { DocumentService } from 'src/app/services/rest/document.service'
import { SettingsService } from 'src/app/services/settings.service'
import { ConfirmDialogComponent } from '../confirm-dialog/confirm-dialog.component'
import { PdfEditorEditMode } from './pdf-editor-edit-mode'
interface PageOperation {
page: number
@@ -19,11 +22,6 @@ interface PageOperation {
loaded?: boolean
}
export enum PdfEditorEditMode {
Update = 'update',
Create = 'create',
}
@Component({
selector: 'pngx-pdf-editor',
templateUrl: './pdf-editor.component.html',
@@ -39,12 +37,15 @@ export class PDFEditorComponent extends ConfirmDialogComponent {
public PdfEditorEditMode = PdfEditorEditMode
private documentService = inject(DocumentService)
private readonly settingsService = inject(SettingsService)
activeModal: NgbActiveModal = inject(NgbActiveModal)
documentID: number
pages: PageOperation[] = []
totalPages = 0
editMode: PdfEditorEditMode = PdfEditorEditMode.Create
editMode: PdfEditorEditMode = this.settingsService.get(
SETTINGS_KEYS.PDF_EDITOR_DEFAULT_EDIT_MODE
)
deleteOriginal: boolean = false
includeMetadata: boolean = true

View File

@@ -0,0 +1,129 @@
<div class="modal-header">
<h4 class="modal-title">{{ title }}</h4>
<button type="button" class="btn-close" aria-label="Close" (click)="cancel()"></button>
</div>
<div class="modal-body">
@if (!createdBundle) {
<form [formGroup]="form" class="d-flex flex-column gap-3">
<div>
<p class="mb-1">
<ng-container i18n>Selected documents:</ng-container>
{{ selectionCount }}
</p>
@if (documentPreview.length > 0) {
<ul class="list-unstyled small mb-0">
@for (doc of documentPreview; track doc.id) {
<li>
<strong>{{ doc.title | documentTitle }}</strong>
</li>
}
@if (selectionCount > documentPreview.length) {
<li>
<ng-container i18n>+ {{ selectionCount - documentPreview.length }} more…</ng-container>
</li>
}
</ul>
}
</div>
<div class="d-flex align-items-center justify-content-between">
<div class="input-group">
<label class="input-group-text" for="expirationDays"><ng-container i18n>Expires</ng-container>:</label>
<select class="form-select" id="expirationDays" formControlName="expirationDays">
@for (option of expirationOptions; track option.value) {
<option [ngValue]="option.value">{{ option.label }}</option>
}
</select>
</div>
<div class="form-check form-switch w-100 ms-3">
<input
class="form-check-input"
type="checkbox"
role="switch"
id="shareArchiveSwitch"
formControlName="shareArchiveVersion"
aria-checked="{{ shareArchiveVersion }}"
/>
<label class="form-check-label" for="shareArchiveSwitch" i18n>Share archive version (if available)</label>
</div>
</div>
</form>
} @else {
<div class="d-flex flex-column gap-3">
<div class="alert alert-success mb-0" role="status">
<h6 class="alert-heading mb-1" i18n>Share link bundle requested</h6>
<p class="mb-0 small" i18n>
You can copy the share link below or open the manager to monitor progress. The link will start working once the bundle is ready.
</p>
</div>
<dl class="row mb-0 small">
<dt class="col-sm-4" i18n>Status</dt>
<dd class="col-sm-8">
<span class="badge text-bg-secondary text-uppercase">{{ statusLabel(createdBundle.status) }}</span>
</dd>
<dt class="col-sm-4" i18n>Slug</dt>
<dd class="col-sm-8"><code>{{ createdBundle.slug }}</code></dd>
<dt class="col-sm-4" i18n>Link</dt>
<dd class="col-sm-8">
<div class="input-group input-group-sm">
<input class="form-control" type="text" [value]="getShareUrl(createdBundle)" readonly>
<button
class="btn btn-outline-primary"
type="button"
(click)="copy(createdBundle)"
>
@if (copied) {
<i-bs name="clipboard-check"></i-bs>
}
@if (!copied) {
<i-bs name="clipboard"></i-bs>
}
<span class="visually-hidden" i18n>Copy link</span>
</button>
</div>
</dd>
<dt class="col-sm-4" i18n>Documents</dt>
<dd class="col-sm-8">{{ createdBundle.document_count }}</dd>
<dt class="col-sm-4" i18n>Expires</dt>
<dd class="col-sm-8">
@if (createdBundle.expiration) {
{{ createdBundle.expiration | date: 'short' }}
}
@if (!createdBundle.expiration) {
<span i18n>Never</span>
}
</dd>
<dt class="col-sm-4" i18n>File version</dt>
<dd class="col-sm-8">{{ fileVersionLabel(createdBundle.file_version) }}</dd>
@if (createdBundle.size_bytes !== undefined && createdBundle.size_bytes !== null) {
<dt class="col-sm-4" i18n>Size</dt>
<dd class="col-sm-8">{{ createdBundle.size_bytes | fileSize }}</dd>
}
</dl>
</div>
}
</div>
<div class="modal-footer">
<div class="d-flex align-items-center gap-2 w-100">
<div class="text-light fst-italic small">
<ng-container i18n>A zip file containing the selected documents will be created for this share link bundle. This process happens in the background and may take some time, especially for large bundles.</ng-container>
</div>
<button type="button" class="btn btn-outline-secondary btn-sm ms-auto" (click)="cancel()">{{ cancelBtnCaption }}</button>
@if (createdBundle) {
<button type="button" class="btn btn-outline-secondary btn-sm text-nowrap" (click)="openManage()" i18n>Manage share link bundles</button>
}
@if (!createdBundle) {
<button
type="button"
class="btn btn-primary btn-sm d-inline-flex align-items-center gap-2 text-nowrap"
(click)="submit()"
[disabled]="loading || !buttonsEnabled">
@if (loading) {
<span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"></span>
}
{{ btnCaption }}
</button>
}
</div>
</div>

View File

@@ -0,0 +1,161 @@
import { Clipboard } from '@angular/cdk/clipboard'
import {
ComponentFixture,
TestBed,
fakeAsync,
tick,
} from '@angular/core/testing'
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap'
import { NgxBootstrapIconsModule, allIcons } from 'ngx-bootstrap-icons'
import { FileVersion } from 'src/app/data/share-link'
import {
ShareLinkBundleStatus,
ShareLinkBundleSummary,
} from 'src/app/data/share-link-bundle'
import { ToastService } from 'src/app/services/toast.service'
import { environment } from 'src/environments/environment'
import { ShareLinkBundleDialogComponent } from './share-link-bundle-dialog.component'
class MockToastService {
showInfo = jest.fn()
showError = jest.fn()
}
describe('ShareLinkBundleDialogComponent', () => {
let component: ShareLinkBundleDialogComponent
let fixture: ComponentFixture<ShareLinkBundleDialogComponent>
let clipboard: Clipboard
let toastService: MockToastService
let activeModal: NgbActiveModal
let originalApiBaseUrl: string
beforeEach(() => {
originalApiBaseUrl = environment.apiBaseUrl
toastService = new MockToastService()
TestBed.configureTestingModule({
imports: [
ShareLinkBundleDialogComponent,
NgxBootstrapIconsModule.pick(allIcons),
],
providers: [
NgbActiveModal,
{ provide: ToastService, useValue: toastService },
],
})
fixture = TestBed.createComponent(ShareLinkBundleDialogComponent)
component = fixture.componentInstance
clipboard = TestBed.inject(Clipboard)
activeModal = TestBed.inject(NgbActiveModal)
fixture.detectChanges()
})
afterEach(() => {
jest.clearAllTimers()
environment.apiBaseUrl = originalApiBaseUrl
})
it('builds payload and emits confirm on submit', () => {
const confirmSpy = jest.spyOn(component.confirmClicked, 'emit')
component.documents = [
{ id: 1, title: 'Doc 1' } as any,
{ id: 2, title: 'Doc 2' } as any,
]
component.form.setValue({
shareArchiveVersion: false,
expirationDays: 3,
})
component.submit()
expect(component.payload).toEqual({
document_ids: [1, 2],
file_version: FileVersion.Original,
expiration_days: 3,
})
expect(component.buttonsEnabled).toBe(false)
expect(confirmSpy).toHaveBeenCalled()
component.form.setValue({
shareArchiveVersion: true,
expirationDays: 7,
})
component.submit()
expect(component.payload).toEqual({
document_ids: [1, 2],
file_version: FileVersion.Archive,
expiration_days: 7,
})
})
it('ignores submit when bundle already created', () => {
component.createdBundle = { id: 1 } as ShareLinkBundleSummary
const confirmSpy = jest.spyOn(component, 'confirm')
component.submit()
expect(confirmSpy).not.toHaveBeenCalled()
})
it('limits preview to ten documents', () => {
const docs = Array.from({ length: 12 }).map((_, index) => ({
id: index + 1,
}))
component.documents = docs as any
expect(component.selectionCount).toBe(12)
expect(component.documentPreview).toHaveLength(10)
expect(component.documentPreview[0].id).toBe(1)
})
it('copies share link and resets state after timeout', fakeAsync(() => {
const copySpy = jest.spyOn(clipboard, 'copy').mockReturnValue(true)
const bundle = {
slug: 'bundle-slug',
status: ShareLinkBundleStatus.Ready,
} as ShareLinkBundleSummary
component.copy(bundle)
expect(copySpy).toHaveBeenCalledWith(component.getShareUrl(bundle))
expect(component.copied).toBe(true)
expect(toastService.showInfo).toHaveBeenCalled()
tick(3000)
expect(component.copied).toBe(false)
}))
it('generates share URLs based on API base URL', () => {
environment.apiBaseUrl = 'https://example.com/api/'
expect(
component.getShareUrl({ slug: 'abc' } as ShareLinkBundleSummary)
).toBe('https://example.com/share/abc')
})
it('opens manage dialog when callback provided', () => {
const manageSpy = jest.fn()
component.onOpenManage = manageSpy
component.openManage()
expect(manageSpy).toHaveBeenCalled()
})
it('falls back to cancel when manage callback missing', () => {
const cancelSpy = jest.spyOn(component, 'cancel')
component.onOpenManage = undefined
component.openManage()
expect(cancelSpy).toHaveBeenCalled()
})
it('maps status and file version labels', () => {
expect(component.statusLabel(ShareLinkBundleStatus.Processing)).toContain(
'Processing'
)
expect(component.fileVersionLabel(FileVersion.Archive)).toContain('Archive')
})
it('closes dialog when cancel invoked', () => {
const closeSpy = jest.spyOn(activeModal, 'close')
component.cancel()
expect(closeSpy).toHaveBeenCalled()
})
})

View File

@@ -0,0 +1,118 @@
import { Clipboard } from '@angular/cdk/clipboard'
import { CommonModule } from '@angular/common'
import { Component, Input, inject } from '@angular/core'
import { FormBuilder, FormGroup, ReactiveFormsModule } from '@angular/forms'
import { NgxBootstrapIconsModule } from 'ngx-bootstrap-icons'
import { Document } from 'src/app/data/document'
import {
FileVersion,
SHARE_LINK_EXPIRATION_OPTIONS,
} from 'src/app/data/share-link'
import {
SHARE_LINK_BUNDLE_FILE_VERSION_LABELS,
SHARE_LINK_BUNDLE_STATUS_LABELS,
ShareLinkBundleCreatePayload,
ShareLinkBundleStatus,
ShareLinkBundleSummary,
} from 'src/app/data/share-link-bundle'
import { DocumentTitlePipe } from 'src/app/pipes/document-title.pipe'
import { FileSizePipe } from 'src/app/pipes/file-size.pipe'
import { ToastService } from 'src/app/services/toast.service'
import { environment } from 'src/environments/environment'
import { ConfirmDialogComponent } from '../confirm-dialog/confirm-dialog.component'
@Component({
selector: 'pngx-share-link-bundle-dialog',
templateUrl: './share-link-bundle-dialog.component.html',
imports: [
CommonModule,
ReactiveFormsModule,
NgxBootstrapIconsModule,
FileSizePipe,
DocumentTitlePipe,
],
providers: [],
})
export class ShareLinkBundleDialogComponent extends ConfirmDialogComponent {
private readonly formBuilder = inject(FormBuilder)
private readonly clipboard = inject(Clipboard)
private readonly toastService = inject(ToastService)
private _documents: Document[] = []
selectionCount = 0
documentPreview: Document[] = []
form: FormGroup = this.formBuilder.group({
shareArchiveVersion: true,
expirationDays: [7],
})
payload: ShareLinkBundleCreatePayload | null = null
readonly expirationOptions = SHARE_LINK_EXPIRATION_OPTIONS
createdBundle: ShareLinkBundleSummary | null = null
copied = false
onOpenManage?: () => void
readonly statuses = ShareLinkBundleStatus
constructor() {
super()
this.loading = false
this.title = $localize`Create share link bundle`
this.btnCaption = $localize`Create link`
}
@Input()
set documents(docs: Document[]) {
this._documents = docs.concat()
this.selectionCount = this._documents.length
this.documentPreview = this._documents.slice(0, 10)
}
submit() {
if (this.createdBundle) return
this.payload = {
document_ids: this._documents.map((doc) => doc.id),
file_version: this.form.value.shareArchiveVersion
? FileVersion.Archive
: FileVersion.Original,
expiration_days: this.form.value.expirationDays,
}
this.buttonsEnabled = false
super.confirm()
}
getShareUrl(bundle: ShareLinkBundleSummary): string {
const apiURL = new URL(environment.apiBaseUrl)
return `${apiURL.origin}${apiURL.pathname.replace(/\/api\/$/, '/share/')}${
bundle.slug
}`
}
copy(bundle: ShareLinkBundleSummary): void {
const success = this.clipboard.copy(this.getShareUrl(bundle))
if (success) {
this.copied = true
this.toastService.showInfo($localize`Share link copied to clipboard.`)
setTimeout(() => {
this.copied = false
}, 3000)
}
}
openManage(): void {
if (this.onOpenManage) {
this.onOpenManage()
} else {
this.cancel()
}
}
statusLabel(status: ShareLinkBundleSummary['status']): string {
return SHARE_LINK_BUNDLE_STATUS_LABELS[status] ?? status
}
fileVersionLabel(version: FileVersion): string {
return SHARE_LINK_BUNDLE_FILE_VERSION_LABELS[version] ?? version
}
}

View File

@@ -0,0 +1,156 @@
<div class="modal-header">
<h4 class="modal-title">{{ title }}</h4>
<button type="button" class="btn-close" aria-label="Close" (click)="close()"></button>
</div>
<div class="modal-body">
@if (loading) {
<div class="d-flex align-items-center gap-2">
<div class="spinner-border spinner-border-sm" role="status"></div>
<span i18n>Loading share link bundles…</span>
</div>
}
@if (!loading && error) {
<div class="alert alert-danger mb-0" role="alert">
{{ error }}
</div>
}
@if (!loading && !error) {
<div class="d-flex justify-content-between align-items-center mb-2">
<p class="mb-0 text-muted small">
<ng-container i18n>Status updates every few seconds while bundles are being prepared.</ng-container>
</p>
</div>
@if (bundles.length === 0) {
<p class="mb-0 text-muted fst-italic" i18n>No share link bundles currently exist.</p>
}
@if (bundles.length > 0) {
<div class="table-responsive">
<table class="table table-sm align-middle mb-0">
<thead>
<tr>
<th scope="col" i18n>Created</th>
<th scope="col" i18n>Status</th>
<th scope="col" i18n>Size</th>
<th scope="col" i18n>Expires</th>
<th scope="col" i18n>Documents</th>
<th scope="col" i18n>File version</th>
<th scope="col" class="text-end" i18n>Actions</th>
</tr>
</thead>
<tbody>
@for (bundle of bundles; track bundle.id) {
<tr>
<td>
<div>{{ bundle.created | date: 'short' }}</div>
@if (bundle.built_at) {
<div class="small text-muted">
<ng-container i18n>Built:</ng-container> {{ bundle.built_at | date: 'short' }}
</div>
}
</td>
<td>
<div class="d-flex align-items-center gap-2">
@if (bundle.status === statuses.Failed && bundle.last_error) {
<button
type="button"
class="btn btn-link p-0 text-danger"
[ngbPopover]="errorDetail"
popoverClass="popover-sm"
triggers="mouseover:mouseleave"
placement="auto"
aria-label="View error details"
i18n-aria-label
>
<span class="badge text-bg-warning text-uppercase me-2">{{ statusLabel(bundle.status) }}</span>
<i-bs name="exclamation-triangle-fill" class="text-warning"></i-bs>
</button>
<ng-template #errorDetail>
@if (bundle.last_error.timestamp) {
<div class="text-muted small mb-1">
{{ bundle.last_error.timestamp | date: 'short' }}
</div>
}
<h6>{{ bundle.last_error.exception_type || ($localize`Unknown error`) }}</h6>
@if (bundle.last_error.message) {
<pre class="text-muted small"><code>{{ bundle.last_error.message }}</code></pre>
}
</ng-template>
}
@if (bundle.status === statuses.Processing || bundle.status === statuses.Pending) {
<span class="spinner-border spinner-border-sm" role="status"></span>
}
@if (bundle.status !== statuses.Failed) {
<span class="badge text-bg-secondary text-uppercase">{{ statusLabel(bundle.status) }}</span>
}
</div>
</td>
<td>
@if (bundle.size_bytes !== undefined && bundle.size_bytes !== null) {
{{ bundle.size_bytes | fileSize }}
}
@if (bundle.size_bytes === undefined || bundle.size_bytes === null) {
<span class="text-muted">&mdash;</span>
}
</td>
<td>
@if (bundle.expiration) {
{{ bundle.expiration | date: 'short' }}
}
@if (!bundle.expiration) {
<span i18n>Never</span>
}
</td>
<td>{{ bundle.document_count }}</td>
<td>{{ fileVersionLabel(bundle.file_version) }}</td>
<td class="text-end">
<div class="btn-group btn-group-sm">
<button
type="button"
class="btn btn-outline-primary"
[disabled]="bundle.status !== statuses.Ready"
(click)="copy(bundle)"
title="Copy share link"
i18n-title
>
@if (copiedSlug === bundle.slug) {
<i-bs name="clipboard-check"></i-bs>
}
@if (copiedSlug !== bundle.slug) {
<i-bs name="clipboard"></i-bs>
}
<span class="visually-hidden" i18n>Copy share link</span>
</button>
@if (bundle.status === statuses.Failed) {
<button
type="button"
class="btn btn-outline-warning"
[disabled]="loading"
(click)="retry(bundle)"
>
<i-bs name="arrow-clockwise"></i-bs>
<span class="visually-hidden" i18n>Retry</span>
</button>
}
<pngx-confirm-button
buttonClasses="btn btn-sm btn-outline-danger"
[disabled]="loading"
(confirm)="delete(bundle)"
iconName="trash"
>
<span class="visually-hidden" i18n>Delete share link bundle</span>
</pngx-confirm-button>
</div>
</td>
</tr>
}
</tbody>
</table>
</div>
}
}
</div>
<div class="modal-footer">
<button type="button" class="btn btn-outline-secondary btn-sm" (click)="close()" i18n>Close</button>
</div>

View File

@@ -0,0 +1,4 @@
:host ::ng-deep .popover {
min-width: 300px;
max-width: 400px;
}

View File

@@ -0,0 +1,251 @@
import { Clipboard } from '@angular/cdk/clipboard'
import {
ComponentFixture,
TestBed,
fakeAsync,
tick,
} from '@angular/core/testing'
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap'
import { NgxBootstrapIconsModule, allIcons } from 'ngx-bootstrap-icons'
import { of, throwError } from 'rxjs'
import { FileVersion } from 'src/app/data/share-link'
import {
ShareLinkBundleStatus,
ShareLinkBundleSummary,
} from 'src/app/data/share-link-bundle'
import { ShareLinkBundleService } from 'src/app/services/rest/share-link-bundle.service'
import { ToastService } from 'src/app/services/toast.service'
import { environment } from 'src/environments/environment'
import { ShareLinkBundleManageDialogComponent } from './share-link-bundle-manage-dialog.component'
class MockShareLinkBundleService {
listAllBundles = jest.fn()
delete = jest.fn()
rebuildBundle = jest.fn()
}
class MockToastService {
showInfo = jest.fn()
showError = jest.fn()
}
describe('ShareLinkBundleManageDialogComponent', () => {
let component: ShareLinkBundleManageDialogComponent
let fixture: ComponentFixture<ShareLinkBundleManageDialogComponent>
let service: MockShareLinkBundleService
let toastService: MockToastService
let clipboard: Clipboard
let activeModal: NgbActiveModal
let originalApiBaseUrl: string
beforeEach(() => {
service = new MockShareLinkBundleService()
toastService = new MockToastService()
originalApiBaseUrl = environment.apiBaseUrl
service.listAllBundles.mockReturnValue(of([]))
service.delete.mockReturnValue(of(true))
service.rebuildBundle.mockReturnValue(of(sampleBundle()))
TestBed.configureTestingModule({
imports: [
ShareLinkBundleManageDialogComponent,
NgxBootstrapIconsModule.pick(allIcons),
],
providers: [
NgbActiveModal,
{ provide: ShareLinkBundleService, useValue: service },
{ provide: ToastService, useValue: toastService },
],
})
fixture = TestBed.createComponent(ShareLinkBundleManageDialogComponent)
component = fixture.componentInstance
clipboard = TestBed.inject(Clipboard)
activeModal = TestBed.inject(NgbActiveModal)
})
afterEach(() => {
component.ngOnDestroy()
fixture.destroy()
environment.apiBaseUrl = originalApiBaseUrl
jest.clearAllMocks()
})
const sampleBundle = (overrides: Partial<ShareLinkBundleSummary> = {}) =>
({
id: 1,
slug: 'bundle-slug',
created: new Date().toISOString(),
document_count: 1,
documents: [1],
status: ShareLinkBundleStatus.Pending,
file_version: FileVersion.Archive,
last_error: undefined,
...overrides,
}) as ShareLinkBundleSummary
it('loads bundles on init and polls periodically', fakeAsync(() => {
const bundles = [sampleBundle({ status: ShareLinkBundleStatus.Ready })]
service.listAllBundles.mockReset()
service.listAllBundles
.mockReturnValueOnce(of(bundles))
.mockReturnValue(of(bundles))
fixture.detectChanges()
tick()
expect(service.listAllBundles).toHaveBeenCalledTimes(1)
expect(component.bundles).toEqual(bundles)
expect(component.loading).toBe(false)
expect(component.error).toBeNull()
tick(5000)
expect(service.listAllBundles).toHaveBeenCalledTimes(2)
}))
it('handles errors when loading bundles', fakeAsync(() => {
service.listAllBundles.mockReset()
service.listAllBundles
.mockReturnValueOnce(throwError(() => new Error('load fail')))
.mockReturnValue(of([]))
fixture.detectChanges()
tick()
expect(component.error).toContain('Failed to load share link bundles.')
expect(toastService.showError).toHaveBeenCalled()
expect(component.loading).toBe(false)
tick(5000)
expect(service.listAllBundles).toHaveBeenCalledTimes(2)
}))
it('copies bundle links when ready', fakeAsync(() => {
jest.spyOn(clipboard, 'copy').mockReturnValue(true)
fixture.detectChanges()
tick()
const readyBundle = sampleBundle({
slug: 'ready-slug',
status: ShareLinkBundleStatus.Ready,
})
component.copy(readyBundle)
expect(clipboard.copy).toHaveBeenCalledWith(
component.getShareUrl(readyBundle)
)
expect(component.copiedSlug).toBe('ready-slug')
expect(toastService.showInfo).toHaveBeenCalled()
tick(3000)
expect(component.copiedSlug).toBeNull()
}))
it('ignores copy requests for non-ready bundles', fakeAsync(() => {
const copySpy = jest.spyOn(clipboard, 'copy')
fixture.detectChanges()
tick()
component.copy(sampleBundle({ status: ShareLinkBundleStatus.Pending }))
expect(copySpy).not.toHaveBeenCalled()
}))
it('deletes bundles and refreshes list', fakeAsync(() => {
service.listAllBundles.mockReturnValue(of([]))
service.delete.mockReturnValue(of(true))
fixture.detectChanges()
tick()
component.delete(sampleBundle())
tick()
expect(service.delete).toHaveBeenCalled()
expect(toastService.showInfo).toHaveBeenCalledWith(
expect.stringContaining('deleted.')
)
expect(service.listAllBundles).toHaveBeenCalledTimes(2)
expect(component.loading).toBe(false)
}))
it('handles delete errors gracefully', fakeAsync(() => {
service.listAllBundles.mockReturnValue(of([]))
service.delete.mockReturnValue(throwError(() => new Error('delete fail')))
fixture.detectChanges()
tick()
component.delete(sampleBundle())
tick()
expect(toastService.showError).toHaveBeenCalled()
expect(component.loading).toBe(false)
}))
it('retries bundle build and replaces existing entry', fakeAsync(() => {
service.listAllBundles.mockReturnValue(of([]))
const updated = sampleBundle({ status: ShareLinkBundleStatus.Ready })
service.rebuildBundle.mockReturnValue(of(updated))
fixture.detectChanges()
tick()
component.bundles = [sampleBundle()]
component.retry(component.bundles[0])
tick()
expect(service.rebuildBundle).toHaveBeenCalledWith(updated.id)
expect(component.bundles[0].status).toBe(ShareLinkBundleStatus.Ready)
expect(toastService.showInfo).toHaveBeenCalled()
}))
it('adds new bundle when retry returns unknown entry', fakeAsync(() => {
service.listAllBundles.mockReturnValue(of([]))
service.rebuildBundle.mockReturnValue(
of(sampleBundle({ id: 99, slug: 'new-slug' }))
)
fixture.detectChanges()
tick()
component.bundles = [sampleBundle()]
component.retry({ id: 99 } as ShareLinkBundleSummary)
tick()
expect(component.bundles.find((bundle) => bundle.id === 99)).toBeTruthy()
}))
it('handles retry errors', fakeAsync(() => {
service.listAllBundles.mockReturnValue(of([]))
service.rebuildBundle.mockReturnValue(throwError(() => new Error('fail')))
fixture.detectChanges()
tick()
component.retry(sampleBundle())
tick()
expect(toastService.showError).toHaveBeenCalled()
}))
it('maps helpers and closes dialog', fakeAsync(() => {
service.listAllBundles.mockReturnValue(of([]))
fixture.detectChanges()
tick()
expect(component.statusLabel(ShareLinkBundleStatus.Processing)).toContain(
'Processing'
)
expect(component.fileVersionLabel(FileVersion.Original)).toContain(
'Original'
)
environment.apiBaseUrl = 'https://example.com/api/'
const url = component.getShareUrl(sampleBundle({ slug: 'sluggy' }))
expect(url).toBe('https://example.com/share/sluggy')
const closeSpy = jest.spyOn(activeModal, 'close')
component.close()
expect(closeSpy).toHaveBeenCalled()
}))
})

View File

@@ -0,0 +1,177 @@
import { Clipboard } from '@angular/cdk/clipboard'
import { CommonModule } from '@angular/common'
import { Component, OnDestroy, OnInit, inject } from '@angular/core'
import { NgbActiveModal, NgbPopoverModule } from '@ng-bootstrap/ng-bootstrap'
import { NgxBootstrapIconsModule } from 'ngx-bootstrap-icons'
import { Subject, catchError, of, switchMap, takeUntil, timer } from 'rxjs'
import { FileVersion } from 'src/app/data/share-link'
import {
SHARE_LINK_BUNDLE_FILE_VERSION_LABELS,
SHARE_LINK_BUNDLE_STATUS_LABELS,
ShareLinkBundleStatus,
ShareLinkBundleSummary,
} from 'src/app/data/share-link-bundle'
import { FileSizePipe } from 'src/app/pipes/file-size.pipe'
import { ShareLinkBundleService } from 'src/app/services/rest/share-link-bundle.service'
import { ToastService } from 'src/app/services/toast.service'
import { environment } from 'src/environments/environment'
import { LoadingComponentWithPermissions } from '../../loading-component/loading.component'
import { ConfirmButtonComponent } from '../confirm-button/confirm-button.component'
@Component({
selector: 'pngx-share-link-bundle-manage-dialog',
templateUrl: './share-link-bundle-manage-dialog.component.html',
styleUrls: ['./share-link-bundle-manage-dialog.component.scss'],
imports: [
ConfirmButtonComponent,
CommonModule,
NgbPopoverModule,
NgxBootstrapIconsModule,
FileSizePipe,
],
})
export class ShareLinkBundleManageDialogComponent
extends LoadingComponentWithPermissions
implements OnInit, OnDestroy
{
private readonly activeModal = inject(NgbActiveModal)
private readonly shareLinkBundleService = inject(ShareLinkBundleService)
private readonly toastService = inject(ToastService)
private readonly clipboard = inject(Clipboard)
title = $localize`Share link bundles`
bundles: ShareLinkBundleSummary[] = []
error: string | null = null
copiedSlug: string | null = null
readonly statuses = ShareLinkBundleStatus
readonly fileVersions = FileVersion
private readonly refresh$ = new Subject<boolean>()
ngOnInit(): void {
this.refresh$
.pipe(
switchMap((silent) => {
if (!silent) {
this.loading = true
}
this.error = null
return this.shareLinkBundleService.listAllBundles().pipe(
catchError((error) => {
if (!silent) {
this.loading = false
}
this.error = $localize`Failed to load share link bundles.`
this.toastService.showError(
$localize`Error retrieving share link bundles.`,
error
)
return of(null)
})
)
}),
takeUntil(this.unsubscribeNotifier)
)
.subscribe((results) => {
if (results) {
this.bundles = results
this.copiedSlug = null
}
this.loading = false
})
this.triggerRefresh(false)
timer(5000, 5000)
.pipe(takeUntil(this.unsubscribeNotifier))
.subscribe(() => this.triggerRefresh(true))
}
ngOnDestroy(): void {
super.ngOnDestroy()
}
getShareUrl(bundle: ShareLinkBundleSummary): string {
const apiURL = new URL(environment.apiBaseUrl)
return `${apiURL.origin}${apiURL.pathname.replace(/\/api\/$/, '/share/')}${
bundle.slug
}`
}
copy(bundle: ShareLinkBundleSummary): void {
if (bundle.status !== ShareLinkBundleStatus.Ready) {
return
}
const success = this.clipboard.copy(this.getShareUrl(bundle))
if (success) {
this.copiedSlug = bundle.slug
setTimeout(() => {
this.copiedSlug = null
}, 3000)
this.toastService.showInfo($localize`Share link copied to clipboard.`)
}
}
delete(bundle: ShareLinkBundleSummary): void {
this.error = null
this.loading = true
this.shareLinkBundleService.delete(bundle).subscribe({
next: () => {
this.toastService.showInfo($localize`Share link bundle deleted.`)
this.triggerRefresh(false)
},
error: (e) => {
this.loading = false
this.toastService.showError(
$localize`Error deleting share link bundle.`,
e
)
},
})
}
retry(bundle: ShareLinkBundleSummary): void {
this.error = null
this.shareLinkBundleService.rebuildBundle(bundle.id).subscribe({
next: (updated) => {
this.toastService.showInfo(
$localize`Share link bundle rebuild requested.`
)
this.replaceBundle(updated)
},
error: (e) => {
this.toastService.showError($localize`Error requesting rebuild.`, e)
},
})
}
statusLabel(status: ShareLinkBundleStatus): string {
return SHARE_LINK_BUNDLE_STATUS_LABELS[status] ?? status
}
fileVersionLabel(version: FileVersion): string {
return SHARE_LINK_BUNDLE_FILE_VERSION_LABELS[version] ?? version
}
close(): void {
this.activeModal.close()
}
private replaceBundle(updated: ShareLinkBundleSummary): void {
const index = this.bundles.findIndex((bundle) => bundle.id === updated.id)
if (index >= 0) {
this.bundles = [
...this.bundles.slice(0, index),
updated,
...this.bundles.slice(index + 1),
]
} else {
this.bundles = [updated, ...this.bundles]
}
}
private triggerRefresh(silent: boolean): void {
this.refresh$.next(silent)
}
}

View File

@@ -51,7 +51,7 @@
<div class="input-group w-100 mt-2">
<label class="input-group-text" for="addLink"><ng-container i18n>Expires</ng-container>:</label>
<select class="form-select fs-6" [(ngModel)]="expirationDays">
@for (option of EXPIRATION_OPTIONS; track option) {
@for (option of expirationOptions; track option) {
<option [ngValue]="option.value">{{ option.label }}</option>
}
</select>

View File

@@ -4,7 +4,11 @@ import { FormsModule, ReactiveFormsModule } from '@angular/forms'
import { NgbActiveModal } from '@ng-bootstrap/ng-bootstrap'
import { NgxBootstrapIconsModule } from 'ngx-bootstrap-icons'
import { first } from 'rxjs'
import { FileVersion, ShareLink } from 'src/app/data/share-link'
import {
FileVersion,
SHARE_LINK_EXPIRATION_OPTIONS,
ShareLink,
} from 'src/app/data/share-link'
import { ShareLinkService } from 'src/app/services/rest/share-link.service'
import { ToastService } from 'src/app/services/toast.service'
import { environment } from 'src/environments/environment'
@@ -21,12 +25,7 @@ export class ShareLinksDialogComponent implements OnInit {
private toastService = inject(ToastService)
private clipboard = inject(Clipboard)
EXPIRATION_OPTIONS = [
{ label: $localize`1 day`, value: 1 },
{ label: $localize`7 days`, value: 7 },
{ label: $localize`30 days`, value: 30 },
{ label: $localize`Never`, value: null },
]
readonly expirationOptions = SHARE_LINK_EXPIRATION_OPTIONS
@Input()
title = $localize`Share Links`

View File

@@ -146,16 +146,26 @@
<ng-template ngbNavContent>
<div>
<pngx-input-text #inputTitle i18n-title title="Title" formControlName="title" [horizontal]="true" [suggestion]="suggestions?.title" (keyup)="titleKeyUp($event)" [error]="error?.title"></pngx-input-text>
<pngx-input-number i18n-title title="Archive serial number" [error]="error?.archive_serial_number" [horizontal]="true" formControlName='archive_serial_number'></pngx-input-number>
@if (!isFieldHidden(DocumentDetailFieldID.ArchiveSerialNumber)) {
<pngx-input-number i18n-title title="Archive serial number" [error]="error?.archive_serial_number" [horizontal]="true" formControlName='archive_serial_number'></pngx-input-number>
}
<pngx-input-date i18n-title title="Date created" formControlName="created" [suggestions]="suggestions?.dates" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event)"
[error]="error?.created"></pngx-input-date>
<pngx-input-select [items]="correspondents" i18n-title title="Correspondent" formControlName="correspondent" [allowNull]="true" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event, DataType.Correspondent)"
(createNew)="createCorrespondent($event)" [hideAddButton]="createDisabled(DataType.Correspondent)" [suggestions]="suggestions?.correspondents" *pngxIfPermissions="{ action: PermissionAction.View, type: PermissionType.Correspondent }"></pngx-input-select>
<pngx-input-select [items]="documentTypes" i18n-title title="Document type" formControlName="document_type" [allowNull]="true" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event, DataType.DocumentType)"
(createNew)="createDocumentType($event)" [hideAddButton]="createDisabled(DataType.DocumentType)" [suggestions]="suggestions?.document_types" *pngxIfPermissions="{ action: PermissionAction.View, type: PermissionType.DocumentType }"></pngx-input-select>
<pngx-input-select [items]="storagePaths" i18n-title title="Storage path" formControlName="storage_path" [allowNull]="true" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event, DataType.StoragePath)"
(createNew)="createStoragePath($event)" [hideAddButton]="createDisabled(DataType.StoragePath)" [suggestions]="suggestions?.storage_paths" i18n-placeholder placeholder="Default" *pngxIfPermissions="{ action: PermissionAction.View, type: PermissionType.StoragePath }"></pngx-input-select>
<pngx-input-tags #tagsInput formControlName="tags" [suggestions]="suggestions?.tags" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event, DataType.Tag)" [hideAddButton]="createDisabled(DataType.Tag)" *pngxIfPermissions="{ action: PermissionAction.View, type: PermissionType.Tag }"></pngx-input-tags>
@if (!isFieldHidden(DocumentDetailFieldID.Correspondent)) {
<pngx-input-select [items]="correspondents" i18n-title title="Correspondent" formControlName="correspondent" [allowNull]="true" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event, DataType.Correspondent)"
(createNew)="createCorrespondent($event)" [hideAddButton]="createDisabled(DataType.Correspondent)" [suggestions]="suggestions?.correspondents" *pngxIfPermissions="{ action: PermissionAction.View, type: PermissionType.Correspondent }"></pngx-input-select>
}
@if (!isFieldHidden(DocumentDetailFieldID.DocumentType)) {
<pngx-input-select [items]="documentTypes" i18n-title title="Document type" formControlName="document_type" [allowNull]="true" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event, DataType.DocumentType)"
(createNew)="createDocumentType($event)" [hideAddButton]="createDisabled(DataType.DocumentType)" [suggestions]="suggestions?.document_types" *pngxIfPermissions="{ action: PermissionAction.View, type: PermissionType.DocumentType }"></pngx-input-select>
}
@if (!isFieldHidden(DocumentDetailFieldID.StoragePath)) {
<pngx-input-select [items]="storagePaths" i18n-title title="Storage path" formControlName="storage_path" [allowNull]="true" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event, DataType.StoragePath)"
(createNew)="createStoragePath($event)" [hideAddButton]="createDisabled(DataType.StoragePath)" [suggestions]="suggestions?.storage_paths" i18n-placeholder placeholder="Default" *pngxIfPermissions="{ action: PermissionAction.View, type: PermissionType.StoragePath }"></pngx-input-select>
}
@if (!isFieldHidden(DocumentDetailFieldID.Tags)) {
<pngx-input-tags #tagsInput formControlName="tags" [suggestions]="suggestions?.tags" [showFilter]="true" [horizontal]="true" (filterDocuments)="filterDocuments($event, DataType.Tag)" [hideAddButton]="createDisabled(DataType.Tag)" *pngxIfPermissions="{ action: PermissionAction.View, type: PermissionType.Tag }"></pngx-input-tags>
}
@for (fieldInstance of document?.custom_fields; track fieldInstance.field; let i = $index) {
<div [formGroup]="customFieldFormFields.controls[i]">
@switch (getCustomFieldFromInstance(fieldInstance)?.data_type) {

View File

@@ -48,6 +48,7 @@ import {
} from 'src/app/data/filter-rule-type'
import { StoragePath } from 'src/app/data/storage-path'
import { Tag } from 'src/app/data/tag'
import { SETTINGS_KEYS } from 'src/app/data/ui-settings'
import { PermissionsGuard } from 'src/app/guards/permissions.guard'
import { CustomDatePipe } from 'src/app/pipes/custom-date.pipe'
import { DocumentTitlePipe } from 'src/app/pipes/document-title.pipe'
@@ -68,10 +69,8 @@ import { environment } from 'src/environments/environment'
import { ConfirmDialogComponent } from '../common/confirm-dialog/confirm-dialog.component'
import { PasswordRemovalConfirmDialogComponent } from '../common/confirm-dialog/password-removal-confirm-dialog/password-removal-confirm-dialog.component'
import { CustomFieldsDropdownComponent } from '../common/custom-fields-dropdown/custom-fields-dropdown.component'
import {
DocumentDetailComponent,
ZoomSetting,
} from './document-detail.component'
import { DocumentDetailComponent } from './document-detail.component'
import { ZoomSetting } from './zoom-setting'
const doc: Document = {
id: 3,
@@ -1015,7 +1014,7 @@ describe('DocumentDetailComponent', () => {
it('should display built-in pdf viewer if not disabled', () => {
initNormally()
component.document.archived_file_name = 'file.pdf'
jest.spyOn(settingsService, 'get').mockReturnValue(false)
settingsService.set(SETTINGS_KEYS.USE_NATIVE_PDF_VIEWER, false)
expect(component.useNativePdfViewer).toBeFalsy()
fixture.detectChanges()
expect(fixture.debugElement.query(By.css('pdf-viewer'))).not.toBeNull()
@@ -1024,7 +1023,7 @@ describe('DocumentDetailComponent', () => {
it('should display native pdf viewer if enabled', () => {
initNormally()
component.document.archived_file_name = 'file.pdf'
jest.spyOn(settingsService, 'get').mockReturnValue(true)
settingsService.set(SETTINGS_KEYS.USE_NATIVE_PDF_VIEWER, true)
expect(component.useNativePdfViewer).toBeTruthy()
fixture.detectChanges()
expect(fixture.debugElement.query(By.css('object'))).not.toBeNull()

View File

@@ -84,6 +84,7 @@ import { ToastService } from 'src/app/services/toast.service'
import { getFilenameFromContentDisposition } from 'src/app/utils/http'
import { ISODateAdapter } from 'src/app/utils/ngb-iso-date-adapter'
import * as UTIF from 'utif'
import { DocumentDetailFieldID } from '../admin/settings/settings.component'
import { ConfirmDialogComponent } from '../common/confirm-dialog/confirm-dialog.component'
import { PasswordRemovalConfirmDialogComponent } from '../common/confirm-dialog/password-removal-confirm-dialog/password-removal-confirm-dialog.component'
import { CustomFieldsDropdownComponent } from '../common/custom-fields-dropdown/custom-fields-dropdown.component'
@@ -105,16 +106,15 @@ import { TextComponent } from '../common/input/text/text.component'
import { TextAreaComponent } from '../common/input/textarea/textarea.component'
import { UrlComponent } from '../common/input/url/url.component'
import { PageHeaderComponent } from '../common/page-header/page-header.component'
import {
PDFEditorComponent,
PdfEditorEditMode,
} from '../common/pdf-editor/pdf-editor.component'
import { PdfEditorEditMode } from '../common/pdf-editor/pdf-editor-edit-mode'
import { PDFEditorComponent } from '../common/pdf-editor/pdf-editor.component'
import { ShareLinksDialogComponent } from '../common/share-links-dialog/share-links-dialog.component'
import { SuggestionsDropdownComponent } from '../common/suggestions-dropdown/suggestions-dropdown.component'
import { DocumentHistoryComponent } from '../document-history/document-history.component'
import { DocumentNotesComponent } from '../document-notes/document-notes.component'
import { ComponentWithPermissions } from '../with-permissions/with-permissions.component'
import { MetadataCollapseComponent } from './metadata-collapse/metadata-collapse.component'
import { ZoomSetting } from './zoom-setting'
enum DocumentDetailNavIDs {
Details = 1,
@@ -136,18 +136,6 @@ enum ContentRenderType {
TIFF = 'tiff',
}
export enum ZoomSetting {
PageFit = 'page-fit',
PageWidth = 'page-width',
Quarter = '.25',
Half = '.5',
ThreeQuarters = '.75',
One = '1',
OneAndHalf = '1.5',
Two = '2',
Three = '3',
}
@Component({
selector: 'pngx-document-detail',
templateUrl: './document-detail.component.html',
@@ -281,6 +269,8 @@ export class DocumentDetailComponent
public readonly DataType = DataType
public readonly DocumentDetailFieldID = DocumentDetailFieldID
@ViewChild('nav') nav: NgbNav
@ViewChild('pdfPreview') set pdfPreview(element) {
// this gets called when component added or removed from DOM
@@ -327,6 +317,12 @@ export class DocumentDetailComponent
return this.settings.get(SETTINGS_KEYS.DOCUMENT_EDITING_OVERLAY_THUMBNAIL)
}
isFieldHidden(fieldId: DocumentDetailFieldID): boolean {
return this.settings
.get(SETTINGS_KEYS.DOCUMENT_DETAILS_HIDDEN_FIELDS)
.includes(fieldId)
}
private getRenderType(mimeType: string): ContentRenderType {
if (!mimeType) return ContentRenderType.Unknown
if (mimeType === 'application/pdf') {

View File

@@ -0,0 +1,11 @@
export enum ZoomSetting {
PageFit = 'page-fit',
PageWidth = 'page-width',
Quarter = '.25',
Half = '.5',
ThreeQuarters = '.75',
One = '1',
OneAndHalf = '1.5',
Two = '2',
Three = '3',
}

View File

@@ -96,14 +96,36 @@
<button ngbDropdownItem (click)="mergeSelected()" [disabled]="!userCanAdd || list.selected.size < 2">
<i-bs name="journals"></i-bs>&nbsp;<ng-container i18n>Merge</ng-container>
</button>
@if (emailEnabled) {
<button ngbDropdownItem (click)="emailSelected()">
<i-bs name="envelope"></i-bs>&nbsp;<ng-container i18n>Email</ng-container>
</button>
}
</div>
</div>
</div>
<div class="btn-toolbar" ngbDropdown>
<button
class="btn btn-sm btn-outline-primary"
id="dropdownSend"
ngbDropdownToggle
[disabled]="disabled || list.selected.size === 0"
>
<i-bs name="send"></i-bs>
<div class="d-none d-sm-inline">
&nbsp;<ng-container i18n>Send</ng-container>
</div>
</button>
<div ngbDropdownMenu aria-labelledby="dropdownSend" class="shadow">
<button ngbDropdownItem (click)="createShareLinkBundle()">
<i-bs name="link"></i-bs>&nbsp;<ng-container i18n>Create a share link bundle</ng-container>
</button>
<button ngbDropdownItem (click)="manageShareLinkBundles()">
<i-bs name="list-ul"></i-bs>&nbsp;<ng-container i18n>Manage share link bundles</ng-container>
</button>
<div class="dropdown-divider"></div>
@if (emailEnabled) {
<button ngbDropdownItem (click)="emailSelected()">
<i-bs name="envelope"></i-bs>&nbsp;<ng-container i18n>Email</ng-container>
</button>
}
</div>
</div>
<div class="btn-group btn-group-sm">
<button class="btn btn-sm btn-outline-primary" [disabled]="awaitingDownload" (click)="downloadSelected()">
@if (!awaitingDownload) {

View File

@@ -3,6 +3,7 @@ import {
HttpTestingController,
provideHttpClientTesting,
} from '@angular/common/http/testing'
import { EventEmitter } from '@angular/core'
import { ComponentFixture, TestBed } from '@angular/core/testing'
import { By } from '@angular/platform-browser'
import { NgbModal, NgbModalRef } from '@ng-bootstrap/ng-bootstrap'
@@ -25,6 +26,7 @@ import {
SelectionData,
} from 'src/app/services/rest/document.service'
import { GroupService } from 'src/app/services/rest/group.service'
import { ShareLinkBundleService } from 'src/app/services/rest/share-link-bundle.service'
import { StoragePathService } from 'src/app/services/rest/storage-path.service'
import { TagService } from 'src/app/services/rest/tag.service'
import { UserService } from 'src/app/services/rest/user.service'
@@ -38,6 +40,8 @@ import { EditDialogMode } from '../../common/edit-dialog/edit-dialog.component'
import { StoragePathEditDialogComponent } from '../../common/edit-dialog/storage-path-edit-dialog/storage-path-edit-dialog.component'
import { TagEditDialogComponent } from '../../common/edit-dialog/tag-edit-dialog/tag-edit-dialog.component'
import { FilterableDropdownComponent } from '../../common/filterable-dropdown/filterable-dropdown.component'
import { ShareLinkBundleDialogComponent } from '../../common/share-link-bundle-dialog/share-link-bundle-dialog.component'
import { ShareLinkBundleManageDialogComponent } from '../../common/share-link-bundle-manage-dialog/share-link-bundle-manage-dialog.component'
import { BulkEditorComponent } from './bulk-editor.component'
const selectionData: SelectionData = {
@@ -72,6 +76,7 @@ describe('BulkEditorComponent', () => {
let storagePathService: StoragePathService
let customFieldsService: CustomFieldsService
let httpTestingController: HttpTestingController
let shareLinkBundleService: ShareLinkBundleService
beforeEach(async () => {
TestBed.configureTestingModule({
@@ -152,6 +157,15 @@ describe('BulkEditorComponent', () => {
}),
},
},
{
provide: ShareLinkBundleService,
useValue: {
createBundle: jest.fn(),
listAllBundles: jest.fn(),
rebuildBundle: jest.fn(),
delete: jest.fn(),
},
},
provideHttpClient(withInterceptorsFromDi()),
provideHttpClientTesting(),
],
@@ -168,6 +182,7 @@ describe('BulkEditorComponent', () => {
storagePathService = TestBed.inject(StoragePathService)
customFieldsService = TestBed.inject(CustomFieldsService)
httpTestingController = TestBed.inject(HttpTestingController)
shareLinkBundleService = TestBed.inject(ShareLinkBundleService)
fixture = TestBed.createComponent(BulkEditorComponent)
component = fixture.componentInstance
@@ -1454,4 +1469,130 @@ describe('BulkEditorComponent', () => {
`${environment.apiBaseUrl}documents/?page=1&page_size=100000&fields=id`
) // listAllFilteredIds
})
it('should create share link bundle and enable manage callback', () => {
jest.spyOn(permissionsService, 'currentUserCan').mockReturnValue(true)
jest
.spyOn(documentListViewService, 'documents', 'get')
.mockReturnValue([{ id: 5 }, { id: 7 }] as any)
jest
.spyOn(documentListViewService, 'selected', 'get')
.mockReturnValue(new Set([5, 7]))
const confirmClicked = new EventEmitter<void>()
const modalRef: Partial<NgbModalRef> = {
close: jest.fn(),
componentInstance: {
documents: [],
confirmClicked,
payload: {
document_ids: [5, 7],
file_version: 'archive',
expiration_days: 7,
},
loading: false,
buttonsEnabled: true,
copied: false,
},
}
const openSpy = jest.spyOn(modalService, 'open')
openSpy.mockReturnValueOnce(modalRef as NgbModalRef)
openSpy.mockReturnValueOnce({} as NgbModalRef)
;(shareLinkBundleService.createBundle as jest.Mock).mockReturnValueOnce(
of({ id: 42 })
)
const toastInfoSpy = jest.spyOn(toastService, 'showInfo')
component.createShareLinkBundle()
expect(openSpy).toHaveBeenNthCalledWith(
1,
ShareLinkBundleDialogComponent,
expect.objectContaining({ backdrop: 'static', size: 'lg' })
)
const dialogInstance = modalRef.componentInstance as any
expect(dialogInstance.documents).toEqual([{ id: 5 }, { id: 7 }])
confirmClicked.emit()
expect(shareLinkBundleService.createBundle).toHaveBeenCalledWith({
document_ids: [5, 7],
file_version: 'archive',
expiration_days: 7,
})
expect(dialogInstance.loading).toBe(false)
expect(dialogInstance.buttonsEnabled).toBe(false)
expect(dialogInstance.createdBundle).toEqual({ id: 42 })
expect(typeof dialogInstance.onOpenManage).toBe('function')
expect(toastInfoSpy).toHaveBeenCalledWith(
$localize`Share link bundle creation requested.`
)
dialogInstance.onOpenManage()
expect(modalRef.close).toHaveBeenCalled()
expect(openSpy).toHaveBeenNthCalledWith(
2,
ShareLinkBundleManageDialogComponent,
expect.objectContaining({ backdrop: 'static', size: 'lg' })
)
openSpy.mockRestore()
})
it('should handle share link bundle creation errors', () => {
jest.spyOn(permissionsService, 'currentUserCan').mockReturnValue(true)
jest
.spyOn(documentListViewService, 'documents', 'get')
.mockReturnValue([{ id: 9 }] as any)
jest
.spyOn(documentListViewService, 'selected', 'get')
.mockReturnValue(new Set([9]))
const confirmClicked = new EventEmitter<void>()
const modalRef: Partial<NgbModalRef> = {
componentInstance: {
documents: [],
confirmClicked,
payload: {
document_ids: [9],
file_version: 'original',
expiration_days: null,
},
loading: false,
buttonsEnabled: true,
},
}
const openSpy = jest
.spyOn(modalService, 'open')
.mockReturnValue(modalRef as NgbModalRef)
;(shareLinkBundleService.createBundle as jest.Mock).mockReturnValueOnce(
throwError(() => new Error('bundle failure'))
)
const toastErrorSpy = jest.spyOn(toastService, 'showError')
component.createShareLinkBundle()
const dialogInstance = modalRef.componentInstance as any
confirmClicked.emit()
expect(toastErrorSpy).toHaveBeenCalledWith(
$localize`Share link bundle creation is not available yet.`,
expect.any(Error)
)
expect(dialogInstance.loading).toBe(false)
expect(dialogInstance.buttonsEnabled).toBe(true)
openSpy.mockRestore()
})
it('should open share link bundle management dialog', () => {
const openSpy = jest.spyOn(modalService, 'open')
component.manageShareLinkBundles()
expect(openSpy).toHaveBeenCalledWith(
ShareLinkBundleManageDialogComponent,
expect.objectContaining({ backdrop: 'static', size: 'lg' })
)
openSpy.mockRestore()
})
})

View File

@@ -33,6 +33,7 @@ import {
SelectionDataItem,
} from 'src/app/services/rest/document.service'
import { SavedViewService } from 'src/app/services/rest/saved-view.service'
import { ShareLinkBundleService } from 'src/app/services/rest/share-link-bundle.service'
import { StoragePathService } from 'src/app/services/rest/storage-path.service'
import { TagService } from 'src/app/services/rest/tag.service'
import { SettingsService } from 'src/app/services/settings.service'
@@ -54,6 +55,8 @@ import {
} from '../../common/filterable-dropdown/filterable-dropdown.component'
import { ToggleableItemState } from '../../common/filterable-dropdown/toggleable-dropdown-button/toggleable-dropdown-button.component'
import { PermissionsDialogComponent } from '../../common/permissions-dialog/permissions-dialog.component'
import { ShareLinkBundleDialogComponent } from '../../common/share-link-bundle-dialog/share-link-bundle-dialog.component'
import { ShareLinkBundleManageDialogComponent } from '../../common/share-link-bundle-manage-dialog/share-link-bundle-manage-dialog.component'
import { ComponentWithPermissions } from '../../with-permissions/with-permissions.component'
import { CustomFieldsBulkEditDialogComponent } from './custom-fields-bulk-edit-dialog/custom-fields-bulk-edit-dialog.component'
@@ -87,6 +90,7 @@ export class BulkEditorComponent
private customFieldService = inject(CustomFieldsService)
private permissionService = inject(PermissionsService)
private savedViewService = inject(SavedViewService)
private readonly shareLinkBundleService = inject(ShareLinkBundleService)
tagSelectionModel = new FilterableDropdownSelectionModel(true)
correspondentSelectionModel = new FilterableDropdownSelectionModel()
@@ -908,6 +912,58 @@ export class BulkEditorComponent
return this.settings.get(SETTINGS_KEYS.EMAIL_ENABLED)
}
createShareLinkBundle() {
const modal = this.modalService.open(ShareLinkBundleDialogComponent, {
backdrop: 'static',
size: 'lg',
})
const dialog = modal.componentInstance as ShareLinkBundleDialogComponent
const selectedDocuments = this.list.documents.filter((d) =>
this.list.selected.has(d.id)
)
dialog.documents = selectedDocuments
dialog.confirmClicked
.pipe(takeUntil(this.unsubscribeNotifier))
.subscribe(() => {
dialog.loading = true
dialog.buttonsEnabled = false
this.shareLinkBundleService
.createBundle(dialog.payload)
.pipe(first())
.subscribe({
next: (result) => {
dialog.loading = false
dialog.buttonsEnabled = false
dialog.createdBundle = result
dialog.copied = false
dialog.payload = null
dialog.onOpenManage = () => {
modal.close()
this.manageShareLinkBundles()
}
this.toastService.showInfo(
$localize`Share link bundle creation requested.`
)
},
error: (error) => {
dialog.loading = false
dialog.buttonsEnabled = true
this.toastService.showError(
$localize`Share link bundle creation is not available yet.`,
error
)
},
})
})
}
manageShareLinkBundles() {
this.modalService.open(ShareLinkBundleManageDialogComponent, {
backdrop: 'static',
size: 'lg',
})
}
emailSelected() {
const allHaveArchiveVersion = this.list.documents
.filter((d) => this.list.selected.has(d.id))

View File

@@ -0,0 +1,53 @@
import { FileVersion } from './share-link'
export enum ShareLinkBundleStatus {
Pending = 'pending',
Processing = 'processing',
Ready = 'ready',
Failed = 'failed',
}
export type ShareLinkBundleError = {
bundle_id: number
message?: string
exception_type?: string
timestamp?: string
}
export interface ShareLinkBundleSummary {
id: number
slug: string
created: string // Date
expiration?: string // Date
documents: number[]
document_count: number
file_version: FileVersion
status: ShareLinkBundleStatus
built_at?: string
size_bytes?: number
last_error?: ShareLinkBundleError
}
export interface ShareLinkBundleCreatePayload {
document_ids: number[]
file_version: FileVersion
expiration_days: number | null
}
export const SHARE_LINK_BUNDLE_STATUS_LABELS: Record<
ShareLinkBundleStatus,
string
> = {
[ShareLinkBundleStatus.Pending]: $localize`Pending`,
[ShareLinkBundleStatus.Processing]: $localize`Processing`,
[ShareLinkBundleStatus.Ready]: $localize`Ready`,
[ShareLinkBundleStatus.Failed]: $localize`Failed`,
}
export const SHARE_LINK_BUNDLE_FILE_VERSION_LABELS: Record<
FileVersion,
string
> = {
[FileVersion.Archive]: $localize`Archive`,
[FileVersion.Original]: $localize`Original`,
}

View File

@@ -5,6 +5,18 @@ export enum FileVersion {
Original = 'original',
}
export interface ShareLinkExpirationOption {
label: string
value: number | null
}
export const SHARE_LINK_EXPIRATION_OPTIONS: ShareLinkExpirationOption[] = [
{ label: $localize`1 day`, value: 1 },
{ label: $localize`7 days`, value: 7 },
{ label: $localize`30 days`, value: 30 },
{ label: $localize`Never`, value: null },
]
export interface ShareLink extends ObjectWithPermissions {
created: string // Date

View File

@@ -1,3 +1,5 @@
import { PdfEditorEditMode } from '../components/common/pdf-editor/pdf-editor-edit-mode'
import { ZoomSetting } from '../components/document-detail/zoom-setting'
import { User } from './user'
export interface UiSettings {
@@ -70,8 +72,12 @@ export const SETTINGS_KEYS = {
'general-settings:document-editing:remove-inbox-tags',
DOCUMENT_EDITING_OVERLAY_THUMBNAIL:
'general-settings:document-editing:overlay-thumbnail',
DOCUMENT_DETAILS_HIDDEN_FIELDS:
'general-settings:document-details:hidden-fields',
SEARCH_DB_ONLY: 'general-settings:search:db-only',
SEARCH_FULL_TYPE: 'general-settings:search:more-link',
PDF_EDITOR_DEFAULT_EDIT_MODE:
'general-settings:document-editing:default-edit-mode',
EMPTY_TRASH_DELAY: 'trash_delay',
GMAIL_OAUTH_URL: 'gmail_oauth_url',
OUTLOOK_OAUTH_URL: 'outlook_oauth_url',
@@ -255,6 +261,11 @@ export const SETTINGS: UiSetting[] = [
type: 'boolean',
default: true,
},
{
key: SETTINGS_KEYS.DOCUMENT_DETAILS_HIDDEN_FIELDS,
type: 'array',
default: [],
},
{
key: SETTINGS_KEYS.SEARCH_DB_ONLY,
type: 'boolean',
@@ -288,11 +299,16 @@ export const SETTINGS: UiSetting[] = [
{
key: SETTINGS_KEYS.PDF_VIEWER_ZOOM_SETTING,
type: 'string',
default: 'page-width', // ZoomSetting from 'document-detail.component'
default: ZoomSetting.PageWidth,
},
{
key: SETTINGS_KEYS.AI_ENABLED,
type: 'boolean',
default: false,
},
{
key: SETTINGS_KEYS.PDF_EDITOR_DEFAULT_EDIT_MODE,
type: 'string',
default: PdfEditorEditMode.Create,
},
]

View File

@@ -1,30 +1,41 @@
import { HttpEvent, HttpRequest } from '@angular/common/http'
import {
HttpClient,
provideHttpClient,
withInterceptors,
} from '@angular/common/http'
import {
HttpTestingController,
provideHttpClientTesting,
} from '@angular/common/http/testing'
import { TestBed } from '@angular/core/testing'
import { of } from 'rxjs'
import { environment } from 'src/environments/environment'
import { ApiVersionInterceptor } from './api-version.interceptor'
import { withApiVersionInterceptor } from './api-version.interceptor'
describe('ApiVersionInterceptor', () => {
let interceptor: ApiVersionInterceptor
let httpClient: HttpClient
let httpMock: HttpTestingController
beforeEach(() => {
TestBed.configureTestingModule({
providers: [ApiVersionInterceptor],
providers: [
provideHttpClient(withInterceptors([withApiVersionInterceptor])),
provideHttpClientTesting(),
],
})
interceptor = TestBed.inject(ApiVersionInterceptor)
httpClient = TestBed.inject(HttpClient)
httpMock = TestBed.inject(HttpTestingController)
})
it('should add api version to headers', () => {
interceptor.intercept(new HttpRequest('GET', 'https://example.com'), {
handle: (request) => {
const header = request.headers['lazyUpdate'][0]
expect(header.name).toEqual('Accept')
expect(header.value).toEqual(
`application/json; version=${environment.apiVersion}`
)
return of({} as HttpEvent<any>)
},
})
httpClient.get('https://example.com').subscribe()
const request = httpMock.expectOne('https://example.com')
const header = request.request.headers['lazyUpdate'][0]
expect(header.name).toEqual('Accept')
expect(header.value).toEqual(
`application/json; version=${environment.apiVersion}`
)
request.flush({})
})
})

View File

@@ -1,27 +1,20 @@
import {
HttpEvent,
HttpHandler,
HttpInterceptor,
HttpHandlerFn,
HttpInterceptorFn,
HttpRequest,
} from '@angular/common/http'
import { Injectable } from '@angular/core'
import { Observable } from 'rxjs'
import { environment } from 'src/environments/environment'
@Injectable()
export class ApiVersionInterceptor implements HttpInterceptor {
constructor() {}
intercept(
request: HttpRequest<unknown>,
next: HttpHandler
): Observable<HttpEvent<unknown>> {
request = request.clone({
setHeaders: {
Accept: `application/json; version=${environment.apiVersion}`,
},
})
return next.handle(request)
}
export const withApiVersionInterceptor: HttpInterceptorFn = (
request: HttpRequest<unknown>,
next: HttpHandlerFn
): Observable<HttpEvent<unknown>> => {
request = request.clone({
setHeaders: {
Accept: `application/json; version=${environment.apiVersion}`,
},
})
return next(request)
}

View File

@@ -1,35 +1,52 @@
import { HttpEvent, HttpRequest } from '@angular/common/http'
import {
HttpClient,
provideHttpClient,
withInterceptors,
} from '@angular/common/http'
import {
HttpTestingController,
provideHttpClientTesting,
} from '@angular/common/http/testing'
import { TestBed } from '@angular/core/testing'
import { Meta } from '@angular/platform-browser'
import { CookieService } from 'ngx-cookie-service'
import { of } from 'rxjs'
import { CsrfInterceptor } from './csrf.interceptor'
import { withCsrfInterceptor } from './csrf.interceptor'
describe('CsrfInterceptor', () => {
let interceptor: CsrfInterceptor
let meta: Meta
let cookieService: CookieService
let httpClient: HttpClient
let httpMock: HttpTestingController
beforeEach(() => {
TestBed.configureTestingModule({
providers: [CsrfInterceptor, Meta, CookieService],
providers: [
Meta,
CookieService,
provideHttpClient(withInterceptors([withCsrfInterceptor])),
provideHttpClientTesting(),
],
})
meta = TestBed.inject(Meta)
cookieService = TestBed.inject(CookieService)
interceptor = TestBed.inject(CsrfInterceptor)
httpClient = TestBed.inject(HttpClient)
httpMock = TestBed.inject(HttpTestingController)
})
it('should get csrf token', () => {
meta.addTag({ name: 'cookie_prefix', content: 'ngx-' }, true)
const cookieServiceSpy = jest.spyOn(cookieService, 'get')
cookieServiceSpy.mockReturnValue('csrftoken')
interceptor.intercept(new HttpRequest('GET', 'https://example.com'), {
handle: (request) => {
expect(request.headers['lazyUpdate'][0]['name']).toEqual('X-CSRFToken')
return of({} as HttpEvent<any>)
},
})
httpClient.get('https://example.com').subscribe()
const request = httpMock.expectOne('https://example.com')
expect(request.request.headers['lazyUpdate'][0]['name']).toEqual(
'X-CSRFToken'
)
expect(cookieServiceSpy).toHaveBeenCalled()
request.flush({})
})
})

View File

@@ -1,36 +1,32 @@
import {
HttpEvent,
HttpHandler,
HttpInterceptor,
HttpHandlerFn,
HttpInterceptorFn,
HttpRequest,
} from '@angular/common/http'
import { inject, Injectable } from '@angular/core'
import { inject } from '@angular/core'
import { Meta } from '@angular/platform-browser'
import { CookieService } from 'ngx-cookie-service'
import { Observable } from 'rxjs'
@Injectable()
export class CsrfInterceptor implements HttpInterceptor {
private cookieService: CookieService = inject(CookieService)
private meta: Meta = inject(Meta)
export const withCsrfInterceptor: HttpInterceptorFn = (
request: HttpRequest<unknown>,
next: HttpHandlerFn
): Observable<HttpEvent<unknown>> => {
const cookieService: CookieService = inject(CookieService)
const meta: Meta = inject(Meta)
intercept(
request: HttpRequest<unknown>,
next: HttpHandler
): Observable<HttpEvent<unknown>> {
let prefix = ''
if (this.meta.getTag('name=cookie_prefix')) {
prefix = this.meta.getTag('name=cookie_prefix').content
}
let csrfToken = this.cookieService.get(`${prefix}csrftoken`)
if (csrfToken) {
request = request.clone({
setHeaders: {
'X-CSRFToken': csrfToken,
},
})
}
return next.handle(request)
let prefix = ''
if (meta.getTag('name=cookie_prefix')) {
prefix = meta.getTag('name=cookie_prefix').content
}
let csrfToken = cookieService.get(`${prefix}csrftoken`)
if (csrfToken) {
request = request.clone({
setHeaders: {
'X-CSRFToken': csrfToken,
},
})
}
return next(request)
}

View File

@@ -0,0 +1,60 @@
import { HttpTestingController } from '@angular/common/http/testing'
import { TestBed } from '@angular/core/testing'
import { Subscription } from 'rxjs'
import { environment } from 'src/environments/environment'
import { commonAbstractPaperlessServiceTests } from './abstract-paperless-service.spec'
import { ShareLinkBundleService } from './share-link-bundle.service'
const endpoint = 'share_link_bundles'
commonAbstractPaperlessServiceTests(endpoint, ShareLinkBundleService)
describe('ShareLinkBundleService', () => {
let httpTestingController: HttpTestingController
let service: ShareLinkBundleService
let subscription: Subscription | undefined
beforeEach(() => {
httpTestingController = TestBed.inject(HttpTestingController)
service = TestBed.inject(ShareLinkBundleService)
})
afterEach(() => {
subscription?.unsubscribe()
httpTestingController.verify()
})
it('creates bundled share links', () => {
const payload = {
document_ids: [1, 2],
file_version: 'archive',
expiration_days: 7,
}
subscription = service.createBundle(payload as any).subscribe()
const req = httpTestingController.expectOne(
`${environment.apiBaseUrl}${endpoint}/`
)
expect(req.request.method).toBe('POST')
expect(req.request.body).toEqual(payload)
req.flush({})
})
it('rebuilds bundles', () => {
subscription = service.rebuildBundle(12).subscribe()
const req = httpTestingController.expectOne(
`${environment.apiBaseUrl}${endpoint}/12/rebuild/`
)
expect(req.request.method).toBe('POST')
expect(req.request.body).toEqual({})
req.flush({})
})
it('lists bundles with expected parameters', () => {
subscription = service.listAllBundles().subscribe()
const req = httpTestingController.expectOne(
`${environment.apiBaseUrl}${endpoint}/?page=1&page_size=1000&ordering=-created`
)
expect(req.request.method).toBe('GET')
req.flush({ results: [] })
})
})

View File

@@ -0,0 +1,41 @@
import { Injectable } from '@angular/core'
import { Observable } from 'rxjs'
import { map } from 'rxjs/operators'
import {
ShareLinkBundleCreatePayload,
ShareLinkBundleSummary,
} from 'src/app/data/share-link-bundle'
import { AbstractNameFilterService } from './abstract-name-filter-service'
@Injectable({
providedIn: 'root',
})
export class ShareLinkBundleService extends AbstractNameFilterService<ShareLinkBundleSummary> {
constructor() {
super()
this.resourceName = 'share_link_bundles'
}
createBundle(
payload: ShareLinkBundleCreatePayload
): Observable<ShareLinkBundleSummary> {
this.clearCache()
return this.http.post<ShareLinkBundleSummary>(
this.getResourceUrl(),
payload
)
}
rebuildBundle(bundleId: number): Observable<ShareLinkBundleSummary> {
this.clearCache()
return this.http.post<ShareLinkBundleSummary>(
this.getResourceUrl(bundleId, 'rebuild'),
{}
)
}
listAllBundles(): Observable<ShareLinkBundleSummary[]> {
return this.list(1, 1000, 'created', true).pipe(
map((response) => response.results)
)
}
}

View File

@@ -1,16 +1,16 @@
import {
APP_INITIALIZER,
enableProdMode,
importProvidersFrom,
inject,
provideAppInitializer,
provideZoneChangeDetection,
} from '@angular/core'
import { DragDropModule } from '@angular/cdk/drag-drop'
import { DatePipe, registerLocaleData } from '@angular/common'
import {
HTTP_INTERCEPTORS,
provideHttpClient,
withFetch,
withInterceptors,
withInterceptorsFromDi,
} from '@angular/common/http'
import { FormsModule, ReactiveFormsModule } from '@angular/forms'
@@ -151,15 +151,14 @@ import { AppComponent } from './app/app.component'
import { DirtyDocGuard } from './app/guards/dirty-doc.guard'
import { DirtySavedViewGuard } from './app/guards/dirty-saved-view.guard'
import { PermissionsGuard } from './app/guards/permissions.guard'
import { ApiVersionInterceptor } from './app/interceptors/api-version.interceptor'
import { CsrfInterceptor } from './app/interceptors/csrf.interceptor'
import { withApiVersionInterceptor } from './app/interceptors/api-version.interceptor'
import { withCsrfInterceptor } from './app/interceptors/csrf.interceptor'
import { DocumentTitlePipe } from './app/pipes/document-title.pipe'
import { FilterPipe } from './app/pipes/filter.pipe'
import { UsernamePipe } from './app/pipes/username.pipe'
import { SettingsService } from './app/services/settings.service'
import { LocalizedDateParserFormatter } from './app/utils/ngb-date-parser-formatter'
import { ISODateAdapter } from './app/utils/ngb-iso-date-adapter'
import { environment } from './environments/environment'
import localeAf from '@angular/common/locales/af'
import localeAr from '@angular/common/locales/ar'
@@ -237,11 +236,11 @@ registerLocaleData(localeUk)
registerLocaleData(localeZh)
registerLocaleData(localeZhHant)
function initializeApp(settings: SettingsService) {
return () => {
return settings.initializeSettings()
}
function initializeApp() {
const settings = inject(SettingsService)
return settings.initializeSettings()
}
const icons = {
airplane,
archive,
@@ -363,10 +362,6 @@ const icons = {
xLg,
}
if (environment.production) {
enableProdMode()
}
bootstrapApplication(AppComponent, {
providers: [
provideZoneChangeDetection(),
@@ -383,24 +378,9 @@ bootstrapApplication(AppComponent, {
DragDropModule,
NgxBootstrapIconsModule.pick(icons)
),
{
provide: APP_INITIALIZER,
useFactory: initializeApp,
deps: [SettingsService],
multi: true,
},
provideAppInitializer(initializeApp),
DatePipe,
CookieService,
{
provide: HTTP_INTERCEPTORS,
useClass: CsrfInterceptor,
multi: true,
},
{
provide: HTTP_INTERCEPTORS,
useClass: ApiVersionInterceptor,
multi: true,
},
FilterPipe,
DocumentTitlePipe,
{ provide: NgbDateAdapter, useClass: ISODateAdapter },
@@ -412,6 +392,10 @@ bootstrapApplication(AppComponent, {
CorrespondentNamePipe,
DocumentTypeNamePipe,
StoragePathNamePipe,
provideHttpClient(withInterceptorsFromDi(), withFetch()),
provideHttpClient(
withInterceptorsFromDi(),
withInterceptors([withCsrfInterceptor, withApiVersionInterceptor]),
withFetch()
),
],
}).catch((err) => console.error(err))

View File

@@ -13,6 +13,7 @@ from documents.models import PaperlessTask
from documents.models import SavedView
from documents.models import SavedViewFilterRule
from documents.models import ShareLink
from documents.models import ShareLinkBundle
from documents.models import StoragePath
from documents.models import Tag
from documents.tasks import update_document_parent_tags
@@ -184,6 +185,22 @@ class ShareLinksAdmin(GuardedModelAdmin):
return super().get_queryset(request).select_related("document__correspondent")
class ShareLinkBundleAdmin(GuardedModelAdmin):
list_display = ("created", "status", "expiration", "owner", "slug")
list_filter = ("status", "created", "expiration", "owner")
search_fields = ("slug",)
def get_queryset(self, request): # pragma: no cover
return (
super()
.get_queryset(request)
.select_related("owner")
.prefetch_related(
"documents",
)
)
class CustomFieldsAdmin(GuardedModelAdmin):
fields = ("name", "created", "data_type")
readonly_fields = ("created", "data_type")
@@ -215,6 +232,7 @@ admin.site.register(StoragePath, StoragePathAdmin)
admin.site.register(PaperlessTask, TaskAdmin)
admin.site.register(Note, NotesAdmin)
admin.site.register(ShareLink, ShareLinksAdmin)
admin.site.register(ShareLinkBundle, ShareLinkBundleAdmin)
admin.site.register(CustomField, CustomFieldsAdmin)
admin.site.register(CustomFieldInstance, CustomFieldInstancesAdmin)

View File

@@ -39,6 +39,7 @@ from documents.models import Document
from documents.models import DocumentType
from documents.models import PaperlessTask
from documents.models import ShareLink
from documents.models import ShareLinkBundle
from documents.models import StoragePath
from documents.models import Tag
@@ -796,6 +797,29 @@ class ShareLinkFilterSet(FilterSet):
}
class ShareLinkBundleFilterSet(FilterSet):
documents = Filter(method="filter_documents")
class Meta:
model = ShareLinkBundle
fields = {
"created": DATETIME_KWARGS,
"expiration": DATETIME_KWARGS,
"status": ["exact"],
}
def filter_documents(self, queryset, name, value):
ids = []
if value:
try:
ids = [int(item) for item in value.split(",") if item]
except ValueError:
return queryset.none()
if not ids:
return queryset
return queryset.filter(documents__in=ids).distinct()
class PaperlessTaskFilterSet(FilterSet):
acknowledged = BooleanFilter(
label="Acknowledged",

View File

@@ -501,9 +501,22 @@ class Command(BaseCommand):
stability_timeout_ms = int(stability_delay * 1000)
testing_timeout_ms = int(self.testing_timeout_s * 1000)
# Start with no timeout (wait indefinitely for first event)
# unless in testing mode
timeout_ms = testing_timeout_ms if is_testing else 0
# Calculate appropriate timeout for watch loop
# In polling mode, rust_timeout must be significantly longer than poll_delay_ms
# to ensure poll cycles can complete before timing out
if is_testing:
if use_polling:
# For polling: timeout must be at least 3x the poll interval to allow
# multiple poll cycles. This prevents timeouts from interfering with
# the polling mechanism.
min_polling_timeout_ms = poll_delay_ms * 3
timeout_ms = max(min_polling_timeout_ms, testing_timeout_ms)
else:
# For native watching, use short timeout to check stop flag
timeout_ms = testing_timeout_ms
else:
# Not testing, wait indefinitely for first event
timeout_ms = 0
self.stop_flag.clear()
@@ -543,8 +556,14 @@ class Command(BaseCommand):
# Check pending files at stability interval
timeout_ms = stability_timeout_ms
elif is_testing:
# In testing, use short timeout to check stop flag
timeout_ms = testing_timeout_ms
# In testing, use appropriate timeout based on watch mode
if use_polling:
# For polling: ensure timeout allows polls to complete
min_polling_timeout_ms = poll_delay_ms * 3
timeout_ms = max(min_polling_timeout_ms, testing_timeout_ms)
else:
# For native watching, use short timeout to check stop flag
timeout_ms = testing_timeout_ms
else: # pragma: nocover
# No pending files, wait indefinitely
timeout_ms = 0

View File

@@ -1,481 +0,0 @@
from __future__ import annotations
from django.db.models import Q
from django.http import QueryDict
from mcp_server import MCPToolset
from mcp_server import ModelQueryToolset
from mcp_server import drf_publish_create_mcp_tool
from mcp_server import drf_publish_destroy_mcp_tool
from mcp_server import drf_publish_list_mcp_tool
from mcp_server import drf_publish_update_mcp_tool
from rest_framework.response import Response
from documents.models import Correspondent
from documents.models import CustomField
from documents.models import Document
from documents.models import DocumentType
from documents.models import Note
from documents.models import SavedView
from documents.models import ShareLink
from documents.models import StoragePath
from documents.models import Tag
from documents.models import Workflow
from documents.models import WorkflowAction
from documents.models import WorkflowTrigger
from documents.permissions import get_objects_for_user_owner_aware
from documents.views import CorrespondentViewSet
from documents.views import CustomFieldViewSet
from documents.views import DocumentTypeViewSet
from documents.views import SavedViewViewSet
from documents.views import ShareLinkViewSet
from documents.views import StoragePathViewSet
from documents.views import TagViewSet
from documents.views import TasksViewSet
from documents.views import UnifiedSearchViewSet
from documents.views import WorkflowActionViewSet
from documents.views import WorkflowTriggerViewSet
from documents.views import WorkflowViewSet
VIEWSET_ACTIONS = {
"create": {"post": "create"},
"list": {"get": "list"},
"update": {"put": "update"},
"destroy": {"delete": "destroy"},
}
BODY_SCHEMA = {"type": "object", "additionalProperties": True}
VIEWSET_INSTRUCTIONS = {
CorrespondentViewSet: "Manage correspondents.",
TagViewSet: "Manage tags.",
UnifiedSearchViewSet: "Search and manage documents.",
DocumentTypeViewSet: "Manage document types.",
StoragePathViewSet: "Manage storage paths.",
SavedViewViewSet: "Manage saved views.",
ShareLinkViewSet: "Manage share links.",
WorkflowTriggerViewSet: "Manage workflow triggers.",
WorkflowActionViewSet: "Manage workflow actions.",
WorkflowViewSet: "Manage workflows.",
CustomFieldViewSet: "Manage custom fields.",
TasksViewSet: "List background tasks.",
}
class OwnerAwareQueryToolsetMixin:
permission: str
def get_queryset(self):
user = getattr(self.request, "user", None)
if not user or not user.is_authenticated:
return self.model.objects.none()
if user.is_superuser:
return self.model._default_manager.all()
return get_objects_for_user_owner_aware(user, self.permission, self.model)
class DocumentQueryToolset(ModelQueryToolset):
model = Document
search_fields = ["title", "content"]
def get_queryset(self):
user = getattr(self.request, "user", None)
if not user or not user.is_authenticated:
return Document.objects.none()
if user.is_superuser:
return Document.objects.all()
return get_objects_for_user_owner_aware(
user,
"documents.view_document",
Document,
)
class CorrespondentQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = Correspondent
permission = "documents.view_correspondent"
class TagQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = Tag
permission = "documents.view_tag"
class DocumentTypeQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = DocumentType
permission = "documents.view_documenttype"
class StoragePathQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = StoragePath
permission = "documents.view_storagepath"
class SavedViewQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = SavedView
permission = "documents.view_savedview"
class ShareLinkQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = ShareLink
permission = "documents.view_sharelink"
class WorkflowTriggerQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = WorkflowTrigger
permission = "documents.view_workflowtrigger"
class WorkflowActionQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = WorkflowAction
permission = "documents.view_workflowaction"
class WorkflowQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
model = Workflow
permission = "documents.view_workflow"
class NoteQueryToolset(ModelQueryToolset):
model = Note
def get_queryset(self):
user = getattr(self.request, "user", None)
if not user or not user.is_authenticated:
return Note.objects.none()
if user.is_superuser:
return Note.objects.all()
return Note.objects.filter(
document__in=get_objects_for_user_owner_aware(
user,
"documents.view_document",
Document,
),
)
class CustomFieldQueryToolset(ModelQueryToolset):
model = CustomField
def get_queryset(self):
user = getattr(self.request, "user", None)
base = CustomField.objects.all()
if not user or not user.is_authenticated:
return base.none()
if user.is_superuser:
return base
return base.filter(
Q(
fields__document__id__in=get_objects_for_user_owner_aware(
user,
"documents.view_document",
Document,
),
)
| Q(fields__document__isnull=True),
).distinct()
class DocumentSearchTools(MCPToolset):
def search_documents(
self,
query: str | None = None,
more_like_id: int | None = None,
fields: list[str] | None = None,
page: int | None = None,
page_size: int | None = None,
*,
full_perms: bool | None = None,
) -> dict:
"""Search documents using the full-text index."""
if not query and not more_like_id:
raise ValueError("Provide either query or more_like_id.")
request = self.request
if request is None:
raise ValueError("Request context is required.")
viewset = UnifiedSearchViewSet()
viewset.request = request
viewset.args = ()
viewset.kwargs = {}
viewset.action = "list"
viewset.format_kwarg = None
viewset.check_permissions(request)
query_params = QueryDict(mutable=True)
if query:
query_params["query"] = query
if more_like_id:
query_params["more_like_id"] = str(more_like_id)
if full_perms is not None:
query_params["full_perms"] = str(full_perms).lower()
if page:
query_params["page"] = str(page)
if page_size:
query_params["page_size"] = str(page_size)
if fields:
query_params.setlist("fields", fields)
request._request.GET = query_params
response = viewset.list(request)
if isinstance(response, Response):
return response.data
if hasattr(response, "data"):
return response.data
return {
"detail": getattr(response, "content", b"").decode() or "Search failed.",
}
drf_publish_create_mcp_tool(
CorrespondentViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[CorrespondentViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
CorrespondentViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[CorrespondentViewSet],
)
drf_publish_update_mcp_tool(
CorrespondentViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[CorrespondentViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
CorrespondentViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[CorrespondentViewSet],
)
drf_publish_create_mcp_tool(
TagViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[TagViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
TagViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[TagViewSet],
)
drf_publish_update_mcp_tool(
TagViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[TagViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
TagViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[TagViewSet],
)
drf_publish_list_mcp_tool(
UnifiedSearchViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[UnifiedSearchViewSet],
)
drf_publish_update_mcp_tool(
UnifiedSearchViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[UnifiedSearchViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
UnifiedSearchViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[UnifiedSearchViewSet],
)
drf_publish_create_mcp_tool(
DocumentTypeViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[DocumentTypeViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
DocumentTypeViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[DocumentTypeViewSet],
)
drf_publish_update_mcp_tool(
DocumentTypeViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[DocumentTypeViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
DocumentTypeViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[DocumentTypeViewSet],
)
drf_publish_create_mcp_tool(
StoragePathViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[StoragePathViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
StoragePathViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[StoragePathViewSet],
)
drf_publish_update_mcp_tool(
StoragePathViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[StoragePathViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
StoragePathViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[StoragePathViewSet],
)
drf_publish_create_mcp_tool(
SavedViewViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[SavedViewViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
SavedViewViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[SavedViewViewSet],
)
drf_publish_update_mcp_tool(
SavedViewViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[SavedViewViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
SavedViewViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[SavedViewViewSet],
)
drf_publish_create_mcp_tool(
ShareLinkViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[ShareLinkViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
ShareLinkViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[ShareLinkViewSet],
)
drf_publish_update_mcp_tool(
ShareLinkViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[ShareLinkViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
ShareLinkViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[ShareLinkViewSet],
)
drf_publish_create_mcp_tool(
WorkflowTriggerViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowTriggerViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
WorkflowTriggerViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowTriggerViewSet],
)
drf_publish_update_mcp_tool(
WorkflowTriggerViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowTriggerViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
WorkflowTriggerViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowTriggerViewSet],
)
drf_publish_create_mcp_tool(
WorkflowActionViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowActionViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
WorkflowActionViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowActionViewSet],
)
drf_publish_update_mcp_tool(
WorkflowActionViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowActionViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
WorkflowActionViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowActionViewSet],
)
drf_publish_create_mcp_tool(
WorkflowViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
WorkflowViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowViewSet],
)
drf_publish_update_mcp_tool(
WorkflowViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
WorkflowViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[WorkflowViewSet],
)
drf_publish_create_mcp_tool(
CustomFieldViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[CustomFieldViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
CustomFieldViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[CustomFieldViewSet],
)
drf_publish_update_mcp_tool(
CustomFieldViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[CustomFieldViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
CustomFieldViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[CustomFieldViewSet],
)
drf_publish_list_mcp_tool(
TasksViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[TasksViewSet],
)

View File

@@ -0,0 +1,177 @@
# Generated by Django 5.2.9 on 2026-01-27 01:09
import django.db.models.deletion
import django.db.models.functions.text
import django.utils.timezone
from django.conf import settings
from django.contrib.auth.management import create_permissions
from django.contrib.auth.models import Group
from django.contrib.auth.models import Permission
from django.contrib.auth.models import User
from django.db import migrations
from django.db import models
def grant_share_link_bundle_permissions(apps, schema_editor):
# Ensure newly introduced permissions are created for all apps
for app_config in apps.get_app_configs():
app_config.models_module = True
create_permissions(app_config, apps=apps, verbosity=0)
app_config.models_module = None
add_document_perm = Permission.objects.filter(codename="add_document").first()
share_bundle_permissions = Permission.objects.filter(
codename__contains="sharelinkbundle",
)
users = User.objects.filter(user_permissions=add_document_perm).distinct()
for user in users:
user.user_permissions.add(*share_bundle_permissions)
groups = Group.objects.filter(permissions=add_document_perm).distinct()
for group in groups:
group.permissions.add(*share_bundle_permissions)
def revoke_share_link_bundle_permissions(apps, schema_editor):
share_bundle_permissions = Permission.objects.filter(
codename__contains="sharelinkbundle",
)
for user in User.objects.all():
user.user_permissions.remove(*share_bundle_permissions)
for group in Group.objects.all():
group.permissions.remove(*share_bundle_permissions)
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
("documents", "0007_document_content_length"),
]
operations = [
migrations.CreateModel(
name="ShareLinkBundle",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
(
"created",
models.DateTimeField(
blank=True,
db_index=True,
default=django.utils.timezone.now,
editable=False,
verbose_name="created",
),
),
(
"expiration",
models.DateTimeField(
blank=True,
db_index=True,
null=True,
verbose_name="expiration",
),
),
(
"slug",
models.SlugField(
blank=True,
editable=False,
unique=True,
verbose_name="slug",
),
),
(
"file_version",
models.CharField(
choices=[("archive", "Archive"), ("original", "Original")],
default="archive",
max_length=50,
),
),
(
"status",
models.CharField(
choices=[
("pending", "Pending"),
("processing", "Processing"),
("ready", "Ready"),
("failed", "Failed"),
],
default="pending",
max_length=50,
),
),
(
"size_bytes",
models.PositiveIntegerField(
blank=True,
null=True,
verbose_name="size (bytes)",
),
),
(
"last_error",
models.JSONField(
blank=True,
null=True,
default=None,
verbose_name="last error",
),
),
(
"file_path",
models.CharField(
blank=True,
max_length=512,
verbose_name="file path",
),
),
(
"built_at",
models.DateTimeField(
blank=True,
null=True,
verbose_name="built at",
),
),
(
"documents",
models.ManyToManyField(
related_name="share_link_bundles",
to="documents.document",
verbose_name="documents",
),
),
(
"owner",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="share_link_bundles",
to=settings.AUTH_USER_MODEL,
verbose_name="owner",
),
),
],
options={
"ordering": ("-created",),
"verbose_name": "share link bundle",
"verbose_name_plural": "share link bundles",
},
),
migrations.RunPython(
grant_share_link_bundle_permissions,
reverse_code=revoke_share_link_bundle_permissions,
),
]

View File

@@ -766,6 +766,114 @@ class ShareLink(SoftDeleteModel):
return f"Share Link for {self.document.title}"
class ShareLinkBundle(models.Model):
class Status(models.TextChoices):
PENDING = ("pending", _("Pending"))
PROCESSING = ("processing", _("Processing"))
READY = ("ready", _("Ready"))
FAILED = ("failed", _("Failed"))
created = models.DateTimeField(
_("created"),
default=timezone.now,
db_index=True,
blank=True,
editable=False,
)
expiration = models.DateTimeField(
_("expiration"),
blank=True,
null=True,
db_index=True,
)
slug = models.SlugField(
_("slug"),
db_index=True,
unique=True,
blank=True,
editable=False,
)
owner = models.ForeignKey(
User,
blank=True,
null=True,
related_name="share_link_bundles",
on_delete=models.SET_NULL,
verbose_name=_("owner"),
)
file_version = models.CharField(
max_length=50,
choices=ShareLink.FileVersion.choices,
default=ShareLink.FileVersion.ARCHIVE,
)
status = models.CharField(
max_length=50,
choices=Status.choices,
default=Status.PENDING,
)
size_bytes = models.PositiveIntegerField(
_("size (bytes)"),
blank=True,
null=True,
)
last_error = models.JSONField(
_("last error"),
blank=True,
null=True,
default=None,
)
file_path = models.CharField(
_("file path"),
max_length=512,
blank=True,
)
built_at = models.DateTimeField(
_("built at"),
null=True,
blank=True,
)
documents = models.ManyToManyField(
"documents.Document",
related_name="share_link_bundles",
verbose_name=_("documents"),
)
class Meta:
ordering = ("-created",)
verbose_name = _("share link bundle")
verbose_name_plural = _("share link bundles")
def __str__(self):
return _("Share link bundle %(slug)s") % {"slug": self.slug}
@property
def absolute_file_path(self) -> Path | None:
if not self.file_path:
return None
return (settings.SHARE_LINK_BUNDLE_DIR / Path(self.file_path)).resolve()
def remove_file(self):
if self.absolute_file_path is not None and self.absolute_file_path.exists():
try:
self.absolute_file_path.unlink()
except OSError:
pass
def delete(self, using=None, *, keep_parents=False):
self.remove_file()
return super().delete(using=using, keep_parents=keep_parents)
class CustomField(models.Model):
"""
Defines the name and type of a custom field

View File

@@ -4,6 +4,7 @@ import logging
import math
import re
from datetime import datetime
from datetime import timedelta
from decimal import Decimal
from typing import TYPE_CHECKING
from typing import Literal
@@ -25,6 +26,7 @@ from django.core.validators import integer_validator
from django.db.models import Count
from django.db.models import Q
from django.db.models.functions import Lower
from django.utils import timezone
from django.utils.crypto import get_random_string
from django.utils.dateparse import parse_datetime
from django.utils.text import slugify
@@ -62,6 +64,7 @@ from documents.models import PaperlessTask
from documents.models import SavedView
from documents.models import SavedViewFilterRule
from documents.models import ShareLink
from documents.models import ShareLinkBundle
from documents.models import StoragePath
from documents.models import Tag
from documents.models import UiSettings
@@ -2228,6 +2231,104 @@ class ShareLinkSerializer(OwnedObjectSerializer):
return super().create(validated_data)
class ShareLinkBundleSerializer(OwnedObjectSerializer):
document_ids = serializers.ListField(
child=serializers.IntegerField(min_value=1),
allow_empty=False,
write_only=True,
)
expiration_days = serializers.IntegerField(
required=False,
allow_null=True,
min_value=1,
write_only=True,
)
documents = serializers.PrimaryKeyRelatedField(
many=True,
read_only=True,
)
document_count = SerializerMethodField()
class Meta:
model = ShareLinkBundle
fields = (
"id",
"created",
"expiration",
"expiration_days",
"slug",
"file_version",
"status",
"size_bytes",
"last_error",
"built_at",
"documents",
"document_ids",
"document_count",
)
read_only_fields = (
"id",
"created",
"expiration",
"slug",
"status",
"size_bytes",
"last_error",
"built_at",
"documents",
"document_count",
)
def validate_document_ids(self, value):
unique_ids = set(value)
if len(unique_ids) != len(value):
raise serializers.ValidationError(
_("Duplicate document identifiers are not allowed."),
)
return value
def create(self, validated_data):
document_ids = validated_data.pop("document_ids")
expiration_days = validated_data.pop("expiration_days", None)
validated_data["slug"] = get_random_string(50)
if expiration_days:
validated_data["expiration"] = timezone.now() + timedelta(
days=expiration_days,
)
else:
validated_data["expiration"] = None
share_link_bundle = super().create(validated_data)
documents = list(
Document.objects.filter(pk__in=document_ids).only(
"pk",
),
)
documents_by_id = {doc.pk: doc for doc in documents}
missing = [
str(doc_id) for doc_id in document_ids if doc_id not in documents_by_id
]
if missing:
raise serializers.ValidationError(
{
"document_ids": _(
"Documents not found: %(ids)s",
)
% {"ids": ", ".join(missing)},
},
)
ordered_documents = [documents_by_id[doc_id] for doc_id in document_ids]
share_link_bundle.documents.set(ordered_documents)
share_link_bundle.document_total = len(ordered_documents)
return share_link_bundle
def get_document_count(self, obj: ShareLinkBundle) -> int:
return getattr(obj, "document_total") or obj.documents.count()
class BulkEditObjectsSerializer(SerializerWithPerms, SetPermissionsMixin):
objects = serializers.ListField(
required=True,

View File

@@ -3,8 +3,10 @@ import hashlib
import logging
import shutil
import uuid
import zipfile
from pathlib import Path
from tempfile import TemporaryDirectory
from tempfile import mkstemp
import tqdm
from celery import Task
@@ -22,6 +24,8 @@ from whoosh.writing import AsyncWriter
from documents import index
from documents import sanity_checker
from documents.barcodes import BarcodePlugin
from documents.bulk_download import ArchiveOnlyStrategy
from documents.bulk_download import OriginalsOnlyStrategy
from documents.caching import clear_document_caches
from documents.classifier import DocumentClassifier
from documents.classifier import load_classifier
@@ -39,6 +43,8 @@ from documents.models import CustomFieldInstance
from documents.models import Document
from documents.models import DocumentType
from documents.models import PaperlessTask
from documents.models import ShareLink
from documents.models import ShareLinkBundle
from documents.models import StoragePath
from documents.models import Tag
from documents.models import WorkflowRun
@@ -625,3 +631,117 @@ def update_document_in_llm_index(document):
@shared_task
def remove_document_from_llm_index(document):
llm_index_remove_document(document)
@shared_task
def build_share_link_bundle(bundle_id: int):
try:
bundle = (
ShareLinkBundle.objects.filter(pk=bundle_id)
.prefetch_related("documents")
.get()
)
except ShareLinkBundle.DoesNotExist:
logger.warning("Share link bundle %s no longer exists.", bundle_id)
return
bundle.remove_file()
bundle.status = ShareLinkBundle.Status.PROCESSING
bundle.last_error = None
bundle.size_bytes = None
bundle.built_at = None
bundle.file_path = ""
bundle.save(
update_fields=[
"status",
"last_error",
"size_bytes",
"built_at",
"file_path",
],
)
documents = list(bundle.documents.all().order_by("pk"))
_, temp_zip_path_str = mkstemp(suffix=".zip", dir=settings.SCRATCH_DIR)
temp_zip_path = Path(temp_zip_path_str)
try:
strategy_class = (
ArchiveOnlyStrategy
if bundle.file_version == ShareLink.FileVersion.ARCHIVE
else OriginalsOnlyStrategy
)
with zipfile.ZipFile(temp_zip_path, "w", zipfile.ZIP_DEFLATED) as zipf:
strategy = strategy_class(zipf)
for document in documents:
strategy.add_document(document)
output_dir = settings.SHARE_LINK_BUNDLE_DIR
output_dir.mkdir(parents=True, exist_ok=True)
final_path = (output_dir / f"{bundle.slug}.zip").resolve()
if final_path.exists():
final_path.unlink()
shutil.move(temp_zip_path, final_path)
bundle.file_path = f"{bundle.slug}.zip"
bundle.size_bytes = final_path.stat().st_size
bundle.status = ShareLinkBundle.Status.READY
bundle.built_at = timezone.now()
bundle.last_error = None
bundle.save(
update_fields=[
"file_path",
"size_bytes",
"status",
"built_at",
"last_error",
],
)
logger.info("Built share link bundle %s", bundle.pk)
except Exception as exc:
logger.exception(
"Failed to build share link bundle %s: %s",
bundle_id,
exc,
)
bundle.status = ShareLinkBundle.Status.FAILED
bundle.last_error = {
"bundle_id": bundle_id,
"exception_type": exc.__class__.__name__,
"message": str(exc),
"timestamp": timezone.now().isoformat(),
}
bundle.save(update_fields=["status", "last_error"])
try:
temp_zip_path.unlink()
except OSError:
pass
raise
finally:
try:
temp_zip_path.unlink(missing_ok=True)
except OSError:
pass
@shared_task
def cleanup_expired_share_link_bundles():
now = timezone.now()
expired_qs = ShareLinkBundle.objects.filter(
expiration__isnull=False,
expiration__lt=now,
)
count = 0
for bundle in expired_qs.iterator():
count += 1
try:
bundle.delete()
except Exception as exc:
logger.warning(
"Failed to delete expired share link bundle %s: %s",
bundle.pk,
exc,
)
if count:
logger.info("Deleted %s expired share link bundle(s)", count)

View File

@@ -224,17 +224,18 @@ class TestDoubleSided(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
THEN:
- The collated file gets put into foo/bar
"""
# TODO: parameterize this instead
for path in [
Path("foo") / "bar" / "double-sided",
Path("double-sided") / "foo" / "bar",
]:
with self.subTest(path=path):
with self.subTest(path=str(path)):
# Ensure we get fresh directories for each run
self.tearDown()
self.setUp()
self.create_staging_file()
self.consume_file("double-sided-odd.pdf", path / "foo.pdf")
self.consume_file("double-sided-odd.pdf", Path(path) / "foo.pdf")
self.assertIsFile(
self.dirs.consumption_dir / "foo" / "bar" / "foo-collated.pdf",
)

View File

@@ -114,6 +114,30 @@ def mock_supported_extensions(mocker: MockerFixture) -> MagicMock:
)
def wait_for_mock_call(
mock_obj: MagicMock,
timeout_s: float = 5.0,
poll_interval_s: float = 0.1,
) -> bool:
"""
Actively wait for a mock to be called.
Args:
mock_obj: The mock object to check (e.g., mock.delay)
timeout_s: Maximum time to wait in seconds
poll_interval_s: How often to check in seconds
Returns:
True if mock was called within timeout, False otherwise
"""
start_time = monotonic()
while monotonic() - start_time < timeout_s:
if mock_obj.called:
return True
sleep(poll_interval_s)
return False
class TestTrackedFile:
"""Tests for the TrackedFile dataclass."""
@@ -724,7 +748,7 @@ def start_consumer(
thread = ConsumerThread(consumption_dir, scratch_dir, **kwargs)
threads.append(thread)
thread.start()
sleep(0.5) # Give thread time to start
sleep(2.0) # Give thread time to start
return thread
try:
@@ -767,7 +791,8 @@ class TestCommandWatch:
target = consumption_dir / "document.pdf"
shutil.copy(sample_pdf, target)
sleep(0.5)
wait_for_mock_call(mock_consume_file_delay.delay, timeout_s=2.0)
if thread.exception:
raise thread.exception
@@ -788,9 +813,12 @@ class TestCommandWatch:
thread = start_consumer()
sleep(0.5)
target = consumption_dir / "document.pdf"
shutil.move(temp_location, target)
sleep(0.5)
wait_for_mock_call(mock_consume_file_delay.delay, timeout_s=2.0)
if thread.exception:
raise thread.exception
@@ -816,7 +844,7 @@ class TestCommandWatch:
f.flush()
sleep(0.05)
sleep(0.5)
wait_for_mock_call(mock_consume_file_delay.delay, timeout_s=2.0)
if thread.exception:
raise thread.exception
@@ -837,7 +865,7 @@ class TestCommandWatch:
(consumption_dir / "._document.pdf").write_bytes(b"test")
shutil.copy(sample_pdf, consumption_dir / "valid.pdf")
sleep(0.5)
wait_for_mock_call(mock_consume_file_delay.delay, timeout_s=2.0)
if thread.exception:
raise thread.exception
@@ -868,11 +896,10 @@ class TestCommandWatch:
assert not thread.is_alive()
@pytest.mark.django_db
class TestCommandWatchPolling:
"""Tests for polling mode."""
@pytest.mark.django_db
@pytest.mark.flaky(reruns=2)
def test_polling_mode_works(
self,
consumption_dir: Path,
@@ -882,7 +909,8 @@ class TestCommandWatchPolling:
) -> None:
"""
Test polling mode detects files.
Note: At times, there appears to be a timing issue, where delay has not yet been called, hence this is marked as flaky.
Uses active waiting with timeout to handle CI delays and polling timing.
"""
# Use shorter polling interval for faster test
thread = start_consumer(polling_interval=0.5, stability_delay=0.1)
@@ -890,9 +918,9 @@ class TestCommandWatchPolling:
target = consumption_dir / "document.pdf"
shutil.copy(sample_pdf, target)
# Wait for: poll interval + stability delay + another poll + margin
# CI can be slow, so use generous timeout
sleep(3.0)
# Actively wait for consumption
# Polling needs: interval (0.5s) + stability (0.1s) + next poll (0.5s) + margin
wait_for_mock_call(mock_consume_file_delay.delay, timeout_s=5.0)
if thread.exception:
raise thread.exception
@@ -919,7 +947,8 @@ class TestCommandWatchRecursive:
target = subdir / "document.pdf"
shutil.copy(sample_pdf, target)
sleep(0.5)
wait_for_mock_call(mock_consume_file_delay.delay, timeout_s=2.0)
if thread.exception:
raise thread.exception
@@ -948,7 +977,8 @@ class TestCommandWatchRecursive:
target = subdir / "document.pdf"
shutil.copy(sample_pdf, target)
sleep(0.5)
wait_for_mock_call(mock_consume_file_delay.delay, timeout_s=2.0)
if thread.exception:
raise thread.exception

View File

@@ -0,0 +1,51 @@
from documents.tests.utils import TestMigrations
class TestMigrateShareLinkBundlePermissions(TestMigrations):
migrate_from = "0007_document_content_length"
migrate_to = "0008_sharelinkbundle"
def setUpBeforeMigration(self, apps):
User = apps.get_model("auth", "User")
Group = apps.get_model("auth", "Group")
self.Permission = apps.get_model("auth", "Permission")
self.user = User.objects.create(username="user1")
self.group = Group.objects.create(name="group1")
add_document = self.Permission.objects.get(codename="add_document")
self.user.user_permissions.add(add_document.id)
self.group.permissions.add(add_document.id)
def test_share_link_permissions_granted_to_add_document_holders(self):
share_perms = self.Permission.objects.filter(
codename__contains="sharelinkbundle",
)
self.assertTrue(self.user.user_permissions.filter(pk__in=share_perms).exists())
self.assertTrue(self.group.permissions.filter(pk__in=share_perms).exists())
class TestReverseMigrateShareLinkBundlePermissions(TestMigrations):
migrate_from = "0008_sharelinkbundle"
migrate_to = "0007_document_content_length"
def setUpBeforeMigration(self, apps):
User = apps.get_model("auth", "User")
Group = apps.get_model("auth", "Group")
self.Permission = apps.get_model("auth", "Permission")
self.user = User.objects.create(username="user1")
self.group = Group.objects.create(name="group1")
add_document = self.Permission.objects.get(codename="add_document")
share_perms = self.Permission.objects.filter(
codename__contains="sharelinkbundle",
)
self.share_perm_ids = list(share_perms.values_list("id", flat=True))
self.user.user_permissions.add(add_document.id, *self.share_perm_ids)
self.group.permissions.add(add_document.id, *self.share_perm_ids)
def test_share_link_permissions_revoked_on_reverse(self):
self.assertFalse(
self.user.user_permissions.filter(pk__in=self.share_perm_ids).exists(),
)
self.assertFalse(
self.group.permissions.filter(pk__in=self.share_perm_ids).exists(),
)

View File

@@ -0,0 +1,536 @@
from __future__ import annotations
import zipfile
from datetime import timedelta
from pathlib import Path
from unittest import mock
from django.conf import settings
from django.contrib.auth.models import User
from django.utils import timezone
from rest_framework import serializers
from rest_framework import status
from rest_framework.test import APITestCase
from documents.filters import ShareLinkBundleFilterSet
from documents.models import ShareLink
from documents.models import ShareLinkBundle
from documents.serialisers import ShareLinkBundleSerializer
from documents.tasks import build_share_link_bundle
from documents.tasks import cleanup_expired_share_link_bundles
from documents.tests.factories import DocumentFactory
from documents.tests.utils import DirectoriesMixin
class ShareLinkBundleAPITests(DirectoriesMixin, APITestCase):
ENDPOINT = "/api/share_link_bundles/"
def setUp(self):
super().setUp()
self.user = User.objects.create_superuser(username="bundle_admin")
self.client.force_authenticate(self.user)
self.document = DocumentFactory.create()
@mock.patch("documents.views.build_share_link_bundle.delay")
def test_create_bundle_triggers_build_job(self, delay_mock):
payload = {
"document_ids": [self.document.pk],
"file_version": ShareLink.FileVersion.ARCHIVE,
"expiration_days": 7,
}
response = self.client.post(self.ENDPOINT, payload, format="json")
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
bundle = ShareLinkBundle.objects.get(pk=response.data["id"])
self.assertEqual(bundle.documents.count(), 1)
self.assertEqual(bundle.status, ShareLinkBundle.Status.PENDING)
delay_mock.assert_called_once_with(bundle.pk)
def test_create_bundle_rejects_missing_documents(self):
payload = {
"document_ids": [9999],
"file_version": ShareLink.FileVersion.ARCHIVE,
"expiration_days": 7,
}
response = self.client.post(self.ENDPOINT, payload, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertIn("document_ids", response.data)
@mock.patch("documents.views.has_perms_owner_aware", return_value=False)
def test_create_bundle_rejects_insufficient_permissions(self, perms_mock):
payload = {
"document_ids": [self.document.pk],
"file_version": ShareLink.FileVersion.ARCHIVE,
"expiration_days": 7,
}
response = self.client.post(self.ENDPOINT, payload, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertIn("document_ids", response.data)
perms_mock.assert_called()
@mock.patch("documents.views.build_share_link_bundle.delay")
def test_rebuild_bundle_resets_state(self, delay_mock):
bundle = ShareLinkBundle.objects.create(
slug="rebuild-slug",
file_version=ShareLink.FileVersion.ARCHIVE,
status=ShareLinkBundle.Status.FAILED,
)
bundle.documents.set([self.document])
bundle.last_error = {"message": "Something went wrong"}
bundle.size_bytes = 100
bundle.file_path = "path/to/file.zip"
bundle.save()
response = self.client.post(f"{self.ENDPOINT}{bundle.pk}/rebuild/")
self.assertEqual(response.status_code, status.HTTP_200_OK)
bundle.refresh_from_db()
self.assertEqual(bundle.status, ShareLinkBundle.Status.PENDING)
self.assertIsNone(bundle.last_error)
self.assertIsNone(bundle.size_bytes)
self.assertEqual(bundle.file_path, "")
delay_mock.assert_called_once_with(bundle.pk)
def test_rebuild_bundle_rejects_processing_status(self):
bundle = ShareLinkBundle.objects.create(
slug="processing-slug",
file_version=ShareLink.FileVersion.ARCHIVE,
status=ShareLinkBundle.Status.PROCESSING,
)
bundle.documents.set([self.document])
response = self.client.post(f"{self.ENDPOINT}{bundle.pk}/rebuild/")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertIn("detail", response.data)
def test_create_bundle_rejects_duplicate_documents(self):
payload = {
"document_ids": [self.document.pk, self.document.pk],
"file_version": ShareLink.FileVersion.ARCHIVE,
"expiration_days": 7,
}
response = self.client.post(self.ENDPOINT, payload, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertIn("document_ids", response.data)
def test_download_ready_bundle_streams_file(self):
bundle_file = Path(self.dirs.media_dir) / "bundles" / "ready.zip"
bundle_file.parent.mkdir(parents=True, exist_ok=True)
bundle_file.write_bytes(b"binary-zip-content")
bundle = ShareLinkBundle.objects.create(
slug="readyslug",
file_version=ShareLink.FileVersion.ARCHIVE,
status=ShareLinkBundle.Status.READY,
file_path=str(bundle_file),
)
bundle.documents.set([self.document])
self.client.logout()
response = self.client.get(f"/share/{bundle.slug}/")
content = b"".join(response.streaming_content)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response["Content-Type"], "application/zip")
self.assertEqual(content, b"binary-zip-content")
self.assertIn("attachment;", response["Content-Disposition"])
def test_download_pending_bundle_returns_202(self):
bundle = ShareLinkBundle.objects.create(
slug="pendingslug",
file_version=ShareLink.FileVersion.ARCHIVE,
status=ShareLinkBundle.Status.PENDING,
)
bundle.documents.set([self.document])
self.client.logout()
response = self.client.get(f"/share/{bundle.slug}/")
self.assertEqual(response.status_code, status.HTTP_202_ACCEPTED)
def test_download_failed_bundle_returns_503(self):
bundle = ShareLinkBundle.objects.create(
slug="failedslug",
file_version=ShareLink.FileVersion.ARCHIVE,
status=ShareLinkBundle.Status.FAILED,
)
bundle.documents.set([self.document])
self.client.logout()
response = self.client.get(f"/share/{bundle.slug}/")
self.assertEqual(response.status_code, status.HTTP_503_SERVICE_UNAVAILABLE)
def test_expired_share_link_redirects(self):
share_link = ShareLink.objects.create(
slug="expiredlink",
document=self.document,
file_version=ShareLink.FileVersion.ORIGINAL,
expiration=timezone.now() - timedelta(hours=1),
)
self.client.logout()
response = self.client.get(f"/share/{share_link.slug}/")
self.assertEqual(response.status_code, status.HTTP_302_FOUND)
self.assertIn("sharelink_expired=1", response["Location"])
def test_unknown_share_link_redirects(self):
self.client.logout()
response = self.client.get("/share/unknownsharelink/")
self.assertEqual(response.status_code, status.HTTP_302_FOUND)
self.assertIn("sharelink_notfound=1", response["Location"])
class ShareLinkBundleTaskTests(DirectoriesMixin, APITestCase):
def setUp(self):
super().setUp()
self.document = DocumentFactory.create()
def test_cleanup_expired_share_link_bundles(self):
expired_path = Path(self.dirs.media_dir) / "expired.zip"
expired_path.parent.mkdir(parents=True, exist_ok=True)
expired_path.write_bytes(b"expired")
active_path = Path(self.dirs.media_dir) / "active.zip"
active_path.write_bytes(b"active")
expired_bundle = ShareLinkBundle.objects.create(
slug="expired-bundle",
file_version=ShareLink.FileVersion.ARCHIVE,
status=ShareLinkBundle.Status.READY,
expiration=timezone.now() - timedelta(days=1),
file_path=str(expired_path),
)
expired_bundle.documents.set([self.document])
active_bundle = ShareLinkBundle.objects.create(
slug="active-bundle",
file_version=ShareLink.FileVersion.ARCHIVE,
status=ShareLinkBundle.Status.READY,
expiration=timezone.now() + timedelta(days=1),
file_path=str(active_path),
)
active_bundle.documents.set([self.document])
cleanup_expired_share_link_bundles()
self.assertFalse(ShareLinkBundle.objects.filter(pk=expired_bundle.pk).exists())
self.assertTrue(ShareLinkBundle.objects.filter(pk=active_bundle.pk).exists())
self.assertFalse(expired_path.exists())
self.assertTrue(active_path.exists())
def test_cleanup_expired_share_link_bundles_logs_on_failure(self):
expired_bundle = ShareLinkBundle.objects.create(
slug="expired-bundle",
file_version=ShareLink.FileVersion.ARCHIVE,
status=ShareLinkBundle.Status.READY,
expiration=timezone.now() - timedelta(days=1),
)
expired_bundle.documents.set([self.document])
with mock.patch.object(
ShareLinkBundle,
"delete",
side_effect=RuntimeError("fail"),
):
with self.assertLogs("paperless.tasks", level="WARNING") as logs:
cleanup_expired_share_link_bundles()
self.assertTrue(
any(
"Failed to delete expired share link bundle" in msg
for msg in logs.output
),
)
class ShareLinkBundleBuildTaskTests(DirectoriesMixin, APITestCase):
def setUp(self):
super().setUp()
self.document = DocumentFactory.create(
mime_type="application/pdf",
checksum="123",
)
self.document.archive_checksum = ""
self.document.save()
self.addCleanup(
setattr,
settings,
"SHARE_LINK_BUNDLE_DIR",
settings.SHARE_LINK_BUNDLE_DIR,
)
settings.SHARE_LINK_BUNDLE_DIR = (
Path(settings.MEDIA_ROOT) / "documents" / "share_link_bundles"
)
def _write_document_file(self, *, archive: bool, content: bytes) -> Path:
if archive:
self.document.archive_filename = f"{self.document.pk:07}.pdf"
self.document.save()
path = self.document.archive_path
else:
path = self.document.source_path
path.parent.mkdir(parents=True, exist_ok=True)
path.write_bytes(content)
return path
def test_build_share_link_bundle_creates_zip_and_sets_metadata(self):
self._write_document_file(archive=False, content=b"source")
archive_path = self._write_document_file(archive=True, content=b"archive")
bundle = ShareLinkBundle.objects.create(
slug="build-archive",
file_version=ShareLink.FileVersion.ARCHIVE,
)
bundle.documents.set([self.document])
build_share_link_bundle(bundle.pk)
bundle.refresh_from_db()
self.assertEqual(bundle.status, ShareLinkBundle.Status.READY)
self.assertIsNone(bundle.last_error)
self.assertIsNotNone(bundle.built_at)
self.assertGreater(bundle.size_bytes or 0, 0)
final_path = bundle.absolute_file_path
self.assertIsNotNone(final_path)
self.assertTrue(final_path.exists())
with zipfile.ZipFile(final_path) as zipf:
names = zipf.namelist()
self.assertEqual(len(names), 1)
self.assertEqual(zipf.read(names[0]), archive_path.read_bytes())
def test_build_share_link_bundle_overwrites_existing_file(self):
self._write_document_file(archive=False, content=b"source")
bundle = ShareLinkBundle.objects.create(
slug="overwrite",
file_version=ShareLink.FileVersion.ORIGINAL,
)
bundle.documents.set([self.document])
existing = settings.SHARE_LINK_BUNDLE_DIR / "overwrite.zip"
existing.parent.mkdir(parents=True, exist_ok=True)
existing.write_bytes(b"old")
build_share_link_bundle(bundle.pk)
bundle.refresh_from_db()
final_path = bundle.absolute_file_path
self.assertIsNotNone(final_path)
self.assertTrue(final_path.exists())
self.assertNotEqual(final_path.read_bytes(), b"old")
def test_build_share_link_bundle_failure_marks_failed(self):
self._write_document_file(archive=False, content=b"source")
bundle = ShareLinkBundle.objects.create(
slug="fail-bundle",
file_version=ShareLink.FileVersion.ORIGINAL,
)
bundle.documents.set([self.document])
with (
mock.patch(
"documents.tasks.OriginalsOnlyStrategy.add_document",
side_effect=RuntimeError("zip failure"),
),
mock.patch("pathlib.Path.unlink") as unlink_mock,
):
unlink_mock.side_effect = [OSError("unlink"), OSError("unlink-finally")] + [
None,
] * 5
with self.assertRaises(RuntimeError):
build_share_link_bundle(bundle.pk)
bundle.refresh_from_db()
self.assertEqual(bundle.status, ShareLinkBundle.Status.FAILED)
self.assertIsInstance(bundle.last_error, dict)
self.assertEqual(bundle.last_error.get("message"), "zip failure")
self.assertEqual(bundle.last_error.get("exception_type"), "RuntimeError")
scratch_zips = list(Path(settings.SCRATCH_DIR).glob("*.zip"))
self.assertTrue(scratch_zips)
for path in scratch_zips:
path.unlink(missing_ok=True)
def test_build_share_link_bundle_missing_bundle_noop(self):
# Should not raise when bundle does not exist
build_share_link_bundle(99999)
class ShareLinkBundleFilterSetTests(DirectoriesMixin, APITestCase):
def setUp(self):
super().setUp()
self.document = DocumentFactory.create()
self.document.checksum = "doc1checksum"
self.document.save()
self.other_document = DocumentFactory.create()
self.other_document.checksum = "doc2checksum"
self.other_document.save()
self.bundle_one = ShareLinkBundle.objects.create(
slug="bundle-one",
file_version=ShareLink.FileVersion.ORIGINAL,
)
self.bundle_one.documents.set([self.document])
self.bundle_two = ShareLinkBundle.objects.create(
slug="bundle-two",
file_version=ShareLink.FileVersion.ORIGINAL,
)
self.bundle_two.documents.set([self.other_document])
def test_filter_documents_returns_all_for_empty_value(self):
filterset = ShareLinkBundleFilterSet(
data={"documents": ""},
queryset=ShareLinkBundle.objects.all(),
)
self.assertCountEqual(filterset.qs, [self.bundle_one, self.bundle_two])
def test_filter_documents_handles_invalid_input(self):
filterset = ShareLinkBundleFilterSet(
data={"documents": "invalid"},
queryset=ShareLinkBundle.objects.all(),
)
self.assertFalse(filterset.qs.exists())
def test_filter_documents_filters_by_multiple_ids(self):
filterset = ShareLinkBundleFilterSet(
data={"documents": f"{self.document.pk},{self.other_document.pk}"},
queryset=ShareLinkBundle.objects.all(),
)
self.assertCountEqual(filterset.qs, [self.bundle_one, self.bundle_two])
def test_filter_documents_returns_queryset_for_empty_ids(self):
filterset = ShareLinkBundleFilterSet(
data={"documents": ","},
queryset=ShareLinkBundle.objects.all(),
)
self.assertCountEqual(filterset.qs, [self.bundle_one, self.bundle_two])
class ShareLinkBundleModelTests(DirectoriesMixin, APITestCase):
def test_absolute_file_path_handles_relative_and_absolute(self):
relative_path = Path("relative.zip")
bundle = ShareLinkBundle.objects.create(
slug="relative-bundle",
file_version=ShareLink.FileVersion.ORIGINAL,
file_path=str(relative_path),
)
self.assertEqual(
bundle.absolute_file_path,
(settings.SHARE_LINK_BUNDLE_DIR / relative_path).resolve(),
)
absolute_path = Path(self.dirs.media_dir) / "absolute.zip"
bundle.file_path = str(absolute_path)
self.assertEqual(bundle.absolute_file_path.resolve(), absolute_path.resolve())
def test_str_returns_translated_slug(self):
bundle = ShareLinkBundle.objects.create(
slug="string-slug",
file_version=ShareLink.FileVersion.ORIGINAL,
)
self.assertIn("string-slug", str(bundle))
def test_remove_file_deletes_existing_file(self):
bundle_path = settings.SHARE_LINK_BUNDLE_DIR / "remove.zip"
bundle_path.parent.mkdir(parents=True, exist_ok=True)
bundle_path.write_bytes(b"remove-me")
bundle = ShareLinkBundle.objects.create(
slug="remove-bundle",
file_version=ShareLink.FileVersion.ORIGINAL,
file_path=str(bundle_path.relative_to(settings.SHARE_LINK_BUNDLE_DIR)),
)
bundle.remove_file()
self.assertFalse(bundle_path.exists())
def test_remove_file_handles_oserror(self):
bundle_path = settings.SHARE_LINK_BUNDLE_DIR / "remove-error.zip"
bundle_path.parent.mkdir(parents=True, exist_ok=True)
bundle_path.write_bytes(b"remove-me")
bundle = ShareLinkBundle.objects.create(
slug="remove-error",
file_version=ShareLink.FileVersion.ORIGINAL,
file_path=str(bundle_path.relative_to(settings.SHARE_LINK_BUNDLE_DIR)),
)
with mock.patch("pathlib.Path.unlink", side_effect=OSError("fail")):
bundle.remove_file()
self.assertTrue(bundle_path.exists())
def test_delete_calls_remove_file(self):
bundle_path = settings.SHARE_LINK_BUNDLE_DIR / "delete.zip"
bundle_path.parent.mkdir(parents=True, exist_ok=True)
bundle_path.write_bytes(b"remove-me")
bundle = ShareLinkBundle.objects.create(
slug="delete-bundle",
file_version=ShareLink.FileVersion.ORIGINAL,
file_path=str(bundle_path.relative_to(settings.SHARE_LINK_BUNDLE_DIR)),
)
bundle.delete()
self.assertFalse(bundle_path.exists())
class ShareLinkBundleSerializerTests(DirectoriesMixin, APITestCase):
def setUp(self):
super().setUp()
self.document = DocumentFactory.create()
def test_validate_document_ids_rejects_duplicates(self):
serializer = ShareLinkBundleSerializer(
data={
"document_ids": [self.document.pk, self.document.pk],
"file_version": ShareLink.FileVersion.ORIGINAL,
},
)
self.assertFalse(serializer.is_valid())
self.assertIn("document_ids", serializer.errors)
def test_create_assigns_documents_and_expiration(self):
serializer = ShareLinkBundleSerializer(
data={
"document_ids": [self.document.pk],
"file_version": ShareLink.FileVersion.ORIGINAL,
"expiration_days": 3,
},
)
self.assertTrue(serializer.is_valid(), serializer.errors)
bundle = serializer.save()
self.assertEqual(list(bundle.documents.all()), [self.document])
expected_expiration = timezone.now() + timedelta(days=3)
self.assertAlmostEqual(
bundle.expiration,
expected_expiration,
delta=timedelta(seconds=10),
)
def test_create_raises_when_missing_documents(self):
serializer = ShareLinkBundleSerializer(
data={
"document_ids": [self.document.pk, 9999],
"file_version": ShareLink.FileVersion.ORIGINAL,
},
)
self.assertTrue(serializer.is_valid(), serializer.errors)
with self.assertRaises(serializers.ValidationError):
serializer.save(documents=[self.document])

View File

@@ -50,6 +50,7 @@ from django.utils import timezone
from django.utils.decorators import method_decorator
from django.utils.timezone import make_aware
from django.utils.translation import get_language
from django.utils.translation import gettext_lazy as _
from django.views import View
from django.views.decorators.cache import cache_control
from django.views.decorators.csrf import ensure_csrf_cookie
@@ -70,6 +71,7 @@ from packaging import version as packaging_version
from redis import Redis
from rest_framework import parsers
from rest_framework import serializers
from rest_framework import status
from rest_framework.decorators import action
from rest_framework.exceptions import NotFound
from rest_framework.exceptions import ValidationError
@@ -120,6 +122,7 @@ from documents.filters import DocumentTypeFilterSet
from documents.filters import ObjectOwnedOrGrantedPermissionsFilter
from documents.filters import ObjectOwnedPermissionsFilter
from documents.filters import PaperlessTaskFilterSet
from documents.filters import ShareLinkBundleFilterSet
from documents.filters import ShareLinkFilterSet
from documents.filters import StoragePathFilterSet
from documents.filters import TagFilterSet
@@ -137,6 +140,7 @@ from documents.models import Note
from documents.models import PaperlessTask
from documents.models import SavedView
from documents.models import ShareLink
from documents.models import ShareLinkBundle
from documents.models import StoragePath
from documents.models import Tag
from documents.models import UiSettings
@@ -170,6 +174,7 @@ from documents.serialisers import PostDocumentSerializer
from documents.serialisers import RunTaskViewSerializer
from documents.serialisers import SavedViewSerializer
from documents.serialisers import SearchResultSerializer
from documents.serialisers import ShareLinkBundleSerializer
from documents.serialisers import ShareLinkSerializer
from documents.serialisers import StoragePathSerializer
from documents.serialisers import StoragePathTestSerializer
@@ -182,6 +187,7 @@ from documents.serialisers import WorkflowActionSerializer
from documents.serialisers import WorkflowSerializer
from documents.serialisers import WorkflowTriggerSerializer
from documents.signals import document_updated
from documents.tasks import build_share_link_bundle
from documents.tasks import consume_file
from documents.tasks import empty_trash
from documents.tasks import index_optimize
@@ -2435,7 +2441,7 @@ class BulkDownloadView(GenericAPIView):
follow_filename_format = serializer.validated_data.get("follow_formatting")
for document in documents:
if not has_perms_owner_aware(request.user, "view_document", document):
if not has_perms_owner_aware(request.user, "change_document", document):
return HttpResponseForbidden("Insufficient permissions")
settings.SCRATCH_DIR.mkdir(parents=True, exist_ok=True)
@@ -2790,21 +2796,187 @@ class ShareLinkViewSet(ModelViewSet, PassUserMixin):
ordering_fields = ("created", "expiration", "document")
class ShareLinkBundleViewSet(ModelViewSet, PassUserMixin):
model = ShareLinkBundle
queryset = ShareLinkBundle.objects.all()
serializer_class = ShareLinkBundleSerializer
pagination_class = StandardPagination
permission_classes = (IsAuthenticated, PaperlessObjectPermissions)
filter_backends = (
DjangoFilterBackend,
OrderingFilter,
ObjectOwnedOrGrantedPermissionsFilter,
)
filterset_class = ShareLinkBundleFilterSet
ordering_fields = ("created", "expiration", "status")
def get_queryset(self):
return (
super()
.get_queryset()
.prefetch_related("documents")
.annotate(document_total=Count("documents", distinct=True))
)
def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
document_ids = serializer.validated_data["document_ids"]
documents_qs = Document.objects.filter(pk__in=document_ids).select_related(
"owner",
)
found_ids = set(documents_qs.values_list("pk", flat=True))
missing = sorted(set(document_ids) - found_ids)
if missing:
raise ValidationError(
{
"document_ids": _(
"Documents not found: %(ids)s",
)
% {"ids": ", ".join(str(item) for item in missing)},
},
)
documents = list(documents_qs)
for document in documents:
if not has_perms_owner_aware(request.user, "view_document", document):
raise ValidationError(
{
"document_ids": _(
"Insufficient permissions to share document %(id)s.",
)
% {"id": document.pk},
},
)
document_map = {document.pk: document for document in documents}
ordered_documents = [document_map[doc_id] for doc_id in document_ids]
bundle = serializer.save(
owner=request.user,
documents=ordered_documents,
)
bundle.remove_file()
bundle.status = ShareLinkBundle.Status.PENDING
bundle.last_error = None
bundle.size_bytes = None
bundle.built_at = None
bundle.file_path = ""
bundle.save(
update_fields=[
"status",
"last_error",
"size_bytes",
"built_at",
"file_path",
],
)
build_share_link_bundle.delay(bundle.pk)
bundle.document_total = len(ordered_documents)
response_serializer = self.get_serializer(bundle)
headers = self.get_success_headers(response_serializer.data)
return Response(
response_serializer.data,
status=status.HTTP_201_CREATED,
headers=headers,
)
@action(detail=True, methods=["post"])
def rebuild(self, request, pk=None):
bundle = self.get_object()
if bundle.status == ShareLinkBundle.Status.PROCESSING:
return Response(
{"detail": _("Bundle is already being processed.")},
status=status.HTTP_400_BAD_REQUEST,
)
bundle.remove_file()
bundle.status = ShareLinkBundle.Status.PENDING
bundle.last_error = None
bundle.size_bytes = None
bundle.built_at = None
bundle.file_path = ""
bundle.save(
update_fields=[
"status",
"last_error",
"size_bytes",
"built_at",
"file_path",
],
)
build_share_link_bundle.delay(bundle.pk)
bundle.document_total = (
getattr(bundle, "document_total", None) or bundle.documents.count()
)
serializer = self.get_serializer(bundle)
return Response(serializer.data)
class SharedLinkView(View):
authentication_classes = []
permission_classes = []
def get(self, request, slug):
share_link = ShareLink.objects.filter(slug=slug).first()
if share_link is None:
if share_link is not None:
if (
share_link.expiration is not None
and share_link.expiration < timezone.now()
):
return HttpResponseRedirect("/accounts/login/?sharelink_expired=1")
return serve_file(
doc=share_link.document,
use_archive=share_link.file_version == "archive",
disposition="inline",
)
bundle = ShareLinkBundle.objects.filter(slug=slug).first()
if bundle is None:
return HttpResponseRedirect("/accounts/login/?sharelink_notfound=1")
if share_link.expiration is not None and share_link.expiration < timezone.now():
if bundle.expiration is not None and bundle.expiration < timezone.now():
return HttpResponseRedirect("/accounts/login/?sharelink_expired=1")
return serve_file(
doc=share_link.document,
use_archive=share_link.file_version == "archive",
disposition="inline",
if bundle.status in {
ShareLinkBundle.Status.PENDING,
ShareLinkBundle.Status.PROCESSING,
}:
return HttpResponse(
_(
"The share link bundle is still being prepared. Please try again later.",
),
status=status.HTTP_202_ACCEPTED,
)
file_path = bundle.absolute_file_path
if bundle.status == ShareLinkBundle.Status.FAILED or file_path is None:
return HttpResponse(
_(
"The share link bundle is unavailable.",
),
status=status.HTTP_503_SERVICE_UNAVAILABLE,
)
response = FileResponse(file_path.open("rb"), content_type="application/zip")
short_slug = bundle.slug[:12]
download_name = f"paperless-share-{short_slug}.zip"
filename_normalized = (
normalize("NFKD", download_name)
.encode(
"ascii",
"ignore",
)
.decode("ascii")
)
filename_encoded = quote(download_name)
response["Content-Disposition"] = (
f"attachment; filename='{filename_normalized}'; "
f"filename*=utf-8''{filename_encoded}"
)
return response
def serve_file(*, doc: Document, use_archive: bool, disposition: str):

File diff suppressed because it is too large Load Diff

View File

@@ -3,7 +3,12 @@ import os
import sys
if __name__ == "__main__":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless.settings")
try:
from paperless_migration.detect import choose_settings_module
os.environ.setdefault("DJANGO_SETTINGS_MODULE", choose_settings_module())
except Exception:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless.settings")
from django.core.management import execute_from_command_line

13
src/manage_migration.py Executable file
View File

@@ -0,0 +1,13 @@
#!/usr/bin/env python3
import os
import sys
if __name__ == "__main__":
os.environ.setdefault(
"DJANGO_SETTINGS_MODULE",
"paperless_migration.settings",
)
from django.core.management import execute_from_command_line
execute_from_command_line(sys.argv)

View File

@@ -3,7 +3,7 @@ from urllib.parse import quote
from allauth.account.adapter import DefaultAccountAdapter
from allauth.core import context
from allauth.headless.tokens.sessions import SessionTokenStrategy
from allauth.headless.tokens.strategies.sessions import SessionTokenStrategy
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter
from django.conf import settings
from django.contrib.auth.models import Group

View File

@@ -1,12 +1,18 @@
import os
try:
from paperless_migration.detect import choose_settings_module
os.environ.setdefault("DJANGO_SETTINGS_MODULE", choose_settings_module())
except Exception:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless.settings")
from django.core.asgi import get_asgi_application
# Fetch Django ASGI application early to ensure AppRegistry is populated
# before importing consumers and AuthMiddlewareStack that may import ORM
# models.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless.settings")
django_asgi_app = get_asgi_application()
from channels.auth import AuthMiddlewareStack # noqa: E402

View File

@@ -1,82 +0,0 @@
from mcp_server import drf_publish_create_mcp_tool
from mcp_server import drf_publish_destroy_mcp_tool
from mcp_server import drf_publish_list_mcp_tool
from mcp_server import drf_publish_update_mcp_tool
from paperless.views import ApplicationConfigurationViewSet
from paperless.views import GroupViewSet
from paperless.views import UserViewSet
VIEWSET_ACTIONS = {
"create": {"post": "create"},
"list": {"get": "list"},
"update": {"put": "update"},
"destroy": {"delete": "destroy"},
}
BODY_SCHEMA = {"type": "object", "additionalProperties": True}
VIEWSET_INSTRUCTIONS = {
UserViewSet: "Manage Paperless users.",
GroupViewSet: "Manage Paperless groups.",
ApplicationConfigurationViewSet: "Manage application configuration.",
}
drf_publish_create_mcp_tool(
UserViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[UserViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
UserViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[UserViewSet],
)
drf_publish_update_mcp_tool(
UserViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[UserViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
UserViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[UserViewSet],
)
drf_publish_create_mcp_tool(
GroupViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[GroupViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
GroupViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[GroupViewSet],
)
drf_publish_update_mcp_tool(
GroupViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[GroupViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
GroupViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[GroupViewSet],
)
drf_publish_list_mcp_tool(
ApplicationConfigurationViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[ApplicationConfigurationViewSet],
)
drf_publish_update_mcp_tool(
ApplicationConfigurationViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[ApplicationConfigurationViewSet],
body_schema=BODY_SCHEMA,
)

View File

@@ -0,0 +1,7 @@
import os
from django.core.asgi import get_asgi_application
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless_migration.settings")
application = get_asgi_application()

View File

@@ -241,6 +241,17 @@ def _parse_beat_schedule() -> dict:
"expires": 23.0 * 60.0 * 60.0,
},
},
{
"name": "Cleanup expired share link bundles",
"env_key": "PAPERLESS_SHARE_LINK_BUNDLE_CLEANUP_CRON",
# Default daily at 02:00
"env_default": "0 2 * * *",
"task": "documents.tasks.cleanup_expired_share_link_bundles",
"options": {
# 1 hour before default schedule sends again
"expires": 23.0 * 60.0 * 60.0,
},
},
]
for task in tasks:
# Either get the environment setting or use the default
@@ -279,6 +290,7 @@ MEDIA_ROOT = __get_path("PAPERLESS_MEDIA_ROOT", BASE_DIR.parent / "media")
ORIGINALS_DIR = MEDIA_ROOT / "documents" / "originals"
ARCHIVE_DIR = MEDIA_ROOT / "documents" / "archive"
THUMBNAIL_DIR = MEDIA_ROOT / "documents" / "thumbnails"
SHARE_LINK_BUNDLE_DIR = MEDIA_ROOT / "documents" / "share_link_bundles"
DATA_DIR = __get_path("PAPERLESS_DATA_DIR", BASE_DIR.parent / "data")
@@ -348,7 +360,6 @@ INSTALLED_APPS = [
"allauth.headless",
"drf_spectacular",
"drf_spectacular_sidecar",
"mcp_server",
"treenode",
*env_apps,
]
@@ -613,17 +624,6 @@ def _parse_remote_user_settings() -> str:
HTTP_REMOTE_USER_HEADER_NAME = _parse_remote_user_settings()
DJANGO_MCP_AUTHENTICATION_CLASSES = REST_FRAMEWORK["DEFAULT_AUTHENTICATION_CLASSES"]
DJANGO_MCP_GLOBAL_SERVER_CONFIG = {
"name": "paperless-ngx",
"instructions": (
"Use the MCP tools to search, query, and manage Paperless-ngx data. "
"Use `search_documents` for full-text search, and `query_data_collections` "
"for structured queries against available collections. "
"Write operations are exposed via DRF-backed tools for create/update/delete."
),
}
# X-Frame options for embedded PDF display:
X_FRAME_OPTIONS = "SAMEORIGIN"

View File

@@ -161,6 +161,7 @@ class TestCeleryScheduleParsing(TestCase):
EMPTY_TRASH_EXPIRE_TIME = 23.0 * 60.0 * 60.0
RUN_SCHEDULED_WORKFLOWS_EXPIRE_TIME = 59.0 * 60.0
LLM_INDEX_EXPIRE_TIME = 23.0 * 60.0 * 60.0
CLEANUP_EXPIRED_SHARE_BUNDLES_EXPIRE_TIME = 23.0 * 60.0 * 60.0
def test_schedule_configuration_default(self):
"""
@@ -212,6 +213,13 @@ class TestCeleryScheduleParsing(TestCase):
"expires": self.LLM_INDEX_EXPIRE_TIME,
},
},
"Cleanup expired share link bundles": {
"task": "documents.tasks.cleanup_expired_share_link_bundles",
"schedule": crontab(minute=0, hour=2),
"options": {
"expires": self.CLEANUP_EXPIRED_SHARE_BUNDLES_EXPIRE_TIME,
},
},
},
schedule,
)
@@ -271,6 +279,13 @@ class TestCeleryScheduleParsing(TestCase):
"expires": self.LLM_INDEX_EXPIRE_TIME,
},
},
"Cleanup expired share link bundles": {
"task": "documents.tasks.cleanup_expired_share_link_bundles",
"schedule": crontab(minute=0, hour=2),
"options": {
"expires": self.CLEANUP_EXPIRED_SHARE_BUNDLES_EXPIRE_TIME,
},
},
},
schedule,
)
@@ -322,6 +337,13 @@ class TestCeleryScheduleParsing(TestCase):
"expires": self.LLM_INDEX_EXPIRE_TIME,
},
},
"Cleanup expired share link bundles": {
"task": "documents.tasks.cleanup_expired_share_link_bundles",
"schedule": crontab(minute=0, hour=2),
"options": {
"expires": self.CLEANUP_EXPIRED_SHARE_BUNDLES_EXPIRE_TIME,
},
},
},
schedule,
)
@@ -345,6 +367,7 @@ class TestCeleryScheduleParsing(TestCase):
"PAPERLESS_EMPTY_TRASH_TASK_CRON": "disable",
"PAPERLESS_WORKFLOW_SCHEDULED_TASK_CRON": "disable",
"PAPERLESS_LLM_INDEX_TASK_CRON": "disable",
"PAPERLESS_SHARE_LINK_BUNDLE_CLEANUP_CRON": "disable",
},
):
schedule = _parse_beat_schedule()

View File

@@ -31,6 +31,7 @@ from documents.views import SavedViewViewSet
from documents.views import SearchAutoCompleteView
from documents.views import SelectionDataView
from documents.views import SharedLinkView
from documents.views import ShareLinkBundleViewSet
from documents.views import ShareLinkViewSet
from documents.views import StatisticsView
from documents.views import StoragePathViewSet
@@ -73,6 +74,7 @@ api_router.register(r"users", UserViewSet, basename="users")
api_router.register(r"groups", GroupViewSet, basename="groups")
api_router.register(r"mail_accounts", MailAccountViewSet)
api_router.register(r"mail_rules", MailRuleViewSet)
api_router.register(r"share_link_bundles", ShareLinkBundleViewSet)
api_router.register(r"share_links", ShareLinkViewSet)
api_router.register(r"workflow_triggers", WorkflowTriggerViewSet)
api_router.register(r"workflow_actions", WorkflowActionViewSet)
@@ -356,7 +358,6 @@ urlpatterns = [
],
),
),
path("", include("mcp_server.urls")),
# Root of the Frontend
re_path(
r".*",

View File

@@ -9,9 +9,14 @@ https://docs.djangoproject.com/en/1.10/howto/deployment/wsgi/
import os
from django.core.wsgi import get_wsgi_application
try:
from paperless_migration.detect import choose_settings_module
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless.settings")
os.environ.setdefault("DJANGO_SETTINGS_MODULE", choose_settings_module())
except Exception:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless.settings")
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()

View File

@@ -1,11 +1,14 @@
import logging
import shutil
from datetime import timedelta
from pathlib import Path
import faiss
import llama_index.core.settings as llama_settings
import tqdm
from celery import states
from django.conf import settings
from django.utils import timezone
from llama_index.core import Document as LlamaDocument
from llama_index.core import StorageContext
from llama_index.core import VectorStoreIndex
@@ -21,6 +24,7 @@ from llama_index.core.text_splitter import TokenTextSplitter
from llama_index.vector_stores.faiss import FaissVectorStore
from documents.models import Document
from documents.models import PaperlessTask
from paperless_ai.embedding import build_llm_index_text
from paperless_ai.embedding import get_embedding_dim
from paperless_ai.embedding import get_embedding_model
@@ -28,6 +32,29 @@ from paperless_ai.embedding import get_embedding_model
logger = logging.getLogger("paperless_ai.indexing")
def queue_llm_index_update_if_needed(*, rebuild: bool, reason: str) -> bool:
from documents.tasks import llmindex_index
has_running = PaperlessTask.objects.filter(
task_name=PaperlessTask.TaskName.LLMINDEX_UPDATE,
status__in=[states.PENDING, states.STARTED],
).exists()
has_recent = PaperlessTask.objects.filter(
task_name=PaperlessTask.TaskName.LLMINDEX_UPDATE,
date_created__gte=(timezone.now() - timedelta(minutes=5)),
).exists()
if has_running or has_recent:
return False
llmindex_index.delay(rebuild=rebuild, scheduled=False, auto=True)
logger.warning(
"Queued LLM index update%s: %s",
" (rebuild)" if rebuild else "",
reason,
)
return True
def get_or_create_storage_context(*, rebuild=False):
"""
Loads or creates the StorageContext (vector store, docstore, index store).
@@ -93,6 +120,10 @@ def load_or_build_index(nodes=None):
except ValueError as e:
logger.warning("Failed to load index from storage: %s", e)
if not nodes:
queue_llm_index_update_if_needed(
rebuild=vector_store_file_exists(),
reason="LLM index missing or invalid while loading.",
)
logger.info("No nodes provided for index creation.")
raise
return VectorStoreIndex(
@@ -250,6 +281,13 @@ def query_similar_documents(
"""
Runs a similarity query and returns top-k similar Document objects.
"""
if not vector_store_file_exists():
queue_llm_index_update_if_needed(
rebuild=False,
reason="LLM index not found for similarity query.",
)
return []
index = load_or_build_index()
# constrain only the node(s) that match the document IDs, if given

View File

@@ -3,11 +3,13 @@ from unittest.mock import MagicMock
from unittest.mock import patch
import pytest
from celery import states
from django.test import override_settings
from django.utils import timezone
from llama_index.core.base.embeddings.base import BaseEmbedding
from documents.models import Document
from documents.models import PaperlessTask
from paperless_ai import indexing
@@ -288,6 +290,36 @@ def test_update_llm_index_no_documents(
)
@pytest.mark.django_db
def test_queue_llm_index_update_if_needed_enqueues_when_idle_or_skips_recent():
# No existing tasks
with patch("documents.tasks.llmindex_index") as mock_task:
result = indexing.queue_llm_index_update_if_needed(
rebuild=True,
reason="test enqueue",
)
assert result is True
mock_task.delay.assert_called_once_with(rebuild=True, scheduled=False, auto=True)
PaperlessTask.objects.create(
task_id="task-1",
task_name=PaperlessTask.TaskName.LLMINDEX_UPDATE,
status=states.STARTED,
date_created=timezone.now(),
)
# Existing running task
with patch("documents.tasks.llmindex_index") as mock_task:
result = indexing.queue_llm_index_update_if_needed(
rebuild=False,
reason="should skip",
)
assert result is False
mock_task.delay.assert_not_called()
@override_settings(
LLM_EMBEDDING_BACKEND="huggingface",
LLM_BACKEND="ollama",
@@ -299,11 +331,15 @@ def test_query_similar_documents(
with (
patch("paperless_ai.indexing.get_or_create_storage_context") as mock_storage,
patch("paperless_ai.indexing.load_or_build_index") as mock_load_or_build_index,
patch(
"paperless_ai.indexing.vector_store_file_exists",
) as mock_vector_store_exists,
patch("paperless_ai.indexing.VectorIndexRetriever") as mock_retriever_cls,
patch("paperless_ai.indexing.Document.objects.filter") as mock_filter,
):
mock_storage.return_value = MagicMock()
mock_storage.return_value.persist_dir = temp_llm_index_dir
mock_vector_store_exists.return_value = True
mock_index = MagicMock()
mock_load_or_build_index.return_value = mock_index
@@ -332,3 +368,31 @@ def test_query_similar_documents(
mock_filter.assert_called_once_with(pk__in=[1, 2])
assert result == mock_filtered_docs
@pytest.mark.django_db
def test_query_similar_documents_triggers_update_when_index_missing(
temp_llm_index_dir,
real_document,
):
with (
patch(
"paperless_ai.indexing.vector_store_file_exists",
return_value=False,
),
patch(
"paperless_ai.indexing.queue_llm_index_update_if_needed",
) as mock_queue,
patch("paperless_ai.indexing.load_or_build_index") as mock_load,
):
result = indexing.query_similar_documents(
real_document,
top_k=2,
)
mock_queue.assert_called_once_with(
rebuild=False,
reason="LLM index not found for similarity query.",
)
mock_load.assert_not_called()
assert result == []

View File

@@ -1,129 +0,0 @@
from mcp_server import ModelQueryToolset
from mcp_server import drf_publish_create_mcp_tool
from mcp_server import drf_publish_destroy_mcp_tool
from mcp_server import drf_publish_list_mcp_tool
from mcp_server import drf_publish_update_mcp_tool
from documents.permissions import get_objects_for_user_owner_aware
from paperless_mail.models import MailAccount
from paperless_mail.models import MailRule
from paperless_mail.models import ProcessedMail
from paperless_mail.views import MailAccountViewSet
from paperless_mail.views import MailRuleViewSet
from paperless_mail.views import ProcessedMailViewSet
VIEWSET_ACTIONS = {
"create": {"post": "create"},
"list": {"get": "list"},
"update": {"put": "update"},
"destroy": {"delete": "destroy"},
}
BODY_SCHEMA = {"type": "object", "additionalProperties": True}
VIEWSET_INSTRUCTIONS = {
MailAccountViewSet: "Manage mail accounts.",
MailRuleViewSet: "Manage mail rules.",
ProcessedMailViewSet: "List processed mail.",
}
class MailAccountQueryToolset(ModelQueryToolset):
model = MailAccount
def get_queryset(self):
user = getattr(self.request, "user", None)
if not user or not user.is_authenticated:
return MailAccount.objects.none()
if user.is_superuser:
return MailAccount.objects.all()
return get_objects_for_user_owner_aware(
user,
"paperless_mail.view_mailaccount",
MailAccount,
)
class MailRuleQueryToolset(ModelQueryToolset):
model = MailRule
def get_queryset(self):
user = getattr(self.request, "user", None)
if not user or not user.is_authenticated:
return MailRule.objects.none()
if user.is_superuser:
return MailRule.objects.all()
return get_objects_for_user_owner_aware(
user,
"paperless_mail.view_mailrule",
MailRule,
)
class ProcessedMailQueryToolset(ModelQueryToolset):
model = ProcessedMail
def get_queryset(self):
user = getattr(self.request, "user", None)
if not user or not user.is_authenticated:
return ProcessedMail.objects.none()
if user.is_superuser:
return ProcessedMail.objects.all()
return get_objects_for_user_owner_aware(
user,
"paperless_mail.view_processedmail",
ProcessedMail,
)
drf_publish_create_mcp_tool(
MailAccountViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[MailAccountViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
MailAccountViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[MailAccountViewSet],
)
drf_publish_update_mcp_tool(
MailAccountViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[MailAccountViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
MailAccountViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[MailAccountViewSet],
)
drf_publish_create_mcp_tool(
MailRuleViewSet,
actions=VIEWSET_ACTIONS["create"],
instructions=VIEWSET_INSTRUCTIONS[MailRuleViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_list_mcp_tool(
MailRuleViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[MailRuleViewSet],
)
drf_publish_update_mcp_tool(
MailRuleViewSet,
actions=VIEWSET_ACTIONS["update"],
instructions=VIEWSET_INSTRUCTIONS[MailRuleViewSet],
body_schema=BODY_SCHEMA,
)
drf_publish_destroy_mcp_tool(
MailRuleViewSet,
actions=VIEWSET_ACTIONS["destroy"],
instructions=VIEWSET_INSTRUCTIONS[MailRuleViewSet],
)
drf_publish_list_mcp_tool(
ProcessedMailViewSet,
actions=VIEWSET_ACTIONS["list"],
instructions=VIEWSET_INSTRUCTIONS[ProcessedMailViewSet],
)

View File

@@ -1,4 +1,3 @@
import os
from collections.abc import Generator
from pathlib import Path
@@ -70,20 +69,31 @@ def mail_parser() -> MailDocumentParser:
@pytest.fixture()
def live_mail_account() -> Generator[MailAccount, None, None]:
try:
account = MailAccount.objects.create(
name="test",
imap_server=os.environ["PAPERLESS_MAIL_TEST_HOST"],
username=os.environ["PAPERLESS_MAIL_TEST_USER"],
password=os.environ["PAPERLESS_MAIL_TEST_PASSWD"],
imap_port=993,
)
yield account
finally:
account.delete()
def greenmail_mail_account(db: None) -> Generator[MailAccount, None, None]:
"""
Create a mail account configured for local Greenmail server.
"""
account = MailAccount.objects.create(
name="Greenmail Test",
imap_server="localhost",
imap_port=3143,
imap_security=MailAccount.ImapSecurity.NONE,
username="test@localhost",
password="test",
character_set="UTF-8",
)
yield account
account.delete()
@pytest.fixture()
def mail_account_handler() -> MailAccountHandler:
return MailAccountHandler()
@pytest.fixture(scope="session")
def nginx_base_url() -> Generator[str, None, None]:
"""
The base URL for the nginx HTTP server we expect to be alive
"""
yield "http://localhost:8080"

View File

@@ -55,7 +55,7 @@ Content-Transfer-Encoding: 7bit
<p>Some Text</p>
<p>
<img src="cid:part1.pNdUSz0s.D3NqVtPg@example.de" alt="Has to be rewritten to work..">
<img src="https://docs.paperless-ngx.com/assets/logo_full_white.svg" alt="This image should not be shown.">
<img src="http://localhost:8080/assets/logo_full_white.svg" alt="This image should not be shown.">
</p>
<p>and an embedded image.<br>

View File

@@ -6,7 +6,7 @@
<p>Some Text</p>
<p>
<img src="cid:part1.pNdUSz0s.D3NqVtPg@example.de" alt="Has to be rewritten to work..">
<img src="https://docs.paperless-ngx.com/assets/logo_full_white.svg" alt="This image should not be shown.">
<img src="http://localhost:8080/assets/logo_full_white.svg" alt="This image should not be shown.">
</p>
<p>and an embedded image.<br>

View File

@@ -1,6 +1,3 @@
import os
import warnings
import pytest
from paperless_mail.mail import MailAccountHandler
@@ -9,53 +6,53 @@ from paperless_mail.models import MailAccount
from paperless_mail.models import MailRule
# Only run if the environment is setup
# And the environment is not empty (forks, I think)
@pytest.mark.skipif(
"PAPERLESS_MAIL_TEST_HOST" not in os.environ
or not len(os.environ["PAPERLESS_MAIL_TEST_HOST"]),
reason="Live server testing not enabled",
)
@pytest.mark.django_db()
class TestMailLiveServer:
def test_process_non_gmail_server_flag(
@pytest.mark.live
@pytest.mark.greenmail
@pytest.mark.django_db
class TestMailGreenmail:
"""
Mail tests using local Greenmail server
"""
def test_process_flag(
self,
mail_account_handler: MailAccountHandler,
live_mail_account: MailAccount,
):
greenmail_mail_account: MailAccount,
) -> None:
"""
Test processing mail with FLAG action.
"""
rule = MailRule.objects.create(
name="testrule",
account=greenmail_mail_account,
action=MailRule.MailAction.FLAG,
)
try:
rule1 = MailRule.objects.create(
name="testrule",
account=live_mail_account,
action=MailRule.MailAction.FLAG,
)
mail_account_handler.handle_mail_account(live_mail_account)
rule1.delete()
mail_account_handler.handle_mail_account(greenmail_mail_account)
except MailError as e:
pytest.fail(f"Failure: {e}")
except Exception as e:
warnings.warn(f"Unhandled exception: {e}")
finally:
rule.delete()
def test_process_non_gmail_server_tag(
def test_process_tag(
self,
mail_account_handler: MailAccountHandler,
live_mail_account: MailAccount,
):
greenmail_mail_account: MailAccount,
) -> None:
"""
Test processing mail with TAG action.
"""
rule = MailRule.objects.create(
name="testrule",
account=greenmail_mail_account,
action=MailRule.MailAction.TAG,
action_parameter="TestTag",
)
try:
rule2 = MailRule.objects.create(
name="testrule",
account=live_mail_account,
action=MailRule.MailAction.TAG,
)
mail_account_handler.handle_mail_account(live_mail_account)
rule2.delete()
mail_account_handler.handle_mail_account(greenmail_mail_account)
except MailError as e:
pytest.fail(f"Failure: {e}")
except Exception as e:
warnings.warn(f"Unhandled exception: {e}")
finally:
rule.delete()

View File

@@ -17,7 +17,7 @@ from paperless_mail.parsers import MailDocumentParser
def extract_text(pdf_path: Path) -> str:
"""
Using pdftotext from poppler, extracts the text of a PDF into a file,
then reads the file contents and returns it
then reads the file contents and returns it.
"""
with tempfile.NamedTemporaryFile(
mode="w+",
@@ -38,71 +38,107 @@ def extract_text(pdf_path: Path) -> str:
class MailAttachmentMock:
def __init__(self, payload, content_id):
def __init__(self, payload: bytes, content_id: str) -> None:
self.payload = payload
self.content_id = content_id
self.content_type = "image/png"
@pytest.mark.live
@pytest.mark.nginx
@pytest.mark.skipif(
"PAPERLESS_CI_TEST" not in os.environ,
reason="No Gotenberg/Tika servers to test with",
)
class TestUrlCanary:
class TestNginxService:
"""
Verify certain URLs are still available so testing is valid still
Verify the local nginx server is responding correctly.
These tests validate that the test infrastructure is working properly
before running the actual parser tests that depend on HTTP resources.
"""
def test_online_image_exception_on_not_available(self):
def test_non_existent_resource_returns_404(
self,
nginx_base_url: str,
) -> None:
"""
GIVEN:
- Fresh start
- Local nginx server is running
WHEN:
- nonexistent image is requested
- A non-existent resource is requested
THEN:
- An exception shall be thrown
"""
"""
A public image is used in the html sample file. We have no control
whether this image stays online forever, so here we check if we can detect if is not
available anymore.
- An HTTP 404 status code shall be returned
"""
resp = httpx.get(
"https://docs.paperless-ngx.com/assets/non-existent.png",
f"{nginx_base_url}/assets/non-existent.png",
timeout=5.0,
)
with pytest.raises(httpx.HTTPStatusError) as exec_info:
resp.raise_for_status()
assert exec_info.value.response.status_code == httpx.codes.NOT_FOUND
def test_is_online_image_still_available(self):
def test_valid_resource_is_available(
self,
nginx_base_url: str,
) -> None:
"""
GIVEN:
- Fresh start
- Local nginx server is running
WHEN:
- A public image used in the html sample file is requested
- A valid test fixture resource is requested
THEN:
- No exception shall be thrown
- The resource shall be returned with HTTP 200 status code
- The response shall contain the expected content type
"""
"""
A public image is used in the html sample file. We have no control
whether this image stays online forever, so here we check if it is still there
"""
# Now check the URL used in samples/sample.html
resp = httpx.get(
"https://docs.paperless-ngx.com/assets/logo_full_white.svg",
f"{nginx_base_url}/assets/logo_full_white.svg",
timeout=5.0,
)
resp.raise_for_status()
assert resp.status_code == httpx.codes.OK
assert "svg" in resp.headers.get("content-type", "").lower()
def test_server_connectivity(
self,
nginx_base_url: str,
) -> None:
"""
GIVEN:
- Local test fixtures server should be running
WHEN:
- A request is made to the server root
THEN:
- The server shall respond without connection errors
"""
try:
resp = httpx.get(
nginx_base_url,
timeout=5.0,
follow_redirects=True,
)
# We don't care about the status code, just that we can connect
assert resp.status_code in {200, 404, 403}
except httpx.ConnectError as e:
pytest.fail(
f"Cannot connect to nginx server at {nginx_base_url}. "
f"Ensure the nginx container is running via docker-compose.ci-test.yml. "
f"Error: {e}",
)
@pytest.mark.live
@pytest.mark.gotenberg
@pytest.mark.tika
@pytest.mark.nginx
@pytest.mark.skipif(
"PAPERLESS_CI_TEST" not in os.environ,
reason="No Gotenberg/Tika servers to test with",
)
class TestParserLive:
@staticmethod
def imagehash(file, hash_size=18):
def imagehash(file: Path, hash_size: int = 18) -> str:
return f"{average_hash(Image.open(file), hash_size)}"
def test_get_thumbnail(
@@ -112,14 +148,15 @@ class TestParserLive:
simple_txt_email_file: Path,
simple_txt_email_pdf_file: Path,
simple_txt_email_thumbnail_file: Path,
):
) -> None:
"""
GIVEN:
- Fresh start
- A simple text email file
- Mocked PDF generation returning a known PDF
WHEN:
- The Thumbnail is requested
- The thumbnail is requested
THEN:
- The returned thumbnail image file is as expected
- The returned thumbnail image file shall match the expected hash
"""
mock_generate_pdf = mocker.patch(
"paperless_mail.parsers.MailDocumentParser.generate_pdf",
@@ -134,22 +171,28 @@ class TestParserLive:
assert self.imagehash(thumb) == self.imagehash(
simple_txt_email_thumbnail_file,
), (
f"Created Thumbnail {thumb} differs from expected file {simple_txt_email_thumbnail_file}"
f"Created thumbnail {thumb} differs from expected file "
f"{simple_txt_email_thumbnail_file}"
)
def test_tika_parse_successful(self, mail_parser: MailDocumentParser):
def test_tika_parse_successful(self, mail_parser: MailDocumentParser) -> None:
"""
GIVEN:
- Fresh start
- HTML content to parse
- Tika server is running
WHEN:
- tika parsing is called
- Tika parsing is called
THEN:
- a web request to tika shall be done and the reply es returned
- A web request to Tika shall be made
- The parsed text content shall be returned
"""
html = '<html><head><meta http-equiv="content-type" content="text/html; charset=UTF-8"></head><body><p>Some Text</p></body></html>'
html = (
'<html><head><meta http-equiv="content-type" '
'content="text/html; charset=UTF-8"></head>'
"<body><p>Some Text</p></body></html>"
)
expected_text = "Some Text"
# Check successful parsing
parsed = mail_parser.tika_parse(html)
assert expected_text == parsed.strip()
@@ -160,14 +203,17 @@ class TestParserLive:
html_email_file: Path,
merged_pdf_first: Path,
merged_pdf_second: Path,
):
) -> None:
"""
GIVEN:
- Intermediary pdfs to be merged
- Intermediary PDFs to be merged
- An HTML email file
WHEN:
- pdf generation is requested with html file requiring merging of pdfs
- PDF generation is requested with HTML file requiring merging
THEN:
- gotenberg is called to merge files and the resulting file is returned
- Gotenberg shall be called to merge files
- The resulting merged PDF shall be returned
- The merged PDF shall contain text from both source PDFs
"""
mock_generate_pdf_from_html = mocker.patch(
"paperless_mail.parsers.MailDocumentParser.generate_pdf_from_html",
@@ -200,16 +246,17 @@ class TestParserLive:
html_email_file: Path,
html_email_pdf_file: Path,
html_email_thumbnail_file: Path,
):
) -> None:
"""
GIVEN:
- Fresh start
- An HTML email file
WHEN:
- pdf generation from simple eml file is requested
- PDF generation from the email file is requested
THEN:
- Gotenberg is called and the resulting file is returned and look as expected.
- Gotenberg shall be called to generate the PDF
- The archive PDF shall contain the expected content
- The generated thumbnail shall match the expected image hash
"""
util_call_with_backoff(mail_parser.parse, [html_email_file, "message/rfc822"])
# Check the archive PDF
@@ -217,7 +264,7 @@ class TestParserLive:
archive_text = extract_text(archive_path)
expected_archive_text = extract_text(html_email_pdf_file)
# Archive includes the HTML content, so use in
# Archive includes the HTML content
assert expected_archive_text in archive_text
# Check the thumbnail
@@ -227,9 +274,12 @@ class TestParserLive:
)
generated_thumbnail_hash = self.imagehash(generated_thumbnail)
# The created pdf is not reproducible. But the converted image should always look the same.
# The created PDF is not reproducible, but the converted image
# should always look the same
expected_hash = self.imagehash(html_email_thumbnail_file)
assert generated_thumbnail_hash == expected_hash, (
f"PDF looks different. Check if {generated_thumbnail} looks weird."
f"PDF thumbnail differs from expected. "
f"Generated: {generated_thumbnail}, "
f"Hash: {generated_thumbnail_hash} vs {expected_hash}"
)

View File

View File

@@ -0,0 +1,6 @@
from django.apps import AppConfig
class PaperlessMigrationConfig(AppConfig):
default_auto_field = "django.db.models.BigAutoField"
name = "paperless_migration"

View File

@@ -0,0 +1,28 @@
"""ASGI application for migration mode with WebSocket support."""
from __future__ import annotations
import os
from channels.auth import AuthMiddlewareStack
from channels.routing import ProtocolTypeRouter
from channels.routing import URLRouter
from channels.security.websocket import AllowedHostsOriginValidator
from django.core.asgi import get_asgi_application
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless_migration.settings")
# Initialize Django ASGI application early to ensure settings are loaded
django_asgi_app = get_asgi_application()
# Import routing after Django is initialized
from paperless_migration.routing import websocket_urlpatterns # noqa: E402
application = ProtocolTypeRouter(
{
"http": django_asgi_app,
"websocket": AllowedHostsOriginValidator(
AuthMiddlewareStack(URLRouter(websocket_urlpatterns)),
),
},
)

View File

@@ -0,0 +1,245 @@
"""WebSocket consumers for migration operations."""
from __future__ import annotations
import json
import logging
import os
import shutil
import tempfile
from pathlib import Path
from typing import Any
from channels.generic.websocket import AsyncWebsocketConsumer
from django.conf import settings
from paperless_migration.services.importer import ImportService
from paperless_migration.services.transform import TransformService
logger = logging.getLogger(__name__)
class MigrationConsumerBase(AsyncWebsocketConsumer):
"""Base consumer with common authentication and messaging logic."""
async def connect(self) -> None:
"""Authenticate and accept or reject the connection."""
user = self.scope.get("user")
session = self.scope.get("session", {})
if not user or not user.is_authenticated:
logger.warning("WebSocket connection rejected: not authenticated")
await self.close(code=4001)
return
if not user.is_superuser:
logger.warning("WebSocket connection rejected: not superuser")
await self.close(code=4003)
return
if not session.get("migration_code_ok"):
logger.warning("WebSocket connection rejected: migration code not verified")
await self.close(code=4002)
return
await self.accept()
logger.info("WebSocket connection accepted for user: %s", user.username)
async def disconnect(self, close_code: int) -> None:
"""Handle disconnection."""
logger.debug("WebSocket disconnected with code: %d", close_code)
async def receive(self, text_data: str | None = None, **kwargs: Any) -> None:
"""Handle incoming messages - triggers the operation."""
if text_data is None:
return
try:
data = json.loads(text_data)
except json.JSONDecodeError:
await self.send_error("Invalid JSON message")
return
action = data.get("action")
if action == "start":
await self.run_operation()
else:
await self.send_error(f"Unknown action: {action}")
async def run_operation(self) -> None:
"""Override in subclasses to run the specific operation."""
raise NotImplementedError
async def send_message(self, msg_type: str, **kwargs: Any) -> None:
"""Send a typed JSON message to the client."""
await self.send(text_data=json.dumps({"type": msg_type, **kwargs}))
async def send_log(self, message: str, level: str = "info") -> None:
"""Send a log message."""
await self.send_message("log", message=message, level=level)
async def send_progress(
self,
current: int,
total: int | None = None,
label: str = "",
) -> None:
"""Send a progress update."""
await self.send_message(
"progress",
current=current,
total=total,
label=label,
)
async def send_stats(self, stats: dict[str, Any]) -> None:
"""Send statistics update."""
await self.send_message("stats", **stats)
async def send_complete(
self,
duration: float,
*,
success: bool,
**kwargs: Any,
) -> None:
"""Send completion message."""
await self.send_message(
"complete",
success=success,
duration=duration,
**kwargs,
)
async def send_error(self, message: str) -> None:
"""Send an error message."""
await self.send_message("error", message=message)
class TransformConsumer(MigrationConsumerBase):
"""WebSocket consumer for transform operations."""
async def run_operation(self) -> None:
"""Run the transform operation."""
input_path = Path(settings.MIGRATION_EXPORT_PATH)
output_path = Path(settings.MIGRATION_TRANSFORMED_PATH)
frequency = settings.MIGRATION_PROGRESS_FREQUENCY
if not input_path.exists():
await self.send_error(f"Export file not found: {input_path}")
return
if output_path.exists():
await self.send_error(
f"Output file already exists: {output_path}. "
"Delete it first to re-run transform.",
)
return
await self.send_log("Starting transform operation...")
service = TransformService(
input_path=input_path,
output_path=output_path,
update_frequency=frequency,
)
try:
async for update in service.run_async():
match update["type"]:
case "progress":
await self.send_progress(
current=update["completed"],
label=f"{update['completed']:,} rows processed",
)
if update.get("stats"):
await self.send_stats({"transformed": update["stats"]})
case "complete":
await self.send_complete(
success=True,
duration=update["duration"],
total_processed=update["total_processed"],
stats=update["stats"],
speed=update["speed"],
)
case "error":
await self.send_error(update["message"])
case "log":
await self.send_log(
update["message"],
update.get("level", "info"),
)
except Exception as exc:
logger.exception("Transform operation failed")
await self.send_error(f"Transform failed: {exc}")
class ImportConsumer(MigrationConsumerBase):
"""WebSocket consumer for import operations."""
async def run_operation(self) -> None:
"""Run the import operation (wipe, migrate, import)."""
export_path = Path(settings.MIGRATION_EXPORT_PATH)
transformed_path = Path(settings.MIGRATION_TRANSFORMED_PATH)
imported_marker = Path(settings.MIGRATION_IMPORTED_PATH)
source_dir = export_path.parent
if not export_path.exists():
await self.send_error("Export file not found. Upload or re-check export.")
return
if not transformed_path.exists():
await self.send_error("Transformed file not found. Run transform first.")
return
await self.send_log("Preparing import operation...")
# Backup original manifest and swap in transformed version
backup_path: Path | None = None
try:
backup_fd, backup_name = tempfile.mkstemp(
prefix="manifest.v2.",
suffix=".json",
dir=source_dir,
)
os.close(backup_fd)
backup_path = Path(backup_name)
shutil.copy2(export_path, backup_path)
shutil.copy2(transformed_path, export_path)
await self.send_log("Manifest files prepared")
except Exception as exc:
await self.send_error(f"Failed to prepare import manifest: {exc}")
return
service = ImportService(
source_dir=source_dir,
imported_marker=imported_marker,
)
try:
async for update in service.run_async():
match update["type"]:
case "phase":
await self.send_log(f"Phase: {update['phase']}", level="info")
case "log":
await self.send_log(
update["message"],
update.get("level", "info"),
)
case "complete":
await self.send_complete(
success=update["success"],
duration=update["duration"],
)
case "error":
await self.send_error(update["message"])
except Exception as exc:
logger.exception("Import operation failed")
await self.send_error(f"Import failed: {exc}")
finally:
# Restore original manifest
if backup_path and backup_path.exists():
try:
shutil.move(str(backup_path), str(export_path))
except Exception as exc:
logger.warning("Failed to restore backup manifest: %s", exc)

View File

@@ -0,0 +1,150 @@
"""Lightweight detection to decide if we should boot migration mode."""
from __future__ import annotations
import logging
import os
import sqlite3
from pathlib import Path
from typing import Any
logger = logging.getLogger(__name__)
BASE_DIR = Path(__file__).resolve().parent.parent
_DOC_EXISTS_QUERY = "SELECT 1 FROM documents_document LIMIT 1;"
def _get_db_config() -> dict[str, Any]:
data_dir = Path(os.getenv("PAPERLESS_DATA_DIR", BASE_DIR.parent / "data")).resolve()
if not os.getenv("PAPERLESS_DBHOST"):
return {
"ENGINE": "sqlite",
"NAME": data_dir / "db.sqlite3",
}
engine = "mariadb" if os.getenv("PAPERLESS_DBENGINE") == "mariadb" else "postgres"
cfg = {
"ENGINE": engine,
"HOST": os.getenv("PAPERLESS_DBHOST"),
"PORT": os.getenv("PAPERLESS_DBPORT"),
"NAME": os.getenv("PAPERLESS_DBNAME", "paperless"),
"USER": os.getenv("PAPERLESS_DBUSER", "paperless"),
"PASSWORD": os.getenv("PAPERLESS_DBPASS", "paperless"),
}
return cfg
def _probe_sqlite(path: Path) -> bool:
if not path.exists():
return False
try:
conn = sqlite3.connect(path, timeout=1)
cur = conn.cursor()
cur.execute(_DOC_EXISTS_QUERY)
cur.fetchone()
return True
except sqlite3.Error:
return False
finally:
try:
conn.close()
except Exception:
pass
def _probe_postgres(cfg: dict[str, Any]) -> bool:
try:
import psycopg
except ImportError: # pragma: no cover
logger.debug("psycopg not installed; skipping postgres probe")
return False
try:
conn = psycopg.connect(
host=cfg["HOST"],
port=cfg["PORT"],
dbname=cfg["NAME"],
user=cfg["USER"],
password=cfg["PASSWORD"],
connect_timeout=2,
)
with conn, conn.cursor() as cur:
cur.execute(_DOC_EXISTS_QUERY)
cur.fetchone()
return True
except Exception:
return False
finally:
try:
conn.close()
except Exception:
pass
def _probe_mariadb(cfg: dict[str, Any]) -> bool:
try:
import MySQLdb # type: ignore
except ImportError: # pragma: no cover
logger.debug("mysqlclient not installed; skipping mariadb probe")
return False
try:
conn = MySQLdb.connect(
host=cfg["HOST"],
port=int(cfg["PORT"] or 3306),
user=cfg["USER"],
passwd=cfg["PASSWORD"],
db=cfg["NAME"],
connect_timeout=2,
)
cur = conn.cursor()
cur.execute("SELECT 1 FROM documents_document LIMIT 1;")
cur.fetchone()
return True
except Exception:
return False
finally:
try:
conn.close()
except Exception:
pass
def is_v2_database() -> bool:
cfg = _get_db_config()
if cfg["ENGINE"] == "sqlite":
return _probe_sqlite(cfg["NAME"])
if cfg["ENGINE"] == "postgres":
return _probe_postgres(cfg)
if cfg["ENGINE"] == "mariadb":
return _probe_mariadb(cfg)
return False
def choose_settings_module() -> str:
# ENV override
toggle = os.getenv("PAPERLESS_MIGRATION_MODE")
if toggle is not None:
chosen = (
"paperless_migration.settings"
if str(toggle).lower() in ("1", "true", "yes", "on")
else "paperless.settings"
)
os.environ["PAPERLESS_MIGRATION_MODE"] = "1" if "migration" in chosen else "0"
return chosen
# Auto-detect via DB probe
if is_v2_database():
logger.warning("Detected v2 schema; booting migration mode.")
os.environ["PAPERLESS_MIGRATION_MODE"] = "1"
return "paperless_migration.settings"
os.environ["PAPERLESS_MIGRATION_MODE"] = "0"
return "paperless.settings"
if __name__ == "__main__": # pragma: no cover
logger.info(
"v2 database detected" if is_v2_database() else "v2 database not detected",
)

View File

@@ -0,0 +1,13 @@
"""WebSocket URL routing for migration operations."""
from __future__ import annotations
from django.urls import path
from paperless_migration.consumers import ImportConsumer
from paperless_migration.consumers import TransformConsumer
websocket_urlpatterns = [
path("ws/migration/transform/", TransformConsumer.as_asgi()),
path("ws/migration/import/", ImportConsumer.as_asgi()),
]

View File

@@ -0,0 +1,186 @@
"""Import service for loading transformed data into v3 database."""
from __future__ import annotations
import subprocess
import sys
import time
from dataclasses import dataclass
from pathlib import Path
from typing import TYPE_CHECKING
from typing import TypedDict
if TYPE_CHECKING:
from collections.abc import AsyncGenerator
from collections.abc import Generator
class ProgressUpdate(TypedDict, total=False):
"""Progress update message structure."""
type: str
phase: str
message: str
level: str
success: bool
duration: float
return_code: int
@dataclass
class ImportService:
"""Service for importing transformed data into v3 database.
This service orchestrates the three-phase import process:
1. Wipe the existing database
2. Run Django migrations for v3 schema
3. Import the transformed data
"""
source_dir: Path
imported_marker: Path
manage_path: Path | None = None
def __post_init__(self) -> None:
if self.manage_path is None:
# Default to manage.py in the src directory
self.manage_path = (
Path(__file__).resolve().parent.parent.parent / "manage.py"
)
def _get_env(self) -> dict[str, str]:
"""Get environment variables for subprocess calls."""
import os
env = os.environ.copy()
env["DJANGO_SETTINGS_MODULE"] = "paperless.settings"
env["PAPERLESS_MIGRATION_MODE"] = "0"
return env
def _run_command(
self,
args: list[str],
label: str,
) -> Generator[ProgressUpdate, None, int]:
"""Run a command and yield log lines. Returns the return code."""
yield {"type": "log", "message": f"Running: {label}", "level": "info"}
process = subprocess.Popen(
args,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
bufsize=1,
text=True,
env=self._get_env(),
)
try:
if process.stdout:
for line in process.stdout:
yield {
"type": "log",
"message": line.rstrip(),
"level": "info",
}
process.wait()
return process.returncode
finally:
if process.poll() is None:
process.kill()
def run_sync(self) -> Generator[ProgressUpdate, None, None]:
"""Run the import synchronously, yielding progress updates.
This orchestrates:
1. Database wipe
2. Django migrations
3. Document import
"""
start_time = time.perf_counter()
# Phase 1: Wipe database
yield {"type": "phase", "phase": "wipe"}
wipe_cmd = [
sys.executable,
"-m",
"paperless_migration.services.wipe_db",
]
wipe_code = yield from self._run_command(wipe_cmd, "Database wipe")
if wipe_code != 0:
yield {
"type": "error",
"message": f"Database wipe failed with code {wipe_code}",
}
return
yield {"type": "log", "message": "Database wipe complete", "level": "info"}
# Phase 2: Run migrations
yield {"type": "phase", "phase": "migrate"}
migrate_cmd = [
sys.executable,
str(self.manage_path),
"migrate",
"--noinput",
]
migrate_code = yield from self._run_command(migrate_cmd, "Django migrations")
if migrate_code != 0:
yield {
"type": "error",
"message": f"Migrations failed with code {migrate_code}",
}
return
yield {"type": "log", "message": "Migrations complete", "level": "info"}
# Phase 3: Import data
yield {"type": "phase", "phase": "import"}
import_cmd = [
sys.executable,
str(self.manage_path),
"document_importer",
str(self.source_dir),
"--data-only",
]
import_code = yield from self._run_command(import_cmd, "Document import")
if import_code != 0:
yield {
"type": "error",
"message": f"Import failed with code {import_code}",
}
return
# Mark import as complete
try:
self.imported_marker.parent.mkdir(parents=True, exist_ok=True)
self.imported_marker.write_text("ok\n", encoding="utf-8")
except Exception as exc:
yield {
"type": "log",
"message": f"Warning: Could not write import marker: {exc}",
"level": "warning",
}
end_time = time.perf_counter()
duration = end_time - start_time
yield {
"type": "complete",
"success": True,
"duration": duration,
}
async def run_async(self) -> AsyncGenerator[ProgressUpdate, None]:
"""Run the import asynchronously, yielding progress updates.
This wraps the synchronous implementation to work with async consumers.
"""
import asyncio
for update in self.run_sync():
yield update
# Yield control to the event loop
await asyncio.sleep(0)

View File

@@ -0,0 +1,173 @@
"""Transform service for converting v2 exports to v3 format."""
from __future__ import annotations
import json
import time
from collections import Counter
from collections.abc import AsyncGenerator
from collections.abc import Callable
from collections.abc import Generator
from dataclasses import dataclass
from dataclasses import field
from typing import TYPE_CHECKING
from typing import Any
from typing import TypedDict
import ijson
if TYPE_CHECKING:
from pathlib import Path
class FixtureObject(TypedDict):
"""Structure of a Django fixture object."""
model: str
pk: int
fields: dict[str, Any]
class ProgressUpdate(TypedDict, total=False):
"""Progress update message structure."""
type: str
completed: int
stats: dict[str, int]
message: str
level: str
duration: float
total_processed: int
speed: float
TransformFn = Callable[[FixtureObject], FixtureObject]
def transform_documents_document(obj: FixtureObject) -> FixtureObject:
"""Transform a documents.document fixture object for v3 schema."""
fields: dict[str, Any] = obj["fields"]
fields.pop("storage_type", None)
content: Any = fields.get("content")
fields["content_length"] = len(content) if isinstance(content, str) else 0
return obj
# Registry of model-specific transforms
TRANSFORMS: dict[str, TransformFn] = {
"documents.document": transform_documents_document,
}
@dataclass
class TransformService:
"""Service for transforming v2 exports to v3 format.
This service processes JSON fixtures incrementally using ijson for
memory-efficient streaming, and yields progress updates suitable
for WebSocket transmission.
"""
input_path: Path
output_path: Path
update_frequency: int = 100
_stats: Counter[str] = field(default_factory=Counter, init=False)
_total_processed: int = field(default=0, init=False)
def validate(self) -> str | None:
"""Validate preconditions for transform. Returns error message or None."""
if not self.input_path.exists():
return f"Input file not found: {self.input_path}"
if self.output_path.exists():
return f"Output file already exists: {self.output_path}"
if self.input_path.resolve() == self.output_path.resolve():
return "Input and output paths cannot be the same file"
return None
def _process_fixture(self, obj: FixtureObject) -> FixtureObject:
"""Apply any registered transforms to a fixture object."""
model: str = obj["model"]
transform: TransformFn | None = TRANSFORMS.get(model)
if transform:
obj = transform(obj)
self._stats[model] += 1
return obj
def run_sync(self) -> Generator[ProgressUpdate, None, None]:
"""Run the transform synchronously, yielding progress updates.
This is the core implementation that processes the JSON file
and yields progress updates at regular intervals.
"""
error = self.validate()
if error:
yield {"type": "error", "message": error}
return
self._stats.clear()
self._total_processed = 0
start_time = time.perf_counter()
yield {"type": "log", "message": "Opening input file...", "level": "info"}
try:
with (
self.input_path.open("rb") as infile,
self.output_path.open("w", encoding="utf-8") as outfile,
):
outfile.write("[\n")
first = True
for i, obj in enumerate(ijson.items(infile, "item")):
fixture: FixtureObject = obj
fixture = self._process_fixture(fixture)
self._total_processed += 1
if not first:
outfile.write(",\n")
first = False
json.dump(fixture, outfile, ensure_ascii=False)
# Yield progress at configured frequency
if i > 0 and i % self.update_frequency == 0:
yield {
"type": "progress",
"completed": self._total_processed,
"stats": dict(self._stats),
}
outfile.write("\n]\n")
except Exception as exc:
# Clean up partial output on error
if self.output_path.exists():
self.output_path.unlink()
yield {"type": "error", "message": str(exc)}
return
end_time = time.perf_counter()
duration = end_time - start_time
speed = self._total_processed / duration if duration > 0 else 0
yield {
"type": "complete",
"duration": duration,
"total_processed": self._total_processed,
"stats": dict(self._stats),
"speed": speed,
}
async def run_async(self) -> AsyncGenerator[ProgressUpdate, None]:
"""Run the transform asynchronously, yielding progress updates.
This wraps the synchronous implementation to work with async consumers.
The actual I/O is done synchronously since ijson doesn't support async,
but we yield control periodically to keep the event loop responsive.
"""
import asyncio
for update in self.run_sync():
yield update
# Yield control to the event loop periodically
await asyncio.sleep(0)

View File

@@ -0,0 +1,115 @@
"""Database wipe service for migration import process.
This module can be run as a script via:
python -m paperless_migration.services.wipe_db
It uses the paperless_migration settings to wipe all tables
before running v3 migrations.
"""
from __future__ import annotations
import logging
import sys
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from django.db.backends.base.base import BaseDatabaseWrapper
logger = logging.getLogger(__name__)
def _get_target_tables(connection: BaseDatabaseWrapper) -> list[str]:
"""Get list of tables to drop that exist in the database."""
from django.apps import apps
from django.db.migrations.recorder import MigrationRecorder
model_tables = {
model._meta.db_table for model in apps.get_models(include_auto_created=True)
}
model_tables.add(MigrationRecorder.Migration._meta.db_table)
existing_tables = set(connection.introspection.table_names())
return sorted(model_tables & existing_tables)
def _drop_sqlite_tables(connection: BaseDatabaseWrapper) -> int:
"""Drop tables for SQLite database. Returns count of tables dropped."""
tables = _get_target_tables(connection)
with connection.cursor() as cursor:
cursor.execute("PRAGMA foreign_keys=OFF;")
for table in tables:
cursor.execute(f'DROP TABLE IF EXISTS "{table}";')
cursor.execute("PRAGMA foreign_keys=ON;")
return len(tables)
def _drop_postgres_tables(connection: BaseDatabaseWrapper) -> int:
"""Drop tables for PostgreSQL database. Returns count of tables dropped."""
tables = _get_target_tables(connection)
if not tables:
return 0
with connection.cursor() as cursor:
for table in tables:
cursor.execute(f'DROP TABLE IF EXISTS "{table}" CASCADE;')
return len(tables)
def _drop_mysql_tables(connection: BaseDatabaseWrapper) -> int:
"""Drop tables for MySQL/MariaDB database. Returns count of tables dropped."""
tables = _get_target_tables(connection)
with connection.cursor() as cursor:
cursor.execute("SET FOREIGN_KEY_CHECKS=0;")
for table in tables:
cursor.execute(f"DROP TABLE IF EXISTS `{table}`;")
cursor.execute("SET FOREIGN_KEY_CHECKS=1;")
return len(tables)
def wipe_database() -> tuple[bool, str]:
"""Wipe all application tables from the database.
Returns:
Tuple of (success: bool, message: str)
"""
from django.db import connection
vendor = connection.vendor
logger.info("Wiping database for vendor: %s", vendor)
try:
match vendor:
case "sqlite":
count = _drop_sqlite_tables(connection)
case "postgresql":
count = _drop_postgres_tables(connection)
case "mysql":
count = _drop_mysql_tables(connection)
case _:
return False, f"Unsupported database vendor: {vendor}"
message = f"Dropped {count} tables from {vendor} database"
logger.info(message)
return True, message
except Exception as exc:
message = f"Failed to wipe database: {exc}"
logger.exception(message)
return False, message
def main() -> int:
"""Entry point when run as a script."""
import os
import django
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless_migration.settings")
django.setup()
success, message = wipe_database()
print(message) # noqa: T201
return 0 if success else 1
if __name__ == "__main__":
sys.exit(main())

View File

@@ -0,0 +1,245 @@
"""Settings for migration-mode Django instance."""
from __future__ import annotations
import logging
import os
from pathlib import Path
from typing import Any
from dotenv import load_dotenv
BASE_DIR = Path(__file__).resolve().parent.parent
DEBUG = os.getenv("PAPERLESS_DEBUG", "false").lower() == "true"
ALLOWED_HOSTS = ["*"]
# Tap paperless.conf if it's available
for path in [
os.getenv("PAPERLESS_CONFIGURATION_PATH"),
"../paperless.conf",
"/etc/paperless.conf",
"/usr/local/etc/paperless.conf",
]:
if path and Path(path).exists():
load_dotenv(path)
break
def __get_path(
key: str,
default: str | Path,
) -> Path:
if key in os.environ:
return Path(os.environ[key]).resolve()
return Path(default).resolve()
DATA_DIR = __get_path("PAPERLESS_DATA_DIR", BASE_DIR.parent / "data")
EXPORT_DIR = __get_path("PAPERLESS_EXPORT_DIR", BASE_DIR.parent / "export")
def _parse_redis_url() -> str:
"""Parse Redis URL from environment with sensible defaults."""
return os.getenv("PAPERLESS_REDIS_URL", "redis://localhost:6379")
def _parse_db_settings() -> dict[str, dict[str, Any]]:
databases: dict[str, dict[str, Any]] = {
"default": {
"ENGINE": "django.db.backends.sqlite3",
"NAME": DATA_DIR / "db.sqlite3",
"OPTIONS": {},
},
}
if os.getenv("PAPERLESS_DBHOST"):
databases["sqlite"] = databases["default"].copy()
databases["default"] = {
"HOST": os.getenv("PAPERLESS_DBHOST"),
"NAME": os.getenv("PAPERLESS_DBNAME", "paperless"),
"USER": os.getenv("PAPERLESS_DBUSER", "paperless"),
"PASSWORD": os.getenv("PAPERLESS_DBPASS", "paperless"),
"OPTIONS": {},
}
if os.getenv("PAPERLESS_DBPORT"):
databases["default"]["PORT"] = os.getenv("PAPERLESS_DBPORT")
if os.getenv("PAPERLESS_DBENGINE") == "mariadb":
engine = "django.db.backends.mysql"
options = {
"read_default_file": "/etc/mysql/my.cnf",
"charset": "utf8mb4",
"ssl_mode": os.getenv("PAPERLESS_DBSSLMODE", "PREFERRED"),
"ssl": {
"ca": os.getenv("PAPERLESS_DBSSLROOTCERT"),
"cert": os.getenv("PAPERLESS_DBSSLCERT"),
"key": os.getenv("PAPERLESS_DBSSLKEY"),
},
}
else:
engine = "django.db.backends.postgresql"
options = {
"sslmode": os.getenv("PAPERLESS_DBSSLMODE", "prefer"),
"sslrootcert": os.getenv("PAPERLESS_DBSSLROOTCERT"),
"sslcert": os.getenv("PAPERLESS_DBSSLCERT"),
"sslkey": os.getenv("PAPERLESS_DBSSLKEY"),
}
databases["default"]["ENGINE"] = engine
databases["default"]["OPTIONS"].update(options)
if os.getenv("PAPERLESS_DB_TIMEOUT") is not None:
timeout = int(os.getenv("PAPERLESS_DB_TIMEOUT"))
if databases["default"]["ENGINE"] == "django.db.backends.sqlite3":
databases["default"]["OPTIONS"].update({"timeout": timeout})
else:
databases["default"]["OPTIONS"].update({"connect_timeout": timeout})
databases["sqlite"]["OPTIONS"].update({"timeout": timeout})
return databases
DATABASES = _parse_db_settings()
SECRET_KEY = os.getenv("PAPERLESS_SECRET_KEY")
AUTH_PASSWORD_VALIDATORS = [
{
"NAME": "django.contrib.auth.password_validation.UserAttributeSimilarityValidator",
},
{
"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator",
},
{
"NAME": "django.contrib.auth.password_validation.CommonPasswordValidator",
},
{
"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator",
},
]
LANGUAGE_CODE = "en-us"
TIME_ZONE = "UTC"
USE_I18N = True
USE_TZ = True
CSRF_TRUSTED_ORIGINS: list[str] = []
INSTALLED_APPS = [
"django.contrib.auth",
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.messages",
"django.contrib.staticfiles",
"channels",
"allauth",
"allauth.account",
"allauth.socialaccount",
"allauth.mfa",
"paperless_migration",
]
MIDDLEWARE = [
"django.middleware.security.SecurityMiddleware",
"django.contrib.sessions.middleware.SessionMiddleware",
"django.middleware.common.CommonMiddleware",
"django.middleware.csrf.CsrfViewMiddleware",
"django.contrib.auth.middleware.AuthenticationMiddleware",
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
"allauth.account.middleware.AccountMiddleware",
]
ROOT_URLCONF = "paperless_migration.urls"
TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
"DIRS": [
BASE_DIR / "paperless_migration" / "templates",
BASE_DIR / "documents" / "templates",
],
"APP_DIRS": True,
"OPTIONS": {
"context_processors": [
"django.template.context_processors.request",
"django.contrib.auth.context_processors.auth",
"django.contrib.messages.context_processors.messages",
],
},
},
]
# ASGI application for Channels
ASGI_APPLICATION = "paperless_migration.asgi.application"
# Channel layers configuration using Redis
REDIS_URL = _parse_redis_url()
CHANNEL_LAYERS = {
"default": {
"BACKEND": "channels_redis.core.RedisChannelLayer",
"CONFIG": {
"hosts": [REDIS_URL],
"capacity": 1500,
"expiry": 10,
},
},
}
# Keep WSGI for compatibility
WSGI_APPLICATION = "paperless_migration.wsgi.application"
AUTHENTICATION_BACKENDS = [
"django.contrib.auth.backends.ModelBackend",
"allauth.account.auth_backends.AuthenticationBackend",
]
STATIC_URL = "/static/"
STATICFILES_DIRS = [
BASE_DIR / ".." / "static",
BASE_DIR / "static",
BASE_DIR / "documents" / "static",
]
DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField"
LOGIN_URL = "/accounts/login/"
LOGIN_REDIRECT_URL = "/migration/"
LOGOUT_REDIRECT_URL = "/accounts/login/?loggedout=1"
ACCOUNT_ADAPTER = "allauth.account.adapter.DefaultAccountAdapter"
ACCOUNT_AUTHENTICATED_LOGIN_REDIRECTS = False
SOCIALACCOUNT_ADAPTER = "allauth.socialaccount.adapter.DefaultSocialAccountAdapter"
SOCIALACCOUNT_ENABLED = False
SESSION_ENGINE = "django.contrib.sessions.backends.db"
MIGRATION_EXPORT_PATH = __get_path(
"PAPERLESS_MIGRATION_EXPORT_PATH",
EXPORT_DIR / "manifest.json",
)
MIGRATION_TRANSFORMED_PATH = __get_path(
"PAPERLESS_MIGRATION_TRANSFORMED_PATH",
EXPORT_DIR / "manifest.v3.json",
)
MIGRATION_IMPORTED_PATH = Path(EXPORT_DIR / "import.completed").resolve()
# Progress update frequency (rows between WebSocket updates)
MIGRATION_PROGRESS_FREQUENCY = int(
os.getenv("PAPERLESS_MIGRATION_PROGRESS_FREQUENCY", "100"),
)
# One-time access code required for migration logins; stable across autoreload
_code = os.getenv("PAPERLESS_MIGRATION_ACCESS_CODE")
if not _code:
import secrets
_code = secrets.token_urlsafe(12)
os.environ["PAPERLESS_MIGRATION_ACCESS_CODE"] = _code
MIGRATION_ACCESS_CODE = _code
if os.environ.get("PAPERLESS_MIGRATION_CODE_LOGGED") != "1":
logging.getLogger(__name__).warning(
"Migration one-time access code: %s",
MIGRATION_ACCESS_CODE,
)
os.environ["PAPERLESS_MIGRATION_CODE_LOGGED"] = "1"

View File

@@ -0,0 +1,77 @@
{% load i18n static %}
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="author" content="Paperless-ngx project and contributors">
<meta name="robots" content="noindex,nofollow">
<meta name="color-scheme" content="light">
<title>{% translate "Paperless-ngx sign in" %}</title>
<link href="{% static 'bootstrap.min.css' %}" rel="stylesheet">
<link href="{% static 'base.css' %}" rel="stylesheet">
<style>
:root, body, .form-control, .form-floating {
color-scheme: light;
--bs-body-bg: #f5f5f5;
--bs-body-color: #212529;
--bs-body-color-rgb: 33, 37, 41;
--bs-border-color: #dee2e6;
--bs-link-color: #17541f;
--bs-link-color-rgb: 23, 84, 31;
}
@media (prefers-color-scheme: dark) { :root { color-scheme: light; } }
body {
min-height: 100vh;
background:
radial-gradient(circle at 20% 20%, #eef5ef, #f7fbf7),
linear-gradient(120deg, rgba(23, 84, 31, 0.05) 0%, rgba(0,0,0,0) 30%),
linear-gradient(300deg, rgba(15, 54, 20, 0.06) 0%, rgba(0,0,0,0) 40%);
}
</style>
</head>
<body class="d-flex align-items-center justify-content-center text-center p-3">
<main class="w-100" style="max-width: 360px;">
<form class="form-accounts p-4 rounded-4" id="form-account" method="post">
{% csrf_token %}
{% include "paperless-ngx/snippets/svg_logo.html" with extra_attrs="width='240' class='logo mb-3'" %}
<p class="text-uppercase fw-semibold mb-1 text-secondary small" style="letter-spacing: 0.12rem;">{% translate "Migration Mode" %}</p>
{% for message in messages %}
<div class="alert alert-{{ message.level_tag }} mb-2" role="alert">{{ message }}</div>
{% endfor %}
<p class="mb-3">{% translate "Login with a superuser account to proceed." %}</p>
{% if form.errors %}
<div class="alert alert-danger" role="alert">
{% for field, errors in form.errors.items %}
{% for error in errors %}
{{ error }}
{% endfor %}
{% endfor %}
</div>
{% endif %}
{% translate "Username" as i18n_username %}
{% translate "Password" as i18n_password %}
<div class="form-floating form-stacked-top">
<input type="text" name="login" id="inputUsername" placeholder="{{ i18n_username }}" class="form-control" autocorrect="off" autocapitalize="none" required autofocus>
<label for="inputUsername">{{ i18n_username }}</label>
</div>
<div class="form-floating form-stacked-middle">
<input type="password" name="password" id="inputPassword" placeholder="{{ i18n_password }}" class="form-control" required>
<label for="inputPassword">{{ i18n_password }}</label>
</div>
<div class="form-floating form-stacked-bottom">
<input type="text" name="code" id="inputCode" placeholder="One-time code" class="form-control" required>
<label for="inputCode">One-time code</label>
</div>
<p class="mt-2 small fst-italic">{% translate "Code can be found in the startup logs." %}</p>
<div class="d-grid mt-3">
<button class="btn btn-lg btn-primary" type="submit">{% translate "Sign in" %}</button>
</div>
</form>
</main>
</body>
</html>

View File

@@ -0,0 +1,558 @@
<!doctype html>
{% load static %}
<html lang="en">
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<title>Paperless-ngx Migration Mode</title>
<link rel="stylesheet" href="{% static 'bootstrap.min.css' %}" />
<link rel="stylesheet" href="{% static 'base.css' %}" />
<style>
:root, .form-control {
color-scheme: light;
--bs-body-bg: #f5f5f5;
--bs-body-color: #212529;
--bs-body-color-rgb: 33, 37, 41;
--bs-border-color: #dee2e6;
--bs-link-color: var(--pngx-primary);
--bs-link-color-rgb: 23, 84, 31;
}
@media (prefers-color-scheme: dark) { :root { color-scheme: light; } }
.btn-primary:disabled {
--bs-btn-disabled-bg: #4d7352;
--bs-btn-disabled-border-color: #4d7352;
}
body {
background:
radial-gradient(circle at 20% 20%, #eef5ef, #f7fbf7),
linear-gradient(120deg, rgba(23, 84, 31, 0.05) 0%, rgba(0,0,0,0) 30%),
linear-gradient(300deg, rgba(15, 54, 20, 0.06) 0%, rgba(0,0,0,0) 40%);
min-height: 100vh;
}
svg.logo .text {
fill: #161616 !important;
}
.hero-card,
.card-step {
background: #fff;
backdrop-filter: blur(6px);
border: 1px solid rgba(23, 84, 31, 0.08);
box-shadow: 0 16px 40px rgba(0, 0, 0, 0.06);
border-radius: 18px;
}
.status-dot {
width: 10px;
height: 10px;
border-radius: 50%;
display: inline-block;
}
.card-step {
border-radius: 16px;
transition: transform 0.15s ease, box-shadow 0.15s ease;
}
.card-step.done-step {
opacity: 0.4;
}
.path-pill {
background: rgba(23, 84, 31, 0.08);
color: var(--bs-body-color);
border-radius: 12px;
padding: 0.4rem 0.75rem;
font-size: 0.9rem;
}
.step-rail {
position: relative;
height: 4px;
background: rgba(23, 84, 31, 0.12);
border-radius: 999px;
}
.step-rail .fill {
position: absolute;
left: 0;
top: 0;
bottom: 0;
width: calc({{ export_exists|yesno:'33,0' }}% + {{ transformed_exists|yesno:'33,0' }}% + {{ imported_exists|yesno:'34,0' }}%);
max-width: 100%;
background: linear-gradient(90deg, #17541f, #2c7a3c);
border-radius: 999px;
transition: width 0.3s ease;
}
.step-chip {
width: 38px;
height: 38px;
border-radius: 50%;
display: grid;
place-items: center;
font-weight: 700;
background: #fff;
border: 2px solid rgba(23, 84, 31, 0.25);
color: #17541f;
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.08);
}
.step-chip.done {
background: #17541f;
color: #fff;
border-color: #17541f;
}
.console-log {
background: #0f1a12;
color: #d1e7d6;
border-radius: 12px;
min-height: 180px;
max-height: 400px;
padding: 12px;
font-size: 0.85rem;
font-family: 'Consolas', 'Monaco', monospace;
overflow: auto;
white-space: pre-wrap;
word-break: break-word;
}
.console-log .log-error { color: #ff6b6b; }
.console-log .log-warning { color: #ffd93d; }
.console-log .log-success { color: #6bcb77; }
.console-log .log-info { color: #4d96ff; }
.progress-bar-container {
height: 24px;
background: rgba(23, 84, 31, 0.1);
border-radius: 12px;
overflow: hidden;
margin-bottom: 0.5rem;
}
.progress-bar-fill {
height: 100%;
background: linear-gradient(90deg, #17541f, #2c7a3c);
border-radius: 12px;
transition: width 0.3s ease;
display: flex;
align-items: center;
justify-content: center;
color: white;
font-size: 0.75rem;
font-weight: 600;
min-width: fit-content;
padding: 0 8px;
}
.stats-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(120px, 1fr));
gap: 0.5rem;
margin-top: 0.5rem;
}
.stat-item {
background: rgba(23, 84, 31, 0.05);
border-radius: 8px;
padding: 0.5rem;
text-align: center;
}
.stat-value {
font-size: 1.25rem;
font-weight: 700;
color: #17541f;
}
.stat-label {
font-size: 0.75rem;
color: #666;
}
.ws-status {
display: inline-flex;
align-items: center;
gap: 0.5rem;
padding: 0.25rem 0.75rem;
border-radius: 999px;
font-size: 0.8rem;
font-weight: 500;
}
.ws-status.connected { background: #d4edda; color: #155724; }
.ws-status.disconnected { background: #f8d7da; color: #721c24; }
.ws-status.connecting { background: #fff3cd; color: #856404; }
</style>
</head>
<body class="pb-4">
<div class="container py-4">
<div class="row justify-content-center mb-4">
<div class="col-lg-9">
<div class="hero-card p-4">
<div class="d-flex flex-wrap align-items-center justify-content-between gap-3">
<div class="d-flex align-items-center gap-3">
{% include "paperless-ngx/snippets/svg_logo.html" with extra_attrs="width='280' class='logo'" %}
<div class="ps-2">
<p class="text-uppercase fw-semibold mb-1 text-secondary" style="letter-spacing: 0.12rem;">Migration Mode</p>
<h1 class="h3 mb-2 text-primary">Paperless-ngx v2 to v3</h1>
<p class="text-muted mb-0">Migrate your data from Paperless-ngx version 2 to version 3.</p>
</div>
</div>
<div class="text-end">
<span class="badge bg-success-subtle text-success border border-success-subtle px-3 py-2">Online</span>
</div>
</div>
<div class="mt-4">
<div class="d-flex justify-content-between align-items-center mb-2">
<div class="d-flex align-items-center gap-2">
<span class="step-chip {% if export_exists %}done{% endif %}">1</span>
<div>
<div class="fw-semibold mb-0">Export</div>
<small class="text-muted">v2 data</small>
</div>
</div>
<div class="d-flex align-items-center gap-2">
<span class="step-chip {% if transformed_exists %}done{% endif %}">2</span>
<div>
<div class="fw-semibold mb-0">Transform</div>
<small class="text-muted">to v3 schema</small>
</div>
</div>
<div class="d-flex align-items-center gap-2">
<span class="step-chip {% if imported_exists %}done{% endif %}">3</span>
<div>
<div class="fw-semibold mb-0">Import</div>
<small class="text-muted">into v3</small>
</div>
</div>
</div>
<div class="step-rail">
<div class="fill"></div>
</div>
</div>
{% if messages %}
<div class="mt-4">
{% for message in messages %}
<div class="alert alert-{{ message.level_tag }} mb-2" role="alert">{{ message }}</div>
{% endfor %}
</div>
{% endif %}
<div class="row g-3 mt-2">
<div class="col-md-6">
<div class="d-flex align-items-center gap-2">
<span class="status-dot bg-{{ export_exists|yesno:'success,danger' }}"></span>
<div>
<div class="fw-semibold">Export file</div>
<div class="small text-muted">{{ export_exists|yesno:"Ready,Missing" }}</div>
</div>
</div>
<div class="path-pill mt-2 text-truncate" title="{{ export_path }}">{{ export_path }}</div>
</div>
<div class="col-md-6">
<div class="d-flex align-items-center gap-2">
<span class="status-dot bg-{{ transformed_exists|yesno:'success,warning' }}"></span>
<div>
<div class="fw-semibold">Transformed file</div>
<div class="small text-muted">{{ transformed_exists|yesno:"Ready,Pending" }}</div>
</div>
</div>
<div class="path-pill mt-2 text-truncate" title="{{ transformed_path }}">{{ transformed_path }}</div>
</div>
</div>
</div>
</div>
</div>
<div class="row gy-4 justify-content-center">
<div class="col-lg-3 col-md-4">
<div class="card card-step h-100 {% if export_exists %}done-step{% endif %}">
<div class="card-body d-flex flex-column gap-3">
<div>
<p class="text-uppercase text-muted mb-1 fw-semibold" style="letter-spacing: 0.08rem;">Step 1</p>
<h3 class="h5 mb-1">Export (v2)</h3>
<p class="small text-muted mb-0">Generate and upload the v2 export file.</p>
</div>
<div class="mt-auto d-grid gap-2">
<form method="post" enctype="multipart/form-data" class="d-flex gap-2 align-items-center">
{% csrf_token %}
<input class="form-control form-control-sm" type="file" name="export_file" accept=".json" {% if export_exists %}disabled{% endif %} required>
<button class="btn btn-outline-secondary btn-sm" type="submit" name="action" value="upload" {% if export_exists %}disabled aria-disabled="true"{% endif %}>Upload</button>
</form>
<form method="post">
{% csrf_token %}
<button class="btn btn-primary w-100" type="submit" name="action" value="check" {% if export_exists %}disabled aria-disabled="true"{% endif %}>Re-check export</button>
</form>
</div>
</div>
</div>
</div>
<div class="col-lg-3 col-md-4">
<div class="card card-step h-100 {% if transformed_exists %}done-step{% endif %}">
<div class="card-body d-flex flex-column gap-3">
<div>
<p class="text-uppercase text-muted mb-1 fw-semibold" style="letter-spacing: 0.08rem;">Step 2</p>
<h3 class="h5 mb-1">Transform</h3>
<p class="small text-muted mb-0">Convert the export into the v3-ready structure.</p>
</div>
<div class="mt-auto d-grid gap-2">
<form method="post">
{% csrf_token %}
<button
class="btn btn-outline-primary w-100"
type="submit"
name="action"
value="transform"
id="btn-transform"
{% if not export_exists or transformed_exists %}disabled aria-disabled="true"{% endif %}
>
Transform export
</button>
</form>
{% if transformed_exists %}
<form method="post">
{% csrf_token %}
<button class="btn btn-outline-danger btn-sm w-100" type="submit" name="action" value="reset_transform">
Reset transform
</button>
</form>
{% endif %}
</div>
</div>
</div>
</div>
<div class="col-lg-3 col-md-4">
<div class="card card-step h-100 {% if imported_exists %}done-step{% endif %}">
<div class="card-body d-flex flex-column gap-3">
<div>
<p class="text-uppercase text-muted mb-1 fw-semibold" style="letter-spacing: 0.08rem;">Step 3</p>
<h3 class="h5 mb-1">Import (v3)</h3>
<p class="small text-muted mb-0">Load the transformed data into your v3 instance.</p>
</div>
<div class="mt-auto">
<form method="post">
{% csrf_token %}
<button
class="btn btn-outline-secondary w-100"
type="submit"
name="action"
value="import"
id="btn-import"
{% if not transformed_exists or imported_exists %}disabled aria-disabled="true"{% endif %}
>
Import transformed data
</button>
</form>
</div>
</div>
</div>
</div>
</div>
<div class="row justify-content-center mt-4">
<div class="col-lg-9">
{% if not export_exists %}
<div class="alert alert-info mb-3">
<div class="fw-semibold mb-1">Export file not found</div>
<div class="small">
Run the v2 export from your Paperless instance, e.g.:
<code>docker run --rm ghcr.io/paperless-ngx/paperless-ngx:2.20.6 document_exporter --data-only</code>
(see <a href="https://docs.paperless-ngx.com/administration/#exporter" target="_blank" rel="noopener noreferrer">documentation</a>). Once the <code>manifest.json</code> is in-place, upload it or (especially for larger files) place it directly at the expected location and click "Re-check export".
<p class="mt-2 mb-0 text-danger fst-italic">Warning: The export must be generated with version Paperless-ngx v2.20.6</p>
</div>
</div>
{% endif %}
<div class="card card-step">
<div class="card-body">
<div class="d-flex justify-content-between align-items-center mb-2">
<div class="fw-semibold">Migration console</div>
<span id="ws-status" class="ws-status disconnected">
<span class="status-dot"></span>
<span class="status-text">Ready</span>
</span>
</div>
<div id="progress-container" class="mb-3" style="display: none;">
<div class="progress-bar-container">
<div id="progress-bar" class="progress-bar-fill" style="width: 0%;">
<span id="progress-text">0 rows</span>
</div>
</div>
<div id="stats-container" class="stats-grid"></div>
</div>
<div id="migration-log" class="console-log">Ready to begin migration...</div>
</div>
</div>
</div>
</div>
</div>
<script>
(function() {
const logEl = document.getElementById('migration-log');
const wsStatusEl = document.getElementById('ws-status');
const progressContainer = document.getElementById('progress-container');
const progressBar = document.getElementById('progress-bar');
const progressText = document.getElementById('progress-text');
const statsContainer = document.getElementById('stats-container');
function setWsStatus(status, text) {
wsStatusEl.className = 'ws-status ' + status;
wsStatusEl.querySelector('.status-text').textContent = text;
}
function appendLog(message, level) {
const line = document.createElement('div');
line.className = 'log-' + (level || 'info');
line.textContent = message;
logEl.appendChild(line);
logEl.scrollTop = logEl.scrollHeight;
}
function clearLog() {
logEl.innerHTML = '';
}
function updateProgress(current, total, label) {
progressContainer.style.display = 'block';
const pct = total ? Math.min(100, (current / total) * 100) : 0;
progressBar.style.width = (total ? pct : 100) + '%';
progressText.textContent = label || (current.toLocaleString() + ' rows');
}
function updateStats(stats) {
if (!stats || Object.keys(stats).length === 0) {
statsContainer.innerHTML = '';
return;
}
let html = '';
for (const [key, value] of Object.entries(stats)) {
const label = key.replace('documents.', '').replace('_', ' ');
html += '<div class="stat-item">' +
'<div class="stat-value">' + (typeof value === 'number' ? value.toLocaleString() : value) + '</div>' +
'<div class="stat-label">' + label + '</div>' +
'</div>';
}
statsContainer.innerHTML = html;
}
function formatDuration(seconds) {
if (seconds < 60) return seconds.toFixed(1) + 's';
const mins = Math.floor(seconds / 60);
const secs = (seconds % 60).toFixed(0);
return mins + 'm ' + secs + 's';
}
function startWebSocket(action) {
const protocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
const wsUrl = protocol + '//' + window.location.host + '/ws/migration/' + action + '/';
clearLog();
appendLog('Connecting to ' + action + ' service...', 'info');
setWsStatus('connecting', 'Connecting...');
progressContainer.style.display = 'none';
statsContainer.innerHTML = '';
const ws = new WebSocket(wsUrl);
ws.onopen = function() {
setWsStatus('connected', 'Connected');
appendLog('Connected. Starting ' + action + '...', 'success');
ws.send(JSON.stringify({ action: 'start' }));
};
ws.onmessage = function(event) {
try {
const data = JSON.parse(event.data);
switch (data.type) {
case 'log':
appendLog(data.message, data.level || 'info');
break;
case 'progress':
updateProgress(data.current, data.total, data.label);
break;
case 'stats':
if (data.transformed) {
updateStats(data.transformed);
} else {
updateStats(data);
}
break;
case 'complete':
const status = data.success ? 'success' : 'error';
const msg = data.success
? 'Completed successfully in ' + formatDuration(data.duration)
: 'Operation failed';
appendLog(msg, status);
if (data.total_processed) {
appendLog('Total processed: ' + data.total_processed.toLocaleString() + ' rows', 'info');
}
if (data.speed) {
appendLog('Speed: ' + Math.round(data.speed).toLocaleString() + ' rows/sec', 'info');
}
if (data.stats) {
updateStats(data.stats);
}
setWsStatus('disconnected', 'Complete');
ws.close();
if (data.success) {
setTimeout(function() { window.location.reload(); }, 1500);
}
break;
case 'error':
appendLog('Error: ' + data.message, 'error');
setWsStatus('disconnected', 'Error');
break;
default:
appendLog(JSON.stringify(data), 'info');
}
} catch (e) {
appendLog('Received: ' + event.data, 'info');
}
};
ws.onerror = function(error) {
appendLog('WebSocket error occurred', 'error');
setWsStatus('disconnected', 'Error');
};
ws.onclose = function(event) {
if (event.code !== 1000) {
const reason = event.code === 4001 ? 'Not authenticated'
: event.code === 4002 ? 'Migration code not verified'
: event.code === 4003 ? 'Superuser access required'
: 'Connection closed (code: ' + event.code + ')';
appendLog(reason, 'error');
}
setWsStatus('disconnected', 'Disconnected');
};
}
// Check if we should auto-start a WebSocket action
{% if ws_action %}
startWebSocket('{{ ws_action }}');
{% endif %}
// Expose for manual triggering if needed
window.startMigrationWs = startWebSocket;
})();
</script>
</body>
</html>

View File

@@ -0,0 +1,21 @@
"""URL configuration for migration mode."""
from __future__ import annotations
from django.conf import settings
from django.contrib.staticfiles.urls import staticfiles_urlpatterns
from django.urls import include
from django.urls import path
from paperless_migration import views
urlpatterns = [
path("accounts/login/", views.migration_login, name="account_login"),
path("accounts/", include("allauth.urls")),
path("migration/", views.migration_home, name="migration_home"),
# Redirect root to migration home
path("", views.migration_home, name="home"),
]
if settings.DEBUG:
urlpatterns += staticfiles_urlpatterns()

View File

@@ -0,0 +1,132 @@
"""Views for migration mode web interface."""
from __future__ import annotations
from pathlib import Path
from typing import TYPE_CHECKING
from django.conf import settings
from django.contrib import messages
from django.contrib.auth import authenticate
from django.contrib.auth import login
from django.contrib.auth.decorators import login_required
from django.http import HttpResponseForbidden
from django.shortcuts import redirect
from django.shortcuts import render
from django.views.decorators.http import require_http_methods
if TYPE_CHECKING:
from django.http import HttpRequest
from django.http import HttpResponse
def _check_migration_access(request: HttpRequest) -> HttpResponse | None:
"""Check if user has migration access. Returns error response or None."""
if not request.session.get("migration_code_ok"):
return HttpResponseForbidden("Access code required")
if not request.user.is_superuser:
return HttpResponseForbidden("Superuser access required")
return None
@login_required
@require_http_methods(["GET", "POST"])
def migration_home(request: HttpRequest) -> HttpResponse:
"""Main migration dashboard view."""
error_response = _check_migration_access(request)
if error_response:
return error_response
export_path = Path(settings.MIGRATION_EXPORT_PATH)
transformed_path = Path(settings.MIGRATION_TRANSFORMED_PATH)
imported_marker = Path(settings.MIGRATION_IMPORTED_PATH)
if request.method == "POST":
action = request.POST.get("action")
if action == "check":
messages.success(request, "Checked export paths.")
elif action == "upload":
upload = request.FILES.get("export_file")
if not upload:
messages.error(request, "No file selected.")
else:
try:
export_path.parent.mkdir(parents=True, exist_ok=True)
with export_path.open("wb") as dest:
for chunk in upload.chunks():
dest.write(chunk)
messages.success(request, f"Uploaded to {export_path}.")
except Exception as exc:
messages.error(request, f"Failed to save file: {exc}")
elif action == "transform":
if imported_marker.exists():
imported_marker.unlink()
# Signal to start WebSocket connection for transform
request.session["start_ws_action"] = "transform"
messages.info(request, "Starting transform via WebSocket...")
elif action == "import":
# Signal to start WebSocket connection for import
request.session["start_ws_action"] = "import"
messages.info(request, "Starting import via WebSocket...")
elif action == "reset_transform":
if transformed_path.exists():
try:
transformed_path.unlink()
messages.success(request, "Transformed file deleted.")
except Exception as exc:
messages.error(request, f"Failed to delete transformed file: {exc}")
if imported_marker.exists():
try:
imported_marker.unlink()
except Exception:
pass
else:
messages.error(request, "Unknown action.")
return redirect("migration_home")
ws_action = request.session.pop("start_ws_action", None)
context = {
"export_path": export_path,
"export_exists": export_path.exists(),
"transformed_path": transformed_path,
"transformed_exists": transformed_path.exists(),
"imported_exists": imported_marker.exists(),
"ws_action": ws_action,
}
return render(request, "paperless_migration/migration_home.html", context)
@require_http_methods(["GET", "POST"])
def migration_login(request: HttpRequest) -> HttpResponse:
"""Migration-specific login view requiring access code."""
if request.method == "POST":
username = request.POST.get("login", "")
password = request.POST.get("password", "")
code = request.POST.get("code", "")
if not code or code != settings.MIGRATION_ACCESS_CODE:
messages.error(request, "One-time code is required.")
return redirect("account_login")
user = authenticate(request, username=username, password=password)
if user is None:
messages.error(request, "Invalid username or password.")
return redirect("account_login")
if not user.is_superuser:
messages.error(request, "Superuser access required.")
return redirect("account_login")
login(request, user)
request.session["migration_code_ok"] = True
return redirect(settings.LOGIN_REDIRECT_URL)
return render(request, "account/login.html")

View File

@@ -0,0 +1,7 @@
import os
from django.core.wsgi import get_wsgi_application
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "paperless_migration.settings")
application = get_wsgi_application()

4060
uv.lock generated

File diff suppressed because it is too large Load Diff