Compare commits
3 Commits
v2.8.3
...
sunset-rtd
Author | SHA1 | Date | |
---|---|---|---|
![]() |
15f4808fec | ||
![]() |
d531805597 | ||
![]() |
304cfc42a9 |
9
.build-config.json
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"qpdf": {
|
||||
"version": "11.1.1"
|
||||
},
|
||||
"jbig2enc": {
|
||||
"version": "0.29",
|
||||
"git_tag": "0.29"
|
||||
}
|
||||
}
|
28
.codecov.yml
@@ -1,28 +0,0 @@
|
||||
codecov:
|
||||
require_ci_to_pass: true
|
||||
# https://docs.codecov.com/docs/flags#recommended-automatic-flag-management
|
||||
# Require each flag to have 1 upload before notification
|
||||
flag_management:
|
||||
individual_flags:
|
||||
- name: backend
|
||||
paths:
|
||||
- src/
|
||||
- name: frontend
|
||||
paths:
|
||||
- src-ui/
|
||||
# https://docs.codecov.com/docs/pull-request-comments
|
||||
# codecov will only comment if coverage changes
|
||||
comment:
|
||||
require_changes: true
|
||||
coverage:
|
||||
status:
|
||||
project:
|
||||
default:
|
||||
# https://docs.codecov.com/docs/commit-status#threshold
|
||||
threshold: 1%
|
||||
patch:
|
||||
default:
|
||||
# For the changed lines only, target 75% covered, but
|
||||
# allow as low as 50%
|
||||
target: 75%
|
||||
threshold: 25%
|
@@ -1,3 +0,0 @@
|
||||
[codespell]
|
||||
write-changes = True
|
||||
ignore-words-list = criterias,afterall,valeu,ureue,equest,ure
|
@@ -1,28 +1,21 @@
|
||||
# Tool caches
|
||||
**/__pycache__
|
||||
**/.ruff_cache/
|
||||
**/.mypy_cache/
|
||||
# Virtual environment & similar
|
||||
.venv/
|
||||
./src-ui/node_modules
|
||||
./src-ui/dist
|
||||
# IDE folders
|
||||
.idea/
|
||||
.vscode/
|
||||
./src-ui/.vscode
|
||||
# VCS
|
||||
/src-ui/.vscode
|
||||
/src-ui/node_modules
|
||||
/src-ui/dist
|
||||
.git
|
||||
# Test related
|
||||
**/.pytest_cache
|
||||
/export
|
||||
/consume
|
||||
/media
|
||||
/data
|
||||
/docs
|
||||
.pytest_cache
|
||||
/dist
|
||||
/scripts
|
||||
/resources
|
||||
**/tests
|
||||
**/*.spec.ts
|
||||
**/htmlcov
|
||||
# Local folders
|
||||
./export
|
||||
./consume
|
||||
./media
|
||||
./data
|
||||
./docs
|
||||
./dist
|
||||
./scripts
|
||||
./resources
|
||||
/src/.pytest_cache
|
||||
.idea
|
||||
.venv/
|
||||
.vscode/
|
||||
|
1
.env
@@ -1 +1,2 @@
|
||||
COMPOSE_PROJECT_NAME=paperless
|
||||
export PROMPT="(pipenv-projectname)$P$G"
|
||||
|
14
.github/DISCUSSION_TEMPLATE/feature-requests.yml
vendored
@@ -1,14 +0,0 @@
|
||||
title: "[Feature Request] "
|
||||
body:
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Description
|
||||
description: A clear and concise description of what you would like to see.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: other
|
||||
attributes:
|
||||
label: Other
|
||||
description: Add any other context or information about the feature request here.
|
28
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -6,21 +6,14 @@ body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
### ⚠️ Please remember: issues are for *bugs*
|
||||
That is, something you believe affects every single user of Paperless-ngx, not just you. If you're not sure, start with one of the other options below.
|
||||
Have a question? 👉 [Start a new discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/new) or [ask in chat](https://matrix.to/#/#paperless:adnidor.de).
|
||||
|
||||
Also, note that **Paperless-ngx does not perform OCR itself**, that is handled by other tools. Problems with OCR of specific files should likely be raised 'upstream', see https://github.com/ocrmypdf/OCRmyPDF/issues or https://github.com/tesseract-ocr/tesseract/issues
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
#### Have a question? 👉 [Start a new discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/new) or [ask in chat](https://matrix.to/#/#paperlessngx:matrix.org).
|
||||
Before opening an issue, please double check:
|
||||
|
||||
#### Before opening an issue, please double check:
|
||||
|
||||
- [The troubleshooting documentation](https://docs.paperless-ngx.com/troubleshooting/).
|
||||
- [The installation instructions](https://docs.paperless-ngx.com/setup/#installation).
|
||||
- [The troubleshooting documentation](https://paperless-ngx.readthedocs.io/en/latest/troubleshooting.html).
|
||||
- [The installation instructions](https://paperless-ngx.readthedocs.io/en/latest/setup.html#installation).
|
||||
- [Existing issues and discussions](https://github.com/paperless-ngx/paperless-ngx/search?q=&type=issues).
|
||||
- Disable any custom container initialization scripts, if using
|
||||
- Disable any customer container initialization scripts, if using any
|
||||
|
||||
If you encounter issues while installing or configuring Paperless-ngx, please post in the ["Support" section of the discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=support).
|
||||
- type: textarea
|
||||
@@ -102,14 +95,3 @@ body:
|
||||
attributes:
|
||||
label: Other
|
||||
description: Any other relevant details.
|
||||
- type: checkboxes
|
||||
id: required-checks
|
||||
attributes:
|
||||
label: Please confirm the following
|
||||
options:
|
||||
- label: I believe this issue is a bug that affects all users of Paperless-ngx, not something specific to my installation.
|
||||
required: true
|
||||
- label: I have already searched for relevant existing issues and discussions before opening this report.
|
||||
required: true
|
||||
- label: I have updated the title field above with a concise description.
|
||||
required: true
|
||||
|
2
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -4,7 +4,7 @@ contact_links:
|
||||
url: https://github.com/paperless-ngx/paperless-ngx/discussions
|
||||
about: This issue tracker is not for support questions. Please refer to our Discussions.
|
||||
- name: 💬 Chat
|
||||
url: https://matrix.to/#/#paperlessngx:matrix.org
|
||||
url: https://matrix.to/#/#paperless:adnidor.de
|
||||
about: Want to discuss Paperless-ngx with others? Check out our chat.
|
||||
- name: 🚀 Feature Request
|
||||
url: https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=feature-requests
|
||||
|
24
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -8,11 +8,7 @@ Note: All PRs with code changes should be targeted to the `dev` branch, pure doc
|
||||
Please include a summary of the change and which issue is fixed (if any) and any relevant motivation / context. List any dependencies that are required for this change. If appropriate, please include an explanation of how your proposed change can be tested. Screenshots and / or videos can also be helpful if appropriate.
|
||||
-->
|
||||
|
||||
<!--
|
||||
⚠️ Important: Pull requests that implement a new feature or enhancement *should almost always target an existing feature request* with evidence of community interest and discussion. This is in order to balance the work of implementing and maintaining new features / enhancements. If that is not currently the case, please open a feature request instead of this PR to gather feedback from both users and the project maintainers.
|
||||
-->
|
||||
|
||||
Closes #(issue or discussion)
|
||||
Fixes # (issue)
|
||||
|
||||
## Type of change
|
||||
|
||||
@@ -21,22 +17,16 @@ What type of change does your PR introduce to Paperless-ngx?
|
||||
NOTE: Please check only one box!
|
||||
-->
|
||||
|
||||
- [ ] Bug fix: non-breaking change which fixes an issue.
|
||||
- [ ] New feature / Enhancement: non-breaking change which adds functionality. _Please read the important note above._
|
||||
- [ ] Breaking change: fix or feature that would cause existing functionality to not work as expected.
|
||||
- [ ] Documentation only.
|
||||
- [ ] Other. Please explain:
|
||||
- [ ] Bug fix (non-breaking change which fixes an issue)
|
||||
- [ ] New feature (non-breaking change which adds functionality)
|
||||
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
||||
- [ ] Other (please explain)
|
||||
|
||||
## Checklist:
|
||||
|
||||
<!--
|
||||
NOTE: PRs that do not address the following will not be merged, please do not skip any relevant items.
|
||||
-->
|
||||
|
||||
- [ ] I have read & agree with the [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
||||
- [ ] If applicable, I have included testing coverage for new code in this PR, for [backend](https://docs.paperless-ngx.com/development/#testing) and / or [front-end](https://docs.paperless-ngx.com/development/#testing-and-code-style) changes.
|
||||
- [ ] If applicable, I have tested my code for new features & regressions on both mobile & desktop devices, using the latest version of major browsers.
|
||||
- [ ] If applicable, I have checked that all tests pass, see [documentation](https://docs.paperless-ngx.com/development/#back-end-development).
|
||||
- [ ] I have run all `pre-commit` hooks, see [documentation](https://docs.paperless-ngx.com/development/#code-formatting-with-pre-commit-hooks).
|
||||
- [ ] If applicable, I have checked that all tests pass, see [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html#back-end-development).
|
||||
- [ ] I have run all `pre-commit` hooks, see [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html#code-formatting-with-pre-commit-hooks).
|
||||
- [ ] I have made corresponding changes to the documentation as needed.
|
||||
- [ ] I have checked my modifications for any breaking changes.
|
||||
|
44
.github/dependabot.yml
vendored
@@ -8,7 +8,7 @@ updates:
|
||||
target-branch: "dev"
|
||||
# Look for `package.json` and `lock` files in the `/src-ui` directory
|
||||
directory: "/src-ui"
|
||||
open-pull-requests-limit: 10
|
||||
# Check the npm registry for updates every month
|
||||
schedule:
|
||||
interval: "monthly"
|
||||
labels:
|
||||
@@ -17,21 +17,6 @@ updates:
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/frontend"
|
||||
groups:
|
||||
frontend-angular-dependencies:
|
||||
patterns:
|
||||
- "@angular*"
|
||||
- "@ng-*"
|
||||
- "ngx-*"
|
||||
- "ng2-pdf-viewer"
|
||||
frontend-jest-dependencies:
|
||||
patterns:
|
||||
- "@types/jest"
|
||||
- "jest*"
|
||||
frontend-eslint-dependencies:
|
||||
patterns:
|
||||
- "@typescript-eslint*"
|
||||
- "eslint"
|
||||
|
||||
# Enable version updates for Python
|
||||
- package-ecosystem: "pip"
|
||||
@@ -47,27 +32,8 @@ updates:
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/backend"
|
||||
ignore:
|
||||
- dependency-name: "uvicorn"
|
||||
groups:
|
||||
development:
|
||||
patterns:
|
||||
- "*pytest*"
|
||||
- "black"
|
||||
- "ruff"
|
||||
- "mkdocs-material"
|
||||
django:
|
||||
patterns:
|
||||
- "*django*"
|
||||
major-versions:
|
||||
update-types:
|
||||
- "major"
|
||||
small-changes:
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
||||
# Enable updates for GitHub Actions
|
||||
# Enable updates for Github Actions
|
||||
- package-ecosystem: "github-actions"
|
||||
target-branch: "dev"
|
||||
directory: "/"
|
||||
@@ -80,9 +46,3 @@ updates:
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/ci-cd"
|
||||
groups:
|
||||
actions:
|
||||
update-types:
|
||||
- "major"
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
16
.github/release-drafter.yml
vendored
@@ -4,7 +4,6 @@ autolabeler:
|
||||
- '/^fix/'
|
||||
title:
|
||||
- "/^fix/i"
|
||||
- "/^Bugfix/i"
|
||||
- label: "enhancement"
|
||||
branch:
|
||||
- '/^feature/'
|
||||
@@ -14,9 +13,6 @@ categories:
|
||||
- title: 'Breaking Changes'
|
||||
labels:
|
||||
- 'breaking-change'
|
||||
- title: 'Notable Changes'
|
||||
labels:
|
||||
- 'notable'
|
||||
- title: 'Features'
|
||||
labels:
|
||||
- 'enhancement'
|
||||
@@ -24,8 +20,7 @@ categories:
|
||||
labels:
|
||||
- 'bug'
|
||||
- title: 'Documentation'
|
||||
labels:
|
||||
- 'documentation'
|
||||
label: 'documentation'
|
||||
- title: 'Maintenance'
|
||||
labels:
|
||||
- 'chore'
|
||||
@@ -34,13 +29,12 @@ categories:
|
||||
- 'ci-cd'
|
||||
- title: 'Dependencies'
|
||||
collapse-after: 3
|
||||
labels:
|
||||
- 'dependencies'
|
||||
label: 'dependencies'
|
||||
- title: 'All App Changes'
|
||||
labels:
|
||||
- 'frontend'
|
||||
- 'backend'
|
||||
collapse-after: 1
|
||||
collapse-after: 0
|
||||
include-labels:
|
||||
- 'enhancement'
|
||||
- 'bug'
|
||||
@@ -52,10 +46,6 @@ include-labels:
|
||||
- 'frontend'
|
||||
- 'backend'
|
||||
- 'ci-cd'
|
||||
- 'breaking-change'
|
||||
- 'notable'
|
||||
exclude-labels:
|
||||
- 'skip-changelog'
|
||||
category-template: '### $TITLE'
|
||||
change-template: '- $TITLE @$AUTHOR ([#$NUMBER]($URL))'
|
||||
change-title-escapes: '\<*_&#@'
|
||||
|
402
.github/scripts/cleanup-tags.py
vendored
Normal file
@@ -0,0 +1,402 @@
|
||||
#!/usr/bin/env python3
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
from argparse import ArgumentParser
|
||||
from typing import Dict
|
||||
from typing import Final
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
|
||||
from common import get_log_level
|
||||
from github import ContainerPackage
|
||||
from github import GithubBranchApi
|
||||
from github import GithubContainerRegistryApi
|
||||
|
||||
logger = logging.getLogger("cleanup-tags")
|
||||
|
||||
|
||||
class DockerManifest2:
|
||||
"""
|
||||
Data class wrapping the Docker Image Manifest Version 2.
|
||||
|
||||
See https://docs.docker.com/registry/spec/manifest-v2-2/
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
self._data = data
|
||||
# This is the sha256: digest string. Corresponds to GitHub API name
|
||||
# if the package is an untagged package
|
||||
self.digest = self._data["digest"]
|
||||
platform_data_os = self._data["platform"]["os"]
|
||||
platform_arch = self._data["platform"]["architecture"]
|
||||
platform_variant = self._data["platform"].get(
|
||||
"variant",
|
||||
"",
|
||||
)
|
||||
self.platform = f"{platform_data_os}/{platform_arch}{platform_variant}"
|
||||
|
||||
|
||||
class RegistryTagsCleaner:
|
||||
"""
|
||||
This is the base class for the image registry cleaning. Given a package
|
||||
name, it will keep all images which are tagged and all untagged images
|
||||
referred to by a manifest. This results in only images which have been untagged
|
||||
and cannot be referenced except by their SHA in being removed. None of these
|
||||
images should be referenced, so it is fine to delete them.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
package_name: str,
|
||||
repo_owner: str,
|
||||
repo_name: str,
|
||||
package_api: GithubContainerRegistryApi,
|
||||
branch_api: Optional[GithubBranchApi],
|
||||
):
|
||||
self.actually_delete = False
|
||||
self.package_api = package_api
|
||||
self.branch_api = branch_api
|
||||
self.package_name = package_name
|
||||
self.repo_owner = repo_owner
|
||||
self.repo_name = repo_name
|
||||
self.tags_to_delete: List[str] = []
|
||||
self.tags_to_keep: List[str] = []
|
||||
|
||||
# Get the information about all versions of the given package
|
||||
# These are active, not deleted, the default returned from the API
|
||||
self.all_package_versions = self.package_api.get_active_package_versions(
|
||||
self.package_name,
|
||||
)
|
||||
|
||||
# Get a mapping from a tag like "1.7.0" or "feature-xyz" to the ContainerPackage
|
||||
# tagged with it. It makes certain lookups easy
|
||||
self.all_pkgs_tags_to_version: Dict[str, ContainerPackage] = {}
|
||||
for pkg in self.all_package_versions:
|
||||
for tag in pkg.tags:
|
||||
self.all_pkgs_tags_to_version[tag] = pkg
|
||||
logger.info(
|
||||
f"Located {len(self.all_package_versions)} versions of package {self.package_name}",
|
||||
)
|
||||
|
||||
self.decide_what_tags_to_keep()
|
||||
|
||||
def clean(self):
|
||||
"""
|
||||
This method will delete image versions, based on the selected tags to delete
|
||||
"""
|
||||
for tag_to_delete in self.tags_to_delete:
|
||||
package_version_info = self.all_pkgs_tags_to_version[tag_to_delete]
|
||||
|
||||
if self.actually_delete:
|
||||
logger.info(
|
||||
f"Deleting {tag_to_delete} (id {package_version_info.id})",
|
||||
)
|
||||
self.package_api.delete_package_version(
|
||||
package_version_info,
|
||||
)
|
||||
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {tag_to_delete} (id {package_version_info.id})",
|
||||
)
|
||||
else:
|
||||
logger.info("No tags to delete")
|
||||
|
||||
def clean_untagged(self, is_manifest_image: bool):
|
||||
"""
|
||||
This method will delete untagged images, that is those which are not named. It
|
||||
handles if the image tag is actually a manifest, which points to images that look otherwise
|
||||
untagged.
|
||||
"""
|
||||
|
||||
def _clean_untagged_manifest():
|
||||
"""
|
||||
|
||||
Handles the deletion of untagged images, but where the package is a manifest, ie a multi
|
||||
arch image, which means some "untagged" images need to exist still.
|
||||
|
||||
Ok, bear with me, these are annoying.
|
||||
|
||||
Our images are multi-arch, so the manifest is more like a pointer to a sha256 digest.
|
||||
These images are untagged, but pointed to, and so should not be removed (or every pull fails).
|
||||
|
||||
So for each image getting kept, parse the manifest to find the digest(s) it points to. Then
|
||||
remove those from the list of untagged images. The final result is the untagged, not pointed to
|
||||
version which should be safe to remove.
|
||||
|
||||
Example:
|
||||
Tag: ghcr.io/paperless-ngx/paperless-ngx:1.7.1 refers to
|
||||
amd64: sha256:b9ed4f8753bbf5146547671052d7e91f68cdfc9ef049d06690b2bc866fec2690
|
||||
armv7: sha256:81605222df4ba4605a2ba4893276e5d08c511231ead1d5da061410e1bbec05c3
|
||||
arm64: sha256:374cd68db40734b844705bfc38faae84cc4182371de4bebd533a9a365d5e8f3b
|
||||
each of which appears as untagged image, but isn't really.
|
||||
|
||||
So from the list of untagged packages, remove those digests. Once all tags which
|
||||
are being kept are checked, the remaining untagged packages are actually untagged
|
||||
with no referrals in a manifest to them.
|
||||
"""
|
||||
# Simplify the untagged data, mapping name (which is a digest) to the version
|
||||
# At the moment, these are the images which APPEAR untagged.
|
||||
untagged_versions = {}
|
||||
for x in self.all_package_versions:
|
||||
if x.untagged:
|
||||
untagged_versions[x.name] = x
|
||||
|
||||
skips = 0
|
||||
|
||||
# Parse manifests to locate digests pointed to
|
||||
for tag in sorted(self.tags_to_keep):
|
||||
full_name = f"ghcr.io/{self.repo_owner}/{self.package_name}:{tag}"
|
||||
logger.info(f"Checking manifest for {full_name}")
|
||||
try:
|
||||
proc = subprocess.run(
|
||||
[
|
||||
shutil.which("docker"),
|
||||
"manifest",
|
||||
"inspect",
|
||||
full_name,
|
||||
],
|
||||
capture_output=True,
|
||||
)
|
||||
|
||||
manifest_list = json.loads(proc.stdout)
|
||||
for manifest_data in manifest_list["manifests"]:
|
||||
manifest = DockerManifest2(manifest_data)
|
||||
|
||||
if manifest.digest in untagged_versions:
|
||||
logger.info(
|
||||
f"Skipping deletion of {manifest.digest},"
|
||||
f" referred to by {full_name}"
|
||||
f" for {manifest.platform}",
|
||||
)
|
||||
del untagged_versions[manifest.digest]
|
||||
skips += 1
|
||||
|
||||
except Exception as err:
|
||||
self.actually_delete = False
|
||||
logger.exception(err)
|
||||
return
|
||||
|
||||
logger.info(
|
||||
f"Skipping deletion of {skips} packages referred to by a manifest",
|
||||
)
|
||||
|
||||
# Delete the untagged and not pointed at packages
|
||||
logger.info(f"Deleting untagged packages of {self.package_name}")
|
||||
for to_delete_name in untagged_versions:
|
||||
to_delete_version = untagged_versions[to_delete_name]
|
||||
|
||||
if self.actually_delete:
|
||||
logger.info(
|
||||
f"Deleting id {to_delete_version.id} named {to_delete_version.name}",
|
||||
)
|
||||
self.package_api.delete_package_version(
|
||||
to_delete_version,
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {to_delete_name} (id {to_delete_version.id})",
|
||||
)
|
||||
|
||||
def _clean_untagged_non_manifest():
|
||||
"""
|
||||
If the package is not a multi-arch manifest, images without tags are safe to delete.
|
||||
"""
|
||||
|
||||
for package in self.all_package_versions:
|
||||
if package.untagged:
|
||||
if self.actually_delete:
|
||||
logger.info(
|
||||
f"Deleting id {package.id} named {package.name}",
|
||||
)
|
||||
self.package_api.delete_package_version(
|
||||
package,
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {package.name} (id {package.id})",
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Not deleting tag {package.tags[0]} of package {self.package_name}",
|
||||
)
|
||||
|
||||
logger.info("Beginning untagged image cleaning")
|
||||
|
||||
if is_manifest_image:
|
||||
_clean_untagged_manifest()
|
||||
else:
|
||||
_clean_untagged_non_manifest()
|
||||
|
||||
def decide_what_tags_to_keep(self):
|
||||
"""
|
||||
This method holds the logic to delete what tags to keep and there fore
|
||||
what tags to delete.
|
||||
|
||||
By default, any image with at least 1 tag will be kept
|
||||
"""
|
||||
# By default, keep anything which is tagged
|
||||
self.tags_to_keep = list(set(self.all_pkgs_tags_to_version.keys()))
|
||||
|
||||
|
||||
class MainImageTagsCleaner(RegistryTagsCleaner):
|
||||
def decide_what_tags_to_keep(self):
|
||||
"""
|
||||
Overrides the default logic for deciding what images to keep. Images tagged as "feature-"
|
||||
will be removed, if the corresponding branch no longer exists.
|
||||
"""
|
||||
|
||||
# Default to everything gets kept still
|
||||
super().decide_what_tags_to_keep()
|
||||
|
||||
# Locate the feature branches
|
||||
feature_branches = {}
|
||||
for branch in self.branch_api.get_branches(
|
||||
repo=self.repo_name,
|
||||
):
|
||||
if branch.name.startswith("feature-"):
|
||||
logger.debug(f"Found feature branch {branch.name}")
|
||||
feature_branches[branch.name] = branch
|
||||
|
||||
logger.info(f"Located {len(feature_branches)} feature branches")
|
||||
|
||||
if not len(feature_branches):
|
||||
# Our work here is done, delete nothing
|
||||
return
|
||||
|
||||
# Filter to packages which are tagged with feature-*
|
||||
packages_tagged_feature: List[ContainerPackage] = []
|
||||
for package in self.all_package_versions:
|
||||
if package.tag_matches("feature-"):
|
||||
packages_tagged_feature.append(package)
|
||||
|
||||
# Map tags like "feature-xyz" to a ContainerPackage
|
||||
feature_pkgs_tags_to_versions: Dict[str, ContainerPackage] = {}
|
||||
for pkg in packages_tagged_feature:
|
||||
for tag in pkg.tags:
|
||||
feature_pkgs_tags_to_versions[tag] = pkg
|
||||
|
||||
logger.info(
|
||||
f'Located {len(feature_pkgs_tags_to_versions)} versions of package {self.package_name} tagged "feature-"',
|
||||
)
|
||||
|
||||
# All the feature tags minus all the feature branches leaves us feature tags
|
||||
# with no corresponding branch
|
||||
self.tags_to_delete = list(
|
||||
set(feature_pkgs_tags_to_versions.keys()) - set(feature_branches.keys()),
|
||||
)
|
||||
|
||||
# All the tags minus the set of going to be deleted tags leaves us the
|
||||
# tags which will be kept around
|
||||
self.tags_to_keep = list(
|
||||
set(self.all_pkgs_tags_to_version.keys()) - set(self.tags_to_delete),
|
||||
)
|
||||
logger.info(
|
||||
f"Located {len(self.tags_to_delete)} versions of package {self.package_name} to delete",
|
||||
)
|
||||
|
||||
|
||||
class LibraryTagsCleaner(RegistryTagsCleaner):
|
||||
"""
|
||||
Exists for the off change that someday, the installer library images
|
||||
will need their own logic
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
def _main():
|
||||
parser = ArgumentParser(
|
||||
description="Using the GitHub API locate and optionally delete container"
|
||||
" tags which no longer have an associated feature branch",
|
||||
)
|
||||
|
||||
# Requires an affirmative command to actually do a delete
|
||||
parser.add_argument(
|
||||
"--delete",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, actually delete the container tags",
|
||||
)
|
||||
|
||||
# When a tagged image is updated, the previous version remains, but it no longer tagged
|
||||
# Add this option to remove them as well
|
||||
parser.add_argument(
|
||||
"--untagged",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, delete untagged containers as well",
|
||||
)
|
||||
|
||||
# If given, the package is assumed to be a multi-arch manifest. Cache packages are
|
||||
# not multi-arch, all other types are
|
||||
parser.add_argument(
|
||||
"--is-manifest",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, the package is assumed to be a multi-arch manifest following schema v2",
|
||||
)
|
||||
|
||||
# Allows configuration of log level for debugging
|
||||
parser.add_argument(
|
||||
"--loglevel",
|
||||
default="info",
|
||||
help="Configures the logging level",
|
||||
)
|
||||
|
||||
# Get the name of the package being processed this round
|
||||
parser.add_argument(
|
||||
"package",
|
||||
help="The package to process",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
logging.basicConfig(
|
||||
level=get_log_level(args),
|
||||
datefmt="%Y-%m-%d %H:%M:%S",
|
||||
format="%(asctime)s %(levelname)-8s %(message)s",
|
||||
)
|
||||
|
||||
# Must be provided in the environment
|
||||
repo_owner: Final[str] = os.environ["GITHUB_REPOSITORY_OWNER"]
|
||||
repo: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
||||
gh_token: Final[str] = os.environ["TOKEN"]
|
||||
|
||||
# Find all branches named feature-*
|
||||
# Note: Only relevant to the main application, but simpler to
|
||||
# leave in for all packages
|
||||
with GithubBranchApi(gh_token) as branch_api:
|
||||
with GithubContainerRegistryApi(gh_token, repo_owner) as container_api:
|
||||
if args.package in {"paperless-ngx", "paperless-ngx/builder/cache/app"}:
|
||||
cleaner = MainImageTagsCleaner(
|
||||
args.package,
|
||||
repo_owner,
|
||||
repo,
|
||||
container_api,
|
||||
branch_api,
|
||||
)
|
||||
else:
|
||||
cleaner = LibraryTagsCleaner(
|
||||
args.package,
|
||||
repo_owner,
|
||||
repo,
|
||||
container_api,
|
||||
None,
|
||||
)
|
||||
|
||||
# Set if actually doing a delete vs dry run
|
||||
cleaner.actually_delete = args.delete
|
||||
|
||||
# Clean images with tags
|
||||
cleaner.clean()
|
||||
|
||||
# Clean images which are untagged
|
||||
cleaner.clean_untagged(args.is_manifest)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_main()
|
48
.github/scripts/common.py
vendored
Normal file
@@ -0,0 +1,48 @@
|
||||
#!/usr/bin/env python3
|
||||
import logging
|
||||
|
||||
|
||||
def get_image_tag(
|
||||
repo_name: str,
|
||||
pkg_name: str,
|
||||
pkg_version: str,
|
||||
) -> str:
|
||||
"""
|
||||
Returns a string representing the normal image for a given package
|
||||
"""
|
||||
return f"ghcr.io/{repo_name.lower()}/builder/{pkg_name}:{pkg_version}"
|
||||
|
||||
|
||||
def get_cache_image_tag(
|
||||
repo_name: str,
|
||||
pkg_name: str,
|
||||
pkg_version: str,
|
||||
branch_name: str,
|
||||
) -> str:
|
||||
"""
|
||||
Returns a string representing the expected image cache tag for a given package
|
||||
|
||||
Registry type caching is utilized for the builder images, to allow fast
|
||||
rebuilds, generally almost instant for the same version
|
||||
"""
|
||||
return f"ghcr.io/{repo_name.lower()}/builder/cache/{pkg_name}:{pkg_version}"
|
||||
|
||||
|
||||
def get_log_level(args) -> int:
|
||||
"""
|
||||
Returns a logging level, based
|
||||
:param args:
|
||||
:return:
|
||||
"""
|
||||
levels = {
|
||||
"critical": logging.CRITICAL,
|
||||
"error": logging.ERROR,
|
||||
"warn": logging.WARNING,
|
||||
"warning": logging.WARNING,
|
||||
"info": logging.INFO,
|
||||
"debug": logging.DEBUG,
|
||||
}
|
||||
level = levels.get(args.loglevel.lower())
|
||||
if level is None:
|
||||
level = logging.INFO
|
||||
return level
|
92
.github/scripts/get-build-json.py
vendored
Executable file
@@ -0,0 +1,92 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
This is a helper script for the mutli-stage Docker image builder.
|
||||
It provides a single point of configuration for package version control.
|
||||
The output JSON object is used by the CI workflow to determine what versions
|
||||
to build and pull into the final Docker image.
|
||||
|
||||
Python package information is obtained from the Pipfile.lock. As this is
|
||||
kept updated by dependabot, it usually will need no further configuration.
|
||||
The sole exception currently is pikepdf, which has a dependency on qpdf,
|
||||
and is configured here to use the latest version of qpdf built by the workflow.
|
||||
|
||||
Other package version information is configured directly below, generally by
|
||||
setting the version and Git information, if any.
|
||||
|
||||
"""
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Final
|
||||
|
||||
from common import get_cache_image_tag
|
||||
from common import get_image_tag
|
||||
|
||||
|
||||
def _main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Generate a JSON object of information required to build the given package, based on the Pipfile.lock",
|
||||
)
|
||||
parser.add_argument(
|
||||
"package",
|
||||
help="The name of the package to generate JSON for",
|
||||
)
|
||||
|
||||
PIPFILE_LOCK_PATH: Final[Path] = Path("Pipfile.lock")
|
||||
BUILD_CONFIG_PATH: Final[Path] = Path(".build-config.json")
|
||||
|
||||
# Read the main config file
|
||||
build_json: Final = json.loads(BUILD_CONFIG_PATH.read_text())
|
||||
|
||||
# Read Pipfile.lock file
|
||||
pipfile_data: Final = json.loads(PIPFILE_LOCK_PATH.read_text())
|
||||
|
||||
args: Final = parser.parse_args()
|
||||
|
||||
# Read from environment variables set by GitHub Actions
|
||||
repo_name: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
||||
branch_name: Final[str] = os.environ["GITHUB_REF_NAME"]
|
||||
|
||||
# Default output values
|
||||
version = None
|
||||
extra_config = {}
|
||||
|
||||
if args.package in pipfile_data["default"]:
|
||||
# Read the version from Pipfile.lock
|
||||
pkg_data = pipfile_data["default"][args.package]
|
||||
pkg_version = pkg_data["version"].split("==")[-1]
|
||||
version = pkg_version
|
||||
|
||||
# Any extra/special values needed
|
||||
if args.package == "pikepdf":
|
||||
extra_config["qpdf_version"] = build_json["qpdf"]["version"]
|
||||
|
||||
elif args.package in build_json:
|
||||
version = build_json[args.package]["version"]
|
||||
|
||||
else:
|
||||
raise NotImplementedError(args.package)
|
||||
|
||||
# The JSON object we'll output
|
||||
output = {
|
||||
"name": args.package,
|
||||
"version": version,
|
||||
"image_tag": get_image_tag(repo_name, args.package, version),
|
||||
"cache_tag": get_cache_image_tag(
|
||||
repo_name,
|
||||
args.package,
|
||||
version,
|
||||
branch_name,
|
||||
),
|
||||
}
|
||||
|
||||
# Add anything special a package may need
|
||||
output.update(extra_config)
|
||||
|
||||
# Output the JSON info to stdout
|
||||
print(json.dumps(output))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_main()
|
274
.github/scripts/github.py
vendored
Normal file
@@ -0,0 +1,274 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
This module contains some useful classes for interacting with the Github API.
|
||||
The full documentation for the API can be found here: https://docs.github.com/en/rest
|
||||
|
||||
Mostly, this focusses on two areas, repo branches and repo packages, as the use case
|
||||
is cleaning up container images which are no longer referred to.
|
||||
|
||||
"""
|
||||
import functools
|
||||
import logging
|
||||
import re
|
||||
import urllib.parse
|
||||
from typing import Dict
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
|
||||
import httpx
|
||||
|
||||
logger = logging.getLogger("github-api")
|
||||
|
||||
|
||||
class _GithubApiBase:
|
||||
"""
|
||||
A base class for interacting with the Github API. It
|
||||
will handle the session and setting authorization headers.
|
||||
"""
|
||||
|
||||
def __init__(self, token: str) -> None:
|
||||
self._token = token
|
||||
self._client: Optional[httpx.Client] = None
|
||||
|
||||
def __enter__(self) -> "_GithubApiBase":
|
||||
"""
|
||||
Sets up the required headers for auth and response
|
||||
type from the API
|
||||
"""
|
||||
self._client = httpx.Client()
|
||||
self._client.headers.update(
|
||||
{
|
||||
"Accept": "application/vnd.github.v3+json",
|
||||
"Authorization": f"token {self._token}",
|
||||
},
|
||||
)
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
"""
|
||||
Ensures the authorization token is cleaned up no matter
|
||||
the reason for the exit
|
||||
"""
|
||||
if "Accept" in self._client.headers:
|
||||
del self._client.headers["Accept"]
|
||||
if "Authorization" in self._client.headers:
|
||||
del self._client.headers["Authorization"]
|
||||
|
||||
# Close the session as well
|
||||
self._client.close()
|
||||
self._client = None
|
||||
|
||||
def _read_all_pages(self, endpoint):
|
||||
"""
|
||||
Helper function to read all pages of an endpoint, utilizing the
|
||||
next.url until exhausted. Assumes the endpoint returns a list
|
||||
"""
|
||||
internal_data = []
|
||||
|
||||
while True:
|
||||
resp = self._client.get(endpoint)
|
||||
if resp.status_code == 200:
|
||||
internal_data += resp.json()
|
||||
if "next" in resp.links:
|
||||
endpoint = resp.links["next"]["url"]
|
||||
else:
|
||||
logger.debug("Exiting pagination loop")
|
||||
break
|
||||
else:
|
||||
logger.warning(f"Request to {endpoint} return HTTP {resp.status_code}")
|
||||
resp.raise_for_status()
|
||||
|
||||
return internal_data
|
||||
|
||||
|
||||
class _EndpointResponse:
|
||||
"""
|
||||
For all endpoint JSON responses, store the full
|
||||
response data, for ease of extending later, if need be.
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
self._data = data
|
||||
|
||||
|
||||
class GithubBranch(_EndpointResponse):
|
||||
"""
|
||||
Simple wrapper for a repository branch, only extracts name information
|
||||
for now.
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
super().__init__(data)
|
||||
self.name = self._data["name"]
|
||||
|
||||
|
||||
class GithubBranchApi(_GithubApiBase):
|
||||
"""
|
||||
Wrapper around branch API.
|
||||
|
||||
See https://docs.github.com/en/rest/branches/branches
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, token: str) -> None:
|
||||
super().__init__(token)
|
||||
|
||||
self._ENDPOINT = "https://api.github.com/repos/{REPO}/branches"
|
||||
|
||||
def get_branches(self, repo: str) -> List[GithubBranch]:
|
||||
"""
|
||||
Returns all current branches of the given repository owned by the given
|
||||
owner or organization.
|
||||
"""
|
||||
# The environment GITHUB_REPOSITORY already contains the owner in the correct location
|
||||
endpoint = self._ENDPOINT.format(REPO=repo)
|
||||
internal_data = self._read_all_pages(endpoint)
|
||||
return [GithubBranch(branch) for branch in internal_data]
|
||||
|
||||
|
||||
class ContainerPackage(_EndpointResponse):
|
||||
"""
|
||||
Data class wrapping the JSON response from the package related
|
||||
endpoints
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict):
|
||||
super().__init__(data)
|
||||
# This is a numerical ID, required for interactions with this
|
||||
# specific package, including deletion of it or restoration
|
||||
self.id: int = self._data["id"]
|
||||
|
||||
# A string name. This might be an actual name or it could be a
|
||||
# digest string like "sha256:"
|
||||
self.name: str = self._data["name"]
|
||||
|
||||
# URL to the package, including its ID, can be used for deletion
|
||||
# or restoration without needing to build up a URL ourselves
|
||||
self.url: str = self._data["url"]
|
||||
|
||||
# The list of tags applied to this image. Maybe an empty list
|
||||
self.tags: List[str] = self._data["metadata"]["container"]["tags"]
|
||||
|
||||
@functools.cached_property
|
||||
def untagged(self) -> bool:
|
||||
"""
|
||||
Returns True if the image has no tags applied to it, False otherwise
|
||||
"""
|
||||
return len(self.tags) == 0
|
||||
|
||||
@functools.cache
|
||||
def tag_matches(self, pattern: str) -> bool:
|
||||
"""
|
||||
Returns True if the image has at least one tag which matches the given regex,
|
||||
False otherwise
|
||||
"""
|
||||
for tag in self.tags:
|
||||
if re.match(pattern, tag) is not None:
|
||||
return True
|
||||
return False
|
||||
|
||||
def __repr__(self):
|
||||
return f"Package {self.name}"
|
||||
|
||||
|
||||
class GithubContainerRegistryApi(_GithubApiBase):
|
||||
"""
|
||||
Class wrapper to deal with the Github packages API. This class only deals with
|
||||
container type packages, the only type published by paperless-ngx.
|
||||
"""
|
||||
|
||||
def __init__(self, token: str, owner_or_org: str) -> None:
|
||||
super().__init__(token)
|
||||
self._owner_or_org = owner_or_org
|
||||
if self._owner_or_org == "paperless-ngx":
|
||||
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-an-organization
|
||||
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
||||
# https://docs.github.com/en/rest/packages#delete-package-version-for-an-organization
|
||||
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
||||
else:
|
||||
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-the-authenticated-user
|
||||
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
||||
# https://docs.github.com/en/rest/packages#delete-a-package-version-for-the-authenticated-user
|
||||
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
||||
self._PACKAGE_VERSION_RESTORE_ENDPOINT = (
|
||||
f"{self._PACKAGE_VERSION_DELETE_ENDPOINT}/restore"
|
||||
)
|
||||
|
||||
def get_active_package_versions(
|
||||
self,
|
||||
package_name: str,
|
||||
) -> List[ContainerPackage]:
|
||||
"""
|
||||
Returns all the versions of a given package (container images) from
|
||||
the API
|
||||
"""
|
||||
|
||||
package_type: str = "container"
|
||||
# Need to quote this for slashes in the name
|
||||
package_name = urllib.parse.quote(package_name, safe="")
|
||||
|
||||
endpoint = self._PACKAGES_VERSIONS_ENDPOINT.format(
|
||||
ORG=self._owner_or_org,
|
||||
PACKAGE_TYPE=package_type,
|
||||
PACKAGE_NAME=package_name,
|
||||
)
|
||||
|
||||
pkgs = []
|
||||
|
||||
for data in self._read_all_pages(endpoint):
|
||||
pkgs.append(ContainerPackage(data))
|
||||
|
||||
return pkgs
|
||||
|
||||
def get_deleted_package_versions(
|
||||
self,
|
||||
package_name: str,
|
||||
) -> List[ContainerPackage]:
|
||||
package_type: str = "container"
|
||||
# Need to quote this for slashes in the name
|
||||
package_name = urllib.parse.quote(package_name, safe="")
|
||||
|
||||
endpoint = (
|
||||
self._PACKAGES_VERSIONS_ENDPOINT.format(
|
||||
ORG=self._owner_or_org,
|
||||
PACKAGE_TYPE=package_type,
|
||||
PACKAGE_NAME=package_name,
|
||||
)
|
||||
+ "?state=deleted"
|
||||
)
|
||||
|
||||
pkgs = []
|
||||
|
||||
for data in self._read_all_pages(endpoint):
|
||||
pkgs.append(ContainerPackage(data))
|
||||
|
||||
return pkgs
|
||||
|
||||
def delete_package_version(self, package_data: ContainerPackage):
|
||||
"""
|
||||
Deletes the given package version from the GHCR
|
||||
"""
|
||||
resp = self._client.delete(package_data.url)
|
||||
if resp.status_code != 204:
|
||||
logger.warning(
|
||||
f"Request to delete {package_data.url} returned HTTP {resp.status_code}",
|
||||
)
|
||||
|
||||
def restore_package_version(
|
||||
self,
|
||||
package_name: str,
|
||||
package_data: ContainerPackage,
|
||||
):
|
||||
package_type: str = "container"
|
||||
endpoint = self._PACKAGE_VERSION_RESTORE_ENDPOINT.format(
|
||||
ORG=self._owner_or_org,
|
||||
PACKAGE_TYPE=package_type,
|
||||
PACKAGE_NAME=package_name,
|
||||
PACKAGE_VERSION_ID=package_data.id,
|
||||
)
|
||||
|
||||
resp = self._client.post(endpoint)
|
||||
if resp.status_code != 204:
|
||||
logger.warning(
|
||||
f"Request to delete {endpoint} returned HTTP {resp.status_code}",
|
||||
)
|
23
.github/stale.yml
vendored
Normal file
@@ -0,0 +1,23 @@
|
||||
# Number of days of inactivity before an issue becomes stale
|
||||
daysUntilStale: 30
|
||||
|
||||
# Number of days of inactivity before a stale issue is closed
|
||||
daysUntilClose: 7
|
||||
|
||||
# Only issues or pull requests with all of these labels are check if stale. Defaults to `[]` (disabled)
|
||||
onlyLabels: [cant-reproduce]
|
||||
|
||||
# Label to use when marking an issue as stale
|
||||
staleLabel: stale
|
||||
|
||||
# Comment to post when marking an issue as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
This issue has been automatically marked as stale because it has not had
|
||||
recent activity. It will be closed if no further activity occurs. Thank you
|
||||
for your contributions.
|
||||
|
||||
# Comment to post when closing a stale issue. Set to `false` to disable
|
||||
closeComment: false
|
||||
|
||||
# See https://github.com/marketplace/stale for more info on the app
|
||||
# and https://github.com/probot/stale for the configuration docs
|
556
.github/workflows/ci.yml
vendored
@@ -13,314 +13,257 @@ on:
|
||||
branches-ignore:
|
||||
- 'translations**'
|
||||
|
||||
env:
|
||||
# This is the version of pipenv all the steps will use
|
||||
# If changing this, change Dockerfile
|
||||
DEFAULT_PIP_ENV_VERSION: "2023.12.1"
|
||||
# This is the default version of Python to use in most steps which aren't specific
|
||||
DEFAULT_PYTHON_VERSION: "3.10"
|
||||
|
||||
jobs:
|
||||
pre-commit:
|
||||
# We want to run on external PRs, but not on our own internal PRs as they'll be run
|
||||
# by the push to the branch. Without this if check, checks are duplicated since
|
||||
# internal PRs match both the push and pull_request events.
|
||||
if:
|
||||
github.event_name == 'push' || github.event.pull_request.head.repo.full_name !=
|
||||
github.repository
|
||||
|
||||
name: Linting Checks
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
-
|
||||
name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v3
|
||||
|
||||
-
|
||||
name: Install python
|
||||
uses: actions/setup-python@v5
|
||||
name: Install tools
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||
python-version: "3.9"
|
||||
|
||||
-
|
||||
name: Check files
|
||||
uses: pre-commit/action@v3.0.1
|
||||
uses: pre-commit/action@v3.0.0
|
||||
|
||||
documentation:
|
||||
name: "Build & Deploy Documentation"
|
||||
runs-on: ubuntu-22.04
|
||||
name: "Build Documentation"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- pre-commit
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
-
|
||||
name: Set up Python
|
||||
id: setup-python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pip install --user pipenv==${{ env.DEFAULT_PIP_ENV_VERSION }}
|
||||
pipx install pipenv==2022.10.12
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: 3.9
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
-
|
||||
name: Install dependencies
|
||||
run: |
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: List installed Python dependencies
|
||||
run: |
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pip list
|
||||
pipenv run pip list
|
||||
-
|
||||
name: Make documentation
|
||||
run: |
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} run mkdocs build --config-file ./mkdocs.yml
|
||||
-
|
||||
name: Deploy documentation
|
||||
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
|
||||
run: |
|
||||
echo "docs.paperless-ngx.com" > "${{ github.workspace }}/docs/CNAME"
|
||||
git config --global user.name "${{ github.actor }}"
|
||||
git config --global user.email "${{ github.actor }}@users.noreply.github.com"
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} run mkdocs gh-deploy --force --no-history
|
||||
cd docs/
|
||||
pipenv run make html
|
||||
-
|
||||
name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: documentation
|
||||
path: site/
|
||||
retention-days: 7
|
||||
path: docs/_build/html/
|
||||
|
||||
tests-backend:
|
||||
name: "Backend Tests (Python ${{ matrix.python-version }})"
|
||||
runs-on: ubuntu-22.04
|
||||
name: "Tests (${{ matrix.python-version }})"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- pre-commit
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ['3.9', '3.10', '3.11']
|
||||
python-version: ['3.8', '3.9', '3.10']
|
||||
fail-fast: false
|
||||
services:
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
ports:
|
||||
- "9998:9998/tcp"
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.6
|
||||
ports:
|
||||
- "3000:3000/tcp"
|
||||
env:
|
||||
# Enable Tika end to end testing
|
||||
TIKA_LIVE: 1
|
||||
# Enable paperless_mail testing against real server
|
||||
PAPERLESS_MAIL_TEST_HOST: ${{ secrets.TEST_MAIL_HOST }}
|
||||
PAPERLESS_MAIL_TEST_USER: ${{ secrets.TEST_MAIL_USER }}
|
||||
PAPERLESS_MAIL_TEST_PASSWD: ${{ secrets.TEST_MAIL_PASSWD }}
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v3
|
||||
with:
|
||||
fetch-depth: 0
|
||||
-
|
||||
name: Start containers
|
||||
name: Install pipenv
|
||||
run: |
|
||||
docker compose --file ${{ github.workspace }}/docker/compose/docker-compose.ci-test.yml pull --quiet
|
||||
docker compose --file ${{ github.workspace }}/docker/compose/docker-compose.ci-test.yml up --detach
|
||||
pipx install pipenv==2022.10.12
|
||||
-
|
||||
name: Set up Python
|
||||
id: setup-python
|
||||
uses: actions/setup-python@v5
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "${{ matrix.python-version }}"
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pip install --user pipenv==${{ env.DEFAULT_PIP_ENV_VERSION }}
|
||||
-
|
||||
name: Install system dependencies
|
||||
run: |
|
||||
sudo apt-get update -qq
|
||||
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript libzbar0 poppler-utils
|
||||
-
|
||||
name: Configure ImageMagick
|
||||
run: |
|
||||
sudo cp docker/imagemagick-policy.xml /etc/ImageMagick-6/policy.xml
|
||||
-
|
||||
name: Install Python dependencies
|
||||
run: |
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python --version
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: List installed Python dependencies
|
||||
run: |
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pip list
|
||||
pipenv run pip list
|
||||
-
|
||||
name: Tests
|
||||
env:
|
||||
PAPERLESS_CI_TEST: 1
|
||||
# Enable paperless_mail testing against real server
|
||||
PAPERLESS_MAIL_TEST_HOST: ${{ secrets.TEST_MAIL_HOST }}
|
||||
PAPERLESS_MAIL_TEST_USER: ${{ secrets.TEST_MAIL_USER }}
|
||||
PAPERLESS_MAIL_TEST_PASSWD: ${{ secrets.TEST_MAIL_PASSWD }}
|
||||
run: |
|
||||
cd src/
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pytest -ra
|
||||
pipenv run pytest -rfEp
|
||||
-
|
||||
name: Upload coverage
|
||||
if: ${{ matrix.python-version == env.DEFAULT_PYTHON_VERSION }}
|
||||
uses: actions/upload-artifact@v4
|
||||
name: Get changed files
|
||||
id: changed-files-specific
|
||||
uses: tj-actions/changed-files@v34
|
||||
with:
|
||||
name: backend-coverage-report
|
||||
path: src/coverage.xml
|
||||
retention-days: 7
|
||||
if-no-files-found: warn
|
||||
files: |
|
||||
src/**
|
||||
-
|
||||
name: Stop containers
|
||||
if: always()
|
||||
name: List all changed files
|
||||
run: |
|
||||
docker compose --file ${{ github.workspace }}/docker/compose/docker-compose.ci-test.yml logs
|
||||
docker compose --file ${{ github.workspace }}/docker/compose/docker-compose.ci-test.yml down
|
||||
|
||||
install-frontend-depedendencies:
|
||||
name: "Install Frontend Dependencies"
|
||||
runs-on: ubuntu-22.04
|
||||
needs:
|
||||
- pre-commit
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
for file in ${{ steps.changed-files-specific.outputs.all_changed_files }}; do
|
||||
echo "${file} was changed"
|
||||
done
|
||||
-
|
||||
name: Use Node.js 20
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 20.x
|
||||
cache: 'npm'
|
||||
cache-dependency-path: 'src-ui/package-lock.json'
|
||||
- name: Cache frontend dependencies
|
||||
id: cache-frontend-deps
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.npm
|
||||
~/.cache
|
||||
key: ${{ runner.os }}-frontenddeps-${{ hashFiles('src-ui/package-lock.json') }}
|
||||
-
|
||||
name: Install dependencies
|
||||
if: steps.cache-frontend-deps.outputs.cache-hit != 'true'
|
||||
run: cd src-ui && npm ci
|
||||
-
|
||||
name: Install Playwright
|
||||
if: steps.cache-frontend-deps.outputs.cache-hit != 'true'
|
||||
run: cd src-ui && npx playwright install --with-deps
|
||||
name: Publish coverage results
|
||||
if: matrix.python-version == '3.9' && steps.changed-files-specific.outputs.any_changed == 'true'
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
# https://github.com/coveralls-clients/coveralls-python/issues/251
|
||||
run: |
|
||||
cd src/
|
||||
pipenv run coveralls --service=github
|
||||
|
||||
tests-frontend:
|
||||
name: "Frontend Tests (Node ${{ matrix.node-version }} - ${{ matrix.shard-index }}/${{ matrix.shard-count }})"
|
||||
runs-on: ubuntu-22.04
|
||||
name: "Tests Frontend"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- install-frontend-depedendencies
|
||||
- pre-commit
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
node-version: [20.x]
|
||||
shard-index: [1, 2, 3, 4]
|
||||
shard-count: [4]
|
||||
node-version: [16.x]
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/checkout@v3
|
||||
-
|
||||
name: Use Node.js 20
|
||||
uses: actions/setup-node@v4
|
||||
name: Use Node.js ${{ matrix.node-version }}
|
||||
uses: actions/setup-node@v3
|
||||
with:
|
||||
node-version: 20.x
|
||||
cache: 'npm'
|
||||
cache-dependency-path: 'src-ui/package-lock.json'
|
||||
- name: Cache frontend dependencies
|
||||
id: cache-frontend-deps
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.npm
|
||||
~/.cache
|
||||
key: ${{ runner.os }}-frontenddeps-${{ hashFiles('src-ui/package-lock.json') }}
|
||||
- name: Re-link Angular cli
|
||||
run: cd src-ui && npm link @angular/cli
|
||||
-
|
||||
name: Linting checks
|
||||
run: cd src-ui && npm run lint
|
||||
-
|
||||
name: Run Jest unit tests
|
||||
run: cd src-ui && npm run test -- --max-workers=2 --shard=${{ matrix.shard-index }}/${{ matrix.shard-count }}
|
||||
-
|
||||
name: Upload Jest coverage
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: jest-coverage-report-${{ matrix.shard-index }}
|
||||
path: |
|
||||
src-ui/coverage/coverage-final.json
|
||||
src-ui/coverage/lcov.info
|
||||
src-ui/coverage/clover.xml
|
||||
retention-days: 7
|
||||
if-no-files-found: warn
|
||||
-
|
||||
name: Run Playwright e2e tests
|
||||
run: cd src-ui && npx playwright test --shard ${{ matrix.shard-index }}/${{ matrix.shard-count }}
|
||||
-
|
||||
name: Upload Playwright test results
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: playwright-report-${{ matrix.shard-index }}
|
||||
path: src-ui/playwright-report
|
||||
retention-days: 7
|
||||
node-version: ${{ matrix.node-version }}
|
||||
- run: cd src-ui && npm ci
|
||||
- run: cd src-ui && npm run test
|
||||
- run: cd src-ui && npm run e2e:ci
|
||||
|
||||
tests-coverage-upload:
|
||||
name: "Upload Coverage"
|
||||
runs-on: ubuntu-22.04
|
||||
prepare-docker-build:
|
||||
name: Prepare Docker Pipeline Data
|
||||
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || contains(github.ref, 'beta.rc') || startsWith(github.ref, 'refs/tags/v'))
|
||||
runs-on: ubuntu-20.04
|
||||
# If the push triggered the installer library workflow, wait for it to
|
||||
# complete here. This ensures the required versions for the final
|
||||
# image have been built, while not waiting at all if the versions haven't changed
|
||||
concurrency:
|
||||
group: build-installer-library
|
||||
cancel-in-progress: false
|
||||
needs:
|
||||
- documentation
|
||||
- tests-backend
|
||||
- tests-frontend
|
||||
steps:
|
||||
-
|
||||
uses: actions/checkout@v4
|
||||
name: Set ghcr repository name
|
||||
id: set-ghcr-repository
|
||||
run: |
|
||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||
echo "repository=${ghcr_name}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Download frontend jest coverage
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: src-ui/coverage/
|
||||
pattern: jest-coverage-report-*
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Download frontend playwright coverage
|
||||
uses: actions/download-artifact@v4
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
path: src-ui/coverage/
|
||||
pattern: playwright-report-*
|
||||
merge-multiple: true
|
||||
python-version: "3.9"
|
||||
-
|
||||
name: Upload frontend coverage to Codecov
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
# not required for public repos, but intermittently fails otherwise
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
flags: frontend
|
||||
directory: src-ui/coverage/
|
||||
# dont include backend coverage files here
|
||||
files: '!coverage.xml'
|
||||
-
|
||||
name: Download backend coverage
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: backend-coverage-report
|
||||
path: src/
|
||||
-
|
||||
name: Upload coverage to Codecov
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
# not required for public repos, but intermittently fails otherwise
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
# future expansion
|
||||
flags: backend
|
||||
directory: src/
|
||||
name: Setup qpdf image
|
||||
id: qpdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo "qpdf-json=${build_json}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Setup psycopg2 image
|
||||
id: psycopg2-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo "psycopg2-json=${build_json}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Setup pikepdf image
|
||||
id: pikepdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo "pikepdf-json=${build_json}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Setup jbig2enc image
|
||||
id: jbig2enc-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo "jbig2enc-json=${build_json}" >> $GITHUB_OUTPUT
|
||||
|
||||
outputs:
|
||||
|
||||
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
||||
|
||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
||||
|
||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
||||
|
||||
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
||||
|
||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json}}
|
||||
|
||||
# build and push image to docker hub.
|
||||
build-docker-image:
|
||||
name: Build Docker image for ${{ github.ref_name }}
|
||||
runs-on: ubuntu-22.04
|
||||
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || startsWith(github.ref, 'refs/heads/fix-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || contains(github.ref, 'beta.rc') || startsWith(github.ref, 'refs/tags/v'))
|
||||
runs-on: ubuntu-20.04
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-build-docker-image-${{ github.ref_name }}
|
||||
cancel-in-progress: true
|
||||
needs:
|
||||
- tests-backend
|
||||
- tests-frontend
|
||||
- prepare-docker-build
|
||||
steps:
|
||||
-
|
||||
name: Check pushing to Docker Hub
|
||||
id: push-other-places
|
||||
id: docker-hub
|
||||
# Only push to Dockerhub from the main repo AND the ref is either:
|
||||
# main
|
||||
# dev
|
||||
@@ -328,29 +271,21 @@ jobs:
|
||||
# a tag
|
||||
# Otherwise forks would require a Docker Hub account and secrets setup
|
||||
run: |
|
||||
if [[ ${{ github.repository_owner }} == "paperless-ngx" && ( ${{ github.ref_name }} == "dev" || ${{ github.ref_name }} == "beta" || ${{ startsWith(github.ref, 'refs/tags/v') }} == "true" ) ]] ; then
|
||||
if [[ ${{ needs.prepare-docker-build.outputs.ghcr-repository }} == "paperless-ngx/paperless-ngx" && ( ${{ github.ref_name }} == "main" || ${{ github.ref_name }} == "dev" || ${{ github.ref_name }} == "beta" || ${{ startsWith(github.ref, 'refs/tags/v') }} == "true" ) ]] ; then
|
||||
echo "Enabling DockerHub image push"
|
||||
echo "enable=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "Not pushing to DockerHub"
|
||||
echo "enable=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
-
|
||||
name: Set ghcr repository name
|
||||
id: set-ghcr-repository
|
||||
run: |
|
||||
ghcr_name=$(echo "${{ github.repository }}" | awk '{ print tolower($0) }')
|
||||
echo "Name is ${ghcr_name}"
|
||||
echo "ghcr-repository=${ghcr_name}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Gather Docker metadata
|
||||
id: docker-meta
|
||||
uses: docker/metadata-action@v5
|
||||
uses: docker/metadata-action@v4
|
||||
with:
|
||||
images: |
|
||||
ghcr.io/${{ steps.set-ghcr-repository.outputs.ghcr-repository }}
|
||||
name=paperlessngx/paperless-ngx,enable=${{ steps.push-other-places.outputs.enable }}
|
||||
name=quay.io/paperlessngx/paperless-ngx,enable=${{ steps.push-other-places.outputs.enable }}
|
||||
ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
||||
name=paperlessngx/paperless-ngx,enable=${{ steps.docker-hub.outputs.enable }}
|
||||
tags: |
|
||||
# Tag branches with branch name
|
||||
type=ref,event=branch
|
||||
@@ -360,59 +295,51 @@ jobs:
|
||||
type=semver,pattern={{major}}.{{minor}}
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
# If https://github.com/docker/buildx/issues/1044 is resolved,
|
||||
# the append input with a native arm64 arch could be used to
|
||||
# significantly speed up building
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
uses: docker/setup-buildx-action@v2
|
||||
-
|
||||
name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v3
|
||||
with:
|
||||
platforms: arm64
|
||||
uses: docker/setup-qemu-action@v2
|
||||
-
|
||||
name: Login to GitHub Container Registry
|
||||
uses: docker/login-action@v3
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Login to Docker Hub
|
||||
uses: docker/login-action@v3
|
||||
uses: docker/login-action@v2
|
||||
# Don't attempt to login is not pushing to Docker Hub
|
||||
if: steps.push-other-places.outputs.enable == 'true'
|
||||
if: steps.docker-hub.outputs.enable == 'true'
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
-
|
||||
name: Login to Quay.io
|
||||
uses: docker/login-action@v3
|
||||
# Don't attempt to login is not pushing to Quay.io
|
||||
if: steps.push-other-places.outputs.enable == 'true'
|
||||
with:
|
||||
registry: quay.io
|
||||
username: ${{ secrets.QUAY_USERNAME }}
|
||||
password: ${{ secrets.QUAY_ROBOT_TOKEN }}
|
||||
-
|
||||
name: Build and push
|
||||
uses: docker/build-push-action@v5
|
||||
uses: docker/build-push-action@v3
|
||||
with:
|
||||
context: .
|
||||
file: ./Dockerfile
|
||||
platforms: linux/amd64,linux/arm64
|
||||
platforms: linux/amd64,linux/arm/v7,linux/arm64
|
||||
push: ${{ github.event_name != 'pull_request' }}
|
||||
tags: ${{ steps.docker-meta.outputs.tags }}
|
||||
labels: ${{ steps.docker-meta.outputs.labels }}
|
||||
# Get cache layers from this branch, then dev
|
||||
build-args: |
|
||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
||||
# Get cache layers from this branch, then dev, then main
|
||||
# This allows new branches to get at least some cache benefits, generally from dev
|
||||
cache-from: |
|
||||
type=registry,ref=ghcr.io/${{ steps.set-ghcr-repository.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||
type=registry,ref=ghcr.io/${{ steps.set-ghcr-repository.outputs.ghcr-repository }}/builder/cache/app:dev
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:dev
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:main
|
||||
cache-to: |
|
||||
type=registry,mode=max,ref=ghcr.io/${{ steps.set-ghcr-repository.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||
type=registry,mode=max,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||
-
|
||||
name: Inspect image
|
||||
run: |
|
||||
@@ -424,44 +351,35 @@ jobs:
|
||||
docker cp frontend-extract:/usr/src/paperless/src/documents/static/frontend src/documents/static/frontend/
|
||||
-
|
||||
name: Upload frontend artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: frontend-compiled
|
||||
path: src/documents/static/frontend/
|
||||
retention-days: 7
|
||||
|
||||
build-release:
|
||||
name: "Build Release"
|
||||
needs:
|
||||
- build-docker-image
|
||||
- documentation
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pip3 install --upgrade pip setuptools wheel pipx
|
||||
pipx install pipenv
|
||||
-
|
||||
name: Set up Python
|
||||
id: setup-python
|
||||
uses: actions/setup-python@v5
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||
python-version: 3.9
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
-
|
||||
name: Install pipenv + tools
|
||||
run: |
|
||||
pip install --upgrade --user pipenv==${{ env.DEFAULT_PIP_ENV_VERSION }} setuptools wheel
|
||||
-
|
||||
name: Install Python dependencies
|
||||
run: |
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||
-
|
||||
name: Patch whitenoise
|
||||
run: |
|
||||
curl --fail --silent --show-error --location --output 484.patch https://github.com/evansd/whitenoise/pull/484.patch
|
||||
patch -d $(pipenv --venv)/lib/python3.10/site-packages --verbose -p2 < 484.patch
|
||||
rm 484.patch
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: Install system dependencies
|
||||
run: |
|
||||
@@ -469,87 +387,58 @@ jobs:
|
||||
sudo apt-get install -qq --no-install-recommends gettext liblept5
|
||||
-
|
||||
name: Download frontend artifact
|
||||
uses: actions/download-artifact@v4
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: frontend-compiled
|
||||
path: src/documents/static/frontend/
|
||||
-
|
||||
name: Download documentation artifact
|
||||
uses: actions/download-artifact@v4
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: documentation
|
||||
path: docs/_build/html/
|
||||
-
|
||||
name: Generate requirements file
|
||||
run: |
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} requirements > requirements.txt
|
||||
pipenv requirements > requirements.txt
|
||||
-
|
||||
name: Compile messages
|
||||
run: |
|
||||
cd src/
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python3 manage.py compilemessages
|
||||
pipenv run python3 manage.py compilemessages
|
||||
-
|
||||
name: Collect static files
|
||||
run: |
|
||||
cd src/
|
||||
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python3 manage.py collectstatic --no-input
|
||||
pipenv run python3 manage.py collectstatic --no-input
|
||||
-
|
||||
name: Move files
|
||||
run: |
|
||||
echo "Making dist folders"
|
||||
for directory in dist \
|
||||
dist/paperless-ngx \
|
||||
dist/paperless-ngx/scripts;
|
||||
do
|
||||
mkdir --verbose --parents ${directory}
|
||||
done
|
||||
|
||||
echo "Copying basic files"
|
||||
for file_name in .dockerignore \
|
||||
.env \
|
||||
Dockerfile \
|
||||
Pipfile \
|
||||
Pipfile.lock \
|
||||
requirements.txt \
|
||||
LICENSE \
|
||||
README.md \
|
||||
paperless.conf.example \
|
||||
gunicorn.conf.py
|
||||
do
|
||||
cp --verbose ${file_name} dist/paperless-ngx/
|
||||
done
|
||||
mv --verbose dist/paperless-ngx/paperless.conf.example dist/paperless-ngx/paperless.conf
|
||||
|
||||
echo "Copying Docker related files"
|
||||
cp --recursive docker/ dist/paperless-ngx/docker
|
||||
|
||||
echo "Copying startup scripts"
|
||||
cp --verbose scripts/*.service scripts/*.sh scripts/*.socket dist/paperless-ngx/scripts/
|
||||
|
||||
echo "Copying source files"
|
||||
cp --recursive src/ dist/paperless-ngx/src
|
||||
echo "Copying documentation"
|
||||
cp --recursive docs/_build/html/ dist/paperless-ngx/docs
|
||||
|
||||
mv --verbose static dist/paperless-ngx
|
||||
mkdir dist
|
||||
mkdir dist/paperless-ngx
|
||||
mkdir dist/paperless-ngx/scripts
|
||||
cp .dockerignore .env Dockerfile Pipfile Pipfile.lock requirements.txt LICENSE README.md dist/paperless-ngx/
|
||||
cp paperless.conf.example dist/paperless-ngx/paperless.conf
|
||||
cp gunicorn.conf.py dist/paperless-ngx/gunicorn.conf.py
|
||||
cp -r docker/ dist/paperless-ngx/docker
|
||||
cp scripts/*.service scripts/*.sh dist/paperless-ngx/scripts/
|
||||
cp -r src/ dist/paperless-ngx/src
|
||||
cp -r docs/_build/html/ dist/paperless-ngx/docs
|
||||
mv static dist/paperless-ngx
|
||||
-
|
||||
name: Make release package
|
||||
run: |
|
||||
echo "Creating release archive"
|
||||
cd dist
|
||||
sudo chown -R 1000:1000 paperless-ngx/
|
||||
tar -cJf paperless-ngx.tar.xz paperless-ngx/
|
||||
-
|
||||
name: Upload release artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: release
|
||||
path: dist/paperless-ngx.tar.xz
|
||||
retention-days: 7
|
||||
|
||||
publish-release:
|
||||
name: "Publish Release"
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-20.04
|
||||
outputs:
|
||||
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
||||
changelog: ${{ steps.create-release.outputs.body }}
|
||||
@@ -560,7 +449,7 @@ jobs:
|
||||
steps:
|
||||
-
|
||||
name: Download release artifact
|
||||
uses: actions/download-artifact@v4
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: release
|
||||
path: ./
|
||||
@@ -577,7 +466,7 @@ jobs:
|
||||
-
|
||||
name: Create Release and Changelog
|
||||
id: create-release
|
||||
uses: release-drafter/release-drafter@v6
|
||||
uses: paperless-ngx/release-drafter@master
|
||||
with:
|
||||
name: Paperless-ngx ${{ steps.get_version.outputs.version }}
|
||||
tag: ${{ steps.get_version.outputs.version }}
|
||||
@@ -589,37 +478,38 @@ jobs:
|
||||
-
|
||||
name: Upload release archive
|
||||
id: upload-release-asset
|
||||
uses: shogo82148/actions-upload-release-asset@v1
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
upload_url: ${{ steps.create-release.outputs.upload_url }}
|
||||
asset_path: ./paperless-ngx.tar.xz
|
||||
asset_name: paperless-ngx-${{ steps.get_version.outputs.version }}.tar.xz
|
||||
asset_content_type: application/x-xz
|
||||
|
||||
append-changelog:
|
||||
name: "Append Changelog"
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- publish-release
|
||||
if: needs.publish-release.outputs.prerelease == 'false'
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v3
|
||||
with:
|
||||
ref: main
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pip3 install --upgrade pip setuptools wheel pipx
|
||||
pipx install pipenv
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||
python-version: 3.9
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
-
|
||||
name: Install pipenv + tools
|
||||
run: |
|
||||
pip install --upgrade --user pipenv==${{ env.DEFAULT_PIP_ENV_VERSION }} setuptools wheel
|
||||
-
|
||||
name: Append Changelog to docs
|
||||
id: append-Changelog
|
||||
@@ -633,19 +523,19 @@ jobs:
|
||||
CURRENT_CHANGELOG=`tail --lines +2 changelog.md`
|
||||
echo -e "$CURRENT_CHANGELOG" >> changelog-new.md
|
||||
mv changelog-new.md changelog.md
|
||||
pipenv run pre-commit run --files changelog.md || true
|
||||
pipenv run pre-commit run --files changelog.md
|
||||
git config --global user.name "github-actions"
|
||||
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
git commit -am "Changelog ${{ needs.publish-release.outputs.version }} - GHA"
|
||||
git push origin ${{ needs.publish-release.outputs.version }}-changelog
|
||||
-
|
||||
name: Create Pull Request
|
||||
uses: actions/github-script@v7
|
||||
uses: actions/github-script@v6
|
||||
with:
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const result = await github.rest.pulls.create({
|
||||
title: 'Documentation: Add ${{ needs.publish-release.outputs.version }} changelog',
|
||||
title: '[Documentation] Add ${{ needs.publish-release.outputs.version }} changelog',
|
||||
owner,
|
||||
repo,
|
||||
head: '${{ needs.publish-release.outputs.version }}-changelog',
|
||||
@@ -656,5 +546,5 @@ jobs:
|
||||
owner,
|
||||
repo,
|
||||
issue_number: result.data.number,
|
||||
labels: ['documentation', 'skip-changelog']
|
||||
labels: ['documentation']
|
||||
});
|
||||
|
109
.github/workflows/cleanup-tags.yml
vendored
@@ -1,17 +1,23 @@
|
||||
# This workflow runs on certain conditions to check for and potentially
|
||||
# delete container images from the GHCR which no longer have an associated
|
||||
# code branch.
|
||||
# Requires a PAT with the correct scope set in the secrets.
|
||||
#
|
||||
# This workflow will not trigger runs on forked repos.
|
||||
# Requires a PAT with the correct scope set in the secrets
|
||||
|
||||
name: Cleanup Image Tags
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 0 * * SAT'
|
||||
delete:
|
||||
pull_request:
|
||||
types:
|
||||
- closed
|
||||
push:
|
||||
paths:
|
||||
- ".github/workflows/cleanup-tags.yml"
|
||||
- ".github/scripts/cleanup-tags.py"
|
||||
- ".github/scripts/github.py"
|
||||
- ".github/scripts/common.py"
|
||||
|
||||
concurrency:
|
||||
group: registry-tags-cleanup
|
||||
@@ -20,51 +26,70 @@ concurrency:
|
||||
jobs:
|
||||
cleanup-images:
|
||||
name: Cleanup Image Tags for ${{ matrix.primary-name }}
|
||||
if: github.repository_owner == 'paperless-ngx'
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
primary-name: ["paperless-ngx", "paperless-ngx/builder/cache/app"]
|
||||
env:
|
||||
# Requires a personal access token with the OAuth scope delete:packages
|
||||
TOKEN: ${{ secrets.GHA_CONTAINER_DELETE_TOKEN }}
|
||||
steps:
|
||||
-
|
||||
name: Clean temporary images
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
uses: stumpylog/image-cleaner-action/ephemeral@v0.6.0
|
||||
with:
|
||||
token: "${{ env.TOKEN }}"
|
||||
owner: "${{ github.repository_owner }}"
|
||||
is_org: "true"
|
||||
package_name: "${{ matrix.primary-name }}"
|
||||
scheme: "branch"
|
||||
repo_name: "paperless-ngx"
|
||||
match_regex: "(feature|fix)"
|
||||
do_delete: "true"
|
||||
include:
|
||||
- primary-name: "paperless-ngx"
|
||||
cache-name: "paperless-ngx/builder/cache/app"
|
||||
|
||||
cleanup-untagged-images:
|
||||
name: Cleanup Untagged Images Tags for ${{ matrix.primary-name }}
|
||||
if: github.repository_owner == 'paperless-ngx'
|
||||
runs-on: ubuntu-22.04
|
||||
needs:
|
||||
- cleanup-images
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
primary-name: ["paperless-ngx", "paperless-ngx/builder/cache/app"]
|
||||
- primary-name: "paperless-ngx/builder/qpdf"
|
||||
cache-name: "paperless-ngx/builder/cache/qpdf"
|
||||
|
||||
- primary-name: "paperless-ngx/builder/pikepdf"
|
||||
cache-name: "paperless-ngx/builder/cache/pikepdf"
|
||||
|
||||
- primary-name: "paperless-ngx/builder/jbig2enc"
|
||||
cache-name: "paperless-ngx/builder/cache/jbig2enc"
|
||||
|
||||
- primary-name: "paperless-ngx/builder/psycopg2"
|
||||
cache-name: "paperless-ngx/builder/cache/psycopg2"
|
||||
env:
|
||||
# Requires a personal access token with the OAuth scope delete:packages
|
||||
TOKEN: ${{ secrets.GHA_CONTAINER_DELETE_TOKEN }}
|
||||
steps:
|
||||
-
|
||||
name: Clean untagged images
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
uses: stumpylog/image-cleaner-action/untagged@v0.6.0
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
token: "${{ env.TOKEN }}"
|
||||
owner: "${{ github.repository_owner }}"
|
||||
is_org: "true"
|
||||
package_name: "${{ matrix.primary-name }}"
|
||||
do_delete: "true"
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.10"
|
||||
-
|
||||
name: Install httpx
|
||||
run: |
|
||||
python -m pip install httpx
|
||||
#
|
||||
# Clean up primary package
|
||||
#
|
||||
-
|
||||
name: Cleanup for package "${{ matrix.primary-name }}"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --untagged --is-manifest --delete "${{ matrix.primary-name }}"
|
||||
#
|
||||
# Clean up registry cache package
|
||||
#
|
||||
-
|
||||
name: Cleanup for package "${{ matrix.cache-name }}"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --untagged --delete "${{ matrix.cache-name }}"
|
||||
#
|
||||
# Verify tags which are left still pull
|
||||
#
|
||||
-
|
||||
name: Check all tags still pull
|
||||
run: |
|
||||
ghcr_name=$(echo "ghcr.io/${GITHUB_REPOSITORY_OWNER}/${{ matrix.primary-name }}" | awk '{ print tolower($0) }')
|
||||
echo "Pulling all tags of ${ghcr_name}"
|
||||
docker pull --quiet --all-tags ${ghcr_name}
|
||||
docker image list
|
||||
|
8
.github/workflows/codeql-analysis.yml
vendored
@@ -23,7 +23,7 @@ on:
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
actions: read
|
||||
contents: read
|
||||
@@ -38,11 +38,11 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v3
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v3
|
||||
uses: github/codeql-action/init@v2
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||
@@ -51,4 +51,4 @@ jobs:
|
||||
# queries: ./path/to/local/query, your-org/your-repo/queries@main
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v3
|
||||
uses: github/codeql-action/analyze@v2
|
||||
|
34
.github/workflows/crowdin.yml
vendored
@@ -1,34 +0,0 @@
|
||||
name: Crowdin Action
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: '2 */12 * * *'
|
||||
push:
|
||||
paths: [
|
||||
'src/locale/**',
|
||||
'src-ui/messages.xlf',
|
||||
'src-ui/src/locale/**'
|
||||
]
|
||||
branches: [ dev ]
|
||||
|
||||
jobs:
|
||||
synchronize-with-crowdin:
|
||||
name: Crowdin Sync
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: crowdin action
|
||||
uses: crowdin/github-action@v1
|
||||
with:
|
||||
upload_translations: false
|
||||
download_translations: true
|
||||
crowdin_branch_name: 'dev'
|
||||
localization_branch_name: l10n_dev
|
||||
pull_request_labels: 'skip-changelog, translation'
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
CROWDIN_PROJECT_ID: ${{ secrets.CROWDIN_PROJECT_ID }}
|
||||
CROWDIN_PERSONAL_TOKEN: ${{ secrets.CROWDIN_PERSONAL_TOKEN }}
|
170
.github/workflows/installer-library.yml
vendored
Normal file
@@ -0,0 +1,170 @@
|
||||
# This workflow will run to update the installer library of
|
||||
# Docker images. These are the images which provide updated wheels
|
||||
# .deb installation packages or maybe just some compiled library
|
||||
|
||||
name: Build Image Library
|
||||
|
||||
on:
|
||||
push:
|
||||
# Must match one of these branches AND one of the paths
|
||||
# to be triggered
|
||||
branches:
|
||||
- "main"
|
||||
- "dev"
|
||||
- "library-*"
|
||||
- "feature-*"
|
||||
paths:
|
||||
# Trigger the workflow if a Dockerfile changed
|
||||
- "docker-builders/**"
|
||||
# Trigger if a package was updated
|
||||
- ".build-config.json"
|
||||
- "Pipfile.lock"
|
||||
# Also trigger on workflow changes related to the library
|
||||
- ".github/workflows/installer-library.yml"
|
||||
- ".github/workflows/reusable-workflow-builder.yml"
|
||||
- ".github/scripts/**"
|
||||
|
||||
# Set a workflow level concurrency group so primary workflow
|
||||
# can wait for this to complete if needed
|
||||
# DO NOT CHANGE without updating main workflow group
|
||||
concurrency:
|
||||
group: build-installer-library
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
prepare-docker-build:
|
||||
name: Prepare Docker Image Version Data
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Set ghcr repository name
|
||||
id: set-ghcr-repository
|
||||
run: |
|
||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||
echo "repository=${ghcr_name}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.9"
|
||||
-
|
||||
name: Install jq
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install jq
|
||||
-
|
||||
name: Setup qpdf image
|
||||
id: qpdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo "qpdf-json=${build_json}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Setup psycopg2 image
|
||||
id: psycopg2-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo "psycopg2-json=${build_json}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Setup pikepdf image
|
||||
id: pikepdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo "pikepdf-json=${build_json}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Setup jbig2enc image
|
||||
id: jbig2enc-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo "jbig2enc-json=${build_json}" >> $GITHUB_OUTPUT
|
||||
-
|
||||
name: Setup other versions
|
||||
id: cache-bust-setup
|
||||
run: |
|
||||
pillow_version=$(jq ".default.pillow.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
lxml_version=$(jq ".default.lxml.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
|
||||
echo "Pillow is ${pillow_version}"
|
||||
echo "lxml is ${lxml_version}"
|
||||
|
||||
echo "pillow-version=${pillow_version}" >> $GITHUB_OUTPUT
|
||||
echo "lxml-version=${lxml_version}" >> $GITHUB_OUTPUT
|
||||
|
||||
outputs:
|
||||
|
||||
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
||||
|
||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
||||
|
||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
||||
|
||||
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
||||
|
||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json }}
|
||||
|
||||
pillow-version: ${{ steps.cache-bust-setup.outputs.pillow-version }}
|
||||
|
||||
lxml-version: ${{ steps.cache-bust-setup.outputs.lxml-version }}
|
||||
|
||||
build-qpdf-debs:
|
||||
name: qpdf
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.qpdf
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.qpdf-json }}
|
||||
build-args: |
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
|
||||
build-jbig2enc:
|
||||
name: jbig2enc
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.jbig2enc
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.jbig2enc-json }}
|
||||
build-args: |
|
||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
||||
|
||||
build-psycopg2-wheel:
|
||||
name: psycopg2
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.psycopg2
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.psycopg2-json }}
|
||||
build-args: |
|
||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
||||
|
||||
build-pikepdf-wheel:
|
||||
name: pikepdf
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
- build-qpdf-debs
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.pikepdf
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.pikepdf-json }}
|
||||
build-args: |
|
||||
REPO=${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
||||
PILLOW_VERSION=${{ needs.prepare-docker-build.outputs.pillow-version }}
|
||||
LXML_VERSION=${{ needs.prepare-docker-build.outputs.lxml-version }}
|
34
.github/workflows/project-actions.yml
vendored
@@ -1,6 +1,10 @@
|
||||
name: Project Automations
|
||||
|
||||
on:
|
||||
issues:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
pull_request_target: #_target allows access to secrets
|
||||
types:
|
||||
- opened
|
||||
@@ -12,16 +16,42 @@ on:
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
env:
|
||||
todo: Todo
|
||||
done: Done
|
||||
in_progress: In Progress
|
||||
|
||||
jobs:
|
||||
issue_opened_or_reopened:
|
||||
name: issue_opened_or_reopened
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event_name == 'issues' && (github.event.action == 'opened' || github.event.action == 'reopened')
|
||||
steps:
|
||||
- name: Add issue to project and set status to ${{ env.todo }}
|
||||
uses: leonsteinhaeuser/project-beta-automations@v2.0.1
|
||||
with:
|
||||
gh_token: ${{ secrets.GH_TOKEN }}
|
||||
organization: paperless-ngx
|
||||
project_id: 2
|
||||
resource_node_id: ${{ github.event.issue.node_id }}
|
||||
status_value: ${{ env.todo }} # Target status
|
||||
pr_opened_or_reopened:
|
||||
name: pr_opened_or_reopened
|
||||
runs-on: ubuntu-22.04
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
# write permission is required for autolabeler
|
||||
pull-requests: write
|
||||
if: github.event_name == 'pull_request_target' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request.user.login != 'dependabot'
|
||||
steps:
|
||||
- name: Add PR to project and set status to "Needs Review"
|
||||
uses: leonsteinhaeuser/project-beta-automations@v2.0.1
|
||||
with:
|
||||
gh_token: ${{ secrets.GH_TOKEN }}
|
||||
organization: paperless-ngx
|
||||
project_id: 2
|
||||
resource_node_id: ${{ github.event.pull_request.node_id }}
|
||||
status_value: "Needs Review" # Target status
|
||||
- name: Label PR with release-drafter
|
||||
uses: release-drafter/release-drafter@v6
|
||||
uses: release-drafter/release-drafter@v5
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
281
.github/workflows/repo-maintenance.yml
vendored
@@ -1,281 +0,0 @@
|
||||
name: 'Repository Maintenance'
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 3 * * *'
|
||||
workflow_dispatch:
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
pull-requests: write
|
||||
discussions: write
|
||||
|
||||
concurrency:
|
||||
group: lock
|
||||
|
||||
jobs:
|
||||
stale:
|
||||
name: 'Stale'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/stale@v9
|
||||
with:
|
||||
days-before-stale: 7
|
||||
days-before-close: 14
|
||||
any-of-labels: 'cant-reproduce,not a bug'
|
||||
stale-issue-label: stale
|
||||
stale-pr-label: stale
|
||||
stale-issue-message: >
|
||||
This issue has been automatically marked as stale because it has not had
|
||||
recent activity. It will be closed if no further activity occurs. Thank you
|
||||
for your contributions. See our [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/dev/CONTRIBUTING.md#automatic-repository-maintenance) for more details.
|
||||
lock-threads:
|
||||
name: 'Lock Old Threads'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: dessant/lock-threads@v5
|
||||
with:
|
||||
issue-inactive-days: '30'
|
||||
pr-inactive-days: '30'
|
||||
discussion-inactive-days: '30'
|
||||
log-output: true
|
||||
issue-comment: >
|
||||
This issue has been automatically locked since there
|
||||
has not been any recent activity after it was closed.
|
||||
Please open a new discussion or issue for related concerns.
|
||||
See our [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/dev/CONTRIBUTING.md#automatic-repository-maintenance) for more details.
|
||||
pr-comment: >
|
||||
This pull request has been automatically locked since there
|
||||
has not been any recent activity after it was closed.
|
||||
Please open a new discussion or issue for related concerns.
|
||||
See our [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/dev/CONTRIBUTING.md#automatic-repository-maintenance) for more details.
|
||||
discussion-comment: >
|
||||
This discussion has been automatically locked since there
|
||||
has not been any recent activity after it was closed.
|
||||
Please open a new discussion for related concerns.
|
||||
See our [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/dev/CONTRIBUTING.md#automatic-repository-maintenance) for more details.
|
||||
close-answered-discussions:
|
||||
name: 'Close Answered Discussions'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
function sleep(ms) {
|
||||
return new Promise(resolve => setTimeout(resolve, ms));
|
||||
}
|
||||
|
||||
const query = `query($owner:String!, $name:String!) {
|
||||
repository(owner:$owner, name:$name){
|
||||
discussions(first:100, answered:true, states:[OPEN]) {
|
||||
nodes {
|
||||
id,
|
||||
number
|
||||
}
|
||||
}
|
||||
}
|
||||
}`;
|
||||
const variables = {
|
||||
owner: context.repo.owner,
|
||||
name: context.repo.repo,
|
||||
}
|
||||
const result = await github.graphql(query, variables)
|
||||
|
||||
console.log(`Found ${result.repository.discussions.nodes.length} open answered discussions`)
|
||||
|
||||
for (const discussion of result.repository.discussions.nodes) {
|
||||
console.log(`Closing discussion #${discussion.number} (${discussion.id})`)
|
||||
|
||||
const addCommentMutation = `mutation($discussion:ID!, $body:String!) {
|
||||
addDiscussionComment(input:{discussionId:$discussion, body:$body}) {
|
||||
clientMutationId
|
||||
}
|
||||
}`;
|
||||
const commentVariables = {
|
||||
discussion: discussion.id,
|
||||
body: 'This discussion has been automatically closed because it was marked as answered. Please see our [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/dev/CONTRIBUTING.md#automatic-repository-maintenance) for more details.',
|
||||
}
|
||||
await github.graphql(addCommentMutation, commentVariables)
|
||||
|
||||
const closeDiscussionMutation = `mutation($discussion:ID!, $reason:DiscussionCloseReason!) {
|
||||
closeDiscussion(input:{discussionId:$discussion, reason:$reason}) {
|
||||
clientMutationId
|
||||
}
|
||||
}`;
|
||||
const closeVariables = {
|
||||
discussion: discussion.id,
|
||||
reason: "RESOLVED",
|
||||
}
|
||||
await github.graphql(closeDiscussionMutation, closeVariables)
|
||||
|
||||
await sleep(1000)
|
||||
}
|
||||
close-outdated-discussions:
|
||||
name: 'Close Outdated Discussions'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
function sleep(ms) {
|
||||
return new Promise(resolve => setTimeout(resolve, ms));
|
||||
}
|
||||
|
||||
const CUTOFF_DAYS = 180;
|
||||
const cutoff = new Date();
|
||||
cutoff.setDate(cutoff.getDate() - CUTOFF_DAYS);
|
||||
|
||||
const query = `query(
|
||||
$owner:String!,
|
||||
$name:String!,
|
||||
$supportCategory:ID!,
|
||||
$generalCategory:ID!,
|
||||
) {
|
||||
supportDiscussions: repository(owner:$owner, name:$name){
|
||||
discussions(
|
||||
categoryId:$supportCategory,
|
||||
last:50,
|
||||
answered:false,
|
||||
states:[OPEN],
|
||||
) {
|
||||
nodes {
|
||||
id,
|
||||
number,
|
||||
updatedAt
|
||||
}
|
||||
},
|
||||
},
|
||||
generalDiscussions: repository(owner:$owner, name:$name){
|
||||
discussions(
|
||||
categoryId:$generalCategory,
|
||||
last:50,
|
||||
states:[OPEN],
|
||||
) {
|
||||
nodes {
|
||||
id,
|
||||
number,
|
||||
updatedAt
|
||||
}
|
||||
}
|
||||
}
|
||||
}`;
|
||||
const variables = {
|
||||
owner: context.repo.owner,
|
||||
name: context.repo.repo,
|
||||
supportCategory: "DIC_kwDOG1Zs184CBKWK",
|
||||
generalCategory: "DIC_kwDOG1Zs184CBKWJ"
|
||||
}
|
||||
const result = await github.graphql(query, variables);
|
||||
const combinedDiscussions = [
|
||||
...result.supportDiscussions.discussions.nodes,
|
||||
...result.generalDiscussions.discussions.nodes,
|
||||
]
|
||||
|
||||
console.log(`Checking ${combinedDiscussions.length} open discussions`);
|
||||
|
||||
for (const discussion of combinedDiscussions) {
|
||||
if (new Date(discussion.updatedAt) < cutoff) {
|
||||
console.log(`Closing outdated discussion #${discussion.number} (${discussion.id}), last updated at ${discussion.updatedAt}`);
|
||||
const addCommentMutation = `mutation($discussion:ID!, $body:String!) {
|
||||
addDiscussionComment(input:{discussionId:$discussion, body:$body}) {
|
||||
clientMutationId
|
||||
}
|
||||
}`;
|
||||
const commentVariables = {
|
||||
discussion: discussion.id,
|
||||
body: 'This discussion has been automatically closed due to inactivity. Please see our [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/dev/CONTRIBUTING.md#automatic-repository-maintenance) for more details.',
|
||||
}
|
||||
await github.graphql(addCommentMutation, commentVariables);
|
||||
|
||||
const closeDiscussionMutation = `mutation($discussion:ID!, $reason:DiscussionCloseReason!) {
|
||||
closeDiscussion(input:{discussionId:$discussion, reason:$reason}) {
|
||||
clientMutationId
|
||||
}
|
||||
}`;
|
||||
const closeVariables = {
|
||||
discussion: discussion.id,
|
||||
reason: "OUTDATED",
|
||||
}
|
||||
await github.graphql(closeDiscussionMutation, closeVariables);
|
||||
|
||||
await sleep(1000);
|
||||
}
|
||||
}
|
||||
close-unsupported-feature-requests:
|
||||
name: 'Close Unsupported Feature Requests'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
function sleep(ms) {
|
||||
return new Promise(resolve => setTimeout(resolve, ms));
|
||||
}
|
||||
|
||||
const CUTOFF_1_DAYS = 180;
|
||||
const CUTOFF_1_COUNT = 5;
|
||||
const CUTOFF_2_DAYS = 365;
|
||||
const CUTOFF_2_COUNT = 10;
|
||||
|
||||
const cutoff1Date = new Date();
|
||||
cutoff1Date.setDate(cutoff1Date.getDate() - CUTOFF_1_DAYS);
|
||||
const cutoff2Date = new Date();
|
||||
cutoff2Date.setDate(cutoff2Date.getDate() - CUTOFF_2_DAYS);
|
||||
|
||||
const query = `query(
|
||||
$owner:String!,
|
||||
$name:String!,
|
||||
$featureRequestsCategory:ID!,
|
||||
) {
|
||||
repository(owner:$owner, name:$name){
|
||||
discussions(
|
||||
categoryId:$featureRequestsCategory,
|
||||
last:100,
|
||||
states:[OPEN],
|
||||
) {
|
||||
nodes {
|
||||
id,
|
||||
number,
|
||||
updatedAt,
|
||||
upvoteCount,
|
||||
}
|
||||
},
|
||||
}
|
||||
}`;
|
||||
const variables = {
|
||||
owner: context.repo.owner,
|
||||
name: context.repo.repo,
|
||||
featureRequestsCategory: "DIC_kwDOG1Zs184CBNr4"
|
||||
}
|
||||
const result = await github.graphql(query, variables);
|
||||
|
||||
for (const discussion of result.repository.discussions.nodes) {
|
||||
const discussionDate = new Date(discussion.updatedAt);
|
||||
if ((discussionDate < cutoff1Date && discussion.upvoteCount < CUTOFF_1_COUNT) ||
|
||||
(discussionDate < cutoff2Date && discussion.upvoteCount < CUTOFF_2_COUNT)) {
|
||||
console.log(`Closing discussion #${discussion.number} (${discussion.id}), last updated at ${discussion.updatedAt} with votes ${discussion.upvoteCount}`);
|
||||
const addCommentMutation = `mutation($discussion:ID!, $body:String!) {
|
||||
addDiscussionComment(input:{discussionId:$discussion, body:$body}) {
|
||||
clientMutationId
|
||||
}
|
||||
}`;
|
||||
const commentVariables = {
|
||||
discussion: discussion.id,
|
||||
body: 'This discussion has been automatically closed due to lack of community support. Please see our [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/dev/CONTRIBUTING.md#automatic-repository-maintenance) for more details.',
|
||||
}
|
||||
await github.graphql(addCommentMutation, commentVariables);
|
||||
|
||||
const closeDiscussionMutation = `mutation($discussion:ID!, $reason:DiscussionCloseReason!) {
|
||||
closeDiscussion(input:{discussionId:$discussion, reason:$reason}) {
|
||||
clientMutationId
|
||||
}
|
||||
}`;
|
||||
const closeVariables = {
|
||||
discussion: discussion.id,
|
||||
reason: "OUTDATED",
|
||||
}
|
||||
await github.graphql(closeDiscussionMutation, closeVariables);
|
||||
|
||||
await sleep(1000);
|
||||
}
|
||||
}
|
53
.github/workflows/reusable-workflow-builder.yml
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
name: Reusable Image Builder
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
dockerfile:
|
||||
required: true
|
||||
type: string
|
||||
build-json:
|
||||
required: true
|
||||
type: string
|
||||
build-args:
|
||||
required: false
|
||||
default: ""
|
||||
type: string
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ fromJSON(inputs.build-json).name }}-${{ fromJSON(inputs.build-json).version }}
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
build-image:
|
||||
name: Build ${{ fromJSON(inputs.build-json).name }} @ ${{ fromJSON(inputs.build-json).version }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v2
|
||||
-
|
||||
name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v2
|
||||
-
|
||||
name: Build ${{ fromJSON(inputs.build-json).name }}
|
||||
uses: docker/build-push-action@v3
|
||||
with:
|
||||
context: .
|
||||
file: ${{ inputs.dockerfile }}
|
||||
tags: ${{ fromJSON(inputs.build-json).image_tag }}
|
||||
platforms: linux/amd64,linux/arm64,linux/arm/v7
|
||||
build-args: ${{ inputs.build-args }}
|
||||
push: true
|
||||
cache-from: type=registry,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
||||
cache-to: type=registry,mode=max,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
5
.gitignore
vendored
@@ -51,8 +51,8 @@ coverage.xml
|
||||
# Django stuff:
|
||||
*.log
|
||||
|
||||
# MkDocs documentation
|
||||
site/
|
||||
# Sphinx documentation
|
||||
docs/_build/
|
||||
|
||||
# PyBuilder
|
||||
target/
|
||||
@@ -73,7 +73,6 @@ virtualenv
|
||||
.venv/
|
||||
/docker-compose.env
|
||||
/docker-compose.yml
|
||||
.ruff_cache/
|
||||
|
||||
# Used for development
|
||||
scripts/import-for-development
|
||||
|
@@ -5,14 +5,12 @@
|
||||
repos:
|
||||
# General hooks
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.6.0
|
||||
rev: v4.3.0
|
||||
hooks:
|
||||
- id: check-docstring-first
|
||||
- id: check-json
|
||||
exclude: "tsconfig.*json"
|
||||
- id: check-yaml
|
||||
args:
|
||||
- "--unsafe"
|
||||
- id: check-toml
|
||||
- id: check-executables-have-shebangs
|
||||
- id: end-of-file-fixer
|
||||
@@ -28,16 +26,8 @@ repos:
|
||||
- svg
|
||||
- id: check-case-conflict
|
||||
- id: detect-private-key
|
||||
- repo: https://github.com/codespell-project/codespell
|
||||
rev: v2.2.6
|
||||
hooks:
|
||||
- id: codespell
|
||||
exclude: "(^src-ui/src/locale/)|(^src-ui/e2e/)|(^src/paperless_mail/tests/samples/)"
|
||||
exclude_types:
|
||||
- pofile
|
||||
- json
|
||||
- repo: https://github.com/pre-commit/mirrors-prettier
|
||||
rev: 'v3.1.0'
|
||||
rev: "v2.7.1"
|
||||
hooks:
|
||||
- id: prettier
|
||||
types_or:
|
||||
@@ -46,17 +36,42 @@ repos:
|
||||
- markdown
|
||||
exclude: "(^Pipfile\\.lock$)"
|
||||
# Python hooks
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: 'v0.4.2'
|
||||
- repo: https://github.com/asottile/reorder_python_imports
|
||||
rev: v3.9.0
|
||||
hooks:
|
||||
- id: ruff
|
||||
- repo: https://github.com/psf/black-pre-commit-mirror
|
||||
rev: 24.4.2
|
||||
- id: reorder-python-imports
|
||||
exclude: "(migrations)"
|
||||
- repo: https://github.com/asottile/yesqa
|
||||
rev: "v1.4.0"
|
||||
hooks:
|
||||
- id: yesqa
|
||||
exclude: "(migrations)"
|
||||
- repo: https://github.com/asottile/add-trailing-comma
|
||||
rev: "v2.3.0"
|
||||
hooks:
|
||||
- id: add-trailing-comma
|
||||
exclude: "(migrations)"
|
||||
- repo: https://github.com/PyCQA/flake8
|
||||
rev: 5.0.4
|
||||
hooks:
|
||||
- id: flake8
|
||||
files: ^src/
|
||||
args:
|
||||
- "--config=./src/setup.cfg"
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 22.10.0
|
||||
hooks:
|
||||
- id: black
|
||||
- repo: https://github.com/asottile/pyupgrade
|
||||
rev: v3.2.2
|
||||
hooks:
|
||||
- id: pyupgrade
|
||||
exclude: "(migrations)"
|
||||
args:
|
||||
- "--py38-plus"
|
||||
# Dockerfile hooks
|
||||
- repo: https://github.com/AleksaC/hadolint-py
|
||||
rev: v2.12.0.3
|
||||
rev: v2.10.0
|
||||
hooks:
|
||||
- id: hadolint
|
||||
# Shell script hooks
|
||||
@@ -67,6 +82,6 @@ repos:
|
||||
args:
|
||||
- "--tab"
|
||||
- repo: https://github.com/shellcheck-py/shellcheck-py
|
||||
rev: "v0.10.0.1"
|
||||
rev: "v0.8.0.4"
|
||||
hooks:
|
||||
- id: shellcheck
|
||||
|
20
.prettierrc
@@ -1,16 +1,4 @@
|
||||
{
|
||||
# https://prettier.io/docs/en/options.html#semicolons
|
||||
"semi": false,
|
||||
# https://prettier.io/docs/en/options.html#quotes
|
||||
"singleQuote": true,
|
||||
# https://prettier.io/docs/en/options.html#trailing-commas
|
||||
"trailingComma": "es5",
|
||||
"overrides": [
|
||||
{
|
||||
"files": ["index.md", "administration.md"],
|
||||
"options": {
|
||||
"tabWidth": 4
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
# https://prettier.io/docs/en/options.html#semicolons
|
||||
semi: false
|
||||
# https://prettier.io/docs/en/options.html#quotes
|
||||
singleQuote: true
|
||||
|
@@ -1 +0,0 @@
|
||||
3.9.18
|
16
.readthedocs.yml
Normal file
@@ -0,0 +1,16 @@
|
||||
# .readthedocs.yml
|
||||
# Read the Docs configuration file
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Build documentation in the docs/ directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
|
||||
# Optionally set the version of Python and requirements required to build your docs
|
||||
python:
|
||||
version: "3.8"
|
||||
install:
|
||||
- requirements: docs/requirements.txt
|
47
.ruff.toml
@@ -1,47 +0,0 @@
|
||||
fix = true
|
||||
line-length = 88
|
||||
respect-gitignore = true
|
||||
src = ["src"]
|
||||
target-version = "py39"
|
||||
output-format = "grouped"
|
||||
show-fixes = true
|
||||
|
||||
# https://docs.astral.sh/ruff/settings/
|
||||
# https://docs.astral.sh/ruff/rules/
|
||||
[lint]
|
||||
extend-select = [
|
||||
"W", # https://docs.astral.sh/ruff/rules/#pycodestyle-e-w
|
||||
"I", # https://docs.astral.sh/ruff/rules/#isort-i
|
||||
"UP", # https://docs.astral.sh/ruff/rules/#pyupgrade-up
|
||||
"COM", # https://docs.astral.sh/ruff/rules/#flake8-commas-com
|
||||
"DJ", # https://docs.astral.sh/ruff/rules/#flake8-django-dj
|
||||
"EXE", # https://docs.astral.sh/ruff/rules/#flake8-executable-exe
|
||||
"ISC", # https://docs.astral.sh/ruff/rules/#flake8-implicit-str-concat-isc
|
||||
"ICN", # https://docs.astral.sh/ruff/rules/#flake8-import-conventions-icn
|
||||
"G201", # https://docs.astral.sh/ruff/rules/#flake8-logging-format-g
|
||||
"INP", # https://docs.astral.sh/ruff/rules/#flake8-no-pep420-inp
|
||||
"PIE", # https://docs.astral.sh/ruff/rules/#flake8-pie-pie
|
||||
"Q", # https://docs.astral.sh/ruff/rules/#flake8-quotes-q
|
||||
"RSE", # https://docs.astral.sh/ruff/rules/#flake8-raise-rse
|
||||
"T20", # https://docs.astral.sh/ruff/rules/#flake8-print-t20
|
||||
"SIM", # https://docs.astral.sh/ruff/rules/#flake8-simplify-sim
|
||||
"TID", # https://docs.astral.sh/ruff/rules/#flake8-tidy-imports-tid
|
||||
"TCH", # https://docs.astral.sh/ruff/rules/#flake8-type-checking-tch
|
||||
"PLC", # https://docs.astral.sh/ruff/rules/#pylint-pl
|
||||
"PLE", # https://docs.astral.sh/ruff/rules/#pylint-pl
|
||||
"RUF", # https://docs.astral.sh/ruff/rules/#ruff-specific-rules-ruf
|
||||
"FLY", # https://docs.astral.sh/ruff/rules/#flynt-fly
|
||||
]
|
||||
# TODO PTH https://docs.astral.sh/ruff/rules/#flake8-use-pathlib-pth
|
||||
ignore = ["DJ001", "SIM105", "RUF012"]
|
||||
|
||||
[lint.per-file-ignores]
|
||||
".github/scripts/*.py" = ["E501", "INP001", "SIM117"]
|
||||
"docker/wait-for-redis.py" = ["INP001", "T201"]
|
||||
"*/tests/*.py" = ["E501", "SIM117"]
|
||||
"*/migrations/*.py" = ["E501", "SIM", "T201"]
|
||||
"src/paperless_tesseract/tests/test_parser.py" = ["RUF001"]
|
||||
"src/documents/models.py" = ["SIM115"]
|
||||
|
||||
[lint.isort]
|
||||
force-single-line = true
|
@@ -11,7 +11,7 @@ If you want to implement something big:
|
||||
|
||||
## Python
|
||||
|
||||
Paperless supports python 3.9 - 3.11. We format Python code with [Black](https://github.com/psf/black).
|
||||
Paperless supports python 3.8 and 3.9. We format Python code with [Black](https://github.com/psf/black).
|
||||
|
||||
## Branches
|
||||
|
||||
@@ -27,11 +27,11 @@ Please format and test your code! I know it's a hassle, but it makes sure that y
|
||||
|
||||
To test your code, execute `pytest` in the src/ directory. This also generates a html coverage report, which you can use to see if you missed anything important during testing.
|
||||
|
||||
Before you can run `pytest`, ensure to [properly set up your local environment](https://docs.paperless-ngx.com/development/#initial-setup-and-first-start).
|
||||
Before you can run `pytest`, ensure to [properly set up your local environment](https://paperless-ngx.readthedocs.io/en/latest/extending.html#initial-setup-and-first-start).
|
||||
|
||||
## More info:
|
||||
|
||||
... is available [in the documentation](https://docs.paperless-ngx.com/development).
|
||||
... is available in the documentation. https://paperless-ngx.readthedocs.io/en/latest/extending.html
|
||||
|
||||
# Merging PRs
|
||||
|
||||
@@ -45,7 +45,7 @@ Examples of `non-trivial` PRs might include:
|
||||
|
||||
- Additional features
|
||||
- Large changes to many distinct files
|
||||
- Breaking or deprecation of existing features
|
||||
- Breaking or depreciation of existing features
|
||||
|
||||
Our community review process for `non-trivial` PRs is the following:
|
||||
|
||||
@@ -58,13 +58,6 @@ Our community review process for `non-trivial` PRs is the following:
|
||||
|
||||
This process might be slow as community members have different schedules and time to dedicate to the Paperless project. However it ensures community code reviews are as brilliantly thorough as they once were with @jonaswinkler.
|
||||
|
||||
# AI-Generated Code
|
||||
|
||||
This project does not specifically prohibit the use of AI-generated code _during the process_ of creating a PR, however:
|
||||
|
||||
1. Any code present in the final PR that was generated using AI sources should be clearly attributed as such and must not violate copyright protections.
|
||||
2. We will not accept PRs that are entirely or mostly AI-derived.
|
||||
|
||||
# Translating Paperless-ngx
|
||||
|
||||
Some notes about translation:
|
||||
@@ -94,7 +87,7 @@ The following files need to be changed:
|
||||
|
||||
- src-ui/angular.json (under the _projects/paperless-ui/i18n/locales_ JSON key)
|
||||
- src/paperless/settings.py (in the _LANGUAGES_ array)
|
||||
- src-ui/src/app/services/settings.service.ts (inside the _LANGUAGE_OPTIONS_ array)
|
||||
- src-ui/src/app/services/settings.service.ts (inside the _getLanguageOptions_ method)
|
||||
- src-ui/src/app/app.module.ts (import locale from _angular/common/locales_ and call _registerLocaleData_)
|
||||
|
||||
Please add the language in the correct order, alphabetically by locale.
|
||||
@@ -137,19 +130,3 @@ All team members are notified when mentioned or assigned to a relevant issue or
|
||||
We are not overly strict with inviting people to the organization. If you have read the [team permissions](#permissions) and think having additional access would enhance your contributions, please reach out to an [admin](#structure) of the team.
|
||||
|
||||
The admins occasionally invite contributors directly if we believe having them on a team will accelerate their work.
|
||||
|
||||
# Automatic Repository Maintenance
|
||||
|
||||
The Paperless-ngx team appreciates all effort and interest from the community in filing bug reports, creating feature requests, sharing ideas and helping other
|
||||
community members. That said, in an effort to keep the repository organized and managebale the project uses automatic handling of certain areas:
|
||||
|
||||
- Issues that cannot be reproduced will be marked 'stale' after 7 days of inactivity and closed after 14 further days of inactivity.
|
||||
- Issues, pull requests and discussions that are closed will be locked after 30 days of inactivity.
|
||||
- Discussions with a marked answer will be automatically closed.
|
||||
- Discussions in the 'General' or 'Support' categories will be closed after 180 days of inactivity.
|
||||
- Feature requests that do not meet the following thresholds will be closed: 5 "up-votes" after 180 days of inactivity or 10 "up-votes" after 365 days.
|
||||
|
||||
In all cases, threads can be re-opened by project maintainers and, of course, users can always create a new discussion for related concerns.
|
||||
Finally, remember that all information remains searchable and 'closed' feature requests can still serve as inspiration for new features.
|
||||
|
||||
Thank you all for your contributions.
|
||||
|
234
Dockerfile
@@ -1,27 +1,43 @@
|
||||
# syntax=docker/dockerfile:1
|
||||
# https://github.com/moby/buildkit/blob/master/frontend/dockerfile/docs/reference.md
|
||||
# syntax=docker/dockerfile:1.4
|
||||
|
||||
# Stage: compile-frontend
|
||||
# Purpose: Compiles the frontend
|
||||
# Notes:
|
||||
# - Does NPM stuff with Typescript and such
|
||||
FROM --platform=$BUILDPLATFORM docker.io/node:20-bookworm-slim AS compile-frontend
|
||||
# Pull the installer images from the library
|
||||
# These are all built previously
|
||||
# They provide either a .deb or .whl
|
||||
|
||||
ARG JBIG2ENC_VERSION
|
||||
ARG QPDF_VERSION
|
||||
ARG PIKEPDF_VERSION
|
||||
ARG PSYCOPG2_VERSION
|
||||
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/jbig2enc:${JBIG2ENC_VERSION} as jbig2enc-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/qpdf:${QPDF_VERSION} as qpdf-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/pikepdf:${PIKEPDF_VERSION} as pikepdf-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/psycopg2:${PSYCOPG2_VERSION} as psycopg2-builder
|
||||
|
||||
FROM --platform=$BUILDPLATFORM node:16-bullseye-slim AS compile-frontend
|
||||
|
||||
# This stage compiles the frontend
|
||||
# This stage runs once for the native platform, as the outputs are not
|
||||
# dependent on target arch
|
||||
# Inputs: None
|
||||
|
||||
COPY ./src-ui /src/src-ui
|
||||
|
||||
WORKDIR /src/src-ui
|
||||
RUN set -eux \
|
||||
&& npm update npm -g \
|
||||
&& npm ci
|
||||
&& npm ci --omit=optional
|
||||
RUN set -eux \
|
||||
&& ./node_modules/.bin/ng build --configuration production
|
||||
|
||||
# Stage: pipenv-base
|
||||
# Purpose: Generates a requirements.txt file for building
|
||||
# Comments:
|
||||
# - pipenv dependencies are not left in the final image
|
||||
# - pipenv can't touch the final image somehow
|
||||
FROM --platform=$BUILDPLATFORM docker.io/python:3.11-alpine as pipenv-base
|
||||
FROM --platform=$BUILDPLATFORM python:3.9-slim-bullseye as pipenv-base
|
||||
|
||||
# This stage generates the requirements.txt file using pipenv
|
||||
# This stage runs once for the native platform, as the outputs are not
|
||||
# dependent on target arch
|
||||
# This way, pipenv dependencies are not left in the final image
|
||||
# nor can pipenv mess up the final image somehow
|
||||
# Inputs: None
|
||||
|
||||
WORKDIR /usr/src/pipenv
|
||||
|
||||
@@ -29,64 +45,64 @@ COPY Pipfile* ./
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing pipenv" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade pipenv==2023.12.1 \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade pipenv \
|
||||
&& echo "Generating requirement.txt" \
|
||||
&& pipenv requirements > requirements.txt
|
||||
|
||||
# Stage: main-app
|
||||
# Purpose: The final image
|
||||
# Comments:
|
||||
# - Don't leave anything extra in here
|
||||
FROM docker.io/python:3.11-slim-bookworm as main-app
|
||||
FROM python:3.9-slim-bullseye as main-app
|
||||
|
||||
LABEL org.opencontainers.image.authors="paperless-ngx team <hello@paperless-ngx.com>"
|
||||
LABEL org.opencontainers.image.documentation="https://docs.paperless-ngx.com/"
|
||||
LABEL org.opencontainers.image.documentation="https://paperless-ngx.readthedocs.io/en/latest/"
|
||||
LABEL org.opencontainers.image.source="https://github.com/paperless-ngx/paperless-ngx"
|
||||
LABEL org.opencontainers.image.url="https://github.com/paperless-ngx/paperless-ngx"
|
||||
LABEL org.opencontainers.image.licenses="GPL-3.0-only"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
# Buildx provided, must be defined to use though
|
||||
ARG TARGETARCH
|
||||
|
||||
# Can be workflow provided, defaults set for manual building
|
||||
ARG JBIG2ENC_VERSION=0.29
|
||||
ARG QPDF_VERSION=11.9.0
|
||||
ARG GS_VERSION=10.02.1
|
||||
|
||||
# Set Python environment variables
|
||||
ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||
PYTHONUNBUFFERED=1 \
|
||||
# Ignore warning from Whitenoise
|
||||
PYTHONWARNINGS="ignore:::django.http.response:517" \
|
||||
PNGX_CONTAINERIZED=1
|
||||
|
||||
#
|
||||
# Begin installation and configuration
|
||||
# Order the steps below from least often changed to most
|
||||
#
|
||||
|
||||
# copy jbig2enc
|
||||
# Basically will never change again
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/.libs/libjbig2enc* /usr/local/lib/
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/jbig2 /usr/local/bin/
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/*.h /usr/local/include/
|
||||
|
||||
# Packages need for running
|
||||
ARG RUNTIME_PACKAGES="\
|
||||
# General utils
|
||||
curl \
|
||||
# Docker specific
|
||||
gosu \
|
||||
# Timezones support
|
||||
tzdata \
|
||||
file \
|
||||
# fonts for text file thumbnail generation
|
||||
fonts-liberation \
|
||||
gettext \
|
||||
ghostscript \
|
||||
gnupg \
|
||||
gosu \
|
||||
icc-profiles-free \
|
||||
imagemagick \
|
||||
# PostgreSQL
|
||||
media-types \
|
||||
liblept5 \
|
||||
libpq5 \
|
||||
libxml2 \
|
||||
liblcms2-2 \
|
||||
libtiff5 \
|
||||
libxslt1.1 \
|
||||
libfreetype6 \
|
||||
libwebp6 \
|
||||
libopenjp2-7 \
|
||||
libimagequant0 \
|
||||
libraqm0 \
|
||||
libgnutls30 \
|
||||
libjpeg62-turbo \
|
||||
python3 \
|
||||
python3-pip \
|
||||
python3-setuptools \
|
||||
postgresql-client \
|
||||
# MySQL / MariaDB
|
||||
mariadb-client \
|
||||
# For Numpy
|
||||
libatlas3-base \
|
||||
# OCRmyPDF dependencies
|
||||
tesseract-ocr \
|
||||
tesseract-ocr-eng \
|
||||
@@ -94,18 +110,13 @@ ARG RUNTIME_PACKAGES="\
|
||||
tesseract-ocr-fra \
|
||||
tesseract-ocr-ita \
|
||||
tesseract-ocr-spa \
|
||||
unpaper \
|
||||
# Suggested for OCRmyPDF
|
||||
pngquant \
|
||||
# Suggested for pikepdf
|
||||
jbig2dec \
|
||||
# lxml
|
||||
libxml2 \
|
||||
libxslt1.1 \
|
||||
# itself
|
||||
qpdf \
|
||||
tzdata \
|
||||
unpaper \
|
||||
# Mime type detection
|
||||
file \
|
||||
libmagic1 \
|
||||
media-types \
|
||||
zlib1g \
|
||||
# Barcode splitter
|
||||
libzbar0 \
|
||||
@@ -117,39 +128,9 @@ RUN set -eux \
|
||||
echo "Installing system packages" \
|
||||
&& apt-get update \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${RUNTIME_PACKAGES} \
|
||||
&& echo "Installing pre-built updates" \
|
||||
&& echo "Installing qpdf ${QPDF_VERSION}" \
|
||||
&& curl --fail --silent --show-error --location \
|
||||
--output libqpdf29_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||
https://github.com/paperless-ngx/builder/releases/download/qpdf-${QPDF_VERSION}/libqpdf29_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||
&& curl --fail --silent --show-error --location \
|
||||
--output qpdf_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||
https://github.com/paperless-ngx/builder/releases/download/qpdf-${QPDF_VERSION}/qpdf_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||
&& dpkg --install ./libqpdf29_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||
&& dpkg --install ./qpdf_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||
&& echo "Installing Ghostscript ${GS_VERSION}" \
|
||||
&& curl --fail --silent --show-error --location \
|
||||
--output libgs10_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||
https://github.com/paperless-ngx/builder/releases/download/ghostscript-${GS_VERSION}/libgs10_${GS_VERSION}.dfsg-1_${TARGETARCH}.deb \
|
||||
&& curl --fail --silent --show-error --location \
|
||||
--output ghostscript_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||
https://github.com/paperless-ngx/builder/releases/download/ghostscript-${GS_VERSION}/ghostscript_${GS_VERSION}.dfsg-1_${TARGETARCH}.deb \
|
||||
&& curl --fail --silent --show-error --location \
|
||||
--output libgs10-common_${GS_VERSION}.dfsg-2_all.deb \
|
||||
https://github.com/paperless-ngx/builder/releases/download/ghostscript-${GS_VERSION}/libgs10-common_${GS_VERSION}.dfsg-1_all.deb \
|
||||
&& dpkg --install ./libgs10-common_${GS_VERSION}.dfsg-2_all.deb \
|
||||
&& dpkg --install ./libgs10_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||
&& dpkg --install ./ghostscript_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||
&& echo "Installing jbig2enc" \
|
||||
&& curl --fail --silent --show-error --location \
|
||||
--output jbig2enc_${JBIG2ENC_VERSION}-1_${TARGETARCH}.deb \
|
||||
https://github.com/paperless-ngx/builder/releases/download/jbig2enc-${JBIG2ENC_VERSION}/jbig2enc_${JBIG2ENC_VERSION}-1_${TARGETARCH}.deb \
|
||||
&& dpkg --install ./jbig2enc_${JBIG2ENC_VERSION}-1_${TARGETARCH}.deb \
|
||||
&& echo "Cleaning up image layer" \
|
||||
&& rm --force --verbose *.deb \
|
||||
&& rm --recursive --force --verbose /var/lib/apt/lists/* \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& echo "Installing supervisor" \
|
||||
&& python3 -m pip install --default-timeout=1000 --upgrade --no-cache-dir supervisor==4.2.5
|
||||
&& python3 -m pip install --default-timeout=1000 --upgrade --no-cache-dir supervisor==4.2.4
|
||||
|
||||
# Copy gunicorn config
|
||||
# Changes very infrequently
|
||||
@@ -158,6 +139,7 @@ WORKDIR /usr/src/paperless/
|
||||
COPY gunicorn.conf.py .
|
||||
|
||||
# setup docker-specific things
|
||||
# Use mounts to avoid copying installer files into the image
|
||||
# These change sometimes, but rarely
|
||||
WORKDIR /usr/src/paperless/src/docker/
|
||||
|
||||
@@ -168,7 +150,6 @@ COPY [ \
|
||||
"docker/docker-prepare.sh", \
|
||||
"docker/paperless_cmd.sh", \
|
||||
"docker/wait-for-redis.py", \
|
||||
"docker/env-from-file.sh", \
|
||||
"docker/management_script.sh", \
|
||||
"docker/flower-conditional.sh", \
|
||||
"docker/install_management_commands.sh", \
|
||||
@@ -188,16 +169,35 @@ RUN set -eux \
|
||||
&& chmod 755 /sbin/docker-prepare.sh \
|
||||
&& mv wait-for-redis.py /sbin/wait-for-redis.py \
|
||||
&& chmod 755 /sbin/wait-for-redis.py \
|
||||
&& mv env-from-file.sh /sbin/env-from-file.sh \
|
||||
&& chmod 755 /sbin/env-from-file.sh \
|
||||
&& mv paperless_cmd.sh /usr/local/bin/paperless_cmd.sh \
|
||||
&& chmod 755 /usr/local/bin/paperless_cmd.sh \
|
||||
&& mv flower-conditional.sh /usr/local/bin/flower-conditional.sh \
|
||||
&& chmod 755 /usr/local/bin/flower-conditional.sh \
|
||||
&& echo "Installing management commands" \
|
||||
&& echo "Installing managment commands" \
|
||||
&& chmod +x install_management_commands.sh \
|
||||
&& ./install_management_commands.sh
|
||||
|
||||
# Install the built packages from the installer library images
|
||||
# Use mounts to avoid copying installer files into the image
|
||||
# These change sometimes
|
||||
RUN --mount=type=bind,from=qpdf-builder,target=/qpdf \
|
||||
--mount=type=bind,from=psycopg2-builder,target=/psycopg2 \
|
||||
--mount=type=bind,from=pikepdf-builder,target=/pikepdf \
|
||||
set -eux \
|
||||
&& echo "Installing qpdf" \
|
||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/libqpdf29_*.deb \
|
||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/qpdf_*.deb \
|
||||
&& echo "Installing pikepdf and dependencies" \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/pyparsing*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/packaging*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/lxml*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/Pillow*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/pikepdf*.whl \
|
||||
&& python3 -m pip list \
|
||||
&& echo "Installing psycopg2" \
|
||||
&& python3 -m pip install --no-cache-dir /psycopg2/usr/src/wheels/psycopg2*.whl \
|
||||
&& python3 -m pip list
|
||||
|
||||
WORKDIR /usr/src/paperless/src/
|
||||
|
||||
# Python dependencies
|
||||
@@ -209,61 +209,39 @@ COPY --from=pipenv-base /usr/src/pipenv/requirements.txt ./
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
git \
|
||||
# https://www.psycopg.org/docs/install.html#prerequisites
|
||||
libpq-dev \
|
||||
# https://github.com/PyMySQL/mysqlclient#linux
|
||||
default-libmysqlclient-dev \
|
||||
pkg-config"
|
||||
python3-dev"
|
||||
|
||||
# hadolint ignore=DL3042
|
||||
RUN --mount=type=cache,target=/root/.cache/pip/,id=pip-cache \
|
||||
set -eux \
|
||||
RUN set -eux \
|
||||
&& echo "Installing build system packages" \
|
||||
&& apt-get update \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade wheel \
|
||||
&& echo "Installing Python requirements" \
|
||||
&& python3 -m pip install --default-timeout=1000 --requirement requirements.txt \
|
||||
&& echo "Patching whitenoise for compression speedup" \
|
||||
&& curl --fail --silent --show-error --location --output 484.patch https://github.com/evansd/whitenoise/pull/484.patch \
|
||||
&& patch -d /usr/local/lib/python3.11/site-packages --verbose -p2 < 484.patch \
|
||||
&& rm 484.patch \
|
||||
&& echo "Installing NLTK data" \
|
||||
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" snowball_data \
|
||||
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" stopwords \
|
||||
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" punkt \
|
||||
&& python3 -m pip install --default-timeout=1000 --no-cache-dir --requirement requirements.txt \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get --yes purge ${BUILD_PACKAGES} \
|
||||
&& apt-get --yes autoremove --purge \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& apt-get clean --yes \
|
||||
&& rm --recursive --force --verbose /var/lib/apt/lists/* \
|
||||
&& rm --recursive --force --verbose /tmp/* \
|
||||
&& rm --recursive --force --verbose /var/tmp/* \
|
||||
&& rm --recursive --force --verbose /var/cache/apt/archives/* \
|
||||
&& truncate --size 0 /var/log/*log
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& rm -rf /tmp/* \
|
||||
&& rm -rf /var/tmp/* \
|
||||
&& rm -rf /var/cache/apt/archives/* \
|
||||
&& truncate -s 0 /var/log/*log
|
||||
|
||||
# copy backend
|
||||
COPY --chown=1000:1000 ./src ./
|
||||
COPY ./src ./
|
||||
|
||||
# copy frontend
|
||||
COPY --from=compile-frontend --chown=1000:1000 /src/src/documents/static/frontend/ ./documents/static/frontend/
|
||||
COPY --from=compile-frontend /src/src/documents/static/frontend/ ./documents/static/frontend/
|
||||
|
||||
# add users, setup scripts
|
||||
# Mount the compiled frontend to expected location
|
||||
RUN set -eux \
|
||||
&& echo "Setting up user/group" \
|
||||
&& addgroup --gid 1000 paperless \
|
||||
&& useradd --uid 1000 --gid paperless --home-dir /usr/src/paperless paperless \
|
||||
&& echo "Creating volume directories" \
|
||||
&& mkdir --parents --verbose /usr/src/paperless/data \
|
||||
&& mkdir --parents --verbose /usr/src/paperless/media \
|
||||
&& mkdir --parents --verbose /usr/src/paperless/consume \
|
||||
&& mkdir --parents --verbose /usr/src/paperless/export \
|
||||
&& echo "Adjusting all permissions" \
|
||||
&& chown --from root:root --changes --recursive paperless:paperless /usr/src/paperless \
|
||||
&& echo "Collecting static files" \
|
||||
&& gosu paperless python3 manage.py collectstatic --clear --no-input --link \
|
||||
&& gosu paperless python3 manage.py compilemessages
|
||||
&& addgroup --gid 1000 paperless \
|
||||
&& useradd --uid 1000 --gid paperless --home-dir /usr/src/paperless paperless \
|
||||
&& chown -R paperless:paperless ../ \
|
||||
&& gosu paperless python3 manage.py collectstatic --clear --no-input \
|
||||
&& gosu paperless python3 manage.py compilemessages
|
||||
|
||||
VOLUME ["/usr/src/paperless/data", \
|
||||
"/usr/src/paperless/media", \
|
||||
@@ -275,5 +253,3 @@ ENTRYPOINT ["/sbin/docker-entrypoint.sh"]
|
||||
EXPOSE 8000
|
||||
|
||||
CMD ["/usr/local/bin/paperless_cmd.sh"]
|
||||
|
||||
HEALTHCHECK --interval=30s --timeout=10s --retries=5 CMD [ "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000" ]
|
||||
|
126
Pipfile
@@ -3,98 +3,78 @@ url = "https://pypi.python.org/simple"
|
||||
verify_ssl = true
|
||||
name = "pypi"
|
||||
|
||||
[[source]]
|
||||
url = "https://www.piwheels.org/simple"
|
||||
verify_ssl = true
|
||||
name = "piwheels"
|
||||
|
||||
[packages]
|
||||
dateparser = "~=1.2"
|
||||
# WARNING: django does not use semver.
|
||||
# Only patch versions are guaranteed to not introduce breaking changes.
|
||||
django = "~=4.2.11"
|
||||
django-allauth = {extras = ["socialaccount"], version = "*"}
|
||||
django-auditlog = "*"
|
||||
django-celery-results = "*"
|
||||
django-compression-middleware = "*"
|
||||
dateparser = "~=1.1"
|
||||
django = "~=4.1"
|
||||
django-cors-headers = "*"
|
||||
django-extensions = "*"
|
||||
django-filter = "~=24.2"
|
||||
django-guardian = "*"
|
||||
django-multiselectfield = "*"
|
||||
djangorestframework = "==3.14.0"
|
||||
djangorestframework-guardian = "*"
|
||||
drf-writable-nested = "*"
|
||||
bleach = "*"
|
||||
celery = {extras = ["redis"], version = "*"}
|
||||
channels = "~=4.1"
|
||||
channels-redis = "*"
|
||||
concurrent-log-handler = "*"
|
||||
django-filter = "~=22.1"
|
||||
djangorestframework = "~=3.14"
|
||||
filelock = "*"
|
||||
flower = "*"
|
||||
gotenberg-client = "*"
|
||||
gunicorn = "*"
|
||||
imap-tools = "*"
|
||||
inotifyrecursive = "~=0.3"
|
||||
langdetect = "*"
|
||||
mysqlclient = "*"
|
||||
nltk = "*"
|
||||
ocrmypdf = "~=15.4"
|
||||
pathvalidate = "*"
|
||||
pdf2image = "*"
|
||||
psycopg2 = "*"
|
||||
python-dateutil = "*"
|
||||
python-dotenv = "*"
|
||||
pillow = "~=9.3"
|
||||
pikepdf = "*"
|
||||
python-gnupg = "*"
|
||||
python-ipware = "*"
|
||||
python-dotenv = "*"
|
||||
python-dateutil = "*"
|
||||
python-magic = "*"
|
||||
pyzbar = "*"
|
||||
psycopg2 = "*"
|
||||
rapidfuzz = "*"
|
||||
redis = {extras = ["hiredis"], version = "*"}
|
||||
scikit-learn = "~=1.4"
|
||||
setproctitle = "*"
|
||||
tika-client = "*"
|
||||
tqdm = "*"
|
||||
uvicorn = {extras = ["standard"], version = "==0.25.0"}
|
||||
watchdog = "~=4.0"
|
||||
whitenoise = "~=6.6"
|
||||
scikit-learn = "~=1.1"
|
||||
# Pin this until piwheels is building 1.9 (see https://www.piwheels.org/project/scipy/)
|
||||
scipy = "==1.8.1"
|
||||
numpy = "*"
|
||||
whitenoise = "~=6.2"
|
||||
watchdog = "~=2.1"
|
||||
whoosh="~=2.7"
|
||||
zxing-cpp = {version = "*", platform_machine = "== 'x86_64'"}
|
||||
|
||||
# Locked for issues
|
||||
# See https://github.com/paperless-ngx/paperless-ngx/discussions/6610 & https://bugs.launchpad.net/lxml/+bug/2059910
|
||||
lxml = "==5.1.1"
|
||||
inotifyrecursive = "~=0.3"
|
||||
ocrmypdf = "~=14.0"
|
||||
tqdm = "*"
|
||||
tika = "*"
|
||||
# TODO: This will sadly also install daphne+dependencies,
|
||||
# which an ASGI server we don't need. Adds about 15MB image size.
|
||||
channels = "~=3.0"
|
||||
# Locked version until https://github.com/django/channels_redis/issues/332
|
||||
# is resolved
|
||||
channels-redis = "==3.4.1"
|
||||
uvicorn = {extras = ["standard"], version = "*"}
|
||||
concurrent-log-handler = "*"
|
||||
"pdfminer.six" = "*"
|
||||
"backports.zoneinfo" = {version = "*", markers = "python_version < '3.9'"}
|
||||
"importlib-resources" = {version = "*", markers = "python_version < '3.9'"}
|
||||
zipp = {version = "*", markers = "python_version < '3.9'"}
|
||||
pyzbar = "*"
|
||||
mysqlclient = "*"
|
||||
celery = {extras = ["redis"], version = "*"}
|
||||
django-celery-results = "*"
|
||||
setproctitle = "*"
|
||||
nltk = "*"
|
||||
pdf2image = "*"
|
||||
flower = "*"
|
||||
|
||||
[dev-packages]
|
||||
# Linting
|
||||
black = "*"
|
||||
pre-commit = "*"
|
||||
ruff = "*"
|
||||
# Testing
|
||||
coveralls = "*"
|
||||
factory-boy = "*"
|
||||
pycodestyle = "*"
|
||||
pytest = "*"
|
||||
pytest-cov = "*"
|
||||
pytest-django = "*"
|
||||
pytest-httpx = "*"
|
||||
pytest-env = "*"
|
||||
pytest-sugar = "*"
|
||||
pytest-xdist = "*"
|
||||
pytest-rerunfailures = "*"
|
||||
imagehash = "*"
|
||||
daphne = "*"
|
||||
# Documentation
|
||||
mkdocs-material = "*"
|
||||
mkdocs-glightbox = "*"
|
||||
|
||||
[typing-dev]
|
||||
mypy = "*"
|
||||
types-Pillow = "*"
|
||||
django-filter-stubs = "*"
|
||||
types-python-dateutil = "*"
|
||||
djangorestframework-stubs = {extras= ["compatible-mypy"], version="*"}
|
||||
celery-types = "*"
|
||||
django-stubs = {extras= ["compatible-mypy"], version="*"}
|
||||
types-dateparser = "*"
|
||||
types-bleach = "*"
|
||||
types-redis = "*"
|
||||
types-tqdm = "*"
|
||||
types-Markdown = "*"
|
||||
types-Pygments = "*"
|
||||
types-colorama = "*"
|
||||
types-psycopg2 = "*"
|
||||
types-setuptools = "*"
|
||||
sphinx = "~=5.3"
|
||||
sphinx_rtd_theme = "*"
|
||||
tox = "*"
|
||||
black = "*"
|
||||
pre-commit = "*"
|
||||
sphinx-autobuild = "*"
|
||||
myst-parser = "*"
|
||||
|
5477
Pipfile.lock
generated
94
README.md
@@ -1,16 +1,12 @@
|
||||
[](https://github.com/paperless-ngx/paperless-ngx/actions)
|
||||
[](https://crowdin.com/project/paperless-ngx)
|
||||
[](https://docs.paperless-ngx.com)
|
||||
[](https://codecov.io/gh/paperless-ngx/paperless-ngx)
|
||||
[](https://paperless-ngx.readthedocs.io/en/latest/?badge=latest)
|
||||
[](https://coveralls.io/github/paperless-ngx/paperless-ngx?branch=master)
|
||||
[](https://matrix.to/#/%23paperlessngx%3Amatrix.org)
|
||||
[](https://demo.paperless-ngx.com)
|
||||
|
||||
<p align="center">
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://github.com/paperless-ngx/paperless-ngx/blob/main/resources/logo/web/png/White%20logo%20-%20no%20background.png" width="50%">
|
||||
<source media="(prefers-color-scheme: light)" srcset="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png" width="50%">
|
||||
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png" width="50%">
|
||||
</picture>
|
||||
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png#gh-light-mode-only" width="50%" />
|
||||
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/White%20logo%20-%20no%20background.png#gh-dark-mode-only" width="50%" />
|
||||
</p>
|
||||
|
||||
<!-- omit in toc -->
|
||||
@@ -19,9 +15,10 @@
|
||||
|
||||
Paperless-ngx is a document management system that transforms your physical documents into a searchable online archive so you can keep, well, _less paper_.
|
||||
|
||||
Paperless-ngx is the official successor to the original [Paperless](https://github.com/the-paperless-project/paperless) & [Paperless-ng](https://github.com/jonaswinkler/paperless-ng) projects and is designed to distribute the responsibility of advancing and supporting the project among a team of people. [Consider joining us!](#community-support)
|
||||
Paperless-ngx forked from [paperless-ng](https://github.com/jonaswinkler/paperless-ng) to continue the great work and distribute responsibility of supporting and advancing the project among a team of people. [Consider joining us!](#community-support) Discussion of this transition can be found in issues
|
||||
[#1599](https://github.com/jonaswinkler/paperless-ng/issues/1599) and [#1632](https://github.com/jonaswinkler/paperless-ng/issues/1632).
|
||||
|
||||
Thanks to the generous folks at [DigitalOcean](https://m.do.co/c/8d70b916d462), a demo is available at [demo.paperless-ngx.com](https://demo.paperless-ngx.com) using login `demo` / `demo`. _Note: demo content is reset frequently and confidential information should not be uploaded._
|
||||
A demo is available at [demo.paperless-ngx.com](https://demo.paperless-ngx.com) using login `demo` / `demo`. _Note: demo content is reset frequently and confidential information should not be uploaded._
|
||||
|
||||
- [Features](#features)
|
||||
- [Getting started](#getting-started)
|
||||
@@ -30,56 +27,64 @@ Thanks to the generous folks at [DigitalOcean](https://m.do.co/c/8d70b916d462),
|
||||
- [Translation](#translation)
|
||||
- [Feature Requests](#feature-requests)
|
||||
- [Bugs](#bugs)
|
||||
- [Related Projects](#related-projects)
|
||||
- [Affiliated Projects](#affiliated-projects)
|
||||
- [Important Note](#important-note)
|
||||
|
||||
<p align="right">This project is supported by:<br/>
|
||||
<a href="https://m.do.co/c/8d70b916d462" style="padding-top: 4px; display: block;">
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://opensource.nyc3.cdn.digitaloceanspaces.com/attribution/assets/SVG/DO_Logo_horizontal_white.svg" width="140px">
|
||||
<source media="(prefers-color-scheme: light)" srcset="https://opensource.nyc3.cdn.digitaloceanspaces.com/attribution/assets/SVG/DO_Logo_horizontal_black_.svg" width="140px">
|
||||
<img src="https://opensource.nyc3.cdn.digitaloceanspaces.com/attribution/assets/SVG/DO_Logo_horizontal_black_.svg" width="140px">
|
||||
</picture>
|
||||
</a>
|
||||
</p>
|
||||
|
||||
# Features
|
||||
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/docs/assets/screenshots/documents-smallcards-dark.png">
|
||||
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/docs/assets/screenshots/documents-smallcards.png">
|
||||
<img src="https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/docs/assets/screenshots/documents-smallcards.png">
|
||||
</picture>
|
||||

|
||||

|
||||
|
||||
A full list of [features](https://docs.paperless-ngx.com/#features) and [screenshots](https://docs.paperless-ngx.com/#screenshots) are available in the [documentation](https://docs.paperless-ngx.com/).
|
||||
- Organize and index your scanned documents with tags, correspondents, types, and more.
|
||||
- Performs OCR on your documents, adds selectable text to image only documents and adds tags, correspondents and document types to your documents.
|
||||
- Supports PDF documents, images, plain text files, and Office documents (Word, Excel, Powerpoint, and LibreOffice equivalents).
|
||||
- Office document support is optional and provided by Apache Tika (see [configuration](https://paperless-ngx.readthedocs.io/en/latest/configuration.html#tika-settings))
|
||||
- Paperless stores your documents plain on disk. Filenames and folders are managed by paperless and their format can be configured freely.
|
||||
- Single page application front end.
|
||||
- Includes a dashboard that shows basic statistics and has document upload.
|
||||
- Filtering by tags, correspondents, types, and more.
|
||||
- Customizable views can be saved and displayed on the dashboard.
|
||||
- Full text search helps you find what you need.
|
||||
- Auto completion suggests relevant words from your documents.
|
||||
- Results are sorted by relevance to your search query.
|
||||
- Highlighting shows you which parts of the document matched the query.
|
||||
- Searching for similar documents ("More like this")
|
||||
- Email processing: Paperless adds documents from your email accounts.
|
||||
- Configure multiple accounts and filters for each account.
|
||||
- When adding documents from mail, paperless can move these mail to a new folder, mark them as read, flag them as important or delete them.
|
||||
- Machine learning powered document matching.
|
||||
- Paperless-ngx learns from your documents and will be able to automatically assign tags, correspondents and types to documents once you've stored a few documents in paperless.
|
||||
- Optimized for multi core systems: Paperless-ngx consumes multiple documents in parallel.
|
||||
- The integrated sanity checker makes sure that your document archive is in good health.
|
||||
- [More screenshots are available in the documentation](https://paperless-ngx.readthedocs.io/en/latest/screenshots.html).
|
||||
|
||||
# Getting started
|
||||
|
||||
The easiest way to deploy paperless is `docker compose`. The files in the [`/docker/compose` directory](https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose) are configured to pull the image from GitHub Packages.
|
||||
The easiest way to deploy paperless is docker-compose. The files in the [`/docker/compose` directory](https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose) are configured to pull the image from Github Packages.
|
||||
|
||||
If you'd like to jump right in, you can configure a `docker compose` environment with our install script:
|
||||
If you'd like to jump right in, you can configure a docker-compose environment with our install script:
|
||||
|
||||
```bash
|
||||
bash -c "$(curl -L https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/install-paperless-ngx.sh)"
|
||||
```
|
||||
|
||||
More details and step-by-step guides for alternative installation methods can be found in [the documentation](https://docs.paperless-ngx.com/setup/#installation).
|
||||
Alternatively, you can install the dependencies and setup apache and a database server yourself. The [documentation](https://paperless-ngx.readthedocs.io/en/latest/setup.html#installation) has a step by step guide on how to do it.
|
||||
|
||||
Migrating from Paperless-ng is easy, just drop in the new docker image! See the [documentation on migrating](https://docs.paperless-ngx.com/setup/#migrating-to-paperless-ngx) for more details.
|
||||
Migrating from Paperless-ng is easy, just drop in the new docker image! See the [documentation on migrating](https://paperless-ngx.readthedocs.io/en/latest/setup.html#migrating-from-paperless-ng) for more details.
|
||||
|
||||
<!-- omit in toc -->
|
||||
|
||||
### Documentation
|
||||
|
||||
The documentation for Paperless-ngx is available at [https://docs.paperless-ngx.com](https://docs.paperless-ngx.com/).
|
||||
The documentation for Paperless-ngx is available on [ReadTheDocs](https://paperless-ngx.readthedocs.io/).
|
||||
|
||||
# Contributing
|
||||
|
||||
If you feel like contributing to the project, please do! Bug fixes, enhancements, visual fixes etc. are always welcome. If you want to implement something big: Please start a discussion about that! The [documentation](https://docs.paperless-ngx.com/development/) has some basic information on how to get started.
|
||||
If you feel like contributing to the project, please do! Bug fixes, enhancements, visual fixes etc. are always welcome. If you want to implement something big: Please start a discussion about that! The [documentation](https://paperless-ngx.readthedocs.io/en/latest/extending.html) has some basic information on how to get started.
|
||||
|
||||
## Community Support
|
||||
|
||||
People interested in continuing the work on paperless-ngx are encouraged to reach out here on github and in the [Matrix Room](https://matrix.to/#/#paperless:matrix.org). If you would like to contribute to the project on an ongoing basis there are multiple [teams](https://github.com/orgs/paperless-ngx/people) (frontend, ci/cd, etc) that could use your help so please reach out!
|
||||
People interested in continuing the work on paperless-ngx are encouraged to reach out here on github and in the [Matrix Room](https://matrix.to/#/#paperless:adnidor.de). If you would like to contribute to the project on an ongoing basis there are multiple [teams](https://github.com/orgs/paperless-ngx/people) (frontend, ci/cd, etc) that could use your help so please reach out!
|
||||
|
||||
## Translation
|
||||
|
||||
@@ -93,11 +98,24 @@ Feature requests can be submitted via [GitHub Discussions](https://github.com/pa
|
||||
|
||||
For bugs please [open an issue](https://github.com/paperless-ngx/paperless-ngx/issues) or [start a discussion](https://github.com/paperless-ngx/paperless-ngx/discussions) if you have questions.
|
||||
|
||||
# Related Projects
|
||||
# Affiliated Projects
|
||||
|
||||
Please see [the wiki](https://github.com/paperless-ngx/paperless-ngx/wiki/Related-Projects) for a user-maintained list of related projects and software that is compatible with Paperless-ngx.
|
||||
Paperless has been around a while now, and people are starting to build stuff on top of it. If you're one of those people, we can add your project to this list:
|
||||
|
||||
- [Paperless App](https://github.com/bauerj/paperless_app): An Android/iOS app for Paperless-ngx. Also works with the original Paperless and Paperless-ng.
|
||||
- [Paperless Share](https://github.com/qcasey/paperless_share). Share any files from your Android application with paperless. Very simple, but works with all of the mobile scanning apps out there that allow you to share scanned documents.
|
||||
- [Scan to Paperless](https://github.com/sbrunner/scan-to-paperless): Scan and prepare (crop, deskew, OCR, ...) your documents for Paperless.
|
||||
- [Paperless Mobile](https://github.com/astubenbord/paperless-mobile): A modern, feature rich mobile application for Paperless.
|
||||
|
||||
These projects also exist, but their status and compatibility with paperless-ngx is unknown.
|
||||
|
||||
- [paperless-cli](https://github.com/stgarf/paperless-cli): A golang command line binary to interact with a Paperless instance.
|
||||
|
||||
This project also exists, but needs updates to be compatible with paperless-ngx.
|
||||
|
||||
- [Paperless Desktop](https://github.com/thomasbrueggemann/paperless-desktop): A desktop UI for your Paperless installation. Runs on Mac, Linux, and Windows.
|
||||
Known issues on Mac: (Could not load reminders and documents)
|
||||
|
||||
# Important Note
|
||||
|
||||
> Document scanners are typically used to scan sensitive documents like your social insurance number, tax records, invoices, etc. **Paperless-ngx should never be run on an untrusted host** because information is stored in clear text without encryption. No guarantees are made regarding security (but we do try!) and you use the app at your own risk.
|
||||
> **The safest way to run Paperless-ngx is on a local server in your own home with backups in place**.
|
||||
Document scanners are typically used to scan sensitive documents. Things like your social insurance number, tax records, invoices, etc. Everything is stored in the clear without encryption. This means that Paperless should never be run on an untrusted host. Instead, I recommend that if you do want to use it, run it locally on a server in your own home.
|
||||
|
@@ -1,9 +0,0 @@
|
||||
# Security Policy
|
||||
|
||||
## Reporting a Vulnerability
|
||||
|
||||
The Paperless-ngx team and community take security bugs seriously. We appreciate your efforts to responsibly disclose your findings, and will make every effort to acknowledge your contributions.
|
||||
|
||||
To report a security issue, please use the GitHub Security Advisory ["Report a Vulnerability"](https://github.com/paperless-ngx/paperless-ngx/security/advisories/new) tab.
|
||||
|
||||
The team will send a response indicating the next steps in handling your report. After the initial reply to your report, the security team will keep you informed of the progress towards a fix and full announcement, and may ask for additional information or guidance.
|
47
build-docker-image.sh
Executable file
@@ -0,0 +1,47 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Helper script for building the Docker image locally.
|
||||
# Parses and provides the nessecary versions of other images to Docker
|
||||
# before passing in the rest of script args.
|
||||
|
||||
# First Argument: The Dockerfile to build
|
||||
# Other Arguments: Additional arguments to docker build
|
||||
|
||||
# Example Usage:
|
||||
# ./build-docker-image.sh Dockerfile -t paperless-ngx:my-awesome-feature
|
||||
|
||||
set -eux
|
||||
|
||||
if ! command -v jq; then
|
||||
echo "jq required"
|
||||
exit 1
|
||||
elif [ ! -f "$1" ]; then
|
||||
echo "$1 is not a file, please provide the Dockerfile"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Parse what we can from Pipfile.lock
|
||||
pikepdf_version=$(jq ".default.pikepdf.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
psycopg2_version=$(jq ".default.psycopg2.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
pillow_version=$(jq ".default.pillow.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
lxml_version=$(jq ".default.lxml.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
# Read this from the other config file
|
||||
qpdf_version=$(jq ".qpdf.version" .build-config.json | sed 's/"//g')
|
||||
jbig2enc_version=$(jq ".jbig2enc.version" .build-config.json | sed 's/"//g')
|
||||
# Get the branch name (used for caching)
|
||||
branch_name=$(git rev-parse --abbrev-ref HEAD)
|
||||
|
||||
# https://docs.docker.com/develop/develop-images/build_enhancements/
|
||||
# Required to use cache-from
|
||||
export DOCKER_BUILDKIT=1
|
||||
|
||||
docker build --file "$1" \
|
||||
--progress=plain \
|
||||
--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:"${branch_name}" \
|
||||
--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:dev \
|
||||
--build-arg JBIG2ENC_VERSION="${jbig2enc_version}" \
|
||||
--build-arg QPDF_VERSION="${qpdf_version}" \
|
||||
--build-arg PIKEPDF_VERSION="${pikepdf_version}" \
|
||||
--build-arg PILLOW_VERSION="${pillow_version}" \
|
||||
--build-arg LXML_VERSION="${lxml_version}" \
|
||||
--build-arg PSYCOPG2_VERSION="${psycopg2_version}" "${@:2}" .
|
@@ -1,6 +1,4 @@
|
||||
project_id_env: CROWDIN_PROJECT_ID
|
||||
api_token_env: CROWDIN_PERSONAL_TOKEN
|
||||
preserve_hierarchy: true
|
||||
commit_message: '[ci skip]'
|
||||
files:
|
||||
- source: /src/locale/en_US/LC_MESSAGES/django.po
|
||||
translation: /src/locale/%locale_with_underscore%/LC_MESSAGES/django.po
|
||||
|
35
docker-builders/Dockerfile.jbig2enc
Normal file
@@ -0,0 +1,35 @@
|
||||
# This Dockerfile compiles the jbig2enc library
|
||||
# Inputs:
|
||||
# - JBIG2ENC_VERSION - the Git tag to checkout and build
|
||||
|
||||
FROM debian:bullseye-slim as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with jbig2enc built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
ARG JBIG2ENC_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
automake \
|
||||
libtool \
|
||||
libleptonica-dev \
|
||||
zlib1g-dev \
|
||||
git \
|
||||
ca-certificates"
|
||||
|
||||
WORKDIR /usr/src/jbig2enc
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Building jbig2enc" \
|
||||
&& git clone --quiet --branch $JBIG2ENC_VERSION https://github.com/agl/jbig2enc . \
|
||||
&& ./autogen.sh \
|
||||
&& ./configure \
|
||||
&& make \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
92
docker-builders/Dockerfile.pikepdf
Normal file
@@ -0,0 +1,92 @@
|
||||
# This Dockerfile builds the pikepdf wheel
|
||||
# Inputs:
|
||||
# - REPO - Docker repository to pull qpdf from
|
||||
# - QPDF_VERSION - The image qpdf version to copy .deb files from
|
||||
# - PIKEPDF_VERSION - Version of pikepdf to build wheel for
|
||||
|
||||
# Default to pulling from the main repo registry when manually building
|
||||
ARG REPO="paperless-ngx/paperless-ngx"
|
||||
|
||||
ARG QPDF_VERSION
|
||||
FROM ghcr.io/${REPO}/builder/qpdf:${QPDF_VERSION} as qpdf-builder
|
||||
|
||||
# This does nothing, except provide a name for a copy below
|
||||
|
||||
FROM python:3.9-slim-bullseye as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with pikepdf wheel built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
ARG PIKEPDF_VERSION
|
||||
# These are not used, but will still bust the cache if one changes
|
||||
# Otherwise, the main image will try to build thing (and fail)
|
||||
ARG PILLOW_VERSION
|
||||
ARG LXML_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
python3-dev \
|
||||
python3-pip \
|
||||
# qpdf requirement - https://github.com/qpdf/qpdf#crypto-providers
|
||||
libgnutls28-dev \
|
||||
# lxml requrements - https://lxml.de/installation.html
|
||||
libxml2-dev \
|
||||
libxslt1-dev \
|
||||
# Pillow requirements - https://pillow.readthedocs.io/en/stable/installation.html#external-libraries
|
||||
# JPEG functionality
|
||||
libjpeg62-turbo-dev \
|
||||
# conpressed PNG
|
||||
zlib1g-dev \
|
||||
# compressed TIFF
|
||||
libtiff-dev \
|
||||
# type related services
|
||||
libfreetype-dev \
|
||||
# color management
|
||||
liblcms2-dev \
|
||||
# WebP format
|
||||
libwebp-dev \
|
||||
# JPEG 2000
|
||||
libopenjp2-7-dev \
|
||||
# improved color quantization
|
||||
libimagequant-dev \
|
||||
# complex text layout support
|
||||
libraqm-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
COPY --from=qpdf-builder /usr/src/qpdf/*.deb ./
|
||||
|
||||
# As this is an base image for a multi-stage final image
|
||||
# the added size of the install is basically irrelevant
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Installing qpdf" \
|
||||
&& dpkg --install libqpdf29_*.deb \
|
||||
&& dpkg --install libqpdf-dev_*.deb \
|
||||
&& echo "Installing Python tools" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade \
|
||||
pip \
|
||||
wheel \
|
||||
# https://pikepdf.readthedocs.io/en/latest/installation.html#requirements
|
||||
pybind11 \
|
||||
&& echo "Building pikepdf wheel ${PIKEPDF_VERSION}" \
|
||||
&& mkdir wheels \
|
||||
&& python3 -m pip wheel \
|
||||
# Build the package at the required version
|
||||
pikepdf==${PIKEPDF_VERSION} \
|
||||
# Output the *.whl into this directory
|
||||
--wheel-dir wheels \
|
||||
# Do not use a binary packge for the package being built
|
||||
--no-binary=pikepdf \
|
||||
# Do use binary packages for dependencies
|
||||
--prefer-binary \
|
||||
# Don't cache build files
|
||||
--no-cache-dir \
|
||||
&& ls -ahl wheels \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
48
docker-builders/Dockerfile.psycopg2
Normal file
@@ -0,0 +1,48 @@
|
||||
# This Dockerfile builds the psycopg2 wheel
|
||||
# Inputs:
|
||||
# - PSYCOPG2_VERSION - Version to build
|
||||
|
||||
FROM python:3.9-slim-bullseye as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with psycopg2 wheel built"
|
||||
|
||||
ARG PSYCOPG2_VERSION
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
python3-dev \
|
||||
python3-pip \
|
||||
# https://www.psycopg.org/docs/install.html#prerequisites
|
||||
libpq-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
# As this is an base image for a multi-stage final image
|
||||
# the added size of the install is basically irrelevant
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Installing Python tools" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade pip wheel \
|
||||
&& echo "Building psycopg2 wheel ${PSYCOPG2_VERSION}" \
|
||||
&& cd /usr/src \
|
||||
&& mkdir wheels \
|
||||
&& python3 -m pip wheel \
|
||||
# Build the package at the required version
|
||||
psycopg2==${PSYCOPG2_VERSION} \
|
||||
# Output the *.whl into this directory
|
||||
--wheel-dir wheels \
|
||||
# Do not use a binary packge for the package being built
|
||||
--no-binary=psycopg2 \
|
||||
# Do use binary packages for dependencies
|
||||
--prefer-binary \
|
||||
# Don't cache build files
|
||||
--no-cache-dir \
|
||||
&& ls -ahl wheels/ \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
48
docker-builders/Dockerfile.qpdf
Normal file
@@ -0,0 +1,48 @@
|
||||
# This Dockerfile compiles the jbig2enc library
|
||||
# Inputs:
|
||||
# - QPDF_VERSION - the version of qpdf to build a .deb.
|
||||
# Must be present as a deb-src in bookworm
|
||||
|
||||
FROM debian:bullseye-slim as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with qpdf built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
# This must match to pikepdf's minimum at least
|
||||
ARG QPDF_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
debhelper \
|
||||
debian-keyring \
|
||||
devscripts \
|
||||
equivs \
|
||||
libtool \
|
||||
# https://qpdf.readthedocs.io/en/stable/installation.html#system-requirements
|
||||
libjpeg62-turbo-dev \
|
||||
libgnutls28-dev \
|
||||
packaging-dev \
|
||||
cmake \
|
||||
zlib1g-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends $BUILD_PACKAGES \
|
||||
&& echo "Getting qpdf src" \
|
||||
&& echo "deb-src http://deb.debian.org/debian/ bookworm main" > /etc/apt/sources.list.d/bookworm-src.list \
|
||||
&& apt-get update \
|
||||
&& mkdir qpdf \
|
||||
&& cd qpdf \
|
||||
&& apt-get source --yes --quiet qpdf=${QPDF_VERSION}-1/bookworm \
|
||||
&& echo "Building qpdf" \
|
||||
&& cd qpdf-$QPDF_VERSION \
|
||||
&& export DEB_BUILD_OPTIONS="terse nocheck nodoc parallel=2" \
|
||||
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean \
|
||||
&& ls -ahl ../*.deb \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
@@ -1,27 +0,0 @@
|
||||
# Docker Compose file for running paperless testing with actual gotenberg
|
||||
# and Tika containers for a more end to end test of the Tika related functionality
|
||||
# Can be used locally or by the CI to start the necessary containers with the
|
||||
# correct networking for the tests
|
||||
|
||||
version: "3.7"
|
||||
services:
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.10
|
||||
hostname: gotenberg
|
||||
container_name: gotenberg
|
||||
network_mode: host
|
||||
restart: unless-stopped
|
||||
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||
# want to allow external content like tracking pixels or even javascript.
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-javascript=true"
|
||||
- "--chromium-allow-list=file:///tmp/.*"
|
||||
- "--log-level=warn"
|
||||
- "--log-format=text"
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
hostname: tika
|
||||
container_name: tika
|
||||
network_mode: host
|
||||
restart: unless-stopped
|
@@ -1,4 +1,4 @@
|
||||
# docker compose file for running paperless from the Docker Hub.
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
@@ -10,7 +10,7 @@
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8000.
|
||||
#
|
||||
# In addition to that, this Docker Compose file adds the following optional
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||
@@ -23,9 +23,9 @@
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker compose pull'.
|
||||
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker compose up -d'.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
@@ -39,7 +39,7 @@ services:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/mariadb:11
|
||||
image: docker.io/library/mariadb:10
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- dbdata:/var/lib/mysql
|
||||
@@ -49,6 +49,8 @@ services:
|
||||
MARIADB_USER: paperless
|
||||
MARIADB_PASSWORD: paperless
|
||||
MARIADB_ROOT_PASSWORD: paperless
|
||||
ports:
|
||||
- "3306:3306"
|
||||
|
||||
webserver:
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
@@ -59,7 +61,12 @@ services:
|
||||
- gotenberg
|
||||
- tika
|
||||
ports:
|
||||
- "8000:8000"
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
@@ -78,14 +85,11 @@ services:
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.10
|
||||
image: docker.io/gotenberg/gotenberg:7.6
|
||||
restart: unless-stopped
|
||||
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||
# want to allow external content like tracking pixels or even javascript.
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-javascript=true"
|
||||
- "--chromium-allow-list=file:///tmp/.*"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
|
@@ -1,4 +1,4 @@
|
||||
# Docker Compose file for running paperless from the Docker Hub.
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
@@ -10,7 +10,7 @@
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8000.
|
||||
#
|
||||
# In addition to that, this Docker Compose file adds the following optional
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||
@@ -19,9 +19,9 @@
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker compose pull'.
|
||||
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker compose up -d'.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
@@ -35,7 +35,7 @@ services:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/mariadb:11
|
||||
image: docker.io/library/mariadb:10
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- dbdata:/var/lib/mysql
|
||||
@@ -45,6 +45,8 @@ services:
|
||||
MARIADB_USER: paperless
|
||||
MARIADB_PASSWORD: paperless
|
||||
MARIADB_ROOT_PASSWORD: paperless
|
||||
ports:
|
||||
- "3306:3306"
|
||||
|
||||
webserver:
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
@@ -53,7 +55,12 @@ services:
|
||||
- db
|
||||
- broker
|
||||
ports:
|
||||
- "8000:8000"
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
@@ -68,6 +75,7 @@ services:
|
||||
PAPERLESS_DBPASS: paperless # only needed if non-default password
|
||||
PAPERLESS_DBPORT: 3306
|
||||
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
|
@@ -1,4 +1,4 @@
|
||||
# Docker Compose file for running paperless from the Docker Hub.
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
@@ -10,7 +10,7 @@
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8010.
|
||||
#
|
||||
# In addition to that, this Docker Compose file adds the following optional
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
||||
@@ -18,7 +18,7 @@
|
||||
# To install and update paperless with this file, do the following:
|
||||
#
|
||||
# - Open portainer Stacks list and click 'Add stack'
|
||||
# - Paste the contents of this file and assign a name, e.g. 'paperless'
|
||||
# - Paste the contents of this file and assign a name, e.g. 'Paperless'
|
||||
# - Click 'Deploy the stack' and wait for it to be deployed
|
||||
# - Open the list of containers, select paperless_webserver_1
|
||||
# - Click 'Console' and then 'Connect' to open the command line inside the container
|
||||
@@ -37,7 +37,7 @@ services:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/postgres:16
|
||||
image: docker.io/library/postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -53,7 +53,12 @@ services:
|
||||
- db
|
||||
- broker
|
||||
ports:
|
||||
- "8010:8000"
|
||||
- 8010:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
|
@@ -1,4 +1,4 @@
|
||||
# Docker Compose file for running paperless from the docker container registry.
|
||||
# docker-compose file for running paperless from the docker container registry.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
@@ -10,7 +10,7 @@
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8000.
|
||||
#
|
||||
# In addition to that, this Docker Compose file adds the following optional
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
||||
@@ -23,9 +23,9 @@
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker compose pull'.
|
||||
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker compose up -d'.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
@@ -39,7 +39,7 @@ services:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/postgres:16
|
||||
image: docker.io/library/postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -57,7 +57,12 @@ services:
|
||||
- gotenberg
|
||||
- tika
|
||||
ports:
|
||||
- "8000:8000"
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
@@ -72,15 +77,11 @@ services:
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.10
|
||||
image: docker.io/gotenberg/gotenberg:7.6
|
||||
restart: unless-stopped
|
||||
|
||||
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||
# want to allow external content like tracking pixels or even javascript.
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-javascript=true"
|
||||
- "--chromium-allow-list=file:///tmp/.*"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
|
@@ -1,4 +1,4 @@
|
||||
# Docker Compose file for running paperless from the Docker Hub.
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
@@ -10,7 +10,7 @@
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8000.
|
||||
#
|
||||
# In addition to that, this Docker Compose file adds the following optional
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
||||
@@ -19,9 +19,9 @@
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker compose pull'.
|
||||
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker compose up -d'.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
@@ -35,7 +35,7 @@ services:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/postgres:16
|
||||
image: docker.io/library/postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -51,7 +51,12 @@ services:
|
||||
- db
|
||||
- broker
|
||||
ports:
|
||||
- "8000:8000"
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
@@ -62,6 +67,7 @@ services:
|
||||
PAPERLESS_REDIS: redis://broker:6379
|
||||
PAPERLESS_DBHOST: db
|
||||
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
|
@@ -1,4 +1,4 @@
|
||||
# Docker Compose file for running paperless from the docker container registry.
|
||||
# docker-compose file for running paperless from the docker container registry.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
# All compose files of paperless configure paperless in the following way:
|
||||
@@ -11,7 +11,7 @@
|
||||
#
|
||||
# SQLite is used as the database. The SQLite file is stored in the data volume.
|
||||
#
|
||||
# In addition to that, this Docker Compose file adds the following optional
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Apache Tika and Gotenberg servers are started with paperless and paperless
|
||||
@@ -23,9 +23,9 @@
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker compose pull'.
|
||||
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker compose up -d'.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
@@ -46,7 +46,12 @@ services:
|
||||
- gotenberg
|
||||
- tika
|
||||
ports:
|
||||
- "8000:8000"
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
@@ -60,15 +65,11 @@ services:
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.10
|
||||
image: docker.io/gotenberg/gotenberg:7.6
|
||||
restart: unless-stopped
|
||||
|
||||
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||
# want to allow external content like tracking pixels or even javascript.
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-javascript=true"
|
||||
- "--chromium-allow-list=file:///tmp/.*"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
|
@@ -1,4 +1,4 @@
|
||||
# Docker Compose file for running paperless from the Docker Hub.
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
@@ -16,9 +16,9 @@
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker compose pull'.
|
||||
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker compose up -d'.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
@@ -37,7 +37,12 @@ services:
|
||||
depends_on:
|
||||
- broker
|
||||
ports:
|
||||
- "8000:8000"
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
@@ -47,6 +52,7 @@ services:
|
||||
environment:
|
||||
PAPERLESS_REDIS: redis://broker:6379
|
||||
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
|
@@ -2,6 +2,37 @@
|
||||
|
||||
set -e
|
||||
|
||||
# Adapted from:
|
||||
# https://github.com/docker-library/postgres/blob/master/docker-entrypoint.sh
|
||||
# usage: file_env VAR
|
||||
# ie: file_env 'XYZ_DB_PASSWORD' will allow for "$XYZ_DB_PASSWORD_FILE" to
|
||||
# fill in the value of "$XYZ_DB_PASSWORD" from a file, especially for Docker's
|
||||
# secrets feature
|
||||
file_env() {
|
||||
local -r var="$1"
|
||||
local -r fileVar="${var}_FILE"
|
||||
|
||||
# Basic validation
|
||||
if [ "${!var:-}" ] && [ "${!fileVar:-}" ]; then
|
||||
echo >&2 "error: both $var and $fileVar are set (but are exclusive)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Only export var if the _FILE exists
|
||||
if [ "${!fileVar:-}" ]; then
|
||||
# And the file exists
|
||||
if [[ -f ${!fileVar} ]]; then
|
||||
echo "Setting ${var} from file"
|
||||
val="$(< "${!fileVar}")"
|
||||
export "$var"="$val"
|
||||
else
|
||||
echo "File ${!fileVar} doesn't exist"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
}
|
||||
|
||||
# Source: https://github.com/sameersbn/docker-gitlab/
|
||||
map_uidgid() {
|
||||
local -r usermap_original_uid=$(id -u paperless)
|
||||
@@ -22,54 +53,46 @@ map_folders() {
|
||||
export CONSUME_DIR="${PAPERLESS_CONSUMPTION_DIR:-/usr/src/paperless/consume}"
|
||||
}
|
||||
|
||||
custom_container_init() {
|
||||
# Mostly borrowed from the LinuxServer.io base image
|
||||
# https://github.com/linuxserver/docker-baseimage-ubuntu/tree/bionic/root/etc/cont-init.d
|
||||
local -r custom_script_dir="/custom-cont-init.d"
|
||||
# Tamper checking.
|
||||
# Don't run files which are owned by anyone except root
|
||||
# Don't run files which are writeable by others
|
||||
if [ -d "${custom_script_dir}" ]; then
|
||||
if [ -n "$(/usr/bin/find "${custom_script_dir}" -maxdepth 1 ! -user root)" ]; then
|
||||
echo "**** Potential tampering with custom scripts detected ****"
|
||||
echo "**** The folder '${custom_script_dir}' must be owned by root ****"
|
||||
return 0
|
||||
fi
|
||||
if [ -n "$(/usr/bin/find "${custom_script_dir}" -maxdepth 1 -perm -o+w)" ]; then
|
||||
echo "**** The folder '${custom_script_dir}' or some of contents have write permissions for others, which is a security risk. ****"
|
||||
echo "**** Please review the permissions and their contents to make sure they are owned by root, and can only be modified by root. ****"
|
||||
return 0
|
||||
fi
|
||||
nltk_data () {
|
||||
# Store the NLTK data outside the Docker container
|
||||
local -r nltk_data_dir="${DATA_DIR}/nltk"
|
||||
local -r truthy_things=("yes y 1 t true")
|
||||
|
||||
# Make sure custom init directory has files in it
|
||||
if [ -n "$(/bin/ls -A "${custom_script_dir}" 2>/dev/null)" ]; then
|
||||
echo "[custom-init] files found in ${custom_script_dir} executing"
|
||||
# Loop over files in the directory
|
||||
for SCRIPT in "${custom_script_dir}"/*; do
|
||||
NAME="$(basename "${SCRIPT}")"
|
||||
if [ -f "${SCRIPT}" ]; then
|
||||
echo "[custom-init] ${NAME}: executing..."
|
||||
/bin/bash "${SCRIPT}"
|
||||
echo "[custom-init] ${NAME}: exited $?"
|
||||
elif [ ! -f "${SCRIPT}" ]; then
|
||||
echo "[custom-init] ${NAME}: is not a file"
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo "[custom-init] no custom files found exiting..."
|
||||
fi
|
||||
# If not set, or it looks truthy
|
||||
if [[ -z "${PAPERLESS_ENABLE_NLTK}" ]] || [[ "${truthy_things[*]}" =~ ${PAPERLESS_ENABLE_NLTK,} ]]; then
|
||||
|
||||
# Download or update the snowball stemmer data
|
||||
python3 -W ignore::RuntimeWarning -m nltk.downloader -d "${nltk_data_dir}" snowball_data
|
||||
|
||||
# Download or update the stopwords corpus
|
||||
python3 -W ignore::RuntimeWarning -m nltk.downloader -d "${nltk_data_dir}" stopwords
|
||||
|
||||
# Download or update the punkt tokenizer data
|
||||
python3 -W ignore::RuntimeWarning -m nltk.downloader -d "${nltk_data_dir}" punkt
|
||||
|
||||
else
|
||||
echo "Skipping NLTK data download"
|
||||
|
||||
fi
|
||||
|
||||
}
|
||||
|
||||
initialize() {
|
||||
|
||||
# Setup environment from secrets before anything else
|
||||
# Check for a version of this var with _FILE appended
|
||||
# and convert the contents to the env var value
|
||||
# Source it so export is persistent
|
||||
# shellcheck disable=SC1091
|
||||
source /sbin/env-from-file.sh
|
||||
for env_var in \
|
||||
PAPERLESS_DBUSER \
|
||||
PAPERLESS_DBPASS \
|
||||
PAPERLESS_SECRET_KEY \
|
||||
PAPERLESS_AUTO_LOGIN_USERNAME \
|
||||
PAPERLESS_ADMIN_USER \
|
||||
PAPERLESS_ADMIN_MAIL \
|
||||
PAPERLESS_ADMIN_PASSWORD \
|
||||
PAPERLESS_REDIS; do
|
||||
# Check for a version of this var with _FILE appended
|
||||
# and convert the contents to the env var value
|
||||
file_env ${env_var}
|
||||
done
|
||||
|
||||
# Change the user and group IDs if needed
|
||||
map_uidgid
|
||||
@@ -86,17 +109,19 @@ initialize() {
|
||||
"${CONSUME_DIR}"; do
|
||||
if [[ ! -d "${dir}" ]]; then
|
||||
echo "Creating directory ${dir}"
|
||||
mkdir --parents "${dir}"
|
||||
mkdir "${dir}"
|
||||
fi
|
||||
done
|
||||
|
||||
local -r tmp_dir="${PAPERLESS_SCRATCH_DIR:=/tmp/paperless}"
|
||||
echo "Creating directory scratch directory ${tmp_dir}"
|
||||
mkdir --parents "${tmp_dir}"
|
||||
local -r tmp_dir="/tmp/paperless"
|
||||
echo "Creating directory ${tmp_dir}"
|
||||
mkdir -p "${tmp_dir}"
|
||||
|
||||
nltk_data
|
||||
|
||||
set +e
|
||||
echo "Adjusting permissions of paperless files. This may take a while."
|
||||
chown -R paperless:paperless "${tmp_dir}"
|
||||
chown -R paperless:paperless ${tmp_dir}
|
||||
for dir in \
|
||||
"${export_dir}" \
|
||||
"${DATA_DIR}" \
|
||||
@@ -107,10 +132,6 @@ initialize() {
|
||||
set -e
|
||||
|
||||
"${gosu_cmd[@]}" /sbin/docker-prepare.sh
|
||||
|
||||
# Leave this last thing
|
||||
custom_container_init
|
||||
|
||||
}
|
||||
|
||||
install_languages() {
|
||||
@@ -126,6 +147,10 @@ install_languages() {
|
||||
|
||||
for lang in "${langs[@]}"; do
|
||||
pkg="tesseract-ocr-$lang"
|
||||
# English is installed by default
|
||||
#if [[ "$lang" == "eng" ]]; then
|
||||
# continue
|
||||
#fi
|
||||
|
||||
if dpkg -s "$pkg" &>/dev/null; then
|
||||
echo "Package $pkg already installed!"
|
||||
|
@@ -20,6 +20,7 @@ wait_for_postgres() {
|
||||
exit 1
|
||||
else
|
||||
echo "Attempt $attempt_num failed! Trying again in 5 seconds..."
|
||||
|
||||
fi
|
||||
|
||||
attempt_num=$(("$attempt_num" + 1))
|
||||
@@ -36,8 +37,6 @@ wait_for_mariadb() {
|
||||
local attempt_num=1
|
||||
local -r max_attempts=5
|
||||
|
||||
# Disable warning, host and port can't have spaces
|
||||
# shellcheck disable=SC2086
|
||||
while ! true > /dev/tcp/$host/$port; do
|
||||
|
||||
if [ $attempt_num -eq $max_attempts ]; then
|
||||
@@ -68,19 +67,13 @@ migrations() {
|
||||
# of the current container starts.
|
||||
flock 200
|
||||
echo "Apply database migrations..."
|
||||
python3 manage.py migrate --skip-checks --no-input
|
||||
python3 manage.py migrate
|
||||
) 200>"${DATA_DIR}/migration_lock"
|
||||
}
|
||||
|
||||
django_checks() {
|
||||
# Explicitly run the Django system checks
|
||||
echo "Running Django checks"
|
||||
python3 manage.py check
|
||||
}
|
||||
|
||||
search_index() {
|
||||
|
||||
local -r index_version=9
|
||||
local -r index_version=1
|
||||
local -r index_version_file=${DATA_DIR}/.index_version
|
||||
|
||||
if [[ (! -f "${index_version_file}") || $(<"${index_version_file}") != "$index_version" ]]; then
|
||||
@@ -96,6 +89,46 @@ superuser() {
|
||||
fi
|
||||
}
|
||||
|
||||
custom_container_init() {
|
||||
# Mostly borrowed from the LinuxServer.io base image
|
||||
# https://github.com/linuxserver/docker-baseimage-ubuntu/tree/bionic/root/etc/cont-init.d
|
||||
local -r custom_script_dir="/custom-cont-init.d"
|
||||
# Tamper checking.
|
||||
# Don't run files which are owned by anyone except root
|
||||
# Don't run files which are writeable by others
|
||||
if [ -d "${custom_script_dir}" ]; then
|
||||
if [ -n "$(/usr/bin/find "${custom_script_dir}" -maxdepth 1 ! -user root)" ]; then
|
||||
echo "**** Potential tampering with custom scripts detected ****"
|
||||
echo "**** The folder '${custom_script_dir}' must be owned by root ****"
|
||||
return 0
|
||||
fi
|
||||
if [ -n "$(/usr/bin/find "${custom_script_dir}" -maxdepth 1 -perm -o+w)" ]; then
|
||||
echo "**** The folder '${custom_script_dir}' or some of contents have write permissions for others, which is a security risk. ****"
|
||||
echo "**** Please review the permissions and their contents to make sure they are owned by root, and can only be modified by root. ****"
|
||||
return 0
|
||||
fi
|
||||
|
||||
# Make sure custom init directory has files in it
|
||||
if [ -n "$(/bin/ls -A "${custom_script_dir}" 2>/dev/null)" ]; then
|
||||
echo "[custom-init] files found in ${custom_script_dir} executing"
|
||||
# Loop over files in the directory
|
||||
for SCRIPT in "${custom_script_dir}"/*; do
|
||||
NAME="$(basename "${SCRIPT}")"
|
||||
if [ -f "${SCRIPT}" ]; then
|
||||
echo "[custom-init] ${NAME}: executing..."
|
||||
/bin/bash "${SCRIPT}"
|
||||
echo "[custom-init] ${NAME}: exited $?"
|
||||
elif [ ! -f "${SCRIPT}" ]; then
|
||||
echo "[custom-init] ${NAME}: is not a file"
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo "[custom-init] no custom files found exiting..."
|
||||
fi
|
||||
|
||||
fi
|
||||
}
|
||||
|
||||
do_work() {
|
||||
if [[ "${PAPERLESS_DBENGINE}" == "mariadb" ]]; then
|
||||
wait_for_mariadb
|
||||
@@ -107,12 +140,13 @@ do_work() {
|
||||
|
||||
migrations
|
||||
|
||||
django_checks
|
||||
|
||||
search_index
|
||||
|
||||
superuser
|
||||
|
||||
# Leave this last thing
|
||||
custom_container_init
|
||||
|
||||
}
|
||||
|
||||
do_work
|
||||
|
@@ -1,42 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Scans the environment variables for those with the suffix _FILE
|
||||
# When located, checks the file exists, and exports the contents
|
||||
# of the file as the same name, minus the suffix
|
||||
# This allows the use of Docker secrets or mounted files
|
||||
# to fill in any of the settings configurable via environment
|
||||
# variables
|
||||
|
||||
set -eu
|
||||
|
||||
for line in $(printenv)
|
||||
do
|
||||
# Extract the name of the environment variable
|
||||
env_name=${line%%=*}
|
||||
# Check if it starts with "PAPERLESS_" and ends in "_FILE"
|
||||
if [[ ${env_name} == PAPERLESS_*_FILE ]]; then
|
||||
# This should have been named different..
|
||||
if [[ ${env_name} == "PAPERLESS_OCR_SKIP_ARCHIVE_FILE" ]]; then
|
||||
continue
|
||||
fi
|
||||
# Extract the value of the environment
|
||||
env_value=${line#*=}
|
||||
|
||||
# Check the file exists
|
||||
if [[ -f ${env_value} ]]; then
|
||||
|
||||
# Trim off the _FILE suffix
|
||||
non_file_env_name=${env_name%"_FILE"}
|
||||
echo "Setting ${non_file_env_name} from file"
|
||||
|
||||
# Reads the value from th file
|
||||
val="$(< "${!env_name}")"
|
||||
|
||||
# Sets the normal name to the read file contents
|
||||
export "${non_file_env_name}"="${val}"
|
||||
|
||||
else
|
||||
echo "File ${env_value} referenced by ${env_name} doesn't exist"
|
||||
fi
|
||||
fi
|
||||
done
|
@@ -3,10 +3,5 @@
|
||||
echo "Checking if we should start flower..."
|
||||
|
||||
if [[ -n "${PAPERLESS_ENABLE_FLOWER}" ]]; then
|
||||
# Small delay to allow celery to be up first
|
||||
echo "Starting flower in 5s"
|
||||
sleep 5
|
||||
celery --app paperless flower --conf=/usr/src/paperless/src/paperless/flowerconfig.py
|
||||
else
|
||||
echo "Not starting flower"
|
||||
celery --app paperless flower
|
||||
fi
|
||||
|
@@ -13,7 +13,6 @@ for command in decrypt_documents \
|
||||
document_retagger \
|
||||
document_thumbnails \
|
||||
document_sanity_checker \
|
||||
document_fuzzy_match \
|
||||
manage_superuser;
|
||||
do
|
||||
echo "installing $command..."
|
||||
|
@@ -3,9 +3,6 @@
|
||||
set -e
|
||||
|
||||
cd /usr/src/paperless/src/
|
||||
# This ensures environment is setup
|
||||
# shellcheck disable=SC1091
|
||||
source /sbin/env-from-file.sh
|
||||
|
||||
if [[ $(id -u) == 0 ]] ;
|
||||
then
|
||||
|
@@ -1,15 +1,14 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
SUPERVISORD_WORKING_DIR="${PAPERLESS_SUPERVISORD_WORKING_DIR:-$PWD}"
|
||||
rootless_args=()
|
||||
if [ "$(id -u)" == "$(id -u paperless)" ]; then
|
||||
rootless_args=(
|
||||
--user
|
||||
paperless
|
||||
--logfile
|
||||
"${SUPERVISORD_WORKING_DIR}/supervisord.log"
|
||||
supervisord.log
|
||||
--pidfile
|
||||
"${SUPERVISORD_WORKING_DIR}/supervisord.pid"
|
||||
supervisord.pid
|
||||
)
|
||||
fi
|
||||
|
||||
|
@@ -15,7 +15,6 @@ stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
stderr_logfile=/dev/stderr
|
||||
stderr_logfile_maxbytes=0
|
||||
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||
|
||||
[program:consumer]
|
||||
command=python3 manage.py document_consumer
|
||||
@@ -26,11 +25,10 @@ stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
stderr_logfile=/dev/stderr
|
||||
stderr_logfile_maxbytes=0
|
||||
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||
|
||||
[program:celery]
|
||||
|
||||
command = celery --app paperless worker --loglevel INFO --without-mingle --without-gossip
|
||||
command = celery --app paperless worker --loglevel INFO
|
||||
user=paperless
|
||||
stopasgroup = true
|
||||
stopwaitsecs = 60
|
||||
@@ -39,7 +37,6 @@ stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
stderr_logfile=/dev/stderr
|
||||
stderr_logfile_maxbytes=0
|
||||
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||
|
||||
[program:celery-beat]
|
||||
|
||||
@@ -51,7 +48,6 @@ stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
stderr_logfile=/dev/stderr
|
||||
stderr_logfile_maxbytes=0
|
||||
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||
|
||||
[program:celery-flower]
|
||||
command = /usr/local/bin/flower-conditional.sh
|
||||
@@ -62,4 +58,3 @@ stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
stderr_logfile=/dev/stderr
|
||||
stderr_logfile_maxbytes=0
|
||||
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||
|
@@ -12,12 +12,13 @@ from typing import Final
|
||||
from redis import Redis
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
MAX_RETRY_COUNT: Final[int] = 5
|
||||
RETRY_SLEEP_SECONDS: Final[int] = 5
|
||||
|
||||
REDIS_URL: Final[str] = os.getenv("PAPERLESS_REDIS", "redis://localhost:6379")
|
||||
|
||||
print("Waiting for Redis...", flush=True)
|
||||
print(f"Waiting for Redis...", flush=True)
|
||||
|
||||
attempt = 0
|
||||
with Redis.from_url(url=REDIS_URL) as client:
|
||||
@@ -28,7 +29,7 @@ if __name__ == "__main__":
|
||||
except Exception as e:
|
||||
print(
|
||||
f"Redis ping #{attempt} failed.\n"
|
||||
f"Error: {e!s}.\n"
|
||||
f"Error: {str(e)}.\n"
|
||||
f"Waiting {RETRY_SLEEP_SECONDS}s",
|
||||
flush=True,
|
||||
)
|
||||
@@ -36,8 +37,8 @@ if __name__ == "__main__":
|
||||
attempt += 1
|
||||
|
||||
if attempt >= MAX_RETRY_COUNT:
|
||||
print("Failed to connect to redis using environment variable PAPERLESS_REDIS.")
|
||||
print(f"Failed to connect to redis using environment variable PAPERLESS_REDIS.")
|
||||
sys.exit(os.EX_UNAVAILABLE)
|
||||
else:
|
||||
print("Connected to Redis broker.")
|
||||
print(f"Connected to Redis broker.")
|
||||
sys.exit(os.EX_OK)
|
||||
|
181
docs/Makefile
Normal file
@@ -0,0 +1,181 @@
|
||||
# Makefile for Sphinx documentation
|
||||
#
|
||||
|
||||
# You can set these variables from the command line.
|
||||
SPHINXOPTS =
|
||||
SPHINXBUILD = sphinx-build
|
||||
PAPER =
|
||||
BUILDDIR = _build
|
||||
|
||||
# User-friendly check for sphinx-build
|
||||
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
|
||||
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
|
||||
endif
|
||||
|
||||
# Internal variables.
|
||||
PAPEROPT_a4 = -D latex_paper_size=a4
|
||||
PAPEROPT_letter = -D latex_paper_size=letter
|
||||
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
# the i18n builder cannot share the environment and doctrees with the others
|
||||
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
|
||||
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
|
||||
|
||||
help:
|
||||
@echo "Please use \`make <target>' where <target> is one of"
|
||||
@echo " html to make standalone HTML files"
|
||||
@echo " livehtml to preview changes with live reload in your browser"
|
||||
@echo " dirhtml to make HTML files named index.html in directories"
|
||||
@echo " singlehtml to make a single large HTML file"
|
||||
@echo " pickle to make pickle files"
|
||||
@echo " json to make JSON files"
|
||||
@echo " htmlhelp to make HTML files and a HTML help project"
|
||||
@echo " qthelp to make HTML files and a qthelp project"
|
||||
@echo " devhelp to make HTML files and a Devhelp project"
|
||||
@echo " epub to make an epub"
|
||||
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
|
||||
@echo " latexpdf to make LaTeX files and run them through pdflatex"
|
||||
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
|
||||
@echo " text to make text files"
|
||||
@echo " man to make manual pages"
|
||||
@echo " texinfo to make Texinfo files"
|
||||
@echo " info to make Texinfo files and run them through makeinfo"
|
||||
@echo " gettext to make PO message catalogs"
|
||||
@echo " changes to make an overview of all changed/added/deprecated items"
|
||||
@echo " xml to make Docutils-native XML files"
|
||||
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
|
||||
@echo " linkcheck to check all external links for integrity"
|
||||
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
|
||||
|
||||
clean:
|
||||
rm -rf $(BUILDDIR)/*
|
||||
|
||||
html:
|
||||
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
||||
|
||||
livehtml:
|
||||
sphinx-autobuild "./" "$(BUILDDIR)" $(O)
|
||||
|
||||
dirhtml:
|
||||
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
|
||||
|
||||
singlehtml:
|
||||
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
|
||||
|
||||
pickle:
|
||||
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
|
||||
@echo
|
||||
@echo "Build finished; now you can process the pickle files."
|
||||
|
||||
json:
|
||||
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
|
||||
@echo
|
||||
@echo "Build finished; now you can process the JSON files."
|
||||
|
||||
htmlhelp:
|
||||
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run HTML Help Workshop with the" \
|
||||
".hhp project file in $(BUILDDIR)/htmlhelp."
|
||||
|
||||
qthelp:
|
||||
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
|
||||
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
|
||||
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/RIPEAtlasToolsMagellan.qhcp"
|
||||
@echo "To view the help file:"
|
||||
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/RIPEAtlasToolsMagellan.qhc"
|
||||
|
||||
devhelp:
|
||||
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
|
||||
@echo
|
||||
@echo "Build finished."
|
||||
@echo "To view the help file:"
|
||||
@echo "# mkdir -p $$HOME/.local/share/devhelp/RIPEAtlasToolsMagellan"
|
||||
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/RIPEAtlasToolsMagellan"
|
||||
@echo "# devhelp"
|
||||
|
||||
epub:
|
||||
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
|
||||
@echo
|
||||
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
|
||||
|
||||
latex:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo
|
||||
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
|
||||
@echo "Run \`make' in that directory to run these through (pdf)latex" \
|
||||
"(use \`make latexpdf' here to do that automatically)."
|
||||
|
||||
latexpdf:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo "Running LaTeX files through pdflatex..."
|
||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf
|
||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||
|
||||
latexpdfja:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo "Running LaTeX files through platex and dvipdfmx..."
|
||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
|
||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||
|
||||
text:
|
||||
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
|
||||
@echo
|
||||
@echo "Build finished. The text files are in $(BUILDDIR)/text."
|
||||
|
||||
man:
|
||||
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
|
||||
@echo
|
||||
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
|
||||
|
||||
texinfo:
|
||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||
@echo
|
||||
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
|
||||
@echo "Run \`make' in that directory to run these through makeinfo" \
|
||||
"(use \`make info' here to do that automatically)."
|
||||
|
||||
info:
|
||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||
@echo "Running Texinfo files through makeinfo..."
|
||||
make -C $(BUILDDIR)/texinfo info
|
||||
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
|
||||
|
||||
gettext:
|
||||
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
|
||||
@echo
|
||||
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
|
||||
|
||||
changes:
|
||||
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
|
||||
@echo
|
||||
@echo "The overview file is in $(BUILDDIR)/changes."
|
||||
|
||||
linkcheck:
|
||||
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
|
||||
@echo
|
||||
@echo "Link check complete; look for any errors in the above output " \
|
||||
"or in $(BUILDDIR)/linkcheck/output.txt."
|
||||
|
||||
doctest:
|
||||
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
|
||||
@echo "Testing of doctests in the sources finished, look at the " \
|
||||
"results in $(BUILDDIR)/doctest/output.txt."
|
||||
|
||||
xml:
|
||||
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
|
||||
@echo
|
||||
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
|
||||
|
||||
pseudoxml:
|
||||
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
|
||||
@echo
|
||||
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
|
0
docs/assets/.keep → docs/_static/.keep
vendored
605
docs/_static/css/custom.css
vendored
Normal file
@@ -0,0 +1,605 @@
|
||||
/* Variables */
|
||||
:root {
|
||||
--color-text-body: #5c5962;
|
||||
--color-text-body-light: #fcfcfc;
|
||||
--color-text-anchor: #7253ed;
|
||||
--color-text-alt: rgba(0, 0, 0, 0.3);
|
||||
--color-text-title: #27262b;
|
||||
--color-text-code-inline: #e74c3c;
|
||||
--color-text-code-nt: #062873;
|
||||
--color-text-selection: #b19eff;
|
||||
--color-bg-body: #fcfcfc;
|
||||
--color-bg-body-alt: #f3f6f6;
|
||||
--color-bg-side-nav: #f5f6fa;
|
||||
--color-bg-side-nav-hover: #ebedf5;
|
||||
--color-bg-code-block: var(--color-bg-side-nav);
|
||||
--color-border: #eeebee;
|
||||
--color-btn-neutral-bg: #f3f6f6;
|
||||
--color-btn-neutral-bg-hover: #e5ebeb;
|
||||
--color-success-title: #1abc9c;
|
||||
--color-success-body: #dbfaf4;
|
||||
--color-warning-title: #f0b37e;
|
||||
--color-warning-body: #ffedcc;
|
||||
--color-danger-title: #f29f97;
|
||||
--color-danger-body: #fdf3f2;
|
||||
--color-info-title: #6ab0de;
|
||||
--color-info-body: #e7f2fa;
|
||||
}
|
||||
|
||||
.dark-mode {
|
||||
--color-text-body: #abb2bf;
|
||||
--color-text-body-light: #9499a2;
|
||||
--color-text-alt: rgba(0255, 255, 255, 0.5);
|
||||
--color-text-title: var(--color-text-anchor);
|
||||
--color-text-code-inline: #abb2bf;
|
||||
--color-text-code-nt: #2063f3;
|
||||
--color-text-selection: #030303;
|
||||
--color-bg-body: #1d1d20 !important;
|
||||
--color-bg-body-alt: #131315;
|
||||
--color-bg-side-nav: #18181a;
|
||||
--color-bg-side-nav-hover: #101216;
|
||||
--color-bg-code-block: #101216;
|
||||
--color-border: #47494f;
|
||||
--color-btn-neutral-bg: #242529;
|
||||
--color-btn-neutral-bg-hover: #101216;
|
||||
--color-success-title: #02120f;
|
||||
--color-success-body: #041b17;
|
||||
--color-warning-title: #1b0e03;
|
||||
--color-warning-body: #371d06;
|
||||
--color-danger-title: #120902;
|
||||
--color-danger-body: #1b0503;
|
||||
--color-info-title: #020608;
|
||||
--color-info-body: #06141e;
|
||||
}
|
||||
|
||||
* {
|
||||
transition: background-color 0.3s ease, border-color 0.3s ease;
|
||||
}
|
||||
|
||||
/* Typography */
|
||||
body {
|
||||
font-family: system-ui,-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif;
|
||||
font-size: inherit;
|
||||
line-height: 1.4;
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.rst-content p {
|
||||
word-break: break-word;
|
||||
}
|
||||
|
||||
h1, h2, h3, h4, h5, h6 {
|
||||
font-family: inherit;
|
||||
}
|
||||
|
||||
.rst-content .toctree-wrapper>p.caption, .rst-content h1, .rst-content h2, .rst-content h3, .rst-content h4, .rst-content h5, .rst-content h6 {
|
||||
padding-top: .5em;
|
||||
}
|
||||
|
||||
p, .main-content-wrap, .rst-content .section ul, .rst-content .toctree-wrapper ul, .rst-content section ul, .wy-plain-list-disc, article ul {
|
||||
line-height: 1.6;
|
||||
}
|
||||
|
||||
pre, .code, .rst-content .linenodiv pre, .rst-content div[class^=highlight] pre, .rst-content pre.literal-block {
|
||||
font-family: "SFMono-Regular", Menlo,Consolas, Monospace;
|
||||
font-size: 0.75em;
|
||||
line-height: 1.8;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4 {
|
||||
font-size: 1rem
|
||||
}
|
||||
|
||||
.rst-versions {
|
||||
font-family: inherit;
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
footer, footer p {
|
||||
font-size: .8rem;
|
||||
}
|
||||
|
||||
footer .rst-footer-buttons {
|
||||
font-size: 1rem;
|
||||
}
|
||||
|
||||
@media (max-width: 400px) {
|
||||
/* break code lines on mobile */
|
||||
pre, code {
|
||||
word-break: break-word;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/* Layout */
|
||||
.wy-side-nav-search, .wy-menu-vertical {
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.wy-nav-side {
|
||||
z-index: 0;
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
background-color: var(--color-bg-side-nav);
|
||||
}
|
||||
|
||||
.wy-side-scroll {
|
||||
width: 100%;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-side-scroll {
|
||||
width:264px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 50rem) {
|
||||
.wy-nav-side {
|
||||
flex-wrap: nowrap;
|
||||
position: fixed;
|
||||
width: 248px;
|
||||
height: 100%;
|
||||
flex-direction: column;
|
||||
border-right: 1px solid var(--color-border);
|
||||
align-items:flex-end
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-nav-side {
|
||||
width: calc((100% - 1064px) / 2 + 264px);
|
||||
min-width:264px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 50rem) {
|
||||
.wy-nav-content-wrap {
|
||||
position: relative;
|
||||
max-width: 800px;
|
||||
margin-left:248px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-nav-content-wrap {
|
||||
margin-left:calc((100% - 1064px) / 2 + 264px)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/* Colors */
|
||||
body.wy-body-for-nav,
|
||||
.wy-nav-content {
|
||||
background: var(--color-bg-body);
|
||||
}
|
||||
|
||||
.wy-nav-side {
|
||||
border-right: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-side-nav-search, .wy-nav-top {
|
||||
background: var(--color-bg-side-nav);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-nav-content-wrap {
|
||||
background: inherit;
|
||||
}
|
||||
|
||||
.wy-side-nav-search > a, .wy-nav-top a, .wy-nav-top i {
|
||||
color: var(--color-text-title);
|
||||
}
|
||||
|
||||
.wy-side-nav-search > a:hover, .wy-nav-top a:hover {
|
||||
background: transparent;
|
||||
}
|
||||
|
||||
.wy-side-nav-search > div.version {
|
||||
color: var(--color-text-alt)
|
||||
}
|
||||
|
||||
.wy-side-nav-search > div[role="search"] {
|
||||
border-top: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l2.current>a, .wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,
|
||||
.wy-menu-vertical li.toctree-l3.current>a, .wy-menu-vertical li.toctree-l3.current li.toctree-l4>a {
|
||||
background: var(--color-bg-side-nav);
|
||||
}
|
||||
|
||||
.rst-content .highlighted {
|
||||
background: #eedd85;
|
||||
box-shadow: 0 0 0 2px #eedd85;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.wy-side-nav-search input[type=text],
|
||||
html.writer-html5 .rst-content table.docutils th {
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,
|
||||
.wy-table-backed,
|
||||
.wy-table-odd td,
|
||||
.wy-table-striped tr:nth-child(2n-1) td {
|
||||
background-color: var(--color-bg-body-alt);
|
||||
}
|
||||
|
||||
.rst-content table.docutils,
|
||||
.wy-table-bordered-all,
|
||||
html.writer-html5 .rst-content table.docutils th,
|
||||
.rst-content table.docutils td,
|
||||
.wy-table-bordered-all td,
|
||||
hr {
|
||||
border-color: var(--color-border) !important;
|
||||
}
|
||||
|
||||
::selection {
|
||||
background: var(--color-text-selection);
|
||||
}
|
||||
|
||||
/* Ridiculous rules are taken from sphinx_rtd */
|
||||
.rst-content .admonition-title,
|
||||
.wy-alert-title {
|
||||
color: var(--color-text-body-light);
|
||||
}
|
||||
|
||||
.rst-content .hint,
|
||||
.rst-content .important,
|
||||
.rst-content .tip,
|
||||
.rst-content .wy-alert-success,
|
||||
.wy-alert.wy-alert-success {
|
||||
background: var(--color-success-body);
|
||||
}
|
||||
|
||||
.rst-content .hint .admonition-title,
|
||||
.rst-content .hint .wy-alert-title,
|
||||
.rst-content .important .admonition-title,
|
||||
.rst-content .important .wy-alert-title,
|
||||
.rst-content .tip .admonition-title,
|
||||
.rst-content .tip .wy-alert-title,
|
||||
.rst-content .wy-alert-success .admonition-title,
|
||||
.rst-content .wy-alert-success .wy-alert-title,
|
||||
.wy-alert.wy-alert-success .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-success .wy-alert-title {
|
||||
background-color: var(--color-success-title);
|
||||
}
|
||||
|
||||
.rst-content .admonition-todo,
|
||||
.rst-content .attention,
|
||||
.rst-content .caution,
|
||||
.rst-content .warning,
|
||||
.rst-content .wy-alert-warning,
|
||||
.wy-alert.wy-alert-warning {
|
||||
background: var(--color-warning-body);
|
||||
}
|
||||
|
||||
.rst-content .admonition-todo .admonition-title,
|
||||
.rst-content .admonition-todo .wy-alert-title,
|
||||
.rst-content .attention .admonition-title,
|
||||
.rst-content .attention .wy-alert-title,
|
||||
.rst-content .caution .admonition-title,
|
||||
.rst-content .caution .wy-alert-title,
|
||||
.rst-content .warning .admonition-title,
|
||||
.rst-content .warning .wy-alert-title,
|
||||
.rst-content .wy-alert-warning .admonition-title,
|
||||
.rst-content .wy-alert-warning .wy-alert-title,
|
||||
.rst-content .wy-alert.wy-alert-warning .admonition-title,
|
||||
.wy-alert.wy-alert-warning .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-warning .wy-alert-title {
|
||||
background: var(--color-warning-title);
|
||||
}
|
||||
|
||||
.rst-content .danger,
|
||||
.rst-content .error,
|
||||
.rst-content .wy-alert-danger,
|
||||
.wy-alert.wy-alert-danger {
|
||||
background: var(--color-danger-body);
|
||||
}
|
||||
|
||||
.rst-content .danger .admonition-title,
|
||||
.rst-content .danger .wy-alert-title,
|
||||
.rst-content .error .admonition-title,
|
||||
.rst-content .error .wy-alert-title,
|
||||
.rst-content .wy-alert-danger .admonition-title,
|
||||
.rst-content .wy-alert-danger .wy-alert-title,
|
||||
.wy-alert.wy-alert-danger .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-danger .wy-alert-title {
|
||||
background: var(--color-danger-title);
|
||||
}
|
||||
|
||||
.rst-content .note,
|
||||
.rst-content .seealso,
|
||||
.rst-content .wy-alert-info,
|
||||
.wy-alert.wy-alert-info {
|
||||
background: var(--color-info-body);
|
||||
}
|
||||
|
||||
.rst-content .note .admonition-title,
|
||||
.rst-content .note .wy-alert-title,
|
||||
.rst-content .seealso .admonition-title,
|
||||
.rst-content .seealso .wy-alert-title,
|
||||
.rst-content .wy-alert-info .admonition-title,
|
||||
.rst-content .wy-alert-info .wy-alert-title,
|
||||
.wy-alert.wy-alert-info .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-info .wy-alert-title {
|
||||
background: var(--color-info-title);
|
||||
}
|
||||
|
||||
|
||||
|
||||
/* Links */
|
||||
a, a:visited,
|
||||
.wy-menu-vertical a,
|
||||
a.icon.icon-home,
|
||||
.wy-menu-vertical li.toctree-l1.current > a.current {
|
||||
color: var(--color-text-anchor);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
a:hover, .wy-breadcrumbs-aside a {
|
||||
color: var(--color-text-anchor); /* reset */
|
||||
}
|
||||
|
||||
.rst-versions a, .rst-versions .rst-current-version {
|
||||
color: #var(--color-text-anchor);
|
||||
}
|
||||
|
||||
.wy-nav-content a.reference, .wy-nav-content a:not([class]) {
|
||||
background-image: linear-gradient(var(--color-border) 0%, var(--color-border) 100%);
|
||||
background-repeat: repeat-x;
|
||||
background-position: 0 100%;
|
||||
background-size: 1px 1px;
|
||||
}
|
||||
|
||||
.wy-nav-content a.reference:hover, .wy-nav-content a:not([class]):hover {
|
||||
background-image: linear-gradient(rgba(114,83,237,0.45) 0%, rgba(114,83,237,0.45) 100%);
|
||||
background-size: 1px 1px;
|
||||
}
|
||||
|
||||
.wy-menu-vertical a:hover,
|
||||
.wy-menu-vertical li.current a:hover,
|
||||
.wy-menu-vertical a:active {
|
||||
background: var(--color-bg-side-nav-hover) !important;
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l1.current>a,
|
||||
.wy-menu-vertical li.current>a,
|
||||
.wy-menu-vertical li.on a {
|
||||
background-color: var(--color-bg-side-nav-hover);
|
||||
border: none;
|
||||
font-weight: normal;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current {
|
||||
background-color: inherit;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current a {
|
||||
border-right: none;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l2 a,
|
||||
.wy-menu-vertical li.toctree-l3 a,
|
||||
.wy-menu-vertical li.toctree-l4 a,
|
||||
.wy-menu-vertical li.toctree-l5 a,
|
||||
.wy-menu-vertical li.toctree-l6 a,
|
||||
.wy-menu-vertical li.toctree-l7 a,
|
||||
.wy-menu-vertical li.toctree-l8 a,
|
||||
.wy-menu-vertical li.toctree-l9 a,
|
||||
.wy-menu-vertical li.toctree-l10 a {
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
a.image-reference, a.image-reference:hover {
|
||||
background: none !important;
|
||||
}
|
||||
|
||||
a.image-reference img {
|
||||
cursor: zoom-in;
|
||||
}
|
||||
|
||||
|
||||
/* Code blocks */
|
||||
.rst-content code, .rst-content tt, code {
|
||||
padding: 0.25em;
|
||||
font-weight: 400;
|
||||
background-color: var(--color-bg-code-block);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.rst-content div[class^=highlight], .rst-content pre.literal-block {
|
||||
padding: 0.7rem;
|
||||
margin-top: 0;
|
||||
margin-bottom: 0.75rem;
|
||||
overflow-x: auto;
|
||||
background-color: var(--color-bg-side-nav);
|
||||
border-color: var(--color-border);
|
||||
border-radius: 4px;
|
||||
box-shadow: none;
|
||||
}
|
||||
|
||||
.rst-content .admonition-title,
|
||||
.rst-content div.admonition,
|
||||
.wy-alert-title {
|
||||
padding: 10px 12px;
|
||||
border-top-left-radius: 4px;
|
||||
border-top-right-radius: 4px;
|
||||
}
|
||||
|
||||
.highlight .go {
|
||||
color: inherit;
|
||||
}
|
||||
|
||||
.highlight .nt {
|
||||
color: var(--color-text-code-nt);
|
||||
}
|
||||
|
||||
.rst-content code.literal,
|
||||
.rst-content tt.literal,
|
||||
html.writer-html5 .rst-content dl.footnote code {
|
||||
border-color: var(--color-border);
|
||||
background-color: var(--color-border);
|
||||
color: var(--color-text-code-inline)
|
||||
}
|
||||
|
||||
|
||||
/* Search */
|
||||
.wy-side-nav-search input[type=text] {
|
||||
border: none;
|
||||
border-radius: 0;
|
||||
background-color: transparent;
|
||||
font-family: inherit;
|
||||
font-size: .85rem;
|
||||
box-shadow: none;
|
||||
padding: .7rem 1rem .7rem 2.8rem;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
#rtd-search-form {
|
||||
position: relative;
|
||||
}
|
||||
|
||||
#rtd-search-form:before {
|
||||
font: normal normal normal 14px/1 FontAwesome;
|
||||
font-size: inherit;
|
||||
text-rendering: auto;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
content: "\f002";
|
||||
color: var(--color-text-alt);
|
||||
position: absolute;
|
||||
left: 1.5rem;
|
||||
top: .7rem;
|
||||
}
|
||||
|
||||
/* Side nav */
|
||||
.wy-side-nav-search {
|
||||
padding: 1rem 0 0 0;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li a button.toctree-expand {
|
||||
float: right;
|
||||
margin-right: -1.5em;
|
||||
padding: 0 .5em;
|
||||
}
|
||||
|
||||
.wy-menu-vertical a,
|
||||
.wy-menu-vertical li.current>a,
|
||||
.wy-menu-vertical li.current li>a {
|
||||
padding-right: 1.5em !important;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current li>a.current {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
/* Misc spacing */
|
||||
.rst-content .admonition-title, .wy-alert-title {
|
||||
padding: 10px 12px;
|
||||
}
|
||||
|
||||
/* Buttons */
|
||||
.btn {
|
||||
display: inline-block;
|
||||
box-sizing: border-box;
|
||||
padding: 0.3em 1em;
|
||||
margin: 0;
|
||||
font-family: inherit;
|
||||
font-size: inherit;
|
||||
font-weight: 500;
|
||||
line-height: 1.5;
|
||||
color: #var(--color-text-anchor);
|
||||
text-decoration: none;
|
||||
vertical-align: baseline;
|
||||
background-color: #f7f7f7;
|
||||
border-width: 0;
|
||||
border-radius: 4px;
|
||||
box-shadow: 0 1px 2px rgba(0,0,0,0.12),0 3px 10px rgba(0,0,0,0.08);
|
||||
appearance: none;
|
||||
}
|
||||
|
||||
.btn:active {
|
||||
padding: 0.3em 1em;
|
||||
}
|
||||
|
||||
.rst-content .btn:focus {
|
||||
outline: 1px solid #ccc;
|
||||
}
|
||||
|
||||
.rst-content .btn-neutral, .rst-content .btn span.fa {
|
||||
color: var(--color-text-body) !important;
|
||||
}
|
||||
|
||||
.btn-neutral {
|
||||
background-color: var(--color-btn-neutral-bg) !important;
|
||||
color: var(--color-btn-neutral-text) !important;
|
||||
border: 1px solid var(--color-btn-neutral-bg);
|
||||
}
|
||||
|
||||
.btn:hover, .btn-neutral:hover {
|
||||
background-color: var(--color-btn-neutral-bg-hover) !important;
|
||||
}
|
||||
|
||||
|
||||
/* Icon overrides */
|
||||
.wy-side-nav-search a.icon-home:before {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before {
|
||||
content: "\f106"; /* fa-angle-up */
|
||||
}
|
||||
|
||||
.fa-plus-square-o:before, .wy-menu-vertical li button.toctree-expand:before {
|
||||
content: "\f107"; /* fa-angle-down */
|
||||
}
|
||||
|
||||
|
||||
/* Misc */
|
||||
.wy-nav-top {
|
||||
line-height: 36px;
|
||||
}
|
||||
|
||||
.wy-nav-top > i {
|
||||
font-size: 24px;
|
||||
padding: 8px 0 0 2px;
|
||||
color:#var(--color-text-anchor);
|
||||
}
|
||||
|
||||
.rst-content table.docutils td,
|
||||
.rst-content table.docutils th,
|
||||
.rst-content table.field-list td,
|
||||
.rst-content table.field-list th,
|
||||
.wy-table td,
|
||||
.wy-table th {
|
||||
padding: 8px 14px;
|
||||
}
|
||||
|
||||
.dark-mode-toggle {
|
||||
position: absolute;
|
||||
top: 14px;
|
||||
right: 12px;
|
||||
height: 20px;
|
||||
width: 24px;
|
||||
z-index: 10;
|
||||
border: none;
|
||||
background-color: transparent;
|
||||
color: inherit;
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.wy-nav-content-wrap {
|
||||
z-index: 20;
|
||||
}
|
||||
|
||||
.rst-content .toctree-wrapper {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.redirect-notice {
|
||||
font-size: 2.5rem;
|
||||
}
|
47
docs/_static/js/darkmode.js
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
let toggleButton
|
||||
let icon
|
||||
|
||||
function load() {
|
||||
'use strict'
|
||||
|
||||
toggleButton = document.createElement('button')
|
||||
toggleButton.setAttribute('title', 'Toggle dark mode')
|
||||
toggleButton.classList.add('dark-mode-toggle')
|
||||
icon = document.createElement('i')
|
||||
icon.classList.add('fa', darkModeState ? 'fa-sun-o' : 'fa-moon-o')
|
||||
toggleButton.appendChild(icon)
|
||||
document.body.prepend(toggleButton)
|
||||
|
||||
// Listen for changes in the OS settings
|
||||
// addListener is used because older versions of Safari don't support addEventListener
|
||||
// prefersDarkQuery set in <head>
|
||||
if (prefersDarkQuery) {
|
||||
prefersDarkQuery.addListener(function (evt) {
|
||||
toggleDarkMode(evt.matches)
|
||||
})
|
||||
}
|
||||
|
||||
// Initial setting depending on the prefers-color-mode or localstorage
|
||||
// darkModeState should be set in the document <head> to prevent flash
|
||||
if (darkModeState == undefined) darkModeState = false
|
||||
toggleDarkMode(darkModeState)
|
||||
|
||||
// Toggles the "dark-mode" class on click and sets localStorage state
|
||||
toggleButton.addEventListener('click', () => {
|
||||
darkModeState = !darkModeState
|
||||
|
||||
toggleDarkMode(darkModeState)
|
||||
localStorage.setItem('dark-mode', darkModeState)
|
||||
})
|
||||
}
|
||||
|
||||
function toggleDarkMode(state) {
|
||||
document.documentElement.classList.toggle('dark-mode', state)
|
||||
document.documentElement.classList.toggle('light-mode', !state)
|
||||
icon.classList.remove('fa-sun-o')
|
||||
icon.classList.remove('fa-moon-o')
|
||||
icon.classList.add(state ? 'fa-sun-o' : 'fa-moon-o')
|
||||
darkModeState = state
|
||||
}
|
||||
|
||||
document.addEventListener('DOMContentLoaded', load)
|
Before Width: | Height: | Size: 67 KiB After Width: | Height: | Size: 67 KiB |
BIN
docs/_static/screenshots/bulk-edit.png
vendored
Normal file
After Width: | Height: | Size: 661 KiB |
BIN
docs/_static/screenshots/correspondents.png
vendored
Normal file
After Width: | Height: | Size: 457 KiB |
BIN
docs/_static/screenshots/dashboard.png
vendored
Normal file
After Width: | Height: | Size: 436 KiB |
BIN
docs/_static/screenshots/documents-filter.png
vendored
Normal file
After Width: | Height: | Size: 462 KiB |
BIN
docs/_static/screenshots/documents-largecards.png
vendored
Normal file
After Width: | Height: | Size: 608 KiB |
BIN
docs/_static/screenshots/documents-smallcards-dark.png
vendored
Normal file
After Width: | Height: | Size: 698 KiB |
BIN
docs/_static/screenshots/documents-smallcards.png
vendored
Normal file
After Width: | Height: | Size: 706 KiB |
BIN
docs/_static/screenshots/documents-table.png
vendored
Normal file
After Width: | Height: | Size: 480 KiB |
BIN
docs/_static/screenshots/documents-wchrome-dark.png
vendored
Normal file
After Width: | Height: | Size: 680 KiB |
BIN
docs/_static/screenshots/documents-wchrome.png
vendored
Normal file
After Width: | Height: | Size: 686 KiB |
BIN
docs/_static/screenshots/editing.png
vendored
Normal file
After Width: | Height: | Size: 848 KiB |
BIN
docs/_static/screenshots/logs.png
vendored
Normal file
After Width: | Height: | Size: 703 KiB |
BIN
docs/_static/screenshots/mail-rules-edited.png
vendored
Normal file
After Width: | Height: | Size: 96 KiB |
BIN
docs/_static/screenshots/mobile.png
vendored
Normal file
After Width: | Height: | Size: 388 KiB |
BIN
docs/_static/screenshots/new-tag.png
vendored
Normal file
After Width: | Height: | Size: 26 KiB |
BIN
docs/_static/screenshots/search-preview.png
vendored
Normal file
After Width: | Height: | Size: 54 KiB |
BIN
docs/_static/screenshots/search-results.png
vendored
Normal file
After Width: | Height: | Size: 517 KiB |
38
docs/_templates/layout.html
vendored
Normal file
@@ -0,0 +1,38 @@
|
||||
{% extends "!layout.html" %}
|
||||
{% block extrahead %}
|
||||
<script>
|
||||
// MediaQueryList object
|
||||
const prefersDarkQuery = window.matchMedia("(prefers-color-scheme: dark)");
|
||||
const lsDark = localStorage.getItem("dark-mode");
|
||||
let darkModeState = lsDark !== null ? lsDark == "true" : prefersDarkQuery.matches;
|
||||
|
||||
document.documentElement.classList.toggle("dark-mode", darkModeState);
|
||||
document.documentElement.classList.toggle("light-mode", !darkModeState);
|
||||
|
||||
const RTD_TO_MKD = {
|
||||
"index.html": "",
|
||||
"setup.html": "setup",
|
||||
"usage_overview.html": "usage",
|
||||
"advanced_usage.html": "advanced_usage",
|
||||
"administration.html": "administration",
|
||||
"configuration.html": "configuration",
|
||||
"api.html": "api",
|
||||
"faq.html": "faq",
|
||||
"troubleshooting.html": "troubleshooting",
|
||||
"extending.html": "development",
|
||||
"scanners.html": "",
|
||||
"screenshots.html": "",
|
||||
"changelog.html": "changelog",
|
||||
}
|
||||
|
||||
const path = (RTD_TO_MKD[window.location.pathname.substring(window.location.pathname.lastIndexOf("/") + 1)] ?? "") + "/";
|
||||
const hash = window.location.hash;
|
||||
const redirectURL = new URL(path + hash, "https://docs.paperless-ngx.com/");
|
||||
console.log(`Redirecting to ${redirectURL} in 3 seconds...`);
|
||||
|
||||
setTimeout(() => {
|
||||
window.location.replace(redirectURL);
|
||||
}, 3000);
|
||||
</script>
|
||||
{{ super() }}
|
||||
{% endblock %}
|
@@ -1,622 +0,0 @@
|
||||
# Administration
|
||||
|
||||
## Making backups {#backup}
|
||||
|
||||
Multiple options exist for making backups of your paperless instance,
|
||||
depending on how you installed paperless.
|
||||
|
||||
Before making a backup, it's probably best to make sure that paperless is not actively
|
||||
consuming documents at that time.
|
||||
|
||||
Options available to any installation of paperless:
|
||||
|
||||
- Use the [document exporter](#exporter). The document exporter exports all your documents,
|
||||
thumbnails, metadata, and database contents to a specific folder. You may import your
|
||||
documents and settings into a fresh instance of paperless again or store your
|
||||
documents in another DMS with this export.
|
||||
|
||||
The document exporter is also able to update an already existing
|
||||
export. Therefore, incremental backups with `rsync` are entirely
|
||||
possible.
|
||||
|
||||
!!! caution
|
||||
|
||||
You cannot import the export generated with one version of paperless in
|
||||
a different version of paperless. The export contains an exact image of
|
||||
the database, and migrations may change the database layout.
|
||||
|
||||
Options available to docker installations:
|
||||
|
||||
- Backup the docker volumes. These usually reside within
|
||||
`/var/lib/docker/volumes` on the host and you need to be root in
|
||||
order to access them.
|
||||
|
||||
Paperless uses 4 volumes:
|
||||
|
||||
- `paperless_media`: This is where your documents are stored.
|
||||
- `paperless_data`: This is where auxiliary data is stored. This
|
||||
folder also contains the SQLite database, if you use it.
|
||||
- `paperless_pgdata`: Exists only if you use PostgreSQL and
|
||||
contains the database.
|
||||
- `paperless_dbdata`: Exists only if you use MariaDB and contains
|
||||
the database.
|
||||
|
||||
Options available to bare-metal and non-docker installations:
|
||||
|
||||
- Backup the entire paperless folder. This ensures that if your
|
||||
paperless instance crashes at some point or your disk fails, you can
|
||||
simply copy the folder back into place and it works.
|
||||
|
||||
When using PostgreSQL or MariaDB, you'll also have to backup the
|
||||
database.
|
||||
|
||||
### Restoring {#migrating-restoring}
|
||||
|
||||
If you've backed-up Paperless-ngx using the [document exporter](#exporter),
|
||||
restoring can simply be done with the [document importer](#importer).
|
||||
|
||||
Of course, other backup strategies require restoring any volumes, folders and database
|
||||
copies you created in the steps above.
|
||||
|
||||
## Updating Paperless {#updating}
|
||||
|
||||
### Docker Route {#docker-updating}
|
||||
|
||||
If a new release of paperless-ngx is available, upgrading depends on how
|
||||
you installed paperless-ngx in the first place. The releases are
|
||||
available at the [release
|
||||
page](https://github.com/paperless-ngx/paperless-ngx/releases).
|
||||
|
||||
First of all, make sure no active processes (like consumption) are running, then [make a backup](#backup).
|
||||
|
||||
After that, ensure that paperless is stopped:
|
||||
|
||||
```shell-session
|
||||
$ cd /path/to/paperless
|
||||
$ docker compose down
|
||||
```
|
||||
|
||||
1. If you pull the image from the docker hub, all you need to do is:
|
||||
|
||||
```shell-session
|
||||
$ docker compose pull
|
||||
$ docker compose up
|
||||
```
|
||||
|
||||
The Docker Compose files refer to the `latest` version, which is
|
||||
always the latest stable release.
|
||||
|
||||
1. If you built the image yourself, do the following:
|
||||
|
||||
```shell-session
|
||||
$ git pull
|
||||
$ docker compose build
|
||||
$ docker compose up
|
||||
```
|
||||
|
||||
Running `docker compose up` will also apply any new database migrations.
|
||||
If you see everything working, press CTRL+C once to gracefully stop
|
||||
paperless. Then you can start paperless-ngx with `-d` to have it run in
|
||||
the background.
|
||||
|
||||
!!! note
|
||||
|
||||
In version 0.9.14, the update process was changed. In 0.9.13 and
|
||||
earlier, the Docker Compose files specified exact versions and pull
|
||||
won't automatically update to newer versions. In order to enable
|
||||
updates as described above, either get the new `docker-compose.yml`
|
||||
file from
|
||||
[here](https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose)
|
||||
or edit the `docker-compose.yml` file, find the line that says
|
||||
|
||||
```
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:0.9.x
|
||||
```
|
||||
|
||||
and replace the version with `latest`:
|
||||
|
||||
```
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
In version 1.7.1 and onwards, the Docker image can now be pinned to a
|
||||
release series. This is often combined with automatic updaters such as
|
||||
Watchtower to allow safer unattended upgrading to new bugfix releases
|
||||
only. It is still recommended to always review release notes before
|
||||
upgrading. To pin your install to a release series, edit the
|
||||
`docker-compose.yml` find the line that says
|
||||
|
||||
```
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
```
|
||||
|
||||
and replace the version with the series you want to track, for
|
||||
example:
|
||||
|
||||
```
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:1.7
|
||||
```
|
||||
|
||||
### Bare Metal Route {#bare-metal-updating}
|
||||
|
||||
After grabbing the new release and unpacking the contents, do the
|
||||
following:
|
||||
|
||||
1. Update dependencies. New paperless version may require additional
|
||||
dependencies. The dependencies required are listed in the section
|
||||
about
|
||||
[bare metal installations](setup.md#bare_metal).
|
||||
|
||||
2. Update python requirements. Keep in mind to activate your virtual
|
||||
environment before that, if you use one.
|
||||
|
||||
```shell-session
|
||||
$ pip install -r requirements.txt
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
At times, some dependencies will be removed from requirements.txt.
|
||||
Comparing the versions and removing no longer needed dependencies
|
||||
will keep your system or virtual environment clean and prevent
|
||||
possible conflicts.
|
||||
|
||||
3. Migrate the database.
|
||||
|
||||
```shell-session
|
||||
$ cd src
|
||||
$ python3 manage.py migrate # (1)
|
||||
```
|
||||
|
||||
1. Including `sudo -Hu <paperless_user>` may be required
|
||||
|
||||
This might not actually do anything. Not every new paperless version
|
||||
comes with new database migrations.
|
||||
|
||||
### Database Upgrades
|
||||
|
||||
In general, paperless does not require a specific version of PostgreSQL or MariaDB and it is
|
||||
safe to update them to newer versions. However, you should always take a backup and follow
|
||||
the instructions from your database's documentation for how to upgrade between major versions.
|
||||
|
||||
For PostgreSQL, refer to [Upgrading a PostgreSQL Cluster](https://www.postgresql.org/docs/current/upgrading.html).
|
||||
|
||||
For MariaDB, refer to [Upgrading MariaDB](https://mariadb.com/kb/en/upgrading/)
|
||||
|
||||
## Downgrading Paperless {#downgrade-paperless}
|
||||
|
||||
Downgrades are possible. However, some updates also contain database
|
||||
migrations (these change the layout of the database and may move data).
|
||||
In order to move back from a version that applied database migrations,
|
||||
you'll have to revert the database migration _before_ downgrading, and
|
||||
then downgrade paperless.
|
||||
|
||||
This table lists the compatible versions for each database migration
|
||||
number.
|
||||
|
||||
| Migration number | Version range |
|
||||
| ---------------- | --------------- |
|
||||
| 1011 | 1.0.0 |
|
||||
| 1012 | 1.1.0 - 1.2.1 |
|
||||
| 1014 | 1.3.0 - 1.3.1 |
|
||||
| 1016 | 1.3.2 - current |
|
||||
|
||||
Execute the following management command to migrate your database:
|
||||
|
||||
```shell-session
|
||||
$ python3 manage.py migrate documents <migration number>
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
Some migrations cannot be undone. The command will issue errors if that
|
||||
happens.
|
||||
|
||||
## Management utilities {#management-commands}
|
||||
|
||||
Paperless comes with some management commands that perform various
|
||||
maintenance tasks on your paperless instance. You can invoke these
|
||||
commands in the following way:
|
||||
|
||||
With Docker Compose, while paperless is running:
|
||||
|
||||
```shell-session
|
||||
$ cd /path/to/paperless
|
||||
$ docker compose exec webserver <command> <arguments>
|
||||
```
|
||||
|
||||
With docker, while paperless is running:
|
||||
|
||||
```shell-session
|
||||
$ docker exec -it <container-name> <command> <arguments>
|
||||
```
|
||||
|
||||
Bare metal:
|
||||
|
||||
```shell-session
|
||||
$ cd /path/to/paperless/src
|
||||
$ python3 manage.py <command> <arguments> # (1)
|
||||
```
|
||||
|
||||
1. Including `sudo -Hu <paperless_user>` may be required
|
||||
|
||||
All commands have built-in help, which can be accessed by executing them
|
||||
with the argument `--help`.
|
||||
|
||||
### Document exporter {#exporter}
|
||||
|
||||
The document exporter exports all your data (including your settings
|
||||
and database contents) from paperless into a folder for backup or
|
||||
migration to another DMS.
|
||||
|
||||
If you use the document exporter within a cronjob to backup your data
|
||||
you might use the `-T` flag behind exec to suppress "The input device
|
||||
is not a TTY" errors. For example:
|
||||
`docker compose exec -T webserver document_exporter ../export`
|
||||
|
||||
```
|
||||
document_exporter target [-c] [-d] [-f] [-na] [-nt] [-p] [-sm] [-z]
|
||||
|
||||
optional arguments:
|
||||
-c, --compare-checksums
|
||||
-d, --delete
|
||||
-f, --use-filename-format
|
||||
-na, --no-archive
|
||||
-nt, --no-thumbnail
|
||||
-p, --use-folder-prefix
|
||||
-sm, --split-manifest
|
||||
-z, --zip
|
||||
-zn, --zip-name
|
||||
```
|
||||
|
||||
`target` is a folder to which the data gets written. This includes
|
||||
documents, thumbnails and a `manifest.json` file. The manifest contains
|
||||
all metadata from the database (correspondents, tags, etc).
|
||||
|
||||
When you use the provided docker compose script, specify `../export` as
|
||||
the target. This path inside the container is automatically mounted on
|
||||
your host on the folder `export`.
|
||||
|
||||
If the target directory already exists and contains files, paperless
|
||||
will assume that the contents of the export directory are a previous
|
||||
export and will attempt to update the previous export. Paperless will
|
||||
only export changed and added files. Paperless determines whether a file
|
||||
has changed by inspecting the file attributes "date/time modified" and
|
||||
"size". If that does not work out for you, specify `-c` or
|
||||
`--compare-checksums` and paperless will attempt to compare file
|
||||
checksums instead. This is slower.
|
||||
|
||||
Paperless will not remove any existing files in the export directory. If
|
||||
you want paperless to also remove files that do not belong to the
|
||||
current export such as files from deleted documents, specify `-d` or `--delete`.
|
||||
Be careful when pointing paperless to a directory that already contains
|
||||
other files.
|
||||
|
||||
The filenames generated by this command follow the format
|
||||
`[date created] [correspondent] [title].[extension]`. If you want
|
||||
paperless to use [`PAPERLESS_FILENAME_FORMAT`](configuration.md#PAPERLESS_FILENAME_FORMAT) for exported filenames
|
||||
instead, specify `-f` or `--use-filename-format`.
|
||||
|
||||
If `-na` or `--no-archive` is provided, no archive files will be exported,
|
||||
only the original files.
|
||||
|
||||
If `-nt` or `--no-thumbnail` is provided, thumbnail files will not be exported.
|
||||
|
||||
!!! note
|
||||
|
||||
When using the `-na`/`--no-archive` or `-nt`/`--no-thumbnail` options
|
||||
the exporter will not output these files for backup. After importing,
|
||||
the [sanity checker](#sanity-checker) will warn about missing thumbnails and archive files
|
||||
until they are regenerated with `document_thumbnails` or [`document_archiver`](#archiver).
|
||||
It can make sense to omit these files from backup as their content and checksum
|
||||
can change (new archiver algorithm) and may then cause additional used space in
|
||||
a deduplicated backup.
|
||||
|
||||
If `-p` or `--use-folder-prefix` is provided, files will be exported
|
||||
in dedicated folders according to their nature: `archive`, `originals`,
|
||||
`thumbnails` or `json`
|
||||
|
||||
If `-sm` or `--split-manifest` is provided, information about document
|
||||
will be placed in individual json files, instead of a single JSON file. The main
|
||||
manifest.json will still contain application wide information (e.g. tags, correspondent,
|
||||
documenttype, etc)
|
||||
|
||||
If `-z` or `--zip` is provided, the export will be a zip file
|
||||
in the target directory, named according to the current local date or the
|
||||
value set in `-zn` or `--zip-name`.
|
||||
|
||||
!!! warning
|
||||
|
||||
If exporting with the file name format, there may be errors due to
|
||||
your operating system's maximum path lengths. Try adjusting the export
|
||||
target or consider not using the filename format.
|
||||
|
||||
### Document importer {#importer}
|
||||
|
||||
The document importer takes the export produced by the [Document
|
||||
exporter](#exporter) and imports it into paperless.
|
||||
|
||||
The importer works just like the exporter. You point it at a directory,
|
||||
and the script does the rest of the work:
|
||||
|
||||
```
|
||||
document_importer source
|
||||
```
|
||||
|
||||
When you use the provided docker compose script, put the export inside
|
||||
the `export` folder in your paperless source directory. Specify
|
||||
`../export` as the `source`.
|
||||
|
||||
Note that .zip files (as can be generated from the exporter) are not supported.
|
||||
|
||||
!!! note
|
||||
|
||||
Importing from a previous version of Paperless may work, but for best
|
||||
results it is suggested to match the versions.
|
||||
|
||||
!!! warning
|
||||
|
||||
The importer should be run against a completely empty installation (database and directories) of Paperless-ngx.
|
||||
|
||||
### Document retagger {#retagger}
|
||||
|
||||
Say you've imported a few hundred documents and now want to introduce a
|
||||
tag or set up a new correspondent, and apply its matching to all of the
|
||||
currently-imported docs. This problem is common enough that there are
|
||||
tools for it.
|
||||
|
||||
```
|
||||
document_retagger [-h] [-c] [-T] [-t] [-i] [--id-range] [--use-first] [-f]
|
||||
|
||||
optional arguments:
|
||||
-c, --correspondent
|
||||
-T, --tags
|
||||
-t, --document_type
|
||||
-s, --storage_path
|
||||
-i, --inbox-only
|
||||
--id-range
|
||||
--use-first
|
||||
-f, --overwrite
|
||||
```
|
||||
|
||||
Run this after changing or adding matching rules. It'll loop over all
|
||||
of the documents in your database and attempt to match documents
|
||||
according to the new rules.
|
||||
|
||||
Specify any combination of `-c`, `-T`, `-t` and `-s` to have the
|
||||
retagger perform matching of the specified metadata type. If you don't
|
||||
specify any of these options, the document retagger won't do anything.
|
||||
|
||||
Specify `-i` to have the document retagger work on documents tagged with
|
||||
inbox tags only. This is useful when you don't want to mess with your
|
||||
already processed documents.
|
||||
|
||||
Specify `--id-range 1 100` to have the document retagger work only on a
|
||||
specific range of document id´s. This can be useful if you have a lot of
|
||||
documents and want to test the matching rules only on a subset of
|
||||
documents.
|
||||
|
||||
When multiple document types or correspondents match a single document,
|
||||
the retagger won't assign these to the document. Specify `--use-first`
|
||||
to override this behavior and just use the first correspondent or type
|
||||
it finds. This option does not apply to tags, since any amount of tags
|
||||
can be applied to a document.
|
||||
|
||||
Finally, `-f` specifies that you wish to overwrite already assigned
|
||||
correspondents, types and/or tags. The default behavior is to not assign
|
||||
correspondents and types to documents that have this data already
|
||||
assigned. `-f` works differently for tags: By default, only additional
|
||||
tags get added to documents, no tags will be removed. With `-f`, tags
|
||||
that don't match a document anymore get removed as well.
|
||||
|
||||
### Managing the Automatic matching algorithm
|
||||
|
||||
The _Auto_ matching algorithm requires a trained neural network to work.
|
||||
This network needs to be updated whenever something in your data
|
||||
changes. The docker image takes care of that automatically with the task
|
||||
scheduler. You can manually renew the classifier by invoking the
|
||||
following management command:
|
||||
|
||||
```
|
||||
document_create_classifier
|
||||
```
|
||||
|
||||
This command takes no arguments.
|
||||
|
||||
### Document thumbnails {#thumbnails}
|
||||
|
||||
Use this command to re-create document thumbnails. Optionally include the ` --document {id}` option to generate thumbnails for a specific document only.
|
||||
|
||||
You may also specify `--processes` to control the number of processes used to generate new thumbnails. The default is to utilize
|
||||
a quarter of the available processors.
|
||||
|
||||
```
|
||||
document_thumbnails
|
||||
```
|
||||
|
||||
### Managing the document search index {#index}
|
||||
|
||||
The document search index is responsible for delivering search results
|
||||
for the website. The document index is automatically updated whenever
|
||||
documents get added to, changed, or removed from paperless. However, if
|
||||
the search yields non-existing documents or won't find anything, you
|
||||
may need to recreate the index manually.
|
||||
|
||||
```
|
||||
document_index {reindex,optimize}
|
||||
```
|
||||
|
||||
Specify `reindex` to have the index created from scratch. This may take
|
||||
some time.
|
||||
|
||||
Specify `optimize` to optimize the index. This updates certain aspects
|
||||
of the index and usually makes queries faster and also ensures that the
|
||||
autocompletion works properly. This command is regularly invoked by the
|
||||
task scheduler.
|
||||
|
||||
### Managing filenames {#renamer}
|
||||
|
||||
If you use paperless' feature to
|
||||
[assign custom filenames to your documents](advanced_usage.md#file-name-handling), you can use this command to move all your files after
|
||||
changing the naming scheme.
|
||||
|
||||
!!! warning
|
||||
|
||||
Since this command moves your documents, it is advised to do a backup
|
||||
beforehand. The renaming logic is robust and will never overwrite or
|
||||
delete a file, but you can't ever be careful enough.
|
||||
|
||||
```
|
||||
document_renamer
|
||||
```
|
||||
|
||||
The command takes no arguments and processes all your documents at once.
|
||||
|
||||
Learn how to use
|
||||
[Management Utilities](#management-commands).
|
||||
|
||||
### Sanity checker {#sanity-checker}
|
||||
|
||||
Paperless has a built-in sanity checker that inspects your document
|
||||
collection for issues.
|
||||
|
||||
The issues detected by the sanity checker are as follows:
|
||||
|
||||
- Missing original files.
|
||||
- Missing archive files.
|
||||
- Inaccessible original files due to improper permissions.
|
||||
- Inaccessible archive files due to improper permissions.
|
||||
- Corrupted original documents by comparing their checksum against
|
||||
what is stored in the database.
|
||||
- Corrupted archive documents by comparing their checksum against what
|
||||
is stored in the database.
|
||||
- Missing thumbnails.
|
||||
- Inaccessible thumbnails due to improper permissions.
|
||||
- Documents without any content (warning).
|
||||
- Orphaned files in the media directory (warning). These are files
|
||||
that are not referenced by any document in paperless.
|
||||
|
||||
```
|
||||
document_sanity_checker
|
||||
```
|
||||
|
||||
The command takes no arguments. Depending on the size of your document
|
||||
archive, this may take some time.
|
||||
|
||||
### Fetching e-mail
|
||||
|
||||
Paperless automatically fetches your e-mail every 10 minutes by default.
|
||||
If you want to invoke the email consumer manually, call the following
|
||||
management command:
|
||||
|
||||
```
|
||||
mail_fetcher
|
||||
```
|
||||
|
||||
The command takes no arguments and processes all your mail accounts and
|
||||
rules.
|
||||
|
||||
!!! tip
|
||||
|
||||
To use OAuth access tokens for mail fetching,
|
||||
select the box to indicate the password is actually
|
||||
a token when creating or editing a mail account. The
|
||||
details for creating a token depend on your email
|
||||
provider.
|
||||
|
||||
### Creating archived documents {#archiver}
|
||||
|
||||
Paperless stores archived PDF/A documents alongside your original
|
||||
documents. These archived documents will also contain selectable text
|
||||
for image-only originals. These documents are derived from the
|
||||
originals, which are always stored unmodified. If coming from an earlier
|
||||
version of paperless, your documents won't have archived versions.
|
||||
|
||||
This command creates PDF/A documents for your documents.
|
||||
|
||||
```
|
||||
document_archiver --overwrite --document <id>
|
||||
```
|
||||
|
||||
This command will only attempt to create archived documents when no
|
||||
archived document exists yet, unless `--overwrite` is specified. If
|
||||
`--document <id>` is specified, the archiver will only process that
|
||||
document.
|
||||
|
||||
!!! note
|
||||
|
||||
This command essentially performs OCR on all your documents again,
|
||||
according to your settings. If you run this with
|
||||
`PAPERLESS_OCR_MODE=redo`, it will potentially run for a very long time.
|
||||
You can cancel the command at any time, since this command will skip
|
||||
already archived versions the next time it is run.
|
||||
|
||||
!!! note
|
||||
|
||||
Some documents will cause errors and cannot be converted into PDF/A
|
||||
documents, such as encrypted PDF documents. The archiver will skip over
|
||||
these documents each time it sees them.
|
||||
|
||||
### Managing encryption {#encryption}
|
||||
|
||||
Documents can be stored in Paperless using GnuPG encryption.
|
||||
|
||||
!!! warning
|
||||
|
||||
Encryption is deprecated since [paperless-ng 0.9](changelog.md#paperless-ng-090) and doesn't really
|
||||
provide any additional security, since you have to store the passphrase
|
||||
in a configuration file on the same system as the encrypted documents
|
||||
for paperless to work. Furthermore, the entire text content of the
|
||||
documents is stored plain in the database, even if your documents are
|
||||
encrypted. Filenames are not encrypted as well.
|
||||
|
||||
Also, the web server provides transparent access to your encrypted
|
||||
documents.
|
||||
|
||||
Consider running paperless on an encrypted filesystem instead, which
|
||||
will then at least provide security against physical hardware theft.
|
||||
|
||||
#### Enabling encryption
|
||||
|
||||
Enabling encryption is no longer supported.
|
||||
|
||||
#### Disabling encryption
|
||||
|
||||
Basic usage to disable encryption of your document store:
|
||||
|
||||
(Note: If [`PAPERLESS_PASSPHRASE`](configuration.md#PAPERLESS_PASSPHRASE) isn't set already, you need to specify
|
||||
it here)
|
||||
|
||||
```
|
||||
decrypt_documents [--passphrase SECR3TP4SSPHRA$E]
|
||||
```
|
||||
|
||||
### Detecting duplicates {#fuzzy_duplicate}
|
||||
|
||||
Paperless already catches and prevents upload of exactly matching documents,
|
||||
however a new scan of an existing document may not produce an exact bit for bit
|
||||
duplicate. But the content should be exact or close, allowing detection.
|
||||
|
||||
This tool does a fuzzy match over document content, looking for
|
||||
those which look close according to a given ratio.
|
||||
|
||||
At this time, other metadata (such as correspondent or type) is not
|
||||
taken into account by the detection.
|
||||
|
||||
```
|
||||
document_fuzzy_match [--ratio] [--processes N]
|
||||
```
|
||||
|
||||
| Option | Required | Default | Description |
|
||||
| ----------- | -------- | ------------------- | ------------------------------------------------------------------------------------------------------------------------------ |
|
||||
| --ratio | No | 85.0 | a number between 0 and 100, setting how similar a document must be for it to be reported. Higher numbers mean more similarity. |
|
||||
| --processes | No | 1/4 of system cores | Number of processes to use for matching. Setting 1 disables multiple processes |
|
||||
| --delete | No | False | If provided, one document of a matched pair above the ratio will be deleted. |
|
||||
|
||||
!!! warning
|
||||
|
||||
If providing the `--delete` option, it is highly recommended to have a backup.
|
||||
While every effort has been taken to ensure proper operation, there is always the
|
||||
chance of deletion of a file you want to keep.
|
11
docs/administration.rst
Normal file
@@ -0,0 +1,11 @@
|
||||
.. _administration:
|
||||
|
||||
**************
|
||||
Administration
|
||||
**************
|
||||
|
||||
.. cssclass:: redirect-notice
|
||||
|
||||
The Paperless-ngx documentation has permanently moved.
|
||||
|
||||
You will be redirected shortly...
|
@@ -1,690 +0,0 @@
|
||||
# Advanced Topics
|
||||
|
||||
Paperless offers a couple of features that automate certain tasks and make
|
||||
your life easier.
|
||||
|
||||
## Matching tags, correspondents, document types, and storage paths {#matching}
|
||||
|
||||
Paperless will compare the matching algorithms defined by every tag,
|
||||
correspondent, document type, and storage path in your database to see
|
||||
if they apply to the text in a document. In other words, if you define a
|
||||
tag called `Home Utility` that had a `match` property of `bc hydro` and
|
||||
a `matching_algorithm` of `Exact`, Paperless will automatically tag
|
||||
your newly-consumed document with your `Home Utility` tag so long as the
|
||||
text `bc hydro` appears in the body of the document somewhere.
|
||||
|
||||
The matching logic is quite powerful. It supports searching the text of
|
||||
your document with different algorithms, and as such, some
|
||||
experimentation may be necessary to get things right.
|
||||
|
||||
In order to have a tag, correspondent, document type, or storage path
|
||||
assigned automatically to newly consumed documents, assign a match and
|
||||
matching algorithm using the web interface. These settings define when
|
||||
to assign tags, correspondents, document types, and storage paths to
|
||||
documents.
|
||||
|
||||
The following algorithms are available:
|
||||
|
||||
- **None:** No matching will be performed.
|
||||
- **Any:** Looks for any occurrence of any word provided in match in
|
||||
the PDF. If you define the match as `Bank1 Bank2`, it will match
|
||||
documents containing either of these terms.
|
||||
- **All:** Requires that every word provided appears in the PDF,
|
||||
albeit not in the order provided.
|
||||
- **Exact:** Matches only if the match appears exactly as provided
|
||||
(i.e. preserve ordering) in the PDF.
|
||||
- **Regular expression:** Parses the match as a regular expression and
|
||||
tries to find a match within the document.
|
||||
- **Fuzzy match:** Uses a partial matching based on locating the tag text
|
||||
inside the document, using a [partial ratio](https://maxbachmann.github.io/RapidFuzz/Usage/fuzz.html#partial-ratio)
|
||||
- **Auto:** Tries to automatically match new documents. This does not
|
||||
require you to set a match. See the [notes below](#automatic-matching).
|
||||
|
||||
When using the _any_ or _all_ matching algorithms, you can search for
|
||||
terms that consist of multiple words by enclosing them in double quotes.
|
||||
For example, defining a match text of `"Bank of America" BofA` using the
|
||||
_any_ algorithm, will match documents that contain either "Bank of
|
||||
America" or "BofA", but will not match documents containing "Bank of
|
||||
South America".
|
||||
|
||||
Then just save your tag, correspondent, document type, or storage path
|
||||
and run another document through the consumer. Once complete, you should
|
||||
see the newly-created document, automatically tagged with the
|
||||
appropriate data.
|
||||
|
||||
### Automatic matching {#automatic-matching}
|
||||
|
||||
Paperless-ngx comes with a new matching algorithm called _Auto_. This
|
||||
matching algorithm tries to assign tags, correspondents, document types,
|
||||
and storage paths to your documents based on how you have already
|
||||
assigned these on existing documents. It uses a neural network under the
|
||||
hood.
|
||||
|
||||
If, for example, all your bank statements of your account 123 at the
|
||||
Bank of America are tagged with the tag "bofa123" and the matching
|
||||
algorithm of this tag is set to _Auto_, this neural network will examine
|
||||
your documents and automatically learn when to assign this tag.
|
||||
|
||||
Paperless tries to hide much of the involved complexity with this
|
||||
approach. However, there are a couple caveats you need to keep in mind
|
||||
when using this feature:
|
||||
|
||||
- Changes to your documents are not immediately reflected by the
|
||||
matching algorithm. The neural network needs to be _trained_ on your
|
||||
documents after changes. Paperless periodically (default: once each
|
||||
hour) checks for changes and does this automatically for you.
|
||||
- The Auto matching algorithm only takes documents into account which
|
||||
are NOT placed in your inbox (i.e. have any inbox tags assigned to
|
||||
them). This ensures that the neural network only learns from
|
||||
documents which you have correctly tagged before.
|
||||
- The matching algorithm can only work if there is a correlation
|
||||
between the tag, correspondent, document type, or storage path and
|
||||
the document itself. Your bank statements usually contain your bank
|
||||
account number and the name of the bank, so this works reasonably
|
||||
well, However, tags such as "TODO" cannot be automatically
|
||||
assigned.
|
||||
- The matching algorithm needs a reasonable number of documents to
|
||||
identify when to assign tags, correspondents, storage paths, and
|
||||
types. If one out of a thousand documents has the correspondent
|
||||
"Very obscure web shop I bought something five years ago", it will
|
||||
probably not assign this correspondent automatically if you buy
|
||||
something from them again. The more documents, the better.
|
||||
- Paperless also needs a reasonable amount of negative examples to
|
||||
decide when not to assign a certain tag, correspondent, document
|
||||
type, or storage path. This will usually be the case as you start
|
||||
filling up paperless with documents. Example: If all your documents
|
||||
are either from "Webshop" or "Bank", paperless will assign one
|
||||
of these correspondents to ANY new document, if both are set to
|
||||
automatic matching.
|
||||
|
||||
## Hooking into the consumption process {#consume-hooks}
|
||||
|
||||
Sometimes you may want to do something arbitrary whenever a document is
|
||||
consumed. Rather than try to predict what you may want to do, Paperless
|
||||
lets you execute scripts of your own choosing just before or after a
|
||||
document is consumed using a couple of simple hooks.
|
||||
|
||||
Just write a script, put it somewhere that Paperless can read & execute,
|
||||
and then put the path to that script in `paperless.conf` or
|
||||
`docker-compose.env` with the variable name of either
|
||||
[`PAPERLESS_PRE_CONSUME_SCRIPT`](configuration.md#PAPERLESS_PRE_CONSUME_SCRIPT) or [`PAPERLESS_POST_CONSUME_SCRIPT`](configuration.md#PAPERLESS_POST_CONSUME_SCRIPT).
|
||||
|
||||
!!! info
|
||||
|
||||
These scripts are executed in a **blocking** process, which means that
|
||||
if a script takes a long time to run, it can significantly slow down
|
||||
your document consumption flow. If you want things to run
|
||||
asynchronously, you'll have to fork the process in your script and
|
||||
exit.
|
||||
|
||||
### Pre-consumption script {#pre-consume-script}
|
||||
|
||||
Executed after the consumer sees a new document in the consumption
|
||||
folder, but before any processing of the document is performed. This
|
||||
script can access the following relevant environment variables set:
|
||||
|
||||
| Environment Variable | Description |
|
||||
| ----------------------- | ------------------------------------------------------------ |
|
||||
| `DOCUMENT_SOURCE_PATH` | Original path of the consumed document |
|
||||
| `DOCUMENT_WORKING_PATH` | Path to a copy of the original that consumption will work on |
|
||||
| `TASK_ID` | UUID of the task used to process the new document (if any) |
|
||||
|
||||
!!! note
|
||||
|
||||
Pre-consume scripts which modify the document should only change
|
||||
the `DOCUMENT_WORKING_PATH` file or a second consume task may
|
||||
be triggered, leading to failures as two tasks work on the
|
||||
same document path
|
||||
|
||||
!!! warning
|
||||
|
||||
If your script modifies `DOCUMENT_WORKING_PATH` in a non-deterministic
|
||||
way, this may allow duplicate documents to be stored
|
||||
|
||||
A simple but common example for this would be creating a simple script
|
||||
like this:
|
||||
|
||||
`/usr/local/bin/ocr-pdf`
|
||||
|
||||
```bash
|
||||
#!/usr/bin/env bash
|
||||
pdf2pdfocr.py -i ${DOCUMENT_WORKING_PATH}
|
||||
```
|
||||
|
||||
`/etc/paperless.conf`
|
||||
|
||||
```bash
|
||||
...
|
||||
PAPERLESS_PRE_CONSUME_SCRIPT="/usr/local/bin/ocr-pdf"
|
||||
...
|
||||
```
|
||||
|
||||
This will pass the path to the document about to be consumed to
|
||||
`/usr/local/bin/ocr-pdf`, which will in turn call
|
||||
[pdf2pdfocr.py](https://github.com/LeoFCardoso/pdf2pdfocr) on your
|
||||
document, which will then overwrite the file with an OCR'd version of
|
||||
the file and exit. At which point, the consumption process will begin
|
||||
with the newly modified file.
|
||||
|
||||
The script's stdout and stderr will be logged line by line to the
|
||||
webserver log, along with the exit code of the script.
|
||||
|
||||
### Post-consumption script {#post-consume-script}
|
||||
|
||||
Executed after the consumer has successfully processed a document and
|
||||
has moved it into paperless. It receives the following environment
|
||||
variables:
|
||||
|
||||
| Environment Variable | Description |
|
||||
| ---------------------------- | ---------------------------------------------- |
|
||||
| `DOCUMENT_ID` | Database primary key of the document |
|
||||
| `DOCUMENT_FILE_NAME` | Formatted filename, not including paths |
|
||||
| `DOCUMENT_CREATED` | Date & time when document created |
|
||||
| `DOCUMENT_MODIFIED` | Date & time when document was last modified |
|
||||
| `DOCUMENT_ADDED` | Date & time when document was added |
|
||||
| `DOCUMENT_SOURCE_PATH` | Path to the original document file |
|
||||
| `DOCUMENT_ARCHIVE_PATH` | Path to the generate archive file (if any) |
|
||||
| `DOCUMENT_THUMBNAIL_PATH` | Path to the generated thumbnail |
|
||||
| `DOCUMENT_DOWNLOAD_URL` | URL for document download |
|
||||
| `DOCUMENT_THUMBNAIL_URL` | URL for the document thumbnail |
|
||||
| `DOCUMENT_CORRESPONDENT` | Assigned correspondent (if any) |
|
||||
| `DOCUMENT_TAGS` | Comma separated list of tags applied (if any) |
|
||||
| `DOCUMENT_ORIGINAL_FILENAME` | Filename of original document |
|
||||
| `TASK_ID` | Task UUID used to import the document (if any) |
|
||||
|
||||
The script can be in any language, A simple shell script example:
|
||||
|
||||
```bash title="post-consumption-example"
|
||||
--8<-- "./scripts/post-consumption-example.sh"
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
The post consumption script cannot cancel the consumption process.
|
||||
|
||||
!!! warning
|
||||
|
||||
The post consumption script should not modify the document files
|
||||
directly.
|
||||
|
||||
The script's stdout and stderr will be logged line by line to the
|
||||
webserver log, along with the exit code of the script.
|
||||
|
||||
### Docker {#docker-consume-hooks}
|
||||
|
||||
To hook into the consumption process when using Docker, you
|
||||
will need to pass the scripts into the container via a host mount
|
||||
in your `docker-compose.yml`.
|
||||
|
||||
Assuming you have
|
||||
`/home/paperless-ngx/scripts/post-consumption-example.sh` as a
|
||||
script which you'd like to run.
|
||||
|
||||
You can pass that script into the consumer container via a host mount:
|
||||
|
||||
```yaml
|
||||
...
|
||||
webserver:
|
||||
...
|
||||
volumes:
|
||||
...
|
||||
- /home/paperless-ngx/scripts:/path/in/container/scripts/ # (1)!
|
||||
environment: # (3)!
|
||||
...
|
||||
PAPERLESS_POST_CONSUME_SCRIPT: /path/in/container/scripts/post-consumption-example.sh # (2)!
|
||||
...
|
||||
```
|
||||
|
||||
1. The external scripts directory is mounted to a location inside the container.
|
||||
2. The internal location of the script is used to set the script to run
|
||||
3. This can also be set in `docker-compose.env`
|
||||
|
||||
Troubleshooting:
|
||||
|
||||
- Monitor the Docker Compose log
|
||||
`cd ~/paperless-ngx; docker compose logs -f`
|
||||
- Check your script's permission e.g. in case of permission error
|
||||
`sudo chmod 755 post-consumption-example.sh`
|
||||
- Pipe your scripts's output to a log file e.g.
|
||||
`echo "${DOCUMENT_ID}" | tee --append /usr/src/paperless/scripts/post-consumption-example.log`
|
||||
|
||||
## File name handling {#file-name-handling}
|
||||
|
||||
By default, paperless stores your documents in the media directory and
|
||||
renames them using the identifier which it has assigned to each
|
||||
document. You will end up getting files like `0000123.pdf` in your media
|
||||
directory. This isn't necessarily a bad thing, because you normally
|
||||
don't have to access these files manually. However, if you wish to name
|
||||
your files differently, you can do that by adjusting the
|
||||
[`PAPERLESS_FILENAME_FORMAT`](configuration.md#PAPERLESS_FILENAME_FORMAT) configuration option
|
||||
or using [storage paths (see below)](#storage-paths). Paperless adds the
|
||||
correct file extension e.g. `.pdf`, `.jpg` automatically.
|
||||
|
||||
This variable allows you to configure the filename (folders are allowed)
|
||||
using placeholders. For example, configuring this to
|
||||
|
||||
```bash
|
||||
PAPERLESS_FILENAME_FORMAT={created_year}/{correspondent}/{title}
|
||||
```
|
||||
|
||||
will create a directory structure as follows:
|
||||
|
||||
```
|
||||
2019/
|
||||
My bank/
|
||||
Statement January.pdf
|
||||
Statement February.pdf
|
||||
2020/
|
||||
My bank/
|
||||
Statement January.pdf
|
||||
Letter.pdf
|
||||
Letter_01.pdf
|
||||
Shoe store/
|
||||
My new shoes.pdf
|
||||
```
|
||||
|
||||
!!! warning
|
||||
|
||||
Do not manually move your files in the media folder. Paperless remembers
|
||||
the last filename a document was stored as. If you do rename a file,
|
||||
paperless will report your files as missing and won't be able to find
|
||||
them.
|
||||
|
||||
!!! tip
|
||||
|
||||
Paperless checks the filename of a document whenever it is saved. Changing (or deleting)
|
||||
a [storage path](#storage-paths) will automatically be reflected in the file system. However,
|
||||
when changing `PAPERLESS_FILENAME_FORMAT` you will need to manually run the
|
||||
[`document renamer`](administration.md#renamer) to move any existing documents.
|
||||
|
||||
#### Placeholders
|
||||
|
||||
Paperless provides the following placeholders within filenames:
|
||||
|
||||
- `{asn}`: The archive serial number of the document, or "none".
|
||||
- `{correspondent}`: The name of the correspondent, or "none".
|
||||
- `{document_type}`: The name of the document type, or "none".
|
||||
- `{tag_list}`: A comma separated list of all tags assigned to the
|
||||
document.
|
||||
- `{title}`: The title of the document.
|
||||
- `{created}`: The full date (ISO format) the document was created.
|
||||
- `{created_year}`: Year created only, formatted as the year with
|
||||
century.
|
||||
- `{created_year_short}`: Year created only, formatted as the year
|
||||
without century, zero padded.
|
||||
- `{created_month}`: Month created only (number 01-12).
|
||||
- `{created_month_name}`: Month created name, as per locale
|
||||
- `{created_month_name_short}`: Month created abbreviated name, as per
|
||||
locale
|
||||
- `{created_day}`: Day created only (number 01-31).
|
||||
- `{added}`: The full date (ISO format) the document was added to
|
||||
paperless.
|
||||
- `{added_year}`: Year added only.
|
||||
- `{added_year_short}`: Year added only, formatted as the year without
|
||||
century, zero padded.
|
||||
- `{added_month}`: Month added only (number 01-12).
|
||||
- `{added_month_name}`: Month added name, as per locale
|
||||
- `{added_month_name_short}`: Month added abbreviated name, as per
|
||||
locale
|
||||
- `{added_day}`: Day added only (number 01-31).
|
||||
- `{owner_username}`: Username of document owner, if any, or "none"
|
||||
- `{original_name}`: Document original filename, minus the extension, if any, or "none"
|
||||
- `{doc_pk}`: The paperless identifier (primary key) for the document.
|
||||
|
||||
!!! warning
|
||||
|
||||
When using file name placeholders, in particular when using `{tag_list}`,
|
||||
you may run into the limits of your operating system's maximum path lengths.
|
||||
In that case, files will retain the previous path instead and the issue logged.
|
||||
|
||||
Paperless will try to conserve the information from your database as
|
||||
much as possible. However, some characters that you can use in document
|
||||
titles and correspondent names (such as `: \ /` and a couple more) are
|
||||
not allowed in filenames and will be replaced with dashes.
|
||||
|
||||
If paperless detects that two documents share the same filename,
|
||||
paperless will automatically append `_01`, `_02`, etc to the filename.
|
||||
This happens if all the placeholders in a filename evaluate to the same
|
||||
value.
|
||||
|
||||
If there are any errors in the placeholders included in `PAPERLESS_FILENAME_FORMAT`,
|
||||
paperless will fall back to using the default naming scheme instead.
|
||||
|
||||
!!! caution
|
||||
|
||||
As of now, you could potentially tell paperless to store your files anywhere
|
||||
outside the media directory by setting
|
||||
|
||||
```
|
||||
PAPERLESS_FILENAME_FORMAT=../../my/custom/location/{title}
|
||||
```
|
||||
|
||||
However, keep in mind that inside docker, if files get stored outside of
|
||||
the predefined volumes, they will be lost after a restart.
|
||||
|
||||
##### Empty placeholders
|
||||
|
||||
You can affect how empty placeholders are treated by changing the
|
||||
[`PAPERLESS_FILENAME_FORMAT_REMOVE_NONE`](configuration.md#PAPERLESS_FILENAME_FORMAT_REMOVE_NONE) setting.
|
||||
|
||||
Enabling this results in all empty placeholders resolving to "" instead of "none" as stated above. Spaces
|
||||
before empty placeholders are removed as well, empty directories are omitted.
|
||||
|
||||
### Storage paths
|
||||
|
||||
When a single storage layout is not sufficient for your use case, storage paths allow for more complex
|
||||
structure to set precisely where each document is stored in the file system.
|
||||
|
||||
- Each storage path is a [`PAPERLESS_FILENAME_FORMAT`](configuration.md#PAPERLESS_FILENAME_FORMAT) and
|
||||
follows the rules described above
|
||||
- Each document is assigned a storage path using the matching algorithms described above, but can be
|
||||
overwritten at any time
|
||||
|
||||
For example, you could define the following two storage paths:
|
||||
|
||||
1. Normal communications are put into a folder structure sorted by
|
||||
`year/correspondent`
|
||||
2. Communications with insurance companies are stored in a flat
|
||||
structure with longer file names, but containing the full date of
|
||||
the correspondence.
|
||||
|
||||
```
|
||||
By Year = {created_year}/{correspondent}/{title}
|
||||
Insurances = Insurances/{correspondent}/{created_year}-{created_month}-{created_day} {title}
|
||||
```
|
||||
|
||||
If you then map these storage paths to the documents, you might get the
|
||||
following result. For simplicity, `By Year` defines the same
|
||||
structure as in the previous example above.
|
||||
|
||||
```text
|
||||
2019/ # By Year
|
||||
My bank/
|
||||
Statement January.pdf
|
||||
Statement February.pdf
|
||||
|
||||
Insurances/ # Insurances
|
||||
Healthcare 123/
|
||||
2022-01-01 Statement January.pdf
|
||||
2022-02-02 Letter.pdf
|
||||
2022-02-03 Letter.pdf
|
||||
Dental 456/
|
||||
2021-12-01 New Conditions.pdf
|
||||
```
|
||||
|
||||
!!! tip
|
||||
|
||||
Defining a storage path is optional. If no storage path is defined for a
|
||||
document, the global [`PAPERLESS_FILENAME_FORMAT`](configuration.md#PAPERLESS_FILENAME_FORMAT) is applied.
|
||||
|
||||
## Celery Monitoring {#celery-monitoring}
|
||||
|
||||
The monitoring tool
|
||||
[Flower](https://flower.readthedocs.io/en/latest/index.html) can be used
|
||||
to view more detailed information about the health of the celery workers
|
||||
used for asynchronous tasks. This includes details on currently running,
|
||||
queued and completed tasks, timing and more. Flower can also be used
|
||||
with Prometheus, as it exports metrics. For details on its capabilities,
|
||||
refer to the [Flower](https://flower.readthedocs.io/en/latest/index.html)
|
||||
documentation.
|
||||
|
||||
Flower can be enabled with the setting [PAPERLESS_ENABLE_FLOWER](configuration.md#PAPERLESS_ENABLE_FLOWER).
|
||||
To configure Flower further, create a `flowerconfig.py` and
|
||||
place it into the `src/paperless` directory. For a Docker
|
||||
installation, you can use volumes to accomplish this:
|
||||
|
||||
```yaml
|
||||
services:
|
||||
# ...
|
||||
webserver:
|
||||
environment:
|
||||
- PAPERLESS_ENABLE_FLOWER
|
||||
ports:
|
||||
- 5555:5555 # (2)!
|
||||
# ...
|
||||
volumes:
|
||||
- /path/to/my/flowerconfig.py:/usr/src/paperless/src/paperless/flowerconfig.py:ro # (1)!
|
||||
```
|
||||
|
||||
1. Note the `:ro` tag means the file will be mounted as read only.
|
||||
2. By default, Flower runs on port 5555, but this can be configured.
|
||||
|
||||
## Custom Container Initialization
|
||||
|
||||
The Docker image includes the ability to run custom user scripts during
|
||||
startup. This could be utilized for installing additional tools or
|
||||
Python packages, for example. Scripts are expected to be shell scripts.
|
||||
|
||||
To utilize this, mount a folder containing your scripts to the custom
|
||||
initialization directory, `/custom-cont-init.d` and place
|
||||
scripts you wish to run inside. For security, the folder must be owned
|
||||
by `root` and should have permissions of `a=rx`. Additionally, scripts
|
||||
must only be writable by `root`.
|
||||
|
||||
Your scripts will be run directly before the webserver completes
|
||||
startup. Scripts will be run by the `root` user.
|
||||
If you would like to switch users, the utility `gosu` is available and
|
||||
preferred over `sudo`.
|
||||
|
||||
This is an advanced functionality with which you could break functionality
|
||||
or lose data. If you experience issues, please disable any custom scripts
|
||||
and try again before reporting an issue.
|
||||
|
||||
For example, using Docker Compose:
|
||||
|
||||
```yaml
|
||||
services:
|
||||
# ...
|
||||
webserver:
|
||||
# ...
|
||||
volumes:
|
||||
- /path/to/my/scripts:/custom-cont-init.d:ro # (1)!
|
||||
```
|
||||
|
||||
1. Note the `:ro` tag means the folder will be mounted as read only. This is for extra security against changes
|
||||
|
||||
## MySQL Caveats {#mysql-caveats}
|
||||
|
||||
### Case Sensitivity
|
||||
|
||||
The database interface does not provide a method to configure a MySQL
|
||||
database to be case sensitive. This would prevent a user from creating a
|
||||
tag `Name` and `NAME` as they are considered the same.
|
||||
|
||||
Per Django documentation, to enable this requires manual intervention.
|
||||
To enable case sensitive tables, you can execute the following command
|
||||
against each table:
|
||||
|
||||
`ALTER TABLE <table_name> CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;`
|
||||
|
||||
You can also set the default for new tables (this does NOT affect
|
||||
existing tables) with:
|
||||
|
||||
`ALTER DATABASE <db_name> CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;`
|
||||
|
||||
!!! warning
|
||||
|
||||
Using mariadb version 10.4+ is recommended. Using the `utf8mb3` character set on
|
||||
an older system may fix issues that can arise while setting up Paperless-ngx but
|
||||
`utf8mb3` can cause issues with consumption (where `utf8mb4` does not).
|
||||
|
||||
### Missing timezones
|
||||
|
||||
MySQL as well as MariaDB do not have any timezone information by default (though some
|
||||
docker images such as the official MariaDB image take care of this for you) which will
|
||||
cause unexpected behavior with date-based queries.
|
||||
|
||||
To fix this, execute one of the following commands:
|
||||
|
||||
MySQL: `mysql_tzinfo_to_sql /usr/share/zoneinfo | mysql -u root mysql -p`
|
||||
|
||||
MariaDB: `mariadb-tzinfo-to-sql /usr/share/zoneinfo | mariadb -u root mysql -p`
|
||||
|
||||
## Barcodes {#barcodes}
|
||||
|
||||
Paperless is able to utilize barcodes for automatically performing some tasks.
|
||||
|
||||
At this time, the library utilized for detection of barcodes supports the following types:
|
||||
|
||||
- AN-13/UPC-A
|
||||
- UPC-E
|
||||
- EAN-8
|
||||
- Code 128
|
||||
- Code 93
|
||||
- Code 39
|
||||
- Codabar
|
||||
- Interleaved 2 of 5
|
||||
- QR Code
|
||||
- SQ Code
|
||||
|
||||
You may check for updates on the [zbar library homepage](https://github.com/mchehab/zbar).
|
||||
For usage in Paperless, the type of barcode does not matter, only the contents of it.
|
||||
|
||||
For how to enable barcode usage, see [the configuration](configuration.md#barcodes).
|
||||
The two settings may be enabled independently, but do have interactions as explained
|
||||
below.
|
||||
|
||||
### Document Splitting {#document-splitting}
|
||||
|
||||
When enabled, Paperless will look for a barcode with the configured value and create a new document
|
||||
starting from the next page. The page with the barcode on it will _not_ be retained. It
|
||||
is expected to be a page existing only for triggering the split.
|
||||
|
||||
### Archive Serial Number Assignment
|
||||
|
||||
When enabled, the value of the barcode (as an integer) will be used to set the document's
|
||||
archive serial number, allowing quick reference back to the original, paper document.
|
||||
|
||||
If document splitting via barcode is also enabled, documents will be split when an ASN
|
||||
barcode is located. However, differing from the splitting, the page with the
|
||||
barcode _will_ be retained. This allows application of a barcode to any page, including
|
||||
one which holds data to keep in the document.
|
||||
|
||||
### Tag Assignment
|
||||
|
||||
When enabled, Paperless will parse barcodes and attempt to interpret and assign tags.
|
||||
|
||||
See the relevant settings [`PAPERLESS_CONSUMER_ENABLE_TAG_BARCODE`](configuration.md#PAPERLESS_CONSUMER_ENABLE_TAG_BARCODE)
|
||||
and [`PAPERLESS_CONSUMER_TAG_BARCODE_MAPPING`](configuration.md#PAPERLESS_CONSUMER_TAG_BARCODE_MAPPING)
|
||||
for more information.
|
||||
|
||||
## Automatic collation of double-sided documents {#collate}
|
||||
|
||||
!!! note
|
||||
|
||||
If your scanner supports double-sided scanning natively, you do not need this feature.
|
||||
|
||||
This feature is turned off by default, see [configuration](configuration.md#collate) on how to turn it on.
|
||||
|
||||
### Summary
|
||||
|
||||
If you have a scanner with an automatic document feeder (ADF) that only scans a single side,
|
||||
this feature makes scanning double-sided documents much more convenient by automatically
|
||||
collating two separate scans into one document, reordering the pages as necessary.
|
||||
|
||||
### Usage example
|
||||
|
||||
Suppose you have a double-sided document with 6 pages (3 sheets of paper). First,
|
||||
put the stack into your ADF as normal, ensuring that page 1 is scanned first. Your ADF
|
||||
will now scan pages 1, 3, and 5. Then you (or your scanner, if it supports it) upload
|
||||
the scan into the correct sub-directory of the consume folder (`double-sided` by default;
|
||||
keep in mind that Paperless will _not_ automatically create the directory for you.)
|
||||
Paperless will then process the scan and move it into an internal staging area.
|
||||
|
||||
The next step is to turn your stack upside down (without reordering the sheets of paper),
|
||||
and scan it once again, your ADF will now scan pages 6, 4, and 2, in that order. Once this
|
||||
scan is copied into the sub-directory, Paperless will collate the previous scan with the
|
||||
new one, reversing the order of the pages on the second, "even numbered" scan. The
|
||||
resulting document will have the pages 1-6 in the correct order, and this new file will
|
||||
then be processed as normal.
|
||||
|
||||
!!! tip
|
||||
|
||||
When scanning the even numbered pages, you can omit the last empty pages, if there are
|
||||
any. For example, if page 6 is empty, you only need to scan pages 2 and 4. _Do not_ omit
|
||||
empty pages in the middle of the document.
|
||||
|
||||
### Things that could go wrong
|
||||
|
||||
Paperless will notice when the first, "odd numbered" scan has less pages than the second
|
||||
scan (this can happen when e.g. the ADF skipped a few pages in the first pass). In that
|
||||
case, Paperless will remove the staging copy as well as the scan, and give you an error
|
||||
message asking you to restart the process from scratch, by scanning the odd pages again,
|
||||
followed by the even pages.
|
||||
|
||||
It's important that the scan files get consumed in the correct order, and one at a time.
|
||||
You therefore need to make sure that Paperless is running while you upload the files into
|
||||
the directory; and if you're using [polling](configuration.md#polling), make sure that
|
||||
`CONSUMER_POLLING` is set to a value lower than it takes for the second scan to appear,
|
||||
like 5-10 or even lower.
|
||||
|
||||
Another thing that might happen is that you start a double sided scan, but then forget
|
||||
to upload the second file. To avoid collating the wrong documents if you then come back
|
||||
a day later to scan a new double-sided document, Paperless will only keep an "odd numbered
|
||||
pages" file for up to 30 minutes. If more time passes, it will consider the next incoming
|
||||
scan a completely new "odd numbered pages" one. The old staging file will get discarded.
|
||||
|
||||
### Interaction with "subdirs as tags"
|
||||
|
||||
The collation feature can be used together with the [subdirs as tags](configuration.md#consume_config)
|
||||
feature (but this is not a requirement). Just create a correctly named double-sided subdir
|
||||
in the hierarchy and upload your scans there. For example, both `double-sided/foo/bar` as
|
||||
well as `foo/bar/double-sided` will cause the collated document to be treated as if it
|
||||
were uploaded into `foo/bar` and receive both `foo` and `bar` tags, but not `double-sided`.
|
||||
|
||||
### Interaction with document splitting
|
||||
|
||||
You can use the [document splitting](#document-splitting) feature, but if you use a normal
|
||||
single-sided split marker page, the split document(s) will have an empty page at the front (or
|
||||
whatever else was on the backside of the split marker page.) You can work around that by having
|
||||
a split marker page that has the split barcode on _both_ sides. This way, the extra page will
|
||||
get automatically removed.
|
||||
|
||||
## SSO and third party authentication with Paperless-ngx
|
||||
|
||||
Paperless-ngx has a built-in authentication system from Django but you can easily integrate an
|
||||
external authentication solution using one of the following methods:
|
||||
|
||||
### Remote User authentication
|
||||
|
||||
This is a simple option that uses remote user authentication made available by certain SSO
|
||||
applications. See the relevant configuration options for more information:
|
||||
[PAPERLESS_ENABLE_HTTP_REMOTE_USER](configuration.md#PAPERLESS_ENABLE_HTTP_REMOTE_USER),
|
||||
[PAPERLESS_HTTP_REMOTE_USER_HEADER_NAME](configuration.md#PAPERLESS_HTTP_REMOTE_USER_HEADER_NAME)
|
||||
and [PAPERLESS_LOGOUT_REDIRECT_URL](configuration.md#PAPERLESS_LOGOUT_REDIRECT_URL)
|
||||
|
||||
### OpenID Connect and social authentication
|
||||
|
||||
Version 2.5.0 of Paperless-ngx added support for integrating other authentication systems via
|
||||
the [django-allauth](https://github.com/pennersr/django-allauth) package. Once set up, users
|
||||
can either log in or (optionally) sign up using any third party systems you integrate. See the
|
||||
relevant [configuration settings](configuration.md#PAPERLESS_SOCIALACCOUNT_PROVIDERS) and
|
||||
[django-allauth docs](https://docs.allauth.org/en/latest/socialaccount/configuration.html)
|
||||
for more information.
|
||||
|
||||
To associate an existing Paperless-ngx account with a social account, first login with your
|
||||
regular credentials and then choose "My Profile" from the user dropdown in the app and you
|
||||
will see options to connect social account(s). If enabled, signup options will be available
|
||||
on the login page.
|
||||
|
||||
As an example, to set up login via Github, the following environment variables would need to be
|
||||
set:
|
||||
|
||||
```conf
|
||||
PAPERLESS_APPS="allauth.socialaccount.providers.github"
|
||||
PAPERLESS_SOCIALACCOUNT_PROVIDERS='{"github": {"APPS": [{"provider_id": "github","name": "Github","client_id": "<CLIENT_ID>","secret": "<CLIENT_SECRET>"}]}}'
|
||||
```
|
||||
|
||||
Or, to use OpenID Connect ("OIDC"), via Keycloak in this example:
|
||||
|
||||
```conf
|
||||
PAPERLESS_APPS="allauth.socialaccount.providers.openid_connect"
|
||||
PAPERLESS_SOCIALACCOUNT_PROVIDERS='
|
||||
{"openid_connect": {"APPS": [{"provider_id": "keycloak","name": "Keycloak","client_id": "paperless","secret": "<CLIENT_SECRET>","settings": { "server_url": "https://<KEYCLOAK_SERVER>/realms/<REALM>/.well-known/openid-configuration"}}]}}'
|
||||
```
|
||||
|
||||
More details about configuration option for various providers can be found in the [allauth documentation](https://docs.allauth.org/en/latest/socialaccount/providers/index.html#provider-specifics).
|
||||
|
||||
### Disabling Regular Login
|
||||
|
||||
Once external auth is set up, 'regular' login can be disabled with the [PAPERLESS_DISABLE_REGULAR_LOGIN](configuration.md#PAPERLESS_DISABLE_REGULAR_LOGIN) setting.
|
11
docs/advanced_usage.rst
Normal file
@@ -0,0 +1,11 @@
|
||||
.. _advanced_usage:
|
||||
|
||||
***************
|
||||
Advanced topics
|
||||
***************
|
||||
|
||||
.. cssclass:: redirect-notice
|
||||
|
||||
The Paperless-ngx documentation has permanently moved.
|
||||
|
||||
You will be redirected shortly...
|
509
docs/api.md
@@ -1,509 +0,0 @@
|
||||
# The REST API
|
||||
|
||||
Paperless makes use of the [Django REST
|
||||
Framework](https://django-rest-framework.org/) standard API interface. It
|
||||
provides a browsable API for most of its endpoints, which you can
|
||||
inspect at `http://<paperless-host>:<port>/api/`. This also documents
|
||||
most of the available filters and ordering fields.
|
||||
|
||||
The API provides the following main endpoints:
|
||||
|
||||
- `/api/correspondents/`: Full CRUD support.
|
||||
- `/api/custom_fields/`: Full CRUD support.
|
||||
- `/api/documents/`: Full CRUD support, except POSTing new documents.
|
||||
See [below](#posting-documents-file-uploads).
|
||||
- `/api/document_types/`: Full CRUD support.
|
||||
- `/api/groups/`: Full CRUD support.
|
||||
- `/api/logs/`: Read-Only.
|
||||
- `/api/mail_accounts/`: Full CRUD support.
|
||||
- `/api/mail_rules/`: Full CRUD support.
|
||||
- `/api/profile/`: GET, PATCH
|
||||
- `/api/share_links/`: Full CRUD support.
|
||||
- `/api/storage_paths/`: Full CRUD support.
|
||||
- `/api/tags/`: Full CRUD support.
|
||||
- `/api/tasks/`: Read-only.
|
||||
- `/api/users/`: Full CRUD support.
|
||||
- `/api/workflows/`: Full CRUD support.
|
||||
- `/api/search/` GET, see [below](#global-search).
|
||||
|
||||
All of these endpoints except for the logging endpoint allow you to
|
||||
fetch (and edit and delete where appropriate) individual objects by
|
||||
appending their primary key to the path, e.g. `/api/documents/454/`.
|
||||
|
||||
The objects served by the document endpoint contain the following
|
||||
fields:
|
||||
|
||||
- `id`: ID of the document. Read-only.
|
||||
- `title`: Title of the document.
|
||||
- `content`: Plain text content of the document.
|
||||
- `tags`: List of IDs of tags assigned to this document, or empty
|
||||
list.
|
||||
- `document_type`: Document type of this document, or null.
|
||||
- `correspondent`: Correspondent of this document or null.
|
||||
- `created`: The date time at which this document was created.
|
||||
- `created_date`: The date (YYYY-MM-DD) at which this document was
|
||||
created. Optional. If also passed with created, this is ignored.
|
||||
- `modified`: The date at which this document was last edited in
|
||||
paperless. Read-only.
|
||||
- `added`: The date at which this document was added to paperless.
|
||||
Read-only.
|
||||
- `archive_serial_number`: The identifier of this document in a
|
||||
physical document archive.
|
||||
- `original_file_name`: Verbose filename of the original document.
|
||||
Read-only.
|
||||
- `archived_file_name`: Verbose filename of the archived document.
|
||||
Read-only. Null if no archived document is available.
|
||||
- `notes`: Array of notes associated with the document.
|
||||
- `set_permissions`: Allows setting document permissions. Optional,
|
||||
write-only. See [below](#permissions).
|
||||
- `custom_fields`: Array of custom fields & values, specified as
|
||||
`{ field: CUSTOM_FIELD_ID, value: VALUE }`
|
||||
|
||||
!!! note
|
||||
|
||||
Note that all endpoint URLs must end with a `/`slash.
|
||||
|
||||
## Downloading documents
|
||||
|
||||
In addition to that, the document endpoint offers these additional
|
||||
actions on individual documents:
|
||||
|
||||
- `/api/documents/<pk>/download/`: Download the document.
|
||||
- `/api/documents/<pk>/preview/`: Display the document inline, without
|
||||
downloading it.
|
||||
- `/api/documents/<pk>/thumb/`: Download the PNG thumbnail of a
|
||||
document.
|
||||
|
||||
Paperless generates archived PDF/A documents from consumed files and
|
||||
stores both the original files as well as the archived files. By
|
||||
default, the endpoints for previews and downloads serve the archived
|
||||
file, if it is available. Otherwise, the original file is served. Some
|
||||
document cannot be archived.
|
||||
|
||||
The endpoints correctly serve the response header fields
|
||||
`Content-Disposition` and `Content-Type` to indicate the filename for
|
||||
download and the type of content of the document.
|
||||
|
||||
In order to download or preview the original document when an archived
|
||||
document is available, supply the query parameter `original=true`.
|
||||
|
||||
!!! tip
|
||||
|
||||
Paperless used to provide these functionality at `/fetch/<pk>/preview`,
|
||||
`/fetch/<pk>/thumb` and `/fetch/<pk>/doc`. Redirects to the new URLs are
|
||||
in place. However, if you use these old URLs to access documents, you
|
||||
should update your app or script to use the new URLs.
|
||||
|
||||
## Getting document metadata
|
||||
|
||||
The api also has an endpoint to retrieve read-only metadata about
|
||||
specific documents. this information is not served along with the
|
||||
document objects, since it requires reading files and would therefore
|
||||
slow down document lists considerably.
|
||||
|
||||
Access the metadata of a document with an ID `id` at
|
||||
`/api/documents/<id>/metadata/`.
|
||||
|
||||
The endpoint reports the following data:
|
||||
|
||||
- `original_checksum`: MD5 checksum of the original document.
|
||||
- `original_size`: Size of the original document, in bytes.
|
||||
- `original_mime_type`: Mime type of the original document.
|
||||
- `media_filename`: Current filename of the document, under which it
|
||||
is stored inside the media directory.
|
||||
- `has_archive_version`: True, if this document is archived, false
|
||||
otherwise.
|
||||
- `original_metadata`: A list of metadata associated with the original
|
||||
document. See below.
|
||||
- `archive_checksum`: MD5 checksum of the archived document, or null.
|
||||
- `archive_size`: Size of the archived document in bytes, or null.
|
||||
- `archive_metadata`: Metadata associated with the archived document,
|
||||
or null. See below.
|
||||
|
||||
File metadata is reported as a list of objects in the following form:
|
||||
|
||||
```json
|
||||
[
|
||||
{
|
||||
"namespace": "http://ns.adobe.com/pdf/1.3/",
|
||||
"prefix": "pdf",
|
||||
"key": "Producer",
|
||||
"value": "SparklePDF, Fancy edition"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
`namespace` and `prefix` can be null. The actual metadata reported
|
||||
depends on the file type and the metadata available in that specific
|
||||
document. Paperless only reports PDF metadata at this point.
|
||||
|
||||
## Documents additional endpoints
|
||||
|
||||
- `/api/documents/<id>/notes/`: Retrieve notes for a document.
|
||||
- `/api/documents/<id>/share_links/`: Retrieve share links for a document.
|
||||
- `/api/documents/<id>/history/`: Retrieve history of changes for a document.
|
||||
|
||||
## Authorization
|
||||
|
||||
The REST api provides four different forms of authentication.
|
||||
|
||||
1. Basic authentication
|
||||
|
||||
Authorize by providing a HTTP header in the form
|
||||
|
||||
```
|
||||
Authorization: Basic <credentials>
|
||||
```
|
||||
|
||||
where `credentials` is a base64-encoded string of
|
||||
`<username>:<password>`
|
||||
|
||||
2. Session authentication
|
||||
|
||||
When you're logged into paperless in your browser, you're
|
||||
automatically logged into the API as well and don't need to provide
|
||||
any authorization headers.
|
||||
|
||||
3. Token authentication
|
||||
|
||||
You can create (or re-create) an API token by opening the "My Profile"
|
||||
link in the user dropdown found in the web UI and clicking the circular
|
||||
arrow button.
|
||||
|
||||
Paperless also offers an endpoint to acquire authentication tokens.
|
||||
|
||||
POST a username and password as a form or json string to
|
||||
`/api/token/` and paperless will respond with a token, if the login
|
||||
data is correct. This token can be used to authenticate other
|
||||
requests with the following HTTP header:
|
||||
|
||||
```
|
||||
Authorization: Token <token>
|
||||
```
|
||||
|
||||
Tokens can also be managed in the Django admin.
|
||||
|
||||
4. Remote User authentication
|
||||
|
||||
If enabled (see
|
||||
[configuration](configuration.md#PAPERLESS_ENABLE_HTTP_REMOTE_USER_API)),
|
||||
you can authenticate against the API using Remote User auth.
|
||||
|
||||
## Global search
|
||||
|
||||
A global search endpoint is available at `/api/search/` and requires a search term
|
||||
of > 2 characters e.g. `?query=foo`. This endpoint returns a maximum of 3 results
|
||||
across nearly all objects, e.g. documents, tags, saved views, mail rules, etc.
|
||||
Results are only included if the requesting user has the appropriate permissions.
|
||||
|
||||
Results are returned in the following format:
|
||||
|
||||
```json
|
||||
{
|
||||
total: number
|
||||
documents: []
|
||||
saved_views: []
|
||||
correspondents: []
|
||||
document_types: []
|
||||
storage_paths: []
|
||||
tags: []
|
||||
users: []
|
||||
groups: []
|
||||
mail_accounts: []
|
||||
mail_rules: []
|
||||
custom_fields: []
|
||||
workflows: []
|
||||
}
|
||||
```
|
||||
|
||||
Global search first searches objects by name (or title for documents) matching the query.
|
||||
If the optional `db_only` parameter is set, only document titles will be searched. Otherwise,
|
||||
if the amount of documents returned by a simple title string search is < 3, results from the
|
||||
search index will also be included.
|
||||
|
||||
## Searching for documents
|
||||
|
||||
Full text searching is available on the `/api/documents/` endpoint. Two
|
||||
specific query parameters cause the API to return full text search
|
||||
results:
|
||||
|
||||
- `/api/documents/?query=your%20search%20query`: Search for a document
|
||||
using a full text query. For details on the syntax, see [Basic Usage - Searching](usage.md#basic-usage_searching).
|
||||
- `/api/documents/?more_like_id=1234`: Search for documents similar to
|
||||
the document with id 1234.
|
||||
|
||||
Pagination works exactly the same as it does for normal requests on this
|
||||
endpoint.
|
||||
|
||||
Certain limitations apply to full text queries:
|
||||
|
||||
- Results are always sorted by search score. The results matching the
|
||||
query best will show up first.
|
||||
- Only a small subset of filtering parameters are supported.
|
||||
|
||||
Furthermore, each returned document has an additional `__search_hit__`
|
||||
attribute with various information about the search results:
|
||||
|
||||
```
|
||||
{
|
||||
"count": 31,
|
||||
"next": "http://localhost:8000/api/documents/?page=2&query=test",
|
||||
"previous": null,
|
||||
"results": [
|
||||
|
||||
...
|
||||
|
||||
{
|
||||
"id": 123,
|
||||
"title": "title",
|
||||
"content": "content",
|
||||
|
||||
...
|
||||
|
||||
"__search_hit__": {
|
||||
"score": 0.343,
|
||||
"highlights": "text <span class="match">Test</span> text",
|
||||
"rank": 23
|
||||
}
|
||||
},
|
||||
|
||||
...
|
||||
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
- `score` is an indication how well this document matches the query
|
||||
relative to the other search results.
|
||||
- `highlights` is an excerpt from the document content and highlights
|
||||
the search terms with `<span>` tags as shown above.
|
||||
- `rank` is the index of the search results. The first result will
|
||||
have rank 0.
|
||||
|
||||
### `/api/search/autocomplete/`
|
||||
|
||||
Get auto completions for a partial search term.
|
||||
|
||||
Query parameters:
|
||||
|
||||
- `term`: The incomplete term.
|
||||
- `limit`: Amount of results. Defaults to 10.
|
||||
|
||||
Results returned by the endpoint are ordered by importance of the term
|
||||
in the document index. The first result is the term that has the highest
|
||||
[Tf/Idf](https://en.wikipedia.org/wiki/Tf%E2%80%93idf) score in the index.
|
||||
|
||||
```json
|
||||
["term1", "term3", "term6", "term4"]
|
||||
```
|
||||
|
||||
## POSTing documents {#file-uploads}
|
||||
|
||||
The API provides a special endpoint for file uploads:
|
||||
|
||||
`/api/documents/post_document/`
|
||||
|
||||
POST a multipart form to this endpoint, where the form field `document`
|
||||
contains the document that you want to upload to paperless. The filename
|
||||
is sanitized and then used to store the document in a temporary
|
||||
directory, and the consumer will be instructed to consume the document
|
||||
from there.
|
||||
|
||||
The endpoint supports the following optional form fields:
|
||||
|
||||
- `title`: Specify a title that the consumer should use for the
|
||||
document.
|
||||
- `created`: Specify a DateTime where the document was created (e.g.
|
||||
"2016-04-19" or "2016-04-19 06:15:00+02:00").
|
||||
- `correspondent`: Specify the ID of a correspondent that the consumer
|
||||
should use for the document.
|
||||
- `document_type`: Similar to correspondent.
|
||||
- `storage_path`: Similar to correspondent.
|
||||
- `tags`: Similar to correspondent. Specify this multiple times to
|
||||
have multiple tags added to the document.
|
||||
- `archive_serial_number`: An optional archive serial number to set.
|
||||
- `custom_fields`: An array of custom field ids to assign (with an empty
|
||||
value) to the document.
|
||||
|
||||
The endpoint will immediately return HTTP 200 if the document consumption
|
||||
process was started successfully, with the UUID of the consumption task
|
||||
as the data. No additional status information about the consumption process
|
||||
itself is available immediately, since that happens in a different process.
|
||||
However, querying the tasks endpoint with the returned UUID e.g.
|
||||
`/api/tasks/?task_id={uuid}` will provide information on the state of the
|
||||
consumption including the ID of a created document if consumption succeeded.
|
||||
|
||||
## Permissions
|
||||
|
||||
All objects (documents, tags, etc.) allow setting object-level permissions
|
||||
with optional `owner` and / or a `set_permissions` parameters which are of
|
||||
the form:
|
||||
|
||||
```
|
||||
"owner": ...,
|
||||
"set_permissions": {
|
||||
"view": {
|
||||
"users": [...],
|
||||
"groups": [...],
|
||||
},
|
||||
"change": {
|
||||
"users": [...],
|
||||
"groups": [...],
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
!!! note
|
||||
|
||||
Arrays should contain user or group ID numbers.
|
||||
|
||||
If these parameters are supplied the object's permissions will be overwritten,
|
||||
assuming the authenticated user has permission to do so (the user must be
|
||||
the object owner or a superuser).
|
||||
|
||||
### Retrieving full permissions
|
||||
|
||||
By default, the API will return a truncated version of object-level
|
||||
permissions, returning `user_can_change` indicating whether the current user
|
||||
can edit the object (either because they are the object owner or have permissions
|
||||
granted). You can pass the parameter `full_perms=true` to API calls to view the
|
||||
full permissions of objects in a format that mirrors the `set_permissions`
|
||||
parameter above.
|
||||
|
||||
## Bulk Editing
|
||||
|
||||
The API supports various bulk-editing operations which are executed asynchronously.
|
||||
|
||||
### Documents
|
||||
|
||||
For bulk operations on documents, use the endpoint `/api/documents/bulk_edit/` which accepts
|
||||
a json payload of the format:
|
||||
|
||||
```json
|
||||
{
|
||||
"documents": [LIST_OF_DOCUMENT_IDS],
|
||||
"method": METHOD, // see below
|
||||
"parameters": args // see below
|
||||
}
|
||||
```
|
||||
|
||||
The following methods are supported:
|
||||
|
||||
- `set_correspondent`
|
||||
- Requires `parameters`: `{ "correspondent": CORRESPONDENT_ID }`
|
||||
- `set_document_type`
|
||||
- Requires `parameters`: `{ "document_type": DOCUMENT_TYPE_ID }`
|
||||
- `set_storage_path`
|
||||
- Requires `parameters`: `{ "storage_path": STORAGE_PATH_ID }`
|
||||
- `add_tag`
|
||||
- Requires `parameters`: `{ "tag": TAG_ID }`
|
||||
- `remove_tag`
|
||||
- Requires `parameters`: `{ "tag": TAG_ID }`
|
||||
- `modify_tags`
|
||||
- Requires `parameters`: `{ "add_tags": [LIST_OF_TAG_IDS] }` and / or `{ "remove_tags": [LIST_OF_TAG_IDS] }`
|
||||
- `delete`
|
||||
- No `parameters` required
|
||||
- `redo_ocr`
|
||||
- No `parameters` required
|
||||
- `set_permissions`
|
||||
- Requires `parameters`:
|
||||
- `"set_permissions": PERMISSIONS_OBJ` (see format [above](#permissions)) and / or
|
||||
- `"owner": OWNER_ID or null`
|
||||
- `"merge": true or false` (defaults to false)
|
||||
- The `merge` flag determines if the supplied permissions will overwrite all existing permissions (including
|
||||
removing them) or be merged with existing permissions.
|
||||
- `merge`
|
||||
- No additional `parameters` required.
|
||||
- The ordering of the merged document is determined by the list of IDs.
|
||||
- Optional `parameters`:
|
||||
- `"metadata_document_id": DOC_ID` apply metadata (tags, correspondent, etc.) from this document to the merged document.
|
||||
- `split`
|
||||
- Requires `parameters`:
|
||||
- `"pages": [..]` The list should be a list of pages and/or a ranges, separated by commas e.g. `"[1,2-3,4,5-7]"`
|
||||
- The split operation only accepts a single document.
|
||||
- `rotate`
|
||||
- Requires `parameters`:
|
||||
- `"degrees": DEGREES`. Must be an integer i.e. 90, 180, 270
|
||||
|
||||
### Objects
|
||||
|
||||
Bulk editing for objects (tags, document types etc.) currently supports set permissions or delete
|
||||
operations, using the endpoint: `/api/bulk_edit_objects/`, which requires a json payload of the format:
|
||||
|
||||
```json
|
||||
{
|
||||
"objects": [LIST_OF_OBJECT_IDS],
|
||||
"object_type": "tags", "correspondents", "document_types" or "storage_paths",
|
||||
"operation": "set_permissions" or "delete",
|
||||
"owner": OWNER_ID, // optional
|
||||
"permissions": { "view": { "users": [] ... }, "change": { ... } }, // (see 'set_permissions' format above)
|
||||
"merge": true / false // defaults to false, see above
|
||||
}
|
||||
```
|
||||
|
||||
## API Versioning
|
||||
|
||||
The REST API is versioned since Paperless-ngx 1.3.0.
|
||||
|
||||
- Versioning ensures that changes to the API don't break older
|
||||
clients.
|
||||
- Clients specify the specific version of the API they wish to use
|
||||
with every request and Paperless will handle the request using the
|
||||
specified API version.
|
||||
- Even if the underlying data model changes, older API versions will
|
||||
always serve compatible data.
|
||||
- If no version is specified, Paperless will serve version 1 to ensure
|
||||
compatibility with older clients that do not request a specific API
|
||||
version.
|
||||
|
||||
API versions are specified by submitting an additional HTTP `Accept`
|
||||
header with every request:
|
||||
|
||||
```
|
||||
Accept: application/json; version=6
|
||||
```
|
||||
|
||||
If an invalid version is specified, Paperless 1.3.0 will respond with
|
||||
"406 Not Acceptable" and an error message in the body. Earlier
|
||||
versions of Paperless will serve API version 1 regardless of whether a
|
||||
version is specified via the `Accept` header.
|
||||
|
||||
If a client wishes to verify whether it is compatible with any given
|
||||
server, the following procedure should be performed:
|
||||
|
||||
1. Perform an _authenticated_ request against any API endpoint. If the
|
||||
server is on version 1.3.0 or newer, the server will add two custom
|
||||
headers to the response:
|
||||
|
||||
```
|
||||
X-Api-Version: 2
|
||||
X-Version: 1.3.0
|
||||
```
|
||||
|
||||
2. Determine whether the client is compatible with this server based on
|
||||
the presence/absence of these headers and their values if present.
|
||||
|
||||
### API Changelog
|
||||
|
||||
#### Version 1
|
||||
|
||||
Initial API version.
|
||||
|
||||
#### Version 2
|
||||
|
||||
- Added field `Tag.color`. This read/write string field contains a hex
|
||||
color such as `#a6cee3`.
|
||||
- Added read-only field `Tag.text_color`. This field contains the text
|
||||
color to use for a specific tag, which is either black or white
|
||||
depending on the brightness of `Tag.color`.
|
||||
- Removed field `Tag.colour`.
|
||||
|
||||
#### Version 3
|
||||
|
||||
- Permissions endpoints have been added.
|
||||
- The format of the `/api/ui_settings/` has changed.
|
||||
|
||||
#### Version 4
|
||||
|
||||
- Consumption templates were refactored to workflows and API endpoints
|
||||
changed as such.
|
12
docs/api.rst
Normal file
@@ -0,0 +1,12 @@
|
||||
.. _api:
|
||||
|
||||
************
|
||||
The REST API
|
||||
************
|
||||
|
||||
|
||||
.. cssclass:: redirect-notice
|
||||
|
||||
The Paperless-ngx documentation has permanently moved.
|
||||
|
||||
You will be redirected shortly...
|
@@ -1,100 +0,0 @@
|
||||
:root > * {
|
||||
--md-primary-fg-color: #17541f;
|
||||
--md-primary-fg-color--dark: #17541f;
|
||||
--md-primary-fg-color--light: #17541f;
|
||||
--md-accent-fg-color: #2b8a38;
|
||||
--md-typeset-a-color: #21652a;
|
||||
}
|
||||
|
||||
[data-md-color-scheme="slate"] {
|
||||
--md-hue: 222;
|
||||
}
|
||||
|
||||
@media (min-width: 400px) {
|
||||
.grid-left {
|
||||
width: 33%;
|
||||
float: left;
|
||||
}
|
||||
.grid-right {
|
||||
width: 62%;
|
||||
margin-left: 4%;
|
||||
float: left;
|
||||
}
|
||||
|
||||
.grid-flipped-left {
|
||||
width: 66%;
|
||||
float: left;
|
||||
}
|
||||
|
||||
.grid-flipped-right {
|
||||
width: 29%;
|
||||
margin-left: 4%;
|
||||
float: left;
|
||||
}
|
||||
|
||||
.grid-half-left {
|
||||
width: 48%;
|
||||
float: left;
|
||||
}
|
||||
|
||||
.grid-half-right {
|
||||
width: 48%;
|
||||
margin-left: 4%;
|
||||
float: left;
|
||||
}
|
||||
}
|
||||
|
||||
.grid-left > p {
|
||||
margin-bottom: 2rem;
|
||||
}
|
||||
|
||||
|
||||
.grid-right p {
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.clear {
|
||||
clear: both;
|
||||
margin-bottom: 20px;
|
||||
display: block;
|
||||
}
|
||||
|
||||
.index-callout {
|
||||
margin-right: .5rem;
|
||||
}
|
||||
|
||||
/* make code in headers not bold */
|
||||
h4 code {
|
||||
font-weight: normal;
|
||||
}
|
||||
|
||||
/* Hide config vars from sidebar, toc and move the border on mobile case their hidden */
|
||||
.md-nav.md-nav--secondary .md-nav__item .md-nav__link[href*="PAPERLESS_"],
|
||||
.md-nav.md-nav--secondary .md-nav__item .md-nav__link[href*="USERMAP_"] {
|
||||
display: none;
|
||||
}
|
||||
|
||||
@media screen and (max-width: 76.1875em) {
|
||||
.md-nav--primary .md-nav__item {
|
||||
border-top: none;
|
||||
}
|
||||
|
||||
.md-nav--primary .md-nav__link {
|
||||
border-top: .05rem solid var(--md-default-fg-color--lightest);
|
||||
}
|
||||
}
|
||||
|
||||
/* Show search shortcut key */
|
||||
[data-md-toggle="search"]:not(:checked) ~ .md-header .md-search__form::after {
|
||||
position: absolute;
|
||||
top: .3rem;
|
||||
right: .3rem;
|
||||
display: block;
|
||||
padding: .1rem .4rem;
|
||||
color: var(--md-default-fg-color--lighter);
|
||||
font-weight: bold;
|
||||
font-size: .8rem;
|
||||
border: .05rem solid var(--md-default-fg-color--lighter);
|
||||
border-radius: .1rem;
|
||||
content: "/";
|
||||
}
|
Before Width: | Height: | Size: 768 B |
@@ -1,12 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<!-- Generator: Adobe Illustrator 27.0.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||
viewBox="0 0 1000 1000" style="enable-background:new 0 0 1000 1000;" xml:space="preserve">
|
||||
<style type="text/css">
|
||||
.st0{fill:#FFFFFF;}
|
||||
</style>
|
||||
<path class="st0" d="M299,891.7c-4.2-19.8-12.5-59.6-13.6-59.6c-176.7-105.7-155.8-288.7-97.3-393.4
|
||||
c12.5,131.8,245.8,222.8,109.8,383.9c-1.1,2,6.2,27.2,12.5,50.2c27.2-46,68-101.4,65.8-106.7C208.9,358.2,731.9,326.9,840.6,73.7
|
||||
c49.1,244.8-25.1,623.5-445.5,719.7c-2,1.1-76.3,131.8-79.5,132.9c0-2-31.4-1.1-27.2-11.5C290.7,908.4,294.8,900.1,299,891.7
|
||||
L299,891.7z M293.8,793.4c53.3-61.8-9.4-167.4-47.1-201.9C310.5,701.3,306.3,765.1,293.8,793.4L293.8,793.4z"/>
|
||||
</svg>
|
Before Width: | Height: | Size: 869 B |
@@ -1,68 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<!-- Generator: Adobe Illustrator 27.0.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||
viewBox="0 0 2962.2 860.2" style="enable-background:new 0 0 2962.2 860.2;" xml:space="preserve">
|
||||
<style type="text/css">
|
||||
.st0{fill:#17541F;stroke:#000000;stroke-miterlimit:10;}
|
||||
</style>
|
||||
<path d="M1055.6,639.7v-20.6c-18,20-43.1,30.1-75.4,30.1c-22.4,0-42.8-5.8-61-17.5c-18.3-11.7-32.5-27.8-42.9-48.3
|
||||
c-10.3-20.5-15.5-43.3-15.5-68.4c0-25.1,5.2-48,15.5-68.5s24.6-36.6,42.9-48.3s38.6-17.5,61-17.5c32.3,0,57.5,10,75.4,30.1v-20.6
|
||||
h85.3v249.6L1055.6,639.7L1055.6,639.7z M1059.1,514.9c0-17.4-5.2-31.9-15.5-43.8c-10.3-11.8-23.9-17.7-40.6-17.7
|
||||
c-16.8,0-30.2,5.9-40.4,17.7c-10.2,11.8-15.3,26.4-15.3,43.8c0,17.4,5.1,31.9,15.3,43.8c10.2,11.8,23.6,17.7,40.4,17.7
|
||||
c16.8,0,30.3-5.9,40.6-17.7C1054,546.9,1059.1,532.3,1059.1,514.9z"/>
|
||||
<path d="M1417.8,398.2c18.3,11.7,32.5,27.8,42.9,48.3c10.3,20.5,15.5,43.3,15.5,68.5c0,25.1-5.2,48-15.5,68.4
|
||||
c-10.3,20.5-24.6,36.6-42.9,48.3s-38.6,17.5-61,17.5c-32.3,0-57.5-10-75.4-30.1v165.6h-85.3V390.2h85.3v20.6
|
||||
c18-20,43.1-30.1,75.4-30.1C1379.2,380.7,1399.5,386.6,1417.8,398.2z M1389.5,514.9c0-17.4-5.1-31.9-15.3-43.8
|
||||
c-10.2-11.8-23.6-17.7-40.4-17.7s-30.2,5.9-40.4,17.7c-10.2,11.8-15.3,26.4-15.3,43.8c0,17.4,5.1,31.9,15.3,43.8
|
||||
c10.2,11.8,23.6,17.7,40.4,17.7s30.2-5.9,40.4-17.7S1389.5,532.3,1389.5,514.9z"/>
|
||||
<path d="M1713.6,555.3l53,49.4c-28.1,29.6-66.7,44.4-115.8,44.4c-28.1,0-53-5.8-74.5-17.5s-38.2-27.7-49.8-48
|
||||
c-11.7-20.3-17.7-43.2-18-68.7c0-24.8,5.9-47.5,17.7-68c11.8-20.5,28.1-36.7,48.7-48.5s43.5-17.7,68.7-17.7
|
||||
c24.8,0,47.6,6.1,68.2,18.2s37,29.5,49.1,52.3c12.1,22.7,18.2,49.1,18.2,79l-0.4,11.7h-181.8c3.6,11.4,10.5,20.7,20.9,28.1
|
||||
c10.3,7.3,21.3,11,33,11c14.4,0,26.3-2.2,35.7-6.5C1695.8,570.1,1704.9,563.7,1713.6,555.3z M1596.9,486.2h92.9
|
||||
c-2.1-12.3-7.5-22.1-16.2-29.4s-18.7-11-30.1-11s-21.5,3.7-30.3,11S1599,473.9,1596.9,486.2z"/>
|
||||
<path d="M1908.8,418.4c7.8-10.8,17.2-19,28.3-24.7s22-8.5,32.8-8.5c11.4,0,20,1.6,26,4.9l-10.8,72.7c-8.4-2.1-15.7-3.1-22-3.1
|
||||
c-17.1,0-30.4,4.3-39.9,12.8c-9.6,8.5-14.4,24.2-14.4,46.9v120.3h-85.3V390.2h85.3V418.4L1908.8,418.4z"/>
|
||||
<path d="M2113,258.2v381.5h-85.3V258.2H2113z"/>
|
||||
<path d="M2360.8,555.3l53,49.4c-28.1,29.6-66.7,44.4-115.8,44.4c-28.1,0-53-5.8-74.5-17.5s-38.2-27.7-49.8-48
|
||||
c-11.7-20.3-17.7-43.2-18-68.7c0-24.8,5.9-47.5,17.7-68s28.1-36.7,48.7-48.5c20.6-11.8,43.5-17.7,68.7-17.7
|
||||
c24.8,0,47.6,6.1,68.2,18.2c20.6,12.1,37,29.5,49.1,52.3c12.1,22.7,18.2,49.1,18.2,79l-0.4,11.7h-181.8
|
||||
c3.6,11.4,10.5,20.7,20.9,28.1c10.3,7.3,21.3,11,33,11c14.4,0,26.3-2.2,35.7-6.5C2343.1,570.1,2352.1,563.7,2360.8,555.3z
|
||||
M2244.1,486.2h92.9c-2.1-12.3-7.5-22.1-16.2-29.4s-18.7-11-30.1-11s-21.5,3.7-30.3,11C2251.7,464.1,2246.2,473.9,2244.1,486.2z"/>
|
||||
<path d="M2565.9,446.3c-9.9,0-17.1,1.1-21.5,3.4c-4.5,2.2-6.7,5.9-6.7,11s3.4,8.8,10.3,11.2c6.9,2.4,18,4.9,33.2,7.6
|
||||
c20,3,37,6.7,50.9,11.2s26,12.1,36.1,22.9c10.2,10.8,15.3,25.9,15.3,45.3c0,29.9-10.9,52.4-32.8,67.6
|
||||
c-21.8,15.1-50.3,22.7-85.3,22.7c-25.7,0-49.5-3.7-71.4-11c-21.8-7.3-37.4-14.7-46.7-22.2l33.7-60.6c10.2,9,23.4,15.8,39.7,20.4
|
||||
c16.3,4.6,31.3,7,45.1,7c19.7,0,29.6-5.2,29.6-15.7c0-5.4-3.3-9.4-9.9-11.9c-6.6-2.5-17.2-5.2-31.9-7.9c-18.9-3.3-34.9-7.2-48-11.7
|
||||
c-13.2-4.5-24.6-12.2-34.3-23.1c-9.7-10.9-14.6-26-14.6-45.1c0-27.2,9.7-48.5,29-63.7c19.3-15.3,46-22.9,80.1-22.9
|
||||
c23.3,0,44.4,3.6,63.3,10.8c18.9,7.2,34,14.5,45.3,22l-32.8,58.8c-10.8-7.5-23.2-13.7-37.3-18.6
|
||||
C2590.5,448.7,2577.6,446.3,2565.9,446.3z"/>
|
||||
<path d="M2817.3,446.3c-9.9,0-17.1,1.1-21.5,3.4c-4.5,2.2-6.7,5.9-6.7,11s3.4,8.8,10.3,11.2c6.9,2.4,18,4.9,33.2,7.6
|
||||
c20,3,37,6.7,50.9,11.2s26,12.1,36.1,22.9c10.2,10.8,15.3,25.9,15.3,45.3c0,29.9-10.9,52.4-32.8,67.6
|
||||
c-21.8,15.1-50.3,22.7-85.3,22.7c-25.7,0-49.5-3.7-71.4-11c-21.8-7.3-37.4-14.7-46.7-22.2l33.7-60.6c10.2,9,23.4,15.8,39.7,20.4
|
||||
c16.3,4.6,31.3,7,45.1,7c19.8,0,29.6-5.2,29.6-15.7c0-5.4-3.3-9.4-9.9-11.9c-6.6-2.5-17.2-5.2-31.9-7.9c-18.9-3.3-34.9-7.2-48-11.7
|
||||
c-13.2-4.5-24.6-12.2-34.3-23.1c-9.7-10.9-14.6-26-14.6-45.1c0-27.2,9.7-48.5,29-63.7c19.3-15.3,46-22.9,80.1-22.9
|
||||
c23.3,0,44.4,3.6,63.3,10.8c18.9,7.2,34,14.5,45.3,22l-32.8,58.8c-10.8-7.5-23.2-13.7-37.3-18.6
|
||||
C2841.8,448.7,2828.9,446.3,2817.3,446.3z"/>
|
||||
<g>
|
||||
<path d="M2508,724h60.2v17.3H2508V724z"/>
|
||||
<path d="M2629.2,694.4c4.9-2,10.2-3.1,16-3.1c10.9,0,19.5,3.4,25.9,10.2s9.6,16.7,9.6,29.6v57.3h-19.6v-52.6
|
||||
c0-9.3-1.7-16.2-5.1-20.7c-3.4-4.5-9.1-6.7-17-6.7c-6.5,0-11.8,2.4-16.1,7.1c-4.3,4.8-6.4,11.5-6.4,20.2v52.6h-19.6v-94.6h19.6v9.5
|
||||
C2620.2,699.4,2624.4,696.4,2629.2,694.4z"/>
|
||||
<path d="M2790.3,833.2c-8.6,6.8-19.4,10.2-32.3,10.2c-7.9,0-15.2-1.4-21.9-4.1s-12.1-6.8-16.3-12.2s-6.6-11.9-7.1-19.6h19.6
|
||||
c0.7,6.1,3.5,10.8,8.4,13.9c4.9,3.2,10.7,4.8,17.4,4.8c7,0,13.1-2,18.2-6c5.1-4,7.7-10.3,7.7-18.9v-24.7c-3.6,3.4-8,6.2-13.3,8.2
|
||||
c-5.2,2.1-10.7,3.1-16.3,3.1c-8.7,0-16.6-2.1-23.7-6.4c-7.1-4.3-12.6-10-16.7-17.3c-4-7.3-6-15.5-6-24.6s2-17.3,6-24.7
|
||||
s9.6-13.2,16.7-17.4c7.1-4.3,15-6.4,23.7-6.4c5.7,0,11.1,1,16.3,3.1s9.6,4.8,13.3,8.2v-8.8h19.4v107.8
|
||||
C2803.2,815.9,2798.9,826.4,2790.3,833.2z M2782.2,755.7c2.6-4.7,3.8-10,3.8-15.9s-1.3-11.2-3.8-16c-2.6-4.8-6.1-8.5-10.5-11.1
|
||||
c-4.5-2.7-9.5-4-15.1-4c-5.8,0-10.9,1.4-15.4,4.3c-4.5,2.8-7.9,6.6-10.3,11.4c-2.4,4.8-3.6,9.9-3.6,15.5c0,5.4,1.2,10.5,3.6,15.3
|
||||
c2.4,4.8,5.8,8.6,10.3,11.5s9.6,4.3,15.4,4.3c5.6,0,10.6-1.4,15.1-4.1C2776.1,764.1,2779.6,760.4,2782.2,755.7z"/>
|
||||
<path d="M2843.5,788.4h-21.6l37.9-48l-36.4-46.6h22.6l25.7,33.3l25.8-33.3h21.6l-36.2,45.9l37.9,48.6h-22.6l-27.4-35L2843.5,788.4z
|
||||
"/>
|
||||
</g>
|
||||
<path d="M835.8,319.2c-11.5-18.9-27.4-33.7-47.6-44.7c-20.2-10.9-43-16.4-68.5-16.4h-90.6c-8.6,39.6-21.3,77.2-38,112.4
|
||||
c-10,21-21.3,41-33.9,59.9v209.2H647v-135h72.7c25.4,0,48.3-5.5,68.5-16.4s36.1-25.8,47.6-44.7c11.5-18.9,17.3-39.5,17.3-61.9
|
||||
C853.1,358.9,847.4,338.1,835.8,319.2z M747,416.6c-9.4,9-21.8,13.5-37,13.5l-62.8,0.4v-93.4l62.8-0.4c15.3,0,27.6,4.5,37,13.5
|
||||
s14.1,20,14.1,33.2C761.1,396.6,756.4,407.7,747,416.6z"/>
|
||||
<path class="st0" d="M164.7,698.7c-3.5-16.5-10.4-49.6-11.3-49.6c-147.1-88-129.7-240.3-81-327.4C82.8,431.4,277,507.1,163.8,641.2
|
||||
c-0.9,1.7,5.2,22.6,10.4,41.8c22.6-38.3,56.6-84.4,54.8-88.8C89.7,254.7,525,228.6,615.5,17.9c40.9,203.7-20.9,518.9-370.8,599
|
||||
c-1.7,0.9-63.5,109.7-66.2,110.6c0-1.7-26.1-0.9-22.6-9.6C157.8,712.6,161.2,705.7,164.7,698.7L164.7,698.7z M160.4,616.9
|
||||
c44.4-51.4-7.8-139.3-39.2-168C174.3,540.2,170.8,593.3,160.4,616.9L160.4,616.9z"/>
|
||||
</svg>
|
Before Width: | Height: | Size: 6.3 KiB |
@@ -1,69 +0,0 @@
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<!-- Generator: Adobe Illustrator 27.0.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||
viewBox="0 0 2962.2 860.2" style="enable-background:new 0 0 2962.2 860.2;" xml:space="preserve">
|
||||
<style type="text/css">
|
||||
.st0{fill:#FFFFFF;stroke:#000000;stroke-miterlimit:10;}
|
||||
.st1{fill:#17541F;stroke:#000000;stroke-miterlimit:10;}
|
||||
</style>
|
||||
<path class="st0" d="M1055.6,639.7v-20.6c-18,20-43.1,30.1-75.4,30.1c-22.4,0-42.8-5.8-61-17.5c-18.3-11.7-32.5-27.8-42.9-48.3
|
||||
c-10.3-20.5-15.5-43.3-15.5-68.4c0-25.1,5.2-48,15.5-68.5s24.6-36.6,42.9-48.3s38.6-17.5,61-17.5c32.3,0,57.5,10,75.4,30.1v-20.6
|
||||
h85.3v249.6L1055.6,639.7L1055.6,639.7z M1059.1,514.9c0-17.4-5.2-31.9-15.5-43.8c-10.3-11.8-23.9-17.7-40.6-17.7
|
||||
c-16.8,0-30.2,5.9-40.4,17.7c-10.2,11.8-15.3,26.4-15.3,43.8c0,17.4,5.1,31.9,15.3,43.8c10.2,11.8,23.6,17.7,40.4,17.7
|
||||
c16.8,0,30.3-5.9,40.6-17.7C1054,546.9,1059.1,532.3,1059.1,514.9z"/>
|
||||
<path class="st0" d="M1417.8,398.2c18.3,11.7,32.5,27.8,42.9,48.3c10.3,20.5,15.5,43.3,15.5,68.5c0,25.1-5.2,48-15.5,68.4
|
||||
c-10.3,20.5-24.6,36.6-42.9,48.3s-38.6,17.5-61,17.5c-32.3,0-57.5-10-75.4-30.1v165.6h-85.3V390.2h85.3v20.6
|
||||
c18-20,43.1-30.1,75.4-30.1C1379.2,380.7,1399.5,386.6,1417.8,398.2z M1389.5,514.9c0-17.4-5.1-31.9-15.3-43.8
|
||||
c-10.2-11.8-23.6-17.7-40.4-17.7s-30.2,5.9-40.4,17.7c-10.2,11.8-15.3,26.4-15.3,43.8c0,17.4,5.1,31.9,15.3,43.8
|
||||
c10.2,11.8,23.6,17.7,40.4,17.7s30.2-5.9,40.4-17.7S1389.5,532.3,1389.5,514.9z"/>
|
||||
<path class="st0" d="M1713.6,555.3l53,49.4c-28.1,29.6-66.7,44.4-115.8,44.4c-28.1,0-53-5.8-74.5-17.5s-38.2-27.7-49.8-48
|
||||
c-11.7-20.3-17.7-43.2-18-68.7c0-24.8,5.9-47.5,17.7-68c11.8-20.5,28.1-36.7,48.7-48.5s43.5-17.7,68.7-17.7
|
||||
c24.8,0,47.6,6.1,68.2,18.2s37,29.5,49.1,52.3c12.1,22.7,18.2,49.1,18.2,79l-0.4,11.7h-181.8c3.6,11.4,10.5,20.7,20.9,28.1
|
||||
c10.3,7.3,21.3,11,33,11c14.4,0,26.3-2.2,35.7-6.5C1695.8,570.1,1704.9,563.7,1713.6,555.3z M1596.9,486.2h92.9
|
||||
c-2.1-12.3-7.5-22.1-16.2-29.4s-18.7-11-30.1-11s-21.5,3.7-30.3,11S1599,473.9,1596.9,486.2z"/>
|
||||
<path class="st0" d="M1908.8,418.4c7.8-10.8,17.2-19,28.3-24.7s22-8.5,32.8-8.5c11.4,0,20,1.6,26,4.9l-10.8,72.7
|
||||
c-8.4-2.1-15.7-3.1-22-3.1c-17.1,0-30.4,4.3-39.9,12.8c-9.6,8.5-14.4,24.2-14.4,46.9v120.3h-85.3V390.2h85.3V418.4L1908.8,418.4z"/>
|
||||
<path class="st0" d="M2113,258.2v381.5h-85.3V258.2H2113z"/>
|
||||
<path class="st0" d="M2360.8,555.3l53,49.4c-28.1,29.6-66.7,44.4-115.8,44.4c-28.1,0-53-5.8-74.5-17.5s-38.2-27.7-49.8-48
|
||||
c-11.7-20.3-17.7-43.2-18-68.7c0-24.8,5.9-47.5,17.7-68s28.1-36.7,48.7-48.5c20.6-11.8,43.5-17.7,68.7-17.7
|
||||
c24.8,0,47.6,6.1,68.2,18.2c20.6,12.1,37,29.5,49.1,52.3c12.1,22.7,18.2,49.1,18.2,79l-0.4,11.7h-181.8
|
||||
c3.6,11.4,10.5,20.7,20.9,28.1c10.3,7.3,21.3,11,33,11c14.4,0,26.3-2.2,35.7-6.5C2343.1,570.1,2352.1,563.7,2360.8,555.3z
|
||||
M2244.1,486.2h92.9c-2.1-12.3-7.5-22.1-16.2-29.4s-18.7-11-30.1-11s-21.5,3.7-30.3,11C2251.7,464.1,2246.2,473.9,2244.1,486.2z"/>
|
||||
<path class="st0" d="M2565.9,446.3c-9.9,0-17.1,1.1-21.5,3.4c-4.5,2.2-6.7,5.9-6.7,11s3.4,8.8,10.3,11.2c6.9,2.4,18,4.9,33.2,7.6
|
||||
c20,3,37,6.7,50.9,11.2s26,12.1,36.1,22.9c10.2,10.8,15.3,25.9,15.3,45.3c0,29.9-10.9,52.4-32.8,67.6
|
||||
c-21.8,15.1-50.3,22.7-85.3,22.7c-25.7,0-49.5-3.7-71.4-11c-21.8-7.3-37.4-14.7-46.7-22.2l33.7-60.6c10.2,9,23.4,15.8,39.7,20.4
|
||||
c16.3,4.6,31.3,7,45.1,7c19.7,0,29.6-5.2,29.6-15.7c0-5.4-3.3-9.4-9.9-11.9c-6.6-2.5-17.2-5.2-31.9-7.9c-18.9-3.3-34.9-7.2-48-11.7
|
||||
c-13.2-4.5-24.6-12.2-34.3-23.1c-9.7-10.9-14.6-26-14.6-45.1c0-27.2,9.7-48.5,29-63.7c19.3-15.3,46-22.9,80.1-22.9
|
||||
c23.3,0,44.4,3.6,63.3,10.8c18.9,7.2,34,14.5,45.3,22l-32.8,58.8c-10.8-7.5-23.2-13.7-37.3-18.6
|
||||
C2590.5,448.7,2577.6,446.3,2565.9,446.3z"/>
|
||||
<path class="st0" d="M2817.3,446.3c-9.9,0-17.1,1.1-21.5,3.4c-4.5,2.2-6.7,5.9-6.7,11s3.4,8.8,10.3,11.2c6.9,2.4,18,4.9,33.2,7.6
|
||||
c20,3,37,6.7,50.9,11.2s26,12.1,36.1,22.9c10.2,10.8,15.3,25.9,15.3,45.3c0,29.9-10.9,52.4-32.8,67.6
|
||||
c-21.8,15.1-50.3,22.7-85.3,22.7c-25.7,0-49.5-3.7-71.4-11c-21.8-7.3-37.4-14.7-46.7-22.2l33.7-60.6c10.2,9,23.4,15.8,39.7,20.4
|
||||
c16.3,4.6,31.3,7,45.1,7c19.8,0,29.6-5.2,29.6-15.7c0-5.4-3.3-9.4-9.9-11.9c-6.6-2.5-17.2-5.2-31.9-7.9c-18.9-3.3-34.9-7.2-48-11.7
|
||||
c-13.2-4.5-24.6-12.2-34.3-23.1c-9.7-10.9-14.6-26-14.6-45.1c0-27.2,9.7-48.5,29-63.7c19.3-15.3,46-22.9,80.1-22.9
|
||||
c23.3,0,44.4,3.6,63.3,10.8c18.9,7.2,34,14.5,45.3,22l-32.8,58.8c-10.8-7.5-23.2-13.7-37.3-18.6
|
||||
C2841.8,448.7,2828.9,446.3,2817.3,446.3z"/>
|
||||
<g>
|
||||
<path class="st0" d="M2508,724h60.2v17.3H2508V724z"/>
|
||||
<path class="st0" d="M2629.2,694.4c4.9-2,10.2-3.1,16-3.1c10.9,0,19.5,3.4,25.9,10.2s9.6,16.7,9.6,29.6v57.3h-19.6v-52.6
|
||||
c0-9.3-1.7-16.2-5.1-20.7c-3.4-4.5-9.1-6.7-17-6.7c-6.5,0-11.8,2.4-16.1,7.1c-4.3,4.8-6.4,11.5-6.4,20.2v52.6h-19.6v-94.6h19.6v9.5
|
||||
C2620.2,699.4,2624.4,696.4,2629.2,694.4z"/>
|
||||
<path class="st0" d="M2790.3,833.2c-8.6,6.8-19.4,10.2-32.3,10.2c-7.9,0-15.2-1.4-21.9-4.1s-12.1-6.8-16.3-12.2s-6.6-11.9-7.1-19.6
|
||||
h19.6c0.7,6.1,3.5,10.8,8.4,13.9c4.9,3.2,10.7,4.8,17.4,4.8c7,0,13.1-2,18.2-6c5.1-4,7.7-10.3,7.7-18.9v-24.7
|
||||
c-3.6,3.4-8,6.2-13.3,8.2c-5.2,2.1-10.7,3.1-16.3,3.1c-8.7,0-16.6-2.1-23.7-6.4c-7.1-4.3-12.6-10-16.7-17.3c-4-7.3-6-15.5-6-24.6
|
||||
s2-17.3,6-24.7s9.6-13.2,16.7-17.4c7.1-4.3,15-6.4,23.7-6.4c5.7,0,11.1,1,16.3,3.1s9.6,4.8,13.3,8.2v-8.8h19.4v107.8
|
||||
C2803.2,815.9,2798.9,826.4,2790.3,833.2z M2782.2,755.7c2.6-4.7,3.8-10,3.8-15.9s-1.3-11.2-3.8-16c-2.6-4.8-6.1-8.5-10.5-11.1
|
||||
c-4.5-2.7-9.5-4-15.1-4c-5.8,0-10.9,1.4-15.4,4.3c-4.5,2.8-7.9,6.6-10.3,11.4c-2.4,4.8-3.6,9.9-3.6,15.5c0,5.4,1.2,10.5,3.6,15.3
|
||||
c2.4,4.8,5.8,8.6,10.3,11.5s9.6,4.3,15.4,4.3c5.6,0,10.6-1.4,15.1-4.1C2776.1,764.1,2779.6,760.4,2782.2,755.7z"/>
|
||||
<path class="st0" d="M2843.5,788.4h-21.6l37.9-48l-36.4-46.6h22.6l25.7,33.3l25.8-33.3h21.6l-36.2,45.9l37.9,48.6h-22.6l-27.4-35
|
||||
L2843.5,788.4z"/>
|
||||
</g>
|
||||
<path class="st0" d="M835.8,319.2c-11.5-18.9-27.4-33.7-47.6-44.7c-20.2-10.9-43-16.4-68.5-16.4h-90.6c-8.6,39.6-21.3,77.2-38,112.4
|
||||
c-10,21-21.3,41-33.9,59.9v209.2H647v-135h72.7c25.4,0,48.3-5.5,68.5-16.4s36.1-25.8,47.6-44.7c11.5-18.9,17.3-39.5,17.3-61.9
|
||||
C853.1,358.9,847.4,338.1,835.8,319.2z M747,416.6c-9.4,9-21.8,13.5-37,13.5l-62.8,0.4v-93.4l62.8-0.4c15.3,0,27.6,4.5,37,13.5
|
||||
s14.1,20,14.1,33.2C761.1,396.6,756.4,407.7,747,416.6z"/>
|
||||
<path class="st1" d="M164.7,698.7c-3.5-16.5-10.4-49.6-11.3-49.6c-147.1-88-129.7-240.3-81-327.4C82.8,431.4,277,507.1,163.8,641.2
|
||||
c-0.9,1.7,5.2,22.6,10.4,41.8c22.6-38.3,56.6-84.4,54.8-88.8C89.7,254.7,525,228.6,615.5,17.9c40.9,203.7-20.9,518.9-370.8,599
|
||||
c-1.7,0.9-63.5,109.7-66.2,110.6c0-1.7-26.1-0.9-22.6-9.6C157.8,712.6,161.2,705.7,164.7,698.7L164.7,698.7z M160.4,616.9
|
||||
c44.4-51.4-7.8-139.3-39.2-168C174.3,540.2,170.8,593.3,160.4,616.9L160.4,616.9z"/>
|
||||
</svg>
|
Before Width: | Height: | Size: 6.5 KiB |
Before Width: | Height: | Size: 1.8 MiB |
Before Width: | Height: | Size: 501 KiB |
Before Width: | Height: | Size: 21 KiB |
Before Width: | Height: | Size: 2.2 MiB |
Before Width: | Height: | Size: 644 KiB |
Before Width: | Height: | Size: 667 KiB |
Before Width: | Height: | Size: 1003 KiB |