Compare commits
2 Commits
v1.9.0-bet
...
beta-1.6.1
Author | SHA1 | Date | |
---|---|---|---|
![]() |
7d8721c53c | ||
![]() |
aa67c851e3 |
@@ -1,9 +0,0 @@
|
||||
{
|
||||
"qpdf": {
|
||||
"version": "10.6.3"
|
||||
},
|
||||
"jbig2enc": {
|
||||
"version": "0.29",
|
||||
"git_tag": "0.29"
|
||||
}
|
||||
}
|
@@ -27,14 +27,11 @@ indent_style = space
|
||||
[*.md]
|
||||
indent_style = space
|
||||
|
||||
[Pipfile.lock]
|
||||
indent_style = space
|
||||
|
||||
# Tests don't get a line width restriction. It's still a good idea to follow
|
||||
# the 79 character rule, but in the interests of clarity, tests often need to
|
||||
# violate it.
|
||||
[**/test_*.py]
|
||||
max_line_length = off
|
||||
|
||||
[Dockerfile*]
|
||||
[Dockerfile]
|
||||
indent_style = space
|
||||
|
88
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -1,88 +0,0 @@
|
||||
name: Bug report
|
||||
description: Something is not working
|
||||
title: "[BUG] Concise description of the issue"
|
||||
labels: ["bug", "unconfirmed"]
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Have a question? 👉 [Start a new discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/new) or [ask in chat](https://matrix.to/#/#paperless:adnidor.de).
|
||||
|
||||
Before opening an issue, please double check:
|
||||
|
||||
- [The troubleshooting documentation](https://paperless-ngx.readthedocs.io/en/latest/troubleshooting.html).
|
||||
- [The installation instructions](https://paperless-ngx.readthedocs.io/en/latest/setup.html#installation).
|
||||
- [Existing issues and discussions](https://github.com/paperless-ngx/paperless-ngx/search?q=&type=issues).
|
||||
|
||||
If you encounter issues while installing or configuring Paperless-ngx, please post in the ["Support" section of the discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=support).
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Description
|
||||
description: A clear and concise description of what the bug is. If applicable, add screenshots to help explain your problem.
|
||||
placeholder: |
|
||||
Currently Paperless does not work when...
|
||||
|
||||
[Screenshot if applicable]
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: reproduction
|
||||
attributes:
|
||||
label: Steps to reproduce
|
||||
description: Steps to reproduce the behavior.
|
||||
placeholder: |
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. See error
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: Webserver logs
|
||||
description: If available, post any logs from the web server related to your issue.
|
||||
render: bash
|
||||
- type: input
|
||||
id: version
|
||||
attributes:
|
||||
label: Paperless-ngx version
|
||||
placeholder: e.g. 1.6.0
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: host-os
|
||||
attributes:
|
||||
label: Host OS
|
||||
description: Host OS of the machine running paperless-ngx. Please add the architecture (uname -m) if applicable.
|
||||
placeholder: e.g. Archlinux / Ubuntu 20.04 / Raspberry Pi `arm64`
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: install-method
|
||||
attributes:
|
||||
label: Installation method
|
||||
options:
|
||||
- Docker - official image
|
||||
- Docker - linuxserver.io image
|
||||
- Bare metal
|
||||
- Other (please describe above)
|
||||
description: Note there are significant differences from the official image and linuxserver.io, please check if your issue is specific to the third-party image.
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: browser
|
||||
attributes:
|
||||
label: Browser
|
||||
description: Which browser you are using, if relevant.
|
||||
placeholder: e.g. Chrome, Safari
|
||||
- type: input
|
||||
id: config-changes
|
||||
attributes:
|
||||
label: Configuration changes
|
||||
description: Any configuration changes you made in `docker-compose.yml`, `docker-compose.env` or `paperless.conf`.
|
||||
- type: input
|
||||
id: other
|
||||
attributes:
|
||||
label: Other
|
||||
description: Any other relevant details.
|
50
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Something is not working
|
||||
title: '[BUG] Concise description of the issue'
|
||||
labels: ''
|
||||
assignees: ''
|
||||
---
|
||||
|
||||
<!---
|
||||
=> Before opening an issue, please check the documentation and see if it helps you resolve your issue: https://paperless-ngx.readthedocs.io/en/latest/troubleshooting.html
|
||||
=> Please also make sure that you followed the installation instructions.
|
||||
=> Please search the issues and look for similar issues before opening a bug report.
|
||||
|
||||
=> If you would like to submit a feature request please submit one under https://github.com/paperless-ngx/paperless-ngx/discussions/categories/feature-requests
|
||||
|
||||
=> If you encounter issues while installing of configuring Paperless-ngx, please post that in the "Support" section of the discussions. Remember that Paperless successfully runs on a variety of different systems. If paperless does not start, it's probably an issue with your system, and not an issue of paperless.
|
||||
|
||||
=> Don't remove the [BUG] prefix from the title.
|
||||
-->
|
||||
|
||||
**Describe the bug**
|
||||
A clear and concise description of what the bug is.
|
||||
|
||||
**To Reproduce**
|
||||
Steps to reproduce the behavior:
|
||||
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
4. See error
|
||||
|
||||
**Expected behavior**
|
||||
A clear and concise description of what you expected to happen.
|
||||
|
||||
**Screenshots**
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
**Webserver logs**
|
||||
|
||||
```
|
||||
If available, post any logs from the web server related to your issue.
|
||||
```
|
||||
|
||||
**Relevant information**
|
||||
|
||||
- Host OS of the machine running paperless: [e.g. Archlinux / Ubuntu 20.04]
|
||||
- Browser [e.g. chrome, safari]
|
||||
- Version [e.g. 1.0.0]
|
||||
- Installation method: [docker / bare metal]
|
||||
- Any configuration changes you made in `docker-compose.yml`, `docker-compose.env` or `paperless.conf`.
|
11
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -1,11 +0,0 @@
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: 🤔 Questions and Help
|
||||
url: https://github.com/paperless-ngx/paperless-ngx/discussions
|
||||
about: This issue tracker is not for support questions. Please refer to our Discussions.
|
||||
- name: 💬 Chat
|
||||
url: https://matrix.to/#/#paperless:adnidor.de
|
||||
about: Want to discuss Paperless-ngx with others? Check out our chat.
|
||||
- name: 🚀 Feature Request
|
||||
url: https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=feature-requests
|
||||
about: Remember to search for existing feature requests and "up-vote" any you like
|
19
.github/ISSUE_TEMPLATE/other.md
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
---
|
||||
name: Other
|
||||
about: Anything that is not a feature request or bug.
|
||||
title: '[Other] Title of your issue'
|
||||
labels: ''
|
||||
assignees: ''
|
||||
---
|
||||
|
||||
<!--
|
||||
|
||||
=> Discussions, Feedback and other suggestions belong in the "Discussion" section and not on the issue tracker.
|
||||
|
||||
=> If you would like to submit a feature request please submit one under https://github.com/paperless-ngx/paperless-ngx/discussions/categories/feature-requests
|
||||
|
||||
=> If you encounter issues while installing of configuring Paperless-ngx, please post that in the "Support" section of the discussions. Remember that Paperless successfully runs on a variety of different systems. If paperless does not start, it's probably is an issue with your system, and not an issue of paperless.
|
||||
|
||||
=> Don't remove the [Other] prefix from the title.
|
||||
|
||||
-->
|
12
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -1,15 +1,23 @@
|
||||
<!--
|
||||
<!--
|
||||
Note: All PRs with code changes should be targeted to the `dev` branch, pure documentation changes can target `main`
|
||||
-->
|
||||
|
||||
## Proposed change
|
||||
|
||||
<!--
|
||||
Please include a summary of the change and which issue is fixed (if any) and any relevant motivation / context. List any dependencies that are required for this change. If appropriate, please include an explanation of how your proposed change can be tested. Screenshots and / or videos can also be helpful if appropriate.
|
||||
Please include a summary of the change and which issue is fixed (if any) and any relevant motivation / context. List any dependencies that are required for this change. If appropriate, please include an explanation of how your poposed change can be tested. Screenshots and / or videos can also be helpful if appropriate.
|
||||
-->
|
||||
|
||||
Fixes # (issue)
|
||||
|
||||
<!--
|
||||
Please also tag the relevant team to help with review. You can tag any of the following:
|
||||
@paperless-ngx/backend (Python / django, database, etc.)
|
||||
@paperless-ngx/frontend (JavaScript/Typescript, HTML, CSS, etc.)
|
||||
@paperless-ngx/ci-cd (GitHub Actions, deployment)
|
||||
@paperless-ngx/test (General testing for larger PRs)
|
||||
-->
|
||||
|
||||
## Type of change
|
||||
|
||||
<!--
|
||||
|
11
.github/dependabot.yml
vendored
@@ -6,14 +6,11 @@ updates:
|
||||
# Enable version updates for npm
|
||||
- package-ecosystem: "npm"
|
||||
target-branch: "dev"
|
||||
# Look for `package.json` and `lock` files in the `/src-ui` directory
|
||||
# Look for `package.json` and `lock` files in the `root` directory
|
||||
directory: "/src-ui"
|
||||
# Check the npm registry for updates every month
|
||||
schedule:
|
||||
interval: "monthly"
|
||||
labels:
|
||||
- "frontend"
|
||||
- "dependencies"
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/frontend"
|
||||
@@ -29,13 +26,9 @@ updates:
|
||||
labels:
|
||||
- "backend"
|
||||
- "dependencies"
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/backend"
|
||||
|
||||
# Enable updates for Github Actions
|
||||
- package-ecosystem: "github-actions"
|
||||
target-branch: "dev"
|
||||
directory: "/"
|
||||
schedule:
|
||||
# Check for updates to GitHub Actions every month
|
||||
@@ -45,4 +38,4 @@ updates:
|
||||
- "dependencies"
|
||||
# Add reviewers
|
||||
reviewers:
|
||||
- "paperless-ngx/ci-cd"
|
||||
- "paperless-ngx/backend"
|
||||
|
55
.github/release-drafter.yml
vendored
@@ -1,55 +0,0 @@
|
||||
autolabeler:
|
||||
- label: "bug"
|
||||
branch:
|
||||
- '/^fix/'
|
||||
title:
|
||||
- "/^fix/i"
|
||||
- label: "enhancement"
|
||||
branch:
|
||||
- '/^feature/'
|
||||
title:
|
||||
- "/^feature/i"
|
||||
categories:
|
||||
- title: 'Breaking Changes'
|
||||
labels:
|
||||
- 'breaking-change'
|
||||
- title: 'Features'
|
||||
labels:
|
||||
- 'enhancement'
|
||||
- title: 'Bug Fixes'
|
||||
labels:
|
||||
- 'bug'
|
||||
- title: 'Documentation'
|
||||
label: 'documentation'
|
||||
- title: 'Maintenance'
|
||||
labels:
|
||||
- 'chore'
|
||||
- 'deployment'
|
||||
- 'translation'
|
||||
- 'ci-cd'
|
||||
- title: 'Dependencies'
|
||||
collapse-after: 3
|
||||
label: 'dependencies'
|
||||
- title: 'All App Changes'
|
||||
labels:
|
||||
- 'frontend'
|
||||
- 'backend'
|
||||
collapse-after: 0
|
||||
include-labels:
|
||||
- 'enhancement'
|
||||
- 'bug'
|
||||
- 'chore'
|
||||
- 'deployment'
|
||||
- 'translation'
|
||||
- 'dependencies'
|
||||
- 'documentation'
|
||||
- 'frontend'
|
||||
- 'backend'
|
||||
- 'ci-cd'
|
||||
category-template: '### $TITLE'
|
||||
change-template: '- $TITLE @$AUTHOR ([#$NUMBER]($URL))'
|
||||
change-title-escapes: '\<*_&#@'
|
||||
template: |
|
||||
## paperless-ngx $RESOLVED_VERSION
|
||||
|
||||
$CHANGES
|
290
.github/scripts/cleanup-tags.py
vendored
@@ -1,290 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
from argparse import ArgumentParser
|
||||
from typing import Dict
|
||||
from typing import Final
|
||||
from typing import List
|
||||
|
||||
from common import get_log_level
|
||||
from github import ContainerPackage
|
||||
from github import GithubBranchApi
|
||||
from github import GithubContainerRegistryApi
|
||||
|
||||
logger = logging.getLogger("cleanup-tags")
|
||||
|
||||
|
||||
class DockerManifest2:
|
||||
"""
|
||||
Data class wrapping the Docker Image Manifest Version 2.
|
||||
|
||||
See https://docs.docker.com/registry/spec/manifest-v2-2/
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
self._data = data
|
||||
# This is the sha256: digest string. Corresponds to Github API name
|
||||
# if the package is an untagged package
|
||||
self.digest = self._data["digest"]
|
||||
platform_data_os = self._data["platform"]["os"]
|
||||
platform_arch = self._data["platform"]["architecture"]
|
||||
platform_variant = self._data["platform"].get(
|
||||
"variant",
|
||||
"",
|
||||
)
|
||||
self.platform = f"{platform_data_os}/{platform_arch}{platform_variant}"
|
||||
|
||||
|
||||
def _main():
|
||||
parser = ArgumentParser(
|
||||
description="Using the GitHub API locate and optionally delete container"
|
||||
" tags which no longer have an associated feature branch",
|
||||
)
|
||||
|
||||
# Requires an affirmative command to actually do a delete
|
||||
parser.add_argument(
|
||||
"--delete",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, actually delete the container tags",
|
||||
)
|
||||
|
||||
# When a tagged image is updated, the previous version remains, but it no longer tagged
|
||||
# Add this option to remove them as well
|
||||
parser.add_argument(
|
||||
"--untagged",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, delete untagged containers as well",
|
||||
)
|
||||
|
||||
# If given, the package is assumed to be a multi-arch manifest. Cache packages are
|
||||
# not multi-arch, all other types are
|
||||
parser.add_argument(
|
||||
"--is-manifest",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="If provided, the package is assumed to be a multi-arch manifest following schema v2",
|
||||
)
|
||||
|
||||
# Allows configuration of log level for debugging
|
||||
parser.add_argument(
|
||||
"--loglevel",
|
||||
default="info",
|
||||
help="Configures the logging level",
|
||||
)
|
||||
|
||||
# Get the name of the package being processed this round
|
||||
parser.add_argument(
|
||||
"package",
|
||||
help="The package to process",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
logging.basicConfig(
|
||||
level=get_log_level(args),
|
||||
datefmt="%Y-%m-%d %H:%M:%S",
|
||||
format="%(asctime)s %(levelname)-8s %(message)s",
|
||||
)
|
||||
|
||||
# Must be provided in the environment
|
||||
repo_owner: Final[str] = os.environ["GITHUB_REPOSITORY_OWNER"]
|
||||
repo: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
||||
gh_token: Final[str] = os.environ["TOKEN"]
|
||||
|
||||
# Find all branches named feature-*
|
||||
# Note: Only relevant to the main application, but simpler to
|
||||
# leave in for all packages
|
||||
with GithubBranchApi(gh_token) as branch_api:
|
||||
feature_branches = {}
|
||||
for branch in branch_api.get_branches(
|
||||
repo=repo,
|
||||
):
|
||||
if branch.name.startswith("feature-"):
|
||||
logger.debug(f"Found feature branch {branch.name}")
|
||||
feature_branches[branch.name] = branch
|
||||
|
||||
logger.info(f"Located {len(feature_branches)} feature branches")
|
||||
|
||||
with GithubContainerRegistryApi(gh_token, repo_owner) as container_api:
|
||||
# Get the information about all versions of the given package
|
||||
all_package_versions: List[
|
||||
ContainerPackage
|
||||
] = container_api.get_package_versions(args.package)
|
||||
|
||||
all_pkgs_tags_to_version: Dict[str, ContainerPackage] = {}
|
||||
for pkg in all_package_versions:
|
||||
for tag in pkg.tags:
|
||||
all_pkgs_tags_to_version[tag] = pkg
|
||||
logger.info(
|
||||
f"Located {len(all_package_versions)} versions of package {args.package}",
|
||||
)
|
||||
|
||||
# Filter to packages which are tagged with feature-*
|
||||
packages_tagged_feature: List[ContainerPackage] = []
|
||||
for package in all_package_versions:
|
||||
if package.tag_matches("feature-"):
|
||||
packages_tagged_feature.append(package)
|
||||
|
||||
feature_pkgs_tags_to_versions: Dict[str, ContainerPackage] = {}
|
||||
for pkg in packages_tagged_feature:
|
||||
for tag in pkg.tags:
|
||||
feature_pkgs_tags_to_versions[tag] = pkg
|
||||
|
||||
logger.info(
|
||||
f'Located {len(feature_pkgs_tags_to_versions)} versions of package {args.package} tagged "feature-"',
|
||||
)
|
||||
|
||||
# All the feature tags minus all the feature branches leaves us feature tags
|
||||
# with no corresponding branch
|
||||
tags_to_delete = list(
|
||||
set(feature_pkgs_tags_to_versions.keys()) - set(feature_branches.keys()),
|
||||
)
|
||||
|
||||
# All the tags minus the set of going to be deleted tags leaves us the
|
||||
# tags which will be kept around
|
||||
tags_to_keep = list(
|
||||
set(all_pkgs_tags_to_version.keys()) - set(tags_to_delete),
|
||||
)
|
||||
logger.info(
|
||||
f"Located {len(tags_to_delete)} versions of package {args.package} to delete",
|
||||
)
|
||||
|
||||
# Delete certain package versions for which no branch existed
|
||||
for tag_to_delete in tags_to_delete:
|
||||
package_version_info = feature_pkgs_tags_to_versions[tag_to_delete]
|
||||
|
||||
if args.delete:
|
||||
logger.info(
|
||||
f"Deleting {tag_to_delete} (id {package_version_info.id})",
|
||||
)
|
||||
container_api.delete_package_version(
|
||||
package_version_info,
|
||||
)
|
||||
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {tag_to_delete} (id {package_version_info.id})",
|
||||
)
|
||||
|
||||
# Deal with untagged package versions
|
||||
if args.untagged:
|
||||
|
||||
logger.info("Handling untagged image packages")
|
||||
|
||||
if not args.is_manifest:
|
||||
# If the package is not a multi-arch manifest, images without tags are safe to delete.
|
||||
# They are not referred to by anything. This will leave all with at least 1 tag
|
||||
|
||||
for package in all_package_versions:
|
||||
if package.untagged:
|
||||
if args.delete:
|
||||
logger.info(
|
||||
f"Deleting id {package.id} named {package.name}",
|
||||
)
|
||||
container_api.delete_package_version(
|
||||
package,
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {package.name} (id {package.id})",
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Not deleting tag {package.tags[0]} of package {args.package}",
|
||||
)
|
||||
else:
|
||||
|
||||
"""
|
||||
Ok, bear with me, these are annoying.
|
||||
|
||||
Our images are multi-arch, so the manifest is more like a pointer to a sha256 digest.
|
||||
These images are untagged, but pointed to, and so should not be removed (or every pull fails).
|
||||
|
||||
So for each image getting kept, parse the manifest to find the digest(s) it points to. Then
|
||||
remove those from the list of untagged images. The final result is the untagged, not pointed to
|
||||
version which should be safe to remove.
|
||||
|
||||
Example:
|
||||
Tag: ghcr.io/paperless-ngx/paperless-ngx:1.7.1 refers to
|
||||
amd64: sha256:b9ed4f8753bbf5146547671052d7e91f68cdfc9ef049d06690b2bc866fec2690
|
||||
armv7: sha256:81605222df4ba4605a2ba4893276e5d08c511231ead1d5da061410e1bbec05c3
|
||||
arm64: sha256:374cd68db40734b844705bfc38faae84cc4182371de4bebd533a9a365d5e8f3b
|
||||
each of which appears as untagged image, but isn't really.
|
||||
|
||||
So from the list of untagged packages, remove those digests. Once all tags which
|
||||
are being kept are checked, the remaining untagged packages are actually untagged
|
||||
with no referrals in a manifest to them.
|
||||
|
||||
"""
|
||||
|
||||
# Simplify the untagged data, mapping name (which is a digest) to the version
|
||||
untagged_versions = {}
|
||||
for x in all_package_versions:
|
||||
if x.untagged:
|
||||
untagged_versions[x.name] = x
|
||||
|
||||
skips = 0
|
||||
# Extra security to not delete on an unexpected error
|
||||
actually_delete = True
|
||||
|
||||
# Parse manifests to locate digests pointed to
|
||||
for tag in sorted(tags_to_keep):
|
||||
full_name = f"ghcr.io/{repo_owner}/{args.package}:{tag}"
|
||||
logger.info(f"Checking manifest for {full_name}")
|
||||
try:
|
||||
proc = subprocess.run(
|
||||
[
|
||||
shutil.which("docker"),
|
||||
"manifest",
|
||||
"inspect",
|
||||
full_name,
|
||||
],
|
||||
capture_output=True,
|
||||
)
|
||||
|
||||
manifest_list = json.loads(proc.stdout)
|
||||
for manifest_data in manifest_list["manifests"]:
|
||||
manifest = DockerManifest2(manifest_data)
|
||||
|
||||
if manifest.digest in untagged_versions:
|
||||
logger.debug(
|
||||
f"Skipping deletion of {manifest.digest}, referred to by {full_name} for {manifest.platform}",
|
||||
)
|
||||
del untagged_versions[manifest.digest]
|
||||
skips += 1
|
||||
|
||||
except Exception as err:
|
||||
actually_delete = False
|
||||
logger.exception(err)
|
||||
|
||||
logger.info(
|
||||
f"Skipping deletion of {skips} packages referred to by a manifest",
|
||||
)
|
||||
|
||||
# Step 3.3 - Delete the untagged and not pointed at packages
|
||||
logger.info(f"Deleting untagged packages of {args.package}")
|
||||
for to_delete_name in untagged_versions:
|
||||
to_delete_version = untagged_versions[to_delete_name]
|
||||
|
||||
if args.delete and actually_delete:
|
||||
logger.info(
|
||||
f"Deleting id {to_delete_version.id} named {to_delete_version.name}",
|
||||
)
|
||||
container_api.delete_package_version(
|
||||
to_delete_version,
|
||||
)
|
||||
else:
|
||||
logger.info(
|
||||
f"Would delete {to_delete_name} (id {to_delete_version.id})",
|
||||
)
|
||||
else:
|
||||
logger.info("Leaving untagged images untouched")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_main()
|
43
.github/scripts/common.py
vendored
@@ -1,43 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import logging
|
||||
|
||||
|
||||
def get_image_tag(
|
||||
repo_name: str,
|
||||
pkg_name: str,
|
||||
pkg_version: str,
|
||||
) -> str:
|
||||
"""
|
||||
Returns a string representing the normal image for a given package
|
||||
"""
|
||||
return f"ghcr.io/{repo_name.lower()}/builder/{pkg_name}:{pkg_version}"
|
||||
|
||||
|
||||
def get_cache_image_tag(
|
||||
repo_name: str,
|
||||
pkg_name: str,
|
||||
pkg_version: str,
|
||||
branch_name: str,
|
||||
) -> str:
|
||||
"""
|
||||
Returns a string representing the expected image cache tag for a given package
|
||||
|
||||
Registry type caching is utilized for the builder images, to allow fast
|
||||
rebuilds, generally almost instant for the same version
|
||||
"""
|
||||
return f"ghcr.io/{repo_name.lower()}/builder/cache/{pkg_name}:{pkg_version}"
|
||||
|
||||
|
||||
def get_log_level(args) -> int:
|
||||
levels = {
|
||||
"critical": logging.CRITICAL,
|
||||
"error": logging.ERROR,
|
||||
"warn": logging.WARNING,
|
||||
"warning": logging.WARNING,
|
||||
"info": logging.INFO,
|
||||
"debug": logging.DEBUG,
|
||||
}
|
||||
level = levels.get(args.loglevel.lower())
|
||||
if level is None:
|
||||
level = logging.INFO
|
||||
return level
|
92
.github/scripts/get-build-json.py
vendored
@@ -1,92 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
This is a helper script for the mutli-stage Docker image builder.
|
||||
It provides a single point of configuration for package version control.
|
||||
The output JSON object is used by the CI workflow to determine what versions
|
||||
to build and pull into the final Docker image.
|
||||
|
||||
Python package information is obtained from the Pipfile.lock. As this is
|
||||
kept updated by dependabot, it usually will need no further configuration.
|
||||
The sole exception currently is pikepdf, which has a dependency on qpdf,
|
||||
and is configured here to use the latest version of qpdf built by the workflow.
|
||||
|
||||
Other package version information is configured directly below, generally by
|
||||
setting the version and Git information, if any.
|
||||
|
||||
"""
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Final
|
||||
|
||||
from common import get_cache_image_tag
|
||||
from common import get_image_tag
|
||||
|
||||
|
||||
def _main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Generate a JSON object of information required to build the given package, based on the Pipfile.lock",
|
||||
)
|
||||
parser.add_argument(
|
||||
"package",
|
||||
help="The name of the package to generate JSON for",
|
||||
)
|
||||
|
||||
PIPFILE_LOCK_PATH: Final[Path] = Path("Pipfile.lock")
|
||||
BUILD_CONFIG_PATH: Final[Path] = Path(".build-config.json")
|
||||
|
||||
# Read the main config file
|
||||
build_json: Final = json.loads(BUILD_CONFIG_PATH.read_text())
|
||||
|
||||
# Read Pipfile.lock file
|
||||
pipfile_data: Final = json.loads(PIPFILE_LOCK_PATH.read_text())
|
||||
|
||||
args: Final = parser.parse_args()
|
||||
|
||||
# Read from environment variables set by GitHub Actions
|
||||
repo_name: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
||||
branch_name: Final[str] = os.environ["GITHUB_REF_NAME"]
|
||||
|
||||
# Default output values
|
||||
version = None
|
||||
extra_config = {}
|
||||
|
||||
if args.package in pipfile_data["default"]:
|
||||
# Read the version from Pipfile.lock
|
||||
pkg_data = pipfile_data["default"][args.package]
|
||||
pkg_version = pkg_data["version"].split("==")[-1]
|
||||
version = pkg_version
|
||||
|
||||
# Any extra/special values needed
|
||||
if args.package == "pikepdf":
|
||||
extra_config["qpdf_version"] = build_json["qpdf"]["version"]
|
||||
|
||||
elif args.package in build_json:
|
||||
version = build_json[args.package]["version"]
|
||||
|
||||
else:
|
||||
raise NotImplementedError(args.package)
|
||||
|
||||
# The JSON object we'll output
|
||||
output = {
|
||||
"name": args.package,
|
||||
"version": version,
|
||||
"image_tag": get_image_tag(repo_name, args.package, version),
|
||||
"cache_tag": get_cache_image_tag(
|
||||
repo_name,
|
||||
args.package,
|
||||
version,
|
||||
branch_name,
|
||||
),
|
||||
}
|
||||
|
||||
# Add anything special a package may need
|
||||
output.update(extra_config)
|
||||
|
||||
# Output the JSON info to stdout
|
||||
print(json.dumps(output))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
_main()
|
227
.github/scripts/github.py
vendored
@@ -1,227 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
This module contains some useful classes for interacting with the Github API.
|
||||
The full documentation for the API can be found here: https://docs.github.com/en/rest
|
||||
|
||||
Mostly, this focusses on two areas, repo branches and repo packages, as the use case
|
||||
is cleaning up container images which are no longer referred to.
|
||||
|
||||
"""
|
||||
import functools
|
||||
import logging
|
||||
import re
|
||||
import urllib.parse
|
||||
from typing import Dict
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
|
||||
import requests
|
||||
|
||||
logger = logging.getLogger("github-api")
|
||||
|
||||
|
||||
class _GithubApiBase:
|
||||
"""
|
||||
A base class for interacting with the Github API. It
|
||||
will handle the session and setting authorization headers.
|
||||
"""
|
||||
|
||||
def __init__(self, token: str) -> None:
|
||||
self._token = token
|
||||
self._session: Optional[requests.Session] = None
|
||||
|
||||
def __enter__(self) -> "_GithubApiBase":
|
||||
"""
|
||||
Sets up the required headers for auth and response
|
||||
type from the API
|
||||
"""
|
||||
self._session = requests.Session()
|
||||
self._session.headers.update(
|
||||
{
|
||||
"Accept": "application/vnd.github.v3+json",
|
||||
"Authorization": f"token {self._token}",
|
||||
},
|
||||
)
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
"""
|
||||
Ensures the authorization token is cleaned up no matter
|
||||
the reason for the exit
|
||||
"""
|
||||
if "Accept" in self._session.headers:
|
||||
del self._session.headers["Accept"]
|
||||
if "Authorization" in self._session.headers:
|
||||
del self._session.headers["Authorization"]
|
||||
|
||||
# Close the session as well
|
||||
self._session.close()
|
||||
self._session = None
|
||||
|
||||
def _read_all_pages(self, endpoint):
|
||||
"""
|
||||
Helper function to read all pages of an endpoint, utilizing the
|
||||
next.url until exhausted. Assumes the endpoint returns a list
|
||||
"""
|
||||
internal_data = []
|
||||
|
||||
while True:
|
||||
resp = self._session.get(endpoint)
|
||||
if resp.status_code == 200:
|
||||
internal_data += resp.json()
|
||||
if "next" in resp.links:
|
||||
endpoint = resp.links["next"]["url"]
|
||||
else:
|
||||
logger.debug("Exiting pagination loop")
|
||||
break
|
||||
else:
|
||||
logger.warning(f"Request to {endpoint} return HTTP {resp.status_code}")
|
||||
break
|
||||
|
||||
return internal_data
|
||||
|
||||
|
||||
class _EndpointResponse:
|
||||
"""
|
||||
For all endpoint JSON responses, store the full
|
||||
response data, for ease of extending later, if need be.
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
self._data = data
|
||||
|
||||
|
||||
class GithubBranch(_EndpointResponse):
|
||||
"""
|
||||
Simple wrapper for a repository branch, only extracts name information
|
||||
for now.
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict) -> None:
|
||||
super().__init__(data)
|
||||
self.name = self._data["name"]
|
||||
|
||||
|
||||
class GithubBranchApi(_GithubApiBase):
|
||||
"""
|
||||
Wrapper around branch API.
|
||||
|
||||
See https://docs.github.com/en/rest/branches/branches
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, token: str) -> None:
|
||||
super().__init__(token)
|
||||
|
||||
self._ENDPOINT = "https://api.github.com/repos/{REPO}/branches"
|
||||
|
||||
def get_branches(self, repo: str) -> List[GithubBranch]:
|
||||
"""
|
||||
Returns all current branches of the given repository owned by the given
|
||||
owner or organization.
|
||||
"""
|
||||
endpoint = self._ENDPOINT.format(REPO=repo)
|
||||
internal_data = self._read_all_pages(endpoint)
|
||||
return [GithubBranch(branch) for branch in internal_data]
|
||||
|
||||
|
||||
class ContainerPackage(_EndpointResponse):
|
||||
"""
|
||||
Data class wrapping the JSON response from the package related
|
||||
endpoints
|
||||
"""
|
||||
|
||||
def __init__(self, data: Dict):
|
||||
super().__init__(data)
|
||||
# This is a numerical ID, required for interactions with this
|
||||
# specific package, including deletion of it or restoration
|
||||
self.id: int = self._data["id"]
|
||||
|
||||
# A string name. This might be an actual name or it could be a
|
||||
# digest string like "sha256:"
|
||||
self.name: str = self._data["name"]
|
||||
|
||||
# URL to the package, including its ID, can be used for deletion
|
||||
# or restoration without needing to build up a URL ourselves
|
||||
self.url: str = self._data["url"]
|
||||
|
||||
# The list of tags applied to this image. Maybe an empty list
|
||||
self.tags: List[str] = self._data["metadata"]["container"]["tags"]
|
||||
|
||||
@functools.cached_property
|
||||
def untagged(self) -> bool:
|
||||
"""
|
||||
Returns True if the image has no tags applied to it, False otherwise
|
||||
"""
|
||||
return len(self.tags) == 0
|
||||
|
||||
@functools.cache
|
||||
def tag_matches(self, pattern: str) -> bool:
|
||||
"""
|
||||
Returns True if the image has at least one tag which matches the given regex,
|
||||
False otherwise
|
||||
"""
|
||||
for tag in self.tags:
|
||||
if re.match(pattern, tag) is not None:
|
||||
return True
|
||||
return False
|
||||
|
||||
def __repr__(self):
|
||||
return f"Package {self.name}"
|
||||
|
||||
|
||||
class GithubContainerRegistryApi(_GithubApiBase):
|
||||
"""
|
||||
Class wrapper to deal with the Github packages API. This class only deals with
|
||||
container type packages, the only type published by paperless-ngx.
|
||||
"""
|
||||
|
||||
def __init__(self, token: str, owner_or_org: str) -> None:
|
||||
super().__init__(token)
|
||||
self._owner_or_org = owner_or_org
|
||||
if self._owner_or_org == "paperless-ngx":
|
||||
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-an-organization
|
||||
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
||||
# https://docs.github.com/en/rest/packages#delete-package-version-for-an-organization
|
||||
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
||||
else:
|
||||
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-the-authenticated-user
|
||||
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
||||
# https://docs.github.com/en/rest/packages#delete-a-package-version-for-the-authenticated-user
|
||||
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
||||
|
||||
def get_package_versions(
|
||||
self,
|
||||
package_name: str,
|
||||
) -> List[ContainerPackage]:
|
||||
"""
|
||||
Returns all the versions of a given package (container images) from
|
||||
the API
|
||||
"""
|
||||
|
||||
package_type: str = "container"
|
||||
# Need to quote this for slashes in the name
|
||||
package_name = urllib.parse.quote(package_name, safe="")
|
||||
|
||||
endpoint = self._PACKAGES_VERSIONS_ENDPOINT.format(
|
||||
ORG=self._owner_or_org,
|
||||
PACKAGE_TYPE=package_type,
|
||||
PACKAGE_NAME=package_name,
|
||||
)
|
||||
|
||||
pkgs = []
|
||||
|
||||
for data in self._read_all_pages(endpoint):
|
||||
pkgs.append(ContainerPackage(data))
|
||||
|
||||
return pkgs
|
||||
|
||||
def delete_package_version(self, package_data: ContainerPackage):
|
||||
"""
|
||||
Deletes the given package version from the GHCR
|
||||
"""
|
||||
resp = self._session.delete(package_data.url)
|
||||
if resp.status_code != 204:
|
||||
logger.warning(
|
||||
f"Request to delete {package_data.url} returned HTTP {resp.status_code}",
|
||||
)
|
15
.github/stale.yml
vendored
@@ -1,23 +1,18 @@
|
||||
# Number of days of inactivity before an issue becomes stale
|
||||
daysUntilStale: 30
|
||||
|
||||
# Number of days of inactivity before a stale issue is closed
|
||||
daysUntilClose: 7
|
||||
|
||||
# Only issues or pull requests with all of these labels are check if stale. Defaults to `[]` (disabled)
|
||||
onlyLabels: [cant-reproduce]
|
||||
|
||||
# Issues with these labels will never be considered stale
|
||||
exemptLabels:
|
||||
- pinned
|
||||
- security
|
||||
- fixpending
|
||||
# Label to use when marking an issue as stale
|
||||
staleLabel: stale
|
||||
|
||||
# Comment to post when marking an issue as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
This issue has been automatically marked as stale because it has not had
|
||||
recent activity. It will be closed if no further activity occurs. Thank you
|
||||
for your contributions.
|
||||
|
||||
# Comment to post when closing a stale issue. Set to `false` to disable
|
||||
closeComment: false
|
||||
|
||||
# See https://github.com/marketplace/stale for more info on the app
|
||||
# and https://github.com/probot/stale for the configuration docs
|
||||
|
416
.github/workflows/ci.yml
vendored
@@ -3,10 +3,8 @@ name: ci
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
# https://semver.org/#spec-item-2
|
||||
- 'v[0-9]+.[0-9]+.[0-9]+'
|
||||
# https://semver.org/#spec-item-9
|
||||
- 'v[0-9]+.[0-9]+.[0-9]+-beta.rc[0-9]+'
|
||||
- ngx-*
|
||||
- beta-*
|
||||
branches-ignore:
|
||||
- 'translations**'
|
||||
pull_request:
|
||||
@@ -14,41 +12,19 @@ on:
|
||||
- 'translations**'
|
||||
|
||||
jobs:
|
||||
pre-commit:
|
||||
name: Linting Checks
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
-
|
||||
name: Checkout repository
|
||||
uses: actions/checkout@v3
|
||||
|
||||
-
|
||||
name: Install tools
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.9"
|
||||
|
||||
-
|
||||
name: Check files
|
||||
uses: pre-commit/action@v3.0.0
|
||||
|
||||
documentation:
|
||||
name: "Build Documentation"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- pre-commit
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pipx install pipenv==2022.8.5
|
||||
pipenv --version
|
||||
run: pipx install pipenv
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
uses: actions/setup-python@v3
|
||||
with:
|
||||
python-version: 3.9
|
||||
cache: "pipenv"
|
||||
@@ -57,10 +33,6 @@ jobs:
|
||||
name: Install dependencies
|
||||
run: |
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: List installed Python dependencies
|
||||
run: |
|
||||
pipenv run pip list
|
||||
-
|
||||
name: Make documentation
|
||||
run: |
|
||||
@@ -73,14 +45,72 @@ jobs:
|
||||
name: documentation
|
||||
path: docs/_build/html/
|
||||
|
||||
tests-backend:
|
||||
name: "Tests (${{ matrix.python-version }})"
|
||||
code-checks-backend:
|
||||
name: "Backend Code Checks"
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Install checkers
|
||||
run: |
|
||||
pipx install reorder-python-imports
|
||||
pipx install yesqa
|
||||
pipx install add-trailing-comma
|
||||
pipx install flake8
|
||||
-
|
||||
name: Run reorder-python-imports
|
||||
run: |
|
||||
find src/ -type f -name '*.py' ! -path "*/migrations/*" | xargs reorder-python-imports
|
||||
-
|
||||
name: Run yesqa
|
||||
run: |
|
||||
find src/ -type f -name '*.py' ! -path "*/migrations/*" | xargs yesqa
|
||||
-
|
||||
name: Run add-trailing-comma
|
||||
run: |
|
||||
find src/ -type f -name '*.py' ! -path "*/migrations/*" | xargs add-trailing-comma
|
||||
# black is placed after add-trailing-comma because it may format differently
|
||||
# if a trailing comma is added
|
||||
-
|
||||
name: Run black
|
||||
uses: psf/black@stable
|
||||
with:
|
||||
options: "--check --diff"
|
||||
version: "22.3.0"
|
||||
-
|
||||
name: Run flake8 checks
|
||||
run: |
|
||||
cd src/
|
||||
flake8 --max-line-length=88 --ignore=E203,W503
|
||||
|
||||
code-checks-frontend:
|
||||
name: "Frontend Code Checks"
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
- uses: actions/setup-node@v3
|
||||
with:
|
||||
node-version: '16'
|
||||
-
|
||||
name: Install prettier
|
||||
run: |
|
||||
npm install prettier
|
||||
-
|
||||
name: Run prettier
|
||||
run:
|
||||
npx prettier --check --ignore-path Pipfile.lock **/*.js **/*.ts *.md **/*.md
|
||||
|
||||
tests-backend:
|
||||
needs: [code-checks-backend]
|
||||
name: "Backend Tests (${{ matrix.python-version }})"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- pre-commit
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ['3.8', '3.9', '3.10']
|
||||
python-version: ['3.8', '3.9']
|
||||
fail-fast: false
|
||||
steps:
|
||||
-
|
||||
@@ -90,12 +120,10 @@ jobs:
|
||||
fetch-depth: 2
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pipx install pipenv==2022.8.5
|
||||
pipenv --version
|
||||
run: pipx install pipenv
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
uses: actions/setup-python@v3
|
||||
with:
|
||||
python-version: "${{ matrix.python-version }}"
|
||||
cache: "pipenv"
|
||||
@@ -104,15 +132,11 @@ jobs:
|
||||
name: Install system dependencies
|
||||
run: |
|
||||
sudo apt-get update -qq
|
||||
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript libzbar0 poppler-utils
|
||||
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript optipng
|
||||
-
|
||||
name: Install Python dependencies
|
||||
run: |
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: List installed Python dependencies
|
||||
run: |
|
||||
pipenv run pip list
|
||||
-
|
||||
name: Tests
|
||||
run: |
|
||||
@@ -121,7 +145,7 @@ jobs:
|
||||
-
|
||||
name: Get changed files
|
||||
id: changed-files-specific
|
||||
uses: tj-actions/changed-files@v29.0.2
|
||||
uses: tj-actions/changed-files@v18.1
|
||||
with:
|
||||
files: |
|
||||
src/**
|
||||
@@ -142,17 +166,15 @@ jobs:
|
||||
pipenv run coveralls --service=github
|
||||
|
||||
tests-frontend:
|
||||
name: "Tests Frontend"
|
||||
needs: [code-checks-frontend]
|
||||
name: "Frontend Tests"
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- pre-commit
|
||||
strategy:
|
||||
matrix:
|
||||
node-version: [16.x]
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
-
|
||||
name: Use Node.js ${{ matrix.node-version }}
|
||||
- name: Use Node.js ${{ matrix.node-version }}
|
||||
uses: actions/setup-node@v3
|
||||
with:
|
||||
node-version: ${{ matrix.node-version }}
|
||||
@@ -160,152 +182,42 @@ jobs:
|
||||
- run: cd src-ui && npm run test
|
||||
- run: cd src-ui && npm run e2e:ci
|
||||
|
||||
prepare-docker-build:
|
||||
name: Prepare Docker Pipeline Data
|
||||
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || contains(github.ref, 'beta.rc') || startsWith(github.ref, 'refs/tags/v'))
|
||||
runs-on: ubuntu-20.04
|
||||
# If the push triggered the installer library workflow, wait for it to
|
||||
# complete here. This ensures the required versions for the final
|
||||
# image have been built, while not waiting at all if the versions haven't changed
|
||||
concurrency:
|
||||
group: build-installer-library
|
||||
cancel-in-progress: false
|
||||
needs:
|
||||
- documentation
|
||||
- tests-backend
|
||||
- tests-frontend
|
||||
steps:
|
||||
-
|
||||
name: Set ghcr repository name
|
||||
id: set-ghcr-repository
|
||||
run: |
|
||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||
echo ::set-output name=repository::${ghcr_name}
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.9"
|
||||
-
|
||||
name: Setup qpdf image
|
||||
id: qpdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=qpdf-json::${build_json}
|
||||
-
|
||||
name: Setup psycopg2 image
|
||||
id: psycopg2-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=psycopg2-json::${build_json}
|
||||
-
|
||||
name: Setup pikepdf image
|
||||
id: pikepdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=pikepdf-json::${build_json}
|
||||
-
|
||||
name: Setup jbig2enc image
|
||||
id: jbig2enc-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=jbig2enc-json::${build_json}
|
||||
|
||||
outputs:
|
||||
|
||||
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
||||
|
||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
||||
|
||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
||||
|
||||
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
||||
|
||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json}}
|
||||
|
||||
# build and push image to docker hub.
|
||||
build-docker-image:
|
||||
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || startsWith(github.ref, 'refs/tags/ngx-') || startsWith(github.ref, 'refs/tags/beta-'))
|
||||
runs-on: ubuntu-20.04
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-build-docker-image-${{ github.ref_name }}
|
||||
cancel-in-progress: true
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
needs: [tests-backend, tests-frontend]
|
||||
steps:
|
||||
-
|
||||
name: Check pushing to Docker Hub
|
||||
id: docker-hub
|
||||
# Only push to Dockerhub from the main repo AND the ref is either:
|
||||
# main
|
||||
# dev
|
||||
# beta
|
||||
# a tag
|
||||
# Otherwise forks would require a Docker Hub account and secrets setup
|
||||
run: |
|
||||
if [[ ${{ needs.prepare-docker-build.outputs.ghcr-repository }} == "paperless-ngx/paperless-ngx" && ( ${{ github.ref_name }} == "main" || ${{ github.ref_name }} == "dev" || ${{ github.ref_name }} == "beta" || ${{ startsWith(github.ref, 'refs/tags/v') }} == "true" ) ]] ; then
|
||||
echo "Enabling DockerHub image push"
|
||||
echo ::set-output name=enable::"true"
|
||||
else
|
||||
echo "Not pushing to DockerHub"
|
||||
echo ::set-output name=enable::"false"
|
||||
fi
|
||||
-
|
||||
name: Gather Docker metadata
|
||||
id: docker-meta
|
||||
uses: docker/metadata-action@v4
|
||||
uses: docker/metadata-action@v3
|
||||
with:
|
||||
images: |
|
||||
ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
||||
name=paperlessngx/paperless-ngx,enable=${{ steps.docker-hub.outputs.enable }}
|
||||
images: ghcr.io/${{ github.repository }}
|
||||
tags: |
|
||||
# Tag branches with branch name
|
||||
type=match,pattern=ngx-(\d.\d.\d),group=1
|
||||
type=semver,pattern=ngx-{{version}}
|
||||
type=semver,pattern=ngx-{{major}}.{{minor}}
|
||||
type=ref,event=branch
|
||||
# Process semver tags
|
||||
# For a tag x.y.z or vX.Y.Z, output an x.y.z and x.y image tag
|
||||
type=semver,pattern={{version}}
|
||||
type=semver,pattern={{major}}.{{minor}}
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v2
|
||||
uses: docker/setup-buildx-action@v1
|
||||
-
|
||||
name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v2
|
||||
uses: docker/setup-qemu-action@v1
|
||||
-
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v2
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Login to Docker Hub
|
||||
uses: docker/login-action@v2
|
||||
# Don't attempt to login is not pushing to Docker Hub
|
||||
if: steps.docker-hub.outputs.enable == 'true'
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
-
|
||||
name: Build and push
|
||||
uses: docker/build-push-action@v3
|
||||
uses: docker/build-push-action@v2
|
||||
with:
|
||||
context: .
|
||||
file: ./Dockerfile
|
||||
@@ -313,19 +225,8 @@ jobs:
|
||||
push: ${{ github.event_name != 'pull_request' }}
|
||||
tags: ${{ steps.docker-meta.outputs.tags }}
|
||||
labels: ${{ steps.docker-meta.outputs.labels }}
|
||||
build-args: |
|
||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
||||
# Get cache layers from this branch, then dev, then main
|
||||
# This allows new branches to get at least some cache benefits, generally from dev
|
||||
cache-from: |
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:dev
|
||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:main
|
||||
cache-to: |
|
||||
type=registry,mode=max,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
-
|
||||
name: Inspect image
|
||||
run: |
|
||||
@@ -343,34 +244,24 @@ jobs:
|
||||
path: src/documents/static/frontend/
|
||||
|
||||
build-release:
|
||||
needs:
|
||||
- build-docker-image
|
||||
needs: [build-docker-image, documentation]
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Install pipenv
|
||||
run: |
|
||||
pip3 install --upgrade pip setuptools wheel pipx
|
||||
pipx install pipenv
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
uses: actions/setup-python@v3
|
||||
with:
|
||||
python-version: 3.9
|
||||
cache: "pipenv"
|
||||
cache-dependency-path: 'Pipfile.lock'
|
||||
-
|
||||
name: Install Python dependencies
|
||||
run: |
|
||||
pipenv sync --dev
|
||||
-
|
||||
name: Install system dependencies
|
||||
name: Install dependencies
|
||||
run: |
|
||||
sudo apt-get update -qq
|
||||
sudo apt-get install -qq --no-install-recommends gettext liblept5
|
||||
pip3 install --upgrade pip setuptools wheel
|
||||
pip3 install -r requirements.txt
|
||||
-
|
||||
name: Download frontend artifact
|
||||
uses: actions/download-artifact@v3
|
||||
@@ -383,38 +274,34 @@ jobs:
|
||||
with:
|
||||
name: documentation
|
||||
path: docs/_build/html/
|
||||
-
|
||||
name: Generate requirements file
|
||||
run: |
|
||||
pipenv requirements > requirements.txt
|
||||
-
|
||||
name: Compile messages
|
||||
run: |
|
||||
cd src/
|
||||
pipenv run python3 manage.py compilemessages
|
||||
-
|
||||
name: Collect static files
|
||||
run: |
|
||||
cd src/
|
||||
pipenv run python3 manage.py collectstatic --no-input
|
||||
-
|
||||
name: Move files
|
||||
run: |
|
||||
mkdir dist
|
||||
mkdir dist/paperless-ngx
|
||||
mkdir dist/paperless-ngx/scripts
|
||||
cp .dockerignore .env Dockerfile Pipfile Pipfile.lock requirements.txt LICENSE README.md dist/paperless-ngx/
|
||||
cp .dockerignore .env Dockerfile Pipfile Pipfile.lock LICENSE README.md requirements.txt dist/paperless-ngx/
|
||||
cp paperless.conf.example dist/paperless-ngx/paperless.conf
|
||||
cp gunicorn.conf.py dist/paperless-ngx/gunicorn.conf.py
|
||||
cp -r docker/ dist/paperless-ngx/docker
|
||||
cp docker/ dist/paperless-ngx/docker -r
|
||||
cp scripts/*.service scripts/*.sh dist/paperless-ngx/scripts/
|
||||
cp -r src/ dist/paperless-ngx/src
|
||||
cp -r docs/_build/html/ dist/paperless-ngx/docs
|
||||
mv static dist/paperless-ngx
|
||||
cp src/ dist/paperless-ngx/src -r
|
||||
cp docs/_build/html/ dist/paperless-ngx/docs -r
|
||||
-
|
||||
name: Compile messages
|
||||
run: |
|
||||
cd dist/paperless-ngx/src
|
||||
python3 manage.py compilemessages
|
||||
-
|
||||
name: Collect static files
|
||||
run: |
|
||||
cd dist/paperless-ngx/src
|
||||
python3 manage.py collectstatic --no-input
|
||||
-
|
||||
name: Make release package
|
||||
run: |
|
||||
cd dist
|
||||
find . -name __pycache__ | xargs rm -r
|
||||
tar -cJf paperless-ngx.tar.xz paperless-ngx/
|
||||
-
|
||||
name: Upload release artifact
|
||||
@@ -425,13 +312,8 @@ jobs:
|
||||
|
||||
publish-release:
|
||||
runs-on: ubuntu-20.04
|
||||
outputs:
|
||||
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
||||
changelog: ${{ steps.create-release.outputs.body }}
|
||||
version: ${{ steps.get_version.outputs.version }}
|
||||
needs:
|
||||
- build-release
|
||||
if: github.ref_type == 'tag' && (startsWith(github.ref_name, 'v') || contains(github.ref_name, '-beta.rc'))
|
||||
needs: build-release
|
||||
if: contains(github.ref, 'refs/tags/ngx-') || contains(github.ref, 'refs/tags/beta-')
|
||||
steps:
|
||||
-
|
||||
name: Download release artifact
|
||||
@@ -443,24 +325,27 @@ jobs:
|
||||
name: Get version
|
||||
id: get_version
|
||||
run: |
|
||||
echo ::set-output name=version::${{ github.ref_name }}
|
||||
if [[ ${{ contains(github.ref_name, '-beta.rc') }} == 'true' ]]; then
|
||||
echo ::set-output name=prerelease::true
|
||||
else
|
||||
if [[ $GITHUB_REF == refs/tags/ngx-* ]]; then
|
||||
echo ::set-output name=version::${GITHUB_REF#refs/tags/ngx-}
|
||||
echo ::set-output name=prerelease::false
|
||||
echo ::set-output name=body::"For a complete list of changes, see the changelog at https://paperless-ngx.readthedocs.io/en/latest/changelog.html"
|
||||
elif [[ $GITHUB_REF == refs/tags/beta-* ]]; then
|
||||
echo ::set-output name=version::${GITHUB_REF#refs/tags/beta-}
|
||||
echo ::set-output name=prerelease::true
|
||||
echo ::set-output name=body::"For a complete list of changes, see the changelog at https://github.com/paperless-ngx/paperless-ngx/blob/beta/docs/changelog.rst"
|
||||
fi
|
||||
-
|
||||
name: Create Release and Changelog
|
||||
id: create-release
|
||||
uses: paperless-ngx/release-drafter@master
|
||||
with:
|
||||
name: Paperless-ngx ${{ steps.get_version.outputs.version }}
|
||||
tag: ${{ steps.get_version.outputs.version }}
|
||||
version: ${{ steps.get_version.outputs.version }}
|
||||
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
||||
publish: true # ensures release is not marked as draft
|
||||
name: Create release
|
||||
id: create_release
|
||||
uses: actions/create-release@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
tag_name: ngx-${{ steps.get_version.outputs.version }}
|
||||
release_name: Paperless-ngx ${{ steps.get_version.outputs.version }}
|
||||
draft: false
|
||||
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
||||
body: ${{ steps.get_version.outputs.body }}
|
||||
-
|
||||
name: Upload release archive
|
||||
id: upload-release-asset
|
||||
@@ -468,56 +353,7 @@ jobs:
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create-release.outputs.upload_url }}
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
|
||||
asset_path: ./paperless-ngx.tar.xz
|
||||
asset_name: paperless-ngx-${{ steps.get_version.outputs.version }}.tar.xz
|
||||
asset_content_type: application/x-xz
|
||||
|
||||
append-changelog:
|
||||
runs-on: ubuntu-20.04
|
||||
needs:
|
||||
- publish-release
|
||||
if: needs.publish-release.outputs.prerelease == 'false'
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
with:
|
||||
ref: main
|
||||
-
|
||||
name: Append Changelog to docs
|
||||
id: append-Changelog
|
||||
working-directory: docs
|
||||
run: |
|
||||
git branch ${{ needs.publish-release.outputs.version }}-changelog
|
||||
git checkout ${{ needs.publish-release.outputs.version }}-changelog
|
||||
echo -e "# Changelog\n\n${{ needs.publish-release.outputs.changelog }}\n" > changelog-new.md
|
||||
echo "Manually linking usernames"
|
||||
sed -i -r 's|@(.+?) \(\[#|[@\1](https://github.com/\1) ([#|ig' changelog-new.md
|
||||
CURRENT_CHANGELOG=`tail --lines +2 changelog.md`
|
||||
echo -e "$CURRENT_CHANGELOG" >> changelog-new.md
|
||||
mv changelog-new.md changelog.md
|
||||
git config --global user.name "github-actions"
|
||||
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
git commit -am "Changelog ${{ steps.get_version.outputs.version }} - GHA"
|
||||
git push origin ${{ needs.publish-release.outputs.version }}-changelog
|
||||
-
|
||||
name: Create Pull Request
|
||||
uses: actions/github-script@v6
|
||||
with:
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const result = await github.rest.pulls.create({
|
||||
title: '[Documentation] Add ${{ needs.publish-release.outputs.version }} changelog',
|
||||
owner,
|
||||
repo,
|
||||
head: '${{ needs.publish-release.outputs.version }}-changelog',
|
||||
base: 'main',
|
||||
body: 'This PR is auto-generated by CI.'
|
||||
});
|
||||
github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: result.data.number,
|
||||
labels: ['documentation']
|
||||
});
|
||||
|
108
.github/workflows/cleanup-tags.yml
vendored
@@ -1,108 +0,0 @@
|
||||
# This workflow runs on certain conditions to check for and potentially
|
||||
# delete container images from the GHCR which no longer have an associated
|
||||
# code branch.
|
||||
# Requires a PAT with the correct scope set in the secrets
|
||||
|
||||
name: Cleanup Image Tags
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 0 * * SAT'
|
||||
delete:
|
||||
pull_request:
|
||||
types:
|
||||
- closed
|
||||
push:
|
||||
paths:
|
||||
- ".github/workflows/cleanup-tags.yml"
|
||||
- ".github/scripts/cleanup-tags.py"
|
||||
- ".github/scripts/github.py"
|
||||
- ".github/scripts/common.py"
|
||||
|
||||
jobs:
|
||||
cleanup:
|
||||
name: Cleanup Image Tags
|
||||
runs-on: ubuntu-20.04
|
||||
env:
|
||||
# Requires a personal access token with the OAuth scope delete:packages
|
||||
TOKEN: ${{ secrets.GHA_CONTAINER_DELETE_TOKEN }}
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v1
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v3
|
||||
with:
|
||||
python-version: "3.10"
|
||||
-
|
||||
name: Install requests
|
||||
run: |
|
||||
python -m pip install requests
|
||||
# Clean up primary packages
|
||||
-
|
||||
name: Cleanup for package "paperless-ngx"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx"
|
||||
-
|
||||
name: Cleanup for package "qpdf"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx/builder/qpdf"
|
||||
-
|
||||
name: Cleanup for package "pikepdf"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx/builder/pikepdf"
|
||||
-
|
||||
name: Cleanup for package "jbig2enc"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx/builder/jbig2enc"
|
||||
-
|
||||
name: Cleanup for package "psycopg2"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --is-manifest --delete "paperless-ngx/builder/psycopg2"
|
||||
#
|
||||
# Clean up registry cache packages
|
||||
#
|
||||
-
|
||||
name: Cleanup for package "builder/cache/app"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/app"
|
||||
-
|
||||
name: Cleanup for package "builder/cache/qpdf"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/qpdf"
|
||||
-
|
||||
name: Cleanup for package "builder/cache/psycopg2"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/psycopg2"
|
||||
-
|
||||
name: Cleanup for package "builder/cache/jbig2enc"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/jbig2enc"
|
||||
-
|
||||
name: Cleanup for package "builder/cache/pikepdf"
|
||||
if: "${{ env.TOKEN != '' }}"
|
||||
run: |
|
||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --loglevel info --untagged --delete "paperless-ngx/builder/cache/pikepdf"
|
||||
-
|
||||
name: Check all tags still pull
|
||||
run: |
|
||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||
echo "Pulling all tags of ghcr.io/${ghcr_name}"
|
||||
docker pull --quiet --all-tags ghcr.io/${ghcr_name}
|
4
.github/workflows/codeql-analysis.yml
vendored
@@ -42,7 +42,7 @@ jobs:
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v2
|
||||
uses: github/codeql-action/init@v1
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||
@@ -51,4 +51,4 @@ jobs:
|
||||
# queries: ./path/to/local/query, your-org/your-repo/queries@main
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v2
|
||||
uses: github/codeql-action/analyze@v1
|
||||
|
147
.github/workflows/installer-library.yml
vendored
@@ -1,147 +0,0 @@
|
||||
# This workflow will run to update the installer library of
|
||||
# Docker images. These are the images which provide updated wheels
|
||||
# .deb installation packages or maybe just some compiled library
|
||||
|
||||
name: Build Image Library
|
||||
|
||||
on:
|
||||
push:
|
||||
# Must match one of these branches AND one of the paths
|
||||
# to be triggered
|
||||
branches:
|
||||
- "main"
|
||||
- "dev"
|
||||
- "library-*"
|
||||
- "feature-*"
|
||||
paths:
|
||||
# Trigger the workflow if a Dockerfile changed
|
||||
- "docker-builders/**"
|
||||
# Trigger if a package was updated
|
||||
- ".build-config.json"
|
||||
- "Pipfile.lock"
|
||||
# Also trigger on workflow changes related to the library
|
||||
- ".github/workflows/installer-library.yml"
|
||||
- ".github/workflows/reusable-workflow-builder.yml"
|
||||
- ".github/scripts/**"
|
||||
|
||||
# Set a workflow level concurrency group so primary workflow
|
||||
# can wait for this to complete if needed
|
||||
# DO NOT CHANGE without updating main workflow group
|
||||
concurrency:
|
||||
group: build-installer-library
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
prepare-docker-build:
|
||||
name: Prepare Docker Image Version Data
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
-
|
||||
name: Set ghcr repository name
|
||||
id: set-ghcr-repository
|
||||
run: |
|
||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
||||
echo ::set-output name=repository::${ghcr_name}
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.9"
|
||||
-
|
||||
name: Setup qpdf image
|
||||
id: qpdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=qpdf-json::${build_json}
|
||||
-
|
||||
name: Setup psycopg2 image
|
||||
id: psycopg2-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=psycopg2-json::${build_json}
|
||||
-
|
||||
name: Setup pikepdf image
|
||||
id: pikepdf-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=pikepdf-json::${build_json}
|
||||
-
|
||||
name: Setup jbig2enc image
|
||||
id: jbig2enc-setup
|
||||
run: |
|
||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
||||
|
||||
echo ${build_json}
|
||||
|
||||
echo ::set-output name=jbig2enc-json::${build_json}
|
||||
|
||||
outputs:
|
||||
|
||||
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
||||
|
||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
||||
|
||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
||||
|
||||
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
||||
|
||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json}}
|
||||
|
||||
build-qpdf-debs:
|
||||
name: qpdf
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.qpdf
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.qpdf-json }}
|
||||
build-args: |
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
|
||||
build-jbig2enc:
|
||||
name: jbig2enc
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.jbig2enc
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.jbig2enc-json }}
|
||||
build-args: |
|
||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
||||
|
||||
build-psycopg2-wheel:
|
||||
name: psycopg2
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.psycopg2
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.psycopg2-json }}
|
||||
build-args: |
|
||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
||||
|
||||
build-pikepdf-wheel:
|
||||
name: pikepdf
|
||||
needs:
|
||||
- prepare-docker-build
|
||||
- build-qpdf-debs
|
||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
||||
with:
|
||||
dockerfile: ./docker-builders/Dockerfile.pikepdf
|
||||
build-json: ${{ needs.prepare-docker-build.outputs.pikepdf-json }}
|
||||
build-args: |
|
||||
REPO=${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
22
.github/workflows/project-actions.yml
vendored
@@ -13,9 +13,6 @@ on:
|
||||
- main
|
||||
- dev
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
env:
|
||||
todo: Todo
|
||||
done: Done
|
||||
@@ -27,8 +24,8 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event_name == 'issues' && (github.event.action == 'opened' || github.event.action == 'reopened')
|
||||
steps:
|
||||
- name: Add issue to project and set status to ${{ env.todo }}
|
||||
uses: leonsteinhaeuser/project-beta-automations@v1.3.0
|
||||
- name: Set issue status to ${{ env.todo }}
|
||||
uses: leonsteinhaeuser/project-beta-automations@v1.2.1
|
||||
with:
|
||||
gh_token: ${{ secrets.GH_TOKEN }}
|
||||
organization: paperless-ngx
|
||||
@@ -38,20 +35,13 @@ jobs:
|
||||
pr_opened_or_reopened:
|
||||
name: pr_opened_or_reopened
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
# write permission is required for autolabeler
|
||||
pull-requests: write
|
||||
if: github.event_name == 'pull_request_target' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request.user.login != 'dependabot'
|
||||
if: github.event_name == 'pull_request_target' && (github.event.action == 'opened' || github.event.action == 'reopened')
|
||||
steps:
|
||||
- name: Add PR to project and set status to "Needs Review"
|
||||
uses: leonsteinhaeuser/project-beta-automations@v1.3.0
|
||||
- name: Set PR status to ${{ env.in_progress }}
|
||||
uses: leonsteinhaeuser/project-beta-automations@v1.2.1
|
||||
with:
|
||||
gh_token: ${{ secrets.GH_TOKEN }}
|
||||
organization: paperless-ngx
|
||||
project_id: 2
|
||||
resource_node_id: ${{ github.event.pull_request.node_id }}
|
||||
status_value: "Needs Review" # Target status
|
||||
- name: Label PR with release-drafter
|
||||
uses: release-drafter/release-drafter@v5
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
status_value: ${{ env.in_progress }} # Target status
|
||||
|
53
.github/workflows/reusable-workflow-builder.yml
vendored
@@ -1,53 +0,0 @@
|
||||
name: Reusable Image Builder
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
dockerfile:
|
||||
required: true
|
||||
type: string
|
||||
build-json:
|
||||
required: true
|
||||
type: string
|
||||
build-args:
|
||||
required: false
|
||||
default: ""
|
||||
type: string
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ fromJSON(inputs.build-json).name }}-${{ fromJSON(inputs.build-json).version }}
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
build-image:
|
||||
name: Build ${{ fromJSON(inputs.build-json).name }} @ ${{ fromJSON(inputs.build-json).version }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
-
|
||||
name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
-
|
||||
name: Login to Github Container Registry
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
-
|
||||
name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v2
|
||||
-
|
||||
name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v2
|
||||
-
|
||||
name: Build ${{ fromJSON(inputs.build-json).name }}
|
||||
uses: docker/build-push-action@v3
|
||||
with:
|
||||
context: .
|
||||
file: ${{ inputs.dockerfile }}
|
||||
tags: ${{ fromJSON(inputs.build-json).image_tag }}
|
||||
platforms: linux/amd64,linux/arm64,linux/arm/v7
|
||||
build-args: ${{ inputs.build-args }}
|
||||
push: true
|
||||
cache-from: type=registry,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
||||
cache-to: type=registry,mode=max,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
8
.gitignore
vendored
@@ -63,14 +63,11 @@ target/
|
||||
|
||||
# VS Code
|
||||
.vscode
|
||||
/src-ui/.vscode
|
||||
/docs/.vscode
|
||||
|
||||
# Other stuff that doesn't belong
|
||||
.virtualenv
|
||||
virtualenv
|
||||
/venv
|
||||
.venv/
|
||||
/docker-compose.env
|
||||
/docker-compose.yml
|
||||
|
||||
@@ -87,9 +84,8 @@ scripts/nuke
|
||||
/paperless.conf
|
||||
/consume/
|
||||
/export/
|
||||
/src-ui/.vscode
|
||||
|
||||
# this is where the compiled frontend is moved to.
|
||||
/src/documents/static/frontend/
|
||||
|
||||
# mac os
|
||||
.DS_Store
|
||||
/docs/.vscode/settings.json
|
||||
|
@@ -1,8 +0,0 @@
|
||||
failure-threshold: warning
|
||||
ignored:
|
||||
# https://github.com/hadolint/hadolint/wiki/DL3008
|
||||
- DL3008
|
||||
# https://github.com/hadolint/hadolint/wiki/DL3013
|
||||
- DL3013
|
||||
# https://github.com/hadolint/hadolint/wiki/DL3003
|
||||
- DL3003
|
@@ -5,7 +5,7 @@
|
||||
repos:
|
||||
# General hooks
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.3.0
|
||||
rev: v4.1.0
|
||||
hooks:
|
||||
- id: check-docstring-first
|
||||
- id: check-json
|
||||
@@ -27,7 +27,7 @@ repos:
|
||||
- id: check-case-conflict
|
||||
- id: detect-private-key
|
||||
- repo: https://github.com/pre-commit/mirrors-prettier
|
||||
rev: "v2.7.1"
|
||||
rev: "v2.6.1"
|
||||
hooks:
|
||||
- id: prettier
|
||||
types_or:
|
||||
@@ -37,17 +37,17 @@ repos:
|
||||
exclude: "(^Pipfile\\.lock$)"
|
||||
# Python hooks
|
||||
- repo: https://github.com/asottile/reorder_python_imports
|
||||
rev: v3.8.2
|
||||
rev: v3.0.1
|
||||
hooks:
|
||||
- id: reorder-python-imports
|
||||
exclude: "(migrations)"
|
||||
- repo: https://github.com/asottile/yesqa
|
||||
rev: "v1.4.0"
|
||||
rev: "v1.3.0"
|
||||
hooks:
|
||||
- id: yesqa
|
||||
exclude: "(migrations)"
|
||||
- repo: https://github.com/asottile/add-trailing-comma
|
||||
rev: "v2.2.3"
|
||||
rev: "v2.2.1"
|
||||
hooks:
|
||||
- id: add-trailing-comma
|
||||
exclude: "(migrations)"
|
||||
@@ -59,21 +59,14 @@ repos:
|
||||
args:
|
||||
- "--config=./src/setup.cfg"
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 22.6.0
|
||||
rev: 22.3.0
|
||||
hooks:
|
||||
- id: black
|
||||
- repo: https://github.com/asottile/pyupgrade
|
||||
rev: v2.37.3
|
||||
hooks:
|
||||
- id: pyupgrade
|
||||
exclude: "(migrations)"
|
||||
args:
|
||||
- "--py38-plus"
|
||||
# Dockerfile hooks
|
||||
- repo: https://github.com/AleksaC/hadolint-py
|
||||
rev: v2.10.0
|
||||
- repo: https://github.com/pryorda/dockerfilelint-precommit-hooks
|
||||
rev: "v0.1.0"
|
||||
hooks:
|
||||
- id: hadolint
|
||||
- id: dockerfilelint
|
||||
# Shell script hooks
|
||||
- repo: https://github.com/lovesegfault/beautysh
|
||||
rev: v6.2.1
|
||||
|
@@ -1,9 +0,0 @@
|
||||
/.github/workflows/ @paperless-ngx/ci-cd
|
||||
/docker/ @paperless-ngx/ci-cd
|
||||
/scripts/ @paperless-ngx/ci-cd
|
||||
|
||||
/src-ui/ @paperless-ngx/frontend
|
||||
|
||||
/src/ @paperless-ngx/backend
|
||||
Pipfile* @paperless-ngx/backend
|
||||
*.py @paperless-ngx/backend
|
247
Dockerfile
@@ -1,36 +1,12 @@
|
||||
# syntax=docker/dockerfile:1.4
|
||||
FROM node:16 AS compile-frontend
|
||||
|
||||
# Pull the installer images from the library
|
||||
# These are all built previously
|
||||
# They provide either a .deb or .whl
|
||||
|
||||
ARG JBIG2ENC_VERSION
|
||||
ARG QPDF_VERSION
|
||||
ARG PIKEPDF_VERSION
|
||||
ARG PSYCOPG2_VERSION
|
||||
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/jbig2enc:${JBIG2ENC_VERSION} as jbig2enc-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/qpdf:${QPDF_VERSION} as qpdf-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/pikepdf:${PIKEPDF_VERSION} as pikepdf-builder
|
||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/psycopg2:${PSYCOPG2_VERSION} as psycopg2-builder
|
||||
|
||||
FROM --platform=$BUILDPLATFORM node:16-bullseye-slim AS compile-frontend
|
||||
|
||||
# This stage compiles the frontend
|
||||
# This stage runs once for the native platform, as the outputs are not
|
||||
# dependent on target arch
|
||||
# Inputs: None
|
||||
|
||||
COPY ./src-ui /src/src-ui
|
||||
COPY . /src
|
||||
|
||||
WORKDIR /src/src-ui
|
||||
RUN set -eux \
|
||||
&& npm update npm -g \
|
||||
&& npm ci --omit=optional
|
||||
RUN set -eux \
|
||||
&& ./node_modules/.bin/ng build --configuration production
|
||||
RUN npm update npm -g && npm ci --no-optional
|
||||
RUN ./node_modules/.bin/ng build --configuration production
|
||||
|
||||
FROM python:3.9-slim-bullseye as main-app
|
||||
FROM ghcr.io/paperless-ngx/builder/ngx-base:1.0 as main-app
|
||||
|
||||
LABEL org.opencontainers.image.authors="paperless-ngx team <hello@paperless-ngx.com>"
|
||||
LABEL org.opencontainers.image.documentation="https://paperless-ngx.readthedocs.io/en/latest/"
|
||||
@@ -38,196 +14,45 @@ LABEL org.opencontainers.image.source="https://github.com/paperless-ngx/paperles
|
||||
LABEL org.opencontainers.image.url="https://github.com/paperless-ngx/paperless-ngx"
|
||||
LABEL org.opencontainers.image.licenses="GPL-3.0-only"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
#
|
||||
# Begin installation and configuration
|
||||
# Order the steps below from least often changed to most
|
||||
#
|
||||
|
||||
# copy jbig2enc
|
||||
# Basically will never change again
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/.libs/libjbig2enc* /usr/local/lib/
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/jbig2 /usr/local/bin/
|
||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/*.h /usr/local/include/
|
||||
|
||||
# Packages need for running
|
||||
ARG RUNTIME_PACKAGES="\
|
||||
curl \
|
||||
file \
|
||||
# fonts for text file thumbnail generation
|
||||
fonts-liberation \
|
||||
gettext \
|
||||
ghostscript \
|
||||
gnupg \
|
||||
gosu \
|
||||
icc-profiles-free \
|
||||
imagemagick \
|
||||
media-types \
|
||||
liblept5 \
|
||||
libpq5 \
|
||||
libxml2 \
|
||||
liblcms2-2 \
|
||||
libtiff5 \
|
||||
libxslt1.1 \
|
||||
libfreetype6 \
|
||||
libwebp6 \
|
||||
libopenjp2-7 \
|
||||
libimagequant0 \
|
||||
libraqm0 \
|
||||
libgnutls30 \
|
||||
libjpeg62-turbo \
|
||||
python3 \
|
||||
python3-pip \
|
||||
python3-setuptools \
|
||||
postgresql-client \
|
||||
mariadb-client \
|
||||
# For Numpy
|
||||
libatlas3-base \
|
||||
# OCRmyPDF dependencies
|
||||
tesseract-ocr \
|
||||
tesseract-ocr-eng \
|
||||
tesseract-ocr-deu \
|
||||
tesseract-ocr-fra \
|
||||
tesseract-ocr-ita \
|
||||
tesseract-ocr-spa \
|
||||
# Suggested for OCRmyPDF
|
||||
pngquant \
|
||||
# Suggested for pikepdf
|
||||
jbig2dec \
|
||||
tzdata \
|
||||
unpaper \
|
||||
# Mime type detection
|
||||
zlib1g \
|
||||
# Barcode splitter
|
||||
libzbar0 \
|
||||
poppler-utils"
|
||||
|
||||
# Install basic runtime packages.
|
||||
# These change very infrequently
|
||||
RUN set -eux \
|
||||
echo "Installing system packages" \
|
||||
&& apt-get update \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${RUNTIME_PACKAGES} \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& echo "Installing supervisor" \
|
||||
&& python3 -m pip install --default-timeout=1000 --upgrade --no-cache-dir supervisor==4.2.4
|
||||
|
||||
# Copy gunicorn config
|
||||
# Changes very infrequently
|
||||
WORKDIR /usr/src/paperless/
|
||||
|
||||
COPY gunicorn.conf.py .
|
||||
|
||||
# setup docker-specific things
|
||||
# Use mounts to avoid copying installer files into the image
|
||||
# These change sometimes, but rarely
|
||||
WORKDIR /usr/src/paperless/src/docker/
|
||||
|
||||
COPY [ \
|
||||
"docker/imagemagick-policy.xml", \
|
||||
"docker/supervisord.conf", \
|
||||
"docker/docker-entrypoint.sh", \
|
||||
"docker/docker-prepare.sh", \
|
||||
"docker/paperless_cmd.sh", \
|
||||
"docker/wait-for-redis.py", \
|
||||
"docker/management_script.sh", \
|
||||
"docker/install_management_commands.sh", \
|
||||
"/usr/src/paperless/src/docker/" \
|
||||
]
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Configuring ImageMagick" \
|
||||
&& mv imagemagick-policy.xml /etc/ImageMagick-6/policy.xml \
|
||||
&& echo "Configuring supervisord" \
|
||||
&& mkdir /var/log/supervisord /var/run/supervisord \
|
||||
&& mv supervisord.conf /etc/supervisord.conf \
|
||||
&& echo "Setting up Docker scripts" \
|
||||
&& mv docker-entrypoint.sh /sbin/docker-entrypoint.sh \
|
||||
&& chmod 755 /sbin/docker-entrypoint.sh \
|
||||
&& mv docker-prepare.sh /sbin/docker-prepare.sh \
|
||||
&& chmod 755 /sbin/docker-prepare.sh \
|
||||
&& mv wait-for-redis.py /sbin/wait-for-redis.py \
|
||||
&& chmod 755 /sbin/wait-for-redis.py \
|
||||
&& mv paperless_cmd.sh /usr/local/bin/paperless_cmd.sh \
|
||||
&& chmod 755 /usr/local/bin/paperless_cmd.sh \
|
||||
&& echo "Installing managment commands" \
|
||||
&& chmod +x install_management_commands.sh \
|
||||
&& ./install_management_commands.sh
|
||||
|
||||
# Install the built packages from the installer library images
|
||||
# Use mounts to avoid copying installer files into the image
|
||||
# These change sometimes
|
||||
RUN --mount=type=bind,from=qpdf-builder,target=/qpdf \
|
||||
--mount=type=bind,from=psycopg2-builder,target=/psycopg2 \
|
||||
--mount=type=bind,from=pikepdf-builder,target=/pikepdf \
|
||||
set -eux \
|
||||
&& echo "Installing qpdf" \
|
||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/libqpdf28_*.deb \
|
||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/qpdf_*.deb \
|
||||
&& echo "Installing pikepdf and dependencies" \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/pyparsing*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/packaging*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/lxml*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/Pillow*.whl \
|
||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/pikepdf*.whl \
|
||||
&& python3 -m pip list \
|
||||
&& echo "Installing psycopg2" \
|
||||
&& python3 -m pip install --no-cache-dir /psycopg2/usr/src/wheels/psycopg2*.whl \
|
||||
&& python3 -m pip list
|
||||
|
||||
WORKDIR /usr/src/paperless/src/
|
||||
|
||||
COPY requirements.txt ../
|
||||
|
||||
# Python dependencies
|
||||
# Change pretty frequently
|
||||
COPY Pipfile* ./
|
||||
RUN apt-get update \
|
||||
# python-Levenshtein still needs to be compiled here
|
||||
&& apt-get -y --no-install-recommends install \
|
||||
build-essential \
|
||||
&& python3 -m pip install --upgrade --no-cache-dir pip wheel \
|
||||
&& python3 -m pip install --default-timeout=1000 --upgrade --no-cache-dir supervisor \
|
||||
&& python3 -m pip install --default-timeout=1000 --no-cache-dir -r ../requirements.txt \
|
||||
&& apt-get -y purge build-essential \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Packages needed only for building a few quick Python
|
||||
# dependencies
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
git \
|
||||
default-libmysqlclient-dev \
|
||||
python3-dev"
|
||||
# setup docker-specific things
|
||||
COPY docker/ ./docker/
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build system packages" \
|
||||
&& apt-get update \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade wheel \
|
||||
&& echo "Installing pipenv" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade pipenv \
|
||||
&& echo "Installing Python requirements" \
|
||||
# pipenv tries to be too fancy and prints so much junk
|
||||
&& pipenv requirements > requirements.txt \
|
||||
&& python3 -m pip install --default-timeout=1000 --no-cache-dir --requirement requirements.txt \
|
||||
&& rm requirements.txt \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& apt-get clean --yes \
|
||||
# Remove pipenv and its unique packages
|
||||
&& python3 -m pip uninstall --yes \
|
||||
pipenv \
|
||||
distlib \
|
||||
platformdirs \
|
||||
virtualenv \
|
||||
virtualenv-clone \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& rm -rf /tmp/* \
|
||||
&& rm -rf /var/tmp/* \
|
||||
&& rm -rf /var/cache/apt/archives/* \
|
||||
&& truncate -s 0 /var/log/*log
|
||||
RUN cd docker \
|
||||
&& cp imagemagick-policy.xml /etc/ImageMagick-6/policy.xml \
|
||||
&& mkdir /var/log/supervisord /var/run/supervisord \
|
||||
&& cp supervisord.conf /etc/supervisord.conf \
|
||||
&& cp docker-entrypoint.sh /sbin/docker-entrypoint.sh \
|
||||
&& chmod 755 /sbin/docker-entrypoint.sh \
|
||||
&& cp docker-prepare.sh /sbin/docker-prepare.sh \
|
||||
&& chmod 755 /sbin/docker-prepare.sh \
|
||||
&& chmod +x install_management_commands.sh \
|
||||
&& ./install_management_commands.sh \
|
||||
&& cd .. \
|
||||
&& rm -rf docker/
|
||||
|
||||
# copy backend
|
||||
COPY ./src ./
|
||||
COPY gunicorn.conf.py ../
|
||||
|
||||
# copy frontend
|
||||
COPY --from=compile-frontend /src/src/documents/static/frontend/ ./documents/static/frontend/
|
||||
# copy app
|
||||
COPY --from=compile-frontend /src/src/ ./
|
||||
|
||||
# add users, setup scripts
|
||||
RUN set -eux \
|
||||
&& addgroup --gid 1000 paperless \
|
||||
RUN addgroup --gid 1000 paperless \
|
||||
&& useradd --uid 1000 --gid paperless --home-dir /usr/src/paperless paperless \
|
||||
&& chown -R paperless:paperless ../ \
|
||||
&& gosu paperless python3 manage.py collectstatic --clear --no-input \
|
||||
@@ -242,4 +67,4 @@ ENTRYPOINT ["/sbin/docker-entrypoint.sh"]
|
||||
|
||||
EXPOSE 8000
|
||||
|
||||
CMD ["/usr/local/bin/paperless_cmd.sh"]
|
||||
CMD ["/usr/local/bin/supervisord", "-c", "/etc/supervisord.conf"]
|
||||
|
31
Pipfile
@@ -13,8 +13,8 @@ dateparser = "~=1.1"
|
||||
django = "~=4.0"
|
||||
django-cors-headers = "*"
|
||||
django-extensions = "*"
|
||||
django-filter = "~=22.1"
|
||||
django-q = {editable = true, ref = "paperless-main", git = "https://github.com/paperless-ngx/django-q.git"}
|
||||
django-filter = "~=21.1"
|
||||
django-q = "~=1.3"
|
||||
djangorestframework = "~=3.13"
|
||||
filelock = "*"
|
||||
fuzzywuzzy = {extras = ["speedup"], version = "*"}
|
||||
@@ -22,22 +22,23 @@ gunicorn = "*"
|
||||
imap-tools = "*"
|
||||
langdetect = "*"
|
||||
pathvalidate = "*"
|
||||
pillow = "~=9.2"
|
||||
pikepdf = "~=5.6"
|
||||
pillow = "~=9.0"
|
||||
# Any version update to pikepdf requires a base image update
|
||||
pikepdf = "~=5.1"
|
||||
python-gnupg = "*"
|
||||
python-dotenv = "*"
|
||||
python-dateutil = "*"
|
||||
python-magic = "*"
|
||||
# Any version update to psycopg2 requires a base image update
|
||||
psycopg2 = "*"
|
||||
redis = "*"
|
||||
scikit-learn = "~=1.1"
|
||||
# Pin this until piwheels is building 1.9 (see https://www.piwheels.org/project/scipy/)
|
||||
scipy = "==1.8.1"
|
||||
whitenoise = "~=6.2"
|
||||
watchdog = "~=2.1"
|
||||
whoosh="~=2.7"
|
||||
# Pinned because aarch64 wheels and updates cause warnings when loading the classifier model.
|
||||
scikit-learn="==1.0.2"
|
||||
whitenoise = "~=6.0.0"
|
||||
watchdog = "~=2.1.0"
|
||||
whoosh="~=2.7.4"
|
||||
inotifyrecursive = "~=0.3"
|
||||
ocrmypdf = "~=13.7"
|
||||
ocrmypdf = "~=13.4"
|
||||
tqdm = "*"
|
||||
tika = "*"
|
||||
# TODO: This will sadly also install daphne+dependencies,
|
||||
@@ -50,10 +51,6 @@ concurrent-log-handler = "*"
|
||||
"backports.zoneinfo" = {version = "*", markers = "python_version < '3.9'"}
|
||||
"importlib-resources" = {version = "*", markers = "python_version < '3.9'"}
|
||||
zipp = {version = "*", markers = "python_version < '3.9'"}
|
||||
pyzbar = "*"
|
||||
pdf2image = "*"
|
||||
mysqlclient = "*"
|
||||
setproctitle = "*"
|
||||
|
||||
[dev-packages]
|
||||
coveralls = "*"
|
||||
@@ -65,10 +62,8 @@ pytest-django = "*"
|
||||
pytest-env = "*"
|
||||
pytest-sugar = "*"
|
||||
pytest-xdist = "*"
|
||||
sphinx = "~=5.1"
|
||||
sphinx = "~=4.4.0"
|
||||
sphinx_rtd_theme = "*"
|
||||
tox = "*"
|
||||
black = "*"
|
||||
pre-commit = "*"
|
||||
sphinx-autobuild = "*"
|
||||
myst-parser = "*"
|
||||
|
1827
Pipfile.lock
generated
@@ -102,7 +102,7 @@ For bugs please [open an issue](https://github.com/paperless-ngx/paperless-ngx/i
|
||||
|
||||
Paperless has been around a while now, and people are starting to build stuff on top of it. If you're one of those people, we can add your project to this list:
|
||||
|
||||
- [Paperless App](https://github.com/bauerj/paperless_app): An Android/iOS app for Paperless-ngx. Also works with the original Paperless and Paperless-ng.
|
||||
- [Paperless App](https://github.com/bauerj/paperless_app): An Android/iOS app for Paperless-ngx. Also works with the original Paperless and Paperless-ngx.
|
||||
- [Paperless Share](https://github.com/qcasey/paperless_share). Share any files from your Android application with paperless. Very simple, but works with all of the mobile scanning apps out there that allow you to share scanned documents.
|
||||
- [Scan to Paperless](https://github.com/sbrunner/scan-to-paperless): Scan and prepare (crop, deskew, OCR, ...) your documents for Paperless.
|
||||
|
||||
|
@@ -1,43 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Helper script for building the Docker image locally.
|
||||
# Parses and provides the nessecary versions of other images to Docker
|
||||
# before passing in the rest of script args.
|
||||
|
||||
# First Argument: The Dockerfile to build
|
||||
# Other Arguments: Additional arguments to docker build
|
||||
|
||||
# Example Usage:
|
||||
# ./build-docker-image.sh Dockerfile -t paperless-ngx:my-awesome-feature
|
||||
|
||||
set -eux
|
||||
|
||||
if ! command -v jq; then
|
||||
echo "jq required"
|
||||
exit 1
|
||||
elif [ ! -f "$1" ]; then
|
||||
echo "$1 is not a file, please provide the Dockerfile"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Parse what we can from Pipfile.lock
|
||||
pikepdf_version=$(jq ".default.pikepdf.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
psycopg2_version=$(jq ".default.psycopg2.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
||||
# Read this from the other config file
|
||||
qpdf_version=$(jq ".qpdf.version" .build-config.json | sed 's/"//g')
|
||||
jbig2enc_version=$(jq ".jbig2enc.version" .build-config.json | sed 's/"//g')
|
||||
# Get the branch name (used for caching)
|
||||
branch_name=$(git rev-parse --abbrev-ref HEAD)
|
||||
|
||||
# https://docs.docker.com/develop/develop-images/build_enhancements/
|
||||
# Required to use cache-from
|
||||
export DOCKER_BUILDKIT=1
|
||||
|
||||
docker build --file "$1" \
|
||||
--progress=plain \
|
||||
--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:"${branch_name}" \
|
||||
--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:dev \
|
||||
--build-arg JBIG2ENC_VERSION="${jbig2enc_version}" \
|
||||
--build-arg QPDF_VERSION="${qpdf_version}" \
|
||||
--build-arg PIKEPDF_VERSION="${pikepdf_version}" \
|
||||
--build-arg PSYCOPG2_VERSION="${psycopg2_version}" "${@:2}" .
|
@@ -1,14 +0,0 @@
|
||||
# This Dockerfile compiles the frontend
|
||||
# Inputs: None
|
||||
|
||||
FROM node:16-bullseye-slim AS compile-frontend
|
||||
|
||||
COPY ./src /src/src
|
||||
COPY ./src-ui /src/src-ui
|
||||
|
||||
WORKDIR /src/src-ui
|
||||
RUN set -eux \
|
||||
&& npm update npm -g \
|
||||
&& npm ci --omit=optional
|
||||
RUN set -eux \
|
||||
&& ./node_modules/.bin/ng build --configuration production
|
@@ -1,35 +0,0 @@
|
||||
# This Dockerfile compiles the jbig2enc library
|
||||
# Inputs:
|
||||
# - JBIG2ENC_VERSION - the Git tag to checkout and build
|
||||
|
||||
FROM debian:bullseye-slim as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with jbig2enc built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
ARG JBIG2ENC_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
automake \
|
||||
libtool \
|
||||
libleptonica-dev \
|
||||
zlib1g-dev \
|
||||
git \
|
||||
ca-certificates"
|
||||
|
||||
WORKDIR /usr/src/jbig2enc
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Building jbig2enc" \
|
||||
&& git clone --quiet --branch $JBIG2ENC_VERSION https://github.com/agl/jbig2enc . \
|
||||
&& ./autogen.sh \
|
||||
&& ./configure \
|
||||
&& make \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
@@ -1,88 +0,0 @@
|
||||
# This Dockerfile builds the pikepdf wheel
|
||||
# Inputs:
|
||||
# - REPO - Docker repository to pull qpdf from
|
||||
# - QPDF_VERSION - The image qpdf version to copy .deb files from
|
||||
# - PIKEPDF_VERSION - Version of pikepdf to build wheel for
|
||||
|
||||
# Default to pulling from the main repo registry when manually building
|
||||
ARG REPO="paperless-ngx/paperless-ngx"
|
||||
|
||||
ARG QPDF_VERSION
|
||||
FROM ghcr.io/${REPO}/builder/qpdf:${QPDF_VERSION} as qpdf-builder
|
||||
|
||||
# This does nothing, except provide a name for a copy below
|
||||
|
||||
FROM python:3.9-slim-bullseye as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with pikepdf wheel built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
ARG PIKEPDF_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
python3-dev \
|
||||
python3-pip \
|
||||
# qpdf requirement - https://github.com/qpdf/qpdf#crypto-providers
|
||||
libgnutls28-dev \
|
||||
# lxml requrements - https://lxml.de/installation.html
|
||||
libxml2-dev \
|
||||
libxslt1-dev \
|
||||
# Pillow requirements - https://pillow.readthedocs.io/en/stable/installation.html#external-libraries
|
||||
# JPEG functionality
|
||||
libjpeg62-turbo-dev \
|
||||
# conpressed PNG
|
||||
zlib1g-dev \
|
||||
# compressed TIFF
|
||||
libtiff-dev \
|
||||
# type related services
|
||||
libfreetype-dev \
|
||||
# color management
|
||||
liblcms2-dev \
|
||||
# WebP format
|
||||
libwebp-dev \
|
||||
# JPEG 2000
|
||||
libopenjp2-7-dev \
|
||||
# improved color quantization
|
||||
libimagequant-dev \
|
||||
# complex text layout support
|
||||
libraqm-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
COPY --from=qpdf-builder /usr/src/qpdf/*.deb ./
|
||||
|
||||
# As this is an base image for a multi-stage final image
|
||||
# the added size of the install is basically irrelevant
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Installing qpdf" \
|
||||
&& dpkg --install libqpdf28_*.deb \
|
||||
&& dpkg --install libqpdf-dev_*.deb \
|
||||
&& echo "Installing Python tools" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade \
|
||||
pip \
|
||||
wheel \
|
||||
# https://pikepdf.readthedocs.io/en/latest/installation.html#requirements
|
||||
pybind11 \
|
||||
&& echo "Building pikepdf wheel ${PIKEPDF_VERSION}" \
|
||||
&& mkdir wheels \
|
||||
&& python3 -m pip wheel \
|
||||
# Build the package at the required version
|
||||
pikepdf==${PIKEPDF_VERSION} \
|
||||
# Output the *.whl into this directory
|
||||
--wheel-dir wheels \
|
||||
# Do not use a binary packge for the package being built
|
||||
--no-binary=pikepdf \
|
||||
# Do use binary packages for dependencies
|
||||
--prefer-binary \
|
||||
# Don't cache build files
|
||||
--no-cache-dir \
|
||||
&& ls -ahl wheels \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
@@ -1,48 +0,0 @@
|
||||
# This Dockerfile builds the psycopg2 wheel
|
||||
# Inputs:
|
||||
# - PSYCOPG2_VERSION - Version to build
|
||||
|
||||
FROM python:3.9-slim-bullseye as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with psycopg2 wheel built"
|
||||
|
||||
ARG PSYCOPG2_VERSION
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
python3-dev \
|
||||
python3-pip \
|
||||
# https://www.psycopg.org/docs/install.html#prerequisites
|
||||
libpq-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
# As this is an base image for a multi-stage final image
|
||||
# the added size of the install is basically irrelevant
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||
&& echo "Installing Python tools" \
|
||||
&& python3 -m pip install --no-cache-dir --upgrade pip wheel \
|
||||
&& echo "Building psycopg2 wheel ${PSYCOPG2_VERSION}" \
|
||||
&& cd /usr/src \
|
||||
&& mkdir wheels \
|
||||
&& python3 -m pip wheel \
|
||||
# Build the package at the required version
|
||||
psycopg2==${PSYCOPG2_VERSION} \
|
||||
# Output the *.whl into this directory
|
||||
--wheel-dir wheels \
|
||||
# Do not use a binary packge for the package being built
|
||||
--no-binary=psycopg2 \
|
||||
# Do use binary packages for dependencies
|
||||
--prefer-binary \
|
||||
# Don't cache build files
|
||||
--no-cache-dir \
|
||||
&& ls -ahl wheels/ \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
@@ -1,52 +0,0 @@
|
||||
# This Dockerfile compiles the jbig2enc library
|
||||
# Inputs:
|
||||
# - QPDF_VERSION - the version of qpdf to build a .deb.
|
||||
# Must be preset as a deb-src
|
||||
|
||||
FROM debian:bullseye-slim as main
|
||||
|
||||
LABEL org.opencontainers.image.description="A intermediate image with qpdf built"
|
||||
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
# This must match to pikepdf's minimum at least
|
||||
ARG QPDF_VERSION
|
||||
|
||||
ARG BUILD_PACKAGES="\
|
||||
build-essential \
|
||||
debhelper \
|
||||
debian-keyring \
|
||||
devscripts \
|
||||
equivs \
|
||||
libtool \
|
||||
# https://qpdf.readthedocs.io/en/stable/installation.html#system-requirements
|
||||
libjpeg62-turbo-dev \
|
||||
libgnutls28-dev \
|
||||
packaging-dev \
|
||||
zlib1g-dev"
|
||||
|
||||
WORKDIR /usr/src
|
||||
|
||||
# As this is an base image for a multi-stage final image
|
||||
# the added size of the install is basically irrelevant
|
||||
|
||||
RUN set -eux \
|
||||
&& echo "Installing build tools" \
|
||||
&& apt-get update --quiet \
|
||||
&& apt-get install --yes --quiet --no-install-recommends $BUILD_PACKAGES \
|
||||
&& echo "Building qpdf" \
|
||||
&& echo "deb-src http://deb.debian.org/debian/ bookworm main" > /etc/apt/sources.list.d/bookworm-src.list \
|
||||
&& apt-get update \
|
||||
&& mkdir qpdf \
|
||||
&& cd qpdf \
|
||||
&& apt-get source --yes --quiet qpdf=${QPDF_VERSION}-1/bookworm \
|
||||
&& cd qpdf-$QPDF_VERSION \
|
||||
# We don't need to build the tests (also don't run them)
|
||||
&& rm -rf libtests \
|
||||
&& DEBEMAIL=hello@paperless-ngx.com debchange --bpo \
|
||||
&& export DEB_BUILD_OPTIONS="terse nocheck nodoc parallel=2" \
|
||||
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean \
|
||||
&& ls -ahl ../*.deb \
|
||||
&& echo "Cleaning up image" \
|
||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
||||
&& apt-get -y autoremove --purge \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
@@ -22,10 +22,6 @@
|
||||
# Docker setup does not use the configuration file.
|
||||
# A few commonly adjusted settings are provided below.
|
||||
|
||||
# This is required if you will be exposing Paperless-ngx on a public domain
|
||||
# (if doing so please consider security measures such as reverse proxy)
|
||||
#PAPERLESS_URL=https://paperless.example.com
|
||||
|
||||
# Adjust this key if you plan to make paperless available publicly. It should
|
||||
# be a very long sequence of random characters. You don't need to remember it.
|
||||
#PAPERLESS_SECRET_KEY=change-me
|
||||
@@ -36,7 +32,3 @@
|
||||
# The default language to use for OCR. Set this to the language most of your
|
||||
# documents are written in.
|
||||
#PAPERLESS_OCR_LANGUAGE=eng
|
||||
|
||||
# Set if accessing paperless via a domain subpath e.g. https://domain.com/PATHPREFIX and using a reverse-proxy like traefik or nginx
|
||||
#PAPERLESS_FORCE_SCRIPT_NAME=/PATHPREFIX
|
||||
#PAPERLESS_STATIC_URL=/PATHPREFIX/static/ # trailing slash required
|
||||
|
@@ -1,83 +0,0 @@
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
# All compose files of paperless configure paperless in the following way:
|
||||
#
|
||||
# - Paperless is (re)started on system boot, if it was running before shutdown.
|
||||
# - Docker volumes for storing data are managed by Docker.
|
||||
# - Folders for importing and exporting files are created in the same directory
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8000.
|
||||
#
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||
#
|
||||
# To install and update paperless with this file, do the following:
|
||||
#
|
||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||
# and '.env' into a folder.
|
||||
# - Run 'docker-compose pull'.
|
||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
||||
# - Run 'docker-compose up -d'.
|
||||
#
|
||||
# For more extensive installation and update instructions, refer to the
|
||||
# documentation.
|
||||
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/mariadb:10
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- dbdata:/var/lib/mysql
|
||||
environment:
|
||||
MARIADB_HOST: paperless
|
||||
MARIADB_DATABASE: paperless
|
||||
MARIADB_USER: paperless
|
||||
MARIADB_PASSWORD: paperless
|
||||
MARIADB_ROOT_PASSWORD: paperless
|
||||
ports:
|
||||
- "3306:3306"
|
||||
|
||||
webserver:
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- db
|
||||
- broker
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
volumes:
|
||||
- data:/usr/src/paperless/data
|
||||
- media:/usr/src/paperless/media
|
||||
- ./export:/usr/src/paperless/export
|
||||
- ./consume:/usr/src/paperless/consume
|
||||
env_file: docker-compose.env
|
||||
environment:
|
||||
PAPERLESS_REDIS: redis://broker:6379
|
||||
PAPERLESS_DBENGINE: mariadb
|
||||
PAPERLESS_DBHOST: db
|
||||
PAPERLESS_DBUSER: paperless
|
||||
PAPERLESS_DBPASSWORD: paperless
|
||||
PAPERLESS_DBPORT: 3306
|
||||
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
dbdata:
|
||||
redisdata:
|
@@ -31,13 +31,13 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
image: redis:6.0
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/postgres:13
|
||||
image: postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -55,7 +55,7 @@ services:
|
||||
ports:
|
||||
- 8010:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
@@ -1,6 +1,7 @@
|
||||
# docker-compose file for running paperless from the docker container registry.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
# Paperless supports amd64, arm and arm64 hardware. The apache/tika image
|
||||
# does not support arm or arm64, however.
|
||||
#
|
||||
# All compose files of paperless configure paperless in the following way:
|
||||
#
|
||||
@@ -33,13 +34,13 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
image: redis:6.0
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/postgres:13
|
||||
image: postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -59,7 +60,7 @@ services:
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
@@ -77,14 +78,14 @@ services:
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.4
|
||||
image: gotenberg/gotenberg:7
|
||||
restart: unless-stopped
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
image: apache/tika
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
|
@@ -29,13 +29,13 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
image: redis:6.0
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/postgres:13
|
||||
image: postgres:13
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
@@ -53,7 +53,7 @@ services:
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
@@ -1,4 +1,4 @@
|
||||
# docker-compose file for running paperless from the Docker Hub.
|
||||
# docker-compose file for running paperless from the docker container registry.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
#
|
||||
@@ -10,10 +10,14 @@
|
||||
# as this file and mounted to the correct folders inside the container.
|
||||
# - Paperless listens on port 8000.
|
||||
#
|
||||
# SQLite is used as the database. The SQLite file is stored in the data volume.
|
||||
#
|
||||
# iwishiwasaneagle/apache-tika-arm docker image is used to enable arm64 arch
|
||||
# which apache/tika does not currently support.
|
||||
#
|
||||
# In addition to that, this docker-compose file adds the following optional
|
||||
# configurations:
|
||||
#
|
||||
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||
# - Apache Tika and Gotenberg servers are started with paperless and paperless
|
||||
# is configured to use these services. These provide support for consuming
|
||||
# Office documents (Word, Excel, Power Point and their LibreOffice counter-
|
||||
@@ -33,30 +37,15 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
image: redis:6.0
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
|
||||
db:
|
||||
image: docker.io/library/mariadb:10
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- dbdata:/var/lib/mysql
|
||||
environment:
|
||||
MARIADB_HOST: paperless
|
||||
MARIADB_DATABASE: paperless
|
||||
MARIADB_USER: paperless
|
||||
MARIADB_PASSWORD: paperless
|
||||
MARIADB_ROOT_PASSWORD: paperless
|
||||
ports:
|
||||
- "3306:3306"
|
||||
|
||||
webserver:
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- db
|
||||
- broker
|
||||
- gotenberg
|
||||
- tika
|
||||
@@ -75,27 +64,21 @@ services:
|
||||
env_file: docker-compose.env
|
||||
environment:
|
||||
PAPERLESS_REDIS: redis://broker:6379
|
||||
PAPERLESS_DBENGINE: mariadb
|
||||
PAPERLESS_DBHOST: db
|
||||
PAPERLESS_DBUSER: paperless
|
||||
PAPERLESS_DBPASSWORD: paperless
|
||||
PAPERLESS_DBPORT: 3306
|
||||
PAPERLESS_TIKA_ENABLED: 1
|
||||
PAPERLESS_TIKA_GOTENBERG_ENDPOINT: http://gotenberg:3000
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.4
|
||||
image: thecodingmachine/gotenberg
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
CHROMIUM_DISABLE_ROUTES: 1
|
||||
DISABLE_GOOGLE_CHROME: 1
|
||||
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
image: iwishiwasaneagle/apache-tika-arm@sha256:a78c25ffe57ecb1a194b2859d42a61af46e9e845191512b8f1a4bf90578ffdfd
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
data:
|
||||
media:
|
||||
dbdata:
|
||||
redisdata:
|
@@ -1,6 +1,8 @@
|
||||
# docker-compose file for running paperless from the docker container registry.
|
||||
# This file contains everything paperless needs to run.
|
||||
# Paperless supports amd64, arm and arm64 hardware.
|
||||
# Paperless supports amd64, arm and arm64 hardware. The apache/tika image
|
||||
# does not support arm or arm64, however.
|
||||
#
|
||||
# All compose files of paperless configure paperless in the following way:
|
||||
#
|
||||
# - Paperless is (re)started on system boot, if it was running before shutdown.
|
||||
@@ -33,7 +35,7 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
image: redis:6.0
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
@@ -48,7 +50,7 @@ services:
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
@@ -65,14 +67,14 @@ services:
|
||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||
|
||||
gotenberg:
|
||||
image: docker.io/gotenberg/gotenberg:7.4
|
||||
image: gotenberg/gotenberg:7
|
||||
restart: unless-stopped
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
image: apache/tika
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
|
@@ -26,7 +26,7 @@
|
||||
version: "3.4"
|
||||
services:
|
||||
broker:
|
||||
image: docker.io/library/redis:7
|
||||
image: redis:6.0
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redisdata:/data
|
||||
@@ -39,7 +39,7 @@ services:
|
||||
ports:
|
||||
- 8000:8000
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
@@ -2,37 +2,6 @@
|
||||
|
||||
set -e
|
||||
|
||||
# Adapted from:
|
||||
# https://github.com/docker-library/postgres/blob/master/docker-entrypoint.sh
|
||||
# usage: file_env VAR
|
||||
# ie: file_env 'XYZ_DB_PASSWORD' will allow for "$XYZ_DB_PASSWORD_FILE" to
|
||||
# fill in the value of "$XYZ_DB_PASSWORD" from a file, especially for Docker's
|
||||
# secrets feature
|
||||
file_env() {
|
||||
local var="$1"
|
||||
local fileVar="${var}_FILE"
|
||||
|
||||
# Basic validation
|
||||
if [ "${!var:-}" ] && [ "${!fileVar:-}" ]; then
|
||||
echo >&2 "error: both $var and $fileVar are set (but are exclusive)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Only export var if the _FILE exists
|
||||
if [ "${!fileVar:-}" ]; then
|
||||
# And the file exists
|
||||
if [[ -f ${!fileVar} ]]; then
|
||||
echo "Setting ${var} from file"
|
||||
val="$(< "${!fileVar}")"
|
||||
export "$var"="$val"
|
||||
else
|
||||
echo "File ${!fileVar} doesn't exist"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
}
|
||||
|
||||
# Source: https://github.com/sameersbn/docker-gitlab/
|
||||
map_uidgid() {
|
||||
USERMAP_ORIG_UID=$(id -u paperless)
|
||||
@@ -46,57 +15,26 @@ map_uidgid() {
|
||||
fi
|
||||
}
|
||||
|
||||
map_folders() {
|
||||
# Export these so they can be used in docker-prepare.sh
|
||||
export DATA_DIR="${PAPERLESS_DATA_DIR:-/usr/src/paperless/data}"
|
||||
export MEDIA_ROOT_DIR="${PAPERLESS_MEDIA_ROOT:-/usr/src/paperless/media}"
|
||||
}
|
||||
|
||||
initialize() {
|
||||
|
||||
# Setup environment from secrets before anything else
|
||||
for env_var in \
|
||||
PAPERLESS_DBUSER \
|
||||
PAPERLESS_DBPASS \
|
||||
PAPERLESS_SECRET_KEY \
|
||||
PAPERLESS_AUTO_LOGIN_USERNAME \
|
||||
PAPERLESS_ADMIN_USER \
|
||||
PAPERLESS_ADMIN_MAIL \
|
||||
PAPERLESS_ADMIN_PASSWORD \
|
||||
PAPERLESS_REDIS; do
|
||||
# Check for a version of this var with _FILE appended
|
||||
# and convert the contents to the env var value
|
||||
file_env ${env_var}
|
||||
done
|
||||
|
||||
# Change the user and group IDs if needed
|
||||
map_uidgid
|
||||
|
||||
# Check for overrides of certain folders
|
||||
map_folders
|
||||
|
||||
local export_dir="/usr/src/paperless/export"
|
||||
|
||||
for dir in "${export_dir}" "${DATA_DIR}" "${DATA_DIR}/index" "${MEDIA_ROOT_DIR}" "${MEDIA_ROOT_DIR}/documents" "${MEDIA_ROOT_DIR}/documents/originals" "${MEDIA_ROOT_DIR}/documents/thumbnails"; do
|
||||
if [[ ! -d "${dir}" ]]; then
|
||||
echo "Creating directory ${dir}"
|
||||
mkdir "${dir}"
|
||||
for dir in export data data/index media media/documents media/documents/originals media/documents/thumbnails; do
|
||||
if [[ ! -d "../$dir" ]]; then
|
||||
echo "Creating directory ../$dir"
|
||||
mkdir ../$dir
|
||||
fi
|
||||
done
|
||||
|
||||
local tmp_dir="/tmp/paperless"
|
||||
echo "Creating directory ${tmp_dir}"
|
||||
mkdir -p "${tmp_dir}"
|
||||
echo "Creating directory /tmp/paperless"
|
||||
mkdir -p /tmp/paperless
|
||||
|
||||
set +e
|
||||
echo "Adjusting permissions of paperless files. This may take a while."
|
||||
chown -R paperless:paperless ${tmp_dir}
|
||||
for dir in "${export_dir}" "${DATA_DIR}" "${MEDIA_ROOT_DIR}"; do
|
||||
find "${dir}" -not \( -user paperless -and -group paperless \) -exec chown paperless:paperless {} +
|
||||
done
|
||||
chown -R paperless:paperless /tmp/paperless
|
||||
find .. -not \( -user paperless -and -group paperless \) -exec chown paperless:paperless {} +
|
||||
set -e
|
||||
|
||||
"${gosu_cmd[@]}" /sbin/docker-prepare.sh
|
||||
gosu paperless /sbin/docker-prepare.sh
|
||||
}
|
||||
|
||||
install_languages() {
|
||||
@@ -138,11 +76,6 @@ install_languages() {
|
||||
|
||||
echo "Paperless-ngx docker container starting..."
|
||||
|
||||
gosu_cmd=(gosu paperless)
|
||||
if [ "$(id -u)" == "$(id -u paperless)" ]; then
|
||||
gosu_cmd=()
|
||||
fi
|
||||
|
||||
# Install additional languages if specified
|
||||
if [[ -n "$PAPERLESS_OCR_LANGUAGES" ]]; then
|
||||
install_languages "$PAPERLESS_OCR_LANGUAGES"
|
||||
@@ -152,7 +85,7 @@ initialize
|
||||
|
||||
if [[ "$1" != "/"* ]]; then
|
||||
echo Executing management command "$@"
|
||||
exec "${gosu_cmd[@]}" python3 manage.py "$@"
|
||||
exec gosu paperless python3 manage.py "$@"
|
||||
else
|
||||
echo Executing "$@"
|
||||
exec "$@"
|
||||
|
@@ -1,43 +1,16 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
wait_for_postgres() {
|
||||
local attempt_num=1
|
||||
local max_attempts=5
|
||||
|
||||
echo "Waiting for PostgreSQL to start..."
|
||||
|
||||
local host="${PAPERLESS_DBHOST:-localhost}"
|
||||
local port="${PAPERLESS_DBPORT:-5432}"
|
||||
|
||||
# Disable warning, host and port can't have spaces
|
||||
# shellcheck disable=SC2086
|
||||
while [ ! "$(pg_isready -h ${host} -p ${port})" ]; do
|
||||
|
||||
if [ $attempt_num -eq $max_attempts ]; then
|
||||
echo "Unable to connect to database."
|
||||
exit 1
|
||||
else
|
||||
echo "Attempt $attempt_num failed! Trying again in 5 seconds..."
|
||||
|
||||
fi
|
||||
|
||||
attempt_num=$(("$attempt_num" + 1))
|
||||
sleep 5
|
||||
done
|
||||
}
|
||||
|
||||
wait_for_mariadb() {
|
||||
echo "Waiting for MariaDB to start..."
|
||||
|
||||
host="${PAPERLESS_DBHOST:=localhost}"
|
||||
port="${PAPERLESS_DBPORT:=3306}"
|
||||
|
||||
attempt_num=1
|
||||
max_attempts=5
|
||||
|
||||
while ! true > /dev/tcp/$host/$port; do
|
||||
echo "Waiting for PostgreSQL to start..."
|
||||
|
||||
host="${PAPERLESS_DBHOST:=localhost}"
|
||||
port="${PAPERLESS_DBPORT:=5342}"
|
||||
|
||||
|
||||
while [ ! "$(pg_isready -h $host -p $port)" ]; do
|
||||
|
||||
if [ $attempt_num -eq $max_attempts ]; then
|
||||
echo "Unable to connect to database."
|
||||
@@ -52,14 +25,6 @@ wait_for_mariadb() {
|
||||
done
|
||||
}
|
||||
|
||||
wait_for_redis() {
|
||||
# We use a Python script to send the Redis ping
|
||||
# instead of installing redis-tools just for 1 thing
|
||||
if ! python3 /sbin/wait-for-redis.py; then
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
migrations() {
|
||||
(
|
||||
# flock is in place to prevent multiple containers from doing migrations
|
||||
@@ -68,18 +33,17 @@ migrations() {
|
||||
flock 200
|
||||
echo "Apply database migrations..."
|
||||
python3 manage.py migrate
|
||||
) 200>"${DATA_DIR}/migration_lock"
|
||||
) 200>/usr/src/paperless/data/migration_lock
|
||||
}
|
||||
|
||||
search_index() {
|
||||
index_version=1
|
||||
index_version_file=/usr/src/paperless/data/.index_version
|
||||
|
||||
local index_version=1
|
||||
local index_version_file=${DATA_DIR}/.index_version
|
||||
|
||||
if [[ (! -f "${index_version_file}") || $(<"${index_version_file}") != "$index_version" ]]; then
|
||||
if [[ (! -f "$index_version_file") || $(<$index_version_file) != "$index_version" ]]; then
|
||||
echo "Search index out of date. Updating..."
|
||||
python3 manage.py document_index reindex --no-progress-bar
|
||||
echo ${index_version} | tee "${index_version_file}" >/dev/null
|
||||
python3 manage.py document_index reindex
|
||||
echo $index_version | tee $index_version_file >/dev/null
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -90,14 +54,10 @@ superuser() {
|
||||
}
|
||||
|
||||
do_work() {
|
||||
if [[ "${PAPERLESS_DBENGINE}" == "mariadb" ]]; then
|
||||
wait_for_mariadb
|
||||
elif [[ -n "${PAPERLESS_DBHOST}" ]]; then
|
||||
if [[ -n "${PAPERLESS_DBHOST}" ]]; then
|
||||
wait_for_postgres
|
||||
fi
|
||||
|
||||
wait_for_redis
|
||||
|
||||
migrations
|
||||
|
||||
search_index
|
||||
|
@@ -1,19 +1,6 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -eu
|
||||
|
||||
for command in decrypt_documents \
|
||||
document_archiver \
|
||||
document_exporter \
|
||||
document_importer \
|
||||
mail_fetcher \
|
||||
document_create_classifier \
|
||||
document_index \
|
||||
document_renamer \
|
||||
document_retagger \
|
||||
document_thumbnails \
|
||||
document_sanity_checker \
|
||||
manage_superuser;
|
||||
for command in document_archiver document_exporter document_importer mail_fetcher document_create_classifier document_index document_renamer document_retagger document_thumbnails document_sanity_checker manage_superuser;
|
||||
do
|
||||
echo "installing $command..."
|
||||
sed "s/management_command/$command/g" management_script.sh > /usr/local/bin/$command
|
||||
|
@@ -1,15 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
rootless_args=()
|
||||
if [ "$(id -u)" == "$(id -u paperless)" ]; then
|
||||
rootless_args=(
|
||||
--user
|
||||
paperless
|
||||
--logfile
|
||||
supervisord.log
|
||||
--pidfile
|
||||
supervisord.pid
|
||||
)
|
||||
fi
|
||||
|
||||
/usr/local/bin/supervisord -c /etc/supervisord.conf "${rootless_args[@]}"
|
@@ -19,7 +19,6 @@ stderr_logfile_maxbytes=0
|
||||
[program:consumer]
|
||||
command=python3 manage.py document_consumer
|
||||
user=paperless
|
||||
stopsignal=INT
|
||||
|
||||
stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
@@ -29,7 +28,6 @@ stderr_logfile_maxbytes=0
|
||||
[program:scheduler]
|
||||
command=python3 manage.py qcluster
|
||||
user=paperless
|
||||
stopasgroup = true
|
||||
|
||||
stdout_logfile=/dev/stdout
|
||||
stdout_logfile_maxbytes=0
|
||||
|
@@ -1,44 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Simple script which attempts to ping the Redis broker as set in the environment for
|
||||
a certain number of times, waiting a little bit in between
|
||||
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
from typing import Final
|
||||
|
||||
from redis import Redis
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
MAX_RETRY_COUNT: Final[int] = 5
|
||||
RETRY_SLEEP_SECONDS: Final[int] = 5
|
||||
|
||||
REDIS_URL: Final[str] = os.getenv("PAPERLESS_REDIS", "redis://localhost:6379")
|
||||
|
||||
print(f"Waiting for Redis...", flush=True)
|
||||
|
||||
attempt = 0
|
||||
with Redis.from_url(url=REDIS_URL) as client:
|
||||
while attempt < MAX_RETRY_COUNT:
|
||||
try:
|
||||
client.ping()
|
||||
break
|
||||
except Exception as e:
|
||||
print(
|
||||
f"Redis ping #{attempt} failed.\n"
|
||||
f"Error: {str(e)}.\n"
|
||||
f"Waiting {RETRY_SLEEP_SECONDS}s",
|
||||
flush=True,
|
||||
)
|
||||
time.sleep(RETRY_SLEEP_SECONDS)
|
||||
attempt += 1
|
||||
|
||||
if attempt >= MAX_RETRY_COUNT:
|
||||
print(f"Failed to connect to redis using environment variable PAPERLESS_REDIS.")
|
||||
sys.exit(os.EX_UNAVAILABLE)
|
||||
else:
|
||||
print(f"Connected to Redis broker.")
|
||||
sys.exit(os.EX_OK)
|
17
docs/Dockerfile
Normal file
@@ -0,0 +1,17 @@
|
||||
FROM python:3.5.1
|
||||
|
||||
# Install Sphinx and Pygments
|
||||
RUN pip install Sphinx Pygments
|
||||
|
||||
# Setup directories, copy data
|
||||
RUN mkdir /build
|
||||
COPY . /build
|
||||
WORKDIR /build/docs
|
||||
|
||||
# Build documentation
|
||||
RUN make html
|
||||
|
||||
# Start webserver
|
||||
WORKDIR /build/docs/_build/html
|
||||
EXPOSE 8000/tcp
|
||||
CMD ["python3", "-m", "http.server"]
|
@@ -24,7 +24,6 @@ I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
help:
|
||||
@echo "Please use \`make <target>' where <target> is one of"
|
||||
@echo " html to make standalone HTML files"
|
||||
@echo " livehtml to preview changes with live reload in your browser"
|
||||
@echo " dirhtml to make HTML files named index.html in directories"
|
||||
@echo " singlehtml to make a single large HTML file"
|
||||
@echo " pickle to make pickle files"
|
||||
@@ -55,9 +54,6 @@ html:
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
||||
|
||||
livehtml:
|
||||
sphinx-autobuild "./" "$(BUILDDIR)" $(O)
|
||||
|
||||
dirhtml:
|
||||
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
||||
@echo
|
||||
|
597
docs/_static/css/custom.css
vendored
@@ -1,597 +0,0 @@
|
||||
/* Variables */
|
||||
:root {
|
||||
--color-text-body: #5c5962;
|
||||
--color-text-body-light: #fcfcfc;
|
||||
--color-text-anchor: #7253ed;
|
||||
--color-text-alt: rgba(0, 0, 0, 0.3);
|
||||
--color-text-title: #27262b;
|
||||
--color-text-code-inline: #e74c3c;
|
||||
--color-text-code-nt: #062873;
|
||||
--color-text-selection: #b19eff;
|
||||
--color-bg-body: #fcfcfc;
|
||||
--color-bg-body-alt: #f3f6f6;
|
||||
--color-bg-side-nav: #f5f6fa;
|
||||
--color-bg-side-nav-hover: #ebedf5;
|
||||
--color-bg-code-block: var(--color-bg-side-nav);
|
||||
--color-border: #eeebee;
|
||||
--color-btn-neutral-bg: #f3f6f6;
|
||||
--color-btn-neutral-bg-hover: #e5ebeb;
|
||||
--color-success-title: #1abc9c;
|
||||
--color-success-body: #dbfaf4;
|
||||
--color-warning-title: #f0b37e;
|
||||
--color-warning-body: #ffedcc;
|
||||
--color-danger-title: #f29f97;
|
||||
--color-danger-body: #fdf3f2;
|
||||
--color-info-title: #6ab0de;
|
||||
--color-info-body: #e7f2fa;
|
||||
}
|
||||
|
||||
.dark-mode {
|
||||
--color-text-body: #abb2bf;
|
||||
--color-text-body-light: #9499a2;
|
||||
--color-text-alt: rgba(0255, 255, 255, 0.5);
|
||||
--color-text-title: var(--color-text-anchor);
|
||||
--color-text-code-inline: #abb2bf;
|
||||
--color-text-code-nt: #2063f3;
|
||||
--color-text-selection: #030303;
|
||||
--color-bg-body: #1d1d20 !important;
|
||||
--color-bg-body-alt: #131315;
|
||||
--color-bg-side-nav: #18181a;
|
||||
--color-bg-side-nav-hover: #101216;
|
||||
--color-bg-code-block: #101216;
|
||||
--color-border: #47494f;
|
||||
--color-btn-neutral-bg: #242529;
|
||||
--color-btn-neutral-bg-hover: #101216;
|
||||
--color-success-title: #02120f;
|
||||
--color-success-body: #041b17;
|
||||
--color-warning-title: #1b0e03;
|
||||
--color-warning-body: #371d06;
|
||||
--color-danger-title: #120902;
|
||||
--color-danger-body: #1b0503;
|
||||
--color-info-title: #020608;
|
||||
--color-info-body: #06141e;
|
||||
}
|
||||
|
||||
* {
|
||||
transition: background-color 0.3s ease, border-color 0.3s ease;
|
||||
}
|
||||
|
||||
/* Typography */
|
||||
body {
|
||||
font-family: system-ui,-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif;
|
||||
font-size: inherit;
|
||||
line-height: 1.4;
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.rst-content p {
|
||||
word-break: break-word;
|
||||
}
|
||||
|
||||
h1, h2, h3, h4, h5, h6 {
|
||||
font-family: inherit;
|
||||
}
|
||||
|
||||
.rst-content .toctree-wrapper>p.caption, .rst-content h1, .rst-content h2, .rst-content h3, .rst-content h4, .rst-content h5, .rst-content h6 {
|
||||
padding-top: .5em;
|
||||
}
|
||||
|
||||
p, .main-content-wrap, .rst-content .section ul, .rst-content .toctree-wrapper ul, .rst-content section ul, .wy-plain-list-disc, article ul {
|
||||
line-height: 1.6;
|
||||
}
|
||||
|
||||
pre, .code, .rst-content .linenodiv pre, .rst-content div[class^=highlight] pre, .rst-content pre.literal-block {
|
||||
font-family: "SFMono-Regular", Menlo,Consolas, Monospace;
|
||||
font-size: 0.75em;
|
||||
line-height: 1.8;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4 {
|
||||
font-size: 1rem
|
||||
}
|
||||
|
||||
.rst-versions {
|
||||
font-family: inherit;
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
footer, footer p {
|
||||
font-size: .8rem;
|
||||
}
|
||||
|
||||
footer .rst-footer-buttons {
|
||||
font-size: 1rem;
|
||||
}
|
||||
|
||||
@media (max-width: 400px) {
|
||||
/* break code lines on mobile */
|
||||
pre, code {
|
||||
word-break: break-word;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/* Layout */
|
||||
.wy-side-nav-search, .wy-menu-vertical {
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.wy-nav-side {
|
||||
z-index: 0;
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
background-color: var(--color-bg-side-nav);
|
||||
}
|
||||
|
||||
.wy-side-scroll {
|
||||
width: 100%;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-side-scroll {
|
||||
width:264px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 50rem) {
|
||||
.wy-nav-side {
|
||||
flex-wrap: nowrap;
|
||||
position: fixed;
|
||||
width: 248px;
|
||||
height: 100%;
|
||||
flex-direction: column;
|
||||
border-right: 1px solid var(--color-border);
|
||||
align-items:flex-end
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-nav-side {
|
||||
width: calc((100% - 1064px) / 2 + 264px);
|
||||
min-width:264px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 50rem) {
|
||||
.wy-nav-content-wrap {
|
||||
position: relative;
|
||||
max-width: 800px;
|
||||
margin-left:248px
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 66.5rem) {
|
||||
.wy-nav-content-wrap {
|
||||
margin-left:calc((100% - 1064px) / 2 + 264px)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/* Colors */
|
||||
body.wy-body-for-nav,
|
||||
.wy-nav-content {
|
||||
background: var(--color-bg-body);
|
||||
}
|
||||
|
||||
.wy-nav-side {
|
||||
border-right: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-side-nav-search, .wy-nav-top {
|
||||
background: var(--color-bg-side-nav);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-nav-content-wrap {
|
||||
background: inherit;
|
||||
}
|
||||
|
||||
.wy-side-nav-search > a, .wy-nav-top a, .wy-nav-top i {
|
||||
color: var(--color-text-title);
|
||||
}
|
||||
|
||||
.wy-side-nav-search > a:hover, .wy-nav-top a:hover {
|
||||
background: transparent;
|
||||
}
|
||||
|
||||
.wy-side-nav-search > div.version {
|
||||
color: var(--color-text-alt)
|
||||
}
|
||||
|
||||
.wy-side-nav-search > div[role="search"] {
|
||||
border-top: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l2.current>a, .wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,
|
||||
.wy-menu-vertical li.toctree-l3.current>a, .wy-menu-vertical li.toctree-l3.current li.toctree-l4>a {
|
||||
background: var(--color-bg-side-nav);
|
||||
}
|
||||
|
||||
.rst-content .highlighted {
|
||||
background: #eedd85;
|
||||
box-shadow: 0 0 0 2px #eedd85;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.wy-side-nav-search input[type=text],
|
||||
html.writer-html5 .rst-content table.docutils th {
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,
|
||||
.wy-table-backed,
|
||||
.wy-table-odd td,
|
||||
.wy-table-striped tr:nth-child(2n-1) td {
|
||||
background-color: var(--color-bg-body-alt);
|
||||
}
|
||||
|
||||
.rst-content table.docutils,
|
||||
.wy-table-bordered-all,
|
||||
html.writer-html5 .rst-content table.docutils th,
|
||||
.rst-content table.docutils td,
|
||||
.wy-table-bordered-all td,
|
||||
hr {
|
||||
border-color: var(--color-border) !important;
|
||||
}
|
||||
|
||||
::selection {
|
||||
background: var(--color-text-selection);
|
||||
}
|
||||
|
||||
/* Ridiculous rules are taken from sphinx_rtd */
|
||||
.rst-content .admonition-title,
|
||||
.wy-alert-title {
|
||||
color: var(--color-text-body-light);
|
||||
}
|
||||
|
||||
.rst-content .hint,
|
||||
.rst-content .important,
|
||||
.rst-content .tip,
|
||||
.rst-content .wy-alert-success,
|
||||
.wy-alert.wy-alert-success {
|
||||
background: var(--color-success-body);
|
||||
}
|
||||
|
||||
.rst-content .hint .admonition-title,
|
||||
.rst-content .hint .wy-alert-title,
|
||||
.rst-content .important .admonition-title,
|
||||
.rst-content .important .wy-alert-title,
|
||||
.rst-content .tip .admonition-title,
|
||||
.rst-content .tip .wy-alert-title,
|
||||
.rst-content .wy-alert-success .admonition-title,
|
||||
.rst-content .wy-alert-success .wy-alert-title,
|
||||
.wy-alert.wy-alert-success .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-success .wy-alert-title {
|
||||
background-color: var(--color-success-title);
|
||||
}
|
||||
|
||||
.rst-content .admonition-todo,
|
||||
.rst-content .attention,
|
||||
.rst-content .caution,
|
||||
.rst-content .warning,
|
||||
.rst-content .wy-alert-warning,
|
||||
.wy-alert.wy-alert-warning {
|
||||
background: var(--color-warning-body);
|
||||
}
|
||||
|
||||
.rst-content .admonition-todo .admonition-title,
|
||||
.rst-content .admonition-todo .wy-alert-title,
|
||||
.rst-content .attention .admonition-title,
|
||||
.rst-content .attention .wy-alert-title,
|
||||
.rst-content .caution .admonition-title,
|
||||
.rst-content .caution .wy-alert-title,
|
||||
.rst-content .warning .admonition-title,
|
||||
.rst-content .warning .wy-alert-title,
|
||||
.rst-content .wy-alert-warning .admonition-title,
|
||||
.rst-content .wy-alert-warning .wy-alert-title,
|
||||
.rst-content .wy-alert.wy-alert-warning .admonition-title,
|
||||
.wy-alert.wy-alert-warning .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-warning .wy-alert-title {
|
||||
background: var(--color-warning-title);
|
||||
}
|
||||
|
||||
.rst-content .danger,
|
||||
.rst-content .error,
|
||||
.rst-content .wy-alert-danger,
|
||||
.wy-alert.wy-alert-danger {
|
||||
background: var(--color-danger-body);
|
||||
}
|
||||
|
||||
.rst-content .danger .admonition-title,
|
||||
.rst-content .danger .wy-alert-title,
|
||||
.rst-content .error .admonition-title,
|
||||
.rst-content .error .wy-alert-title,
|
||||
.rst-content .wy-alert-danger .admonition-title,
|
||||
.rst-content .wy-alert-danger .wy-alert-title,
|
||||
.wy-alert.wy-alert-danger .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-danger .wy-alert-title {
|
||||
background: var(--color-danger-title);
|
||||
}
|
||||
|
||||
.rst-content .note,
|
||||
.rst-content .seealso,
|
||||
.rst-content .wy-alert-info,
|
||||
.wy-alert.wy-alert-info {
|
||||
background: var(--color-info-body);
|
||||
}
|
||||
|
||||
.rst-content .note .admonition-title,
|
||||
.rst-content .note .wy-alert-title,
|
||||
.rst-content .seealso .admonition-title,
|
||||
.rst-content .seealso .wy-alert-title,
|
||||
.rst-content .wy-alert-info .admonition-title,
|
||||
.rst-content .wy-alert-info .wy-alert-title,
|
||||
.wy-alert.wy-alert-info .rst-content .admonition-title,
|
||||
.wy-alert.wy-alert-info .wy-alert-title {
|
||||
background: var(--color-info-title);
|
||||
}
|
||||
|
||||
|
||||
|
||||
/* Links */
|
||||
a, a:visited,
|
||||
.wy-menu-vertical a,
|
||||
a.icon.icon-home,
|
||||
.wy-menu-vertical li.toctree-l1.current > a.current {
|
||||
color: var(--color-text-anchor);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
a:hover, .wy-breadcrumbs-aside a {
|
||||
color: var(--color-text-anchor); /* reset */
|
||||
}
|
||||
|
||||
.rst-versions a, .rst-versions .rst-current-version {
|
||||
color: #var(--color-text-anchor);
|
||||
}
|
||||
|
||||
.wy-nav-content a.reference, .wy-nav-content a:not([class]) {
|
||||
background-image: linear-gradient(var(--color-border) 0%, var(--color-border) 100%);
|
||||
background-repeat: repeat-x;
|
||||
background-position: 0 100%;
|
||||
background-size: 1px 1px;
|
||||
}
|
||||
|
||||
.wy-nav-content a.reference:hover, .wy-nav-content a:not([class]):hover {
|
||||
background-image: linear-gradient(rgba(114,83,237,0.45) 0%, rgba(114,83,237,0.45) 100%);
|
||||
background-size: 1px 1px;
|
||||
}
|
||||
|
||||
.wy-menu-vertical a:hover,
|
||||
.wy-menu-vertical li.current a:hover,
|
||||
.wy-menu-vertical a:active {
|
||||
background: var(--color-bg-side-nav-hover) !important;
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l1.current>a,
|
||||
.wy-menu-vertical li.current>a,
|
||||
.wy-menu-vertical li.on a {
|
||||
background-color: var(--color-bg-side-nav-hover);
|
||||
border: none;
|
||||
font-weight: normal;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current {
|
||||
background-color: inherit;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current a {
|
||||
border-right: none;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.toctree-l2 a,
|
||||
.wy-menu-vertical li.toctree-l3 a,
|
||||
.wy-menu-vertical li.toctree-l4 a,
|
||||
.wy-menu-vertical li.toctree-l5 a,
|
||||
.wy-menu-vertical li.toctree-l6 a,
|
||||
.wy-menu-vertical li.toctree-l7 a,
|
||||
.wy-menu-vertical li.toctree-l8 a,
|
||||
.wy-menu-vertical li.toctree-l9 a,
|
||||
.wy-menu-vertical li.toctree-l10 a {
|
||||
color: var(--color-text-body);
|
||||
}
|
||||
|
||||
a.image-reference, a.image-reference:hover {
|
||||
background: none !important;
|
||||
}
|
||||
|
||||
a.image-reference img {
|
||||
cursor: zoom-in;
|
||||
}
|
||||
|
||||
|
||||
/* Code blocks */
|
||||
.rst-content code, .rst-content tt, code {
|
||||
padding: 0.25em;
|
||||
font-weight: 400;
|
||||
background-color: var(--color-bg-code-block);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.rst-content div[class^=highlight], .rst-content pre.literal-block {
|
||||
padding: 0.7rem;
|
||||
margin-top: 0;
|
||||
margin-bottom: 0.75rem;
|
||||
overflow-x: auto;
|
||||
background-color: var(--color-bg-side-nav);
|
||||
border-color: var(--color-border);
|
||||
border-radius: 4px;
|
||||
box-shadow: none;
|
||||
}
|
||||
|
||||
.rst-content .admonition-title,
|
||||
.rst-content div.admonition,
|
||||
.wy-alert-title {
|
||||
padding: 10px 12px;
|
||||
border-top-left-radius: 4px;
|
||||
border-top-right-radius: 4px;
|
||||
}
|
||||
|
||||
.highlight .go {
|
||||
color: inherit;
|
||||
}
|
||||
|
||||
.highlight .nt {
|
||||
color: var(--color-text-code-nt);
|
||||
}
|
||||
|
||||
.rst-content code.literal,
|
||||
.rst-content tt.literal,
|
||||
html.writer-html5 .rst-content dl.footnote code {
|
||||
border-color: var(--color-border);
|
||||
background-color: var(--color-border);
|
||||
color: var(--color-text-code-inline)
|
||||
}
|
||||
|
||||
|
||||
/* Search */
|
||||
.wy-side-nav-search input[type=text] {
|
||||
border: none;
|
||||
border-radius: 0;
|
||||
background-color: transparent;
|
||||
font-family: inherit;
|
||||
font-size: .85rem;
|
||||
box-shadow: none;
|
||||
padding: .7rem 1rem .7rem 2.8rem;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
#rtd-search-form {
|
||||
position: relative;
|
||||
}
|
||||
|
||||
#rtd-search-form:before {
|
||||
font: normal normal normal 14px/1 FontAwesome;
|
||||
font-size: inherit;
|
||||
text-rendering: auto;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
content: "\f002";
|
||||
color: var(--color-text-alt);
|
||||
position: absolute;
|
||||
left: 1.5rem;
|
||||
top: .7rem;
|
||||
}
|
||||
|
||||
/* Side nav */
|
||||
.wy-side-nav-search {
|
||||
padding: 1rem 0 0 0;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li a button.toctree-expand {
|
||||
float: right;
|
||||
margin-right: -1.5em;
|
||||
padding: 0 .5em;
|
||||
}
|
||||
|
||||
.wy-menu-vertical a,
|
||||
.wy-menu-vertical li.current>a,
|
||||
.wy-menu-vertical li.current li>a {
|
||||
padding-right: 1.5em !important;
|
||||
}
|
||||
|
||||
.wy-menu-vertical li.current li>a.current {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
/* Misc spacing */
|
||||
.rst-content .admonition-title, .wy-alert-title {
|
||||
padding: 10px 12px;
|
||||
}
|
||||
|
||||
/* Buttons */
|
||||
.btn {
|
||||
display: inline-block;
|
||||
box-sizing: border-box;
|
||||
padding: 0.3em 1em;
|
||||
margin: 0;
|
||||
font-family: inherit;
|
||||
font-size: inherit;
|
||||
font-weight: 500;
|
||||
line-height: 1.5;
|
||||
color: #var(--color-text-anchor);
|
||||
text-decoration: none;
|
||||
vertical-align: baseline;
|
||||
background-color: #f7f7f7;
|
||||
border-width: 0;
|
||||
border-radius: 4px;
|
||||
box-shadow: 0 1px 2px rgba(0,0,0,0.12),0 3px 10px rgba(0,0,0,0.08);
|
||||
appearance: none;
|
||||
}
|
||||
|
||||
.btn:active {
|
||||
padding: 0.3em 1em;
|
||||
}
|
||||
|
||||
.rst-content .btn:focus {
|
||||
outline: 1px solid #ccc;
|
||||
}
|
||||
|
||||
.rst-content .btn-neutral, .rst-content .btn span.fa {
|
||||
color: var(--color-text-body) !important;
|
||||
}
|
||||
|
||||
.btn-neutral {
|
||||
background-color: var(--color-btn-neutral-bg) !important;
|
||||
color: var(--color-btn-neutral-text) !important;
|
||||
border: 1px solid var(--color-btn-neutral-bg);
|
||||
}
|
||||
|
||||
.btn:hover, .btn-neutral:hover {
|
||||
background-color: var(--color-btn-neutral-bg-hover) !important;
|
||||
}
|
||||
|
||||
|
||||
/* Icon overrides */
|
||||
.wy-side-nav-search a.icon-home:before {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before {
|
||||
content: "\f106"; /* fa-angle-up */
|
||||
}
|
||||
|
||||
.fa-plus-square-o:before, .wy-menu-vertical li button.toctree-expand:before {
|
||||
content: "\f107"; /* fa-angle-down */
|
||||
}
|
||||
|
||||
|
||||
/* Misc */
|
||||
.wy-nav-top {
|
||||
line-height: 36px;
|
||||
}
|
||||
|
||||
.wy-nav-top > i {
|
||||
font-size: 24px;
|
||||
padding: 8px 0 0 2px;
|
||||
color:#var(--color-text-anchor);
|
||||
}
|
||||
|
||||
.rst-content table.docutils td,
|
||||
.rst-content table.docutils th,
|
||||
.rst-content table.field-list td,
|
||||
.rst-content table.field-list th,
|
||||
.wy-table td,
|
||||
.wy-table th {
|
||||
padding: 8px 14px;
|
||||
}
|
||||
|
||||
.dark-mode-toggle {
|
||||
position: absolute;
|
||||
top: 14px;
|
||||
right: 12px;
|
||||
height: 20px;
|
||||
width: 24px;
|
||||
z-index: 10;
|
||||
border: none;
|
||||
background-color: transparent;
|
||||
color: inherit;
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.wy-nav-content-wrap {
|
||||
z-index: 20;
|
||||
}
|
14
docs/_static/custom.css
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
/* override table width restrictions */
|
||||
@media screen and (min-width: 767px) {
|
||||
|
||||
.wy-table-responsive table td {
|
||||
/* !important prevents the common CSS stylesheets from
|
||||
overriding this as on RTD they are loaded after this stylesheet */
|
||||
white-space: normal !important;
|
||||
}
|
||||
|
||||
.wy-table-responsive {
|
||||
overflow: visible !important;
|
||||
}
|
||||
|
||||
}
|
47
docs/_static/js/darkmode.js
vendored
@@ -1,47 +0,0 @@
|
||||
let toggleButton
|
||||
let icon
|
||||
|
||||
function load() {
|
||||
'use strict'
|
||||
|
||||
toggleButton = document.createElement('button')
|
||||
toggleButton.setAttribute('title', 'Toggle dark mode')
|
||||
toggleButton.classList.add('dark-mode-toggle')
|
||||
icon = document.createElement('i')
|
||||
icon.classList.add('fa', darkModeState ? 'fa-sun-o' : 'fa-moon-o')
|
||||
toggleButton.appendChild(icon)
|
||||
document.body.prepend(toggleButton)
|
||||
|
||||
// Listen for changes in the OS settings
|
||||
// addListener is used because older versions of Safari don't support addEventListener
|
||||
// prefersDarkQuery set in <head>
|
||||
if (prefersDarkQuery) {
|
||||
prefersDarkQuery.addListener(function (evt) {
|
||||
toggleDarkMode(evt.matches)
|
||||
})
|
||||
}
|
||||
|
||||
// Initial setting depending on the prefers-color-mode or localstorage
|
||||
// darkModeState should be set in the document <head> to prevent flash
|
||||
if (darkModeState == undefined) darkModeState = false
|
||||
toggleDarkMode(darkModeState)
|
||||
|
||||
// Toggles the "dark-mode" class on click and sets localStorage state
|
||||
toggleButton.addEventListener('click', () => {
|
||||
darkModeState = !darkModeState
|
||||
|
||||
toggleDarkMode(darkModeState)
|
||||
localStorage.setItem('dark-mode', darkModeState)
|
||||
})
|
||||
}
|
||||
|
||||
function toggleDarkMode(state) {
|
||||
document.documentElement.classList.toggle('dark-mode', state)
|
||||
document.documentElement.classList.toggle('light-mode', !state)
|
||||
icon.classList.remove('fa-sun-o')
|
||||
icon.classList.remove('fa-moon-o')
|
||||
icon.classList.add(state ? 'fa-sun-o' : 'fa-moon-o')
|
||||
darkModeState = state
|
||||
}
|
||||
|
||||
document.addEventListener('DOMContentLoaded', load)
|
BIN
docs/_static/screenshot.png
vendored
Normal file
After Width: | Height: | Size: 445 KiB |
BIN
docs/_static/screenshots/bulk-edit.png
vendored
Before Width: | Height: | Size: 661 KiB |
BIN
docs/_static/screenshots/correspondents.png
vendored
Before Width: | Height: | Size: 457 KiB After Width: | Height: | Size: 106 KiB |
BIN
docs/_static/screenshots/dashboard.png
vendored
Before Width: | Height: | Size: 436 KiB After Width: | Height: | Size: 167 KiB |
BIN
docs/_static/screenshots/documents-filter.png
vendored
Before Width: | Height: | Size: 462 KiB After Width: | Height: | Size: 28 KiB |
BIN
docs/_static/screenshots/documents-largecards.png
vendored
Before Width: | Height: | Size: 608 KiB After Width: | Height: | Size: 306 KiB |
Before Width: | Height: | Size: 698 KiB |
BIN
docs/_static/screenshots/documents-smallcards.png
vendored
Before Width: | Height: | Size: 706 KiB After Width: | Height: | Size: 410 KiB |
BIN
docs/_static/screenshots/documents-table.png
vendored
Before Width: | Height: | Size: 480 KiB After Width: | Height: | Size: 137 KiB |
BIN
docs/_static/screenshots/editing.png
vendored
Before Width: | Height: | Size: 848 KiB After Width: | Height: | Size: 293 KiB |
BIN
docs/_static/screenshots/logs.png
vendored
Before Width: | Height: | Size: 703 KiB After Width: | Height: | Size: 260 KiB |
BIN
docs/_static/screenshots/mobile.png
vendored
Before Width: | Height: | Size: 388 KiB After Width: | Height: | Size: 158 KiB |
BIN
docs/_static/screenshots/new-tag.png
vendored
Before Width: | Height: | Size: 26 KiB After Width: | Height: | Size: 32 KiB |
BIN
docs/_static/screenshots/search-preview.png
vendored
Before Width: | Height: | Size: 54 KiB After Width: | Height: | Size: 61 KiB |
BIN
docs/_static/screenshots/search-results.png
vendored
Before Width: | Height: | Size: 517 KiB After Width: | Height: | Size: 261 KiB |
13
docs/_templates/layout.html
vendored
@@ -1,13 +0,0 @@
|
||||
{% extends "!layout.html" %}
|
||||
{% block extrahead %}
|
||||
<script>
|
||||
// MediaQueryList object
|
||||
const prefersDarkQuery = window.matchMedia("(prefers-color-scheme: dark)");
|
||||
const lsDark = localStorage.getItem("dark-mode");
|
||||
let darkModeState = lsDark !== null ? lsDark == "true" : prefersDarkQuery.matches;
|
||||
|
||||
document.documentElement.classList.toggle("dark-mode", darkModeState);
|
||||
document.documentElement.classList.toggle("light-mode", !darkModeState);
|
||||
</script>
|
||||
{{ super() }}
|
||||
{% endblock %}
|
@@ -35,15 +35,13 @@ Options available to docker installations:
|
||||
``/var/lib/docker/volumes`` on the host and you need to be root in order
|
||||
to access them.
|
||||
|
||||
Paperless uses 4 volumes:
|
||||
Paperless uses 3 volumes:
|
||||
|
||||
* ``paperless_media``: This is where your documents are stored.
|
||||
* ``paperless_data``: This is where auxillary data is stored. This
|
||||
folder also contains the SQLite database, if you use it.
|
||||
* ``paperless_pgdata``: Exists only if you use PostgreSQL and contains
|
||||
the database.
|
||||
* ``paperless_dbdata``: Exists only if you use MariaDB and contains
|
||||
the database.
|
||||
|
||||
Options available to bare-metal and non-docker installations:
|
||||
|
||||
@@ -51,7 +49,7 @@ Options available to bare-metal and non-docker installations:
|
||||
crashes at some point or your disk fails, you can simply copy the folder back
|
||||
into place and it works.
|
||||
|
||||
When using PostgreSQL or MariaDB, you'll also have to backup the database.
|
||||
When using PostgreSQL, you'll also have to backup the database.
|
||||
|
||||
.. _migrating-restoring:
|
||||
|
||||
@@ -119,23 +117,6 @@ Then you can start paperless-ngx with ``-d`` to have it run in the background.
|
||||
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
|
||||
.. note::
|
||||
In version 1.7.1 and onwards, the Docker image can now be pinned to a release series.
|
||||
This is often combined with automatic updaters such as Watchtower to allow safer
|
||||
unattended upgrading to new bugfix releases only. It is still recommended to always
|
||||
review release notes before upgrading. To pin your install to a release series, edit
|
||||
the ``docker-compose.yml`` find the line that says
|
||||
|
||||
.. code::
|
||||
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
|
||||
and replace the version with the series you want to track, for example:
|
||||
|
||||
.. code::
|
||||
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:1.7
|
||||
|
||||
Bare Metal Route
|
||||
================
|
||||
|
||||
@@ -289,10 +270,6 @@ When you use the provided docker compose script, put the export inside the
|
||||
``export`` folder in your paperless source directory. Specify ``../export``
|
||||
as the ``source``.
|
||||
|
||||
.. note::
|
||||
|
||||
Importing from a previous version of Paperless may work, but for best results
|
||||
it is suggested to match the versions.
|
||||
|
||||
.. _utilities-retagger:
|
||||
|
||||
@@ -312,7 +289,6 @@ there are tools for it.
|
||||
-c, --correspondent
|
||||
-T, --tags
|
||||
-t, --document_type
|
||||
-s, --storage_path
|
||||
-i, --inbox-only
|
||||
--use-first
|
||||
-f, --overwrite
|
||||
@@ -321,7 +297,7 @@ Run this after changing or adding matching rules. It'll loop over all
|
||||
of the documents in your database and attempt to match documents
|
||||
according to the new rules.
|
||||
|
||||
Specify any combination of ``-c``, ``-T``, ``-t`` and ``-s`` to have the
|
||||
Specify any combination of ``-c``, ``-T`` and ``-t`` to have the
|
||||
retagger perform matching of the specified metadata type. If you don't
|
||||
specify any of these options, the document retagger won't do anything.
|
||||
|
||||
@@ -393,8 +369,8 @@ the naming scheme.
|
||||
|
||||
.. warning::
|
||||
|
||||
Since this command moves your documents, it is advised to do
|
||||
a backup beforehand. The renaming logic is robust and will never overwrite
|
||||
Since this command moves you documents around alot, it is advised to to
|
||||
a backup before. The renaming logic is robust and will never overwrite
|
||||
or delete a file, but you can't ever be careful enough.
|
||||
|
||||
.. code::
|
||||
|
@@ -7,12 +7,12 @@ easier.
|
||||
|
||||
.. _advanced-matching:
|
||||
|
||||
Matching tags, correspondents, document types, and storage paths
|
||||
################################################################
|
||||
Matching tags, correspondents and document types
|
||||
################################################
|
||||
|
||||
Paperless will compare the matching algorithms defined by every tag, correspondent,
|
||||
document type, and storage path in your database to see if they apply to the text
|
||||
in a document. In other words, if you define a tag called ``Home Utility``
|
||||
Paperless will compare the matching algorithms defined by every tag and
|
||||
correspondent already set in your database to see if they apply to the text in
|
||||
a document. In other words, if you defined a tag called ``Home Utility``
|
||||
that had a ``match`` property of ``bc hydro`` and a ``matching_algorithm`` of
|
||||
``literal``, Paperless will automatically tag your newly-consumed document with
|
||||
your ``Home Utility`` tag so long as the text ``bc hydro`` appears in the body
|
||||
@@ -22,10 +22,10 @@ The matching logic is quite powerful. It supports searching the text of your
|
||||
document with different algorithms, and as such, some experimentation may be
|
||||
necessary to get things right.
|
||||
|
||||
In order to have a tag, correspondent, document type, or storage path assigned
|
||||
automatically to newly consumed documents, assign a match and matching algorithm
|
||||
using the web interface. These settings define when to assign tags, correspondents,
|
||||
document types, and storage paths to documents.
|
||||
In order to have a tag, correspondent, or type assigned automatically to newly
|
||||
consumed documents, assign a match and matching algorithm using the web
|
||||
interface. These settings define when to assign correspondents, tags, and types
|
||||
to documents.
|
||||
|
||||
The following algorithms are available:
|
||||
|
||||
@@ -37,7 +37,7 @@ The following algorithms are available:
|
||||
* **Literal:** Matches only if the match appears exactly as provided (i.e. preserve ordering) in the PDF.
|
||||
* **Regular expression:** Parses the match as a regular expression and tries to
|
||||
find a match within the document.
|
||||
* **Fuzzy match:** I don't know. Look at the source.
|
||||
* **Fuzzy match:** I dont know. Look at the source.
|
||||
* **Auto:** Tries to automatically match new documents. This does not require you
|
||||
to set a match. See the notes below.
|
||||
|
||||
@@ -47,9 +47,9 @@ defining a match text of ``"Bank of America" BofA`` using the *any* algorithm,
|
||||
will match documents that contain either "Bank of America" or "BofA", but will
|
||||
not match documents containing "Bank of South America".
|
||||
|
||||
Then just save your tag, correspondent, document type, or storage path and run
|
||||
another document through the consumer. Once complete, you should see the
|
||||
newly-created document, automatically tagged with the appropriate data.
|
||||
Then just save your tag/correspondent and run another document through the
|
||||
consumer. Once complete, you should see the newly-created document,
|
||||
automatically tagged with the appropriate data.
|
||||
|
||||
|
||||
.. _advanced-automatic_matching:
|
||||
@@ -58,9 +58,9 @@ Automatic matching
|
||||
==================
|
||||
|
||||
Paperless-ngx comes with a new matching algorithm called *Auto*. This matching
|
||||
algorithm tries to assign tags, correspondents, document types, and storage paths
|
||||
to your documents based on how you have already assigned these on existing documents.
|
||||
It uses a neural network under the hood.
|
||||
algorithm tries to assign tags, correspondents, and document types to your
|
||||
documents based on how you have already assigned these on existing documents. It
|
||||
uses a neural network under the hood.
|
||||
|
||||
If, for example, all your bank statements of your account 123 at the Bank of
|
||||
America are tagged with the tag "bofa_123" and the matching algorithm of this
|
||||
@@ -80,21 +80,20 @@ feature:
|
||||
that the neural network only learns from documents which you have correctly
|
||||
tagged before.
|
||||
* The matching algorithm can only work if there is a correlation between the
|
||||
tag, correspondent, document type, or storage path and the document itself.
|
||||
Your bank statements usually contain your bank account number and the name
|
||||
of the bank, so this works reasonably well, However, tags such as "TODO"
|
||||
cannot be automatically assigned.
|
||||
tag, correspondent, or document type and the document itself. Your bank
|
||||
statements usually contain your bank account number and the name of the bank,
|
||||
so this works reasonably well, However, tags such as "TODO" cannot be
|
||||
automatically assigned.
|
||||
* The matching algorithm needs a reasonable number of documents to identify when
|
||||
to assign tags, correspondents, storage paths, and types. If one out of a
|
||||
thousand documents has the correspondent "Very obscure web shop I bought
|
||||
something five years ago", it will probably not assign this correspondent
|
||||
automatically if you buy something from them again. The more documents, the better.
|
||||
to assign tags, correspondents, and types. If one out of a thousand documents
|
||||
has the correspondent "Very obscure web shop I bought something five years
|
||||
ago", it will probably not assign this correspondent automatically if you buy
|
||||
something from them again. The more documents, the better.
|
||||
* Paperless also needs a reasonable amount of negative examples to decide when
|
||||
not to assign a certain tag, correspondent, document type, or storage path. This will
|
||||
usually be the case as you start filling up paperless with documents.
|
||||
Example: If all your documents are either from "Webshop" and "Bank", paperless
|
||||
will assign one of these correspondents to ANY new document, if both are set
|
||||
to automatic matching.
|
||||
not to assign a certain tag, correspondent or type. This will usually be the
|
||||
case as you start filling up paperless with documents. Example: If all your
|
||||
documents are either from "Webshop" and "Bank", paperless will assign one of
|
||||
these correspondents to ANY new document, if both are set to automatic matching.
|
||||
|
||||
Hooking into the consumption process
|
||||
####################################
|
||||
@@ -121,10 +120,10 @@ Pre-consumption script
|
||||
======================
|
||||
|
||||
Executed after the consumer sees a new document in the consumption folder, but
|
||||
before any processing of the document is performed. This script can access the
|
||||
following relevant environment variables set:
|
||||
before any processing of the document is performed. This script receives exactly
|
||||
one argument:
|
||||
|
||||
* ``DOCUMENT_SOURCE_PATH``
|
||||
* Document file name
|
||||
|
||||
A simple but common example for this would be creating a simple script like
|
||||
this:
|
||||
@@ -134,7 +133,7 @@ this:
|
||||
.. code:: bash
|
||||
|
||||
#!/usr/bin/env bash
|
||||
pdf2pdfocr.py -i ${DOCUMENT_SOURCE_PATH}
|
||||
pdf2pdfocr.py -i ${1}
|
||||
|
||||
``/etc/paperless.conf``
|
||||
|
||||
@@ -157,21 +156,16 @@ Post-consumption script
|
||||
=======================
|
||||
|
||||
Executed after the consumer has successfully processed a document and has moved it
|
||||
into paperless. It receives the following environment variables:
|
||||
into paperless. It receives the following arguments:
|
||||
|
||||
* ``DOCUMENT_ID``
|
||||
* ``DOCUMENT_FILE_NAME``
|
||||
* ``DOCUMENT_CREATED``
|
||||
* ``DOCUMENT_MODIFIED``
|
||||
* ``DOCUMENT_ADDED``
|
||||
* ``DOCUMENT_SOURCE_PATH``
|
||||
* ``DOCUMENT_ARCHIVE_PATH``
|
||||
* ``DOCUMENT_THUMBNAIL_PATH``
|
||||
* ``DOCUMENT_DOWNLOAD_URL``
|
||||
* ``DOCUMENT_THUMBNAIL_URL``
|
||||
* ``DOCUMENT_CORRESPONDENT``
|
||||
* ``DOCUMENT_TAGS``
|
||||
* ``DOCUMENT_ORIGINAL_FILENAME``
|
||||
* Document id
|
||||
* Generated file name
|
||||
* Source path
|
||||
* Thumbnail path
|
||||
* Download URL
|
||||
* Thumbnail URL
|
||||
* Correspondent
|
||||
* Tags
|
||||
|
||||
The script can be in any language, but for a simple shell script
|
||||
example, you can take a look at `post-consumption-example.sh`_ in this project.
|
||||
@@ -206,7 +200,7 @@ Troubleshooting:
|
||||
- Check your script's permission e.g. in case of permission error ``sudo chmod 755 post-consumption-example.sh``
|
||||
- Pipe your scripts's output to a log file e.g. ``echo "${DOCUMENT_ID}" | tee --append /usr/src/paperless/scripts/post-consumption-example.log``
|
||||
|
||||
.. _post-consumption-example.sh: https://github.com/paperless-ngx/paperless-ngx/blob/main/scripts/post-consumption-example.sh
|
||||
.. _post-consumption-example.sh: https://github.com/jonaswinkler/paperless-ngx/blob/master/scripts/post-consumption-example.sh
|
||||
|
||||
.. _advanced-file_name_handling:
|
||||
|
||||
@@ -249,7 +243,7 @@ will create a directory structure as follows:
|
||||
last filename a document was stored as. If you do rename a file, paperless will
|
||||
report your files as missing and won't be able to find them.
|
||||
|
||||
Paperless provides the following placeholders within filenames:
|
||||
Paperless provides the following placeholders withing filenames:
|
||||
|
||||
* ``{asn}``: The archive serial number of the document, or "none".
|
||||
* ``{correspondent}``: The name of the correspondent, or "none".
|
||||
@@ -274,17 +268,6 @@ If paperless detects that two documents share the same filename, paperless will
|
||||
append ``_01``, ``_02``, etc to the filename. This happens if all the placeholders in a filename
|
||||
evaluate to the same value.
|
||||
|
||||
.. hint::
|
||||
You can affect how empty placeholders are treated by changing the following setting to
|
||||
`true`.
|
||||
|
||||
.. code::
|
||||
|
||||
PAPERLESS_FILENAME_FORMAT_REMOVE_NONE=True
|
||||
|
||||
Doing this results in all empty placeholders resolving to "" instead of "none" as stated above.
|
||||
Spaces before empty placeholders are removed as well, empty directories are omitted.
|
||||
|
||||
.. hint::
|
||||
|
||||
Paperless checks the filename of a document whenever it is saved. Therefore,
|
||||
@@ -307,59 +290,3 @@ evaluate to the same value.
|
||||
|
||||
However, keep in mind that inside docker, if files get stored outside of the
|
||||
predefined volumes, they will be lost after a restart of paperless.
|
||||
|
||||
|
||||
Storage paths
|
||||
#############
|
||||
|
||||
One of the best things in Paperless is that you can not only access the documents via the
|
||||
web interface, but also via the file system.
|
||||
|
||||
When as single storage layout is not sufficient for your use case, storage paths come to
|
||||
the rescue. Storage paths allow you to configure more precisely where each document is stored
|
||||
in the file system.
|
||||
|
||||
- Each storage path is a `PAPERLESS_FILENAME_FORMAT` and follows the rules described above
|
||||
- Each document is assigned a storage path using the matching algorithms described above, but
|
||||
can be overwritten at any time
|
||||
|
||||
For example, you could define the following two storage paths:
|
||||
|
||||
1. Normal communications are put into a folder structure sorted by `year/correspondent`
|
||||
2. Communications with insurance companies are stored in a flat structure with longer file names,
|
||||
but containing the full date of the correspondence.
|
||||
|
||||
.. code::
|
||||
|
||||
By Year = {created_year}/{correspondent}/{title}
|
||||
Insurances = Insurances/{correspondent}/{created_year}-{created_month}-{created_day} {title}
|
||||
|
||||
|
||||
If you then map these storage paths to the documents, you might get the following result.
|
||||
For simplicity, `By Year` defines the same structure as in the previous example above.
|
||||
|
||||
.. code:: text
|
||||
|
||||
2019/ # By Year
|
||||
My bank/
|
||||
Statement January.pdf
|
||||
Statement February.pdf
|
||||
|
||||
Insurances/ # Insurances
|
||||
Healthcare 123/
|
||||
2022-01-01 Statement January.pdf
|
||||
2022-02-02 Letter.pdf
|
||||
2022-02-03 Letter.pdf
|
||||
Dental 456/
|
||||
2021-12-01 New Conditions.pdf
|
||||
|
||||
|
||||
.. hint::
|
||||
|
||||
Defining a storage path is optional. If no storage path is defined for a document, the global
|
||||
`PAPERLESS_FILENAME_FORMAT` is applied.
|
||||
|
||||
.. caution::
|
||||
|
||||
If you adjust the format of an existing storage path, old documents don't get relocated automatically.
|
||||
You need to run the :ref:`document renamer <utilities-renamer>` to adjust their pathes.
|
||||
|
@@ -31,8 +31,7 @@ The objects served by the document endpoint contain the following fields:
|
||||
* ``tags``: List of IDs of tags assigned to this document, or empty list.
|
||||
* ``document_type``: Document type of this document, or null.
|
||||
* ``correspondent``: Correspondent of this document or null.
|
||||
* ``created``: The date time at which this document was created.
|
||||
* ``created_date``: The date (YYYY-MM-DD) at which this document was created. Optional. If also passed with created, this is ignored.
|
||||
* ``created``: The date at which this document was created.
|
||||
* ``modified``: The date at which this document was last edited in paperless. Read-only.
|
||||
* ``added``: The date at which this document was added to paperless. Read-only.
|
||||
* ``archive_serial_number``: The identifier of this document in a physical document archive.
|
||||
@@ -241,13 +240,11 @@ be instructed to consume the document from there.
|
||||
The endpoint supports the following optional form fields:
|
||||
|
||||
* ``title``: Specify a title that the consumer should use for the document.
|
||||
* ``created``: Specify a DateTime where the document was created (e.g. "2016-04-19" or "2016-04-19 06:15:00+02:00").
|
||||
* ``correspondent``: Specify the ID of a correspondent that the consumer should use for the document.
|
||||
* ``document_type``: Similar to correspondent.
|
||||
* ``tags``: Similar to correspondent. Specify this multiple times to have multiple tags added
|
||||
to the document.
|
||||
|
||||
|
||||
The endpoint will immediately return "OK" if the document consumption process
|
||||
was started successfully. No additional status information about the consumption
|
||||
process itself is available, since that happens in a different process.
|
||||
|
2064
docs/changelog.md
1691
docs/changelog.rst
Normal file
24
docs/conf.py
@@ -2,8 +2,6 @@ import sphinx_rtd_theme
|
||||
|
||||
|
||||
__version__ = None
|
||||
__full_version_str__ = None
|
||||
__major_minor_version_str__ = None
|
||||
exec(open("../src/paperless/version.py").read())
|
||||
|
||||
|
||||
@@ -14,17 +12,13 @@ extensions = [
|
||||
"sphinx.ext.imgmath",
|
||||
"sphinx.ext.viewcode",
|
||||
"sphinx_rtd_theme",
|
||||
"myst_parser",
|
||||
]
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ["_templates"]
|
||||
# templates_path = ['_templates']
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = {
|
||||
".rst": "restructuredtext",
|
||||
".md": "markdown",
|
||||
}
|
||||
source_suffix = ".rst"
|
||||
|
||||
# The encoding of source files.
|
||||
# source_encoding = 'utf-8-sig'
|
||||
@@ -47,9 +41,9 @@ copyright = "2015-2022, Daniel Quinn, Jonas Winkler, and the paperless-ngx team"
|
||||
#
|
||||
|
||||
# The short X.Y version.
|
||||
version = __major_minor_version_str__
|
||||
version = ".".join([str(_) for _ in __version__[:2]])
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
release = __full_version_str__
|
||||
release = ".".join([str(_) for _ in __version__[:3]])
|
||||
|
||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||
# for a list of supported languages.
|
||||
@@ -125,16 +119,6 @@ html_theme_path = []
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
html_static_path = ["_static"]
|
||||
|
||||
# These paths are either relative to html_static_path
|
||||
# or fully qualified paths (eg. https://...)
|
||||
html_css_files = [
|
||||
"css/custom.css",
|
||||
]
|
||||
|
||||
html_js_files = [
|
||||
"js/darkmode.js",
|
||||
]
|
||||
|
||||
# Add any extra paths that contain custom files (such as robots.txt or
|
||||
# .htaccess) here, relative to this directory. These files are copied
|
||||
# directly to the root of the documentation.
|
||||
|
@@ -27,23 +27,11 @@ PAPERLESS_REDIS=<url>
|
||||
This is required for processing scheduled tasks such as email fetching, index
|
||||
optimization and for training the automatic document matcher.
|
||||
|
||||
* If your Redis server needs login credentials PAPERLESS_REDIS = ``redis://<username>:<password>@<host>:<port>``
|
||||
|
||||
* With the requirepass option PAPERLESS_REDIS = ``redis://:<password>@<host>:<port>``
|
||||
|
||||
`More information on securing your Redis Instance <https://redis.io/docs/getting-started/#securing-redis>`_.
|
||||
|
||||
Defaults to redis://localhost:6379.
|
||||
|
||||
PAPERLESS_DBENGINE=<engine_name>
|
||||
Optional, gives the ability to choose Postgres or MariaDB for database engine.
|
||||
Available options are `postgresql` and `mariadb`.
|
||||
Default is `postgresql`.
|
||||
|
||||
PAPERLESS_DBHOST=<hostname>
|
||||
By default, sqlite is used as the database backend. This can be changed here.
|
||||
|
||||
Set PAPERLESS_DBHOST and another database will be used instead of sqlite.
|
||||
Set PAPERLESS_DBHOST and PostgreSQL will be used instead of mysql.
|
||||
|
||||
PAPERLESS_DBPORT=<port>
|
||||
Adjust port if necessary.
|
||||
@@ -51,17 +39,17 @@ PAPERLESS_DBPORT=<port>
|
||||
Default is 5432.
|
||||
|
||||
PAPERLESS_DBNAME=<name>
|
||||
Database name in PostgreSQL or MariaDB.
|
||||
Database name in PostgreSQL.
|
||||
|
||||
Defaults to "paperless".
|
||||
|
||||
PAPERLESS_DBUSER=<name>
|
||||
Database user in PostgreSQL or MariaDB.
|
||||
Database user in PostgreSQL.
|
||||
|
||||
Defaults to "paperless".
|
||||
|
||||
PAPERLESS_DBPASS=<password>
|
||||
Database password for PostgreSQL or MariaDB.
|
||||
Database password for PostgreSQL.
|
||||
|
||||
Defaults to "paperless".
|
||||
|
||||
@@ -72,13 +60,6 @@ PAPERLESS_DBSSLMODE=<mode>
|
||||
|
||||
Default is ``prefer``.
|
||||
|
||||
PAPERLESS_DB_TIMEOUT=<float>
|
||||
Amount of time for a database connection to wait for the database to unlock.
|
||||
Mostly applicable for an sqlite based installation, consider changing to postgresql
|
||||
if you need to increase this.
|
||||
|
||||
Defaults to unset, keeping the Django defaults.
|
||||
|
||||
Paths and folders
|
||||
#################
|
||||
|
||||
@@ -130,14 +111,6 @@ PAPERLESS_FILENAME_FORMAT=<format>
|
||||
|
||||
Default is none, which disables this feature.
|
||||
|
||||
PAPERLESS_FILENAME_FORMAT_REMOVE_NONE=<bool>
|
||||
Tells paperless to replace placeholders in `PAPERLESS_FILENAME_FORMAT` that would resolve
|
||||
to 'none' to be omitted from the resulting filename. This also holds true for directory
|
||||
names.
|
||||
See :ref:`advanced-file_name_handling` for details.
|
||||
|
||||
Defaults to `false` which disables this feature.
|
||||
|
||||
PAPERLESS_LOGGING_DIR=<path>
|
||||
This is where paperless will store log files.
|
||||
|
||||
@@ -157,8 +130,6 @@ PAPERLESS_LOGROTATE_MAX_BACKUPS=<num>
|
||||
|
||||
Defaults to 20.
|
||||
|
||||
.. _hosting-and-security:
|
||||
|
||||
Hosting & Security
|
||||
##################
|
||||
|
||||
@@ -171,24 +142,7 @@ PAPERLESS_SECRET_KEY=<key>
|
||||
|
||||
Default is listed in the file ``src/paperless/settings.py``.
|
||||
|
||||
PAPERLESS_URL=<url>
|
||||
This setting can be used to set the three options below (ALLOWED_HOSTS,
|
||||
CORS_ALLOWED_HOSTS and CSRF_TRUSTED_ORIGINS). If the other options are
|
||||
set the values will be combined with this one. Do not include a trailing
|
||||
slash. E.g. https://paperless.domain.com
|
||||
|
||||
Defaults to empty string, leaving the other settings unaffected.
|
||||
|
||||
PAPERLESS_CSRF_TRUSTED_ORIGINS=<comma-separated-list>
|
||||
A list of trusted origins for unsafe requests (e.g. POST). As of Django 4.0
|
||||
this is required to access the Django admin via the web.
|
||||
See https://docs.djangoproject.com/en/4.0/ref/settings/#csrf-trusted-origins
|
||||
|
||||
Can also be set using PAPERLESS_URL (see above).
|
||||
|
||||
Defaults to empty string, which does not add any origins to the trusted list.
|
||||
|
||||
PAPERLESS_ALLOWED_HOSTS=<comma-separated-list>
|
||||
PAPERLESS_ALLOWED_HOSTS<comma-separated-list>
|
||||
If you're planning on putting Paperless on the open internet, then you
|
||||
really should set this value to the domain name you're using. Failing to do
|
||||
so leaves you open to HTTP host header attacks:
|
||||
@@ -197,19 +151,12 @@ PAPERLESS_ALLOWED_HOSTS=<comma-separated-list>
|
||||
Just remember that this is a comma-separated list, so "example.com" is fine,
|
||||
as is "example.com,www.example.com", but NOT " example.com" or "example.com,"
|
||||
|
||||
Can also be set using PAPERLESS_URL (see above).
|
||||
|
||||
If manually set, please remember to include "localhost". Otherwise docker
|
||||
healthcheck will fail.
|
||||
|
||||
Defaults to "*", which is all hosts.
|
||||
|
||||
PAPERLESS_CORS_ALLOWED_HOSTS=<comma-separated-list>
|
||||
PAPERLESS_CORS_ALLOWED_HOSTS<comma-separated-list>
|
||||
You need to add your servers to the list of allowed hosts that can do CORS
|
||||
calls. Set this to your public domain name.
|
||||
|
||||
Can also be set using PAPERLESS_URL (see above).
|
||||
|
||||
Defaults to "http://localhost:8000".
|
||||
|
||||
PAPERLESS_FORCE_SCRIPT_NAME=<path>
|
||||
@@ -221,16 +168,9 @@ PAPERLESS_FORCE_SCRIPT_NAME=<path>
|
||||
PAPERLESS_STATIC_URL=<path>
|
||||
Override the STATIC_URL here. Unless you're hosting Paperless off a
|
||||
subdomain like /paperless/, you probably don't need to change this.
|
||||
If you do change it, be sure to include the trailing slash.
|
||||
|
||||
Defaults to "/static/".
|
||||
|
||||
.. note::
|
||||
|
||||
When hosting paperless behind a reverse proxy like Traefik or Nginx at a subpath e.g.
|
||||
example.com/paperlessngx you will also need to set ``PAPERLESS_FORCE_SCRIPT_NAME``
|
||||
(see above).
|
||||
|
||||
PAPERLESS_AUTO_LOGIN_USERNAME=<username>
|
||||
Specify a username here so that paperless will automatically perform login
|
||||
with the selected user.
|
||||
@@ -245,7 +185,7 @@ PAPERLESS_AUTO_LOGIN_USERNAME=<username>
|
||||
PAPERLESS_ADMIN_USER=<username>
|
||||
If this environment variable is specified, Paperless automatically creates
|
||||
a superuser with the provided username at start. This is useful in cases
|
||||
where you can not run the `createsuperuser` command separately, such as Kubernetes
|
||||
where you can not run the `createsuperuser` command seperately, such as Kubernetes
|
||||
or AWS ECS.
|
||||
|
||||
Requires `PAPERLESS_ADMIN_PASSWORD` to be set.
|
||||
@@ -450,23 +390,14 @@ PAPERLESS_OCR_IMAGE_DPI=<num>
|
||||
the produced PDF documents are A4 sized.
|
||||
|
||||
PAPERLESS_OCR_MAX_IMAGE_PIXELS=<num>
|
||||
Paperless will raise a warning when OCRing images which are over this limit and
|
||||
will not OCR images which are more than twice this limit. Note this does not
|
||||
prevent the document from being consumed, but could result in missing text content.
|
||||
|
||||
If unset, will default to the value determined by
|
||||
`Pillow <https://pillow.readthedocs.io/en/stable/reference/Image.html#PIL.Image.MAX_IMAGE_PIXELS>`_.
|
||||
|
||||
.. note::
|
||||
|
||||
Increasing this limit could cause Paperless to consume additional resources
|
||||
when consuming a file. Be sure you have sufficient system resources.
|
||||
|
||||
.. caution::
|
||||
|
||||
The limit is intended to prevent malicious files from consuming system resources
|
||||
and causing crashes and other errors. Only increase this value if you are certain
|
||||
your documents are not malicious and you need the text which was not OCRed
|
||||
Paperless will not OCR images that have more pixels than this limit.
|
||||
This is intended to prevent decompression bombs from overloading paperless.
|
||||
Increasing this limit is desired if you face a DecompressionBombError despite
|
||||
the concerning file not being malicious; this could e.g. be caused by invalidly
|
||||
recognized metadata.
|
||||
If you have enough resources or if you are certain that your uploaded files
|
||||
are not malicious you can increase this value to your needs.
|
||||
The default value is 256000000, an image with more pixels than that would not be parsed.
|
||||
|
||||
PAPERLESS_OCR_USER_ARGS=<json>
|
||||
OCRmyPDF offers many more options. Use this parameter to specify any
|
||||
@@ -517,7 +448,7 @@ PAPERLESS_TIKA_GOTENBERG_ENDPOINT=<url>
|
||||
Defaults to "http://localhost:3000".
|
||||
|
||||
If you run paperless on docker, you can add those services to the docker-compose
|
||||
file (see the provided ``docker-compose.sqlite-tika.yml`` file for reference). The changes
|
||||
file (see the provided ``docker-compose.tika.yml`` file for reference). The changes
|
||||
requires are as follows:
|
||||
|
||||
.. code:: yaml
|
||||
@@ -538,14 +469,14 @@ requires are as follows:
|
||||
# ...
|
||||
|
||||
gotenberg:
|
||||
image: gotenberg/gotenberg:7.4
|
||||
image: gotenberg/gotenberg:7
|
||||
restart: unless-stopped
|
||||
command:
|
||||
- "gotenberg"
|
||||
- "--chromium-disable-routes=true"
|
||||
|
||||
tika:
|
||||
image: ghcr.io/paperless-ngx/tika:latest
|
||||
image: apache/tika
|
||||
restart: unless-stopped
|
||||
|
||||
Add the configuration variables to the environment of the webserver (alternatively
|
||||
@@ -562,8 +493,6 @@ PAPERLESS_TASK_WORKERS=<num>
|
||||
maintain the automatic matching algorithm, check emails, consume documents,
|
||||
etc. This variable specifies how many things it will do in parallel.
|
||||
|
||||
Defaults to 1
|
||||
|
||||
|
||||
PAPERLESS_THREADS_PER_WORKER=<num>
|
||||
Furthermore, paperless uses multiple threads when consuming documents to
|
||||
@@ -611,10 +540,6 @@ PAPERLESS_WORKER_TIMEOUT=<num>
|
||||
large documents within the default 1800 seconds. So extending this timeout
|
||||
may prove to be useful on weak hardware setups.
|
||||
|
||||
PAPERLESS_WORKER_RETRY=<num>
|
||||
If PAPERLESS_WORKER_TIMEOUT has been configured, the retry time for a task can
|
||||
also be configured. By default, this value will be set to 10s more than the
|
||||
worker timeout. This value should never be set less than the worker timeout.
|
||||
|
||||
PAPERLESS_TIME_ZONE=<timezone>
|
||||
Set the time zone here.
|
||||
@@ -635,28 +560,6 @@ PAPERLESS_CONSUMER_POLLING=<num>
|
||||
|
||||
Defaults to 0, which disables polling and uses filesystem notifications.
|
||||
|
||||
PAPERLESS_CONSUMER_POLLING_RETRY_COUNT=<num>
|
||||
If consumer polling is enabled, sets the number of times paperless will check for a
|
||||
file to remain unmodified.
|
||||
|
||||
Defaults to 5.
|
||||
|
||||
PAPERLESS_CONSUMER_POLLING_DELAY=<num>
|
||||
If consumer polling is enabled, sets the delay in seconds between each check (above) paperless
|
||||
will do while waiting for a file to remain unmodified.
|
||||
|
||||
Defaults to 5.
|
||||
|
||||
.. _configuration-inotify:
|
||||
|
||||
PAPERLESS_CONSUMER_INOTIFY_DELAY=<num>
|
||||
Sets the time in seconds the consumer will wait for additional events
|
||||
from inotify before the consumer will consider a file ready and begin consumption.
|
||||
Certain scanners or network setups may generate multiple events for a single file,
|
||||
leading to multiple consumers working on the same file. Configure this to
|
||||
prevent that.
|
||||
|
||||
Defaults to 0.5 seconds.
|
||||
|
||||
PAPERLESS_CONSUMER_DELETE_DUPLICATES=<bool>
|
||||
When the consumer detects a duplicate document, it will not touch the
|
||||
@@ -685,37 +588,6 @@ PAPERLESS_CONSUMER_SUBDIRS_AS_TAGS=<bool>
|
||||
|
||||
Defaults to false.
|
||||
|
||||
PAPERLESS_CONSUMER_ENABLE_BARCODES=<bool>
|
||||
Enables the scanning and page separation based on detected barcodes.
|
||||
This allows for scanning and adding multiple documents per uploaded
|
||||
file, which are separated by one or multiple barcode pages.
|
||||
|
||||
For ease of use, it is suggested to use a standardized separation page,
|
||||
e.g. `here <https://www.alliancegroup.co.uk/patch-codes.htm>`_.
|
||||
|
||||
If no barcodes are detected in the uploaded file, no page separation
|
||||
will happen.
|
||||
|
||||
The original document will be removed and the separated pages will be
|
||||
saved as pdf.
|
||||
|
||||
Defaults to false.
|
||||
|
||||
PAPERLESS_CONSUMER_BARCODE_TIFF_SUPPORT=<bool>
|
||||
Whether TIFF image files should be scanned for barcodes.
|
||||
This will automatically convert any TIFF image(s) to pdfs for later
|
||||
processing.
|
||||
This only has an effect, if PAPERLESS_CONSUMER_ENABLE_BARCODES has been
|
||||
enabled.
|
||||
|
||||
Defaults to false.
|
||||
|
||||
PAPERLESS_CONSUMER_BARCODE_STRING=PATCHT
|
||||
Defines the string to be detected as a separator barcode.
|
||||
If paperless is used with the PATCH-T separator pages, users
|
||||
shouldn't change this.
|
||||
|
||||
Defaults to "PATCHT"
|
||||
|
||||
PAPERLESS_CONVERT_MEMORY_LIMIT=<num>
|
||||
On smaller systems, or even in the case of Very Large Documents, the consumer
|
||||
@@ -740,6 +612,13 @@ PAPERLESS_CONVERT_TMPDIR=<path>
|
||||
|
||||
Default is none, which disables the temporary directory.
|
||||
|
||||
PAPERLESS_OPTIMIZE_THUMBNAILS=<bool>
|
||||
Use optipng to optimize thumbnails. This usually reduces the size of
|
||||
thumbnails by about 20%, but uses considerable compute time during
|
||||
consumption.
|
||||
|
||||
Defaults to true.
|
||||
|
||||
PAPERLESS_POST_CONSUME_SCRIPT=<filename>
|
||||
After a document is consumed, Paperless can trigger an arbitrary script if
|
||||
you like. This script will be passed a number of arguments for you to work
|
||||
@@ -755,24 +634,8 @@ PAPERLESS_FILENAME_DATE_ORDER=<format>
|
||||
The filename will be checked first, and if nothing is found, the document
|
||||
text will be checked as normal.
|
||||
|
||||
A date in a filename must have some separators (`.`, `-`, `/`, etc)
|
||||
for it to be parsed.
|
||||
|
||||
Defaults to none, which disables this feature.
|
||||
|
||||
PAPERLESS_NUMBER_OF_SUGGESTED_DATES=<num>
|
||||
Paperless searches an entire document for dates. The first date found will
|
||||
be used as the initial value for the created date. When this variable is
|
||||
greater than 0 (or left to it's default value), paperless will also suggest
|
||||
other dates found in the document, up to a maximum of this setting. Note that
|
||||
duplicates will be removed, which can result in fewer dates displayed in the
|
||||
frontend than this setting value.
|
||||
|
||||
The task to find all dates can be time-consuming and increases with a higher
|
||||
(maximum) number of suggested dates and slower hardware.
|
||||
|
||||
Defaults to 3. Set to 0 to disable this feature.
|
||||
|
||||
PAPERLESS_THUMBNAIL_FONT_NAME=<filename>
|
||||
Paperless creates thumbnails for plain text files by rendering the content
|
||||
of the file on an image and uses a predefined font for that. This
|
||||
@@ -788,7 +651,10 @@ PAPERLESS_IGNORE_DATES=<string>
|
||||
this process. This is useful for special dates (like date of birth) that appear
|
||||
in documents regularly but are very unlikely to be the documents creation date.
|
||||
|
||||
The date is parsed using the order specified in PAPERLESS_DATE_ORDER
|
||||
You may specify dates in a multitude of formats supported by dateparser (see
|
||||
https://dateparser.readthedocs.io/en/latest/#popular-formats) but as the dates
|
||||
need to be comma separated, the options are limited.
|
||||
Example: "2020-12-02,22.04.1999"
|
||||
|
||||
Defaults to an empty string to not ignore any dates.
|
||||
|
||||
@@ -805,7 +671,7 @@ PAPERLESS_CONSUMER_IGNORE_PATTERNS=<json>
|
||||
|
||||
This can be adjusted by configuring a custom json array with patterns to exclude.
|
||||
|
||||
Defaults to ``[".DS_STORE/*", "._*", ".stfolder/*", ".stversions/*", ".localized/*", "desktop.ini"]``.
|
||||
Defautls to ``[".DS_STORE/*", "._*", ".stfolder/*"]``.
|
||||
|
||||
Binaries
|
||||
########
|
||||
@@ -818,10 +684,13 @@ the program doesn't automatically execute it (ie. the program isn't in your
|
||||
$PATH), then you'll need to specify the literal path for that program.
|
||||
|
||||
PAPERLESS_CONVERT_BINARY=<path>
|
||||
Defaults to "convert".
|
||||
Defaults to "/usr/bin/convert".
|
||||
|
||||
PAPERLESS_GS_BINARY=<path>
|
||||
Defaults to "gs".
|
||||
Defaults to "/usr/bin/gs".
|
||||
|
||||
PAPERLESS_OPTIPNG_BINARY=<path>
|
||||
Defaults to "/usr/bin/optipng".
|
||||
|
||||
|
||||
.. _configuration-docker:
|
||||
@@ -838,14 +707,9 @@ PAPERLESS_WEBSERVER_WORKERS=<num>
|
||||
also loads the entire application into memory separately, so increasing this value
|
||||
will increase RAM usage.
|
||||
|
||||
Defaults to 1.
|
||||
Consider configuring this to 1 on low power devices with limited amount of RAM.
|
||||
|
||||
PAPERLESS_BIND_ADDR=<ip address>
|
||||
The IP address the webserver will listen on inside the container. There are
|
||||
special setups where you may need to configure this value to restrict the
|
||||
Ip address or interface the webserver listens on.
|
||||
|
||||
Defaults to [::], meaning all interfaces, including IPv6.
|
||||
Defaults to 2.
|
||||
|
||||
PAPERLESS_PORT=<port>
|
||||
The port number the webserver will listen on inside the container. There are
|
||||
@@ -900,26 +764,3 @@ PAPERLESS_OCR_LANGUAGES=<list>
|
||||
PAPERLESS_OCR_LANGUAGE=tur
|
||||
|
||||
Defaults to none, which does not install any additional languages.
|
||||
|
||||
|
||||
.. _configuration-update-checking:
|
||||
|
||||
Update Checking
|
||||
###############
|
||||
|
||||
PAPERLESS_ENABLE_UPDATE_CHECK=<bool>
|
||||
Enable (or disable) the automatic check for available updates. This feature is disabled
|
||||
by default but if it is not explicitly set Paperless-ngx will show a message about this.
|
||||
|
||||
If enabled, the feature works by pinging the the Github API for the latest release e.g.
|
||||
https://api.github.com/repos/paperless-ngx/paperless-ngx/releases/latest
|
||||
to determine whether a new version is available.
|
||||
|
||||
Actual updating of the app must still be performed manually.
|
||||
|
||||
Note that for users of thirdy-party containers e.g. linuxserver.io this notification
|
||||
may be 'ahead' of a new release from the third-party maintainers.
|
||||
|
||||
In either case, no tracking data is collected by the app in any way.
|
||||
|
||||
Defaults to none, which disables the feature.
|
||||
|
@@ -79,7 +79,7 @@ To do the setup you need to perform the steps from the following chapters in a c
|
||||
6. You can now either ...
|
||||
|
||||
* install redis or
|
||||
* use the included scripts/start-services.sh to use docker to fire up a redis instance (and some other services such as tika, gotenberg and a database server) or
|
||||
* use the included scripts/start-services.sh to use docker to fire up a redis instance (and some other services such as tika, gotenberg and a postgresql server) or
|
||||
* spin up a bare redis container
|
||||
|
||||
.. code:: shell-session
|
||||
@@ -334,17 +334,11 @@ directory.
|
||||
Building the Docker image
|
||||
=========================
|
||||
|
||||
The docker image is primarily built by the GitHub actions workflow, but it can be
|
||||
faster when developing to build and tag an image locally.
|
||||
|
||||
To provide the build arguments automatically, build the image using the helper
|
||||
script ``build-docker-image.sh``.
|
||||
|
||||
Building the docker image from source:
|
||||
|
||||
.. code:: shell-session
|
||||
|
||||
./build-docker-image.sh Dockerfile -t <your-tag>
|
||||
docker build . -t <your-tag>
|
||||
|
||||
Extending Paperless
|
||||
===================
|
||||
|
@@ -7,7 +7,7 @@ Frequently asked questions
|
||||
|
||||
**A:** While Paperless-ngx is already considered largely "feature-complete" it is a community-driven
|
||||
project and development will be guided in this way. New features can be submitted via
|
||||
GitHub discussions and "up-voted" by the community but this is not a guarantee the feature
|
||||
GitHub discussions and "up-voted" by the community but this is not a garauntee the feature
|
||||
will be implemented. This project will always be open to collaboration in the form of PRs,
|
||||
ideas etc.
|
||||
|
||||
|
@@ -44,7 +44,7 @@ resources in the documentation:
|
||||
learn about how paperless automates all tagging using machine learning.
|
||||
* Paperless now comes with a :ref:`proper email consumer <usage-email>`
|
||||
that's fully tested and production ready.
|
||||
* Paperless creates searchable PDF/A documents from whatever you put into
|
||||
* Paperless creates searchable PDF/A documents from whatever you you put into
|
||||
the consumption directory. This means that you can select text in
|
||||
image-only documents coming from your scanner.
|
||||
* See :ref:`this note <utilities-encyption>` about GnuPG encryption in
|
||||
@@ -52,7 +52,7 @@ resources in the documentation:
|
||||
* Paperless is now integrated with a
|
||||
:ref:`task processing queue <setup-task_processor>` that tells you
|
||||
at a glance when and why something is not working.
|
||||
* The :doc:`changelog </changelog>` contains a detailed list of all changes
|
||||
* The :ref:`changelog <paperless_changelog>` contains a detailed list of all changes
|
||||
in paperless-ngx.
|
||||
|
||||
Contents
|
||||
|
@@ -1 +0,0 @@
|
||||
myst-parser==0.17.2
|
||||
|
@@ -1,8 +1,134 @@
|
||||
|
||||
.. _scanners:
|
||||
|
||||
*******************
|
||||
Scanners & Software
|
||||
*******************
|
||||
***********************
|
||||
Scanner recommendations
|
||||
***********************
|
||||
|
||||
As Paperless operates by watching a folder for new files, doesn't care what
|
||||
scanner you use, but sometimes finding a scanner that will write to an FTP,
|
||||
NFS, or SMB server can be difficult. This page is here to help you find one
|
||||
that works right for you based on recommendations from other Paperless users.
|
||||
|
||||
Physical scanners
|
||||
=================
|
||||
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brand | Model | Supports | Recommended By |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| | | FTP | NFS | SMB | SMTP | API [1]_ | |
|
||||
+=========+================+=====+=====+=====+======+==========+================+
|
||||
| Brother | `ADS-1700W`_ | yes | | yes | yes | |`holzhannes`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `ADS-1600W`_ | yes | | yes | yes | |`holzhannes`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `ADS-1500W`_ | yes | | yes | yes | |`danielquinn`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `ADS-1100W`_ | yes | | | | |`ytzelf`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `ADS-2800W`_ | yes | yes | | yes | yes |`philpagel`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `MFC-J6930DW`_ | yes | | | | |`ayounggun`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `MFC-L5850DW`_ | yes | | | yes | |`holzhannes`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `MFC-L2750DW`_ | yes | | yes | yes | |`muued`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `MFC-J5910DW`_ | yes | | | | |`bmsleight`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `MFC-8950DW`_ | yes | | | yes | yes |`philpagel`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Brother | `MFC-9142CDN`_ | yes | | yes | | |`REOLDEV`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Fujitsu | `ix500`_ | yes | | yes | | |`eonist`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Epson | `ES-580W`_ | yes | | yes | yes | |`fignew`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Epson | `WF-7710DWF`_ | yes | | yes | | |`Skylinar`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Fujitsu | `S1300i`_ | yes | | yes | | |`jonaswinkler`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
| Doxie | `Q2`_ | | | | | yes |`Unkn0wnCat`_ |
|
||||
+---------+----------------+-----+-----+-----+------+----------+----------------+
|
||||
|
||||
.. _MFC-L5850DW: https://www.brother-usa.com/products/mfcl5850dw
|
||||
.. _MFC-L2750DW: https://www.brother.de/drucker/laserdrucker/mfc-l2750dw
|
||||
.. _ADS-1700W: https://www.brother-usa.com/products/ads1700w
|
||||
.. _ADS-1600W: https://www.brother-usa.com/products/ads1600w
|
||||
.. _ADS-1500W: https://www.brother.ca/en/p/ads1500w
|
||||
.. _ADS-1100W: https://support.brother.com/g/b/downloadtop.aspx?c=fr&lang=fr&prod=ads1100w_eu_as_cn
|
||||
.. _ADS-2800W: https://www.brother-usa.com/products/ads2800w
|
||||
.. _MFC-J6930DW: https://www.brother.ca/en/p/MFCJ6930DW
|
||||
.. _MFC-J5910DW: https://www.brother.co.uk/printers/inkjet-printers/mfcj5910dw
|
||||
.. _MFC-8950DW: https://www.brother-usa.com/products/mfc8950dw
|
||||
.. _MFC-9142CDN: https://www.brother.co.uk/printers/laser-printers/mfc9140cdn
|
||||
.. _ES-580W: https://epson.com/Support/Scanners/ES-Series/Epson-WorkForce-ES-580W/s/SPT_B11B258201
|
||||
.. _WF-7710DWF: https://www.epson.de/en/products/printers/inkjet-printers/for-home/workforce-wf-7710dwf
|
||||
.. _ix500: http://www.fujitsu.com/us/products/computing/peripheral/scanners/scansnap/ix500/
|
||||
.. _S1300i: https://www.fujitsu.com/global/products/computing/peripheral/scanners/soho/s1300i/
|
||||
.. _Q2: https://www.getdoxie.com/product/doxie-q/
|
||||
|
||||
.. _ayounggun: https://github.com/ayounggun
|
||||
.. _bmsleight: https://github.com/bmsleight
|
||||
.. _danielquinn: https://github.com/danielquinn
|
||||
.. _eonist: https://github.com/eonist
|
||||
.. _fignew: https://github.com/fignew
|
||||
.. _holzhannes: https://github.com/holzhannes
|
||||
.. _jonaswinkler: https://github.com/jonaswinkler
|
||||
.. _REOLDEV: https://github.com/REOLDEV
|
||||
.. _Skylinar: https://github.com/Skylinar
|
||||
.. _ytzelf: https://github.com/ytzelf
|
||||
.. _Unkn0wnCat: https://github.com/Unkn0wnCat
|
||||
.. _muued: https://github.com/muued
|
||||
.. _philpagel: https://github.com/philpagel
|
||||
|
||||
.. [1] Scanners with API Integration allow to push scanned documents directly to :ref:`Paperless API <api-file_uploads>`, sometimes referred to as Webhook or Document POST.
|
||||
|
||||
Mobile phone software
|
||||
=====================
|
||||
|
||||
You can use your phone to "scan" documents. The regular camera app will work, but may have too low contrast for OCR to work well. Apps specifically for scanning are recommended.
|
||||
|
||||
+-----------------------------+----------------+-----+-----+-----+-------+--------+------------------+
|
||||
| Name | OS | Supports | Recommended By |
|
||||
+-----------------------------+----------------+-----+-----+-----+-------+--------+------------------+
|
||||
| | | FTP | NFS | SMB | Email | WebDAV | |
|
||||
+=============================+================+=====+=====+=====+=======+========+==================+
|
||||
| `Office Lens`_ | Android | ? | ? | ? | ? | ? | `jonaswinkler`_ |
|
||||
+-----------------------------+----------------+-----+-----+-----+-------+--------+------------------+
|
||||
| `Genius Scan`_ | Android | yes | no | yes | yes | yes | `hannahswain`_ |
|
||||
+-----------------------------+----------------+-----+-----+-----+-------+--------+------------------+
|
||||
| `OpenScan`_ | Android | no | no | no | no | no | `benjaminfrank`_ |
|
||||
+-----------------------------+----------------+-----+-----+-----+-------+--------+------------------+
|
||||
| `OCR Scanner - QuickScan`_ | iOS | no | no | no | no | yes | `holzhannes`_ |
|
||||
+-----------------------------+----------------+-----+-----+-----+-------+--------+------------------+
|
||||
|
||||
On Android, you can use these applications in combination with one of the :ref:`Paperless-ngx compatible apps <usage-mobile_upload>` to "Share" the documents produced by these scanner apps with paperless. On iOS, you can share the scanned documents via iOS-Sharing to other mail, WebDav or FTP apps.
|
||||
|
||||
.. _Office Lens: https://play.google.com/store/apps/details?id=com.microsoft.office.officelens
|
||||
.. _Genius Scan: https://play.google.com/store/apps/details?id=com.thegrizzlylabs.geniusscan.free
|
||||
.. _OCR Scanner - QuickScan: https://apps.apple.com/us/app/quickscan-scanner-text-ocr/id1513790291
|
||||
.. _OpenScan: https://github.com/Ethereal-Developers-Inc/OpenScan
|
||||
|
||||
.. _hannahswain: https://github.com/hannahswain
|
||||
.. _benjaminfrank: https://github.com/benjaminfrank
|
||||
|
||||
API Scanning Setup
|
||||
==================
|
||||
|
||||
This sections contains information on how to set up scanners to post directly to :ref:`Paperless API <api-file_uploads>`.
|
||||
|
||||
Doxie Q2
|
||||
--------
|
||||
|
||||
This part assumes your Doxie is connected to WiFi and you know its IP.
|
||||
|
||||
1. Open your Doxie web UI by navigating to its IP address
|
||||
2. Navigate to Options -> Webhook
|
||||
3. Set the *URL* to ``https://[your-paperless-ngx-instance]/api/documents/post_document/``
|
||||
4. Set the *File Parameter Name* to ``document``
|
||||
5. Add the username and password to the respective fields (Consider creating a user just for your Doxie)
|
||||
6. Click *Submit* at the bottom of the page
|
||||
|
||||
Congrats, you can now scan directly from your Doxie to your Paperless-ngx instance!
|
||||
|
||||
Paperless-ngx is compatible with many different scanners and scanning tools. A user-maintained list of scanners and other software is available on `the wiki <https://github.com/paperless-ngx/paperless-ngx/wiki/Scanner-&-Software-Recommendations>`_.
|
||||
|
@@ -4,60 +4,41 @@
|
||||
Screenshots
|
||||
***********
|
||||
|
||||
This is what Paperless-ngx looks like.
|
||||
This is what paperless-ngx looks like. You shouldn't use paperless to index
|
||||
research papers though, its a horrible tool for that job.
|
||||
|
||||
The dashboard shows customizable views on your document and allows document uploads:
|
||||
|
||||
.. image:: _static/screenshots/dashboard.png
|
||||
:target: _static/screenshots/dashboard.png
|
||||
|
||||
The document list provides three different styles to scroll through your documents:
|
||||
|
||||
.. image:: _static/screenshots/documents-table.png
|
||||
:target: _static/screenshots/documents-table.png
|
||||
.. image:: _static/screenshots/documents-smallcards.png
|
||||
:target: _static/screenshots/documents-smallcards.png
|
||||
.. image:: _static/screenshots/documents-largecards.png
|
||||
:target: _static/screenshots/documents-largecards.png
|
||||
|
||||
Paperless-ngx also supports "dark mode":
|
||||
|
||||
.. image:: _static/screenshots/documents-smallcards-dark.png
|
||||
:target: _static/screenshots/documents-smallcards-dark.png
|
||||
|
||||
Extensive filtering mechanisms:
|
||||
|
||||
.. image:: _static/screenshots/documents-filter.png
|
||||
:target: _static/screenshots/documents-filter.png
|
||||
|
||||
Bulk editing of document tags, correspondents, etc.:
|
||||
|
||||
.. image:: _static/screenshots/bulk-edit.png
|
||||
:target: _static/screenshots/bulk-edit.png
|
||||
|
||||
Side-by-side editing of documents:
|
||||
Side-by-side editing of documents. Optimized for 1080p.
|
||||
|
||||
.. image:: _static/screenshots/editing.png
|
||||
:target: _static/screenshots/editing.png
|
||||
|
||||
Tag editing. This looks about the same for correspondents and document types.
|
||||
|
||||
.. image:: _static/screenshots/new-tag.png
|
||||
:target: _static/screenshots/new-tag.png
|
||||
|
||||
Searching provides auto complete and highlights the results.
|
||||
|
||||
.. image:: _static/screenshots/search-preview.png
|
||||
:target: _static/screenshots/search-preview.png
|
||||
.. image:: _static/screenshots/search-results.png
|
||||
:target: _static/screenshots/search-results.png
|
||||
|
||||
Fancy mail filters!
|
||||
|
||||
.. image:: _static/screenshots/mail-rules-edited.png
|
||||
:target: _static/screenshots/mail-rules-edited.png
|
||||
|
||||
Mobile devices are supported.
|
||||
Mobile support in the future? This kinda works, however some layouts are still
|
||||
too wide.
|
||||
|
||||
.. image:: _static/screenshots/mobile.png
|
||||
:target: _static/screenshots/mobile.png
|
||||
|
@@ -73,7 +73,7 @@ Paperless consists of the following components:
|
||||
for getting the tasks from the webserver and the consumer to the task scheduler. These run in a different
|
||||
process (maybe even on different machines!), and therefore, this is necessary.
|
||||
|
||||
* Optional: A database server. Paperless supports PostgreSQL, MariaDB and SQLite for storing its data.
|
||||
* Optional: A database server. Paperless supports both PostgreSQL and SQLite for storing its data.
|
||||
|
||||
|
||||
Installation
|
||||
@@ -184,25 +184,6 @@ Install Paperless from Docker Hub
|
||||
port 8000. Modifying the part before the colon will map requests on another
|
||||
port to the webserver running on the default port.
|
||||
|
||||
**Rootless**
|
||||
|
||||
If you want to run Paperless as a rootless container, you will need to do the
|
||||
following in your ``docker-compose.yml``:
|
||||
|
||||
- set the ``user`` running the container to map to the ``paperless`` user in the
|
||||
container.
|
||||
This value (``user_id`` below), should be the same id that ``USERMAP_UID`` and
|
||||
``USERMAP_GID`` are set to in the next step.
|
||||
See ``USERMAP_UID`` and ``USERMAP_GID`` :ref:`here <configuration-docker>`.
|
||||
|
||||
Your entry for Paperless should contain something like:
|
||||
|
||||
.. code::
|
||||
|
||||
webserver:
|
||||
image: ghcr.io/paperless-ngx/paperless-ngx:latest
|
||||
user: <user_id>
|
||||
|
||||
5. Modify ``docker-compose.env``, following the comments in the file. The
|
||||
most important change is to set ``USERMAP_UID`` and ``USERMAP_GID``
|
||||
to the uid and gid of your user on the host system. Use ``id -u`` and
|
||||
@@ -219,20 +200,6 @@ Install Paperless from Docker Hub
|
||||
You can copy any setting from the file ``paperless.conf.example`` and paste it here.
|
||||
Have a look at :ref:`configuration` to see what's available.
|
||||
|
||||
.. note::
|
||||
|
||||
You can utilize Docker secrets for some configuration settings by
|
||||
appending `_FILE` to some configuration values. This is supported currently
|
||||
only by:
|
||||
|
||||
* PAPERLESS_DBUSER
|
||||
* PAPERLESS_DBPASS
|
||||
* PAPERLESS_SECRET_KEY
|
||||
* PAPERLESS_AUTO_LOGIN_USERNAME
|
||||
* PAPERLESS_ADMIN_USER
|
||||
* PAPERLESS_ADMIN_MAIL
|
||||
* PAPERLESS_ADMIN_PASSWORD
|
||||
|
||||
.. caution::
|
||||
|
||||
Some file systems such as NFS network shares don't support file system
|
||||
@@ -317,22 +284,19 @@ writing. Windows is not and will never be supported.
|
||||
* ``python3-pip``
|
||||
* ``python3-dev``
|
||||
|
||||
* ``default-libmysqlclient-dev`` for MariaDB
|
||||
* ``fonts-liberation`` for generating thumbnails for plain text files
|
||||
* ``imagemagick`` >= 6 for PDF conversion
|
||||
* ``optipng`` for optimizing thumbnails
|
||||
* ``gnupg`` for handling encrypted documents
|
||||
* ``libpq-dev`` for PostgreSQL
|
||||
* ``libmagic-dev`` for mime type detection
|
||||
* ``mariadb-client`` for MariaDB compile time
|
||||
* ``mime-support`` for mime type detection
|
||||
* ``libzbar0`` for barcode detection
|
||||
* ``poppler-utils`` for barcode detection
|
||||
|
||||
Use this list for your preferred package management:
|
||||
|
||||
.. code::
|
||||
|
||||
python3 python3-pip python3-dev imagemagick fonts-liberation gnupg libpq-dev libmagic-dev mime-support libzbar0 poppler-utils
|
||||
python3 python3-pip python3-dev imagemagick fonts-liberation optipng gnupg libpq-dev libmagic-dev mime-support
|
||||
|
||||
These dependencies are required for OCRmyPDF, which is used for text recognition.
|
||||
|
||||
@@ -342,7 +306,7 @@ writing. Windows is not and will never be supported.
|
||||
* ``qpdf``
|
||||
* ``liblept5``
|
||||
* ``libxml2``
|
||||
* ``pngquant`` (suggested for certain PDF image optimizations)
|
||||
* ``pngquant``
|
||||
* ``zlib1g``
|
||||
* ``tesseract-ocr`` >= 4.0.0 for OCR
|
||||
* ``tesseract-ocr`` language packs (``tesseract-ocr-eng``, ``tesseract-ocr-deu``, etc)
|
||||
@@ -364,13 +328,7 @@ writing. Windows is not and will never be supported.
|
||||
2. Install ``redis`` >= 5.0 and configure it to start automatically.
|
||||
|
||||
3. Optional. Install ``postgresql`` and configure a database, user and password for paperless. If you do not wish
|
||||
to use PostgreSQL, MariaDB and SQLite are available as well.
|
||||
|
||||
.. note::
|
||||
|
||||
On bare-metal installations using SQLite, ensure the
|
||||
`JSON1 extension <https://code.djangoproject.com/wiki/JSON1Extension>`_ is enabled. This is
|
||||
usually the case, but not always.
|
||||
to use PostgreSQL, SQLite is available as well.
|
||||
|
||||
4. Get the release archive from `<https://github.com/paperless-ngx/paperless-ngx/releases>`_.
|
||||
If you clone the git repo as it is, you also have to compile the front end by yourself.
|
||||
@@ -380,7 +338,6 @@ writing. Windows is not and will never be supported.
|
||||
settings to your needs. Required settings for getting paperless running are:
|
||||
|
||||
* ``PAPERLESS_REDIS`` should point to your redis server, such as redis://localhost:6379.
|
||||
* ``PAPERLESS_DBENGINE`` optional, and should be one of `postgres, mariadb, or sqlite`
|
||||
* ``PAPERLESS_DBHOST`` should be the hostname on which your PostgreSQL server is running. Do not configure this
|
||||
to use SQLite instead. Also configure port, database name, user and password as necessary.
|
||||
* ``PAPERLESS_CONSUMPTION_DIR`` should point to a folder which paperless should watch for documents. You might
|
||||
@@ -388,8 +345,6 @@ writing. Windows is not and will never be supported.
|
||||
paperless stores its data. If you like, you can point both to the same directory.
|
||||
* ``PAPERLESS_SECRET_KEY`` should be a random sequence of characters. It's used for authentication. Failure
|
||||
to do so allows third parties to forge authentication credentials.
|
||||
* ``PAPERLESS_URL`` if you are behind a reverse proxy. This should point to your domain. Please see
|
||||
:ref:`configuration` for more information.
|
||||
|
||||
Many more adjustments can be made to paperless, especially the OCR part. The following options are recommended
|
||||
for everyone:
|
||||
@@ -554,7 +509,7 @@ how you installed paperless.
|
||||
This setup describes how to update an existing paperless Docker installation.
|
||||
The important things to keep in mind are as follows:
|
||||
|
||||
* Read the :doc:`changelog </changelog>` and take note of breaking changes.
|
||||
* Read the :ref:`changelog <paperless_changelog>` and take note of breaking changes.
|
||||
* You should decide if you want to stick with SQLite or want to migrate your database
|
||||
to PostgreSQL. See :ref:`setup-sqlite_to_psql` for details on how to move your data from
|
||||
SQLite to PostgreSQL. Both work fine with paperless. However, if you already have a
|
||||
@@ -765,8 +720,12 @@ configuring some options in paperless can help improve performance immensely:
|
||||
* If you want to perform OCR on the device, consider using ``PAPERLESS_OCR_CLEAN=none``.
|
||||
This will speed up OCR times and use less memory at the expense of slightly worse
|
||||
OCR results.
|
||||
* Set ``PAPERLESS_OPTIMIZE_THUMBNAILS`` to 'false' if you want faster consumption
|
||||
times. Thumbnails will be about 20% larger.
|
||||
* If using docker, consider setting ``PAPERLESS_WEBSERVER_WORKERS`` to
|
||||
1. This will save some memory.
|
||||
* Use the arm compatible docker-compose if you're wanting to use Tika on something like
|
||||
a raspberry pi. The official apache/tika image does not support the arm architecture.
|
||||
|
||||
For details, refer to :ref:`configuration`.
|
||||
|
||||
@@ -825,6 +784,4 @@ the following configuration is required for paperless to operate:
|
||||
}
|
||||
}
|
||||
|
||||
The ``PAPERLESS_URL`` configuration variable is also required when using a reverse proxy. Please refer to the :ref:`hosting-and-security` docs.
|
||||
|
||||
Also read `this <https://channels.readthedocs.io/en/stable/deploying.html#nginx-supervisor-ubuntu>`__, towards the end of the section.
|
||||
|
@@ -125,7 +125,7 @@ If using docker-compose, this is achieved by the following configuration change
|
||||
.. code:: yaml
|
||||
|
||||
gotenberg:
|
||||
image: gotenberg/gotenberg:7.4
|
||||
image: gotenberg/gotenberg:7
|
||||
restart: unless-stopped
|
||||
command:
|
||||
- "gotenberg"
|
||||
@@ -235,85 +235,3 @@ You might find messages like these in your log files:
|
||||
This indicates that paperless failed to read PDF metadata from one of your documents. This happens when you
|
||||
open the affected documents in paperless for editing. Paperless will continue to work, and will simply not
|
||||
show the invalid metadata.
|
||||
|
||||
Consumer fails with a FileNotFoundError
|
||||
#######################################
|
||||
|
||||
You might find messages like these in your log files:
|
||||
|
||||
.. code::
|
||||
|
||||
[ERROR] [paperless.consumer] Error while consuming document SCN_0001.pdf: FileNotFoundError: [Errno 2] No such file or directory: '/tmp/ocrmypdf.io.yhk3zbv0/origin.pdf'
|
||||
Traceback (most recent call last):
|
||||
File "/app/paperless/src/paperless_tesseract/parsers.py", line 261, in parse
|
||||
ocrmypdf.ocr(**args)
|
||||
File "/usr/local/lib/python3.8/dist-packages/ocrmypdf/api.py", line 337, in ocr
|
||||
return run_pipeline(options=options, plugin_manager=plugin_manager, api=True)
|
||||
File "/usr/local/lib/python3.8/dist-packages/ocrmypdf/_sync.py", line 385, in run_pipeline
|
||||
exec_concurrent(context, executor)
|
||||
File "/usr/local/lib/python3.8/dist-packages/ocrmypdf/_sync.py", line 302, in exec_concurrent
|
||||
pdf = post_process(pdf, context, executor)
|
||||
File "/usr/local/lib/python3.8/dist-packages/ocrmypdf/_sync.py", line 235, in post_process
|
||||
pdf_out = metadata_fixup(pdf_out, context)
|
||||
File "/usr/local/lib/python3.8/dist-packages/ocrmypdf/_pipeline.py", line 798, in metadata_fixup
|
||||
with pikepdf.open(context.origin) as original, pikepdf.open(working_file) as pdf:
|
||||
File "/usr/local/lib/python3.8/dist-packages/pikepdf/_methods.py", line 923, in open
|
||||
pdf = Pdf._open(
|
||||
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/ocrmypdf.io.yhk3zbv0/origin.pdf'
|
||||
|
||||
This probably indicates paperless tried to consume the same file twice. This can happen for a number of reasons,
|
||||
depending on how documents are placed into the consume folder. If paperless is using inotify (the default) to
|
||||
check for documents, try adjusting the :ref:`inotify configuration <configuration-inotify>`. If polling is enabled,
|
||||
try adjusting the :ref:`polling configuration <configuration-polling>`.
|
||||
|
||||
Consumer fails waiting for file to remain unmodified.
|
||||
#####################################################
|
||||
|
||||
You might find messages like these in your log files:
|
||||
|
||||
.. code::
|
||||
|
||||
[ERROR] [paperless.management.consumer] Timeout while waiting on file /usr/src/paperless/src/../consume/SCN_0001.pdf to remain unmodified.
|
||||
|
||||
This indicates paperless timed out while waiting for the file to be completely written to the consume folder.
|
||||
Adjusting :ref:`polling configuration <configuration-polling>` values should resolve the issue.
|
||||
|
||||
.. note::
|
||||
|
||||
The user will need to manually move the file out of the consume folder and
|
||||
back in, for the initial failing file to be consumed.
|
||||
|
||||
Consumer fails reporting "OS reports file as busy still".
|
||||
#########################################################
|
||||
|
||||
You might find messages like these in your log files:
|
||||
|
||||
.. code::
|
||||
|
||||
[WARNING] [paperless.management.consumer] Not consuming file /usr/src/paperless/src/../consume/SCN_0001.pdf: OS reports file as busy still
|
||||
|
||||
This indicates paperless was unable to open the file, as the OS reported the file as still being in use. To prevent a
|
||||
crash, paperless did not try to consume the file. If paperless is using inotify (the default) to
|
||||
check for documents, try adjusting the :ref:`inotify configuration <configuration-inotify>`. If polling is enabled,
|
||||
try adjusting the :ref:`polling configuration <configuration-polling>`.
|
||||
|
||||
.. note::
|
||||
|
||||
The user will need to manually move the file out of the consume folder and
|
||||
back in, for the initial failing file to be consumed.
|
||||
|
||||
Log reports "Creating PaperlessTask failed".
|
||||
#########################################################
|
||||
|
||||
You might find messages like these in your log files:
|
||||
|
||||
.. code::
|
||||
|
||||
[ERROR] [paperless.management.consumer] Creating PaperlessTask failed: db locked
|
||||
|
||||
You are likely using an sqlite based installation, with an increased number of workers and are running into sqlite's concurrency limitations.
|
||||
Uploading or consuming multiple files at once results in many workers attempting to access the database simultaneously.
|
||||
|
||||
Consider changing to the PostgreSQL database if you will be processing many documents at once often. Otherwise,
|
||||
try tweaking the ``PAPERLESS_DB_TIMEOUT`` setting to allow more time for the database to unlock. This may have
|
||||
minor performance implications.
|
||||
|
@@ -62,7 +62,7 @@ your documents:
|
||||
|
||||
1. OCR the document, if it has no text. Digital documents usually have text,
|
||||
and this step will be skipped for those documents.
|
||||
2. Paperless will create an archivable PDF/A document from your document.
|
||||
2. Paperless will create an archiveable PDF/A document from your document.
|
||||
If this document is coming from your scanner, it will have embedded selectable text.
|
||||
3. Paperless performs automatic matching of tags, correspondents and types on the
|
||||
document before storing it in the database.
|
||||
@@ -102,14 +102,12 @@ files from the scanner. Typically, you're looking at an FTP server like
|
||||
|
||||
.. TODO: hyperref to configuration of the location of this magic folder.
|
||||
|
||||
Web UI Upload
|
||||
=============
|
||||
Dashboard upload
|
||||
================
|
||||
|
||||
The dashboard has a file drop field to upload documents to paperless. Simply drag a file
|
||||
onto this field or select a file with the file dialog. Multiple files are supported.
|
||||
|
||||
You can also upload documents on any other page of the web UI by dragging-and-dropping
|
||||
files into your browser window.
|
||||
|
||||
.. _usage-mobile_upload:
|
||||
|
||||
@@ -161,9 +159,6 @@ These are as follows:
|
||||
will not consume flagged mails.
|
||||
* **Move to folder:** Moves consumed mails out of the way so that paperless wont
|
||||
consume them again.
|
||||
* **Add custom Tag:** Adds a custom tag to mails with consumed documents (the IMAP
|
||||
standard calls these "keywords"). Paperless will not consume mails already tagged.
|
||||
Not all mail servers support this feature!
|
||||
|
||||
.. caution::
|
||||
|
||||
@@ -183,15 +178,6 @@ These are as follows:
|
||||
automatically or manually and tell paperless to move them to yet another folder
|
||||
after consumption. It's up to you.
|
||||
|
||||
.. note::
|
||||
|
||||
When defining a mail rule with a folder, you may need to try different characters to
|
||||
define how the sub-folders are separated. Common values include ".", "/" or "|", but
|
||||
this varies by the mail server. Check the documentation for your mail server. In the
|
||||
event of an error fetching mail from a certain folder, check the Paperless logs. When
|
||||
a folder is not located, Paperless will attempt to list all folders found in the account
|
||||
to the Paperless logs.
|
||||
|
||||
.. note::
|
||||
|
||||
Paperless will process the rules in the order defined in the admin page.
|
||||
|
@@ -1,17 +1,9 @@
|
||||
import os
|
||||
|
||||
# See https://docs.gunicorn.org/en/stable/settings.html for
|
||||
# explanations of settings
|
||||
|
||||
bind = f'{os.getenv("PAPERLESS_BIND_ADDR", "[::]")}:{os.getenv("PAPERLESS_PORT", 8000)}'
|
||||
|
||||
workers = int(os.getenv("PAPERLESS_WEBSERVER_WORKERS", 1))
|
||||
bind = f'0.0.0.0:{os.getenv("PAPERLESS_PORT", 8000)}'
|
||||
workers = int(os.getenv("PAPERLESS_WEBSERVER_WORKERS", 2))
|
||||
worker_class = "paperless.workers.ConfigurableWorker"
|
||||
timeout = 120
|
||||
preload_app = True
|
||||
|
||||
# https://docs.gunicorn.org/en/stable/faq.html#blocking-os-fchmod
|
||||
worker_tmp_dir = "/dev/shm"
|
||||
|
||||
|
||||
def pre_fork(server, worker):
|
||||
@@ -32,7 +24,7 @@ def worker_int(worker):
|
||||
## get traceback info
|
||||
import threading, sys, traceback
|
||||
|
||||
id2name = {th.ident: th.name for th in threading.enumerate()}
|
||||
id2name = dict([(th.ident, th.name) for th in threading.enumerate()])
|
||||
code = []
|
||||
for threadId, stack in sys._current_frames().items():
|
||||
code.append("\n# Thread: %s(%d)" % (id2name.get(threadId, ""), threadId))
|
||||
|
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env bash
|
||||
#!/bin/bash
|
||||
|
||||
ask() {
|
||||
while true ; do
|
||||
@@ -47,29 +47,24 @@ if [[ $(id -u) == "0" ]] ; then
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if ! command -v wget &> /dev/null ; then
|
||||
if [[ -z $(which wget) ]] ; then
|
||||
echo "wget executable not found. Is wget installed?"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if ! command -v docker &> /dev/null ; then
|
||||
if [[ -z $(which docker) ]] ; then
|
||||
echo "docker executable not found. Is docker installed?"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
DOCKER_COMPOSE_CMD="docker-compose"
|
||||
if ! command -v ${DOCKER_COMPOSE_CMD} ; then
|
||||
if docker compose version &> /dev/null ; then
|
||||
DOCKER_COMPOSE_CMD="docker compose"
|
||||
else
|
||||
echo "docker-compose executable not found. Is docker-compose installed?"
|
||||
exit 1
|
||||
fi
|
||||
if [[ -z $(which docker-compose) ]] ; then
|
||||
echo "docker-compose executable not found. Is docker-compose installed?"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check if user has permissions to run Docker by trying to get the status of Docker (docker status).
|
||||
# If this fails, the user probably does not have permissions for Docker.
|
||||
if ! docker stats --no-stream &> /dev/null ; then
|
||||
if [ ! "$(docker stats --no-stream 2>/dev/null 1>&2)" ] ; then
|
||||
echo ""
|
||||
echo "WARN: It look like the current user does not have Docker permissions."
|
||||
echo "WARN: Use 'sudo usermod -aG docker $USER' to assign Docker permissions to the user."
|
||||
@@ -92,14 +87,6 @@ echo ""
|
||||
echo "1. Application configuration"
|
||||
echo "============================"
|
||||
|
||||
echo ""
|
||||
echo "The URL paperless will be available at. This is required if the"
|
||||
echo "installation will be accessible via the web, otherwise can be left blank."
|
||||
echo ""
|
||||
|
||||
ask "URL" ""
|
||||
URL=$ask_result
|
||||
|
||||
echo ""
|
||||
echo "The port on which the paperless webserver will listen for incoming"
|
||||
echo "connections."
|
||||
@@ -118,12 +105,12 @@ ask "Current time zone" "$default_time_zone"
|
||||
TIME_ZONE=$ask_result
|
||||
|
||||
echo ""
|
||||
echo "Database backend: PostgreSQL, MariaDB, and SQLite are available. Use PostgreSQL"
|
||||
echo "Database backend: PostgreSQL and SQLite are available. Use PostgreSQL"
|
||||
echo "if unsure. If you're running on a low-power device such as Raspberry"
|
||||
echo "Pi, use SQLite to save resources."
|
||||
echo ""
|
||||
|
||||
ask "Database backend" "postgres" "postgres sqlite mariadb"
|
||||
ask "Database backend" "postgres" "postgres sqlite"
|
||||
DATABASE_BACKEND=$ask_result
|
||||
|
||||
echo ""
|
||||
@@ -214,9 +201,9 @@ echo ""
|
||||
ask_docker_folder "Data folder" ""
|
||||
DATA_FOLDER=$ask_result
|
||||
|
||||
if [[ "$DATABASE_BACKEND" == "postgres" || "$DATABASE_BACKEND" == "mariadb" ]] ; then
|
||||
if [[ "$DATABASE_BACKEND" == "postgres" ]] ; then
|
||||
echo ""
|
||||
echo "The database folder, where your database stores its data."
|
||||
echo "The database folder, where postgres stores its data."
|
||||
echo "Leave empty to have this managed by docker."
|
||||
echo ""
|
||||
echo "CAUTION: If specified, you must specify an absolute path starting with /"
|
||||
@@ -224,7 +211,7 @@ if [[ "$DATABASE_BACKEND" == "postgres" || "$DATABASE_BACKEND" == "mariadb" ]] ;
|
||||
echo ""
|
||||
|
||||
ask_docker_folder "Database folder" ""
|
||||
DATABASE_FOLDER=$ask_result
|
||||
POSTGRES_FOLDER=$ask_result
|
||||
fi
|
||||
|
||||
echo ""
|
||||
@@ -278,16 +265,14 @@ if [[ -z $DATA_FOLDER ]] ; then
|
||||
else
|
||||
echo "Data folder: $DATA_FOLDER"
|
||||
fi
|
||||
if [[ "$DATABASE_BACKEND" == "postgres" || "$DATABASE_BACKEND" == "mariadb" ]] ; then
|
||||
if [[ -z $DATABASE_FOLDER ]] ; then
|
||||
echo "Database folder: Managed by docker"
|
||||
if [[ "$DATABASE_BACKEND" == "postgres" ]] ; then
|
||||
if [[ -z $POSTGRES_FOLDER ]] ; then
|
||||
echo "Database (postgres) folder: Managed by docker"
|
||||
else
|
||||
echo "Database folder: $DATABASE_FOLDER"
|
||||
echo "Database (postgres) folder: $POSTGRES_FOLDER"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "URL: $URL"
|
||||
echo "Port: $PORT"
|
||||
echo "Database: $DATABASE_BACKEND"
|
||||
echo "Tika enabled: $TIKA_ENABLED"
|
||||
@@ -320,15 +305,9 @@ wget "https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/docker/
|
||||
|
||||
SECRET_KEY=$(tr -dc 'a-zA-Z0-9' < /dev/urandom | fold -w 64 | head -n 1)
|
||||
|
||||
DEFAULT_LANGUAGES=("deu eng fra ita spa")
|
||||
|
||||
_split_langs="${OCR_LANGUAGE//+/ }"
|
||||
read -r -a OCR_LANGUAGES_ARRAY <<< "${_split_langs}"
|
||||
DEFAULT_LANGUAGES="deu eng fra ita spa"
|
||||
|
||||
{
|
||||
if [[ ! $URL == "" ]] ; then
|
||||
echo "PAPERLESS_URL=$URL"
|
||||
fi
|
||||
if [[ ! $USERMAP_UID == "1000" ]] ; then
|
||||
echo "USERMAP_UID=$USERMAP_UID"
|
||||
fi
|
||||
@@ -338,8 +317,8 @@ read -r -a OCR_LANGUAGES_ARRAY <<< "${_split_langs}"
|
||||
echo "PAPERLESS_TIME_ZONE=$TIME_ZONE"
|
||||
echo "PAPERLESS_OCR_LANGUAGE=$OCR_LANGUAGE"
|
||||
echo "PAPERLESS_SECRET_KEY=$SECRET_KEY"
|
||||
if [[ ! ${DEFAULT_LANGUAGES[*]} =~ ${OCR_LANGUAGES_ARRAY[*]} ]] ; then
|
||||
echo "PAPERLESS_OCR_LANGUAGES=${OCR_LANGUAGES_ARRAY[*]}"
|
||||
if [[ ! " ${DEFAULT_LANGUAGES[*]} " =~ ${OCR_LANGUAGE} ]] ; then
|
||||
echo "PAPERLESS_OCR_LANGUAGES=$OCR_LANGUAGE"
|
||||
fi
|
||||
} > docker-compose.env
|
||||
|
||||
@@ -357,16 +336,9 @@ if [[ -n $DATA_FOLDER ]] ; then
|
||||
sed -i "/^\s*data:/d" docker-compose.yml
|
||||
fi
|
||||
|
||||
# If the database folder was provided (not blank), replace the pgdata/dbdata volume with a bind mount
|
||||
# of the provided folder
|
||||
if [[ -n $DATABASE_FOLDER ]] ; then
|
||||
if [[ "$DATABASE_BACKEND" == "postgres" ]] ; then
|
||||
sed -i "s#- pgdata:/var/lib/postgresql/data#- $DATABASE_FOLDER:/var/lib/postgresql/data#g" docker-compose.yml
|
||||
sed -i "/^\s*pgdata:/d" docker-compose.yml
|
||||
elif [[ "$DATABASE_BACKEND" == "mariadb" ]]; then
|
||||
sed -i "s#- dbdata:/var/lib/mysql#- $DATABASE_FOLDER:/var/lib/mysql#g" docker-compose.yml
|
||||
sed -i "/^\s*dbdata:/d" docker-compose.yml
|
||||
fi
|
||||
if [[ -n $POSTGRES_FOLDER ]] ; then
|
||||
sed -i "s#- pgdata:/var/lib/postgresql/data#- $POSTGRES_FOLDER:/var/lib/postgresql/data#g" docker-compose.yml
|
||||
sed -i "/^\s*pgdata:/d" docker-compose.yml
|
||||
fi
|
||||
|
||||
# remove trailing blank lines from end of file
|
||||
@@ -379,8 +351,8 @@ if [ "$l1" -eq "$l2" ] ; then
|
||||
fi
|
||||
|
||||
|
||||
${DOCKER_COMPOSE_CMD} pull
|
||||
docker-compose pull
|
||||
|
||||
${DOCKER_COMPOSE_CMD} run --rm -e DJANGO_SUPERUSER_PASSWORD="$PASSWORD" webserver createsuperuser --noinput --username "$USERNAME" --email "$EMAIL"
|
||||
docker-compose run --rm -e DJANGO_SUPERUSER_PASSWORD="$PASSWORD" webserver createsuperuser --noinput --username "$USERNAME" --email "$EMAIL"
|
||||
|
||||
${DOCKER_COMPOSE_CMD} up --detach
|
||||
docker-compose up -d
|
||||
|
@@ -23,15 +23,12 @@
|
||||
#PAPERLESS_MEDIA_ROOT=../media
|
||||
#PAPERLESS_STATICDIR=../static
|
||||
#PAPERLESS_FILENAME_FORMAT=
|
||||
#PAPERLESS_FILENAME_FORMAT_REMOVE_NONE=
|
||||
|
||||
# Security and hosting
|
||||
|
||||
#PAPERLESS_SECRET_KEY=change-me
|
||||
#PAPERLESS_URL=https://example.com
|
||||
#PAPERLESS_CSRF_TRUSTED_ORIGINS=https://example.com # can be set using PAPERLESS_URL
|
||||
#PAPERLESS_ALLOWED_HOSTS=example.com,www.example.com # can be set using PAPERLESS_URL
|
||||
#PAPERLESS_CORS_ALLOWED_HOSTS=https://localhost:8080,https://example.com # can be set using PAPERLESS_URL
|
||||
#PAPERLESS_ALLOWED_HOSTS=example.com,www.example.com
|
||||
#PAPERLESS_CORS_ALLOWED_HOSTS=http://example.com,http://localhost:8000
|
||||
#PAPERLESS_FORCE_SCRIPT_NAME=
|
||||
#PAPERLESS_STATIC_URL=/static/
|
||||
#PAPERLESS_AUTO_LOGIN_USERNAME=
|
||||
@@ -61,18 +58,15 @@
|
||||
#PAPERLESS_CONSUMER_POLLING=10
|
||||
#PAPERLESS_CONSUMER_DELETE_DUPLICATES=false
|
||||
#PAPERLESS_CONSUMER_RECURSIVE=false
|
||||
#PAPERLESS_CONSUMER_IGNORE_PATTERNS=[".DS_STORE/*", "._*", ".stfolder/*", ".stversions/*", ".localized/*", "desktop.ini"]
|
||||
#PAPERLESS_CONSUMER_IGNORE_PATTERNS=[".DS_STORE/*", "._*", ".stfolder/*"]
|
||||
#PAPERLESS_CONSUMER_SUBDIRS_AS_TAGS=false
|
||||
#PAPERLESS_CONSUMER_ENABLE_BARCODES=false
|
||||
#PAPERLESS_CONSUMER_BARCODE_STRING=PATCHT
|
||||
#PAPERLESS_OPTIMIZE_THUMBNAILS=true
|
||||
#PAPERLESS_PRE_CONSUME_SCRIPT=/path/to/an/arbitrary/script.sh
|
||||
#PAPERLESS_POST_CONSUME_SCRIPT=/path/to/an/arbitrary/script.sh
|
||||
#PAPERLESS_FILENAME_DATE_ORDER=YMD
|
||||
#PAPERLESS_FILENAME_PARSE_TRANSFORMS=[]
|
||||
#PAPERLESS_NUMBER_OF_SUGGESTED_DATES=5
|
||||
#PAPERLESS_THUMBNAIL_FONT_NAME=
|
||||
#PAPERLESS_IGNORE_DATES=
|
||||
#PAPERLESS_ENABLE_UPDATE_CHECK=
|
||||
|
||||
# Tika settings
|
||||
|
||||
@@ -84,3 +78,4 @@
|
||||
|
||||
#PAPERLESS_CONVERT_BINARY=/usr/bin/convert
|
||||
#PAPERLESS_GS_BINARY=/usr/bin/gs
|
||||
#PAPERLESS_OPTIPNG_BINARY=/usr/bin/optipng
|
||||
|
111
requirements.txt
Normal file
@@ -0,0 +1,111 @@
|
||||
#
|
||||
# These requirements were autogenerated by pipenv
|
||||
# To regenerate from the project's Pipfile, run:
|
||||
#
|
||||
# pipenv lock --requirements
|
||||
#
|
||||
|
||||
-i https://pypi.python.org/simple
|
||||
--extra-index-url https://www.piwheels.org/simple
|
||||
aioredis==1.3.1
|
||||
anyio==3.5.0; python_full_version >= '3.6.2'
|
||||
arrow==1.2.2; python_version >= '3.6'
|
||||
asgiref==3.5.0; python_version >= '3.7'
|
||||
async-timeout==4.0.2; python_version >= '3.6'
|
||||
attrs==21.4.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
|
||||
autobahn==22.2.2; python_version >= '3.7'
|
||||
automat==20.2.0
|
||||
backports.zoneinfo==0.2.1; python_version < '3.9'
|
||||
blessed==1.19.1; python_version >= '2.7'
|
||||
certifi==2021.10.8
|
||||
cffi==1.15.0
|
||||
channels-redis==3.4.0
|
||||
channels==3.0.4
|
||||
chardet==4.0.0; python_version >= '3.1'
|
||||
charset-normalizer==2.0.12; python_version >= '3'
|
||||
click==8.0.4; python_version >= '3.6'
|
||||
coloredlogs==15.0.1; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
|
||||
concurrent-log-handler==0.9.20
|
||||
constantly==15.1.0
|
||||
cryptography==36.0.2; python_version >= '3.6'
|
||||
daphne==3.0.2; python_version >= '3.6'
|
||||
dateparser==1.1.1
|
||||
django-cors-headers==3.11.0
|
||||
django-extensions==3.1.5
|
||||
django-filter==21.1
|
||||
django-picklefield==3.0.1; python_version >= '3'
|
||||
django-q==1.3.9
|
||||
django==4.0.3
|
||||
djangorestframework==3.13.1
|
||||
filelock==3.6.0
|
||||
fuzzywuzzy[speedup]==0.18.0
|
||||
gunicorn==20.1.0
|
||||
h11==0.13.0; python_version >= '3.6'
|
||||
hiredis==2.0.0; python_version >= '3.6'
|
||||
httptools==0.4.0
|
||||
humanfriendly==10.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
|
||||
hyperlink==21.0.0
|
||||
idna==3.3; python_version >= '3.5'
|
||||
imap-tools==0.53.0
|
||||
img2pdf==0.4.3
|
||||
importlib-resources==5.6.0; python_version < '3.9'
|
||||
incremental==21.3.0
|
||||
inotify-simple==1.3.5; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
|
||||
inotifyrecursive==0.3.5
|
||||
joblib==1.1.0; python_version >= '3.6'
|
||||
langdetect==1.0.9
|
||||
lxml==4.8.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
|
||||
msgpack==1.0.3
|
||||
numpy==1.22.3; python_version >= '3.8'
|
||||
ocrmypdf==13.4.1
|
||||
packaging==21.3; python_version >= '3.6'
|
||||
pathvalidate==2.5.0
|
||||
pdfminer.six==20211012
|
||||
pikepdf==5.1.0
|
||||
pillow==9.0.1
|
||||
pluggy==1.0.0; python_version >= '3.6'
|
||||
portalocker==2.4.0; python_version >= '3'
|
||||
psycopg2==2.9.3
|
||||
pyasn1-modules==0.2.8
|
||||
pyasn1==0.4.8
|
||||
pycparser==2.21
|
||||
pyopenssl==22.0.0
|
||||
pyparsing==3.0.7; python_version >= '3.6'
|
||||
python-dateutil==2.8.2
|
||||
python-dotenv==0.20.0
|
||||
python-gnupg==0.4.8
|
||||
python-levenshtein==0.12.2
|
||||
python-magic==0.4.25
|
||||
pytz-deprecation-shim==0.1.0.post0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'
|
||||
pytz==2022.1
|
||||
pyyaml==6.0
|
||||
redis==3.5.3
|
||||
regex==2022.3.2; python_version >= '3.6'
|
||||
reportlab==3.6.9; python_version >= '3.7' and python_version < '4'
|
||||
requests==2.27.1; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'
|
||||
scikit-learn==1.0.2
|
||||
scipy==1.8.0; python_version < '3.11' and python_version >= '3.8'
|
||||
service-identity==21.1.0
|
||||
setuptools==61.1.1; python_version >= '3.7'
|
||||
six==1.16.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
|
||||
sniffio==1.2.0; python_version >= '3.5'
|
||||
sqlparse==0.4.2; python_version >= '3.5'
|
||||
threadpoolctl==3.1.0; python_version >= '3.6'
|
||||
tika==1.24
|
||||
tqdm==4.63.1
|
||||
twisted[tls]==22.2.0; python_full_version >= '3.6.7'
|
||||
txaio==22.2.1; python_version >= '3.6'
|
||||
typing-extensions==4.1.1; python_version >= '3.6'
|
||||
tzdata==2022.1; python_version >= '3.6'
|
||||
tzlocal==4.1; python_version >= '3.6'
|
||||
urllib3==1.26.9; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'
|
||||
uvicorn[standard]==0.17.6
|
||||
uvloop==0.16.0
|
||||
watchdog==2.1.7
|
||||
watchgod==0.8.1
|
||||
wcwidth==0.2.5
|
||||
websockets==10.2
|
||||
whitenoise==6.0.0
|
||||
whoosh==2.7.4
|
||||
zipp==3.7.0; python_version < '3.9'
|
||||
zope.interface==5.4.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
|
@@ -1,16 +1,21 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
DOCUMENT_ID=${1}
|
||||
DOCUMENT_FILE_NAME=${2}
|
||||
DOCUMENT_SOURCE_PATH=${3}
|
||||
DOCUMENT_THUMBNAIL_PATH=${4}
|
||||
DOCUMENT_DOWNLOAD_URL=${5}
|
||||
DOCUMENT_THUMBNAIL_URL=${6}
|
||||
DOCUMENT_CORRESPONDENT=${7}
|
||||
DOCUMENT_TAGS=${8}
|
||||
|
||||
echo "
|
||||
|
||||
A document with an id of ${DOCUMENT_ID} was just consumed. I know the
|
||||
following additional information about it:
|
||||
|
||||
* Generated File Name: ${DOCUMENT_FILE_NAME}
|
||||
* Archive Path: ${DOCUMENT_ARCHIVE_PATH}
|
||||
* Source Path: ${DOCUMENT_SOURCE_PATH}
|
||||
* Created: ${DOCUMENT_CREATED}
|
||||
* Added: ${DOCUMENT_ADDED}
|
||||
* Modified: ${DOCUMENT_MODIFIED}
|
||||
* Thumbnail Path: ${DOCUMENT_THUMBNAIL_PATH}
|
||||
* Download URL: ${DOCUMENT_DOWNLOAD_URL}
|
||||
* Thumbnail URL: ${DOCUMENT_THUMBNAIL_URL}
|
||||
|
@@ -2,5 +2,5 @@
|
||||
|
||||
docker run -p 5432:5432 -e POSTGRES_PASSWORD=password -v paperless_pgdata:/var/lib/postgresql/data -d postgres:13
|
||||
docker run -d -p 6379:6379 redis:latest
|
||||
docker run -p 3000:3000 -d gotenberg/gotenberg:7.4
|
||||
docker run -p 9998:9998 -d ghcr.io/paperless-ngx/tika:latest
|
||||
docker run -p 3000:3000 -d gotenberg/gotenberg:7
|
||||
docker run -p 9998:9998 -d apache/tika
|
||||
|
@@ -1,13 +0,0 @@
|
||||
import { defineConfig } from 'cypress'
|
||||
|
||||
export default defineConfig({
|
||||
videosFolder: 'cypress/videos',
|
||||
screenshotsFolder: 'cypress/screenshots',
|
||||
fixturesFolder: 'cypress/fixtures',
|
||||
e2e: {
|
||||
setupNodeEvents(on, config) {
|
||||
return require('./cypress/plugins/index.ts')(on, config)
|
||||
},
|
||||
baseUrl: 'http://localhost:4200',
|
||||
},
|
||||
})
|
9
src-ui/cypress.json
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"integrationFolder": "cypress/integration",
|
||||
"supportFile": "cypress/support/index.ts",
|
||||
"videosFolder": "cypress/videos",
|
||||
"screenshotsFolder": "cypress/screenshots",
|
||||
"pluginsFile": "cypress/plugins/index.ts",
|
||||
"fixturesFolder": "cypress/fixtures",
|
||||
"baseUrl": "http://localhost:4200"
|
||||
}
|
@@ -1,94 +0,0 @@
|
||||
describe('document-detail', () => {
|
||||
beforeEach(() => {
|
||||
// also uses global fixtures from cypress/support/e2e.ts
|
||||
|
||||
this.modifiedDocuments = []
|
||||
|
||||
cy.fixture('documents/documents.json').then((documentsJson) => {
|
||||
cy.intercept('GET', 'http://localhost:8000/api/documents/1/', (req) => {
|
||||
let response = { ...documentsJson }
|
||||
response = response.results.find((d) => d.id == 1)
|
||||
req.reply(response)
|
||||
})
|
||||
})
|
||||
|
||||
cy.intercept('PUT', 'http://localhost:8000/api/documents/1/', (req) => {
|
||||
this.modifiedDocuments.push(req.body) // store this for later
|
||||
req.reply({ result: 'OK' })
|
||||
}).as('saveDoc')
|
||||
|
||||
cy.fixture('documents/1/comments.json').then((commentsJson) => {
|
||||
cy.intercept(
|
||||
'GET',
|
||||
'http://localhost:8000/api/documents/1/comments/',
|
||||
(req) => {
|
||||
req.reply(commentsJson.filter((c) => c.id != 10)) // 3
|
||||
}
|
||||
)
|
||||
|
||||
cy.intercept(
|
||||
'DELETE',
|
||||
'http://localhost:8000/api/documents/1/comments/?id=9',
|
||||
(req) => {
|
||||
req.reply(commentsJson.filter((c) => c.id != 9 && c.id != 10)) // 2
|
||||
}
|
||||
)
|
||||
|
||||
cy.intercept(
|
||||
'POST',
|
||||
'http://localhost:8000/api/documents/1/comments/',
|
||||
(req) => {
|
||||
req.reply(commentsJson) // 4
|
||||
}
|
||||
)
|
||||
})
|
||||
|
||||
cy.viewport(1024, 1024)
|
||||
cy.visit('/documents/1/')
|
||||
})
|
||||
|
||||
it('should activate / deactivate save button when changes are saved', () => {
|
||||
cy.contains('button', 'Save').should('be.disabled')
|
||||
cy.get('app-input-text[formcontrolname="title"]')
|
||||
.type(' additional')
|
||||
.wait(1500) // this delay is for frontend debounce
|
||||
cy.contains('button', 'Save').should('not.be.disabled')
|
||||
})
|
||||
|
||||
it('should warn on unsaved changes', () => {
|
||||
cy.get('app-input-text[formcontrolname="title"]')
|
||||
.type(' additional')
|
||||
.wait(1500) // this delay is for frontend debounce
|
||||
cy.get('button[title="Close"]').click()
|
||||
cy.contains('You have unsaved changes')
|
||||
cy.contains('button', 'Cancel').click().wait(150)
|
||||
cy.contains('button', 'Save').click().wait('@saveDoc').wait(2000) // navigates away after saving
|
||||
cy.contains('You have unsaved changes').should('not.exist')
|
||||
})
|
||||
|
||||
it('should show a list of comments', () => {
|
||||
cy.wait(1000).get('a').contains('Comments').click().wait(1000)
|
||||
cy.get('app-document-comments').find('.card').its('length').should('eq', 3)
|
||||
})
|
||||
|
||||
it('should support comment deletion', () => {
|
||||
cy.wait(1000).get('a').contains('Comments').click().wait(1000)
|
||||
cy.get('app-document-comments')
|
||||
.find('.card')
|
||||
.first()
|
||||
.find('button')
|
||||
.click({ force: true })
|
||||
.wait(500)
|
||||
cy.get('app-document-comments').find('.card').its('length').should('eq', 2)
|
||||
})
|
||||
|
||||
it('should support comment insertion', () => {
|
||||
cy.wait(1000).get('a').contains('Comments').click().wait(1000)
|
||||
cy.get('app-document-comments')
|
||||
.find('form textarea')
|
||||
.type('Testing new comment')
|
||||
.wait(500)
|
||||
cy.get('app-document-comments').find('form button').click().wait(1500)
|
||||
cy.get('app-document-comments').find('.card').its('length').should('eq', 4)
|
||||
})
|
||||
})
|
@@ -1,331 +0,0 @@
|
||||
import { PaperlessDocument } from 'src/app/data/paperless-document'
|
||||
|
||||
describe('documents query params', () => {
|
||||
beforeEach(() => {
|
||||
// also uses global fixtures from cypress/support/e2e.ts
|
||||
|
||||
cy.fixture('documents/documents.json').then((documentsJson) => {
|
||||
// mock api filtering
|
||||
cy.intercept('GET', 'http://localhost:8000/api/documents/*', (req) => {
|
||||
let response = { ...documentsJson }
|
||||
|
||||
if (req.query.hasOwnProperty('ordering')) {
|
||||
const sort_field = req.query['ordering'].toString().replace('-', '')
|
||||
const reverse = req.query['ordering'].toString().indexOf('-') !== -1
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).sort((docA, docB) => {
|
||||
let result = 0
|
||||
switch (sort_field) {
|
||||
case 'created':
|
||||
case 'added':
|
||||
result =
|
||||
new Date(docA[sort_field]) < new Date(docB[sort_field])
|
||||
? -1
|
||||
: 1
|
||||
break
|
||||
case 'archive_serial_number':
|
||||
result = docA[sort_field] < docB[sort_field] ? -1 : 1
|
||||
break
|
||||
}
|
||||
if (reverse) result = -result
|
||||
return result
|
||||
})
|
||||
}
|
||||
|
||||
if (req.query.hasOwnProperty('tags__id__in')) {
|
||||
const tag_ids: Array<number> = req.query['tags__id__in']
|
||||
.toString()
|
||||
.split(',')
|
||||
.map((v) => +v)
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter(
|
||||
(d) =>
|
||||
d.tags.length > 0 &&
|
||||
d.tags.filter((t) => tag_ids.includes(t)).length > 0
|
||||
)
|
||||
response.count = response.results.length
|
||||
} else if (req.query.hasOwnProperty('tags__id__none')) {
|
||||
const tag_ids: Array<number> = req.query['tags__id__none']
|
||||
.toString()
|
||||
.split(',')
|
||||
.map((v) => +v)
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.tags.filter((t) => tag_ids.includes(t)).length == 0)
|
||||
response.count = response.results.length
|
||||
} else if (
|
||||
req.query.hasOwnProperty('is_tagged') &&
|
||||
req.query['is_tagged'] == '0'
|
||||
) {
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.tags.length == 0)
|
||||
response.count = response.results.length
|
||||
}
|
||||
|
||||
if (req.query.hasOwnProperty('document_type__id')) {
|
||||
const doctype_id = +req.query['document_type__id']
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.document_type == doctype_id)
|
||||
response.count = response.results.length
|
||||
} else if (
|
||||
req.query.hasOwnProperty('document_type__isnull') &&
|
||||
req.query['document_type__isnull'] == '1'
|
||||
) {
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.document_type == undefined)
|
||||
response.count = response.results.length
|
||||
}
|
||||
|
||||
if (req.query.hasOwnProperty('correspondent__id')) {
|
||||
const correspondent_id = +req.query['correspondent__id']
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.correspondent == correspondent_id)
|
||||
response.count = response.results.length
|
||||
} else if (
|
||||
req.query.hasOwnProperty('correspondent__isnull') &&
|
||||
req.query['correspondent__isnull'] == '1'
|
||||
) {
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.correspondent == undefined)
|
||||
response.count = response.results.length
|
||||
}
|
||||
|
||||
if (req.query.hasOwnProperty('storage_path__id')) {
|
||||
const storage_path_id = +req.query['storage_path__id']
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.storage_path == storage_path_id)
|
||||
response.count = response.results.length
|
||||
} else if (
|
||||
req.query.hasOwnProperty('storage_path__isnull') &&
|
||||
req.query['storage_path__isnull'] == '1'
|
||||
) {
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.storage_path == undefined)
|
||||
response.count = response.results.length
|
||||
}
|
||||
|
||||
if (req.query.hasOwnProperty('created__date__gt')) {
|
||||
const date = new Date(req.query['created__date__gt'])
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => new Date(d.created) > date)
|
||||
response.count = response.results.length
|
||||
} else if (req.query.hasOwnProperty('created__date__lt')) {
|
||||
const date = new Date(req.query['created__date__lt'])
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => new Date(d.created) < date)
|
||||
response.count = response.results.length
|
||||
}
|
||||
|
||||
if (req.query.hasOwnProperty('added__date__gt')) {
|
||||
const date = new Date(req.query['added__date__gt'])
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => new Date(d.added) > date)
|
||||
response.count = response.results.length
|
||||
} else if (req.query.hasOwnProperty('added__date__lt')) {
|
||||
const date = new Date(req.query['added__date__lt'])
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => new Date(d.added) < date)
|
||||
response.count = response.results.length
|
||||
}
|
||||
|
||||
if (req.query.hasOwnProperty('title_content')) {
|
||||
const title_content_regexp = new RegExp(
|
||||
req.query['title_content'].toString(),
|
||||
'i'
|
||||
)
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter(
|
||||
(d) =>
|
||||
title_content_regexp.test(d.title) ||
|
||||
title_content_regexp.test(d.content)
|
||||
)
|
||||
response.count = response.results.length
|
||||
}
|
||||
|
||||
if (req.query.hasOwnProperty('archive_serial_number')) {
|
||||
const asn = +req.query['archive_serial_number']
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) => d.archive_serial_number == asn)
|
||||
response.count = response.results.length
|
||||
} else if (req.query.hasOwnProperty('archive_serial_number__isnull')) {
|
||||
const isnull = req.query['storage_path__isnull'] == '1'
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter((d) =>
|
||||
isnull
|
||||
? d.archive_serial_number == undefined
|
||||
: d.archive_serial_number != undefined
|
||||
)
|
||||
response.count = response.results.length
|
||||
} else if (req.query.hasOwnProperty('archive_serial_number__gt')) {
|
||||
const asn = +req.query['archive_serial_number__gt']
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter(
|
||||
(d) => d.archive_serial_number > 0 && d.archive_serial_number > asn
|
||||
)
|
||||
response.count = response.results.length
|
||||
} else if (req.query.hasOwnProperty('archive_serial_number__lt')) {
|
||||
const asn = +req.query['archive_serial_number__lt']
|
||||
response.results = (
|
||||
documentsJson.results as Array<PaperlessDocument>
|
||||
).filter(
|
||||
(d) => d.archive_serial_number > 0 && d.archive_serial_number < asn
|
||||
)
|
||||
response.count = response.results.length
|
||||
}
|
||||
|
||||
req.reply(response)
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
it('should show a list of documents sorted by created', () => {
|
||||
cy.visit('/documents?sort=created')
|
||||
cy.get('app-document-card-small').first().contains('No latin title')
|
||||
})
|
||||
|
||||
it('should show a list of documents reverse sorted by created', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true')
|
||||
cy.get('app-document-card-small').first().contains('sit amet')
|
||||
})
|
||||
|
||||
it('should show a list of documents sorted by added', () => {
|
||||
cy.visit('/documents?sort=added')
|
||||
cy.get('app-document-card-small').first().contains('No latin title')
|
||||
})
|
||||
|
||||
it('should show a list of documents reverse sorted by added', () => {
|
||||
cy.visit('/documents?sort=added&reverse=true')
|
||||
cy.get('app-document-card-small').first().contains('sit amet')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by any tags', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&tags__id__in=2,4,5')
|
||||
cy.contains('3 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by excluded tags', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&tags__id__none=2,4')
|
||||
cy.contains('One document')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by no tags', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&is_tagged=0')
|
||||
cy.contains('One document')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by document type', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&document_type__id=1')
|
||||
cy.contains('3 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by no document type', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&document_type__isnull=1')
|
||||
cy.contains('One document')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by correspondent', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&correspondent__id=9')
|
||||
cy.contains('2 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by no correspondent', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&correspondent__isnull=1')
|
||||
cy.contains('2 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by storage path', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&storage_path__id=2')
|
||||
cy.contains('One document')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by no storage path', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&storage_path__isnull=1')
|
||||
cy.contains('3 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by title or content', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&title_content=lorem')
|
||||
cy.contains('2 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by asn', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&archive_serial_number=12345')
|
||||
cy.contains('One document')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by empty asn', () => {
|
||||
cy.visit(
|
||||
'/documents?sort=created&reverse=true&archive_serial_number__isnull=1'
|
||||
)
|
||||
cy.contains('2 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by non-empty asn', () => {
|
||||
cy.visit(
|
||||
'/documents?sort=created&reverse=true&archive_serial_number__isnull=0'
|
||||
)
|
||||
cy.contains('2 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by asn greater than', () => {
|
||||
cy.visit(
|
||||
'/documents?sort=created&reverse=true&archive_serial_number__gt=12346'
|
||||
)
|
||||
cy.contains('One document')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by asn less than', () => {
|
||||
cy.visit(
|
||||
'/documents?sort=created&reverse=true&archive_serial_number__lt=12346'
|
||||
)
|
||||
cy.contains('One document')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by created date greater than', () => {
|
||||
cy.visit(
|
||||
'/documents?sort=created&reverse=true&created__date__gt=2022-03-23'
|
||||
)
|
||||
cy.contains('3 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by created date less than', () => {
|
||||
cy.visit(
|
||||
'/documents?sort=created&reverse=true&created__date__lt=2022-03-23'
|
||||
)
|
||||
cy.contains('One document')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by added date greater than', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&added__date__gt=2022-03-24')
|
||||
cy.contains('2 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by added date less than', () => {
|
||||
cy.visit('/documents?sort=created&reverse=true&added__date__lt=2022-03-24')
|
||||
cy.contains('2 documents')
|
||||
})
|
||||
|
||||
it('should show a list of documents filtered by multiple filters', () => {
|
||||
cy.visit(
|
||||
'/documents?sort=created&reverse=true&document_type__id=1&correspondent__id=9&tags__id__in=4,5'
|
||||
)
|
||||
cy.contains('2 documents')
|
||||
})
|
||||
})
|
@@ -1,60 +0,0 @@
|
||||
describe('tasks', () => {
|
||||
beforeEach(() => {
|
||||
this.dismissedTasks = new Set<number>()
|
||||
|
||||
cy.fixture('tasks/tasks.json').then((tasksViewsJson) => {
|
||||
// acknowledge tasks POST
|
||||
cy.intercept(
|
||||
'POST',
|
||||
'http://localhost:8000/api/acknowledge_tasks/',
|
||||
(req) => {
|
||||
req.body['tasks'].forEach((t) => this.dismissedTasks.add(t)) // store this for later
|
||||
req.reply({ result: 'OK' })
|
||||
}
|
||||
)
|
||||
|
||||
cy.intercept('GET', 'http://localhost:8000/api/tasks/', (req) => {
|
||||
let response = [...tasksViewsJson]
|
||||
if (this.dismissedTasks.size) {
|
||||
response = response.filter((t) => {
|
||||
return !this.dismissedTasks.has(t.id)
|
||||
})
|
||||
}
|
||||
|
||||
req.reply(response)
|
||||
}).as('tasks')
|
||||
})
|
||||
|
||||
cy.visit('/tasks')
|
||||
cy.wait('@tasks')
|
||||
})
|
||||
|
||||
it('should show a list of dismissable tasks in tabs', () => {
|
||||
cy.get('tbody').find('tr:visible').its('length').should('eq', 10) // double because collapsible result tr
|
||||
cy.wait(500) // stabilizes the test, for some reason...
|
||||
cy.get('tbody')
|
||||
.find('button:visible')
|
||||
.contains('Dismiss')
|
||||
.first()
|
||||
.click()
|
||||
.wait('@tasks')
|
||||
.wait(2000)
|
||||
.then(() => {
|
||||
cy.get('tbody').find('tr:visible').its('length').should('eq', 8) // double because collapsible result tr
|
||||
})
|
||||
})
|
||||
|
||||
it('should allow toggling all tasks in list and warn on dismiss', () => {
|
||||
cy.get('thead').find('input[type="checkbox"]').first().click()
|
||||
cy.get('body').find('button').contains('Dismiss selected').first().click()
|
||||
cy.contains('Confirm')
|
||||
cy.get('.modal')
|
||||
.contains('button', 'Dismiss')
|
||||
.click()
|
||||
.wait('@tasks')
|
||||
.wait(2000)
|
||||
.then(() => {
|
||||
cy.get('tbody').find('tr:visible').should('not.exist')
|
||||
})
|
||||
})
|
||||
})
|
@@ -1,46 +0,0 @@
|
||||
[
|
||||
{
|
||||
"id": 10,
|
||||
"comment": "Testing new comment",
|
||||
"created": "2022-08-08T04:24:55.176008Z",
|
||||
"user": {
|
||||
"id": 1,
|
||||
"username": "user2",
|
||||
"firstname": "",
|
||||
"lastname": ""
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": 9,
|
||||
"comment": "Testing one more time",
|
||||
"created": "2022-02-18T04:24:55.176008Z",
|
||||
"user": {
|
||||
"id": 2,
|
||||
"username": "user1",
|
||||
"firstname": "",
|
||||
"lastname": ""
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": 8,
|
||||
"comment": "Another comment",
|
||||
"created": "2021-11-08T04:24:47.925042Z",
|
||||
"user": {
|
||||
"id": 2,
|
||||
"username": "user33",
|
||||
"firstname": "",
|
||||
"lastname": ""
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": 7,
|
||||
"comment": "Cupcake ipsum dolor sit amet cheesecake candy cookie tiramisu. Donut chocolate chupa chups macaroon brownie halvah pie cheesecake gummies. Sweet chocolate bar candy donut gummi bears bear claw liquorice bonbon shortbread.\n\nDonut chocolate bar candy wafer wafer tiramisu. Gummies chocolate cake muffin toffee carrot cake macaroon. Toffee toffee jelly beans danish lollipop cake.",
|
||||
"created": "2021-02-08T02:37:49.724132Z",
|
||||
"user": {
|
||||
"id": 3,
|
||||
"username": "admin",
|
||||
"firstname": "",
|
||||
"lastname": ""
|
||||
}
|
||||
}
|
||||
]
|
@@ -1 +0,0 @@
|
||||
{"version":"v1.7.1","update_available":false,"feature_is_set":true}
|
@@ -1,17 +0,0 @@
|
||||
{
|
||||
"count": 1,
|
||||
"next": null,
|
||||
"previous": null,
|
||||
"results": [
|
||||
{
|
||||
"id": 2,
|
||||
"slug": "year-title",
|
||||
"name": "Year - Title",
|
||||
"path": "{created_year}/{title}",
|
||||
"match": "",
|
||||
"matching_algorithm": 6,
|
||||
"is_insensitive": true,
|
||||
"document_count": 1
|
||||
}
|
||||
]
|
||||
}
|