Merge remote-tracking branch 'paperless-ngx/dev' into dev

This commit is contained in:
Trenton Holmes 2023-08-03 10:00:14 -07:00
commit cbcd9ed67d
93 changed files with 4444 additions and 5187 deletions

View File

@ -17,6 +17,21 @@ updates:
# Add reviewers
reviewers:
- "paperless-ngx/frontend"
groups:
frontend-angular-dependencies:
patterns:
- "@angular*"
- "@ng-*"
- "ngx-*"
- "ng2-pdf-viewer"
frontend-jest-dependencies:
patterns:
- "@types/jest"
- "jest"
frontend-eslint-dependencies:
patterns:
- "@typescript-eslint*"
- "eslint"
# Enable version updates for Python
- package-ecosystem: "pip"

View File

@ -16,7 +16,7 @@ on:
env:
# This is the version of pipenv all the steps will use
# If changing this, change Dockerfile
DEFAULT_PIP_ENV_VERSION: "2023.6.12"
DEFAULT_PIP_ENV_VERSION: "2023.7.23"
# This is the default version of Python to use in most steps
# If changing this, change Dockerfile
DEFAULT_PYTHON_VERSION: "3.9"

View File

@ -29,7 +29,7 @@ jobs:
-
name: Clean temporary images
if: "${{ env.TOKEN != '' }}"
uses: stumpylog/image-cleaner-action/ephemeral@v0.1.0
uses: stumpylog/image-cleaner-action/ephemeral@v0.2.0
with:
token: "${{ env.TOKEN }}"
owner: "${{ github.repository_owner }}"
@ -68,7 +68,7 @@ jobs:
-
name: Clean untagged images
if: "${{ env.TOKEN != '' }}"
uses: stumpylog/image-cleaner-action/untagged@v0.1.0
uses: stumpylog/image-cleaner-action/untagged@v0.2.0
with:
token: "${{ env.TOKEN }}"
owner: "${{ github.repository_owner }}"

View File

@ -27,7 +27,7 @@ repos:
- id: check-case-conflict
- id: detect-private-key
- repo: https://github.com/pre-commit/mirrors-prettier
rev: 'v2.7.1'
rev: 'v3.0.0'
hooks:
- id: prettier
types_or:
@ -37,11 +37,11 @@ repos:
exclude: "(^Pipfile\\.lock$)"
# Python hooks
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: 'v0.0.272'
rev: 'v0.0.280'
hooks:
- id: ruff
- repo: https://github.com/psf/black
rev: 23.3.0
rev: 23.7.0
hooks:
- id: black
# Dockerfile hooks

View File

@ -2,3 +2,5 @@
semi: false
# https://prettier.io/docs/en/options.html#quotes
singleQuote: true
# https://prettier.io/docs/en/options.html#trailing-commas
trailingComma: "es5"

View File

@ -2,7 +2,7 @@
# https://beta.ruff.rs/docs/rules/
extend-select = ["I", "W", "UP", "COM", "DJ", "EXE", "ISC", "ICN", "G201", "INP", "PIE", "RSE", "SIM", "TID", "PLC", "PLE", "RUF"]
# TODO PTH
ignore = ["DJ001", "SIM105"]
ignore = ["DJ001", "SIM105", "RUF012"]
fix = true
line-length = 88
respect-gitignore = true

View File

@ -45,7 +45,7 @@ Examples of `non-trivial` PRs might include:
- Additional features
- Large changes to many distinct files
- Breaking or depreciation of existing features
- Breaking or deprecation of existing features
Our community review process for `non-trivial` PRs is the following:

View File

@ -29,7 +29,7 @@ COPY Pipfile* ./
RUN set -eux \
&& echo "Installing pipenv" \
&& python3 -m pip install --no-cache-dir --upgrade pipenv==2023.6.12 \
&& python3 -m pip install --no-cache-dir --upgrade pipenv==2023.7.23 \
&& echo "Generating requirement.txt" \
&& pipenv requirements > requirements.txt
@ -214,7 +214,8 @@ COPY --from=pipenv-base /usr/src/pipenv/requirements.txt ./
ARG BUILD_PACKAGES="\
build-essential \
git \
default-libmysqlclient-dev"
default-libmysqlclient-dev \
pkg-config"
# hadolint ignore=DL3042
RUN --mount=type=cache,target=/root/.cache/pip/,id=pip-cache \

View File

@ -26,8 +26,6 @@ gunicorn = "*"
imap-tools = "*"
langdetect = "*"
pathvalidate = "*"
pillow = "*"
pikepdf = "*"
python-gnupg = "*"
python-dotenv = "*"
python-dateutil = "*"
@ -36,7 +34,7 @@ python-ipware = "*"
psycopg2 = "*"
rapidfuzz = "*"
redis = {extras = ["hiredis"], version = "*"}
scikit-learn = "~=1.2"
scikit-learn = "~=1.3"
whitenoise = "~=6.3"
watchdog = "~=2.2"
whoosh="~=2.7"
@ -64,9 +62,10 @@ zxing-cpp = {version = "*", platform_machine = "== 'x86_64'"}
scipy = "==1.8.1"
# v4 brings in extra dependencies for features not used here
reportlab = "==3.6.12"
# Pin this until piwheels is building a newer version (see https://www.piwheels.org/project/cryptography/)
# Pin these until piwheels is building a newer version (see https://www.piwheels.org/project/{package}/)
cryptography = "==40.0.1"
httpx = "*"
pikepdf = "==7.2.0"
pillow = "==9.5.0"
[dev-packages]
# Linting

1901
Pipfile.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -68,23 +68,23 @@ $ docker-compose down
After that, [make a backup](#backup).
1. If you pull the image from the docker hub, all you need to do is:
1. If you pull the image from the docker hub, all you need to do is:
```shell-session
$ docker-compose pull
$ docker-compose up
```
```shell-session
$ docker-compose pull
$ docker-compose up
```
The docker-compose files refer to the `latest` version, which is
always the latest stable release.
The docker-compose files refer to the `latest` version, which is
always the latest stable release.
2. If you built the image yourself, do the following:
1. If you built the image yourself, do the following:
```shell-session
$ git pull
$ docker-compose build
$ docker-compose up
```
```shell-session
$ git pull
$ docker-compose build
$ docker-compose up
```
Running `docker-compose up` will also apply any new database migrations.
If you see everything working, press CTRL+C once to gracefully stop
@ -470,7 +470,7 @@ The issues detected by the sanity checker are as follows:
- Inaccessible thumbnails due to improper permissions.
- Documents without any content (warning).
- Orphaned files in the media directory (warning). These are files
that are not referenced by any document im paperless.
that are not referenced by any document in paperless.
```
document_sanity_checker

View File

@ -1,6 +1,6 @@
# Advanced Topics
Paperless offers a couple features that automate certain tasks and make
Paperless offers a couple of features that automate certain tasks and make
your life easier.
## Matching tags, correspondents, document types, and storage paths {#matching}
@ -35,9 +35,9 @@ The following algorithms are available:
(i.e. preserve ordering) in the PDF.
- **Regular expression:** Parses the match as a regular expression and
tries to find a match within the document.
- **Fuzzy match:** I don't know. Look at the source.
- **Fuzzy match:** I don't know. Look at [the source](https://github.com/paperless-ngx/paperless-ngx/blob/main/src/documents/matching.py).
- **Auto:** Tries to automatically match new documents. This does not
require you to set a match. See the notes below.
require you to set a match. See the [notes below](#automatic-matching).
When using the _any_ or _all_ matching algorithms, you can search for
terms that consist of multiple words by enclosing them in double quotes.
@ -92,7 +92,7 @@ when using this feature:
decide when not to assign a certain tag, correspondent, document
type, or storage path. This will usually be the case as you start
filling up paperless with documents. Example: If all your documents
are either from "Webshop" and "Bank", paperless will assign one
are either from "Webshop" or "Bank", paperless will assign one
of these correspondents to ANY new document, if both are set to
automatic matching.
@ -101,7 +101,7 @@ when using this feature:
Sometimes you may want to do something arbitrary whenever a document is
consumed. Rather than try to predict what you may want to do, Paperless
lets you execute scripts of your own choosing just before or after a
document is consumed using a couple simple hooks.
document is consumed using a couple of simple hooks.
Just write a script, put it somewhere that Paperless can read & execute,
and then put the path to that script in `paperless.conf` or
@ -197,7 +197,7 @@ The script can be in any language, A simple shell script example:
!!! warning
The post consumption script should not modify the document files
directly
directly.
The script's stdout and stderr will be logged line by line to the
webserver log, along with the exit code of the script.
@ -311,6 +311,7 @@ Paperless provides the following placeholders within filenames:
- `{added_day}`: Day added only (number 01-31).
- `{owner_username}`: Username of document owner, if any, or "none"
- `{original_name}`: Document original filename, minus the extension, if any, or "none"
- `{doc_pk}`: The paperless identifier (primary key) for the document.
Paperless will try to conserve the information from your database as
much as possible. However, some characters that you can use in document
@ -528,7 +529,7 @@ For how to enable barcode usage, see [the configuration](/configuration#barcodes
The two settings may be enabled independently, but do have interactions as explained
below.
### Document Splitting
### Document Splitting {#document-splitting}
When enabled, Paperless will look for a barcode with the configured value and create a new document
starting from the next page. The page with the barcode on it will _not_ be retained. It
@ -543,3 +544,69 @@ If document splitting via barcode is also enabled, documents will be split when
barcode is located. However, differing from the splitting, the page with the
barcode _will_ be retained. This allows application of a barcode to any page, including
one which holds data to keep in the document.
## Automatic collation of double-sided documents {#collate}
!!! note
If your scanner supports double-sided scanning natively, you do not need this feature.
This feature is turned off by default, see [configuration](/configuration#collate) on how to turn it on.
### Summary
If you have a scanner with an automatic document feeder (ADF) that only scans a single side,
this feature makes scanning double-sided documents much more convenient by automatically
collating two separate scans into one document, reordering the pages as necessary.
### Usage example
Suppose you have a double-sided document with 6 pages (3 sheets of paper). First,
put the stack into your ADF as normal, ensuring that page 1 is scanned first. Your ADF
will now scan pages 1, 3, and 5. Then you (or your the scanner, if it supports it) upload
the scan into the correct sub-directory of the consume folder (`double-sided` by default;
keep in mind that Paperless will _not_ automatically create the directory for you.)
Paperless will then process the scan and move it into an internal staging area.
The next step is to turn your stack upside down (without reordering the sheets of paper),
and scan it once again, your ADF will now scan pages 6, 4, and 2, in that order. Once this
scan is copied into the sub-directory, Paperless will collate the previous scan with the
new one, reversing the order of the pages on the second, "even numbered" scan. The
resulting document will have the pages 1-6 in the correct order, and this new file will
then be processed as normal.
!!! tip
When scanning the even numbered pages, you can omit the last empty pages, if there are
any. For example, if page 6 is empty, you only need to scan pages 2 and 4. _Do not_ omit
empty pages in the middle of the document.
### Things that could go wrong
Paperless will notice when the first, "odd numbered" scan has less pages than the second
scan (this can happen when e.g. the ADF skipped a few pages in the first pass). In that
case, Paperless will remove the staging copy as well as the scan, and give you an error
message asking you to restart the process from scratch, by scanning the odd pages again,
followed by the even pages.
Another thing that might happen is that you start a double sided scan, but then forget
to upload the second file. To avoid collating the wrong documents if you then come back
a day later to scan a new double-sided document, Paperless will only keep an "odd numbered
pages" file for up to 30 minutes. If more time passes, it will consider the next incoming
scan a completely new "odd numbered pages" one. The old staging file will get discarded.
### Interaction with "subdirs as tags"
The collation feature can be used together with the "subdirs as tags" feature (but this is not
a requirement). Just create a correctly named double-sided subdir in the hierachy and upload
your scans there. For example, both `double-sided/foo/bar` as well as `foo/bar/double-sided` will
cause the collated document to be treated as if it were uploaded into `foo/bar` and receive both
`foo` and `bar` tags, but not `double-sided`.
### Interaction with document splitting
You can use the [document splitting](#document-splitting) feature, but if you use a normal
single-sided split marker page, the split document(s) will have an empty page at the front (or
whatever else was on the backside of the split marker page.) You can work around that by having
a split marker page that has the split barcode on _both_ sides. This way, the extra page will
get automatically removed.

View File

@ -524,7 +524,7 @@ parsing documents.
`PAPERLESS_OCR_MODE=<mode>`
: Tell paperless when and how to perform ocr on your documents. Four
: Tell paperless when and how to perform ocr on your documents. Three
modes are available:
- `skip`: Paperless skips all pages and will perform ocr only on
@ -1116,6 +1116,43 @@ combination with PAPERLESS_CONSUMER_BARCODE_UPSCALE bigger than 1.0.
Defaults to "300"
## Collate Double-Sided Documents {#collate}
`PAPERLESS_CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED=<bool>`
: Enables automatic collation of two single-sided scans into a double-sided
document.
This is useful if you have an automatic document feeder that only supports
single-sided scans, but you need to scan a double-sided document. If your
ADF supports double-sided scans natively, you do not need this feature.
`PAPERLESS_CONSUMER_RECURSIVE` must be enabled for this to work.
For more information, read the [corresponding section in the advanced
documentation](/advanced_usage#collate).
Defaults to false.
`PAPERLESS_CONSUMER_COLLATE_DOUBLE_SIDED_SUBDIR_NAME=<str>`
: The name of the subdirectory that the collate feature expects documents to
arrive.
This only has an effect if `PAPERLESS_CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED`
has been enabled. Note that Paperless will not automatically create the
directory.
Defaults to "double-sided".
`PAPERLESS_CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT=<bool>`
: Whether TIFF image files should be supported when collating documents.
This will automatically convert any TIFF image(s) to pdfs for later
processing. This only has an effect if
`PAPERLESS_CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED` has been enabled.
Defaults to false.
## Binaries
There are a few external software packages that Paperless expects to
@ -1123,7 +1160,7 @@ find on your system when it starts up. Unless you've done something
creative with their installation, you probably won't need to edit any
of these. However, if you've installed these programs somewhere where
simply typing the name of the program doesn't automatically execute it
(ie. the program isn't in your \$PATH), then you'll need to specify
(ie. the program isn't in your $PATH), then you'll need to specify
the literal path for that program.
`PAPERLESS_CONVERT_BINARY=<path>`
@ -1207,7 +1244,7 @@ actual group ID on the host system, which you can get by executing
with English, German, Italian, Spanish and French. If your language
is not in this list, install additional languages with this
configuration option. You will need to [find the right LangCodes](https://tesseract-ocr.github.io/tessdoc/Data-Files-in-different-versions.html)
but note that (tesseract-ocr-\* package names)[https://packages.debian.org/bullseye/graphics/]
but note that [tesseract-ocr-\* package names](https://packages.debian.org/bullseye/graphics/)
do not always correspond with the language codes e.g. "chi_tra" should be
specified as "chi-tra".

View File

@ -58,7 +58,7 @@ first-time setup.
!!! note
Every command is executed directly from the root folder of the project unless specified otherwise.
Every command is executed directly from the root folder of the project unless specified otherwise.
1. Install prerequisites + pipenv as mentioned in
[Bare metal route](/setup#bare_metal).
@ -177,68 +177,69 @@ The front end is built using AngularJS. In order to get started, you need Node.j
The following commands are all performed in the `src-ui`-directory. You will need a running back end (including an active session) to connect to the back end API. To spin it up refer to the commands under the section [above](#back-end-development).
1. Install the Angular CLI. You might need sudo privileges
to perform this command:
1. Install the Angular CLI. You might need sudo privileges to perform this command:
```bash
$ npm install -g @angular/cli
```
```bash
$ npm install -g @angular/cli
```
2. Make sure that it's on your path.
2. Make sure that it's on your path.
3. Install all necessary modules:
3. Install all necessary modules:
```bash
$ npm install
```
```bash
$ npm install
```
4. You can launch a development server by running:
4. You can launch a development server by running:
```bash
$ ng serve
```
```bash
$ ng serve
```
This will automatically update whenever you save. However, in-place
compilation might fail on syntax errors, in which case you need to
restart it.
This will automatically update whenever you save. However, in-place
compilation might fail on syntax errors, in which case you need to
restart it.
By default, the development server is available on `http://localhost:4200/` and is configured to access the API at
`http://localhost:8000/api/`, which is the default of the backend. If you enabled `DEBUG` on the back end, several security overrides for allowed hosts, CORS and X-Frame-Options are in place so that the front end behaves exactly as in production.
By default, the development server is available on `http://localhost:4200/` and is configured to access the API at
`http://localhost:8000/api/`, which is the default of the backend. If you enabled `DEBUG` on the back end, several security overrides for allowed hosts, CORS and X-Frame-Options are in place so that the front end behaves exactly as in production.
### Testing and code style
- The front end code (.ts, .html, .scss) use `prettier` for code
formatting via the Git `pre-commit` hooks which run automatically on
commit. See [above](#code-formatting-with-pre-commit-hooks) for installation instructions. You can also run this via the CLI with a
command such as
The front end code (.ts, .html, .scss) use `prettier` for code
formatting via the Git `pre-commit` hooks which run automatically on
commit. See [above](#code-formatting-with-pre-commit-hooks) for installation instructions. You can also run this via the CLI with a
command such as
```bash
$ git ls-files -- '*.ts' | xargs pre-commit run prettier --files
```
```bash
$ git ls-files -- '*.ts' | xargs pre-commit run prettier --files
```
- Front end testing uses Jest and Playwright. Unit tests and e2e tests,
respectively, can be run non-interactively with:
Front end testing uses Jest and Playwright. Unit tests and e2e tests,
respectively, can be run non-interactively with:
```bash
$ ng test
$ npx playwright test
```
```bash
$ ng test
$ npx playwright test
```
- Playwright also includes a UI which can be run with:
Playwright also includes a UI which can be run with:
```bash
$ npx playwright test --ui
```
```bash
$ npx playwright test --ui
```
- In order to build the front end and serve it as part of Django, execute:
### Building the frontend
```bash
$ ng build --configuration production
```
In order to build the front end and serve it as part of Django, execute:
This will build the front end and put it in a location from which the
Django server will serve it as static content. This way, you can verify
that authentication is working.
```bash
$ ng build --configuration production
```
This will build the front end and put it in a location from which the
Django server will serve it as static content. This way, you can verify
that authentication is working.
## Localization

View File

@ -3,10 +3,11 @@
## _What's the general plan for Paperless-ngx?_
**A:** While Paperless-ngx is already considered largely
"feature-complete" it is a community-driven project and development
will be guided in this way. New features can be submitted via GitHub
discussions and "up-voted" by the community but this is not a
guarantee the feature will be implemented. This project will always be
"feature-complete", it is a community-driven project and development
will be guided in this way. New features can be submitted via
[GitHub discussions](https://github.com/paperless-ngx/paperless-ngx/discussions)
and "up-voted" by the community, but this is not a
guarantee that the feature will be implemented. This project will always be
open to collaboration in the form of PRs, ideas etc.
## _I'm using docker. Where are my documents?_
@ -58,7 +59,7 @@ elsewhere. Here are a couple notes about that.
WebP images are processed with OCR and converted into PDF documents.
- Plain text documents are supported as well and are added verbatim to
paperless.
- With the optional Tika integration enabled (see [Tika configuration](/configuration#tika),
- With the optional Tika integration enabled (see [Tika configuration](https://docs.paperless-ngx.com/configuration#tika)),
Paperless also supports various Office documents (.docx, .doc, odt,
.ppt, .pptx, .odp, .xls, .xlsx, .ods).
@ -82,7 +83,7 @@ has to do much less work to serve the data.
## _How do I install paperless-ngx on Raspberry Pi?_
**A:** Docker images are available for armv7 and arm64 hardware, so just
follow the docker-compose instructions. Apart from more required disk
follow the [docker-compose instructions](https://docs.paperless-ngx.com/setup/#installation). Apart from more required disk
space compared to a bare metal installation, docker comes with close to
zero overhead, even on Raspberry Pi.

View File

@ -72,7 +72,7 @@ fi
if ! docker stats --no-stream &> /dev/null ; then
echo ""
echo "WARN: It look like the current user does not have Docker permissions."
echo "WARN: Use 'sudo usermod -aG docker $USER' to assign Docker permissions to the user."
echo "WARN: Use 'sudo usermod -aG docker $USER' to assign Docker permissions to the user (may require restarting shell)."
echo ""
sleep 3
fi

View File

@ -68,6 +68,9 @@
#PAPERLESS_CONSUMER_BARCODE_STRING=PATCHT
#PAPERLESS_CONSUMER_BARCODE_UPSCALE=0.0
#PAPERLESS_CONSUMER_BARCODE_DPI=300
#PAPERLESS_CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED=false
#PAPERLESS_CONSUMER_COLLATE_DOUBLE_SIDED_SUBDIR_NAME=double-sided
#PAPERLESS_CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT=false
#PAPERLESS_PRE_CONSUME_SCRIPT=/path/to/an/arbitrary/script.sh
#PAPERLESS_POST_CONSUME_SCRIPT=/path/to/an/arbitrary/script.sh
#PAPERLESS_FILENAME_DATE_ORDER=YMD

View File

@ -94,51 +94,6 @@ test('should show a list of notes', async ({ page }) => {
).toHaveCount(4)
})
test('should support note deletion', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR, { notFound: 'fallback' })
await page.goto('/documents/175/notes')
await expect(page.locator('app-document-notes')).toBeVisible()
const deletePromise = page.waitForRequest(
(request) =>
request.method() === 'DELETE' &&
request.url().includes('/api/documents/175/notes/')
)
await page
.getByRole('button', { name: /delete note/i, includeHidden: true })
.first()
.click()
await deletePromise
})
test('should support note insertion', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR, { notFound: 'fallback' })
await page.goto('/documents/175/notes')
await expect(page.locator('app-document-notes')).toBeVisible()
await expect(
await page.getByRole('button', {
name: /delete note/i,
includeHidden: true,
})
).toHaveCount(4)
await page.getByPlaceholder('Enter note').fill('This is a new note')
const addPromise = page.waitForRequest((request) => {
if (!request.url().includes('/notes/')) {
// ignore other requests
return true
} else {
const data = request.postDataJSON()
const isValid = data['note'] === 'This is a new note'
return (
isValid &&
request.method() === 'POST' &&
request.url().includes('/notes/')
)
}
})
await page.getByRole('button', { name: 'Add note' }).click()
await addPromise
})
test('should support quick filters', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR2, { notFound: 'fallback' })
await page.goto('/documents/175/details')

View File

@ -1,58 +0,0 @@
import { test, expect } from '@playwright/test'
const REQUESTS_HAR1 = 'e2e/manage/requests/api-manage1.har'
const REQUESTS_HAR2 = 'e2e/manage/requests/api-manage2.har'
test('should show a list of tags with bottom pagination as well', async ({
page,
}) => {
await page.routeFromHAR(REQUESTS_HAR1, { notFound: 'fallback' })
await page.goto('/tags')
await expect(page.getByRole('main')).toHaveText(/26 total tags/i)
await expect(await page.locator('ngb-pagination')).toHaveCount(2)
})
test('should show a list of correspondents without bottom pagination', async ({
page,
}) => {
await page.routeFromHAR(REQUESTS_HAR2, { notFound: 'fallback' })
await page.goto('/correspondents')
await expect(page.getByRole('main')).toHaveText(/4 total correspondents/i)
await expect(await page.locator('ngb-pagination')).toHaveCount(1)
})
test('should support quick filter Documents button', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR1, { notFound: 'fallback' })
await page.goto('/tags')
await page
.getByRole('row', { name: 'Inbox' })
.getByRole('button', { name: 'Documents' })
.click()
await expect(page).toHaveURL(/tags__id__all=9/)
})
test('should support item editing', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR1, { notFound: 'fallback' })
await page.goto('/tags')
await page
.getByRole('row', { name: 'Inbox' })
.getByRole('button', { name: 'Edit' })
.click()
await expect(page.getByRole('dialog')).toBeVisible()
await expect(page.getByLabel('Name')).toHaveValue('Inbox')
await page.getByTitle('Color').getByRole('button').click()
const color = await page.getByLabel('Color').inputValue()
const updatePromise = page.waitForRequest((request) => {
const data = request.postDataJSON()
const isValid = data['color'] === color
return (
isValid &&
request.method() === 'PUT' &&
request.url().includes('/api/tags/9/')
)
})
await page.getByRole('button', { name: 'Save' }).click()
await updatePromise
})

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -1,8 +1,6 @@
import { test, expect } from '@playwright/test'
const REQUESTS_HAR = 'e2e/settings/requests/api-settings.har'
const REQUESTS_HAR2 = 'e2e/settings/requests/api-settings2.har'
const REQUESTS_HAR3 = 'e2e/settings/requests/api-settings3.har'
test('should post settings on save', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR, { notFound: 'fallback' })
@ -101,65 +99,3 @@ test('should support tab direct navigation', async ({ page }) => {
page.getByRole('tab', { name: 'Users & Groups' })
).toHaveAttribute('aria-selected', 'true')
})
test('should show a list of mail accounts & support creation', async ({
page,
}) => {
await page.routeFromHAR(REQUESTS_HAR2, { notFound: 'fallback' })
await page.goto('/settings/mail')
await expect(
page.getByRole('listitem').filter({ hasText: 'imap.gmail.com' })
).toHaveCount(1)
await expect(
page.getByRole('listitem').filter({ hasText: 'imap.domain.com' })
).toHaveCount(1)
await page.getByRole('button', { name: /Add Account/ }).click()
await expect(page.getByRole('dialog')).toHaveCount(1)
await page.getByLabel('Name', { exact: true }).fill('Test Account')
await page.getByLabel('IMAP Server', { exact: true }).fill('imap.server.com')
await page.getByLabel('IMAP Port', { exact: true }).fill('993')
await page.getByLabel('Username', { exact: true }).fill('username')
await page.getByLabel('Password', { exact: true }).fill('password')
const createPromise = page.waitForRequest((request) => {
const data = request.postDataJSON()
const isValid = data['imap_server'] === 'imap.server.com'
return (
isValid &&
request.method() === 'POST' &&
request.url().includes('/api/mail_accounts/')
)
})
await page.getByRole('button', { name: 'Save' }).click()
await createPromise
})
test('should show a list of mail rules & support creation', async ({
page,
}) => {
await page.routeFromHAR(REQUESTS_HAR3, { notFound: 'fallback' })
await page.goto('/settings/mail')
await expect(
page.getByRole('listitem').filter({ hasText: 'domain' })
).toHaveCount(2)
await expect(
page.getByRole('listitem').filter({ hasText: 'gmail' })
).toHaveCount(2)
await page.getByRole('button', { name: /Add Rule/ }).click()
await expect(page.getByRole('dialog')).toHaveCount(1)
await page.getByLabel('Name', { exact: true }).fill('Test Rule')
await page.getByTitle('Account').locator('span').first().click()
await page.getByRole('option', { name: 'gmail' }).click()
await page.getByLabel('Maximum age (days)').fill('0')
const createPromise = page.waitForRequest((request) => {
const data = request.postDataJSON()
const isValid = data['name'] === 'Test Rule'
return (
isValid &&
request.method() === 'POST' &&
request.url().includes('/api/mail_rules/')
)
})
await page.getByRole('button', { name: 'Save' }).scrollIntoViewIfNeeded()
await page.getByRole('button', { name: 'Save' }).click()
await createPromise
})

File diff suppressed because one or more lines are too long

View File

@ -1,71 +0,0 @@
import { test, expect } from '@playwright/test'
const REQUESTS_HAR = 'e2e/tasks/requests/api-tasks.har'
test('should show a list of dismissable tasks in tabs', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR, { notFound: 'fallback' })
await page.goto('/tasks')
await expect(page.getByRole('tab', { name: /Failed/ })).toHaveText(/1/)
await expect(
page.getByRole('cell').filter({ hasText: 'Dismiss' })
).toHaveCount(1)
await expect(page.getByRole('tab', { name: /Complete/ })).toHaveText(/8/)
await page.getByRole('tab', { name: /Complete/ }).click()
await expect(
page.getByRole('cell').filter({ hasText: 'Dismiss' })
).toHaveCount(8)
await page.getByRole('tab', { name: /Started/ }).click()
await expect(
page.getByRole('cell').filter({ hasText: 'Dismiss' })
).toHaveCount(0)
await page.getByRole('tab', { name: /Queued/ }).click()
await expect(
page.getByRole('cell').filter({ hasText: 'Dismiss' })
).toHaveCount(0)
})
test('should support dismissing tasks', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR, { notFound: 'fallback' })
await page.goto('/tasks')
await page.getByRole('tab', { name: /Failed/ }).click()
const dismissPromise = page.waitForRequest((request) => {
const data = request.postDataJSON()
const isValid = Array.isArray(data['tasks']) && data['tasks'].includes(255)
return (
isValid &&
request.method() === 'POST' &&
request.url().includes('/api/acknowledge_tasks/')
)
})
await page
.getByRole('button', { name: 'Dismiss', exact: true })
.first()
.click()
await dismissPromise
})
test('should support dismiss all tasks', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR, { notFound: 'fallback' })
await page.goto('/tasks')
await expect(page.getByRole('button', { name: 'Dismiss all' })).toBeEnabled()
await page.getByRole('button', { name: 'Dismiss all' }).click()
const dismissPromise = page.waitForRequest((request) => {
const data = request.postDataJSON()
const isValid = Array.isArray(data['tasks'])
return (
isValid &&
request.method() === 'POST' &&
request.url().includes('/api/acknowledge_tasks/')
)
})
await page.getByRole('button', { name: /Dismiss/ }).click()
await dismissPromise
})
test('should warn on dismiss all tasks', async ({ page }) => {
await page.routeFromHAR(REQUESTS_HAR, { notFound: 'fallback' })
await page.goto('/tasks')
await expect(page.getByRole('button', { name: 'Dismiss all' })).toBeEnabled()
await page.getByRole('button', { name: 'Dismiss all' }).click()
await expect(page.getByRole('dialog')).toHaveCount(1)
})

View File

@ -333,84 +333,84 @@
<source>The dashboard can be used to show saved views, such as an &apos;Inbox&apos;. Those settings are found under Settings &gt; Saved Views once you have created some.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">145</context>
<context context-type="linenumber">146</context>
</context-group>
</trans-unit>
<trans-unit id="9075755296812854717" datatype="html">
<source>Drag-and-drop documents here to start uploading or place them in the consume folder. You can also drag-and-drop documents anywhere on all other pages of the web app. Once you do, Paperless-ngx will start training its machine learning algorithms.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">155</context>
<context context-type="linenumber">153</context>
</context-group>
</trans-unit>
<trans-unit id="7495498057594070122" datatype="html">
<source>The documents list shows all of your documents and allows for filtering as well as bulk-editing. There are three different view styles: list, small cards and large cards. A list of documents currently opened for editing is shown in the sidebar.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">165</context>
<context context-type="linenumber">158</context>
</context-group>
</trans-unit>
<trans-unit id="1334220418719920556" datatype="html">
<source>The filtering tools allow you to quickly find documents using various searches, dates, tags, etc.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">178</context>
<context context-type="linenumber">165</context>
</context-group>
</trans-unit>
<trans-unit id="5427326625898532358" datatype="html">
<source>Any combination of filters can be saved as a &apos;view&apos; which can then be displayed on the dashboard and / or sidebar.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">189</context>
<context context-type="linenumber">171</context>
</context-group>
</trans-unit>
<trans-unit id="2804886236408698479" datatype="html">
<source>Tags, correspondents, document types and storage paths can all be managed using these pages. They can also be created from the document edit view.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">199</context>
<context context-type="linenumber">176</context>
</context-group>
</trans-unit>
<trans-unit id="4680387114119209483" datatype="html">
<source>File Tasks shows you documents that have been consumed, are waiting to be, or may have failed during the process.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">209</context>
<context context-type="linenumber">184</context>
</context-group>
</trans-unit>
<trans-unit id="8116994662047019809" datatype="html">
<source>Check out the settings for various tweaks to the web app, toggle settings for saved views or setup e-mail checking.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">219</context>
<context context-type="linenumber">192</context>
</context-group>
</trans-unit>
<trans-unit id="7172877665285340082" datatype="html">
<source>Thank you! 🙏</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">229</context>
<context context-type="linenumber">200</context>
</context-group>
</trans-unit>
<trans-unit id="7354947513482088740" datatype="html">
<source>There are &lt;em&gt;tons&lt;/em&gt; more features and info we didn&apos;t cover here, but this should get you started. Check out the documentation or visit the project on GitHub to learn more or to report issues.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">231</context>
<context context-type="linenumber">202</context>
</context-group>
</trans-unit>
<trans-unit id="4270528545616947218" datatype="html">
<source>Lastly, on behalf of every contributor to this community-supported project, thank you for using Paperless-ngx!</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">233</context>
<context context-type="linenumber">204</context>
</context-group>
</trans-unit>
<trans-unit id="5749300816154614125" datatype="html">
<source>Initiating upload...</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/app.component.ts</context>
<context context-type="linenumber">289</context>
<context context-type="linenumber">273</context>
</context-group>
</trans-unit>
<trans-unit id="2173456130768795374" datatype="html">
@ -723,7 +723,7 @@
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">594</context>
<context context-type="linenumber">600</context>
</context-group>
</trans-unit>
<trans-unit id="2526035785704676448" datatype="html">
@ -2013,6 +2013,10 @@
<context context-type="sourcefile">src/app/components/common/input/permissions/permissions-form/permissions-form.component.html</context>
<context context-type="linenumber">46</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/dashboard/widgets/saved-view-widget/saved-view-widget.component.html</context>
<context context-type="linenumber">17</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/document-card-large/document-card-large.component.html</context>
<context context-type="linenumber">49</context>
@ -2296,13 +2300,43 @@
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">202</context>
<context context-type="linenumber">201</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/rest/document.service.ts</context>
<context context-type="linenumber">20</context>
</context-group>
</trans-unit>
<trans-unit id="8911158217491828773" datatype="html">
<source>View Preview</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/dashboard/widgets/saved-view-widget/saved-view-widget.component.html</context>
<context context-type="linenumber">19</context>
</context-group>
</trans-unit>
<trans-unit id="3099741642167775297" datatype="html">
<source>Download</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/dashboard/widgets/saved-view-widget/saved-view-widget.component.html</context>
<context context-type="linenumber">29</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-detail/document-detail.component.html</context>
<context context-type="linenumber">19</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/bulk-editor/bulk-editor.component.html</context>
<context context-type="linenumber">102</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/document-card-large/document-card-large.component.html</context>
<context context-type="linenumber">64</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/document-card-small/document-card-small.component.html</context>
<context context-type="linenumber">99</context>
</context-group>
</trans-unit>
<trans-unit id="1069523139277190436" datatype="html">
<source>Statistics</source>
<context-group purpose="location">
@ -2482,25 +2516,6 @@
<context context-type="linenumber">5,6</context>
</context-group>
</trans-unit>
<trans-unit id="3099741642167775297" datatype="html">
<source>Download</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-detail/document-detail.component.html</context>
<context context-type="linenumber">19</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/bulk-editor/bulk-editor.component.html</context>
<context context-type="linenumber">102</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/document-card-large/document-card-large.component.html</context>
<context context-type="linenumber">64</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/document-card-small/document-card-small.component.html</context>
<context context-type="linenumber">99</context>
</context-group>
</trans-unit>
<trans-unit id="8659635229098859487" datatype="html">
<source>Download original</source>
<context-group purpose="location">
@ -2898,19 +2913,19 @@
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">694</context>
<context context-type="linenumber">711</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">754</context>
<context context-type="linenumber">771</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">821</context>
<context context-type="linenumber">838</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">884</context>
<context context-type="linenumber">901</context>
</context-group>
</trans-unit>
<trans-unit id="1181910457994920507" datatype="html">
@ -2925,19 +2940,19 @@
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">696</context>
<context context-type="linenumber">713</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">756</context>
<context context-type="linenumber">773</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">823</context>
<context context-type="linenumber">840</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">886</context>
<context context-type="linenumber">903</context>
</context-group>
</trans-unit>
<trans-unit id="5729001209753056399" datatype="html">
@ -3499,7 +3514,7 @@
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">207</context>
<context context-type="linenumber">206</context>
</context-group>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/rest/document.service.ts</context>
@ -3625,7 +3640,9 @@
</context-group>
</trans-unit>
<trans-unit id="5195932016807797291" datatype="html">
<source>Correspondent: <x id="PH" equiv-text="this.correspondents.find((c) =&gt; c.id == +rule.value)?.name"/></source>
<source>Correspondent: <x id="PH" equiv-text="this.correspondents.find(
(c) =&gt; c.id == +rule.value
)?.name"/></source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">118,120</context>
@ -3639,7 +3656,9 @@
</context-group>
</trans-unit>
<trans-unit id="317796810569008208" datatype="html">
<source>Document type: <x id="PH" equiv-text="this.documentTypes.find((dt) =&gt; dt.id == +rule.value)?.name"/></source>
<source>Document type: <x id="PH" equiv-text="this.documentTypes.find(
(dt) =&gt; dt.id == +rule.value
)?.name"/></source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">128,130</context>
@ -3653,7 +3672,9 @@
</context-group>
</trans-unit>
<trans-unit id="232202047340644471" datatype="html">
<source>Storage path: <x id="PH" equiv-text="this.storagePaths.find((sp) =&gt; sp.id == +rule.value)?.name"/></source>
<source>Storage path: <x id="PH" equiv-text="this.storagePaths.find(
(sp) =&gt; sp.id == +rule.value
)?.name"/></source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">138,140</context>
@ -3667,108 +3688,109 @@
</context-group>
</trans-unit>
<trans-unit id="8180755793012580465" datatype="html">
<source>Tag: <x id="PH" equiv-text="this.tags.find((t) =&gt; t.id == +rule.value)?.name"/></source>
<source>Tag: <x id="PH" equiv-text="this.tags.find((t) =&gt; t.id == +rule.value)
?.name"/></source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">146,148</context>
<context context-type="linenumber">146,147</context>
</context-group>
</trans-unit>
<trans-unit id="6494566478302448576" datatype="html">
<source>Without any tag</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">152</context>
<context context-type="linenumber">151</context>
</context-group>
</trans-unit>
<trans-unit id="6523384805359286307" datatype="html">
<source>Title: <x id="PH" equiv-text="rule.value"/></source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">156</context>
<context context-type="linenumber">155</context>
</context-group>
</trans-unit>
<trans-unit id="1872523635812236432" datatype="html">
<source>ASN: <x id="PH" equiv-text="rule.value"/></source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">159</context>
<context context-type="linenumber">158</context>
</context-group>
</trans-unit>
<trans-unit id="102674688969746976" datatype="html">
<source>Owner: <x id="PH" equiv-text="rule.value"/></source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">162</context>
<context context-type="linenumber">161</context>
</context-group>
</trans-unit>
<trans-unit id="3550877650686009106" datatype="html">
<source>Owner not in: <x id="PH" equiv-text="rule.value"/></source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">165</context>
<context context-type="linenumber">164</context>
</context-group>
</trans-unit>
<trans-unit id="1082034558646673343" datatype="html">
<source>Without an owner</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">168</context>
<context context-type="linenumber">167</context>
</context-group>
</trans-unit>
<trans-unit id="3100631071441658964" datatype="html">
<source>Title &amp; content</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">205</context>
<context context-type="linenumber">204</context>
</context-group>
</trans-unit>
<trans-unit id="1010505078885609376" datatype="html">
<source>Advanced search</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">210</context>
<context context-type="linenumber">209</context>
</context-group>
</trans-unit>
<trans-unit id="2649431021108393503" datatype="html">
<source>More like</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">216</context>
<context context-type="linenumber">215</context>
</context-group>
</trans-unit>
<trans-unit id="3697582909018473071" datatype="html">
<source>equals</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">235</context>
<context context-type="linenumber">234</context>
</context-group>
</trans-unit>
<trans-unit id="5325481293405718739" datatype="html">
<source>is empty</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">239</context>
<context context-type="linenumber">238</context>
</context-group>
</trans-unit>
<trans-unit id="6166785695326182482" datatype="html">
<source>is not empty</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">243</context>
<context context-type="linenumber">242</context>
</context-group>
</trans-unit>
<trans-unit id="4686622206659266699" datatype="html">
<source>greater than</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">247</context>
<context context-type="linenumber">246</context>
</context-group>
</trans-unit>
<trans-unit id="8014012170270529279" datatype="html">
<source>less than</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/document-list/filter-editor/filter-editor.component.ts</context>
<context context-type="linenumber">251</context>
<context context-type="linenumber">250</context>
</context-group>
</trans-unit>
<trans-unit id="7210076240260527720" datatype="html">
@ -4471,231 +4493,231 @@
<source>Saved view &quot;<x id="PH" equiv-text="savedView.name"/>&quot; deleted.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">476</context>
<context context-type="linenumber">482</context>
</context-group>
</trans-unit>
<trans-unit id="3891152409365583719" datatype="html">
<source>Settings saved</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">578</context>
<context context-type="linenumber">584</context>
</context-group>
</trans-unit>
<trans-unit id="7217000812750597833" datatype="html">
<source>Settings were saved successfully.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">579</context>
<context context-type="linenumber">585</context>
</context-group>
</trans-unit>
<trans-unit id="525012668859298131" datatype="html">
<source>Settings were saved successfully. Reload is required to apply some changes.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">583</context>
<context context-type="linenumber">589</context>
</context-group>
</trans-unit>
<trans-unit id="8491974984518503778" datatype="html">
<source>Reload now</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">584</context>
<context context-type="linenumber">590</context>
</context-group>
</trans-unit>
<trans-unit id="6839066544204061364" datatype="html">
<source>Use system language</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">603</context>
<context context-type="linenumber">609</context>
</context-group>
</trans-unit>
<trans-unit id="7729897675462249787" datatype="html">
<source>Use date format of display language</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">610</context>
<context context-type="linenumber">616</context>
</context-group>
</trans-unit>
<trans-unit id="5260584511980773458" datatype="html">
<source>Error while storing settings on server.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">630</context>
<context context-type="linenumber">636</context>
</context-group>
</trans-unit>
<trans-unit id="4510369340305901516" datatype="html">
<source>Password has been changed, you will be logged out momentarily.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">662</context>
<context context-type="linenumber">679</context>
</context-group>
</trans-unit>
<trans-unit id="2753185112875184719" datatype="html">
<source>Saved user &quot;<x id="PH" equiv-text="newUser.username"/>&quot;.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">669</context>
<context context-type="linenumber">686</context>
</context-group>
</trans-unit>
<trans-unit id="3471101514724661554" datatype="html">
<source>Error saving user.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">681</context>
<context context-type="linenumber">698</context>
</context-group>
</trans-unit>
<trans-unit id="5565868288871970148" datatype="html">
<source>Confirm delete user account</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">692</context>
<context context-type="linenumber">709</context>
</context-group>
</trans-unit>
<trans-unit id="8133663925694885325" datatype="html">
<source>This operation will permanently delete this user account.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">693</context>
<context context-type="linenumber">710</context>
</context-group>
</trans-unit>
<trans-unit id="857903183180440990" datatype="html">
<source>Deleted user</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">702</context>
<context context-type="linenumber">719</context>
</context-group>
</trans-unit>
<trans-unit id="1942566571910298572" datatype="html">
<source>Error deleting user.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">710</context>
<context context-type="linenumber">727</context>
</context-group>
</trans-unit>
<trans-unit id="5766640174051730159" datatype="html">
<source>Saved group &quot;<x id="PH" equiv-text="newGroup.name"/>&quot;.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">731</context>
<context context-type="linenumber">748</context>
</context-group>
</trans-unit>
<trans-unit id="8382042988405122578" datatype="html">
<source>Error saving group.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">741</context>
<context context-type="linenumber">758</context>
</context-group>
</trans-unit>
<trans-unit id="6538873300613683004" datatype="html">
<source>Confirm delete user group</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">752</context>
<context context-type="linenumber">769</context>
</context-group>
</trans-unit>
<trans-unit id="7710984639498518244" datatype="html">
<source>This operation will permanently delete this user group.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">753</context>
<context context-type="linenumber">770</context>
</context-group>
</trans-unit>
<trans-unit id="6834066329827670963" datatype="html">
<source>Deleted group</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">762</context>
<context context-type="linenumber">779</context>
</context-group>
</trans-unit>
<trans-unit id="8850738980935204840" datatype="html">
<source>Error deleting group.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">770</context>
<context context-type="linenumber">787</context>
</context-group>
</trans-unit>
<trans-unit id="6327501535846658797" datatype="html">
<source>Saved account &quot;<x id="PH" equiv-text="newMailAccount.name"/>&quot;.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">796</context>
<context context-type="linenumber">813</context>
</context-group>
</trans-unit>
<trans-unit id="8067594003836508139" datatype="html">
<source>Error saving account.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">808</context>
<context context-type="linenumber">825</context>
</context-group>
</trans-unit>
<trans-unit id="5641934153807844674" datatype="html">
<source>Confirm delete mail account</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">819</context>
<context context-type="linenumber">836</context>
</context-group>
</trans-unit>
<trans-unit id="7176985344323395435" datatype="html">
<source>This operation will permanently delete this mail account.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">820</context>
<context context-type="linenumber">837</context>
</context-group>
</trans-unit>
<trans-unit id="4233826387148482123" datatype="html">
<source>Deleted mail account</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">829</context>
<context context-type="linenumber">846</context>
</context-group>
</trans-unit>
<trans-unit id="6202503362522392111" datatype="html">
<source>Error deleting mail account.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">838</context>
<context context-type="linenumber">855</context>
</context-group>
</trans-unit>
<trans-unit id="123368655395433699" datatype="html">
<source>Saved rule &quot;<x id="PH" equiv-text="newMailRule.name"/>&quot;.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">859</context>
<context context-type="linenumber">876</context>
</context-group>
</trans-unit>
<trans-unit id="8951124554918814321" datatype="html">
<source>Error saving rule.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">871</context>
<context context-type="linenumber">888</context>
</context-group>
</trans-unit>
<trans-unit id="3896080636020672118" datatype="html">
<source>Confirm delete mail rule</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">882</context>
<context context-type="linenumber">899</context>
</context-group>
</trans-unit>
<trans-unit id="2250372580580310337" datatype="html">
<source>This operation will permanently delete this mail rule.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">883</context>
<context context-type="linenumber">900</context>
</context-group>
</trans-unit>
<trans-unit id="9077981247971516916" datatype="html">
<source>Deleted mail rule</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">892</context>
<context context-type="linenumber">909</context>
</context-group>
</trans-unit>
<trans-unit id="2033194641751367552" datatype="html">
<source>Error deleting mail rule.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/components/manage/settings/settings.component.ts</context>
<context context-type="linenumber">901</context>
<context context-type="linenumber">918</context>
</context-group>
</trans-unit>
<trans-unit id="5101757640976222639" datatype="html">
@ -5083,28 +5105,28 @@
<source>Document already exists.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">15</context>
<context context-type="linenumber">16</context>
</context-group>
</trans-unit>
<trans-unit id="6108404046106249255" datatype="html">
<source>Document with ASN already exists.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">16</context>
<context context-type="linenumber">17</context>
</context-group>
</trans-unit>
<trans-unit id="148389968432135849" datatype="html">
<source>File not found.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">17</context>
<context context-type="linenumber">18</context>
</context-group>
</trans-unit>
<trans-unit id="1520671543092565667" datatype="html">
<source>Pre-consume script does not exist.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">18</context>
<context context-type="linenumber">19</context>
</context-group>
<note priority="1" from="description">Pre-Consume is a term that appears like that in the documentation as well and does not need a specific translation</note>
</trans-unit>
@ -5112,7 +5134,7 @@
<source>Error while executing pre-consume script.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">19</context>
<context context-type="linenumber">20</context>
</context-group>
<note priority="1" from="description">Pre-Consume is a term that appears like that in the documentation as well and does not need a specific translation</note>
</trans-unit>
@ -5120,7 +5142,7 @@
<source>Post-consume script does not exist.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">20</context>
<context context-type="linenumber">21</context>
</context-group>
<note priority="1" from="description">Post-Consume is a term that appears like that in the documentation as well and does not need a specific translation</note>
</trans-unit>
@ -5128,7 +5150,7 @@
<source>Error while executing post-consume script.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">21</context>
<context context-type="linenumber">22</context>
</context-group>
<note priority="1" from="description">Post-Consume is a term that appears like that in the documentation as well and does not need a specific translation</note>
</trans-unit>
@ -5136,49 +5158,49 @@
<source>Received new file.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">22</context>
<context context-type="linenumber">23</context>
</context-group>
</trans-unit>
<trans-unit id="7337565919209746135" datatype="html">
<source>File type not supported.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">23</context>
<context context-type="linenumber">24</context>
</context-group>
</trans-unit>
<trans-unit id="5002399167376099234" datatype="html">
<source>Processing document...</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">24</context>
<context context-type="linenumber">25</context>
</context-group>
</trans-unit>
<trans-unit id="1085975194762600381" datatype="html">
<source>Generating thumbnail...</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">25</context>
<context context-type="linenumber">26</context>
</context-group>
</trans-unit>
<trans-unit id="3280851677698431426" datatype="html">
<source>Retrieving date from document...</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">26</context>
<context context-type="linenumber">27</context>
</context-group>
</trans-unit>
<trans-unit id="7162102384876037296" datatype="html">
<source>Saving document...</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">27</context>
<context context-type="linenumber">28</context>
</context-group>
</trans-unit>
<trans-unit id="4550450765009165976" datatype="html">
<source>Finished.</source>
<context-group purpose="location">
<context context-type="sourcefile">src/app/services/consumer-status.service.ts</context>
<context context-type="linenumber">28</context>
<context context-type="linenumber">29</context>
</context-group>
</trans-unit>
<trans-unit id="5523607037798226031" datatype="html">

2110
src-ui/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -10,50 +10,50 @@
},
"private": true,
"dependencies": {
"@angular/common": "~16.1.3",
"@angular/compiler": "~16.1.3",
"@angular/core": "~16.1.3",
"@angular/forms": "~16.1.3",
"@angular/localize": "~16.1.3",
"@angular/platform-browser": "~16.1.3",
"@angular/platform-browser-dynamic": "~16.1.3",
"@angular/router": "~16.1.3",
"@ng-bootstrap/ng-bootstrap": "^15.0.1",
"@ng-select/ng-select": "^11.0.0",
"@angular/common": "~16.1.7",
"@angular/compiler": "~16.1.7",
"@angular/core": "~16.1.7",
"@angular/forms": "~16.1.7",
"@angular/localize": "~16.1.7",
"@angular/platform-browser": "~16.1.7",
"@angular/platform-browser-dynamic": "~16.1.7",
"@angular/router": "~16.1.7",
"@ng-bootstrap/ng-bootstrap": "^15.1.0",
"@ng-select/ng-select": "^11.1.1",
"@ngneat/dirty-check-forms": "^3.0.3",
"@popperjs/core": "^2.11.8",
"bootstrap": "^5.3.0",
"bootstrap": "^5.3.1",
"file-saver": "^2.0.5",
"mime-names": "^1.0.0",
"ng2-pdf-viewer": "^9.1.5",
"ngx-color": "^9.0.0",
"ngx-cookie-service": "^16.0.0",
"ngx-file-drop": "^16.0.0",
"ngx-ui-tour-ng-bootstrap": "^13.0.2",
"ngx-ui-tour-ng-bootstrap": "^13.0.3",
"rxjs": "^7.8.1",
"tslib": "^2.6.0",
"tslib": "^2.6.1",
"uuid": "^9.0.0",
"zone.js": "^0.13.0"
},
"devDependencies": {
"@angular-builders/jest": "16.0.0",
"@angular-devkit/build-angular": "~16.1.3",
"@angular-eslint/builder": "16.0.3",
"@angular-eslint/eslint-plugin": "16.0.3",
"@angular-eslint/eslint-plugin-template": "16.0.3",
"@angular-eslint/schematics": "16.0.3",
"@angular-eslint/template-parser": "16.0.3",
"@angular/cli": "~16.1.3",
"@angular-devkit/build-angular": "~16.1.6",
"@angular-eslint/builder": "16.1.0",
"@angular-eslint/eslint-plugin": "16.1.0",
"@angular-eslint/eslint-plugin-template": "16.1.0",
"@angular-eslint/schematics": "16.1.0",
"@angular-eslint/template-parser": "16.1.0",
"@angular/cli": "~16.1.6",
"@angular/compiler-cli": "~16.1.3",
"@playwright/test": "^1.35.1",
"@types/jest": "^29.5.0",
"@types/node": "^20.2.5",
"@typescript-eslint/eslint-plugin": "^5.59.2",
"@typescript-eslint/parser": "^5.59.2",
"@playwright/test": "^1.36.2",
"@types/jest": "^29.5.3",
"@types/node": "^20.4.5",
"@typescript-eslint/eslint-plugin": "^6.2.1",
"@typescript-eslint/parser": "^6.2.1",
"concurrently": "^8.1.0",
"eslint": "^8.39.0",
"jest": "29.5.0",
"jest-environment-jsdom": "^29.5.0",
"eslint": "^8.46.0",
"jest": "29.6.2",
"jest-environment-jsdom": "^29.6.2",
"jest-preset-angular": "^13.1.1",
"jest-websocket-mock": "^2.4.0",
"ts-node": "~10.9.1",

View File

@ -139,104 +139,88 @@ export class AppComponent implements OnInit, OnDestroy {
const nextBtnTitle = $localize`Next`
const endBtnTitle = $localize`End`
this.tourService.initialize([
this.tourService.initialize(
[
{
anchorId: 'tour.dashboard',
content: $localize`The dashboard can be used to show saved views, such as an 'Inbox'. Those settings are found under Settings > Saved Views once you have created some.`,
route: '/dashboard',
delayAfterNavigation: 500,
isOptional: false,
},
{
anchorId: 'tour.upload-widget',
content: $localize`Drag-and-drop documents here to start uploading or place them in the consume folder. You can also drag-and-drop documents anywhere on all other pages of the web app. Once you do, Paperless-ngx will start training its machine learning algorithms.`,
route: '/dashboard',
},
{
anchorId: 'tour.documents',
content: $localize`The documents list shows all of your documents and allows for filtering as well as bulk-editing. There are three different view styles: list, small cards and large cards. A list of documents currently opened for editing is shown in the sidebar.`,
route: '/documents?sort=created&reverse=1&page=1',
delayAfterNavigation: 500,
placement: 'bottom',
},
{
anchorId: 'tour.documents-filter-editor',
content: $localize`The filtering tools allow you to quickly find documents using various searches, dates, tags, etc.`,
route: '/documents?sort=created&reverse=1&page=1',
placement: 'bottom',
},
{
anchorId: 'tour.documents-views',
content: $localize`Any combination of filters can be saved as a 'view' which can then be displayed on the dashboard and / or sidebar.`,
route: '/documents?sort=created&reverse=1&page=1',
},
{
anchorId: 'tour.tags',
content: $localize`Tags, correspondents, document types and storage paths can all be managed using these pages. They can also be created from the document edit view.`,
route: '/tags',
backdropConfig: {
offset: 0,
},
},
{
anchorId: 'tour.file-tasks',
content: $localize`File Tasks shows you documents that have been consumed, are waiting to be, or may have failed during the process.`,
route: '/tasks',
backdropConfig: {
offset: 0,
},
},
{
anchorId: 'tour.settings',
content: $localize`Check out the settings for various tweaks to the web app, toggle settings for saved views or setup e-mail checking.`,
route: '/settings',
backdropConfig: {
offset: 0,
},
},
{
anchorId: 'tour.outro',
title: $localize`Thank you! 🙏`,
content:
$localize`There are <em>tons</em> more features and info we didn't cover here, but this should get you started. Check out the documentation or visit the project on GitHub to learn more or to report issues.` +
'<br/><br/>' +
$localize`Lastly, on behalf of every contributor to this community-supported project, thank you for using Paperless-ngx!`,
route: '/dashboard',
isOptional: false,
backdropConfig: {
offset: 0,
},
},
],
{
anchorId: 'tour.dashboard',
content: $localize`The dashboard can be used to show saved views, such as an 'Inbox'. Those settings are found under Settings > Saved Views once you have created some.`,
route: '/dashboard',
enableBackdrop: true,
delayAfterNavigation: 500,
backdropConfig: {
offset: 10,
},
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
{
anchorId: 'tour.upload-widget',
content: $localize`Drag-and-drop documents here to start uploading or place them in the consume folder. You can also drag-and-drop documents anywhere on all other pages of the web app. Once you do, Paperless-ngx will start training its machine learning algorithms.`,
route: '/dashboard',
enableBackdrop: true,
isOptional: true,
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
{
anchorId: 'tour.documents',
content: $localize`The documents list shows all of your documents and allows for filtering as well as bulk-editing. There are three different view styles: list, small cards and large cards. A list of documents currently opened for editing is shown in the sidebar.`,
route: '/documents?sort=created&reverse=1&page=1',
delayAfterNavigation: 500,
placement: 'bottom',
enableBackdrop: true,
disableScrollToAnchor: true,
isOptional: true,
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
{
anchorId: 'tour.documents-filter-editor',
content: $localize`The filtering tools allow you to quickly find documents using various searches, dates, tags, etc.`,
route: '/documents?sort=created&reverse=1&page=1',
placement: 'bottom',
enableBackdrop: true,
isOptional: true,
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
{
anchorId: 'tour.documents-views',
content: $localize`Any combination of filters can be saved as a 'view' which can then be displayed on the dashboard and / or sidebar.`,
route: '/documents?sort=created&reverse=1&page=1',
enableBackdrop: true,
isOptional: true,
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
{
anchorId: 'tour.tags',
content: $localize`Tags, correspondents, document types and storage paths can all be managed using these pages. They can also be created from the document edit view.`,
route: '/tags',
enableBackdrop: true,
isOptional: true,
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
{
anchorId: 'tour.file-tasks',
content: $localize`File Tasks shows you documents that have been consumed, are waiting to be, or may have failed during the process.`,
route: '/tasks',
enableBackdrop: true,
isOptional: true,
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
{
anchorId: 'tour.settings',
content: $localize`Check out the settings for various tweaks to the web app, toggle settings for saved views or setup e-mail checking.`,
route: '/settings',
enableBackdrop: true,
isOptional: true,
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
{
anchorId: 'tour.outro',
title: $localize`Thank you! 🙏`,
content:
$localize`There are <em>tons</em> more features and info we didn't cover here, but this should get you started. Check out the documentation or visit the project on GitHub to learn more or to report issues.` +
'<br/><br/>' +
$localize`Lastly, on behalf of every contributor to this community-supported project, thank you for using Paperless-ngx!`,
route: '/dashboard',
prevBtnTitle,
nextBtnTitle,
endBtnTitle,
},
])
useLegacyTitle: true,
}
)
this.tourService.start$.subscribe(() => {
this.renderer.addClass(document.body, 'tour-active')

View File

@ -22,7 +22,7 @@ export enum EditDialogMode {
@Directive()
export abstract class EditDialogComponent<
T extends ObjectWithPermissions | ObjectWithId
T extends ObjectWithPermissions | ObjectWithId,
> implements OnInit
{
constructor(

View File

@ -26,7 +26,10 @@ import { EditDialogMode } from '../../edit-dialog/edit-dialog.component'
styleUrls: ['./tags.component.scss'],
})
export class TagsComponent implements OnInit, ControlValueAccessor {
constructor(private tagService: TagService, private modalService: NgbModal) {
constructor(
private tagService: TagService,
private modalService: NgbModal
) {
this.createTagRef = this.createTag.bind(this)
}

View File

@ -6,14 +6,33 @@
<table content class="table table-sm table-hover table-borderless mb-0">
<thead>
<tr>
<th i18n>Created</th>
<th scope="col" i18n>Created</th>
<th scope="col" i18n>Title</th>
</tr>
</thead>
<tbody *appIfPermissions="{ action: PermissionAction.View, type: PermissionType.Document }">
<tr *ngFor="let doc of documents">
<tr *ngFor="let doc of documents" (mouseleave)="mouseLeaveCard()">
<td><a routerLink="/documents/{{doc.id}}" class="d-block text-dark text-decoration-none">{{doc.created_date | customDate}}</a></td>
<td><a routerLink="/documents/{{doc.id}}" class="d-block text-dark text-decoration-none">{{doc.title | documentTitle}}<app-tag [tag]="t" *ngFor="let t of doc.tags$ | async" class="ms-1" (click)="clickTag(t, $event)"></app-tag></a></td>
<td class="position-relative">
<a routerLink="/documents/{{doc.id}}" title="Edit" i18n-title class="d-block text-dark text-decoration-none">{{doc.title | documentTitle}}<app-tag [tag]="t" *ngFor="let t of doc.tags$ | async" class="ms-1" (click)="clickTag(t, $event)"></app-tag></a>
<div class="btn-group position-absolute top-50 end-0 translate-middle-y">
<a [href]="getPreviewUrl(doc)" title="View Preview" i18n-title target="_blank" class="btn btn-sm px-4 py-0 btn-dark border-dark-subtle"
[ngbPopover]="previewContent" [popoverTitle]="doc.title | documentTitle"
autoClose="true" popoverClass="shadow popover-preview" container="body" (mouseenter)="mouseEnterPreview(doc)" (mouseleave)="mouseLeavePreview()" #popover="ngbPopover">
<svg class="buttonicon-xs" fill="currentColor">
<use xlink:href="assets/bootstrap-icons.svg#eye"/>
</svg>
</a>
<ng-template #previewContent>
<object [data]="getPreviewUrl(doc) | safeUrl" class="preview" width="100%"></object>
</ng-template>
<a [href]="getDownloadUrl(doc)" class="btn btn-sm px-4 py-0 btn-dark border-dark-subtle" title="Download" i18n-title (click)="$event.stopPropagation()">
<svg class="buttonicon-xs" fill="currentColor">
<use xlink:href="assets/bootstrap-icons.svg#download"/>
</svg>
</a>
</div>
</td>
</tr>
</tbody>
</table>

View File

@ -10,3 +10,15 @@ th:first-child {
tbody app-tag {
cursor: pointer;
}
tr .btn-group {
margin-right: 2px;
box-shadow: -6px 0px 4px -1px rgba(var(--bs-body-bg-rgb), .5);
opacity: 0;
pointer-events: none;
}
tr:hover .btn-group {
opacity: 1;
pointer-events: all;
}

View File

@ -1,6 +1,11 @@
import { DatePipe } from '@angular/common'
import { HttpClientTestingModule } from '@angular/common/http/testing'
import { ComponentFixture, TestBed } from '@angular/core/testing'
import {
ComponentFixture,
TestBed,
fakeAsync,
tick,
} from '@angular/core/testing'
import { Router } from '@angular/router'
import { RouterTestingModule } from '@angular/router/testing'
import { NgbModule } from '@ng-bootstrap/ng-bootstrap'
@ -21,6 +26,8 @@ import { PermissionsService } from 'src/app/services/permissions.service'
import { DocumentService } from 'src/app/services/rest/document.service'
import { WidgetFrameComponent } from '../widget-frame/widget-frame.component'
import { SavedViewWidgetComponent } from './saved-view-widget.component'
import { By } from '@angular/platform-browser'
import { SafeUrlPipe } from 'src/app/pipes/safeurl.pipe'
const savedView: PaperlessSavedView = {
id: 1,
@ -64,6 +71,7 @@ describe('SavedViewWidgetComponent', () => {
IfPermissionsDirective,
CustomDatePipe,
DocumentTitlePipe,
SafeUrlPipe,
],
providers: [
PermissionsGuard,
@ -107,8 +115,39 @@ describe('SavedViewWidgetComponent', () => {
fixture.detectChanges()
expect(fixture.debugElement.nativeElement.textContent).toContain('doc2')
expect(fixture.debugElement.nativeElement.textContent).toContain('doc3')
// preview + download buttons
expect(
fixture.debugElement.queryAll(By.css('td a.btn'))[0].attributes['href']
).toEqual(component.getPreviewUrl(documentResults[0]))
expect(
fixture.debugElement.queryAll(By.css('td a.btn'))[1].attributes['href']
).toEqual(component.getDownloadUrl(documentResults[0]))
})
it('should show preview on mouseover after delay to preload content', fakeAsync(() => {
jest.spyOn(documentService, 'listFiltered').mockReturnValue(
of({
all: [2, 3],
count: 2,
results: documentResults,
})
)
component.ngOnInit()
fixture.detectChanges()
component.mouseEnterPreview(documentResults[0])
expect(component.popover.isOpen()).toBeTruthy()
expect(component.popoverHidden).toBeTruthy()
tick(600)
expect(component.popoverHidden).toBeFalsy()
component.mouseLeaveCard()
component.mouseEnterPreview(documentResults[1])
tick(100)
component.mouseLeavePreview()
tick(600)
expect(component.popover.isOpen()).toBeFalsy()
}))
it('should call api endpoint and load results', () => {
const listAllSpy = jest.spyOn(documentService, 'listFiltered')
listAllSpy.mockReturnValue(

View File

@ -1,4 +1,12 @@
import { Component, Input, OnDestroy, OnInit } from '@angular/core'
import {
Component,
Input,
OnDestroy,
OnInit,
QueryList,
ViewChild,
ViewChildren,
} from '@angular/core'
import { Router } from '@angular/router'
import { Subscription } from 'rxjs'
import { PaperlessDocument } from 'src/app/data/paperless-document'
@ -10,11 +18,15 @@ import { FILTER_HAS_TAGS_ALL } from 'src/app/data/filter-rule-type'
import { OpenDocumentsService } from 'src/app/services/open-documents.service'
import { DocumentListViewService } from 'src/app/services/document-list-view.service'
import { ComponentWithPermissions } from 'src/app/components/with-permissions/with-permissions.component'
import { NgbPopover } from '@ng-bootstrap/ng-bootstrap'
@Component({
selector: 'app-saved-view-widget',
templateUrl: './saved-view-widget.component.html',
styleUrls: ['./saved-view-widget.component.scss'],
styleUrls: [
'./saved-view-widget.component.scss',
'../../../document-list/popover-preview/popover-preview.scss',
],
})
export class SavedViewWidgetComponent
extends ComponentWithPermissions
@ -39,6 +51,12 @@ export class SavedViewWidgetComponent
subscription: Subscription
@ViewChildren('popover') popovers: QueryList<NgbPopover>
popover: NgbPopover
mouseOnPreview = false
popoverHidden = true
ngOnInit(): void {
this.reload()
this.subscription = this.consumerStatusService
@ -87,4 +105,38 @@ export class SavedViewWidgetComponent
{ rule_type: FILTER_HAS_TAGS_ALL, value: tag.id.toString() },
])
}
getPreviewUrl(document: PaperlessDocument): string {
return this.documentService.getPreviewUrl(document.id)
}
getDownloadUrl(document: PaperlessDocument): string {
return this.documentService.getDownloadUrl(document.id)
}
mouseEnterPreview(doc: PaperlessDocument) {
this.popover = this.popovers.get(this.documents.indexOf(doc))
this.mouseOnPreview = true
if (!this.popover.isOpen()) {
// we're going to open but hide to pre-load content during hover delay
this.popover.open()
this.popoverHidden = true
setTimeout(() => {
if (this.mouseOnPreview) {
// show popover
this.popoverHidden = false
} else {
this.popover.close()
}
}, 600)
}
}
mouseLeavePreview() {
this.mouseOnPreview = false
}
mouseLeaveCard() {
this.popover?.close()
}
}

View File

@ -84,7 +84,7 @@ describe('UploadFileWidgetComponent', () => {
it('should change color by status phase', () => {
const processingStatus = new FileStatus()
processingStatus.phase = FileStatusPhase.PROCESSING
processingStatus.phase = FileStatusPhase.WORKING
expect(component.getStatusColor(processingStatus)).toEqual('primary')
const failedStatus = new FileStatus()
failedStatus.phase = FileStatusPhase.FAILED
@ -134,7 +134,7 @@ function mockConsumerStatuses(consumerStatusService) {
switch (phase) {
case FileStatusPhase.FAILED:
return [new FileStatus()]
case FileStatusPhase.PROCESSING:
case FileStatusPhase.WORKING:
return [new FileStatus(), new FileStatus()]
case FileStatusPhase.STARTED:
return [new FileStatus(), new FileStatus(), new FileStatus()]

View File

@ -90,8 +90,9 @@ export class UploadFileWidgetComponent extends ComponentWithPermissions {
getStatusColor(status: FileStatus) {
switch (status.phase) {
case FileStatusPhase.PROCESSING:
case FileStatusPhase.UPLOADING:
case FileStatusPhase.STARTED:
case FileStatusPhase.WORKING:
return 'primary'
case FileStatusPhase.FAILED:
return 'danger'

View File

@ -174,7 +174,7 @@
<li [ngbNavItem]="DocumentDetailNavIDs.Notes" *ngIf="notesEnabled">
<a ngbNavLink i18n>Notes <span *ngIf="document?.notes.length" class="badge text-bg-secondary ms-1">{{document.notes.length}}</span></a>
<ng-template ngbNavContent>
<app-document-notes [documentId]="documentId" [notes]="document?.notes" (updated)="notesUpdated($event)"></app-document-notes>
<app-document-notes [documentId]="documentId" [notes]="document?.notes" [addDisabled]="!userCanEdit" (updated)="notesUpdated($event)"></app-document-notes>
</ng-template>
</li>

View File

@ -50,7 +50,7 @@
</a>
<a class="btn btn-sm btn-outline-secondary" target="_blank" [href]="previewUrl"
[ngbPopover]="previewContent" [popoverTitle]="document.title | documentTitle"
autoClose="true" popoverClass="shadow" (mouseenter)="mouseEnterPreview()" (mouseleave)="mouseLeavePreview()" #popover="ngbPopover">
autoClose="true" popoverClass="shadow popover-preview" (mouseenter)="mouseEnterPreview()" (mouseleave)="mouseLeavePreview()" #popover="ngbPopover">
<svg class="sidebaricon" fill="currentColor" class="sidebaricon">
<use xlink:href="assets/bootstrap-icons.svg#eye"/>
</svg>&nbsp;<span class="d-none d-md-inline" i18n>View</span>
@ -94,7 +94,7 @@
<small>#{{document.archive_serial_number}}</small>
</div>
<ng-template #dateTooltip>
<div class="d-flex flex-column">
<div class="d-flex flex-column text-light">
<span i18n>Created: {{ document.created | customDate }}</span>
<span i18n>Added: {{ document.added | customDate }}</span>
<span i18n>Modified: {{ document.modified | customDate }}</span>

View File

@ -87,7 +87,7 @@
</a>
<a [href]="previewUrl" target="_blank" class="btn btn-sm btn-outline-secondary"
[ngbPopover]="previewContent" [popoverTitle]="document.title | documentTitle"
autoClose="true" popoverClass="shadow" (mouseenter)="mouseEnterPreview()" (mouseleave)="mouseLeavePreview()" #popover="ngbPopover">
autoClose="true" popoverClass="shadow popover-preview" (mouseenter)="mouseEnterPreview()" (mouseleave)="mouseLeavePreview()" #popover="ngbPopover">
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="currentColor" class="bi bi-eye" viewBox="0 0 16 16">
<path d="M16 8s-3-5.5-8-5.5S0 8 0 8s3 5.5 8 5.5S16 8 16 8zM1.173 8a13.133 13.133 0 0 1 1.66-2.043C4.12 4.668 5.88 3.5 8 3.5c2.12 0 3.879 1.168 5.168 2.457A13.133 13.133 0 0 1 14.828 8c-.058.087-.122.183-.195.288-.335.48-.83 1.12-1.465 1.755C11.879 11.332 10.119 12.5 8 12.5c-2.12 0-3.879-1.168-5.168-2.457A13.134 13.134 0 0 1 1.172 8z"/>
<path d="M8 5.5a2.5 2.5 0 1 0 0 5 2.5 2.5 0 0 0 0-5zM4.5 8a3.5 3.5 0 1 1 7 0 3.5 3.5 0 0 1-7 0z"/>

View File

@ -115,9 +115,9 @@ export class FilterEditorComponent implements OnInit, OnDestroy {
case FILTER_CORRESPONDENT:
case FILTER_HAS_CORRESPONDENT_ANY:
if (rule.value) {
return $localize`Correspondent: ${
this.correspondents.find((c) => c.id == +rule.value)?.name
}`
return $localize`Correspondent: ${this.correspondents.find(
(c) => c.id == +rule.value
)?.name}`
} else {
return $localize`Without correspondent`
}
@ -125,9 +125,9 @@ export class FilterEditorComponent implements OnInit, OnDestroy {
case FILTER_DOCUMENT_TYPE:
case FILTER_HAS_DOCUMENT_TYPE_ANY:
if (rule.value) {
return $localize`Document type: ${
this.documentTypes.find((dt) => dt.id == +rule.value)?.name
}`
return $localize`Document type: ${this.documentTypes.find(
(dt) => dt.id == +rule.value
)?.name}`
} else {
return $localize`Without document type`
}
@ -135,17 +135,16 @@ export class FilterEditorComponent implements OnInit, OnDestroy {
case FILTER_STORAGE_PATH:
case FILTER_HAS_STORAGE_PATH_ANY:
if (rule.value) {
return $localize`Storage path: ${
this.storagePaths.find((sp) => sp.id == +rule.value)?.name
}`
return $localize`Storage path: ${this.storagePaths.find(
(sp) => sp.id == +rule.value
)?.name}`
} else {
return $localize`Without storage path`
}
case FILTER_HAS_TAGS_ALL:
return $localize`Tag: ${
this.tags.find((t) => t.id == +rule.value)?.name
}`
return $localize`Tag: ${this.tags.find((t) => t.id == +rule.value)
?.name}`
case FILTER_HAS_ANY_TAG:
if (rule.value == 'false') {

View File

@ -1,4 +1,4 @@
::ng-deep app-document-list .popover {
::ng-deep .popover.popover-preview {
max-width: 40rem;
.preview {
@ -16,7 +16,7 @@
}
}
::ng-deep .popover-hidden .popover {
::ng-deep .popover-hidden .popover {
opacity: 0;
pointer-events: none;
}

View File

@ -8,7 +8,7 @@
</div>
<div class="form-group mt-2 d-flex justify-content-end align-items-center">
<div *ngIf="networkActive" class="spinner-border spinner-border-sm fw-normal me-auto" role="status"></div>
<button type="button" class="btn btn-primary btn-sm" [disabled]="networkActive" (click)="addNote()" i18n>Add note</button>
<button type="button" class="btn btn-primary btn-sm" [disabled]="networkActive || addDisabled" (click)="addNote()" i18n>Add note</button>
</div>
</form>
<hr>

View File

@ -1,6 +1,5 @@
.card-body {
max-height: 12rem;
overflow: scroll;
white-space: pre-wrap;
}

View File

@ -26,6 +26,9 @@ export class DocumentNotesComponent extends ComponentWithPermissions {
@Input()
notes: PaperlessDocumentNote[] = []
@Input()
addDisabled: boolean = false
@Output()
updated: EventEmitter<PaperlessDocumentNote[]> = new EventEmitter()
users: PaperlessUser[]
@ -61,7 +64,9 @@ export class DocumentNotesComponent extends ComponentWithPermissions {
error: (e) => {
this.networkActive = false
this.toastService.showError(
$localize`Error saving note: ${e.toString()}`
$localize`Error saving note`,
10000,
JSON.stringify(e)
)
},
})

View File

@ -266,8 +266,8 @@
<div class="col d-flex align-items-center">{{account.imap_server}}</div>
<div class="col">
<div class="btn-group">
<button *appIfPermissions="{ action: PermissionAction.Change, type: PermissionType.MailAccount }" class="btn btn-sm btn-primary" type="button" (click)="editMailAccount(account)" i18n>Edit</button>
<button *appIfPermissions="{ action: PermissionAction.Change, type: PermissionType.MailAccount }" class="btn btn-sm btn-outline-danger" type="button" (click)="deleteMailAccount(account)" i18n>Delete</button>
<button *appIfPermissions="{ action: PermissionAction.Change, type: PermissionType.MailAccount }" [disabled]="!userCanEdit(account)" class="btn btn-sm btn-primary" type="button" (click)="editMailAccount(account)" i18n>Edit</button>
<button *appIfPermissions="{ action: PermissionAction.Delete, type: PermissionType.MailAccount }" [disabled]="!userIsOwner(account)" class="btn btn-sm btn-outline-danger" type="button" (click)="deleteMailAccount(account)" i18n>Delete</button>
</div>
</div>
</div>
@ -303,8 +303,8 @@
<div class="col d-flex align-items-center">{{(mailAccountService.getCached(rule.account) | async)?.name}}</div>
<div class="col">
<div class="btn-group">
<button *appIfPermissions="{ action: PermissionAction.Change, type: PermissionType.MailRule }" class="btn btn-sm btn-primary" type="button" (click)="editMailRule(rule)" i18n>Edit</button>
<button *appIfPermissions="{ action: PermissionAction.Change, type: PermissionType.MailRule }" class="btn btn-sm btn-outline-danger" type="button" (click)="deleteMailRule(rule)" i18n>Delete</button>
<button *appIfPermissions="{ action: PermissionAction.Change, type: PermissionType.MailRule }" [disabled]="!userCanEdit(rule)" class="btn btn-sm btn-primary" type="button" (click)="editMailRule(rule)" i18n>Edit</button>
<button *appIfPermissions="{ action: PermissionAction.Delete, type: PermissionType.MailRule }" [disabled]="!userIsOwner(rule)" class="btn btn-sm btn-outline-danger" type="button" (click)="deleteMailRule(rule)" i18n>Delete</button>
</div>
</div>
</div>

View File

@ -48,8 +48,8 @@ const savedViews = [
{ id: 2, name: 'view2' },
]
const users = [
{ id: 1, username: 'user1' },
{ id: 2, username: 'user2' },
{ id: 1, username: 'user1', is_superuser: false },
{ id: 2, username: 'user2', is_superuser: false },
]
const groups = [
{ id: 1, name: 'group1' },
@ -60,8 +60,8 @@ const mailAccounts = [
{ id: 2, name: 'account2' },
]
const mailRules = [
{ id: 1, name: 'rule1' },
{ id: 2, name: 'rule2' },
{ id: 1, name: 'rule1', owner: 1 },
{ id: 2, name: 'rule2', owner: 2 },
]
describe('SettingsComponent', () => {
@ -75,6 +75,7 @@ describe('SettingsComponent', () => {
let viewportScroller: ViewportScroller
let toastService: ToastService
let userService: UserService
let permissionsService: PermissionsService
let groupService: GroupService
let mailAccountService: MailAccountService
let mailRuleService: MailRuleService
@ -90,17 +91,7 @@ describe('SettingsComponent', () => {
CheckComponent,
ColorComponent,
],
providers: [
{
provide: PermissionsService,
useValue: {
currentUserCan: () => true,
},
},
CustomDatePipe,
DatePipe,
PermissionsGuard,
],
providers: [CustomDatePipe, DatePipe, PermissionsGuard],
imports: [
NgbModule,
HttpClientTestingModule,
@ -117,6 +108,14 @@ describe('SettingsComponent', () => {
toastService = TestBed.inject(ToastService)
settingsService = TestBed.inject(SettingsService)
userService = TestBed.inject(UserService)
permissionsService = TestBed.inject(PermissionsService)
jest.spyOn(permissionsService, 'currentUserCan').mockReturnValue(true)
jest
.spyOn(permissionsService, 'currentUserHasObjectPermissions')
.mockReturnValue(true)
jest
.spyOn(permissionsService, 'currentUserOwnsObject')
.mockReturnValue(true)
jest.spyOn(userService, 'listAll').mockReturnValue(
of({
all: users.map((u) => u.id),

View File

@ -45,6 +45,11 @@ import { MailRuleService } from 'src/app/services/rest/mail-rule.service'
import { MailAccountEditDialogComponent } from '../../common/edit-dialog/mail-account-edit-dialog/mail-account-edit-dialog.component'
import { MailRuleEditDialogComponent } from '../../common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component'
import { EditDialogMode } from '../../common/edit-dialog/edit-dialog.component'
import { ObjectWithPermissions } from 'src/app/data/object-with-permissions'
import {
PermissionAction,
PermissionsService,
} from 'src/app/services/permissions.service'
enum SettingsNavIDs {
General = 1,
@ -140,7 +145,8 @@ export class SettingsComponent
private usersService: UserService,
private groupsService: GroupService,
private router: Router,
private modalService: NgbModal
private modalService: NgbModal,
private permissionsService: PermissionsService
) {
super()
this.settings.settingsSaved.subscribe(() => {
@ -642,6 +648,17 @@ export class SettingsComponent
this.settingsForm.get('themeColor').patchValue('')
}
userCanEdit(obj: ObjectWithPermissions): boolean {
return this.permissionsService.currentUserHasObjectPermissions(
PermissionAction.Change,
obj
)
}
userIsOwner(obj: ObjectWithPermissions): boolean {
return this.permissionsService.currentUserOwnsObject(obj)
}
editUser(user: PaperlessUser) {
var modal = this.modalService.open(UserEditDialogComponent, {
backdrop: 'static',

View File

@ -1,4 +1,4 @@
import { ObjectWithId } from './object-with-id'
import { ObjectWithPermissions } from './object-with-permissions'
export enum IMAPSecurity {
None = 1,
@ -6,7 +6,7 @@ export enum IMAPSecurity {
STARTTLS = 3,
}
export interface PaperlessMailAccount extends ObjectWithId {
export interface PaperlessMailAccount extends ObjectWithPermissions {
name: string
imap_server: string

View File

@ -1,4 +1,4 @@
import { ObjectWithId } from './object-with-id'
import { ObjectWithPermissions } from './object-with-permissions'
export enum MailFilterAttachmentType {
Attachments = 1,
@ -31,7 +31,7 @@ export enum MailMetadataCorrespondentOption {
FromCustom = 4,
}
export interface PaperlessMailRule extends ObjectWithId {
export interface PaperlessMailRule extends ObjectWithPermissions {
name: string
account: number // PaperlessMailAccount.id

View File

@ -11,7 +11,10 @@ import { Meta } from '@angular/platform-browser'
@Injectable()
export class CsrfInterceptor implements HttpInterceptor {
constructor(private cookieService: CookieService, private meta: Meta) {}
constructor(
private cookieService: CookieService,
private meta: Meta
) {}
intercept(
request: HttpRequest<unknown>,

View File

@ -60,10 +60,10 @@ describe('ConsumerStatusService', () => {
current_progress: 50,
max_progress: 100,
document_id: 12,
status: 'STARTING',
status: 'WORKING',
})
expect(status.getProgress()).toBeCloseTo(0.6) // 0.8 * 50/100
expect(status.getProgress()).toBeCloseTo(0.6) // (0.8 * 50/100) + .2
expect(consumerStatusService.getConsumerStatusNotCompleted()).toEqual([
status,
])
@ -194,6 +194,7 @@ describe('ConsumerStatusService', () => {
expect(consumerStatusService.getConsumerStatusCompleted()).toHaveLength(1)
consumerStatusService.dismissCompleted()
expect(consumerStatusService.getConsumerStatusCompleted()).toHaveLength(0)
consumerStatusService.disconnect()
})
it('should support dismiss', () => {
@ -238,17 +239,40 @@ describe('ConsumerStatusService', () => {
})
it('should notify of document created on status message without upload', () => {
let detected = false
consumerStatusService.onDocumentDetected().subscribe((filestatus) => {
expect(filestatus.phase).toEqual(FileStatusPhase.STARTED)
detected = true
})
consumerStatusService.connect()
server.send({
task_id: '1234',
filename: 'file.pdf',
current_progress: 0,
max_progress: 100,
message: 'new_file',
status: 'STARTED',
})
consumerStatusService.disconnect()
expect(detected).toBeTruthy()
})
it('should notify of document in progress without upload', () => {
consumerStatusService.connect()
server.send({
task_id: '1234',
filename: 'file.pdf',
current_progress: 50,
max_progress: 100,
document_id: 12,
status: 'STARTING',
docuement_id: 12,
status: 'WORKING',
})
consumerStatusService.disconnect()
expect(consumerStatusService.getConsumerStatusNotCompleted()).toHaveLength(
1
)
})
})

View File

@ -3,10 +3,11 @@ import { Subject } from 'rxjs'
import { environment } from 'src/environments/environment'
import { WebsocketConsumerStatusMessage } from '../data/websocket-consumer-status-message'
// see ConsumerFilePhase in src/documents/consumer.py
export enum FileStatusPhase {
STARTED = 0,
UPLOADING = 1,
PROCESSING = 2,
WORKING = 2,
SUCCESS = 3,
FAILED = 4,
}
@ -49,7 +50,7 @@ export class FileStatus {
return 0.0
case FileStatusPhase.UPLOADING:
return (this.currentPhaseProgress / this.currentPhaseMaxProgress) * 0.2
case FileStatusPhase.PROCESSING:
case FileStatusPhase.WORKING:
return (
(this.currentPhaseProgress / this.currentPhaseMaxProgress) * 0.8 + 0.2
)
@ -150,7 +151,7 @@ export class ConsumerStatusService {
let created = statusMessageGet.created
status.updateProgress(
FileStatusPhase.PROCESSING,
FileStatusPhase.WORKING,
statusMessage.current_progress,
statusMessage.max_progress
)
@ -164,16 +165,25 @@ export class ConsumerStatusService {
}
status.documentId = statusMessage.document_id
if (created && statusMessage.status == 'STARTING') {
this.documentDetectedSubject.next(status)
if (statusMessage.status in FileStatusPhase) {
status.phase = FileStatusPhase[statusMessage.status]
}
if (statusMessage.status == 'SUCCESS') {
status.phase = FileStatusPhase.SUCCESS
this.documentConsumptionFinishedSubject.next(status)
}
if (statusMessage.status == 'FAILED') {
status.phase = FileStatusPhase.FAILED
this.documentConsumptionFailedSubject.next(status)
switch (status.phase) {
case FileStatusPhase.STARTED:
if (created) this.documentDetectedSubject.next(status)
break
case FileStatusPhase.SUCCESS:
this.documentConsumptionFinishedSubject.next(status)
break
case FileStatusPhase.FAILED:
this.documentConsumptionFailedSubject.next(status)
break
default:
break
}
}
}

View File

@ -276,9 +276,9 @@ export class DocumentListViewService {
errorMessage = Object.keys(error.error)
.map((fieldName) => {
const fieldError: Array<string> = error.error[fieldName]
return `${
DOCUMENT_SORT_FIELDS.find((f) => f.field == fieldName)?.name
}: ${fieldError[0]}`
return `${DOCUMENT_SORT_FIELDS.find(
(f) => f.field == fieldName
)?.name}: ${fieldError[0]}`
})
.join(', ')
} else {

View File

@ -2,7 +2,7 @@ import { ObjectWithId } from 'src/app/data/object-with-id'
import { AbstractPaperlessService } from './abstract-paperless-service'
export abstract class AbstractNameFilterService<
T extends ObjectWithId
T extends ObjectWithId,
> extends AbstractPaperlessService<T> {
listFiltered(
page?: number,

View File

@ -8,7 +8,10 @@ import { environment } from 'src/environments/environment'
export abstract class AbstractPaperlessService<T extends ObjectWithId> {
protected baseUrl: string = environment.apiBaseUrl
constructor(protected http: HttpClient, private resourceName: string) {}
constructor(
protected http: HttpClient,
private resourceName: string
) {}
protected getResourceUrl(id: number = null, action: string = null): string {
let url = `${this.baseUrl}${this.resourceName}/`

View File

@ -9,7 +9,10 @@ import { AbstractNameFilterService } from './abstract-name-filter-service'
providedIn: 'root',
})
export class GroupService extends AbstractNameFilterService<PaperlessGroup> {
constructor(http: HttpClient, private permissionService: PermissionsService) {
constructor(
http: HttpClient,
private permissionService: PermissionsService
) {
super(http, 'groups')
}

View File

@ -9,7 +9,10 @@ import { AbstractNameFilterService } from './abstract-name-filter-service'
providedIn: 'root',
})
export class UserService extends AbstractNameFilterService<PaperlessUser> {
constructor(http: HttpClient, private permissionService: PermissionsService) {
constructor(
http: HttpClient,
private permissionService: PermissionsService
) {
super(http, 'users')
}

View File

@ -433,6 +433,11 @@ ul.pagination {
height: 1em;
}
.buttonicon-xs {
width: 0.8em;
height: 0.8em;
}
.sidebaricon {
width: 16px;
height: 16px;
@ -560,6 +565,17 @@ body.tour-active .sidebar {
z-index: inherit !important;
}
.tour-step {
.popover-header {
--bs-popover-header-padding-y: .75rem;
}
.popover-body {
// reset ngx-ui-tour overrides
padding: var(--bs-popover-body-padding-y) var(--bs-popover-body-padding-x) !important;
}
}
.nav-item.touranchor--is-active a {
font-weight: bold !important;
color: var(--bs-primary);

View File

@ -71,6 +71,7 @@ $form-check-radio-checked-bg-image-dark: url("data:image/svg+xml,<svg xmlns='htt
--pngx-focus-alpha: 0.6;
--pngx-primary-faded: var(--pngx-primary-darken-15);
--pngx-primary-text-contrast: var(--bs-body-color);
--bs-dark-border-subtle: var(--pngx-bg-darker);
.text-dark, .text-light {
color: var(--bs-body-color) !important;

View File

@ -1,15 +1,12 @@
import logging
import shutil
import tempfile
from dataclasses import dataclass
from pathlib import Path
from subprocess import run
from typing import Dict
from typing import Final
from typing import List
from typing import Optional
import img2pdf
from django.conf import settings
from pdf2image import convert_from_path
from pdf2image.exceptions import PDFPageCountError
@ -17,7 +14,10 @@ from pikepdf import Page
from pikepdf import Pdf
from PIL import Image
from documents.converters import convert_from_tiff_to_pdf
from documents.data_models import DocumentSource
from documents.utils import copy_basic_file_stats
from documents.utils import copy_file_with_basic_stats
logger = logging.getLogger("paperless.barcodes")
@ -54,7 +54,7 @@ class BarcodeReader:
self.mime: Final[str] = mime_type
self.pdf_file: Path = self.file
self.barcodes: List[Barcode] = []
self.temp_dir: Optional[Path] = None
self.temp_dir: Optional[tempfile.TemporaryDirectory] = None
if settings.CONSUMER_BARCODE_TIFF_SUPPORT:
self.SUPPORTED_FILE_MIMES = {"application/pdf", "image/tiff"}
@ -154,34 +154,7 @@ class BarcodeReader:
if self.mime != "image/tiff":
return
with Image.open(self.file) as im:
has_alpha_layer = im.mode in ("RGBA", "LA")
if has_alpha_layer:
# Note the save into the temp folder, so as not to trigger a new
# consume
scratch_image = Path(self.temp_dir.name) / Path(self.file.name)
run(
[
settings.CONVERT_BINARY,
"-alpha",
"off",
self.file,
scratch_image,
],
)
else:
# Not modifying the original, safe to use in place
scratch_image = self.file
self.pdf_file = Path(self.temp_dir.name) / Path(self.file.name).with_suffix(
".pdf",
)
with scratch_image.open("rb") as img_file, self.pdf_file.open("wb") as pdf_file:
pdf_file.write(img2pdf.convert(img_file))
# Copy what file stat is possible
shutil.copystat(self.file, self.pdf_file)
self.pdf_file = convert_from_tiff_to_pdf(self.file, Path(self.temp_dir.name))
def detect(self) -> None:
"""
@ -306,7 +279,7 @@ class BarcodeReader:
with open(savepath, "wb") as out:
dst.save(out)
shutil.copystat(self.file, savepath)
copy_basic_file_stats(self.file, savepath)
document_paths.append(savepath)
@ -363,5 +336,5 @@ class BarcodeReader:
else:
dest = save_to_dir
logger.info(f"Saving {document_path} to {dest}")
shutil.copy2(document_path, dest)
copy_file_with_basic_stats(document_path, dest)
return True

View File

@ -5,6 +5,7 @@ import re
import warnings
from datetime import datetime
from hashlib import sha256
from pathlib import Path
from typing import Iterator
from typing import List
from typing import Optional
@ -81,7 +82,7 @@ class DocumentClassifier:
self._stemmer = None
self._stop_words = None
def load(self):
def load(self) -> None:
# Catch warnings for processing
with warnings.catch_warnings(record=True) as w:
with open(settings.MODEL_FILE, "rb") as f:
@ -120,19 +121,20 @@ class DocumentClassifier:
raise IncompatibleClassifierVersionError
def save(self):
target_file = settings.MODEL_FILE
target_file_temp = settings.MODEL_FILE.with_suffix(".pickle.part")
target_file: Path = settings.MODEL_FILE
target_file_temp = target_file.with_suffix(".pickle.part")
with open(target_file_temp, "wb") as f:
pickle.dump(self.FORMAT_VERSION, f)
pickle.dump(self.last_doc_change_time, f)
pickle.dump(self.last_auto_type_hash, f)
pickle.dump(self.data_vectorizer, f)
pickle.dump(self.tags_binarizer, f)
pickle.dump(self.tags_classifier, f)
pickle.dump(self.correspondent_classifier, f)
pickle.dump(self.document_type_classifier, f)
pickle.dump(self.storage_path_classifier, f)
@ -247,7 +249,7 @@ class DocumentClassifier:
data_vectorized = self.data_vectorizer.fit_transform(content_generator())
# See the notes here:
# https://scikit-learn.org/stable/modules/generated/sklearn.feature_extraction.text.CountVectorizer.html # noqa: 501
# https://scikit-learn.org/stable/modules/generated/sklearn.feature_extraction.text.CountVectorizer.html # noqa: E501
# This attribute isn't needed to function and can be large
self.data_vectorizer.stop_words_ = None
@ -380,7 +382,7 @@ class DocumentClassifier:
return content
def predict_correspondent(self, content: str):
def predict_correspondent(self, content: str) -> Optional[int]:
if self.correspondent_classifier:
X = self.data_vectorizer.transform([self.preprocess_content(content)])
correspondent_id = self.correspondent_classifier.predict(X)
@ -391,7 +393,7 @@ class DocumentClassifier:
else:
return None
def predict_document_type(self, content: str):
def predict_document_type(self, content: str) -> Optional[int]:
if self.document_type_classifier:
X = self.data_vectorizer.transform([self.preprocess_content(content)])
document_type_id = self.document_type_classifier.predict(X)
@ -402,7 +404,7 @@ class DocumentClassifier:
else:
return None
def predict_tags(self, content: str):
def predict_tags(self, content: str) -> List[int]:
from sklearn.utils.multiclass import type_of_target
if self.tags_classifier:
@ -423,7 +425,7 @@ class DocumentClassifier:
else:
return []
def predict_storage_path(self, content: str):
def predict_storage_path(self, content: str) -> Optional[int]:
if self.storage_path_classifier:
X = self.data_vectorizer.transform([self.preprocess_content(content)])
storage_path_id = self.storage_path_classifier.predict(X)

View File

@ -1,9 +1,9 @@
import datetime
import hashlib
import os
import shutil
import tempfile
import uuid
from enum import Enum
from pathlib import Path
from subprocess import CompletedProcess
from subprocess import run
@ -21,6 +21,9 @@ from django.utils import timezone
from filelock import FileLock
from rest_framework.reverse import reverse
from documents.utils import copy_basic_file_stats
from documents.utils import copy_file_with_basic_stats
from .classifier import load_classifier
from .file_handling import create_source_path_directory
from .file_handling import generate_unique_filename
@ -42,21 +45,30 @@ class ConsumerError(Exception):
pass
MESSAGE_DOCUMENT_ALREADY_EXISTS = "document_already_exists"
MESSAGE_ASN_ALREADY_EXISTS = "asn_already_exists"
MESSAGE_ASN_RANGE = "asn_value_out_of_range"
MESSAGE_FILE_NOT_FOUND = "file_not_found"
MESSAGE_PRE_CONSUME_SCRIPT_NOT_FOUND = "pre_consume_script_not_found"
MESSAGE_PRE_CONSUME_SCRIPT_ERROR = "pre_consume_script_error"
MESSAGE_POST_CONSUME_SCRIPT_NOT_FOUND = "post_consume_script_not_found"
MESSAGE_POST_CONSUME_SCRIPT_ERROR = "post_consume_script_error"
MESSAGE_NEW_FILE = "new_file"
MESSAGE_UNSUPPORTED_TYPE = "unsupported_type"
MESSAGE_PARSING_DOCUMENT = "parsing_document"
MESSAGE_GENERATING_THUMBNAIL = "generating_thumbnail"
MESSAGE_PARSE_DATE = "parse_date"
MESSAGE_SAVE_DOCUMENT = "save_document"
MESSAGE_FINISHED = "finished"
class ConsumerStatusShortMessage(str, Enum):
DOCUMENT_ALREADY_EXISTS = "document_already_exists"
ASN_ALREADY_EXISTS = "asn_already_exists"
ASN_RANGE = "asn_value_out_of_range"
FILE_NOT_FOUND = "file_not_found"
PRE_CONSUME_SCRIPT_NOT_FOUND = "pre_consume_script_not_found"
PRE_CONSUME_SCRIPT_ERROR = "pre_consume_script_error"
POST_CONSUME_SCRIPT_NOT_FOUND = "post_consume_script_not_found"
POST_CONSUME_SCRIPT_ERROR = "post_consume_script_error"
NEW_FILE = "new_file"
UNSUPPORTED_TYPE = "unsupported_type"
PARSING_DOCUMENT = "parsing_document"
GENERATING_THUMBNAIL = "generating_thumbnail"
PARSE_DATE = "parse_date"
SAVE_DOCUMENT = "save_document"
FINISHED = "finished"
FAILED = "failed"
class ConsumerFilePhase(str, Enum):
STARTED = "STARTED"
WORKING = "WORKING"
SUCCESS = "SUCCESS"
FAILED = "FAILED"
class Consumer(LoggingMixin):
@ -64,10 +76,10 @@ class Consumer(LoggingMixin):
def _send_progress(
self,
current_progress,
max_progress,
status,
message=None,
current_progress: int,
max_progress: int,
status: ConsumerFilePhase,
message: Optional[ConsumerStatusShortMessage] = None,
document_id=None,
): # pragma: no cover
payload = {
@ -86,12 +98,12 @@ class Consumer(LoggingMixin):
def _fail(
self,
message,
log_message=None,
message: ConsumerStatusShortMessage,
log_message: Optional[str] = None,
exc_info=None,
exception: Optional[Exception] = None,
):
self._send_progress(100, 100, "FAILED", message)
self._send_progress(100, 100, ConsumerFilePhase.FAILED, message)
self.log.error(log_message or message, exc_info=exc_info)
raise ConsumerError(f"{self.filename}: {log_message or message}") from exception
@ -111,13 +123,19 @@ class Consumer(LoggingMixin):
self.channel_layer = get_channel_layer()
def pre_check_file_exists(self):
"""
Confirm the input file still exists where it should
"""
if not os.path.isfile(self.path):
self._fail(
MESSAGE_FILE_NOT_FOUND,
ConsumerStatusShortMessage.FILE_NOT_FOUND,
f"Cannot consume {self.path}: File not found.",
)
def pre_check_duplicate(self):
"""
Using the MD5 of the file, check this exact file doesn't already exist
"""
with open(self.path, "rb") as f:
checksum = hashlib.md5(f.read()).hexdigest()
existing_doc = Document.objects.filter(
@ -127,12 +145,15 @@ class Consumer(LoggingMixin):
if settings.CONSUMER_DELETE_DUPLICATES:
os.unlink(self.path)
self._fail(
MESSAGE_DOCUMENT_ALREADY_EXISTS,
ConsumerStatusShortMessage.DOCUMENT_ALREADY_EXISTS,
f"Not consuming {self.filename}: It is a duplicate of"
f" {existing_doc.get().title} (#{existing_doc.get().pk})",
)
def pre_check_directories(self):
"""
Ensure all required directories exist before attempting to use them
"""
os.makedirs(settings.SCRATCH_DIR, exist_ok=True)
os.makedirs(settings.THUMBNAIL_DIR, exist_ok=True)
os.makedirs(settings.ORIGINALS_DIR, exist_ok=True)
@ -152,7 +173,7 @@ class Consumer(LoggingMixin):
or self.override_asn > Document.ARCHIVE_SERIAL_NUMBER_MAX
):
self._fail(
MESSAGE_ASN_RANGE,
ConsumerStatusShortMessage.ASN_RANGE,
f"Not consuming {self.filename}: "
f"Given ASN {self.override_asn} is out of range "
f"[{Document.ARCHIVE_SERIAL_NUMBER_MIN:,}, "
@ -160,17 +181,21 @@ class Consumer(LoggingMixin):
)
if Document.objects.filter(archive_serial_number=self.override_asn).exists():
self._fail(
MESSAGE_ASN_ALREADY_EXISTS,
ConsumerStatusShortMessage.ASN_ALREADY_EXISTS,
f"Not consuming {self.filename}: Given ASN already exists!",
)
def run_pre_consume_script(self):
"""
If one is configured and exists, run the pre-consume script and
handle its output and/or errors
"""
if not settings.PRE_CONSUME_SCRIPT:
return
if not os.path.isfile(settings.PRE_CONSUME_SCRIPT):
self._fail(
MESSAGE_PRE_CONSUME_SCRIPT_NOT_FOUND,
ConsumerStatusShortMessage.PRE_CONSUME_SCRIPT_NOT_FOUND,
f"Configured pre-consume script "
f"{settings.PRE_CONSUME_SCRIPT} does not exist.",
)
@ -201,19 +226,23 @@ class Consumer(LoggingMixin):
except Exception as e:
self._fail(
MESSAGE_PRE_CONSUME_SCRIPT_ERROR,
ConsumerStatusShortMessage.PRE_CONSUME_SCRIPT_ERROR,
f"Error while executing pre-consume script: {e}",
exc_info=True,
exception=e,
)
def run_post_consume_script(self, document: Document):
"""
If one is configured and exists, run the pre-consume script and
handle its output and/or errors
"""
if not settings.POST_CONSUME_SCRIPT:
return
if not os.path.isfile(settings.POST_CONSUME_SCRIPT):
self._fail(
MESSAGE_POST_CONSUME_SCRIPT_NOT_FOUND,
ConsumerStatusShortMessage.POST_CONSUME_SCRIPT_NOT_FOUND,
f"Configured post-consume script "
f"{settings.POST_CONSUME_SCRIPT} does not exist.",
)
@ -274,7 +303,7 @@ class Consumer(LoggingMixin):
except Exception as e:
self._fail(
MESSAGE_POST_CONSUME_SCRIPT_ERROR,
ConsumerStatusShortMessage.POST_CONSUME_SCRIPT_ERROR,
f"Error while executing post-consume script: {e}",
exc_info=True,
exception=e,
@ -308,7 +337,12 @@ class Consumer(LoggingMixin):
self.override_asn = override_asn
self.override_owner_id = override_owner_id
self._send_progress(0, 100, "STARTING", MESSAGE_NEW_FILE)
self._send_progress(
0,
100,
ConsumerFilePhase.STARTED,
ConsumerStatusShortMessage.NEW_FILE,
)
# Make sure that preconditions for consuming the file are met.
@ -326,7 +360,7 @@ class Consumer(LoggingMixin):
dir=settings.SCRATCH_DIR,
)
self.path = Path(tempdir.name) / Path(self.filename)
shutil.copy2(self.original_path, self.path)
copy_file_with_basic_stats(self.original_path, self.path)
# Determine the parser class.
@ -340,7 +374,10 @@ class Consumer(LoggingMixin):
)
if not parser_class:
tempdir.cleanup()
self._fail(MESSAGE_UNSUPPORTED_TYPE, f"Unsupported mime type {mime_type}")
self._fail(
ConsumerStatusShortMessage.UNSUPPORTED_TYPE,
f"Unsupported mime type {mime_type}",
)
# Notify all listeners that we're going to do some work.
@ -355,7 +392,7 @@ class Consumer(LoggingMixin):
def progress_callback(current_progress, max_progress): # pragma: no cover
# recalculate progress to be within 20 and 80
p = int((current_progress / max_progress) * 50 + 20)
self._send_progress(p, 100, "WORKING")
self._send_progress(p, 100, ConsumerFilePhase.WORKING)
# This doesn't parse the document yet, but gives us a parser.
@ -377,12 +414,22 @@ class Consumer(LoggingMixin):
archive_path = None
try:
self._send_progress(20, 100, "WORKING", MESSAGE_PARSING_DOCUMENT)
self._send_progress(
20,
100,
ConsumerFilePhase.WORKING,
ConsumerStatusShortMessage.PARSING_DOCUMENT,
)
self.log.debug(f"Parsing {self.filename}...")
document_parser.parse(self.path, mime_type, self.filename)
self.log.debug(f"Generating thumbnail for {self.filename}...")
self._send_progress(70, 100, "WORKING", MESSAGE_GENERATING_THUMBNAIL)
self._send_progress(
70,
100,
ConsumerFilePhase.WORKING,
ConsumerStatusShortMessage.GENERATING_THUMBNAIL,
)
thumbnail = document_parser.get_thumbnail(
self.path,
mime_type,
@ -392,7 +439,12 @@ class Consumer(LoggingMixin):
text = document_parser.get_text()
date = document_parser.get_date()
if date is None:
self._send_progress(90, 100, "WORKING", MESSAGE_PARSE_DATE)
self._send_progress(
90,
100,
ConsumerFilePhase.WORKING,
ConsumerStatusShortMessage.PARSE_DATE,
)
date = parse_date(self.filename, text)
archive_path = document_parser.get_archive_path()
@ -414,7 +466,12 @@ class Consumer(LoggingMixin):
classifier = load_classifier()
self._send_progress(95, 100, "WORKING", MESSAGE_SAVE_DOCUMENT)
self._send_progress(
95,
100,
ConsumerFilePhase.WORKING,
ConsumerStatusShortMessage.SAVE_DOCUMENT,
)
# now that everything is done, we can start to store the document
# in the system. This will be a transaction and reasonably fast.
try:
@ -499,7 +556,13 @@ class Consumer(LoggingMixin):
self.log.info(f"Document {document} consumption finished")
self._send_progress(100, 100, "SUCCESS", MESSAGE_FINISHED, document.id)
self._send_progress(
100,
100,
ConsumerFilePhase.SUCCESS,
ConsumerStatusShortMessage.FINISHED,
document.id,
)
# Return the most up to date fields
document.refresh_from_db()
@ -585,7 +648,7 @@ class Consumer(LoggingMixin):
# Attempt to copy file's original stats, but it's ok if we can't
try:
shutil.copystat(source, target)
copy_basic_file_stats(source, target)
except Exception: # pragma: no cover
pass

View File

@ -0,0 +1,46 @@
from pathlib import Path
from subprocess import run
import img2pdf
from django.conf import settings
from PIL import Image
from documents.utils import copy_basic_file_stats
def convert_from_tiff_to_pdf(tiff_path: Path, target_directory: Path) -> Path:
"""
Converts a TIFF file into a PDF file.
The PDF will be created in the given target_directory and share the name of
the original TIFF file, as well as its stats (mtime etc.).
Returns the path of the PDF created.
"""
with Image.open(tiff_path) as im:
has_alpha_layer = im.mode in ("RGBA", "LA")
if has_alpha_layer:
# Note the save into the temp folder, so as not to trigger a new
# consume
scratch_image = target_directory / tiff_path.name
run(
[
settings.CONVERT_BINARY,
"-alpha",
"off",
tiff_path,
scratch_image,
],
)
else:
# Not modifying the original, safe to use in place
scratch_image = tiff_path
pdf_path = (target_directory / tiff_path.name).with_suffix(".pdf")
with scratch_image.open("rb") as img_file, pdf_path.open("wb") as pdf_file:
pdf_file.write(img2pdf.convert(img_file))
# Copy what file stat is possible
copy_basic_file_stats(tiff_path, pdf_path)
return pdf_path

View File

@ -0,0 +1,131 @@
import datetime as dt
import logging
import os
import shutil
from pathlib import Path
from django.conf import settings
from pikepdf import Pdf
from documents.consumer import ConsumerError
from documents.converters import convert_from_tiff_to_pdf
from documents.data_models import ConsumableDocument
logger = logging.getLogger("paperless.double_sided")
# Hardcoded for now, could be made a configurable setting if needed
TIMEOUT_MINUTES = 30
# Used by test cases
STAGING_FILE_NAME = "double-sided-staging.pdf"
def collate(input_doc: ConsumableDocument) -> str:
"""
Tries to collate pages from 2 single sided scans of a double sided
document.
When called with a file, it checks whether or not a staging file
exists, if not, the current file is turned into that staging file
containing the odd numbered pages.
If a staging file exists, and it is not too old, the current file is
considered to be the second part (the even numbered pages) and it will
collate the pages of both, the pages of the second file will be added
in reverse order, since the ADF will have scanned the pages from bottom
to top.
Returns a status message on succcess, or raises a ConsumerError
in case of failure.
"""
# Make sure scratch dir exists, Consumer might not have run yet
settings.SCRATCH_DIR.mkdir(exist_ok=True)
if input_doc.mime_type == "application/pdf":
pdf_file = input_doc.original_file
elif (
input_doc.mime_type == "image/tiff"
and settings.CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT
):
pdf_file = convert_from_tiff_to_pdf(
input_doc.original_file,
settings.SCRATCH_DIR,
)
input_doc.original_file.unlink()
else:
raise ConsumerError("Unsupported file type for collation of double-sided scans")
staging = settings.SCRATCH_DIR / STAGING_FILE_NAME
valid_staging_exists = False
if staging.exists():
stats = os.stat(str(staging))
# if the file is older than the timeout, we don't consider
# it valid
if dt.datetime.now().timestamp() - stats.st_mtime > TIMEOUT_MINUTES * 60:
logger.warning("Outdated double sided staging file exists, deleting it")
os.unlink(str(staging))
else:
valid_staging_exists = True
if valid_staging_exists:
try:
# Collate pages from second PDF in reverse order
with Pdf.open(staging) as pdf1, Pdf.open(pdf_file) as pdf2:
pdf2.pages.reverse()
try:
for i, page in enumerate(pdf2.pages):
pdf1.pages.insert(2 * i + 1, page)
except IndexError:
raise ConsumerError(
"This second file (even numbered pages) contains more "
"pages than the first/odd numbered one. This means the "
"two uploaded files don't belong to the same double-"
"sided scan. Please retry, starting with the odd "
"numbered pages again.",
)
# Merged file has the same path, but without the
# double-sided subdir. Therefore, it is also in the
# consumption dir and will be picked up for processing
old_file = input_doc.original_file
new_file = Path(
*(
part
for part in old_file.with_name(
f"{old_file.stem}-collated.pdf",
).parts
if part != settings.CONSUMER_COLLATE_DOUBLE_SIDED_SUBDIR_NAME
),
)
# If the user didn't create the subdirs yet, do it for them
new_file.parent.mkdir(parents=True, exist_ok=True)
pdf1.save(new_file)
logger.info("Collated documents into new file %s", new_file)
return (
"Success. Even numbered pages of double sided scan collated "
"with odd pages"
)
finally:
# Delete staging and recently uploaded file no matter what.
# If any error occurs, the user needs to be able to restart
# the process from scratch; after all, the staging file
# with the odd numbered pages might be the culprit
pdf_file.unlink()
staging.unlink()
else:
# In Python 3.9 move supports Path objects directly,
# but for now we have to be compatible with 3.8
shutil.move(str(pdf_file), str(staging))
# update access to modification time so we know if the file
# is outdated when another file gets uploaded
os.utime(str(staging), (dt.datetime.now().timestamp(),) * 2)
logger.info(
"Got scan with odd numbered pages of double-sided scan, moved it to %s",
staging,
)
return (
"Received odd numbered pages of double sided scan, waiting up to "
f"{TIMEOUT_MINUTES} minutes for even numbered pages"
)

View File

@ -218,6 +218,7 @@ def generate_filename(
tag_list=tag_list,
owner_username=owner_username_str,
original_name=original_name,
doc_pk=f"{doc.pk:07}",
).strip()
if settings.FILENAME_FORMAT_REMOVE_NONE:

View File

@ -11,13 +11,17 @@ from typing import Set
import tqdm
from django.conf import settings
from django.contrib.auth.models import Group
from django.contrib.auth.models import Permission
from django.contrib.auth.models import User
from django.contrib.contenttypes.models import ContentType
from django.core import serializers
from django.core.management.base import BaseCommand
from django.core.management.base import CommandError
from django.db import transaction
from django.utils import timezone
from filelock import FileLock
from guardian.models import GroupObjectPermission
from guardian.models import UserObjectPermission
from documents.file_handling import delete_empty_directories
from documents.file_handling import generate_filename
@ -33,6 +37,7 @@ from documents.models import UiSettings
from documents.settings import EXPORTER_ARCHIVE_NAME
from documents.settings import EXPORTER_FILE_NAME
from documents.settings import EXPORTER_THUMBNAIL_NAME
from documents.utils import copy_file_with_basic_stats
from paperless import version
from paperless.db import GnuPG
from paperless_mail.models import MailAccount
@ -261,6 +266,22 @@ class Command(BaseCommand):
serializers.serialize("json", UiSettings.objects.all()),
)
manifest += json.loads(
serializers.serialize("json", ContentType.objects.all()),
)
manifest += json.loads(
serializers.serialize("json", Permission.objects.all()),
)
manifest += json.loads(
serializers.serialize("json", UserObjectPermission.objects.all()),
)
manifest += json.loads(
serializers.serialize("json", GroupObjectPermission.objects.all()),
)
# 3. Export files from each document
for index, document_dict in tqdm.tqdm(
enumerate(document_manifest),
@ -417,4 +438,4 @@ class Command(BaseCommand):
if perform_copy:
target.parent.mkdir(parents=True, exist_ok=True)
shutil.copy2(source, target)
copy_file_with_basic_stats(source, target)

View File

@ -1,17 +1,20 @@
import json
import logging
import os
import shutil
from contextlib import contextmanager
from pathlib import Path
import tqdm
from django.conf import settings
from django.contrib.auth.models import Permission
from django.contrib.contenttypes.models import ContentType
from django.core.exceptions import FieldDoesNotExist
from django.core.management import call_command
from django.core.management.base import BaseCommand
from django.core.management.base import CommandError
from django.core.serializers.base import DeserializationError
from django.db import IntegrityError
from django.db import transaction
from django.db.models.signals import m2m_changed
from django.db.models.signals import post_save
from filelock import FileLock
@ -23,6 +26,7 @@ from documents.settings import EXPORTER_ARCHIVE_NAME
from documents.settings import EXPORTER_FILE_NAME
from documents.settings import EXPORTER_THUMBNAIL_NAME
from documents.signals.handlers import update_filename_and_move_files
from documents.utils import copy_file_with_basic_stats
from paperless import version
@ -116,9 +120,13 @@ class Command(BaseCommand):
):
# Fill up the database with whatever is in the manifest
try:
for manifest_path in manifest_paths:
call_command("loaddata", manifest_path)
except (FieldDoesNotExist, DeserializationError) as e:
with transaction.atomic():
for manifest_path in manifest_paths:
# delete these since pk can change, re-created from import
ContentType.objects.all().delete()
Permission.objects.all().delete()
call_command("loaddata", manifest_path)
except (FieldDoesNotExist, DeserializationError, IntegrityError) as e:
self.stdout.write(self.style.ERROR("Database import failed"))
if (
self.version is not None
@ -238,7 +246,7 @@ class Command(BaseCommand):
create_source_path_directory(document.source_path)
shutil.copy2(document_path, document.source_path)
copy_file_with_basic_stats(document_path, document.source_path)
if thumbnail_path:
if thumbnail_path.suffix in {".png", ".PNG"}:
@ -253,13 +261,16 @@ class Command(BaseCommand):
output_file=str(document.thumbnail_path),
)
else:
shutil.copy2(thumbnail_path, document.thumbnail_path)
copy_file_with_basic_stats(
thumbnail_path,
document.thumbnail_path,
)
if archive_path:
create_source_path_directory(document.archive_path)
# TODO: this assumes that the export is valid and
# archive_filename is present on all documents with
# archived files
shutil.copy2(archive_path, document.archive_path)
copy_file_with_basic_stats(archive_path, document.archive_path)
document.save()

View File

@ -1,7 +1,9 @@
import logging
import re
from documents.classifier import DocumentClassifier
from documents.models import Correspondent
from documents.models import Document
from documents.models import DocumentType
from documents.models import MatchingModel
from documents.models import StoragePath
@ -11,7 +13,7 @@ from documents.permissions import get_objects_for_user_owner_aware
logger = logging.getLogger("paperless.matching")
def log_reason(matching_model, document, reason):
def log_reason(matching_model: MatchingModel, document: Document, reason: str):
class_name = type(matching_model).__name__
logger.debug(
f"{class_name} {matching_model.name} matched on document "
@ -19,7 +21,7 @@ def log_reason(matching_model, document, reason):
)
def match_correspondents(document, classifier, user=None):
def match_correspondents(document: Document, classifier: DocumentClassifier, user=None):
pred_id = classifier.predict_correspondent(document.content) if classifier else None
if user is None and document.owner is not None:
@ -35,11 +37,15 @@ def match_correspondents(document, classifier, user=None):
correspondents = Correspondent.objects.all()
return list(
filter(lambda o: matches(o, document) or o.pk == pred_id, correspondents),
filter(
lambda o: matches(o, document)
or (o.pk == pred_id and o.matching_algorithm == MatchingModel.MATCH_AUTO),
correspondents,
),
)
def match_document_types(document, classifier, user=None):
def match_document_types(document: Document, classifier: DocumentClassifier, user=None):
pred_id = classifier.predict_document_type(document.content) if classifier else None
if user is None and document.owner is not None:
@ -55,11 +61,15 @@ def match_document_types(document, classifier, user=None):
document_types = DocumentType.objects.all()
return list(
filter(lambda o: matches(o, document) or o.pk == pred_id, document_types),
filter(
lambda o: matches(o, document)
or (o.pk == pred_id and o.matching_algorithm == MatchingModel.MATCH_AUTO),
document_types,
),
)
def match_tags(document, classifier, user=None):
def match_tags(document: Document, classifier: DocumentClassifier, user=None):
predicted_tag_ids = classifier.predict_tags(document.content) if classifier else []
if user is None and document.owner is not None:
@ -71,11 +81,18 @@ def match_tags(document, classifier, user=None):
tags = Tag.objects.all()
return list(
filter(lambda o: matches(o, document) or o.pk in predicted_tag_ids, tags),
filter(
lambda o: matches(o, document)
or (
o.matching_algorithm == MatchingModel.MATCH_AUTO
and o.pk in predicted_tag_ids
),
tags,
),
)
def match_storage_paths(document, classifier, user=None):
def match_storage_paths(document: Document, classifier: DocumentClassifier, user=None):
pred_id = classifier.predict_storage_path(document.content) if classifier else None
if user is None and document.owner is not None:
@ -92,13 +109,14 @@ def match_storage_paths(document, classifier, user=None):
return list(
filter(
lambda o: matches(o, document) or o.pk == pred_id,
lambda o: matches(o, document)
or (o.pk == pred_id and o.matching_algorithm == MatchingModel.MATCH_AUTO),
storage_paths,
),
)
def matches(matching_model, document):
def matches(matching_model: MatchingModel, document: Document):
search_kwargs = {}
document_content = document.content

View File

@ -0,0 +1,162 @@
# Generated by Django 4.1.9 on 2023-06-29 19:29
import logging
import multiprocessing.pool
import shutil
import tempfile
import time
from pathlib import Path
import gnupg
from django.conf import settings
from django.db import migrations
from documents.parsers import run_convert
logger = logging.getLogger("paperless.migrations")
def _do_convert(work_package):
(
existing_encrypted_thumbnail,
converted_encrypted_thumbnail,
passphrase,
) = work_package
try:
gpg = gnupg.GPG(gnupghome=settings.GNUPG_HOME)
logger.info(f"Decrypting thumbnail: {existing_encrypted_thumbnail}")
# Decrypt png
decrypted_thumbnail = existing_encrypted_thumbnail.with_suffix("").resolve()
with open(existing_encrypted_thumbnail, "rb") as existing_encrypted_file:
raw_thumb = gpg.decrypt_file(
existing_encrypted_file,
passphrase=passphrase,
always_trust=True,
).data
with open(decrypted_thumbnail, "wb") as decrypted_file:
decrypted_file.write(raw_thumb)
converted_decrypted_thumbnail = Path(
str(converted_encrypted_thumbnail).replace("webp.gpg", "webp"),
).resolve()
logger.info(f"Converting decrypted thumbnail: {decrypted_thumbnail}")
# Convert to webp
run_convert(
density=300,
scale="500x5000>",
alpha="remove",
strip=True,
trim=False,
auto_orient=True,
input_file=f"{decrypted_thumbnail}[0]",
output_file=str(converted_decrypted_thumbnail),
)
logger.info(
f"Encrypting converted thumbnail: {converted_decrypted_thumbnail}",
)
# Encrypt webp
with open(converted_decrypted_thumbnail, "rb") as converted_decrypted_file:
encrypted = gpg.encrypt_file(
fileobj_or_path=converted_decrypted_file,
recipients=None,
passphrase=passphrase,
symmetric=True,
always_trust=True,
).data
with open(converted_encrypted_thumbnail, "wb") as converted_encrypted_file:
converted_encrypted_file.write(encrypted)
# Copy newly created thumbnail to thumbnail directory
shutil.copy(converted_encrypted_thumbnail, existing_encrypted_thumbnail.parent)
# Remove the existing encrypted PNG version
existing_encrypted_thumbnail.unlink()
# Remove the decrypted PNG version
decrypted_thumbnail.unlink()
# Remove the decrypted WebP version
converted_decrypted_thumbnail.unlink()
logger.info(
"Conversion to WebP completed, "
f"replaced {existing_encrypted_thumbnail.name} with {converted_encrypted_thumbnail.name}",
)
except Exception as e:
logger.error(f"Error converting thumbnail (existing file unchanged): {e}")
def _convert_encrypted_thumbnails_to_webp(apps, schema_editor):
start = time.time()
with tempfile.TemporaryDirectory() as tempdir:
work_packages = []
if len(list(Path(settings.THUMBNAIL_DIR).glob("*.png.gpg"))) > 0:
passphrase = settings.PASSPHRASE
if not passphrase:
raise Exception(
"Passphrase not defined, encrypted thumbnails cannot be migrated"
"without this",
)
for file in Path(settings.THUMBNAIL_DIR).glob("*.png.gpg"):
existing_thumbnail = file.resolve()
# Change the existing filename suffix from png to webp
converted_thumbnail_name = Path(
str(existing_thumbnail).replace(".png.gpg", ".webp.gpg"),
).name
# Create the expected output filename in the tempdir
converted_thumbnail = (
Path(tempdir) / Path(converted_thumbnail_name)
).resolve()
# Package up the necessary info
work_packages.append(
(existing_thumbnail, converted_thumbnail, passphrase),
)
if len(work_packages):
logger.info(
"\n\n"
" This is a one-time only migration to convert thumbnails for all of your\n"
" *encrypted* documents into WebP format. If you have a lot of encrypted documents, \n"
" this may take a while, so a coffee break may be in order."
"\n",
)
with multiprocessing.pool.Pool(
processes=min(multiprocessing.cpu_count(), 4),
maxtasksperchild=4,
) as pool:
pool.map(_do_convert, work_packages)
end = time.time()
duration = end - start
logger.info(f"Conversion completed in {duration:.3f}s")
class Migration(migrations.Migration):
dependencies = [
("documents", "1036_alter_savedviewfilterrule_rule_type"),
]
operations = [
migrations.RunPython(
code=_convert_encrypted_thumbnails_to_webp,
reverse_code=migrations.RunPython.noop,
),
]

View File

@ -18,6 +18,7 @@ from django.utils import timezone
from documents.loggers import LoggingMixin
from documents.signals import document_consumer_declaration
from documents.utils import copy_file_with_basic_stats
# This regular expression will try to find dates in the document at
# hand and will match the following formats:
@ -31,16 +32,18 @@ from documents.signals import document_consumer_declaration
# - MONTH ZZZZ, with ZZZZ being 4 digits
# - MONTH XX, ZZZZ with XX being 1 or 2 and ZZZZ being 4 digits
# - XX MON ZZZZ with XX being 1 or 2 and ZZZZ being 4 digits. MONTH is 3 letters
# - XXPP MONTH ZZZZ with XX being 1 or 2 and PP being 2 letters and ZZZZ being 4 digits
# TODO: isnt there a date parsing library for this?
DATE_REGEX = re.compile(
r"(\b|(?!=([_-])))([0-9]{1,2})[\.\/-]([0-9]{1,2})[\.\/-]([0-9]{4}|[0-9]{2})(\b|(?=([_-])))|" # noqa: E501
r"(\b|(?!=([_-])))([0-9]{4}|[0-9]{2})[\.\/-]([0-9]{1,2})[\.\/-]([0-9]{1,2})(\b|(?=([_-])))|" # noqa: E501
r"(\b|(?!=([_-])))([0-9]{1,2}[\. ]+[^ ]{3,9} ([0-9]{4}|[0-9]{2}))(\b|(?=([_-])))|" # noqa: E501
r"(\b|(?!=([_-])))([0-9]{1,2}[\. ]+[a-zA-Z]{3,9} ([0-9]{4}|[0-9]{2}))(\b|(?=([_-])))|" # noqa: E501
r"(\b|(?!=([_-])))([^\W\d_]{3,9} [0-9]{1,2}, ([0-9]{4}))(\b|(?=([_-])))|"
r"(\b|(?!=([_-])))([^\W\d_]{3,9} [0-9]{4})(\b|(?=([_-])))|"
r"(\b|(?!=([_-])))(\b[0-9]{1,2}[ \.\/-][A-Z]{3}[ \.\/-][0-9]{4})(\b|(?=([_-])))", # noqa: E501
r"(\b|(?!=([_-])))([0-9]{1,2}[^ ]{2}[\. ]+[^ ]{3,9}[ \.\/-][0-9]{4})(\b|(?=([_-])))|" # noqa: E501
r"(\b|(?!=([_-])))(\b[0-9]{1,2}[ \.\/-][a-zA-Z]{3}[ \.\/-][0-9]{4})(\b|(?=([_-])))", # noqa: E501
)
@ -206,7 +209,7 @@ def make_thumbnail_from_pdf_gs_fallback(in_path, temp_dir, logging_group=None) -
# so we need to copy it before it gets moved.
# https://github.com/paperless-ngx/paperless-ngx/issues/3631
default_thumbnail_path = os.path.join(temp_dir, "document.png")
shutil.copy2(get_default_thumbnail(), default_thumbnail_path)
copy_file_with_basic_stats(get_default_thumbnail(), default_thumbnail_path)
return default_thumbnail_path

View File

@ -1,6 +1,7 @@
import logging
import os
import shutil
from typing import Optional
from celery import states
from celery.signals import before_task_publish
@ -21,6 +22,7 @@ from django.utils import timezone
from filelock import FileLock
from documents import matching
from documents.classifier import DocumentClassifier
from documents.file_handling import create_source_path_directory
from documents.file_handling import delete_empty_directories
from documents.file_handling import generate_unique_filename
@ -33,7 +35,7 @@ from documents.permissions import get_objects_for_user_owner_aware
logger = logging.getLogger("paperless.handlers")
def add_inbox_tags(sender, document=None, logging_group=None, **kwargs):
def add_inbox_tags(sender, document: Document, logging_group=None, **kwargs):
if document.owner is not None:
tags = get_objects_for_user_owner_aware(
document.owner,
@ -48,9 +50,9 @@ def add_inbox_tags(sender, document=None, logging_group=None, **kwargs):
def set_correspondent(
sender,
document=None,
document: Document,
logging_group=None,
classifier=None,
classifier: Optional[DocumentClassifier] = None,
replace=False,
use_first=True,
suggest=False,
@ -111,9 +113,9 @@ def set_correspondent(
def set_document_type(
sender,
document=None,
document: Document,
logging_group=None,
classifier=None,
classifier: Optional[DocumentClassifier] = None,
replace=False,
use_first=True,
suggest=False,
@ -175,9 +177,9 @@ def set_document_type(
def set_tags(
sender,
document=None,
document: Document,
logging_group=None,
classifier=None,
classifier: Optional[DocumentClassifier] = None,
replace=False,
suggest=False,
base_url=None,
@ -239,9 +241,9 @@ def set_tags(
def set_storage_path(
sender,
document=None,
document: Document,
logging_group=None,
classifier=None,
classifier: Optional[DocumentClassifier] = None,
replace=False,
use_first=True,
suggest=False,
@ -491,7 +493,7 @@ def update_filename_and_move_files(sender, instance: Document, **kwargs):
)
def set_log_entry(sender, document=None, logging_group=None, **kwargs):
def set_log_entry(sender, document: Document, logging_group=None, **kwargs):
ct = ContentType.objects.get(model="document")
user = User.objects.get(username="consumer")

View File

@ -25,6 +25,7 @@ from documents.consumer import Consumer
from documents.consumer import ConsumerError
from documents.data_models import ConsumableDocument
from documents.data_models import DocumentMetadataOverrides
from documents.double_sided import collate
from documents.file_handling import create_source_path_directory
from documents.file_handling import generate_unique_filename
from documents.models import Correspondent
@ -64,6 +65,12 @@ def train_classifier():
and not Correspondent.objects.filter(matching_algorithm=Tag.MATCH_AUTO).exists()
and not StoragePath.objects.filter(matching_algorithm=Tag.MATCH_AUTO).exists()
):
logger.info("No automatic matching items, not training")
# Special case, items were once auto and trained, so remove the model
# and prevent its use again
if settings.MODEL_FILE.exists():
logger.info(f"Removing {settings.MODEL_FILE} so it won't be used")
settings.MODEL_FILE.unlink()
return
classifier = load_classifier()
@ -89,10 +96,40 @@ def consume_file(
input_doc: ConsumableDocument,
overrides: Optional[DocumentMetadataOverrides] = None,
):
def send_progress(status="SUCCESS", message="finished"):
payload = {
"filename": overrides.filename or input_doc.original_file.name,
"task_id": None,
"current_progress": 100,
"max_progress": 100,
"status": status,
"message": message,
}
try:
async_to_sync(get_channel_layer().group_send)(
"status_updates",
{"type": "status_update", "data": payload},
)
except ConnectionError as e:
logger.warning(f"ConnectionError on status send: {e!s}")
# Default no overrides
if overrides is None:
overrides = DocumentMetadataOverrides()
# Handle collation of double-sided documents scanned in two parts
if settings.CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED and (
settings.CONSUMER_COLLATE_DOUBLE_SIDED_SUBDIR_NAME
in input_doc.original_file.parts
):
try:
msg = collate(input_doc)
send_progress(message=msg)
return msg
except ConsumerError as e:
send_progress(status="FAILURE", message=e.args[0])
raise e
# read all barcodes in the current document
if settings.CONSUMER_ENABLE_BARCODES or settings.CONSUMER_ENABLE_ASN_BARCODE:
with BarcodeReader(input_doc.original_file, input_doc.mime_type) as reader:
@ -102,32 +139,18 @@ def consume_file(
):
# notify the sender, otherwise the progress bar
# in the UI stays stuck
payload = {
"filename": overrides.filename or input_doc.original_file.name,
"task_id": None,
"current_progress": 100,
"max_progress": 100,
"status": "SUCCESS",
"message": "finished",
}
try:
async_to_sync(get_channel_layer().group_send)(
"status_updates",
{"type": "status_update", "data": payload},
)
except ConnectionError as e:
logger.warning(f"ConnectionError on status send: {e!s}")
send_progress()
# consuming stops here, since the original document with
# the barcodes has been split and will be consumed separately
input_doc.original_file.unlink()
return "File successfully split"
# try reading the ASN from barcode
if settings.CONSUMER_ENABLE_ASN_BARCODE:
if settings.CONSUMER_ENABLE_ASN_BARCODE and reader.asn is not None:
# Note this will take precedence over an API provided ASN
# But it's from a physical barcode, so that's good
overrides.asn = reader.asn
if overrides.asn:
logger.info(f"Found ASN in barcode: {overrides.asn}")
logger.info(f"Found ASN in barcode: {overrides.asn}")
# continue with consumption if no barcode was found
document = Consumer().try_consume_file(

Binary file not shown.

Binary file not shown.

View File

@ -2369,6 +2369,62 @@ class TestDocumentApi(DirectoriesMixin, DocumentConsumeDelayMixin, APITestCase):
self.assertEqual(resp_data["note"], "this is a posted note")
def test_notes_permissions_aware(self):
"""
GIVEN:
- Existing document owned by user2 but with granted view perms for user1
WHEN:
- API request is made by user1 to add a note or delete
THEN:
- Notes are neither created nor deleted
"""
user1 = User.objects.create_user(username="test1")
user1.user_permissions.add(*Permission.objects.all())
user1.save()
user2 = User.objects.create_user(username="test2")
user2.save()
doc = Document.objects.create(
title="test",
mime_type="application/pdf",
content="this is a document which will have notes added",
)
doc.owner = user2
doc.save()
self.client.force_authenticate(user1)
resp = self.client.get(
f"/api/documents/{doc.pk}/notes/",
format="json",
)
self.assertEqual(resp.content, b"Insufficient permissions to view")
self.assertEqual(resp.status_code, status.HTTP_403_FORBIDDEN)
assign_perm("view_document", user1, doc)
resp = self.client.post(
f"/api/documents/{doc.pk}/notes/",
data={"note": "this is a posted note"},
)
self.assertEqual(resp.content, b"Insufficient permissions to create")
self.assertEqual(resp.status_code, status.HTTP_403_FORBIDDEN)
note = Note.objects.create(
note="This is a note.",
document=doc,
user=user2,
)
response = self.client.delete(
f"/api/documents/{doc.pk}/notes/?id={note.pk}",
format="json",
)
self.assertEqual(response.content, b"Insufficient permissions to delete")
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_delete_note(self):
"""
GIVEN:

View File

@ -21,6 +21,7 @@ from django.utils import timezone
from documents.consumer import Consumer
from documents.consumer import ConsumerError
from documents.consumer import ConsumerFilePhase
from documents.models import Correspondent
from documents.models import Document
from documents.models import DocumentType
@ -228,8 +229,8 @@ def fake_magic_from_file(file, mime=False):
class TestConsumer(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
def _assert_first_last_send_progress(
self,
first_status="STARTING",
last_status="SUCCESS",
first_status=ConsumerFilePhase.STARTED,
last_status=ConsumerFilePhase.SUCCESS,
first_progress=0,
first_progress_max=100,
last_progress=100,
@ -561,10 +562,16 @@ class TestConsumer(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
@mock.patch("documents.consumer.load_classifier")
def testClassifyDocument(self, m):
correspondent = Correspondent.objects.create(name="test")
dtype = DocumentType.objects.create(name="test")
t1 = Tag.objects.create(name="t1")
t2 = Tag.objects.create(name="t2")
correspondent = Correspondent.objects.create(
name="test",
matching_algorithm=Correspondent.MATCH_AUTO,
)
dtype = DocumentType.objects.create(
name="test",
matching_algorithm=DocumentType.MATCH_AUTO,
)
t1 = Tag.objects.create(name="t1", matching_algorithm=Tag.MATCH_AUTO)
t2 = Tag.objects.create(name="t2", matching_algorithm=Tag.MATCH_AUTO)
m.return_value = MagicMock()
m.return_value.predict_correspondent.return_value = correspondent.pk

View File

@ -152,6 +152,55 @@ class TestDate(TestCase):
text = "Customer Number Currency 22 MAR,2022 Credit Card 1934829304"
self.assertIsNone(parse_date("", text), None)
def test_date_format_19(self):
text = "Customer Number Currency 21st MAR 2022 Credit Card 1934829304"
self.assertEqual(
parse_date("", text),
datetime.datetime(2022, 3, 21, 0, 0, tzinfo=tz.gettz(settings.TIME_ZONE)),
)
def test_date_format_20(self):
text = "Customer Number Currency 22nd March 2022 Credit Card 1934829304"
self.assertEqual(
parse_date("", text),
datetime.datetime(2022, 3, 22, 0, 0, tzinfo=tz.gettz(settings.TIME_ZONE)),
)
def test_date_format_21(self):
text = "Customer Number Currency 2nd MAR 2022 Credit Card 1934829304"
self.assertEqual(
parse_date("", text),
datetime.datetime(2022, 3, 2, 0, 0, tzinfo=tz.gettz(settings.TIME_ZONE)),
)
def test_date_format_22(self):
text = "Customer Number Currency 23rd MAR 2022 Credit Card 1934829304"
self.assertEqual(
parse_date("", text),
datetime.datetime(2022, 3, 23, 0, 0, tzinfo=tz.gettz(settings.TIME_ZONE)),
)
def test_date_format_23(self):
text = "Customer Number Currency 24th MAR 2022 Credit Card 1934829304"
self.assertEqual(
parse_date("", text),
datetime.datetime(2022, 3, 24, 0, 0, tzinfo=tz.gettz(settings.TIME_ZONE)),
)
def test_date_format_24(self):
text = "Customer Number Currency 21-MAR-2022 Credit Card 1934829304"
self.assertEqual(
parse_date("", text),
datetime.datetime(2022, 3, 21, 0, 0, tzinfo=tz.gettz(settings.TIME_ZONE)),
)
def test_date_format_25(self):
text = "Customer Number Currency 25TH MAR 2022 Credit Card 1934829304"
self.assertEqual(
parse_date("", text),
datetime.datetime(2022, 3, 25, 0, 0, tzinfo=tz.gettz(settings.TIME_ZONE)),
)
def test_crazy_date_past(self, *args):
self.assertIsNone(parse_date("", "01-07-0590 00:00:00"))

View File

@ -0,0 +1,253 @@
import datetime as dt
import os
import shutil
from pathlib import Path
from typing import Union
from unittest import mock
from django.test import TestCase
from django.test import override_settings
from pdfminer.high_level import extract_text
from pikepdf import Pdf
from documents import tasks
from documents.consumer import ConsumerError
from documents.data_models import ConsumableDocument
from documents.data_models import DocumentSource
from documents.double_sided import STAGING_FILE_NAME
from documents.double_sided import TIMEOUT_MINUTES
from documents.tests.utils import DirectoriesMixin
from documents.tests.utils import FileSystemAssertsMixin
@override_settings(
CONSUMER_RECURSIVE=True,
CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED=True,
)
class TestDoubleSided(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
SAMPLE_DIR = Path(__file__).parent / "samples"
def setUp(self):
super().setUp()
self.dirs.double_sided_dir = self.dirs.consumption_dir / "double-sided"
self.dirs.double_sided_dir.mkdir()
self.staging_file = self.dirs.scratch_dir / STAGING_FILE_NAME
def consume_file(self, srcname, dstname: Union[str, Path] = "foo.pdf"):
"""
Starts the consume process and also ensures the
destination file does not exist afterwards
"""
src = self.SAMPLE_DIR / srcname
dst = self.dirs.double_sided_dir / dstname
dst.parent.mkdir(parents=True, exist_ok=True)
shutil.copy(src, dst)
with mock.patch("documents.tasks.async_to_sync"), mock.patch(
"documents.consumer.async_to_sync",
):
msg = tasks.consume_file(
ConsumableDocument(
source=DocumentSource.ConsumeFolder,
original_file=dst,
),
None,
)
self.assertIsNotFile(dst)
return msg
def create_staging_file(self, src="double-sided-odd.pdf", datetime=None):
shutil.copy(self.SAMPLE_DIR / src, self.staging_file)
if datetime is None:
datetime = dt.datetime.now()
os.utime(str(self.staging_file), (datetime.timestamp(),) * 2)
def test_odd_numbered_moved_to_staging(self):
"""
GIVEN:
- No staging file exists
WHEN:
- A file is copied into the double-sided consume directory
THEN:
- The file becomes the new staging file
- The file in the consume directory gets removed
- The staging file has the st_mtime set to now
- The user gets informed
"""
msg = self.consume_file("double-sided-odd.pdf")
self.assertIsFile(self.staging_file)
self.assertAlmostEqual(
dt.datetime.fromtimestamp(self.staging_file.stat().st_mtime),
dt.datetime.now(),
delta=dt.timedelta(seconds=5),
)
self.assertIn("Received odd numbered pages", msg)
def test_collation(self):
"""
GIVEN:
- A staging file not older than TIMEOUT_MINUTES with odd pages exists
WHEN:
- A file is copied into the double-sided consume directory
THEN:
- A new file containing the collated staging and uploaded file is
created and put into the consume directory
- The new file is named "foo-collated.pdf", where foo is the name of
the second file
- Both staging and uploaded file get deleted
- The new file contains the pages in the correct order
"""
self.create_staging_file()
self.consume_file("double-sided-even.pdf", "some-random-name.pdf")
target = self.dirs.consumption_dir / "some-random-name-collated.pdf"
self.assertIsFile(target)
self.assertIsNotFile(self.staging_file)
self.assertRegex(
extract_text(str(target)),
r"(?s)"
r"This is page 1.*This is page 2.*This is page 3.*"
r"This is page 4.*This is page 5",
)
def test_staging_file_expiration(self):
"""
GIVEN:
- A staging file older than TIMEOUT_MINUTES exists
WHEN:
- A file is copied into the double-sided consume directory
THEN:
- It becomes the new staging file
"""
self.create_staging_file(
datetime=dt.datetime.now()
- dt.timedelta(minutes=TIMEOUT_MINUTES, seconds=1),
)
msg = self.consume_file("double-sided-odd.pdf")
self.assertIsFile(self.staging_file)
self.assertIn("Received odd numbered pages", msg)
def test_less_odd_pages_then_even_fails(self):
"""
GIVEN:
- A valid staging file
WHEN:
- A file is copied into the double-sided consume directory
that has more pages than the staging file
THEN:
- Both files get removed
- A ConsumerError exception is thrown
"""
self.create_staging_file("simple.pdf")
self.assertRaises(
ConsumerError,
self.consume_file,
"double-sided-even.pdf",
)
self.assertIsNotFile(self.staging_file)
@override_settings(CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT=True)
def test_tiff_upload_enabled(self):
"""
GIVEN:
- CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT is true
- No staging file exists
WHEN:
- A TIFF file gets uploaded into the double-sided
consume dir
THEN:
- The file is converted into a PDF and moved to
the staging file
"""
self.consume_file("simple.tiff", "simple.tiff")
self.assertIsFile(self.staging_file)
# Ensure the file is a valid PDF by trying to read it
Pdf.open(self.staging_file)
@override_settings(CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT=False)
def test_tiff_upload_disabled(self):
"""
GIVEN:
- CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT is false
- No staging file exists
WHEN:
- A TIFF file gets uploaded into the double-sided
consume dir
THEN:
- A ConsumerError is raised
"""
self.assertRaises(
ConsumerError,
self.consume_file,
"simple.tiff",
"simple.tiff",
)
@override_settings(CONSUMER_COLLATE_DOUBLE_SIDED_SUBDIR_NAME="quux")
def test_different_upload_dir_name(self):
"""
GIVEN:
- No staging file exists
- CONSUMER_COLLATE_DOUBLE_SIDED_SUBDIR_NAME is set to quux
WHEN:
- A file is uploaded into the quux dir
THEN:
- A staging file is created
"""
self.consume_file("double-sided-odd.pdf", Path("..") / "quux" / "foo.pdf")
self.assertIsFile(self.staging_file)
def test_only_double_sided_dir_is_handled(self):
"""
GIVEN:
- No staging file exists
WHEN:
- A file is uploaded into the normal consumption dir
THEN:
- The file is processed as normal
"""
msg = self.consume_file("simple.pdf", Path("..") / "simple.pdf")
self.assertIsNotFile(self.staging_file)
self.assertRegex(msg, "Success. New document .* created")
def test_subdirectory_upload(self):
"""
GIVEN:
- A staging file exists
WHEN:
- A file gets uploaded into foo/bar/double-sided
or double-sided/foo/bar
THEN:
- The collated file gets put into foo/bar
"""
for path in [
Path("foo") / "bar" / "double-sided",
Path("double-sided") / "foo" / "bar",
]:
with self.subTest(path=path):
# Ensure we get fresh directories for each run
self.tearDown()
self.setUp()
self.create_staging_file()
self.consume_file("double-sided-odd.pdf", path / "foo.pdf")
self.assertIsFile(
self.dirs.consumption_dir / "foo" / "bar" / "foo-collated.pdf",
)
@override_settings(CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED=False)
def test_disabled_double_sided_dir_upload(self):
"""
GIVEN:
- CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED is false
WHEN:
- A file is uploaded into the double-sided directory
THEN:
- The file is processed like a normal upload
"""
msg = self.consume_file("simple.pdf")
self.assertIsNotFile(self.staging_file)
self.assertRegex(msg, "Success. New document .* created")

View File

@ -446,6 +446,19 @@ class TestFileHandling(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
self.assertIsNotDir(os.path.join(settings.ORIGINALS_DIR, "none"))
self.assertIsDir(settings.ORIGINALS_DIR)
@override_settings(FILENAME_FORMAT="{doc_pk}")
def test_format_doc_pk(self):
document = Document()
document.pk = 1
document.mime_type = "application/pdf"
document.storage_type = Document.STORAGE_TYPE_UNENCRYPTED
self.assertEqual(generate_filename(document), "0000001.pdf")
document.pk = 13579
self.assertEqual(generate_filename(document), "0013579.pdf")
@override_settings(FILENAME_FORMAT=None)
def test_format_none(self):
document = Document()

View File

@ -7,11 +7,18 @@ from pathlib import Path
from unittest import mock
from zipfile import ZipFile
from django.contrib.auth.models import Group
from django.contrib.auth.models import Permission
from django.contrib.contenttypes.models import ContentType
from django.core.management import call_command
from django.core.management.base import CommandError
from django.db import IntegrityError
from django.test import TestCase
from django.test import override_settings
from django.utils import timezone
from guardian.models import GroupObjectPermission
from guardian.models import UserObjectPermission
from guardian.shortcuts import assign_perm
from documents.management.commands import document_exporter
from documents.models import Correspondent
@ -34,6 +41,8 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
self.addCleanup(shutil.rmtree, self.target)
self.user = User.objects.create(username="temp_admin")
self.user2 = User.objects.create(username="user2")
self.group1 = Group.objects.create(name="group1")
self.d1 = Document.objects.create(
content="Content",
@ -73,6 +82,9 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
user=self.user,
)
assign_perm("view_document", self.user2, self.d2)
assign_perm("view_document", self.group1, self.d3)
self.t1 = Tag.objects.create(name="t")
self.dt1 = DocumentType.objects.create(name="dt")
self.c1 = Correspondent.objects.create(name="c")
@ -141,12 +153,12 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
manifest = self._do_export(use_filename_format=use_filename_format)
self.assertEqual(len(manifest), 10)
self.assertEqual(len(manifest), 149)
# dont include consumer or AnonymousUser users
self.assertEqual(
len(list(filter(lambda e: e["model"] == "auth.user", manifest))),
1,
2,
)
self.assertEqual(
@ -218,6 +230,9 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
Correspondent.objects.all().delete()
DocumentType.objects.all().delete()
Tag.objects.all().delete()
Permission.objects.all().delete()
UserObjectPermission.objects.all().delete()
GroupObjectPermission.objects.all().delete()
self.assertEqual(Document.objects.count(), 0)
call_command("document_importer", "--no-progress-bar", self.target)
@ -230,6 +245,9 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
self.assertEqual(Document.objects.get(id=self.d2.id).title, "wow2")
self.assertEqual(Document.objects.get(id=self.d3.id).title, "wow2")
self.assertEqual(Document.objects.get(id=self.d4.id).title, "wow_dec")
self.assertEqual(GroupObjectPermission.objects.count(), 1)
self.assertEqual(UserObjectPermission.objects.count(), 1)
self.assertEqual(Permission.objects.count(), 108)
messages = check_sanity()
# everything is alright after the test
self.assertEqual(len(messages), 0)
@ -259,7 +277,7 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
st_mtime_1 = os.stat(os.path.join(self.target, "manifest.json")).st_mtime
with mock.patch(
"documents.management.commands.document_exporter.shutil.copy2",
"documents.management.commands.document_exporter.copy_file_with_basic_stats",
) as m:
self._do_export()
m.assert_not_called()
@ -270,7 +288,7 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
Path(self.d1.source_path).touch()
with mock.patch(
"documents.management.commands.document_exporter.shutil.copy2",
"documents.management.commands.document_exporter.copy_file_with_basic_stats",
) as m:
self._do_export()
self.assertEqual(m.call_count, 1)
@ -293,7 +311,7 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
self.assertIsFile(os.path.join(self.target, "manifest.json"))
with mock.patch(
"documents.management.commands.document_exporter.shutil.copy2",
"documents.management.commands.document_exporter.copy_file_with_basic_stats",
) as m:
self._do_export()
m.assert_not_called()
@ -304,7 +322,7 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
self.d2.save()
with mock.patch(
"documents.management.commands.document_exporter.shutil.copy2",
"documents.management.commands.document_exporter.copy_file_with_basic_stats",
) as m:
self._do_export(compare_checksums=True)
self.assertEqual(m.call_count, 1)
@ -641,3 +659,47 @@ class TestExportImport(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
self.assertEqual(Document.objects.count(), 0)
call_command("document_importer", "--no-progress-bar", self.target)
self.assertEqual(Document.objects.count(), 4)
def test_import_db_transaction_failed(self):
"""
GIVEN:
- Import from manifest started
WHEN:
- Import of database fails
THEN:
- ContentType & Permission objects are not deleted, db transaction rolled back
"""
shutil.rmtree(os.path.join(self.dirs.media_dir, "documents"))
shutil.copytree(
os.path.join(os.path.dirname(__file__), "samples", "documents"),
os.path.join(self.dirs.media_dir, "documents"),
)
self.assertEqual(ContentType.objects.count(), 27)
self.assertEqual(Permission.objects.count(), 108)
manifest = self._do_export()
with paperless_environment():
self.assertEqual(
len(list(filter(lambda e: e["model"] == "auth.permission", manifest))),
108,
)
# add 1 more to db to show objects are not re-created by import
Permission.objects.create(
name="test",
codename="test_perm",
content_type_id=1,
)
self.assertEqual(Permission.objects.count(), 109)
# will cause an import error
self.user.delete()
self.user = User.objects.create(username="temp_admin")
with self.assertRaises(IntegrityError):
call_command("document_importer", "--no-progress-bar", self.target)
self.assertEqual(ContentType.objects.count(), 27)
self.assertEqual(Permission.objects.count(), 109)

View File

@ -2,6 +2,7 @@ import hashlib
import os
import shutil
from pathlib import Path
from typing import Optional
from unittest import mock
from django.conf import settings
@ -60,8 +61,8 @@ def make_test_document(
mime_type: str,
original: str,
original_filename: str,
archive: str = None,
archive_filename: str = None,
archive: Optional[str] = None,
archive_filename: Optional[str] = None,
):
doc = document_class()
doc.filename = original_filename

View File

@ -0,0 +1,276 @@
import shutil
import tempfile
from pathlib import Path
from typing import Callable
from typing import Iterable
from typing import Union
from unittest import mock
from django.test import override_settings
from documents.tests.utils import TestMigrations
@override_settings(PASSPHRASE="test")
@mock.patch(
"documents.migrations.1037_webp_encrypted_thumbnail_conversion.multiprocessing.pool.Pool.map",
)
@mock.patch("documents.migrations.1037_webp_encrypted_thumbnail_conversion.run_convert")
class TestMigrateToEncrytpedWebPThumbnails(TestMigrations):
migrate_from = "1036_alter_savedviewfilterrule_rule_type"
migrate_to = "1037_webp_encrypted_thumbnail_conversion"
auto_migrate = False
def pretend_convert_output(self, *args, **kwargs):
"""
Pretends to do the conversion, by copying the input file
to the output file
"""
shutil.copy2(
Path(kwargs["input_file"].rstrip("[0]")),
Path(kwargs["output_file"]),
)
def pretend_map(self, func: Callable, iterable: Iterable):
"""
Pretends to be the map of a multiprocessing.Pool, but secretly does
everything in series
"""
for item in iterable:
func(item)
def create_dummy_thumbnails(
self,
thumb_dir: Path,
ext: str,
count: int,
start_count: int = 0,
):
"""
Helper to create a certain count of files of given extension in a given directory
"""
for idx in range(count):
(Path(thumb_dir) / Path(f"{start_count + idx:07}.{ext}")).touch()
# Triple check expected files exist
self.assert_file_count_by_extension(ext, thumb_dir, count)
def create_webp_thumbnail_files(
self,
thumb_dir: Path,
count: int,
start_count: int = 0,
):
"""
Creates a dummy WebP thumbnail file in the given directory, based on
the database Document
"""
self.create_dummy_thumbnails(thumb_dir, "webp", count, start_count)
def create_encrypted_webp_thumbnail_files(
self,
thumb_dir: Path,
count: int,
start_count: int = 0,
):
"""
Creates a dummy encrypted WebP thumbnail file in the given directory, based on
the database Document
"""
self.create_dummy_thumbnails(thumb_dir, "webp.gpg", count, start_count)
def create_png_thumbnail_files(
self,
thumb_dir: Path,
count: int,
start_count: int = 0,
):
"""
Creates a dummy PNG thumbnail file in the given directory, based on
the database Document
"""
self.create_dummy_thumbnails(thumb_dir, "png", count, start_count)
def create_encrypted_png_thumbnail_files(
self,
thumb_dir: Path,
count: int,
start_count: int = 0,
):
"""
Creates a dummy encrypted PNG thumbnail file in the given directory, based on
the database Document
"""
self.create_dummy_thumbnails(thumb_dir, "png.gpg", count, start_count)
def assert_file_count_by_extension(
self,
ext: str,
dir: Union[str, Path],
expected_count: int,
):
"""
Helper to assert a certain count of given extension files in given directory
"""
if not isinstance(dir, Path):
dir = Path(dir)
matching_files = list(dir.glob(f"*.{ext}"))
self.assertEqual(len(matching_files), expected_count)
def assert_encrypted_png_file_count(self, dir: Path, expected_count: int):
"""
Helper to assert a certain count of excrypted PNG extension files in given directory
"""
self.assert_file_count_by_extension("png.gpg", dir, expected_count)
def assert_encrypted_webp_file_count(self, dir: Path, expected_count: int):
"""
Helper to assert a certain count of encrypted WebP extension files in given directory
"""
self.assert_file_count_by_extension("webp.gpg", dir, expected_count)
def assert_webp_file_count(self, dir: Path, expected_count: int):
"""
Helper to assert a certain count of WebP extension files in given directory
"""
self.assert_file_count_by_extension("webp", dir, expected_count)
def assert_png_file_count(self, dir: Path, expected_count: int):
"""
Helper to assert a certain count of PNG extension files in given directory
"""
self.assert_file_count_by_extension("png", dir, expected_count)
def setUp(self):
self.thumbnail_dir = Path(tempfile.mkdtemp()).resolve()
return super().setUp()
def tearDown(self) -> None:
shutil.rmtree(self.thumbnail_dir)
return super().tearDown()
def test_do_nothing_if_converted(
self,
run_convert_mock: mock.MagicMock,
map_mock: mock.MagicMock,
):
"""
GIVEN:
- Encrytped document exists with existing encrypted WebP thumbnail path
WHEN:
- Migration is attempted
THEN:
- Nothing is converted
"""
map_mock.side_effect = self.pretend_map
with override_settings(
THUMBNAIL_DIR=self.thumbnail_dir,
):
self.create_encrypted_webp_thumbnail_files(self.thumbnail_dir, 3)
self.performMigration()
run_convert_mock.assert_not_called()
self.assert_encrypted_webp_file_count(self.thumbnail_dir, 3)
def test_convert_thumbnails(
self,
run_convert_mock: mock.MagicMock,
map_mock: mock.MagicMock,
):
"""
GIVEN:
- Encrypted documents exist with PNG thumbnail
WHEN:
- Migration is attempted
THEN:
- Thumbnails are converted to webp & re-encrypted
"""
map_mock.side_effect = self.pretend_map
run_convert_mock.side_effect = self.pretend_convert_output
with override_settings(
THUMBNAIL_DIR=self.thumbnail_dir,
):
self.create_encrypted_png_thumbnail_files(self.thumbnail_dir, 3)
self.performMigration()
run_convert_mock.assert_called()
self.assertEqual(run_convert_mock.call_count, 3)
self.assert_encrypted_webp_file_count(self.thumbnail_dir, 3)
def test_convert_errors_out(
self,
run_convert_mock: mock.MagicMock,
map_mock: mock.MagicMock,
):
"""
GIVEN:
- Encrypted document exists with PNG thumbnail
WHEN:
- Migration is attempted, but raises an exception
THEN:
- Single thumbnail is converted
"""
map_mock.side_effect = self.pretend_map
run_convert_mock.side_effect = OSError
with override_settings(
THUMBNAIL_DIR=self.thumbnail_dir,
):
self.create_encrypted_png_thumbnail_files(self.thumbnail_dir, 3)
self.performMigration()
run_convert_mock.assert_called()
self.assertEqual(run_convert_mock.call_count, 3)
self.assert_encrypted_png_file_count(self.thumbnail_dir, 3)
def test_convert_mixed(
self,
run_convert_mock: mock.MagicMock,
map_mock: mock.MagicMock,
):
"""
GIVEN:
- Documents exist with PNG, encrypted PNG and WebP thumbnails
WHEN:
- Migration is attempted
THEN:
- Only encrypted PNG thumbnails are converted
"""
map_mock.side_effect = self.pretend_map
run_convert_mock.side_effect = self.pretend_convert_output
with override_settings(
THUMBNAIL_DIR=self.thumbnail_dir,
):
self.create_png_thumbnail_files(self.thumbnail_dir, 3)
self.create_encrypted_png_thumbnail_files(
self.thumbnail_dir,
3,
start_count=3,
)
self.create_webp_thumbnail_files(self.thumbnail_dir, 2, start_count=6)
self.create_encrypted_webp_thumbnail_files(
self.thumbnail_dir,
3,
start_count=8,
)
self.performMigration()
run_convert_mock.assert_called()
self.assertEqual(run_convert_mock.call_count, 3)
self.assert_png_file_count(self.thumbnail_dir, 3)
self.assert_encrypted_webp_file_count(self.thumbnail_dir, 6)
self.assert_webp_file_count(self.thumbnail_dir, 2)
self.assert_encrypted_png_file_count(self.thumbnail_dir, 0)

43
src/documents/utils.py Normal file
View File

@ -0,0 +1,43 @@
import shutil
from os import utime
from pathlib import Path
from typing import Tuple
from typing import Union
def _coerce_to_path(
source: Union[Path, str],
dest: Union[Path, str],
) -> Tuple[Path, Path]:
return Path(source).resolve(), Path(dest).resolve()
def copy_basic_file_stats(source: Union[Path, str], dest: Union[Path, str]) -> None:
"""
Copies only the m_time and a_time attributes from source to destination.
Both are expected to exist.
The extended attribute copy does weird things with SELinux and files
copied from temporary directories and copystat doesn't allow disabling
these copies
"""
source, dest = _coerce_to_path(source, dest)
src_stat = source.stat()
utime(dest, ns=(src_stat.st_atime_ns, src_stat.st_mtime_ns))
def copy_file_with_basic_stats(
source: Union[Path, str],
dest: Union[Path, str],
) -> None:
"""
A sort of simpler copy2 that doesn't copy extended file attributes,
only the access time and modified times from source to dest.
The extended attribute copy does weird things with SELinux and files
copied from temporary directories.
"""
source, dest = _coerce_to_path(source, dest)
shutil.copy(source, dest)
copy_basic_file_stats(source, dest)

View File

@ -502,19 +502,18 @@ class DocumentViewSet(
@action(methods=["get", "post", "delete"], detail=True)
def notes(self, request, pk=None):
currentUser = request.user
try:
doc = Document.objects.get(pk=pk)
if request.user is not None and not has_perms_owner_aware(
request.user,
if currentUser is not None and not has_perms_owner_aware(
currentUser,
"view_document",
doc,
):
return HttpResponseForbidden("Insufficient permissions")
return HttpResponseForbidden("Insufficient permissions to view")
except Document.DoesNotExist:
raise Http404
currentUser = request.user
if request.method == "GET":
try:
return Response(self.getNotes(doc))
@ -525,6 +524,13 @@ class DocumentViewSet(
)
elif request.method == "POST":
try:
if currentUser is not None and not has_perms_owner_aware(
currentUser,
"change_document",
doc,
):
return HttpResponseForbidden("Insufficient permissions to create")
c = Note.objects.create(
document=doc,
note=request.data["note"],
@ -545,6 +551,13 @@ class DocumentViewSet(
},
)
elif request.method == "DELETE":
if currentUser is not None and not has_perms_owner_aware(
currentUser,
"change_document",
doc,
):
return HttpResponseForbidden("Insufficient permissions to delete")
note = Note.objects.get(id=int(request.GET.get("id")))
note.delete()

View File

@ -791,6 +791,18 @@ CONSUMER_BARCODE_DPI: Final[str] = int(
os.getenv("PAPERLESS_CONSUMER_BARCODE_DPI", 300),
)
CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED: Final[bool] = __get_boolean(
"PAPERLESS_CONSUMER_ENABLE_COLLATE_DOUBLE_SIDED",
)
CONSUMER_COLLATE_DOUBLE_SIDED_SUBDIR_NAME: Final[str] = os.getenv(
"PAPERLESS_CONSUMER_COLLATE_DOUBLE_SIDED_SUBDIR_NAME",
"double-sided",
)
CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT: Final[bool] = __get_boolean(
"PAPERLESS_CONSUMER_COLLATE_DOUBLE_SIDED_TIFF_SUPPORT",
)
OCR_PAGES = int(os.getenv("PAPERLESS_OCR_PAGES", 0))

View File

@ -1,6 +1,7 @@
from django import forms
from django.contrib import admin
from django.utils.translation import gettext_lazy as _
from guardian.admin import GuardedModelAdmin
from paperless_mail.models import MailAccount
from paperless_mail.models import MailRule
@ -31,7 +32,7 @@ class MailAccountAdminForm(forms.ModelForm):
]
class MailAccountAdmin(admin.ModelAdmin):
class MailAccountAdmin(GuardedModelAdmin):
list_display = ("name", "imap_server", "username")
fieldsets = [
@ -45,7 +46,7 @@ class MailAccountAdmin(admin.ModelAdmin):
form = MailAccountAdminForm
class MailRuleAdmin(admin.ModelAdmin):
class MailRuleAdmin(GuardedModelAdmin):
radio_fields = {
"attachment_type": admin.VERTICAL,
"action": admin.VERTICAL,

View File

@ -2,6 +2,7 @@ import datetime
import itertools
import logging
import os
import ssl
import tempfile
import traceback
from datetime import date
@ -394,13 +395,12 @@ def get_mailbox(server, port, security) -> MailBox:
"""
Returns the correct MailBox instance for the given configuration.
"""
if security == MailAccount.ImapSecurity.NONE:
mailbox = MailBoxUnencrypted(server, port)
elif security == MailAccount.ImapSecurity.STARTTLS:
mailbox = MailBoxTls(server, port)
mailbox = MailBoxTls(server, port, ssl_context=ssl.create_default_context())
elif security == MailAccount.ImapSecurity.SSL:
mailbox = MailBox(server, port)
mailbox = MailBox(server, port, ssl_context=ssl.create_default_context())
else:
raise NotImplementedError("Unknown IMAP security") # pragma: nocover
return mailbox

View File

@ -25,7 +25,6 @@ class MailAccountSerializer(OwnedObjectSerializer):
class Meta:
model = MailAccount
depth = 1
fields = [
"id",
"name",
@ -36,6 +35,10 @@ class MailAccountSerializer(OwnedObjectSerializer):
"password",
"character_set",
"is_token",
"owner",
"user_can_change",
"permissions",
"set_permissions",
]
def update(self, instance, validated_data):
@ -67,7 +70,6 @@ class MailRuleSerializer(OwnedObjectSerializer):
class Meta:
model = MailRule
depth = 1
fields = [
"id",
"name",
@ -89,6 +91,10 @@ class MailRuleSerializer(OwnedObjectSerializer):
"order",
"attachment_type",
"consumption_scope",
"owner",
"user_can_change",
"permissions",
"set_permissions",
]
def update(self, instance, validated_data):

View File

@ -1,7 +1,9 @@
import json
from unittest import mock
from django.contrib.auth.models import Permission
from django.contrib.auth.models import User
from guardian.shortcuts import assign_perm
from rest_framework import status
from rest_framework.test import APITestCase
@ -27,7 +29,9 @@ class TestAPIMailAccounts(DirectoriesMixin, APITestCase):
super().setUp()
self.user = User.objects.create_superuser(username="temp_admin")
self.user = User.objects.create_user(username="temp_admin")
self.user.user_permissions.add(*Permission.objects.all())
self.user.save()
self.client.force_authenticate(user=self.user)
def test_get_mail_accounts(self):
@ -266,6 +270,73 @@ class TestAPIMailAccounts(DirectoriesMixin, APITestCase):
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data["success"], True)
def test_get_mail_accounts_owner_aware(self):
"""
GIVEN:
- Configured accounts with different users
WHEN:
- API call is made to get mail accounts
THEN:
- Only unowned, owned by user or granted accounts are provided
"""
user2 = User.objects.create_user(username="temp_admin2")
account1 = MailAccount.objects.create(
name="Email1",
username="username1",
password="password1",
imap_server="server.example.com",
imap_port=443,
imap_security=MailAccount.ImapSecurity.SSL,
character_set="UTF-8",
)
account2 = MailAccount.objects.create(
name="Email2",
username="username2",
password="password2",
imap_server="server.example.com",
imap_port=443,
imap_security=MailAccount.ImapSecurity.SSL,
character_set="UTF-8",
)
account2.owner = self.user
account2.save()
account3 = MailAccount.objects.create(
name="Email3",
username="username3",
password="password3",
imap_server="server.example.com",
imap_port=443,
imap_security=MailAccount.ImapSecurity.SSL,
character_set="UTF-8",
)
account3.owner = user2
account3.save()
account4 = MailAccount.objects.create(
name="Email4",
username="username4",
password="password4",
imap_server="server.example.com",
imap_port=443,
imap_security=MailAccount.ImapSecurity.SSL,
character_set="UTF-8",
)
account4.owner = user2
account4.save()
assign_perm("view_mailaccount", self.user, account4)
response = self.client.get(self.ENDPOINT)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data["count"], 3)
self.assertEqual(response.data["results"][0]["name"], account1.name)
self.assertEqual(response.data["results"][1]["name"], account2.name)
self.assertEqual(response.data["results"][2]["name"], account4.name)
class TestAPIMailRules(DirectoriesMixin, APITestCase):
ENDPOINT = "/api/mail_rules/"
@ -273,7 +344,9 @@ class TestAPIMailRules(DirectoriesMixin, APITestCase):
def setUp(self):
super().setUp()
self.user = User.objects.create_superuser(username="temp_admin")
self.user = User.objects.create_user(username="temp_admin")
self.user.user_permissions.add(*Permission.objects.all())
self.user.save()
self.client.force_authenticate(user=self.user)
def test_get_mail_rules(self):
@ -533,3 +606,72 @@ class TestAPIMailRules(DirectoriesMixin, APITestCase):
returned_rule1 = MailRule.objects.get(pk=rule1.pk)
self.assertEqual(returned_rule1.name, "Updated Name 1")
self.assertEqual(returned_rule1.action, MailRule.MailAction.DELETE)
def test_get_mail_rules_owner_aware(self):
"""
GIVEN:
- Configured rules with different users
WHEN:
- API call is made to get mail rules
THEN:
- Only unowned, owned by user or granted mail rules are provided
"""
user2 = User.objects.create_user(username="temp_admin2")
account1 = MailAccount.objects.create(
name="Email1",
username="username1",
password="password1",
imap_server="server.example.com",
imap_port=443,
imap_security=MailAccount.ImapSecurity.SSL,
character_set="UTF-8",
)
rule1 = MailRule.objects.create(
name="Rule1",
account=account1,
folder="INBOX",
filter_from="from@example1.com",
order=0,
)
rule2 = MailRule.objects.create(
name="Rule2",
account=account1,
folder="INBOX",
filter_from="from@example2.com",
order=1,
)
rule2.owner = self.user
rule2.save()
rule3 = MailRule.objects.create(
name="Rule3",
account=account1,
folder="INBOX",
filter_from="from@example3.com",
order=2,
)
rule3.owner = user2
rule3.save()
rule4 = MailRule.objects.create(
name="Rule4",
account=account1,
folder="INBOX",
filter_from="from@example4.com",
order=3,
)
rule4.owner = user2
rule4.save()
assign_perm("view_mailrule", self.user, rule4)
response = self.client.get(self.ENDPOINT)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data["count"], 3)
self.assertEqual(response.data["results"][0]["name"], rule1.name)
self.assertEqual(response.data["results"][1]["name"], rule2.name)
self.assertEqual(response.data["results"][2]["name"], rule4.name)

View File

@ -7,6 +7,8 @@ from rest_framework.permissions import IsAuthenticated
from rest_framework.response import Response
from rest_framework.viewsets import ModelViewSet
from documents.filters import ObjectOwnedOrGrantedPermissionsFilter
from documents.permissions import PaperlessObjectPermissions
from documents.views import PassUserMixin
from paperless.views import StandardPagination
from paperless_mail.mail import MailError
@ -24,7 +26,8 @@ class MailAccountViewSet(ModelViewSet, PassUserMixin):
queryset = MailAccount.objects.all().order_by("pk")
serializer_class = MailAccountSerializer
pagination_class = StandardPagination
permission_classes = (IsAuthenticated,)
permission_classes = (IsAuthenticated, PaperlessObjectPermissions)
filter_backends = (ObjectOwnedOrGrantedPermissionsFilter,)
class MailRuleViewSet(ModelViewSet, PassUserMixin):
@ -33,7 +36,8 @@ class MailRuleViewSet(ModelViewSet, PassUserMixin):
queryset = MailRule.objects.all().order_by("order")
serializer_class = MailRuleSerializer
pagination_class = StandardPagination
permission_classes = (IsAuthenticated,)
permission_classes = (IsAuthenticated, PaperlessObjectPermissions)
filter_backends = (ObjectOwnedOrGrantedPermissionsFilter,)
class MailAccountTestView(GenericAPIView):

View File

@ -861,8 +861,9 @@ class TestParserFileTypes(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
parser = RasterisedDocumentParser(None)
parser.parse(os.path.join(self.SAMPLE_FILES, "document.webp"), "image/webp")
self.assertIsFile(parser.archive_path)
# OCR consistent mangles this space, oh well
self.assertIn(
"this is awebp document, created 11/14/2022.",
# Older tesseracts consistently mangle the space between "a webp",
# tesseract 5.3.0 seems to do a better job, so we're accepting both
self.assertRegex(
parser.get_text().lower(),
r"this is a ?webp document, created 11/14/2022.",
)