mirror of
https://github.com/paperless-ngx/paperless-ngx.git
synced 2025-12-24 02:05:48 -06:00
Compare commits
62 Commits
v2.19.5
...
ef834ae808
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ef834ae808 | ||
|
|
005ef4fce6 | ||
|
|
44f0191bfb | ||
|
|
e9f846ca24 | ||
|
|
2049497b76 | ||
|
|
2a9d1fce0d | ||
|
|
808c074f48 | ||
|
|
7927e5c436 | ||
|
|
0537e87cb5 | ||
|
|
b4da5c3cd1 | ||
|
|
251b0fb3d6 | ||
|
|
32bdf11f7f | ||
|
|
0627ca69f5 | ||
|
|
f5525bbdff | ||
|
|
a21a2a41a8 | ||
|
|
cc73ed8b86 | ||
|
|
0c706b2316 | ||
|
|
85b7b6874d | ||
|
|
56b26185fa | ||
|
|
6537fade7b | ||
|
|
9f8090816f | ||
|
|
1de7c52478 | ||
|
|
9aaaa6f069 | ||
|
|
c3a20b7797 | ||
|
|
476556379b | ||
|
|
e5cafff043 | ||
|
|
8e0d574e99 | ||
|
|
8a5820328e | ||
|
|
809d62a2f4 | ||
|
|
0d87f94b9b | ||
|
|
315b90f8e5 | ||
|
|
47b2d2964b | ||
|
|
e05639ae4e | ||
|
|
f400a8cb2f | ||
|
|
26abcf5612 | ||
|
|
afde52430d | ||
|
|
716f2da652 | ||
|
|
c54073b7c2 | ||
|
|
247e6f39dc | ||
|
|
1e6dfc4481 | ||
|
|
7cc0750066 | ||
|
|
bd6585d3b4 | ||
|
|
717e828a1d | ||
|
|
07381d48e6 | ||
|
|
dd0ffaf312 | ||
|
|
264504affc | ||
|
|
4feedf2add | ||
|
|
2f76cf9831 | ||
|
|
1002d37f6b | ||
|
|
d260a94740 | ||
|
|
88c69b83ea | ||
|
|
2557ee2014 | ||
|
|
3c75deed80 | ||
|
|
d05343c927 | ||
|
|
e7972b7eaf | ||
|
|
75a091cc0d | ||
|
|
dca74803fd | ||
|
|
3cf3d868d0 | ||
|
|
bf4fc6604a | ||
|
|
e8c1eb86fa | ||
|
|
c3dad3cf69 | ||
|
|
811bd66088 |
@@ -1,5 +1,19 @@
|
||||
# Changelog
|
||||
|
||||
## paperless-ngx 2.19.5
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- Fix: ensure custom field query propagation, change detection [@shamoon](https://github.com/shamoon) ([#11291](https://github.com/paperless-ngx/paperless-ngx/pull/11291))
|
||||
|
||||
### Dependencies
|
||||
|
||||
- docker(deps): Bump astral-sh/uv from 0.9.4-python3.12-bookworm-slim to 0.9.7-python3.12-bookworm-slim @[dependabot[bot]](https://github.com/apps/dependabot) ([#11283](https://github.com/paperless-ngx/paperless-ngx/pull/11283))
|
||||
|
||||
### All App Changes
|
||||
|
||||
- Fix: ensure custom field query propagation, change detection [@shamoon](https://github.com/shamoon) ([#11291](https://github.com/paperless-ngx/paperless-ngx/pull/11291))
|
||||
|
||||
## paperless-ngx 2.19.4
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
@@ -1794,3 +1794,23 @@ password. All of these options come from their similarly-named [Django settings]
|
||||
#### [`PAPERLESS_EMAIL_USE_SSL=<bool>`](#PAPERLESS_EMAIL_USE_SSL) {#PAPERLESS_EMAIL_USE_SSL}
|
||||
|
||||
: Defaults to false.
|
||||
|
||||
## Remote OCR
|
||||
|
||||
#### [`PAPERLESS_REMOTE_OCR_ENGINE=<str>`](#PAPERLESS_REMOTE_OCR_ENGINE) {#PAPERLESS_REMOTE_OCR_ENGINE}
|
||||
|
||||
: The remote OCR engine to use. Currently only Azure AI is supported as "azureai".
|
||||
|
||||
Defaults to None, which disables remote OCR.
|
||||
|
||||
#### [`PAPERLESS_REMOTE_OCR_API_KEY=<str>`](#PAPERLESS_REMOTE_OCR_API_KEY) {#PAPERLESS_REMOTE_OCR_API_KEY}
|
||||
|
||||
: The API key to use for the remote OCR engine.
|
||||
|
||||
Defaults to None.
|
||||
|
||||
#### [`PAPERLESS_REMOTE_OCR_ENDPOINT=<str>`](#PAPERLESS_REMOTE_OCR_ENDPOINT) {#PAPERLESS_REMOTE_OCR_ENDPOINT}
|
||||
|
||||
: The endpoint to use for the remote OCR engine. This is required for Azure AI.
|
||||
|
||||
Defaults to None.
|
||||
|
||||
@@ -25,9 +25,10 @@ physical documents into a searchable online archive so you can keep, well, _less
|
||||
## Features
|
||||
|
||||
- **Organize and index** your scanned documents with tags, correspondents, types, and more.
|
||||
- _Your_ data is stored locally on _your_ server and is never transmitted or shared in any way.
|
||||
- _Your_ data is stored locally on _your_ server and is never transmitted or shared in any way, unless you explicitly choose to do so.
|
||||
- Performs **OCR** on your documents, adding searchable and selectable text, even to documents scanned with only images.
|
||||
- Utilizes the open-source Tesseract engine to recognize more than 100 languages.
|
||||
- Utilizes the open-source Tesseract engine to recognize more than 100 languages.
|
||||
- _New!_ Supports remote OCR with Azure AI (opt-in).
|
||||
- Documents are saved as PDF/A format which is designed for long term storage, alongside the unaltered originals.
|
||||
- Uses machine-learning to automatically add tags, correspondents and document types to your documents.
|
||||
- Supports PDF documents, images, plain text files, Office documents (Word, Excel, PowerPoint, and LibreOffice equivalents)[^1] and more.
|
||||
|
||||
@@ -891,6 +891,21 @@ how regularly you intend to scan documents and use paperless.
|
||||
performed the task associated with the document, move it to the
|
||||
inbox.
|
||||
|
||||
## Remote OCR
|
||||
|
||||
!!! important
|
||||
|
||||
This feature is disabled by default and will always remain strictly "opt-in".
|
||||
|
||||
Paperless-ngx supports performing OCR on documents using remote services. At the moment, this is limited to
|
||||
[Microsoft's Azure "Document Intelligence" service](https://azure.microsoft.com/en-us/products/ai-services/ai-document-intelligence).
|
||||
This is of course a paid service (with a free tier) which requires an Azure account and subscription. Azure AI is not affiliated with
|
||||
Paperless-ngx in any way. When enabled, Paperless-ngx will automatically send appropriate documents to Azure for OCR processing, bypassing
|
||||
the local OCR engine. See the [configuration](configuration.md#PAPERLESS_REMOTE_OCR_ENGINE) options for more details.
|
||||
|
||||
Additionally, when using a commercial service with this feature, consider both potential costs as well as any associated file size
|
||||
or page limitations (e.g. with a free tier).
|
||||
|
||||
## Architecture
|
||||
|
||||
Paperless-ngx consists of the following components:
|
||||
|
||||
@@ -16,6 +16,7 @@ classifiers = [
|
||||
# This will allow testing to not install a webserver, mysql, etc
|
||||
|
||||
dependencies = [
|
||||
"azure-ai-documentintelligence>=1.0.2",
|
||||
"babel>=2.17",
|
||||
"bleach~=6.2.0",
|
||||
"celery[redis]~=5.5.1",
|
||||
@@ -252,6 +253,7 @@ testpaths = [
|
||||
"src/paperless_tesseract/tests/",
|
||||
"src/paperless_tika/tests",
|
||||
"src/paperless_text/tests/",
|
||||
"src/paperless_remote/tests/",
|
||||
]
|
||||
addopts = [
|
||||
"--pythonwarnings=all",
|
||||
|
||||
@@ -2557,7 +2557,7 @@
|
||||
</context-group>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">195</context>
|
||||
<context context-type="linenumber">196</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="2753185112875184719" datatype="html">
|
||||
@@ -6044,71 +6044,71 @@
|
||||
<source>Profile updated successfully</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">192</context>
|
||||
<context context-type="linenumber">193</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="3417726855410304962" datatype="html">
|
||||
<source>Error saving profile</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">206</context>
|
||||
<context context-type="linenumber">207</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="154249228726292516" datatype="html">
|
||||
<source>Error generating auth token</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">223</context>
|
||||
<context context-type="linenumber">225</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="4153637646944982460" datatype="html">
|
||||
<source>Error disconnecting social account</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">248</context>
|
||||
<context context-type="linenumber">250</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="5939111172212776886" datatype="html">
|
||||
<source>Error fetching TOTP settings</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">267</context>
|
||||
<context context-type="linenumber">269</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="1030314492414713260" datatype="html">
|
||||
<source>TOTP activated successfully</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">288</context>
|
||||
<context context-type="linenumber">290</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="3755006064892435830" datatype="html">
|
||||
<source>Error activating TOTP</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">290</context>
|
||||
<context context-type="linenumber">292</context>
|
||||
</context-group>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">296</context>
|
||||
<context context-type="linenumber">298</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="5919827473541889422" datatype="html">
|
||||
<source>TOTP deactivated successfully</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">312</context>
|
||||
<context context-type="linenumber">314</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="6214722303383624015" datatype="html">
|
||||
<source>Error deactivating TOTP</source>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">314</context>
|
||||
<context context-type="linenumber">316</context>
|
||||
</context-group>
|
||||
<context-group purpose="location">
|
||||
<context context-type="sourcefile">src/app/components/common/profile-edit-dialog/profile-edit-dialog.component.ts</context>
|
||||
<context context-type="linenumber">319</context>
|
||||
<context context-type="linenumber">321</context>
|
||||
</context-group>
|
||||
</trans-unit>
|
||||
<trans-unit id="6617773613987957957" datatype="html">
|
||||
|
||||
@@ -183,6 +183,7 @@ export class ProfileEditDialogComponent
|
||||
this.newPassword && this.currentPassword !== this.newPassword
|
||||
const profile = Object.assign({}, this.form.value)
|
||||
delete profile.totp_code
|
||||
this.error = null
|
||||
this.networkActive = true
|
||||
this.profileService
|
||||
.update(profile)
|
||||
@@ -204,6 +205,7 @@ export class ProfileEditDialogComponent
|
||||
},
|
||||
error: (error) => {
|
||||
this.toastService.showError($localize`Error saving profile`, error)
|
||||
this.error = error?.error
|
||||
this.networkActive = false
|
||||
},
|
||||
})
|
||||
|
||||
@@ -99,6 +99,29 @@ def generate_unique_filename(doc, *, archive_filename=False) -> Path:
|
||||
return new_filename
|
||||
|
||||
|
||||
def format_filename(document: Document, template_str: str) -> str | None:
|
||||
rendered_filename = validate_filepath_template_and_render(
|
||||
template_str,
|
||||
document,
|
||||
)
|
||||
if rendered_filename is None:
|
||||
return None
|
||||
|
||||
# Apply this setting. It could become a filter in the future (or users could use |default)
|
||||
if settings.FILENAME_FORMAT_REMOVE_NONE:
|
||||
rendered_filename = rendered_filename.replace("/-none-/", "/")
|
||||
rendered_filename = rendered_filename.replace(" -none-", "")
|
||||
rendered_filename = rendered_filename.replace("-none-", "")
|
||||
rendered_filename = rendered_filename.strip(os.sep)
|
||||
|
||||
rendered_filename = rendered_filename.replace(
|
||||
"-none-",
|
||||
"none",
|
||||
) # backward compatibility
|
||||
|
||||
return rendered_filename
|
||||
|
||||
|
||||
def generate_filename(
|
||||
doc: Document,
|
||||
*,
|
||||
@@ -108,28 +131,6 @@ def generate_filename(
|
||||
) -> Path:
|
||||
base_path: Path | None = None
|
||||
|
||||
def format_filename(document: Document, template_str: str) -> str | None:
|
||||
rendered_filename = validate_filepath_template_and_render(
|
||||
template_str,
|
||||
document,
|
||||
)
|
||||
if rendered_filename is None:
|
||||
return None
|
||||
|
||||
# Apply this setting. It could become a filter in the future (or users could use |default)
|
||||
if settings.FILENAME_FORMAT_REMOVE_NONE:
|
||||
rendered_filename = rendered_filename.replace("/-none-/", "/")
|
||||
rendered_filename = rendered_filename.replace(" -none-", "")
|
||||
rendered_filename = rendered_filename.replace("-none-", "")
|
||||
rendered_filename = rendered_filename.strip(os.sep)
|
||||
|
||||
rendered_filename = rendered_filename.replace(
|
||||
"-none-",
|
||||
"none",
|
||||
) # backward compatibility
|
||||
|
||||
return rendered_filename
|
||||
|
||||
# Determine the source of the format string
|
||||
if doc.storage_path is not None:
|
||||
filename_format = doc.storage_path.path
|
||||
|
||||
@@ -52,6 +52,33 @@ class FilePathTemplate(Template):
|
||||
return clean_filepath(original_render)
|
||||
|
||||
|
||||
class PlaceholderString(str):
|
||||
"""
|
||||
String subclass used as a sentinel for empty metadata values inside templates.
|
||||
|
||||
- Renders as \"-none-\" to preserve existing filename cleaning logic.
|
||||
- Compares equal to either \"-none-\" or \"none\" so templates can check for either.
|
||||
- Evaluates to False so {% if correspondent %} behaves intuitively.
|
||||
"""
|
||||
|
||||
def __new__(cls, value: str = "-none-"):
|
||||
return super().__new__(cls, value)
|
||||
|
||||
def __bool__(self) -> bool:
|
||||
return False
|
||||
|
||||
def __eq__(self, other) -> bool:
|
||||
if isinstance(other, str) and other == "none":
|
||||
other = "-none-"
|
||||
return super().__eq__(other)
|
||||
|
||||
def __ne__(self, other) -> bool:
|
||||
return not self.__eq__(other)
|
||||
|
||||
|
||||
NO_VALUE_PLACEHOLDER = PlaceholderString("-none-")
|
||||
|
||||
|
||||
_template_environment.undefined = _LogStrictUndefined
|
||||
|
||||
_template_environment.filters["get_cf_value"] = get_cf_value
|
||||
@@ -128,7 +155,7 @@ def get_added_date_context(document: Document) -> dict[str, str]:
|
||||
def get_basic_metadata_context(
|
||||
document: Document,
|
||||
*,
|
||||
no_value_default: str,
|
||||
no_value_default: str = NO_VALUE_PLACEHOLDER,
|
||||
) -> dict[str, str]:
|
||||
"""
|
||||
Given a Document, constructs some basic information about it. If certain values are not set,
|
||||
@@ -266,7 +293,7 @@ def validate_filepath_template_and_render(
|
||||
# Build the context dictionary
|
||||
context = (
|
||||
{"document": document}
|
||||
| get_basic_metadata_context(document, no_value_default="-none-")
|
||||
| get_basic_metadata_context(document, no_value_default=NO_VALUE_PLACEHOLDER)
|
||||
| get_creation_date_context(document)
|
||||
| get_added_date_context(document)
|
||||
| get_tags_context(tags_list)
|
||||
|
||||
@@ -4,6 +4,7 @@ from unittest import mock
|
||||
|
||||
from django.contrib.auth.models import Permission
|
||||
from django.contrib.auth.models import User
|
||||
from django.test import override_settings
|
||||
from rest_framework import status
|
||||
from rest_framework.test import APITestCase
|
||||
|
||||
@@ -334,6 +335,45 @@ class TestApiStoragePaths(DirectoriesMixin, APITestCase):
|
||||
self.assertEqual(response.status_code, status.HTTP_200_OK)
|
||||
self.assertEqual(response.data, "path/Something")
|
||||
|
||||
def test_test_storage_path_respects_none_placeholder_setting(self):
|
||||
"""
|
||||
GIVEN:
|
||||
- A storage path template referencing an empty field
|
||||
WHEN:
|
||||
- Testing the template before and after enabling remove-none
|
||||
THEN:
|
||||
- The preview shows "none" by default and drops the placeholder when configured
|
||||
"""
|
||||
document = Document.objects.create(
|
||||
mime_type="application/pdf",
|
||||
storage_path=self.sp1,
|
||||
title="Something",
|
||||
checksum="123",
|
||||
)
|
||||
payload = json.dumps(
|
||||
{
|
||||
"document": document.id,
|
||||
"path": "folder/{{ correspondent }}/{{ title }}",
|
||||
},
|
||||
)
|
||||
|
||||
response = self.client.post(
|
||||
f"{self.ENDPOINT}test/",
|
||||
payload,
|
||||
content_type="application/json",
|
||||
)
|
||||
self.assertEqual(response.status_code, status.HTTP_200_OK)
|
||||
self.assertEqual(response.data, "folder/none/Something")
|
||||
|
||||
with override_settings(FILENAME_FORMAT_REMOVE_NONE=True):
|
||||
response = self.client.post(
|
||||
f"{self.ENDPOINT}test/",
|
||||
payload,
|
||||
content_type="application/json",
|
||||
)
|
||||
self.assertEqual(response.status_code, status.HTTP_200_OK)
|
||||
self.assertEqual(response.data, "folder/Something")
|
||||
|
||||
|
||||
class TestBulkEditObjects(APITestCase):
|
||||
# See test_api_permissions.py for bulk tests on permissions
|
||||
|
||||
@@ -648,7 +648,7 @@ class TestApiUser(DirectoriesMixin, APITestCase):
|
||||
|
||||
user1 = {
|
||||
"username": "testuser",
|
||||
"password": "test",
|
||||
"password": "areallysupersecretpassword235",
|
||||
"first_name": "Test",
|
||||
"last_name": "User",
|
||||
}
|
||||
@@ -730,7 +730,7 @@ class TestApiUser(DirectoriesMixin, APITestCase):
|
||||
f"{self.ENDPOINT}{user1.pk}/",
|
||||
data={
|
||||
"first_name": "Updated Name 2",
|
||||
"password": "123xyz",
|
||||
"password": "newreallystrongpassword456",
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
@@ -192,6 +192,65 @@ class TestApiProfile(DirectoriesMixin, APITestCase):
|
||||
self.assertEqual(user.first_name, user_data["first_name"])
|
||||
self.assertEqual(user.last_name, user_data["last_name"])
|
||||
|
||||
def test_update_profile_invalid_password_returns_field_error(self):
|
||||
"""
|
||||
GIVEN:
|
||||
- Configured user
|
||||
WHEN:
|
||||
- API call is made to update profile with weak password
|
||||
THEN:
|
||||
- Profile update fails with password field error
|
||||
"""
|
||||
|
||||
user_data = {
|
||||
"email": "new@email.com",
|
||||
"password": "short", # shorter than default validator threshold
|
||||
"first_name": "new first name",
|
||||
"last_name": "new last name",
|
||||
}
|
||||
|
||||
response = self.client.patch(self.ENDPOINT, user_data)
|
||||
|
||||
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
|
||||
self.assertIn("password", response.data)
|
||||
self.assertIsInstance(response.data["password"], list)
|
||||
self.assertTrue(
|
||||
any(
|
||||
"too short" in message.lower() for message in response.data["password"]
|
||||
),
|
||||
)
|
||||
|
||||
def test_update_profile_placeholder_password_skips_validation(self):
|
||||
"""
|
||||
GIVEN:
|
||||
- Configured user with existing password
|
||||
WHEN:
|
||||
- API call is made with the obfuscated placeholder password value
|
||||
THEN:
|
||||
- Profile is updated without changing the password or running validators
|
||||
"""
|
||||
|
||||
original_password = "orig-pass-12345"
|
||||
self.user.set_password(original_password)
|
||||
self.user.save()
|
||||
|
||||
user_data = {
|
||||
"email": "new@email.com",
|
||||
"password": "*" * 12, # matches obfuscated value from serializer
|
||||
"first_name": "new first name",
|
||||
"last_name": "new last name",
|
||||
}
|
||||
|
||||
response = self.client.patch(self.ENDPOINT, user_data)
|
||||
|
||||
self.assertEqual(response.status_code, status.HTTP_200_OK)
|
||||
|
||||
user = User.objects.get(username=self.user.username)
|
||||
self.assertTrue(user.check_password(original_password))
|
||||
self.assertEqual(user.email, user_data["email"])
|
||||
self.assertEqual(user.first_name, user_data["first_name"])
|
||||
self.assertEqual(user.last_name, user_data["last_name"])
|
||||
|
||||
def test_update_auth_token(self):
|
||||
"""
|
||||
GIVEN:
|
||||
|
||||
@@ -1078,6 +1078,47 @@ class TestFilenameGeneration(DirectoriesMixin, TestCase):
|
||||
Path("SomeImportantNone/2020-07-25.pdf"),
|
||||
)
|
||||
|
||||
@override_settings(
|
||||
FILENAME_FORMAT=(
|
||||
"{% if correspondent == 'none' %}none/{% endif %}"
|
||||
"{% if correspondent == '-none-' %}dash/{% endif %}"
|
||||
"{% if not correspondent %}false/{% endif %}"
|
||||
"{% if correspondent != 'none' %}notnoneyes/{% else %}notnoneno/{% endif %}"
|
||||
"{{ correspondent or 'missing' }}/{{ title }}"
|
||||
),
|
||||
)
|
||||
def test_placeholder_matches_none_variants_and_false(self):
|
||||
"""
|
||||
GIVEN:
|
||||
- Templates that compare against 'none', '-none-' and rely on truthiness
|
||||
WHEN:
|
||||
- A document has or lacks a correspondent
|
||||
THEN:
|
||||
- Empty placeholders behave like both strings and evaluate False
|
||||
"""
|
||||
doc_without_correspondent = Document.objects.create(
|
||||
title="does not matter",
|
||||
mime_type="application/pdf",
|
||||
checksum="abc",
|
||||
)
|
||||
doc_with_correspondent = Document.objects.create(
|
||||
title="does not matter",
|
||||
mime_type="application/pdf",
|
||||
checksum="def",
|
||||
correspondent=Correspondent.objects.create(name="Acme"),
|
||||
)
|
||||
|
||||
self.assertEqual(
|
||||
generate_filename(doc_without_correspondent),
|
||||
Path(
|
||||
"none/dash/false/notnoneno/missing/does not matter.pdf",
|
||||
),
|
||||
)
|
||||
self.assertEqual(
|
||||
generate_filename(doc_with_correspondent),
|
||||
Path("notnoneyes/Acme/does not matter.pdf"),
|
||||
)
|
||||
|
||||
@override_settings(
|
||||
FILENAME_FORMAT="{created_year_short}/{created_month_name_short}/{created_month_name}/{title}",
|
||||
)
|
||||
|
||||
@@ -23,6 +23,7 @@ from django.conf import settings
|
||||
from django.contrib.auth.models import Group
|
||||
from django.contrib.auth.models import User
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.cache import cache
|
||||
from django.db import connections
|
||||
from django.db.migrations.loader import MigrationLoader
|
||||
from django.db.migrations.recorder import MigrationRecorder
|
||||
@@ -51,7 +52,6 @@ from django.utils.timezone import make_aware
|
||||
from django.utils.translation import get_language
|
||||
from django.views import View
|
||||
from django.views.decorators.cache import cache_control
|
||||
from django.views.decorators.cache import cache_page
|
||||
from django.views.decorators.http import condition
|
||||
from django.views.decorators.http import last_modified
|
||||
from django.views.generic import TemplateView
|
||||
@@ -108,6 +108,7 @@ from documents.conditionals import thumbnail_last_modified
|
||||
from documents.data_models import ConsumableDocument
|
||||
from documents.data_models import DocumentMetadataOverrides
|
||||
from documents.data_models import DocumentSource
|
||||
from documents.file_handling import format_filename
|
||||
from documents.filters import CorrespondentFilterSet
|
||||
from documents.filters import CustomFieldFilterSet
|
||||
from documents.filters import DocumentFilterSet
|
||||
@@ -183,7 +184,6 @@ from documents.tasks import index_optimize
|
||||
from documents.tasks import sanity_check
|
||||
from documents.tasks import train_classifier
|
||||
from documents.tasks import update_document_parent_tags
|
||||
from documents.templating.filepath import validate_filepath_template_and_render
|
||||
from documents.utils import get_boolean
|
||||
from paperless import version
|
||||
from paperless.celery import app as celery_app
|
||||
@@ -2336,7 +2336,7 @@ class StoragePathViewSet(ModelViewSet, PermissionsAwareDocumentCountMixin):
|
||||
document = serializer.validated_data.get("document")
|
||||
path = serializer.validated_data.get("path")
|
||||
|
||||
result = validate_filepath_template_and_render(path, document)
|
||||
result = format_filename(document, path)
|
||||
return Response(result)
|
||||
|
||||
|
||||
@@ -2426,7 +2426,6 @@ class UiSettingsView(GenericAPIView):
|
||||
)
|
||||
|
||||
|
||||
@method_decorator(cache_page(60 * 15), name="dispatch")
|
||||
@extend_schema_view(
|
||||
get=extend_schema(
|
||||
description="Get the current version of the Paperless-NGX server",
|
||||
@@ -2436,31 +2435,34 @@ class UiSettingsView(GenericAPIView):
|
||||
),
|
||||
)
|
||||
class RemoteVersionView(GenericAPIView):
|
||||
cache_key = "remote_version_view_latest_release"
|
||||
|
||||
def get(self, request, format=None):
|
||||
remote_version = "0.0.0"
|
||||
is_greater_than_current = False
|
||||
current_version = packaging_version.parse(version.__full_version_str__)
|
||||
try:
|
||||
resp = httpx.get(
|
||||
"https://api.github.com/repos/paperless-ngx/paperless-ngx/releases/latest",
|
||||
headers={"Accept": "application/json"},
|
||||
)
|
||||
resp.raise_for_status()
|
||||
remote_version = cache.get(self.cache_key)
|
||||
if remote_version is None:
|
||||
try:
|
||||
resp = httpx.get(
|
||||
"https://api.github.com/repos/paperless-ngx/paperless-ngx/releases/latest",
|
||||
headers={"Accept": "application/json"},
|
||||
)
|
||||
resp.raise_for_status()
|
||||
data = resp.json()
|
||||
remote_version = data["tag_name"]
|
||||
# Some early tags used ngx-x.y.z
|
||||
remote_version = remote_version.removeprefix("ngx-")
|
||||
except ValueError as e:
|
||||
logger.debug(f"An error occurred parsing remote version json: {e}")
|
||||
except httpx.HTTPError as e:
|
||||
logger.debug(f"An error occurred checking for available updates: {e}")
|
||||
except httpx.HTTPError as e:
|
||||
logger.debug(f"An error occurred checking for available updates: {e}")
|
||||
|
||||
if remote_version:
|
||||
cache.set(self.cache_key, remote_version, 60 * 15)
|
||||
else:
|
||||
remote_version = "0.0.0"
|
||||
|
||||
is_greater_than_current = (
|
||||
packaging_version.parse(
|
||||
remote_version,
|
||||
)
|
||||
> current_version
|
||||
packaging_version.parse(remote_version) > current_version
|
||||
)
|
||||
|
||||
return Response(
|
||||
|
||||
@@ -9,6 +9,7 @@ from allauth.socialaccount.models import SocialApp
|
||||
from django.contrib.auth.models import Group
|
||||
from django.contrib.auth.models import Permission
|
||||
from django.contrib.auth.models import User
|
||||
from django.contrib.auth.password_validation import validate_password
|
||||
from rest_framework import serializers
|
||||
from rest_framework.authtoken.serializers import AuthTokenSerializer
|
||||
|
||||
@@ -19,6 +20,23 @@ from paperless_mail.serialisers import ObfuscatedPasswordField
|
||||
logger = logging.getLogger("paperless.settings")
|
||||
|
||||
|
||||
class PasswordValidationMixin:
|
||||
def _has_real_password(self, value: str | None) -> bool:
|
||||
return bool(value) and value.replace("*", "") != ""
|
||||
|
||||
def validate_password(self, value: str) -> str:
|
||||
if not self._has_real_password(value):
|
||||
return value
|
||||
|
||||
request = self.context.get("request") if hasattr(self, "context") else None
|
||||
user = self.instance or (
|
||||
request.user if request and hasattr(request, "user") else None
|
||||
)
|
||||
validate_password(value, user) # raise ValidationError if invalid
|
||||
|
||||
return value
|
||||
|
||||
|
||||
class PaperlessAuthTokenSerializer(AuthTokenSerializer):
|
||||
code = serializers.CharField(
|
||||
label="MFA Code",
|
||||
@@ -49,7 +67,7 @@ class PaperlessAuthTokenSerializer(AuthTokenSerializer):
|
||||
return attrs
|
||||
|
||||
|
||||
class UserSerializer(serializers.ModelSerializer):
|
||||
class UserSerializer(PasswordValidationMixin, serializers.ModelSerializer):
|
||||
password = ObfuscatedPasswordField(required=False)
|
||||
user_permissions = serializers.SlugRelatedField(
|
||||
many=True,
|
||||
@@ -87,11 +105,11 @@ class UserSerializer(serializers.ModelSerializer):
|
||||
return obj.get_group_permissions()
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
if "password" in validated_data:
|
||||
if len(validated_data.get("password").replace("*", "")) > 0:
|
||||
instance.set_password(validated_data.get("password"))
|
||||
instance.save()
|
||||
validated_data.pop("password")
|
||||
password = validated_data.pop("password", None)
|
||||
if self._has_real_password(password):
|
||||
instance.set_password(password)
|
||||
instance.save()
|
||||
|
||||
super().update(instance, validated_data)
|
||||
return instance
|
||||
|
||||
@@ -102,12 +120,7 @@ class UserSerializer(serializers.ModelSerializer):
|
||||
user_permissions = None
|
||||
if "user_permissions" in validated_data:
|
||||
user_permissions = validated_data.pop("user_permissions")
|
||||
password = None
|
||||
if (
|
||||
"password" in validated_data
|
||||
and len(validated_data.get("password").replace("*", "")) > 0
|
||||
):
|
||||
password = validated_data.pop("password")
|
||||
password = validated_data.pop("password", None)
|
||||
user = User.objects.create(**validated_data)
|
||||
# set groups
|
||||
if groups:
|
||||
@@ -116,7 +129,7 @@ class UserSerializer(serializers.ModelSerializer):
|
||||
if user_permissions:
|
||||
user.user_permissions.set(user_permissions)
|
||||
# set password
|
||||
if password:
|
||||
if self._has_real_password(password):
|
||||
user.set_password(password)
|
||||
user.save()
|
||||
return user
|
||||
@@ -156,7 +169,7 @@ class SocialAccountSerializer(serializers.ModelSerializer):
|
||||
return "Unknown App"
|
||||
|
||||
|
||||
class ProfileSerializer(serializers.ModelSerializer):
|
||||
class ProfileSerializer(PasswordValidationMixin, serializers.ModelSerializer):
|
||||
email = serializers.EmailField(allow_blank=True, required=False)
|
||||
password = ObfuscatedPasswordField(required=False, allow_null=False)
|
||||
auth_token = serializers.SlugRelatedField(read_only=True, slug_field="key")
|
||||
|
||||
@@ -322,6 +322,7 @@ INSTALLED_APPS = [
|
||||
"paperless_tesseract.apps.PaperlessTesseractConfig",
|
||||
"paperless_text.apps.PaperlessTextConfig",
|
||||
"paperless_mail.apps.PaperlessMailConfig",
|
||||
"paperless_remote.apps.PaperlessRemoteParserConfig",
|
||||
"django.contrib.admin",
|
||||
"rest_framework",
|
||||
"rest_framework.authtoken",
|
||||
@@ -1401,3 +1402,10 @@ WEBHOOKS_ALLOW_INTERNAL_REQUESTS = __get_boolean(
|
||||
"PAPERLESS_WEBHOOKS_ALLOW_INTERNAL_REQUESTS",
|
||||
"true",
|
||||
)
|
||||
|
||||
###############################################################################
|
||||
# Remote Parser #
|
||||
###############################################################################
|
||||
REMOTE_OCR_ENGINE = os.getenv("PAPERLESS_REMOTE_OCR_ENGINE")
|
||||
REMOTE_OCR_API_KEY = os.getenv("PAPERLESS_REMOTE_OCR_API_KEY")
|
||||
REMOTE_OCR_ENDPOINT = os.getenv("PAPERLESS_REMOTE_OCR_ENDPOINT")
|
||||
|
||||
@@ -197,10 +197,10 @@ class ProfileView(GenericAPIView):
|
||||
serializer.is_valid(raise_exception=True)
|
||||
user = self.request.user if hasattr(self.request, "user") else None
|
||||
|
||||
if len(serializer.validated_data.get("password").replace("*", "")) > 0:
|
||||
user.set_password(serializer.validated_data.get("password"))
|
||||
password = serializer.validated_data.pop("password", None)
|
||||
if password and password.replace("*", ""):
|
||||
user.set_password(password)
|
||||
user.save()
|
||||
serializer.validated_data.pop("password")
|
||||
|
||||
for key, value in serializer.validated_data.items():
|
||||
setattr(user, key, value)
|
||||
|
||||
@@ -103,6 +103,9 @@ class PaperlessMailOAuth2Manager:
|
||||
refresh_token=account.refresh_token,
|
||||
),
|
||||
)
|
||||
if "refresh_token" in result:
|
||||
# Outlook returns a new refresh token on refresh, Gmail does not
|
||||
account.refresh_token = result["refresh_token"]
|
||||
account.password = result["access_token"]
|
||||
account.expiration = timezone.now() + timedelta(
|
||||
seconds=result["expires_in"],
|
||||
|
||||
4
src/paperless_remote/__init__.py
Normal file
4
src/paperless_remote/__init__.py
Normal file
@@ -0,0 +1,4 @@
|
||||
# this is here so that django finds the checks.
|
||||
from paperless_remote.checks import check_remote_parser_configured
|
||||
|
||||
__all__ = ["check_remote_parser_configured"]
|
||||
14
src/paperless_remote/apps.py
Normal file
14
src/paperless_remote/apps.py
Normal file
@@ -0,0 +1,14 @@
|
||||
from django.apps import AppConfig
|
||||
|
||||
from paperless_remote.signals import remote_consumer_declaration
|
||||
|
||||
|
||||
class PaperlessRemoteParserConfig(AppConfig):
|
||||
name = "paperless_remote"
|
||||
|
||||
def ready(self):
|
||||
from documents.signals import document_consumer_declaration
|
||||
|
||||
document_consumer_declaration.connect(remote_consumer_declaration)
|
||||
|
||||
AppConfig.ready(self)
|
||||
17
src/paperless_remote/checks.py
Normal file
17
src/paperless_remote/checks.py
Normal file
@@ -0,0 +1,17 @@
|
||||
from django.conf import settings
|
||||
from django.core.checks import Error
|
||||
from django.core.checks import register
|
||||
|
||||
|
||||
@register()
|
||||
def check_remote_parser_configured(app_configs, **kwargs):
|
||||
if settings.REMOTE_OCR_ENGINE == "azureai" and not (
|
||||
settings.REMOTE_OCR_ENDPOINT and settings.REMOTE_OCR_API_KEY
|
||||
):
|
||||
return [
|
||||
Error(
|
||||
"Azure AI remote parser requires endpoint and API key to be configured.",
|
||||
),
|
||||
]
|
||||
|
||||
return []
|
||||
113
src/paperless_remote/parsers.py
Normal file
113
src/paperless_remote/parsers.py
Normal file
@@ -0,0 +1,113 @@
|
||||
from pathlib import Path
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
from paperless_tesseract.parsers import RasterisedDocumentParser
|
||||
|
||||
|
||||
class RemoteEngineConfig:
|
||||
def __init__(
|
||||
self,
|
||||
engine: str,
|
||||
api_key: str | None = None,
|
||||
endpoint: str | None = None,
|
||||
):
|
||||
self.engine = engine
|
||||
self.api_key = api_key
|
||||
self.endpoint = endpoint
|
||||
|
||||
def engine_is_valid(self):
|
||||
valid = self.engine in ["azureai"] and self.api_key is not None
|
||||
if self.engine == "azureai":
|
||||
valid = valid and self.endpoint is not None
|
||||
return valid
|
||||
|
||||
|
||||
class RemoteDocumentParser(RasterisedDocumentParser):
|
||||
"""
|
||||
This parser uses a remote OCR engine to parse documents. Currently, it supports Azure AI Vision
|
||||
as this is the only service that provides a remote OCR API with text-embedded PDF output.
|
||||
"""
|
||||
|
||||
logging_name = "paperless.parsing.remote"
|
||||
|
||||
def get_settings(self) -> RemoteEngineConfig:
|
||||
"""
|
||||
Returns the configuration for the remote OCR engine, loaded from Django settings.
|
||||
"""
|
||||
return RemoteEngineConfig(
|
||||
engine=settings.REMOTE_OCR_ENGINE,
|
||||
api_key=settings.REMOTE_OCR_API_KEY,
|
||||
endpoint=settings.REMOTE_OCR_ENDPOINT,
|
||||
)
|
||||
|
||||
def supported_mime_types(self):
|
||||
if self.settings.engine_is_valid():
|
||||
return {
|
||||
"application/pdf": ".pdf",
|
||||
"image/png": ".png",
|
||||
"image/jpeg": ".jpg",
|
||||
"image/tiff": ".tiff",
|
||||
"image/bmp": ".bmp",
|
||||
"image/gif": ".gif",
|
||||
"image/webp": ".webp",
|
||||
}
|
||||
else:
|
||||
return {}
|
||||
|
||||
def azure_ai_vision_parse(
|
||||
self,
|
||||
file: Path,
|
||||
) -> str | None:
|
||||
"""
|
||||
Uses Azure AI Vision to parse the document and return the text content.
|
||||
It requests a searchable PDF output with embedded text.
|
||||
The PDF is saved to the archive_path attribute.
|
||||
Returns the text content extracted from the document.
|
||||
If the parsing fails, it returns None.
|
||||
"""
|
||||
from azure.ai.documentintelligence import DocumentIntelligenceClient
|
||||
from azure.ai.documentintelligence.models import AnalyzeDocumentRequest
|
||||
from azure.ai.documentintelligence.models import AnalyzeOutputOption
|
||||
from azure.ai.documentintelligence.models import DocumentContentFormat
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
client = DocumentIntelligenceClient(
|
||||
endpoint=self.settings.endpoint,
|
||||
credential=AzureKeyCredential(self.settings.api_key),
|
||||
)
|
||||
|
||||
with file.open("rb") as f:
|
||||
analyze_request = AnalyzeDocumentRequest(bytes_source=f.read())
|
||||
poller = client.begin_analyze_document(
|
||||
model_id="prebuilt-read",
|
||||
body=analyze_request,
|
||||
output_content_format=DocumentContentFormat.TEXT,
|
||||
output=[AnalyzeOutputOption.PDF], # request searchable PDF output
|
||||
content_type="application/json",
|
||||
)
|
||||
|
||||
poller.wait()
|
||||
result_id = poller.details["operation_id"]
|
||||
result = poller.result()
|
||||
|
||||
# Download the PDF with embedded text
|
||||
self.archive_path = self.tempdir / "archive.pdf"
|
||||
with self.archive_path.open("wb") as f:
|
||||
for chunk in client.get_analyze_result_pdf(
|
||||
model_id="prebuilt-read",
|
||||
result_id=result_id,
|
||||
):
|
||||
f.write(chunk)
|
||||
|
||||
client.close()
|
||||
return result.content
|
||||
|
||||
def parse(self, document_path: Path, mime_type, file_name=None):
|
||||
if not self.settings.engine_is_valid():
|
||||
self.log.warning(
|
||||
"No valid remote parser engine is configured, content will be empty.",
|
||||
)
|
||||
self.text = ""
|
||||
elif self.settings.engine == "azureai":
|
||||
self.text = self.azure_ai_vision_parse(document_path)
|
||||
18
src/paperless_remote/signals.py
Normal file
18
src/paperless_remote/signals.py
Normal file
@@ -0,0 +1,18 @@
|
||||
def get_parser(*args, **kwargs):
|
||||
from paperless_remote.parsers import RemoteDocumentParser
|
||||
|
||||
return RemoteDocumentParser(*args, **kwargs)
|
||||
|
||||
|
||||
def get_supported_mime_types():
|
||||
from paperless_remote.parsers import RemoteDocumentParser
|
||||
|
||||
return RemoteDocumentParser(None).supported_mime_types()
|
||||
|
||||
|
||||
def remote_consumer_declaration(sender, **kwargs):
|
||||
return {
|
||||
"parser": get_parser,
|
||||
"weight": 5,
|
||||
"mime_types": get_supported_mime_types(),
|
||||
}
|
||||
0
src/paperless_remote/tests/__init__.py
Normal file
0
src/paperless_remote/tests/__init__.py
Normal file
BIN
src/paperless_remote/tests/samples/simple-digital.pdf
Normal file
BIN
src/paperless_remote/tests/samples/simple-digital.pdf
Normal file
Binary file not shown.
24
src/paperless_remote/tests/test_checks.py
Normal file
24
src/paperless_remote/tests/test_checks.py
Normal file
@@ -0,0 +1,24 @@
|
||||
from unittest import TestCase
|
||||
|
||||
from django.test import override_settings
|
||||
|
||||
from paperless_remote import check_remote_parser_configured
|
||||
|
||||
|
||||
class TestChecks(TestCase):
|
||||
@override_settings(REMOTE_OCR_ENGINE=None)
|
||||
def test_no_engine(self):
|
||||
msgs = check_remote_parser_configured(None)
|
||||
self.assertEqual(len(msgs), 0)
|
||||
|
||||
@override_settings(REMOTE_OCR_ENGINE="azureai")
|
||||
@override_settings(REMOTE_OCR_API_KEY="somekey")
|
||||
@override_settings(REMOTE_OCR_ENDPOINT=None)
|
||||
def test_azure_no_endpoint(self):
|
||||
msgs = check_remote_parser_configured(None)
|
||||
self.assertEqual(len(msgs), 1)
|
||||
self.assertTrue(
|
||||
msgs[0].msg.startswith(
|
||||
"Azure AI remote parser requires endpoint and API key to be configured.",
|
||||
),
|
||||
)
|
||||
101
src/paperless_remote/tests/test_parser.py
Normal file
101
src/paperless_remote/tests/test_parser.py
Normal file
@@ -0,0 +1,101 @@
|
||||
import uuid
|
||||
from pathlib import Path
|
||||
from unittest import mock
|
||||
|
||||
from django.test import TestCase
|
||||
from django.test import override_settings
|
||||
|
||||
from documents.tests.utils import DirectoriesMixin
|
||||
from documents.tests.utils import FileSystemAssertsMixin
|
||||
from paperless_remote.parsers import RemoteDocumentParser
|
||||
from paperless_remote.signals import get_parser
|
||||
|
||||
|
||||
class TestParser(DirectoriesMixin, FileSystemAssertsMixin, TestCase):
|
||||
SAMPLE_FILES = Path(__file__).resolve().parent / "samples"
|
||||
|
||||
def assertContainsStrings(self, content: str, strings: list[str]):
|
||||
# Asserts that all strings appear in content, in the given order.
|
||||
indices = []
|
||||
for s in strings:
|
||||
if s in content:
|
||||
indices.append(content.index(s))
|
||||
else:
|
||||
self.fail(f"'{s}' is not in '{content}'")
|
||||
self.assertListEqual(indices, sorted(indices))
|
||||
|
||||
@mock.patch("paperless_tesseract.parsers.run_subprocess")
|
||||
@mock.patch("azure.ai.documentintelligence.DocumentIntelligenceClient")
|
||||
def test_get_text_with_azure(self, mock_client_cls, mock_subprocess):
|
||||
# Arrange mock Azure client
|
||||
mock_client = mock.Mock()
|
||||
mock_client_cls.return_value = mock_client
|
||||
|
||||
# Simulate poller result and its `.details`
|
||||
mock_poller = mock.Mock()
|
||||
mock_poller.wait.return_value = None
|
||||
mock_poller.details = {"operation_id": "fake-op-id"}
|
||||
mock_client.begin_analyze_document.return_value = mock_poller
|
||||
mock_poller.result.return_value.content = "This is a test document."
|
||||
|
||||
# Return dummy PDF bytes
|
||||
mock_client.get_analyze_result_pdf.return_value = [
|
||||
b"%PDF-",
|
||||
b"1.7 ",
|
||||
b"FAKEPDF",
|
||||
]
|
||||
|
||||
# Simulate pdftotext by writing dummy text to sidecar file
|
||||
def fake_run(cmd, *args, **kwargs):
|
||||
with Path(cmd[-1]).open("w", encoding="utf-8") as f:
|
||||
f.write("This is a test document.")
|
||||
|
||||
mock_subprocess.side_effect = fake_run
|
||||
|
||||
with override_settings(
|
||||
REMOTE_OCR_ENGINE="azureai",
|
||||
REMOTE_OCR_API_KEY="somekey",
|
||||
REMOTE_OCR_ENDPOINT="https://endpoint.cognitiveservices.azure.com",
|
||||
):
|
||||
parser = get_parser(uuid.uuid4())
|
||||
parser.parse(
|
||||
self.SAMPLE_FILES / "simple-digital.pdf",
|
||||
"application/pdf",
|
||||
)
|
||||
|
||||
self.assertContainsStrings(
|
||||
parser.text.strip(),
|
||||
["This is a test document."],
|
||||
)
|
||||
|
||||
@override_settings(
|
||||
REMOTE_OCR_ENGINE="azureai",
|
||||
REMOTE_OCR_API_KEY="key",
|
||||
REMOTE_OCR_ENDPOINT="https://endpoint.cognitiveservices.azure.com",
|
||||
)
|
||||
def test_supported_mime_types_valid_config(self):
|
||||
parser = RemoteDocumentParser(uuid.uuid4())
|
||||
expected_types = {
|
||||
"application/pdf": ".pdf",
|
||||
"image/png": ".png",
|
||||
"image/jpeg": ".jpg",
|
||||
"image/tiff": ".tiff",
|
||||
"image/bmp": ".bmp",
|
||||
"image/gif": ".gif",
|
||||
"image/webp": ".webp",
|
||||
}
|
||||
self.assertEqual(parser.supported_mime_types(), expected_types)
|
||||
|
||||
def test_supported_mime_types_invalid_config(self):
|
||||
parser = get_parser(uuid.uuid4())
|
||||
self.assertEqual(parser.supported_mime_types(), {})
|
||||
|
||||
@override_settings(
|
||||
REMOTE_OCR_ENGINE=None,
|
||||
REMOTE_OCR_API_KEY=None,
|
||||
REMOTE_OCR_ENDPOINT=None,
|
||||
)
|
||||
def test_parse_with_invalid_config(self):
|
||||
parser = get_parser(uuid.uuid4())
|
||||
parser.parse(self.SAMPLE_FILES / "simple-digital.pdf", "application/pdf")
|
||||
self.assertEqual(parser.text, "")
|
||||
39
uv.lock
generated
39
uv.lock
generated
@@ -95,6 +95,34 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/02/ff/1175b0b7371e46244032d43a56862d0af455823b5280a50c63d99cc50f18/automat-25.4.16-py3-none-any.whl", hash = "sha256:04e9bce696a8d5671ee698005af6e5a9fa15354140a87f4870744604dcdd3ba1", size = 42842, upload-time = "2025-04-16T20:12:14.447Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "azure-ai-documentintelligence"
|
||||
version = "1.0.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "azure-core", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "isodate", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "typing-extensions", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/44/7b/8115cd713e2caa5e44def85f2b7ebd02a74ae74d7113ba20bdd41fd6dd80/azure_ai_documentintelligence-1.0.2.tar.gz", hash = "sha256:4d75a2513f2839365ebabc0e0e1772f5601b3a8c9a71e75da12440da13b63484", size = 170940 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/75/c9ec040f23082f54ffb1977ff8f364c2d21c79a640a13d1c1809e7fd6b1a/azure_ai_documentintelligence-1.0.2-py3-none-any.whl", hash = "sha256:e1fb446abbdeccc9759d897898a0fe13141ed29f9ad11fc705f951925822ed59", size = 106005 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "azure-core"
|
||||
version = "1.33.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "requests", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "six", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "typing-extensions", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/75/aa/7c9db8edd626f1a7d99d09ef7926f6f4fb34d5f9fa00dc394afdfe8e2a80/azure_core-1.33.0.tar.gz", hash = "sha256:f367aa07b5e3005fec2c1e184b882b0b039910733907d001c20fb08ebb8c0eb9", size = 295633 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/07/b7/76b7e144aa53bd206bf1ce34fa75350472c3f69bf30e5c8c18bc9881035d/azure_core-1.33.0-py3-none-any.whl", hash = "sha256:9b5b6d0223a1d38c37500e6971118c1e0f13f54951e6893968b38910bc9cda8f", size = 207071 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "babel"
|
||||
version = "2.17.0"
|
||||
@@ -1451,6 +1479,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/fc/4e5a141c3f7c7bed550ac1f69e599e92b6be449dd4677ec09f325cad0955/inotifyrecursive-0.3.5-py3-none-any.whl", hash = "sha256:7e5f4a2e1dc2bef0efa3b5f6b339c41fb4599055a2b54909d020e9e932cc8d2f", size = 8009, upload-time = "2020-11-20T12:38:46.981Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "isodate"
|
||||
version = "0.7.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/54/4d/e940025e2ce31a8ce1202635910747e5a87cc3a6a6bb2d00973375014749/isodate-0.7.2.tar.gz", hash = "sha256:4cd1aa0f43ca76f4a6c6c0292a85f40b35ec2e43e315b59f06e6d32171a953e6", size = 29705 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/15/aa/0aca39a37d3c7eb941ba736ede56d689e7be91cab5d9ca846bde3999eba6/isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15", size = 22320 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jinja2"
|
||||
version = "3.1.6"
|
||||
@@ -2118,6 +2155,7 @@ name = "paperless-ngx"
|
||||
version = "2.19.5"
|
||||
source = { virtual = "." }
|
||||
dependencies = [
|
||||
{ name = "azure-ai-documentintelligence", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "babel", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "bleach", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "celery", extra = ["redis"], marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
@@ -2254,6 +2292,7 @@ typing = [
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "azure-ai-documentintelligence", specifier = ">=1.0.2" },
|
||||
{ name = "babel", specifier = ">=2.17" },
|
||||
{ name = "bleach", specifier = "~=6.2.0" },
|
||||
{ name = "celery", extras = ["redis"], specifier = "~=5.5.1" },
|
||||
|
||||
Reference in New Issue
Block a user