Compare commits

..

4 Commits

Author SHA1 Message Date
GitHub Actions
8686f264cf Auto translate strings 2025-10-13 22:26:49 +00:00
shamoon
f6c004183e Feature: Advanced Workflow Trigger Filters (#11029) 2025-10-13 22:23:56 +00:00
dependabot[bot]
d394053ddc docker(deps): Bump astral-sh/uv from 0.8.22-python3.12-bookworm-slim to 0.9.2-python3.12-bookworm-slim (#11052)
Bumps [astral-sh/uv](https://github.com/astral-sh/uv) from 0.8.22-python3.12-bookworm-slim to 0.9.2-python3.12-bookworm-slim.
- [Release notes](https://github.com/astral-sh/uv/releases)
- [Changelog](https://github.com/astral-sh/uv/blob/main/CHANGELOG.md)
- [Commits](https://github.com/astral-sh/uv/compare/0.8.22...0.9.2)

---
updated-dependencies:
- dependency-name: astral-sh/uv
  dependency-version: 0.9.2-python3.12-bookworm-slim
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 14:28:59 -07:00
GitHub Actions
a36c28418c Auto translate strings 2025-10-13 20:19:53 +00:00
21 changed files with 2707 additions and 490 deletions

View File

@@ -32,7 +32,7 @@ RUN set -eux \
# Purpose: Installs s6-overlay and rootfs # Purpose: Installs s6-overlay and rootfs
# Comments: # Comments:
# - Don't leave anything extra in here either # - Don't leave anything extra in here either
FROM ghcr.io/astral-sh/uv:0.8.22-python3.12-bookworm-slim AS s6-overlay-base FROM ghcr.io/astral-sh/uv:0.9.2-python3.12-bookworm-slim AS s6-overlay-base
WORKDIR /usr/src/s6 WORKDIR /usr/src/s6

View File

@@ -462,15 +462,24 @@ flowchart TD
Workflows allow you to filter by: Workflows allow you to filter by:
- Source, e.g. documents uploaded via consume folder, API (& the web UI) and mail fetch - Source, e.g. documents uploaded via consume folder, API (& the web UI) and mail fetch
- File name, including wildcards e.g. \*.pdf will apply to all pdfs - File name, including wildcards e.g. \*.pdf will apply to all pdfs.
- File path, including wildcards. Note that enabling `PAPERLESS_CONSUMER_RECURSIVE` would allow, for - File path, including wildcards. Note that enabling `PAPERLESS_CONSUMER_RECURSIVE` would allow, for
example, automatically assigning documents to different owners based on the upload directory. example, automatically assigning documents to different owners based on the upload directory.
- Mail rule. Choosing this option will force 'mail fetch' to be the workflow source. - Mail rule. Choosing this option will force 'mail fetch' to be the workflow source.
- Content matching (`Added`, `Updated` and `Scheduled` triggers only). Filter document content using the matching settings. - Content matching (`Added`, `Updated` and `Scheduled` triggers only). Filter document content using the matching settings.
- Tags (`Added`, `Updated` and `Scheduled` triggers only). Filter for documents with any of the specified tags
- Document type (`Added`, `Updated` and `Scheduled` triggers only). Filter documents with this doc type There are also 'advanced' filters available for `Added`, `Updated` and `Scheduled` triggers:
- Correspondent (`Added`, `Updated` and `Scheduled` triggers only). Filter documents with this correspondent
- Storage path (`Added`, `Updated` and `Scheduled` triggers only). Filter documents with this storage path - Any Tags: Filter for documents with any of the specified tags.
- All Tags: Filter for documents with all of the specified tags.
- No Tags: Filter for documents with none of the specified tags.
- Document type: Filter documents with this document type.
- Not Document types: Filter documents without any of these document types.
- Correspondent: Filter documents with this correspondent.
- Not Correspondents: Filter documents without any of these correspondents.
- Storage path: Filter documents with this storage path.
- Not Storage paths: Filter documents without any of these storage paths.
- Custom field query: Filter documents with a custom field query (the same as used for the document list filters).
### Workflow Actions ### Workflow Actions

View File

@@ -42,7 +42,7 @@ dependencies = [
"drf-spectacular~=0.28", "drf-spectacular~=0.28",
"drf-spectacular-sidecar~=2025.9.1", "drf-spectacular-sidecar~=2025.9.1",
"drf-writable-nested~=0.7.1", "drf-writable-nested~=0.7.1",
"filelock~=3.20.0", "filelock~=3.19.1",
"flower~=2.0.1", "flower~=2.0.1",
"gotenberg-client~=0.11.0", "gotenberg-client~=0.11.0",
"httpx-oauth~=0.16", "httpx-oauth~=0.16",
@@ -115,8 +115,8 @@ testing = [
lint = [ lint = [
"pre-commit~=4.3.0", "pre-commit~=4.3.0",
"pre-commit-uv~=4.2.0", "pre-commit-uv~=4.1.3",
"ruff~=0.14.0", "ruff~=0.13.0",
] ]
typing = [ typing = [

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +1,4 @@
@if (useDropdown) {
<div class="btn-group w-100" role="group" ngbDropdown #dropdown="ngbDropdown" (openChange)="onOpenChange($event)" [popperOptions]="popperOptions"> <div class="btn-group w-100" role="group" ngbDropdown #dropdown="ngbDropdown" (openChange)="onOpenChange($event)" [popperOptions]="popperOptions">
<button class="btn btn-sm btn-outline-primary" id="dropdown_toggle" ngbDropdownToggle [disabled]="disabled"> <button class="btn btn-sm btn-outline-primary" id="dropdown_toggle" ngbDropdownToggle [disabled]="disabled">
<i-bs name="{{icon}}"></i-bs> <i-bs name="{{icon}}"></i-bs>
@@ -7,8 +8,16 @@
} }
</button> </button>
<div class="px-3 shadow" ngbDropdownMenu attr.aria-labelledby="dropdown_{{name}}"> <div class="px-3 shadow" ngbDropdownMenu attr.aria-labelledby="dropdown_{{name}}">
<ng-container *ngTemplateOutlet="list; context: { queries: selectionModel.queries }"></ng-container>
</div>
</div>
} @else {
<ng-container *ngTemplateOutlet="list; context: { queries: selectionModel.queries }"></ng-container>
}
<ng-template #list let-queries="queries">
<div class="list-group list-group-flush"> <div class="list-group list-group-flush">
@for (element of selectionModel.queries; track element.id; let i = $index) { @for (element of queries; track element.id; let i = $index) {
<div class="list-group-item px-0 d-flex flex-nowrap"> <div class="list-group-item px-0 d-flex flex-nowrap">
@switch (element.type) { @switch (element.type) {
@case (CustomFieldQueryComponentType.Atom) { @case (CustomFieldQueryComponentType.Atom) {
@@ -21,8 +30,7 @@
</div> </div>
} }
</div> </div>
</div> </ng-template>
</div>
<ng-template #comparisonValueTemplate let-atom="atom"> <ng-template #comparisonValueTemplate let-atom="atom">
@if (getCustomFieldByID(atom.field)?.data_type === CustomFieldDataType.Date) { @if (getCustomFieldByID(atom.field)?.data_type === CustomFieldDataType.Date) {

View File

@@ -120,6 +120,12 @@ export class CustomFieldQueriesModel {
}) })
} }
addInitialAtom() {
this.addAtom(
new CustomFieldQueryAtom([null, CustomFieldQueryOperator.Exists, 'true'])
)
}
private findElement( private findElement(
queryElement: CustomFieldQueryElement, queryElement: CustomFieldQueryElement,
elements: any[] elements: any[]
@@ -206,6 +212,9 @@ export class CustomFieldsQueryDropdownComponent extends LoadingComponentWithPerm
@Input() @Input()
applyOnClose = false applyOnClose = false
@Input()
useDropdown: boolean = true
get name(): string { get name(): string {
return this.title ? this.title.replace(/\s/g, '_').toLowerCase() : null return this.title ? this.title.replace(/\s/g, '_').toLowerCase() : null
} }
@@ -258,13 +267,7 @@ export class CustomFieldsQueryDropdownComponent extends LoadingComponentWithPerm
public onOpenChange(open: boolean) { public onOpenChange(open: boolean) {
if (open) { if (open) {
if (this.selectionModel.queries.length === 0) { if (this.selectionModel.queries.length === 0) {
this.selectionModel.addAtom( this.selectionModel.addInitialAtom()
new CustomFieldQueryAtom([
null,
CustomFieldQueryOperator.Exists,
'true',
])
)
} }
if ( if (
this.selectionModel.queries.length === 1 && this.selectionModel.queries.length === 1 &&

View File

@@ -156,31 +156,97 @@
<p class="small" i18n>Trigger for documents that match <em>all</em> filters specified below.</p> <p class="small" i18n>Trigger for documents that match <em>all</em> filters specified below.</p>
<div class="row"> <div class="row">
<div class="col"> <div class="col">
<pngx-input-text i18n-title title="Filter filename" formControlName="filter_filename" i18n-hint hint="Apply to documents that match this filename. Wildcards such as *.pdf or *invoice* are allowed. Case insensitive." [error]="error?.filter_filename"></pngx-input-text> <pngx-input-text i18n-title title="Filter filename" formControlName="filter_filename" horizontal="true" i18n-hint hint="Apply to documents that match this filename. Wildcards such as *.pdf or *invoice* are allowed. Case insensitive." [error]="error?.filter_filename"></pngx-input-text>
@if (formGroup.get('type').value === WorkflowTriggerType.Consumption) { @if (formGroup.get('type').value === WorkflowTriggerType.Consumption) {
<pngx-input-select i18n-title title="Filter sources" [items]="sourceOptions" [multiple]="true" formControlName="sources" [error]="error?.sources"></pngx-input-select> <pngx-input-select i18n-title title="Filter sources" [items]="sourceOptions" horizontal="true" [multiple]="true" formControlName="sources" [error]="error?.sources"></pngx-input-select>
<pngx-input-text i18n-title title="Filter path" formControlName="filter_path" i18n-hint hint="Apply to documents that match this path. Wildcards specified as * are allowed. Case-normalized.</a>" [error]="error?.filter_path"></pngx-input-text> <pngx-input-text i18n-title title="Filter path" formControlName="filter_path" horizontal="true" i18n-hint hint="Apply to documents that match this path. Wildcards specified as * are allowed. Case-normalized.</a>" [error]="error?.filter_path"></pngx-input-text>
<pngx-input-select i18n-title title="Filter mail rule" [items]="mailRules" [allowNull]="true" formControlName="filter_mailrule" i18n-hint hint="Apply to documents consumed via this mail rule." [error]="error?.filter_mailrule"></pngx-input-select> <pngx-input-select i18n-title title="Filter mail rule" [items]="mailRules" horizontal="true" [allowNull]="true" formControlName="filter_mailrule" i18n-hint hint="Apply to documents consumed via this mail rule." [error]="error?.filter_mailrule"></pngx-input-select>
} }
@if (formGroup.get('type').value === WorkflowTriggerType.DocumentAdded || formGroup.get('type').value === WorkflowTriggerType.DocumentUpdated || formGroup.get('type').value === WorkflowTriggerType.Scheduled) { @if (formGroup.get('type').value === WorkflowTriggerType.DocumentAdded || formGroup.get('type').value === WorkflowTriggerType.DocumentUpdated || formGroup.get('type').value === WorkflowTriggerType.Scheduled) {
<pngx-input-select i18n-title title="Content matching algorithm" [items]="getMatchingAlgorithms()" formControlName="matching_algorithm"></pngx-input-select> <pngx-input-select i18n-title title="Content matching algorithm" horizontal="true" [items]="getMatchingAlgorithms()" formControlName="matching_algorithm"></pngx-input-select>
@if (patternRequired) { @if (matchingPatternRequired(formGroup)) {
<pngx-input-text i18n-title title="Content matching pattern" formControlName="match" [error]="error?.match"></pngx-input-text> <pngx-input-text i18n-title title="Content matching pattern" horizontal="true" formControlName="match" [error]="error?.match"></pngx-input-text>
} }
@if (patternRequired) { @if (matchingPatternRequired(formGroup)) {
<pngx-input-check i18n-title title="Case insensitive" formControlName="is_insensitive"></pngx-input-check> <pngx-input-check i18n-title title="Case insensitive" horizontal="true" formControlName="is_insensitive"></pngx-input-check>
} }
} }
</div> </div>
</div>
@if (formGroup.get('type').value === WorkflowTriggerType.DocumentAdded || formGroup.get('type').value === WorkflowTriggerType.DocumentUpdated || formGroup.get('type').value === WorkflowTriggerType.Scheduled) { @if (formGroup.get('type').value === WorkflowTriggerType.DocumentAdded || formGroup.get('type').value === WorkflowTriggerType.DocumentUpdated || formGroup.get('type').value === WorkflowTriggerType.Scheduled) {
<div class="col-md-6"> <div class="row mt-3">
<pngx-input-tags [allowCreate]="false" i18n-title title="Has any of tags" formControlName="filter_has_tags"></pngx-input-tags> <div class="col">
<pngx-input-select i18n-title title="Has correspondent" [items]="correspondents" [allowNull]="true" formControlName="filter_has_correspondent"></pngx-input-select> <div class="trigger-filters mb-3">
<pngx-input-select i18n-title title="Has document type" [items]="documentTypes" [allowNull]="true" formControlName="filter_has_document_type"></pngx-input-select> <div class="d-flex align-items-center">
<pngx-input-select i18n-title title="Has storage path" [items]="storagePaths" [allowNull]="true" formControlName="filter_has_storage_path"></pngx-input-select> <label class="form-label mb-0" i18n>Advanced Filters</label>
<button
type="button"
class="btn btn-sm btn-outline-primary ms-auto"
(click)="addFilter(formGroup)"
[disabled]="!canAddFilter(formGroup)"
>
<i-bs name="plus-circle"></i-bs>&nbsp;<span i18n>Add filter</span>
</button>
</div>
<ul class="mt-2 list-group filters" formArrayName="filters">
@if (getFiltersFormArray(formGroup).length === 0) {
<p class="text-muted small" i18n>No advanced workflow filters defined.</p>
}
@for (filter of getFiltersFormArray(formGroup).controls; track filter; let filterIndex = $index) {
<li [formGroupName]="filterIndex" class="list-group-item">
<div class="d-flex align-items-center gap-2">
<div class="w-25">
<pngx-input-select
i18n-title
[items]="getFilterTypeOptions(formGroup, filterIndex)"
formControlName="type"
[allowNull]="false"
></pngx-input-select>
</div>
<div class="flex-grow-1">
@if (isTagsFilter(filter.get('type').value)) {
<pngx-input-tags
[allowCreate]="false"
[title]="null"
formControlName="values"
></pngx-input-tags>
} @else if (
isCustomFieldQueryFilter(filter.get('type').value)
) {
<pngx-custom-fields-query-dropdown
[selectionModel]="getCustomFieldQueryModel(filter)"
(selectionModelChange)="onCustomFieldQuerySelectionChange(filter, $event)"
[useDropdown]="false"
></pngx-custom-fields-query-dropdown>
@if (!isCustomFieldQueryValid(filter)) {
<div class="text-danger small" i18n>
Complete the custom field query configuration.
</div> </div>
} }
} @else {
<pngx-input-select
[items]="getFilterSelectItems(filter.get('type').value)"
[allowNull]="true"
[multiple]="isSelectMultiple(filter.get('type').value)"
formControlName="values"
></pngx-input-select>
}
</div> </div>
<button
type="button"
class="btn btn-link text-danger p-0"
(click)="removeFilter(formGroup, filterIndex)"
>
<i-bs name="trash"></i-bs><span class="ms-1" i18n>Delete</span>
</button>
</div>
</li>
}
</ul>
</div>
</div>
</div>
}
</div> </div>
</ng-template> </ng-template>

View File

@@ -7,3 +7,7 @@
.accordion-button { .accordion-button {
font-size: 1rem; font-size: 1rem;
} }
:host ::ng-deep .filters .paperless-input-select.mb-3 {
margin-bottom: 0 !important;
}

View File

@@ -11,8 +11,14 @@ import {
import { NgbActiveModal, NgbModule } from '@ng-bootstrap/ng-bootstrap' import { NgbActiveModal, NgbModule } from '@ng-bootstrap/ng-bootstrap'
import { NgSelectModule } from '@ng-select/ng-select' import { NgSelectModule } from '@ng-select/ng-select'
import { of } from 'rxjs' import { of } from 'rxjs'
import { CustomFieldQueriesModel } from 'src/app/components/common/custom-fields-query-dropdown/custom-fields-query-dropdown.component'
import { CustomFieldDataType } from 'src/app/data/custom-field' import { CustomFieldDataType } from 'src/app/data/custom-field'
import { MATCHING_ALGORITHMS, MATCH_AUTO } from 'src/app/data/matching-model' import { CustomFieldQueryLogicalOperator } from 'src/app/data/custom-field-query'
import {
MATCHING_ALGORITHMS,
MATCH_AUTO,
MATCH_NONE,
} from 'src/app/data/matching-model'
import { Workflow } from 'src/app/data/workflow' import { Workflow } from 'src/app/data/workflow'
import { import {
WorkflowAction, WorkflowAction,
@@ -31,6 +37,7 @@ import { DocumentTypeService } from 'src/app/services/rest/document-type.service
import { MailRuleService } from 'src/app/services/rest/mail-rule.service' import { MailRuleService } from 'src/app/services/rest/mail-rule.service'
import { StoragePathService } from 'src/app/services/rest/storage-path.service' import { StoragePathService } from 'src/app/services/rest/storage-path.service'
import { SettingsService } from 'src/app/services/settings.service' import { SettingsService } from 'src/app/services/settings.service'
import { CustomFieldQueryExpression } from 'src/app/utils/custom-field-query-element'
import { ConfirmButtonComponent } from '../../confirm-button/confirm-button.component' import { ConfirmButtonComponent } from '../../confirm-button/confirm-button.component'
import { NumberComponent } from '../../input/number/number.component' import { NumberComponent } from '../../input/number/number.component'
import { PermissionsGroupComponent } from '../../input/permissions/permissions-group/permissions-group.component' import { PermissionsGroupComponent } from '../../input/permissions/permissions-group/permissions-group.component'
@@ -43,6 +50,7 @@ import { EditDialogMode } from '../edit-dialog.component'
import { import {
DOCUMENT_SOURCE_OPTIONS, DOCUMENT_SOURCE_OPTIONS,
SCHEDULE_DATE_FIELD_OPTIONS, SCHEDULE_DATE_FIELD_OPTIONS,
TriggerFilterType,
WORKFLOW_ACTION_OPTIONS, WORKFLOW_ACTION_OPTIONS,
WORKFLOW_TYPE_OPTIONS, WORKFLOW_TYPE_OPTIONS,
WorkflowEditDialogComponent, WorkflowEditDialogComponent,
@@ -375,6 +383,562 @@ describe('WorkflowEditDialogComponent', () => {
expect(component.objectForm.get('actions').value[0].webhook).toBeNull() expect(component.objectForm.get('actions').value[0].webhook).toBeNull()
}) })
it('should require matching pattern when algorithm is not none', () => {
const triggerGroup = new FormGroup({
matching_algorithm: new FormControl(MATCH_AUTO),
match: new FormControl(''),
})
expect(component.matchingPatternRequired(triggerGroup)).toBe(true)
triggerGroup.get('matching_algorithm').setValue(MATCHING_ALGORITHMS[0].id)
expect(component.matchingPatternRequired(triggerGroup)).toBe(true)
triggerGroup.get('matching_algorithm').setValue(MATCH_NONE)
expect(component.matchingPatternRequired(triggerGroup)).toBe(false)
})
it('should map filter builder values into trigger filters on save', () => {
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0)
component.addFilter(triggerGroup as FormGroup)
component.addFilter(triggerGroup as FormGroup)
component.addFilter(triggerGroup as FormGroup)
const filters = component.getFiltersFormArray(triggerGroup as FormGroup)
expect(filters.length).toBe(3)
filters.at(0).get('values').setValue([1])
filters.at(1).get('values').setValue([2, 3])
filters.at(2).get('values').setValue([4])
const addFilterOfType = (type: TriggerFilterType) => {
const newFilter = component.addFilter(triggerGroup as FormGroup)
newFilter.get('type').setValue(type)
return newFilter
}
const correspondentIs = addFilterOfType(TriggerFilterType.CorrespondentIs)
correspondentIs.get('values').setValue(1)
const correspondentNot = addFilterOfType(TriggerFilterType.CorrespondentNot)
correspondentNot.get('values').setValue([1])
const documentTypeIs = addFilterOfType(TriggerFilterType.DocumentTypeIs)
documentTypeIs.get('values').setValue(1)
const documentTypeNot = addFilterOfType(TriggerFilterType.DocumentTypeNot)
documentTypeNot.get('values').setValue([1])
const storagePathIs = addFilterOfType(TriggerFilterType.StoragePathIs)
storagePathIs.get('values').setValue(1)
const storagePathNot = addFilterOfType(TriggerFilterType.StoragePathNot)
storagePathNot.get('values').setValue([1])
const customFieldFilter = addFilterOfType(
TriggerFilterType.CustomFieldQuery
)
const customFieldQuery = JSON.stringify(['AND', [[1, 'exact', 'test']]])
customFieldFilter.get('values').setValue(customFieldQuery)
const formValues = component['getFormValues']()
expect(formValues.triggers[0].filter_has_tags).toEqual([1])
expect(formValues.triggers[0].filter_has_all_tags).toEqual([2, 3])
expect(formValues.triggers[0].filter_has_not_tags).toEqual([4])
expect(formValues.triggers[0].filter_has_correspondent).toEqual(1)
expect(formValues.triggers[0].filter_has_not_correspondents).toEqual([1])
expect(formValues.triggers[0].filter_has_document_type).toEqual(1)
expect(formValues.triggers[0].filter_has_not_document_types).toEqual([1])
expect(formValues.triggers[0].filter_has_storage_path).toEqual(1)
expect(formValues.triggers[0].filter_has_not_storage_paths).toEqual([1])
expect(formValues.triggers[0].filter_custom_field_query).toEqual(
customFieldQuery
)
expect(formValues.triggers[0].filters).toBeUndefined()
})
it('should ignore empty and null filter values when mapping filters', () => {
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0) as FormGroup
const tagsFilter = component.addFilter(triggerGroup)
tagsFilter.get('type').setValue(TriggerFilterType.TagsAny)
tagsFilter.get('values').setValue([])
const correspondentFilter = component.addFilter(triggerGroup)
correspondentFilter.get('type').setValue(TriggerFilterType.CorrespondentIs)
correspondentFilter.get('values').setValue(null)
const formValues = component['getFormValues']()
expect(formValues.triggers[0].filter_has_tags).toEqual([])
expect(formValues.triggers[0].filter_has_correspondent).toBeNull()
})
it('should derive single select filters from array values', () => {
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0) as FormGroup
const addFilterOfType = (type: TriggerFilterType, value: any) => {
const filter = component.addFilter(triggerGroup)
filter.get('type').setValue(type)
filter.get('values').setValue(value)
}
addFilterOfType(TriggerFilterType.CorrespondentIs, [5])
addFilterOfType(TriggerFilterType.DocumentTypeIs, [6])
addFilterOfType(TriggerFilterType.StoragePathIs, [7])
const formValues = component['getFormValues']()
expect(formValues.triggers[0].filter_has_correspondent).toEqual(5)
expect(formValues.triggers[0].filter_has_document_type).toEqual(6)
expect(formValues.triggers[0].filter_has_storage_path).toEqual(7)
})
it('should convert multi-value filter values when aggregating filters', () => {
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0) as FormGroup
const setFilter = (type: TriggerFilterType, value: number): void => {
const filter = component.addFilter(triggerGroup) as FormGroup
filter.get('type').setValue(type)
filter.get('values').setValue(value)
}
setFilter(TriggerFilterType.TagsAll, 11)
setFilter(TriggerFilterType.TagsNone, 12)
setFilter(TriggerFilterType.CorrespondentNot, 13)
setFilter(TriggerFilterType.DocumentTypeNot, 14)
setFilter(TriggerFilterType.StoragePathNot, 15)
const formValues = component['getFormValues']()
expect(formValues.triggers[0].filter_has_all_tags).toEqual([11])
expect(formValues.triggers[0].filter_has_not_tags).toEqual([12])
expect(formValues.triggers[0].filter_has_not_correspondents).toEqual([13])
expect(formValues.triggers[0].filter_has_not_document_types).toEqual([14])
expect(formValues.triggers[0].filter_has_not_storage_paths).toEqual([15])
})
it('should reuse filter type options and update disabled state', () => {
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0) as FormGroup
component.addFilter(triggerGroup)
const optionsFirst = component.getFilterTypeOptions(triggerGroup, 0)
const optionsSecond = component.getFilterTypeOptions(triggerGroup, 0)
expect(optionsFirst).toBe(optionsSecond)
// to force disabled flag
component.addFilter(triggerGroup)
const filterArray = component.getFiltersFormArray(triggerGroup)
const firstFilter = filterArray.at(0)
firstFilter.get('type').setValue(TriggerFilterType.CorrespondentIs)
component.addFilter(triggerGroup)
const updatedFilters = component.getFiltersFormArray(triggerGroup)
const secondFilter = updatedFilters.at(1)
const options = component.getFilterTypeOptions(triggerGroup, 1)
const correspondentIsOption = options.find(
(option) => option.id === TriggerFilterType.CorrespondentIs
)
expect(correspondentIsOption.disabled).toBe(true)
firstFilter.get('type').setValue(TriggerFilterType.DocumentTypeNot)
secondFilter.get('type').setValue(TriggerFilterType.TagsAll)
const postChangeOptions = component.getFilterTypeOptions(triggerGroup, 1)
const correspondentOptionAfter = postChangeOptions.find(
(option) => option.id === TriggerFilterType.CorrespondentIs
)
expect(correspondentOptionAfter.disabled).toBe(false)
})
it('should keep multi-entry filter options enabled and allow duplicates', () => {
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0) as FormGroup
component.filterDefinitions = [
{
id: TriggerFilterType.TagsAny,
name: 'Any tags',
inputType: 'tags',
allowMultipleEntries: true,
allowMultipleValues: true,
} as any,
{
id: TriggerFilterType.CorrespondentIs,
name: 'Correspondent is',
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: false,
selectItems: 'correspondents',
} as any,
]
const firstFilter = component.addFilter(triggerGroup)
firstFilter.get('type').setValue(TriggerFilterType.TagsAny)
const secondFilter = component.addFilter(triggerGroup)
expect(secondFilter).not.toBeNull()
const options = component.getFilterTypeOptions(triggerGroup, 1)
const multiEntryOption = options.find(
(option) => option.id === TriggerFilterType.TagsAny
)
expect(multiEntryOption.disabled).toBe(false)
expect(component.canAddFilter(triggerGroup)).toBe(true)
})
it('should return null when no filter definitions remain available', () => {
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0) as FormGroup
component.filterDefinitions = [
{
id: TriggerFilterType.TagsAny,
name: 'Any tags',
inputType: 'tags',
allowMultipleEntries: false,
allowMultipleValues: true,
} as any,
{
id: TriggerFilterType.CorrespondentIs,
name: 'Correspondent is',
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: false,
selectItems: 'correspondents',
} as any,
]
const firstFilter = component.addFilter(triggerGroup)
firstFilter.get('type').setValue(TriggerFilterType.TagsAny)
const secondFilter = component.addFilter(triggerGroup)
secondFilter.get('type').setValue(TriggerFilterType.CorrespondentIs)
expect(component.canAddFilter(triggerGroup)).toBe(false)
expect(component.addFilter(triggerGroup)).toBeNull()
})
it('should skip filter definitions without handlers when building form array', () => {
const originalDefinitions = component.filterDefinitions
component.filterDefinitions = [
{
id: 999,
name: 'Unsupported',
inputType: 'text',
allowMultipleEntries: false,
allowMultipleValues: false,
} as any,
]
const trigger = {
filter_has_tags: [],
filter_has_all_tags: [],
filter_has_not_tags: [],
filter_has_not_correspondents: [],
filter_has_not_document_types: [],
filter_has_not_storage_paths: [],
filter_has_correspondent: null,
filter_has_document_type: null,
filter_has_storage_path: null,
filter_custom_field_query: null,
} as any
const filters = component['buildFiltersFormArray'](trigger)
expect(filters.length).toBe(0)
component.filterDefinitions = originalDefinitions
})
it('should return null when adding filter for unknown trigger form group', () => {
expect(component.addFilter(new FormGroup({}) as any)).toBeNull()
})
it('should ignore remove filter calls for unknown trigger form group', () => {
expect(() =>
component.removeFilter(new FormGroup({}) as any, 0)
).not.toThrow()
})
it('should teardown custom field query model when removing a custom field filter', () => {
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0) as FormGroup
component.addFilter(triggerGroup)
const filters = component.getFiltersFormArray(triggerGroup)
const filterGroup = filters.at(0) as FormGroup
filterGroup.get('type').setValue(TriggerFilterType.CustomFieldQuery)
const model = component.getCustomFieldQueryModel(filterGroup)
expect(model).toBeDefined()
expect(
component['getStoredCustomFieldQueryModel'](filterGroup as any)
).toBe(model)
component.removeFilter(triggerGroup, 0)
expect(
component['getStoredCustomFieldQueryModel'](filterGroup as any)
).toBeNull()
})
it('should return readable filter names', () => {
expect(component.getFilterName(TriggerFilterType.TagsAny)).toBe(
'Has any of these tags'
)
expect(component.getFilterName(999 as any)).toBe('')
})
it('should build filter form array from existing trigger filters', () => {
const trigger = workflow.triggers[0]
trigger.filter_has_tags = [1]
trigger.filter_has_all_tags = [2, 3]
trigger.filter_has_not_tags = [4]
trigger.filter_has_correspondent = 5 as any
trigger.filter_has_not_correspondents = [6] as any
trigger.filter_has_document_type = 7 as any
trigger.filter_has_not_document_types = [8] as any
trigger.filter_has_storage_path = 9 as any
trigger.filter_has_not_storage_paths = [10] as any
trigger.filter_custom_field_query = JSON.stringify([
'AND',
[[1, 'exact', 'value']],
]) as any
component.object = workflow
component.ngOnInit()
const triggerGroup = component.triggerFields.at(0) as FormGroup
const filters = component.getFiltersFormArray(triggerGroup)
expect(filters.length).toBe(10)
const customFieldFilter = filters.at(9) as FormGroup
expect(customFieldFilter.get('type').value).toBe(
TriggerFilterType.CustomFieldQuery
)
const model = component.getCustomFieldQueryModel(customFieldFilter)
expect(model.isValid()).toBe(true)
})
it('should expose select metadata helpers', () => {
expect(component.isSelectMultiple(TriggerFilterType.CorrespondentNot)).toBe(
true
)
expect(component.isSelectMultiple(TriggerFilterType.CorrespondentIs)).toBe(
false
)
component.correspondents = [{ id: 1, name: 'C1' } as any]
component.documentTypes = [{ id: 2, name: 'DT' } as any]
component.storagePaths = [{ id: 3, name: 'SP' } as any]
expect(
component.getFilterSelectItems(TriggerFilterType.CorrespondentIs)
).toEqual(component.correspondents)
expect(
component.getFilterSelectItems(TriggerFilterType.DocumentTypeIs)
).toEqual(component.documentTypes)
expect(
component.getFilterSelectItems(TriggerFilterType.StoragePathIs)
).toEqual(component.storagePaths)
expect(component.getFilterSelectItems(TriggerFilterType.TagsAll)).toEqual(
[]
)
expect(
component.isCustomFieldQueryFilter(TriggerFilterType.CustomFieldQuery)
).toBe(true)
})
it('should return empty select items when definition is missing', () => {
const originalDefinitions = component.filterDefinitions
component.filterDefinitions = []
expect(
component.getFilterSelectItems(TriggerFilterType.CorrespondentIs)
).toEqual([])
component.filterDefinitions = originalDefinitions
})
it('should return empty select items when definition has unknown source', () => {
const originalDefinitions = component.filterDefinitions
component.filterDefinitions = [
{
id: TriggerFilterType.CorrespondentIs,
name: 'Correspondent is',
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: false,
selectItems: 'unknown',
} as any,
]
expect(
component.getFilterSelectItems(TriggerFilterType.CorrespondentIs)
).toEqual([])
component.filterDefinitions = originalDefinitions
})
it('should handle custom field query selection change and validation states', () => {
const formGroup = new FormGroup({
values: new FormControl(null),
})
const model = new CustomFieldQueriesModel()
const changeSpy = jest.spyOn(
component as any,
'onCustomFieldQueryModelChanged'
)
component.onCustomFieldQuerySelectionChange(formGroup, model)
expect(changeSpy).toHaveBeenCalledWith(formGroup, model)
expect(component.isCustomFieldQueryValid(formGroup)).toBe(true)
component['setCustomFieldQueryModel'](formGroup as any, model as any)
const validSpy = jest.spyOn(model, 'isValid').mockReturnValue(false)
const emptySpy = jest.spyOn(model, 'isEmpty').mockReturnValue(false)
expect(component.isCustomFieldQueryValid(formGroup)).toBe(false)
expect(validSpy).toHaveBeenCalled()
validSpy.mockReturnValue(true)
emptySpy.mockReturnValue(true)
expect(component.isCustomFieldQueryValid(formGroup)).toBe(true)
emptySpy.mockReturnValue(false)
expect(component.isCustomFieldQueryValid(formGroup)).toBe(true)
component['clearCustomFieldQueryModel'](formGroup as any)
})
it('should recover from invalid custom field query json and update control on changes', () => {
const filterGroup = new FormGroup({
values: new FormControl('not-json'),
})
component['ensureCustomFieldQueryModel'](filterGroup, 'not-json')
const model = component['getStoredCustomFieldQueryModel'](
filterGroup as any
)
expect(model).toBeDefined()
expect(model.queries.length).toBeGreaterThan(0)
const valuesControl = filterGroup.get('values')
expect(valuesControl.value).toBeNull()
const expression = new CustomFieldQueryExpression([
CustomFieldQueryLogicalOperator.And,
[[1, 'exact', 'value']],
])
model.queries = [expression]
jest.spyOn(model, 'isValid').mockReturnValue(true)
jest.spyOn(model, 'isEmpty').mockReturnValue(false)
model.changed.next(model)
expect(valuesControl.value).toEqual(JSON.stringify(expression.serialize()))
component['clearCustomFieldQueryModel'](filterGroup as any)
})
it('should handle custom field query model change edge cases', () => {
const groupWithoutControl = new FormGroup({})
const dummyModel = {
isValid: jest.fn().mockReturnValue(true),
isEmpty: jest.fn().mockReturnValue(false),
}
expect(() =>
component['onCustomFieldQueryModelChanged'](
groupWithoutControl as any,
dummyModel as any
)
).not.toThrow()
const groupWithControl = new FormGroup({
values: new FormControl('initial'),
})
const emptyModel = {
isValid: jest.fn().mockReturnValue(true),
isEmpty: jest.fn().mockReturnValue(true),
}
component['onCustomFieldQueryModelChanged'](
groupWithControl as any,
emptyModel as any
)
expect(groupWithControl.get('values').value).toBeNull()
})
it('should normalize filter values for single and multi selects', () => {
expect(
component['normalizeFilterValue'](TriggerFilterType.TagsAny)
).toEqual([])
expect(
component['normalizeFilterValue'](TriggerFilterType.TagsAny, 5)
).toEqual([5])
expect(
component['normalizeFilterValue'](TriggerFilterType.TagsAny, [5, 6])
).toEqual([5, 6])
expect(
component['normalizeFilterValue'](TriggerFilterType.CorrespondentIs, [7])
).toEqual(7)
expect(
component['normalizeFilterValue'](TriggerFilterType.CorrespondentIs, 8)
).toEqual(8)
const customFieldJson = JSON.stringify(['AND', [[1, 'exact', 'test']]])
expect(
component['normalizeFilterValue'](
TriggerFilterType.CustomFieldQuery,
customFieldJson
)
).toEqual(customFieldJson)
const customFieldObject = ['AND', [[1, 'exact', 'other']]]
expect(
component['normalizeFilterValue'](
TriggerFilterType.CustomFieldQuery,
customFieldObject
)
).toEqual(JSON.stringify(customFieldObject))
expect(
component['normalizeFilterValue'](
TriggerFilterType.CustomFieldQuery,
false
)
).toBeNull()
})
it('should add and remove filter form groups', () => {
component['changeDetector'] = { detectChanges: jest.fn() } as any
component.object = undefined
component.addTrigger()
const triggerGroup = component.triggerFields.at(0) as FormGroup
component.addFilter(triggerGroup)
component.removeFilter(triggerGroup, 0)
expect(component.getFiltersFormArray(triggerGroup).length).toBe(0)
component.addFilter(triggerGroup)
const filterArrayAfterAdd = component.getFiltersFormArray(triggerGroup)
filterArrayAfterAdd.at(0).get('type').setValue(TriggerFilterType.TagsAll)
expect(component.getFiltersFormArray(triggerGroup).length).toBe(1)
})
it('should remove selected custom field from the form group', () => { it('should remove selected custom field from the form group', () => {
const formGroup = new FormGroup({ const formGroup = new FormGroup({
assign_custom_fields: new FormControl([1, 2, 3]), assign_custom_fields: new FormControl([1, 2, 3]),

View File

@@ -6,6 +6,7 @@ import {
import { NgTemplateOutlet } from '@angular/common' import { NgTemplateOutlet } from '@angular/common'
import { Component, OnInit, inject } from '@angular/core' import { Component, OnInit, inject } from '@angular/core'
import { import {
AbstractControl,
FormArray, FormArray,
FormControl, FormControl,
FormGroup, FormGroup,
@@ -14,7 +15,7 @@ import {
} from '@angular/forms' } from '@angular/forms'
import { NgbAccordionModule } from '@ng-bootstrap/ng-bootstrap' import { NgbAccordionModule } from '@ng-bootstrap/ng-bootstrap'
import { NgxBootstrapIconsModule } from 'ngx-bootstrap-icons' import { NgxBootstrapIconsModule } from 'ngx-bootstrap-icons'
import { first } from 'rxjs' import { Subscription, first, takeUntil } from 'rxjs'
import { Correspondent } from 'src/app/data/correspondent' import { Correspondent } from 'src/app/data/correspondent'
import { CustomField, CustomFieldDataType } from 'src/app/data/custom-field' import { CustomField, CustomFieldDataType } from 'src/app/data/custom-field'
import { DocumentType } from 'src/app/data/document-type' import { DocumentType } from 'src/app/data/document-type'
@@ -45,7 +46,12 @@ import { StoragePathService } from 'src/app/services/rest/storage-path.service'
import { UserService } from 'src/app/services/rest/user.service' import { UserService } from 'src/app/services/rest/user.service'
import { WorkflowService } from 'src/app/services/rest/workflow.service' import { WorkflowService } from 'src/app/services/rest/workflow.service'
import { SettingsService } from 'src/app/services/settings.service' import { SettingsService } from 'src/app/services/settings.service'
import { CustomFieldQueryExpression } from 'src/app/utils/custom-field-query-element'
import { ConfirmButtonComponent } from '../../confirm-button/confirm-button.component' import { ConfirmButtonComponent } from '../../confirm-button/confirm-button.component'
import {
CustomFieldQueriesModel,
CustomFieldsQueryDropdownComponent,
} from '../../custom-fields-query-dropdown/custom-fields-query-dropdown.component'
import { CheckComponent } from '../../input/check/check.component' import { CheckComponent } from '../../input/check/check.component'
import { CustomFieldsValuesComponent } from '../../input/custom-fields-values/custom-fields-values.component' import { CustomFieldsValuesComponent } from '../../input/custom-fields-values/custom-fields-values.component'
import { EntriesComponent } from '../../input/entries/entries.component' import { EntriesComponent } from '../../input/entries/entries.component'
@@ -135,10 +141,235 @@ export const WORKFLOW_ACTION_OPTIONS = [
}, },
] ]
export enum TriggerFilterType {
TagsAny = 'tags_any',
TagsAll = 'tags_all',
TagsNone = 'tags_none',
CorrespondentIs = 'correspondent_is',
CorrespondentNot = 'correspondent_not',
DocumentTypeIs = 'document_type_is',
DocumentTypeNot = 'document_type_not',
StoragePathIs = 'storage_path_is',
StoragePathNot = 'storage_path_not',
CustomFieldQuery = 'custom_field_query',
}
interface TriggerFilterDefinition {
id: TriggerFilterType
name: string
inputType: 'tags' | 'select' | 'customFieldQuery'
allowMultipleEntries: boolean
allowMultipleValues: boolean
selectItems?: 'correspondents' | 'documentTypes' | 'storagePaths'
disabled?: boolean
}
type TriggerFilterOption = TriggerFilterDefinition & {
disabled?: boolean
}
type TriggerFilterAggregate = {
filter_has_tags: number[]
filter_has_all_tags: number[]
filter_has_not_tags: number[]
filter_has_not_correspondents: number[]
filter_has_not_document_types: number[]
filter_has_not_storage_paths: number[]
filter_has_correspondent: number | null
filter_has_document_type: number | null
filter_has_storage_path: number | null
filter_custom_field_query: string | null
}
interface FilterHandler {
apply: (aggregate: TriggerFilterAggregate, values: any) => void
extract: (trigger: WorkflowTrigger) => any
hasValue: (value: any) => boolean
}
const CUSTOM_FIELD_QUERY_MODEL_KEY = Symbol('customFieldQueryModel')
const CUSTOM_FIELD_QUERY_SUBSCRIPTION_KEY = Symbol(
'customFieldQuerySubscription'
)
type CustomFieldFilterGroup = FormGroup & {
[CUSTOM_FIELD_QUERY_MODEL_KEY]?: CustomFieldQueriesModel
[CUSTOM_FIELD_QUERY_SUBSCRIPTION_KEY]?: Subscription
}
const TRIGGER_FILTER_DEFINITIONS: TriggerFilterDefinition[] = [
{
id: TriggerFilterType.TagsAny,
name: $localize`Has any of these tags`,
inputType: 'tags',
allowMultipleEntries: false,
allowMultipleValues: true,
},
{
id: TriggerFilterType.TagsAll,
name: $localize`Has all of these tags`,
inputType: 'tags',
allowMultipleEntries: false,
allowMultipleValues: true,
},
{
id: TriggerFilterType.TagsNone,
name: $localize`Does not have these tags`,
inputType: 'tags',
allowMultipleEntries: false,
allowMultipleValues: true,
},
{
id: TriggerFilterType.CorrespondentIs,
name: $localize`Has correspondent`,
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: false,
selectItems: 'correspondents',
},
{
id: TriggerFilterType.CorrespondentNot,
name: $localize`Does not have correspondents`,
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: true,
selectItems: 'correspondents',
},
{
id: TriggerFilterType.DocumentTypeIs,
name: $localize`Has document type`,
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: false,
selectItems: 'documentTypes',
},
{
id: TriggerFilterType.DocumentTypeNot,
name: $localize`Does not have document types`,
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: true,
selectItems: 'documentTypes',
},
{
id: TriggerFilterType.StoragePathIs,
name: $localize`Has storage path`,
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: false,
selectItems: 'storagePaths',
},
{
id: TriggerFilterType.StoragePathNot,
name: $localize`Does not have storage paths`,
inputType: 'select',
allowMultipleEntries: false,
allowMultipleValues: true,
selectItems: 'storagePaths',
},
{
id: TriggerFilterType.CustomFieldQuery,
name: $localize`Matches custom field query`,
inputType: 'customFieldQuery',
allowMultipleEntries: false,
allowMultipleValues: false,
},
]
const TRIGGER_MATCHING_ALGORITHMS = MATCHING_ALGORITHMS.filter( const TRIGGER_MATCHING_ALGORITHMS = MATCHING_ALGORITHMS.filter(
(a) => a.id !== MATCH_AUTO (a) => a.id !== MATCH_AUTO
) )
const FILTER_HANDLERS: Record<TriggerFilterType, FilterHandler> = {
[TriggerFilterType.TagsAny]: {
apply: (aggregate, values) => {
aggregate.filter_has_tags = Array.isArray(values) ? [...values] : [values]
},
extract: (trigger) => trigger.filter_has_tags,
hasValue: (value) => Array.isArray(value) && value.length > 0,
},
[TriggerFilterType.TagsAll]: {
apply: (aggregate, values) => {
aggregate.filter_has_all_tags = Array.isArray(values)
? [...values]
: [values]
},
extract: (trigger) => trigger.filter_has_all_tags,
hasValue: (value) => Array.isArray(value) && value.length > 0,
},
[TriggerFilterType.TagsNone]: {
apply: (aggregate, values) => {
aggregate.filter_has_not_tags = Array.isArray(values)
? [...values]
: [values]
},
extract: (trigger) => trigger.filter_has_not_tags,
hasValue: (value) => Array.isArray(value) && value.length > 0,
},
[TriggerFilterType.CorrespondentIs]: {
apply: (aggregate, values) => {
aggregate.filter_has_correspondent = Array.isArray(values)
? (values[0] ?? null)
: values
},
extract: (trigger) => trigger.filter_has_correspondent,
hasValue: (value) => value !== null && value !== undefined,
},
[TriggerFilterType.CorrespondentNot]: {
apply: (aggregate, values) => {
aggregate.filter_has_not_correspondents = Array.isArray(values)
? [...values]
: [values]
},
extract: (trigger) => trigger.filter_has_not_correspondents,
hasValue: (value) => Array.isArray(value) && value.length > 0,
},
[TriggerFilterType.DocumentTypeIs]: {
apply: (aggregate, values) => {
aggregate.filter_has_document_type = Array.isArray(values)
? (values[0] ?? null)
: values
},
extract: (trigger) => trigger.filter_has_document_type,
hasValue: (value) => value !== null && value !== undefined,
},
[TriggerFilterType.DocumentTypeNot]: {
apply: (aggregate, values) => {
aggregate.filter_has_not_document_types = Array.isArray(values)
? [...values]
: [values]
},
extract: (trigger) => trigger.filter_has_not_document_types,
hasValue: (value) => Array.isArray(value) && value.length > 0,
},
[TriggerFilterType.StoragePathIs]: {
apply: (aggregate, values) => {
aggregate.filter_has_storage_path = Array.isArray(values)
? (values[0] ?? null)
: values
},
extract: (trigger) => trigger.filter_has_storage_path,
hasValue: (value) => value !== null && value !== undefined,
},
[TriggerFilterType.StoragePathNot]: {
apply: (aggregate, values) => {
aggregate.filter_has_not_storage_paths = Array.isArray(values)
? [...values]
: [values]
},
extract: (trigger) => trigger.filter_has_not_storage_paths,
hasValue: (value) => Array.isArray(value) && value.length > 0,
},
[TriggerFilterType.CustomFieldQuery]: {
apply: (aggregate, values) => {
aggregate.filter_custom_field_query = values as string
},
extract: (trigger) => trigger.filter_custom_field_query,
hasValue: (value) =>
typeof value === 'string' && value !== null && value.trim().length > 0,
},
}
@Component({ @Component({
selector: 'pngx-workflow-edit-dialog', selector: 'pngx-workflow-edit-dialog',
templateUrl: './workflow-edit-dialog.component.html', templateUrl: './workflow-edit-dialog.component.html',
@@ -153,6 +384,7 @@ const TRIGGER_MATCHING_ALGORITHMS = MATCHING_ALGORITHMS.filter(
TextAreaComponent, TextAreaComponent,
TagsComponent, TagsComponent,
CustomFieldsValuesComponent, CustomFieldsValuesComponent,
CustomFieldsQueryDropdownComponent,
PermissionsGroupComponent, PermissionsGroupComponent,
PermissionsUserComponent, PermissionsUserComponent,
ConfirmButtonComponent, ConfirmButtonComponent,
@@ -170,6 +402,8 @@ export class WorkflowEditDialogComponent
{ {
public WorkflowTriggerType = WorkflowTriggerType public WorkflowTriggerType = WorkflowTriggerType
public WorkflowActionType = WorkflowActionType public WorkflowActionType = WorkflowActionType
public TriggerFilterType = TriggerFilterType
public filterDefinitions = TRIGGER_FILTER_DEFINITIONS
private correspondentService: CorrespondentService private correspondentService: CorrespondentService
private documentTypeService: DocumentTypeService private documentTypeService: DocumentTypeService
@@ -189,6 +423,11 @@ export class WorkflowEditDialogComponent
private allowedActionTypes = [] private allowedActionTypes = []
private readonly triggerFilterOptionsMap = new WeakMap<
FormArray,
TriggerFilterOption[]
>()
constructor() { constructor() {
super() super()
this.service = inject(WorkflowService) this.service = inject(WorkflowService)
@@ -390,6 +629,416 @@ export class WorkflowEditDialogComponent
return this.objectForm.get('actions') as FormArray return this.objectForm.get('actions') as FormArray
} }
protected override getFormValues(): any {
const formValues = super.getFormValues()
if (formValues?.triggers?.length) {
formValues.triggers = formValues.triggers.map(
(trigger: any, index: number) => {
const triggerFormGroup = this.triggerFields.at(index) as FormGroup
const filters = this.getFiltersFormArray(triggerFormGroup)
const aggregate: TriggerFilterAggregate = {
filter_has_tags: [],
filter_has_all_tags: [],
filter_has_not_tags: [],
filter_has_not_correspondents: [],
filter_has_not_document_types: [],
filter_has_not_storage_paths: [],
filter_has_correspondent: null,
filter_has_document_type: null,
filter_has_storage_path: null,
filter_custom_field_query: null,
}
for (const control of filters.controls) {
const type = control.get('type').value as TriggerFilterType
const values = control.get('values').value
if (values === null || values === undefined) {
continue
}
if (Array.isArray(values) && values.length === 0) {
continue
}
const handler = FILTER_HANDLERS[type]
handler?.apply(aggregate, values)
}
trigger.filter_has_tags = aggregate.filter_has_tags
trigger.filter_has_all_tags = aggregate.filter_has_all_tags
trigger.filter_has_not_tags = aggregate.filter_has_not_tags
trigger.filter_has_not_correspondents =
aggregate.filter_has_not_correspondents
trigger.filter_has_not_document_types =
aggregate.filter_has_not_document_types
trigger.filter_has_not_storage_paths =
aggregate.filter_has_not_storage_paths
trigger.filter_has_correspondent =
aggregate.filter_has_correspondent ?? null
trigger.filter_has_document_type =
aggregate.filter_has_document_type ?? null
trigger.filter_has_storage_path =
aggregate.filter_has_storage_path ?? null
trigger.filter_custom_field_query =
aggregate.filter_custom_field_query ?? null
delete trigger.filters
return trigger
}
)
}
return formValues
}
public matchingPatternRequired(formGroup: FormGroup): boolean {
return formGroup.get('matching_algorithm').value !== MATCH_NONE
}
private createFilterFormGroup(
type: TriggerFilterType,
initialValue?: any
): FormGroup {
const group = new FormGroup({
type: new FormControl(type),
values: new FormControl(this.normalizeFilterValue(type, initialValue)),
})
group.get('type').valueChanges.subscribe((newType: TriggerFilterType) => {
if (newType === TriggerFilterType.CustomFieldQuery) {
this.ensureCustomFieldQueryModel(group)
} else {
this.clearCustomFieldQueryModel(group)
group.get('values').setValue(this.getDefaultFilterValue(newType), {
emitEvent: false,
})
}
})
if (type === TriggerFilterType.CustomFieldQuery) {
this.ensureCustomFieldQueryModel(group, initialValue)
}
return group
}
private buildFiltersFormArray(trigger: WorkflowTrigger): FormArray {
const filters = new FormArray([])
for (const definition of this.filterDefinitions) {
const handler = FILTER_HANDLERS[definition.id]
if (!handler) {
continue
}
const value = handler.extract(trigger)
if (!handler.hasValue(value)) {
continue
}
filters.push(this.createFilterFormGroup(definition.id, value))
}
return filters
}
getFiltersFormArray(formGroup: FormGroup): FormArray {
return formGroup.get('filters') as FormArray
}
getFilterTypeOptions(formGroup: FormGroup, filterIndex: number) {
const filters = this.getFiltersFormArray(formGroup)
const options = this.getFilterTypeOptionsForArray(filters)
const currentType = filters.at(filterIndex).get('type')
.value as TriggerFilterType
const usedTypes = new Set(
filters.controls.map(
(control) => control.get('type').value as TriggerFilterType
)
)
for (const option of options) {
if (option.allowMultipleEntries) {
option.disabled = false
continue
}
option.disabled = usedTypes.has(option.id) && option.id !== currentType
}
return options
}
canAddFilter(formGroup: FormGroup): boolean {
const filters = this.getFiltersFormArray(formGroup)
const usedTypes = new Set(
filters.controls.map(
(control) => control.get('type').value as TriggerFilterType
)
)
return this.filterDefinitions.some((definition) => {
if (definition.allowMultipleEntries) {
return true
}
return !usedTypes.has(definition.id)
})
}
addFilter(triggerFormGroup: FormGroup): FormGroup | null {
const triggerIndex = this.triggerFields.controls.indexOf(triggerFormGroup)
if (triggerIndex === -1) {
return null
}
const filters = this.getFiltersFormArray(triggerFormGroup)
const availableDefinition = this.filterDefinitions.find((definition) => {
if (definition.allowMultipleEntries) {
return true
}
return !filters.controls.some(
(control) => control.get('type').value === definition.id
)
})
if (!availableDefinition) {
return null
}
filters.push(this.createFilterFormGroup(availableDefinition.id))
triggerFormGroup.markAsDirty()
triggerFormGroup.markAsTouched()
return filters.at(-1) as FormGroup
}
removeFilter(triggerFormGroup: FormGroup, filterIndex: number) {
const triggerIndex = this.triggerFields.controls.indexOf(triggerFormGroup)
if (triggerIndex === -1) {
return
}
const filters = this.getFiltersFormArray(triggerFormGroup)
const filterGroup = filters.at(filterIndex) as FormGroup
if (filterGroup?.get('type').value === TriggerFilterType.CustomFieldQuery) {
this.clearCustomFieldQueryModel(filterGroup)
}
filters.removeAt(filterIndex)
triggerFormGroup.markAsDirty()
triggerFormGroup.markAsTouched()
}
getFilterDefinition(
type: TriggerFilterType
): TriggerFilterDefinition | undefined {
return this.filterDefinitions.find((definition) => definition.id === type)
}
getFilterName(type: TriggerFilterType): string {
return this.getFilterDefinition(type)?.name ?? ''
}
isTagsFilter(type: TriggerFilterType): boolean {
return this.getFilterDefinition(type)?.inputType === 'tags'
}
isCustomFieldQueryFilter(type: TriggerFilterType): boolean {
return this.getFilterDefinition(type)?.inputType === 'customFieldQuery'
}
isMultiValueFilter(type: TriggerFilterType): boolean {
switch (type) {
case TriggerFilterType.TagsAny:
case TriggerFilterType.TagsAll:
case TriggerFilterType.TagsNone:
case TriggerFilterType.CorrespondentNot:
case TriggerFilterType.DocumentTypeNot:
case TriggerFilterType.StoragePathNot:
return true
default:
return false
}
}
isSelectMultiple(type: TriggerFilterType): boolean {
return !this.isTagsFilter(type) && this.isMultiValueFilter(type)
}
getFilterSelectItems(type: TriggerFilterType) {
const definition = this.getFilterDefinition(type)
if (!definition || definition.inputType !== 'select') {
return []
}
switch (definition.selectItems) {
case 'correspondents':
return this.correspondents
case 'documentTypes':
return this.documentTypes
case 'storagePaths':
return this.storagePaths
default:
return []
}
}
getCustomFieldQueryModel(control: AbstractControl): CustomFieldQueriesModel {
return this.ensureCustomFieldQueryModel(control as FormGroup)
}
onCustomFieldQuerySelectionChange(
control: AbstractControl,
model: CustomFieldQueriesModel
) {
this.onCustomFieldQueryModelChanged(control as FormGroup, model)
}
isCustomFieldQueryValid(control: AbstractControl): boolean {
const model = this.getStoredCustomFieldQueryModel(control as FormGroup)
if (!model) {
return true
}
return model.isEmpty() || model.isValid()
}
private getFilterTypeOptionsForArray(
filters: FormArray
): TriggerFilterOption[] {
let cached = this.triggerFilterOptionsMap.get(filters)
if (!cached) {
cached = this.filterDefinitions.map((definition) => ({
...definition,
disabled: false,
}))
this.triggerFilterOptionsMap.set(filters, cached)
}
return cached
}
private ensureCustomFieldQueryModel(
filterGroup: FormGroup,
initialValue?: any
): CustomFieldQueriesModel {
const existingModel = this.getStoredCustomFieldQueryModel(filterGroup)
if (existingModel) {
return existingModel
}
const model = new CustomFieldQueriesModel()
this.setCustomFieldQueryModel(filterGroup, model)
const rawValue =
typeof initialValue === 'string'
? initialValue
: (filterGroup.get('values').value as string)
if (rawValue) {
try {
const parsed = JSON.parse(rawValue)
const expression = new CustomFieldQueryExpression(parsed)
model.queries = [expression]
} catch {
model.clear(false)
model.addInitialAtom()
}
}
const subscription = model.changed
.pipe(takeUntil(this.unsubscribeNotifier))
.subscribe(() => {
this.onCustomFieldQueryModelChanged(filterGroup, model)
})
filterGroup[CUSTOM_FIELD_QUERY_SUBSCRIPTION_KEY]?.unsubscribe()
filterGroup[CUSTOM_FIELD_QUERY_SUBSCRIPTION_KEY] = subscription
this.onCustomFieldQueryModelChanged(filterGroup, model)
return model
}
private clearCustomFieldQueryModel(filterGroup: FormGroup) {
const group = filterGroup as CustomFieldFilterGroup
group[CUSTOM_FIELD_QUERY_SUBSCRIPTION_KEY]?.unsubscribe()
delete group[CUSTOM_FIELD_QUERY_SUBSCRIPTION_KEY]
delete group[CUSTOM_FIELD_QUERY_MODEL_KEY]
}
private getStoredCustomFieldQueryModel(
filterGroup: FormGroup
): CustomFieldQueriesModel | null {
return (
(filterGroup as CustomFieldFilterGroup)[CUSTOM_FIELD_QUERY_MODEL_KEY] ??
null
)
}
private setCustomFieldQueryModel(
filterGroup: FormGroup,
model: CustomFieldQueriesModel
) {
const group = filterGroup as CustomFieldFilterGroup
group[CUSTOM_FIELD_QUERY_MODEL_KEY] = model
}
private onCustomFieldQueryModelChanged(
filterGroup: FormGroup,
model: CustomFieldQueriesModel
) {
const control = filterGroup.get('values')
if (!control) {
return
}
if (!model.isValid()) {
control.setValue(null, { emitEvent: false })
return
}
if (model.isEmpty()) {
control.setValue(null, { emitEvent: false })
return
}
const serialized = JSON.stringify(model.queries[0].serialize())
control.setValue(serialized, { emitEvent: false })
}
private getDefaultFilterValue(type: TriggerFilterType) {
if (type === TriggerFilterType.CustomFieldQuery) {
return null
}
return this.isMultiValueFilter(type) ? [] : null
}
private normalizeFilterValue(type: TriggerFilterType, value?: any) {
if (value === undefined || value === null) {
return this.getDefaultFilterValue(type)
}
if (type === TriggerFilterType.CustomFieldQuery) {
if (typeof value === 'string') {
return value
}
return value ? JSON.stringify(value) : null
}
if (this.isMultiValueFilter(type)) {
return Array.isArray(value) ? [...value] : [value]
}
if (Array.isArray(value)) {
return value.length > 0 ? value[0] : null
}
return value
}
private createTriggerField( private createTriggerField(
trigger: WorkflowTrigger, trigger: WorkflowTrigger,
emitEvent: boolean = false emitEvent: boolean = false
@@ -405,16 +1054,7 @@ export class WorkflowEditDialogComponent
matching_algorithm: new FormControl(trigger.matching_algorithm), matching_algorithm: new FormControl(trigger.matching_algorithm),
match: new FormControl(trigger.match), match: new FormControl(trigger.match),
is_insensitive: new FormControl(trigger.is_insensitive), is_insensitive: new FormControl(trigger.is_insensitive),
filter_has_tags: new FormControl(trigger.filter_has_tags), filters: this.buildFiltersFormArray(trigger),
filter_has_correspondent: new FormControl(
trigger.filter_has_correspondent
),
filter_has_document_type: new FormControl(
trigger.filter_has_document_type
),
filter_has_storage_path: new FormControl(
trigger.filter_has_storage_path
),
schedule_offset_days: new FormControl(trigger.schedule_offset_days), schedule_offset_days: new FormControl(trigger.schedule_offset_days),
schedule_is_recurring: new FormControl(trigger.schedule_is_recurring), schedule_is_recurring: new FormControl(trigger.schedule_is_recurring),
schedule_recurring_interval_days: new FormControl( schedule_recurring_interval_days: new FormControl(
@@ -537,6 +1177,12 @@ export class WorkflowEditDialogComponent
filter_path: null, filter_path: null,
filter_mailrule: null, filter_mailrule: null,
filter_has_tags: [], filter_has_tags: [],
filter_has_all_tags: [],
filter_has_not_tags: [],
filter_has_not_correspondents: [],
filter_has_not_document_types: [],
filter_has_not_storage_paths: [],
filter_custom_field_query: null,
filter_has_correspondent: null, filter_has_correspondent: null,
filter_has_document_type: null, filter_has_document_type: null,
filter_has_storage_path: null, filter_has_storage_path: null,

View File

@@ -1,5 +1,6 @@
<div class="mb-3 paperless-input-select" [class.disabled]="disabled"> <div class="mb-3 paperless-input-select" [class.disabled]="disabled">
<div class="row"> <div class="row">
@if (title || removable) {
<div class="d-flex align-items-center position-relative hidden-button-container" [class.col-md-3]="horizontal"> <div class="d-flex align-items-center position-relative hidden-button-container" [class.col-md-3]="horizontal">
@if (title) { @if (title) {
<label class="form-label" [class.mb-md-0]="horizontal" [for]="inputId">{{title}}</label> <label class="form-label" [class.mb-md-0]="horizontal" [for]="inputId">{{title}}</label>
@@ -10,6 +11,7 @@
</button> </button>
} }
</div> </div>
}
<div [class.col-md-9]="horizontal"> <div [class.col-md-9]="horizontal">
<div [class.input-group]="allowCreateNew || showFilter" [class.is-invalid]="error"> <div [class.input-group]="allowCreateNew || showFilter" [class.is-invalid]="error">
<ng-select name="inputId" [(ngModel)]="value" <ng-select name="inputId" [(ngModel)]="value"

View File

@@ -1,8 +1,10 @@
<div class="mb-3 paperless-input-select paperless-input-tags" [class.disabled]="disabled" [class.pb-3]="getSuggestions().length > 0"> <div class="mb-3 paperless-input-select paperless-input-tags" [class.disabled]="disabled" [class.pb-3]="getSuggestions().length > 0">
<div class="row"> <div class="row">
@if (title) {
<div class="d-flex align-items-center" [class.col-md-3]="horizontal"> <div class="d-flex align-items-center" [class.col-md-3]="horizontal">
<label class="form-label" [class.mb-md-0]="horizontal" for="tags">{{title}}</label> <label class="form-label" [class.mb-md-0]="horizontal" for="tags">{{title}}</label>
</div> </div>
}
<div class="position-relative" [class.col-md-9]="horizontal"> <div class="position-relative" [class.col-md-9]="horizontal">
<div class="input-group flex-nowrap"> <div class="input-group flex-nowrap">
<ng-select #tagSelect name="tags" [items]="tags" bindLabel="name" bindValue="id" [(ngModel)]="value" <ng-select #tagSelect name="tags" [items]="tags" bindLabel="name" bindValue="id" [(ngModel)]="value"

View File

@@ -40,6 +40,18 @@ export interface WorkflowTrigger extends ObjectWithId {
filter_has_tags?: number[] // Tag.id[] filter_has_tags?: number[] // Tag.id[]
filter_has_all_tags?: number[] // Tag.id[]
filter_has_not_tags?: number[] // Tag.id[]
filter_has_not_correspondents?: number[] // Correspondent.id[]
filter_has_not_document_types?: number[] // DocumentType.id[]
filter_has_not_storage_paths?: number[] // StoragePath.id[]
filter_custom_field_query?: string
filter_has_correspondent?: number // Correspondent.id filter_has_correspondent?: number // Correspondent.id
filter_has_document_type?: number // DocumentType.id filter_has_document_type?: number // DocumentType.id

View File

@@ -6,8 +6,11 @@ from fnmatch import fnmatch
from fnmatch import translate as fnmatch_translate from fnmatch import translate as fnmatch_translate
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from rest_framework import serializers
from documents.data_models import ConsumableDocument from documents.data_models import ConsumableDocument
from documents.data_models import DocumentSource from documents.data_models import DocumentSource
from documents.filters import CustomFieldQueryParser
from documents.models import Correspondent from documents.models import Correspondent
from documents.models import Document from documents.models import Document
from documents.models import DocumentType from documents.models import DocumentType
@@ -342,67 +345,147 @@ def consumable_document_matches_workflow(
def existing_document_matches_workflow( def existing_document_matches_workflow(
document: Document, document: Document,
trigger: WorkflowTrigger, trigger: WorkflowTrigger,
) -> tuple[bool, str]: ) -> tuple[bool, str | None]:
""" """
Returns True if the Document matches all filters from the workflow trigger, Returns True if the Document matches all filters from the workflow trigger,
False otherwise. Includes a reason if doesn't match False otherwise. Includes a reason if doesn't match
""" """
trigger_matched = True # Check content matching algorithm
reason = ""
if trigger.matching_algorithm > MatchingModel.MATCH_NONE and not matches( if trigger.matching_algorithm > MatchingModel.MATCH_NONE and not matches(
trigger, trigger,
document, document,
): ):
reason = ( return (
False,
f"Document content matching settings for algorithm '{trigger.matching_algorithm}' did not match", f"Document content matching settings for algorithm '{trigger.matching_algorithm}' did not match",
) )
trigger_matched = False
# Document tags vs trigger has_tags # Check if any tag filters exist to determine if we need to load document tags
if ( trigger_has_tags_qs = trigger.filter_has_tags.all()
trigger.filter_has_tags.all().count() > 0 trigger_has_all_tags_qs = trigger.filter_has_all_tags.all()
and document.tags.filter( trigger_has_not_tags_qs = trigger.filter_has_not_tags.all()
id__in=trigger.filter_has_tags.all().values_list("id"),
).count() has_tags_filter = trigger_has_tags_qs.exists()
== 0 has_all_tags_filter = trigger_has_all_tags_qs.exists()
): has_not_tags_filter = trigger_has_not_tags_qs.exists()
reason = (
f"Document tags {document.tags.all()} do not include" # Load document tags once if any tag filters exist
f" {trigger.filter_has_tags.all()}", document_tag_ids = None
if has_tags_filter or has_all_tags_filter or has_not_tags_filter:
document_tag_ids = set(document.tags.values_list("id", flat=True))
# Document tags vs trigger has_tags (any of)
if has_tags_filter:
trigger_has_tag_ids = set(trigger_has_tags_qs.values_list("id", flat=True))
if not (document_tag_ids & trigger_has_tag_ids):
# For error message, load the actual tag objects
return (
False,
f"Document tags {list(document.tags.all())} do not include {list(trigger_has_tags_qs)}",
)
# Document tags vs trigger has_all_tags (all of)
if has_all_tags_filter:
required_tag_ids = set(trigger_has_all_tags_qs.values_list("id", flat=True))
if not required_tag_ids.issubset(document_tag_ids):
return (
False,
f"Document tags {list(document.tags.all())} do not contain all of {list(trigger_has_all_tags_qs)}",
)
# Document tags vs trigger has_not_tags (none of)
if has_not_tags_filter:
excluded_tag_ids = set(trigger_has_not_tags_qs.values_list("id", flat=True))
if document_tag_ids & excluded_tag_ids:
return (
False,
f"Document tags {list(document.tags.all())} include excluded tags {list(trigger_has_not_tags_qs)}",
) )
trigger_matched = False
# Document correspondent vs trigger has_correspondent # Document correspondent vs trigger has_correspondent
if ( if (
trigger.filter_has_correspondent is not None trigger.filter_has_correspondent_id is not None
and document.correspondent != trigger.filter_has_correspondent and document.correspondent_id != trigger.filter_has_correspondent_id
): ):
reason = ( return (
False,
f"Document correspondent {document.correspondent} does not match {trigger.filter_has_correspondent}", f"Document correspondent {document.correspondent} does not match {trigger.filter_has_correspondent}",
) )
trigger_matched = False
if (
document.correspondent_id
and trigger.filter_has_not_correspondents.filter(
id=document.correspondent_id,
).exists()
):
return (
False,
f"Document correspondent {document.correspondent} is excluded by {list(trigger.filter_has_not_correspondents.all())}",
)
# Document document_type vs trigger has_document_type # Document document_type vs trigger has_document_type
if ( if (
trigger.filter_has_document_type is not None trigger.filter_has_document_type_id is not None
and document.document_type != trigger.filter_has_document_type and document.document_type_id != trigger.filter_has_document_type_id
): ):
reason = ( return (
False,
f"Document doc type {document.document_type} does not match {trigger.filter_has_document_type}", f"Document doc type {document.document_type} does not match {trigger.filter_has_document_type}",
) )
trigger_matched = False
if (
document.document_type_id
and trigger.filter_has_not_document_types.filter(
id=document.document_type_id,
).exists()
):
return (
False,
f"Document doc type {document.document_type} is excluded by {list(trigger.filter_has_not_document_types.all())}",
)
# Document storage_path vs trigger has_storage_path # Document storage_path vs trigger has_storage_path
if ( if (
trigger.filter_has_storage_path is not None trigger.filter_has_storage_path_id is not None
and document.storage_path != trigger.filter_has_storage_path and document.storage_path_id != trigger.filter_has_storage_path_id
): ):
reason = ( return (
False,
f"Document storage path {document.storage_path} does not match {trigger.filter_has_storage_path}", f"Document storage path {document.storage_path} does not match {trigger.filter_has_storage_path}",
) )
trigger_matched = False
if (
document.storage_path_id
and trigger.filter_has_not_storage_paths.filter(
id=document.storage_path_id,
).exists()
):
return (
False,
f"Document storage path {document.storage_path} is excluded by {list(trigger.filter_has_not_storage_paths.all())}",
)
# Custom field query check
if trigger.filter_custom_field_query:
parser = CustomFieldQueryParser("filter_custom_field_query")
try:
custom_field_q, annotations = parser.parse(
trigger.filter_custom_field_query,
)
except serializers.ValidationError:
return (False, "Invalid custom field query configuration")
qs = (
Document.objects.filter(id=document.id)
.annotate(**annotations)
.filter(custom_field_q)
)
if not qs.exists():
return (
False,
"Document custom fields do not match the configured custom field query",
)
# Document original_filename vs trigger filename # Document original_filename vs trigger filename
if ( if (
@@ -414,13 +497,12 @@ def existing_document_matches_workflow(
trigger.filter_filename.lower(), trigger.filter_filename.lower(),
) )
): ):
reason = ( return (
f"Document filename {document.original_filename} does not match" False,
f" {trigger.filter_filename.lower()}", f"Document filename {document.original_filename} does not match {trigger.filter_filename.lower()}",
) )
trigger_matched = False
return (trigger_matched, reason) return (True, None)
def prefilter_documents_by_workflowtrigger( def prefilter_documents_by_workflowtrigger(
@@ -433,31 +515,66 @@ def prefilter_documents_by_workflowtrigger(
document_matches_workflow in run_workflows document_matches_workflow in run_workflows
""" """
if trigger.filter_has_tags.all().count() > 0: # Filter for documents that have AT LEAST ONE of the specified tags.
documents = documents.filter( if trigger.filter_has_tags.exists():
tags__in=trigger.filter_has_tags.all(), documents = documents.filter(tags__in=trigger.filter_has_tags.all()).distinct()
).distinct()
# Filter for documents that have ALL of the specified tags.
if trigger.filter_has_all_tags.exists():
for tag in trigger.filter_has_all_tags.all():
documents = documents.filter(tags=tag)
# Multiple JOINs can create duplicate results.
documents = documents.distinct()
# Exclude documents that have ANY of the specified tags.
if trigger.filter_has_not_tags.exists():
documents = documents.exclude(tags__in=trigger.filter_has_not_tags.all())
# Correspondent, DocumentType, etc. filtering
if trigger.filter_has_correspondent is not None: if trigger.filter_has_correspondent is not None:
documents = documents.filter( documents = documents.filter(
correspondent=trigger.filter_has_correspondent, correspondent=trigger.filter_has_correspondent,
) )
if trigger.filter_has_not_correspondents.exists():
documents = documents.exclude(
correspondent__in=trigger.filter_has_not_correspondents.all(),
)
if trigger.filter_has_document_type is not None: if trigger.filter_has_document_type is not None:
documents = documents.filter( documents = documents.filter(
document_type=trigger.filter_has_document_type, document_type=trigger.filter_has_document_type,
) )
if trigger.filter_has_not_document_types.exists():
documents = documents.exclude(
document_type__in=trigger.filter_has_not_document_types.all(),
)
if trigger.filter_has_storage_path is not None: if trigger.filter_has_storage_path is not None:
documents = documents.filter( documents = documents.filter(
storage_path=trigger.filter_has_storage_path, storage_path=trigger.filter_has_storage_path,
) )
if trigger.filter_has_not_storage_paths.exists():
documents = documents.exclude(
storage_path__in=trigger.filter_has_not_storage_paths.all(),
)
if trigger.filter_filename is not None and len(trigger.filter_filename) > 0: # Custom Field & Filename Filtering
# the true fnmatch will actually run later so we just want a loose filter here
if trigger.filter_custom_field_query:
parser = CustomFieldQueryParser("filter_custom_field_query")
try:
custom_field_q, annotations = parser.parse(
trigger.filter_custom_field_query,
)
except serializers.ValidationError:
return documents.none()
documents = documents.annotate(**annotations).filter(custom_field_q)
if trigger.filter_filename:
regex = fnmatch_translate(trigger.filter_filename).lstrip("^").rstrip("$") regex = fnmatch_translate(trigger.filter_filename).lstrip("^").rstrip("$")
regex = f"(?i){regex}" documents = documents.filter(original_filename__iregex=regex)
documents = documents.filter(original_filename__regex=regex)
return documents return documents
@@ -472,13 +589,34 @@ def document_matches_workflow(
settings from the workflow trigger, False otherwise settings from the workflow trigger, False otherwise
""" """
triggers_queryset = (
workflow.triggers.filter(
type=trigger_type,
)
.select_related(
"filter_mailrule",
"filter_has_document_type",
"filter_has_correspondent",
"filter_has_storage_path",
"schedule_date_custom_field",
)
.prefetch_related(
"filter_has_tags",
"filter_has_all_tags",
"filter_has_not_tags",
"filter_has_not_document_types",
"filter_has_not_correspondents",
"filter_has_not_storage_paths",
)
)
trigger_matched = True trigger_matched = True
if workflow.triggers.filter(type=trigger_type).count() == 0: if not triggers_queryset.exists():
trigger_matched = False trigger_matched = False
logger.info(f"Document did not match {workflow}") logger.info(f"Document did not match {workflow}")
logger.debug(f"No matching triggers with type {trigger_type} found") logger.debug(f"No matching triggers with type {trigger_type} found")
else: else:
for trigger in workflow.triggers.filter(type=trigger_type): for trigger in triggers_queryset:
if trigger_type == WorkflowTrigger.WorkflowTriggerType.CONSUMPTION: if trigger_type == WorkflowTrigger.WorkflowTriggerType.CONSUMPTION:
trigger_matched, reason = consumable_document_matches_workflow( trigger_matched, reason = consumable_document_matches_workflow(
document, document,

View File

@@ -0,0 +1,73 @@
# Generated by Django 5.2.6 on 2025-10-07 18:52
from django.db import migrations
from django.db import models
class Migration(migrations.Migration):
dependencies = [
("documents", "1071_tag_tn_ancestors_count_tag_tn_ancestors_pks_and_more"),
]
operations = [
migrations.AddField(
model_name="workflowtrigger",
name="filter_custom_field_query",
field=models.TextField(
blank=True,
help_text="JSON-encoded custom field query expression.",
null=True,
verbose_name="filter custom field query",
),
),
migrations.AddField(
model_name="workflowtrigger",
name="filter_has_all_tags",
field=models.ManyToManyField(
blank=True,
related_name="workflowtriggers_has_all",
to="documents.tag",
verbose_name="has all of these tag(s)",
),
),
migrations.AddField(
model_name="workflowtrigger",
name="filter_has_not_correspondents",
field=models.ManyToManyField(
blank=True,
related_name="workflowtriggers_has_not_correspondent",
to="documents.correspondent",
verbose_name="does not have these correspondent(s)",
),
),
migrations.AddField(
model_name="workflowtrigger",
name="filter_has_not_document_types",
field=models.ManyToManyField(
blank=True,
related_name="workflowtriggers_has_not_document_type",
to="documents.documenttype",
verbose_name="does not have these document type(s)",
),
),
migrations.AddField(
model_name="workflowtrigger",
name="filter_has_not_storage_paths",
field=models.ManyToManyField(
blank=True,
related_name="workflowtriggers_has_not_storage_path",
to="documents.storagepath",
verbose_name="does not have these storage path(s)",
),
),
migrations.AddField(
model_name="workflowtrigger",
name="filter_has_not_tags",
field=models.ManyToManyField(
blank=True,
related_name="workflowtriggers_has_not",
to="documents.tag",
verbose_name="does not have these tag(s)",
),
),
]

View File

@@ -1065,6 +1065,20 @@ class WorkflowTrigger(models.Model):
verbose_name=_("has these tag(s)"), verbose_name=_("has these tag(s)"),
) )
filter_has_all_tags = models.ManyToManyField(
Tag,
blank=True,
related_name="workflowtriggers_has_all",
verbose_name=_("has all of these tag(s)"),
)
filter_has_not_tags = models.ManyToManyField(
Tag,
blank=True,
related_name="workflowtriggers_has_not",
verbose_name=_("does not have these tag(s)"),
)
filter_has_document_type = models.ForeignKey( filter_has_document_type = models.ForeignKey(
DocumentType, DocumentType,
null=True, null=True,
@@ -1073,6 +1087,13 @@ class WorkflowTrigger(models.Model):
verbose_name=_("has this document type"), verbose_name=_("has this document type"),
) )
filter_has_not_document_types = models.ManyToManyField(
DocumentType,
blank=True,
related_name="workflowtriggers_has_not_document_type",
verbose_name=_("does not have these document type(s)"),
)
filter_has_correspondent = models.ForeignKey( filter_has_correspondent = models.ForeignKey(
Correspondent, Correspondent,
null=True, null=True,
@@ -1081,6 +1102,13 @@ class WorkflowTrigger(models.Model):
verbose_name=_("has this correspondent"), verbose_name=_("has this correspondent"),
) )
filter_has_not_correspondents = models.ManyToManyField(
Correspondent,
blank=True,
related_name="workflowtriggers_has_not_correspondent",
verbose_name=_("does not have these correspondent(s)"),
)
filter_has_storage_path = models.ForeignKey( filter_has_storage_path = models.ForeignKey(
StoragePath, StoragePath,
null=True, null=True,
@@ -1089,6 +1117,20 @@ class WorkflowTrigger(models.Model):
verbose_name=_("has this storage path"), verbose_name=_("has this storage path"),
) )
filter_has_not_storage_paths = models.ManyToManyField(
StoragePath,
blank=True,
related_name="workflowtriggers_has_not_storage_path",
verbose_name=_("does not have these storage path(s)"),
)
filter_custom_field_query = models.TextField(
_("filter custom field query"),
null=True,
blank=True,
help_text=_("JSON-encoded custom field query expression."),
)
schedule_offset_days = models.IntegerField( schedule_offset_days = models.IntegerField(
_("schedule offset days"), _("schedule offset days"),
default=0, default=0,

View File

@@ -44,6 +44,7 @@ if settings.AUDIT_LOG_ENABLED:
from documents import bulk_edit from documents import bulk_edit
from documents.data_models import DocumentSource from documents.data_models import DocumentSource
from documents.filters import CustomFieldQueryParser
from documents.models import Correspondent from documents.models import Correspondent
from documents.models import CustomField from documents.models import CustomField
from documents.models import CustomFieldInstance from documents.models import CustomFieldInstance
@@ -2240,6 +2241,12 @@ class WorkflowTriggerSerializer(serializers.ModelSerializer):
"match", "match",
"is_insensitive", "is_insensitive",
"filter_has_tags", "filter_has_tags",
"filter_has_all_tags",
"filter_has_not_tags",
"filter_custom_field_query",
"filter_has_not_correspondents",
"filter_has_not_document_types",
"filter_has_not_storage_paths",
"filter_has_correspondent", "filter_has_correspondent",
"filter_has_document_type", "filter_has_document_type",
"filter_has_storage_path", "filter_has_storage_path",
@@ -2265,6 +2272,20 @@ class WorkflowTriggerSerializer(serializers.ModelSerializer):
): ):
attrs["filter_path"] = None attrs["filter_path"] = None
if (
"filter_custom_field_query" in attrs
and attrs["filter_custom_field_query"] is not None
and len(attrs["filter_custom_field_query"]) == 0
):
attrs["filter_custom_field_query"] = None
if (
"filter_custom_field_query" in attrs
and attrs["filter_custom_field_query"] is not None
):
parser = CustomFieldQueryParser("filter_custom_field_query")
parser.parse(attrs["filter_custom_field_query"])
trigger_type = attrs.get("type", getattr(self.instance, "type", None)) trigger_type = attrs.get("type", getattr(self.instance, "type", None))
if ( if (
trigger_type == WorkflowTrigger.WorkflowTriggerType.CONSUMPTION trigger_type == WorkflowTrigger.WorkflowTriggerType.CONSUMPTION
@@ -2460,6 +2481,20 @@ class WorkflowSerializer(serializers.ModelSerializer):
if triggers is not None and triggers is not serializers.empty: if triggers is not None and triggers is not serializers.empty:
for trigger in triggers: for trigger in triggers:
filter_has_tags = trigger.pop("filter_has_tags", None) filter_has_tags = trigger.pop("filter_has_tags", None)
filter_has_all_tags = trigger.pop("filter_has_all_tags", None)
filter_has_not_tags = trigger.pop("filter_has_not_tags", None)
filter_has_not_correspondents = trigger.pop(
"filter_has_not_correspondents",
None,
)
filter_has_not_document_types = trigger.pop(
"filter_has_not_document_types",
None,
)
filter_has_not_storage_paths = trigger.pop(
"filter_has_not_storage_paths",
None,
)
# Convert sources to strings to handle django-multiselectfield v1.0 changes # Convert sources to strings to handle django-multiselectfield v1.0 changes
WorkflowTriggerSerializer.normalize_workflow_trigger_sources(trigger) WorkflowTriggerSerializer.normalize_workflow_trigger_sources(trigger)
trigger_instance, _ = WorkflowTrigger.objects.update_or_create( trigger_instance, _ = WorkflowTrigger.objects.update_or_create(
@@ -2468,6 +2503,22 @@ class WorkflowSerializer(serializers.ModelSerializer):
) )
if filter_has_tags is not None: if filter_has_tags is not None:
trigger_instance.filter_has_tags.set(filter_has_tags) trigger_instance.filter_has_tags.set(filter_has_tags)
if filter_has_all_tags is not None:
trigger_instance.filter_has_all_tags.set(filter_has_all_tags)
if filter_has_not_tags is not None:
trigger_instance.filter_has_not_tags.set(filter_has_not_tags)
if filter_has_not_correspondents is not None:
trigger_instance.filter_has_not_correspondents.set(
filter_has_not_correspondents,
)
if filter_has_not_document_types is not None:
trigger_instance.filter_has_not_document_types.set(
filter_has_not_document_types,
)
if filter_has_not_storage_paths is not None:
trigger_instance.filter_has_not_storage_paths.set(
filter_has_not_storage_paths,
)
set_triggers.append(trigger_instance) set_triggers.append(trigger_instance)
if actions is not None and actions is not serializers.empty: if actions is not None and actions is not serializers.empty:

View File

@@ -184,6 +184,17 @@ class TestApiWorkflows(DirectoriesMixin, APITestCase):
"filter_filename": "*", "filter_filename": "*",
"filter_path": "*/samples/*", "filter_path": "*/samples/*",
"filter_has_tags": [self.t1.id], "filter_has_tags": [self.t1.id],
"filter_has_all_tags": [self.t2.id],
"filter_has_not_tags": [self.t3.id],
"filter_has_not_correspondents": [self.c2.id],
"filter_has_not_document_types": [self.dt2.id],
"filter_has_not_storage_paths": [self.sp2.id],
"filter_custom_field_query": json.dumps(
[
"AND",
[[self.cf1.id, "exact", "value"]],
],
),
"filter_has_document_type": self.dt.id, "filter_has_document_type": self.dt.id,
"filter_has_correspondent": self.c.id, "filter_has_correspondent": self.c.id,
"filter_has_storage_path": self.sp.id, "filter_has_storage_path": self.sp.id,
@@ -223,6 +234,36 @@ class TestApiWorkflows(DirectoriesMixin, APITestCase):
) )
self.assertEqual(response.status_code, status.HTTP_201_CREATED) self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(Workflow.objects.count(), 2) self.assertEqual(Workflow.objects.count(), 2)
workflow = Workflow.objects.get(name="Workflow 2")
trigger = workflow.triggers.first()
self.assertSetEqual(
set(trigger.filter_has_tags.values_list("id", flat=True)),
{self.t1.id},
)
self.assertSetEqual(
set(trigger.filter_has_all_tags.values_list("id", flat=True)),
{self.t2.id},
)
self.assertSetEqual(
set(trigger.filter_has_not_tags.values_list("id", flat=True)),
{self.t3.id},
)
self.assertSetEqual(
set(trigger.filter_has_not_correspondents.values_list("id", flat=True)),
{self.c2.id},
)
self.assertSetEqual(
set(trigger.filter_has_not_document_types.values_list("id", flat=True)),
{self.dt2.id},
)
self.assertSetEqual(
set(trigger.filter_has_not_storage_paths.values_list("id", flat=True)),
{self.sp2.id},
)
self.assertEqual(
trigger.filter_custom_field_query,
json.dumps(["AND", [[self.cf1.id, "exact", "value"]]]),
)
def test_api_create_invalid_workflow_trigger(self): def test_api_create_invalid_workflow_trigger(self):
""" """
@@ -376,6 +417,14 @@ class TestApiWorkflows(DirectoriesMixin, APITestCase):
{ {
"type": WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED, "type": WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
"filter_has_tags": [self.t1.id], "filter_has_tags": [self.t1.id],
"filter_has_all_tags": [self.t2.id],
"filter_has_not_tags": [self.t3.id],
"filter_has_not_correspondents": [self.c2.id],
"filter_has_not_document_types": [self.dt2.id],
"filter_has_not_storage_paths": [self.sp2.id],
"filter_custom_field_query": json.dumps(
["AND", [[self.cf1.id, "exact", "value"]]],
),
"filter_has_correspondent": self.c.id, "filter_has_correspondent": self.c.id,
"filter_has_document_type": self.dt.id, "filter_has_document_type": self.dt.id,
}, },
@@ -393,6 +442,30 @@ class TestApiWorkflows(DirectoriesMixin, APITestCase):
workflow = Workflow.objects.get(id=response.data["id"]) workflow = Workflow.objects.get(id=response.data["id"])
self.assertEqual(workflow.name, "Workflow Updated") self.assertEqual(workflow.name, "Workflow Updated")
self.assertEqual(workflow.triggers.first().filter_has_tags.first(), self.t1) self.assertEqual(workflow.triggers.first().filter_has_tags.first(), self.t1)
self.assertEqual(
workflow.triggers.first().filter_has_all_tags.first(),
self.t2,
)
self.assertEqual(
workflow.triggers.first().filter_has_not_tags.first(),
self.t3,
)
self.assertEqual(
workflow.triggers.first().filter_has_not_correspondents.first(),
self.c2,
)
self.assertEqual(
workflow.triggers.first().filter_has_not_document_types.first(),
self.dt2,
)
self.assertEqual(
workflow.triggers.first().filter_has_not_storage_paths.first(),
self.sp2,
)
self.assertEqual(
workflow.triggers.first().filter_custom_field_query,
json.dumps(["AND", [[self.cf1.id, "exact", "value"]]]),
)
self.assertEqual(workflow.actions.first().assign_title, "Action New Title") self.assertEqual(workflow.actions.first().assign_title, "Action New Title")
def test_api_update_workflow_no_trigger_actions(self): def test_api_update_workflow_no_trigger_actions(self):

View File

@@ -1,4 +1,5 @@
import datetime import datetime
import json
import shutil import shutil
import socket import socket
from datetime import timedelta from datetime import timedelta
@@ -31,6 +32,7 @@ from documents import tasks
from documents.data_models import ConsumableDocument from documents.data_models import ConsumableDocument
from documents.data_models import DocumentSource from documents.data_models import DocumentSource
from documents.matching import document_matches_workflow from documents.matching import document_matches_workflow
from documents.matching import existing_document_matches_workflow
from documents.matching import prefilter_documents_by_workflowtrigger from documents.matching import prefilter_documents_by_workflowtrigger
from documents.models import Correspondent from documents.models import Correspondent
from documents.models import CustomField from documents.models import CustomField
@@ -46,6 +48,7 @@ from documents.models import WorkflowActionEmail
from documents.models import WorkflowActionWebhook from documents.models import WorkflowActionWebhook
from documents.models import WorkflowRun from documents.models import WorkflowRun
from documents.models import WorkflowTrigger from documents.models import WorkflowTrigger
from documents.serialisers import WorkflowTriggerSerializer
from documents.signals import document_consumption_finished from documents.signals import document_consumption_finished
from documents.tests.utils import DirectoriesMixin from documents.tests.utils import DirectoriesMixin
from documents.tests.utils import DummyProgressManager from documents.tests.utils import DummyProgressManager
@@ -1080,9 +1083,409 @@ class TestWorkflows(
) )
expected_str = f"Document did not match {w}" expected_str = f"Document did not match {w}"
self.assertIn(expected_str, cm.output[0]) self.assertIn(expected_str, cm.output[0])
expected_str = f"Document tags {doc.tags.all()} do not include {trigger.filter_has_tags.all()}" expected_str = f"Document tags {list(doc.tags.all())} do not include {list(trigger.filter_has_tags.all())}"
self.assertIn(expected_str, cm.output[1]) self.assertIn(expected_str, cm.output[1])
def test_document_added_no_match_all_tags(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
)
trigger.filter_has_all_tags.set([self.t1, self.t2])
action = WorkflowAction.objects.create(
assign_title="Doc assign owner",
assign_owner=self.user2,
)
w = Workflow.objects.create(
name="Workflow 1",
order=0,
)
w.triggers.add(trigger)
w.actions.add(action)
w.save()
doc = Document.objects.create(
title="sample test",
correspondent=self.c,
original_filename="sample.pdf",
)
doc.tags.set([self.t1])
doc.save()
with self.assertLogs("paperless.matching", level="DEBUG") as cm:
document_consumption_finished.send(
sender=self.__class__,
document=doc,
)
expected_str = f"Document did not match {w}"
self.assertIn(expected_str, cm.output[0])
expected_str = (
f"Document tags {list(doc.tags.all())} do not contain all of"
f" {list(trigger.filter_has_all_tags.all())}"
)
self.assertIn(expected_str, cm.output[1])
def test_document_added_excluded_tags(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
)
trigger.filter_has_not_tags.set([self.t3])
action = WorkflowAction.objects.create(
assign_title="Doc assign owner",
assign_owner=self.user2,
)
w = Workflow.objects.create(
name="Workflow 1",
order=0,
)
w.triggers.add(trigger)
w.actions.add(action)
w.save()
doc = Document.objects.create(
title="sample test",
correspondent=self.c,
original_filename="sample.pdf",
)
doc.tags.set([self.t3])
doc.save()
with self.assertLogs("paperless.matching", level="DEBUG") as cm:
document_consumption_finished.send(
sender=self.__class__,
document=doc,
)
expected_str = f"Document did not match {w}"
self.assertIn(expected_str, cm.output[0])
expected_str = (
f"Document tags {list(doc.tags.all())} include excluded tags"
f" {list(trigger.filter_has_not_tags.all())}"
)
self.assertIn(expected_str, cm.output[1])
def test_document_added_excluded_correspondent(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
)
trigger.filter_has_not_correspondents.set([self.c])
action = WorkflowAction.objects.create(
assign_title="Doc assign owner",
assign_owner=self.user2,
)
w = Workflow.objects.create(
name="Workflow 1",
order=0,
)
w.triggers.add(trigger)
w.actions.add(action)
w.save()
doc = Document.objects.create(
title="sample test",
correspondent=self.c,
original_filename="sample.pdf",
)
with self.assertLogs("paperless.matching", level="DEBUG") as cm:
document_consumption_finished.send(
sender=self.__class__,
document=doc,
)
expected_str = f"Document did not match {w}"
self.assertIn(expected_str, cm.output[0])
expected_str = (
f"Document correspondent {doc.correspondent} is excluded by"
f" {list(trigger.filter_has_not_correspondents.all())}"
)
self.assertIn(expected_str, cm.output[1])
def test_document_added_excluded_document_types(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
)
trigger.filter_has_not_document_types.set([self.dt])
action = WorkflowAction.objects.create(
assign_title="Doc assign owner",
assign_owner=self.user2,
)
w = Workflow.objects.create(
name="Workflow 1",
order=0,
)
w.triggers.add(trigger)
w.actions.add(action)
w.save()
doc = Document.objects.create(
title="sample test",
document_type=self.dt,
original_filename="sample.pdf",
)
with self.assertLogs("paperless.matching", level="DEBUG") as cm:
document_consumption_finished.send(
sender=self.__class__,
document=doc,
)
expected_str = f"Document did not match {w}"
self.assertIn(expected_str, cm.output[0])
expected_str = (
f"Document doc type {doc.document_type} is excluded by"
f" {list(trigger.filter_has_not_document_types.all())}"
)
self.assertIn(expected_str, cm.output[1])
def test_document_added_excluded_storage_paths(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
)
trigger.filter_has_not_storage_paths.set([self.sp])
action = WorkflowAction.objects.create(
assign_title="Doc assign owner",
assign_owner=self.user2,
)
w = Workflow.objects.create(
name="Workflow 1",
order=0,
)
w.triggers.add(trigger)
w.actions.add(action)
w.save()
doc = Document.objects.create(
title="sample test",
storage_path=self.sp,
original_filename="sample.pdf",
)
with self.assertLogs("paperless.matching", level="DEBUG") as cm:
document_consumption_finished.send(
sender=self.__class__,
document=doc,
)
expected_str = f"Document did not match {w}"
self.assertIn(expected_str, cm.output[0])
expected_str = (
f"Document storage path {doc.storage_path} is excluded by"
f" {list(trigger.filter_has_not_storage_paths.all())}"
)
self.assertIn(expected_str, cm.output[1])
def test_document_added_custom_field_query_no_match(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
filter_custom_field_query=json.dumps(
[
"AND",
[[self.cf1.id, "exact", "expected"]],
],
),
)
action = WorkflowAction.objects.create(
assign_title="Doc assign owner",
assign_owner=self.user2,
)
workflow = Workflow.objects.create(name="Workflow 1", order=0)
workflow.triggers.add(trigger)
workflow.actions.add(action)
workflow.save()
doc = Document.objects.create(
title="sample test",
correspondent=self.c,
original_filename="sample.pdf",
)
CustomFieldInstance.objects.create(
document=doc,
field=self.cf1,
value_text="other",
)
with self.assertLogs("paperless.matching", level="DEBUG") as cm:
document_consumption_finished.send(
sender=self.__class__,
document=doc,
)
expected_str = f"Document did not match {workflow}"
self.assertIn(expected_str, cm.output[0])
self.assertIn(
"Document custom fields do not match the configured custom field query",
cm.output[1],
)
def test_document_added_custom_field_query_match(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
filter_custom_field_query=json.dumps(
[
"AND",
[[self.cf1.id, "exact", "expected"]],
],
),
)
doc = Document.objects.create(
title="sample test",
correspondent=self.c,
original_filename="sample.pdf",
)
CustomFieldInstance.objects.create(
document=doc,
field=self.cf1,
value_text="expected",
)
matched, reason = existing_document_matches_workflow(doc, trigger)
self.assertTrue(matched)
self.assertIsNone(reason)
def test_prefilter_documents_custom_field_query(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
filter_custom_field_query=json.dumps(
[
"AND",
[[self.cf1.id, "exact", "match"]],
],
),
)
doc1 = Document.objects.create(
title="doc 1",
correspondent=self.c,
original_filename="doc1.pdf",
checksum="checksum1",
)
CustomFieldInstance.objects.create(
document=doc1,
field=self.cf1,
value_text="match",
)
doc2 = Document.objects.create(
title="doc 2",
correspondent=self.c,
original_filename="doc2.pdf",
checksum="checksum2",
)
CustomFieldInstance.objects.create(
document=doc2,
field=self.cf1,
value_text="different",
)
filtered = prefilter_documents_by_workflowtrigger(
Document.objects.all(),
trigger,
)
self.assertIn(doc1, filtered)
self.assertNotIn(doc2, filtered)
def test_consumption_trigger_requires_filter_configuration(self):
serializer = WorkflowTriggerSerializer(
data={
"type": WorkflowTrigger.WorkflowTriggerType.CONSUMPTION,
},
)
self.assertFalse(serializer.is_valid())
errors = serializer.errors.get("non_field_errors", [])
self.assertIn(
"File name, path or mail rule filter are required",
[str(error) for error in errors],
)
def test_workflow_trigger_serializer_clears_empty_custom_field_query(self):
serializer = WorkflowTriggerSerializer(
data={
"type": WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
"filter_custom_field_query": "",
},
)
self.assertTrue(serializer.is_valid(), serializer.errors)
self.assertIsNone(serializer.validated_data.get("filter_custom_field_query"))
def test_existing_document_invalid_custom_field_query_configuration(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
filter_custom_field_query="{ not json",
)
document = Document.objects.create(
title="doc invalid query",
original_filename="invalid.pdf",
checksum="checksum-invalid-query",
)
matched, reason = existing_document_matches_workflow(document, trigger)
self.assertFalse(matched)
self.assertEqual(reason, "Invalid custom field query configuration")
def test_prefilter_documents_returns_none_for_invalid_custom_field_query(self):
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
filter_custom_field_query="{ not json",
)
Document.objects.create(
title="doc",
original_filename="doc.pdf",
checksum="checksum-prefilter-invalid",
)
filtered = prefilter_documents_by_workflowtrigger(
Document.objects.all(),
trigger,
)
self.assertEqual(list(filtered), [])
def test_prefilter_documents_applies_all_filters(self):
other_document_type = DocumentType.objects.create(name="Other Type")
other_storage_path = StoragePath.objects.create(
name="Blocked path",
path="/blocked/",
)
trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,
filter_has_correspondent=self.c,
filter_has_document_type=self.dt,
filter_has_storage_path=self.sp,
)
trigger.filter_has_tags.set([self.t1])
trigger.filter_has_all_tags.set([self.t1, self.t2])
trigger.filter_has_not_tags.set([self.t3])
trigger.filter_has_not_correspondents.set([self.c2])
trigger.filter_has_not_document_types.set([other_document_type])
trigger.filter_has_not_storage_paths.set([other_storage_path])
allowed_document = Document.objects.create(
title="allowed",
correspondent=self.c,
document_type=self.dt,
storage_path=self.sp,
original_filename="allow.pdf",
checksum="checksum-prefilter-allowed",
)
allowed_document.tags.set([self.t1, self.t2])
blocked_document = Document.objects.create(
title="blocked",
correspondent=self.c2,
document_type=other_document_type,
storage_path=other_storage_path,
original_filename="block.pdf",
checksum="checksum-prefilter-blocked",
)
blocked_document.tags.set([self.t1, self.t3])
filtered = prefilter_documents_by_workflowtrigger(
Document.objects.all(),
trigger,
)
self.assertIn(allowed_document, filtered)
self.assertNotIn(blocked_document, filtered)
def test_document_added_no_match_doctype(self): def test_document_added_no_match_doctype(self):
trigger = WorkflowTrigger.objects.create( trigger = WorkflowTrigger.objects.create(
type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED, type=WorkflowTrigger.WorkflowTriggerType.DOCUMENT_ADDED,

View File

@@ -2,7 +2,7 @@ msgid ""
msgstr "" msgstr ""
"Project-Id-Version: paperless-ngx\n" "Project-Id-Version: paperless-ngx\n"
"Report-Msgid-Bugs-To: \n" "Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2025-09-30 16:50+0000\n" "POT-Creation-Date: 2025-10-13 22:25+0000\n"
"PO-Revision-Date: 2022-02-17 04:17\n" "PO-Revision-Date: 2022-02-17 04:17\n"
"Last-Translator: \n" "Last-Translator: \n"
"Language-Team: English\n" "Language-Team: English\n"
@@ -89,7 +89,7 @@ msgstr ""
msgid "Automatic" msgid "Automatic"
msgstr "" msgstr ""
#: documents/models.py:64 documents/models.py:456 documents/models.py:1484 #: documents/models.py:64 documents/models.py:456 documents/models.py:1526
#: paperless_mail/models.py:23 paperless_mail/models.py:143 #: paperless_mail/models.py:23 paperless_mail/models.py:143
msgid "name" msgid "name"
msgstr "" msgstr ""
@@ -264,7 +264,7 @@ msgid "The position of this document in your physical document archive."
msgstr "" msgstr ""
#: documents/models.py:318 documents/models.py:699 documents/models.py:753 #: documents/models.py:318 documents/models.py:699 documents/models.py:753
#: documents/models.py:1527 #: documents/models.py:1569
msgid "document" msgid "document"
msgstr "" msgstr ""
@@ -864,371 +864,399 @@ msgstr ""
msgid "has these tag(s)" msgid "has these tag(s)"
msgstr "" msgstr ""
#: documents/models.py:1073 #: documents/models.py:1072
msgid "has all of these tag(s)"
msgstr ""
#: documents/models.py:1079
msgid "does not have these tag(s)"
msgstr ""
#: documents/models.py:1087
msgid "has this document type" msgid "has this document type"
msgstr "" msgstr ""
#: documents/models.py:1081 #: documents/models.py:1094
msgid "does not have these document type(s)"
msgstr ""
#: documents/models.py:1102
msgid "has this correspondent" msgid "has this correspondent"
msgstr "" msgstr ""
#: documents/models.py:1089 #: documents/models.py:1109
msgid "does not have these correspondent(s)"
msgstr ""
#: documents/models.py:1117
msgid "has this storage path" msgid "has this storage path"
msgstr "" msgstr ""
#: documents/models.py:1093 #: documents/models.py:1124
msgid "does not have these storage path(s)"
msgstr ""
#: documents/models.py:1128
msgid "filter custom field query"
msgstr ""
#: documents/models.py:1131
msgid "JSON-encoded custom field query expression."
msgstr ""
#: documents/models.py:1135
msgid "schedule offset days" msgid "schedule offset days"
msgstr "" msgstr ""
#: documents/models.py:1096 #: documents/models.py:1138
msgid "The number of days to offset the schedule trigger by." msgid "The number of days to offset the schedule trigger by."
msgstr "" msgstr ""
#: documents/models.py:1101 #: documents/models.py:1143
msgid "schedule is recurring" msgid "schedule is recurring"
msgstr "" msgstr ""
#: documents/models.py:1104 #: documents/models.py:1146
msgid "If the schedule should be recurring." msgid "If the schedule should be recurring."
msgstr "" msgstr ""
#: documents/models.py:1109 #: documents/models.py:1151
msgid "schedule recurring delay in days" msgid "schedule recurring delay in days"
msgstr "" msgstr ""
#: documents/models.py:1113 #: documents/models.py:1155
msgid "The number of days between recurring schedule triggers." msgid "The number of days between recurring schedule triggers."
msgstr "" msgstr ""
#: documents/models.py:1118 #: documents/models.py:1160
msgid "schedule date field" msgid "schedule date field"
msgstr "" msgstr ""
#: documents/models.py:1123 #: documents/models.py:1165
msgid "The field to check for a schedule trigger." msgid "The field to check for a schedule trigger."
msgstr "" msgstr ""
#: documents/models.py:1132 #: documents/models.py:1174
msgid "schedule date custom field" msgid "schedule date custom field"
msgstr "" msgstr ""
#: documents/models.py:1136 #: documents/models.py:1178
msgid "workflow trigger" msgid "workflow trigger"
msgstr "" msgstr ""
#: documents/models.py:1137 #: documents/models.py:1179
msgid "workflow triggers" msgid "workflow triggers"
msgstr "" msgstr ""
#: documents/models.py:1145 #: documents/models.py:1187
msgid "email subject" msgid "email subject"
msgstr "" msgstr ""
#: documents/models.py:1149 #: documents/models.py:1191
msgid "" msgid ""
"The subject of the email, can include some placeholders, see documentation." "The subject of the email, can include some placeholders, see documentation."
msgstr "" msgstr ""
#: documents/models.py:1155 #: documents/models.py:1197
msgid "email body" msgid "email body"
msgstr "" msgstr ""
#: documents/models.py:1158 #: documents/models.py:1200
msgid "" msgid ""
"The body (message) of the email, can include some placeholders, see " "The body (message) of the email, can include some placeholders, see "
"documentation." "documentation."
msgstr "" msgstr ""
#: documents/models.py:1164 #: documents/models.py:1206
msgid "emails to" msgid "emails to"
msgstr "" msgstr ""
#: documents/models.py:1167 #: documents/models.py:1209
msgid "The destination email addresses, comma separated." msgid "The destination email addresses, comma separated."
msgstr "" msgstr ""
#: documents/models.py:1173 #: documents/models.py:1215
msgid "include document in email" msgid "include document in email"
msgstr "" msgstr ""
#: documents/models.py:1184 #: documents/models.py:1226
msgid "webhook url" msgid "webhook url"
msgstr "" msgstr ""
#: documents/models.py:1187 #: documents/models.py:1229
msgid "The destination URL for the notification." msgid "The destination URL for the notification."
msgstr "" msgstr ""
#: documents/models.py:1192 #: documents/models.py:1234
msgid "use parameters" msgid "use parameters"
msgstr "" msgstr ""
#: documents/models.py:1197 #: documents/models.py:1239
msgid "send as JSON" msgid "send as JSON"
msgstr "" msgstr ""
#: documents/models.py:1201 #: documents/models.py:1243
msgid "webhook parameters" msgid "webhook parameters"
msgstr "" msgstr ""
#: documents/models.py:1204 #: documents/models.py:1246
msgid "The parameters to send with the webhook URL if body not used." msgid "The parameters to send with the webhook URL if body not used."
msgstr "" msgstr ""
#: documents/models.py:1208 #: documents/models.py:1250
msgid "webhook body" msgid "webhook body"
msgstr "" msgstr ""
#: documents/models.py:1211 #: documents/models.py:1253
msgid "The body to send with the webhook URL if parameters not used." msgid "The body to send with the webhook URL if parameters not used."
msgstr "" msgstr ""
#: documents/models.py:1215 #: documents/models.py:1257
msgid "webhook headers" msgid "webhook headers"
msgstr "" msgstr ""
#: documents/models.py:1218 #: documents/models.py:1260
msgid "The headers to send with the webhook URL." msgid "The headers to send with the webhook URL."
msgstr "" msgstr ""
#: documents/models.py:1223 #: documents/models.py:1265
msgid "include document in webhook" msgid "include document in webhook"
msgstr "" msgstr ""
#: documents/models.py:1234 #: documents/models.py:1276
msgid "Assignment" msgid "Assignment"
msgstr "" msgstr ""
#: documents/models.py:1238 #: documents/models.py:1280
msgid "Removal" msgid "Removal"
msgstr "" msgstr ""
#: documents/models.py:1242 documents/templates/account/password_reset.html:15 #: documents/models.py:1284 documents/templates/account/password_reset.html:15
msgid "Email" msgid "Email"
msgstr "" msgstr ""
#: documents/models.py:1246 #: documents/models.py:1288
msgid "Webhook" msgid "Webhook"
msgstr "" msgstr ""
#: documents/models.py:1250 #: documents/models.py:1292
msgid "Workflow Action Type" msgid "Workflow Action Type"
msgstr "" msgstr ""
#: documents/models.py:1256 #: documents/models.py:1298
msgid "assign title" msgid "assign title"
msgstr "" msgstr ""
#: documents/models.py:1260 #: documents/models.py:1302
msgid "Assign a document title, must be a Jinja2 template, see documentation." msgid "Assign a document title, must be a Jinja2 template, see documentation."
msgstr "" msgstr ""
#: documents/models.py:1268 paperless_mail/models.py:274 #: documents/models.py:1310 paperless_mail/models.py:274
msgid "assign this tag" msgid "assign this tag"
msgstr "" msgstr ""
#: documents/models.py:1277 paperless_mail/models.py:282 #: documents/models.py:1319 paperless_mail/models.py:282
msgid "assign this document type" msgid "assign this document type"
msgstr "" msgstr ""
#: documents/models.py:1286 paperless_mail/models.py:296 #: documents/models.py:1328 paperless_mail/models.py:296
msgid "assign this correspondent" msgid "assign this correspondent"
msgstr "" msgstr ""
#: documents/models.py:1295 #: documents/models.py:1337
msgid "assign this storage path" msgid "assign this storage path"
msgstr "" msgstr ""
#: documents/models.py:1304 #: documents/models.py:1346
msgid "assign this owner" msgid "assign this owner"
msgstr "" msgstr ""
#: documents/models.py:1311 #: documents/models.py:1353
msgid "grant view permissions to these users" msgid "grant view permissions to these users"
msgstr "" msgstr ""
#: documents/models.py:1318 #: documents/models.py:1360
msgid "grant view permissions to these groups" msgid "grant view permissions to these groups"
msgstr "" msgstr ""
#: documents/models.py:1325 #: documents/models.py:1367
msgid "grant change permissions to these users" msgid "grant change permissions to these users"
msgstr "" msgstr ""
#: documents/models.py:1332 #: documents/models.py:1374
msgid "grant change permissions to these groups" msgid "grant change permissions to these groups"
msgstr "" msgstr ""
#: documents/models.py:1339 #: documents/models.py:1381
msgid "assign these custom fields" msgid "assign these custom fields"
msgstr "" msgstr ""
#: documents/models.py:1343 #: documents/models.py:1385
msgid "custom field values" msgid "custom field values"
msgstr "" msgstr ""
#: documents/models.py:1347 #: documents/models.py:1389
msgid "Optional values to assign to the custom fields." msgid "Optional values to assign to the custom fields."
msgstr "" msgstr ""
#: documents/models.py:1356 #: documents/models.py:1398
msgid "remove these tag(s)" msgid "remove these tag(s)"
msgstr "" msgstr ""
#: documents/models.py:1361 #: documents/models.py:1403
msgid "remove all tags" msgid "remove all tags"
msgstr "" msgstr ""
#: documents/models.py:1368 #: documents/models.py:1410
msgid "remove these document type(s)" msgid "remove these document type(s)"
msgstr "" msgstr ""
#: documents/models.py:1373 #: documents/models.py:1415
msgid "remove all document types" msgid "remove all document types"
msgstr "" msgstr ""
#: documents/models.py:1380 #: documents/models.py:1422
msgid "remove these correspondent(s)" msgid "remove these correspondent(s)"
msgstr "" msgstr ""
#: documents/models.py:1385 #: documents/models.py:1427
msgid "remove all correspondents" msgid "remove all correspondents"
msgstr "" msgstr ""
#: documents/models.py:1392 #: documents/models.py:1434
msgid "remove these storage path(s)" msgid "remove these storage path(s)"
msgstr "" msgstr ""
#: documents/models.py:1397 #: documents/models.py:1439
msgid "remove all storage paths" msgid "remove all storage paths"
msgstr "" msgstr ""
#: documents/models.py:1404 #: documents/models.py:1446
msgid "remove these owner(s)" msgid "remove these owner(s)"
msgstr "" msgstr ""
#: documents/models.py:1409 #: documents/models.py:1451
msgid "remove all owners" msgid "remove all owners"
msgstr "" msgstr ""
#: documents/models.py:1416 #: documents/models.py:1458
msgid "remove view permissions for these users" msgid "remove view permissions for these users"
msgstr "" msgstr ""
#: documents/models.py:1423 #: documents/models.py:1465
msgid "remove view permissions for these groups" msgid "remove view permissions for these groups"
msgstr "" msgstr ""
#: documents/models.py:1430 #: documents/models.py:1472
msgid "remove change permissions for these users" msgid "remove change permissions for these users"
msgstr "" msgstr ""
#: documents/models.py:1437 #: documents/models.py:1479
msgid "remove change permissions for these groups" msgid "remove change permissions for these groups"
msgstr "" msgstr ""
#: documents/models.py:1442 #: documents/models.py:1484
msgid "remove all permissions" msgid "remove all permissions"
msgstr "" msgstr ""
#: documents/models.py:1449 #: documents/models.py:1491
msgid "remove these custom fields" msgid "remove these custom fields"
msgstr "" msgstr ""
#: documents/models.py:1454 #: documents/models.py:1496
msgid "remove all custom fields" msgid "remove all custom fields"
msgstr "" msgstr ""
#: documents/models.py:1463 #: documents/models.py:1505
msgid "email" msgid "email"
msgstr "" msgstr ""
#: documents/models.py:1472 #: documents/models.py:1514
msgid "webhook" msgid "webhook"
msgstr "" msgstr ""
#: documents/models.py:1476 #: documents/models.py:1518
msgid "workflow action" msgid "workflow action"
msgstr "" msgstr ""
#: documents/models.py:1477 #: documents/models.py:1519
msgid "workflow actions" msgid "workflow actions"
msgstr "" msgstr ""
#: documents/models.py:1486 paperless_mail/models.py:145 #: documents/models.py:1528 paperless_mail/models.py:145
msgid "order" msgid "order"
msgstr "" msgstr ""
#: documents/models.py:1492 #: documents/models.py:1534
msgid "triggers" msgid "triggers"
msgstr "" msgstr ""
#: documents/models.py:1499 #: documents/models.py:1541
msgid "actions" msgid "actions"
msgstr "" msgstr ""
#: documents/models.py:1502 paperless_mail/models.py:154 #: documents/models.py:1544 paperless_mail/models.py:154
msgid "enabled" msgid "enabled"
msgstr "" msgstr ""
#: documents/models.py:1513 #: documents/models.py:1555
msgid "workflow" msgid "workflow"
msgstr "" msgstr ""
#: documents/models.py:1517 #: documents/models.py:1559
msgid "workflow trigger type" msgid "workflow trigger type"
msgstr "" msgstr ""
#: documents/models.py:1531 #: documents/models.py:1573
msgid "date run" msgid "date run"
msgstr "" msgstr ""
#: documents/models.py:1537 #: documents/models.py:1579
msgid "workflow run" msgid "workflow run"
msgstr "" msgstr ""
#: documents/models.py:1538 #: documents/models.py:1580
msgid "workflow runs" msgid "workflow runs"
msgstr "" msgstr ""
#: documents/serialisers.py:141 #: documents/serialisers.py:143
#, python-format #, python-format
msgid "Invalid regular expression: %(error)s" msgid "Invalid regular expression: %(error)s"
msgstr "" msgstr ""
#: documents/serialisers.py:607 #: documents/serialisers.py:609
msgid "Invalid color." msgid "Invalid color."
msgstr "" msgstr ""
#: documents/serialisers.py:636 #: documents/serialisers.py:638
msgid "Invalid parent tag." msgid "Invalid parent tag."
msgstr "" msgstr ""
#: documents/serialisers.py:1793 #: documents/serialisers.py:1795
#, python-format #, python-format
msgid "File type %(type)s not supported" msgid "File type %(type)s not supported"
msgstr "" msgstr ""
#: documents/serialisers.py:1837 #: documents/serialisers.py:1839
#, python-format #, python-format
msgid "Custom field id must be an integer: %(id)s" msgid "Custom field id must be an integer: %(id)s"
msgstr "" msgstr ""
#: documents/serialisers.py:1844 #: documents/serialisers.py:1846
#, python-format #, python-format
msgid "Custom field with id %(id)s does not exist" msgid "Custom field with id %(id)s does not exist"
msgstr "" msgstr ""
#: documents/serialisers.py:1861 documents/serialisers.py:1871 #: documents/serialisers.py:1863 documents/serialisers.py:1873
msgid "" msgid ""
"Custom fields must be a list of integers or an object mapping ids to values." "Custom fields must be a list of integers or an object mapping ids to values."
msgstr "" msgstr ""
#: documents/serialisers.py:1866 #: documents/serialisers.py:1868
msgid "Some custom fields don't exist or were specified twice." msgid "Some custom fields don't exist or were specified twice."
msgstr "" msgstr ""
#: documents/serialisers.py:1936 #: documents/serialisers.py:1983
msgid "Invalid variable detected." msgid "Invalid variable detected."
msgstr "" msgstr ""

115
uv.lock generated
View File

@@ -1036,11 +1036,11 @@ wheels = [
[[package]] [[package]]
name = "filelock" name = "filelock"
version = "3.20.0" version = "3.19.1"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/58/46/0028a82567109b5ef6e4d2a1f04a583fb513e6cf9527fcdd09afd817deeb/filelock-3.20.0.tar.gz", hash = "sha256:711e943b4ec6be42e1d4e6690b48dc175c822967466bb31c0c293f34334c13f4", size = 18922, upload-time = "2025-10-08T18:03:50.056Z" } sdist = { url = "https://files.pythonhosted.org/packages/40/bb/0ab3e58d22305b6f5440629d20683af28959bf793d98d11950e305c1c326/filelock-3.19.1.tar.gz", hash = "sha256:66eda1888b0171c998b35be2bcc0f6d75c388a7ce20c3f3f37aa8e96c2dddf58", size = 17687, upload-time = "2025-08-14T16:56:03.016Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/76/91/7216b27286936c16f5b4d0c530087e4a54eead683e6b0b73dd0c64844af6/filelock-3.20.0-py3-none-any.whl", hash = "sha256:339b4732ffda5cd79b13f4e2711a31b0365ce445d95d243bb996273d072546a2", size = 16054, upload-time = "2025-10-08T18:03:48.35Z" }, { url = "https://files.pythonhosted.org/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl", hash = "sha256:d38e30481def20772f5baf097c122c3babc4fcdb7e14e57049eb9d88c6dc017d", size = 15988, upload-time = "2025-08-14T16:56:01.633Z" },
] ]
[[package]] [[package]]
@@ -1795,11 +1795,12 @@ wheels = [
[[package]] [[package]]
name = "mkdocs-material" name = "mkdocs-material"
version = "9.6.21" version = "9.6.20"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "babel", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "babel", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "backrefs", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "backrefs", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "click", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "colorama", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "colorama", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "jinja2", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "jinja2", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "markdown", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "markdown", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
@@ -1810,9 +1811,9 @@ dependencies = [
{ name = "pymdown-extensions", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "pymdown-extensions", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "requests", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "requests", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/ff/d5/ab83ca9aa314954b0a9e8849780bdd01866a3cfcb15ffb7e3a61ca06ff0b/mkdocs_material-9.6.21.tar.gz", hash = "sha256:b01aa6d2731322438056f360f0e623d3faae981f8f2d8c68b1b973f4f2657870", size = 4043097, upload-time = "2025-09-30T19:11:27.517Z" } sdist = { url = "https://files.pythonhosted.org/packages/ba/ee/6ed7fc739bd7591485c8bec67d5984508d3f2733e708f32714c21593341a/mkdocs_material-9.6.20.tar.gz", hash = "sha256:e1f84d21ec5fb730673c4259b2e0d39f8d32a3fef613e3a8e7094b012d43e790", size = 4037822, upload-time = "2025-09-15T08:48:01.816Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/cf/4f/98681c2030375fe9b057dbfb9008b68f46c07dddf583f4df09bf8075e37f/mkdocs_material-9.6.21-py3-none-any.whl", hash = "sha256:aa6a5ab6fb4f6d381588ac51da8782a4d3757cb3d1b174f81a2ec126e1f22c92", size = 9203097, upload-time = "2025-09-30T19:11:24.063Z" }, { url = "https://files.pythonhosted.org/packages/67/d8/a31dd52e657bf12b20574706d07df8d767e1ab4340f9bfb9ce73950e5e59/mkdocs_material-9.6.20-py3-none-any.whl", hash = "sha256:b8d8c8b0444c7c06dd984b55ba456ce731f0035c5a1533cc86793618eb1e6c82", size = 9193367, upload-time = "2025-09-15T08:47:58.722Z" },
] ]
[[package]] [[package]]
@@ -1921,7 +1922,7 @@ sdist = { url = "https://files.pythonhosted.org/packages/61/68/810093cb579daae42
[[package]] [[package]]
name = "nltk" name = "nltk"
version = "3.9.2" version = "3.9.1"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "click", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "click", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
@@ -1929,9 +1930,9 @@ dependencies = [
{ name = "regex", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "regex", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "tqdm", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "tqdm", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/f9/76/3a5e4312c19a028770f86fd7c058cf9f4ec4321c6cf7526bab998a5b683c/nltk-3.9.2.tar.gz", hash = "sha256:0f409e9b069ca4177c1903c3e843eef90c7e92992fa4931ae607da6de49e1419", size = 2887629, upload-time = "2025-10-01T07:19:23.764Z" } sdist = { url = "https://files.pythonhosted.org/packages/3c/87/db8be88ad32c2d042420b6fd9ffd4a149f9a0d7f0e86b3f543be2eeeedd2/nltk-3.9.1.tar.gz", hash = "sha256:87d127bd3de4bd89a4f81265e5fa59cb1b199b27440175370f7417d2bc7ae868", size = 2904691, upload-time = "2024-08-18T19:48:37.769Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/60/90/81ac364ef94209c100e12579629dc92bf7a709a84af32f8c551b02c07e94/nltk-3.9.2-py3-none-any.whl", hash = "sha256:1e209d2b3009110635ed9709a67a1a3e33a10f799490fa71cf4bec218c11c88a", size = 1513404, upload-time = "2025-10-01T07:19:21.648Z" }, { url = "https://files.pythonhosted.org/packages/4d/66/7d9e26593edda06e8cb531874633f7c2372279c3b0f46235539fe546df8b/nltk-3.9.1-py3-none-any.whl", hash = "sha256:4fa26829c5b00715afe3061398a8989dc643b92ce7dd93fb4585a70930d168a1", size = 1505442, upload-time = "2024-08-18T19:48:21.909Z" },
] ]
[[package]] [[package]]
@@ -2279,7 +2280,7 @@ requires-dist = [
{ name = "drf-spectacular", specifier = "~=0.28" }, { name = "drf-spectacular", specifier = "~=0.28" },
{ name = "drf-spectacular-sidecar", specifier = "~=2025.9.1" }, { name = "drf-spectacular-sidecar", specifier = "~=2025.9.1" },
{ name = "drf-writable-nested", specifier = "~=0.7.1" }, { name = "drf-writable-nested", specifier = "~=0.7.1" },
{ name = "filelock", specifier = "~=3.20.0" }, { name = "filelock", specifier = "~=3.19.1" },
{ name = "flower", specifier = "~=2.0.1" }, { name = "flower", specifier = "~=2.0.1" },
{ name = "gotenberg-client", specifier = "~=0.11.0" }, { name = "gotenberg-client", specifier = "~=0.11.0" },
{ name = "granian", extras = ["uvloop"], marker = "extra == 'webserver'", specifier = "~=2.5.1" }, { name = "granian", extras = ["uvloop"], marker = "extra == 'webserver'", specifier = "~=2.5.1" },
@@ -2327,7 +2328,7 @@ dev = [
{ name = "mkdocs-glightbox", specifier = "~=0.5.1" }, { name = "mkdocs-glightbox", specifier = "~=0.5.1" },
{ name = "mkdocs-material", specifier = "~=9.6.4" }, { name = "mkdocs-material", specifier = "~=9.6.4" },
{ name = "pre-commit", specifier = "~=4.3.0" }, { name = "pre-commit", specifier = "~=4.3.0" },
{ name = "pre-commit-uv", specifier = "~=4.2.0" }, { name = "pre-commit-uv", specifier = "~=4.1.3" },
{ name = "pytest", specifier = "~=8.4.1" }, { name = "pytest", specifier = "~=8.4.1" },
{ name = "pytest-cov", specifier = "~=7.0.0" }, { name = "pytest-cov", specifier = "~=7.0.0" },
{ name = "pytest-django", specifier = "~=4.11.1" }, { name = "pytest-django", specifier = "~=4.11.1" },
@@ -2337,7 +2338,7 @@ dev = [
{ name = "pytest-rerunfailures" }, { name = "pytest-rerunfailures" },
{ name = "pytest-sugar" }, { name = "pytest-sugar" },
{ name = "pytest-xdist" }, { name = "pytest-xdist" },
{ name = "ruff", specifier = "~=0.14.0" }, { name = "ruff", specifier = "~=0.13.0" },
] ]
docs = [ docs = [
{ name = "mkdocs-glightbox", specifier = "~=0.5.1" }, { name = "mkdocs-glightbox", specifier = "~=0.5.1" },
@@ -2345,8 +2346,8 @@ docs = [
] ]
lint = [ lint = [
{ name = "pre-commit", specifier = "~=4.3.0" }, { name = "pre-commit", specifier = "~=4.3.0" },
{ name = "pre-commit-uv", specifier = "~=4.2.0" }, { name = "pre-commit-uv", specifier = "~=4.1.3" },
{ name = "ruff", specifier = "~=0.14.0" }, { name = "ruff", specifier = "~=0.13.0" },
] ]
testing = [ testing = [
{ name = "daphne" }, { name = "daphne" },
@@ -2641,15 +2642,15 @@ wheels = [
[[package]] [[package]]
name = "pre-commit-uv" name = "pre-commit-uv"
version = "4.2.0" version = "4.1.5"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "pre-commit", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "pre-commit", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "uv", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "uv", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/f6/42/84372bc99a841bfdd8b182a50186471a7f5e873d8e8bcec0d0cb6dabcbb0/pre_commit_uv-4.2.0.tar.gz", hash = "sha256:c32bb1d90235507726eee2aeef2be5fdab431a6f1906e3f1addb0a4e99b369d1", size = 6912, upload-time = "2025-10-09T19:30:48.354Z" } sdist = { url = "https://files.pythonhosted.org/packages/3d/0c/e6ab71e93d8e78ffa36a1f8b6ce12014679e2b83b401404c12bb2840078f/pre_commit_uv-4.1.5.tar.gz", hash = "sha256:3f40714152b4f4aa484703b8dbfeb9baa0aaedb17207e0012b3561da756d577d", size = 6920, upload-time = "2025-08-27T14:44:40.178Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/87/9f/ec8491f6b3022489a4d36ce372214c10a34f90b425aa61ff2e0a8dc5b9d5/pre_commit_uv-4.2.0-py3-none-any.whl", hash = "sha256:cc1b56641e6c62d90a4d8b4f0af6f2610f1c397ce81af024e768c0f33715cb81", size = 5650, upload-time = "2025-10-09T19:30:47.257Z" }, { url = "https://files.pythonhosted.org/packages/f7/c6/747bc58da9f0665c607890c73b349b3934381e312272f584808182655898/pre_commit_uv-4.1.5-py3-none-any.whl", hash = "sha256:f4805e45615b898c4ca6ea37bdb60a05bb7830f986c303a06a378d6b50c3aa9e", size = 5653, upload-time = "2025-08-27T14:44:39.187Z" },
] ]
[[package]] [[package]]
@@ -2865,15 +2866,15 @@ wheels = [
[[package]] [[package]]
name = "pytest-env" name = "pytest-env"
version = "1.2.0" version = "1.1.5"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "pytest", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "pytest", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "tomli", marker = "(python_full_version < '3.11' and sys_platform == 'darwin') or (python_full_version < '3.11' and sys_platform == 'linux')" }, { name = "tomli", marker = "(python_full_version < '3.11' and sys_platform == 'darwin') or (python_full_version < '3.11' and sys_platform == 'linux')" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/13/12/9c87d0ca45d5992473208bcef2828169fa7d39b8d7fc6e3401f5c08b8bf7/pytest_env-1.2.0.tar.gz", hash = "sha256:475e2ebe8626cee01f491f304a74b12137742397d6c784ea4bc258f069232b80", size = 8973, upload-time = "2025-10-09T19:15:47.42Z" } sdist = { url = "https://files.pythonhosted.org/packages/1f/31/27f28431a16b83cab7a636dce59cf397517807d247caa38ee67d65e71ef8/pytest_env-1.1.5.tar.gz", hash = "sha256:91209840aa0e43385073ac464a554ad2947cc2fd663a9debf88d03b01e0cc1cf", size = 8911, upload-time = "2024-09-17T22:39:18.566Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/27/98/822b924a4a3eb58aacba84444c7439fce32680592f394de26af9c76e2569/pytest_env-1.2.0-py3-none-any.whl", hash = "sha256:d7e5b7198f9b83c795377c09feefa45d56083834e60d04767efd64819fc9da00", size = 6251, upload-time = "2025-10-09T19:15:46.077Z" }, { url = "https://files.pythonhosted.org/packages/de/b8/87cfb16045c9d4092cfcf526135d73b88101aac83bc1adcf82dfb5fd3833/pytest_env-1.1.5-py3-none-any.whl", hash = "sha256:ce90cf8772878515c24b31cd97c7fa1f4481cd68d588419fd45f10ecaee6bc30", size = 6141, upload-time = "2024-09-17T22:39:16.942Z" },
] ]
[[package]] [[package]]
@@ -2903,15 +2904,15 @@ wheels = [
[[package]] [[package]]
name = "pytest-rerunfailures" name = "pytest-rerunfailures"
version = "16.1" version = "16.0.1"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
dependencies = [ dependencies = [
{ name = "packaging", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "packaging", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
{ name = "pytest", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" }, { name = "pytest", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
] ]
sdist = { url = "https://files.pythonhosted.org/packages/de/04/71e9520551fc8fe2cf5c1a1842e4e600265b0815f2016b7c27ec85688682/pytest_rerunfailures-16.1.tar.gz", hash = "sha256:c38b266db8a808953ebd71ac25c381cb1981a78ff9340a14bcb9f1b9bff1899e", size = 30889, upload-time = "2025-10-10T07:06:01.238Z" } sdist = { url = "https://files.pythonhosted.org/packages/26/53/a543a76f922a5337d10df22441af8bf68f1b421cadf9aedf8a77943b81f6/pytest_rerunfailures-16.0.1.tar.gz", hash = "sha256:ed4b3a6e7badb0a720ddd93f9de1e124ba99a0cb13bc88561b3c168c16062559", size = 27612, upload-time = "2025-09-02T06:48:25.193Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/77/54/60eabb34445e3db3d3d874dc1dfa72751bfec3265bd611cb13c8b290adea/pytest_rerunfailures-16.1-py3-none-any.whl", hash = "sha256:5d11b12c0ca9a1665b5054052fcc1084f8deadd9328962745ef6b04e26382e86", size = 14093, upload-time = "2025-10-10T07:06:00.019Z" }, { url = "https://files.pythonhosted.org/packages/38/73/67dc14cda1942914e70fbb117fceaf11e259362c517bdadd76b0dd752524/pytest_rerunfailures-16.0.1-py3-none-any.whl", hash = "sha256:0bccc0e3b0e3388275c25a100f7077081318196569a121217688ed05e58984b9", size = 13610, upload-time = "2025-09-02T06:48:23.615Z" },
] ]
[[package]] [[package]]
@@ -3523,25 +3524,25 @@ wheels = [
[[package]] [[package]]
name = "ruff" name = "ruff"
version = "0.14.0" version = "0.13.2"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/41/b9/9bd84453ed6dd04688de9b3f3a4146a1698e8faae2ceeccce4e14c67ae17/ruff-0.14.0.tar.gz", hash = "sha256:62ec8969b7510f77945df916de15da55311fade8d6050995ff7f680afe582c57", size = 5452071, upload-time = "2025-10-07T18:21:55.763Z" } sdist = { url = "https://files.pythonhosted.org/packages/02/df/8d7d8c515d33adfc540e2edf6c6021ea1c5a58a678d8cfce9fae59aabcab/ruff-0.13.2.tar.gz", hash = "sha256:cb12fffd32fb16d32cef4ed16d8c7cdc27ed7c944eaa98d99d01ab7ab0b710ff", size = 5416417, upload-time = "2025-09-25T14:54:09.936Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/3a/4e/79d463a5f80654e93fa653ebfb98e0becc3f0e7cf6219c9ddedf1e197072/ruff-0.14.0-py3-none-linux_armv6l.whl", hash = "sha256:58e15bffa7054299becf4bab8a1187062c6f8cafbe9f6e39e0d5aface455d6b3", size = 12494532, upload-time = "2025-10-07T18:21:00.373Z" }, { url = "https://files.pythonhosted.org/packages/6e/84/5716a7fa4758e41bf70e603e13637c42cfb9dbf7ceb07180211b9bbf75ef/ruff-0.13.2-py3-none-linux_armv6l.whl", hash = "sha256:3796345842b55f033a78285e4f1641078f902020d8450cade03aad01bffd81c3", size = 12343254, upload-time = "2025-09-25T14:53:27.784Z" },
{ url = "https://files.pythonhosted.org/packages/ee/40/e2392f445ed8e02aa6105d49db4bfff01957379064c30f4811c3bf38aece/ruff-0.14.0-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:838d1b065f4df676b7c9957992f2304e41ead7a50a568185efd404297d5701e8", size = 13160768, upload-time = "2025-10-07T18:21:04.73Z" }, { url = "https://files.pythonhosted.org/packages/9b/77/c7042582401bb9ac8eff25360e9335e901d7a1c0749a2b28ba4ecb239991/ruff-0.13.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:ff7e4dda12e683e9709ac89e2dd436abf31a4d8a8fc3d89656231ed808e231d2", size = 13040891, upload-time = "2025-09-25T14:53:31.38Z" },
{ url = "https://files.pythonhosted.org/packages/75/da/2a656ea7c6b9bd14c7209918268dd40e1e6cea65f4bb9880eaaa43b055cd/ruff-0.14.0-py3-none-macosx_11_0_arm64.whl", hash = "sha256:703799d059ba50f745605b04638fa7e9682cc3da084b2092feee63500ff3d9b8", size = 12363376, upload-time = "2025-10-07T18:21:07.833Z" }, { url = "https://files.pythonhosted.org/packages/c6/15/125a7f76eb295cb34d19c6778e3a82ace33730ad4e6f28d3427e134a02e0/ruff-0.13.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:c75e9d2a2fafd1fdd895d0e7e24b44355984affdde1c412a6f6d3f6e16b22d46", size = 12243588, upload-time = "2025-09-25T14:53:33.543Z" },
{ url = "https://files.pythonhosted.org/packages/42/e2/1ffef5a1875add82416ff388fcb7ea8b22a53be67a638487937aea81af27/ruff-0.14.0-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ba9a8925e90f861502f7d974cc60e18ca29c72bb0ee8bfeabb6ade35a3abde7", size = 12608055, upload-time = "2025-10-07T18:21:10.72Z" }, { url = "https://files.pythonhosted.org/packages/9e/eb/0093ae04a70f81f8be7fd7ed6456e926b65d238fc122311293d033fdf91e/ruff-0.13.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cceac74e7bbc53ed7d15d1042ffe7b6577bf294611ad90393bf9b2a0f0ec7cb6", size = 12491359, upload-time = "2025-09-25T14:53:35.892Z" },
{ url = "https://files.pythonhosted.org/packages/4a/32/986725199d7cee510d9f1dfdf95bf1efc5fa9dd714d0d85c1fb1f6be3bc3/ruff-0.14.0-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e41f785498bd200ffc276eb9e1570c019c1d907b07cfb081092c8ad51975bbe7", size = 12318544, upload-time = "2025-10-07T18:21:13.741Z" }, { url = "https://files.pythonhosted.org/packages/43/fe/72b525948a6956f07dad4a6f122336b6a05f2e3fd27471cea612349fedb9/ruff-0.13.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6ae3f469b5465ba6d9721383ae9d49310c19b452a161b57507764d7ef15f4b07", size = 12162486, upload-time = "2025-09-25T14:53:38.171Z" },
{ url = "https://files.pythonhosted.org/packages/9a/ed/4969cefd53315164c94eaf4da7cfba1f267dc275b0abdd593d11c90829a3/ruff-0.14.0-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:30a58c087aef4584c193aebf2700f0fbcfc1e77b89c7385e3139956fa90434e2", size = 14001280, upload-time = "2025-10-07T18:21:16.411Z" }, { url = "https://files.pythonhosted.org/packages/6a/e3/0fac422bbbfb2ea838023e0d9fcf1f30183d83ab2482800e2cb892d02dfe/ruff-0.13.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4f8f9e3cd6714358238cd6626b9d43026ed19c0c018376ac1ef3c3a04ffb42d8", size = 13871203, upload-time = "2025-09-25T14:53:41.943Z" },
{ url = "https://files.pythonhosted.org/packages/ab/ad/96c1fc9f8854c37681c9613d825925c7f24ca1acfc62a4eb3896b50bacd2/ruff-0.14.0-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:f8d07350bc7af0a5ce8812b7d5c1a7293cf02476752f23fdfc500d24b79b783c", size = 15027286, upload-time = "2025-10-07T18:21:19.577Z" }, { url = "https://files.pythonhosted.org/packages/6b/82/b721c8e3ec5df6d83ba0e45dcf00892c4f98b325256c42c38ef136496cbf/ruff-0.13.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:c6ed79584a8f6cbe2e5d7dbacf7cc1ee29cbdb5df1172e77fbdadc8bb85a1f89", size = 14929635, upload-time = "2025-09-25T14:53:43.953Z" },
{ url = "https://files.pythonhosted.org/packages/b3/00/1426978f97df4fe331074baf69615f579dc4e7c37bb4c6f57c2aad80c87f/ruff-0.14.0-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eec3bbbf3a7d5482b5c1f42d5fc972774d71d107d447919fca620b0be3e3b75e", size = 14451506, upload-time = "2025-10-07T18:21:22.779Z" }, { url = "https://files.pythonhosted.org/packages/c4/a0/ad56faf6daa507b83079a1ad7a11694b87d61e6bf01c66bd82b466f21821/ruff-0.13.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aed130b2fde049cea2019f55deb939103123cdd191105f97a0599a3e753d61b0", size = 14338783, upload-time = "2025-09-25T14:53:46.205Z" },
{ url = "https://files.pythonhosted.org/packages/58/d5/9c1cea6e493c0cf0647674cca26b579ea9d2a213b74b5c195fbeb9678e15/ruff-0.14.0-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:16b68e183a0e28e5c176d51004aaa40559e8f90065a10a559176713fcf435206", size = 13437384, upload-time = "2025-10-07T18:21:25.758Z" }, { url = "https://files.pythonhosted.org/packages/47/77/ad1d9156db8f99cd01ee7e29d74b34050e8075a8438e589121fcd25c4b08/ruff-0.13.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1887c230c2c9d65ed1b4e4cfe4d255577ea28b718ae226c348ae68df958191aa", size = 13355322, upload-time = "2025-09-25T14:53:48.164Z" },
{ url = "https://files.pythonhosted.org/packages/29/b4/4cd6a4331e999fc05d9d77729c95503f99eae3ba1160469f2b64866964e3/ruff-0.14.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eb732d17db2e945cfcbbc52af0143eda1da36ca8ae25083dd4f66f1542fdf82e", size = 13447976, upload-time = "2025-10-07T18:21:28.83Z" }, { url = "https://files.pythonhosted.org/packages/64/8b/e87cfca2be6f8b9f41f0bb12dc48c6455e2d66df46fe61bb441a226f1089/ruff-0.13.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5bcb10276b69b3cfea3a102ca119ffe5c6ba3901e20e60cf9efb53fa417633c3", size = 13354427, upload-time = "2025-09-25T14:53:50.486Z" },
{ url = "https://files.pythonhosted.org/packages/3b/c0/ac42f546d07e4f49f62332576cb845d45c67cf5610d1851254e341d563b6/ruff-0.14.0-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:c958f66ab884b7873e72df38dcabee03d556a8f2ee1b8538ee1c2bbd619883dd", size = 13682850, upload-time = "2025-10-07T18:21:31.842Z" }, { url = "https://files.pythonhosted.org/packages/7f/df/bf382f3fbead082a575edb860897287f42b1b3c694bafa16bc9904c11ed3/ruff-0.13.2-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:afa721017aa55a555b2ff7944816587f1cb813c2c0a882d158f59b832da1660d", size = 13537637, upload-time = "2025-09-25T14:53:52.887Z" },
{ url = "https://files.pythonhosted.org/packages/5f/c4/4b0c9bcadd45b4c29fe1af9c5d1dc0ca87b4021665dfbe1c4688d407aa20/ruff-0.14.0-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:7eb0499a2e01f6e0c285afc5bac43ab380cbfc17cd43a2e1dd10ec97d6f2c42d", size = 12449825, upload-time = "2025-10-07T18:21:35.074Z" }, { url = "https://files.pythonhosted.org/packages/51/70/1fb7a7c8a6fc8bd15636288a46e209e81913b87988f26e1913d0851e54f4/ruff-0.13.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:1dbc875cf3720c64b3990fef8939334e74cb0ca65b8dbc61d1f439201a38101b", size = 12340025, upload-time = "2025-09-25T14:53:54.88Z" },
{ url = "https://files.pythonhosted.org/packages/4b/a8/e2e76288e6c16540fa820d148d83e55f15e994d852485f221b9524514730/ruff-0.14.0-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:4c63b2d99fafa05efca0ab198fd48fa6030d57e4423df3f18e03aa62518c565f", size = 12272599, upload-time = "2025-10-07T18:21:38.08Z" }, { url = "https://files.pythonhosted.org/packages/4c/27/1e5b3f1c23ca5dd4106d9d580e5c13d9acb70288bff614b3d7b638378cc9/ruff-0.13.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:5b939a1b2a960e9742e9a347e5bbc9b3c3d2c716f86c6ae273d9cbd64f193f22", size = 12133449, upload-time = "2025-09-25T14:53:57.089Z" },
{ url = "https://files.pythonhosted.org/packages/18/14/e2815d8eff847391af632b22422b8207704222ff575dec8d044f9ab779b2/ruff-0.14.0-py3-none-musllinux_1_2_i686.whl", hash = "sha256:668fce701b7a222f3f5327f86909db2bbe99c30877c8001ff934c5413812ac02", size = 13193828, upload-time = "2025-10-07T18:21:41.216Z" }, { url = "https://files.pythonhosted.org/packages/2d/09/b92a5ccee289f11ab128df57d5911224197d8d55ef3bd2043534ff72ca54/ruff-0.13.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:50e2d52acb8de3804fc5f6e2fa3ae9bdc6812410a9e46837e673ad1f90a18736", size = 13051369, upload-time = "2025-09-25T14:53:59.124Z" },
{ url = "https://files.pythonhosted.org/packages/44/c6/61ccc2987cf0aecc588ff8f3212dea64840770e60d78f5606cd7dc34de32/ruff-0.14.0-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:a86bf575e05cb68dcb34e4c7dfe1064d44d3f0c04bbc0491949092192b515296", size = 13628617, upload-time = "2025-10-07T18:21:44.04Z" }, { url = "https://files.pythonhosted.org/packages/89/99/26c9d1c7d8150f45e346dc045cc49f23e961efceb4a70c47dea0960dea9a/ruff-0.13.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:3196bc13ab2110c176b9a4ae5ff7ab676faaa1964b330a1383ba20e1e19645f2", size = 13523644, upload-time = "2025-09-25T14:54:01.622Z" },
] ]
[[package]] [[package]]
@@ -4201,25 +4202,25 @@ wheels = [
[[package]] [[package]]
name = "uv" name = "uv"
version = "0.9.2" version = "0.8.22"
source = { registry = "https://pypi.org/simple" } source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/2e/23/70eb7805be75d698c244d5fb085d6af454e7d0417eea18f9ad8cbabd6df9/uv-0.9.2.tar.gz", hash = "sha256:78d58c1489dcff2fa9de8c1829a627c65a04571732dfc862e4dc7b88874df01b", size = 3693596, upload-time = "2025-10-10T19:02:06.236Z" } sdist = { url = "https://files.pythonhosted.org/packages/a6/39/231e123458d50dd497cf6d27b592f5d3bc3e2e50f496b56859865a7b22e3/uv-0.8.22.tar.gz", hash = "sha256:e6e1289c411d43e0ca245f46e76457f3807de646d90b656591b6cf46348bed5c", size = 3667007, upload-time = "2025-09-23T20:35:14.736Z" }
wheels = [ wheels = [
{ url = "https://files.pythonhosted.org/packages/12/af/2fb37e18842148e90327a284ad8db7c89a78c8b884fce298173fda44edb3/uv-0.9.2-py3-none-linux_armv6l.whl", hash = "sha256:9e3ad7f9ca7f327c4d507840b817592a3599746e138d545791ebb2eca15f34a1", size = 20606301, upload-time = "2025-10-10T19:01:19.2Z" }, { url = "https://files.pythonhosted.org/packages/7c/e6/bb440171dd8a36d0f9874b4c71778f7bbc83e62ccf42c62bd1583c802793/uv-0.8.22-py3-none-linux_armv6l.whl", hash = "sha256:7350c5f82d9c38944e6466933edcf96a90e0cb85eae5c0e53a5bc716d6f62332", size = 20554993, upload-time = "2025-09-23T20:34:26.549Z" },
{ url = "https://files.pythonhosted.org/packages/0d/ef/6dc7d0506c69edbfbba595768f96a035a85249671a57d163e31ed47c829b/uv-0.9.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:6bd0e1b4135789ee3855d38da17eca8cc9d5b2e3f96023be191422bd6751f0b8", size = 19593291, upload-time = "2025-10-10T19:01:23.801Z" }, { url = "https://files.pythonhosted.org/packages/28/e9/813f7eb9fb9694c4024362782c8933e37887b5195e189f80dc40f2da5958/uv-0.8.22-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:89944e99b04cc8542cb5931306f1c593f00c9d6f2b652fffc4d84d12b915f911", size = 19565276, upload-time = "2025-09-23T20:34:30.436Z" },
{ url = "https://files.pythonhosted.org/packages/01/b6/d422f2353482ca7c5b8175a35d2e07d14d700f53bd4f95d5e86a3d451996/uv-0.9.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:939bdd13e37330d8fb43683a10a2586c0d21c353184d9ca28251842e249356e4", size = 18175720, upload-time = "2025-10-10T19:01:26.052Z" }, { url = "https://files.pythonhosted.org/packages/d7/ca/bf37d86af6e16e45fa2b1a03300784ff3297aa9252a23dfbeaf6e391e72e/uv-0.8.22-py3-none-macosx_11_0_arm64.whl", hash = "sha256:6706b782ad75662df794e186d16b9ffa4946d57c88f21d0eadfd43425794d1b0", size = 18162303, upload-time = "2025-09-23T20:34:32.761Z" },
{ url = "https://files.pythonhosted.org/packages/d9/ca/53b5819315fe01bec2911b48a2bdb800ac9ab1cf76c5d959199271540eb5/uv-0.9.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.musllinux_1_1_aarch64.whl", hash = "sha256:b75e8762b7b3f7fc15a065bd6fcb56007c19d952c94b9e45968e47bdd7adc289", size = 20024004, upload-time = "2025-10-10T19:01:28.661Z" }, { url = "https://files.pythonhosted.org/packages/e4/eb/289b6a59fff1613958499a886283f52403c5ce4f0a8a550b86fbd70e8e4f/uv-0.8.22-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.musllinux_1_1_aarch64.whl", hash = "sha256:d6a33bd5309f8fb77d9fc249bb17f77a23426e6153e43b03ca1cd6640f0a423d", size = 19982769, upload-time = "2025-09-23T20:34:34.962Z" },
{ url = "https://files.pythonhosted.org/packages/3c/77/4a1d5b132eb072388250855e2e507d4ce5dbd31045f172d6a6266e6e1c95/uv-0.9.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:47a29843d6c2c14533492de585b5b7826a48f54e4796e47db4511b78f7738af5", size = 20199272, upload-time = "2025-10-10T19:01:31.436Z" }, { url = "https://files.pythonhosted.org/packages/df/ba/2fcc3ce75be62eecf280f3cbe74d186f371a468fad3167b5a34dee2f904e/uv-0.8.22-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4a982bdd5d239dd6dd2b4219165e209c75af1e1819730454ee46d65b3ccf77a3", size = 20163849, upload-time = "2025-09-23T20:34:37.744Z" },
{ url = "https://files.pythonhosted.org/packages/44/ad/452124cd1ec0127f6ce277052fabd709aa18f51476a809fba8abb09cc734/uv-0.9.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7cbe195d9a232344a8cf082e4fc4326a1f269fd4efe745d797a84d35008364cf", size = 21139676, upload-time = "2025-10-10T19:01:33.631Z" }, { url = "https://files.pythonhosted.org/packages/f4/4d/4fc9a508c2c497a80c41710c96f1782a29edecffcac742f3843af061ba8f/uv-0.8.22-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:58b6fb191a04b922dc3c8fea6660f58545a651843d7d0efa9ae69164fca9e05d", size = 21130147, upload-time = "2025-09-23T20:34:40.414Z" },
{ url = "https://files.pythonhosted.org/packages/39/f7/54871ac979c31ba0f5897703d884b7b73399fdab83c6c054ec92c624c45a/uv-0.9.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:43ae1b5e4cb578a13299b4b105fc099e4300885d3ac0321b735d8c23d488bb1a", size = 22583678, upload-time = "2025-10-10T19:01:36.008Z" }, { url = "https://files.pythonhosted.org/packages/71/79/6bcb3c3c3b7c9cb1a162a76dca2b166752e4ba39ec90e802b252f0a54039/uv-0.8.22-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:8ea724ae9f15c0cb4964e9e2e1b21df65c56ae02a54dc1d8a6ea44a52d819268", size = 22561974, upload-time = "2025-09-23T20:34:42.843Z" },
{ url = "https://files.pythonhosted.org/packages/75/36/bc10c28b76b565b18b01b1b4a04f51ba19837db8adc8f4f0c9982c125a04/uv-0.9.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e46e0ac8796d888d7885af9243f4ec265e88872a74582bf3a8072c0e75578698", size = 22223038, upload-time = "2025-10-10T19:01:38.444Z" }, { url = "https://files.pythonhosted.org/packages/3f/98/89bb29d82ff7e5ab1b5e862d9bdc12b1d3a4d5201cf558432487e29cc448/uv-0.8.22-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7378127cbd6ebce8ba6d9bdb88aa8ea995b579824abb5ec381c63b3a123a43be", size = 22183189, upload-time = "2025-09-23T20:34:45.57Z" },
{ url = "https://files.pythonhosted.org/packages/fe/91/64f498195168891ae804489386dccd08a5c6a80fd9f35658161c9af8e33a/uv-0.9.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:40b722a1891b42edf16573794281000479c0001b52574c10032206f3fb77870a", size = 21358696, upload-time = "2025-10-10T19:01:41.126Z" }, { url = "https://files.pythonhosted.org/packages/95/b0/354c7d7d11fff2ee97bb208f0fec6b09ae885c0d591b6eff2d7b84cc6695/uv-0.8.22-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7e761ca7df8a0059b3fae6bc2c1db24583fa00b016e35bd22a5599d7084471a7", size = 21492888, upload-time = "2025-09-23T20:34:48.45Z" },
{ url = "https://files.pythonhosted.org/packages/e0/77/0ceb3c0fed4f43f3cb387a76170115bdb873822c5c2dc6036d05dd5b2070/uv-0.9.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:50128cbeae27c4cb58973c39a2110169d13676525397a3c2de5394448ea5c58f", size = 21244422, upload-time = "2025-10-10T19:01:43.91Z" }, { url = "https://files.pythonhosted.org/packages/3a/a9/a83cee9b8cf63e57ce64ba27c77777cc66410e144fd178368f55af1fa18d/uv-0.8.22-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8efec4ef5acddc35f0867998c44e0b15fc4dace1e4c26d01443871a2fbb04bf6", size = 21252972, upload-time = "2025-09-23T20:34:50.862Z" },
{ url = "https://files.pythonhosted.org/packages/e8/1c/5d6c9f6f648eda9db1567401c9d921d4f59afbbb4af83b5230b91a96aa48/uv-0.9.2-py3-none-manylinux_2_28_aarch64.whl", hash = "sha256:c6888f2a0d49819311780e158df0d09750f701095e46a59c3f5712f76bae952c", size = 20123453, upload-time = "2025-10-10T19:01:46.489Z" }, { url = "https://files.pythonhosted.org/packages/0f/0c/71d5d5d3fca7aa788d63297a06ca26d3585270342277b52312bb693b100c/uv-0.8.22-py3-none-manylinux_2_28_aarch64.whl", hash = "sha256:9eb3b4abfa25e07d7e1bb4c9bb8dbbdd51878356a37c3c4a2ece3d68d4286f28", size = 20115520, upload-time = "2025-09-23T20:34:53.165Z" },
{ url = "https://files.pythonhosted.org/packages/78/f5/d3896606ca57a358c175a343b34b3e1ebf29b31a6ae6ae6f3daf266db202/uv-0.9.2-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:b7f6f3e1bfc0d2bdadc12e01814169dae4fbd60cedc8f634987d66ae68aab99a", size = 21236450, upload-time = "2025-10-10T19:01:49.292Z" }, { url = "https://files.pythonhosted.org/packages/da/90/57fae2798be1e71692872b8304e2e2c345eacbe2070bdcbba6d5a7675fa1/uv-0.8.22-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:b1fdffc2e71892ce648b66317e478fe8884d0007e20cfa582fff3dcea588a450", size = 21168787, upload-time = "2025-09-23T20:34:55.638Z" },
{ url = "https://files.pythonhosted.org/packages/92/6e/e5b3d1500e0c21b1dcb5be798322885d43b3ca0e696309e30029d55ffa6d/uv-0.9.2-py3-none-musllinux_1_1_armv7l.whl", hash = "sha256:0cc3808e24f169869c7a0ad311cef672fef53cebcf6cc384a17be35c60146f4a", size = 20146874, upload-time = "2025-10-10T19:01:51.565Z" }, { url = "https://files.pythonhosted.org/packages/fe/f6/23c8d8fdd1084603795f6344eee8e763ba06f891e863397fe5b7b532cb58/uv-0.8.22-py3-none-musllinux_1_1_armv7l.whl", hash = "sha256:f6ded9bacb31441d788afca397b8b884ebc2e70f903bea0a38806194be4b249c", size = 20170112, upload-time = "2025-09-23T20:34:58.008Z" },
{ url = "https://files.pythonhosted.org/packages/d6/04/966fe4aed5f4753f2c04af611263a0cbc64e84d21b13979258a64c08e801/uv-0.9.2-py3-none-musllinux_1_1_i686.whl", hash = "sha256:042f8b3773160b769efbbf34f704f99efddb248283325309bb78ffe6e7c96425", size = 20527007, upload-time = "2025-10-10T19:01:53.893Z" }, { url = "https://files.pythonhosted.org/packages/96/23/801d517964a7200014897522ae067bf7111fc2e138b38d13d9df9544bf06/uv-0.8.22-py3-none-musllinux_1_1_i686.whl", hash = "sha256:aefa0cb27a86d2145ca9290a1e99c16a17ea26a4f14a89fb7336bc19388427cc", size = 20537608, upload-time = "2025-09-23T20:35:00.44Z" },
{ url = "https://files.pythonhosted.org/packages/7d/fc/6cb19e86592ffe51c9d2b33ca51dce700e22a42da3de9f4245f735357cb2/uv-0.9.2-py3-none-musllinux_1_1_x86_64.whl", hash = "sha256:d7072e10e2d4e3342476a831e4adf1a7a490d8a3a99c5538d3e13400c4849b29", size = 21469236, upload-time = "2025-10-10T19:01:56.465Z" }, { url = "https://files.pythonhosted.org/packages/20/8a/1bd4159089f8df0128e4ceb7f4c31c23a451984a5b49c13489c70e721335/uv-0.8.22-py3-none-musllinux_1_1_x86_64.whl", hash = "sha256:9757f0b0c7d296f1e354db442ed0ce39721c06d11635ce4ee6638c5e809a9cb4", size = 21471224, upload-time = "2025-09-23T20:35:03.718Z" },
] ]
[[package]] [[package]]