mirror of
https://github.com/paperless-ngx/paperless-ngx.git
synced 2026-01-28 22:59:03 -06:00
Compare commits
1 Commits
feature/mc
...
feature/lo
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b36cfab43e |
24
docs/api.md
24
docs/api.md
@@ -8,7 +8,7 @@ Further documentation is provided here for some endpoints and features.
|
||||
|
||||
## Authorization
|
||||
|
||||
The REST api provides five different forms of authentication.
|
||||
The REST api provides four different forms of authentication.
|
||||
|
||||
1. Basic authentication
|
||||
|
||||
@@ -52,28 +52,6 @@ The REST api provides five different forms of authentication.
|
||||
[configuration](configuration.md#PAPERLESS_ENABLE_HTTP_REMOTE_USER_API)),
|
||||
you can authenticate against the API using Remote User auth.
|
||||
|
||||
5. Headless OIDC via [`django-allauth`](https://codeberg.org/allauth/django-allauth)
|
||||
|
||||
`django-allauth` exposes API endpoints under `api/auth/` which enable tools
|
||||
like third-party apps to authenticate with social accounts that are
|
||||
configured. See
|
||||
[here](advanced_usage.md#openid-connect-and-social-authentication) for more
|
||||
information on social accounts.
|
||||
|
||||
## Model Context Protocol (MCP)
|
||||
|
||||
Paperless-ngx exposes an MCP endpoint powered by `django-mcp-server` so MCP
|
||||
clients can query data collections, run full-text document search, and invoke
|
||||
DRF-backed CRUD tools.
|
||||
|
||||
- Endpoint: `/mcp/`
|
||||
- Authentication: identical to the REST API (Basic, Session, Token, or Remote
|
||||
User depending on your configuration).
|
||||
|
||||
The MCP server uses existing DRF viewsets and permissions. It also exposes a
|
||||
`query_data_collections` tool for structured querying across published models
|
||||
and a `search_documents` tool for full-text search.
|
||||
|
||||
## Searching for documents
|
||||
|
||||
Full text searching is available on the `/api/documents/` endpoint. Two
|
||||
|
||||
@@ -659,7 +659,7 @@ system. See the corresponding
|
||||
|
||||
: Sync groups from the third party authentication system (e.g. OIDC) to Paperless-ngx. When enabled, users will be added or removed from groups based on their group membership in the third party authentication system. Groups must already exist in Paperless-ngx and have the same name as in the third party authentication system. Groups are updated upon logging in via the third party authentication system, see the corresponding [django-allauth documentation](https://docs.allauth.org/en/dev/socialaccount/signals.html).
|
||||
|
||||
: In order to pass groups from the authentication system you will need to update your [PAPERLESS_SOCIALACCOUNT_PROVIDERS](#PAPERLESS_SOCIALACCOUNT_PROVIDERS) setting by adding a top-level "SCOPES" setting which includes "groups", or the custom groups claim configured in [`PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM`](#PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM) e.g.:
|
||||
: In order to pass groups from the authentication system you will need to update your [PAPERLESS_SOCIALACCOUNT_PROVIDERS](#PAPERLESS_SOCIALACCOUNT_PROVIDERS) setting by adding a top-level "SCOPES" setting which includes "groups", e.g.:
|
||||
|
||||
```json
|
||||
{"openid_connect":{"SCOPE": ["openid","profile","email","groups"]...
|
||||
@@ -667,12 +667,6 @@ system. See the corresponding
|
||||
|
||||
Defaults to False
|
||||
|
||||
#### [`PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM=<str>`](#PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM) {#PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM}
|
||||
|
||||
: Allows you to define a custom groups claim. See [PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS](#PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS) which is required for this setting to take effect.
|
||||
|
||||
Defaults to "groups"
|
||||
|
||||
#### [`PAPERLESS_SOCIAL_ACCOUNT_DEFAULT_GROUPS=<comma-separated-list>`](#PAPERLESS_SOCIAL_ACCOUNT_DEFAULT_GROUPS) {#PAPERLESS_SOCIAL_ACCOUNT_DEFAULT_GROUPS}
|
||||
|
||||
: A list of group names that users who signup via social accounts will be added to upon signup. Groups listed here must already exist.
|
||||
@@ -1152,9 +1146,8 @@ via the consumption directory, you can disable the consumer to save resources.
|
||||
|
||||
#### [`PAPERLESS_CONSUMER_DELETE_DUPLICATES=<bool>`](#PAPERLESS_CONSUMER_DELETE_DUPLICATES) {#PAPERLESS_CONSUMER_DELETE_DUPLICATES}
|
||||
|
||||
: As of version 3.0 Paperless-ngx allows duplicate documents to be consumed by default, _except_ when
|
||||
this setting is enabled. When enabled, Paperless will check if a document with the same hash already
|
||||
exists in the system and delete the duplicate file from the consumption directory without consuming it.
|
||||
: When the consumer detects a duplicate document, it will not touch
|
||||
the original document. This default behavior can be changed here.
|
||||
|
||||
Defaults to false.
|
||||
|
||||
|
||||
@@ -36,7 +36,6 @@ dependencies = [
|
||||
"django-extensions~=4.1",
|
||||
"django-filter~=25.1",
|
||||
"django-guardian~=3.2.0",
|
||||
"django-mcp-server~=0.5.7",
|
||||
"django-multiselectfield~=1.0.1",
|
||||
"django-soft-delete~=1.0.18",
|
||||
"django-treenode>=0.23.2",
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -97,12 +97,6 @@
|
||||
<br/><em>(<ng-container i18n>click for full output</ng-container>)</em>
|
||||
}
|
||||
</ng-template>
|
||||
@if (task.duplicate_documents?.length > 0) {
|
||||
<div class="small text-warning-emphasis d-flex align-items-center gap-1">
|
||||
<i-bs class="lh-1" width="1em" height="1em" name="exclamation-triangle"></i-bs>
|
||||
<span i18n>Duplicate(s) detected</span>
|
||||
</div>
|
||||
}
|
||||
</td>
|
||||
}
|
||||
<td class="d-lg-none">
|
||||
|
||||
@@ -164,11 +164,9 @@
|
||||
{{ item.name }}
|
||||
<span class="ms-auto text-muted small">
|
||||
@if (item.dateEnd) {
|
||||
{{ item.date | customDate:'mediumDate' }} – {{ item.dateEnd | customDate:'mediumDate' }}
|
||||
} @else if (item.dateTilNow) {
|
||||
{{ item.dateTilNow | customDate:'mediumDate' }} – <ng-container i18n>now</ng-container>
|
||||
{{ item.date | customDate:'MMM d' }} – {{ item.dateEnd | customDate:'mediumDate' }}
|
||||
} @else {
|
||||
{{ item.date | customDate:'mediumDate' }}
|
||||
{{ item.date | customDate:'mediumDate' }} – <ng-container i18n>now</ng-container>
|
||||
}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
@@ -79,34 +79,32 @@ export class DatesDropdownComponent implements OnInit, OnDestroy {
|
||||
{
|
||||
id: RelativeDate.WITHIN_1_WEEK,
|
||||
name: $localize`Within 1 week`,
|
||||
dateTilNow: new Date().setDate(new Date().getDate() - 7),
|
||||
date: new Date().setDate(new Date().getDate() - 7),
|
||||
},
|
||||
{
|
||||
id: RelativeDate.WITHIN_1_MONTH,
|
||||
name: $localize`Within 1 month`,
|
||||
dateTilNow: new Date().setMonth(new Date().getMonth() - 1),
|
||||
date: new Date().setMonth(new Date().getMonth() - 1),
|
||||
},
|
||||
{
|
||||
id: RelativeDate.WITHIN_3_MONTHS,
|
||||
name: $localize`Within 3 months`,
|
||||
dateTilNow: new Date().setMonth(new Date().getMonth() - 3),
|
||||
date: new Date().setMonth(new Date().getMonth() - 3),
|
||||
},
|
||||
{
|
||||
id: RelativeDate.WITHIN_1_YEAR,
|
||||
name: $localize`Within 1 year`,
|
||||
dateTilNow: new Date().setFullYear(new Date().getFullYear() - 1),
|
||||
date: new Date().setFullYear(new Date().getFullYear() - 1),
|
||||
},
|
||||
{
|
||||
id: RelativeDate.THIS_YEAR,
|
||||
name: $localize`This year`,
|
||||
date: new Date('1/1/' + new Date().getFullYear()),
|
||||
dateEnd: new Date('12/31/' + new Date().getFullYear()),
|
||||
},
|
||||
{
|
||||
id: RelativeDate.THIS_MONTH,
|
||||
name: $localize`This month`,
|
||||
date: new Date().setDate(1),
|
||||
dateEnd: new Date(new Date().getFullYear(), new Date().getMonth() + 1, 0),
|
||||
},
|
||||
{
|
||||
id: RelativeDate.TODAY,
|
||||
|
||||
@@ -1,18 +1,9 @@
|
||||
<div class="row pt-3 pb-3 pb-md-2 align-items-center">
|
||||
<div class="col-md text-truncate">
|
||||
<h3 class="d-flex align-items-center mb-1" style="line-height: 1.4">
|
||||
<span class="text-truncate">{{title}}</span>
|
||||
@if (id) {
|
||||
<span class="badge bg-primary text-primary-text-contrast ms-3 small fs-normal cursor-pointer" (click)="copyID()">
|
||||
@if (copied) {
|
||||
<i-bs width="1em" height="1em" name="clipboard-check"></i-bs> <ng-container i18n>Copied!</ng-container>
|
||||
} @else {
|
||||
ID: {{id}}
|
||||
}
|
||||
</span>
|
||||
}
|
||||
<h3 class="text-truncate" style="line-height: 1.4">
|
||||
{{title}}
|
||||
@if (subTitle) {
|
||||
<span class="h6 mb-0 mt-1 d-block d-md-inline fw-normal ms-md-3 text-truncate" style="line-height: 1.4">{{subTitle}}</span>
|
||||
<span class="h6 mb-0 d-block d-md-inline fw-normal ms-md-3 text-truncate" style="line-height: 1.4">{{subTitle}}</span>
|
||||
}
|
||||
@if (info) {
|
||||
<button class="btn btn-sm btn-link text-muted me-auto p-0 p-md-2" title="What's this?" i18n-title type="button" [ngbPopover]="infoPopover" [autoClose]="true">
|
||||
|
||||
@@ -1,10 +1,5 @@
|
||||
h3 {
|
||||
min-height: calc(1.325rem + 0.9vw);
|
||||
|
||||
.badge {
|
||||
font-size: 0.65rem;
|
||||
line-height: 1;
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 1200px) {
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import { Clipboard } from '@angular/cdk/clipboard'
|
||||
import { ComponentFixture, TestBed } from '@angular/core/testing'
|
||||
import { Title } from '@angular/platform-browser'
|
||||
import { environment } from 'src/environments/environment'
|
||||
@@ -8,7 +7,6 @@ describe('PageHeaderComponent', () => {
|
||||
let component: PageHeaderComponent
|
||||
let fixture: ComponentFixture<PageHeaderComponent>
|
||||
let titleService: Title
|
||||
let clipboard: Clipboard
|
||||
|
||||
beforeEach(async () => {
|
||||
TestBed.configureTestingModule({
|
||||
@@ -17,7 +15,6 @@ describe('PageHeaderComponent', () => {
|
||||
}).compileComponents()
|
||||
|
||||
titleService = TestBed.inject(Title)
|
||||
clipboard = TestBed.inject(Clipboard)
|
||||
fixture = TestBed.createComponent(PageHeaderComponent)
|
||||
component = fixture.componentInstance
|
||||
fixture.detectChanges()
|
||||
@@ -27,8 +24,7 @@ describe('PageHeaderComponent', () => {
|
||||
component.title = 'Foo'
|
||||
component.subTitle = 'Bar'
|
||||
fixture.detectChanges()
|
||||
expect(fixture.nativeElement.textContent).toContain('Foo')
|
||||
expect(fixture.nativeElement.textContent).toContain('Bar')
|
||||
expect(fixture.nativeElement.textContent).toContain('Foo Bar')
|
||||
})
|
||||
|
||||
it('should set html title', () => {
|
||||
@@ -36,16 +32,4 @@ describe('PageHeaderComponent', () => {
|
||||
component.title = 'Foo Bar'
|
||||
expect(titleSpy).toHaveBeenCalledWith(`Foo Bar - ${environment.appTitle}`)
|
||||
})
|
||||
|
||||
it('should copy id to clipboard, reset after 3 seconds', () => {
|
||||
jest.useFakeTimers()
|
||||
component.id = 42 as any
|
||||
jest.spyOn(clipboard, 'copy').mockReturnValue(true)
|
||||
component.copyID()
|
||||
expect(clipboard.copy).toHaveBeenCalledWith('42')
|
||||
expect(component.copied).toBe(true)
|
||||
|
||||
jest.advanceTimersByTime(3000)
|
||||
expect(component.copied).toBe(false)
|
||||
})
|
||||
})
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import { Clipboard } from '@angular/cdk/clipboard'
|
||||
import { Component, Input, inject } from '@angular/core'
|
||||
import { Title } from '@angular/platform-browser'
|
||||
import { NgbPopoverModule } from '@ng-bootstrap/ng-bootstrap'
|
||||
@@ -14,11 +13,8 @@ import { environment } from 'src/environments/environment'
|
||||
})
|
||||
export class PageHeaderComponent {
|
||||
private titleService = inject(Title)
|
||||
private clipboard = inject(Clipboard)
|
||||
|
||||
private _title = ''
|
||||
public copied: boolean = false
|
||||
private copyTimeout: any
|
||||
_title = ''
|
||||
|
||||
@Input()
|
||||
set title(title: string) {
|
||||
@@ -30,9 +26,6 @@ export class PageHeaderComponent {
|
||||
return this._title
|
||||
}
|
||||
|
||||
@Input()
|
||||
id: number
|
||||
|
||||
@Input()
|
||||
subTitle: string = ''
|
||||
|
||||
@@ -41,12 +34,4 @@ export class PageHeaderComponent {
|
||||
|
||||
@Input()
|
||||
infoLink: string
|
||||
|
||||
public copyID() {
|
||||
this.copied = this.clipboard.copy(this.id.toString())
|
||||
clearTimeout(this.copyTimeout)
|
||||
this.copyTimeout = setTimeout(() => {
|
||||
this.copied = false
|
||||
}, 3000)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
<pngx-page-header [(title)]="title" [id]="documentId">
|
||||
<pngx-page-header [(title)]="title">
|
||||
@if (archiveContentRenderType === ContentRenderType.PDF && !useNativePdfViewer) {
|
||||
@if (previewNumPages) {
|
||||
<div class="input-group input-group-sm d-none d-md-flex">
|
||||
@@ -370,37 +370,6 @@
|
||||
</ng-template>
|
||||
</li>
|
||||
}
|
||||
|
||||
@if (document?.duplicate_documents?.length) {
|
||||
<li [ngbNavItem]="DocumentDetailNavIDs.Duplicates">
|
||||
<a class="text-nowrap" ngbNavLink i18n>
|
||||
Duplicates
|
||||
<span class="badge text-bg-secondary ms-1">{{ document.duplicate_documents.length }}</span>
|
||||
</a>
|
||||
<ng-template ngbNavContent>
|
||||
<div class="d-flex flex-column gap-2">
|
||||
<div class="fst-italic" i18n>Duplicate documents detected:</div>
|
||||
<div class="list-group">
|
||||
@for (duplicate of document.duplicate_documents; track duplicate.id) {
|
||||
<a
|
||||
class="list-group-item list-group-item-action d-flex justify-content-between align-items-center"
|
||||
[routerLink]="['/documents', duplicate.id, 'details']"
|
||||
[class.disabled]="duplicate.deleted_at"
|
||||
>
|
||||
<span class="d-flex align-items-center gap-2">
|
||||
<span>{{ duplicate.title || ('#' + duplicate.id) }}</span>
|
||||
@if (duplicate.deleted_at) {
|
||||
<span class="badge text-bg-secondary" i18n>In trash</span>
|
||||
}
|
||||
</span>
|
||||
<span class="text-secondary">#{{ duplicate.id }}</span>
|
||||
</a>
|
||||
}
|
||||
</div>
|
||||
</div>
|
||||
</ng-template>
|
||||
</li>
|
||||
}
|
||||
</ul>
|
||||
|
||||
<div [ngbNavOutlet]="nav" class="mt-3"></div>
|
||||
|
||||
@@ -301,16 +301,16 @@ describe('DocumentDetailComponent', () => {
|
||||
.spyOn(openDocumentsService, 'openDocument')
|
||||
.mockReturnValueOnce(of(true))
|
||||
fixture.detectChanges()
|
||||
expect(component.activeNavID).toEqual(component.DocumentDetailNavIDs.Notes)
|
||||
expect(component.activeNavID).toEqual(5) // DocumentDetailNavIDs.Notes
|
||||
})
|
||||
|
||||
it('should change url on tab switch', () => {
|
||||
initNormally()
|
||||
const navigateSpy = jest.spyOn(router, 'navigate')
|
||||
component.nav.select(component.DocumentDetailNavIDs.Notes)
|
||||
component.nav.select(5)
|
||||
component.nav.navChange.next({
|
||||
activeId: 1,
|
||||
nextId: component.DocumentDetailNavIDs.Notes,
|
||||
nextId: 5,
|
||||
preventDefault: () => {},
|
||||
})
|
||||
fixture.detectChanges()
|
||||
@@ -352,18 +352,6 @@ describe('DocumentDetailComponent', () => {
|
||||
expect(component.document).toEqual(doc)
|
||||
})
|
||||
|
||||
it('should fall back to details tab when duplicates tab is active but no duplicates', () => {
|
||||
initNormally()
|
||||
component.activeNavID = component.DocumentDetailNavIDs.Duplicates
|
||||
const noDupDoc = { ...doc, duplicate_documents: [] }
|
||||
|
||||
component.updateComponent(noDupDoc)
|
||||
|
||||
expect(component.activeNavID).toEqual(
|
||||
component.DocumentDetailNavIDs.Details
|
||||
)
|
||||
})
|
||||
|
||||
it('should load already-opened document via param', () => {
|
||||
initNormally()
|
||||
jest.spyOn(documentService, 'get').mockReturnValueOnce(of(doc))
|
||||
@@ -379,38 +367,6 @@ describe('DocumentDetailComponent', () => {
|
||||
expect(component.document).toEqual(doc)
|
||||
})
|
||||
|
||||
it('should update cached open document duplicates when reloading an open doc', () => {
|
||||
const openDoc = { ...doc, duplicate_documents: [{ id: 1, title: 'Old' }] }
|
||||
const updatedDuplicates = [
|
||||
{ id: 2, title: 'Newer duplicate', deleted_at: null },
|
||||
]
|
||||
jest
|
||||
.spyOn(activatedRoute, 'paramMap', 'get')
|
||||
.mockReturnValue(of(convertToParamMap({ id: 3, section: 'details' })))
|
||||
jest.spyOn(documentService, 'get').mockReturnValue(
|
||||
of({
|
||||
...doc,
|
||||
modified: new Date('2024-01-02T00:00:00Z'),
|
||||
duplicate_documents: updatedDuplicates,
|
||||
})
|
||||
)
|
||||
jest.spyOn(openDocumentsService, 'getOpenDocument').mockReturnValue(openDoc)
|
||||
const saveSpy = jest.spyOn(openDocumentsService, 'save')
|
||||
jest.spyOn(openDocumentsService, 'openDocument').mockReturnValue(of(true))
|
||||
jest.spyOn(customFieldsService, 'listAll').mockReturnValue(
|
||||
of({
|
||||
count: customFields.length,
|
||||
all: customFields.map((f) => f.id),
|
||||
results: customFields,
|
||||
})
|
||||
)
|
||||
|
||||
fixture.detectChanges()
|
||||
|
||||
expect(openDoc.duplicate_documents).toEqual(updatedDuplicates)
|
||||
expect(saveSpy).toHaveBeenCalled()
|
||||
})
|
||||
|
||||
it('should disable form if user cannot edit', () => {
|
||||
currentUserHasObjectPermissions = false
|
||||
initNormally()
|
||||
|
||||
@@ -8,7 +8,7 @@ import {
|
||||
FormsModule,
|
||||
ReactiveFormsModule,
|
||||
} from '@angular/forms'
|
||||
import { ActivatedRoute, Router, RouterModule } from '@angular/router'
|
||||
import { ActivatedRoute, Router } from '@angular/router'
|
||||
import {
|
||||
NgbDateStruct,
|
||||
NgbDropdownModule,
|
||||
@@ -124,7 +124,6 @@ enum DocumentDetailNavIDs {
|
||||
Notes = 5,
|
||||
Permissions = 6,
|
||||
History = 7,
|
||||
Duplicates = 8,
|
||||
}
|
||||
|
||||
enum ContentRenderType {
|
||||
@@ -182,7 +181,6 @@ export enum ZoomSetting {
|
||||
NgxBootstrapIconsModule,
|
||||
PdfViewerModule,
|
||||
TextAreaComponent,
|
||||
RouterModule,
|
||||
],
|
||||
})
|
||||
export class DocumentDetailComponent
|
||||
@@ -456,11 +454,6 @@ export class DocumentDetailComponent
|
||||
const openDocument = this.openDocumentService.getOpenDocument(
|
||||
this.documentId
|
||||
)
|
||||
// update duplicate documents if present
|
||||
if (openDocument && doc?.duplicate_documents) {
|
||||
openDocument.duplicate_documents = doc.duplicate_documents
|
||||
this.openDocumentService.save()
|
||||
}
|
||||
const useDoc = openDocument || doc
|
||||
if (openDocument) {
|
||||
if (
|
||||
@@ -711,13 +704,6 @@ export class DocumentDetailComponent
|
||||
}
|
||||
this.title = this.documentTitlePipe.transform(doc.title)
|
||||
this.prepareForm(doc)
|
||||
|
||||
if (
|
||||
this.activeNavID === DocumentDetailNavIDs.Duplicates &&
|
||||
!doc?.duplicate_documents?.length
|
||||
) {
|
||||
this.activeNavID = DocumentDetailNavIDs.Details
|
||||
}
|
||||
}
|
||||
|
||||
get customFieldFormFields(): FormArray {
|
||||
|
||||
@@ -14,7 +14,6 @@ import { SortableDirective } from 'src/app/directives/sortable.directive'
|
||||
import { CustomDatePipe } from 'src/app/pipes/custom-date.pipe'
|
||||
import { PermissionType } from 'src/app/services/permissions.service'
|
||||
import { CorrespondentService } from 'src/app/services/rest/correspondent.service'
|
||||
import { ClearableBadgeComponent } from '../../common/clearable-badge/clearable-badge.component'
|
||||
import { CorrespondentEditDialogComponent } from '../../common/edit-dialog/correspondent-edit-dialog/correspondent-edit-dialog.component'
|
||||
import { PageHeaderComponent } from '../../common/page-header/page-header.component'
|
||||
import { ManagementListComponent } from '../management-list/management-list.component'
|
||||
@@ -37,7 +36,6 @@ import { ManagementListComponent } from '../management-list/management-list.comp
|
||||
NgbDropdownModule,
|
||||
NgbPaginationModule,
|
||||
NgxBootstrapIconsModule,
|
||||
ClearableBadgeComponent,
|
||||
],
|
||||
})
|
||||
export class CorrespondentListComponent extends ManagementListComponent<Correspondent> {
|
||||
|
||||
@@ -13,7 +13,6 @@ import { IfPermissionsDirective } from 'src/app/directives/if-permissions.direct
|
||||
import { SortableDirective } from 'src/app/directives/sortable.directive'
|
||||
import { PermissionType } from 'src/app/services/permissions.service'
|
||||
import { DocumentTypeService } from 'src/app/services/rest/document-type.service'
|
||||
import { ClearableBadgeComponent } from '../../common/clearable-badge/clearable-badge.component'
|
||||
import { DocumentTypeEditDialogComponent } from '../../common/edit-dialog/document-type-edit-dialog/document-type-edit-dialog.component'
|
||||
import { PageHeaderComponent } from '../../common/page-header/page-header.component'
|
||||
import { ManagementListComponent } from '../management-list/management-list.component'
|
||||
@@ -35,7 +34,6 @@ import { ManagementListComponent } from '../management-list/management-list.comp
|
||||
NgbDropdownModule,
|
||||
NgbPaginationModule,
|
||||
NgxBootstrapIconsModule,
|
||||
ClearableBadgeComponent,
|
||||
],
|
||||
})
|
||||
export class DocumentTypeListComponent extends ManagementListComponent<DocumentType> {
|
||||
|
||||
@@ -1,48 +1,17 @@
|
||||
<pngx-page-header title="{{ typeNamePlural | titlecase }}" info="View, add, edit and delete {{ typeNamePlural }}." infoLink="usage/#terms-and-definitions">
|
||||
|
||||
<div ngbDropdown class="btn-group flex-fill d-sm-none">
|
||||
<button class="btn btn-sm btn-outline-primary" id="dropdownSelectMobile" ngbDropdownToggle>
|
||||
<i-bs name="text-indent-left"></i-bs>
|
||||
<div class="d-none d-sm-inline"> <ng-container i18n>Select</ng-container></div>
|
||||
@if (selectedObjects.size > 0) {
|
||||
<pngx-clearable-badge [selected]="selectedObjects.size > 0" [number]="selectedObjects.size" (cleared)="selectNone()"></pngx-clearable-badge><span class="visually-hidden">selected</span>
|
||||
}
|
||||
<button class="btn btn-sm btn-outline-secondary" (click)="clearSelection()" [hidden]="selectedObjects.size === 0">
|
||||
<i-bs name="x"></i-bs> <ng-container i18n>Clear selection</ng-container>
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-outline-primary" (click)="setPermissions()" [disabled]="!userCanBulkEdit(PermissionAction.Change) || selectedObjects.size === 0">
|
||||
<i-bs name="person-fill-lock"></i-bs> <ng-container i18n>Permissions</ng-container>
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-outline-danger" (click)="delete()" [disabled]="!userCanBulkEdit(PermissionAction.Delete) || selectedObjects.size === 0">
|
||||
<i-bs name="trash"></i-bs> <ng-container i18n>Delete</ng-container>
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-outline-primary ms-md-5" (click)="openCreateDialog()" *pngxIfPermissions="{ action: PermissionAction.Add, type: permissionType }">
|
||||
<i-bs name="plus-circle"></i-bs> <ng-container i18n>Create</ng-container>
|
||||
</button>
|
||||
<div ngbDropdownMenu aria-labelledby="dropdownSelectMobile" class="shadow">
|
||||
<button ngbDropdownItem (click)="selectNone()" i18n>Select none</button>
|
||||
<button ngbDropdownItem (click)="selectPage(true)" i18n>Select page</button>
|
||||
<button ngbDropdownItem (click)="selectAll()" i18n>Select all</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="d-none d-sm-flex flex-fill me-3">
|
||||
<div class="input-group input-group-sm">
|
||||
<span class="input-group-text border-0" i18n>Select:</span>
|
||||
</div>
|
||||
<div class="btn-group btn-group-sm flex-nowrap">
|
||||
@if (selectedObjects.size > 0) {
|
||||
<button class="btn btn-sm btn-outline-secondary" (click)="selectNone()">
|
||||
<i-bs name="slash-circle"></i-bs> <ng-container i18n>None</ng-container>
|
||||
</button>
|
||||
}
|
||||
<button class="btn btn-sm btn-outline-primary" (click)="selectPage(true)">
|
||||
<i-bs name="file-earmark-check"></i-bs> <ng-container i18n>Page</ng-container>
|
||||
</button>
|
||||
<button class="btn btn-sm btn-outline-primary" (click)="selectAll()">
|
||||
<i-bs name="check-all"></i-bs> <ng-container i18n>All</ng-container>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<button type="button" class="btn btn-sm btn-outline-primary" (click)="setPermissions()" [disabled]="!userCanBulkEdit(PermissionAction.Change) || selectedObjects.size === 0">
|
||||
<i-bs name="person-fill-lock"></i-bs> <ng-container i18n>Permissions</ng-container>
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-outline-danger" (click)="delete()" [disabled]="!userCanBulkEdit(PermissionAction.Delete) || selectedObjects.size === 0">
|
||||
<i-bs name="trash"></i-bs> <ng-container i18n>Delete</ng-container>
|
||||
</button>
|
||||
<button type="button" class="btn btn-sm btn-outline-primary ms-md-5" (click)="openCreateDialog()" *pngxIfPermissions="{ action: PermissionAction.Add, type: permissionType }">
|
||||
<i-bs name="plus-circle"></i-bs> <ng-container i18n>Create</ng-container>
|
||||
</button>
|
||||
</pngx-page-header>
|
||||
|
||||
<div class="row mb-3">
|
||||
@@ -62,7 +31,7 @@
|
||||
<tr>
|
||||
<th scope="col">
|
||||
<div class="form-check m-0 ms-2 me-n2">
|
||||
<input type="checkbox" class="form-check-input" id="all-objects" [(ngModel)]="togggleAll" [disabled]="data.length === 0" (change)="selectPage($event.target.checked); $event.stopPropagation();">
|
||||
<input type="checkbox" class="form-check-input" id="all-objects" [(ngModel)]="togggleAll" [disabled]="data.length === 0" (click)="toggleAll($event); $event.stopPropagation();">
|
||||
<label class="form-check-label" for="all-objects"></label>
|
||||
</div>
|
||||
</th>
|
||||
|
||||
@@ -163,7 +163,8 @@ describe('ManagementListComponent', () => {
|
||||
const toastInfoSpy = jest.spyOn(toastService, 'showInfo')
|
||||
const reloadSpy = jest.spyOn(component, 'reloadData')
|
||||
|
||||
component.openCreateDialog()
|
||||
const createButton = fixture.debugElement.queryAll(By.css('button'))[4]
|
||||
createButton.triggerEventHandler('click')
|
||||
|
||||
expect(modal).not.toBeUndefined()
|
||||
const editDialog = modal.componentInstance as EditDialogComponent<Tag>
|
||||
@@ -186,7 +187,8 @@ describe('ManagementListComponent', () => {
|
||||
const toastInfoSpy = jest.spyOn(toastService, 'showInfo')
|
||||
const reloadSpy = jest.spyOn(component, 'reloadData')
|
||||
|
||||
component.openEditDialog(tags[0])
|
||||
const editButton = fixture.debugElement.queryAll(By.css('button'))[7]
|
||||
editButton.triggerEventHandler('click')
|
||||
|
||||
expect(modal).not.toBeUndefined()
|
||||
const editDialog = modal.componentInstance as EditDialogComponent<Tag>
|
||||
@@ -210,7 +212,8 @@ describe('ManagementListComponent', () => {
|
||||
const deleteSpy = jest.spyOn(tagService, 'delete')
|
||||
const reloadSpy = jest.spyOn(component, 'reloadData')
|
||||
|
||||
component.openDeleteDialog(tags[0])
|
||||
const deleteButton = fixture.debugElement.queryAll(By.css('button'))[8]
|
||||
deleteButton.triggerEventHandler('click')
|
||||
|
||||
expect(modal).not.toBeUndefined()
|
||||
const editDialog = modal.componentInstance as ConfirmDialogComponent
|
||||
@@ -227,21 +230,6 @@ describe('ManagementListComponent', () => {
|
||||
expect(reloadSpy).toHaveBeenCalled()
|
||||
})
|
||||
|
||||
it('should use the all list length for collection size when provided', fakeAsync(() => {
|
||||
jest.spyOn(tagService, 'listFiltered').mockReturnValueOnce(
|
||||
of({
|
||||
count: 1,
|
||||
all: [1, 2, 3],
|
||||
results: tags.slice(0, 1),
|
||||
})
|
||||
)
|
||||
|
||||
component.reloadData()
|
||||
tick(100)
|
||||
|
||||
expect(component.collectionSize).toBe(3)
|
||||
}))
|
||||
|
||||
it('should support quick filter for objects', () => {
|
||||
const expectedUrl = documentListViewService.getQuickFilterUrl([
|
||||
{ rule_type: FILTER_HAS_TAGS_ALL, value: tags[0].id.toString() },
|
||||
@@ -276,84 +264,19 @@ describe('ManagementListComponent', () => {
|
||||
expect(component.page).toEqual(1)
|
||||
})
|
||||
|
||||
it('should support toggle select page in vew', () => {
|
||||
it('should support toggle all items in view', () => {
|
||||
expect(component.selectedObjects.size).toEqual(0)
|
||||
const selectPageSpy = jest.spyOn(component, 'selectPage')
|
||||
const toggleAllSpy = jest.spyOn(component, 'toggleAll')
|
||||
const checkButton = fixture.debugElement.queryAll(
|
||||
By.css('input.form-check-input')
|
||||
)[0]
|
||||
checkButton.nativeElement.dispatchEvent(new Event('change'))
|
||||
checkButton.nativeElement.dispatchEvent(new Event('click'))
|
||||
checkButton.nativeElement.checked = true
|
||||
checkButton.nativeElement.dispatchEvent(new Event('change'))
|
||||
expect(selectPageSpy).toHaveBeenCalled()
|
||||
checkButton.nativeElement.dispatchEvent(new Event('click'))
|
||||
expect(toggleAllSpy).toHaveBeenCalled()
|
||||
expect(component.selectedObjects.size).toEqual(tags.length)
|
||||
})
|
||||
|
||||
it('selectNone should clear selection and reset toggle flag', () => {
|
||||
component.selectedObjects = new Set([tags[0].id, tags[1].id])
|
||||
component.togggleAll = true
|
||||
|
||||
component.selectNone()
|
||||
|
||||
expect(component.selectedObjects.size).toBe(0)
|
||||
expect(component.togggleAll).toBe(false)
|
||||
})
|
||||
|
||||
it('selectPage should select current page items or clear selection', () => {
|
||||
component.selectPage(true)
|
||||
expect(component.selectedObjects).toEqual(new Set(tags.map((t) => t.id)))
|
||||
expect(component.togggleAll).toBe(true)
|
||||
|
||||
component.togggleAll = true
|
||||
component.selectPage(false)
|
||||
expect(component.selectedObjects.size).toBe(0)
|
||||
expect(component.togggleAll).toBe(false)
|
||||
})
|
||||
|
||||
it('selectAll should use all IDs when collection size exists', () => {
|
||||
;(component as any).allIDs = [1, 2, 3, 4]
|
||||
component.collectionSize = 4
|
||||
|
||||
component.selectAll()
|
||||
|
||||
expect(component.selectedObjects).toEqual(new Set([1, 2, 3, 4]))
|
||||
expect(component.togggleAll).toBe(true)
|
||||
})
|
||||
|
||||
it('selectAll should clear selection when collection size is zero', () => {
|
||||
component.selectedObjects = new Set([1])
|
||||
component.collectionSize = 0
|
||||
component.togggleAll = true
|
||||
|
||||
component.selectAll()
|
||||
|
||||
expect(component.selectedObjects.size).toBe(0)
|
||||
expect(component.togggleAll).toBe(false)
|
||||
})
|
||||
|
||||
it('toggleSelected should toggle object selection and update toggle state', () => {
|
||||
component.toggleSelected(tags[0])
|
||||
expect(component.selectedObjects.has(tags[0].id)).toBe(true)
|
||||
expect(component.togggleAll).toBe(false)
|
||||
|
||||
component.toggleSelected(tags[1])
|
||||
component.toggleSelected(tags[2])
|
||||
expect(component.togggleAll).toBe(true)
|
||||
|
||||
component.toggleSelected(tags[1])
|
||||
expect(component.selectedObjects.has(tags[1].id)).toBe(false)
|
||||
expect(component.togggleAll).toBe(false)
|
||||
})
|
||||
|
||||
it('areAllPageItemsSelected should return false when page has no selectable items', () => {
|
||||
component.data = []
|
||||
component.selectedObjects.clear()
|
||||
|
||||
expect((component as any).areAllPageItemsSelected()).toBe(false)
|
||||
|
||||
component.data = tags
|
||||
})
|
||||
|
||||
it('should support bulk edit permissions', () => {
|
||||
const bulkEditPermsSpy = jest.spyOn(tagService, 'bulk_edit_objects')
|
||||
component.toggleSelected(tags[0])
|
||||
|
||||
@@ -84,7 +84,6 @@ export abstract class ManagementListComponent<T extends MatchingModel>
|
||||
|
||||
public data: T[] = []
|
||||
private unfilteredData: T[] = []
|
||||
private allIDs: number[] = []
|
||||
|
||||
public page = 1
|
||||
|
||||
@@ -172,8 +171,7 @@ export abstract class ManagementListComponent<T extends MatchingModel>
|
||||
tap((c) => {
|
||||
this.unfilteredData = c.results
|
||||
this.data = this.filterData(c.results)
|
||||
this.collectionSize = c.all?.length ?? c.count
|
||||
this.allIDs = c.all
|
||||
this.collectionSize = c.count
|
||||
}),
|
||||
delay(100)
|
||||
)
|
||||
@@ -302,6 +300,16 @@ export abstract class ManagementListComponent<T extends MatchingModel>
|
||||
return ownsAll
|
||||
}
|
||||
|
||||
toggleAll(event: PointerEvent) {
|
||||
const checked = (event.target as HTMLInputElement).checked
|
||||
this.togggleAll = checked
|
||||
if (checked) {
|
||||
this.selectedObjects = new Set(this.getSelectableIDs(this.data))
|
||||
} else {
|
||||
this.clearSelection()
|
||||
}
|
||||
}
|
||||
|
||||
protected getSelectableIDs(objects: T[]): number[] {
|
||||
return objects.map((o) => o.id)
|
||||
}
|
||||
@@ -311,38 +319,10 @@ export abstract class ManagementListComponent<T extends MatchingModel>
|
||||
this.selectedObjects.clear()
|
||||
}
|
||||
|
||||
selectNone() {
|
||||
this.clearSelection()
|
||||
}
|
||||
|
||||
selectPage(select: boolean) {
|
||||
if (select) {
|
||||
this.selectedObjects = new Set(this.getSelectableIDs(this.data))
|
||||
this.togggleAll = this.areAllPageItemsSelected()
|
||||
} else {
|
||||
this.clearSelection()
|
||||
}
|
||||
}
|
||||
|
||||
selectAll() {
|
||||
if (!this.collectionSize) {
|
||||
this.clearSelection()
|
||||
return
|
||||
}
|
||||
this.selectedObjects = new Set(this.allIDs)
|
||||
this.togggleAll = this.areAllPageItemsSelected()
|
||||
}
|
||||
|
||||
toggleSelected(object) {
|
||||
this.selectedObjects.has(object.id)
|
||||
? this.selectedObjects.delete(object.id)
|
||||
: this.selectedObjects.add(object.id)
|
||||
this.togggleAll = this.areAllPageItemsSelected()
|
||||
}
|
||||
|
||||
protected areAllPageItemsSelected(): boolean {
|
||||
const ids = this.getSelectableIDs(this.data)
|
||||
return ids.length > 0 && ids.every((id) => this.selectedObjects.has(id))
|
||||
}
|
||||
|
||||
setPermissions() {
|
||||
|
||||
@@ -13,7 +13,6 @@ import { IfPermissionsDirective } from 'src/app/directives/if-permissions.direct
|
||||
import { SortableDirective } from 'src/app/directives/sortable.directive'
|
||||
import { PermissionType } from 'src/app/services/permissions.service'
|
||||
import { StoragePathService } from 'src/app/services/rest/storage-path.service'
|
||||
import { ClearableBadgeComponent } from '../../common/clearable-badge/clearable-badge.component'
|
||||
import { StoragePathEditDialogComponent } from '../../common/edit-dialog/storage-path-edit-dialog/storage-path-edit-dialog.component'
|
||||
import { PageHeaderComponent } from '../../common/page-header/page-header.component'
|
||||
import { ManagementListComponent } from '../management-list/management-list.component'
|
||||
@@ -35,7 +34,6 @@ import { ManagementListComponent } from '../management-list/management-list.comp
|
||||
NgbDropdownModule,
|
||||
NgbPaginationModule,
|
||||
NgxBootstrapIconsModule,
|
||||
ClearableBadgeComponent,
|
||||
],
|
||||
})
|
||||
export class StoragePathListComponent extends ManagementListComponent<StoragePath> {
|
||||
|
||||
@@ -138,12 +138,16 @@ describe('TagListComponent', () => {
|
||||
}
|
||||
|
||||
component.data = [parent as any]
|
||||
component.selectPage(true)
|
||||
const selectEvent = { target: { checked: true } } as unknown as PointerEvent
|
||||
component.toggleAll(selectEvent)
|
||||
|
||||
expect(component.selectedObjects.has(10)).toBe(true)
|
||||
expect(component.selectedObjects.has(11)).toBe(true)
|
||||
|
||||
component.selectPage(false)
|
||||
const deselectEvent = {
|
||||
target: { checked: false },
|
||||
} as unknown as PointerEvent
|
||||
component.toggleAll(deselectEvent)
|
||||
expect(component.selectedObjects.size).toBe(0)
|
||||
})
|
||||
})
|
||||
|
||||
@@ -13,7 +13,6 @@ import { IfPermissionsDirective } from 'src/app/directives/if-permissions.direct
|
||||
import { SortableDirective } from 'src/app/directives/sortable.directive'
|
||||
import { PermissionType } from 'src/app/services/permissions.service'
|
||||
import { TagService } from 'src/app/services/rest/tag.service'
|
||||
import { ClearableBadgeComponent } from '../../common/clearable-badge/clearable-badge.component'
|
||||
import { TagEditDialogComponent } from '../../common/edit-dialog/tag-edit-dialog/tag-edit-dialog.component'
|
||||
import { PageHeaderComponent } from '../../common/page-header/page-header.component'
|
||||
import { ManagementListComponent } from '../management-list/management-list.component'
|
||||
@@ -35,7 +34,6 @@ import { ManagementListComponent } from '../management-list/management-list.comp
|
||||
NgbDropdownModule,
|
||||
NgbPaginationModule,
|
||||
NgxBootstrapIconsModule,
|
||||
ClearableBadgeComponent,
|
||||
],
|
||||
})
|
||||
export class TagListComponent extends ManagementListComponent<Tag> {
|
||||
|
||||
@@ -159,8 +159,6 @@ export interface Document extends ObjectWithPermissions {
|
||||
|
||||
page_count?: number
|
||||
|
||||
duplicate_documents?: Document[]
|
||||
|
||||
// Frontend only
|
||||
__changedFields?: string[]
|
||||
}
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import { Document } from './document'
|
||||
import { ObjectWithId } from './object-with-id'
|
||||
|
||||
export enum PaperlessTaskType {
|
||||
@@ -43,7 +42,5 @@ export interface PaperlessTask extends ObjectWithId {
|
||||
|
||||
related_document?: number
|
||||
|
||||
duplicate_documents?: Document[]
|
||||
|
||||
owner?: number
|
||||
}
|
||||
|
||||
@@ -779,45 +779,19 @@ class ConsumerPreflightPlugin(
|
||||
Q(checksum=checksum) | Q(archive_checksum=checksum),
|
||||
)
|
||||
if existing_doc.exists():
|
||||
existing_doc = existing_doc.order_by("-created")
|
||||
duplicates_in_trash = existing_doc.filter(deleted_at__isnull=False)
|
||||
log_msg = (
|
||||
f"Consuming duplicate {self.filename}: "
|
||||
f"{existing_doc.count()} existing document(s) share the same content."
|
||||
)
|
||||
msg = ConsumerStatusShortMessage.DOCUMENT_ALREADY_EXISTS
|
||||
log_msg = f"Not consuming {self.filename}: It is a duplicate of {existing_doc.get().title} (#{existing_doc.get().pk})."
|
||||
|
||||
if duplicates_in_trash.exists():
|
||||
log_msg += " Note: at least one existing document is in the trash."
|
||||
|
||||
self.log.warning(log_msg)
|
||||
if existing_doc.first().deleted_at is not None:
|
||||
msg = ConsumerStatusShortMessage.DOCUMENT_ALREADY_EXISTS_IN_TRASH
|
||||
log_msg += " Note: existing document is in the trash."
|
||||
|
||||
if settings.CONSUMER_DELETE_DUPLICATES:
|
||||
duplicate = existing_doc.first()
|
||||
duplicate_label = (
|
||||
duplicate.title
|
||||
or duplicate.original_filename
|
||||
or (Path(duplicate.filename).name if duplicate.filename else None)
|
||||
or str(duplicate.pk)
|
||||
)
|
||||
|
||||
Path(self.input_doc.original_file).unlink()
|
||||
|
||||
failure_msg = (
|
||||
f"Not consuming {self.filename}: "
|
||||
f"It is a duplicate of {duplicate_label} (#{duplicate.pk})"
|
||||
)
|
||||
status_msg = ConsumerStatusShortMessage.DOCUMENT_ALREADY_EXISTS
|
||||
|
||||
if duplicates_in_trash.exists():
|
||||
status_msg = (
|
||||
ConsumerStatusShortMessage.DOCUMENT_ALREADY_EXISTS_IN_TRASH
|
||||
)
|
||||
failure_msg += " Note: existing document is in the trash."
|
||||
|
||||
self._fail(
|
||||
status_msg,
|
||||
failure_msg,
|
||||
)
|
||||
self._fail(
|
||||
msg,
|
||||
log_msg,
|
||||
)
|
||||
|
||||
def pre_check_directories(self):
|
||||
"""
|
||||
|
||||
@@ -602,7 +602,7 @@ def rewrite_natural_date_keywords(query_string: str) -> str:
|
||||
|
||||
case "this year":
|
||||
start = datetime(local_now.year, 1, 1, 0, 0, 0, tzinfo=tz)
|
||||
end = datetime(local_now.year, 12, 31, 23, 59, 59, tzinfo=tz)
|
||||
end = datetime.combine(today, time.max, tzinfo=tz)
|
||||
|
||||
case "previous week":
|
||||
days_since_monday = local_now.weekday()
|
||||
|
||||
@@ -1,481 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from django.db.models import Q
|
||||
from django.http import QueryDict
|
||||
from mcp_server import MCPToolset
|
||||
from mcp_server import ModelQueryToolset
|
||||
from mcp_server import drf_publish_create_mcp_tool
|
||||
from mcp_server import drf_publish_destroy_mcp_tool
|
||||
from mcp_server import drf_publish_list_mcp_tool
|
||||
from mcp_server import drf_publish_update_mcp_tool
|
||||
from rest_framework.response import Response
|
||||
|
||||
from documents.models import Correspondent
|
||||
from documents.models import CustomField
|
||||
from documents.models import Document
|
||||
from documents.models import DocumentType
|
||||
from documents.models import Note
|
||||
from documents.models import SavedView
|
||||
from documents.models import ShareLink
|
||||
from documents.models import StoragePath
|
||||
from documents.models import Tag
|
||||
from documents.models import Workflow
|
||||
from documents.models import WorkflowAction
|
||||
from documents.models import WorkflowTrigger
|
||||
from documents.permissions import get_objects_for_user_owner_aware
|
||||
from documents.views import CorrespondentViewSet
|
||||
from documents.views import CustomFieldViewSet
|
||||
from documents.views import DocumentTypeViewSet
|
||||
from documents.views import SavedViewViewSet
|
||||
from documents.views import ShareLinkViewSet
|
||||
from documents.views import StoragePathViewSet
|
||||
from documents.views import TagViewSet
|
||||
from documents.views import TasksViewSet
|
||||
from documents.views import UnifiedSearchViewSet
|
||||
from documents.views import WorkflowActionViewSet
|
||||
from documents.views import WorkflowTriggerViewSet
|
||||
from documents.views import WorkflowViewSet
|
||||
|
||||
VIEWSET_ACTIONS = {
|
||||
"create": {"post": "create"},
|
||||
"list": {"get": "list"},
|
||||
"update": {"put": "update"},
|
||||
"destroy": {"delete": "destroy"},
|
||||
}
|
||||
|
||||
BODY_SCHEMA = {"type": "object", "additionalProperties": True}
|
||||
|
||||
VIEWSET_INSTRUCTIONS = {
|
||||
CorrespondentViewSet: "Manage correspondents.",
|
||||
TagViewSet: "Manage tags.",
|
||||
UnifiedSearchViewSet: "Search and manage documents.",
|
||||
DocumentTypeViewSet: "Manage document types.",
|
||||
StoragePathViewSet: "Manage storage paths.",
|
||||
SavedViewViewSet: "Manage saved views.",
|
||||
ShareLinkViewSet: "Manage share links.",
|
||||
WorkflowTriggerViewSet: "Manage workflow triggers.",
|
||||
WorkflowActionViewSet: "Manage workflow actions.",
|
||||
WorkflowViewSet: "Manage workflows.",
|
||||
CustomFieldViewSet: "Manage custom fields.",
|
||||
TasksViewSet: "List background tasks.",
|
||||
}
|
||||
|
||||
|
||||
class OwnerAwareQueryToolsetMixin:
|
||||
permission: str
|
||||
|
||||
def get_queryset(self):
|
||||
user = getattr(self.request, "user", None)
|
||||
if not user or not user.is_authenticated:
|
||||
return self.model.objects.none()
|
||||
if user.is_superuser:
|
||||
return self.model._default_manager.all()
|
||||
return get_objects_for_user_owner_aware(user, self.permission, self.model)
|
||||
|
||||
|
||||
class DocumentQueryToolset(ModelQueryToolset):
|
||||
model = Document
|
||||
search_fields = ["title", "content"]
|
||||
|
||||
def get_queryset(self):
|
||||
user = getattr(self.request, "user", None)
|
||||
if not user or not user.is_authenticated:
|
||||
return Document.objects.none()
|
||||
if user.is_superuser:
|
||||
return Document.objects.all()
|
||||
return get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"documents.view_document",
|
||||
Document,
|
||||
)
|
||||
|
||||
|
||||
class CorrespondentQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = Correspondent
|
||||
permission = "documents.view_correspondent"
|
||||
|
||||
|
||||
class TagQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = Tag
|
||||
permission = "documents.view_tag"
|
||||
|
||||
|
||||
class DocumentTypeQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = DocumentType
|
||||
permission = "documents.view_documenttype"
|
||||
|
||||
|
||||
class StoragePathQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = StoragePath
|
||||
permission = "documents.view_storagepath"
|
||||
|
||||
|
||||
class SavedViewQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = SavedView
|
||||
permission = "documents.view_savedview"
|
||||
|
||||
|
||||
class ShareLinkQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = ShareLink
|
||||
permission = "documents.view_sharelink"
|
||||
|
||||
|
||||
class WorkflowTriggerQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = WorkflowTrigger
|
||||
permission = "documents.view_workflowtrigger"
|
||||
|
||||
|
||||
class WorkflowActionQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = WorkflowAction
|
||||
permission = "documents.view_workflowaction"
|
||||
|
||||
|
||||
class WorkflowQueryToolset(OwnerAwareQueryToolsetMixin, ModelQueryToolset):
|
||||
model = Workflow
|
||||
permission = "documents.view_workflow"
|
||||
|
||||
|
||||
class NoteQueryToolset(ModelQueryToolset):
|
||||
model = Note
|
||||
|
||||
def get_queryset(self):
|
||||
user = getattr(self.request, "user", None)
|
||||
if not user or not user.is_authenticated:
|
||||
return Note.objects.none()
|
||||
if user.is_superuser:
|
||||
return Note.objects.all()
|
||||
return Note.objects.filter(
|
||||
document__in=get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"documents.view_document",
|
||||
Document,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class CustomFieldQueryToolset(ModelQueryToolset):
|
||||
model = CustomField
|
||||
|
||||
def get_queryset(self):
|
||||
user = getattr(self.request, "user", None)
|
||||
base = CustomField.objects.all()
|
||||
if not user or not user.is_authenticated:
|
||||
return base.none()
|
||||
if user.is_superuser:
|
||||
return base
|
||||
return base.filter(
|
||||
Q(
|
||||
fields__document__id__in=get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"documents.view_document",
|
||||
Document,
|
||||
),
|
||||
)
|
||||
| Q(fields__document__isnull=True),
|
||||
).distinct()
|
||||
|
||||
|
||||
class DocumentSearchTools(MCPToolset):
|
||||
def search_documents(
|
||||
self,
|
||||
query: str | None = None,
|
||||
more_like_id: int | None = None,
|
||||
fields: list[str] | None = None,
|
||||
page: int | None = None,
|
||||
page_size: int | None = None,
|
||||
*,
|
||||
full_perms: bool | None = None,
|
||||
) -> dict:
|
||||
"""Search documents using the full-text index."""
|
||||
if not query and not more_like_id:
|
||||
raise ValueError("Provide either query or more_like_id.")
|
||||
|
||||
request = self.request
|
||||
if request is None:
|
||||
raise ValueError("Request context is required.")
|
||||
|
||||
viewset = UnifiedSearchViewSet()
|
||||
viewset.request = request
|
||||
viewset.args = ()
|
||||
viewset.kwargs = {}
|
||||
viewset.action = "list"
|
||||
viewset.format_kwarg = None
|
||||
viewset.check_permissions(request)
|
||||
|
||||
query_params = QueryDict(mutable=True)
|
||||
if query:
|
||||
query_params["query"] = query
|
||||
if more_like_id:
|
||||
query_params["more_like_id"] = str(more_like_id)
|
||||
if full_perms is not None:
|
||||
query_params["full_perms"] = str(full_perms).lower()
|
||||
if page:
|
||||
query_params["page"] = str(page)
|
||||
if page_size:
|
||||
query_params["page_size"] = str(page_size)
|
||||
if fields:
|
||||
query_params.setlist("fields", fields)
|
||||
|
||||
request._request.GET = query_params
|
||||
response = viewset.list(request)
|
||||
if isinstance(response, Response):
|
||||
return response.data
|
||||
if hasattr(response, "data"):
|
||||
return response.data
|
||||
return {
|
||||
"detail": getattr(response, "content", b"").decode() or "Search failed.",
|
||||
}
|
||||
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
CorrespondentViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[CorrespondentViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
CorrespondentViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[CorrespondentViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
CorrespondentViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[CorrespondentViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
CorrespondentViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[CorrespondentViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
TagViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[TagViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
TagViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[TagViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
TagViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[TagViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
TagViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[TagViewSet],
|
||||
)
|
||||
|
||||
drf_publish_list_mcp_tool(
|
||||
UnifiedSearchViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[UnifiedSearchViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
UnifiedSearchViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[UnifiedSearchViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
UnifiedSearchViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[UnifiedSearchViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
DocumentTypeViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[DocumentTypeViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
DocumentTypeViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[DocumentTypeViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
DocumentTypeViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[DocumentTypeViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
DocumentTypeViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[DocumentTypeViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
StoragePathViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[StoragePathViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
StoragePathViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[StoragePathViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
StoragePathViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[StoragePathViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
StoragePathViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[StoragePathViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
SavedViewViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[SavedViewViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
SavedViewViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[SavedViewViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
SavedViewViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[SavedViewViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
SavedViewViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[SavedViewViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
ShareLinkViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[ShareLinkViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
ShareLinkViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[ShareLinkViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
ShareLinkViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[ShareLinkViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
ShareLinkViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[ShareLinkViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
WorkflowTriggerViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowTriggerViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
WorkflowTriggerViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowTriggerViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
WorkflowTriggerViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowTriggerViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
WorkflowTriggerViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowTriggerViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
WorkflowActionViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowActionViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
WorkflowActionViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowActionViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
WorkflowActionViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowActionViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
WorkflowActionViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowActionViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
WorkflowViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
WorkflowViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
WorkflowViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
WorkflowViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[WorkflowViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
CustomFieldViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[CustomFieldViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
CustomFieldViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[CustomFieldViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
CustomFieldViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[CustomFieldViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
CustomFieldViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[CustomFieldViewSet],
|
||||
)
|
||||
|
||||
drf_publish_list_mcp_tool(
|
||||
TasksViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[TasksViewSet],
|
||||
)
|
||||
@@ -1,23 +0,0 @@
|
||||
# Generated by Django 5.2.7 on 2026-01-14 17:45
|
||||
|
||||
from django.db import migrations
|
||||
from django.db import models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("documents", "0005_workflowtrigger_filter_has_any_correspondents_and_more"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="document",
|
||||
name="checksum",
|
||||
field=models.CharField(
|
||||
editable=False,
|
||||
max_length=32,
|
||||
verbose_name="checksum",
|
||||
help_text="The checksum of the original document.",
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,25 +0,0 @@
|
||||
# Generated by Django 5.2.6 on 2026-01-24 07:33
|
||||
|
||||
import django.db.models.functions.text
|
||||
from django.db import migrations
|
||||
from django.db import models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("documents", "0006_alter_document_checksum_unique"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="document",
|
||||
name="content_length",
|
||||
field=models.GeneratedField(
|
||||
db_persist=True,
|
||||
expression=django.db.models.functions.text.Length("content"),
|
||||
null=False,
|
||||
help_text="Length of the content field in characters. Automatically maintained by the database for faster statistics computation.",
|
||||
output_field=models.PositiveIntegerField(default=0),
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -20,9 +20,7 @@ if settings.AUDIT_LOG_ENABLED:
|
||||
from auditlog.registry import auditlog
|
||||
|
||||
from django.db.models import Case
|
||||
from django.db.models import PositiveIntegerField
|
||||
from django.db.models.functions import Cast
|
||||
from django.db.models.functions import Length
|
||||
from django.db.models.functions import Substr
|
||||
from django_softdelete.models import SoftDeleteModel
|
||||
|
||||
@@ -194,15 +192,6 @@ class Document(SoftDeleteModel, ModelWithOwner):
|
||||
),
|
||||
)
|
||||
|
||||
content_length = models.GeneratedField(
|
||||
expression=Length("content"),
|
||||
output_field=PositiveIntegerField(default=0),
|
||||
db_persist=True,
|
||||
null=False,
|
||||
serialize=False,
|
||||
help_text="Length of the content field in characters. Automatically maintained by the database for faster statistics computation.",
|
||||
)
|
||||
|
||||
mime_type = models.CharField(_("mime type"), max_length=256, editable=False)
|
||||
|
||||
tags = models.ManyToManyField(
|
||||
@@ -216,6 +205,7 @@ class Document(SoftDeleteModel, ModelWithOwner):
|
||||
_("checksum"),
|
||||
max_length=32,
|
||||
editable=False,
|
||||
unique=True,
|
||||
help_text=_("The checksum of the original document."),
|
||||
)
|
||||
|
||||
@@ -956,7 +946,7 @@ if settings.AUDIT_LOG_ENABLED:
|
||||
auditlog.register(
|
||||
Document,
|
||||
m2m_fields={"tags"},
|
||||
exclude_fields=["content_length", "modified"],
|
||||
exclude_fields=["modified"],
|
||||
)
|
||||
auditlog.register(Correspondent)
|
||||
auditlog.register(Tag)
|
||||
|
||||
@@ -148,29 +148,13 @@ def get_document_count_filter_for_user(user):
|
||||
)
|
||||
|
||||
|
||||
def get_objects_for_user_owner_aware(
|
||||
user,
|
||||
perms,
|
||||
Model,
|
||||
*,
|
||||
include_deleted=False,
|
||||
) -> QuerySet:
|
||||
"""
|
||||
Returns objects the user owns, are unowned, or has explicit perms.
|
||||
When include_deleted is True, soft-deleted items are also included.
|
||||
"""
|
||||
manager = (
|
||||
Model.global_objects
|
||||
if include_deleted and hasattr(Model, "global_objects")
|
||||
else Model.objects
|
||||
)
|
||||
|
||||
objects_owned = manager.filter(owner=user)
|
||||
objects_unowned = manager.filter(owner__isnull=True)
|
||||
def get_objects_for_user_owner_aware(user, perms, Model) -> QuerySet:
|
||||
objects_owned = Model.objects.filter(owner=user)
|
||||
objects_unowned = Model.objects.filter(owner__isnull=True)
|
||||
objects_with_perms = get_objects_for_user(
|
||||
user=user,
|
||||
perms=perms,
|
||||
klass=manager.all(),
|
||||
klass=Model,
|
||||
accept_global_perms=False,
|
||||
)
|
||||
return objects_owned | objects_unowned | objects_with_perms
|
||||
|
||||
@@ -23,7 +23,6 @@ from django.core.validators import MinValueValidator
|
||||
from django.core.validators import RegexValidator
|
||||
from django.core.validators import integer_validator
|
||||
from django.db.models import Count
|
||||
from django.db.models import Q
|
||||
from django.db.models.functions import Lower
|
||||
from django.utils.crypto import get_random_string
|
||||
from django.utils.dateparse import parse_datetime
|
||||
@@ -73,7 +72,6 @@ from documents.models import WorkflowTrigger
|
||||
from documents.parsers import is_mime_type_supported
|
||||
from documents.permissions import get_document_count_filter_for_user
|
||||
from documents.permissions import get_groups_with_only_permission
|
||||
from documents.permissions import get_objects_for_user_owner_aware
|
||||
from documents.permissions import set_permissions_for_object
|
||||
from documents.regex import validate_regex_pattern
|
||||
from documents.templating.filepath import validate_filepath_template_and_render
|
||||
@@ -84,9 +82,6 @@ from documents.validators import url_validator
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Iterable
|
||||
|
||||
from django.db.models.query import QuerySet
|
||||
|
||||
|
||||
logger = logging.getLogger("paperless.serializers")
|
||||
|
||||
|
||||
@@ -1019,32 +1014,6 @@ class NotesSerializer(serializers.ModelSerializer):
|
||||
return ret
|
||||
|
||||
|
||||
def _get_viewable_duplicates(
|
||||
document: Document,
|
||||
user: User | None,
|
||||
) -> QuerySet[Document]:
|
||||
checksums = {document.checksum}
|
||||
if document.archive_checksum:
|
||||
checksums.add(document.archive_checksum)
|
||||
duplicates = Document.global_objects.filter(
|
||||
Q(checksum__in=checksums) | Q(archive_checksum__in=checksums),
|
||||
).exclude(pk=document.pk)
|
||||
duplicates = duplicates.order_by("-created")
|
||||
allowed = get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"documents.view_document",
|
||||
Document,
|
||||
include_deleted=True,
|
||||
)
|
||||
return duplicates.filter(id__in=allowed)
|
||||
|
||||
|
||||
class DuplicateDocumentSummarySerializer(serializers.Serializer):
|
||||
id = serializers.IntegerField()
|
||||
title = serializers.CharField()
|
||||
deleted_at = serializers.DateTimeField(allow_null=True)
|
||||
|
||||
|
||||
@extend_schema_serializer(
|
||||
deprecate_fields=["created_date"],
|
||||
)
|
||||
@@ -1062,7 +1031,6 @@ class DocumentSerializer(
|
||||
archived_file_name = SerializerMethodField()
|
||||
created_date = serializers.DateField(required=False)
|
||||
page_count = SerializerMethodField()
|
||||
duplicate_documents = SerializerMethodField()
|
||||
|
||||
notes = NotesSerializer(many=True, required=False, read_only=True)
|
||||
|
||||
@@ -1088,16 +1056,6 @@ class DocumentSerializer(
|
||||
def get_page_count(self, obj) -> int | None:
|
||||
return obj.page_count
|
||||
|
||||
@extend_schema_field(DuplicateDocumentSummarySerializer(many=True))
|
||||
def get_duplicate_documents(self, obj):
|
||||
view = self.context.get("view")
|
||||
if view and getattr(view, "action", None) != "retrieve":
|
||||
return []
|
||||
request = self.context.get("request")
|
||||
user = request.user if request else None
|
||||
duplicates = _get_viewable_duplicates(obj, user)
|
||||
return list(duplicates.values("id", "title", "deleted_at"))
|
||||
|
||||
def get_original_file_name(self, obj) -> str | None:
|
||||
return obj.original_filename
|
||||
|
||||
@@ -1275,7 +1233,6 @@ class DocumentSerializer(
|
||||
"archive_serial_number",
|
||||
"original_file_name",
|
||||
"archived_file_name",
|
||||
"duplicate_documents",
|
||||
"owner",
|
||||
"permissions",
|
||||
"user_can_change",
|
||||
@@ -2137,12 +2094,10 @@ class TasksViewSerializer(OwnedObjectSerializer):
|
||||
"result",
|
||||
"acknowledged",
|
||||
"related_document",
|
||||
"duplicate_documents",
|
||||
"owner",
|
||||
)
|
||||
|
||||
related_document = serializers.SerializerMethodField()
|
||||
duplicate_documents = serializers.SerializerMethodField()
|
||||
created_doc_re = re.compile(r"New document id (\d+) created")
|
||||
duplicate_doc_re = re.compile(r"It is a duplicate of .* \(#(\d+)\)")
|
||||
|
||||
@@ -2167,17 +2122,6 @@ class TasksViewSerializer(OwnedObjectSerializer):
|
||||
|
||||
return result
|
||||
|
||||
@extend_schema_field(DuplicateDocumentSummarySerializer(many=True))
|
||||
def get_duplicate_documents(self, obj):
|
||||
related_document = self.get_related_document(obj)
|
||||
request = self.context.get("request")
|
||||
user = request.user if request else None
|
||||
document = Document.global_objects.filter(pk=related_document).first()
|
||||
if not related_document or not user or not document:
|
||||
return []
|
||||
duplicates = _get_viewable_duplicates(document, user)
|
||||
return list(duplicates.values("id", "title", "deleted_at"))
|
||||
|
||||
|
||||
class RunTaskViewSerializer(serializers.Serializer):
|
||||
task_name = serializers.ChoiceField(
|
||||
|
||||
@@ -19,6 +19,7 @@ from django.db import DatabaseError
|
||||
from django.db import close_old_connections
|
||||
from django.db import connections
|
||||
from django.db import models
|
||||
from django.db import transaction
|
||||
from django.db.models import Q
|
||||
from django.dispatch import receiver
|
||||
from django.utils import timezone
|
||||
@@ -453,62 +454,94 @@ def update_filename_and_move_files(
|
||||
# This will in turn cause this logic to move the file where it belongs.
|
||||
return
|
||||
|
||||
with FileLock(settings.MEDIA_LOCK):
|
||||
try:
|
||||
# If this was waiting for the lock, the filename or archive_filename
|
||||
# of this document may have been updated. This happens if multiple updates
|
||||
# get queued from the UI for the same document
|
||||
# So freshen up the data before doing anything
|
||||
instance.refresh_from_db()
|
||||
move_original = False
|
||||
move_archive = False
|
||||
old_filename = None
|
||||
old_archive_filename = None
|
||||
old_source_path = None
|
||||
old_archive_path = None
|
||||
|
||||
old_filename = instance.filename
|
||||
old_source_path = instance.source_path
|
||||
try:
|
||||
with transaction.atomic():
|
||||
Document.global_objects.select_for_update().get(pk=instance.pk)
|
||||
with FileLock(settings.MEDIA_LOCK):
|
||||
# If this was waiting for the lock, the filename or archive_filename
|
||||
# of this document may have been updated. This happens if multiple updates
|
||||
# get queued from the UI for the same document
|
||||
# So freshen up the data before doing anything
|
||||
instance.refresh_from_db()
|
||||
|
||||
candidate_filename = generate_filename(instance)
|
||||
candidate_source_path = (
|
||||
settings.ORIGINALS_DIR / candidate_filename
|
||||
).resolve()
|
||||
if candidate_filename == Path(old_filename):
|
||||
new_filename = Path(old_filename)
|
||||
elif (
|
||||
candidate_source_path.exists()
|
||||
and candidate_source_path != old_source_path
|
||||
):
|
||||
# Only fall back to unique search when there is an actual conflict
|
||||
new_filename = generate_unique_filename(instance)
|
||||
else:
|
||||
new_filename = candidate_filename
|
||||
old_filename = instance.filename
|
||||
old_source_path = instance.source_path
|
||||
|
||||
# Need to convert to string to be able to save it to the db
|
||||
instance.filename = str(new_filename)
|
||||
move_original = old_filename != instance.filename
|
||||
|
||||
old_archive_filename = instance.archive_filename
|
||||
old_archive_path = instance.archive_path
|
||||
|
||||
if instance.has_archive_version:
|
||||
archive_candidate = generate_filename(instance, archive_filename=True)
|
||||
archive_candidate_path = (
|
||||
settings.ARCHIVE_DIR / archive_candidate
|
||||
candidate_filename = generate_filename(instance)
|
||||
candidate_source_path = (
|
||||
settings.ORIGINALS_DIR / candidate_filename
|
||||
).resolve()
|
||||
if archive_candidate == Path(old_archive_filename):
|
||||
new_archive_filename = Path(old_archive_filename)
|
||||
if candidate_filename == Path(old_filename):
|
||||
new_filename = Path(old_filename)
|
||||
elif (
|
||||
archive_candidate_path.exists()
|
||||
and archive_candidate_path != old_archive_path
|
||||
candidate_source_path.exists()
|
||||
and candidate_source_path != old_source_path
|
||||
):
|
||||
new_archive_filename = generate_unique_filename(
|
||||
# Only fall back to unique search when there is an actual conflict
|
||||
new_filename = generate_unique_filename(instance)
|
||||
else:
|
||||
new_filename = candidate_filename
|
||||
|
||||
# Need to convert to string to be able to save it to the db
|
||||
instance.filename = str(new_filename)
|
||||
move_original = old_filename != instance.filename
|
||||
|
||||
old_archive_filename = instance.archive_filename
|
||||
old_archive_path = instance.archive_path
|
||||
|
||||
if instance.has_archive_version:
|
||||
archive_candidate = generate_filename(
|
||||
instance,
|
||||
archive_filename=True,
|
||||
)
|
||||
archive_candidate_path = (
|
||||
settings.ARCHIVE_DIR / archive_candidate
|
||||
).resolve()
|
||||
if archive_candidate == Path(old_archive_filename):
|
||||
new_archive_filename = Path(old_archive_filename)
|
||||
elif (
|
||||
archive_candidate_path.exists()
|
||||
and archive_candidate_path != old_archive_path
|
||||
):
|
||||
new_archive_filename = generate_unique_filename(
|
||||
instance,
|
||||
archive_filename=True,
|
||||
)
|
||||
else:
|
||||
new_archive_filename = archive_candidate
|
||||
|
||||
instance.archive_filename = str(new_archive_filename)
|
||||
|
||||
move_archive = old_archive_filename != instance.archive_filename
|
||||
else:
|
||||
new_archive_filename = archive_candidate
|
||||
move_archive = False
|
||||
|
||||
instance.archive_filename = str(new_archive_filename)
|
||||
if move_original:
|
||||
validate_move(
|
||||
instance,
|
||||
old_source_path,
|
||||
instance.source_path,
|
||||
settings.ORIGINALS_DIR,
|
||||
)
|
||||
create_source_path_directory(instance.source_path)
|
||||
shutil.move(old_source_path, instance.source_path)
|
||||
|
||||
move_archive = old_archive_filename != instance.archive_filename
|
||||
else:
|
||||
move_archive = False
|
||||
if move_archive:
|
||||
validate_move(
|
||||
instance,
|
||||
old_archive_path,
|
||||
instance.archive_path,
|
||||
settings.ARCHIVE_DIR,
|
||||
)
|
||||
create_source_path_directory(instance.archive_path)
|
||||
shutil.move(old_archive_path, instance.archive_path)
|
||||
|
||||
if not move_original and not move_archive:
|
||||
# Just update modified. Also, don't save() here to prevent infinite recursion.
|
||||
@@ -517,26 +550,6 @@ def update_filename_and_move_files(
|
||||
)
|
||||
return
|
||||
|
||||
if move_original:
|
||||
validate_move(
|
||||
instance,
|
||||
old_source_path,
|
||||
instance.source_path,
|
||||
settings.ORIGINALS_DIR,
|
||||
)
|
||||
create_source_path_directory(instance.source_path)
|
||||
shutil.move(old_source_path, instance.source_path)
|
||||
|
||||
if move_archive:
|
||||
validate_move(
|
||||
instance,
|
||||
old_archive_path,
|
||||
instance.archive_path,
|
||||
settings.ARCHIVE_DIR,
|
||||
)
|
||||
create_source_path_directory(instance.archive_path)
|
||||
shutil.move(old_archive_path, instance.archive_path)
|
||||
|
||||
# Don't save() here to prevent infinite recursion.
|
||||
Document.global_objects.filter(pk=instance.pk).update(
|
||||
filename=instance.filename,
|
||||
@@ -546,22 +559,24 @@ def update_filename_and_move_files(
|
||||
# Clear any caching for this document. Slightly overkill, but not terrible
|
||||
clear_document_caches(instance.pk)
|
||||
|
||||
except (OSError, DatabaseError, CannotMoveFilesException) as e:
|
||||
logger.warning(f"Exception during file handling: {e}")
|
||||
# This happens when either:
|
||||
# - moving the files failed due to file system errors
|
||||
# - saving to the database failed due to database errors
|
||||
# In both cases, we need to revert to the original state.
|
||||
except (OSError, DatabaseError, CannotMoveFilesException) as e:
|
||||
logger.warning(f"Exception during file handling: {e}")
|
||||
# This happens when either:
|
||||
# - moving the files failed due to file system errors
|
||||
# - saving to the database failed due to database errors
|
||||
# In both cases, we need to revert to the original state.
|
||||
|
||||
if move_original or move_archive:
|
||||
# Try to move files to their original location.
|
||||
try:
|
||||
if move_original and instance.source_path.is_file():
|
||||
logger.info("Restoring previous original path")
|
||||
shutil.move(instance.source_path, old_source_path)
|
||||
with FileLock(settings.MEDIA_LOCK):
|
||||
if move_original and instance.source_path.is_file():
|
||||
logger.info("Restoring previous original path")
|
||||
shutil.move(instance.source_path, old_source_path)
|
||||
|
||||
if move_archive and instance.archive_path.is_file():
|
||||
logger.info("Restoring previous archive path")
|
||||
shutil.move(instance.archive_path, old_archive_path)
|
||||
if move_archive and instance.archive_path.is_file():
|
||||
logger.info("Restoring previous archive path")
|
||||
shutil.move(instance.archive_path, old_archive_path)
|
||||
|
||||
except Exception:
|
||||
# This is fine, since:
|
||||
@@ -574,23 +589,29 @@ def update_filename_and_move_files(
|
||||
# anyway.
|
||||
pass
|
||||
|
||||
# restore old values on the instance
|
||||
# restore old values on the instance
|
||||
if old_filename is not None:
|
||||
instance.filename = old_filename
|
||||
if old_archive_filename is not None:
|
||||
instance.archive_filename = old_archive_filename
|
||||
|
||||
# finally, remove any empty sub folders. This will do nothing if
|
||||
# something has failed above.
|
||||
if not old_source_path.is_file():
|
||||
delete_empty_directories(
|
||||
Path(old_source_path).parent,
|
||||
root=settings.ORIGINALS_DIR,
|
||||
)
|
||||
# finally, remove any empty sub folders. This will do nothing if
|
||||
# something has failed above.
|
||||
if old_source_path and not old_source_path.is_file():
|
||||
delete_empty_directories(
|
||||
Path(old_source_path).parent,
|
||||
root=settings.ORIGINALS_DIR,
|
||||
)
|
||||
|
||||
if instance.has_archive_version and not old_archive_path.is_file():
|
||||
delete_empty_directories(
|
||||
Path(old_archive_path).parent,
|
||||
root=settings.ARCHIVE_DIR,
|
||||
)
|
||||
if (
|
||||
instance.has_archive_version
|
||||
and old_archive_path
|
||||
and not old_archive_path.is_file()
|
||||
):
|
||||
delete_empty_directories(
|
||||
Path(old_archive_path).parent,
|
||||
root=settings.ARCHIVE_DIR,
|
||||
)
|
||||
|
||||
|
||||
@shared_task
|
||||
|
||||
@@ -131,10 +131,6 @@ class TestDocumentApi(DirectoriesMixin, DocumentConsumeDelayMixin, APITestCase):
|
||||
self.assertIn("content", results_full[0])
|
||||
self.assertIn("id", results_full[0])
|
||||
|
||||
# Content length is used internally for performance reasons.
|
||||
# No need to expose this field.
|
||||
self.assertNotIn("content_length", results_full[0])
|
||||
|
||||
response = self.client.get("/api/documents/?fields=id", format="json")
|
||||
self.assertEqual(response.status_code, status.HTTP_200_OK)
|
||||
results = response.data["results"]
|
||||
|
||||
@@ -7,7 +7,6 @@ from django.contrib.auth.models import User
|
||||
from rest_framework import status
|
||||
from rest_framework.test import APITestCase
|
||||
|
||||
from documents.models import Document
|
||||
from documents.models import PaperlessTask
|
||||
from documents.tests.utils import DirectoriesMixin
|
||||
from documents.views import TasksViewSet
|
||||
@@ -259,7 +258,7 @@ class TestTasks(DirectoriesMixin, APITestCase):
|
||||
task_id=str(uuid.uuid4()),
|
||||
task_file_name="task_one.pdf",
|
||||
status=celery.states.FAILURE,
|
||||
result="test.pdf: Unexpected error during ingestion.",
|
||||
result="test.pdf: Not consuming test.pdf: It is a duplicate.",
|
||||
)
|
||||
|
||||
response = self.client.get(self.ENDPOINT)
|
||||
@@ -271,7 +270,7 @@ class TestTasks(DirectoriesMixin, APITestCase):
|
||||
|
||||
self.assertEqual(
|
||||
returned_data["result"],
|
||||
"test.pdf: Unexpected error during ingestion.",
|
||||
"test.pdf: Not consuming test.pdf: It is a duplicate.",
|
||||
)
|
||||
|
||||
def test_task_name_webui(self):
|
||||
@@ -326,34 +325,20 @@ class TestTasks(DirectoriesMixin, APITestCase):
|
||||
|
||||
self.assertEqual(returned_data["task_file_name"], "anothertest.pdf")
|
||||
|
||||
def test_task_result_duplicate_warning_includes_count(self):
|
||||
def test_task_result_failed_duplicate_includes_related_doc(self):
|
||||
"""
|
||||
GIVEN:
|
||||
- A celery task succeeds, but a duplicate exists
|
||||
- A celery task failed with a duplicate error
|
||||
WHEN:
|
||||
- API call is made to get tasks
|
||||
THEN:
|
||||
- The returned data includes duplicate warning metadata
|
||||
- The returned data includes a related document link
|
||||
"""
|
||||
checksum = "duplicate-checksum"
|
||||
Document.objects.create(
|
||||
title="Existing",
|
||||
content="",
|
||||
mime_type="application/pdf",
|
||||
checksum=checksum,
|
||||
)
|
||||
created_doc = Document.objects.create(
|
||||
title="Created",
|
||||
content="",
|
||||
mime_type="application/pdf",
|
||||
checksum=checksum,
|
||||
archive_checksum="another-checksum",
|
||||
)
|
||||
PaperlessTask.objects.create(
|
||||
task_id=str(uuid.uuid4()),
|
||||
task_file_name="task_one.pdf",
|
||||
status=celery.states.SUCCESS,
|
||||
result=f"Success. New document id {created_doc.pk} created",
|
||||
status=celery.states.FAILURE,
|
||||
result="Not consuming task_one.pdf: It is a duplicate of task_one_existing.pdf (#1234).",
|
||||
)
|
||||
|
||||
response = self.client.get(self.ENDPOINT)
|
||||
@@ -363,7 +348,7 @@ class TestTasks(DirectoriesMixin, APITestCase):
|
||||
|
||||
returned_data = response.data[0]
|
||||
|
||||
self.assertEqual(returned_data["related_document"], str(created_doc.pk))
|
||||
self.assertEqual(returned_data["related_document"], "1234")
|
||||
|
||||
def test_run_train_classifier_task(self):
|
||||
"""
|
||||
|
||||
@@ -485,21 +485,21 @@ class TestConsumer(
|
||||
with self.get_consumer(self.get_test_file()) as consumer:
|
||||
consumer.run()
|
||||
|
||||
with self.get_consumer(self.get_test_file()) as consumer:
|
||||
consumer.run()
|
||||
with self.assertRaisesMessage(ConsumerError, "It is a duplicate"):
|
||||
with self.get_consumer(self.get_test_file()) as consumer:
|
||||
consumer.run()
|
||||
|
||||
self.assertEqual(Document.objects.count(), 2)
|
||||
self._assert_first_last_send_progress()
|
||||
self._assert_first_last_send_progress(last_status="FAILED")
|
||||
|
||||
def testDuplicates2(self):
|
||||
with self.get_consumer(self.get_test_file()) as consumer:
|
||||
consumer.run()
|
||||
|
||||
with self.get_consumer(self.get_test_archive_file()) as consumer:
|
||||
consumer.run()
|
||||
with self.assertRaisesMessage(ConsumerError, "It is a duplicate"):
|
||||
with self.get_consumer(self.get_test_archive_file()) as consumer:
|
||||
consumer.run()
|
||||
|
||||
self.assertEqual(Document.objects.count(), 2)
|
||||
self._assert_first_last_send_progress()
|
||||
self._assert_first_last_send_progress(last_status="FAILED")
|
||||
|
||||
def testDuplicates3(self):
|
||||
with self.get_consumer(self.get_test_archive_file()) as consumer:
|
||||
@@ -513,10 +513,9 @@ class TestConsumer(
|
||||
|
||||
Document.objects.all().delete()
|
||||
|
||||
with self.get_consumer(self.get_test_file()) as consumer:
|
||||
consumer.run()
|
||||
|
||||
self.assertEqual(Document.objects.count(), 1)
|
||||
with self.assertRaisesMessage(ConsumerError, "document is in the trash"):
|
||||
with self.get_consumer(self.get_test_file()) as consumer:
|
||||
consumer.run()
|
||||
|
||||
def testAsnExists(self):
|
||||
with self.get_consumer(
|
||||
@@ -719,45 +718,12 @@ class TestConsumer(
|
||||
dst = self.get_test_file()
|
||||
self.assertIsFile(dst)
|
||||
|
||||
expected_message = (
|
||||
f"{dst.name}: Not consuming {dst.name}: "
|
||||
f"It is a duplicate of {document.title} (#{document.pk})"
|
||||
)
|
||||
|
||||
with self.assertRaisesMessage(ConsumerError, expected_message):
|
||||
with self.assertRaises(ConsumerError):
|
||||
with self.get_consumer(dst) as consumer:
|
||||
consumer.run()
|
||||
|
||||
self.assertIsNotFile(dst)
|
||||
self.assertEqual(Document.objects.count(), 1)
|
||||
self._assert_first_last_send_progress(last_status=ProgressStatusOptions.FAILED)
|
||||
|
||||
@override_settings(CONSUMER_DELETE_DUPLICATES=True)
|
||||
def test_delete_duplicate_in_trash(self):
|
||||
dst = self.get_test_file()
|
||||
with self.get_consumer(dst) as consumer:
|
||||
consumer.run()
|
||||
|
||||
# Move the existing document to trash
|
||||
document = Document.objects.first()
|
||||
document.delete()
|
||||
|
||||
dst = self.get_test_file()
|
||||
self.assertIsFile(dst)
|
||||
|
||||
expected_message = (
|
||||
f"{dst.name}: Not consuming {dst.name}: "
|
||||
f"It is a duplicate of {document.title} (#{document.pk})"
|
||||
f" Note: existing document is in the trash."
|
||||
)
|
||||
|
||||
with self.assertRaisesMessage(ConsumerError, expected_message):
|
||||
with self.get_consumer(dst) as consumer:
|
||||
consumer.run()
|
||||
|
||||
self.assertIsNotFile(dst)
|
||||
self.assertEqual(Document.global_objects.count(), 1)
|
||||
self.assertEqual(Document.objects.count(), 0)
|
||||
self._assert_first_last_send_progress(last_status="FAILED")
|
||||
|
||||
@override_settings(CONSUMER_DELETE_DUPLICATES=False)
|
||||
def test_no_delete_duplicate(self):
|
||||
@@ -777,12 +743,15 @@ class TestConsumer(
|
||||
dst = self.get_test_file()
|
||||
self.assertIsFile(dst)
|
||||
|
||||
with self.get_consumer(dst) as consumer:
|
||||
consumer.run()
|
||||
with self.assertRaisesRegex(
|
||||
ConsumerError,
|
||||
r"sample\.pdf: Not consuming sample\.pdf: It is a duplicate of sample \(#\d+\)",
|
||||
):
|
||||
with self.get_consumer(dst) as consumer:
|
||||
consumer.run()
|
||||
|
||||
self.assertIsNotFile(dst)
|
||||
self.assertEqual(Document.objects.count(), 2)
|
||||
self._assert_first_last_send_progress()
|
||||
self.assertIsFile(dst)
|
||||
self._assert_first_last_send_progress(last_status="FAILED")
|
||||
|
||||
@override_settings(FILENAME_FORMAT="{title}")
|
||||
@mock.patch("documents.parsers.document_consumer_declaration.send")
|
||||
|
||||
@@ -180,7 +180,7 @@ class TestRewriteNaturalDateKeywords(SimpleTestCase):
|
||||
(
|
||||
"added:this year",
|
||||
datetime(2025, 7, 15, 12, 0, 0, tzinfo=timezone.utc),
|
||||
("added:[20250101", "TO 20251231"),
|
||||
("added:[20250101", "TO 20250715"),
|
||||
),
|
||||
(
|
||||
"added:previous year",
|
||||
|
||||
@@ -241,10 +241,6 @@ class TestExportImport(
|
||||
checksum = hashlib.md5(f.read()).hexdigest()
|
||||
self.assertEqual(checksum, element["fields"]["checksum"])
|
||||
|
||||
# Generated field "content_length" should not be exported,
|
||||
# it is automatically computed during import.
|
||||
self.assertNotIn("content_length", element["fields"])
|
||||
|
||||
if document_exporter.EXPORTER_ARCHIVE_NAME in element:
|
||||
fname = (
|
||||
self.target / element[document_exporter.EXPORTER_ARCHIVE_NAME]
|
||||
|
||||
@@ -35,6 +35,7 @@ from django.db.models import Model
|
||||
from django.db.models import Q
|
||||
from django.db.models import Sum
|
||||
from django.db.models import When
|
||||
from django.db.models.functions import Length
|
||||
from django.db.models.functions import Lower
|
||||
from django.db.models.manager import Manager
|
||||
from django.http import FileResponse
|
||||
@@ -478,11 +479,11 @@ class TagViewSet(ModelViewSet, PermissionsAwareDocumentCountMixin):
|
||||
|
||||
if descendant_pks:
|
||||
filter_q = self.get_document_count_filter()
|
||||
children_source = list(
|
||||
children_source = (
|
||||
Tag.objects.filter(pk__in=descendant_pks | {t.pk for t in all_tags})
|
||||
.select_related("owner")
|
||||
.annotate(document_count=Count("documents", filter=filter_q))
|
||||
.order_by(*ordering),
|
||||
.order_by(*ordering)
|
||||
)
|
||||
else:
|
||||
children_source = all_tags
|
||||
@@ -494,11 +495,7 @@ class TagViewSet(ModelViewSet, PermissionsAwareDocumentCountMixin):
|
||||
|
||||
page = self.paginate_queryset(queryset)
|
||||
serializer = self.get_serializer(page, many=True)
|
||||
response = self.get_paginated_response(serializer.data)
|
||||
if descendant_pks:
|
||||
# Include children in the "all" field, if needed
|
||||
response.data["all"] = [tag.pk for tag in children_source]
|
||||
return response
|
||||
return self.get_paginated_response(serializer.data)
|
||||
|
||||
def perform_update(self, serializer):
|
||||
old_parent = self.get_object().get_parent()
|
||||
@@ -2325,19 +2322,23 @@ class StatisticsView(GenericAPIView):
|
||||
user = request.user if request.user is not None else None
|
||||
|
||||
documents = (
|
||||
Document.objects.all()
|
||||
if user is None
|
||||
else get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"documents.view_document",
|
||||
Document,
|
||||
(
|
||||
Document.objects.all()
|
||||
if user is None
|
||||
else get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"documents.view_document",
|
||||
Document,
|
||||
)
|
||||
)
|
||||
.only("mime_type", "content")
|
||||
.prefetch_related("tags")
|
||||
)
|
||||
tags = (
|
||||
Tag.objects.all()
|
||||
if user is None
|
||||
else get_objects_for_user_owner_aware(user, "documents.view_tag", Tag)
|
||||
).only("id", "is_inbox_tag")
|
||||
)
|
||||
correspondent_count = (
|
||||
Correspondent.objects.count()
|
||||
if user is None
|
||||
@@ -2366,33 +2367,31 @@ class StatisticsView(GenericAPIView):
|
||||
).count()
|
||||
)
|
||||
|
||||
inbox_tag_pks = list(
|
||||
tags.filter(is_inbox_tag=True).values_list("pk", flat=True),
|
||||
)
|
||||
documents_total = documents.count()
|
||||
|
||||
inbox_tags = tags.filter(is_inbox_tag=True)
|
||||
|
||||
documents_inbox = (
|
||||
documents.filter(tags__id__in=inbox_tag_pks).values("id").distinct().count()
|
||||
if inbox_tag_pks
|
||||
documents.filter(tags__id__in=inbox_tags).distinct().count()
|
||||
if inbox_tags.exists()
|
||||
else None
|
||||
)
|
||||
|
||||
# Single SQL request for document stats and mime type counts
|
||||
mime_type_stats = list(
|
||||
document_file_type_counts = (
|
||||
documents.values("mime_type")
|
||||
.annotate(
|
||||
mime_type_count=Count("id"),
|
||||
mime_type_chars=Sum("content_length"),
|
||||
)
|
||||
.order_by("-mime_type_count"),
|
||||
.annotate(mime_type_count=Count("mime_type"))
|
||||
.order_by("-mime_type_count")
|
||||
if documents_total > 0
|
||||
else []
|
||||
)
|
||||
|
||||
# Calculate totals from grouped results
|
||||
documents_total = sum(row["mime_type_count"] for row in mime_type_stats)
|
||||
character_count = sum(row["mime_type_chars"] or 0 for row in mime_type_stats)
|
||||
document_file_type_counts = [
|
||||
{"mime_type": row["mime_type"], "mime_type_count": row["mime_type_count"]}
|
||||
for row in mime_type_stats
|
||||
]
|
||||
character_count = (
|
||||
documents.annotate(
|
||||
characters=Length("content"),
|
||||
)
|
||||
.aggregate(Sum("characters"))
|
||||
.get("characters__sum")
|
||||
)
|
||||
|
||||
current_asn = Document.objects.aggregate(
|
||||
Max("archive_serial_number", default=0),
|
||||
@@ -2405,9 +2404,11 @@ class StatisticsView(GenericAPIView):
|
||||
"documents_total": documents_total,
|
||||
"documents_inbox": documents_inbox,
|
||||
"inbox_tag": (
|
||||
inbox_tag_pks[0] if inbox_tag_pks else None
|
||||
inbox_tags.first().pk if inbox_tags.exists() else None
|
||||
), # backwards compatibility
|
||||
"inbox_tags": (inbox_tag_pks if inbox_tag_pks else None),
|
||||
"inbox_tags": (
|
||||
[tag.pk for tag in inbox_tags] if inbox_tags.exists() else None
|
||||
),
|
||||
"document_file_type_counts": document_file_type_counts,
|
||||
"character_count": character_count,
|
||||
"tag_count": len(tags),
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -3,15 +3,12 @@ from urllib.parse import quote
|
||||
|
||||
from allauth.account.adapter import DefaultAccountAdapter
|
||||
from allauth.core import context
|
||||
from allauth.headless.tokens.sessions import SessionTokenStrategy
|
||||
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.models import Group
|
||||
from django.contrib.auth.models import User
|
||||
from django.forms import ValidationError
|
||||
from django.http import HttpRequest
|
||||
from django.urls import reverse
|
||||
from rest_framework.authtoken.models import Token
|
||||
|
||||
from documents.models import Document
|
||||
from paperless.signals import handle_social_account_updated
|
||||
@@ -162,11 +159,3 @@ class CustomSocialAccountAdapter(DefaultSocialAccountAdapter):
|
||||
exception,
|
||||
extra_context,
|
||||
)
|
||||
|
||||
|
||||
class DrfTokenStrategy(SessionTokenStrategy):
|
||||
def create_access_token(self, request: HttpRequest) -> str | None:
|
||||
if not request.user.is_authenticated:
|
||||
return None
|
||||
token, _ = Token.objects.get_or_create(user=request.user)
|
||||
return token.key
|
||||
|
||||
@@ -1,82 +0,0 @@
|
||||
from mcp_server import drf_publish_create_mcp_tool
|
||||
from mcp_server import drf_publish_destroy_mcp_tool
|
||||
from mcp_server import drf_publish_list_mcp_tool
|
||||
from mcp_server import drf_publish_update_mcp_tool
|
||||
|
||||
from paperless.views import ApplicationConfigurationViewSet
|
||||
from paperless.views import GroupViewSet
|
||||
from paperless.views import UserViewSet
|
||||
|
||||
VIEWSET_ACTIONS = {
|
||||
"create": {"post": "create"},
|
||||
"list": {"get": "list"},
|
||||
"update": {"put": "update"},
|
||||
"destroy": {"delete": "destroy"},
|
||||
}
|
||||
|
||||
BODY_SCHEMA = {"type": "object", "additionalProperties": True}
|
||||
|
||||
VIEWSET_INSTRUCTIONS = {
|
||||
UserViewSet: "Manage Paperless users.",
|
||||
GroupViewSet: "Manage Paperless groups.",
|
||||
ApplicationConfigurationViewSet: "Manage application configuration.",
|
||||
}
|
||||
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
UserViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[UserViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
UserViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[UserViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
UserViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[UserViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
UserViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[UserViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
GroupViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[GroupViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
GroupViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[GroupViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
GroupViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[GroupViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
GroupViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[GroupViewSet],
|
||||
)
|
||||
|
||||
drf_publish_list_mcp_tool(
|
||||
ApplicationConfigurationViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[ApplicationConfigurationViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
ApplicationConfigurationViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[ApplicationConfigurationViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
@@ -345,10 +345,8 @@ INSTALLED_APPS = [
|
||||
"allauth.account",
|
||||
"allauth.socialaccount",
|
||||
"allauth.mfa",
|
||||
"allauth.headless",
|
||||
"drf_spectacular",
|
||||
"drf_spectacular_sidecar",
|
||||
"mcp_server",
|
||||
"treenode",
|
||||
*env_apps,
|
||||
]
|
||||
@@ -541,12 +539,6 @@ SOCIALACCOUNT_PROVIDERS = json.loads(
|
||||
)
|
||||
SOCIAL_ACCOUNT_DEFAULT_GROUPS = __get_list("PAPERLESS_SOCIAL_ACCOUNT_DEFAULT_GROUPS")
|
||||
SOCIAL_ACCOUNT_SYNC_GROUPS = __get_boolean("PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS")
|
||||
SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM: Final[str] = os.getenv(
|
||||
"PAPERLESS_SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM",
|
||||
"groups",
|
||||
)
|
||||
|
||||
HEADLESS_TOKEN_STRATEGY = "paperless.adapter.DrfTokenStrategy"
|
||||
|
||||
MFA_TOTP_ISSUER = "Paperless-ngx"
|
||||
|
||||
@@ -613,17 +605,6 @@ def _parse_remote_user_settings() -> str:
|
||||
|
||||
HTTP_REMOTE_USER_HEADER_NAME = _parse_remote_user_settings()
|
||||
|
||||
DJANGO_MCP_AUTHENTICATION_CLASSES = REST_FRAMEWORK["DEFAULT_AUTHENTICATION_CLASSES"]
|
||||
DJANGO_MCP_GLOBAL_SERVER_CONFIG = {
|
||||
"name": "paperless-ngx",
|
||||
"instructions": (
|
||||
"Use the MCP tools to search, query, and manage Paperless-ngx data. "
|
||||
"Use `search_documents` for full-text search, and `query_data_collections` "
|
||||
"for structured queries against available collections. "
|
||||
"Write operations are exposed via DRF-backed tools for create/update/delete."
|
||||
),
|
||||
}
|
||||
|
||||
# X-Frame options for embedded PDF display:
|
||||
X_FRAME_OPTIONS = "SAMEORIGIN"
|
||||
|
||||
|
||||
@@ -40,19 +40,15 @@ def handle_social_account_updated(sender, request, sociallogin, **kwargs):
|
||||
|
||||
extra_data = sociallogin.account.extra_data or {}
|
||||
social_account_groups = extra_data.get(
|
||||
settings.SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM,
|
||||
"groups",
|
||||
[],
|
||||
) # pre-allauth 65.11.0 structure
|
||||
|
||||
if not social_account_groups:
|
||||
# allauth 65.11.0+ nests claims under `userinfo`/`id_token`
|
||||
social_account_groups = (
|
||||
extra_data.get("userinfo", {}).get(
|
||||
settings.SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM,
|
||||
)
|
||||
or extra_data.get("id_token", {}).get(
|
||||
settings.SOCIAL_ACCOUNT_SYNC_GROUPS_CLAIM,
|
||||
)
|
||||
extra_data.get("userinfo", {}).get("groups")
|
||||
or extra_data.get("id_token", {}).get("groups")
|
||||
or []
|
||||
)
|
||||
if settings.SOCIAL_ACCOUNT_SYNC_GROUPS and social_account_groups is not None:
|
||||
|
||||
@@ -4,7 +4,6 @@ from allauth.account.adapter import get_adapter
|
||||
from allauth.core import context
|
||||
from allauth.socialaccount.adapter import get_adapter as get_social_adapter
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from django.contrib.auth.models import Group
|
||||
from django.contrib.auth.models import User
|
||||
from django.forms import ValidationError
|
||||
@@ -12,9 +11,6 @@ from django.http import HttpRequest
|
||||
from django.test import TestCase
|
||||
from django.test import override_settings
|
||||
from django.urls import reverse
|
||||
from rest_framework.authtoken.models import Token
|
||||
|
||||
from paperless.adapter import DrfTokenStrategy
|
||||
|
||||
|
||||
class TestCustomAccountAdapter(TestCase):
|
||||
@@ -185,74 +181,3 @@ class TestCustomSocialAccountAdapter(TestCase):
|
||||
self.assertTrue(
|
||||
any("Test authentication error" in message for message in log_cm.output),
|
||||
)
|
||||
|
||||
|
||||
class TestDrfTokenStrategy(TestCase):
|
||||
def test_create_access_token_creates_new_token(self):
|
||||
"""
|
||||
GIVEN:
|
||||
- A user with no existing DRF token
|
||||
WHEN:
|
||||
- create_access_token is called
|
||||
THEN:
|
||||
- A new token is created and its key is returned
|
||||
"""
|
||||
|
||||
user = User.objects.create_user("testuser")
|
||||
request = HttpRequest()
|
||||
request.user = user
|
||||
|
||||
strategy = DrfTokenStrategy()
|
||||
token_key = strategy.create_access_token(request)
|
||||
|
||||
# Verify a token was created
|
||||
self.assertIsNotNone(token_key)
|
||||
self.assertTrue(Token.objects.filter(user=user).exists())
|
||||
|
||||
# Verify the returned key matches the created token
|
||||
token = Token.objects.get(user=user)
|
||||
self.assertEqual(token_key, token.key)
|
||||
|
||||
def test_create_access_token_returns_existing_token(self):
|
||||
"""
|
||||
GIVEN:
|
||||
- A user with an existing DRF token
|
||||
WHEN:
|
||||
- create_access_token is called again
|
||||
THEN:
|
||||
- The same token key is returned (no new token created)
|
||||
"""
|
||||
|
||||
user = User.objects.create_user("testuser")
|
||||
existing_token = Token.objects.create(user=user)
|
||||
|
||||
request = HttpRequest()
|
||||
request.user = user
|
||||
|
||||
strategy = DrfTokenStrategy()
|
||||
token_key = strategy.create_access_token(request)
|
||||
|
||||
# Verify the existing token key is returned
|
||||
self.assertEqual(token_key, existing_token.key)
|
||||
|
||||
# Verify only one token exists (no duplicate created)
|
||||
self.assertEqual(Token.objects.filter(user=user).count(), 1)
|
||||
|
||||
def test_create_access_token_returns_none_for_unauthenticated_user(self):
|
||||
"""
|
||||
GIVEN:
|
||||
- An unauthenticated request
|
||||
WHEN:
|
||||
- create_access_token is called
|
||||
THEN:
|
||||
- None is returned and no token is created
|
||||
"""
|
||||
|
||||
request = HttpRequest()
|
||||
request.user = AnonymousUser()
|
||||
|
||||
strategy = DrfTokenStrategy()
|
||||
token_key = strategy.create_access_token(request)
|
||||
|
||||
self.assertIsNone(token_key)
|
||||
self.assertEqual(Token.objects.count(), 0)
|
||||
|
||||
@@ -228,7 +228,6 @@ urlpatterns = [
|
||||
],
|
||||
),
|
||||
),
|
||||
re_path("^auth/headless/", include("allauth.headless.urls")),
|
||||
re_path(
|
||||
"^$", # Redirect to the API swagger view
|
||||
RedirectView.as_view(url="schema/view/"),
|
||||
@@ -356,7 +355,6 @@ urlpatterns = [
|
||||
],
|
||||
),
|
||||
),
|
||||
path("", include("mcp_server.urls")),
|
||||
# Root of the Frontend
|
||||
re_path(
|
||||
r".*",
|
||||
|
||||
@@ -1,129 +0,0 @@
|
||||
from mcp_server import ModelQueryToolset
|
||||
from mcp_server import drf_publish_create_mcp_tool
|
||||
from mcp_server import drf_publish_destroy_mcp_tool
|
||||
from mcp_server import drf_publish_list_mcp_tool
|
||||
from mcp_server import drf_publish_update_mcp_tool
|
||||
|
||||
from documents.permissions import get_objects_for_user_owner_aware
|
||||
from paperless_mail.models import MailAccount
|
||||
from paperless_mail.models import MailRule
|
||||
from paperless_mail.models import ProcessedMail
|
||||
from paperless_mail.views import MailAccountViewSet
|
||||
from paperless_mail.views import MailRuleViewSet
|
||||
from paperless_mail.views import ProcessedMailViewSet
|
||||
|
||||
VIEWSET_ACTIONS = {
|
||||
"create": {"post": "create"},
|
||||
"list": {"get": "list"},
|
||||
"update": {"put": "update"},
|
||||
"destroy": {"delete": "destroy"},
|
||||
}
|
||||
|
||||
BODY_SCHEMA = {"type": "object", "additionalProperties": True}
|
||||
|
||||
VIEWSET_INSTRUCTIONS = {
|
||||
MailAccountViewSet: "Manage mail accounts.",
|
||||
MailRuleViewSet: "Manage mail rules.",
|
||||
ProcessedMailViewSet: "List processed mail.",
|
||||
}
|
||||
|
||||
|
||||
class MailAccountQueryToolset(ModelQueryToolset):
|
||||
model = MailAccount
|
||||
|
||||
def get_queryset(self):
|
||||
user = getattr(self.request, "user", None)
|
||||
if not user or not user.is_authenticated:
|
||||
return MailAccount.objects.none()
|
||||
if user.is_superuser:
|
||||
return MailAccount.objects.all()
|
||||
return get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"paperless_mail.view_mailaccount",
|
||||
MailAccount,
|
||||
)
|
||||
|
||||
|
||||
class MailRuleQueryToolset(ModelQueryToolset):
|
||||
model = MailRule
|
||||
|
||||
def get_queryset(self):
|
||||
user = getattr(self.request, "user", None)
|
||||
if not user or not user.is_authenticated:
|
||||
return MailRule.objects.none()
|
||||
if user.is_superuser:
|
||||
return MailRule.objects.all()
|
||||
return get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"paperless_mail.view_mailrule",
|
||||
MailRule,
|
||||
)
|
||||
|
||||
|
||||
class ProcessedMailQueryToolset(ModelQueryToolset):
|
||||
model = ProcessedMail
|
||||
|
||||
def get_queryset(self):
|
||||
user = getattr(self.request, "user", None)
|
||||
if not user or not user.is_authenticated:
|
||||
return ProcessedMail.objects.none()
|
||||
if user.is_superuser:
|
||||
return ProcessedMail.objects.all()
|
||||
return get_objects_for_user_owner_aware(
|
||||
user,
|
||||
"paperless_mail.view_processedmail",
|
||||
ProcessedMail,
|
||||
)
|
||||
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
MailAccountViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[MailAccountViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
MailAccountViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[MailAccountViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
MailAccountViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[MailAccountViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
MailAccountViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[MailAccountViewSet],
|
||||
)
|
||||
|
||||
drf_publish_create_mcp_tool(
|
||||
MailRuleViewSet,
|
||||
actions=VIEWSET_ACTIONS["create"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[MailRuleViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_list_mcp_tool(
|
||||
MailRuleViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[MailRuleViewSet],
|
||||
)
|
||||
drf_publish_update_mcp_tool(
|
||||
MailRuleViewSet,
|
||||
actions=VIEWSET_ACTIONS["update"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[MailRuleViewSet],
|
||||
body_schema=BODY_SCHEMA,
|
||||
)
|
||||
drf_publish_destroy_mcp_tool(
|
||||
MailRuleViewSet,
|
||||
actions=VIEWSET_ACTIONS["destroy"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[MailRuleViewSet],
|
||||
)
|
||||
|
||||
drf_publish_list_mcp_tool(
|
||||
ProcessedMailViewSet,
|
||||
actions=VIEWSET_ACTIONS["list"],
|
||||
instructions=VIEWSET_INSTRUCTIONS[ProcessedMailViewSet],
|
||||
)
|
||||
156
uv.lock
generated
156
uv.lock
generated
@@ -1038,22 +1038,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2f/23/63a7d868373a73d25c4a5c2dd3cce3aaeb22fbee82560d42b6e93ba01403/django_guardian-3.2.0-py3-none-any.whl", hash = "sha256:0768565a057988a93fc4a1d93649c4a794abfd7473a8408a079cfbf83c559d77", size = 134674, upload-time = "2025-09-16T10:35:51.69Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "django-mcp-server"
|
||||
version = "0.5.7"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "django", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "djangorestframework", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "inflection", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "mcp", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "uritemplate", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b2/70/e2cf268b77d0aa171b72763325279284561dbbd9b80ed4fd6975b4b7bd9c/django_mcp_server-0.5.7.tar.gz", hash = "sha256:5077f8fabf5fb621b5ce490afd0db60f21e57b3a451ed14a9f44aef545ea4eee", size = 23910, upload-time = "2025-10-10T17:13:34.681Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/01/f78a11f51437f70b4ff2d9f131d47acf82c2a4cf78d63e9cf291e3727054/django_mcp_server-0.5.7-py3-none-any.whl", hash = "sha256:04b58bf02623aaee59708c3661ffe17981acd4532587c38b6cfe2c9e7090c6d3", size = 26389, upload-time = "2025-10-10T17:13:33.56Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "django-multiselectfield"
|
||||
version = "1.0.1"
|
||||
@@ -1722,15 +1706,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/45/4b/2b81e876abf77b4af3372aff731f4f6722840ebc7dcfd85778eaba271733/httpx_oauth-0.16.1-py3-none-any.whl", hash = "sha256:2fcad82f80f28d0473a0fc4b4eda223dc952050af7e3a8c8781342d850f09fb5", size = 38056, upload-time = "2024-12-20T07:23:00.394Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "httpx-sse"
|
||||
version = "0.4.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/0f/4c/751061ffa58615a32c31b2d82e8482be8dd4a89154f003147acee90f2be9/httpx_sse-0.4.3.tar.gz", hash = "sha256:9b1ed0127459a66014aec3c56bebd93da3c1bc8bb6618c8082039a44889a755d", size = 15943, upload-time = "2025-10-10T21:48:22.271Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/fd/6668e5aec43ab844de6fc74927e155a3b37bf40d7c3790e49fc0406b6578/httpx_sse-0.4.3-py3-none-any.whl", hash = "sha256:0ac1c9fe3c0afad2e0ebb25a934a59f4c7823b60792691f779fad2c5568830fc", size = 8960, upload-time = "2025-10-10T21:48:21.158Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "huggingface-hub"
|
||||
version = "0.30.2"
|
||||
@@ -2403,30 +2378,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/be/2f/5108cb3ee4ba6501748c4908b908e55f42a5b66245b4cfe0c99326e1ef6e/marshmallow-3.26.2-py3-none-any.whl", hash = "sha256:013fa8a3c4c276c24d26d84ce934dc964e2aa794345a0f8c7e5a7191482c8a73", size = 50964, upload-time = "2025-12-22T06:53:51.801Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mcp"
|
||||
version = "1.26.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "anyio", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "httpx", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "httpx-sse", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "jsonschema", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "pydantic", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "pydantic-settings", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "pyjwt", extra = ["crypto"], marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "python-multipart", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "sse-starlette", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "starlette", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "typing-extensions", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "typing-inspection", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "uvicorn", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/fc/6d/62e76bbb8144d6ed86e202b5edd8a4cb631e7c8130f3f4893c3f90262b10/mcp-1.26.0.tar.gz", hash = "sha256:db6e2ef491eecc1a0d93711a76f28dec2e05999f93afd48795da1c1137142c66", size = 608005, upload-time = "2026-01-24T19:40:32.468Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/d9/eaa1f80170d2b7c5ba23f3b59f766f3a0bb41155fbc32a69adfa1adaaef9/mcp-1.26.0-py3-none-any.whl", hash = "sha256:904a21c33c25aa98ddbeb47273033c435e595bbacfdb177f4bd87f6dceebe1ca", size = 233615, upload-time = "2026-01-24T19:40:30.652Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mdurl"
|
||||
version = "0.1.2"
|
||||
@@ -2986,7 +2937,6 @@ dependencies = [
|
||||
{ name = "django-extensions", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "django-filter", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "django-guardian", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "django-mcp-server", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "django-multiselectfield", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "django-soft-delete", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "django-treenode", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
@@ -3135,7 +3085,6 @@ requires-dist = [
|
||||
{ name = "django-extensions", specifier = "~=4.1" },
|
||||
{ name = "django-filter", specifier = "~=25.1" },
|
||||
{ name = "django-guardian", specifier = "~=3.2.0" },
|
||||
{ name = "django-mcp-server", specifier = "~=0.5.7" },
|
||||
{ name = "django-multiselectfield", specifier = "~=1.0.1" },
|
||||
{ name = "django-soft-delete", specifier = "~=1.0.18" },
|
||||
{ name = "django-treenode", specifier = ">=0.23.2" },
|
||||
@@ -3841,20 +3790,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/88/9d/b06ca6acfe4abb296110fb1273a4d848a0bfb2ff65f3ee92127b3244e16b/pydantic_core-2.41.5-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f14f8f046c14563f8eb3f45f499cc658ab8d10072961e07225e507adb700e93f", size = 2316992, upload-time = "2025-11-04T13:43:43.602Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pydantic-settings"
|
||||
version = "2.12.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pydantic", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "python-dotenv", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "typing-inspection", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/43/4b/ac7e0aae12027748076d72a8764ff1c9d82ca75a7a52622e67ed3f765c54/pydantic_settings-2.12.0.tar.gz", hash = "sha256:005538ef951e3c2a68e1c08b292b5f2e71490def8589d4221b95dab00dafcfd0", size = 194184, upload-time = "2025-11-10T14:25:47.013Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/60/5d4751ba3f4a40a6891f24eec885f51afd78d208498268c734e256fb13c4/pydantic_settings-2.12.0-py3-none-any.whl", hash = "sha256:fddb9fd99a5b18da837b29710391e945b1e30c135477f484084ee513adb93809", size = 51880, upload-time = "2025-11-10T14:25:45.546Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pygments"
|
||||
version = "2.19.2"
|
||||
@@ -4072,15 +4007,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/73/9f872cb81fc5c3bb48f7227872c28975f998f3e7c2b1c16e95e6432bbb90/python_magic-0.4.27-py2.py3-none-any.whl", hash = "sha256:c212960ad306f700aa0d01e5d7a325d20548ff97eb9920dcd29513174f0294d3", size = 13840, upload-time = "2022-06-07T20:16:57.763Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "python-multipart"
|
||||
version = "0.0.22"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/94/01/979e98d542a70714b0cb2b6728ed0b7c46792b695e3eaec3e20711271ca3/python_multipart-0.0.22.tar.gz", hash = "sha256:7340bef99a7e0032613f56dc36027b959fd3b30a787ed62d310e951f7c3a3a58", size = 37612, upload-time = "2026-01-25T10:15:56.219Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/1b/d0/397f9626e711ff749a95d96b7af99b9c566a9bb5129b8e4c10fc4d100304/python_multipart-0.0.22-py3-none-any.whl", hash = "sha256:2b2cd894c83d21bf49d702499531c7bafd057d730c201782048f7945d82de155", size = 24579, upload-time = "2026-01-25T10:15:54.811Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytz"
|
||||
version = "2025.2"
|
||||
@@ -5022,32 +4948,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/5c/bfd6bd0bf979426d405cc6e71eceb8701b148b16c21d2dc3c261efc61c7b/sqlparse-0.5.3-py3-none-any.whl", hash = "sha256:cf2196ed3418f3ba5de6af7e82c694a9fbdbfecccdfc72e281548517081f16ca", size = 44415, upload-time = "2024-12-10T12:05:27.824Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "sse-starlette"
|
||||
version = "3.2.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "anyio", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "starlette", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8b/8d/00d280c03ffd39aaee0e86ec81e2d3b9253036a0f93f51d10503adef0e65/sse_starlette-3.2.0.tar.gz", hash = "sha256:8127594edfb51abe44eac9c49e59b0b01f1039d0c7461c6fd91d4e03b70da422", size = 27253, upload-time = "2026-01-17T13:11:05.62Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/96/7f/832f015020844a8b8f7a9cbc103dd76ba8e3875004c41e08440ea3a2b41a/sse_starlette-3.2.0-py3-none-any.whl", hash = "sha256:5876954bd51920fc2cd51baee47a080eb88a37b5b784e615abb0b283f801cdbf", size = 12763, upload-time = "2026-01-17T13:11:03.775Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "starlette"
|
||||
version = "0.52.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "anyio", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "typing-extensions", marker = "(python_full_version < '3.13' and sys_platform == 'darwin') or (python_full_version < '3.13' and sys_platform == 'linux')" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c4/68/79977123bb7be889ad680d79a40f339082c1978b5cfcf62c2d8d196873ac/starlette-0.52.1.tar.gz", hash = "sha256:834edd1b0a23167694292e94f597773bc3f89f362be6effee198165a35d62933", size = 2653702, upload-time = "2026-01-18T13:34:11.062Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/81/0d/13d1d239a25cbfb19e740db83143e95c772a1fe10202dda4b76792b114dd/starlette-0.52.1-py3-none-any.whl", hash = "sha256:0029d43eb3d273bc4f83a08720b4912ea4b071087a3b48db01b7c839f7954d74", size = 74272, upload-time = "2026-01-18T13:34:09.188Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "sympy"
|
||||
version = "1.13.3"
|
||||
@@ -5208,13 +5108,13 @@ dependencies = [
|
||||
{ name = "typing-extensions", marker = "sys_platform == 'darwin'" },
|
||||
]
|
||||
wheels = [
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp310-none-macosx_11_0_arm64.whl", hash = "sha256:bf1e68cfb935ae2046374ff02a7aa73dda70351b46342846f557055b3a540bf0" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp311-none-macosx_11_0_arm64.whl", hash = "sha256:a52952a8c90a422c14627ea99b9826b7557203b46b4d0772d3ca5c7699692425" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:287242dd1f830846098b5eca847f817aa5c6015ea57ab4c1287809efea7b77eb" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:8924d10d36eac8fe0652a060a03fc2ae52980841850b9a1a2ddb0f27a4f181cd" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp313-none-macosx_11_0_arm64.whl", hash = "sha256:bcee64ae7aa65876ceeae6dcaebe75109485b213528c74939602208a20706e3f" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:defadbeb055cfcf5def58f70937145aecbd7a4bc295238ded1d0e85ae2cf0e1d" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:886f84b181f766f53265ba0a1d503011e60f53fff9d569563ef94f24160e1072" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp310-none-macosx_11_0_arm64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp311-none-macosx_11_0_arm64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp312-none-macosx_11_0_arm64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp313-cp313t-macosx_11_0_arm64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp313-none-macosx_11_0_arm64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp314-cp314-macosx_11_0_arm64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1-cp314-cp314t-macosx_11_0_arm64.whl" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -5238,20 +5138,20 @@ dependencies = [
|
||||
{ name = "typing-extensions", marker = "sys_platform == 'linux'" },
|
||||
]
|
||||
wheels = [
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:10866c8a48c4aa5ae3f48538dc8a055b99c57d9c6af2bf5dd715374d9d6ddca3" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:7210713b66943fdbfcc237b2e782871b649123ac5d29f548ce8c85be4223ab38" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:0e611cfb16724e62252b67d31073bc5c490cb83e92ecdc1192762535e0e44487" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:3de2adb9b4443dc9210ef1f1b16da3647ace53553166d6360bbbd7edd6f16e4d" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:3bf9b442a51a2948e41216a76d7ab00f0694cfcaaa51b6f9bcab57b7f89843e6" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:7417d8c565f219d3455654cb431c6d892a3eb40246055e14d645422de13b9ea1" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:3e532e553b37ee859205a9b2d1c7977fd6922f53bbb1b9bfdd5bdc00d1a60ed4" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:39b3dff6d8fba240ae0d1bede4ca11c2531ae3b47329206512d99e17907ff74b" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:01b1884f724977a20c7da2f640f1c7b37f4a2c117a7f4a6c1c0424d14cb86322" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:031a597147fa81b1e6d79ccf1ad3ccc7fafa27941d6cf26ff5caaa384fb20e92" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp314-cp314-manylinux_2_28_aarch64.whl", hash = "sha256:65010ab4aacce6c9a1ddfc935f986c003ca8638ded04348fd326c3e74346237c" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp314-cp314-manylinux_2_28_x86_64.whl", hash = "sha256:88adf5157db5da1d54b1c9fe4a6c1d20ceef00e75d854e206a87dbf69e3037dc" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:3ac2b8df2c55430e836dcda31940d47f1f5f94b8731057b6f20300ebea394dd9" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:5b688445f928f13563b7418b17c57e97bf955ab559cf73cd8f2b961f8572dbb3" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp310-cp310-manylinux_2_28_aarch64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp310-cp310-manylinux_2_28_x86_64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp311-cp311-manylinux_2_28_aarch64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp311-cp311-manylinux_2_28_x86_64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp312-cp312-manylinux_2_28_aarch64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp312-cp312-manylinux_2_28_x86_64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp313-cp313-manylinux_2_28_aarch64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp313-cp313-manylinux_2_28_x86_64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp313-cp313t-manylinux_2_28_aarch64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp313-cp313t-manylinux_2_28_x86_64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp314-cp314-manylinux_2_28_aarch64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp314-cp314-manylinux_2_28_x86_64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp314-cp314t-manylinux_2_28_aarch64.whl" },
|
||||
{ url = "https://download.pytorch.org/whl/cpu/torch-2.9.1%2Bcpu-cp314-cp314t-manylinux_2_28_x86_64.whl" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -5595,20 +5495,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/34/257747253ad446fd155e39f0c30afda4597b3b9e28f44a9de5dee76a6509/uv-0.9.6-py3-none-musllinux_1_1_x86_64.whl", hash = "sha256:b31377ebf2d0499afc5abe3fe1abded5ca843f3a1161b432fe26eb0ce15bab8e", size = 21597889, upload-time = "2025-10-29T19:40:36.963Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "uvicorn"
|
||||
version = "0.40.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "h11", marker = "sys_platform == 'darwin' or sys_platform == 'linux'" },
|
||||
{ name = "typing-extensions", marker = "(python_full_version < '3.11' and sys_platform == 'darwin') or (python_full_version < '3.11' and sys_platform == 'linux')" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c3/d1/8f3c683c9561a4e6689dd3b1d345c815f10f86acd044ee1fb9a4dcd0b8c5/uvicorn-0.40.0.tar.gz", hash = "sha256:839676675e87e73694518b5574fd0f24c9d97b46bea16df7b8c05ea1a51071ea", size = 81761, upload-time = "2025-12-21T14:16:22.45Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/d8/2083a1daa7439a66f3a48589a57d576aa117726762618f6bb09fe3798796/uvicorn-0.40.0-py3-none-any.whl", hash = "sha256:c6c8f55bc8bf13eb6fa9ff87ad62308bbbc33d0b67f84293151efe87e0d5f2ee", size = 68502, upload-time = "2025-12-21T14:16:21.041Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "uvloop"
|
||||
version = "0.21.0"
|
||||
|
||||
Reference in New Issue
Block a user