Compare commits
1660 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
aad814d342 | ||
![]() |
7e21aaec17 | ||
![]() |
6d5fdfe2e2 | ||
![]() |
5942cd6cd2 | ||
![]() |
5cd17e71e2 | ||
![]() |
2f2ecaa61e | ||
![]() |
aa858a35e2 | ||
![]() |
d658150d42 | ||
![]() |
c312149b35 | ||
![]() |
18a9a3df12 | ||
![]() |
0cdd3581c9 | ||
![]() |
cae79b811f | ||
![]() |
6d953babcb | ||
![]() |
975e5f3fd0 | ||
![]() |
6cfe92bed1 | ||
![]() |
a51c0850c8 | ||
![]() |
03415456bf | ||
![]() |
0309a0fae1 | ||
![]() |
b48910bb94 | ||
![]() |
9a89786dd3 | ||
![]() |
15a5261189 | ||
![]() |
f616da3b85 | ||
![]() |
0e9cf016ec | ||
![]() |
4481f12e32 | ||
![]() |
66efaedcbb | ||
![]() |
771c1fab92 | ||
![]() |
8d6e7ed477 | ||
![]() |
5d80511b9b | ||
![]() |
90f90dc9b4 | ||
![]() |
a58e8498aa | ||
![]() |
ca355d5855 | ||
![]() |
826322b610 | ||
![]() |
80ff5677ea | ||
![]() |
62c417cd51 | ||
![]() |
f27f25aa03 | ||
![]() |
285a4b5aef | ||
![]() |
47a2ded30d | ||
![]() |
5b502b1e1a | ||
![]() |
aff56077a8 | ||
![]() |
6e371ac5ac | ||
![]() |
1b69b89d2d | ||
![]() |
5a20c8e512 | ||
![]() |
4ca1503beb | ||
![]() |
567a7eb7f3 | ||
![]() |
20f27fe32f | ||
![]() |
1a50d6bb86 | ||
![]() |
0b16c2db03 | ||
![]() |
76ac888386 | ||
![]() |
7cfa05d7f2 | ||
![]() |
e3496d0485 | ||
![]() |
d2c33c0074 | ||
![]() |
9c5caecafa | ||
![]() |
64651d5a84 | ||
![]() |
4493236879 | ||
![]() |
ce643942ea | ||
![]() |
46d216b02f | ||
![]() |
133d43ae30 | ||
![]() |
27155cb7e3 | ||
![]() |
b55c413774 | ||
![]() |
69be86e16c | ||
![]() |
65f6b0881e | ||
![]() |
c2bede40c7 | ||
![]() |
33c2398de9 | ||
![]() |
64cfc43891 | ||
![]() |
6575c69409 | ||
![]() |
e3f4e0b775 | ||
![]() |
5be89bfda5 | ||
![]() |
e1b573adeb | ||
![]() |
0913c7aa9e | ||
![]() |
5297626816 | ||
![]() |
c075642d78 | ||
![]() |
253f1a44c1 | ||
![]() |
26837c8871 | ||
![]() |
c732d31edd | ||
![]() |
c2595f28fb | ||
![]() |
58656a63cf | ||
![]() |
f4ce178cfa | ||
![]() |
2c1cfc64d5 | ||
![]() |
4b84d93cee | ||
![]() |
0bfa347595 | ||
![]() |
c7a07b59fd | ||
![]() |
c7c737d8c9 | ||
![]() |
1126e668d1 | ||
![]() |
b71a94ebca | ||
![]() |
815137af19 | ||
![]() |
3c4fbfec54 | ||
![]() |
b3a8eb7e06 | ||
![]() |
1e5c64ff73 | ||
![]() |
a320bfa425 | ||
![]() |
1c4dfc3c6e | ||
![]() |
92ee4d33c3 | ||
![]() |
01c2f01d3f | ||
![]() |
b83e6a5d5b | ||
![]() |
16281b38e0 | ||
![]() |
ca11e116b6 | ||
![]() |
1f32e4a642 | ||
![]() |
4c01089de0 | ||
![]() |
47a416dd9b | ||
![]() |
1ab1bbdd70 | ||
![]() |
1e05cb168c | ||
![]() |
d0383c1edf | ||
![]() |
c8ee35692c | ||
![]() |
3e062a8021 | ||
![]() |
0eb17e7102 | ||
![]() |
a14796cf90 | ||
![]() |
84b1c1ce6c | ||
![]() |
27e0b65c2d | ||
![]() |
ef35576174 | ||
![]() |
804faf726b | ||
![]() |
a94e5d2e47 | ||
![]() |
54d817d656 | ||
![]() |
00aa6c1f6a | ||
![]() |
ed34815393 | ||
![]() |
ceb15716c3 | ||
![]() |
bc97f540de | ||
![]() |
4ae67c79e0 | ||
![]() |
be857e989b | ||
![]() |
232402197c | ||
![]() |
12cbe84166 | ||
![]() |
b7b0be141c | ||
![]() |
8f9f4d9b71 | ||
![]() |
b97b9b7a28 | ||
![]() |
cf23732da3 | ||
![]() |
4e6b1d54e8 | ||
![]() |
fec8f90b78 | ||
![]() |
bedd4dafc4 | ||
![]() |
0d92648d62 | ||
![]() |
07989bc2fa | ||
![]() |
09ff2cb9a3 | ||
![]() |
6aa2e60037 | ||
![]() |
9f3ef532b8 | ||
![]() |
654621b438 | ||
![]() |
f8cfa9e02d | ||
![]() |
7476aa5897 | ||
![]() |
1d00a59706 | ||
![]() |
82d4e5e456 | ||
![]() |
07b6a36aab | ||
![]() |
0ceb6cc143 | ||
![]() |
864a155ee8 | ||
![]() |
15e2e575ee | ||
![]() |
253c524131 | ||
![]() |
48c02b8554 | ||
![]() |
d8fef22c9a | ||
![]() |
d82103e02d | ||
![]() |
9e5b22931e | ||
![]() |
30cd361edb | ||
![]() |
48da0071bd | ||
![]() |
32037e3645 | ||
![]() |
4279038a62 | ||
![]() |
c8afe7503b | ||
![]() |
4dfae98250 | ||
![]() |
b44dca49f5 | ||
![]() |
3d5892a841 | ||
![]() |
365e06891a | ||
![]() |
0f436481b8 | ||
![]() |
a5510be0bf | ||
![]() |
45a3c95a66 | ||
![]() |
94cb13ce9d | ||
![]() |
fecc431b10 | ||
![]() |
c88a975bf0 | ||
![]() |
4ed26ab934 | ||
![]() |
76a8da20bf | ||
![]() |
9f05747681 | ||
![]() |
f0bb9bb953 | ||
![]() |
72c6ac09ac | ||
![]() |
07c38d6353 | ||
![]() |
dd1cafbc77 | ||
![]() |
77e30d7844 | ||
![]() |
1c4f772a14 | ||
![]() |
bfc3148e4f | ||
![]() |
85db76c86e | ||
![]() |
f6c0d97172 | ||
![]() |
67f846513b | ||
![]() |
ad4cf843dd | ||
![]() |
af79b1504f | ||
![]() |
caff72967e | ||
![]() |
4abd21bbca | ||
![]() |
a2770a1bff | ||
![]() |
30e2d15321 | ||
![]() |
ba7e8b8627 | ||
![]() |
58850e97ff | ||
![]() |
b35acabc56 | ||
![]() |
ef907f1bc4 | ||
![]() |
c25128917b | ||
![]() |
9db6e64666 | ||
![]() |
f51a29a2e1 | ||
![]() |
b90025a2d0 | ||
![]() |
03b5ae02e1 | ||
![]() |
70627b992a | ||
![]() |
e830cd9baa | ||
![]() |
05fb10390e | ||
![]() |
d2c802c9da | ||
![]() |
741c165a5b | ||
![]() |
53d3ede184 | ||
![]() |
a41d2f831d | ||
![]() |
e30b7d0fa4 | ||
![]() |
7222741863 | ||
![]() |
49cea717af | ||
![]() |
49a7b62155 | ||
![]() |
3abc88985a | ||
![]() |
764b007d2c | ||
![]() |
a5c0d3dfae | ||
![]() |
386070923f | ||
![]() |
e79015d877 | ||
![]() |
b574396751 | ||
![]() |
91400070a7 | ||
![]() |
23d02085e8 | ||
![]() |
a9459dca89 | ||
![]() |
bd08cd1983 | ||
![]() |
3ae91495c1 | ||
![]() |
7f9fa46271 | ||
![]() |
f3b02d9922 | ||
![]() |
f2be582299 | ||
![]() |
b2eb403f9b | ||
![]() |
7e6a5927d7 | ||
![]() |
2c4a3650c4 | ||
![]() |
5b125d4513 | ||
![]() |
66167aeb55 | ||
![]() |
9804269cd1 | ||
![]() |
3239c478a5 | ||
![]() |
8f2ca5761b | ||
![]() |
ef9669cdb6 | ||
![]() |
caddcaf807 | ||
![]() |
90707d661b | ||
![]() |
facb7226fe | ||
![]() |
b671f54cb7 | ||
![]() |
55f26c9c4c | ||
![]() |
18995b3561 | ||
![]() |
8e3fbdddc7 | ||
![]() |
c19df84bef | ||
![]() |
e8527ba723 | ||
![]() |
577b49df9d | ||
![]() |
5b83bd03f5 | ||
![]() |
73cbf6c33d | ||
![]() |
95ab3b99f8 | ||
![]() |
24dbf669a9 | ||
![]() |
8716590cef | ||
![]() |
10729f0362 | ||
![]() |
800f54f263 | ||
![]() |
e63a543c29 | ||
![]() |
0f08796e1b | ||
![]() |
d10c32bad6 | ||
![]() |
813ad6551c | ||
![]() |
5582b33d40 | ||
![]() |
79afc236c9 | ||
![]() |
1f83a75d5f | ||
![]() |
2a3baf5aec | ||
![]() |
f92ae3b232 | ||
![]() |
03523244ef | ||
![]() |
55e799b833 | ||
![]() |
10929e9ac8 | ||
![]() |
89757609c2 | ||
![]() |
0f1dea67b7 | ||
![]() |
38e035b95c | ||
![]() |
f695d4b9da | ||
![]() |
4fbe7f16f3 | ||
![]() |
dcc2cc5001 | ||
![]() |
cb271deb8f | ||
![]() |
d480e91196 | ||
![]() |
9880f9ebc7 | ||
![]() |
999ae678c2 | ||
![]() |
5f0eba694c | ||
![]() |
f893ba929a | ||
![]() |
d35b0423b6 | ||
![]() |
651a22d056 | ||
![]() |
8dee246854 | ||
![]() |
90db397ec6 | ||
![]() |
cf973ff41e | ||
![]() |
fd8de5b1ea | ||
![]() |
27772257a8 | ||
![]() |
5665db844e | ||
![]() |
21a9963a2b | ||
![]() |
99f260225a | ||
![]() |
e4054d684c | ||
![]() |
01af725d79 | ||
![]() |
db0f31adea | ||
![]() |
226c771735 | ||
![]() |
b18b070622 | ||
![]() |
e52ae28426 | ||
![]() |
6acd30eda1 | ||
![]() |
f170cc0354 | ||
![]() |
d9abae51b5 | ||
![]() |
ada67bd54e | ||
![]() |
f7f5d0efa6 | ||
![]() |
a2bdd64ad0 | ||
![]() |
48f4f21d28 | ||
![]() |
57b8ee37ec | ||
![]() |
71bf8eb332 | ||
![]() |
fb1e288580 | ||
![]() |
720c6dd3b0 | ||
![]() |
be6506da08 | ||
![]() |
4f6698e39f | ||
![]() |
cdcd22e6a6 | ||
![]() |
b1b3eb4406 | ||
![]() |
b0c6f5e56b | ||
![]() |
322d8a61c2 | ||
![]() |
d41c4730cd | ||
![]() |
b38bb47491 | ||
![]() |
6973691cce | ||
![]() |
96176589ca | ||
![]() |
a74740877a | ||
![]() |
1de4072a48 | ||
![]() |
cf2796b2af | ||
![]() |
85315b768c | ||
![]() |
76a147e58b | ||
![]() |
de88c6de8c | ||
![]() |
613b429540 | ||
![]() |
d9abb745a9 | ||
![]() |
702225f535 | ||
![]() |
612e6341a3 | ||
![]() |
70b566f746 | ||
![]() |
6dd69126ae | ||
![]() |
f3d6756fba | ||
![]() |
8d60506884 | ||
![]() |
9712ac109d | ||
![]() |
379b4f8cd3 | ||
![]() |
86d223fd93 | ||
![]() |
205106b566 | ||
![]() |
54099d8441 | ||
![]() |
7c8b501c40 | ||
![]() |
02bf0349ca | ||
![]() |
9d72d1fc81 | ||
![]() |
085e6da1f2 | ||
![]() |
1ce0eae931 | ||
![]() |
ff5c25fc8c | ||
![]() |
ef863bec7c | ||
![]() |
89117da57d | ||
![]() |
6170f4f56a | ||
![]() |
0b00f8f4f0 | ||
![]() |
a45c128c38 | ||
![]() |
fe26655653 | ||
![]() |
2277fb5a58 | ||
![]() |
b6ab3095ab | ||
![]() |
591466d16d | ||
![]() |
666c12f2e8 | ||
![]() |
8180226eb7 | ||
![]() |
0d5be4a730 | ||
![]() |
49a2e2f3fa | ||
![]() |
219563146d | ||
![]() |
0d01295e79 | ||
![]() |
b65e195c27 | ||
![]() |
f5717cca1c | ||
![]() |
a2c9f6792e | ||
![]() |
95c12c1840 | ||
![]() |
4de1cb0a09 | ||
![]() |
ec9ebd3026 | ||
![]() |
5ee9ad3e4f | ||
![]() |
9e2135e2c7 | ||
![]() |
116da276c7 | ||
![]() |
7c9ab8c0b6 | ||
![]() |
3a36d9b1ae | ||
![]() |
5fce21e269 | ||
![]() |
a46bfc8b5f | ||
![]() |
a860c29304 | ||
![]() |
cc90b45022 | ||
![]() |
78ae4c42f7 | ||
![]() |
d1292c59ea | ||
![]() |
1a87c730bc | ||
![]() |
7aa72f768f | ||
![]() |
81b9f2d4e0 | ||
![]() |
a03a745295 | ||
![]() |
ce8bf90663 | ||
![]() |
e2ae919a84 | ||
![]() |
9a0e44a731 | ||
![]() |
650c816a7b | ||
![]() |
c8bfbb9315 | ||
![]() |
80b32dd392 | ||
![]() |
e3352ea426 | ||
![]() |
faed7683be | ||
![]() |
46a39190a4 | ||
![]() |
a2f738772c | ||
![]() |
a1697ff21c | ||
![]() |
b9fdf68be3 | ||
![]() |
ec971d0473 | ||
![]() |
d462464af9 | ||
![]() |
423e0768f9 | ||
![]() |
0ef3a141a8 | ||
![]() |
d532913d56 | ||
![]() |
3205bb3bdf | ||
![]() |
b238ba054d | ||
![]() |
a8e13df249 | ||
![]() |
9dc77d94ed | ||
![]() |
714995877a | ||
![]() |
729d7a11cd | ||
![]() |
95cd86a541 | ||
![]() |
ca1b8344fa | ||
![]() |
e797295b82 | ||
![]() |
f216b322c2 | ||
![]() |
b97c19ca3c | ||
![]() |
d54c7ca27c | ||
![]() |
077de8dcaa | ||
![]() |
6f50e5671a | ||
![]() |
bab9bdc832 | ||
![]() |
8e0adbb0fb | ||
![]() |
6b175ae7e3 | ||
![]() |
9494633da3 | ||
![]() |
c754a5f391 | ||
![]() |
2b692c6fc8 | ||
![]() |
35b6fb1a6d | ||
![]() |
61566a34d1 | ||
![]() |
e14f4c94c2 | ||
![]() |
52bdb1a80c | ||
![]() |
88ee3bdb6d | ||
![]() |
8f8a99a645 | ||
![]() |
8d35f22daf | ||
![]() |
6f8ef9bbdc | ||
![]() |
e7ddf6ba8f | ||
![]() |
407a119b9a | ||
![]() |
96aa2451bf | ||
![]() |
7680aab04d | ||
![]() |
3aef26b229 | ||
![]() |
935141adc0 | ||
![]() |
4300733d0c | ||
![]() |
c284a091c0 | ||
![]() |
7e768bfe23 | ||
![]() |
938f388a0b | ||
![]() |
d22a6fb8ea | ||
![]() |
11142e2f05 | ||
![]() |
fe7fb488c0 | ||
![]() |
0929c5086a | ||
![]() |
eb3ed7719b | ||
![]() |
6c19a0f8c7 | ||
![]() |
a0ece589b0 | ||
![]() |
8165071edf | ||
![]() |
a667974378 | ||
![]() |
fe1f88ce5d | ||
![]() |
8f2715e437 | ||
![]() |
57a3223c77 | ||
![]() |
df82ac8ac4 | ||
![]() |
16adddc803 | ||
![]() |
d1ae82c5c2 | ||
![]() |
0098936347 | ||
![]() |
17b85f6400 | ||
![]() |
48be9c0fd1 | ||
![]() |
ce36c2d0ea | ||
![]() |
06c63ef4a4 | ||
![]() |
03d93a7d6e | ||
![]() |
a0005c8b3e | ||
![]() |
2220343c50 | ||
![]() |
68fbed996f | ||
![]() |
20a4d8949d | ||
![]() |
1029ecfd49 | ||
![]() |
4de2a3b09e | ||
![]() |
651407c88e | ||
![]() |
3e2e485f66 | ||
![]() |
10efaad224 | ||
![]() |
3dda02660c | ||
![]() |
0b4e8141b0 | ||
![]() |
16ab0efa59 | ||
![]() |
20ca1ec547 | ||
![]() |
a1c9ab237f | ||
![]() |
a65239f7f1 | ||
![]() |
a0622675fd | ||
![]() |
1e0b778097 | ||
![]() |
3b666fef77 | ||
![]() |
9291c98189 | ||
![]() |
f6dadd8c82 | ||
![]() |
022bb272e6 | ||
![]() |
edad1a41e8 | ||
![]() |
421e78a748 | ||
![]() |
b961df90a7 | ||
![]() |
0b26b7098a | ||
![]() |
b09566a9a9 | ||
![]() |
b869deab66 | ||
![]() |
3d552c3112 | ||
![]() |
2a2bf3bf55 | ||
![]() |
26863b8cdc | ||
![]() |
b0f7d07214 | ||
![]() |
30c6557a32 | ||
![]() |
e18e173089 | ||
![]() |
6a8bdbd4f6 | ||
![]() |
b5dec87a62 | ||
![]() |
e50d30876a | ||
![]() |
cbcd9ed67d | ||
![]() |
6bcc26b487 | ||
![]() |
56fcb3fee1 | ||
![]() |
557e1790dd | ||
![]() |
2e67697d36 | ||
![]() |
ca4500692f | ||
![]() |
5aba6bff09 | ||
![]() |
9ed74068b9 | ||
![]() |
be4685742c | ||
![]() |
86b0a38811 | ||
![]() |
3e528f0a9a | ||
![]() |
170d7b6922 | ||
![]() |
f05249a9ad | ||
![]() |
5957ed7af3 | ||
![]() |
8ed63893eb | ||
![]() |
ea9dd926bc | ||
![]() |
f31d3b531f | ||
![]() |
d851448c32 | ||
![]() |
65327d52a6 | ||
![]() |
2ea5ae59b2 | ||
![]() |
a04d09028b | ||
![]() |
dd9255cb81 | ||
![]() |
536d7ecd3e | ||
![]() |
f50aac08df | ||
![]() |
db0f1d2159 | ||
![]() |
d97b565d6c | ||
![]() |
d3071f13d8 | ||
![]() |
768407c1d7 | ||
![]() |
c5d18b03cd | ||
![]() |
5f05b44cde | ||
![]() |
beaa09e9b3 | ||
![]() |
d6960f537b | ||
![]() |
0918eab004 | ||
![]() |
9b16789a17 | ||
![]() |
157240351f | ||
![]() |
6ad3d45d60 | ||
![]() |
b715e4d426 | ||
![]() |
aad5e9e99f | ||
![]() |
2a104ad33f | ||
![]() |
851290ee89 | ||
![]() |
a8c6c55e3b | ||
![]() |
992a647424 | ||
![]() |
c22461a1b6 | ||
![]() |
23fefc3ab7 | ||
![]() |
0beb9f0b5f | ||
![]() |
d376f9e7a3 | ||
![]() |
07e7bcd30b | ||
![]() |
95e86cb649 | ||
![]() |
802e5591ce | ||
![]() |
26d5730ad2 | ||
![]() |
8c7554e081 | ||
![]() |
9f5d47c320 | ||
![]() |
4aa452ce63 | ||
![]() |
7ef81ae10f | ||
![]() |
3628292afa | ||
![]() |
8aa5ecde62 | ||
![]() |
2f149eac9d | ||
![]() |
fcfc705b87 | ||
![]() |
7bd5c010a1 | ||
![]() |
52168d8e61 | ||
![]() |
a3842d9228 | ||
![]() |
452c51bd16 | ||
![]() |
22bedd9957 | ||
![]() |
cb318c723d | ||
![]() |
9a81d3c28e | ||
![]() |
3ca59e3b7a | ||
![]() |
07a12bdf15 | ||
![]() |
62e81d8bf0 | ||
![]() |
e295a41caa | ||
![]() |
c545a80aa3 | ||
![]() |
56f1a0cb51 | ||
![]() |
7218b6da97 | ||
![]() |
83f9f2d387 | ||
![]() |
13a2e38385 | ||
![]() |
996d942387 | ||
![]() |
cc42eb9fab | ||
![]() |
44125be979 | ||
![]() |
c2e9cc9a51 | ||
![]() |
fcd10f2adc | ||
![]() |
93009c1eed | ||
![]() |
d875be60d4 | ||
![]() |
db48d4c576 | ||
![]() |
3293231ad2 | ||
![]() |
aa1f2d3b59 | ||
![]() |
f492b679e3 | ||
![]() |
7ca84322bd | ||
![]() |
e3257b8fa3 | ||
![]() |
0bcda5ded8 | ||
![]() |
7ec82c0891 | ||
![]() |
3241ac7dc2 | ||
![]() |
e974605fc8 | ||
![]() |
ce13380533 | ||
![]() |
e23e3acda3 | ||
![]() |
63ab9972da | ||
![]() |
feb4901620 | ||
![]() |
67788a1b1b | ||
![]() |
001faf9ed7 | ||
![]() |
7a464d8a6e | ||
![]() |
5acd1c7c1b | ||
![]() |
de14540374 | ||
![]() |
37e0c2667b | ||
![]() |
252abb41c3 | ||
![]() |
fb2af341d8 | ||
![]() |
931f5f9c27 | ||
![]() |
b5d04e575e | ||
![]() |
3d395601fe | ||
![]() |
6630ce646c | ||
![]() |
f5508eea1c | ||
![]() |
c8c460432f | ||
![]() |
18299dafd2 | ||
![]() |
817d09026e | ||
![]() |
d76b009390 | ||
![]() |
9effed3ce1 | ||
![]() |
c1bbfc5dcf | ||
![]() |
b6c9cfb76f | ||
![]() |
59ca7bbcf2 | ||
![]() |
52c8d5e999 | ||
![]() |
5851e7f1b7 | ||
![]() |
e05b3441de | ||
![]() |
0d6e79cb93 | ||
![]() |
76a102d901 | ||
![]() |
bbd4659fbf | ||
![]() |
0880420ef6 | ||
![]() |
2351c79282 | ||
![]() |
d6016fc798 | ||
![]() |
bc17291006 | ||
![]() |
ed6cb14c4d | ||
![]() |
08de8a04b8 | ||
![]() |
5c67de8b47 | ||
![]() |
38b0408b1a | ||
![]() |
9ccad7ea86 | ||
![]() |
4a4e810a14 | ||
![]() |
76d2df3bde | ||
![]() |
c02563d894 | ||
![]() |
574ec6780b | ||
![]() |
9e0f56982b | ||
![]() |
1c66daf12b | ||
![]() |
59d683849e | ||
![]() |
9946acb1a0 | ||
![]() |
83a760644d | ||
![]() |
25ccff8640 | ||
![]() |
5c4c5a7794 | ||
![]() |
cb6af97595 | ||
![]() |
c4407dccf6 | ||
![]() |
ecdea4c3c8 | ||
![]() |
26d6f302cf | ||
![]() |
ecf10622ef | ||
![]() |
0fb553675b | ||
![]() |
2080fde4f9 | ||
![]() |
d10e67ce09 | ||
![]() |
74fe7c586b | ||
![]() |
05188aed6d | ||
![]() |
865efb7752 | ||
![]() |
4782b4da07 | ||
![]() |
4693632c7d | ||
![]() |
4c4b571a88 | ||
![]() |
328c87995b | ||
![]() |
a1d10e7d4a | ||
![]() |
77d9a7e9d3 | ||
![]() |
981b090088 | ||
![]() |
6d1c788ee0 | ||
![]() |
02de773d5b | ||
![]() |
2a240d83fd | ||
![]() |
25cdf7916d | ||
![]() |
11b5983a0d | ||
![]() |
4964987245 | ||
![]() |
ed129d6074 | ||
![]() |
37e928d869 | ||
![]() |
06def8c11e | ||
![]() |
10571676a4 | ||
![]() |
af5160237d | ||
![]() |
b86842ba73 | ||
![]() |
bfc271e743 | ||
![]() |
ee88140fdd | ||
![]() |
51249a1dce | ||
![]() |
57ec9e6b13 | ||
![]() |
1324d17d87 | ||
![]() |
26b438a888 | ||
![]() |
70f3f98363 | ||
![]() |
71e4be2d5e | ||
![]() |
5740806a28 | ||
![]() |
9b50a1b7a6 | ||
![]() |
19caad832e | ||
![]() |
dd6ae13281 | ||
![]() |
077abbe961 | ||
![]() |
3d85dc1127 | ||
![]() |
e3ea5dd13c | ||
![]() |
714b2ecd9c | ||
![]() |
883937bfd7 | ||
![]() |
0ebe08d796 | ||
![]() |
36b4fff5c7 | ||
![]() |
0684c8c388 | ||
![]() |
67744c877d | ||
![]() |
45d8c945e2 | ||
![]() |
ee19307ea2 | ||
![]() |
2c1cd25be4 | ||
![]() |
6e65558ea4 | ||
![]() |
304324ebd0 | ||
![]() |
97cd06d2ba | ||
![]() |
df948065a3 | ||
![]() |
f92126b44f | ||
![]() |
e329f6cdf1 | ||
![]() |
2c96438d61 | ||
![]() |
41a9aac75d | ||
![]() |
8768168536 | ||
![]() |
325809fbbf | ||
![]() |
3dd47a9f5b | ||
![]() |
00f16ef8f0 | ||
![]() |
5e67aae83b | ||
![]() |
ae5c603c98 | ||
![]() |
c62aa3cb55 | ||
![]() |
7073cb6d5c | ||
![]() |
a495ad58d0 | ||
![]() |
569165371c | ||
![]() |
ea14fa5251 | ||
![]() |
4a02865697 | ||
![]() |
3a2a20cefd | ||
![]() |
f2f42de701 | ||
![]() |
6d60d4897c | ||
![]() |
9f71ce8083 | ||
![]() |
50f8f7da93 | ||
![]() |
d475344b51 | ||
![]() |
2630863409 | ||
![]() |
e120f4a3f7 | ||
![]() |
6aff4c986c | ||
![]() |
2e891b1634 | ||
![]() |
35a0c5d36f | ||
![]() |
82d786b94c | ||
![]() |
de49d602a1 | ||
![]() |
3cb6511b66 | ||
![]() |
1d3ae777d5 | ||
![]() |
dbb2ea39d2 | ||
![]() |
fb607332b9 | ||
![]() |
b956f627b0 | ||
![]() |
2ac64ab573 | ||
![]() |
482e00970c | ||
![]() |
d62340efb5 | ||
![]() |
eb0f35219c | ||
![]() |
243598ae50 | ||
![]() |
74c965d21d | ||
![]() |
30316179a0 | ||
![]() |
f16a1101e6 | ||
![]() |
0466f7a18a | ||
![]() |
97cf3b2079 | ||
![]() |
6542d75a6a | ||
![]() |
c6900c5d51 | ||
![]() |
0e9642ea3e | ||
![]() |
3ab2892066 | ||
![]() |
69b69aca6a | ||
![]() |
a05dbd2e5a | ||
![]() |
c1641f6fb8 | ||
![]() |
452c79f9a1 | ||
![]() |
37959fe31c | ||
![]() |
30f73f39a0 | ||
![]() |
fa613cd5fb | ||
![]() |
b8afb22902 | ||
![]() |
07e07fc7e8 | ||
![]() |
58f95c1891 | ||
![]() |
7ad8e3b3da | ||
![]() |
b9a548758a | ||
![]() |
a436caf2ad | ||
![]() |
5461f8a225 | ||
![]() |
78b747571c | ||
![]() |
0c6a9a189b | ||
![]() |
45d666ff2d | ||
![]() |
4143925322 | ||
![]() |
9be3d2ccaf | ||
![]() |
8be8a310d7 | ||
![]() |
b81c339922 | ||
![]() |
d3d103f141 | ||
![]() |
dd575ccb88 | ||
![]() |
a83c7c64b5 | ||
![]() |
c04ded6fd8 | ||
![]() |
f24c779737 | ||
![]() |
3cdd358fc8 | ||
![]() |
e11939b149 | ||
![]() |
a4ef0702c9 | ||
![]() |
548b6352ca | ||
![]() |
2658c16073 | ||
![]() |
49d0b6aa00 | ||
![]() |
e65f584197 | ||
![]() |
ce1bbda188 | ||
![]() |
846897fb4c | ||
![]() |
457e134261 | ||
![]() |
3e129763c7 | ||
![]() |
c49d086965 | ||
![]() |
df7bfc4efd | ||
![]() |
7fba1f9ed2 | ||
![]() |
3205d52331 | ||
![]() |
111960c530 | ||
![]() |
e1bc1a0129 | ||
![]() |
8b543a5fa9 | ||
![]() |
dc7a67a1d7 | ||
![]() |
350c20d6ab | ||
![]() |
b5f0cd7c70 | ||
![]() |
90488cd77a | ||
![]() |
5bbc59e87c | ||
![]() |
c02758213b | ||
![]() |
09c62d67c1 | ||
![]() |
3f3fa3044c | ||
![]() |
62673145fb | ||
![]() |
0baf73de5e | ||
![]() |
66a0783e7b | ||
![]() |
17144c45e5 | ||
![]() |
311c0ba4f1 | ||
![]() |
e293d23ae3 | ||
![]() |
93769d2608 | ||
![]() |
e7540563d0 | ||
![]() |
fc1047550e | ||
![]() |
dadc618719 | ||
![]() |
36f3bd2869 | ||
![]() |
fdcea983a4 | ||
![]() |
081534457c | ||
![]() |
94a6272a1d | ||
![]() |
d389e0ecf8 | ||
![]() |
17eb1c604f | ||
![]() |
99474aab06 | ||
![]() |
f3d3bf20de | ||
![]() |
3c999e9847 | ||
![]() |
692fa5f606 | ||
![]() |
752b8e79ff | ||
![]() |
3f82cf4ab3 | ||
![]() |
1549b9df74 | ||
![]() |
78ef87a952 | ||
![]() |
29ede48e0f | ||
![]() |
6349d25219 | ||
![]() |
830a450f00 | ||
![]() |
18f9ce9c0b | ||
![]() |
2471be0c78 | ||
![]() |
60cfd687dc | ||
![]() |
e06c61b95d | ||
![]() |
471eee0872 | ||
![]() |
20abd8a9f8 | ||
![]() |
88e5c471de | ||
![]() |
09086e574d | ||
![]() |
8d95c13e31 | ||
![]() |
c922cc4351 | ||
![]() |
a42f28c502 | ||
![]() |
b802f3a71f | ||
![]() |
f78f212a77 | ||
![]() |
22cbfd473b | ||
![]() |
e5973ef713 | ||
![]() |
5364a29b5f | ||
![]() |
49754d33fa | ||
![]() |
d7d95037be | ||
![]() |
515146d4a2 | ||
![]() |
b7540fab58 | ||
![]() |
88e6f8abf6 | ||
![]() |
3c4dadd905 | ||
![]() |
f18f997796 | ||
![]() |
3a1daf46ae | ||
![]() |
8dffea4a42 | ||
![]() |
3852a6c5cf | ||
![]() |
6493f51a29 | ||
![]() |
028f42e775 | ||
![]() |
eb1cc55f94 | ||
![]() |
fb864f1132 | ||
![]() |
8b8d988c07 | ||
![]() |
c2b5451fe4 | ||
![]() |
487d3a6262 | ||
![]() |
fe990b4cd2 | ||
![]() |
019c7e2f78 | ||
![]() |
1c64a4f145 | ||
![]() |
fc869aa203 | ||
![]() |
3a0ada9f46 | ||
![]() |
cc9980fc19 | ||
![]() |
7515d8af64 | ||
![]() |
5e7579c1fd | ||
![]() |
38af53f281 | ||
![]() |
a26bec5b00 | ||
![]() |
feb943b6df | ||
![]() |
059e37a41f | ||
![]() |
7ce67fd465 | ||
![]() |
6b8b8209f3 | ||
![]() |
fd1d12859d | ||
![]() |
efb00b2387 | ||
![]() |
9b2ca57038 | ||
![]() |
9694face16 | ||
![]() |
7ef14832d0 | ||
![]() |
33f7b58e6e | ||
![]() |
9e992da863 | ||
![]() |
c986a218c7 | ||
![]() |
8ee6312402 | ||
![]() |
3c86b12ef9 | ||
![]() |
02d09edd49 | ||
![]() |
55af3c3dd1 | ||
![]() |
f8f5a77744 | ||
![]() |
5b6956ff24 | ||
![]() |
f1c138eaed | ||
![]() |
caf43638de | ||
![]() |
b783d2e210 | ||
![]() |
9a40a5f019 | ||
![]() |
81a7b34101 | ||
![]() |
f124e2a889 | ||
![]() |
02b2bcafc5 | ||
![]() |
81a5fd377e | ||
![]() |
f875ae4abf | ||
![]() |
01fd400ec7 | ||
![]() |
c59420581c | ||
![]() |
0aa9462cea | ||
![]() |
bf2f6f84e5 | ||
![]() |
5126f01b57 | ||
![]() |
ec4814a76e | ||
![]() |
69b53d70c5 | ||
![]() |
02875f5a34 | ||
![]() |
29d8c4e08d | ||
![]() |
df203311fe | ||
![]() |
10f9b91c44 | ||
![]() |
cd861364a2 | ||
![]() |
90b52abc04 | ||
![]() |
093b726c52 | ||
![]() |
fd84fc9dbe | ||
![]() |
4353646b3a | ||
![]() |
7545e5312c | ||
![]() |
bd494ce9ec | ||
![]() |
ee3cf8e6d1 | ||
![]() |
df524fdc1f | ||
![]() |
f0c0cfee1d | ||
![]() |
cf1bf3c163 | ||
![]() |
b9d703fe25 | ||
![]() |
46f7e685b6 | ||
![]() |
8023331fca | ||
![]() |
597db7d4bd | ||
![]() |
773bd32cd0 | ||
![]() |
c7e3756de1 | ||
![]() |
64bf122c95 | ||
![]() |
a92b0411fd | ||
![]() |
728d61762a | ||
![]() |
d9783e2a4d | ||
![]() |
b6303d2c16 | ||
![]() |
dd673a62b5 | ||
![]() |
b7577038a0 | ||
![]() |
613b71d23b | ||
![]() |
0284100c2d | ||
![]() |
26cd470d31 | ||
![]() |
ebaf509a42 | ||
![]() |
646db73061 | ||
![]() |
e6df581909 | ||
![]() |
a8e12409b5 | ||
![]() |
b7c7e293f7 | ||
![]() |
fe85aff052 | ||
![]() |
12d8bcad6e | ||
![]() |
bbfc244f16 | ||
![]() |
e275a2736a | ||
![]() |
d2a8076596 | ||
![]() |
83344f748f | ||
![]() |
1d5dbc454d | ||
![]() |
16e2dc60aa | ||
![]() |
a6fd4a8472 | ||
![]() |
c25698dfa7 | ||
![]() |
2ab2064a72 | ||
![]() |
356c26ce84 | ||
![]() |
bc56dfbcb5 | ||
![]() |
daaeb36363 | ||
![]() |
a46a9cf0bf | ||
![]() |
cbf435169a | ||
![]() |
7d05f6c54a | ||
![]() |
8a8667d1f4 | ||
![]() |
b9b8b764db | ||
![]() |
4978af351d | ||
![]() |
b5d639652d | ||
![]() |
c4ebfaf7f6 | ||
![]() |
256266280d | ||
![]() |
f886b58529 | ||
![]() |
14fe93b9ab | ||
![]() |
a2fb0ceb7d | ||
![]() |
bc284ecf6d | ||
![]() |
6113f586c9 | ||
![]() |
6c0862248c | ||
![]() |
0e8f2a7c6c | ||
![]() |
a808f8bbd5 | ||
![]() |
9428d5638e | ||
![]() |
e00cd5e304 | ||
![]() |
3c04bf2742 | ||
![]() |
2ef6d450bc | ||
![]() |
628b0bffeb | ||
![]() |
27eaa566a5 | ||
![]() |
fb36646bd3 | ||
![]() |
304cc37618 | ||
![]() |
8239e8a581 | ||
![]() |
69e117d898 | ||
![]() |
c773ec8a30 | ||
![]() |
b4b49ee096 | ||
![]() |
cf5ab87db9 | ||
![]() |
deaff293d2 | ||
![]() |
dccdebd2c0 | ||
![]() |
2674d4f034 | ||
![]() |
cb529561e1 | ||
![]() |
1a1cf49c67 | ||
![]() |
757b61a010 | ||
![]() |
448dcbab46 | ||
![]() |
30fc5bbb09 | ||
![]() |
d3e14818df | ||
![]() |
864e242ed9 | ||
![]() |
8f18baea8f | ||
![]() |
130489a1a9 | ||
![]() |
88a5a2049b | ||
![]() |
15fb3e5328 | ||
![]() |
90b800b030 | ||
![]() |
be88ad2676 | ||
![]() |
dfadfc0f13 | ||
![]() |
5ae48c8012 | ||
![]() |
6f163111ce | ||
![]() |
3bcbd05252 | ||
![]() |
e0d2697618 | ||
![]() |
7340535b9a | ||
![]() |
c385355c2b | ||
![]() |
a119790697 | ||
![]() |
e392098e35 | ||
![]() |
a2d4d16867 | ||
![]() |
7f74a85400 | ||
![]() |
1fc9eaf360 | ||
![]() |
1898f9b183 | ||
![]() |
1fb03a755f | ||
![]() |
8a505e3b66 | ||
![]() |
45ecec5623 | ||
![]() |
b34dfcd72f | ||
![]() |
319aa39925 | ||
![]() |
08ac40dd48 | ||
![]() |
0a0dc25e15 | ||
![]() |
0557a15fa8 | ||
![]() |
c5fafdda11 | ||
![]() |
434d1fe225 | ||
![]() |
405769dc97 | ||
![]() |
ffa116bf44 | ||
![]() |
6b1d8cabf4 | ||
![]() |
20c21e9e65 | ||
![]() |
088743a155 | ||
![]() |
d1fba28936 | ||
![]() |
65064a6934 | ||
![]() |
c7f5b7ae82 | ||
![]() |
2b244165e2 | ||
![]() |
47682bc143 | ||
![]() |
0dcfb97824 | ||
![]() |
50af671e02 | ||
![]() |
1ef273c35d | ||
![]() |
91b9831548 | ||
![]() |
3241968626 | ||
![]() |
5dee65afcb | ||
![]() |
4108eabd0d | ||
![]() |
829a693128 | ||
![]() |
bf1e49fc4c | ||
![]() |
d1984c0dda | ||
![]() |
22bb28db62 | ||
![]() |
721b52a45b | ||
![]() |
edf4f98d41 | ||
![]() |
7321ea1603 | ||
![]() |
b80c2126a3 | ||
![]() |
70afed9122 | ||
![]() |
87479c32de | ||
![]() |
35f75563a7 | ||
![]() |
f5d6a9f428 | ||
![]() |
2d314efb98 | ||
![]() |
930bac3c8b | ||
![]() |
d457f66e8b | ||
![]() |
98ef1ba579 | ||
![]() |
dd8514a84d | ||
![]() |
7bcfeab85c | ||
![]() |
d1cd03302c | ||
![]() |
36669652bf | ||
![]() |
48234896a4 | ||
![]() |
d4c310433e | ||
![]() |
a9d82a64a8 | ||
![]() |
74b6b8cb62 | ||
![]() |
52e8a1aba3 | ||
![]() |
0f92523d28 | ||
![]() |
6934fc6510 | ||
![]() |
ad746be010 | ||
![]() |
97119f729a | ||
![]() |
e576f1b0c4 | ||
![]() |
1771293fcf | ||
![]() |
d872423a76 | ||
![]() |
6c12f65b2d | ||
![]() |
566f50ec66 | ||
![]() |
e702f9c317 | ||
![]() |
680f8086e7 | ||
![]() |
9216f000ad | ||
![]() |
cea6ef7a66 | ||
![]() |
3287daf4e4 | ||
![]() |
376b40b25f | ||
![]() |
b00ca90d15 | ||
![]() |
06ec83701c | ||
![]() |
2b61ec32d0 | ||
![]() |
08299dd85d | ||
![]() |
a2e7199ff5 | ||
![]() |
28d68c8a77 | ||
![]() |
999abb7016 | ||
![]() |
5af2658d88 | ||
![]() |
46fd55d100 | ||
![]() |
fd4d747927 | ||
![]() |
ec3ea965d0 | ||
![]() |
41e4438a05 | ||
![]() |
254e0ea132 | ||
![]() |
190e24b25d | ||
![]() |
3f22aa9638 | ||
![]() |
0e679841a4 | ||
![]() |
e512e658e6 | ||
![]() |
f7cea2f92e | ||
![]() |
de5689f5b2 | ||
![]() |
e6cf5a5984 | ||
![]() |
23e7ccb543 | ||
![]() |
b4d97d4a2b | ||
![]() |
8f90fe79c8 | ||
![]() |
eb0df5d5e9 | ||
![]() |
e75510309d | ||
![]() |
3425d01853 | ||
![]() |
606737f3b2 | ||
![]() |
bf3b5fbf8e | ||
![]() |
f2fb06e6f3 | ||
![]() |
e58ba44e3d | ||
![]() |
1e19ec6b9a | ||
![]() |
6ed637cfdd | ||
![]() |
1d17e24c6e | ||
![]() |
de155a753d | ||
![]() |
1b4020b3d7 | ||
![]() |
b948750d55 | ||
![]() |
ce41ac9158 | ||
![]() |
5869467db3 | ||
![]() |
80472af53c | ||
![]() |
eaa6039082 | ||
![]() |
03b84e7c43 | ||
![]() |
0ee8f5c498 | ||
![]() |
41d15e8731 | ||
![]() |
bef081d353 | ||
![]() |
a79045c064 | ||
![]() |
24ae8249f9 | ||
![]() |
81341df635 | ||
![]() |
3c2bbf244d | ||
![]() |
fa60251c18 | ||
![]() |
5bd06494d5 | ||
![]() |
2fd217ef1f | ||
![]() |
d7068ca42b | ||
![]() |
f7b19cddbf | ||
![]() |
62e756a11e | ||
![]() |
c992661a17 | ||
![]() |
dde3205425 | ||
![]() |
15a264de3d | ||
![]() |
66929a9088 | ||
![]() |
45c9518a6d | ||
![]() |
95d32dc0da | ||
![]() |
d5ab1119d3 | ||
![]() |
d9110f4ef7 | ||
![]() |
5468394ef2 | ||
![]() |
7ad21e0e45 | ||
![]() |
5012c0c97c | ||
![]() |
698208fcd5 | ||
![]() |
e0d5fd9290 | ||
![]() |
4bd1974911 | ||
![]() |
d246e4090a | ||
![]() |
ff172f5ea1 | ||
![]() |
09b1413748 | ||
![]() |
14b997fe2c | ||
![]() |
908b412a9a | ||
![]() |
cbd80615be | ||
![]() |
a9707f0ab0 | ||
![]() |
4637e33326 | ||
![]() |
4a5f21dd87 | ||
![]() |
0778c2808b | ||
![]() |
567a1bb770 | ||
![]() |
743ee886be | ||
![]() |
20c6abae63 | ||
![]() |
ae0c585918 | ||
![]() |
4cfc416cdc | ||
![]() |
9902c4745d | ||
![]() |
e373ca7bdc | ||
![]() |
6a34a35585 | ||
![]() |
ee935a2988 | ||
![]() |
0a977a9d0a | ||
![]() |
4d26a3d2c6 | ||
![]() |
276d11e4e8 | ||
![]() |
e89c0f15dd | ||
![]() |
2bdf0aae14 | ||
![]() |
2bc7f0b8e0 | ||
![]() |
bf8ae22f3f | ||
![]() |
a5c6dab7c3 | ||
![]() |
f3eedec402 | ||
![]() |
741152dd50 | ||
![]() |
89c639f850 | ||
![]() |
727fb38baf | ||
![]() |
e19dd2d527 | ||
![]() |
9aa41b3524 | ||
![]() |
3911740360 | ||
![]() |
f161722b34 | ||
![]() |
adb956467b | ||
![]() |
00e17f4d69 | ||
![]() |
dbe49b24df | ||
![]() |
7e75193f4a | ||
![]() |
96e8cfb765 | ||
![]() |
d47ca6109a | ||
![]() |
8ac7d56fc5 | ||
![]() |
bfaede26c4 | ||
![]() |
477fd360f8 | ||
![]() |
5b50870f21 | ||
![]() |
8161316c01 | ||
![]() |
3cb26722f1 | ||
![]() |
7912f4a22a | ||
![]() |
851b23fb09 | ||
![]() |
849a108520 | ||
![]() |
97ff2e126c | ||
![]() |
ef6c4e6789 | ||
![]() |
3e467c517d | ||
![]() |
70ac696f17 | ||
![]() |
39c80ded58 | ||
![]() |
98782ca69d | ||
![]() |
c332eea354 | ||
![]() |
41a3f039b5 | ||
![]() |
2d3cf43bc5 | ||
![]() |
7c90702ff7 | ||
![]() |
952cf11b9e | ||
![]() |
4383550d98 | ||
![]() |
fcba2cca77 | ||
![]() |
2042b85056 | ||
![]() |
283a2ab648 | ||
![]() |
5e7b93d153 | ||
![]() |
c4ac35164b | ||
![]() |
22a13981f3 | ||
![]() |
29251b6e38 | ||
![]() |
b382f1412a | ||
![]() |
320537a054 | ||
![]() |
2fe7f8be46 | ||
![]() |
f100198a8a | ||
![]() |
b470fc0140 | ||
![]() |
db02d5eff0 | ||
![]() |
9564a9c28d | ||
![]() |
55295922c8 | ||
![]() |
c5b701f99d | ||
![]() |
3c606efc46 | ||
![]() |
cbab1a51f1 | ||
![]() |
41bcfcaffe | ||
![]() |
7cb14374cf | ||
![]() |
19b9fd0578 | ||
![]() |
d668c475de | ||
![]() |
248f3f2181 | ||
![]() |
bcd10f63ea | ||
![]() |
db9733f0d5 | ||
![]() |
e6aa213aa1 | ||
![]() |
f0fa726e71 | ||
![]() |
ae46ef7add | ||
![]() |
c87ca25f22 | ||
![]() |
ef627d53e5 | ||
![]() |
d0fcf3607d | ||
![]() |
3dbb7e5781 | ||
![]() |
9597358cb0 | ||
![]() |
c5a21a3b0e | ||
![]() |
f56ccec77f | ||
![]() |
489340a338 | ||
![]() |
e9e3d75383 | ||
![]() |
74a1e5ed86 | ||
![]() |
3d961a3dbb | ||
![]() |
604d56d0b8 | ||
![]() |
a82e259c1d | ||
![]() |
34ef99a8ab | ||
![]() |
fa389d19d0 | ||
![]() |
ebe70d996c | ||
![]() |
e83e13fd57 | ||
![]() |
02362ae5e1 | ||
![]() |
e0f324f61c | ||
![]() |
c422a081bf | ||
![]() |
72efb24b73 | ||
![]() |
ee7653a8cd | ||
![]() |
020abaf7d6 | ||
![]() |
6e2f6350e6 | ||
![]() |
6b939f7567 | ||
![]() |
c958a7c593 | ||
![]() |
8709ea4df0 | ||
![]() |
16d3041d7a | ||
![]() |
64b2037eda | ||
![]() |
4133001c73 | ||
![]() |
d5f46eedab | ||
![]() |
7a2a3e048e | ||
![]() |
20891a370b | ||
![]() |
ca412e0184 | ||
![]() |
8a89f5ae27 | ||
![]() |
dbea2acc8f | ||
![]() |
65167625c4 | ||
![]() |
77b23d3acb | ||
![]() |
5d8aa27831 | ||
![]() |
b15eda9466 | ||
![]() |
5eb4b975ae | ||
![]() |
3ce1e01d96 | ||
![]() |
afcefe3d04 | ||
![]() |
4726fe8b6f | ||
![]() |
201a4a7ef9 | ||
![]() |
93a6391f96 | ||
![]() |
782db3f324 | ||
![]() |
7610a0459e | ||
![]() |
927616decb | ||
![]() |
8b2b7bbe6d | ||
![]() |
ca30dbc832 | ||
![]() |
3c3c847db5 | ||
![]() |
b7a2601724 | ||
![]() |
1189df1fe6 | ||
![]() |
a37177703c | ||
![]() |
8df1324afd | ||
![]() |
a6e2708605 | ||
![]() |
0df91c31f1 | ||
![]() |
1718cf6504 | ||
![]() |
bec8d00232 | ||
![]() |
1471dd72a6 | ||
![]() |
588a786d73 | ||
![]() |
7e1c1da424 | ||
![]() |
94ebe3b61c | ||
![]() |
75c5ccccec | ||
![]() |
eb4c8e1b1e | ||
![]() |
73b1b942a9 | ||
![]() |
8c5ef111d8 | ||
![]() |
b6266ad18f | ||
![]() |
2635c3a1a0 | ||
![]() |
13ece25de0 | ||
![]() |
c69ece1d0e | ||
![]() |
07ec6ff7ab | ||
![]() |
66e23bd356 | ||
![]() |
abc58000b4 | ||
![]() |
5e3ef94697 | ||
![]() |
2daee375d0 | ||
![]() |
857944aabe | ||
![]() |
72f58d54a3 | ||
![]() |
668b068bb5 | ||
![]() |
9893ae9880 | ||
![]() |
9cdf2f046f | ||
![]() |
f7f841ce6d | ||
![]() |
e8a52d48cf | ||
![]() |
21eb253c57 | ||
![]() |
754286cb9a | ||
![]() |
08f5d9a92f | ||
![]() |
a1a61000ab | ||
![]() |
dd91d4264a | ||
![]() |
9d87ac5244 | ||
![]() |
3559e27cdd | ||
![]() |
4b9c79fa07 | ||
![]() |
ee197bf89f | ||
![]() |
31dac60d04 | ||
![]() |
75a7dd38a1 | ||
![]() |
6c658a676e | ||
![]() |
38de2a7767 | ||
![]() |
d7cb7c78af | ||
![]() |
ee272da95f | ||
![]() |
8098ac6b15 | ||
![]() |
c08f0054da | ||
![]() |
29f8cda104 | ||
![]() |
0d1a8d6d2f | ||
![]() |
edfd3bbe91 | ||
![]() |
148f394679 | ||
![]() |
b91ec5a520 | ||
![]() |
f57873fb1b | ||
![]() |
32754defef | ||
![]() |
0d777343fb | ||
![]() |
3b0fa4f707 | ||
![]() |
8b3d01c49b | ||
![]() |
1d5eb983ea | ||
![]() |
add647afe6 | ||
![]() |
3e777f2a5b | ||
![]() |
7bfb11a711 | ||
![]() |
808cf93a19 | ||
![]() |
37ddc3b8f7 | ||
![]() |
bc91b830ed | ||
![]() |
ced248ad49 | ||
![]() |
d73fbb1643 | ||
![]() |
40db244d4a | ||
![]() |
8181535f40 | ||
![]() |
2d9fa58157 | ||
![]() |
7af0b47ba9 | ||
![]() |
b9f0418038 | ||
![]() |
74b729bf5a | ||
![]() |
66333caebc | ||
![]() |
405c0922f8 | ||
![]() |
bdf9b2453f | ||
![]() |
8ef5f0e93c | ||
![]() |
597bb98cb9 | ||
![]() |
cc971a4569 | ||
![]() |
8955d25a00 | ||
![]() |
bdcba570cb | ||
![]() |
0e83c94832 | ||
![]() |
d2a6f79612 | ||
![]() |
8154c7b53a | ||
![]() |
ac611acaa1 | ||
![]() |
faecd59432 | ||
![]() |
0f536a9b9a | ||
![]() |
a203b006e7 | ||
![]() |
5978650ec6 | ||
![]() |
8fea4c00ad | ||
![]() |
9b3ec22beb | ||
![]() |
54e1c17539 | ||
![]() |
06e2500443 | ||
![]() |
f43c3e0dd6 | ||
![]() |
73b2c366df | ||
![]() |
1547a698cb | ||
![]() |
87b1f5adec | ||
![]() |
8a281452b8 | ||
![]() |
dd38b84116 | ||
![]() |
658d372cd2 | ||
![]() |
61f7e73961 | ||
![]() |
31bcb613ad | ||
![]() |
b85d39b189 | ||
![]() |
dc7bef5d48 | ||
![]() |
dae4550bc3 | ||
![]() |
dace08d4e3 | ||
![]() |
c42a9b8b8f | ||
![]() |
94db39e055 | ||
![]() |
9ee324915f | ||
![]() |
bb211ee779 | ||
![]() |
00c3d8c523 | ||
![]() |
4f0d2cd612 | ||
![]() |
f4304a120b | ||
![]() |
b8c3e564c6 | ||
![]() |
0a47fba9ae | ||
![]() |
9aea8a7d7c | ||
![]() |
7b9c0d65b9 | ||
![]() |
7dd9a4e089 | ||
![]() |
d15773f282 | ||
![]() |
9784ea4a60 | ||
![]() |
4fce5aba63 | ||
![]() |
2ab77fbaf7 | ||
![]() |
94ad290e14 | ||
![]() |
118e3703dc | ||
![]() |
d2b290f789 | ||
![]() |
583f05af2d | ||
![]() |
1b2cb13a21 | ||
![]() |
4dc0c7bbe2 | ||
![]() |
44212d492d | ||
![]() |
3ccb83e49c | ||
![]() |
215691ac1a | ||
![]() |
a884647a7c | ||
![]() |
590d129cd3 | ||
![]() |
8fcb7efbd2 | ||
![]() |
f1204d2749 | ||
![]() |
b07b8d65a6 | ||
![]() |
dadd7472fd | ||
![]() |
2801b60b0e | ||
![]() |
e625ac21c3 | ||
![]() |
7ace9eb325 | ||
![]() |
02465672f9 | ||
![]() |
9b1b620a9c | ||
![]() |
6ea6c79575 | ||
![]() |
c430b9f8cf | ||
![]() |
92fb390f7b | ||
![]() |
8164840cba | ||
![]() |
8e8b2d7e8a | ||
![]() |
459de80124 | ||
![]() |
b38aacd1ce | ||
![]() |
a4535c11e4 | ||
![]() |
f78e93a364 | ||
![]() |
75d2a3a45f | ||
![]() |
6d3feaebfd | ||
![]() |
781929e9a8 | ||
![]() |
1871ef1a72 | ||
![]() |
5e9a7b94ba | ||
![]() |
51a5746611 | ||
![]() |
16fc7ebecc | ||
![]() |
10a5d50ce9 | ||
![]() |
454264a87f | ||
![]() |
7ecb76dddc | ||
![]() |
5fc7c15039 | ||
![]() |
44f860d9b0 | ||
![]() |
64eabbe8d0 | ||
![]() |
197938eaab | ||
![]() |
02a40055f5 | ||
![]() |
72bacc016a | ||
![]() |
aeecc10e45 | ||
![]() |
2b3edbaa46 | ||
![]() |
270f8677a7 | ||
![]() |
447edd1355 | ||
![]() |
024921212a | ||
![]() |
5d08a34365 | ||
![]() |
20763e7c26 | ||
![]() |
b33ba4c902 | ||
![]() |
fae5e834b9 | ||
![]() |
4cb4bd13ad | ||
![]() |
896304ccaa | ||
![]() |
9ae186e6f9 | ||
![]() |
c7690c05f5 | ||
![]() |
7273a8c7a5 | ||
![]() |
4195d5746f | ||
![]() |
8b90b51b1a | ||
![]() |
e74af5c73c | ||
![]() |
99c2442b28 | ||
![]() |
3c2df48a1a | ||
![]() |
a0c1c48dca | ||
![]() |
4e05aba0a5 | ||
![]() |
299a69a2de | ||
![]() |
7bc077ac08 | ||
![]() |
64752f6b57 | ||
![]() |
c2880bcf9a | ||
![]() |
159dcdbda5 | ||
![]() |
1838fa971e | ||
![]() |
d8d111f093 | ||
![]() |
31a03b1d30 | ||
![]() |
5004771d79 | ||
![]() |
92b9fc1ba9 | ||
![]() |
585cc24dd5 | ||
![]() |
f261c70f1e | ||
![]() |
8c9dfa449c | ||
![]() |
d94ca2962e | ||
![]() |
3c7eacf923 | ||
![]() |
643486b14b | ||
![]() |
87045da1e2 | ||
![]() |
a109723ada | ||
![]() |
151573a26e | ||
![]() |
284e0d3f60 | ||
![]() |
7048af276a | ||
![]() |
e6cd3c1970 | ||
![]() |
623ac441d5 | ||
![]() |
003201bc1b | ||
![]() |
1bf6d9165f | ||
![]() |
4b49bd9de8 | ||
![]() |
69f82d503a | ||
![]() |
0cfa5211e9 | ||
![]() |
6c7ff54aad | ||
![]() |
0b53a8981c | ||
![]() |
c4dbd58efd | ||
![]() |
959f80604a | ||
![]() |
dee691b72b | ||
![]() |
a4829ce26a | ||
![]() |
7ed4dedd5e | ||
![]() |
93d272f50b | ||
![]() |
6fe5674ac3 | ||
![]() |
6024a862d6 | ||
![]() |
195f3a5dbf | ||
![]() |
94f0808a2f | ||
![]() |
e3f062b981 | ||
![]() |
22142203ce | ||
![]() |
412d9f5cd2 | ||
![]() |
133532a463 | ||
![]() |
c9683808c9 | ||
![]() |
b25f083687 | ||
![]() |
62ba4b9730 | ||
![]() |
150c7f26a5 | ||
![]() |
4b4111ec03 | ||
![]() |
9e33344808 | ||
![]() |
bba1fc7194 | ||
![]() |
efaa1c4dd7 | ||
![]() |
a88b318d7d | ||
![]() |
2460c3e076 | ||
![]() |
9763b72f81 | ||
![]() |
19ab62c06c | ||
![]() |
eb8f37d846 | ||
![]() |
5c9e2d7070 | ||
![]() |
da9f2b1a8c | ||
![]() |
985f298c46 | ||
![]() |
2bb63b2d02 | ||
![]() |
ac75c61c8c | ||
![]() |
f8f0915a32 | ||
![]() |
7b87511e88 | ||
![]() |
bb05c2218f | ||
![]() |
e96e8472d9 | ||
![]() |
3191c15889 | ||
![]() |
d4af7aa411 | ||
![]() |
7b3719101a | ||
![]() |
4def3bf5c2 | ||
![]() |
3daee46c3d | ||
![]() |
fbebd8d7c0 | ||
![]() |
af5cb35531 | ||
![]() |
61a2dca81f | ||
![]() |
4aa8e9b800 | ||
![]() |
b81fe1695d | ||
![]() |
3625e5080c | ||
![]() |
d689a707a4 | ||
![]() |
55e1745889 | ||
![]() |
c21775980f | ||
![]() |
33e597f5bb | ||
![]() |
6fa2ca648a | ||
![]() |
adecf5d927 | ||
![]() |
e69d7d804b | ||
![]() |
43ec058593 | ||
![]() |
a4d96061de | ||
![]() |
a0eecb83cf | ||
![]() |
9955315a10 | ||
![]() |
ee7097b497 | ||
![]() |
387c23d27a | ||
![]() |
359593728e | ||
![]() |
9708832ccd | ||
![]() |
aa2ae8fe4c | ||
![]() |
729845662f | ||
![]() |
6ff28c92a4 | ||
![]() |
d19bf59f47 | ||
![]() |
a340b9c8a1 | ||
![]() |
d7939ca958 | ||
![]() |
00d67d53bf | ||
![]() |
b869ad02a1 | ||
![]() |
91d4941438 | ||
![]() |
5746e8b56d | ||
![]() |
8e83f90952 | ||
![]() |
80910c72cf | ||
![]() |
ca4ece3ccd | ||
![]() |
ac6c0484ed | ||
![]() |
1e4923835b | ||
![]() |
7be9ae9c02 | ||
![]() |
da38efebdf | ||
![]() |
0fd51e35e1 | ||
![]() |
59e0c1fe4e | ||
![]() |
cfe9528884 | ||
![]() |
f1eecd146d | ||
![]() |
1b45637e9c | ||
![]() |
76acf2b01d | ||
![]() |
eda2bd2dbd | ||
![]() |
6819decec3 | ||
![]() |
c2220aa1ef | ||
![]() |
0d87e529f3 | ||
![]() |
24ce1830eb | ||
![]() |
dfed4176ed | ||
![]() |
be8615741e | ||
![]() |
fd1f6aa960 | ||
![]() |
067a6107f5 | ||
![]() |
62782be08e | ||
![]() |
428fe4a372 | ||
![]() |
91e3302e54 | ||
![]() |
906d5d0bab | ||
![]() |
06c62abfbd | ||
![]() |
31e4a0a88b | ||
![]() |
1663450c1f | ||
![]() |
d840308392 | ||
![]() |
a08467342c | ||
![]() |
cf82cb35c9 | ||
![]() |
53fff1d54a | ||
![]() |
60cf260b71 | ||
![]() |
b9d1499d04 | ||
![]() |
3fe68d7bbe | ||
![]() |
2eeb02638b | ||
![]() |
d8e02c6fa0 | ||
![]() |
3c8d7f2dee | ||
![]() |
d71d388c08 | ||
![]() |
e2093436ac | ||
![]() |
b7e2013589 | ||
![]() |
f31cee75f3 | ||
![]() |
ec27f3c053 | ||
![]() |
737f00df3a | ||
![]() |
e6804dad2f | ||
![]() |
1875e9852e | ||
![]() |
f021e7fcc3 | ||
![]() |
4cf9ed9d26 | ||
![]() |
f8b77d7ef7 | ||
![]() |
31850c3351 | ||
![]() |
446842ecfc | ||
![]() |
8159b7574c | ||
![]() |
7050f29cff | ||
![]() |
2f32565476 | ||
![]() |
ceeb2da3fe | ||
![]() |
6dc5c1de32 | ||
![]() |
a5ab6f2558 | ||
![]() |
f846c2934c | ||
![]() |
1adb5e724d | ||
![]() |
8fad13b500 | ||
![]() |
6b2ee0e301 | ||
![]() |
7a241950d4 | ||
![]() |
c1a1f6d74e | ||
![]() |
b99422da12 | ||
![]() |
2ec695fba7 | ||
![]() |
109c07b23b | ||
![]() |
bf34c955ff | ||
![]() |
6ece5240a5 | ||
![]() |
211fbf0cf6 | ||
![]() |
f2d635671d | ||
![]() |
8b204cac99 | ||
![]() |
d15c701510 | ||
![]() |
c73688d167 | ||
![]() |
32da039d5f | ||
![]() |
79da613cb6 | ||
![]() |
2973e4672a | ||
![]() |
18e0012a59 | ||
![]() |
2554ced198 | ||
![]() |
fad13c148e | ||
![]() |
dbaa606a9f | ||
![]() |
c0bccc6a95 | ||
![]() |
b1a4eec7be | ||
![]() |
bb8a0d26e2 | ||
![]() |
629a5dd61e | ||
![]() |
b21970fd53 | ||
![]() |
4279ba13e9 | ||
![]() |
28d70438ec | ||
![]() |
ca6454f9fd | ||
![]() |
0ffc9955b2 | ||
![]() |
cf53d0866a | ||
![]() |
3f19c0ed03 | ||
![]() |
355efadf87 | ||
![]() |
927a9781ad | ||
![]() |
70eb22df42 | ||
![]() |
f461485aa0 | ||
![]() |
b6f1ced455 | ||
![]() |
10f36870e6 | ||
![]() |
fdaf9e9b46 | ||
![]() |
57b709824f | ||
![]() |
c7b46ac861 | ||
![]() |
bf28a512c6 | ||
![]() |
4333bd58cf | ||
![]() |
96a29883cd | ||
![]() |
59e359ff98 | ||
![]() |
4603813896 |
@@ -1,9 +0,0 @@
|
|||||||
{
|
|
||||||
"qpdf": {
|
|
||||||
"version": "11.2.0"
|
|
||||||
},
|
|
||||||
"jbig2enc": {
|
|
||||||
"version": "0.29",
|
|
||||||
"git_tag": "0.29"
|
|
||||||
}
|
|
||||||
}
|
|
28
.codecov.yml
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
codecov:
|
||||||
|
require_ci_to_pass: true
|
||||||
|
# https://docs.codecov.com/docs/flags#recommended-automatic-flag-management
|
||||||
|
# Require each flag to have 1 upload before notification
|
||||||
|
flag_management:
|
||||||
|
individual_flags:
|
||||||
|
- name: backend
|
||||||
|
paths:
|
||||||
|
- src/
|
||||||
|
- name: frontend
|
||||||
|
paths:
|
||||||
|
- src-ui/
|
||||||
|
# https://docs.codecov.com/docs/pull-request-comments
|
||||||
|
# codecov will only comment if coverage changes
|
||||||
|
comment:
|
||||||
|
require_changes: true
|
||||||
|
coverage:
|
||||||
|
status:
|
||||||
|
project:
|
||||||
|
default:
|
||||||
|
# https://docs.codecov.com/docs/commit-status#threshold
|
||||||
|
threshold: 1%
|
||||||
|
patch:
|
||||||
|
default:
|
||||||
|
# For the changed lines only, target 75% covered, but
|
||||||
|
# allow as low as 50%
|
||||||
|
target: 75%
|
||||||
|
threshold: 25%
|
@@ -1,21 +1,28 @@
|
|||||||
|
# Tool caches
|
||||||
**/__pycache__
|
**/__pycache__
|
||||||
/src-ui/.vscode
|
**/.ruff_cache/
|
||||||
/src-ui/node_modules
|
**/.mypy_cache/
|
||||||
/src-ui/dist
|
# Virtual environment & similar
|
||||||
|
.venv/
|
||||||
|
./src-ui/node_modules
|
||||||
|
./src-ui/dist
|
||||||
|
# IDE folders
|
||||||
|
.idea/
|
||||||
|
.vscode/
|
||||||
|
./src-ui/.vscode
|
||||||
|
# VCS
|
||||||
.git
|
.git
|
||||||
/export
|
# Test related
|
||||||
/consume
|
**/.pytest_cache
|
||||||
/media
|
|
||||||
/data
|
|
||||||
/docs
|
|
||||||
.pytest_cache
|
|
||||||
/dist
|
|
||||||
/scripts
|
|
||||||
/resources
|
|
||||||
**/tests
|
**/tests
|
||||||
**/*.spec.ts
|
**/*.spec.ts
|
||||||
**/htmlcov
|
**/htmlcov
|
||||||
/src/.pytest_cache
|
# Local folders
|
||||||
.idea
|
./export
|
||||||
.venv/
|
./consume
|
||||||
.vscode/
|
./media
|
||||||
|
./data
|
||||||
|
./docs
|
||||||
|
./dist
|
||||||
|
./scripts
|
||||||
|
./resources
|
||||||
|
1
.env
@@ -1,2 +1 @@
|
|||||||
COMPOSE_PROJECT_NAME=paperless
|
COMPOSE_PROJECT_NAME=paperless
|
||||||
export PROMPT="(pipenv-projectname)$P$G"
|
|
||||||
|
14
.github/DISCUSSION_TEMPLATE/feature-requests.yml
vendored
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
title: "[Feature Request] "
|
||||||
|
body:
|
||||||
|
- type: textarea
|
||||||
|
id: description
|
||||||
|
attributes:
|
||||||
|
label: Description
|
||||||
|
description: A clear and concise description of what you would like to see.
|
||||||
|
validations:
|
||||||
|
required: true
|
||||||
|
- type: textarea
|
||||||
|
id: other
|
||||||
|
attributes:
|
||||||
|
label: Other
|
||||||
|
description: Add any other context or information about the feature request here.
|
13
.github/ISSUE_TEMPLATE/bug-report.yml
vendored
@@ -6,14 +6,21 @@ body:
|
|||||||
- type: markdown
|
- type: markdown
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
value: |
|
||||||
Have a question? 👉 [Start a new discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/new) or [ask in chat](https://matrix.to/#/#paperlessngx:matrix.org).
|
### ⚠️ Please remember: issues are for *bugs*
|
||||||
|
That is, something you believe affects every single user of Paperless-ngx, not just you. If you're not sure, start with one of the other options below.
|
||||||
|
|
||||||
Before opening an issue, please double check:
|
Also, note that **Paperless-ngx does not perform OCR itself**, that is handled by other tools. Problems with OCR of specific files should likely be raised 'upstream', see https://github.com/ocrmypdf/OCRmyPDF/issues or https://github.com/tesseract-ocr/tesseract/issues
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
#### Have a question? 👉 [Start a new discussion](https://github.com/paperless-ngx/paperless-ngx/discussions/new) or [ask in chat](https://matrix.to/#/#paperlessngx:matrix.org).
|
||||||
|
|
||||||
|
#### Before opening an issue, please double check:
|
||||||
|
|
||||||
- [The troubleshooting documentation](https://docs.paperless-ngx.com/troubleshooting/).
|
- [The troubleshooting documentation](https://docs.paperless-ngx.com/troubleshooting/).
|
||||||
- [The installation instructions](https://docs.paperless-ngx.com/setup/#installation).
|
- [The installation instructions](https://docs.paperless-ngx.com/setup/#installation).
|
||||||
- [Existing issues and discussions](https://github.com/paperless-ngx/paperless-ngx/search?q=&type=issues).
|
- [Existing issues and discussions](https://github.com/paperless-ngx/paperless-ngx/search?q=&type=issues).
|
||||||
- Disable any customer container initialization scripts, if using any
|
- Disable any customer container initialization scripts, if using
|
||||||
|
|
||||||
If you encounter issues while installing or configuring Paperless-ngx, please post in the ["Support" section of the discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=support).
|
If you encounter issues while installing or configuring Paperless-ngx, please post in the ["Support" section of the discussions](https://github.com/paperless-ngx/paperless-ngx/discussions/new?category=support).
|
||||||
- type: textarea
|
- type: textarea
|
||||||
|
7
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -20,11 +20,16 @@ NOTE: Please check only one box!
|
|||||||
- [ ] Bug fix (non-breaking change which fixes an issue)
|
- [ ] Bug fix (non-breaking change which fixes an issue)
|
||||||
- [ ] New feature (non-breaking change which adds functionality)
|
- [ ] New feature (non-breaking change which adds functionality)
|
||||||
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
||||||
- [ ] Other (please explain)
|
- [ ] Other (please explain):
|
||||||
|
|
||||||
## Checklist:
|
## Checklist:
|
||||||
|
|
||||||
|
<!--
|
||||||
|
NOTE: PRs that do not address the following will not be merged, please do not skip any relevant items.
|
||||||
|
-->
|
||||||
|
|
||||||
- [ ] I have read & agree with the [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
- [ ] I have read & agree with the [contributing guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
||||||
|
- [ ] If applicable, I have included testing coverage for new code in this PR, for [backend](https://docs.paperless-ngx.com/development/#testing) and / or [front-end](https://docs.paperless-ngx.com/development/#testing-and-code-style) changes.
|
||||||
- [ ] If applicable, I have tested my code for new features & regressions on both mobile & desktop devices, using the latest version of major browsers.
|
- [ ] If applicable, I have tested my code for new features & regressions on both mobile & desktop devices, using the latest version of major browsers.
|
||||||
- [ ] If applicable, I have checked that all tests pass, see [documentation](https://docs.paperless-ngx.com/development/#back-end-development).
|
- [ ] If applicable, I have checked that all tests pass, see [documentation](https://docs.paperless-ngx.com/development/#back-end-development).
|
||||||
- [ ] I have run all `pre-commit` hooks, see [documentation](https://docs.paperless-ngx.com/development/#code-formatting-with-pre-commit-hooks).
|
- [ ] I have run all `pre-commit` hooks, see [documentation](https://docs.paperless-ngx.com/development/#code-formatting-with-pre-commit-hooks).
|
||||||
|
42
.github/dependabot.yml
vendored
@@ -8,7 +8,7 @@ updates:
|
|||||||
target-branch: "dev"
|
target-branch: "dev"
|
||||||
# Look for `package.json` and `lock` files in the `/src-ui` directory
|
# Look for `package.json` and `lock` files in the `/src-ui` directory
|
||||||
directory: "/src-ui"
|
directory: "/src-ui"
|
||||||
# Check the npm registry for updates every month
|
open-pull-requests-limit: 10
|
||||||
schedule:
|
schedule:
|
||||||
interval: "monthly"
|
interval: "monthly"
|
||||||
labels:
|
labels:
|
||||||
@@ -17,6 +17,21 @@ updates:
|
|||||||
# Add reviewers
|
# Add reviewers
|
||||||
reviewers:
|
reviewers:
|
||||||
- "paperless-ngx/frontend"
|
- "paperless-ngx/frontend"
|
||||||
|
groups:
|
||||||
|
frontend-angular-dependencies:
|
||||||
|
patterns:
|
||||||
|
- "@angular*"
|
||||||
|
- "@ng-*"
|
||||||
|
- "ngx-*"
|
||||||
|
- "ng2-pdf-viewer"
|
||||||
|
frontend-jest-dependencies:
|
||||||
|
patterns:
|
||||||
|
- "@types/jest"
|
||||||
|
- "jest*"
|
||||||
|
frontend-eslint-dependencies:
|
||||||
|
patterns:
|
||||||
|
- "@typescript-eslint*"
|
||||||
|
- "eslint"
|
||||||
|
|
||||||
# Enable version updates for Python
|
# Enable version updates for Python
|
||||||
- package-ecosystem: "pip"
|
- package-ecosystem: "pip"
|
||||||
@@ -32,8 +47,25 @@ updates:
|
|||||||
# Add reviewers
|
# Add reviewers
|
||||||
reviewers:
|
reviewers:
|
||||||
- "paperless-ngx/backend"
|
- "paperless-ngx/backend"
|
||||||
|
groups:
|
||||||
|
development:
|
||||||
|
patterns:
|
||||||
|
- "*pytest*"
|
||||||
|
- "black"
|
||||||
|
- "ruff"
|
||||||
|
- "mkdocs-material"
|
||||||
|
django:
|
||||||
|
patterns:
|
||||||
|
- "*django*"
|
||||||
|
major-versions:
|
||||||
|
update-types:
|
||||||
|
- "major"
|
||||||
|
small-changes:
|
||||||
|
update-types:
|
||||||
|
- "minor"
|
||||||
|
- "patch"
|
||||||
|
|
||||||
# Enable updates for Github Actions
|
# Enable updates for GitHub Actions
|
||||||
- package-ecosystem: "github-actions"
|
- package-ecosystem: "github-actions"
|
||||||
target-branch: "dev"
|
target-branch: "dev"
|
||||||
directory: "/"
|
directory: "/"
|
||||||
@@ -46,3 +78,9 @@ updates:
|
|||||||
# Add reviewers
|
# Add reviewers
|
||||||
reviewers:
|
reviewers:
|
||||||
- "paperless-ngx/ci-cd"
|
- "paperless-ngx/ci-cd"
|
||||||
|
groups:
|
||||||
|
actions:
|
||||||
|
update-types:
|
||||||
|
- "major"
|
||||||
|
- "minor"
|
||||||
|
- "patch"
|
||||||
|
4
.github/release-drafter.yml
vendored
@@ -40,7 +40,7 @@ categories:
|
|||||||
labels:
|
labels:
|
||||||
- 'frontend'
|
- 'frontend'
|
||||||
- 'backend'
|
- 'backend'
|
||||||
collapse-after: 0
|
collapse-after: 1
|
||||||
include-labels:
|
include-labels:
|
||||||
- 'enhancement'
|
- 'enhancement'
|
||||||
- 'bug'
|
- 'bug'
|
||||||
@@ -54,6 +54,8 @@ include-labels:
|
|||||||
- 'ci-cd'
|
- 'ci-cd'
|
||||||
- 'breaking-change'
|
- 'breaking-change'
|
||||||
- 'notable'
|
- 'notable'
|
||||||
|
exclude-labels:
|
||||||
|
- 'skip-changelog'
|
||||||
category-template: '### $TITLE'
|
category-template: '### $TITLE'
|
||||||
change-template: '- $TITLE @$AUTHOR ([#$NUMBER]($URL))'
|
change-template: '- $TITLE @$AUTHOR ([#$NUMBER]($URL))'
|
||||||
change-title-escapes: '\<*_&#@'
|
change-title-escapes: '\<*_&#@'
|
||||||
|
402
.github/scripts/cleanup-tags.py
vendored
@@ -1,402 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
import json
|
|
||||||
import logging
|
|
||||||
import os
|
|
||||||
import shutil
|
|
||||||
import subprocess
|
|
||||||
from argparse import ArgumentParser
|
|
||||||
from typing import Dict
|
|
||||||
from typing import Final
|
|
||||||
from typing import List
|
|
||||||
from typing import Optional
|
|
||||||
|
|
||||||
from common import get_log_level
|
|
||||||
from github import ContainerPackage
|
|
||||||
from github import GithubBranchApi
|
|
||||||
from github import GithubContainerRegistryApi
|
|
||||||
|
|
||||||
logger = logging.getLogger("cleanup-tags")
|
|
||||||
|
|
||||||
|
|
||||||
class DockerManifest2:
|
|
||||||
"""
|
|
||||||
Data class wrapping the Docker Image Manifest Version 2.
|
|
||||||
|
|
||||||
See https://docs.docker.com/registry/spec/manifest-v2-2/
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, data: Dict) -> None:
|
|
||||||
self._data = data
|
|
||||||
# This is the sha256: digest string. Corresponds to GitHub API name
|
|
||||||
# if the package is an untagged package
|
|
||||||
self.digest = self._data["digest"]
|
|
||||||
platform_data_os = self._data["platform"]["os"]
|
|
||||||
platform_arch = self._data["platform"]["architecture"]
|
|
||||||
platform_variant = self._data["platform"].get(
|
|
||||||
"variant",
|
|
||||||
"",
|
|
||||||
)
|
|
||||||
self.platform = f"{platform_data_os}/{platform_arch}{platform_variant}"
|
|
||||||
|
|
||||||
|
|
||||||
class RegistryTagsCleaner:
|
|
||||||
"""
|
|
||||||
This is the base class for the image registry cleaning. Given a package
|
|
||||||
name, it will keep all images which are tagged and all untagged images
|
|
||||||
referred to by a manifest. This results in only images which have been untagged
|
|
||||||
and cannot be referenced except by their SHA in being removed. None of these
|
|
||||||
images should be referenced, so it is fine to delete them.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(
|
|
||||||
self,
|
|
||||||
package_name: str,
|
|
||||||
repo_owner: str,
|
|
||||||
repo_name: str,
|
|
||||||
package_api: GithubContainerRegistryApi,
|
|
||||||
branch_api: Optional[GithubBranchApi],
|
|
||||||
):
|
|
||||||
self.actually_delete = False
|
|
||||||
self.package_api = package_api
|
|
||||||
self.branch_api = branch_api
|
|
||||||
self.package_name = package_name
|
|
||||||
self.repo_owner = repo_owner
|
|
||||||
self.repo_name = repo_name
|
|
||||||
self.tags_to_delete: List[str] = []
|
|
||||||
self.tags_to_keep: List[str] = []
|
|
||||||
|
|
||||||
# Get the information about all versions of the given package
|
|
||||||
# These are active, not deleted, the default returned from the API
|
|
||||||
self.all_package_versions = self.package_api.get_active_package_versions(
|
|
||||||
self.package_name,
|
|
||||||
)
|
|
||||||
|
|
||||||
# Get a mapping from a tag like "1.7.0" or "feature-xyz" to the ContainerPackage
|
|
||||||
# tagged with it. It makes certain lookups easy
|
|
||||||
self.all_pkgs_tags_to_version: Dict[str, ContainerPackage] = {}
|
|
||||||
for pkg in self.all_package_versions:
|
|
||||||
for tag in pkg.tags:
|
|
||||||
self.all_pkgs_tags_to_version[tag] = pkg
|
|
||||||
logger.info(
|
|
||||||
f"Located {len(self.all_package_versions)} versions of package {self.package_name}",
|
|
||||||
)
|
|
||||||
|
|
||||||
self.decide_what_tags_to_keep()
|
|
||||||
|
|
||||||
def clean(self):
|
|
||||||
"""
|
|
||||||
This method will delete image versions, based on the selected tags to delete
|
|
||||||
"""
|
|
||||||
for tag_to_delete in self.tags_to_delete:
|
|
||||||
package_version_info = self.all_pkgs_tags_to_version[tag_to_delete]
|
|
||||||
|
|
||||||
if self.actually_delete:
|
|
||||||
logger.info(
|
|
||||||
f"Deleting {tag_to_delete} (id {package_version_info.id})",
|
|
||||||
)
|
|
||||||
self.package_api.delete_package_version(
|
|
||||||
package_version_info,
|
|
||||||
)
|
|
||||||
|
|
||||||
else:
|
|
||||||
logger.info(
|
|
||||||
f"Would delete {tag_to_delete} (id {package_version_info.id})",
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
logger.info("No tags to delete")
|
|
||||||
|
|
||||||
def clean_untagged(self, is_manifest_image: bool):
|
|
||||||
"""
|
|
||||||
This method will delete untagged images, that is those which are not named. It
|
|
||||||
handles if the image tag is actually a manifest, which points to images that look otherwise
|
|
||||||
untagged.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def _clean_untagged_manifest():
|
|
||||||
"""
|
|
||||||
|
|
||||||
Handles the deletion of untagged images, but where the package is a manifest, ie a multi
|
|
||||||
arch image, which means some "untagged" images need to exist still.
|
|
||||||
|
|
||||||
Ok, bear with me, these are annoying.
|
|
||||||
|
|
||||||
Our images are multi-arch, so the manifest is more like a pointer to a sha256 digest.
|
|
||||||
These images are untagged, but pointed to, and so should not be removed (or every pull fails).
|
|
||||||
|
|
||||||
So for each image getting kept, parse the manifest to find the digest(s) it points to. Then
|
|
||||||
remove those from the list of untagged images. The final result is the untagged, not pointed to
|
|
||||||
version which should be safe to remove.
|
|
||||||
|
|
||||||
Example:
|
|
||||||
Tag: ghcr.io/paperless-ngx/paperless-ngx:1.7.1 refers to
|
|
||||||
amd64: sha256:b9ed4f8753bbf5146547671052d7e91f68cdfc9ef049d06690b2bc866fec2690
|
|
||||||
armv7: sha256:81605222df4ba4605a2ba4893276e5d08c511231ead1d5da061410e1bbec05c3
|
|
||||||
arm64: sha256:374cd68db40734b844705bfc38faae84cc4182371de4bebd533a9a365d5e8f3b
|
|
||||||
each of which appears as untagged image, but isn't really.
|
|
||||||
|
|
||||||
So from the list of untagged packages, remove those digests. Once all tags which
|
|
||||||
are being kept are checked, the remaining untagged packages are actually untagged
|
|
||||||
with no referrals in a manifest to them.
|
|
||||||
"""
|
|
||||||
# Simplify the untagged data, mapping name (which is a digest) to the version
|
|
||||||
# At the moment, these are the images which APPEAR untagged.
|
|
||||||
untagged_versions = {}
|
|
||||||
for x in self.all_package_versions:
|
|
||||||
if x.untagged:
|
|
||||||
untagged_versions[x.name] = x
|
|
||||||
|
|
||||||
skips = 0
|
|
||||||
|
|
||||||
# Parse manifests to locate digests pointed to
|
|
||||||
for tag in sorted(self.tags_to_keep):
|
|
||||||
full_name = f"ghcr.io/{self.repo_owner}/{self.package_name}:{tag}"
|
|
||||||
logger.info(f"Checking manifest for {full_name}")
|
|
||||||
try:
|
|
||||||
proc = subprocess.run(
|
|
||||||
[
|
|
||||||
shutil.which("docker"),
|
|
||||||
"manifest",
|
|
||||||
"inspect",
|
|
||||||
full_name,
|
|
||||||
],
|
|
||||||
capture_output=True,
|
|
||||||
)
|
|
||||||
|
|
||||||
manifest_list = json.loads(proc.stdout)
|
|
||||||
for manifest_data in manifest_list["manifests"]:
|
|
||||||
manifest = DockerManifest2(manifest_data)
|
|
||||||
|
|
||||||
if manifest.digest in untagged_versions:
|
|
||||||
logger.info(
|
|
||||||
f"Skipping deletion of {manifest.digest},"
|
|
||||||
f" referred to by {full_name}"
|
|
||||||
f" for {manifest.platform}",
|
|
||||||
)
|
|
||||||
del untagged_versions[manifest.digest]
|
|
||||||
skips += 1
|
|
||||||
|
|
||||||
except Exception as err:
|
|
||||||
self.actually_delete = False
|
|
||||||
logger.exception(err)
|
|
||||||
return
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
f"Skipping deletion of {skips} packages referred to by a manifest",
|
|
||||||
)
|
|
||||||
|
|
||||||
# Delete the untagged and not pointed at packages
|
|
||||||
logger.info(f"Deleting untagged packages of {self.package_name}")
|
|
||||||
for to_delete_name in untagged_versions:
|
|
||||||
to_delete_version = untagged_versions[to_delete_name]
|
|
||||||
|
|
||||||
if self.actually_delete:
|
|
||||||
logger.info(
|
|
||||||
f"Deleting id {to_delete_version.id} named {to_delete_version.name}",
|
|
||||||
)
|
|
||||||
self.package_api.delete_package_version(
|
|
||||||
to_delete_version,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
logger.info(
|
|
||||||
f"Would delete {to_delete_name} (id {to_delete_version.id})",
|
|
||||||
)
|
|
||||||
|
|
||||||
def _clean_untagged_non_manifest():
|
|
||||||
"""
|
|
||||||
If the package is not a multi-arch manifest, images without tags are safe to delete.
|
|
||||||
"""
|
|
||||||
|
|
||||||
for package in self.all_package_versions:
|
|
||||||
if package.untagged:
|
|
||||||
if self.actually_delete:
|
|
||||||
logger.info(
|
|
||||||
f"Deleting id {package.id} named {package.name}",
|
|
||||||
)
|
|
||||||
self.package_api.delete_package_version(
|
|
||||||
package,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
logger.info(
|
|
||||||
f"Would delete {package.name} (id {package.id})",
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
logger.info(
|
|
||||||
f"Not deleting tag {package.tags[0]} of package {self.package_name}",
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.info("Beginning untagged image cleaning")
|
|
||||||
|
|
||||||
if is_manifest_image:
|
|
||||||
_clean_untagged_manifest()
|
|
||||||
else:
|
|
||||||
_clean_untagged_non_manifest()
|
|
||||||
|
|
||||||
def decide_what_tags_to_keep(self):
|
|
||||||
"""
|
|
||||||
This method holds the logic to delete what tags to keep and there fore
|
|
||||||
what tags to delete.
|
|
||||||
|
|
||||||
By default, any image with at least 1 tag will be kept
|
|
||||||
"""
|
|
||||||
# By default, keep anything which is tagged
|
|
||||||
self.tags_to_keep = list(set(self.all_pkgs_tags_to_version.keys()))
|
|
||||||
|
|
||||||
|
|
||||||
class MainImageTagsCleaner(RegistryTagsCleaner):
|
|
||||||
def decide_what_tags_to_keep(self):
|
|
||||||
"""
|
|
||||||
Overrides the default logic for deciding what images to keep. Images tagged as "feature-"
|
|
||||||
will be removed, if the corresponding branch no longer exists.
|
|
||||||
"""
|
|
||||||
|
|
||||||
# Default to everything gets kept still
|
|
||||||
super().decide_what_tags_to_keep()
|
|
||||||
|
|
||||||
# Locate the feature branches
|
|
||||||
feature_branches = {}
|
|
||||||
for branch in self.branch_api.get_branches(
|
|
||||||
repo=self.repo_name,
|
|
||||||
):
|
|
||||||
if branch.name.startswith("feature-"):
|
|
||||||
logger.debug(f"Found feature branch {branch.name}")
|
|
||||||
feature_branches[branch.name] = branch
|
|
||||||
|
|
||||||
logger.info(f"Located {len(feature_branches)} feature branches")
|
|
||||||
|
|
||||||
if not len(feature_branches):
|
|
||||||
# Our work here is done, delete nothing
|
|
||||||
return
|
|
||||||
|
|
||||||
# Filter to packages which are tagged with feature-*
|
|
||||||
packages_tagged_feature: List[ContainerPackage] = []
|
|
||||||
for package in self.all_package_versions:
|
|
||||||
if package.tag_matches("feature-"):
|
|
||||||
packages_tagged_feature.append(package)
|
|
||||||
|
|
||||||
# Map tags like "feature-xyz" to a ContainerPackage
|
|
||||||
feature_pkgs_tags_to_versions: Dict[str, ContainerPackage] = {}
|
|
||||||
for pkg in packages_tagged_feature:
|
|
||||||
for tag in pkg.tags:
|
|
||||||
feature_pkgs_tags_to_versions[tag] = pkg
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
f'Located {len(feature_pkgs_tags_to_versions)} versions of package {self.package_name} tagged "feature-"',
|
|
||||||
)
|
|
||||||
|
|
||||||
# All the feature tags minus all the feature branches leaves us feature tags
|
|
||||||
# with no corresponding branch
|
|
||||||
self.tags_to_delete = list(
|
|
||||||
set(feature_pkgs_tags_to_versions.keys()) - set(feature_branches.keys()),
|
|
||||||
)
|
|
||||||
|
|
||||||
# All the tags minus the set of going to be deleted tags leaves us the
|
|
||||||
# tags which will be kept around
|
|
||||||
self.tags_to_keep = list(
|
|
||||||
set(self.all_pkgs_tags_to_version.keys()) - set(self.tags_to_delete),
|
|
||||||
)
|
|
||||||
logger.info(
|
|
||||||
f"Located {len(self.tags_to_delete)} versions of package {self.package_name} to delete",
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class LibraryTagsCleaner(RegistryTagsCleaner):
|
|
||||||
"""
|
|
||||||
Exists for the off change that someday, the installer library images
|
|
||||||
will need their own logic
|
|
||||||
"""
|
|
||||||
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
def _main():
|
|
||||||
parser = ArgumentParser(
|
|
||||||
description="Using the GitHub API locate and optionally delete container"
|
|
||||||
" tags which no longer have an associated feature branch",
|
|
||||||
)
|
|
||||||
|
|
||||||
# Requires an affirmative command to actually do a delete
|
|
||||||
parser.add_argument(
|
|
||||||
"--delete",
|
|
||||||
action="store_true",
|
|
||||||
default=False,
|
|
||||||
help="If provided, actually delete the container tags",
|
|
||||||
)
|
|
||||||
|
|
||||||
# When a tagged image is updated, the previous version remains, but it no longer tagged
|
|
||||||
# Add this option to remove them as well
|
|
||||||
parser.add_argument(
|
|
||||||
"--untagged",
|
|
||||||
action="store_true",
|
|
||||||
default=False,
|
|
||||||
help="If provided, delete untagged containers as well",
|
|
||||||
)
|
|
||||||
|
|
||||||
# If given, the package is assumed to be a multi-arch manifest. Cache packages are
|
|
||||||
# not multi-arch, all other types are
|
|
||||||
parser.add_argument(
|
|
||||||
"--is-manifest",
|
|
||||||
action="store_true",
|
|
||||||
default=False,
|
|
||||||
help="If provided, the package is assumed to be a multi-arch manifest following schema v2",
|
|
||||||
)
|
|
||||||
|
|
||||||
# Allows configuration of log level for debugging
|
|
||||||
parser.add_argument(
|
|
||||||
"--loglevel",
|
|
||||||
default="info",
|
|
||||||
help="Configures the logging level",
|
|
||||||
)
|
|
||||||
|
|
||||||
# Get the name of the package being processed this round
|
|
||||||
parser.add_argument(
|
|
||||||
"package",
|
|
||||||
help="The package to process",
|
|
||||||
)
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
logging.basicConfig(
|
|
||||||
level=get_log_level(args),
|
|
||||||
datefmt="%Y-%m-%d %H:%M:%S",
|
|
||||||
format="%(asctime)s %(levelname)-8s %(message)s",
|
|
||||||
)
|
|
||||||
|
|
||||||
# Must be provided in the environment
|
|
||||||
repo_owner: Final[str] = os.environ["GITHUB_REPOSITORY_OWNER"]
|
|
||||||
repo: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
|
||||||
gh_token: Final[str] = os.environ["TOKEN"]
|
|
||||||
|
|
||||||
# Find all branches named feature-*
|
|
||||||
# Note: Only relevant to the main application, but simpler to
|
|
||||||
# leave in for all packages
|
|
||||||
with GithubBranchApi(gh_token) as branch_api:
|
|
||||||
with GithubContainerRegistryApi(gh_token, repo_owner) as container_api:
|
|
||||||
if args.package in {"paperless-ngx", "paperless-ngx/builder/cache/app"}:
|
|
||||||
cleaner = MainImageTagsCleaner(
|
|
||||||
args.package,
|
|
||||||
repo_owner,
|
|
||||||
repo,
|
|
||||||
container_api,
|
|
||||||
branch_api,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
cleaner = LibraryTagsCleaner(
|
|
||||||
args.package,
|
|
||||||
repo_owner,
|
|
||||||
repo,
|
|
||||||
container_api,
|
|
||||||
None,
|
|
||||||
)
|
|
||||||
|
|
||||||
# Set if actually doing a delete vs dry run
|
|
||||||
cleaner.actually_delete = args.delete
|
|
||||||
|
|
||||||
# Clean images with tags
|
|
||||||
cleaner.clean()
|
|
||||||
|
|
||||||
# Clean images which are untagged
|
|
||||||
cleaner.clean_untagged(args.is_manifest)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
_main()
|
|
48
.github/scripts/common.py
vendored
@@ -1,48 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
import logging
|
|
||||||
|
|
||||||
|
|
||||||
def get_image_tag(
|
|
||||||
repo_name: str,
|
|
||||||
pkg_name: str,
|
|
||||||
pkg_version: str,
|
|
||||||
) -> str:
|
|
||||||
"""
|
|
||||||
Returns a string representing the normal image for a given package
|
|
||||||
"""
|
|
||||||
return f"ghcr.io/{repo_name.lower()}/builder/{pkg_name}:{pkg_version}"
|
|
||||||
|
|
||||||
|
|
||||||
def get_cache_image_tag(
|
|
||||||
repo_name: str,
|
|
||||||
pkg_name: str,
|
|
||||||
pkg_version: str,
|
|
||||||
branch_name: str,
|
|
||||||
) -> str:
|
|
||||||
"""
|
|
||||||
Returns a string representing the expected image cache tag for a given package
|
|
||||||
|
|
||||||
Registry type caching is utilized for the builder images, to allow fast
|
|
||||||
rebuilds, generally almost instant for the same version
|
|
||||||
"""
|
|
||||||
return f"ghcr.io/{repo_name.lower()}/builder/cache/{pkg_name}:{pkg_version}"
|
|
||||||
|
|
||||||
|
|
||||||
def get_log_level(args) -> int:
|
|
||||||
"""
|
|
||||||
Returns a logging level, based
|
|
||||||
:param args:
|
|
||||||
:return:
|
|
||||||
"""
|
|
||||||
levels = {
|
|
||||||
"critical": logging.CRITICAL,
|
|
||||||
"error": logging.ERROR,
|
|
||||||
"warn": logging.WARNING,
|
|
||||||
"warning": logging.WARNING,
|
|
||||||
"info": logging.INFO,
|
|
||||||
"debug": logging.DEBUG,
|
|
||||||
}
|
|
||||||
level = levels.get(args.loglevel.lower())
|
|
||||||
if level is None:
|
|
||||||
level = logging.INFO
|
|
||||||
return level
|
|
92
.github/scripts/get-build-json.py
vendored
@@ -1,92 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""
|
|
||||||
This is a helper script for the mutli-stage Docker image builder.
|
|
||||||
It provides a single point of configuration for package version control.
|
|
||||||
The output JSON object is used by the CI workflow to determine what versions
|
|
||||||
to build and pull into the final Docker image.
|
|
||||||
|
|
||||||
Python package information is obtained from the Pipfile.lock. As this is
|
|
||||||
kept updated by dependabot, it usually will need no further configuration.
|
|
||||||
The sole exception currently is pikepdf, which has a dependency on qpdf,
|
|
||||||
and is configured here to use the latest version of qpdf built by the workflow.
|
|
||||||
|
|
||||||
Other package version information is configured directly below, generally by
|
|
||||||
setting the version and Git information, if any.
|
|
||||||
|
|
||||||
"""
|
|
||||||
import argparse
|
|
||||||
import json
|
|
||||||
import os
|
|
||||||
from pathlib import Path
|
|
||||||
from typing import Final
|
|
||||||
|
|
||||||
from common import get_cache_image_tag
|
|
||||||
from common import get_image_tag
|
|
||||||
|
|
||||||
|
|
||||||
def _main():
|
|
||||||
parser = argparse.ArgumentParser(
|
|
||||||
description="Generate a JSON object of information required to build the given package, based on the Pipfile.lock",
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
"package",
|
|
||||||
help="The name of the package to generate JSON for",
|
|
||||||
)
|
|
||||||
|
|
||||||
PIPFILE_LOCK_PATH: Final[Path] = Path("Pipfile.lock")
|
|
||||||
BUILD_CONFIG_PATH: Final[Path] = Path(".build-config.json")
|
|
||||||
|
|
||||||
# Read the main config file
|
|
||||||
build_json: Final = json.loads(BUILD_CONFIG_PATH.read_text())
|
|
||||||
|
|
||||||
# Read Pipfile.lock file
|
|
||||||
pipfile_data: Final = json.loads(PIPFILE_LOCK_PATH.read_text())
|
|
||||||
|
|
||||||
args: Final = parser.parse_args()
|
|
||||||
|
|
||||||
# Read from environment variables set by GitHub Actions
|
|
||||||
repo_name: Final[str] = os.environ["GITHUB_REPOSITORY"]
|
|
||||||
branch_name: Final[str] = os.environ["GITHUB_REF_NAME"]
|
|
||||||
|
|
||||||
# Default output values
|
|
||||||
version = None
|
|
||||||
extra_config = {}
|
|
||||||
|
|
||||||
if args.package in pipfile_data["default"]:
|
|
||||||
# Read the version from Pipfile.lock
|
|
||||||
pkg_data = pipfile_data["default"][args.package]
|
|
||||||
pkg_version = pkg_data["version"].split("==")[-1]
|
|
||||||
version = pkg_version
|
|
||||||
|
|
||||||
# Any extra/special values needed
|
|
||||||
if args.package == "pikepdf":
|
|
||||||
extra_config["qpdf_version"] = build_json["qpdf"]["version"]
|
|
||||||
|
|
||||||
elif args.package in build_json:
|
|
||||||
version = build_json[args.package]["version"]
|
|
||||||
|
|
||||||
else:
|
|
||||||
raise NotImplementedError(args.package)
|
|
||||||
|
|
||||||
# The JSON object we'll output
|
|
||||||
output = {
|
|
||||||
"name": args.package,
|
|
||||||
"version": version,
|
|
||||||
"image_tag": get_image_tag(repo_name, args.package, version),
|
|
||||||
"cache_tag": get_cache_image_tag(
|
|
||||||
repo_name,
|
|
||||||
args.package,
|
|
||||||
version,
|
|
||||||
branch_name,
|
|
||||||
),
|
|
||||||
}
|
|
||||||
|
|
||||||
# Add anything special a package may need
|
|
||||||
output.update(extra_config)
|
|
||||||
|
|
||||||
# Output the JSON info to stdout
|
|
||||||
print(json.dumps(output))
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
_main()
|
|
274
.github/scripts/github.py
vendored
@@ -1,274 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""
|
|
||||||
This module contains some useful classes for interacting with the Github API.
|
|
||||||
The full documentation for the API can be found here: https://docs.github.com/en/rest
|
|
||||||
|
|
||||||
Mostly, this focusses on two areas, repo branches and repo packages, as the use case
|
|
||||||
is cleaning up container images which are no longer referred to.
|
|
||||||
|
|
||||||
"""
|
|
||||||
import functools
|
|
||||||
import logging
|
|
||||||
import re
|
|
||||||
import urllib.parse
|
|
||||||
from typing import Dict
|
|
||||||
from typing import List
|
|
||||||
from typing import Optional
|
|
||||||
|
|
||||||
import httpx
|
|
||||||
|
|
||||||
logger = logging.getLogger("github-api")
|
|
||||||
|
|
||||||
|
|
||||||
class _GithubApiBase:
|
|
||||||
"""
|
|
||||||
A base class for interacting with the Github API. It
|
|
||||||
will handle the session and setting authorization headers.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, token: str) -> None:
|
|
||||||
self._token = token
|
|
||||||
self._client: Optional[httpx.Client] = None
|
|
||||||
|
|
||||||
def __enter__(self) -> "_GithubApiBase":
|
|
||||||
"""
|
|
||||||
Sets up the required headers for auth and response
|
|
||||||
type from the API
|
|
||||||
"""
|
|
||||||
self._client = httpx.Client()
|
|
||||||
self._client.headers.update(
|
|
||||||
{
|
|
||||||
"Accept": "application/vnd.github.v3+json",
|
|
||||||
"Authorization": f"token {self._token}",
|
|
||||||
},
|
|
||||||
)
|
|
||||||
return self
|
|
||||||
|
|
||||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
|
||||||
"""
|
|
||||||
Ensures the authorization token is cleaned up no matter
|
|
||||||
the reason for the exit
|
|
||||||
"""
|
|
||||||
if "Accept" in self._client.headers:
|
|
||||||
del self._client.headers["Accept"]
|
|
||||||
if "Authorization" in self._client.headers:
|
|
||||||
del self._client.headers["Authorization"]
|
|
||||||
|
|
||||||
# Close the session as well
|
|
||||||
self._client.close()
|
|
||||||
self._client = None
|
|
||||||
|
|
||||||
def _read_all_pages(self, endpoint):
|
|
||||||
"""
|
|
||||||
Helper function to read all pages of an endpoint, utilizing the
|
|
||||||
next.url until exhausted. Assumes the endpoint returns a list
|
|
||||||
"""
|
|
||||||
internal_data = []
|
|
||||||
|
|
||||||
while True:
|
|
||||||
resp = self._client.get(endpoint)
|
|
||||||
if resp.status_code == 200:
|
|
||||||
internal_data += resp.json()
|
|
||||||
if "next" in resp.links:
|
|
||||||
endpoint = resp.links["next"]["url"]
|
|
||||||
else:
|
|
||||||
logger.debug("Exiting pagination loop")
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
logger.warning(f"Request to {endpoint} return HTTP {resp.status_code}")
|
|
||||||
resp.raise_for_status()
|
|
||||||
|
|
||||||
return internal_data
|
|
||||||
|
|
||||||
|
|
||||||
class _EndpointResponse:
|
|
||||||
"""
|
|
||||||
For all endpoint JSON responses, store the full
|
|
||||||
response data, for ease of extending later, if need be.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, data: Dict) -> None:
|
|
||||||
self._data = data
|
|
||||||
|
|
||||||
|
|
||||||
class GithubBranch(_EndpointResponse):
|
|
||||||
"""
|
|
||||||
Simple wrapper for a repository branch, only extracts name information
|
|
||||||
for now.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, data: Dict) -> None:
|
|
||||||
super().__init__(data)
|
|
||||||
self.name = self._data["name"]
|
|
||||||
|
|
||||||
|
|
||||||
class GithubBranchApi(_GithubApiBase):
|
|
||||||
"""
|
|
||||||
Wrapper around branch API.
|
|
||||||
|
|
||||||
See https://docs.github.com/en/rest/branches/branches
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, token: str) -> None:
|
|
||||||
super().__init__(token)
|
|
||||||
|
|
||||||
self._ENDPOINT = "https://api.github.com/repos/{REPO}/branches"
|
|
||||||
|
|
||||||
def get_branches(self, repo: str) -> List[GithubBranch]:
|
|
||||||
"""
|
|
||||||
Returns all current branches of the given repository owned by the given
|
|
||||||
owner or organization.
|
|
||||||
"""
|
|
||||||
# The environment GITHUB_REPOSITORY already contains the owner in the correct location
|
|
||||||
endpoint = self._ENDPOINT.format(REPO=repo)
|
|
||||||
internal_data = self._read_all_pages(endpoint)
|
|
||||||
return [GithubBranch(branch) for branch in internal_data]
|
|
||||||
|
|
||||||
|
|
||||||
class ContainerPackage(_EndpointResponse):
|
|
||||||
"""
|
|
||||||
Data class wrapping the JSON response from the package related
|
|
||||||
endpoints
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, data: Dict):
|
|
||||||
super().__init__(data)
|
|
||||||
# This is a numerical ID, required for interactions with this
|
|
||||||
# specific package, including deletion of it or restoration
|
|
||||||
self.id: int = self._data["id"]
|
|
||||||
|
|
||||||
# A string name. This might be an actual name or it could be a
|
|
||||||
# digest string like "sha256:"
|
|
||||||
self.name: str = self._data["name"]
|
|
||||||
|
|
||||||
# URL to the package, including its ID, can be used for deletion
|
|
||||||
# or restoration without needing to build up a URL ourselves
|
|
||||||
self.url: str = self._data["url"]
|
|
||||||
|
|
||||||
# The list of tags applied to this image. Maybe an empty list
|
|
||||||
self.tags: List[str] = self._data["metadata"]["container"]["tags"]
|
|
||||||
|
|
||||||
@functools.cached_property
|
|
||||||
def untagged(self) -> bool:
|
|
||||||
"""
|
|
||||||
Returns True if the image has no tags applied to it, False otherwise
|
|
||||||
"""
|
|
||||||
return len(self.tags) == 0
|
|
||||||
|
|
||||||
@functools.cache
|
|
||||||
def tag_matches(self, pattern: str) -> bool:
|
|
||||||
"""
|
|
||||||
Returns True if the image has at least one tag which matches the given regex,
|
|
||||||
False otherwise
|
|
||||||
"""
|
|
||||||
for tag in self.tags:
|
|
||||||
if re.match(pattern, tag) is not None:
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
return f"Package {self.name}"
|
|
||||||
|
|
||||||
|
|
||||||
class GithubContainerRegistryApi(_GithubApiBase):
|
|
||||||
"""
|
|
||||||
Class wrapper to deal with the Github packages API. This class only deals with
|
|
||||||
container type packages, the only type published by paperless-ngx.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, token: str, owner_or_org: str) -> None:
|
|
||||||
super().__init__(token)
|
|
||||||
self._owner_or_org = owner_or_org
|
|
||||||
if self._owner_or_org == "paperless-ngx":
|
|
||||||
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-an-organization
|
|
||||||
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
|
||||||
# https://docs.github.com/en/rest/packages#delete-package-version-for-an-organization
|
|
||||||
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/orgs/{ORG}/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
|
||||||
else:
|
|
||||||
# https://docs.github.com/en/rest/packages#get-all-package-versions-for-a-package-owned-by-the-authenticated-user
|
|
||||||
self._PACKAGES_VERSIONS_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions"
|
|
||||||
# https://docs.github.com/en/rest/packages#delete-a-package-version-for-the-authenticated-user
|
|
||||||
self._PACKAGE_VERSION_DELETE_ENDPOINT = "https://api.github.com/user/packages/{PACKAGE_TYPE}/{PACKAGE_NAME}/versions/{PACKAGE_VERSION_ID}"
|
|
||||||
self._PACKAGE_VERSION_RESTORE_ENDPOINT = (
|
|
||||||
f"{self._PACKAGE_VERSION_DELETE_ENDPOINT}/restore"
|
|
||||||
)
|
|
||||||
|
|
||||||
def get_active_package_versions(
|
|
||||||
self,
|
|
||||||
package_name: str,
|
|
||||||
) -> List[ContainerPackage]:
|
|
||||||
"""
|
|
||||||
Returns all the versions of a given package (container images) from
|
|
||||||
the API
|
|
||||||
"""
|
|
||||||
|
|
||||||
package_type: str = "container"
|
|
||||||
# Need to quote this for slashes in the name
|
|
||||||
package_name = urllib.parse.quote(package_name, safe="")
|
|
||||||
|
|
||||||
endpoint = self._PACKAGES_VERSIONS_ENDPOINT.format(
|
|
||||||
ORG=self._owner_or_org,
|
|
||||||
PACKAGE_TYPE=package_type,
|
|
||||||
PACKAGE_NAME=package_name,
|
|
||||||
)
|
|
||||||
|
|
||||||
pkgs = []
|
|
||||||
|
|
||||||
for data in self._read_all_pages(endpoint):
|
|
||||||
pkgs.append(ContainerPackage(data))
|
|
||||||
|
|
||||||
return pkgs
|
|
||||||
|
|
||||||
def get_deleted_package_versions(
|
|
||||||
self,
|
|
||||||
package_name: str,
|
|
||||||
) -> List[ContainerPackage]:
|
|
||||||
package_type: str = "container"
|
|
||||||
# Need to quote this for slashes in the name
|
|
||||||
package_name = urllib.parse.quote(package_name, safe="")
|
|
||||||
|
|
||||||
endpoint = (
|
|
||||||
self._PACKAGES_VERSIONS_ENDPOINT.format(
|
|
||||||
ORG=self._owner_or_org,
|
|
||||||
PACKAGE_TYPE=package_type,
|
|
||||||
PACKAGE_NAME=package_name,
|
|
||||||
)
|
|
||||||
+ "?state=deleted"
|
|
||||||
)
|
|
||||||
|
|
||||||
pkgs = []
|
|
||||||
|
|
||||||
for data in self._read_all_pages(endpoint):
|
|
||||||
pkgs.append(ContainerPackage(data))
|
|
||||||
|
|
||||||
return pkgs
|
|
||||||
|
|
||||||
def delete_package_version(self, package_data: ContainerPackage):
|
|
||||||
"""
|
|
||||||
Deletes the given package version from the GHCR
|
|
||||||
"""
|
|
||||||
resp = self._client.delete(package_data.url)
|
|
||||||
if resp.status_code != 204:
|
|
||||||
logger.warning(
|
|
||||||
f"Request to delete {package_data.url} returned HTTP {resp.status_code}",
|
|
||||||
)
|
|
||||||
|
|
||||||
def restore_package_version(
|
|
||||||
self,
|
|
||||||
package_name: str,
|
|
||||||
package_data: ContainerPackage,
|
|
||||||
):
|
|
||||||
package_type: str = "container"
|
|
||||||
endpoint = self._PACKAGE_VERSION_RESTORE_ENDPOINT.format(
|
|
||||||
ORG=self._owner_or_org,
|
|
||||||
PACKAGE_TYPE=package_type,
|
|
||||||
PACKAGE_NAME=package_name,
|
|
||||||
PACKAGE_VERSION_ID=package_data.id,
|
|
||||||
)
|
|
||||||
|
|
||||||
resp = self._client.post(endpoint)
|
|
||||||
if resp.status_code != 204:
|
|
||||||
logger.warning(
|
|
||||||
f"Request to delete {endpoint} returned HTTP {resp.status_code}",
|
|
||||||
)
|
|
23
.github/stale.yml
vendored
@@ -1,23 +0,0 @@
|
|||||||
# Number of days of inactivity before an issue becomes stale
|
|
||||||
daysUntilStale: 30
|
|
||||||
|
|
||||||
# Number of days of inactivity before a stale issue is closed
|
|
||||||
daysUntilClose: 7
|
|
||||||
|
|
||||||
# Only issues or pull requests with all of these labels are check if stale. Defaults to `[]` (disabled)
|
|
||||||
onlyLabels: [cant-reproduce]
|
|
||||||
|
|
||||||
# Label to use when marking an issue as stale
|
|
||||||
staleLabel: stale
|
|
||||||
|
|
||||||
# Comment to post when marking an issue as stale. Set to `false` to disable
|
|
||||||
markComment: >
|
|
||||||
This issue has been automatically marked as stale because it has not had
|
|
||||||
recent activity. It will be closed if no further activity occurs. Thank you
|
|
||||||
for your contributions.
|
|
||||||
|
|
||||||
# Comment to post when closing a stale issue. Set to `false` to disable
|
|
||||||
closeComment: false
|
|
||||||
|
|
||||||
# See https://github.com/marketplace/stale for more info on the app
|
|
||||||
# and https://github.com/probot/stale for the configuration docs
|
|
516
.github/workflows/ci.yml
vendored
@@ -13,282 +13,306 @@ on:
|
|||||||
branches-ignore:
|
branches-ignore:
|
||||||
- 'translations**'
|
- 'translations**'
|
||||||
|
|
||||||
|
env:
|
||||||
|
# This is the version of pipenv all the steps will use
|
||||||
|
# If changing this, change Dockerfile
|
||||||
|
DEFAULT_PIP_ENV_VERSION: "2023.10.24"
|
||||||
|
# This is the default version of Python to use in most steps which aren't specific
|
||||||
|
DEFAULT_PYTHON_VERSION: "3.10"
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
pre-commit:
|
pre-commit:
|
||||||
|
# We want to run on external PRs, but not on our own internal PRs as they'll be run
|
||||||
|
# by the push to the branch. Without this if check, checks are duplicated since
|
||||||
|
# internal PRs match both the push and pull_request events.
|
||||||
|
if:
|
||||||
|
github.event_name == 'push' || github.event.pull_request.head.repo.full_name !=
|
||||||
|
github.repository
|
||||||
|
|
||||||
name: Linting Checks
|
name: Linting Checks
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout repository
|
name: Checkout repository
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
-
|
-
|
||||||
name: Install tools
|
name: Install python
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: "3.9"
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
|
|
||||||
-
|
-
|
||||||
name: Check files
|
name: Check files
|
||||||
uses: pre-commit/action@v3.0.0
|
uses: pre-commit/action@v3.0.0
|
||||||
|
|
||||||
documentation:
|
documentation:
|
||||||
name: "Build Documentation"
|
name: "Build & Deploy Documentation"
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
needs:
|
needs:
|
||||||
- pre-commit
|
- pre-commit
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v4
|
||||||
-
|
|
||||||
name: Install pipenv
|
|
||||||
run: |
|
|
||||||
pipx install pipenv==2022.11.30
|
|
||||||
-
|
-
|
||||||
name: Set up Python
|
name: Set up Python
|
||||||
|
id: setup-python
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: 3.8
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
cache: "pipenv"
|
cache: "pipenv"
|
||||||
cache-dependency-path: 'Pipfile.lock'
|
cache-dependency-path: 'Pipfile.lock'
|
||||||
|
-
|
||||||
|
name: Install pipenv
|
||||||
|
run: |
|
||||||
|
pip install --user pipenv==${{ env.DEFAULT_PIP_ENV_VERSION }}
|
||||||
-
|
-
|
||||||
name: Install dependencies
|
name: Install dependencies
|
||||||
run: |
|
run: |
|
||||||
pipenv sync --dev
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||||
-
|
-
|
||||||
name: List installed Python dependencies
|
name: List installed Python dependencies
|
||||||
run: |
|
run: |
|
||||||
pipenv run pip list
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pip list
|
||||||
-
|
-
|
||||||
name: Make documentation
|
name: Make documentation
|
||||||
run: |
|
run: |
|
||||||
pipenv run mkdocs build --config-file ./mkdocs.yml
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run mkdocs build --config-file ./mkdocs.yml
|
||||||
|
-
|
||||||
|
name: Deploy documentation
|
||||||
|
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
|
||||||
|
run: |
|
||||||
|
echo "docs.paperless-ngx.com" > "${{ github.workspace }}/docs/CNAME"
|
||||||
|
git config --global user.name "${{ github.actor }}"
|
||||||
|
git config --global user.email "${{ github.actor }}@users.noreply.github.com"
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run mkdocs gh-deploy --force --no-history
|
||||||
-
|
-
|
||||||
name: Upload artifact
|
name: Upload artifact
|
||||||
uses: actions/upload-artifact@v3
|
uses: actions/upload-artifact@v3
|
||||||
with:
|
with:
|
||||||
name: documentation
|
name: documentation
|
||||||
path: site/
|
path: site/
|
||||||
|
retention-days: 7
|
||||||
documentation-deploy:
|
|
||||||
name: "Deploy Documentation"
|
|
||||||
runs-on: ubuntu-22.04
|
|
||||||
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
|
|
||||||
needs:
|
|
||||||
- documentation
|
|
||||||
steps:
|
|
||||||
-
|
|
||||||
name: Checkout
|
|
||||||
uses: actions/checkout@v3
|
|
||||||
-
|
|
||||||
name: Deploy docs
|
|
||||||
uses: mhausenblas/mkdocs-deploy-gh-pages@master
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
CUSTOM_DOMAIN: docs.paperless-ngx.com
|
|
||||||
CONFIG_FILE: mkdocs.yml
|
|
||||||
EXTRA_PACKAGES: build-base
|
|
||||||
|
|
||||||
tests-backend:
|
tests-backend:
|
||||||
name: "Tests (${{ matrix.python-version }})"
|
name: "Backend Tests (Python ${{ matrix.python-version }})"
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
needs:
|
needs:
|
||||||
- pre-commit
|
- pre-commit
|
||||||
strategy:
|
strategy:
|
||||||
matrix:
|
matrix:
|
||||||
python-version: ['3.8', '3.9', '3.10']
|
python-version: ['3.9', '3.10', '3.11']
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
env:
|
|
||||||
# Enable Tika end to end testing
|
|
||||||
TIKA_LIVE: 1
|
|
||||||
# Enable paperless_mail testing against real server
|
|
||||||
PAPERLESS_MAIL_TEST_HOST: ${{ secrets.TEST_MAIL_HOST }}
|
|
||||||
PAPERLESS_MAIL_TEST_USER: ${{ secrets.TEST_MAIL_USER }}
|
|
||||||
PAPERLESS_MAIL_TEST_PASSWD: ${{ secrets.TEST_MAIL_PASSWD }}
|
|
||||||
# Skip Tests which require convert
|
|
||||||
PAPERLESS_TEST_SKIP_CONVERT: 1
|
|
||||||
# Enable Gotenberg end to end testing
|
|
||||||
GOTENBERG_LIVE: 1
|
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v4
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
-
|
-
|
||||||
name: Start containers
|
name: Start containers
|
||||||
run: |
|
run: |
|
||||||
docker compose --file ${GITHUB_WORKSPACE}/docker/compose/docker-compose.ci-test.yml pull --quiet
|
docker compose --file ${{ github.workspace }}/docker/compose/docker-compose.ci-test.yml pull --quiet
|
||||||
docker compose --file ${GITHUB_WORKSPACE}/docker/compose/docker-compose.ci-test.yml up --detach
|
docker compose --file ${{ github.workspace }}/docker/compose/docker-compose.ci-test.yml up --detach
|
||||||
-
|
|
||||||
name: Install pipenv
|
|
||||||
run: |
|
|
||||||
pipx install pipenv==2022.11.30
|
|
||||||
-
|
-
|
||||||
name: Set up Python
|
name: Set up Python
|
||||||
|
id: setup-python
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: "${{ matrix.python-version }}"
|
python-version: "${{ matrix.python-version }}"
|
||||||
cache: "pipenv"
|
cache: "pipenv"
|
||||||
cache-dependency-path: 'Pipfile.lock'
|
cache-dependency-path: 'Pipfile.lock'
|
||||||
|
-
|
||||||
|
name: Install pipenv
|
||||||
|
run: |
|
||||||
|
pip install --user pipenv==${{ env.DEFAULT_PIP_ENV_VERSION }}
|
||||||
-
|
-
|
||||||
name: Install system dependencies
|
name: Install system dependencies
|
||||||
run: |
|
run: |
|
||||||
sudo apt-get update -qq
|
sudo apt-get update -qq
|
||||||
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript libzbar0 poppler-utils
|
sudo apt-get install -qq --no-install-recommends unpaper tesseract-ocr imagemagick ghostscript libzbar0 poppler-utils
|
||||||
|
-
|
||||||
|
name: Configure ImageMagick
|
||||||
|
run: |
|
||||||
|
sudo cp docker/imagemagick-policy.xml /etc/ImageMagick-6/policy.xml
|
||||||
-
|
-
|
||||||
name: Install Python dependencies
|
name: Install Python dependencies
|
||||||
run: |
|
run: |
|
||||||
pipenv sync --dev
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python --version
|
||||||
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||||
-
|
-
|
||||||
name: List installed Python dependencies
|
name: List installed Python dependencies
|
||||||
run: |
|
run: |
|
||||||
pipenv run pip list
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pip list
|
||||||
-
|
-
|
||||||
name: Tests
|
name: Tests
|
||||||
run: |
|
|
||||||
cd src/
|
|
||||||
pipenv run pytest -rfEp
|
|
||||||
-
|
|
||||||
name: Get changed files
|
|
||||||
id: changed-files-specific
|
|
||||||
uses: tj-actions/changed-files@v34
|
|
||||||
with:
|
|
||||||
files: |
|
|
||||||
src/**
|
|
||||||
-
|
|
||||||
name: List all changed files
|
|
||||||
run: |
|
|
||||||
for file in ${{ steps.changed-files-specific.outputs.all_changed_files }}; do
|
|
||||||
echo "${file} was changed"
|
|
||||||
done
|
|
||||||
-
|
|
||||||
name: Publish coverage results
|
|
||||||
if: matrix.python-version == '3.9' && steps.changed-files-specific.outputs.any_changed == 'true'
|
|
||||||
env:
|
env:
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
PAPERLESS_CI_TEST: 1
|
||||||
# https://github.com/coveralls-clients/coveralls-python/issues/251
|
# Enable paperless_mail testing against real server
|
||||||
|
PAPERLESS_MAIL_TEST_HOST: ${{ secrets.TEST_MAIL_HOST }}
|
||||||
|
PAPERLESS_MAIL_TEST_USER: ${{ secrets.TEST_MAIL_USER }}
|
||||||
|
PAPERLESS_MAIL_TEST_PASSWD: ${{ secrets.TEST_MAIL_PASSWD }}
|
||||||
run: |
|
run: |
|
||||||
cd src/
|
cd src/
|
||||||
pipenv run coveralls --service=github
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run pytest -ra
|
||||||
|
-
|
||||||
|
name: Upload coverage
|
||||||
|
if: ${{ matrix.python-version == env.DEFAULT_PYTHON_VERSION }}
|
||||||
|
uses: actions/upload-artifact@v3
|
||||||
|
with:
|
||||||
|
name: backend-coverage-report
|
||||||
|
path: src/coverage.xml
|
||||||
|
retention-days: 7
|
||||||
|
if-no-files-found: warn
|
||||||
-
|
-
|
||||||
name: Stop containers
|
name: Stop containers
|
||||||
if: always()
|
if: always()
|
||||||
run: |
|
run: |
|
||||||
docker compose --file ${GITHUB_WORKSPACE}/docker/compose/docker-compose.ci-test.yml logs
|
docker compose --file ${{ github.workspace }}/docker/compose/docker-compose.ci-test.yml logs
|
||||||
docker compose --file ${GITHUB_WORKSPACE}/docker/compose/docker-compose.ci-test.yml down
|
docker compose --file ${{ github.workspace }}/docker/compose/docker-compose.ci-test.yml down
|
||||||
|
|
||||||
tests-frontend:
|
install-frontend-depedendencies:
|
||||||
name: "Tests Frontend"
|
name: "Install Frontend Dependendencies"
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
needs:
|
needs:
|
||||||
- pre-commit
|
- pre-commit
|
||||||
strategy:
|
|
||||||
matrix:
|
|
||||||
node-version: [16.x]
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
-
|
-
|
||||||
name: Use Node.js ${{ matrix.node-version }}
|
name: Use Node.js 20
|
||||||
uses: actions/setup-node@v3
|
uses: actions/setup-node@v4
|
||||||
with:
|
with:
|
||||||
node-version: ${{ matrix.node-version }}
|
node-version: 20.x
|
||||||
- run: cd src-ui && npm ci
|
cache: 'npm'
|
||||||
- run: cd src-ui && npm run lint
|
cache-dependency-path: 'src-ui/package-lock.json'
|
||||||
- run: cd src-ui && npm run test
|
- name: Cache frontend depdendencies
|
||||||
- run: cd src-ui && npm run e2e:ci
|
id: cache-frontend-deps
|
||||||
|
uses: actions/cache@v3
|
||||||
|
with:
|
||||||
|
path: |
|
||||||
|
~/.npm
|
||||||
|
~/.cache
|
||||||
|
key: ${{ runner.os }}-frontenddeps-${{ hashFiles('src-ui/package-lock.json') }}
|
||||||
|
-
|
||||||
|
name: Install dependencies
|
||||||
|
if: steps.cache-frontend-deps.outputs.cache-hit != 'true'
|
||||||
|
run: cd src-ui && npm ci
|
||||||
|
-
|
||||||
|
name: Install Playwright
|
||||||
|
if: steps.cache-frontend-deps.outputs.cache-hit != 'true'
|
||||||
|
run: cd src-ui && npx playwright install --with-deps
|
||||||
|
|
||||||
prepare-docker-build:
|
tests-frontend:
|
||||||
name: Prepare Docker Pipeline Data
|
name: "Frontend Tests (Node ${{ matrix.node-version }} - ${{ matrix.shard-index }}/${{ matrix.shard-count }})"
|
||||||
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || contains(github.ref, 'beta.rc') || startsWith(github.ref, 'refs/tags/v'))
|
runs-on: ubuntu-22.04
|
||||||
|
needs:
|
||||||
|
- install-frontend-depedendencies
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
node-version: [20.x]
|
||||||
|
shard-index: [1, 2, 3, 4]
|
||||||
|
shard-count: [4]
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
-
|
||||||
|
name: Use Node.js 20
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 20.x
|
||||||
|
cache: 'npm'
|
||||||
|
cache-dependency-path: 'src-ui/package-lock.json'
|
||||||
|
- name: Cache frontend depdendencies
|
||||||
|
id: cache-frontend-deps
|
||||||
|
uses: actions/cache@v3
|
||||||
|
with:
|
||||||
|
path: |
|
||||||
|
~/.npm
|
||||||
|
~/.cache
|
||||||
|
key: ${{ runner.os }}-frontenddeps-${{ hashFiles('src-ui/package-lock.json') }}
|
||||||
|
- name: Re-link Angular cli
|
||||||
|
run: cd src-ui && npm link @angular/cli
|
||||||
|
-
|
||||||
|
name: Linting checks
|
||||||
|
run: cd src-ui && npm run lint
|
||||||
|
-
|
||||||
|
name: Run Jest unit tests
|
||||||
|
run: cd src-ui && npm run test -- --max-workers=2 --shard=${{ matrix.shard-index }}/${{ matrix.shard-count }}
|
||||||
|
-
|
||||||
|
name: Upload Jest coverage
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v3
|
||||||
|
with:
|
||||||
|
name: jest-coverage-report-${{ matrix.shard-index }}
|
||||||
|
path: |
|
||||||
|
src-ui/coverage/coverage-final.json
|
||||||
|
src-ui/coverage/lcov.info
|
||||||
|
src-ui/coverage/clover.xml
|
||||||
|
retention-days: 7
|
||||||
|
if-no-files-found: warn
|
||||||
|
-
|
||||||
|
name: Run Playwright e2e tests
|
||||||
|
run: cd src-ui && npx playwright test --shard ${{ matrix.shard-index }}/${{ matrix.shard-count }}
|
||||||
|
-
|
||||||
|
name: Upload Playwright test results
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v3
|
||||||
|
with:
|
||||||
|
name: playwright-report
|
||||||
|
path: src-ui/playwright-report
|
||||||
|
retention-days: 7
|
||||||
|
|
||||||
|
tests-coverage-upload:
|
||||||
|
name: "Upload Coverage"
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
# If the push triggered the installer library workflow, wait for it to
|
|
||||||
# complete here. This ensures the required versions for the final
|
|
||||||
# image have been built, while not waiting at all if the versions haven't changed
|
|
||||||
concurrency:
|
|
||||||
group: build-installer-library
|
|
||||||
cancel-in-progress: false
|
|
||||||
needs:
|
needs:
|
||||||
- documentation
|
|
||||||
- tests-backend
|
- tests-backend
|
||||||
- tests-frontend
|
- tests-frontend
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Set ghcr repository name
|
uses: actions/checkout@v4
|
||||||
id: set-ghcr-repository
|
|
||||||
run: |
|
|
||||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
|
||||||
echo "repository=${ghcr_name}" >> $GITHUB_OUTPUT
|
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Download frontend coverage
|
||||||
uses: actions/checkout@v3
|
uses: actions/download-artifact@v3
|
||||||
-
|
|
||||||
name: Set up Python
|
|
||||||
uses: actions/setup-python@v4
|
|
||||||
with:
|
with:
|
||||||
python-version: "3.9"
|
path: src-ui/coverage/
|
||||||
-
|
-
|
||||||
name: Setup qpdf image
|
name: Upload frontend coverage to Codecov
|
||||||
id: qpdf-setup
|
uses: codecov/codecov-action@v3
|
||||||
run: |
|
with:
|
||||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
# not required for public repos, but intermittently fails otherwise
|
||||||
|
token: ${{ secrets.CODECOV_TOKEN }}
|
||||||
echo ${build_json}
|
flags: frontend
|
||||||
|
directory: src-ui/coverage/
|
||||||
echo "qpdf-json=${build_json}" >> $GITHUB_OUTPUT
|
# dont include backend coverage files here
|
||||||
|
files: '!coverage.xml'
|
||||||
-
|
-
|
||||||
name: Setup psycopg2 image
|
name: Download backend coverage
|
||||||
id: psycopg2-setup
|
uses: actions/download-artifact@v3
|
||||||
run: |
|
with:
|
||||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
name: backend-coverage-report
|
||||||
|
path: src/
|
||||||
echo ${build_json}
|
|
||||||
|
|
||||||
echo "psycopg2-json=${build_json}" >> $GITHUB_OUTPUT
|
|
||||||
-
|
-
|
||||||
name: Setup pikepdf image
|
name: Upload coverage to Codecov
|
||||||
id: pikepdf-setup
|
uses: codecov/codecov-action@v3
|
||||||
run: |
|
with:
|
||||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
# not required for public repos, but intermittently fails otherwise
|
||||||
|
token: ${{ secrets.CODECOV_TOKEN }}
|
||||||
|
# future expansion
|
||||||
|
flags: backend
|
||||||
|
directory: src/
|
||||||
|
|
||||||
echo ${build_json}
|
|
||||||
|
|
||||||
echo "pikepdf-json=${build_json}" >> $GITHUB_OUTPUT
|
|
||||||
-
|
|
||||||
name: Setup jbig2enc image
|
|
||||||
id: jbig2enc-setup
|
|
||||||
run: |
|
|
||||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
|
||||||
|
|
||||||
echo ${build_json}
|
|
||||||
|
|
||||||
echo "jbig2enc-json=${build_json}" >> $GITHUB_OUTPUT
|
|
||||||
|
|
||||||
outputs:
|
|
||||||
|
|
||||||
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
|
||||||
|
|
||||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
|
||||||
|
|
||||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
|
||||||
|
|
||||||
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
|
||||||
|
|
||||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json}}
|
|
||||||
|
|
||||||
# build and push image to docker hub.
|
|
||||||
build-docker-image:
|
build-docker-image:
|
||||||
|
name: Build Docker image for ${{ github.ref_name }}
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
|
if: github.event_name == 'push' && (startsWith(github.ref, 'refs/heads/feature-') || github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/beta' || contains(github.ref, 'beta.rc') || startsWith(github.ref, 'refs/tags/v'))
|
||||||
concurrency:
|
concurrency:
|
||||||
group: ${{ github.workflow }}-build-docker-image-${{ github.ref_name }}
|
group: ${{ github.workflow }}-build-docker-image-${{ github.ref_name }}
|
||||||
cancel-in-progress: true
|
cancel-in-progress: true
|
||||||
needs:
|
needs:
|
||||||
- prepare-docker-build
|
- tests-backend
|
||||||
|
- tests-frontend
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Check pushing to Docker Hub
|
name: Check pushing to Docker Hub
|
||||||
id: docker-hub
|
id: push-other-places
|
||||||
# Only push to Dockerhub from the main repo AND the ref is either:
|
# Only push to Dockerhub from the main repo AND the ref is either:
|
||||||
# main
|
# main
|
||||||
# dev
|
# dev
|
||||||
@@ -296,21 +320,29 @@ jobs:
|
|||||||
# a tag
|
# a tag
|
||||||
# Otherwise forks would require a Docker Hub account and secrets setup
|
# Otherwise forks would require a Docker Hub account and secrets setup
|
||||||
run: |
|
run: |
|
||||||
if [[ ${{ needs.prepare-docker-build.outputs.ghcr-repository }} == "paperless-ngx/paperless-ngx" && ( ${{ github.ref_name }} == "main" || ${{ github.ref_name }} == "dev" || ${{ github.ref_name }} == "beta" || ${{ startsWith(github.ref, 'refs/tags/v') }} == "true" ) ]] ; then
|
if [[ ${{ github.repository_owner }} == "paperless-ngx" && ( ${{ github.ref_name }} == "dev" || ${{ github.ref_name }} == "beta" || ${{ startsWith(github.ref, 'refs/tags/v') }} == "true" ) ]] ; then
|
||||||
echo "Enabling DockerHub image push"
|
echo "Enabling DockerHub image push"
|
||||||
echo "enable=true" >> $GITHUB_OUTPUT
|
echo "enable=true" >> $GITHUB_OUTPUT
|
||||||
else
|
else
|
||||||
echo "Not pushing to DockerHub"
|
echo "Not pushing to DockerHub"
|
||||||
echo "enable=false" >> $GITHUB_OUTPUT
|
echo "enable=false" >> $GITHUB_OUTPUT
|
||||||
fi
|
fi
|
||||||
|
-
|
||||||
|
name: Set ghcr repository name
|
||||||
|
id: set-ghcr-repository
|
||||||
|
run: |
|
||||||
|
ghcr_name=$(echo "${{ github.repository }}" | awk '{ print tolower($0) }')
|
||||||
|
echo "Name is ${ghcr_name}"
|
||||||
|
echo "ghcr-repository=${ghcr_name}" >> $GITHUB_OUTPUT
|
||||||
-
|
-
|
||||||
name: Gather Docker metadata
|
name: Gather Docker metadata
|
||||||
id: docker-meta
|
id: docker-meta
|
||||||
uses: docker/metadata-action@v4
|
uses: docker/metadata-action@v5
|
||||||
with:
|
with:
|
||||||
images: |
|
images: |
|
||||||
ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
ghcr.io/${{ steps.set-ghcr-repository.outputs.ghcr-repository }}
|
||||||
name=paperlessngx/paperless-ngx,enable=${{ steps.docker-hub.outputs.enable }}
|
name=paperlessngx/paperless-ngx,enable=${{ steps.push-other-places.outputs.enable }}
|
||||||
|
name=quay.io/paperlessngx/paperless-ngx,enable=${{ steps.push-other-places.outputs.enable }}
|
||||||
tags: |
|
tags: |
|
||||||
# Tag branches with branch name
|
# Tag branches with branch name
|
||||||
type=ref,event=branch
|
type=ref,event=branch
|
||||||
@@ -320,51 +352,59 @@ jobs:
|
|||||||
type=semver,pattern={{major}}.{{minor}}
|
type=semver,pattern={{major}}.{{minor}}
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v4
|
||||||
|
# If https://github.com/docker/buildx/issues/1044 is resolved,
|
||||||
|
# the append input with a native arm64 arch could be used to
|
||||||
|
# significantly speed up building
|
||||||
-
|
-
|
||||||
name: Set up Docker Buildx
|
name: Set up Docker Buildx
|
||||||
uses: docker/setup-buildx-action@v2
|
uses: docker/setup-buildx-action@v3
|
||||||
-
|
-
|
||||||
name: Set up QEMU
|
name: Set up QEMU
|
||||||
uses: docker/setup-qemu-action@v2
|
uses: docker/setup-qemu-action@v3
|
||||||
|
with:
|
||||||
|
platforms: arm64
|
||||||
-
|
-
|
||||||
name: Login to Github Container Registry
|
name: Login to GitHub Container Registry
|
||||||
uses: docker/login-action@v2
|
uses: docker/login-action@v3
|
||||||
with:
|
with:
|
||||||
registry: ghcr.io
|
registry: ghcr.io
|
||||||
username: ${{ github.actor }}
|
username: ${{ github.actor }}
|
||||||
password: ${{ secrets.GITHUB_TOKEN }}
|
password: ${{ secrets.GITHUB_TOKEN }}
|
||||||
-
|
-
|
||||||
name: Login to Docker Hub
|
name: Login to Docker Hub
|
||||||
uses: docker/login-action@v2
|
uses: docker/login-action@v3
|
||||||
# Don't attempt to login is not pushing to Docker Hub
|
# Don't attempt to login is not pushing to Docker Hub
|
||||||
if: steps.docker-hub.outputs.enable == 'true'
|
if: steps.push-other-places.outputs.enable == 'true'
|
||||||
with:
|
with:
|
||||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||||
|
-
|
||||||
|
name: Login to Quay.io
|
||||||
|
uses: docker/login-action@v3
|
||||||
|
# Don't attempt to login is not pushing to Quay.io
|
||||||
|
if: steps.push-other-places.outputs.enable == 'true'
|
||||||
|
with:
|
||||||
|
registry: quay.io
|
||||||
|
username: ${{ secrets.QUAY_USERNAME }}
|
||||||
|
password: ${{ secrets.QUAY_ROBOT_TOKEN }}
|
||||||
-
|
-
|
||||||
name: Build and push
|
name: Build and push
|
||||||
uses: docker/build-push-action@v3
|
uses: docker/build-push-action@v5
|
||||||
with:
|
with:
|
||||||
context: .
|
context: .
|
||||||
file: ./Dockerfile
|
file: ./Dockerfile
|
||||||
platforms: linux/amd64,linux/arm/v7,linux/arm64
|
platforms: linux/amd64,linux/arm64
|
||||||
push: ${{ github.event_name != 'pull_request' }}
|
push: ${{ github.event_name != 'pull_request' }}
|
||||||
tags: ${{ steps.docker-meta.outputs.tags }}
|
tags: ${{ steps.docker-meta.outputs.tags }}
|
||||||
labels: ${{ steps.docker-meta.outputs.labels }}
|
labels: ${{ steps.docker-meta.outputs.labels }}
|
||||||
build-args: |
|
# Get cache layers from this branch, then dev
|
||||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
|
||||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
|
||||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
|
||||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
|
||||||
# Get cache layers from this branch, then dev, then main
|
|
||||||
# This allows new branches to get at least some cache benefits, generally from dev
|
# This allows new branches to get at least some cache benefits, generally from dev
|
||||||
cache-from: |
|
cache-from: |
|
||||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
type=registry,ref=ghcr.io/${{ steps.set-ghcr-repository.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:dev
|
type=registry,ref=ghcr.io/${{ steps.set-ghcr-repository.outputs.ghcr-repository }}/builder/cache/app:dev
|
||||||
type=registry,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:main
|
|
||||||
cache-to: |
|
cache-to: |
|
||||||
type=registry,mode=max,ref=ghcr.io/${{ needs.prepare-docker-build.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
type=registry,mode=max,ref=ghcr.io/${{ steps.set-ghcr-repository.outputs.ghcr-repository }}/builder/cache/app:${{ github.ref_name }}
|
||||||
-
|
-
|
||||||
name: Inspect image
|
name: Inspect image
|
||||||
run: |
|
run: |
|
||||||
@@ -380,31 +420,40 @@ jobs:
|
|||||||
with:
|
with:
|
||||||
name: frontend-compiled
|
name: frontend-compiled
|
||||||
path: src/documents/static/frontend/
|
path: src/documents/static/frontend/
|
||||||
|
retention-days: 7
|
||||||
|
|
||||||
build-release:
|
build-release:
|
||||||
|
name: "Build Release"
|
||||||
needs:
|
needs:
|
||||||
- build-docker-image
|
- build-docker-image
|
||||||
|
- documentation
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v4
|
||||||
-
|
|
||||||
name: Install pipenv
|
|
||||||
run: |
|
|
||||||
pip3 install --upgrade pip setuptools wheel pipx
|
|
||||||
pipx install pipenv
|
|
||||||
-
|
-
|
||||||
name: Set up Python
|
name: Set up Python
|
||||||
|
id: setup-python
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: 3.9
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
cache: "pipenv"
|
cache: "pipenv"
|
||||||
cache-dependency-path: 'Pipfile.lock'
|
cache-dependency-path: 'Pipfile.lock'
|
||||||
|
-
|
||||||
|
name: Install pipenv + tools
|
||||||
|
run: |
|
||||||
|
pip install --upgrade --user pipenv==${{ env.DEFAULT_PIP_ENV_VERSION }} setuptools wheel
|
||||||
-
|
-
|
||||||
name: Install Python dependencies
|
name: Install Python dependencies
|
||||||
run: |
|
run: |
|
||||||
pipenv sync --dev
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} sync --dev
|
||||||
|
-
|
||||||
|
name: Patch whitenoise
|
||||||
|
run: |
|
||||||
|
curl --fail --silent --show-error --location --output 484.patch https://github.com/evansd/whitenoise/pull/484.patch
|
||||||
|
patch -d $(pipenv --venv)/lib/python3.10/site-packages --verbose -p2 < 484.patch
|
||||||
|
rm 484.patch
|
||||||
-
|
-
|
||||||
name: Install system dependencies
|
name: Install system dependencies
|
||||||
run: |
|
run: |
|
||||||
@@ -425,35 +474,62 @@ jobs:
|
|||||||
-
|
-
|
||||||
name: Generate requirements file
|
name: Generate requirements file
|
||||||
run: |
|
run: |
|
||||||
pipenv requirements > requirements.txt
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} requirements > requirements.txt
|
||||||
-
|
-
|
||||||
name: Compile messages
|
name: Compile messages
|
||||||
run: |
|
run: |
|
||||||
cd src/
|
cd src/
|
||||||
pipenv run python3 manage.py compilemessages
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python3 manage.py compilemessages
|
||||||
-
|
-
|
||||||
name: Collect static files
|
name: Collect static files
|
||||||
run: |
|
run: |
|
||||||
cd src/
|
cd src/
|
||||||
pipenv run python3 manage.py collectstatic --no-input
|
pipenv --python ${{ steps.setup-python.outputs.python-version }} run python3 manage.py collectstatic --no-input
|
||||||
-
|
-
|
||||||
name: Move files
|
name: Move files
|
||||||
run: |
|
run: |
|
||||||
mkdir dist
|
echo "Making dist folders"
|
||||||
mkdir dist/paperless-ngx
|
for directory in dist \
|
||||||
mkdir dist/paperless-ngx/scripts
|
dist/paperless-ngx \
|
||||||
cp .dockerignore .env Dockerfile Pipfile Pipfile.lock requirements.txt LICENSE README.md dist/paperless-ngx/
|
dist/paperless-ngx/scripts;
|
||||||
cp paperless.conf.example dist/paperless-ngx/paperless.conf
|
do
|
||||||
cp gunicorn.conf.py dist/paperless-ngx/gunicorn.conf.py
|
mkdir --verbose --parents ${directory}
|
||||||
cp -r docker/ dist/paperless-ngx/docker
|
done
|
||||||
cp scripts/*.service scripts/*.sh dist/paperless-ngx/scripts/
|
|
||||||
cp -r src/ dist/paperless-ngx/src
|
echo "Copying basic files"
|
||||||
cp -r docs/_build/html/ dist/paperless-ngx/docs
|
for file_name in .dockerignore \
|
||||||
mv static dist/paperless-ngx
|
.env \
|
||||||
|
Dockerfile \
|
||||||
|
Pipfile \
|
||||||
|
Pipfile.lock \
|
||||||
|
requirements.txt \
|
||||||
|
LICENSE \
|
||||||
|
README.md \
|
||||||
|
paperless.conf.example \
|
||||||
|
gunicorn.conf.py
|
||||||
|
do
|
||||||
|
cp --verbose ${file_name} dist/paperless-ngx/
|
||||||
|
done
|
||||||
|
mv --verbose dist/paperless-ngx/paperless.conf.example dist/paperless-ngx/paperless.conf
|
||||||
|
|
||||||
|
echo "Copying Docker related files"
|
||||||
|
cp --recursive docker/ dist/paperless-ngx/docker
|
||||||
|
|
||||||
|
echo "Copying startup scripts"
|
||||||
|
cp --verbose scripts/*.service scripts/*.sh scripts/*.socket dist/paperless-ngx/scripts/
|
||||||
|
|
||||||
|
echo "Copying source files"
|
||||||
|
cp --recursive src/ dist/paperless-ngx/src
|
||||||
|
echo "Copying documentation"
|
||||||
|
cp --recursive docs/_build/html/ dist/paperless-ngx/docs
|
||||||
|
|
||||||
|
mv --verbose static dist/paperless-ngx
|
||||||
-
|
-
|
||||||
name: Make release package
|
name: Make release package
|
||||||
run: |
|
run: |
|
||||||
|
echo "Creating release archive"
|
||||||
cd dist
|
cd dist
|
||||||
|
sudo chown -R 1000:1000 paperless-ngx/
|
||||||
tar -cJf paperless-ngx.tar.xz paperless-ngx/
|
tar -cJf paperless-ngx.tar.xz paperless-ngx/
|
||||||
-
|
-
|
||||||
name: Upload release artifact
|
name: Upload release artifact
|
||||||
@@ -461,8 +537,10 @@ jobs:
|
|||||||
with:
|
with:
|
||||||
name: release
|
name: release
|
||||||
path: dist/paperless-ngx.tar.xz
|
path: dist/paperless-ngx.tar.xz
|
||||||
|
retention-days: 7
|
||||||
|
|
||||||
publish-release:
|
publish-release:
|
||||||
|
name: "Publish Release"
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
outputs:
|
outputs:
|
||||||
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
prerelease: ${{ steps.get_version.outputs.prerelease }}
|
||||||
@@ -491,7 +569,7 @@ jobs:
|
|||||||
-
|
-
|
||||||
name: Create Release and Changelog
|
name: Create Release and Changelog
|
||||||
id: create-release
|
id: create-release
|
||||||
uses: paperless-ngx/release-drafter@master
|
uses: release-drafter/release-drafter@v5
|
||||||
with:
|
with:
|
||||||
name: Paperless-ngx ${{ steps.get_version.outputs.version }}
|
name: Paperless-ngx ${{ steps.get_version.outputs.version }}
|
||||||
tag: ${{ steps.get_version.outputs.version }}
|
tag: ${{ steps.get_version.outputs.version }}
|
||||||
@@ -512,6 +590,7 @@ jobs:
|
|||||||
asset_content_type: application/x-xz
|
asset_content_type: application/x-xz
|
||||||
|
|
||||||
append-changelog:
|
append-changelog:
|
||||||
|
name: "Append Changelog"
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
needs:
|
needs:
|
||||||
- publish-release
|
- publish-release
|
||||||
@@ -519,21 +598,20 @@ jobs:
|
|||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Checkout
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
ref: main
|
ref: main
|
||||||
-
|
|
||||||
name: Install pipenv
|
|
||||||
run: |
|
|
||||||
pip3 install --upgrade pip setuptools wheel pipx
|
|
||||||
pipx install pipenv
|
|
||||||
-
|
-
|
||||||
name: Set up Python
|
name: Set up Python
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: 3.9
|
python-version: ${{ env.DEFAULT_PYTHON_VERSION }}
|
||||||
cache: "pipenv"
|
cache: "pipenv"
|
||||||
cache-dependency-path: 'Pipfile.lock'
|
cache-dependency-path: 'Pipfile.lock'
|
||||||
|
-
|
||||||
|
name: Install pipenv + tools
|
||||||
|
run: |
|
||||||
|
pip install --upgrade --user pipenv==${{ env.DEFAULT_PIP_ENV_VERSION }} setuptools wheel
|
||||||
-
|
-
|
||||||
name: Append Changelog to docs
|
name: Append Changelog to docs
|
||||||
id: append-Changelog
|
id: append-Changelog
|
||||||
@@ -554,7 +632,7 @@ jobs:
|
|||||||
git push origin ${{ needs.publish-release.outputs.version }}-changelog
|
git push origin ${{ needs.publish-release.outputs.version }}-changelog
|
||||||
-
|
-
|
||||||
name: Create Pull Request
|
name: Create Pull Request
|
||||||
uses: actions/github-script@v6
|
uses: actions/github-script@v7
|
||||||
with:
|
with:
|
||||||
script: |
|
script: |
|
||||||
const { repo, owner } = context.repo;
|
const { repo, owner } = context.repo;
|
||||||
@@ -570,5 +648,5 @@ jobs:
|
|||||||
owner,
|
owner,
|
||||||
repo,
|
repo,
|
||||||
issue_number: result.data.number,
|
issue_number: result.data.number,
|
||||||
labels: ['documentation']
|
labels: ['documentation', 'skip-changelog']
|
||||||
});
|
});
|
||||||
|
95
.github/workflows/cleanup-tags.yml
vendored
@@ -12,9 +12,6 @@ on:
|
|||||||
push:
|
push:
|
||||||
paths:
|
paths:
|
||||||
- ".github/workflows/cleanup-tags.yml"
|
- ".github/workflows/cleanup-tags.yml"
|
||||||
- ".github/scripts/cleanup-tags.py"
|
|
||||||
- ".github/scripts/github.py"
|
|
||||||
- ".github/scripts/common.py"
|
|
||||||
|
|
||||||
concurrency:
|
concurrency:
|
||||||
group: registry-tags-cleanup
|
group: registry-tags-cleanup
|
||||||
@@ -26,68 +23,48 @@ jobs:
|
|||||||
if: github.repository_owner == 'paperless-ngx'
|
if: github.repository_owner == 'paperless-ngx'
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
strategy:
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
primary-name: ["paperless-ngx", "paperless-ngx/builder/cache/app"]
|
||||||
- primary-name: "paperless-ngx"
|
|
||||||
cache-name: "paperless-ngx/builder/cache/app"
|
|
||||||
|
|
||||||
- primary-name: "paperless-ngx/builder/qpdf"
|
|
||||||
cache-name: "paperless-ngx/builder/cache/qpdf"
|
|
||||||
|
|
||||||
- primary-name: "paperless-ngx/builder/pikepdf"
|
|
||||||
cache-name: "paperless-ngx/builder/cache/pikepdf"
|
|
||||||
|
|
||||||
- primary-name: "paperless-ngx/builder/jbig2enc"
|
|
||||||
cache-name: "paperless-ngx/builder/cache/jbig2enc"
|
|
||||||
|
|
||||||
- primary-name: "paperless-ngx/builder/psycopg2"
|
|
||||||
cache-name: "paperless-ngx/builder/cache/psycopg2"
|
|
||||||
env:
|
env:
|
||||||
# Requires a personal access token with the OAuth scope delete:packages
|
# Requires a personal access token with the OAuth scope delete:packages
|
||||||
TOKEN: ${{ secrets.GHA_CONTAINER_DELETE_TOKEN }}
|
TOKEN: ${{ secrets.GHA_CONTAINER_DELETE_TOKEN }}
|
||||||
steps:
|
steps:
|
||||||
-
|
-
|
||||||
name: Checkout
|
name: Clean temporary images
|
||||||
uses: actions/checkout@v3
|
|
||||||
-
|
|
||||||
name: Login to Github Container Registry
|
|
||||||
uses: docker/login-action@v2
|
|
||||||
with:
|
|
||||||
registry: ghcr.io
|
|
||||||
username: ${{ github.actor }}
|
|
||||||
password: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
-
|
|
||||||
name: Set up Python
|
|
||||||
uses: actions/setup-python@v4
|
|
||||||
with:
|
|
||||||
python-version: "3.10"
|
|
||||||
-
|
|
||||||
name: Install httpx
|
|
||||||
run: |
|
|
||||||
python -m pip install httpx
|
|
||||||
#
|
|
||||||
# Clean up primary package
|
|
||||||
#
|
|
||||||
-
|
|
||||||
name: Cleanup for package "${{ matrix.primary-name }}"
|
|
||||||
if: "${{ env.TOKEN != '' }}"
|
if: "${{ env.TOKEN != '' }}"
|
||||||
run: |
|
uses: stumpylog/image-cleaner-action/ephemeral@v0.4.0
|
||||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --untagged --is-manifest --delete "${{ matrix.primary-name }}"
|
with:
|
||||||
#
|
token: "${{ env.TOKEN }}"
|
||||||
# Clean up registry cache package
|
owner: "${{ github.repository_owner }}"
|
||||||
#
|
is_org: "true"
|
||||||
|
package_name: "${{ matrix.primary-name }}"
|
||||||
|
scheme: "branch"
|
||||||
|
repo_name: "paperless-ngx"
|
||||||
|
match_regex: "feature-"
|
||||||
|
do_delete: "true"
|
||||||
|
|
||||||
|
cleanup-untagged-images:
|
||||||
|
name: Cleanup Untagged Images Tags for ${{ matrix.primary-name }}
|
||||||
|
if: github.repository_owner == 'paperless-ngx'
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
needs:
|
||||||
|
- cleanup-images
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
primary-name: ["paperless-ngx", "paperless-ngx/builder/cache/app"]
|
||||||
|
env:
|
||||||
|
# Requires a personal access token with the OAuth scope delete:packages
|
||||||
|
TOKEN: ${{ secrets.GHA_CONTAINER_DELETE_TOKEN }}
|
||||||
|
steps:
|
||||||
-
|
-
|
||||||
name: Cleanup for package "${{ matrix.cache-name }}"
|
name: Clean untagged images
|
||||||
if: "${{ env.TOKEN != '' }}"
|
if: "${{ env.TOKEN != '' }}"
|
||||||
run: |
|
uses: stumpylog/image-cleaner-action/untagged@v0.4.0
|
||||||
python ${GITHUB_WORKSPACE}/.github/scripts/cleanup-tags.py --untagged --delete "${{ matrix.cache-name }}"
|
with:
|
||||||
#
|
token: "${{ env.TOKEN }}"
|
||||||
# Verify tags which are left still pull
|
owner: "${{ github.repository_owner }}"
|
||||||
#
|
is_org: "true"
|
||||||
-
|
package_name: "${{ matrix.primary-name }}"
|
||||||
name: Check all tags still pull
|
do_delete: "true"
|
||||||
run: |
|
|
||||||
ghcr_name=$(echo "ghcr.io/${GITHUB_REPOSITORY_OWNER}/${{ matrix.primary-name }}" | awk '{ print tolower($0) }')
|
|
||||||
echo "Pulling all tags of ${ghcr_name}"
|
|
||||||
docker pull --quiet --all-tags ${ghcr_name}
|
|
||||||
docker image list
|
|
||||||
|
2
.github/workflows/codeql-analysis.yml
vendored
@@ -38,7 +38,7 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout repository
|
- name: Checkout repository
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
# Initializes the CodeQL tools for scanning.
|
# Initializes the CodeQL tools for scanning.
|
||||||
- name: Initialize CodeQL
|
- name: Initialize CodeQL
|
||||||
|
33
.github/workflows/crowdin.yml
vendored
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
name: Crowdin Action
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
schedule:
|
||||||
|
- cron: '2 */12 * * *'
|
||||||
|
push:
|
||||||
|
paths: [
|
||||||
|
'src/locale/**',
|
||||||
|
'src-ui/src/locale/**'
|
||||||
|
]
|
||||||
|
branches: [ dev ]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
synchronize-with-crowdin:
|
||||||
|
name: Crowdin Sync
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
- name: crowdin action
|
||||||
|
uses: crowdin/github-action@v1
|
||||||
|
with:
|
||||||
|
upload_translations: false
|
||||||
|
download_translations: true
|
||||||
|
crowdin_branch_name: 'dev'
|
||||||
|
localization_branch_name: l10n_dev
|
||||||
|
pull_request_labels: 'skip-changelog, translation'
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
CROWDIN_PROJECT_ID: ${{ secrets.CROWDIN_PROJECT_ID }}
|
||||||
|
CROWDIN_PERSONAL_TOKEN: ${{ secrets.CROWDIN_PERSONAL_TOKEN }}
|
171
.github/workflows/installer-library.yml
vendored
@@ -1,171 +0,0 @@
|
|||||||
# This workflow will run to update the installer library of
|
|
||||||
# Docker images. These are the images which provide updated wheels
|
|
||||||
# .deb installation packages or maybe just some compiled library
|
|
||||||
|
|
||||||
name: Build Image Library
|
|
||||||
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
# Must match one of these branches AND one of the paths
|
|
||||||
# to be triggered
|
|
||||||
branches:
|
|
||||||
- "main"
|
|
||||||
- "dev"
|
|
||||||
- "library-*"
|
|
||||||
- "feature-*"
|
|
||||||
paths:
|
|
||||||
# Trigger the workflow if a Dockerfile changed
|
|
||||||
- "docker-builders/**"
|
|
||||||
# Trigger if a package was updated
|
|
||||||
- ".build-config.json"
|
|
||||||
- "Pipfile.lock"
|
|
||||||
# Also trigger on workflow changes related to the library
|
|
||||||
- ".github/workflows/installer-library.yml"
|
|
||||||
- ".github/workflows/reusable-workflow-builder.yml"
|
|
||||||
- ".github/scripts/**"
|
|
||||||
|
|
||||||
# Set a workflow level concurrency group so primary workflow
|
|
||||||
# can wait for this to complete if needed
|
|
||||||
# DO NOT CHANGE without updating main workflow group
|
|
||||||
concurrency:
|
|
||||||
group: build-installer-library
|
|
||||||
cancel-in-progress: false
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
prepare-docker-build:
|
|
||||||
name: Prepare Docker Image Version Data
|
|
||||||
runs-on: ubuntu-22.04
|
|
||||||
steps:
|
|
||||||
-
|
|
||||||
name: Set ghcr repository name
|
|
||||||
id: set-ghcr-repository
|
|
||||||
run: |
|
|
||||||
ghcr_name=$(echo "${GITHUB_REPOSITORY}" | awk '{ print tolower($0) }')
|
|
||||||
echo "repository=${ghcr_name}" >> $GITHUB_OUTPUT
|
|
||||||
-
|
|
||||||
name: Checkout
|
|
||||||
uses: actions/checkout@v3
|
|
||||||
-
|
|
||||||
name: Set up Python
|
|
||||||
uses: actions/setup-python@v4
|
|
||||||
with:
|
|
||||||
python-version: "3.9"
|
|
||||||
-
|
|
||||||
name: Install jq
|
|
||||||
run: |
|
|
||||||
sudo apt-get update
|
|
||||||
sudo apt-get install jq
|
|
||||||
-
|
|
||||||
name: Setup qpdf image
|
|
||||||
id: qpdf-setup
|
|
||||||
run: |
|
|
||||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py qpdf)
|
|
||||||
|
|
||||||
echo ${build_json}
|
|
||||||
|
|
||||||
echo "qpdf-json=${build_json}" >> $GITHUB_OUTPUT
|
|
||||||
-
|
|
||||||
name: Setup psycopg2 image
|
|
||||||
id: psycopg2-setup
|
|
||||||
run: |
|
|
||||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py psycopg2)
|
|
||||||
|
|
||||||
echo ${build_json}
|
|
||||||
|
|
||||||
echo "psycopg2-json=${build_json}" >> $GITHUB_OUTPUT
|
|
||||||
-
|
|
||||||
name: Setup pikepdf image
|
|
||||||
id: pikepdf-setup
|
|
||||||
run: |
|
|
||||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py pikepdf)
|
|
||||||
|
|
||||||
echo ${build_json}
|
|
||||||
|
|
||||||
echo "pikepdf-json=${build_json}" >> $GITHUB_OUTPUT
|
|
||||||
-
|
|
||||||
name: Setup jbig2enc image
|
|
||||||
id: jbig2enc-setup
|
|
||||||
run: |
|
|
||||||
build_json=$(python ${GITHUB_WORKSPACE}/.github/scripts/get-build-json.py jbig2enc)
|
|
||||||
|
|
||||||
echo ${build_json}
|
|
||||||
|
|
||||||
echo "jbig2enc-json=${build_json}" >> $GITHUB_OUTPUT
|
|
||||||
-
|
|
||||||
name: Setup other versions
|
|
||||||
id: cache-bust-setup
|
|
||||||
run: |
|
|
||||||
pillow_version=$(jq ".default.pillow.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
|
||||||
lxml_version=$(jq ".default.lxml.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
|
||||||
|
|
||||||
echo "Pillow is ${pillow_version}"
|
|
||||||
echo "lxml is ${lxml_version}"
|
|
||||||
|
|
||||||
echo "pillow-version=${pillow_version}" >> $GITHUB_OUTPUT
|
|
||||||
echo "lxml-version=${lxml_version}" >> $GITHUB_OUTPUT
|
|
||||||
|
|
||||||
outputs:
|
|
||||||
|
|
||||||
ghcr-repository: ${{ steps.set-ghcr-repository.outputs.repository }}
|
|
||||||
|
|
||||||
qpdf-json: ${{ steps.qpdf-setup.outputs.qpdf-json }}
|
|
||||||
|
|
||||||
pikepdf-json: ${{ steps.pikepdf-setup.outputs.pikepdf-json }}
|
|
||||||
|
|
||||||
psycopg2-json: ${{ steps.psycopg2-setup.outputs.psycopg2-json }}
|
|
||||||
|
|
||||||
jbig2enc-json: ${{ steps.jbig2enc-setup.outputs.jbig2enc-json }}
|
|
||||||
|
|
||||||
pillow-version: ${{ steps.cache-bust-setup.outputs.pillow-version }}
|
|
||||||
|
|
||||||
lxml-version: ${{ steps.cache-bust-setup.outputs.lxml-version }}
|
|
||||||
|
|
||||||
build-qpdf-debs:
|
|
||||||
name: qpdf
|
|
||||||
needs:
|
|
||||||
- prepare-docker-build
|
|
||||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
|
||||||
with:
|
|
||||||
dockerfile: ./docker-builders/Dockerfile.qpdf
|
|
||||||
build-platforms: linux/amd64
|
|
||||||
build-json: ${{ needs.prepare-docker-build.outputs.qpdf-json }}
|
|
||||||
build-args: |
|
|
||||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
|
||||||
|
|
||||||
build-jbig2enc:
|
|
||||||
name: jbig2enc
|
|
||||||
needs:
|
|
||||||
- prepare-docker-build
|
|
||||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
|
||||||
with:
|
|
||||||
dockerfile: ./docker-builders/Dockerfile.jbig2enc
|
|
||||||
build-json: ${{ needs.prepare-docker-build.outputs.jbig2enc-json }}
|
|
||||||
build-args: |
|
|
||||||
JBIG2ENC_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.jbig2enc-json).version }}
|
|
||||||
|
|
||||||
build-psycopg2-wheel:
|
|
||||||
name: psycopg2
|
|
||||||
needs:
|
|
||||||
- prepare-docker-build
|
|
||||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
|
||||||
with:
|
|
||||||
dockerfile: ./docker-builders/Dockerfile.psycopg2
|
|
||||||
build-json: ${{ needs.prepare-docker-build.outputs.psycopg2-json }}
|
|
||||||
build-args: |
|
|
||||||
PSYCOPG2_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.psycopg2-json).version }}
|
|
||||||
|
|
||||||
build-pikepdf-wheel:
|
|
||||||
name: pikepdf
|
|
||||||
needs:
|
|
||||||
- prepare-docker-build
|
|
||||||
- build-qpdf-debs
|
|
||||||
uses: ./.github/workflows/reusable-workflow-builder.yml
|
|
||||||
with:
|
|
||||||
dockerfile: ./docker-builders/Dockerfile.pikepdf
|
|
||||||
build-json: ${{ needs.prepare-docker-build.outputs.pikepdf-json }}
|
|
||||||
build-args: |
|
|
||||||
REPO=${{ needs.prepare-docker-build.outputs.ghcr-repository }}
|
|
||||||
QPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.qpdf-json).version }}
|
|
||||||
PIKEPDF_VERSION=${{ fromJSON(needs.prepare-docker-build.outputs.pikepdf-json).version }}
|
|
||||||
PILLOW_VERSION=${{ needs.prepare-docker-build.outputs.pillow-version }}
|
|
||||||
LXML_VERSION=${{ needs.prepare-docker-build.outputs.lxml-version }}
|
|
30
.github/workflows/project-actions.yml
vendored
@@ -1,10 +1,6 @@
|
|||||||
name: Project Automations
|
name: Project Automations
|
||||||
|
|
||||||
on:
|
on:
|
||||||
issues:
|
|
||||||
types:
|
|
||||||
- opened
|
|
||||||
- reopened
|
|
||||||
pull_request_target: #_target allows access to secrets
|
pull_request_target: #_target allows access to secrets
|
||||||
types:
|
types:
|
||||||
- opened
|
- opened
|
||||||
@@ -16,25 +12,7 @@ on:
|
|||||||
permissions:
|
permissions:
|
||||||
contents: read
|
contents: read
|
||||||
|
|
||||||
env:
|
|
||||||
todo: Todo
|
|
||||||
done: Done
|
|
||||||
in_progress: In Progress
|
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
issue_opened_or_reopened:
|
|
||||||
name: issue_opened_or_reopened
|
|
||||||
runs-on: ubuntu-22.04
|
|
||||||
if: github.event_name == 'issues' && (github.event.action == 'opened' || github.event.action == 'reopened')
|
|
||||||
steps:
|
|
||||||
- name: Add issue to project and set status to ${{ env.todo }}
|
|
||||||
uses: leonsteinhaeuser/project-beta-automations@v2.0.1
|
|
||||||
with:
|
|
||||||
gh_token: ${{ secrets.GH_TOKEN }}
|
|
||||||
organization: paperless-ngx
|
|
||||||
project_id: 2
|
|
||||||
resource_node_id: ${{ github.event.issue.node_id }}
|
|
||||||
status_value: ${{ env.todo }} # Target status
|
|
||||||
pr_opened_or_reopened:
|
pr_opened_or_reopened:
|
||||||
name: pr_opened_or_reopened
|
name: pr_opened_or_reopened
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
@@ -43,14 +21,6 @@ jobs:
|
|||||||
pull-requests: write
|
pull-requests: write
|
||||||
if: github.event_name == 'pull_request_target' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request.user.login != 'dependabot'
|
if: github.event_name == 'pull_request_target' && (github.event.action == 'opened' || github.event.action == 'reopened') && github.event.pull_request.user.login != 'dependabot'
|
||||||
steps:
|
steps:
|
||||||
- name: Add PR to project and set status to "Needs Review"
|
|
||||||
uses: leonsteinhaeuser/project-beta-automations@v2.0.1
|
|
||||||
with:
|
|
||||||
gh_token: ${{ secrets.GH_TOKEN }}
|
|
||||||
organization: paperless-ngx
|
|
||||||
project_id: 2
|
|
||||||
resource_node_id: ${{ github.event.pull_request.node_id }}
|
|
||||||
status_value: "Needs Review" # Target status
|
|
||||||
- name: Label PR with release-drafter
|
- name: Label PR with release-drafter
|
||||||
uses: release-drafter/release-drafter@v5
|
uses: release-drafter/release-drafter@v5
|
||||||
env:
|
env:
|
||||||
|
31
.github/workflows/release-chart.yml
vendored
@@ -1,31 +0,0 @@
|
|||||||
---
|
|
||||||
name: Release Charts
|
|
||||||
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
tags:
|
|
||||||
- v*
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
release_chart:
|
|
||||||
name: "Release Chart"
|
|
||||||
runs-on: ubuntu-22.04
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v3
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
- name: Configure Git
|
|
||||||
run: |
|
|
||||||
git config user.name "$GITHUB_ACTOR"
|
|
||||||
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
|
|
||||||
- name: Install Helm
|
|
||||||
uses: azure/setup-helm@v3
|
|
||||||
with:
|
|
||||||
version: v3.10.0
|
|
||||||
|
|
||||||
- name: Run chart-releaser
|
|
||||||
uses: helm/chart-releaser-action@v1.4.1
|
|
||||||
env:
|
|
||||||
CR_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
|
|
109
.github/workflows/repo-maintenance.yml
vendored
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
name: 'Repository Maintenance'
|
||||||
|
|
||||||
|
on:
|
||||||
|
schedule:
|
||||||
|
- cron: '0 3 * * *'
|
||||||
|
workflow_dispatch:
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
issues: write
|
||||||
|
pull-requests: write
|
||||||
|
discussions: write
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: lock
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
stale:
|
||||||
|
name: 'Stale'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/stale@v8
|
||||||
|
with:
|
||||||
|
days-before-stale: 7
|
||||||
|
days-before-close: 14
|
||||||
|
any-of-labels: 'cant-reproduce,not a bug'
|
||||||
|
stale-issue-label: stale
|
||||||
|
stale-pr-label: stale
|
||||||
|
stale-issue-message: >
|
||||||
|
This issue has been automatically marked as stale because it has not had
|
||||||
|
recent activity. It will be closed if no further activity occurs. Thank you
|
||||||
|
for your contributions.
|
||||||
|
lock-threads:
|
||||||
|
name: 'Lock Old Threads'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: dessant/lock-threads@v5
|
||||||
|
with:
|
||||||
|
issue-inactive-days: '30'
|
||||||
|
pr-inactive-days: '30'
|
||||||
|
discussion-inactive-days: '30'
|
||||||
|
log-output: true
|
||||||
|
issue-comment: >
|
||||||
|
This issue has been automatically locked since there
|
||||||
|
has not been any recent activity after it was closed.
|
||||||
|
Please open a new discussion or issue for related concerns.
|
||||||
|
pr-comment: >
|
||||||
|
This pull request has been automatically locked since there
|
||||||
|
has not been any recent activity after it was closed.
|
||||||
|
Please open a new discussion or issue for related concerns.
|
||||||
|
discussion-comment: >
|
||||||
|
This discussion has been automatically locked since there
|
||||||
|
has not been any recent activity after it was closed.
|
||||||
|
Please open a new discussion for related concerns.
|
||||||
|
close-answered-discussions:
|
||||||
|
name: 'Close Answered Discussions'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/github-script@v7
|
||||||
|
with:
|
||||||
|
script: |
|
||||||
|
function sleep(ms) {
|
||||||
|
return new Promise(resolve => setTimeout(resolve, ms));
|
||||||
|
}
|
||||||
|
|
||||||
|
const query = `query($owner:String!, $name:String!) {
|
||||||
|
repository(owner:$owner, name:$name){
|
||||||
|
discussions(first:100, answered:true, states:[OPEN]) {
|
||||||
|
nodes {
|
||||||
|
id,
|
||||||
|
number
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}`;
|
||||||
|
const variables = {
|
||||||
|
owner: context.repo.owner,
|
||||||
|
name: context.repo.repo,
|
||||||
|
}
|
||||||
|
const result = await github.graphql(query, variables)
|
||||||
|
|
||||||
|
console.log(`Found ${result.repository.discussions.nodes.length} open answered discussions`)
|
||||||
|
|
||||||
|
for (const discussion of result.repository.discussions.nodes) {
|
||||||
|
console.log(`Closing dicussion #${discussion.number} (${discussion.id})`)
|
||||||
|
|
||||||
|
const addCommentMutation = `mutation($discussion:ID!, $body:String!) {
|
||||||
|
addDiscussionComment(input:{discussionId:$discussion, body:$body}) {
|
||||||
|
clientMutationId
|
||||||
|
}
|
||||||
|
}`;
|
||||||
|
const commentVariables = {
|
||||||
|
discussion: discussion.id,
|
||||||
|
body: 'This discussion has been automatically closed because it was marked as answered.',
|
||||||
|
}
|
||||||
|
await github.graphql(addCommentMutation, commentVariables)
|
||||||
|
|
||||||
|
const closeDiscussionMutation = `mutation($discussion:ID!, $reason:DiscussionCloseReason!) {
|
||||||
|
closeDiscussion(input:{discussionId:$discussion, reason:$reason}) {
|
||||||
|
clientMutationId
|
||||||
|
}
|
||||||
|
}`;
|
||||||
|
const closeVariables = {
|
||||||
|
discussion: discussion.id,
|
||||||
|
reason: "RESOLVED",
|
||||||
|
}
|
||||||
|
await github.graphql(closeDiscussionMutation, closeVariables)
|
||||||
|
|
||||||
|
await sleep(1000)
|
||||||
|
}
|
57
.github/workflows/reusable-workflow-builder.yml
vendored
@@ -1,57 +0,0 @@
|
|||||||
name: Reusable Image Builder
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
dockerfile:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
build-json:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
build-args:
|
|
||||||
required: false
|
|
||||||
default: ""
|
|
||||||
type: string
|
|
||||||
build-platforms:
|
|
||||||
required: false
|
|
||||||
default: linux/amd64,linux/arm64,linux/arm/v7
|
|
||||||
type: string
|
|
||||||
|
|
||||||
concurrency:
|
|
||||||
group: ${{ github.workflow }}-${{ fromJSON(inputs.build-json).name }}-${{ fromJSON(inputs.build-json).version }}
|
|
||||||
cancel-in-progress: false
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
build-image:
|
|
||||||
name: Build ${{ fromJSON(inputs.build-json).name }} @ ${{ fromJSON(inputs.build-json).version }}
|
|
||||||
runs-on: ubuntu-22.04
|
|
||||||
steps:
|
|
||||||
-
|
|
||||||
name: Checkout
|
|
||||||
uses: actions/checkout@v3
|
|
||||||
-
|
|
||||||
name: Login to Github Container Registry
|
|
||||||
uses: docker/login-action@v2
|
|
||||||
with:
|
|
||||||
registry: ghcr.io
|
|
||||||
username: ${{ github.actor }}
|
|
||||||
password: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
-
|
|
||||||
name: Set up Docker Buildx
|
|
||||||
uses: docker/setup-buildx-action@v2
|
|
||||||
-
|
|
||||||
name: Set up QEMU
|
|
||||||
uses: docker/setup-qemu-action@v2
|
|
||||||
-
|
|
||||||
name: Build ${{ fromJSON(inputs.build-json).name }}
|
|
||||||
uses: docker/build-push-action@v3
|
|
||||||
with:
|
|
||||||
context: .
|
|
||||||
file: ${{ inputs.dockerfile }}
|
|
||||||
tags: ${{ fromJSON(inputs.build-json).image_tag }}
|
|
||||||
platforms: ${{ inputs.build-platforms }}
|
|
||||||
build-args: ${{ inputs.build-args }}
|
|
||||||
push: true
|
|
||||||
cache-from: type=registry,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
|
||||||
cache-to: type=registry,mode=max,ref=${{ fromJSON(inputs.build-json).cache_tag }}
|
|
1
.gitignore
vendored
@@ -73,6 +73,7 @@ virtualenv
|
|||||||
.venv/
|
.venv/
|
||||||
/docker-compose.env
|
/docker-compose.env
|
||||||
/docker-compose.yml
|
/docker-compose.yml
|
||||||
|
.ruff_cache/
|
||||||
|
|
||||||
# Used for development
|
# Used for development
|
||||||
scripts/import-for-development
|
scripts/import-for-development
|
||||||
|
@@ -5,20 +5,19 @@
|
|||||||
repos:
|
repos:
|
||||||
# General hooks
|
# General hooks
|
||||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||||
rev: v4.4.0
|
rev: v4.5.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: check-docstring-first
|
- id: check-docstring-first
|
||||||
- id: check-json
|
- id: check-json
|
||||||
exclude: "tsconfig.*json"
|
exclude: "tsconfig.*json"
|
||||||
- id: check-yaml
|
- id: check-yaml
|
||||||
exclude: "charts/paperless-ngx/templates/common.yaml"
|
|
||||||
- id: check-toml
|
- id: check-toml
|
||||||
- id: check-executables-have-shebangs
|
- id: check-executables-have-shebangs
|
||||||
- id: end-of-file-fixer
|
- id: end-of-file-fixer
|
||||||
exclude_types:
|
exclude_types:
|
||||||
- svg
|
- svg
|
||||||
- pofile
|
- pofile
|
||||||
exclude: "^(LICENSE|charts/paperless-ngx/README.md)$"
|
exclude: "(^LICENSE$)"
|
||||||
- id: mixed-line-ending
|
- id: mixed-line-ending
|
||||||
args:
|
args:
|
||||||
- "--fix=lf"
|
- "--fix=lf"
|
||||||
@@ -28,51 +27,26 @@ repos:
|
|||||||
- id: check-case-conflict
|
- id: check-case-conflict
|
||||||
- id: detect-private-key
|
- id: detect-private-key
|
||||||
- repo: https://github.com/pre-commit/mirrors-prettier
|
- repo: https://github.com/pre-commit/mirrors-prettier
|
||||||
rev: "v2.7.1"
|
rev: 'v3.1.0'
|
||||||
hooks:
|
hooks:
|
||||||
- id: prettier
|
- id: prettier
|
||||||
types_or:
|
types_or:
|
||||||
- javascript
|
- javascript
|
||||||
- ts
|
- ts
|
||||||
- markdown
|
- markdown
|
||||||
exclude: "(^Pipfile\\.lock$)|(^charts/paperless-ngx/README.md$)"
|
exclude: "(^Pipfile\\.lock$)"
|
||||||
# Python hooks
|
# Python hooks
|
||||||
- repo: https://github.com/asottile/reorder_python_imports
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
rev: v3.9.0
|
rev: 'v0.1.5'
|
||||||
hooks:
|
hooks:
|
||||||
- id: reorder-python-imports
|
- id: ruff
|
||||||
exclude: "(migrations)"
|
- repo: https://github.com/psf/black-pre-commit-mirror
|
||||||
- repo: https://github.com/asottile/yesqa
|
rev: 23.11.0
|
||||||
rev: "v1.4.0"
|
|
||||||
hooks:
|
|
||||||
- id: yesqa
|
|
||||||
exclude: "(migrations)"
|
|
||||||
- repo: https://github.com/asottile/add-trailing-comma
|
|
||||||
rev: "v2.4.0"
|
|
||||||
hooks:
|
|
||||||
- id: add-trailing-comma
|
|
||||||
exclude: "(migrations)"
|
|
||||||
- repo: https://github.com/PyCQA/flake8
|
|
||||||
rev: 6.0.0
|
|
||||||
hooks:
|
|
||||||
- id: flake8
|
|
||||||
files: ^src/
|
|
||||||
args:
|
|
||||||
- "--config=./src/setup.cfg"
|
|
||||||
- repo: https://github.com/psf/black
|
|
||||||
rev: 22.12.0
|
|
||||||
hooks:
|
hooks:
|
||||||
- id: black
|
- id: black
|
||||||
- repo: https://github.com/asottile/pyupgrade
|
|
||||||
rev: v3.3.1
|
|
||||||
hooks:
|
|
||||||
- id: pyupgrade
|
|
||||||
exclude: "(migrations)"
|
|
||||||
args:
|
|
||||||
- "--py38-plus"
|
|
||||||
# Dockerfile hooks
|
# Dockerfile hooks
|
||||||
- repo: https://github.com/AleksaC/hadolint-py
|
- repo: https://github.com/AleksaC/hadolint-py
|
||||||
rev: v2.10.0
|
rev: v2.12.0.3
|
||||||
hooks:
|
hooks:
|
||||||
- id: hadolint
|
- id: hadolint
|
||||||
# Shell script hooks
|
# Shell script hooks
|
||||||
@@ -83,6 +57,6 @@ repos:
|
|||||||
args:
|
args:
|
||||||
- "--tab"
|
- "--tab"
|
||||||
- repo: https://github.com/shellcheck-py/shellcheck-py
|
- repo: https://github.com/shellcheck-py/shellcheck-py
|
||||||
rev: "v0.9.0.2"
|
rev: "v0.9.0.6"
|
||||||
hooks:
|
hooks:
|
||||||
- id: shellcheck
|
- id: shellcheck
|
||||||
|
20
.prettierrc
@@ -1,4 +1,16 @@
|
|||||||
# https://prettier.io/docs/en/options.html#semicolons
|
{
|
||||||
semi: false
|
# https://prettier.io/docs/en/options.html#semicolons
|
||||||
# https://prettier.io/docs/en/options.html#quotes
|
"semi": false,
|
||||||
singleQuote: true
|
# https://prettier.io/docs/en/options.html#quotes
|
||||||
|
"singleQuote": true,
|
||||||
|
# https://prettier.io/docs/en/options.html#trailing-commas
|
||||||
|
"trailingComma": "es5",
|
||||||
|
"overrides": [
|
||||||
|
{
|
||||||
|
"files": ["index.md", "administration.md"],
|
||||||
|
"options": {
|
||||||
|
"tabWidth": 4
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
1
.python-version
Normal file
@@ -0,0 +1 @@
|
|||||||
|
3.9.18
|
23
.ruff.toml
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
# https://beta.ruff.rs/docs/settings/
|
||||||
|
# https://beta.ruff.rs/docs/rules/
|
||||||
|
extend-select = ["I", "W", "UP", "COM", "DJ", "EXE", "ISC", "ICN", "G201", "INP", "PIE", "RSE", "SIM", "TID", "PLC", "PLE", "RUF"]
|
||||||
|
# TODO PTH
|
||||||
|
ignore = ["DJ001", "SIM105", "RUF012"]
|
||||||
|
fix = true
|
||||||
|
line-length = 88
|
||||||
|
respect-gitignore = true
|
||||||
|
src = ["src"]
|
||||||
|
target-version = "py39"
|
||||||
|
output-format = "grouped"
|
||||||
|
show-fixes = true
|
||||||
|
|
||||||
|
[per-file-ignores]
|
||||||
|
".github/scripts/*.py" = ["E501", "INP001", "SIM117"]
|
||||||
|
"docker/wait-for-redis.py" = ["INP001"]
|
||||||
|
"*/tests/*.py" = ["E501", "SIM117"]
|
||||||
|
"*/migrations/*.py" = ["E501", "SIM"]
|
||||||
|
"src/paperless_tesseract/tests/test_parser.py" = ["RUF001"]
|
||||||
|
"src/documents/models.py" = ["SIM115"]
|
||||||
|
|
||||||
|
[isort]
|
||||||
|
force-single-line = true
|
@@ -11,7 +11,7 @@ If you want to implement something big:
|
|||||||
|
|
||||||
## Python
|
## Python
|
||||||
|
|
||||||
Paperless supports python 3.8 and 3.9. We format Python code with [Black](https://github.com/psf/black).
|
Paperless supports python 3.9 - 3.11. We format Python code with [Black](https://github.com/psf/black).
|
||||||
|
|
||||||
## Branches
|
## Branches
|
||||||
|
|
||||||
@@ -45,7 +45,7 @@ Examples of `non-trivial` PRs might include:
|
|||||||
|
|
||||||
- Additional features
|
- Additional features
|
||||||
- Large changes to many distinct files
|
- Large changes to many distinct files
|
||||||
- Breaking or depreciation of existing features
|
- Breaking or deprecation of existing features
|
||||||
|
|
||||||
Our community review process for `non-trivial` PRs is the following:
|
Our community review process for `non-trivial` PRs is the following:
|
||||||
|
|
||||||
@@ -58,6 +58,13 @@ Our community review process for `non-trivial` PRs is the following:
|
|||||||
|
|
||||||
This process might be slow as community members have different schedules and time to dedicate to the Paperless project. However it ensures community code reviews are as brilliantly thorough as they once were with @jonaswinkler.
|
This process might be slow as community members have different schedules and time to dedicate to the Paperless project. However it ensures community code reviews are as brilliantly thorough as they once were with @jonaswinkler.
|
||||||
|
|
||||||
|
# AI-Generated Code
|
||||||
|
|
||||||
|
This project does not specifically prohibit the use of AI-generated code _during the process_ of creating a PR, however:
|
||||||
|
|
||||||
|
1. Any code present in the final PR that was generated using AI sources should be clearly attributed as such and must not violate copyright protections.
|
||||||
|
2. We will not accept PRs that are entirely or mostly AI-derived.
|
||||||
|
|
||||||
# Translating Paperless-ngx
|
# Translating Paperless-ngx
|
||||||
|
|
||||||
Some notes about translation:
|
Some notes about translation:
|
||||||
|
203
Dockerfile
@@ -1,25 +1,11 @@
|
|||||||
# syntax=docker/dockerfile:1.4
|
# syntax=docker/dockerfile:1
|
||||||
|
# https://github.com/moby/buildkit/blob/master/frontend/dockerfile/docs/reference.md
|
||||||
|
|
||||||
# Pull the installer images from the library
|
# Stage: compile-frontend
|
||||||
# These are all built previously
|
# Purpose: Compiles the frontend
|
||||||
# They provide either a .deb or .whl
|
# Notes:
|
||||||
|
# - Does NPM stuff with Typescript and such
|
||||||
ARG JBIG2ENC_VERSION
|
FROM --platform=$BUILDPLATFORM docker.io/node:20-bookworm-slim AS compile-frontend
|
||||||
ARG QPDF_VERSION
|
|
||||||
ARG PIKEPDF_VERSION
|
|
||||||
ARG PSYCOPG2_VERSION
|
|
||||||
|
|
||||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/jbig2enc:${JBIG2ENC_VERSION} as jbig2enc-builder
|
|
||||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/qpdf:${QPDF_VERSION} as qpdf-builder
|
|
||||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/pikepdf:${PIKEPDF_VERSION} as pikepdf-builder
|
|
||||||
FROM ghcr.io/paperless-ngx/paperless-ngx/builder/psycopg2:${PSYCOPG2_VERSION} as psycopg2-builder
|
|
||||||
|
|
||||||
FROM --platform=$BUILDPLATFORM node:16-bullseye-slim AS compile-frontend
|
|
||||||
|
|
||||||
# This stage compiles the frontend
|
|
||||||
# This stage runs once for the native platform, as the outputs are not
|
|
||||||
# dependent on target arch
|
|
||||||
# Inputs: None
|
|
||||||
|
|
||||||
COPY ./src-ui /src/src-ui
|
COPY ./src-ui /src/src-ui
|
||||||
|
|
||||||
@@ -30,14 +16,12 @@ RUN set -eux \
|
|||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& ./node_modules/.bin/ng build --configuration production
|
&& ./node_modules/.bin/ng build --configuration production
|
||||||
|
|
||||||
FROM --platform=$BUILDPLATFORM python:3.9-slim-bullseye as pipenv-base
|
# Stage: pipenv-base
|
||||||
|
# Purpose: Generates a requirements.txt file for building
|
||||||
# This stage generates the requirements.txt file using pipenv
|
# Comments:
|
||||||
# This stage runs once for the native platform, as the outputs are not
|
# - pipenv dependencies are not left in the final image
|
||||||
# dependent on target arch
|
# - pipenv can't touch the final image somehow
|
||||||
# This way, pipenv dependencies are not left in the final image
|
FROM --platform=$BUILDPLATFORM docker.io/python:3.11-alpine as pipenv-base
|
||||||
# nor can pipenv mess up the final image somehow
|
|
||||||
# Inputs: None
|
|
||||||
|
|
||||||
WORKDIR /usr/src/pipenv
|
WORKDIR /usr/src/pipenv
|
||||||
|
|
||||||
@@ -45,11 +29,15 @@ COPY Pipfile* ./
|
|||||||
|
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& echo "Installing pipenv" \
|
&& echo "Installing pipenv" \
|
||||||
&& python3 -m pip install --no-cache-dir --upgrade pipenv==2022.11.30 \
|
&& python3 -m pip install --no-cache-dir --upgrade pipenv==2023.10.24 \
|
||||||
&& echo "Generating requirement.txt" \
|
&& echo "Generating requirement.txt" \
|
||||||
&& pipenv requirements > requirements.txt
|
&& pipenv requirements > requirements.txt
|
||||||
|
|
||||||
FROM python:3.9-slim-bullseye as main-app
|
# Stage: main-app
|
||||||
|
# Purpose: The final image
|
||||||
|
# Comments:
|
||||||
|
# - Don't leave anything extra in here
|
||||||
|
FROM docker.io/python:3.11-slim-bookworm as main-app
|
||||||
|
|
||||||
LABEL org.opencontainers.image.authors="paperless-ngx team <hello@paperless-ngx.com>"
|
LABEL org.opencontainers.image.authors="paperless-ngx team <hello@paperless-ngx.com>"
|
||||||
LABEL org.opencontainers.image.documentation="https://docs.paperless-ngx.com/"
|
LABEL org.opencontainers.image.documentation="https://docs.paperless-ngx.com/"
|
||||||
@@ -58,30 +46,22 @@ LABEL org.opencontainers.image.url="https://github.com/paperless-ngx/paperless-n
|
|||||||
LABEL org.opencontainers.image.licenses="GPL-3.0-only"
|
LABEL org.opencontainers.image.licenses="GPL-3.0-only"
|
||||||
|
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
ARG DEBIAN_FRONTEND=noninteractive
|
||||||
# Buildx provided
|
|
||||||
ARG TARGETARCH
|
|
||||||
ARG TARGETVARIANT
|
|
||||||
|
|
||||||
# Workflow provided
|
# Buildx provided, must be defined to use though
|
||||||
ARG QPDF_VERSION
|
ARG TARGETARCH
|
||||||
|
|
||||||
|
# Can be workflow provided, defaults set for manual building
|
||||||
|
ARG JBIG2ENC_VERSION=0.29
|
||||||
|
ARG QPDF_VERSION=11.6.3
|
||||||
|
ARG GS_VERSION=10.02.0
|
||||||
|
|
||||||
#
|
#
|
||||||
# Begin installation and configuration
|
# Begin installation and configuration
|
||||||
# Order the steps below from least often changed to most
|
# Order the steps below from least often changed to most
|
||||||
#
|
#
|
||||||
|
|
||||||
# copy jbig2enc
|
|
||||||
# Basically will never change again
|
|
||||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/.libs/libjbig2enc* /usr/local/lib/
|
|
||||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/jbig2 /usr/local/bin/
|
|
||||||
COPY --from=jbig2enc-builder /usr/src/jbig2enc/src/*.h /usr/local/include/
|
|
||||||
|
|
||||||
# Packages need for running
|
# Packages need for running
|
||||||
ARG RUNTIME_PACKAGES="\
|
ARG RUNTIME_PACKAGES="\
|
||||||
# Python
|
|
||||||
python3 \
|
|
||||||
python3-pip \
|
|
||||||
python3-setuptools \
|
|
||||||
# General utils
|
# General utils
|
||||||
curl \
|
curl \
|
||||||
# Docker specific
|
# Docker specific
|
||||||
@@ -95,23 +75,11 @@ ARG RUNTIME_PACKAGES="\
|
|||||||
gnupg \
|
gnupg \
|
||||||
icc-profiles-free \
|
icc-profiles-free \
|
||||||
imagemagick \
|
imagemagick \
|
||||||
# Image processing
|
|
||||||
liblept5 \
|
|
||||||
liblcms2-2 \
|
|
||||||
libtiff5 \
|
|
||||||
libfreetype6 \
|
|
||||||
libwebp6 \
|
|
||||||
libopenjp2-7 \
|
|
||||||
libimagequant0 \
|
|
||||||
libraqm0 \
|
|
||||||
libjpeg62-turbo \
|
|
||||||
# PostgreSQL
|
# PostgreSQL
|
||||||
libpq5 \
|
libpq5 \
|
||||||
postgresql-client \
|
postgresql-client \
|
||||||
# MySQL / MariaDB
|
# MySQL / MariaDB
|
||||||
mariadb-client \
|
mariadb-client \
|
||||||
# For Numpy
|
|
||||||
libatlas3-base \
|
|
||||||
# OCRmyPDF dependencies
|
# OCRmyPDF dependencies
|
||||||
tesseract-ocr \
|
tesseract-ocr \
|
||||||
tesseract-ocr-eng \
|
tesseract-ocr-eng \
|
||||||
@@ -121,11 +89,12 @@ ARG RUNTIME_PACKAGES="\
|
|||||||
tesseract-ocr-spa \
|
tesseract-ocr-spa \
|
||||||
unpaper \
|
unpaper \
|
||||||
pngquant \
|
pngquant \
|
||||||
# pikepdf / qpdf
|
|
||||||
jbig2dec \
|
jbig2dec \
|
||||||
|
# lxml
|
||||||
libxml2 \
|
libxml2 \
|
||||||
libxslt1.1 \
|
libxslt1.1 \
|
||||||
libgnutls30 \
|
# itself
|
||||||
|
qpdf \
|
||||||
# Mime type detection
|
# Mime type detection
|
||||||
file \
|
file \
|
||||||
libmagic1 \
|
libmagic1 \
|
||||||
@@ -133,9 +102,7 @@ ARG RUNTIME_PACKAGES="\
|
|||||||
zlib1g \
|
zlib1g \
|
||||||
# Barcode splitter
|
# Barcode splitter
|
||||||
libzbar0 \
|
libzbar0 \
|
||||||
poppler-utils \
|
poppler-utils"
|
||||||
# RapidFuzz on armv7
|
|
||||||
libatomic1"
|
|
||||||
|
|
||||||
# Install basic runtime packages.
|
# Install basic runtime packages.
|
||||||
# These change very infrequently
|
# These change very infrequently
|
||||||
@@ -143,9 +110,39 @@ RUN set -eux \
|
|||||||
echo "Installing system packages" \
|
echo "Installing system packages" \
|
||||||
&& apt-get update \
|
&& apt-get update \
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${RUNTIME_PACKAGES} \
|
&& apt-get install --yes --quiet --no-install-recommends ${RUNTIME_PACKAGES} \
|
||||||
&& rm -rf /var/lib/apt/lists/* \
|
&& echo "Installing pre-built updates" \
|
||||||
|
&& echo "Installing qpdf ${QPDF_VERSION}" \
|
||||||
|
&& curl --fail --silent --show-error --location \
|
||||||
|
--output libqpdf29_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
https://github.com/paperless-ngx/builder/releases/download/qpdf-${QPDF_VERSION}/libqpdf29_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
&& curl --fail --silent --show-error --location \
|
||||||
|
--output qpdf_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
https://github.com/paperless-ngx/builder/releases/download/qpdf-${QPDF_VERSION}/qpdf_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
&& dpkg --install ./libqpdf29_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
&& dpkg --install ./qpdf_${QPDF_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
&& echo "Installing Ghostscript ${GS_VERSION}" \
|
||||||
|
&& curl --fail --silent --show-error --location \
|
||||||
|
--output libgs10_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||||
|
https://github.com/paperless-ngx/builder/releases/download/ghostscript-${GS_VERSION}/libgs10_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||||
|
&& curl --fail --silent --show-error --location \
|
||||||
|
--output ghostscript_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||||
|
https://github.com/paperless-ngx/builder/releases/download/ghostscript-${GS_VERSION}/ghostscript_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||||
|
&& curl --fail --silent --show-error --location \
|
||||||
|
--output libgs10-common_${GS_VERSION}.dfsg-2_all.deb \
|
||||||
|
https://github.com/paperless-ngx/builder/releases/download/ghostscript-${GS_VERSION}/libgs10-common_${GS_VERSION}.dfsg-2_all.deb \
|
||||||
|
&& dpkg --install ./libgs10-common_${GS_VERSION}.dfsg-2_all.deb \
|
||||||
|
&& dpkg --install ./libgs10_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||||
|
&& dpkg --install ./ghostscript_${GS_VERSION}.dfsg-2_${TARGETARCH}.deb \
|
||||||
|
&& echo "Installing jbig2enc" \
|
||||||
|
&& curl --fail --silent --show-error --location \
|
||||||
|
--output jbig2enc_${JBIG2ENC_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
https://github.com/paperless-ngx/builder/releases/download/jbig2enc-${JBIG2ENC_VERSION}/jbig2enc_${JBIG2ENC_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
&& dpkg --install ./jbig2enc_${JBIG2ENC_VERSION}-1_${TARGETARCH}.deb \
|
||||||
|
&& echo "Cleaning up image layer" \
|
||||||
|
&& rm --force --verbose *.deb \
|
||||||
|
&& rm --recursive --force --verbose /var/lib/apt/lists/* \
|
||||||
&& echo "Installing supervisor" \
|
&& echo "Installing supervisor" \
|
||||||
&& python3 -m pip install --default-timeout=1000 --upgrade --no-cache-dir supervisor==4.2.4
|
&& python3 -m pip install --default-timeout=1000 --upgrade --no-cache-dir supervisor==4.2.5
|
||||||
|
|
||||||
# Copy gunicorn config
|
# Copy gunicorn config
|
||||||
# Changes very infrequently
|
# Changes very infrequently
|
||||||
@@ -154,7 +151,6 @@ WORKDIR /usr/src/paperless/
|
|||||||
COPY gunicorn.conf.py .
|
COPY gunicorn.conf.py .
|
||||||
|
|
||||||
# setup docker-specific things
|
# setup docker-specific things
|
||||||
# Use mounts to avoid copying installer files into the image
|
|
||||||
# These change sometimes, but rarely
|
# These change sometimes, but rarely
|
||||||
WORKDIR /usr/src/paperless/src/docker/
|
WORKDIR /usr/src/paperless/src/docker/
|
||||||
|
|
||||||
@@ -195,23 +191,6 @@ RUN set -eux \
|
|||||||
&& chmod +x install_management_commands.sh \
|
&& chmod +x install_management_commands.sh \
|
||||||
&& ./install_management_commands.sh
|
&& ./install_management_commands.sh
|
||||||
|
|
||||||
# Install the built packages from the installer library images
|
|
||||||
# Use mounts to avoid copying installer files into the image
|
|
||||||
# These change sometimes
|
|
||||||
RUN --mount=type=bind,from=qpdf-builder,target=/qpdf \
|
|
||||||
--mount=type=bind,from=psycopg2-builder,target=/psycopg2 \
|
|
||||||
--mount=type=bind,from=pikepdf-builder,target=/pikepdf \
|
|
||||||
set -eux \
|
|
||||||
&& echo "Installing qpdf" \
|
|
||||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/${QPDF_VERSION}/${TARGETARCH}${TARGETVARIANT}/libqpdf29_*.deb \
|
|
||||||
&& apt-get install --yes --no-install-recommends /qpdf/usr/src/qpdf/${QPDF_VERSION}/${TARGETARCH}${TARGETVARIANT}/qpdf_*.deb \
|
|
||||||
&& echo "Installing pikepdf and dependencies" \
|
|
||||||
&& python3 -m pip install --no-cache-dir /pikepdf/usr/src/wheels/*.whl \
|
|
||||||
&& python3 -m pip list \
|
|
||||||
&& echo "Installing psycopg2" \
|
|
||||||
&& python3 -m pip install --no-cache-dir /psycopg2/usr/src/wheels/psycopg2*.whl \
|
|
||||||
&& python3 -m pip list
|
|
||||||
|
|
||||||
WORKDIR /usr/src/paperless/src/
|
WORKDIR /usr/src/paperless/src/
|
||||||
|
|
||||||
# Python dependencies
|
# Python dependencies
|
||||||
@@ -223,43 +202,61 @@ COPY --from=pipenv-base /usr/src/pipenv/requirements.txt ./
|
|||||||
ARG BUILD_PACKAGES="\
|
ARG BUILD_PACKAGES="\
|
||||||
build-essential \
|
build-essential \
|
||||||
git \
|
git \
|
||||||
|
# https://www.psycopg.org/docs/install.html#prerequisites
|
||||||
|
libpq-dev \
|
||||||
|
# https://github.com/PyMySQL/mysqlclient#linux
|
||||||
default-libmysqlclient-dev \
|
default-libmysqlclient-dev \
|
||||||
python3-dev"
|
pkg-config"
|
||||||
|
|
||||||
RUN set -eux \
|
# hadolint ignore=DL3042
|
||||||
|
RUN --mount=type=cache,target=/root/.cache/pip/,id=pip-cache \
|
||||||
|
set -eux \
|
||||||
&& echo "Installing build system packages" \
|
&& echo "Installing build system packages" \
|
||||||
&& apt-get update \
|
&& apt-get update \
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
||||||
&& python3 -m pip install --no-cache-dir --upgrade wheel \
|
&& python3 -m pip install --no-cache-dir --upgrade wheel \
|
||||||
&& echo "Installing Python requirements" \
|
&& echo "Installing Python requirements" \
|
||||||
&& python3 -m pip install --default-timeout=1000 --no-cache-dir --requirement requirements.txt \
|
&& python3 -m pip install --default-timeout=1000 --requirement requirements.txt \
|
||||||
|
&& echo "Patching whitenoise for compression speedup" \
|
||||||
|
&& curl --fail --silent --show-error --location --output 484.patch https://github.com/evansd/whitenoise/pull/484.patch \
|
||||||
|
&& patch -d /usr/local/lib/python3.11/site-packages --verbose -p2 < 484.patch \
|
||||||
|
&& rm 484.patch \
|
||||||
&& echo "Installing NLTK data" \
|
&& echo "Installing NLTK data" \
|
||||||
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/local/share/nltk_data" snowball_data \
|
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" snowball_data \
|
||||||
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/local/share/nltk_data" stopwords \
|
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" stopwords \
|
||||||
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/local/share/nltk_data" punkt \
|
&& python3 -W ignore::RuntimeWarning -m nltk.downloader -d "/usr/share/nltk_data" punkt \
|
||||||
&& echo "Cleaning up image" \
|
&& echo "Cleaning up image" \
|
||||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
&& apt-get --yes purge ${BUILD_PACKAGES} \
|
||||||
&& apt-get -y autoremove --purge \
|
&& apt-get --yes autoremove --purge \
|
||||||
&& apt-get clean --yes \
|
&& apt-get clean --yes \
|
||||||
&& rm -rf /var/lib/apt/lists/* \
|
&& rm --recursive --force --verbose /var/lib/apt/lists/* \
|
||||||
&& rm -rf /tmp/* \
|
&& rm --recursive --force --verbose /tmp/* \
|
||||||
&& rm -rf /var/tmp/* \
|
&& rm --recursive --force --verbose /var/tmp/* \
|
||||||
&& rm -rf /var/cache/apt/archives/* \
|
&& rm --recursive --force --verbose /var/cache/apt/archives/* \
|
||||||
&& truncate -s 0 /var/log/*log
|
&& truncate --size 0 /var/log/*log
|
||||||
|
|
||||||
# copy backend
|
# copy backend
|
||||||
COPY ./src ./
|
COPY --chown=1000:1000 ./src ./
|
||||||
|
|
||||||
# copy frontend
|
# copy frontend
|
||||||
COPY --from=compile-frontend /src/src/documents/static/frontend/ ./documents/static/frontend/
|
COPY --from=compile-frontend --chown=1000:1000 /src/src/documents/static/frontend/ ./documents/static/frontend/
|
||||||
|
|
||||||
# add users, setup scripts
|
# add users, setup scripts
|
||||||
|
# Mount the compiled frontend to expected location
|
||||||
RUN set -eux \
|
RUN set -eux \
|
||||||
&& addgroup --gid 1000 paperless \
|
&& echo "Setting up user/group" \
|
||||||
&& useradd --uid 1000 --gid paperless --home-dir /usr/src/paperless paperless \
|
&& addgroup --gid 1000 paperless \
|
||||||
&& chown -R paperless:paperless ../ \
|
&& useradd --uid 1000 --gid paperless --home-dir /usr/src/paperless paperless \
|
||||||
&& gosu paperless python3 manage.py collectstatic --clear --no-input \
|
&& echo "Creating volume directories" \
|
||||||
&& gosu paperless python3 manage.py compilemessages
|
&& mkdir --parents --verbose /usr/src/paperless/data \
|
||||||
|
&& mkdir --parents --verbose /usr/src/paperless/media \
|
||||||
|
&& mkdir --parents --verbose /usr/src/paperless/consume \
|
||||||
|
&& mkdir --parents --verbose /usr/src/paperless/export \
|
||||||
|
&& echo "Adjusting all permissions" \
|
||||||
|
&& chown --from root:root --changes --recursive paperless:paperless /usr/src/paperless \
|
||||||
|
&& echo "Collecting static files" \
|
||||||
|
&& gosu paperless python3 manage.py collectstatic --clear --no-input --link \
|
||||||
|
&& gosu paperless python3 manage.py compilemessages
|
||||||
|
|
||||||
VOLUME ["/usr/src/paperless/data", \
|
VOLUME ["/usr/src/paperless/data", \
|
||||||
"/usr/src/paperless/media", \
|
"/usr/src/paperless/media", \
|
||||||
|
125
Pipfile
@@ -3,88 +3,93 @@ url = "https://pypi.python.org/simple"
|
|||||||
verify_ssl = true
|
verify_ssl = true
|
||||||
name = "pypi"
|
name = "pypi"
|
||||||
|
|
||||||
[[source]]
|
|
||||||
url = "https://www.piwheels.org/simple"
|
|
||||||
verify_ssl = true
|
|
||||||
name = "piwheels"
|
|
||||||
|
|
||||||
[packages]
|
[packages]
|
||||||
dateparser = "~=1.1"
|
dateparser = "~=1.1"
|
||||||
django = "~=4.1"
|
# WARNING: django does not use semver.
|
||||||
|
# Only patch versions are guaranteed to not introduce breaking changes.
|
||||||
|
django = "~=4.2.7"
|
||||||
|
django-auditlog = "*"
|
||||||
|
django-celery-results = "*"
|
||||||
|
django-compression-middleware = "*"
|
||||||
django-cors-headers = "*"
|
django-cors-headers = "*"
|
||||||
django-extensions = "*"
|
django-extensions = "*"
|
||||||
django-filter = "~=22.1"
|
django-filter = "~=23.3"
|
||||||
|
django-guardian = "*"
|
||||||
|
django-multiselectfield = "*"
|
||||||
djangorestframework = "~=3.14"
|
djangorestframework = "~=3.14"
|
||||||
|
djangorestframework-guardian = "*"
|
||||||
|
drf-writable-nested = "*"
|
||||||
|
bleach = "*"
|
||||||
|
celery = {extras = ["redis"], version = "*"}
|
||||||
|
channels = "~=4.0"
|
||||||
|
channels-redis = "*"
|
||||||
|
concurrent-log-handler = "*"
|
||||||
filelock = "*"
|
filelock = "*"
|
||||||
|
flower = "*"
|
||||||
|
gotenberg-client = "*"
|
||||||
gunicorn = "*"
|
gunicorn = "*"
|
||||||
imap-tools = "*"
|
imap-tools = "*"
|
||||||
|
inotifyrecursive = "~=0.3"
|
||||||
langdetect = "*"
|
langdetect = "*"
|
||||||
|
mysqlclient = "*"
|
||||||
|
nltk = "*"
|
||||||
|
ocrmypdf = "~=15.0"
|
||||||
pathvalidate = "*"
|
pathvalidate = "*"
|
||||||
pillow = "~=9.3"
|
pdf2image = "*"
|
||||||
pikepdf = "*"
|
|
||||||
python-gnupg = "*"
|
|
||||||
python-dotenv = "*"
|
|
||||||
python-dateutil = "*"
|
|
||||||
python-magic = "*"
|
|
||||||
psycopg2 = "*"
|
psycopg2 = "*"
|
||||||
|
python-dateutil = "*"
|
||||||
|
python-dotenv = "*"
|
||||||
|
python-gnupg = "*"
|
||||||
|
python-ipware = "*"
|
||||||
|
python-magic = "*"
|
||||||
|
pyzbar = "*"
|
||||||
rapidfuzz = "*"
|
rapidfuzz = "*"
|
||||||
redis = {extras = ["hiredis"], version = "*"}
|
redis = {extras = ["hiredis"], version = "*"}
|
||||||
scikit-learn = "~=1.1"
|
scikit-learn = "~=1.3"
|
||||||
numpy = "*"
|
|
||||||
whitenoise = "~=6.2"
|
|
||||||
watchdog = "~=2.1"
|
|
||||||
whoosh="~=2.7"
|
|
||||||
inotifyrecursive = "~=0.3"
|
|
||||||
ocrmypdf = "~=14.0"
|
|
||||||
tqdm = "*"
|
|
||||||
tika = "*"
|
|
||||||
# TODO: This will sadly also install daphne+dependencies,
|
|
||||||
# which an ASGI server we don't need. Adds about 15MB image size.
|
|
||||||
channels = "~=3.0"
|
|
||||||
uvicorn = {extras = ["standard"], version = "*"}
|
|
||||||
concurrent-log-handler = "*"
|
|
||||||
"pdfminer.six" = "*"
|
|
||||||
"backports.zoneinfo" = {version = "*", markers = "python_version < '3.9'"}
|
|
||||||
"importlib-resources" = {version = "*", markers = "python_version < '3.9'"}
|
|
||||||
zipp = {version = "*", markers = "python_version < '3.9'"}
|
|
||||||
pyzbar = "*"
|
|
||||||
mysqlclient = "*"
|
|
||||||
celery = {extras = ["redis"], version = "*"}
|
|
||||||
django-celery-results = "*"
|
|
||||||
setproctitle = "*"
|
setproctitle = "*"
|
||||||
nltk = "*"
|
tika-client = "*"
|
||||||
pdf2image = "*"
|
tqdm = "*"
|
||||||
flower = "*"
|
uvicorn = {extras = ["standard"], version = "*"}
|
||||||
bleach = "*"
|
watchdog = "~=3.0"
|
||||||
|
whitenoise = "~=6.6"
|
||||||
#
|
whoosh="~=2.7"
|
||||||
# Packages locked due to issues (try to check if these are fixed in a release every so often)
|
zxing-cpp = {version = "*", platform_machine = "== 'x86_64'"}
|
||||||
#
|
|
||||||
|
|
||||||
# Pin this until piwheels is building 1.9 (see https://www.piwheels.org/project/scipy/)
|
|
||||||
scipy = "==1.8.1"
|
|
||||||
|
|
||||||
# Newer versions aren't builting yet (see https://www.piwheels.org/project/cryptography/)
|
|
||||||
cryptography = "==38.0.1"
|
|
||||||
|
|
||||||
# Locked version until https://github.com/django/channels_redis/issues/332
|
|
||||||
# is resolved
|
|
||||||
channels-redis = "==3.4.1"
|
|
||||||
|
|
||||||
[dev-packages]
|
[dev-packages]
|
||||||
coveralls = "*"
|
# Linting
|
||||||
|
black = "*"
|
||||||
|
pre-commit = "*"
|
||||||
|
ruff = "*"
|
||||||
|
# Testing
|
||||||
factory-boy = "*"
|
factory-boy = "*"
|
||||||
pycodestyle = "*"
|
|
||||||
pytest = "*"
|
pytest = "*"
|
||||||
pytest-cov = "*"
|
pytest-cov = "*"
|
||||||
pytest-django = "*"
|
pytest-django = "*"
|
||||||
|
pytest-httpx = "*"
|
||||||
pytest-env = "*"
|
pytest-env = "*"
|
||||||
pytest-sugar = "*"
|
pytest-sugar = "*"
|
||||||
pytest-xdist = "*"
|
pytest-xdist = "*"
|
||||||
tox = "*"
|
pytest-rerunfailures = "*"
|
||||||
black = "*"
|
|
||||||
pre-commit = "*"
|
|
||||||
sphinx-autobuild = "*"
|
|
||||||
myst-parser = "*"
|
|
||||||
imagehash = "*"
|
imagehash = "*"
|
||||||
|
daphne = "*"
|
||||||
|
# Documentation
|
||||||
mkdocs-material = "*"
|
mkdocs-material = "*"
|
||||||
|
mkdocs-glightbox = "*"
|
||||||
|
|
||||||
|
[typing-dev]
|
||||||
|
mypy = "*"
|
||||||
|
types-Pillow = "*"
|
||||||
|
django-filter-stubs = "*"
|
||||||
|
types-python-dateutil = "*"
|
||||||
|
djangorestframework-stubs = {extras= ["compatible-mypy"], version="*"}
|
||||||
|
celery-types = "*"
|
||||||
|
django-stubs = {extras= ["compatible-mypy"], version="*"}
|
||||||
|
types-dateparser = "*"
|
||||||
|
types-bleach = "*"
|
||||||
|
types-redis = "*"
|
||||||
|
types-tqdm = "*"
|
||||||
|
types-Markdown = "*"
|
||||||
|
types-Pygments = "*"
|
||||||
|
types-colorama = "*"
|
||||||
|
types-psycopg2 = "*"
|
||||||
|
types-setuptools = "*"
|
||||||
|
5318
Pipfile.lock
generated
67
README.md
@@ -1,13 +1,16 @@
|
|||||||
[](https://github.com/paperless-ngx/paperless-ngx/actions)
|
[](https://github.com/paperless-ngx/paperless-ngx/actions)
|
||||||
[](https://crowdin.com/project/paperless-ngx)
|
[](https://crowdin.com/project/paperless-ngx)
|
||||||
[](https://docs.paperless-ngx.com)
|
[](https://docs.paperless-ngx.com)
|
||||||
[](https://coveralls.io/github/paperless-ngx/paperless-ngx?branch=master)
|
[](https://codecov.io/gh/paperless-ngx/paperless-ngx)
|
||||||
[](https://matrix.to/#/%23paperlessngx%3Amatrix.org)
|
[](https://matrix.to/#/%23paperlessngx%3Amatrix.org)
|
||||||
[](https://demo.paperless-ngx.com)
|
[](https://demo.paperless-ngx.com)
|
||||||
|
|
||||||
<p align="center">
|
<p align="center">
|
||||||
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png#gh-light-mode-only" width="50%" />
|
<picture>
|
||||||
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/White%20logo%20-%20no%20background.png#gh-dark-mode-only" width="50%" />
|
<source media="(prefers-color-scheme: dark)" srcset="https://github.com/paperless-ngx/paperless-ngx/blob/main/resources/logo/web/png/White%20logo%20-%20no%20background.png" width="50%">
|
||||||
|
<source media="(prefers-color-scheme: light)" srcset="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png" width="50%">
|
||||||
|
<img src="https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/png/Black%20logo%20-%20no%20background.png" width="50%">
|
||||||
|
</picture>
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
<!-- omit in toc -->
|
<!-- omit in toc -->
|
||||||
@@ -16,8 +19,7 @@
|
|||||||
|
|
||||||
Paperless-ngx is a document management system that transforms your physical documents into a searchable online archive so you can keep, well, _less paper_.
|
Paperless-ngx is a document management system that transforms your physical documents into a searchable online archive so you can keep, well, _less paper_.
|
||||||
|
|
||||||
Paperless-ngx forked from [paperless-ng](https://github.com/jonaswinkler/paperless-ng) to continue the great work and distribute responsibility of supporting and advancing the project among a team of people. [Consider joining us!](#community-support) Discussion of this transition can be found in issues
|
Paperless-ngx is the official successor to the original [Paperless](https://github.com/the-paperless-project/paperless) & [Paperless-ng](https://github.com/jonaswinkler/paperless-ng) projects and is designed to distribute the responsibility of advancing and supporting the project among a team of people. [Consider joining us!](#community-support)
|
||||||
[#1599](https://github.com/jonaswinkler/paperless-ng/issues/1599) and [#1632](https://github.com/jonaswinkler/paperless-ng/issues/1632).
|
|
||||||
|
|
||||||
A demo is available at [demo.paperless-ngx.com](https://demo.paperless-ngx.com) using login `demo` / `demo`. _Note: demo content is reset frequently and confidential information should not be uploaded._
|
A demo is available at [demo.paperless-ngx.com](https://demo.paperless-ngx.com) using login `demo` / `demo`. _Note: demo content is reset frequently and confidential information should not be uploaded._
|
||||||
|
|
||||||
@@ -33,37 +35,19 @@ A demo is available at [demo.paperless-ngx.com](https://demo.paperless-ngx.com)
|
|||||||
|
|
||||||
# Features
|
# Features
|
||||||
|
|
||||||

|
<picture>
|
||||||

|
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/docs/assets/screenshots/documents-smallcards-dark.png">
|
||||||
|
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/docs/assets/screenshots/documents-smallcards.png">
|
||||||
|
<img src="https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/docs/assets/screenshots/documents-smallcards.png">
|
||||||
|
</picture>
|
||||||
|
|
||||||
- Organize and index your scanned documents with tags, correspondents, types, and more.
|
A full list of [features](https://docs.paperless-ngx.com/#features) and [screenshots](https://docs.paperless-ngx.com/#screenshots) are available in the [documentation](https://docs.paperless-ngx.com/).
|
||||||
- Performs OCR on your documents, adds selectable text to image only documents and adds tags, correspondents and document types to your documents.
|
|
||||||
- Supports PDF documents, images, plain text files, and Office documents (Word, Excel, Powerpoint, and LibreOffice equivalents).
|
|
||||||
- Office document support is optional and provided by Apache Tika (see [configuration](https://docs.paperless-ngx.com/configuration/#tika))
|
|
||||||
- Paperless stores your documents plain on disk. Filenames and folders are managed by paperless and their format can be configured freely.
|
|
||||||
- Single page application front end.
|
|
||||||
- Includes a dashboard that shows basic statistics and has document upload.
|
|
||||||
- Filtering by tags, correspondents, types, and more.
|
|
||||||
- Customizable views can be saved and displayed on the dashboard.
|
|
||||||
- Full text search helps you find what you need.
|
|
||||||
- Auto completion suggests relevant words from your documents.
|
|
||||||
- Results are sorted by relevance to your search query.
|
|
||||||
- Highlighting shows you which parts of the document matched the query.
|
|
||||||
- Searching for similar documents ("More like this")
|
|
||||||
- Email processing: Paperless adds documents from your email accounts.
|
|
||||||
- Configure multiple accounts and filters for each account.
|
|
||||||
- When adding documents from mail, paperless can move these mail to a new folder, mark them as read, flag them as important or delete them.
|
|
||||||
- Machine learning powered document matching.
|
|
||||||
- Paperless-ngx learns from your documents and will be able to automatically assign tags, correspondents and types to documents once you've stored a few documents in paperless.
|
|
||||||
- Optimized for multi core systems: Paperless-ngx consumes multiple documents in parallel.
|
|
||||||
- The integrated sanity checker makes sure that your document archive is in good health.
|
|
||||||
- [More screenshots are available in the documentation](https://docs.paperless-ngx.com/#screenshots).
|
|
||||||
|
|
||||||
# Getting started
|
# Getting started
|
||||||
|
|
||||||
The easiest way to deploy paperless is docker-compose. The files in the [`/docker/compose` directory](https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose) are configured to pull the image from Github Packages.
|
The easiest way to deploy paperless is `docker compose`. The files in the [`/docker/compose` directory](https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose) are configured to pull the image from GitHub Packages.
|
||||||
|
|
||||||
If you'd like to jump right in, you can configure a docker-compose environment with our install script:
|
If you'd like to jump right in, you can configure a `docker compose` environment with our install script:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
bash -c "$(curl -L https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/install-paperless-ngx.sh)"
|
bash -c "$(curl -L https://raw.githubusercontent.com/paperless-ngx/paperless-ngx/main/install-paperless-ngx.sh)"
|
||||||
@@ -85,7 +69,7 @@ If you feel like contributing to the project, please do! Bug fixes, enhancements
|
|||||||
|
|
||||||
## Community Support
|
## Community Support
|
||||||
|
|
||||||
People interested in continuing the work on paperless-ngx are encouraged to reach out here on github and in the [Matrix Room](https://matrix.to/#/#paperless:adnidor.de). If you would like to contribute to the project on an ongoing basis there are multiple [teams](https://github.com/orgs/paperless-ngx/people) (frontend, ci/cd, etc) that could use your help so please reach out!
|
People interested in continuing the work on paperless-ngx are encouraged to reach out here on github and in the [Matrix Room](https://matrix.to/#/#paperless:matrix.org). If you would like to contribute to the project on an ongoing basis there are multiple [teams](https://github.com/orgs/paperless-ngx/people) (frontend, ci/cd, etc) that could use your help so please reach out!
|
||||||
|
|
||||||
## Translation
|
## Translation
|
||||||
|
|
||||||
@@ -101,22 +85,9 @@ For bugs please [open an issue](https://github.com/paperless-ngx/paperless-ngx/i
|
|||||||
|
|
||||||
# Affiliated Projects
|
# Affiliated Projects
|
||||||
|
|
||||||
Paperless has been around a while now, and people are starting to build stuff on top of it. If you're one of those people, we can add your project to this list:
|
Please see [the wiki](https://github.com/paperless-ngx/paperless-ngx/wiki/Affiliated-Projects) for a user-maintained list of affiliated projects and software that is compatible with Paperless-ngx.
|
||||||
|
|
||||||
- [Paperless App](https://github.com/bauerj/paperless_app): An Android/iOS app for Paperless-ngx. Also works with the original Paperless and Paperless-ng.
|
|
||||||
- [Paperless Share](https://github.com/qcasey/paperless_share). Share any files from your Android application with paperless. Very simple, but works with all of the mobile scanning apps out there that allow you to share scanned documents.
|
|
||||||
- [Scan to Paperless](https://github.com/sbrunner/scan-to-paperless): Scan and prepare (crop, deskew, OCR, ...) your documents for Paperless.
|
|
||||||
- [Paperless Mobile](https://github.com/astubenbord/paperless-mobile): A modern, feature rich mobile application for Paperless.
|
|
||||||
|
|
||||||
These projects also exist, but their status and compatibility with paperless-ngx is unknown.
|
|
||||||
|
|
||||||
- [paperless-cli](https://github.com/stgarf/paperless-cli): A golang command line binary to interact with a Paperless instance.
|
|
||||||
|
|
||||||
This project also exists, but needs updates to be compatible with paperless-ngx.
|
|
||||||
|
|
||||||
- [Paperless Desktop](https://github.com/thomasbrueggemann/paperless-desktop): A desktop UI for your Paperless installation. Runs on Mac, Linux, and Windows.
|
|
||||||
Known issues on Mac: (Could not load reminders and documents)
|
|
||||||
|
|
||||||
# Important Note
|
# Important Note
|
||||||
|
|
||||||
Document scanners are typically used to scan sensitive documents. Things like your social insurance number, tax records, invoices, etc. Everything is stored in the clear without encryption. This means that Paperless should never be run on an untrusted host. Instead, I recommend that if you do want to use it, run it locally on a server in your own home.
|
> Document scanners are typically used to scan sensitive documents like your social insurance number, tax records, invoices, etc. **Paperless-ngx should never be run on an untrusted host** because information is stored in clear text without encryption. No guarantees are made regarding security (but we do try!) and you use the app at your own risk.
|
||||||
|
> **The safest way to run Paperless-ngx is on a local server in your own home with backups in place**.
|
||||||
|
@@ -1,81 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
|
|
||||||
# Helper script for building the Docker image locally.
|
|
||||||
# Parses and provides the nessecary versions of other images to Docker
|
|
||||||
# before passing in the rest of script args.
|
|
||||||
|
|
||||||
# First Argument: The Dockerfile to build
|
|
||||||
# Other Arguments: Additional arguments to docker build
|
|
||||||
|
|
||||||
# Example Usage:
|
|
||||||
# ./build-docker-image.sh Dockerfile -t paperless-ngx:my-awesome-feature
|
|
||||||
|
|
||||||
set -eu
|
|
||||||
|
|
||||||
if ! command -v jq &> /dev/null ; then
|
|
||||||
echo "jq required"
|
|
||||||
exit 1
|
|
||||||
elif [ ! -f "$1" ]; then
|
|
||||||
echo "$1 is not a file, please provide the Dockerfile"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Get the branch name (used for caching)
|
|
||||||
branch_name=$(git rev-parse --abbrev-ref HEAD)
|
|
||||||
|
|
||||||
# Parse eithe Pipfile.lock or the .build-config.json
|
|
||||||
jbig2enc_version=$(jq ".jbig2enc.version" .build-config.json | sed 's/"//g')
|
|
||||||
qpdf_version=$(jq ".qpdf.version" .build-config.json | sed 's/"//g')
|
|
||||||
psycopg2_version=$(jq ".default.psycopg2.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
|
||||||
pikepdf_version=$(jq ".default.pikepdf.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
|
||||||
pillow_version=$(jq ".default.pillow.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
|
||||||
lxml_version=$(jq ".default.lxml.version" Pipfile.lock | sed 's/=//g' | sed 's/"//g')
|
|
||||||
|
|
||||||
base_filename="$(basename -- "${1}")"
|
|
||||||
build_args_str=""
|
|
||||||
cache_from_str=""
|
|
||||||
|
|
||||||
case "${base_filename}" in
|
|
||||||
|
|
||||||
*.jbig2enc)
|
|
||||||
build_args_str="--build-arg JBIG2ENC_VERSION=${jbig2enc_version}"
|
|
||||||
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/jbig2enc:${jbig2enc_version}"
|
|
||||||
;;
|
|
||||||
|
|
||||||
*.psycopg2)
|
|
||||||
build_args_str="--build-arg PSYCOPG2_VERSION=${psycopg2_version}"
|
|
||||||
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/psycopg2:${psycopg2_version}"
|
|
||||||
;;
|
|
||||||
|
|
||||||
*.qpdf)
|
|
||||||
build_args_str="--build-arg QPDF_VERSION=${qpdf_version}"
|
|
||||||
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/qpdf:${qpdf_version}"
|
|
||||||
;;
|
|
||||||
|
|
||||||
*.pikepdf)
|
|
||||||
build_args_str="--build-arg QPDF_VERSION=${qpdf_version} --build-arg PIKEPDF_VERSION=${pikepdf_version} --build-arg PILLOW_VERSION=${pillow_version} --build-arg LXML_VERSION=${lxml_version}"
|
|
||||||
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/pikepdf:${pikepdf_version}"
|
|
||||||
;;
|
|
||||||
|
|
||||||
Dockerfile)
|
|
||||||
build_args_str="--build-arg QPDF_VERSION=${qpdf_version} --build-arg PIKEPDF_VERSION=${pikepdf_version} --build-arg PSYCOPG2_VERSION=${psycopg2_version} --build-arg JBIG2ENC_VERSION=${jbig2enc_version}"
|
|
||||||
cache_from_str="--cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:${branch_name} --cache-from ghcr.io/paperless-ngx/paperless-ngx/builder/cache/app:dev"
|
|
||||||
;;
|
|
||||||
|
|
||||||
*)
|
|
||||||
echo "Unable to match ${base_filename}"
|
|
||||||
exit 1
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
|
|
||||||
read -r -a build_args_arr <<< "${build_args_str}"
|
|
||||||
read -r -a cache_from_arr <<< "${cache_from_str}"
|
|
||||||
|
|
||||||
set -eux
|
|
||||||
|
|
||||||
docker buildx build --file "${1}" \
|
|
||||||
--progress=plain \
|
|
||||||
--output=type=docker \
|
|
||||||
"${cache_from_arr[@]}" \
|
|
||||||
"${build_args_arr[@]}" \
|
|
||||||
"${@:2}" .
|
|
@@ -1,26 +0,0 @@
|
|||||||
# Patterns to ignore when building packages.
|
|
||||||
# This supports shell glob matching, relative path matching, and
|
|
||||||
# negation (prefixed with !). Only one pattern per line.
|
|
||||||
.DS_Store
|
|
||||||
# Common VCS dirs
|
|
||||||
.git/
|
|
||||||
.gitignore
|
|
||||||
.bzr/
|
|
||||||
.bzrignore
|
|
||||||
.hg/
|
|
||||||
.hgignore
|
|
||||||
.svn/
|
|
||||||
# Common backup files
|
|
||||||
*.swp
|
|
||||||
*.bak
|
|
||||||
*.tmp
|
|
||||||
*~
|
|
||||||
# Various IDEs
|
|
||||||
.project
|
|
||||||
.idea/
|
|
||||||
*.tmproj
|
|
||||||
.vscode/
|
|
||||||
# OWNERS file for Kubernetes
|
|
||||||
OWNERS
|
|
||||||
# helm-docs templates
|
|
||||||
*.gotmpl
|
|
@@ -1,35 +0,0 @@
|
|||||||
---
|
|
||||||
apiVersion: v2
|
|
||||||
appVersion: "1.9.2"
|
|
||||||
description: Paperless-ngx - Index and archive all of your scanned paper documents
|
|
||||||
name: paperless
|
|
||||||
version: 10.0.1
|
|
||||||
kubeVersion: ">=1.16.0-0"
|
|
||||||
keywords:
|
|
||||||
- paperless
|
|
||||||
- paperless-ngx
|
|
||||||
- dms
|
|
||||||
- document
|
|
||||||
home: https://github.com/paperless-ngx/paperless-ngx/tree/main/charts/paperless-ngx
|
|
||||||
icon: https://github.com/paperless-ngx/paperless-ngx/raw/main/resources/logo/web/svg/square.svg
|
|
||||||
sources:
|
|
||||||
- https://github.com/paperless-ngx/paperless-ngx
|
|
||||||
maintainers:
|
|
||||||
- name: Paperless-ngx maintainers
|
|
||||||
dependencies:
|
|
||||||
- name: common
|
|
||||||
repository: https://library-charts.k8s-at-home.com
|
|
||||||
version: 4.5.2
|
|
||||||
- name: postgresql
|
|
||||||
version: 11.6.12
|
|
||||||
repository: https://charts.bitnami.com/bitnami
|
|
||||||
condition: postgresql.enabled
|
|
||||||
- name: redis
|
|
||||||
version: 16.13.1
|
|
||||||
repository: https://charts.bitnami.com/bitnami
|
|
||||||
condition: redis.enabled
|
|
||||||
deprecated: false
|
|
||||||
annotations:
|
|
||||||
artifacthub.io/changes: |
|
|
||||||
- kind: changed
|
|
||||||
description: Moved to Paperless-ngx ownership
|
|
@@ -1,201 +0,0 @@
|
|||||||
Apache License
|
|
||||||
Version 2.0, January 2004
|
|
||||||
http://www.apache.org/licenses/
|
|
||||||
|
|
||||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
|
||||||
|
|
||||||
1. Definitions.
|
|
||||||
|
|
||||||
"License" shall mean the terms and conditions for use, reproduction,
|
|
||||||
and distribution as defined by Sections 1 through 9 of this document.
|
|
||||||
|
|
||||||
"Licensor" shall mean the copyright owner or entity authorized by
|
|
||||||
the copyright owner that is granting the License.
|
|
||||||
|
|
||||||
"Legal Entity" shall mean the union of the acting entity and all
|
|
||||||
other entities that control, are controlled by, or are under common
|
|
||||||
control with that entity. For the purposes of this definition,
|
|
||||||
"control" means (i) the power, direct or indirect, to cause the
|
|
||||||
direction or management of such entity, whether by contract or
|
|
||||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
|
||||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
|
||||||
|
|
||||||
"You" (or "Your") shall mean an individual or Legal Entity
|
|
||||||
exercising permissions granted by this License.
|
|
||||||
|
|
||||||
"Source" form shall mean the preferred form for making modifications,
|
|
||||||
including but not limited to software source code, documentation
|
|
||||||
source, and configuration files.
|
|
||||||
|
|
||||||
"Object" form shall mean any form resulting from mechanical
|
|
||||||
transformation or translation of a Source form, including but
|
|
||||||
not limited to compiled object code, generated documentation,
|
|
||||||
and conversions to other media types.
|
|
||||||
|
|
||||||
"Work" shall mean the work of authorship, whether in Source or
|
|
||||||
Object form, made available under the License, as indicated by a
|
|
||||||
copyright notice that is included in or attached to the work
|
|
||||||
(an example is provided in the Appendix below).
|
|
||||||
|
|
||||||
"Derivative Works" shall mean any work, whether in Source or Object
|
|
||||||
form, that is based on (or derived from) the Work and for which the
|
|
||||||
editorial revisions, annotations, elaborations, or other modifications
|
|
||||||
represent, as a whole, an original work of authorship. For the purposes
|
|
||||||
of this License, Derivative Works shall not include works that remain
|
|
||||||
separable from, or merely link (or bind by name) to the interfaces of,
|
|
||||||
the Work and Derivative Works thereof.
|
|
||||||
|
|
||||||
"Contribution" shall mean any work of authorship, including
|
|
||||||
the original version of the Work and any modifications or additions
|
|
||||||
to that Work or Derivative Works thereof, that is intentionally
|
|
||||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
|
||||||
or by an individual or Legal Entity authorized to submit on behalf of
|
|
||||||
the copyright owner. For the purposes of this definition, "submitted"
|
|
||||||
means any form of electronic, verbal, or written communication sent
|
|
||||||
to the Licensor or its representatives, including but not limited to
|
|
||||||
communication on electronic mailing lists, source code control systems,
|
|
||||||
and issue tracking systems that are managed by, or on behalf of, the
|
|
||||||
Licensor for the purpose of discussing and improving the Work, but
|
|
||||||
excluding communication that is conspicuously marked or otherwise
|
|
||||||
designated in writing by the copyright owner as "Not a Contribution."
|
|
||||||
|
|
||||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
|
||||||
on behalf of whom a Contribution has been received by Licensor and
|
|
||||||
subsequently incorporated within the Work.
|
|
||||||
|
|
||||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
copyright license to reproduce, prepare Derivative Works of,
|
|
||||||
publicly display, publicly perform, sublicense, and distribute the
|
|
||||||
Work and such Derivative Works in Source or Object form.
|
|
||||||
|
|
||||||
3. Grant of Patent License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
(except as stated in this section) patent license to make, have made,
|
|
||||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
|
||||||
where such license applies only to those patent claims licensable
|
|
||||||
by such Contributor that are necessarily infringed by their
|
|
||||||
Contribution(s) alone or by combination of their Contribution(s)
|
|
||||||
with the Work to which such Contribution(s) was submitted. If You
|
|
||||||
institute patent litigation against any entity (including a
|
|
||||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
|
||||||
or a Contribution incorporated within the Work constitutes direct
|
|
||||||
or contributory patent infringement, then any patent licenses
|
|
||||||
granted to You under this License for that Work shall terminate
|
|
||||||
as of the date such litigation is filed.
|
|
||||||
|
|
||||||
4. Redistribution. You may reproduce and distribute copies of the
|
|
||||||
Work or Derivative Works thereof in any medium, with or without
|
|
||||||
modifications, and in Source or Object form, provided that You
|
|
||||||
meet the following conditions:
|
|
||||||
|
|
||||||
(a) You must give any other recipients of the Work or
|
|
||||||
Derivative Works a copy of this License; and
|
|
||||||
|
|
||||||
(b) You must cause any modified files to carry prominent notices
|
|
||||||
stating that You changed the files; and
|
|
||||||
|
|
||||||
(c) You must retain, in the Source form of any Derivative Works
|
|
||||||
that You distribute, all copyright, patent, trademark, and
|
|
||||||
attribution notices from the Source form of the Work,
|
|
||||||
excluding those notices that do not pertain to any part of
|
|
||||||
the Derivative Works; and
|
|
||||||
|
|
||||||
(d) If the Work includes a "NOTICE" text file as part of its
|
|
||||||
distribution, then any Derivative Works that You distribute must
|
|
||||||
include a readable copy of the attribution notices contained
|
|
||||||
within such NOTICE file, excluding those notices that do not
|
|
||||||
pertain to any part of the Derivative Works, in at least one
|
|
||||||
of the following places: within a NOTICE text file distributed
|
|
||||||
as part of the Derivative Works; within the Source form or
|
|
||||||
documentation, if provided along with the Derivative Works; or,
|
|
||||||
within a display generated by the Derivative Works, if and
|
|
||||||
wherever such third-party notices normally appear. The contents
|
|
||||||
of the NOTICE file are for informational purposes only and
|
|
||||||
do not modify the License. You may add Your own attribution
|
|
||||||
notices within Derivative Works that You distribute, alongside
|
|
||||||
or as an addendum to the NOTICE text from the Work, provided
|
|
||||||
that such additional attribution notices cannot be construed
|
|
||||||
as modifying the License.
|
|
||||||
|
|
||||||
You may add Your own copyright statement to Your modifications and
|
|
||||||
may provide additional or different license terms and conditions
|
|
||||||
for use, reproduction, or distribution of Your modifications, or
|
|
||||||
for any such Derivative Works as a whole, provided Your use,
|
|
||||||
reproduction, and distribution of the Work otherwise complies with
|
|
||||||
the conditions stated in this License.
|
|
||||||
|
|
||||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
|
||||||
any Contribution intentionally submitted for inclusion in the Work
|
|
||||||
by You to the Licensor shall be under the terms and conditions of
|
|
||||||
this License, without any additional terms or conditions.
|
|
||||||
Notwithstanding the above, nothing herein shall supersede or modify
|
|
||||||
the terms of any separate license agreement you may have executed
|
|
||||||
with Licensor regarding such Contributions.
|
|
||||||
|
|
||||||
6. Trademarks. This License does not grant permission to use the trade
|
|
||||||
names, trademarks, service marks, or product names of the Licensor,
|
|
||||||
except as required for reasonable and customary use in describing the
|
|
||||||
origin of the Work and reproducing the content of the NOTICE file.
|
|
||||||
|
|
||||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
|
||||||
agreed to in writing, Licensor provides the Work (and each
|
|
||||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
|
||||||
implied, including, without limitation, any warranties or conditions
|
|
||||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
|
||||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
|
||||||
appropriateness of using or redistributing the Work and assume any
|
|
||||||
risks associated with Your exercise of permissions under this License.
|
|
||||||
|
|
||||||
8. Limitation of Liability. In no event and under no legal theory,
|
|
||||||
whether in tort (including negligence), contract, or otherwise,
|
|
||||||
unless required by applicable law (such as deliberate and grossly
|
|
||||||
negligent acts) or agreed to in writing, shall any Contributor be
|
|
||||||
liable to You for damages, including any direct, indirect, special,
|
|
||||||
incidental, or consequential damages of any character arising as a
|
|
||||||
result of this License or out of the use or inability to use the
|
|
||||||
Work (including but not limited to damages for loss of goodwill,
|
|
||||||
work stoppage, computer failure or malfunction, or any and all
|
|
||||||
other commercial damages or losses), even if such Contributor
|
|
||||||
has been advised of the possibility of such damages.
|
|
||||||
|
|
||||||
9. Accepting Warranty or Additional Liability. While redistributing
|
|
||||||
the Work or Derivative Works thereof, You may choose to offer,
|
|
||||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
|
||||||
or other liability obligations and/or rights consistent with this
|
|
||||||
License. However, in accepting such obligations, You may act only
|
|
||||||
on Your own behalf and on Your sole responsibility, not on behalf
|
|
||||||
of any other Contributor, and only if You agree to indemnify,
|
|
||||||
defend, and hold each Contributor harmless for any liability
|
|
||||||
incurred by, or claims asserted against, such Contributor by reason
|
|
||||||
of your accepting any such warranty or additional liability.
|
|
||||||
|
|
||||||
END OF TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
APPENDIX: How to apply the Apache License to your work.
|
|
||||||
|
|
||||||
To apply the Apache License to your work, attach the following
|
|
||||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
|
||||||
replaced with your own identifying information. (Don't include
|
|
||||||
the brackets!) The text should be enclosed in the appropriate
|
|
||||||
comment syntax for the file format. We also recommend that a
|
|
||||||
file or class name and description of purpose be included on the
|
|
||||||
same "printed page" as the copyright notice for easier
|
|
||||||
identification within third-party archives.
|
|
||||||
|
|
||||||
Copyright 2020 k8s@Home
|
|
||||||
|
|
||||||
Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
you may not use this file except in compliance with the License.
|
|
||||||
You may obtain a copy of the License at
|
|
||||||
|
|
||||||
http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
|
|
||||||
Unless required by applicable law or agreed to in writing, software
|
|
||||||
distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
See the License for the specific language governing permissions and
|
|
||||||
limitations under the License.
|
|
@@ -1,50 +0,0 @@
|
|||||||
# paperless
|
|
||||||
|
|
||||||
 
|
|
||||||
|
|
||||||
Paperless-ngx - Index and archive all of your scanned paper documents
|
|
||||||
|
|
||||||
**Homepage:** <https://github.com/paperless-ngx/paperless-ngx/tree/main/charts/paperless-ngx>
|
|
||||||
|
|
||||||
## Maintainers
|
|
||||||
|
|
||||||
| Name | Email | Url |
|
|
||||||
| ---- | ------ | --- |
|
|
||||||
| Paperless-ngx maintainers | | |
|
|
||||||
|
|
||||||
## Source Code
|
|
||||||
|
|
||||||
* <https://github.com/paperless-ngx/paperless-ngx>
|
|
||||||
|
|
||||||
## Requirements
|
|
||||||
|
|
||||||
Kubernetes: `>=1.16.0-0`
|
|
||||||
|
|
||||||
| Repository | Name | Version |
|
|
||||||
|------------|------|---------|
|
|
||||||
| https://charts.bitnami.com/bitnami | postgresql | 11.6.12 |
|
|
||||||
| https://charts.bitnami.com/bitnami | redis | 16.13.1 |
|
|
||||||
| https://library-charts.k8s-at-home.com | common | 4.5.2 |
|
|
||||||
|
|
||||||
## Values
|
|
||||||
|
|
||||||
| Key | Type | Default | Description |
|
|
||||||
|-----|------|---------|-------------|
|
|
||||||
| env | object | See below | See the following files for additional environment variables: https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose/ https://github.com/paperless-ngx/paperless-ngx/blob/main/paperless.conf.example |
|
|
||||||
| env.COMPOSE_PROJECT_NAME | string | `"paperless"` | Project name |
|
|
||||||
| env.PAPERLESS_DBHOST | string | `nil` | Database host to use |
|
|
||||||
| env.PAPERLESS_OCR_LANGUAGE | string | `"eng"` | OCR languages to install |
|
|
||||||
| env.PAPERLESS_PORT | int | `8000` | Port to use |
|
|
||||||
| env.PAPERLESS_REDIS | string | `nil` | Redis to use |
|
|
||||||
| image.pullPolicy | string | `"IfNotPresent"` | image pull policy |
|
|
||||||
| image.repository | string | `"ghcr.io/paperless-ngx/paperless-ngx"` | image repository |
|
|
||||||
| image.tag | string | chart.appVersion | image tag |
|
|
||||||
| ingress.main | object | See values.yaml | Enable and configure ingress settings for the chart under this key. |
|
|
||||||
| persistence.consume | object | See values.yaml | Configure volume to monitor for new documents. |
|
|
||||||
| persistence.data | object | See values.yaml | Configure persistence for data. |
|
|
||||||
| persistence.export | object | See values.yaml | Configure export volume. |
|
|
||||||
| persistence.media | object | See values.yaml | Configure persistence for media. |
|
|
||||||
| postgresql | object | See values.yaml | Enable and configure postgresql database subchart under this key. For more options see [postgresql chart documentation](https://github.com/bitnami/charts/tree/master/bitnami/postgresql) |
|
|
||||||
| redis | object | See values.yaml | Enable and configure redis subchart under this key. For more options see [redis chart documentation](https://github.com/bitnami/charts/tree/master/bitnami/redis) |
|
|
||||||
| service | object | See values.yaml | Configures service settings for the chart. |
|
|
||||||
|
|
@@ -1,8 +0,0 @@
|
|||||||
{{- define "custom.custom.configuration.header" -}}
|
|
||||||
## Custom configuration
|
|
||||||
{{- end -}}
|
|
||||||
|
|
||||||
{{- define "custom.custom.configuration" -}}
|
|
||||||
{{ template "custom.custom.configuration.header" . }}
|
|
||||||
N/A
|
|
||||||
{{- end -}}
|
|
@@ -1,26 +0,0 @@
|
|||||||
env:
|
|
||||||
PAPERLESS_REDIS: redis://paperless-redis-headless:6379
|
|
||||||
|
|
||||||
persistence:
|
|
||||||
data:
|
|
||||||
enabled: true
|
|
||||||
type: emptyDir
|
|
||||||
media:
|
|
||||||
enabled: true
|
|
||||||
type: emptyDir
|
|
||||||
consume:
|
|
||||||
enabled: true
|
|
||||||
type: emptyDir
|
|
||||||
export:
|
|
||||||
enabled: true
|
|
||||||
type: emptyDir
|
|
||||||
|
|
||||||
redis:
|
|
||||||
enabled: true
|
|
||||||
architecture: standalone
|
|
||||||
auth:
|
|
||||||
enabled: false
|
|
||||||
master:
|
|
||||||
persistence:
|
|
||||||
enabled: false
|
|
||||||
fullnameOverride: paperless-redis
|
|
@@ -1,4 +0,0 @@
|
|||||||
{{- include "common.notes.defaultNotes" . }}
|
|
||||||
2. Create a super user by running the command:
|
|
||||||
export POD_NAME=$(kubectl get pods --namespace {{ .Release.Namespace }} -l "app.kubernetes.io/name={{ include "common.names.name" . }},app.kubernetes.io/instance={{ .Release.Name }}" -o jsonpath="{.items[0].metadata.name}")
|
|
||||||
kubectl exec -it --namespace {{ .Release.Namespace }} $POD_NAME -- bash -c "python manage.py createsuperuser"
|
|
@@ -1,11 +0,0 @@
|
|||||||
{{/* Make sure all variables are set properly */}}
|
|
||||||
{{- include "common.values.setup" . }}
|
|
||||||
|
|
||||||
{{/* Append the hardcoded settings */}}
|
|
||||||
{{- define "paperless.harcodedValues" -}}
|
|
||||||
env:
|
|
||||||
PAPERLESS_URL: http{{if ne ( len .Values.ingress.main.tls ) 0 }}s{{end}}://{{ (first .Values.ingress.main.hosts).host }}
|
|
||||||
{{- end -}}
|
|
||||||
{{- $_ := merge .Values (include "paperless.harcodedValues" . | fromYaml) -}}
|
|
||||||
|
|
||||||
{{ include "common.all" . }}
|
|
@@ -1,107 +0,0 @@
|
|||||||
#
|
|
||||||
# IMPORTANT NOTE
|
|
||||||
#
|
|
||||||
# This chart inherits from our common library chart. You can check the default values/options here:
|
|
||||||
# https://github.com/k8s-at-home/library-charts/tree/main/charts/stable/common/values.yaml
|
|
||||||
#
|
|
||||||
|
|
||||||
image:
|
|
||||||
# -- image repository
|
|
||||||
repository: ghcr.io/paperless-ngx/paperless-ngx
|
|
||||||
# -- image pull policy
|
|
||||||
pullPolicy: IfNotPresent
|
|
||||||
# -- image tag
|
|
||||||
# @default -- chart.appVersion
|
|
||||||
tag:
|
|
||||||
|
|
||||||
# -- See the following files for additional environment variables:
|
|
||||||
# https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose/
|
|
||||||
# https://github.com/paperless-ngx/paperless-ngx/blob/main/paperless.conf.example
|
|
||||||
# @default -- See below
|
|
||||||
env:
|
|
||||||
# -- Project name
|
|
||||||
COMPOSE_PROJECT_NAME: paperless
|
|
||||||
# -- Redis to use
|
|
||||||
PAPERLESS_REDIS:
|
|
||||||
# -- OCR languages to install
|
|
||||||
PAPERLESS_OCR_LANGUAGE: eng
|
|
||||||
# USERMAP_UID: 1000
|
|
||||||
# USERMAP_GID: 1000
|
|
||||||
# PAPERLESS_TIME_ZONE: Europe/London
|
|
||||||
# -- Database host to use
|
|
||||||
PAPERLESS_DBHOST:
|
|
||||||
# -- Port to use
|
|
||||||
PAPERLESS_PORT: 8000
|
|
||||||
# -- Username for the root user
|
|
||||||
# PAPERLESS_ADMIN_USER: admin
|
|
||||||
# -- Password for the root user
|
|
||||||
# PAPERLESS_ADMIN_PASSWORD: admin
|
|
||||||
# PAPERLESS_URL: <set to main ingress by default>
|
|
||||||
|
|
||||||
# -- Configures service settings for the chart.
|
|
||||||
# @default -- See values.yaml
|
|
||||||
service:
|
|
||||||
main:
|
|
||||||
ports:
|
|
||||||
http:
|
|
||||||
port: 8000
|
|
||||||
|
|
||||||
ingress:
|
|
||||||
# -- Enable and configure ingress settings for the chart under this key.
|
|
||||||
# @default -- See values.yaml
|
|
||||||
main:
|
|
||||||
enabled: false
|
|
||||||
|
|
||||||
persistence:
|
|
||||||
# -- Configure persistence for data.
|
|
||||||
# @default -- See values.yaml
|
|
||||||
data:
|
|
||||||
enabled: false
|
|
||||||
mountPath: /usr/src/paperless/data
|
|
||||||
accessMode: ReadWriteOnce
|
|
||||||
emptyDir:
|
|
||||||
enabled: false
|
|
||||||
# -- Configure persistence for media.
|
|
||||||
# @default -- See values.yaml
|
|
||||||
media:
|
|
||||||
enabled: false
|
|
||||||
mountPath: /usr/src/paperless/media
|
|
||||||
accessMode: ReadWriteOnce
|
|
||||||
emptyDir:
|
|
||||||
enabled: false
|
|
||||||
# -- Configure volume to monitor for new documents.
|
|
||||||
# @default -- See values.yaml
|
|
||||||
consume:
|
|
||||||
enabled: false
|
|
||||||
mountPath: /usr/src/paperless/consume
|
|
||||||
accessMode: ReadWriteOnce
|
|
||||||
emptyDir:
|
|
||||||
enabled: false
|
|
||||||
# -- Configure export volume.
|
|
||||||
# @default -- See values.yaml
|
|
||||||
export:
|
|
||||||
enabled: false
|
|
||||||
mountPath: /usr/src/paperless/export
|
|
||||||
accessMode: ReadWriteOnce
|
|
||||||
emptyDir:
|
|
||||||
enabled: false
|
|
||||||
|
|
||||||
# -- Enable and configure postgresql database subchart under this key.
|
|
||||||
# For more options see [postgresql chart documentation](https://github.com/bitnami/charts/tree/master/bitnami/postgresql)
|
|
||||||
# @default -- See values.yaml
|
|
||||||
postgresql:
|
|
||||||
enabled: false
|
|
||||||
postgresqlUsername: paperless
|
|
||||||
postgresqlPassword: paperless
|
|
||||||
postgresqlDatabase: paperless
|
|
||||||
persistence:
|
|
||||||
enabled: false
|
|
||||||
# storageClass: ""
|
|
||||||
|
|
||||||
# -- Enable and configure redis subchart under this key.
|
|
||||||
# For more options see [redis chart documentation](https://github.com/bitnami/charts/tree/master/bitnami/redis)
|
|
||||||
# @default -- See values.yaml
|
|
||||||
redis:
|
|
||||||
enabled: false
|
|
||||||
auth:
|
|
||||||
enabled: false
|
|
@@ -1,4 +1,6 @@
|
|||||||
commit_message: '[ci skip]'
|
project_id_env: CROWDIN_PROJECT_ID
|
||||||
|
api_token_env: CROWDIN_PERSONAL_TOKEN
|
||||||
|
preserve_hierarchy: true
|
||||||
files:
|
files:
|
||||||
- source: /src/locale/en_US/LC_MESSAGES/django.po
|
- source: /src/locale/en_US/LC_MESSAGES/django.po
|
||||||
translation: /src/locale/%locale_with_underscore%/LC_MESSAGES/django.po
|
translation: /src/locale/%locale_with_underscore%/LC_MESSAGES/django.po
|
||||||
|
@@ -1,35 +0,0 @@
|
|||||||
# This Dockerfile compiles the jbig2enc library
|
|
||||||
# Inputs:
|
|
||||||
# - JBIG2ENC_VERSION - the Git tag to checkout and build
|
|
||||||
|
|
||||||
FROM debian:bullseye-slim as main
|
|
||||||
|
|
||||||
LABEL org.opencontainers.image.description="A intermediate image with jbig2enc built"
|
|
||||||
|
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
|
||||||
ARG JBIG2ENC_VERSION
|
|
||||||
|
|
||||||
ARG BUILD_PACKAGES="\
|
|
||||||
build-essential \
|
|
||||||
automake \
|
|
||||||
libtool \
|
|
||||||
libleptonica-dev \
|
|
||||||
zlib1g-dev \
|
|
||||||
git \
|
|
||||||
ca-certificates"
|
|
||||||
|
|
||||||
WORKDIR /usr/src/jbig2enc
|
|
||||||
|
|
||||||
RUN set -eux \
|
|
||||||
&& echo "Installing build tools" \
|
|
||||||
&& apt-get update --quiet \
|
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
|
||||||
&& echo "Building jbig2enc" \
|
|
||||||
&& git clone --quiet --branch $JBIG2ENC_VERSION https://github.com/agl/jbig2enc . \
|
|
||||||
&& ./autogen.sh \
|
|
||||||
&& ./configure \
|
|
||||||
&& make \
|
|
||||||
&& echo "Cleaning up image" \
|
|
||||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
|
||||||
&& apt-get -y autoremove --purge \
|
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
|
@@ -1,102 +0,0 @@
|
|||||||
# This Dockerfile builds the pikepdf wheel
|
|
||||||
# Inputs:
|
|
||||||
# - REPO - Docker repository to pull qpdf from
|
|
||||||
# - QPDF_VERSION - The image qpdf version to copy .deb files from
|
|
||||||
# - PIKEPDF_VERSION - Version of pikepdf to build wheel for
|
|
||||||
|
|
||||||
# Default to pulling from the main repo registry when manually building
|
|
||||||
ARG REPO="paperless-ngx/paperless-ngx"
|
|
||||||
|
|
||||||
ARG QPDF_VERSION
|
|
||||||
FROM ghcr.io/${REPO}/builder/qpdf:${QPDF_VERSION} as qpdf-builder
|
|
||||||
|
|
||||||
# This does nothing, except provide a name for a copy below
|
|
||||||
|
|
||||||
FROM python:3.9-slim-bullseye as main
|
|
||||||
|
|
||||||
LABEL org.opencontainers.image.description="A intermediate image with pikepdf wheel built"
|
|
||||||
|
|
||||||
# Buildx provided
|
|
||||||
ARG TARGETARCH
|
|
||||||
ARG TARGETVARIANT
|
|
||||||
|
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
|
||||||
# Workflow provided
|
|
||||||
ARG QPDF_VERSION
|
|
||||||
ARG PIKEPDF_VERSION
|
|
||||||
# These are not used, but will still bust the cache if one changes
|
|
||||||
# Otherwise, the main image will try to build thing (and fail)
|
|
||||||
ARG PILLOW_VERSION
|
|
||||||
ARG LXML_VERSION
|
|
||||||
|
|
||||||
ARG BUILD_PACKAGES="\
|
|
||||||
build-essential \
|
|
||||||
python3-dev \
|
|
||||||
python3-pip \
|
|
||||||
# qpdf requirement - https://github.com/qpdf/qpdf#crypto-providers
|
|
||||||
libgnutls28-dev \
|
|
||||||
# lxml requrements - https://lxml.de/installation.html
|
|
||||||
libxml2-dev \
|
|
||||||
libxslt1-dev \
|
|
||||||
# Pillow requirements - https://pillow.readthedocs.io/en/stable/installation.html#external-libraries
|
|
||||||
# JPEG functionality
|
|
||||||
libjpeg62-turbo-dev \
|
|
||||||
# conpressed PNG
|
|
||||||
zlib1g-dev \
|
|
||||||
# compressed TIFF
|
|
||||||
libtiff-dev \
|
|
||||||
# type related services
|
|
||||||
libfreetype-dev \
|
|
||||||
# color management
|
|
||||||
liblcms2-dev \
|
|
||||||
# WebP format
|
|
||||||
libwebp-dev \
|
|
||||||
# JPEG 2000
|
|
||||||
libopenjp2-7-dev \
|
|
||||||
# improved color quantization
|
|
||||||
libimagequant-dev \
|
|
||||||
# complex text layout support
|
|
||||||
libraqm-dev"
|
|
||||||
|
|
||||||
WORKDIR /usr/src
|
|
||||||
|
|
||||||
COPY --from=qpdf-builder /usr/src/qpdf/${QPDF_VERSION}/${TARGETARCH}${TARGETVARIANT}/*.deb ./
|
|
||||||
|
|
||||||
# As this is an base image for a multi-stage final image
|
|
||||||
# the added size of the install is basically irrelevant
|
|
||||||
|
|
||||||
RUN set -eux \
|
|
||||||
&& echo "Installing build tools" \
|
|
||||||
&& apt-get update --quiet \
|
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
|
||||||
&& echo "Installing qpdf" \
|
|
||||||
&& dpkg --install libqpdf29_*.deb \
|
|
||||||
&& dpkg --install libqpdf-dev_*.deb \
|
|
||||||
&& echo "Installing Python tools" \
|
|
||||||
&& python3 -m pip install --no-cache-dir --upgrade \
|
|
||||||
pip \
|
|
||||||
wheel \
|
|
||||||
# https://pikepdf.readthedocs.io/en/latest/installation.html#requirements
|
|
||||||
pybind11 \
|
|
||||||
&& echo "Building pikepdf wheel ${PIKEPDF_VERSION}" \
|
|
||||||
&& mkdir wheels \
|
|
||||||
&& python3 -m pip wheel \
|
|
||||||
# Build the package at the required version
|
|
||||||
pikepdf==${PIKEPDF_VERSION} \
|
|
||||||
# Look to piwheels for additional pre-built wheels
|
|
||||||
--extra-index-url https://www.piwheels.org/simple \
|
|
||||||
# Output the *.whl into this directory
|
|
||||||
--wheel-dir wheels \
|
|
||||||
# Do not use a binary packge for the package being built
|
|
||||||
--no-binary=pikepdf \
|
|
||||||
# Do use binary packages for dependencies
|
|
||||||
--prefer-binary \
|
|
||||||
# Don't cache build files
|
|
||||||
--no-cache-dir \
|
|
||||||
&& ls -ahl wheels \
|
|
||||||
&& echo "Gathering package data" \
|
|
||||||
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ./wheels/pkg-list.txt \
|
|
||||||
&& echo "Cleaning up image" \
|
|
||||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
|
||||||
&& apt-get -y autoremove --purge \
|
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
|
@@ -1,50 +0,0 @@
|
|||||||
# This Dockerfile builds the psycopg2 wheel
|
|
||||||
# Inputs:
|
|
||||||
# - PSYCOPG2_VERSION - Version to build
|
|
||||||
|
|
||||||
FROM python:3.9-slim-bullseye as main
|
|
||||||
|
|
||||||
LABEL org.opencontainers.image.description="A intermediate image with psycopg2 wheel built"
|
|
||||||
|
|
||||||
ARG PSYCOPG2_VERSION
|
|
||||||
ARG DEBIAN_FRONTEND=noninteractive
|
|
||||||
|
|
||||||
ARG BUILD_PACKAGES="\
|
|
||||||
build-essential \
|
|
||||||
python3-dev \
|
|
||||||
python3-pip \
|
|
||||||
# https://www.psycopg.org/docs/install.html#prerequisites
|
|
||||||
libpq-dev"
|
|
||||||
|
|
||||||
WORKDIR /usr/src
|
|
||||||
|
|
||||||
# As this is an base image for a multi-stage final image
|
|
||||||
# the added size of the install is basically irrelevant
|
|
||||||
|
|
||||||
RUN set -eux \
|
|
||||||
&& echo "Installing build tools" \
|
|
||||||
&& apt-get update --quiet \
|
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${BUILD_PACKAGES} \
|
|
||||||
&& echo "Installing Python tools" \
|
|
||||||
&& python3 -m pip install --no-cache-dir --upgrade pip wheel \
|
|
||||||
&& echo "Building psycopg2 wheel ${PSYCOPG2_VERSION}" \
|
|
||||||
&& cd /usr/src \
|
|
||||||
&& mkdir wheels \
|
|
||||||
&& python3 -m pip wheel \
|
|
||||||
# Build the package at the required version
|
|
||||||
psycopg2==${PSYCOPG2_VERSION} \
|
|
||||||
# Output the *.whl into this directory
|
|
||||||
--wheel-dir wheels \
|
|
||||||
# Do not use a binary packge for the package being built
|
|
||||||
--no-binary=psycopg2 \
|
|
||||||
# Do use binary packages for dependencies
|
|
||||||
--prefer-binary \
|
|
||||||
# Don't cache build files
|
|
||||||
--no-cache-dir \
|
|
||||||
&& ls -ahl wheels/ \
|
|
||||||
&& echo "Gathering package data" \
|
|
||||||
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ./wheels/pkg-list.txt \
|
|
||||||
&& echo "Cleaning up image" \
|
|
||||||
&& apt-get -y purge ${BUILD_PACKAGES} \
|
|
||||||
&& apt-get -y autoremove --purge \
|
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
|
@@ -1,156 +0,0 @@
|
|||||||
#
|
|
||||||
# Stage: pre-build
|
|
||||||
# Purpose:
|
|
||||||
# - Installs common packages
|
|
||||||
# - Sets common environment variables related to dpkg
|
|
||||||
# - Aquires the qpdf source from bookwork
|
|
||||||
# Useful Links:
|
|
||||||
# - https://qpdf.readthedocs.io/en/stable/installation.html#system-requirements
|
|
||||||
# - https://wiki.debian.org/Multiarch/HOWTO
|
|
||||||
# - https://wiki.debian.org/CrossCompiling
|
|
||||||
#
|
|
||||||
|
|
||||||
FROM debian:bullseye-slim as pre-build
|
|
||||||
|
|
||||||
ARG QPDF_VERSION
|
|
||||||
|
|
||||||
ARG COMMON_BUILD_PACKAGES="\
|
|
||||||
cmake \
|
|
||||||
debhelper\
|
|
||||||
debian-keyring \
|
|
||||||
devscripts \
|
|
||||||
dpkg-dev \
|
|
||||||
equivs \
|
|
||||||
packaging-dev \
|
|
||||||
libtool"
|
|
||||||
|
|
||||||
ENV DEB_BUILD_OPTIONS="terse nocheck nodoc parallel=2"
|
|
||||||
|
|
||||||
WORKDIR /usr/src
|
|
||||||
|
|
||||||
RUN set -eux \
|
|
||||||
&& echo "Installing common packages" \
|
|
||||||
&& apt-get update --quiet \
|
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${COMMON_BUILD_PACKAGES} \
|
|
||||||
&& echo "Getting qpdf source" \
|
|
||||||
&& echo "deb-src http://deb.debian.org/debian/ bookworm main" > /etc/apt/sources.list.d/bookworm-src.list \
|
|
||||||
&& apt-get update --quiet \
|
|
||||||
&& apt-get source --yes --quiet qpdf=${QPDF_VERSION}-1/bookworm
|
|
||||||
|
|
||||||
#
|
|
||||||
# Stage: amd64-builder
|
|
||||||
# Purpose: Builds qpdf for x86_64 (native build)
|
|
||||||
#
|
|
||||||
FROM pre-build as amd64-builder
|
|
||||||
|
|
||||||
ARG AMD64_BUILD_PACKAGES="\
|
|
||||||
build-essential \
|
|
||||||
libjpeg62-turbo-dev:amd64 \
|
|
||||||
libgnutls28-dev:amd64 \
|
|
||||||
zlib1g-dev:amd64"
|
|
||||||
|
|
||||||
WORKDIR /usr/src/qpdf-${QPDF_VERSION}
|
|
||||||
|
|
||||||
RUN set -eux \
|
|
||||||
&& echo "Beginning amd64" \
|
|
||||||
&& echo "Install amd64 packages" \
|
|
||||||
&& apt-get update --quiet \
|
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${AMD64_BUILD_PACKAGES} \
|
|
||||||
&& echo "Building amd64" \
|
|
||||||
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean \
|
|
||||||
&& echo "Removing debug files" \
|
|
||||||
&& rm -f ../libqpdf29-dbgsym* \
|
|
||||||
&& rm -f ../qpdf-dbgsym* \
|
|
||||||
&& echo "Gathering package data" \
|
|
||||||
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ../pkg-list.txt
|
|
||||||
#
|
|
||||||
# Stage: armhf-builder
|
|
||||||
# Purpose:
|
|
||||||
# - Sets armhf specific environment
|
|
||||||
# - Builds qpdf for armhf (cross compile)
|
|
||||||
#
|
|
||||||
FROM pre-build as armhf-builder
|
|
||||||
|
|
||||||
ARG ARMHF_PACKAGES="\
|
|
||||||
crossbuild-essential-armhf \
|
|
||||||
libjpeg62-turbo-dev:armhf \
|
|
||||||
libgnutls28-dev:armhf \
|
|
||||||
zlib1g-dev:armhf"
|
|
||||||
|
|
||||||
WORKDIR /usr/src/qpdf-${QPDF_VERSION}
|
|
||||||
|
|
||||||
ENV CXX="/usr/bin/arm-linux-gnueabihf-g++" \
|
|
||||||
CC="/usr/bin/arm-linux-gnueabihf-gcc"
|
|
||||||
|
|
||||||
RUN set -eux \
|
|
||||||
&& echo "Beginning armhf" \
|
|
||||||
&& echo "Install armhf packages" \
|
|
||||||
&& dpkg --add-architecture armhf \
|
|
||||||
&& apt-get update --quiet \
|
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${ARMHF_PACKAGES} \
|
|
||||||
&& echo "Building armhf" \
|
|
||||||
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean --host-arch armhf \
|
|
||||||
&& echo "Removing debug files" \
|
|
||||||
&& rm -f ../libqpdf29-dbgsym* \
|
|
||||||
&& rm -f ../qpdf-dbgsym* \
|
|
||||||
&& echo "Gathering package data" \
|
|
||||||
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ../pkg-list.txt
|
|
||||||
|
|
||||||
#
|
|
||||||
# Stage: aarch64-builder
|
|
||||||
# Purpose:
|
|
||||||
# - Sets aarch64 specific environment
|
|
||||||
# - Builds qpdf for aarch64 (cross compile)
|
|
||||||
#
|
|
||||||
FROM pre-build as aarch64-builder
|
|
||||||
|
|
||||||
ARG ARM64_PACKAGES="\
|
|
||||||
crossbuild-essential-arm64 \
|
|
||||||
libjpeg62-turbo-dev:arm64 \
|
|
||||||
libgnutls28-dev:arm64 \
|
|
||||||
zlib1g-dev:arm64"
|
|
||||||
|
|
||||||
ENV CXX="/usr/bin/aarch64-linux-gnu-g++" \
|
|
||||||
CC="/usr/bin/aarch64-linux-gnu-gcc"
|
|
||||||
|
|
||||||
WORKDIR /usr/src/qpdf-${QPDF_VERSION}
|
|
||||||
|
|
||||||
RUN set -eux \
|
|
||||||
&& echo "Beginning arm64" \
|
|
||||||
&& echo "Install arm64 packages" \
|
|
||||||
&& dpkg --add-architecture arm64 \
|
|
||||||
&& apt-get update --quiet \
|
|
||||||
&& apt-get install --yes --quiet --no-install-recommends ${ARM64_PACKAGES} \
|
|
||||||
&& echo "Building arm64" \
|
|
||||||
&& dpkg-buildpackage --build=binary --unsigned-source --unsigned-changes --post-clean --host-arch arm64 \
|
|
||||||
&& echo "Removing debug files" \
|
|
||||||
&& rm -f ../libqpdf29-dbgsym* \
|
|
||||||
&& rm -f ../qpdf-dbgsym* \
|
|
||||||
&& echo "Gathering package data" \
|
|
||||||
&& dpkg-query -f '${Package;-40}${Version}\n' -W > ../pkg-list.txt
|
|
||||||
|
|
||||||
#
|
|
||||||
# Stage: package
|
|
||||||
# Purpose: Holds the compiled .deb files in arch/variant specific folders
|
|
||||||
#
|
|
||||||
FROM alpine:3.17 as package
|
|
||||||
|
|
||||||
LABEL org.opencontainers.image.description="A image with qpdf installers stored in architecture & version specific folders"
|
|
||||||
|
|
||||||
ARG QPDF_VERSION
|
|
||||||
|
|
||||||
WORKDIR /usr/src/qpdf/${QPDF_VERSION}/amd64
|
|
||||||
|
|
||||||
COPY --from=amd64-builder /usr/src/*.deb ./
|
|
||||||
COPY --from=amd64-builder /usr/src/pkg-list.txt ./
|
|
||||||
|
|
||||||
# Note this is ${TARGETARCH}${TARGETVARIANT} for armv7
|
|
||||||
WORKDIR /usr/src/qpdf/${QPDF_VERSION}/armv7
|
|
||||||
|
|
||||||
COPY --from=armhf-builder /usr/src/*.deb ./
|
|
||||||
COPY --from=armhf-builder /usr/src/pkg-list.txt ./
|
|
||||||
|
|
||||||
WORKDIR /usr/src/qpdf/${QPDF_VERSION}/arm64
|
|
||||||
|
|
||||||
COPY --from=aarch64-builder /usr/src/*.deb ./
|
|
||||||
COPY --from=aarch64-builder /usr/src/pkg-list.txt ./
|
|
@@ -1,4 +1,4 @@
|
|||||||
# docker-compose file for running paperless testing with actual gotenberg
|
# Docker Compose file for running paperless testing with actual gotenberg
|
||||||
# and Tika containers for a more end to end test of the Tika related functionality
|
# and Tika containers for a more end to end test of the Tika related functionality
|
||||||
# Can be used locally or by the CI to start the nessecary containers with the
|
# Can be used locally or by the CI to start the nessecary containers with the
|
||||||
# correct networking for the tests
|
# correct networking for the tests
|
||||||
@@ -6,7 +6,7 @@
|
|||||||
version: "3.7"
|
version: "3.7"
|
||||||
services:
|
services:
|
||||||
gotenberg:
|
gotenberg:
|
||||||
image: docker.io/gotenberg/gotenberg:7.6
|
image: docker.io/gotenberg/gotenberg:7.10
|
||||||
hostname: gotenberg
|
hostname: gotenberg
|
||||||
container_name: gotenberg
|
container_name: gotenberg
|
||||||
network_mode: host
|
network_mode: host
|
||||||
@@ -17,6 +17,8 @@ services:
|
|||||||
- "gotenberg"
|
- "gotenberg"
|
||||||
- "--chromium-disable-javascript=true"
|
- "--chromium-disable-javascript=true"
|
||||||
- "--chromium-allow-list=file:///tmp/.*"
|
- "--chromium-allow-list=file:///tmp/.*"
|
||||||
|
- "--log-level=warn"
|
||||||
|
- "--log-format=text"
|
||||||
tika:
|
tika:
|
||||||
image: ghcr.io/paperless-ngx/tika:latest
|
image: ghcr.io/paperless-ngx/tika:latest
|
||||||
hostname: tika
|
hostname: tika
|
||||||
|
@@ -1,4 +1,4 @@
|
|||||||
# docker-compose file for running paperless from the Docker Hub.
|
# docker compose file for running paperless from the Docker Hub.
|
||||||
# This file contains everything paperless needs to run.
|
# This file contains everything paperless needs to run.
|
||||||
# Paperless supports amd64, arm and arm64 hardware.
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
#
|
#
|
||||||
@@ -10,7 +10,7 @@
|
|||||||
# as this file and mounted to the correct folders inside the container.
|
# as this file and mounted to the correct folders inside the container.
|
||||||
# - Paperless listens on port 8000.
|
# - Paperless listens on port 8000.
|
||||||
#
|
#
|
||||||
# In addition to that, this docker-compose file adds the following optional
|
# In addition to that, this Docker Compose file adds the following optional
|
||||||
# configurations:
|
# configurations:
|
||||||
#
|
#
|
||||||
# - Instead of SQLite (default), MariaDB is used as the database server.
|
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||||
@@ -23,9 +23,9 @@
|
|||||||
#
|
#
|
||||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||||
# and '.env' into a folder.
|
# and '.env' into a folder.
|
||||||
# - Run 'docker-compose pull'.
|
# - Run 'docker compose pull'.
|
||||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||||
# - Run 'docker-compose up -d'.
|
# - Run 'docker compose up -d'.
|
||||||
#
|
#
|
||||||
# For more extensive installation and update instructions, refer to the
|
# For more extensive installation and update instructions, refer to the
|
||||||
# documentation.
|
# documentation.
|
||||||
@@ -59,7 +59,7 @@ services:
|
|||||||
- gotenberg
|
- gotenberg
|
||||||
- tika
|
- tika
|
||||||
ports:
|
ports:
|
||||||
- 8000:8000
|
- "8000:8000"
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||||
interval: 30s
|
interval: 30s
|
||||||
@@ -83,7 +83,7 @@ services:
|
|||||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||||
|
|
||||||
gotenberg:
|
gotenberg:
|
||||||
image: docker.io/gotenberg/gotenberg:7.6
|
image: docker.io/gotenberg/gotenberg:7.10
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
# The gotenberg chromium route is used to convert .eml files. We do not
|
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||||
# want to allow external content like tracking pixels or even javascript.
|
# want to allow external content like tracking pixels or even javascript.
|
||||||
|
@@ -1,4 +1,4 @@
|
|||||||
# docker-compose file for running paperless from the Docker Hub.
|
# Docker Compose file for running paperless from the Docker Hub.
|
||||||
# This file contains everything paperless needs to run.
|
# This file contains everything paperless needs to run.
|
||||||
# Paperless supports amd64, arm and arm64 hardware.
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
#
|
#
|
||||||
@@ -10,7 +10,7 @@
|
|||||||
# as this file and mounted to the correct folders inside the container.
|
# as this file and mounted to the correct folders inside the container.
|
||||||
# - Paperless listens on port 8000.
|
# - Paperless listens on port 8000.
|
||||||
#
|
#
|
||||||
# In addition to that, this docker-compose file adds the following optional
|
# In addition to that, this Docker Compose file adds the following optional
|
||||||
# configurations:
|
# configurations:
|
||||||
#
|
#
|
||||||
# - Instead of SQLite (default), MariaDB is used as the database server.
|
# - Instead of SQLite (default), MariaDB is used as the database server.
|
||||||
@@ -19,9 +19,9 @@
|
|||||||
#
|
#
|
||||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||||
# and '.env' into a folder.
|
# and '.env' into a folder.
|
||||||
# - Run 'docker-compose pull'.
|
# - Run 'docker compose pull'.
|
||||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||||
# - Run 'docker-compose up -d'.
|
# - Run 'docker compose up -d'.
|
||||||
#
|
#
|
||||||
# For more extensive installation and update instructions, refer to the
|
# For more extensive installation and update instructions, refer to the
|
||||||
# documentation.
|
# documentation.
|
||||||
@@ -53,7 +53,7 @@ services:
|
|||||||
- db
|
- db
|
||||||
- broker
|
- broker
|
||||||
ports:
|
ports:
|
||||||
- 8000:8000
|
- "8000:8000"
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
test: ["CMD", "curl", "-f", "http://localhost:8000"]
|
||||||
interval: 30s
|
interval: 30s
|
||||||
|
@@ -1,4 +1,4 @@
|
|||||||
# docker-compose file for running paperless from the Docker Hub.
|
# Docker Compose file for running paperless from the Docker Hub.
|
||||||
# This file contains everything paperless needs to run.
|
# This file contains everything paperless needs to run.
|
||||||
# Paperless supports amd64, arm and arm64 hardware.
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
#
|
#
|
||||||
@@ -10,7 +10,7 @@
|
|||||||
# as this file and mounted to the correct folders inside the container.
|
# as this file and mounted to the correct folders inside the container.
|
||||||
# - Paperless listens on port 8010.
|
# - Paperless listens on port 8010.
|
||||||
#
|
#
|
||||||
# In addition to that, this docker-compose file adds the following optional
|
# In addition to that, this Docker Compose file adds the following optional
|
||||||
# configurations:
|
# configurations:
|
||||||
#
|
#
|
||||||
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
||||||
@@ -37,7 +37,7 @@ services:
|
|||||||
- redisdata:/data
|
- redisdata:/data
|
||||||
|
|
||||||
db:
|
db:
|
||||||
image: docker.io/library/postgres:13
|
image: docker.io/library/postgres:15
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- pgdata:/var/lib/postgresql/data
|
- pgdata:/var/lib/postgresql/data
|
||||||
@@ -53,7 +53,7 @@ services:
|
|||||||
- db
|
- db
|
||||||
- broker
|
- broker
|
||||||
ports:
|
ports:
|
||||||
- 8010:8000
|
- "8010:8000"
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||||
interval: 30s
|
interval: 30s
|
||||||
|
@@ -1,4 +1,4 @@
|
|||||||
# docker-compose file for running paperless from the docker container registry.
|
# Docker Compose file for running paperless from the docker container registry.
|
||||||
# This file contains everything paperless needs to run.
|
# This file contains everything paperless needs to run.
|
||||||
# Paperless supports amd64, arm and arm64 hardware.
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
#
|
#
|
||||||
@@ -10,7 +10,7 @@
|
|||||||
# as this file and mounted to the correct folders inside the container.
|
# as this file and mounted to the correct folders inside the container.
|
||||||
# - Paperless listens on port 8000.
|
# - Paperless listens on port 8000.
|
||||||
#
|
#
|
||||||
# In addition to that, this docker-compose file adds the following optional
|
# In addition to that, this Docker Compose file adds the following optional
|
||||||
# configurations:
|
# configurations:
|
||||||
#
|
#
|
||||||
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
||||||
@@ -23,9 +23,9 @@
|
|||||||
#
|
#
|
||||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||||
# and '.env' into a folder.
|
# and '.env' into a folder.
|
||||||
# - Run 'docker-compose pull'.
|
# - Run 'docker compose pull'.
|
||||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||||
# - Run 'docker-compose up -d'.
|
# - Run 'docker compose up -d'.
|
||||||
#
|
#
|
||||||
# For more extensive installation and update instructions, refer to the
|
# For more extensive installation and update instructions, refer to the
|
||||||
# documentation.
|
# documentation.
|
||||||
@@ -39,7 +39,7 @@ services:
|
|||||||
- redisdata:/data
|
- redisdata:/data
|
||||||
|
|
||||||
db:
|
db:
|
||||||
image: docker.io/library/postgres:13
|
image: docker.io/library/postgres:15
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- pgdata:/var/lib/postgresql/data
|
- pgdata:/var/lib/postgresql/data
|
||||||
@@ -57,7 +57,7 @@ services:
|
|||||||
- gotenberg
|
- gotenberg
|
||||||
- tika
|
- tika
|
||||||
ports:
|
ports:
|
||||||
- 8000:8000
|
- "8000:8000"
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||||
interval: 30s
|
interval: 30s
|
||||||
@@ -77,7 +77,7 @@ services:
|
|||||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||||
|
|
||||||
gotenberg:
|
gotenberg:
|
||||||
image: docker.io/gotenberg/gotenberg:7.6
|
image: docker.io/gotenberg/gotenberg:7.10
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|
||||||
# The gotenberg chromium route is used to convert .eml files. We do not
|
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||||
|
@@ -1,4 +1,4 @@
|
|||||||
# docker-compose file for running paperless from the Docker Hub.
|
# Docker Compose file for running paperless from the Docker Hub.
|
||||||
# This file contains everything paperless needs to run.
|
# This file contains everything paperless needs to run.
|
||||||
# Paperless supports amd64, arm and arm64 hardware.
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
#
|
#
|
||||||
@@ -10,7 +10,7 @@
|
|||||||
# as this file and mounted to the correct folders inside the container.
|
# as this file and mounted to the correct folders inside the container.
|
||||||
# - Paperless listens on port 8000.
|
# - Paperless listens on port 8000.
|
||||||
#
|
#
|
||||||
# In addition to that, this docker-compose file adds the following optional
|
# In addition to that, this Docker Compose file adds the following optional
|
||||||
# configurations:
|
# configurations:
|
||||||
#
|
#
|
||||||
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
# - Instead of SQLite (default), PostgreSQL is used as the database server.
|
||||||
@@ -19,9 +19,9 @@
|
|||||||
#
|
#
|
||||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||||
# and '.env' into a folder.
|
# and '.env' into a folder.
|
||||||
# - Run 'docker-compose pull'.
|
# - Run 'docker compose pull'.
|
||||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||||
# - Run 'docker-compose up -d'.
|
# - Run 'docker compose up -d'.
|
||||||
#
|
#
|
||||||
# For more extensive installation and update instructions, refer to the
|
# For more extensive installation and update instructions, refer to the
|
||||||
# documentation.
|
# documentation.
|
||||||
@@ -35,7 +35,7 @@ services:
|
|||||||
- redisdata:/data
|
- redisdata:/data
|
||||||
|
|
||||||
db:
|
db:
|
||||||
image: docker.io/library/postgres:13
|
image: docker.io/library/postgres:15
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
volumes:
|
volumes:
|
||||||
- pgdata:/var/lib/postgresql/data
|
- pgdata:/var/lib/postgresql/data
|
||||||
@@ -51,7 +51,7 @@ services:
|
|||||||
- db
|
- db
|
||||||
- broker
|
- broker
|
||||||
ports:
|
ports:
|
||||||
- 8000:8000
|
- "8000:8000"
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||||
interval: 30s
|
interval: 30s
|
||||||
|
@@ -1,4 +1,4 @@
|
|||||||
# docker-compose file for running paperless from the docker container registry.
|
# Docker Compose file for running paperless from the docker container registry.
|
||||||
# This file contains everything paperless needs to run.
|
# This file contains everything paperless needs to run.
|
||||||
# Paperless supports amd64, arm and arm64 hardware.
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
# All compose files of paperless configure paperless in the following way:
|
# All compose files of paperless configure paperless in the following way:
|
||||||
@@ -11,7 +11,7 @@
|
|||||||
#
|
#
|
||||||
# SQLite is used as the database. The SQLite file is stored in the data volume.
|
# SQLite is used as the database. The SQLite file is stored in the data volume.
|
||||||
#
|
#
|
||||||
# In addition to that, this docker-compose file adds the following optional
|
# In addition to that, this Docker Compose file adds the following optional
|
||||||
# configurations:
|
# configurations:
|
||||||
#
|
#
|
||||||
# - Apache Tika and Gotenberg servers are started with paperless and paperless
|
# - Apache Tika and Gotenberg servers are started with paperless and paperless
|
||||||
@@ -23,9 +23,9 @@
|
|||||||
#
|
#
|
||||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||||
# and '.env' into a folder.
|
# and '.env' into a folder.
|
||||||
# - Run 'docker-compose pull'.
|
# - Run 'docker compose pull'.
|
||||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||||
# - Run 'docker-compose up -d'.
|
# - Run 'docker compose up -d'.
|
||||||
#
|
#
|
||||||
# For more extensive installation and update instructions, refer to the
|
# For more extensive installation and update instructions, refer to the
|
||||||
# documentation.
|
# documentation.
|
||||||
@@ -46,7 +46,7 @@ services:
|
|||||||
- gotenberg
|
- gotenberg
|
||||||
- tika
|
- tika
|
||||||
ports:
|
ports:
|
||||||
- 8000:8000
|
- "8000:8000"
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||||
interval: 30s
|
interval: 30s
|
||||||
@@ -65,7 +65,7 @@ services:
|
|||||||
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
PAPERLESS_TIKA_ENDPOINT: http://tika:9998
|
||||||
|
|
||||||
gotenberg:
|
gotenberg:
|
||||||
image: docker.io/gotenberg/gotenberg:7.6
|
image: docker.io/gotenberg/gotenberg:7.10
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|
||||||
# The gotenberg chromium route is used to convert .eml files. We do not
|
# The gotenberg chromium route is used to convert .eml files. We do not
|
||||||
|
@@ -1,4 +1,4 @@
|
|||||||
# docker-compose file for running paperless from the Docker Hub.
|
# Docker Compose file for running paperless from the Docker Hub.
|
||||||
# This file contains everything paperless needs to run.
|
# This file contains everything paperless needs to run.
|
||||||
# Paperless supports amd64, arm and arm64 hardware.
|
# Paperless supports amd64, arm and arm64 hardware.
|
||||||
#
|
#
|
||||||
@@ -16,9 +16,9 @@
|
|||||||
#
|
#
|
||||||
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
# - Copy this file as 'docker-compose.yml' and the files 'docker-compose.env'
|
||||||
# and '.env' into a folder.
|
# and '.env' into a folder.
|
||||||
# - Run 'docker-compose pull'.
|
# - Run 'docker compose pull'.
|
||||||
# - Run 'docker-compose run --rm webserver createsuperuser' to create a user.
|
# - Run 'docker compose run --rm webserver createsuperuser' to create a user.
|
||||||
# - Run 'docker-compose up -d'.
|
# - Run 'docker compose up -d'.
|
||||||
#
|
#
|
||||||
# For more extensive installation and update instructions, refer to the
|
# For more extensive installation and update instructions, refer to the
|
||||||
# documentation.
|
# documentation.
|
||||||
@@ -37,7 +37,7 @@ services:
|
|||||||
depends_on:
|
depends_on:
|
||||||
- broker
|
- broker
|
||||||
ports:
|
ports:
|
||||||
- 8000:8000
|
- "8000:8000"
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
test: ["CMD", "curl", "-fs", "-S", "--max-time", "2", "http://localhost:8000"]
|
||||||
interval: 30s
|
interval: 30s
|
||||||
|
@@ -80,7 +80,7 @@ django_checks() {
|
|||||||
|
|
||||||
search_index() {
|
search_index() {
|
||||||
|
|
||||||
local -r index_version=1
|
local -r index_version=7
|
||||||
local -r index_version_file=${DATA_DIR}/.index_version
|
local -r index_version_file=${DATA_DIR}/.index_version
|
||||||
|
|
||||||
if [[ (! -f "${index_version_file}") || $(<"${index_version_file}") != "$index_version" ]]; then
|
if [[ (! -f "${index_version_file}") || $(<"${index_version_file}") != "$index_version" ]]; then
|
||||||
|
@@ -13,8 +13,12 @@ for line in $(printenv)
|
|||||||
do
|
do
|
||||||
# Extract the name of the environment variable
|
# Extract the name of the environment variable
|
||||||
env_name=${line%%=*}
|
env_name=${line%%=*}
|
||||||
# Check if it ends in "_FILE"
|
# Check if it starts with "PAPERLESS_" and ends in "_FILE"
|
||||||
if [[ ${env_name} == *_FILE ]]; then
|
if [[ ${env_name} == PAPERLESS_*_FILE ]]; then
|
||||||
|
# This should have been named different..
|
||||||
|
if [[ ${env_name} == "PAPERLESS_OCR_SKIP_ARCHIVE_FILE" ]]; then
|
||||||
|
continue
|
||||||
|
fi
|
||||||
# Extract the value of the environment
|
# Extract the value of the environment
|
||||||
env_value=${line#*=}
|
env_value=${line#*=}
|
||||||
|
|
||||||
@@ -32,8 +36,7 @@ do
|
|||||||
export "${non_file_env_name}"="${val}"
|
export "${non_file_env_name}"="${val}"
|
||||||
|
|
||||||
else
|
else
|
||||||
echo "File ${env_value} doesn't exist"
|
echo "File ${env_value} referenced by ${env_name} doesn't exist"
|
||||||
exit 1
|
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
@@ -3,5 +3,10 @@
|
|||||||
echo "Checking if we should start flower..."
|
echo "Checking if we should start flower..."
|
||||||
|
|
||||||
if [[ -n "${PAPERLESS_ENABLE_FLOWER}" ]]; then
|
if [[ -n "${PAPERLESS_ENABLE_FLOWER}" ]]; then
|
||||||
celery --app paperless flower
|
# Small delay to allow celery to be up first
|
||||||
|
echo "Starting flower in 5s"
|
||||||
|
sleep 5
|
||||||
|
celery --app paperless flower --conf=/usr/src/paperless/src/paperless/flowerconfig.py
|
||||||
|
else
|
||||||
|
echo "Not starting flower"
|
||||||
fi
|
fi
|
||||||
|
@@ -13,6 +13,7 @@ for command in decrypt_documents \
|
|||||||
document_retagger \
|
document_retagger \
|
||||||
document_thumbnails \
|
document_thumbnails \
|
||||||
document_sanity_checker \
|
document_sanity_checker \
|
||||||
|
document_fuzzy_match \
|
||||||
manage_superuser;
|
manage_superuser;
|
||||||
do
|
do
|
||||||
echo "installing $command..."
|
echo "installing $command..."
|
||||||
|
@@ -15,6 +15,7 @@ stdout_logfile=/dev/stdout
|
|||||||
stdout_logfile_maxbytes=0
|
stdout_logfile_maxbytes=0
|
||||||
stderr_logfile=/dev/stderr
|
stderr_logfile=/dev/stderr
|
||||||
stderr_logfile_maxbytes=0
|
stderr_logfile_maxbytes=0
|
||||||
|
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||||
|
|
||||||
[program:consumer]
|
[program:consumer]
|
||||||
command=python3 manage.py document_consumer
|
command=python3 manage.py document_consumer
|
||||||
@@ -25,10 +26,11 @@ stdout_logfile=/dev/stdout
|
|||||||
stdout_logfile_maxbytes=0
|
stdout_logfile_maxbytes=0
|
||||||
stderr_logfile=/dev/stderr
|
stderr_logfile=/dev/stderr
|
||||||
stderr_logfile_maxbytes=0
|
stderr_logfile_maxbytes=0
|
||||||
|
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||||
|
|
||||||
[program:celery]
|
[program:celery]
|
||||||
|
|
||||||
command = celery --app paperless worker --loglevel INFO
|
command = celery --app paperless worker --loglevel INFO --without-mingle --without-gossip
|
||||||
user=paperless
|
user=paperless
|
||||||
stopasgroup = true
|
stopasgroup = true
|
||||||
stopwaitsecs = 60
|
stopwaitsecs = 60
|
||||||
@@ -37,6 +39,7 @@ stdout_logfile=/dev/stdout
|
|||||||
stdout_logfile_maxbytes=0
|
stdout_logfile_maxbytes=0
|
||||||
stderr_logfile=/dev/stderr
|
stderr_logfile=/dev/stderr
|
||||||
stderr_logfile_maxbytes=0
|
stderr_logfile_maxbytes=0
|
||||||
|
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||||
|
|
||||||
[program:celery-beat]
|
[program:celery-beat]
|
||||||
|
|
||||||
@@ -48,6 +51,7 @@ stdout_logfile=/dev/stdout
|
|||||||
stdout_logfile_maxbytes=0
|
stdout_logfile_maxbytes=0
|
||||||
stderr_logfile=/dev/stderr
|
stderr_logfile=/dev/stderr
|
||||||
stderr_logfile_maxbytes=0
|
stderr_logfile_maxbytes=0
|
||||||
|
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||||
|
|
||||||
[program:celery-flower]
|
[program:celery-flower]
|
||||||
command = /usr/local/bin/flower-conditional.sh
|
command = /usr/local/bin/flower-conditional.sh
|
||||||
@@ -58,3 +62,4 @@ stdout_logfile=/dev/stdout
|
|||||||
stdout_logfile_maxbytes=0
|
stdout_logfile_maxbytes=0
|
||||||
stderr_logfile=/dev/stderr
|
stderr_logfile=/dev/stderr
|
||||||
stderr_logfile_maxbytes=0
|
stderr_logfile_maxbytes=0
|
||||||
|
environment = HOME="/usr/src/paperless",USER="paperless"
|
||||||
|
@@ -12,13 +12,12 @@ from typing import Final
|
|||||||
from redis import Redis
|
from redis import Redis
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|
||||||
MAX_RETRY_COUNT: Final[int] = 5
|
MAX_RETRY_COUNT: Final[int] = 5
|
||||||
RETRY_SLEEP_SECONDS: Final[int] = 5
|
RETRY_SLEEP_SECONDS: Final[int] = 5
|
||||||
|
|
||||||
REDIS_URL: Final[str] = os.getenv("PAPERLESS_REDIS", "redis://localhost:6379")
|
REDIS_URL: Final[str] = os.getenv("PAPERLESS_REDIS", "redis://localhost:6379")
|
||||||
|
|
||||||
print(f"Waiting for Redis...", flush=True)
|
print("Waiting for Redis...", flush=True)
|
||||||
|
|
||||||
attempt = 0
|
attempt = 0
|
||||||
with Redis.from_url(url=REDIS_URL) as client:
|
with Redis.from_url(url=REDIS_URL) as client:
|
||||||
@@ -29,7 +28,7 @@ if __name__ == "__main__":
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(
|
print(
|
||||||
f"Redis ping #{attempt} failed.\n"
|
f"Redis ping #{attempt} failed.\n"
|
||||||
f"Error: {str(e)}.\n"
|
f"Error: {e!s}.\n"
|
||||||
f"Waiting {RETRY_SLEEP_SECONDS}s",
|
f"Waiting {RETRY_SLEEP_SECONDS}s",
|
||||||
flush=True,
|
flush=True,
|
||||||
)
|
)
|
||||||
@@ -37,8 +36,8 @@ if __name__ == "__main__":
|
|||||||
attempt += 1
|
attempt += 1
|
||||||
|
|
||||||
if attempt >= MAX_RETRY_COUNT:
|
if attempt >= MAX_RETRY_COUNT:
|
||||||
print(f"Failed to connect to redis using environment variable PAPERLESS_REDIS.")
|
print("Failed to connect to redis using environment variable PAPERLESS_REDIS.")
|
||||||
sys.exit(os.EX_UNAVAILABLE)
|
sys.exit(os.EX_UNAVAILABLE)
|
||||||
else:
|
else:
|
||||||
print(f"Connected to Redis broker.")
|
print("Connected to Redis broker.")
|
||||||
sys.exit(os.EX_OK)
|
sys.exit(os.EX_OK)
|
||||||
|
@@ -5,17 +5,19 @@
|
|||||||
Multiple options exist for making backups of your paperless instance,
|
Multiple options exist for making backups of your paperless instance,
|
||||||
depending on how you installed paperless.
|
depending on how you installed paperless.
|
||||||
|
|
||||||
Before making backups, make sure that paperless is not running.
|
Before making a backup, it's probably best to make sure that paperless is not actively
|
||||||
|
consuming documents at that time.
|
||||||
|
|
||||||
Options available to any installation of paperless:
|
Options available to any installation of paperless:
|
||||||
|
|
||||||
- Use the [document exporter](#exporter). The document exporter exports all your documents,
|
- Use the [document exporter](#exporter). The document exporter exports all your documents,
|
||||||
thumbnails and metadata to a specific folder. You may import your
|
thumbnails, metadata, and database contents to a specific folder. You may import your
|
||||||
documents into a fresh instance of paperless again or store your
|
documents and settings into a fresh instance of paperless again or store your
|
||||||
documents in another DMS with this export.
|
documents in another DMS with this export.
|
||||||
- The document exporter is also able to update an already existing
|
|
||||||
export. Therefore, incremental backups with `rsync` are entirely
|
The document exporter is also able to update an already existing
|
||||||
possible.
|
export. Therefore, incremental backups with `rsync` are entirely
|
||||||
|
possible.
|
||||||
|
|
||||||
!!! caution
|
!!! caution
|
||||||
|
|
||||||
@@ -25,31 +27,37 @@ Options available to any installation of paperless:
|
|||||||
|
|
||||||
Options available to docker installations:
|
Options available to docker installations:
|
||||||
|
|
||||||
- Backup the docker volumes. These usually reside within
|
- Backup the docker volumes. These usually reside within
|
||||||
`/var/lib/docker/volumes` on the host and you need to be root in
|
`/var/lib/docker/volumes` on the host and you need to be root in
|
||||||
order to access them.
|
order to access them.
|
||||||
|
|
||||||
Paperless uses 4 volumes:
|
Paperless uses 4 volumes:
|
||||||
|
|
||||||
- `paperless_media`: This is where your documents are stored.
|
- `paperless_media`: This is where your documents are stored.
|
||||||
- `paperless_data`: This is where auxillary data is stored. This
|
- `paperless_data`: This is where auxiliary data is stored. This
|
||||||
folder also contains the SQLite database, if you use it.
|
folder also contains the SQLite database, if you use it.
|
||||||
- `paperless_pgdata`: Exists only if you use PostgreSQL and
|
- `paperless_pgdata`: Exists only if you use PostgreSQL and
|
||||||
contains the database.
|
contains the database.
|
||||||
- `paperless_dbdata`: Exists only if you use MariaDB and contains
|
- `paperless_dbdata`: Exists only if you use MariaDB and contains
|
||||||
the database.
|
the database.
|
||||||
|
|
||||||
Options available to bare-metal and non-docker installations:
|
Options available to bare-metal and non-docker installations:
|
||||||
|
|
||||||
- Backup the entire paperless folder. This ensures that if your
|
- Backup the entire paperless folder. This ensures that if your
|
||||||
paperless instance crashes at some point or your disk fails, you can
|
paperless instance crashes at some point or your disk fails, you can
|
||||||
simply copy the folder back into place and it works.
|
simply copy the folder back into place and it works.
|
||||||
|
|
||||||
When using PostgreSQL or MariaDB, you'll also have to backup the
|
When using PostgreSQL or MariaDB, you'll also have to backup the
|
||||||
database.
|
database.
|
||||||
|
|
||||||
### Restoring {#migrating-restoring}
|
### Restoring {#migrating-restoring}
|
||||||
|
|
||||||
|
If you've backed-up Paperless-ngx using the [document exporter](#exporter),
|
||||||
|
restoring can simply be done with the [document importer](#importer).
|
||||||
|
|
||||||
|
Of course, other backup strategies require restoring any volumes, folders and database
|
||||||
|
copies you created in the steps above.
|
||||||
|
|
||||||
## Updating Paperless {#updating}
|
## Updating Paperless {#updating}
|
||||||
|
|
||||||
### Docker Route {#docker-updating}
|
### Docker Route {#docker-updating}
|
||||||
@@ -63,30 +71,30 @@ First of all, ensure that paperless is stopped.
|
|||||||
|
|
||||||
```shell-session
|
```shell-session
|
||||||
$ cd /path/to/paperless
|
$ cd /path/to/paperless
|
||||||
$ docker-compose down
|
$ docker compose down
|
||||||
```
|
```
|
||||||
|
|
||||||
After that, [make a backup](#backup).
|
After that, [make a backup](#backup).
|
||||||
|
|
||||||
1. If you pull the image from the docker hub, all you need to do is:
|
1. If you pull the image from the docker hub, all you need to do is:
|
||||||
|
|
||||||
```shell-session
|
```shell-session
|
||||||
$ docker-compose pull
|
$ docker compose pull
|
||||||
$ docker-compose up
|
$ docker compose up
|
||||||
```
|
```
|
||||||
|
|
||||||
The docker-compose files refer to the `latest` version, which is
|
The Docker Compose files refer to the `latest` version, which is
|
||||||
always the latest stable release.
|
always the latest stable release.
|
||||||
|
|
||||||
2. If you built the image yourself, do the following:
|
1. If you built the image yourself, do the following:
|
||||||
|
|
||||||
```shell-session
|
```shell-session
|
||||||
$ git pull
|
$ git pull
|
||||||
$ docker-compose build
|
$ docker compose build
|
||||||
$ docker-compose up
|
$ docker compose up
|
||||||
```
|
```
|
||||||
|
|
||||||
Running `docker-compose up` will also apply any new database migrations.
|
Running `docker compose up` will also apply any new database migrations.
|
||||||
If you see everything working, press CTRL+C once to gracefully stop
|
If you see everything working, press CTRL+C once to gracefully stop
|
||||||
paperless. Then you can start paperless-ngx with `-d` to have it run in
|
paperless. Then you can start paperless-ngx with `-d` to have it run in
|
||||||
the background.
|
the background.
|
||||||
@@ -94,11 +102,11 @@ the background.
|
|||||||
!!! note
|
!!! note
|
||||||
|
|
||||||
In version 0.9.14, the update process was changed. In 0.9.13 and
|
In version 0.9.14, the update process was changed. In 0.9.13 and
|
||||||
earlier, the docker-compose files specified exact versions and pull
|
earlier, the Docker Compose files specified exact versions and pull
|
||||||
won't automatically update to newer versions. In order to enable
|
won't automatically update to newer versions. In order to enable
|
||||||
updates as described above, either get the new `docker-compose.yml`
|
updates as described above, either get the new `docker-compose.yml`
|
||||||
file from
|
file from
|
||||||
[here](https://github.com/paperless-ngx/paperless-ngx/tree/master/docker/compose)
|
[here](https://github.com/paperless-ngx/paperless-ngx/tree/main/docker/compose)
|
||||||
or edit the `docker-compose.yml` file, find the line that says
|
or edit the `docker-compose.yml` file, find the line that says
|
||||||
|
|
||||||
```
|
```
|
||||||
@@ -139,7 +147,7 @@ following:
|
|||||||
1. Update dependencies. New paperless version may require additional
|
1. Update dependencies. New paperless version may require additional
|
||||||
dependencies. The dependencies required are listed in the section
|
dependencies. The dependencies required are listed in the section
|
||||||
about
|
about
|
||||||
[bare metal installations](/setup#bare_metal).
|
[bare metal installations](setup.md#bare_metal).
|
||||||
|
|
||||||
2. Update python requirements. Keep in mind to activate your virtual
|
2. Update python requirements. Keep in mind to activate your virtual
|
||||||
environment before that, if you use one.
|
environment before that, if you use one.
|
||||||
@@ -148,16 +156,35 @@ following:
|
|||||||
$ pip install -r requirements.txt
|
$ pip install -r requirements.txt
|
||||||
```
|
```
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
At times, some dependencies will be removed from requirements.txt.
|
||||||
|
Comparing the versions and removing no longer needed dependencies
|
||||||
|
will keep your system or virtual environment clean and prevent
|
||||||
|
possible conflicts.
|
||||||
|
|
||||||
3. Migrate the database.
|
3. Migrate the database.
|
||||||
|
|
||||||
```shell-session
|
```shell-session
|
||||||
$ cd src
|
$ cd src
|
||||||
$ python3 manage.py migrate
|
$ python3 manage.py migrate # (1)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
1. Including `sudo -Hu <paperless_user>` may be required
|
||||||
|
|
||||||
This might not actually do anything. Not every new paperless version
|
This might not actually do anything. Not every new paperless version
|
||||||
comes with new database migrations.
|
comes with new database migrations.
|
||||||
|
|
||||||
|
### Database Upgrades
|
||||||
|
|
||||||
|
In general, paperless does not require a specific version of PostgreSQL or MariaDB and it is
|
||||||
|
safe to update them to newer versions. However, you should always take a backup and follow
|
||||||
|
the instructions from your database's documentation for how to upgrade between major versions.
|
||||||
|
|
||||||
|
For PostgreSQL, refer to [Upgrading a PostgreSQL Cluster](https://www.postgresql.org/docs/current/upgrading.html).
|
||||||
|
|
||||||
|
For MariaDB, refer to [Upgrading MariaDB](https://mariadb.com/kb/en/upgrading/)
|
||||||
|
|
||||||
## Downgrading Paperless {#downgrade-paperless}
|
## Downgrading Paperless {#downgrade-paperless}
|
||||||
|
|
||||||
Downgrades are possible. However, some updates also contain database
|
Downgrades are possible. However, some updates also contain database
|
||||||
@@ -193,11 +220,11 @@ Paperless comes with some management commands that perform various
|
|||||||
maintenance tasks on your paperless instance. You can invoke these
|
maintenance tasks on your paperless instance. You can invoke these
|
||||||
commands in the following way:
|
commands in the following way:
|
||||||
|
|
||||||
With docker-compose, while paperless is running:
|
With Docker Compose, while paperless is running:
|
||||||
|
|
||||||
```shell-session
|
```shell-session
|
||||||
$ cd /path/to/paperless
|
$ cd /path/to/paperless
|
||||||
$ docker-compose exec webserver <command> <arguments>
|
$ docker compose exec webserver <command> <arguments>
|
||||||
```
|
```
|
||||||
|
|
||||||
With docker, while paperless is running:
|
With docker, while paperless is running:
|
||||||
@@ -210,30 +237,38 @@ Bare metal:
|
|||||||
|
|
||||||
```shell-session
|
```shell-session
|
||||||
$ cd /path/to/paperless/src
|
$ cd /path/to/paperless/src
|
||||||
$ python3 manage.py <command> <arguments>
|
$ python3 manage.py <command> <arguments> # (1)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
1. Including `sudo -Hu <paperless_user>` may be required
|
||||||
|
|
||||||
All commands have built-in help, which can be accessed by executing them
|
All commands have built-in help, which can be accessed by executing them
|
||||||
with the argument `--help`.
|
with the argument `--help`.
|
||||||
|
|
||||||
### Document exporter {#exporter}
|
### Document exporter {#exporter}
|
||||||
|
|
||||||
The document exporter exports all your data from paperless into a folder
|
The document exporter exports all your data (including your settings
|
||||||
for backup or migration to another DMS.
|
and database contents) from paperless into a folder for backup or
|
||||||
|
migration to another DMS.
|
||||||
|
|
||||||
If you use the document exporter within a cronjob to backup your data
|
If you use the document exporter within a cronjob to backup your data
|
||||||
you might use the `-T` flag behind exec to suppress "The input device
|
you might use the `-T` flag behind exec to suppress "The input device
|
||||||
is not a TTY" errors. For example:
|
is not a TTY" errors. For example:
|
||||||
`docker-compose exec -T webserver document_exporter ../export`
|
`docker compose exec -T webserver document_exporter ../export`
|
||||||
|
|
||||||
```
|
```
|
||||||
document_exporter target [-c] [-f] [-d]
|
document_exporter target [-c] [-d] [-f] [-na] [-nt] [-p] [-sm] [-z]
|
||||||
|
|
||||||
optional arguments:
|
optional arguments:
|
||||||
-c, --compare-checksums
|
-c, --compare-checksums
|
||||||
-f, --use-filename-format
|
-d, --delete
|
||||||
-d, --delete
|
-f, --use-filename-format
|
||||||
-z --zip
|
-na, --no-archive
|
||||||
|
-nt, --no-thumbnail
|
||||||
|
-p, --use-folder-prefix
|
||||||
|
-sm, --split-manifest
|
||||||
|
-z, --zip
|
||||||
|
-zn, --zip-name
|
||||||
```
|
```
|
||||||
|
|
||||||
`target` is a folder to which the data gets written. This includes
|
`target` is a folder to which the data gets written. This includes
|
||||||
@@ -249,23 +284,54 @@ will assume that the contents of the export directory are a previous
|
|||||||
export and will attempt to update the previous export. Paperless will
|
export and will attempt to update the previous export. Paperless will
|
||||||
only export changed and added files. Paperless determines whether a file
|
only export changed and added files. Paperless determines whether a file
|
||||||
has changed by inspecting the file attributes "date/time modified" and
|
has changed by inspecting the file attributes "date/time modified" and
|
||||||
"size". If that does not work out for you, specify
|
"size". If that does not work out for you, specify `-c` or
|
||||||
`--compare-checksums` and paperless will attempt to compare file
|
`--compare-checksums` and paperless will attempt to compare file
|
||||||
checksums instead. This is slower.
|
checksums instead. This is slower.
|
||||||
|
|
||||||
Paperless will not remove any existing files in the export directory. If
|
Paperless will not remove any existing files in the export directory. If
|
||||||
you want paperless to also remove files that do not belong to the
|
you want paperless to also remove files that do not belong to the
|
||||||
current export such as files from deleted documents, specify `--delete`.
|
current export such as files from deleted documents, specify `-d` or `--delete`.
|
||||||
Be careful when pointing paperless to a directory that already contains
|
Be careful when pointing paperless to a directory that already contains
|
||||||
other files.
|
other files.
|
||||||
|
|
||||||
If `-z` or `--zip` is provided, the export will be a zipfile
|
|
||||||
in the target directory, named according to the current date.
|
|
||||||
|
|
||||||
The filenames generated by this command follow the format
|
The filenames generated by this command follow the format
|
||||||
`[date created] [correspondent] [title].[extension]`. If you want
|
`[date created] [correspondent] [title].[extension]`. If you want
|
||||||
paperless to use `PAPERLESS_FILENAME_FORMAT` for exported filenames
|
paperless to use [`PAPERLESS_FILENAME_FORMAT`](configuration.md#PAPERLESS_FILENAME_FORMAT) for exported filenames
|
||||||
instead, specify `--use-filename-format`.
|
instead, specify `-f` or `--use-filename-format`.
|
||||||
|
|
||||||
|
If `-na` or `--no-archive` is provided, no archive files will be exported,
|
||||||
|
only the original files.
|
||||||
|
|
||||||
|
If `-nt` or `--no-thumbnail` is provided, thumbnail files will not be exported.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
When using the `-na`/`--no-archive` or `-nt`/`--no-thumbnail` options
|
||||||
|
the exporter will not output these files for backup. After importing,
|
||||||
|
the [sanity checker](#sanity-checker) will warn about missing thumbnails and archive files
|
||||||
|
until they are regenerated with `document_thumbnails` or [`document_archiver`](#archiver).
|
||||||
|
It can make sense to omit these files from backup as their content and checksum
|
||||||
|
can change (new archiver algorithm) and may then cause additional used space in
|
||||||
|
a deduplicated backup.
|
||||||
|
|
||||||
|
If `-p` or `--use-folder-prefix` is provided, files will be exported
|
||||||
|
in dedicated folders according to their nature: `archive`, `originals`,
|
||||||
|
`thumbnails` or `json`
|
||||||
|
|
||||||
|
If `-sm` or `--split-manifest` is provided, information about document
|
||||||
|
will be placed in individual json files, instead of a single JSON file. The main
|
||||||
|
manifest.json will still contain application wide information (e.g. tags, correspondent,
|
||||||
|
documenttype, etc)
|
||||||
|
|
||||||
|
If `-z` or `--zip` is provided, the export will be a zip file
|
||||||
|
in the target directory, named according to the current local date or the
|
||||||
|
value set in `-zn` or `--zip-name`.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
If exporting with the file name format, there may be errors due to
|
||||||
|
your operating system's maximum path lengths. Try adjusting the export
|
||||||
|
target or consider not using the filename format.
|
||||||
|
|
||||||
### Document importer {#importer}
|
### Document importer {#importer}
|
||||||
|
|
||||||
@@ -296,7 +362,7 @@ currently-imported docs. This problem is common enough that there are
|
|||||||
tools for it.
|
tools for it.
|
||||||
|
|
||||||
```
|
```
|
||||||
document_retagger [-h] [-c] [-T] [-t] [-i] [--use-first] [-f]
|
document_retagger [-h] [-c] [-T] [-t] [-i] [--id-range] [--use-first] [-f]
|
||||||
|
|
||||||
optional arguments:
|
optional arguments:
|
||||||
-c, --correspondent
|
-c, --correspondent
|
||||||
@@ -304,6 +370,7 @@ optional arguments:
|
|||||||
-t, --document_type
|
-t, --document_type
|
||||||
-s, --storage_path
|
-s, --storage_path
|
||||||
-i, --inbox-only
|
-i, --inbox-only
|
||||||
|
--id-range
|
||||||
--use-first
|
--use-first
|
||||||
-f, --overwrite
|
-f, --overwrite
|
||||||
```
|
```
|
||||||
@@ -320,6 +387,11 @@ Specify `-i` to have the document retagger work on documents tagged with
|
|||||||
inbox tags only. This is useful when you don't want to mess with your
|
inbox tags only. This is useful when you don't want to mess with your
|
||||||
already processed documents.
|
already processed documents.
|
||||||
|
|
||||||
|
Specify `--id-range 1 100` to have the document retagger work only on a
|
||||||
|
specific range of document id´s. This can be useful if you have a lot of
|
||||||
|
documents and want to test the matching rules only on a subset of
|
||||||
|
documents.
|
||||||
|
|
||||||
When multiple document types or correspondents match a single document,
|
When multiple document types or correspondents match a single document,
|
||||||
the retagger won't assign these to the document. Specify `--use-first`
|
the retagger won't assign these to the document. Specify `--use-first`
|
||||||
to override this behavior and just use the first correspondent or type
|
to override this behavior and just use the first correspondent or type
|
||||||
@@ -336,7 +408,7 @@ that don't match a document anymore get removed as well.
|
|||||||
### Managing the Automatic matching algorithm
|
### Managing the Automatic matching algorithm
|
||||||
|
|
||||||
The _Auto_ matching algorithm requires a trained neural network to work.
|
The _Auto_ matching algorithm requires a trained neural network to work.
|
||||||
This network needs to be updated whenever somethings in your data
|
This network needs to be updated whenever something in your data
|
||||||
changes. The docker image takes care of that automatically with the task
|
changes. The docker image takes care of that automatically with the task
|
||||||
scheduler. You can manually renew the classifier by invoking the
|
scheduler. You can manually renew the classifier by invoking the
|
||||||
following management command:
|
following management command:
|
||||||
@@ -347,6 +419,17 @@ document_create_classifier
|
|||||||
|
|
||||||
This command takes no arguments.
|
This command takes no arguments.
|
||||||
|
|
||||||
|
### Document thumbnails {#thumbnails}
|
||||||
|
|
||||||
|
Use this command to re-create document thumbnails. Optionally include the ` --document {id}` option to generate thumbnails for a specific document only.
|
||||||
|
|
||||||
|
You may also specify `--processes` to control the number of processes used to generate new thumbnails. The default is to utilize
|
||||||
|
a quarter of the available processors.
|
||||||
|
|
||||||
|
```
|
||||||
|
document_thumbnails
|
||||||
|
```
|
||||||
|
|
||||||
### Managing the document search index {#index}
|
### Managing the document search index {#index}
|
||||||
|
|
||||||
The document search index is responsible for delivering search results
|
The document search index is responsible for delivering search results
|
||||||
@@ -370,7 +453,7 @@ task scheduler.
|
|||||||
### Managing filenames {#renamer}
|
### Managing filenames {#renamer}
|
||||||
|
|
||||||
If you use paperless' feature to
|
If you use paperless' feature to
|
||||||
[assign custom filenames to your documents](/advanced_usage#file-name-handling), you can use this command to move all your files after
|
[assign custom filenames to your documents](advanced_usage.md#file-name-handling), you can use this command to move all your files after
|
||||||
changing the naming scheme.
|
changing the naming scheme.
|
||||||
|
|
||||||
!!! warning
|
!!! warning
|
||||||
@@ -395,19 +478,19 @@ collection for issues.
|
|||||||
|
|
||||||
The issues detected by the sanity checker are as follows:
|
The issues detected by the sanity checker are as follows:
|
||||||
|
|
||||||
- Missing original files.
|
- Missing original files.
|
||||||
- Missing archive files.
|
- Missing archive files.
|
||||||
- Inaccessible original files due to improper permissions.
|
- Inaccessible original files due to improper permissions.
|
||||||
- Inaccessible archive files due to improper permissions.
|
- Inaccessible archive files due to improper permissions.
|
||||||
- Corrupted original documents by comparing their checksum against
|
- Corrupted original documents by comparing their checksum against
|
||||||
what is stored in the database.
|
what is stored in the database.
|
||||||
- Corrupted archive documents by comparing their checksum against what
|
- Corrupted archive documents by comparing their checksum against what
|
||||||
is stored in the database.
|
is stored in the database.
|
||||||
- Missing thumbnails.
|
- Missing thumbnails.
|
||||||
- Inaccessible thumbnails due to improper permissions.
|
- Inaccessible thumbnails due to improper permissions.
|
||||||
- Documents without any content (warning).
|
- Documents without any content (warning).
|
||||||
- Orphaned files in the media directory (warning). These are files
|
- Orphaned files in the media directory (warning). These are files
|
||||||
that are not referenced by any document im paperless.
|
that are not referenced by any document in paperless.
|
||||||
|
|
||||||
```
|
```
|
||||||
document_sanity_checker
|
document_sanity_checker
|
||||||
@@ -429,12 +512,13 @@ mail_fetcher
|
|||||||
The command takes no arguments and processes all your mail accounts and
|
The command takes no arguments and processes all your mail accounts and
|
||||||
rules.
|
rules.
|
||||||
|
|
||||||
!!! note
|
!!! tip
|
||||||
|
|
||||||
As of October 2022 Microsoft no longer supports IMAP authentication
|
To use OAuth access tokens for mail fetching,
|
||||||
for Exchange servers, thus Exchange is no longer supported until a
|
select the box to indicate the password is actually
|
||||||
solution is implemented in the Python IMAP library used by Paperless.
|
a token when creating or editing a mail account. The
|
||||||
See [learn.microsoft.com](https://learn.microsoft.com/en-us/exchange/clients-and-mobile-in-exchange-online/deprecation-of-basic-authentication-exchange-online)
|
details for creating a token depend on your email
|
||||||
|
provider.
|
||||||
|
|
||||||
### Creating archived documents {#archiver}
|
### Creating archived documents {#archiver}
|
||||||
|
|
||||||
@@ -475,7 +559,7 @@ Documents can be stored in Paperless using GnuPG encryption.
|
|||||||
|
|
||||||
!!! warning
|
!!! warning
|
||||||
|
|
||||||
Encryption is deprecated since [paperless-ng 0.9](/changelog#paperless-ng-090) and doesn't really
|
Encryption is deprecated since [paperless-ng 0.9](changelog.md#paperless-ng-090) and doesn't really
|
||||||
provide any additional security, since you have to store the passphrase
|
provide any additional security, since you have to store the passphrase
|
||||||
in a configuration file on the same system as the encrypted documents
|
in a configuration file on the same system as the encrypted documents
|
||||||
for paperless to work. Furthermore, the entire text content of the
|
for paperless to work. Furthermore, the entire text content of the
|
||||||
@@ -496,9 +580,30 @@ Enabling encryption is no longer supported.
|
|||||||
|
|
||||||
Basic usage to disable encryption of your document store:
|
Basic usage to disable encryption of your document store:
|
||||||
|
|
||||||
(Note: If `PAPERLESS_PASSPHRASE` isn't set already, you need to specify
|
(Note: If [`PAPERLESS_PASSPHRASE`](configuration.md#PAPERLESS_PASSPHRASE) isn't set already, you need to specify
|
||||||
it here)
|
it here)
|
||||||
|
|
||||||
```
|
```
|
||||||
decrypt_documents [--passphrase SECR3TP4SSPHRA$E]
|
decrypt_documents [--passphrase SECR3TP4SSPHRA$E]
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Detecting duplicates {#fuzzy_duplicate}
|
||||||
|
|
||||||
|
Paperless already catches and prevents upload of exactly matching documents,
|
||||||
|
however a new scan of an existing document may not produce an exact bit for bit
|
||||||
|
duplicate. But the content should be exact or close, allowing detection.
|
||||||
|
|
||||||
|
This tool does a fuzzy match over document content, looking for
|
||||||
|
those which look close according to a given ratio.
|
||||||
|
|
||||||
|
At this time, other metadata (such as correspondent or type) is not
|
||||||
|
taken into account by the detection.
|
||||||
|
|
||||||
|
```
|
||||||
|
document_fuzzy_match [--ratio] [--processes N]
|
||||||
|
```
|
||||||
|
|
||||||
|
| Option | Required | Default | Description |
|
||||||
|
| ----------- | -------- | ------------------- | ------------------------------------------------------------------------------------------------------------------------------ |
|
||||||
|
| --ratio | No | 85.0 | a number between 0 and 100, setting how similar a document must be for it to be reported. Higher numbers mean more similarity. |
|
||||||
|
| --processes | No | 1/4 of system cores | Number of processes to use for matching. Setting 1 disables multiple processes |
|
||||||
|
@@ -1,6 +1,6 @@
|
|||||||
# Advanced Topics
|
# Advanced Topics
|
||||||
|
|
||||||
Paperless offers a couple features that automate certain tasks and make
|
Paperless offers a couple of features that automate certain tasks and make
|
||||||
your life easier.
|
your life easier.
|
||||||
|
|
||||||
## Matching tags, correspondents, document types, and storage paths {#matching}
|
## Matching tags, correspondents, document types, and storage paths {#matching}
|
||||||
@@ -9,7 +9,7 @@ Paperless will compare the matching algorithms defined by every tag,
|
|||||||
correspondent, document type, and storage path in your database to see
|
correspondent, document type, and storage path in your database to see
|
||||||
if they apply to the text in a document. In other words, if you define a
|
if they apply to the text in a document. In other words, if you define a
|
||||||
tag called `Home Utility` that had a `match` property of `bc hydro` and
|
tag called `Home Utility` that had a `match` property of `bc hydro` and
|
||||||
a `matching_algorithm` of `literal`, Paperless will automatically tag
|
a `matching_algorithm` of `Exact`, Paperless will automatically tag
|
||||||
your newly-consumed document with your `Home Utility` tag so long as the
|
your newly-consumed document with your `Home Utility` tag so long as the
|
||||||
text `bc hydro` appears in the body of the document somewhere.
|
text `bc hydro` appears in the body of the document somewhere.
|
||||||
|
|
||||||
@@ -25,18 +25,20 @@ documents.
|
|||||||
|
|
||||||
The following algorithms are available:
|
The following algorithms are available:
|
||||||
|
|
||||||
|
- **None:** No matching will be performed.
|
||||||
- **Any:** Looks for any occurrence of any word provided in match in
|
- **Any:** Looks for any occurrence of any word provided in match in
|
||||||
the PDF. If you define the match as `Bank1 Bank2`, it will match
|
the PDF. If you define the match as `Bank1 Bank2`, it will match
|
||||||
documents containing either of these terms.
|
documents containing either of these terms.
|
||||||
- **All:** Requires that every word provided appears in the PDF,
|
- **All:** Requires that every word provided appears in the PDF,
|
||||||
albeit not in the order provided.
|
albeit not in the order provided.
|
||||||
- **Literal:** Matches only if the match appears exactly as provided
|
- **Exact:** Matches only if the match appears exactly as provided
|
||||||
(i.e. preserve ordering) in the PDF.
|
(i.e. preserve ordering) in the PDF.
|
||||||
- **Regular expression:** Parses the match as a regular expression and
|
- **Regular expression:** Parses the match as a regular expression and
|
||||||
tries to find a match within the document.
|
tries to find a match within the document.
|
||||||
- **Fuzzy match:** I don't know. Look at the source.
|
- **Fuzzy match:** Uses a partial matching based on locating the tag text
|
||||||
|
inside the document, using a [partial ratio](https://maxbachmann.github.io/RapidFuzz/Usage/fuzz.html#partial-ratio)
|
||||||
- **Auto:** Tries to automatically match new documents. This does not
|
- **Auto:** Tries to automatically match new documents. This does not
|
||||||
require you to set a match. See the notes below.
|
require you to set a match. See the [notes below](#automatic-matching).
|
||||||
|
|
||||||
When using the _any_ or _all_ matching algorithms, you can search for
|
When using the _any_ or _all_ matching algorithms, you can search for
|
||||||
terms that consist of multiple words by enclosing them in double quotes.
|
terms that consist of multiple words by enclosing them in double quotes.
|
||||||
@@ -91,7 +93,7 @@ when using this feature:
|
|||||||
decide when not to assign a certain tag, correspondent, document
|
decide when not to assign a certain tag, correspondent, document
|
||||||
type, or storage path. This will usually be the case as you start
|
type, or storage path. This will usually be the case as you start
|
||||||
filling up paperless with documents. Example: If all your documents
|
filling up paperless with documents. Example: If all your documents
|
||||||
are either from "Webshop" and "Bank", paperless will assign one
|
are either from "Webshop" or "Bank", paperless will assign one
|
||||||
of these correspondents to ANY new document, if both are set to
|
of these correspondents to ANY new document, if both are set to
|
||||||
automatic matching.
|
automatic matching.
|
||||||
|
|
||||||
@@ -100,12 +102,12 @@ when using this feature:
|
|||||||
Sometimes you may want to do something arbitrary whenever a document is
|
Sometimes you may want to do something arbitrary whenever a document is
|
||||||
consumed. Rather than try to predict what you may want to do, Paperless
|
consumed. Rather than try to predict what you may want to do, Paperless
|
||||||
lets you execute scripts of your own choosing just before or after a
|
lets you execute scripts of your own choosing just before or after a
|
||||||
document is consumed using a couple simple hooks.
|
document is consumed using a couple of simple hooks.
|
||||||
|
|
||||||
Just write a script, put it somewhere that Paperless can read & execute,
|
Just write a script, put it somewhere that Paperless can read & execute,
|
||||||
and then put the path to that script in `paperless.conf` or
|
and then put the path to that script in `paperless.conf` or
|
||||||
`docker-compose.env` with the variable name of either
|
`docker-compose.env` with the variable name of either
|
||||||
`PAPERLESS_PRE_CONSUME_SCRIPT` or `PAPERLESS_POST_CONSUME_SCRIPT`.
|
[`PAPERLESS_PRE_CONSUME_SCRIPT`](configuration.md#PAPERLESS_PRE_CONSUME_SCRIPT) or [`PAPERLESS_POST_CONSUME_SCRIPT`](configuration.md#PAPERLESS_POST_CONSUME_SCRIPT).
|
||||||
|
|
||||||
!!! info
|
!!! info
|
||||||
|
|
||||||
@@ -121,7 +123,18 @@ Executed after the consumer sees a new document in the consumption
|
|||||||
folder, but before any processing of the document is performed. This
|
folder, but before any processing of the document is performed. This
|
||||||
script can access the following relevant environment variables set:
|
script can access the following relevant environment variables set:
|
||||||
|
|
||||||
- `DOCUMENT_SOURCE_PATH`
|
| Environment Variable | Description |
|
||||||
|
| ----------------------- | ------------------------------------------------------------ |
|
||||||
|
| `DOCUMENT_SOURCE_PATH` | Original path of the consumed document |
|
||||||
|
| `DOCUMENT_WORKING_PATH` | Path to a copy of the original that consumption will work on |
|
||||||
|
| `TASK_ID` | UUID of the task used to process the new document (if any) |
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
Pre-consume scripts which modify the document should only change
|
||||||
|
the `DOCUMENT_WORKING_PATH` file or a second consume task may
|
||||||
|
be triggered, leading to failures as two tasks work on the
|
||||||
|
same document path
|
||||||
|
|
||||||
A simple but common example for this would be creating a simple script
|
A simple but common example for this would be creating a simple script
|
||||||
like this:
|
like this:
|
||||||
@@ -130,7 +143,7 @@ like this:
|
|||||||
|
|
||||||
```bash
|
```bash
|
||||||
#!/usr/bin/env bash
|
#!/usr/bin/env bash
|
||||||
pdf2pdfocr.py -i ${DOCUMENT_SOURCE_PATH}
|
pdf2pdfocr.py -i ${DOCUMENT_WORKING_PATH}
|
||||||
```
|
```
|
||||||
|
|
||||||
`/etc/paperless.conf`
|
`/etc/paperless.conf`
|
||||||
@@ -157,26 +170,37 @@ Executed after the consumer has successfully processed a document and
|
|||||||
has moved it into paperless. It receives the following environment
|
has moved it into paperless. It receives the following environment
|
||||||
variables:
|
variables:
|
||||||
|
|
||||||
- `DOCUMENT_ID`
|
| Environment Variable | Description |
|
||||||
- `DOCUMENT_FILE_NAME`
|
| ---------------------------- | ---------------------------------------------- |
|
||||||
- `DOCUMENT_CREATED`
|
| `DOCUMENT_ID` | Database primary key of the document |
|
||||||
- `DOCUMENT_MODIFIED`
|
| `DOCUMENT_FILE_NAME` | Formatted filename, not including paths |
|
||||||
- `DOCUMENT_ADDED`
|
| `DOCUMENT_CREATED` | Date & time when document created |
|
||||||
- `DOCUMENT_SOURCE_PATH`
|
| `DOCUMENT_MODIFIED` | Date & time when document was last modified |
|
||||||
- `DOCUMENT_ARCHIVE_PATH`
|
| `DOCUMENT_ADDED` | Date & time when document was added |
|
||||||
- `DOCUMENT_THUMBNAIL_PATH`
|
| `DOCUMENT_SOURCE_PATH` | Path to the original document file |
|
||||||
- `DOCUMENT_DOWNLOAD_URL`
|
| `DOCUMENT_ARCHIVE_PATH` | Path to the generate archive file (if any) |
|
||||||
- `DOCUMENT_THUMBNAIL_URL`
|
| `DOCUMENT_THUMBNAIL_PATH` | Path to the generated thumbnail |
|
||||||
- `DOCUMENT_CORRESPONDENT`
|
| `DOCUMENT_DOWNLOAD_URL` | URL for document download |
|
||||||
- `DOCUMENT_TAGS`
|
| `DOCUMENT_THUMBNAIL_URL` | URL for the document thumbnail |
|
||||||
- `DOCUMENT_ORIGINAL_FILENAME`
|
| `DOCUMENT_CORRESPONDENT` | Assigned correspondent (if any) |
|
||||||
|
| `DOCUMENT_TAGS` | Comma separated list of tags applied (if any) |
|
||||||
|
| `DOCUMENT_ORIGINAL_FILENAME` | Filename of original document |
|
||||||
|
| `TASK_ID` | Task UUID used to import the document (if any) |
|
||||||
|
|
||||||
The script can be in any language, but for a simple shell script
|
The script can be in any language, A simple shell script example:
|
||||||
example, you can take a look at
|
|
||||||
[post-consumption-example.sh](https://github.com/paperless-ngx/paperless-ngx/blob/main/scripts/post-consumption-example.sh)
|
|
||||||
in this project.
|
|
||||||
|
|
||||||
The post consumption script cannot cancel the consumption process.
|
```bash title="post-consumption-example"
|
||||||
|
--8<-- "./scripts/post-consumption-example.sh"
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
The post consumption script cannot cancel the consumption process.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
The post consumption script should not modify the document files
|
||||||
|
directly.
|
||||||
|
|
||||||
The script's stdout and stderr will be logged line by line to the
|
The script's stdout and stderr will be logged line by line to the
|
||||||
webserver log, along with the exit code of the script.
|
webserver log, along with the exit code of the script.
|
||||||
@@ -212,8 +236,8 @@ webserver:
|
|||||||
|
|
||||||
Troubleshooting:
|
Troubleshooting:
|
||||||
|
|
||||||
- Monitor the docker-compose log
|
- Monitor the Docker Compose log
|
||||||
`cd ~/paperless-ngx; docker-compose logs -f`
|
`cd ~/paperless-ngx; docker compose logs -f`
|
||||||
- Check your script's permission e.g. in case of permission error
|
- Check your script's permission e.g. in case of permission error
|
||||||
`sudo chmod 755 post-consumption-example.sh`
|
`sudo chmod 755 post-consumption-example.sh`
|
||||||
- Pipe your scripts's output to a log file e.g.
|
- Pipe your scripts's output to a log file e.g.
|
||||||
@@ -227,7 +251,7 @@ document. You will end up getting files like `0000123.pdf` in your media
|
|||||||
directory. This isn't necessarily a bad thing, because you normally
|
directory. This isn't necessarily a bad thing, because you normally
|
||||||
don't have to access these files manually. However, if you wish to name
|
don't have to access these files manually. However, if you wish to name
|
||||||
your files differently, you can do that by adjusting the
|
your files differently, you can do that by adjusting the
|
||||||
`PAPERLESS_FILENAME_FORMAT` configuration option. Paperless adds the
|
[`PAPERLESS_FILENAME_FORMAT`](configuration.md#PAPERLESS_FILENAME_FORMAT) configuration option. Paperless adds the
|
||||||
correct file extension e.g. `.pdf`, `.jpg` automatically.
|
correct file extension e.g. `.pdf`, `.jpg` automatically.
|
||||||
|
|
||||||
This variable allows you to configure the filename (folders are allowed)
|
This variable allows you to configure the filename (folders are allowed)
|
||||||
@@ -288,6 +312,9 @@ Paperless provides the following placeholders within filenames:
|
|||||||
- `{added_month_name_short}`: Month added abbreviated name, as per
|
- `{added_month_name_short}`: Month added abbreviated name, as per
|
||||||
locale
|
locale
|
||||||
- `{added_day}`: Day added only (number 01-31).
|
- `{added_day}`: Day added only (number 01-31).
|
||||||
|
- `{owner_username}`: Username of document owner, if any, or "none"
|
||||||
|
- `{original_name}`: Document original filename, minus the extension, if any, or "none"
|
||||||
|
- `{doc_pk}`: The paperless identifier (primary key) for the document.
|
||||||
|
|
||||||
Paperless will try to conserve the information from your database as
|
Paperless will try to conserve the information from your database as
|
||||||
much as possible. However, some characters that you can use in document
|
much as possible. However, some characters that you can use in document
|
||||||
@@ -317,7 +344,7 @@ value.
|
|||||||
Paperless checks the filename of a document whenever it is saved.
|
Paperless checks the filename of a document whenever it is saved.
|
||||||
Therefore, you need to update the filenames of your documents and move
|
Therefore, you need to update the filenames of your documents and move
|
||||||
them after altering this setting by invoking the
|
them after altering this setting by invoking the
|
||||||
[`document renamer`](/administration#renamer).
|
[`document renamer`](administration.md#renamer).
|
||||||
|
|
||||||
!!! warning
|
!!! warning
|
||||||
|
|
||||||
@@ -336,16 +363,23 @@ value.
|
|||||||
However, keep in mind that inside docker, if files get stored outside of
|
However, keep in mind that inside docker, if files get stored outside of
|
||||||
the predefined volumes, they will be lost after a restart of paperless.
|
the predefined volumes, they will be lost after a restart of paperless.
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
When file naming handling, in particular when using `{tag_list}`,
|
||||||
|
you may run into the limits of your operating system's maximum
|
||||||
|
path lengths. Files will retain the previous path instead and
|
||||||
|
the issue logged.
|
||||||
|
|
||||||
## Storage paths
|
## Storage paths
|
||||||
|
|
||||||
One of the best things in Paperless is that you can not only access the
|
One of the best things in Paperless is that you can not only access the
|
||||||
documents via the web interface, but also via the file system.
|
documents via the web interface, but also via the file system.
|
||||||
|
|
||||||
When as single storage layout is not sufficient for your use case,
|
When a single storage layout is not sufficient for your use case,
|
||||||
storage paths come to the rescue. Storage paths allow you to configure
|
storage paths come to the rescue. Storage paths allow you to configure
|
||||||
more precisely where each document is stored in the file system.
|
more precisely where each document is stored in the file system.
|
||||||
|
|
||||||
- Each storage path is a `PAPERLESS_FILENAME_FORMAT` and
|
- Each storage path is a [`PAPERLESS_FILENAME_FORMAT`](configuration.md#PAPERLESS_FILENAME_FORMAT) and
|
||||||
follows the rules described above
|
follows the rules described above
|
||||||
- Each document is assigned a storage path using the matching
|
- Each document is assigned a storage path using the matching
|
||||||
algorithms described above, but can be overwritten at any time
|
algorithms described above, but can be overwritten at any time
|
||||||
@@ -373,7 +407,7 @@ structure as in the previous example above.
|
|||||||
Statement January.pdf
|
Statement January.pdf
|
||||||
Statement February.pdf
|
Statement February.pdf
|
||||||
|
|
||||||
Insurances/ # Insurances
|
Insurances/ # Insurances
|
||||||
Healthcare 123/
|
Healthcare 123/
|
||||||
2022-01-01 Statement January.pdf
|
2022-01-01 Statement January.pdf
|
||||||
2022-02-02 Letter.pdf
|
2022-02-02 Letter.pdf
|
||||||
@@ -385,14 +419,7 @@ structure as in the previous example above.
|
|||||||
!!! tip
|
!!! tip
|
||||||
|
|
||||||
Defining a storage path is optional. If no storage path is defined for a
|
Defining a storage path is optional. If no storage path is defined for a
|
||||||
document, the global `PAPERLESS_FILENAME_FORMAT` is applied.
|
document, the global [`PAPERLESS_FILENAME_FORMAT`](configuration.md#PAPERLESS_FILENAME_FORMAT) is applied.
|
||||||
|
|
||||||
!!! warning
|
|
||||||
|
|
||||||
If you adjust the format of an existing storage path, old documents
|
|
||||||
don't get relocated automatically. You need to run the
|
|
||||||
[document renamer](/administration#renamer) to
|
|
||||||
adjust their pathes.
|
|
||||||
|
|
||||||
## Celery Monitoring {#celery-monitoring}
|
## Celery Monitoring {#celery-monitoring}
|
||||||
|
|
||||||
@@ -465,7 +492,7 @@ database to be case sensitive. This would prevent a user from creating a
|
|||||||
tag `Name` and `NAME` as they are considered the same.
|
tag `Name` and `NAME` as they are considered the same.
|
||||||
|
|
||||||
Per Django documentation, to enable this requires manual intervention.
|
Per Django documentation, to enable this requires manual intervention.
|
||||||
To enable case sensetive tables, you can execute the following command
|
To enable case sensitive tables, you can execute the following command
|
||||||
against each table:
|
against each table:
|
||||||
|
|
||||||
`ALTER TABLE <table_name> CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;`
|
`ALTER TABLE <table_name> CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;`
|
||||||
@@ -474,3 +501,121 @@ You can also set the default for new tables (this does NOT affect
|
|||||||
existing tables) with:
|
existing tables) with:
|
||||||
|
|
||||||
`ALTER DATABASE <db_name> CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;`
|
`ALTER DATABASE <db_name> CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;`
|
||||||
|
|
||||||
|
!!! warning
|
||||||
|
|
||||||
|
Using mariadb version 10.4+ is recommended. Using the `utf8mb3` character set on
|
||||||
|
an older system may fix issues that can arise while setting up Paperless-ngx but
|
||||||
|
`utf8mb3` can cause issues with consumption (where `utf8mb4` does not).
|
||||||
|
|
||||||
|
## Barcodes {#barcodes}
|
||||||
|
|
||||||
|
Paperless is able to utilize barcodes for automatically performing some tasks.
|
||||||
|
|
||||||
|
At this time, the library utilized for detection of barcodes supports the following types:
|
||||||
|
|
||||||
|
- AN-13/UPC-A
|
||||||
|
- UPC-E
|
||||||
|
- EAN-8
|
||||||
|
- Code 128
|
||||||
|
- Code 93
|
||||||
|
- Code 39
|
||||||
|
- Codabar
|
||||||
|
- Interleaved 2 of 5
|
||||||
|
- QR Code
|
||||||
|
- SQ Code
|
||||||
|
|
||||||
|
You may check for updates on the [zbar library homepage](https://github.com/mchehab/zbar).
|
||||||
|
For usage in Paperless, the type of barcode does not matter, only the contents of it.
|
||||||
|
|
||||||
|
For how to enable barcode usage, see [the configuration](configuration.md#barcodes).
|
||||||
|
The two settings may be enabled independently, but do have interactions as explained
|
||||||
|
below.
|
||||||
|
|
||||||
|
### Document Splitting {#document-splitting}
|
||||||
|
|
||||||
|
When enabled, Paperless will look for a barcode with the configured value and create a new document
|
||||||
|
starting from the next page. The page with the barcode on it will _not_ be retained. It
|
||||||
|
is expected to be a page existing only for triggering the split.
|
||||||
|
|
||||||
|
### Archive Serial Number Assignment
|
||||||
|
|
||||||
|
When enabled, the value of the barcode (as an integer) will be used to set the document's
|
||||||
|
archive serial number, allowing quick reference back to the original, paper document.
|
||||||
|
|
||||||
|
If document splitting via barcode is also enabled, documents will be split when an ASN
|
||||||
|
barcode is located. However, differing from the splitting, the page with the
|
||||||
|
barcode _will_ be retained. This allows application of a barcode to any page, including
|
||||||
|
one which holds data to keep in the document.
|
||||||
|
|
||||||
|
## Automatic collation of double-sided documents {#collate}
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
If your scanner supports double-sided scanning natively, you do not need this feature.
|
||||||
|
|
||||||
|
This feature is turned off by default, see [configuration](configuration.md#collate) on how to turn it on.
|
||||||
|
|
||||||
|
### Summary
|
||||||
|
|
||||||
|
If you have a scanner with an automatic document feeder (ADF) that only scans a single side,
|
||||||
|
this feature makes scanning double-sided documents much more convenient by automatically
|
||||||
|
collating two separate scans into one document, reordering the pages as necessary.
|
||||||
|
|
||||||
|
### Usage example
|
||||||
|
|
||||||
|
Suppose you have a double-sided document with 6 pages (3 sheets of paper). First,
|
||||||
|
put the stack into your ADF as normal, ensuring that page 1 is scanned first. Your ADF
|
||||||
|
will now scan pages 1, 3, and 5. Then you (or your scanner, if it supports it) upload
|
||||||
|
the scan into the correct sub-directory of the consume folder (`double-sided` by default;
|
||||||
|
keep in mind that Paperless will _not_ automatically create the directory for you.)
|
||||||
|
Paperless will then process the scan and move it into an internal staging area.
|
||||||
|
|
||||||
|
The next step is to turn your stack upside down (without reordering the sheets of paper),
|
||||||
|
and scan it once again, your ADF will now scan pages 6, 4, and 2, in that order. Once this
|
||||||
|
scan is copied into the sub-directory, Paperless will collate the previous scan with the
|
||||||
|
new one, reversing the order of the pages on the second, "even numbered" scan. The
|
||||||
|
resulting document will have the pages 1-6 in the correct order, and this new file will
|
||||||
|
then be processed as normal.
|
||||||
|
|
||||||
|
!!! tip
|
||||||
|
|
||||||
|
When scanning the even numbered pages, you can omit the last empty pages, if there are
|
||||||
|
any. For example, if page 6 is empty, you only need to scan pages 2 and 4. _Do not_ omit
|
||||||
|
empty pages in the middle of the document.
|
||||||
|
|
||||||
|
### Things that could go wrong
|
||||||
|
|
||||||
|
Paperless will notice when the first, "odd numbered" scan has less pages than the second
|
||||||
|
scan (this can happen when e.g. the ADF skipped a few pages in the first pass). In that
|
||||||
|
case, Paperless will remove the staging copy as well as the scan, and give you an error
|
||||||
|
message asking you to restart the process from scratch, by scanning the odd pages again,
|
||||||
|
followed by the even pages.
|
||||||
|
|
||||||
|
It's important that the scan files get consumed in the correct order, and one at a time.
|
||||||
|
You therefore need to make sure that Paperless is running while you upload the files into
|
||||||
|
the directory; and if you're using [polling](configuration.md#polling), make sure that
|
||||||
|
`CONSUMER_POLLING` is set to a value lower than it takes for the second scan to appear,
|
||||||
|
like 5-10 or even lower.
|
||||||
|
|
||||||
|
Another thing that might happen is that you start a double sided scan, but then forget
|
||||||
|
to upload the second file. To avoid collating the wrong documents if you then come back
|
||||||
|
a day later to scan a new double-sided document, Paperless will only keep an "odd numbered
|
||||||
|
pages" file for up to 30 minutes. If more time passes, it will consider the next incoming
|
||||||
|
scan a completely new "odd numbered pages" one. The old staging file will get discarded.
|
||||||
|
|
||||||
|
### Interaction with "subdirs as tags"
|
||||||
|
|
||||||
|
The collation feature can be used together with the [subdirs as tags](configuration.md#consume_config)
|
||||||
|
feature (but this is not a requirement). Just create a correctly named double-sided subdir
|
||||||
|
in the hierachy and upload your scans there. For example, both `double-sided/foo/bar` as
|
||||||
|
well as `foo/bar/double-sided` will cause the collated document to be treated as if it
|
||||||
|
were uploaded into `foo/bar` and receive both `foo` and `bar` tags, but not `double-sided`.
|
||||||
|
|
||||||
|
### Interaction with document splitting
|
||||||
|
|
||||||
|
You can use the [document splitting](#document-splitting) feature, but if you use a normal
|
||||||
|
single-sided split marker page, the split document(s) will have an empty page at the front (or
|
||||||
|
whatever else was on the backside of the split marker page.) You can work around that by having
|
||||||
|
a split marker page that has the split barcode on _both_ sides. This way, the extra page will
|
||||||
|
get automatically removed.
|
||||||
|
79
docs/api.md
@@ -6,7 +6,7 @@ provides a browsable API for most of its endpoints, which you can
|
|||||||
inspect at `http://<paperless-host>:<port>/api/`. This also documents
|
inspect at `http://<paperless-host>:<port>/api/`. This also documents
|
||||||
most of the available filters and ordering fields.
|
most of the available filters and ordering fields.
|
||||||
|
|
||||||
The API provides 5 main endpoints:
|
The API provides the following main endpoints:
|
||||||
|
|
||||||
- `/api/documents/`: Full CRUD support, except POSTing new documents.
|
- `/api/documents/`: Full CRUD support, except POSTing new documents.
|
||||||
See below.
|
See below.
|
||||||
@@ -14,12 +14,18 @@ The API provides 5 main endpoints:
|
|||||||
- `/api/document_types/`: Full CRUD support.
|
- `/api/document_types/`: Full CRUD support.
|
||||||
- `/api/logs/`: Read-Only.
|
- `/api/logs/`: Read-Only.
|
||||||
- `/api/tags/`: Full CRUD support.
|
- `/api/tags/`: Full CRUD support.
|
||||||
|
- `/api/tasks/`: Read-only.
|
||||||
- `/api/mail_accounts/`: Full CRUD support.
|
- `/api/mail_accounts/`: Full CRUD support.
|
||||||
- `/api/mail_rules/`: Full CRUD support.
|
- `/api/mail_rules/`: Full CRUD support.
|
||||||
|
- `/api/users/`: Full CRUD support.
|
||||||
|
- `/api/groups/`: Full CRUD support.
|
||||||
|
- `/api/share_links/`: Full CRUD support.
|
||||||
|
- `/api/custom_fields/`: Full CRUD support.
|
||||||
|
- `/api/profile/`: GET, PATCH
|
||||||
|
|
||||||
All of these endpoints except for the logging endpoint allow you to
|
All of these endpoints except for the logging endpoint allow you to
|
||||||
fetch, edit and delete individual objects by appending their primary key
|
fetch (and edit and delete where appropriate) individual objects by
|
||||||
to the path, for example `/api/documents/454/`.
|
appending their primary key to the path, e.g. `/api/documents/454/`.
|
||||||
|
|
||||||
The objects served by the document endpoint contain the following
|
The objects served by the document endpoint contain the following
|
||||||
fields:
|
fields:
|
||||||
@@ -44,6 +50,11 @@ fields:
|
|||||||
Read-only.
|
Read-only.
|
||||||
- `archived_file_name`: Verbose filename of the archived document.
|
- `archived_file_name`: Verbose filename of the archived document.
|
||||||
Read-only. Null if no archived document is available.
|
Read-only. Null if no archived document is available.
|
||||||
|
- `notes`: Array of notes associated with the document.
|
||||||
|
- `set_permissions`: Allows setting document permissions. Optional,
|
||||||
|
write-only. See [below](#permissions).
|
||||||
|
- `custom_fields`: Array of custom fields & values, specified as
|
||||||
|
`{ field: CUSTOM_FIELD_ID, value: VALUE }`
|
||||||
|
|
||||||
## Downloading documents
|
## Downloading documents
|
||||||
|
|
||||||
@@ -119,6 +130,11 @@ File metadata is reported as a list of objects in the following form:
|
|||||||
depends on the file type and the metadata available in that specific
|
depends on the file type and the metadata available in that specific
|
||||||
document. Paperless only reports PDF metadata at this point.
|
document. Paperless only reports PDF metadata at this point.
|
||||||
|
|
||||||
|
## Documents additional endpoints
|
||||||
|
|
||||||
|
- `/api/documents/<id>/notes/`: Retrieve notes for a document.
|
||||||
|
- `/api/documents/<id>/share_links/`: Retrieve share links for a document.
|
||||||
|
|
||||||
## Authorization
|
## Authorization
|
||||||
|
|
||||||
The REST api provides three different forms of authentication.
|
The REST api provides three different forms of authentication.
|
||||||
@@ -142,6 +158,10 @@ The REST api provides three different forms of authentication.
|
|||||||
|
|
||||||
3. Token authentication
|
3. Token authentication
|
||||||
|
|
||||||
|
You can create (or re-create) an API token by opening the "My Profile"
|
||||||
|
link in the user dropdown found in the web UI and clicking the circular
|
||||||
|
arrow button.
|
||||||
|
|
||||||
Paperless also offers an endpoint to acquire authentication tokens.
|
Paperless also offers an endpoint to acquire authentication tokens.
|
||||||
|
|
||||||
POST a username and password as a form or json string to
|
POST a username and password as a form or json string to
|
||||||
@@ -153,7 +173,7 @@ The REST api provides three different forms of authentication.
|
|||||||
Authorization: Token <token>
|
Authorization: Token <token>
|
||||||
```
|
```
|
||||||
|
|
||||||
Tokens can be managed and revoked in the paperless admin.
|
Tokens can also be managed in the Django admin.
|
||||||
|
|
||||||
## Searching for documents
|
## Searching for documents
|
||||||
|
|
||||||
@@ -162,7 +182,7 @@ specific query parameters cause the API to return full text search
|
|||||||
results:
|
results:
|
||||||
|
|
||||||
- `/api/documents/?query=your%20search%20query`: Search for a document
|
- `/api/documents/?query=your%20search%20query`: Search for a document
|
||||||
using a full text query. For details on the syntax, see [Basic Usage - Searching](/usage#basic-usage_searching).
|
using a full text query. For details on the syntax, see [Basic Usage - Searching](usage.md#basic-usage_searching).
|
||||||
- `/api/documents/?more_like=1234`: Search for documents similar to
|
- `/api/documents/?more_like=1234`: Search for documents similar to
|
||||||
the document with id 1234.
|
the document with id 1234.
|
||||||
|
|
||||||
@@ -254,11 +274,52 @@ The endpoint supports the following optional form fields:
|
|||||||
- `document_type`: Similar to correspondent.
|
- `document_type`: Similar to correspondent.
|
||||||
- `tags`: Similar to correspondent. Specify this multiple times to
|
- `tags`: Similar to correspondent. Specify this multiple times to
|
||||||
have multiple tags added to the document.
|
have multiple tags added to the document.
|
||||||
|
- `archive_serial_number`: An optional archive serial number to set.
|
||||||
|
|
||||||
The endpoint will immediately return "OK" if the document consumption
|
The endpoint will immediately return HTTP 200 if the document consumption
|
||||||
process was started successfully. No additional status information about
|
process was started successfully, with the UUID of the consumption task
|
||||||
the consumption process itself is available, since that happens in a
|
as the data. No additional status information about the consumption process
|
||||||
different process.
|
itself is available immediately, since that happens in a different process.
|
||||||
|
However, querying the tasks endpoint with the returned UUID e.g.
|
||||||
|
`/api/tasks/?task_id={uuid}` will provide information on the state of the
|
||||||
|
consumption including the ID of a created document if consumption succeeded.
|
||||||
|
|
||||||
|
## Permissions
|
||||||
|
|
||||||
|
All objects (documents, tags, etc.) allow setting object-level permissions
|
||||||
|
with optional `owner` and / or a `set_permissions` parameters which are of
|
||||||
|
the form:
|
||||||
|
|
||||||
|
```
|
||||||
|
"owner": ...,
|
||||||
|
"set_permissions": {
|
||||||
|
"view": {
|
||||||
|
"users": [...],
|
||||||
|
"groups": [...],
|
||||||
|
},
|
||||||
|
"change": {
|
||||||
|
"users": [...],
|
||||||
|
"groups": [...],
|
||||||
|
},
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
Arrays should contain user or group ID numbers.
|
||||||
|
|
||||||
|
If these parameters are supplied the object's permissions will be overwritten,
|
||||||
|
assuming the authenticated user has permission to do so (the user must be
|
||||||
|
the object owner or a superuser).
|
||||||
|
|
||||||
|
### Retrieving full permissions
|
||||||
|
|
||||||
|
By default, the API will return a truncated version of object-level
|
||||||
|
permissions, returning `user_can_change` indicating whether the current user
|
||||||
|
can edit the object (either because they are the object owner or have permissions
|
||||||
|
granted). You can pass the parameter `full_perms=true` to API calls to view the
|
||||||
|
full permissions of objects in a format that mirrors the `set_permissions`
|
||||||
|
parameter above.
|
||||||
|
|
||||||
## API Versioning
|
## API Versioning
|
||||||
|
|
||||||
|
@@ -20,6 +20,28 @@
|
|||||||
margin-left: 4%;
|
margin-left: 4%;
|
||||||
float: left;
|
float: left;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.grid-flipped-left {
|
||||||
|
width: 66%;
|
||||||
|
float: left;
|
||||||
|
}
|
||||||
|
|
||||||
|
.grid-flipped-right {
|
||||||
|
width: 29%;
|
||||||
|
margin-left: 4%;
|
||||||
|
float: left;
|
||||||
|
}
|
||||||
|
|
||||||
|
.grid-half-left {
|
||||||
|
width: 48%;
|
||||||
|
float: left;
|
||||||
|
}
|
||||||
|
|
||||||
|
.grid-half-right {
|
||||||
|
width: 48%;
|
||||||
|
margin-left: 4%;
|
||||||
|
float: left;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
.grid-left > p {
|
.grid-left > p {
|
||||||
@@ -31,6 +53,48 @@
|
|||||||
margin: 0;
|
margin: 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.clear {
|
||||||
|
clear: both;
|
||||||
|
margin-bottom: 20px;
|
||||||
|
display: block;
|
||||||
|
}
|
||||||
|
|
||||||
.index-callout {
|
.index-callout {
|
||||||
margin-right: .5rem;
|
margin-right: .5rem;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/* make code in headers not bold */
|
||||||
|
h4 code {
|
||||||
|
font-weight: normal;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Hide config vars from sidebar, toc and move the border on mobile case their hidden */
|
||||||
|
.md-nav.md-nav--secondary .md-nav__item .md-nav__link[href*="PAPERLESS_"],
|
||||||
|
.md-nav.md-nav--secondary .md-nav__item .md-nav__link[href*="USERMAP_"] {
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media screen and (max-width: 76.1875em) {
|
||||||
|
.md-nav--primary .md-nav__item {
|
||||||
|
border-top: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.md-nav--primary .md-nav__link {
|
||||||
|
border-top: .05rem solid var(--md-default-fg-color--lightest);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Show search shortcut key */
|
||||||
|
[data-md-toggle="search"]:not(:checked) ~ .md-header .md-search__form::after {
|
||||||
|
position: absolute;
|
||||||
|
top: .3rem;
|
||||||
|
right: .3rem;
|
||||||
|
display: block;
|
||||||
|
padding: .1rem .4rem;
|
||||||
|
color: var(--md-default-fg-color--lighter);
|
||||||
|
font-weight: bold;
|
||||||
|
font-size: .8rem;
|
||||||
|
border: .05rem solid var(--md-default-fg-color--lighter);
|
||||||
|
border-radius: .1rem;
|
||||||
|
content: "/";
|
||||||
|
}
|
||||||
|
Before Width: | Height: | Size: 661 KiB After Width: | Height: | Size: 1.8 MiB |
BIN
docs/assets/screenshots/consumption_template.png
Normal file
After Width: | Height: | Size: 160 KiB |
Before Width: | Height: | Size: 457 KiB After Width: | Height: | Size: 501 KiB |
BIN
docs/assets/screenshots/custom_field1.png
Normal file
After Width: | Height: | Size: 21 KiB |
BIN
docs/assets/screenshots/custom_field2.png
Normal file
After Width: | Height: | Size: 2.2 MiB |
Before Width: | Height: | Size: 436 KiB After Width: | Height: | Size: 644 KiB |
Before Width: | Height: | Size: 462 KiB After Width: | Height: | Size: 667 KiB |
Before Width: | Height: | Size: 608 KiB After Width: | Height: | Size: 1003 KiB |
Before Width: | Height: | Size: 698 KiB After Width: | Height: | Size: 1.7 MiB |
BIN
docs/assets/screenshots/documents-smallcards-slimsidebar.png
Normal file
After Width: | Height: | Size: 2.1 MiB |
Before Width: | Height: | Size: 706 KiB After Width: | Height: | Size: 1.8 MiB |
Before Width: | Height: | Size: 480 KiB After Width: | Height: | Size: 925 KiB |
Before Width: | Height: | Size: 680 KiB After Width: | Height: | Size: 1.7 MiB |
Before Width: | Height: | Size: 686 KiB After Width: | Height: | Size: 1.8 MiB |
Before Width: | Height: | Size: 848 KiB After Width: | Height: | Size: 2.7 MiB |
Before Width: | Height: | Size: 703 KiB After Width: | Height: | Size: 726 KiB |
Before Width: | Height: | Size: 76 KiB After Width: | Height: | Size: 169 KiB |
Before Width: | Height: | Size: 388 KiB |
BIN
docs/assets/screenshots/mobile1.png
Normal file
After Width: | Height: | Size: 432 KiB |
BIN
docs/assets/screenshots/mobile2.png
Normal file
After Width: | Height: | Size: 280 KiB |
BIN
docs/assets/screenshots/mobile3.png
Normal file
After Width: | Height: | Size: 246 KiB |
BIN
docs/assets/screenshots/new-correspondent.png
Normal file
After Width: | Height: | Size: 27 KiB |
BIN
docs/assets/screenshots/new-document_type.png
Normal file
After Width: | Height: | Size: 29 KiB |
BIN
docs/assets/screenshots/new-storage_path.png
Normal file
After Width: | Height: | Size: 48 KiB |
Before Width: | Height: | Size: 26 KiB After Width: | Height: | Size: 45 KiB |
BIN
docs/assets/screenshots/permissions_document.png
Normal file
After Width: | Height: | Size: 550 KiB |
BIN
docs/assets/screenshots/permissions_global.png
Normal file
After Width: | Height: | Size: 116 KiB |
Before Width: | Height: | Size: 54 KiB After Width: | Height: | Size: 87 KiB |
Before Width: | Height: | Size: 517 KiB After Width: | Height: | Size: 792 KiB |
1419
docs/changelog.md
@@ -1,18 +1,18 @@
|
|||||||
# Development
|
# Development
|
||||||
|
|
||||||
This section describes the steps you need to take to start development
|
This section describes the steps you need to take to start development
|
||||||
on paperless-ngx.
|
on Paperless-ngx.
|
||||||
|
|
||||||
Check out the source from github. The repository is organized in the
|
Check out the source from GitHub. The repository is organized in the
|
||||||
following way:
|
following way:
|
||||||
|
|
||||||
- `main` always represents the latest release and will only see
|
- `main` always represents the latest release and will only see
|
||||||
changes when a new release is made.
|
changes when a new release is made.
|
||||||
- `dev` contains the code that will be in the next release.
|
- `dev` contains the code that will be in the next release.
|
||||||
- `feature-X` contain bigger changes that will be in some release, but
|
- `feature-X` contains bigger changes that will be in some release, but
|
||||||
not necessarily the next one.
|
not necessarily the next one.
|
||||||
|
|
||||||
When making functional changes to paperless, _always_ make your changes
|
When making functional changes to Paperless-ngx, _always_ make your changes
|
||||||
on the `dev` branch.
|
on the `dev` branch.
|
||||||
|
|
||||||
Apart from that, the folder structure is as follows:
|
Apart from that, the folder structure is as follows:
|
||||||
@@ -24,9 +24,9 @@ Apart from that, the folder structure is as follows:
|
|||||||
development.
|
development.
|
||||||
- `docker/` - Files required to build the docker image.
|
- `docker/` - Files required to build the docker image.
|
||||||
|
|
||||||
## Contributing to Paperless
|
## Contributing to Paperless-ngx
|
||||||
|
|
||||||
Maybe you've been using Paperless for a while and want to add a feature
|
Maybe you've been using Paperless-ngx for a while and want to add a feature
|
||||||
or two, or maybe you've come across a bug that you have some ideas how
|
or two, or maybe you've come across a bug that you have some ideas how
|
||||||
to solve. The beauty of open source software is that you can see what's
|
to solve. The beauty of open source software is that you can see what's
|
||||||
wrong and help to get it fixed for everyone!
|
wrong and help to get it fixed for everyone!
|
||||||
@@ -36,13 +36,13 @@ conduct](https://github.com/paperless-ngx/paperless-ngx/blob/main/CODE_OF_CONDUC
|
|||||||
and other important information in the [contributing
|
and other important information in the [contributing
|
||||||
guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
guidelines](https://github.com/paperless-ngx/paperless-ngx/blob/main/CONTRIBUTING.md).
|
||||||
|
|
||||||
## Code formatting with pre-commit Hooks
|
## Code formatting with pre-commit hooks
|
||||||
|
|
||||||
To ensure a consistent style and formatting across the project source,
|
To ensure a consistent style and formatting across the project source,
|
||||||
the project utilizes a Git [`pre-commit`](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks)
|
the project utilizes Git [`pre-commit`](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks)
|
||||||
hook to perform some formatting and linting before a commit is allowed.
|
hooks to perform some formatting and linting before a commit is allowed.
|
||||||
That way, everyone uses the same style and some common issues can be caught
|
That way, everyone uses the same style and some common issues can be caught
|
||||||
early on. See below for installation instructions.
|
early on.
|
||||||
|
|
||||||
Once installed, hooks will run when you commit. If the formatting isn't
|
Once installed, hooks will run when you commit. If the formatting isn't
|
||||||
quite right or a linter catches something, the commit will be rejected.
|
quite right or a linter catches something, the commit will be rejected.
|
||||||
@@ -51,129 +51,115 @@ as the Python formatting tool `black`, will format failing
|
|||||||
files, so all you need to do is `git add` those files again
|
files, so all you need to do is `git add` those files again
|
||||||
and retry your commit.
|
and retry your commit.
|
||||||
|
|
||||||
## Initial setup and first start
|
## General setup
|
||||||
|
|
||||||
After you forked and cloned the code from github you need to perform a
|
After you forked and cloned the code from GitHub you need to perform a
|
||||||
first-time setup. To do the setup you need to perform the steps from the
|
first-time setup.
|
||||||
following chapters in a certain order:
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
Every command is executed directly from the root folder of the project unless specified otherwise.
|
||||||
|
|
||||||
1. Install prerequisites + pipenv as mentioned in
|
1. Install prerequisites + pipenv as mentioned in
|
||||||
[Bare metal route](/setup#bare_metal)
|
[Bare metal route](setup.md#bare_metal).
|
||||||
|
|
||||||
2. Copy `paperless.conf.example` to `paperless.conf` and enable debug
|
2. Copy `paperless.conf.example` to `paperless.conf` and enable debug
|
||||||
mode.
|
mode within the file via `PAPERLESS_DEBUG=true`.
|
||||||
|
|
||||||
3. Install the Angular CLI interface:
|
3. Create `consume` and `media` directories:
|
||||||
|
|
||||||
```shell-session
|
```bash
|
||||||
$ npm install -g @angular/cli
|
$ mkdir -p consume media
|
||||||
```
|
```
|
||||||
|
|
||||||
4. Install pre-commit hooks
|
4. Install the Python dependencies:
|
||||||
|
|
||||||
```shell-session
|
```bash
|
||||||
pre-commit install
|
$ pipenv install --dev
|
||||||
```
|
```
|
||||||
|
|
||||||
5. Create `consume` and `media` folders in the cloned root folder.
|
!!! note
|
||||||
|
|
||||||
```shell-session
|
Using a virtual environment is highly recommended. You can spawn one via `pipenv shell`.
|
||||||
mkdir -p consume media
|
Make sure you're using Python 3.10.x or lower. Otherwise you might
|
||||||
|
get issues with building dependencies. You can use
|
||||||
|
[pyenv](https://github.com/pyenv/pyenv) to install a specific
|
||||||
|
Python version.
|
||||||
|
|
||||||
|
5. Install pre-commit hooks:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ pre-commit install
|
||||||
```
|
```
|
||||||
|
|
||||||
6. You can now either ...
|
6. Apply migrations and create a superuser for your development instance:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# src/
|
||||||
|
|
||||||
|
$ python3 manage.py migrate
|
||||||
|
$ python3 manage.py createsuperuser
|
||||||
|
```
|
||||||
|
|
||||||
|
7. You can now either ...
|
||||||
|
|
||||||
- install redis or
|
- install redis or
|
||||||
|
|
||||||
- use the included scripts/start-services.sh to use docker to fire
|
- use the included `scripts/start_services.sh` to use docker to fire
|
||||||
up a redis instance (and some other services such as tika,
|
up a redis instance (and some other services such as tika,
|
||||||
gotenberg and a database server) or
|
gotenberg and a database server) or
|
||||||
|
|
||||||
- spin up a bare redis container
|
- spin up a bare redis container
|
||||||
|
|
||||||
```shell-session
|
```
|
||||||
docker run -d -p 6379:6379 --restart unless-stopped redis:latest
|
$ docker run -d -p 6379:6379 --restart unless-stopped redis:latest
|
||||||
```
|
```
|
||||||
|
|
||||||
7. Install the python dependencies by performing in the src/ directory.
|
8. Continue with either back-end or front-end development – or both :-).
|
||||||
|
|
||||||
```shell-session
|
|
||||||
pipenv install --dev
|
|
||||||
```
|
|
||||||
|
|
||||||
!!! note
|
|
||||||
|
|
||||||
Make sure you're using python 3.10.x or lower. Otherwise you might
|
|
||||||
get issues with building dependencies. You can use
|
|
||||||
[pyenv](https://github.com/pyenv/pyenv) to install a specific
|
|
||||||
python version.
|
|
||||||
|
|
||||||
8. Generate the static UI so you can perform a login to get session
|
|
||||||
that is required for frontend development (this needs to be done one
|
|
||||||
time only). From src-ui directory:
|
|
||||||
|
|
||||||
```shell-session
|
|
||||||
npm install .
|
|
||||||
./node_modules/.bin/ng build --configuration production
|
|
||||||
```
|
|
||||||
|
|
||||||
9. Apply migrations and create a superuser for your dev instance:
|
|
||||||
|
|
||||||
```shell-session
|
|
||||||
python3 manage.py migrate
|
|
||||||
python3 manage.py createsuperuser
|
|
||||||
```
|
|
||||||
|
|
||||||
10. Now spin up the dev backend. Depending on which part of paperless
|
|
||||||
you're developing for, you need to have some or all of them
|
|
||||||
running.
|
|
||||||
|
|
||||||
```shell-session
|
|
||||||
python3 manage.py runserver & python3 manage.py document_consumer & celery --app paperless worker
|
|
||||||
```
|
|
||||||
|
|
||||||
11. Login with the superuser credentials provided in step 8 at
|
|
||||||
`http://localhost:8000` to create a session that enables you to use
|
|
||||||
the backend.
|
|
||||||
|
|
||||||
Backend development environment is now ready, to start Frontend
|
|
||||||
development go to `/src-ui` and run `ng serve`. From there you can use
|
|
||||||
`http://localhost:4200` for a preview.
|
|
||||||
|
|
||||||
## Back end development
|
## Back end development
|
||||||
|
|
||||||
The backend is a [Django](https://www.djangoproject.com/) application. PyCharm works well for development,
|
The back end is a [Django](https://www.djangoproject.com/) application.
|
||||||
but you can use whatever you want.
|
[PyCharm](https://www.jetbrains.com/de-de/pycharm/) as well as [Visual Studio Code](https://code.visualstudio.com)
|
||||||
|
work well for development, but you can use whatever you want.
|
||||||
|
|
||||||
Configure the IDE to use the src/ folder as the base source folder.
|
Configure the IDE to use the `src/`-folder as the base source folder.
|
||||||
Configure the following launch configurations in your IDE:
|
Configure the following launch configurations in your IDE:
|
||||||
|
|
||||||
- `python3 manage.py runserver`
|
- `python3 manage.py runserver`
|
||||||
- `celery --app paperless worker`
|
|
||||||
- `python3 manage.py document_consumer`
|
- `python3 manage.py document_consumer`
|
||||||
|
- `celery --app paperless worker -l DEBUG` (or any other log level)
|
||||||
|
|
||||||
To start them all:
|
To start them all:
|
||||||
|
|
||||||
```shell-session
|
```bash
|
||||||
python3 manage.py runserver & python3 manage.py document_consumer & celery --app paperless worker
|
# src/
|
||||||
|
|
||||||
|
$ python3 manage.py runserver & \
|
||||||
|
python3 manage.py document_consumer & \
|
||||||
|
celery --app paperless worker -l DEBUG
|
||||||
```
|
```
|
||||||
|
|
||||||
Testing and code style:
|
You might need the front end to test your back end code.
|
||||||
|
This assumes that you have AngularJS installed on your system.
|
||||||
|
Go to the [Front end development](#front-end-development) section for further details.
|
||||||
|
To build the front end once use this command:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# src-ui/
|
||||||
|
|
||||||
|
$ npm install
|
||||||
|
$ ng build --configuration production
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing
|
||||||
|
|
||||||
- Run `pytest` in the `src/` directory to execute all tests. This also
|
- Run `pytest` in the `src/` directory to execute all tests. This also
|
||||||
generates a HTML coverage report. When runnings test, paperless.conf
|
generates a HTML coverage report. When runnings test, `paperless.conf`
|
||||||
is loaded as well. However: the tests rely on the default
|
is loaded as well. However, the tests rely on the default
|
||||||
configuration. This is not ideal. But for now, make sure no settings
|
configuration. This is not ideal. But for now, make sure no settings
|
||||||
except for DEBUG are overridden when testing.
|
except for DEBUG are overridden when testing.
|
||||||
|
|
||||||
- Coding style is enforced by the Git pre-commit hooks. These will
|
|
||||||
ensure your code is formatted and do some linting when you do a `git commit`.
|
|
||||||
|
|
||||||
- You can also run `black` manually to format your code
|
|
||||||
|
|
||||||
- The `pre-commit` hooks will modify files and interact with each other.
|
|
||||||
It may take a couple of `git add`, `git commit` cycle to satisfy them.
|
|
||||||
|
|
||||||
!!! note
|
!!! note
|
||||||
|
|
||||||
The line length rule E501 is generally useful for getting multiple
|
The line length rule E501 is generally useful for getting multiple
|
||||||
@@ -184,71 +170,71 @@ Testing and code style:
|
|||||||
|
|
||||||
## Front end development
|
## Front end development
|
||||||
|
|
||||||
The front end is built using Angular. In order to get started, you need
|
The front end is built using AngularJS. In order to get started, you need Node.js (version 14.15+) and
|
||||||
`npm`. Install the Angular CLI interface with
|
`npm`.
|
||||||
|
|
||||||
```shell-session
|
!!! note
|
||||||
$ npm install -g @angular/cli
|
|
||||||
|
The following commands are all performed in the `src-ui`-directory. You will need a running back end (including an active session) to connect to the back end API. To spin it up refer to the commands under the section [above](#back-end-development).
|
||||||
|
|
||||||
|
1. Install the Angular CLI. You might need sudo privileges to perform this command:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ npm install -g @angular/cli
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Make sure that it's on your path.
|
||||||
|
|
||||||
|
3. Install all necessary modules:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ npm install
|
||||||
|
```
|
||||||
|
|
||||||
|
4. You can launch a development server by running:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ ng serve
|
||||||
|
```
|
||||||
|
|
||||||
|
This will automatically update whenever you save. However, in-place
|
||||||
|
compilation might fail on syntax errors, in which case you need to
|
||||||
|
restart it.
|
||||||
|
|
||||||
|
By default, the development server is available on `http://localhost:4200/` and is configured to access the API at
|
||||||
|
`http://localhost:8000/api/`, which is the default of the backend. If you enabled `DEBUG` on the back end, several security overrides for allowed hosts, CORS and X-Frame-Options are in place so that the front end behaves exactly as in production.
|
||||||
|
|
||||||
|
### Testing and code style
|
||||||
|
|
||||||
|
The front end code (.ts, .html, .scss) use `prettier` for code
|
||||||
|
formatting via the Git `pre-commit` hooks which run automatically on
|
||||||
|
commit. See [above](#code-formatting-with-pre-commit-hooks) for installation instructions. You can also run this via the CLI with a
|
||||||
|
command such as
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ git ls-files -- '*.ts' | xargs pre-commit run prettier --files
|
||||||
```
|
```
|
||||||
|
|
||||||
and make sure that it's on your path. Next, in the src-ui/ directory,
|
Front end testing uses Jest and Playwright. Unit tests and e2e tests,
|
||||||
install the required dependencies of the project.
|
respectively, can be run non-interactively with:
|
||||||
|
|
||||||
```shell-session
|
```bash
|
||||||
$ npm install
|
$ ng test
|
||||||
|
$ npx playwright test
|
||||||
```
|
```
|
||||||
|
|
||||||
You can launch a development server by running
|
Playwright also includes a UI which can be run with:
|
||||||
|
|
||||||
```shell-session
|
```bash
|
||||||
$ ng serve
|
$ npx playwright test --ui
|
||||||
```
|
```
|
||||||
|
|
||||||
This will automatically update whenever you save. However, in-place
|
### Building the frontend
|
||||||
compilation might fail on syntax errors, in which case you need to
|
|
||||||
restart it.
|
|
||||||
|
|
||||||
By default, the development server is available on
|
In order to build the front end and serve it as part of Django, execute:
|
||||||
`http://localhost:4200/` and is configured to access the API at
|
|
||||||
`http://localhost:8000/api/`, which is the default of the backend. If
|
|
||||||
you enabled DEBUG on the back end, several security overrides for
|
|
||||||
allowed hosts, CORS and X-Frame-Options are in place so that the front
|
|
||||||
end behaves exactly as in production. This also relies on you being
|
|
||||||
logged into the back end. Without a valid session, The front end will
|
|
||||||
simply not work.
|
|
||||||
|
|
||||||
Testing and code style:
|
```bash
|
||||||
|
$ ng build --configuration production
|
||||||
- The frontend code (.ts, .html, .scss) use `prettier` for code
|
|
||||||
formatting via the Git `pre-commit` hooks which run automatically on
|
|
||||||
commit. See
|
|
||||||
[above](#code-formatting-with-pre-commit-hooks) for installation. You can also run this via cli with a
|
|
||||||
command such as
|
|
||||||
|
|
||||||
```shell-session
|
|
||||||
$ git ls-files -- '*.ts' | xargs pre-commit run prettier --files
|
|
||||||
```
|
|
||||||
|
|
||||||
- Frontend testing uses jest and cypress. There is currently a need
|
|
||||||
for significantly more frontend tests. Unit tests and e2e tests,
|
|
||||||
respectively, can be run non-interactively with:
|
|
||||||
|
|
||||||
```shell-session
|
|
||||||
$ ng test
|
|
||||||
$ npm run e2e:ci
|
|
||||||
```
|
|
||||||
|
|
||||||
Cypress also includes a UI which can be run from within the `src-ui`
|
|
||||||
directory with
|
|
||||||
|
|
||||||
```shell-session
|
|
||||||
$ ./node_modules/.bin/cypress open
|
|
||||||
```
|
|
||||||
|
|
||||||
In order to build the front end and serve it as part of django, execute
|
|
||||||
|
|
||||||
```shell-session
|
|
||||||
$ ng build --prod
|
|
||||||
```
|
```
|
||||||
|
|
||||||
This will build the front end and put it in a location from which the
|
This will build the front end and put it in a location from which the
|
||||||
@@ -257,25 +243,25 @@ that authentication is working.
|
|||||||
|
|
||||||
## Localization
|
## Localization
|
||||||
|
|
||||||
Paperless is available in many different languages. Since paperless
|
Paperless-ngx is available in many different languages. Since Paperless-ngx
|
||||||
consists both of a django application and an Angular front end, both
|
consists both of a Django application and an AngularJS front end, both
|
||||||
these parts have to be translated separately.
|
these parts have to be translated separately.
|
||||||
|
|
||||||
### Front end localization
|
### Front end localization
|
||||||
|
|
||||||
- The Angular front end does localization according to the [Angular
|
- The AngularJS front end does localization according to the [Angular
|
||||||
documentation](https://angular.io/guide/i18n).
|
documentation](https://angular.io/guide/i18n).
|
||||||
- The source language of the project is "en_US".
|
- The source language of the project is "en_US".
|
||||||
- The source strings end up in the file "src-ui/messages.xlf".
|
- The source strings end up in the file `src-ui/messages.xlf`.
|
||||||
- The translated strings need to be placed in the
|
- The translated strings need to be placed in the
|
||||||
"src-ui/src/locale/" folder.
|
`src-ui/src/locale/` folder.
|
||||||
- In order to extract added or changed strings from the source files,
|
- In order to extract added or changed strings from the source files,
|
||||||
call `ng xi18n --ivy`.
|
call `ng extract-i18n`.
|
||||||
|
|
||||||
Adding new languages requires adding the translated files in the
|
Adding new languages requires adding the translated files in the
|
||||||
"src-ui/src/locale/" folder and adjusting a couple files.
|
`src-ui/src/locale/` folder and adjusting a couple files.
|
||||||
|
|
||||||
1. Adjust "src-ui/angular.json":
|
1. Adjust `src-ui/angular.json`:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
"i18n": {
|
"i18n": {
|
||||||
@@ -292,7 +278,7 @@ Adding new languages requires adding the translated files in the
|
|||||||
```
|
```
|
||||||
|
|
||||||
2. Add the language to the available options in
|
2. Add the language to the available options in
|
||||||
"src-ui/src/app/services/settings.service.ts":
|
`src-ui/src/app/services/settings.service.ts`:
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
getLanguageOptions(): LanguageOption[] {
|
getLanguageOptions(): LanguageOption[] {
|
||||||
@@ -313,7 +299,7 @@ Adding new languages requires adding the translated files in the
|
|||||||
and "yyyy".
|
and "yyyy".
|
||||||
|
|
||||||
3. Import and register the Angular data for this locale in
|
3. Import and register the Angular data for this locale in
|
||||||
"src-ui/src/app/app.module.ts":
|
`src-ui/src/app/app.module.ts`:
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
import localeDe from '@angular/common/locales/de'
|
import localeDe from '@angular/common/locales/de'
|
||||||
@@ -326,10 +312,10 @@ A majority of the strings that appear in the back end appear only when
|
|||||||
the admin is used. However, some of these are still shown on the front
|
the admin is used. However, some of these are still shown on the front
|
||||||
end (such as error messages).
|
end (such as error messages).
|
||||||
|
|
||||||
- The django application does localization according to the [django
|
- The django application does localization according to the [Django
|
||||||
documentation](https://docs.djangoproject.com/en/3.1/topics/i18n/translation/).
|
documentation](https://docs.djangoproject.com/en/3.1/topics/i18n/translation/).
|
||||||
- The source language of the project is "en_US".
|
- The source language of the project is "en_US".
|
||||||
- Localization files end up in the folder "src/locale/".
|
- Localization files end up in the folder `src/locale/`.
|
||||||
- In order to extract strings from the application, call
|
- In order to extract strings from the application, call
|
||||||
`python3 manage.py makemessages -l en_US`. This is important after
|
`python3 manage.py makemessages -l en_US`. This is important after
|
||||||
making changes to translatable strings.
|
making changes to translatable strings.
|
||||||
@@ -340,8 +326,8 @@ end (such as error messages).
|
|||||||
command.
|
command.
|
||||||
|
|
||||||
Adding new languages requires adding the translated files in the
|
Adding new languages requires adding the translated files in the
|
||||||
"src/locale/" folder and adjusting the file
|
`src/locale/`-folder and adjusting the file
|
||||||
"src/paperless/settings.py" to include the new language:
|
`src/paperless/settings.py` to include the new language:
|
||||||
|
|
||||||
```python
|
```python
|
||||||
LANGUAGES = [
|
LANGUAGES = [
|
||||||
@@ -360,18 +346,27 @@ LANGUAGES = [
|
|||||||
The documentation is built using material-mkdocs, see their [documentation](https://squidfunk.github.io/mkdocs-material/reference/).
|
The documentation is built using material-mkdocs, see their [documentation](https://squidfunk.github.io/mkdocs-material/reference/).
|
||||||
If you want to build the documentation locally, this is how you do it:
|
If you want to build the documentation locally, this is how you do it:
|
||||||
|
|
||||||
1. Install python dependencies.
|
1. Have an active pipenv shell (`pipenv shell`) and install Python dependencies:
|
||||||
|
|
||||||
```shell-session
|
```bash
|
||||||
$ cd /path/to/paperless
|
|
||||||
$ pipenv install --dev
|
$ pipenv install --dev
|
||||||
```
|
```
|
||||||
|
|
||||||
2. Build the documentation
|
2. Build the documentation
|
||||||
|
|
||||||
```shell-session
|
```bash
|
||||||
$ cd /path/to/paperless
|
$ mkdocs build --config-file mkdocs.yml
|
||||||
$ pipenv mkdocs build --config-file mkdocs.yml
|
```
|
||||||
|
|
||||||
|
_alternatively..._
|
||||||
|
|
||||||
|
3. Serve the documentation. This will spin up a
|
||||||
|
copy of the documentation at http://127.0.0.1:8000
|
||||||
|
that will automatically refresh every time you change
|
||||||
|
something.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ mkdocs serve
|
||||||
```
|
```
|
||||||
|
|
||||||
## Building the Docker image
|
## Building the Docker image
|
||||||
@@ -379,40 +374,37 @@ If you want to build the documentation locally, this is how you do it:
|
|||||||
The docker image is primarily built by the GitHub actions workflow, but
|
The docker image is primarily built by the GitHub actions workflow, but
|
||||||
it can be faster when developing to build and tag an image locally.
|
it can be faster when developing to build and tag an image locally.
|
||||||
|
|
||||||
To provide the build arguments automatically, build the image using the
|
Building the image works as with any image:
|
||||||
helper script `build-docker-image.sh`.
|
|
||||||
|
|
||||||
Building the docker image from source:
|
```
|
||||||
|
docker build --file Dockerfile --tag paperless:local --progress simple .
|
||||||
```shell-session
|
|
||||||
./build-docker-image.sh Dockerfile -t <your-tag>
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Extending Paperless
|
## Extending Paperless-ngx
|
||||||
|
|
||||||
Paperless does not have any fancy plugin systems and will probably never
|
Paperless-ngx does not have any fancy plugin systems and will probably never
|
||||||
have. However, some parts of the application have been designed to allow
|
have. However, some parts of the application have been designed to allow
|
||||||
easy integration of additional features without any modification to the
|
easy integration of additional features without any modification to the
|
||||||
base code.
|
base code.
|
||||||
|
|
||||||
### Making custom parsers
|
### Making custom parsers
|
||||||
|
|
||||||
Paperless uses parsers to add documents to paperless. A parser is
|
Paperless-ngx uses parsers to add documents. A parser is
|
||||||
responsible for:
|
responsible for:
|
||||||
|
|
||||||
- Retrieve the content from the original
|
- Retrieving the content from the original
|
||||||
- Create a thumbnail
|
- Creating a thumbnail
|
||||||
- Optional: Retrieve a created date from the original
|
- _optional:_ Retrieving a created date from the original
|
||||||
- Optional: Create an archived document from the original
|
- _optional:_ Creating an archived document from the original
|
||||||
|
|
||||||
Custom parsers can be added to paperless to support more file types. In
|
Custom parsers can be added to Paperless-ngx to support more file types. In
|
||||||
order to do that, you need to write the parser itself and announce its
|
order to do that, you need to write the parser itself and announce its
|
||||||
existence to paperless.
|
existence to Paperless-ngx.
|
||||||
|
|
||||||
The parser itself must extend `documents.parsers.DocumentParser` and
|
The parser itself must extend `documents.parsers.DocumentParser` and
|
||||||
must implement the methods `parse` and `get_thumbnail`. You can provide
|
must implement the methods `parse` and `get_thumbnail`. You can provide
|
||||||
your own implementation to `get_date` if you don't want to rely on
|
your own implementation to `get_date` if you don't want to rely on
|
||||||
paperless' default date guessing mechanisms.
|
Paperless-ngx' default date guessing mechanisms.
|
||||||
|
|
||||||
```python
|
```python
|
||||||
class MyCustomParser(DocumentParser):
|
class MyCustomParser(DocumentParser):
|
||||||
@@ -444,7 +436,7 @@ to be empty and removed after consumption finished. You can use that
|
|||||||
directory to store any intermediate files and also use it to store the
|
directory to store any intermediate files and also use it to store the
|
||||||
thumbnail / archived document.
|
thumbnail / archived document.
|
||||||
|
|
||||||
After that, you need to announce your parser to paperless. You need to
|
After that, you need to announce your parser to Paperless-ngx. You need to
|
||||||
connect a handler to the `document_consumer_declaration` signal. Have a
|
connect a handler to the `document_consumer_declaration` signal. Have a
|
||||||
look in the file `src/paperless_tesseract/apps.py` on how that's done.
|
look in the file `src/paperless_tesseract/apps.py` on how that's done.
|
||||||
The handler is a method that returns information about your parser:
|
The handler is a method that returns information about your parser:
|
||||||
@@ -464,11 +456,11 @@ def myparser_consumer_declaration(sender, **kwargs):
|
|||||||
- `parser` is a reference to a class that extends `DocumentParser`.
|
- `parser` is a reference to a class that extends `DocumentParser`.
|
||||||
- `weight` is used whenever two or more parsers are able to parse a
|
- `weight` is used whenever two or more parsers are able to parse a
|
||||||
file: The parser with the higher weight wins. This can be used to
|
file: The parser with the higher weight wins. This can be used to
|
||||||
override the parsers provided by paperless.
|
override the parsers provided by Paperless-ngx.
|
||||||
- `mime_types` is a dictionary. The keys are the mime types your
|
- `mime_types` is a dictionary. The keys are the mime types your
|
||||||
parser supports and the value is the default file extension that
|
parser supports and the value is the default file extension that
|
||||||
paperless should use when storing files and serving them for
|
Paperless-ngx should use when storing files and serving them for
|
||||||
download. We could guess that from the file extensions, but some
|
download. We could guess that from the file extensions, but some
|
||||||
mime types have many extensions associated with them and the python
|
mime types have many extensions associated with them and the Python
|
||||||
methods responsible for guessing the extension do not always return
|
methods responsible for guessing the extension do not always return
|
||||||
the same value.
|
the same value.
|
||||||
|
45
docs/faq.md
@@ -3,15 +3,16 @@
|
|||||||
## _What's the general plan for Paperless-ngx?_
|
## _What's the general plan for Paperless-ngx?_
|
||||||
|
|
||||||
**A:** While Paperless-ngx is already considered largely
|
**A:** While Paperless-ngx is already considered largely
|
||||||
"feature-complete" it is a community-driven project and development
|
"feature-complete", it is a community-driven project and development
|
||||||
will be guided in this way. New features can be submitted via GitHub
|
will be guided in this way. New features can be submitted via
|
||||||
discussions and "up-voted" by the community but this is not a
|
[GitHub discussions](https://github.com/paperless-ngx/paperless-ngx/discussions)
|
||||||
guarantee the feature will be implemented. This project will always be
|
and "up-voted" by the community, but this is not a
|
||||||
|
guarantee that the feature will be implemented. This project will always be
|
||||||
open to collaboration in the form of PRs, ideas etc.
|
open to collaboration in the form of PRs, ideas etc.
|
||||||
|
|
||||||
## _I'm using docker. Where are my documents?_
|
## _I'm using docker. Where are my documents?_
|
||||||
|
|
||||||
**A:** Your documents are stored inside the docker volume
|
**A:** By default, your documents are stored inside the docker volume
|
||||||
`paperless_media`. Docker manages this volume automatically for you. It
|
`paperless_media`. Docker manages this volume automatically for you. It
|
||||||
is a persistent storage and will persist as long as you don't
|
is a persistent storage and will persist as long as you don't
|
||||||
explicitly delete it. The actual location depends on your host operating
|
explicitly delete it. The actual location depends on your host operating
|
||||||
@@ -27,6 +28,12 @@ system. On Linux, chances are high that this location is
|
|||||||
files around manually. This folder is meant to be entirely managed by
|
files around manually. This folder is meant to be entirely managed by
|
||||||
docker and paperless.
|
docker and paperless.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
Files consumed from the consumption directory are re-created inside
|
||||||
|
this media directory and are removed from the consumption directory
|
||||||
|
itself.
|
||||||
|
|
||||||
## Let's say I want to switch tools in a year. Can I easily move to other systems?
|
## Let's say I want to switch tools in a year. Can I easily move to other systems?
|
||||||
|
|
||||||
**A:** Your documents are stored as plain files inside the media folder.
|
**A:** Your documents are stored as plain files inside the media folder.
|
||||||
@@ -39,8 +46,8 @@ elsewhere. Here are a couple notes about that.
|
|||||||
- By default, paperless uses the internal ID of each document as its
|
- By default, paperless uses the internal ID of each document as its
|
||||||
filename. This might not be very convenient for export. However, you
|
filename. This might not be very convenient for export. However, you
|
||||||
can adjust the way files are stored in paperless by
|
can adjust the way files are stored in paperless by
|
||||||
[configuring the filename format](/advanced_usage#file-name-handling).
|
[configuring the filename format](advanced_usage.md#file-name-handling).
|
||||||
- [The exporter](/administration#exporter) is
|
- [The exporter](administration.md#exporter) is
|
||||||
another easy way to get your files out of paperless with reasonable
|
another easy way to get your files out of paperless with reasonable
|
||||||
file names.
|
file names.
|
||||||
|
|
||||||
@@ -52,7 +59,7 @@ elsewhere. Here are a couple notes about that.
|
|||||||
WebP images are processed with OCR and converted into PDF documents.
|
WebP images are processed with OCR and converted into PDF documents.
|
||||||
- Plain text documents are supported as well and are added verbatim to
|
- Plain text documents are supported as well and are added verbatim to
|
||||||
paperless.
|
paperless.
|
||||||
- With the optional Tika integration enabled (see [Tika configuration](/configuration#tika),
|
- With the optional Tika integration enabled (see [Tika configuration](https://docs.paperless-ngx.com/configuration#tika)),
|
||||||
Paperless also supports various Office documents (.docx, .doc, odt,
|
Paperless also supports various Office documents (.docx, .doc, odt,
|
||||||
.ppt, .pptx, .odp, .xls, .xlsx, .ods).
|
.ppt, .pptx, .odp, .xls, .xlsx, .ods).
|
||||||
|
|
||||||
@@ -71,20 +78,27 @@ has to do much less work to serve the data.
|
|||||||
!!! note
|
!!! note
|
||||||
|
|
||||||
You can adjust some of the settings so that paperless uses less
|
You can adjust some of the settings so that paperless uses less
|
||||||
processing power. See [setup](/setup#less-powerful-devices) for details.
|
processing power. See [setup](setup.md#less-powerful-devices) for details.
|
||||||
|
|
||||||
## _How do I install paperless-ngx on Raspberry Pi?_
|
## _How do I install paperless-ngx on Raspberry Pi?_
|
||||||
|
|
||||||
**A:** Docker images are available for armv7 and arm64 hardware, so just
|
**A:** Docker images are available for arm64 hardware, so just
|
||||||
follow the docker-compose instructions. Apart from more required disk
|
follow the [Docker Compose instructions](https://docs.paperless-ngx.com/setup/#installation). Apart from more required disk
|
||||||
space compared to a bare metal installation, docker comes with close to
|
space compared to a bare metal installation, docker comes with close to
|
||||||
zero overhead, even on Raspberry Pi.
|
zero overhead, even on Raspberry Pi.
|
||||||
|
|
||||||
If you decide to got with the bare metal route, be aware that some of
|
If you decide to go with the bare metal route, be aware that some of
|
||||||
the python requirements do not have precompiled packages for ARM /
|
the python requirements do not have precompiled packages for ARM /
|
||||||
ARM64. Installation of these will require additional development
|
ARM64. Installation of these will require additional development
|
||||||
libraries and compilation will take a long time.
|
libraries and compilation will take a long time.
|
||||||
|
|
||||||
|
!!! note
|
||||||
|
|
||||||
|
For ARMv7 (32-bit) systems, paperless may still function, but it could require
|
||||||
|
modifications to the Dockerfile (if using Docker) or additional
|
||||||
|
tools for installing bare metal. It is suggested to upgrade to arm64
|
||||||
|
instead.
|
||||||
|
|
||||||
## _How do I run this on Unraid?_
|
## _How do I run this on Unraid?_
|
||||||
|
|
||||||
**A:** Paperless-ngx is available as [community
|
**A:** Paperless-ngx is available as [community
|
||||||
@@ -96,14 +110,11 @@ Fahrer](https://github.com/Tooa) created a container template for that.
|
|||||||
**A:** I honestly don't know! As for all other devices that might be
|
**A:** I honestly don't know! As for all other devices that might be
|
||||||
able to run paperless, you're a bit on your own. If you can't run the
|
able to run paperless, you're a bit on your own. If you can't run the
|
||||||
docker image, the documentation has instructions for bare metal
|
docker image, the documentation has instructions for bare metal
|
||||||
installs. I'm running paperless on an i3 processor from 2015 or so.
|
installs.
|
||||||
This is also what I use to test new releases with. Apart from that, I
|
|
||||||
also have a Raspberry Pi, which I occasionally build the image on and
|
|
||||||
see if it works.
|
|
||||||
|
|
||||||
## _How do I proxy this with NGINX?_
|
## _How do I proxy this with NGINX?_
|
||||||
|
|
||||||
**A:** See [here](/setup#nginx).
|
**A:** See [the wiki](https://github.com/paperless-ngx/paperless-ngx/wiki/Using-a-Reverse-Proxy-with-Paperless-ngx#nginx).
|
||||||
|
|
||||||
## _How do I get WebSocket support with Apache mod_wsgi_?
|
## _How do I get WebSocket support with Apache mod_wsgi_?
|
||||||
|
|
||||||
|
190
docs/index.md
@@ -5,7 +5,7 @@
|
|||||||
**Paperless-ngx** is a _community-supported_ open-source document management system that transforms your
|
**Paperless-ngx** is a _community-supported_ open-source document management system that transforms your
|
||||||
physical documents into a searchable online archive so you can keep, well, _less paper_.
|
physical documents into a searchable online archive so you can keep, well, _less paper_.
|
||||||
|
|
||||||
[Get started](/setup){ .md-button .md-button--primary .index-callout }
|
[Get started](setup.md){ .md-button .md-button--primary .index-callout }
|
||||||
[Demo](https://demo.paperless-ngx.com){ .md-button .md-button--secondary target=\_blank }
|
[Demo](https://demo.paperless-ngx.com){ .md-button .md-button--secondary target=\_blank }
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
@@ -15,103 +15,161 @@ physical documents into a searchable online archive so you can keep, well, _less
|
|||||||
</div>
|
</div>
|
||||||
<div class="clear"></div>
|
<div class="clear"></div>
|
||||||
|
|
||||||
## Why This Exists
|
## Features
|
||||||
|
|
||||||
Paper is a nightmare. Environmental issues aside, there's no excuse for
|
- **Organize and index** your scanned documents with tags, correspondents, types, and more.
|
||||||
it in the 21st century. It takes up space, collects dust, doesn't
|
- Performs **OCR** on your documents, adding searchable and selectable text, even to documents scanned with only images.
|
||||||
support any form of a search feature, indexing is tedious, it's heavy
|
- Utilizes the open-source Tesseract engine to recognize more than 100 languages.
|
||||||
and prone to damage & loss.
|
- Documents are saved as PDF/A format which is designed for long term storage, alongside the unaltered originals.
|
||||||
|
- Uses machine-learning to automatically add tags, correspondents and document types to your documents.
|
||||||
|
- Supports PDF documents, images, plain text files, Office documents (Word, Excel, Powerpoint, and LibreOffice equivalents)[^1] and more.
|
||||||
|
- Paperless stores your documents plain on disk. Filenames and folders are managed by paperless and their format can be configured freely with different configurations assigned to different documents.
|
||||||
|
- **Beautiful, modern web application** that features:
|
||||||
|
- Customizable dashboard with statistics.
|
||||||
|
- Filtering by tags, correspondents, types, and more.
|
||||||
|
- Bulk editing of tags, correspondents, types and more.
|
||||||
|
- Drag-and-drop uploading of documents throughout the app.
|
||||||
|
- Customizable views can be saved and displayed on the dashboard and / or sidebar.
|
||||||
|
- Support for custom fields of various data types.
|
||||||
|
- Shareable public links with optional expiration.
|
||||||
|
- **Full text search** helps you find what you need:
|
||||||
|
- Auto completion suggests relevant words from your documents.
|
||||||
|
- Results are sorted by relevance to your search query.
|
||||||
|
- Highlighting shows you which parts of the document matched the query.
|
||||||
|
- Searching for similar documents ("More like this")
|
||||||
|
- **Email processing**[^1]: import documents from your email accounts:
|
||||||
|
- Configure multiple accounts and rules for each account.
|
||||||
|
- After processing, paperless can perform actions on the messages such as marking as read, deleting and more.
|
||||||
|
- A built-in robust **multi-user permissions** system that supports 'global' permissions as well as per document or object.
|
||||||
|
- A powerful templating system that gives you more control over the consumption pipeline.
|
||||||
|
- **Optimized** for multi core systems: Paperless-ngx consumes multiple documents in parallel.
|
||||||
|
- The integrated sanity checker makes sure that your document archive is in good health.
|
||||||
|
|
||||||
This software is designed to make "going paperless" easier. No more worrying
|
[^1]: Office document and email consumption support is optional and provided by Apache Tika (see [configuration](https://docs.paperless-ngx.com/configuration/#tika))
|
||||||
about finding stuff again, feed documents right from the post box into
|
|
||||||
the scanner and then shred them. Perhaps you might find it useful too.
|
|
||||||
|
|
||||||
## Paperless, a history
|
## Paperless, a history
|
||||||
|
|
||||||
Paperless is a simple Django application running in two parts: a
|
Paperless-ngx is the official successor to the original [Paperless](https://github.com/the-paperless-project/paperless) & [Paperless-ng](https://github.com/jonaswinkler/paperless-ng) projects and is designed to distribute the responsibility of advancing and supporting the project among a team of people. [Consider joining us!](https://github.com/paperless-ngx/paperless-ngx#community-support)
|
||||||
_Consumer_ (the thing that does the indexing) and the _Web server_ (the
|
|
||||||
part that lets you search & download already-indexed documents). If you
|
|
||||||
want to learn more about its functions keep on reading after the
|
|
||||||
installation section.
|
|
||||||
|
|
||||||
Paperless-ngx is a document management system that transforms your
|
Further discussion of the transition between these projects can be found at
|
||||||
physical documents into a searchable online archive so you can keep,
|
[ng#1599](https://github.com/jonaswinkler/paperless-ng/issues/1599) and [ng#1632](https://github.com/jonaswinkler/paperless-ng/issues/1632).
|
||||||
well, _less paper_.
|
|
||||||
|
|
||||||
Paperless-ngx forked from paperless-ng to continue the great work and
|
|
||||||
distribute responsibility of supporting and advancing the project among
|
|
||||||
a team of people.
|
|
||||||
|
|
||||||
NG stands for both Angular (the framework used for the Frontend) and
|
|
||||||
next-gen. Publishing this project under a different name also avoids
|
|
||||||
confusion between paperless and paperless-ngx.
|
|
||||||
|
|
||||||
If you want to learn about what's different in paperless-ngx from
|
|
||||||
Paperless, check out these resources in the documentation:
|
|
||||||
|
|
||||||
- [Some screenshots](#screenshots) of the new UI are available.
|
|
||||||
- Read [this section](/advanced_usage#automatic-matching) if you want to learn about how paperless automates all
|
|
||||||
tagging using machine learning.
|
|
||||||
- Paperless now comes with a [proper email consumer](/usage#usage-email) that's fully tested and production ready.
|
|
||||||
- Paperless creates searchable PDF/A documents from whatever you put into the consumption directory. This means
|
|
||||||
that you can select text in image-only documents coming from your scanner.
|
|
||||||
- See [this note](/administration#encryption) about GnuPG encryption in paperless-ngx.
|
|
||||||
- Paperless is now integrated with a
|
|
||||||
[task processing queue](/setup#task_processor) that tells you at a glance when and why something is not working.
|
|
||||||
- The [changelog](/changelog) contains a detailed list of all changes in paperless-ngx.
|
|
||||||
|
|
||||||
## Screenshots
|
## Screenshots
|
||||||
|
|
||||||
This is what Paperless-ngx looks like.
|
Paperless-ngx aims to be as nice to use as it is useful. Check out some screenshots below.
|
||||||
|
|
||||||
The dashboard shows customizable views on your document and allows
|
<div class="grid-flipped-left" markdown>
|
||||||
document uploads:
|

|
||||||
|
</div>
|
||||||
|
<div class="grid-flipped-right" markdown>
|
||||||
|
The dashboard shows saved views which can be sorted. Documents can be uploaded with the button or dropped anywhere in the application.
|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
[](assets/screenshots/dashboard.png)
|
The document list provides three different styles to browse your documents.
|
||||||
|
|
||||||
The document list provides three different styles to scroll through your
|
{: style="width:32%"}
|
||||||
documents:
|
{: style="width:32%"}
|
||||||
|
{: style="width:32%"}
|
||||||
|
|
||||||
[](assets/screenshots/documents-table.png)
|
<div class="clear"></div>
|
||||||
|
|
||||||
[](assets/screenshots/documents-smallcards.png)
|
<div class="grid-left" markdown>
|
||||||
|
Use the 'slim' sidebar to focus on your docs and minimize the UI.
|
||||||
|
</div>
|
||||||
|
<div class="grid-right" markdown>
|
||||||
|

|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
[](assets/screenshots/documents-largecards.png)
|
Of course, Paperless-ngx also supports dark mode:
|
||||||
|
|
||||||
Paperless-ngx also supports dark mode:
|

|
||||||
|
|
||||||
[](assets/screenshots/documents-smallcards-dark.png)
|
<div class="clear"></div>
|
||||||
|
|
||||||
Extensive filtering mechanisms:
|
<div class="grid-left" markdown>
|
||||||
|
Quickly find documents with extensive filtering mechanisms.
|
||||||
|
</div>
|
||||||
|
<div class="grid-right" markdown>
|
||||||
|

|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
<div class="grid-left" markdown>
|
||||||
|
And perform bulk edit operations to set tags, correspondents, etc. as well as permissions.
|
||||||
|
</div>
|
||||||
|
<div class="grid-right" markdown>
|
||||||
|

|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
[](assets/screenshots/documents-filter.png)
|
Side-by-side editing of documents.
|
||||||
|
|
||||||
Bulk editing of document tags, correspondents, etc.:
|

|
||||||
|
|
||||||
[](assets/screenshots/bulk-edit.png)
|
<div class="grid-left" markdown>
|
||||||
|
Support for custom fields.
|
||||||
|
|
||||||
Side-by-side editing of documents:
|

|
||||||
|
|
||||||
[](assets/screenshots/editing.png)
|
</div>
|
||||||
|
<div class="grid-right" markdown>
|
||||||
|

|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
Tag editing. This looks about the same for correspondents and document
|
<div class="grid-left" markdown>
|
||||||
types.
|
A robust permissions system with support for 'global' and document / object permissions.
|
||||||
|
|
||||||
[](assets/screenshots/new-tag.png)
|

|
||||||
|
|
||||||
Searching provides auto complete and highlights the results.
|
</div>
|
||||||
|
<div class="grid-right" markdown>
|
||||||
|

|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
[](assets/screenshots/search-preview.png)
|
<div class="grid-left" markdown>
|
||||||
|
Searching provides auto complete and highlights the results.
|
||||||
|
|
||||||
[](assets/screenshots/search-results.png)
|

|
||||||
|
|
||||||
Fancy mail filters!
|
</div>
|
||||||
|
<div class="grid-right" markdown>
|
||||||
|

|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
[](assets/screenshots/mail-rules-edited.png)
|
Tag, correspondent, document type and storage path editing.
|
||||||
|
|
||||||
|
{: style="width:21%; float: left"}
|
||||||
|
{: style="width:21%; margin-left: 4%; float: left"}
|
||||||
|
{: style="width:21%; margin-left: 4%; float: left"}
|
||||||
|
{: style="width:21%; margin-left: 4%; float: left"}
|
||||||
|
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
|
<div class="grid-half-left" markdown>
|
||||||
|
Mail rules support various filters and actions for incoming e-mails.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
</div>
|
||||||
|
<div class="grid-half-right" markdown>
|
||||||
|
Consumption templates provide finer control over the document pipeline.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
</div>
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
|
<div class="clear"></div>
|
||||||
|
|
||||||
Mobile devices are supported.
|
Mobile devices are supported.
|
||||||
|
|
||||||
[](assets/screenshots/mobile.png)
|
{: style="width:32%"}
|
||||||
|
{: style="width:32%"}
|
||||||
|
{: style="width:32%"}
|
||||||
|
|
||||||
## Support
|
## Support
|
||||||
|
|
||||||
@@ -131,7 +189,7 @@ People interested in continuing the work on paperless-ngx are encouraged to reac
|
|||||||
|
|
||||||
### Translation
|
### Translation
|
||||||
|
|
||||||
Paperless-ngx is available in many languages that are coordinated on [Crowdin](https://crwd.in/paperless-ngx). If you want to help out by translating paperless-ngx into your language, please head over to https://crwd.in/paperless-ngx, and thank you!
|
Paperless-ngx is available in many languages that are coordinated on [Crowdin](https://crwd.in/paperless-ngx). If you want to help out by translating paperless-ngx into your language, please head over to the [Paperless-ngx project at Crowdin](https://crwd.in/paperless-ngx), and thank you!
|
||||||
|
|
||||||
## Scanners & Software
|
## Scanners & Software
|
||||||
|
|
||||||
|